Archivo de la categoría: hybrid cloud

Red Hat CEO pins 21% growth on hybrid cloud market

James WhitehurstRed Hat demonstrated healthy growth in its quarterly earnings, with CEO James Whitehurst attributing the success to the growing hybrid cloud market.

The company reported Q4 revenues at $544 million and total revenues for the year at $2.05 billion, both an increase of 21% on the previous year (constant currency). It now claims to be the only open-source company to have breached the $2 billion milestone.

“Our results reflect the fact that enterprises are increasingly adopting hybrid cloud infrastructures and open source technologies, and they are turning to Red Hat as their strategic partner as they do so,” said Whitehurst. “First, the fourth quarter marked our 56th consecutive quarter of revenue growth which contributed to Red Hat’s first year of crossing the $2 billion in total revenue milestone.”

While public cloud has been dominating the headlines in recent weeks, the Red Hat team remain positive that the hybrid cloud market will ultimately deliver on expectations. “Public cloud has been a great resource for us to reach new customers, including small and medium-sized businesses,” said Whitehurst.

“During meetings Frank (Frank Calderoni, CFO) and I have hosted over the quarter, investors have asked whether the public cloud is a positive driver for Red Hat. We firmly believe that it will be a hybrid cloud world, where applications will run across four – all four footprints; physical, virtual, public cloud, and private cloud.

“Our revenue from private IaaS, PaaS and cloud management technologies is growing at nearly twice as fast as our public cloud revenue did when it was at the same size.”

Although it is unsurprising that Red Hat strongly backs the hybrid cloud model, security and data protection concerns in the industry add weight to the position. Despite progress made in the delivery and management of public cloud platforms, recent research has shown that enterprise decision makers are still concerned about the level of security offered in public cloud, but also where the data will reside geographically. Both concerns are seemingly driven hybrid cloud adoption, giving enterprise the full control on how and where company critical data is stored.

Over the last 12 months, Ret Hat has also confirmed a number of partnerships with major players in the public cloud space to increase its footprint. Last year, a partnership was announced with Microsoft where it became a Red Hat Certified Cloud and Service Provider, enabling customers to run their Red Hat Enterprise Linux applications and workloads on Microsoft Azure. In addition the Certified Cloud and Service Provider platform also has relationships with Google and Rackspace. Red Hat claims that these relationships have resulted in more than $100 million revenue, a 90% increase year-on-year.

“In Q4, we further expanded our technology offerings that can be consumed in the cloud. For instance, RHEL on-demand is activated on Azure in February,” said Whitehurst. “OpenShift, our PaaS solution, and our storage technology will be added to the Google cloud. And RHEL OpenStack platform is now available at RackSpace as a managed service.”

Despite increased competition in the market over recent years, Ret Hat has proved to be effective at holding onto customers. The largest 25 contracts that where up for renewal in the last quarter were all renewed and the new deals were 25% higher in the aggregate. The company also claims that 498 of the largest 500 deals over the last five years have also been removed.

“We never want to lose a deal, if we do, we never give up trying to win back the business,” said Calderoni. “This quarter, I am pleased to report that we closed a multi-million-dollar ‘win-back’ of one of those two former top deals.”

The company also estimates that revenues will grow to between $558 million and $566 million for Q1 and between $2.38 billion and $2.420 billion for the financial year.

Natural Resources Wales extends cloud ERP relationship with Trustmarque

CloudSystem integrator Trustmarque has announced it will continue it work with Natural Resources Wales, focusing on disaster recovery, and application and infrastructure support.

The agreement, which has now been in place for two years, was initially launched to help Natural Resources Wales simplify its IT estate following the merger of the three different bodies. Natural Resources Wales was brought about through the merger of Countryside Council for Wales, Environment Agency Wales, and the Forestry Commission Wales, all of which operated on different ERP systems.

“The creation of Natural Resources Wales resulted in a complex and disparate IT estate, and over the past two years Trustmarque has helped us effectively simplify it,” said Paul Subacchi, Head of Business Support Services at Natural Resources Wales. “Our ERP system is absolutely critical to the organisation, enabling us to become more efficient and offer greater self-service functionality to our employees.  Cloud is a significant part of our IT strategy, so we need a platform that is available, resilient, flexible and secure to deliver our ERP system.”

Initially projects focused on consolidating all ERP systems it was using for finance and HR onto a single platform, delivered through a combination of cloud, on premise and managed services. Trustmarque will now deliver Natural Resources Wales’ sole ERP system as a private cloud service, as well as creating a self-service portal, MyNRW, for the organizations 2000 employees.

Security was an important consideration for Natural Resources Wales, as Trustmarque has to continually demonstrate that it meets minimum security requirements set forward by G-Cloud. The requirements range from encryption to protect consumer data transiting networks, Trustmarque staff security screening and consumer separation, as well as ensuring that its own supply chain meets the same standards.

“The work we have done with NRW throughout our collaboration is testament to Trustmarque’s end-to-end IT service capabilities and our expertise in delivering cloud services,” said Mike Henson, Cloud and Managed Services Director at Trustmarque. “By selecting the Trustmarque Cloud, Natural Resources Wales is now able to realise the benefits of its Unit 4 ERP system via a secure and robust platform.

“We’ve also removed the potential ‘headache’ that software licensing can cause, allowing Natural Resources Wales to focus on its core business without any compliance concerns. We see our continuing partnership with Natural Resources Wales as an important and valuable digital transformation programme, and look forward to our future work together.”

Head in the clouds? What to consider when selecting a hybrid cloud partner

online shopping cartThe benefits of any cloud solution relies heavily on how well it’s built and how much advance planning goes into the design. Developing any organisation’s hybrid cloud infrastructure is no small feat, as there are many facets, from hardware selection to resource allocation, at play. So how do you get the most from your hybrid cloud provider?

Here are seven important considerations to make when designing and building out your hybrid cloud:

  1. Right-sizing workloads

One of the biggest advantages of a hybrid cloud service is the ability to match IT workloads to the environment that best suits it. You can build out hybrid cloud solutions with incredible hardware and impressive infrastructure, but if you don’t tailor your IT infrastructure to the specific demands on workloads, you may end up with performance snags, improper capacity allocation, poor availability or wasted resources. Dynamic or more volatile workloads are well suited to the hyper-scalability and speedy provisioning of hybrid cloud hosting, as are any cloud-native apps your business relies on. Performance workloads that require higher IOPS (input/output per second), CPU and utilisation are typically much better suited to a private cloud infrastructure if they have elastic qualities or requirements for self-service. More persistent workloads almost always deliver greater value and efficiency with dedicated servers in a managed hosting or co-location environment. Another key benefit to choosing a hybrid cloud configuration is the organisation only pays for extra compute resources as required.

  1. Security and compliance: securing data in a hybrid cloud

Different workloads may also have different security or compliance requirements which dictates a certain type of IT infrastructure hosting environment. For example, your most confidential data shouldn’t be hosted in a multi-tenant environment, especially if that business is subject to Health Insurance Portability and Accountability Act (HIPAA) or PCI compliance requirements. Might seem obvious, but when right-sizing your workloads, don’t overlook what data must be isolated, and also be sure to encrypt any data you may opt to host in the cloud. Whilst cloud hosting providers can’t provide your compliance for you, most offer an array of managed IT security solutions. Some even offer a third-party-audited Attestation of Compliance to help you document for auditors how their best practices validate against your organisation’s compliance needs.

  1. Data centre footprint: important considerations

There is a myriad of reasons an organisation may wish to outsource its IT infrastructure: from shrinking its IT footprint to driving greater efficiencies, from securing capacity for future growth, or simply to streamline core business functions. The bottom line is that data centres require massive amounts of capital expenditure to both build and maintain, and legacy infrastructure does become obsolete over time. This can place a huge capital and upfront strain onto any mid-to-large-sized businesses expenditure planning.

But data centre consolidation takes discipline, prioritisation and solid growth planning. The ability to migrate workloads to a single, unified platform consisting of a mix of cloud, hosting and datacentre colocation provides your IT Ops with greater flexibility and control, enabling a company to migrate workloads on its own terms and with a central partner answerable for the result.

  1. Hardware needs

For larger workloads should you seek to host on premises, in a private cloud, or through colocation, and what sort of performance needs do you have with hardware suppliers? A truly hybrid IT outsourcing solution enables you to deploy the best mix of enterprise-class, brand-name hardware that you either choose to manage yourself or consume fully-managed from a cloud hosting service provider. Performance requirements, configuration characteristics, your organisation’s access to specific domain expertise (in storage, networking, virtualisation, etc.) as well as the state of your current hardware often dictates the infrastructure mix you adopt. It may be the right time to review your inventory and decommission that hardware reaching end of life. Document the server de-commissioning and migration process thoroughly to ensure no data is lost mid-migration, and follow your lifecycle plan through for decommissioning servers.

  1. Personnel requirements

When designing and building any new IT infrastructure, it’s sometimes easy to get so caught up in the technology that you forget about the people who manage it. With cloud and managed hosting, you benefit from your provider’s expertise and their SLAs — so you don’t have to dedicate your own IT resource to maintaining those particular servers. This frees up valuable staff bandwidth so that your staff focuses on tasks core to business growth, or trains for the skills they’ll need to handle the trickier configuration issues you introduce to your IT infrastructure.

  1. When to implement disaster recovery

A recent study by Databarracks also found that 73% of UK SME’s have no proper disaster recovery plans in place in the event of data loss, so it’s well worth considering what your business continuity planning is in the event of a sustained outage. Building in redundancy and failover as part of your cloud environment is an essential part of any defined disaster recovery service.

For instance, you might wish to mirror a dedicated server environment on cloud virtual machines – paying for a small storage fee to house the redundant environment, but only paying for compute if you actually have to failover. That’s just one of the ways a truly hybrid solution can work for you. When updating your disaster recovery plans to accommodate your new infrastructure, it’s essential to determine your Recovery Point Objectives and Recovery Time Objective (RPO/RTO) on a workload-by-workload basis, and to design your solution with those priorities in mind.

Written by Annette Murphy, Commercial Director for Northern Europe at Zayo Group

Hybrid environments and IoT pose biggest threats to infosec – F5

F5 Forum 2Service providers and enterprises face an insecure networking environment in coming years as more applications, data and services are sent to the cloud, according to networking vendor F5, writes Telecoms.com.

Speaking at the F5 Forum in London, VP of UK and Ireland Keith Bird stressed security is now front and centre not only to the CTO and CEO, but to consumers as intrusion or security breaches regularly make headlines. Bird pointed to the hybrid on-premise/cloud-based environment, in which an increasing number of enterprise and service providers operate, as a huge challenge looming for the information security industry.

“Not so long ago, we looked at just single points of entry. In today’s hybrid world, we’ve got apps in the data centre or in the cloud as SaaS and this is only increasing,” he said. “What we know for sure is that there is no longer a perimeter to the network – that’s totally disappeared.”

“81% of people we recently surveyed said they plan on operating in a hybrid environment, while 20% said they’re now moving over half of their corporate applications to the cloud. Even some of the largest companies in the world are taking up to 90% of their applications to the cloud.”

Given the volume and nature of data being hosted in the cloud, firms are far more accountable and held to tighter information security standards today than they have ever been. The average financial impact of an information security breach is now in the region of $7.2 million, according to F5 research.

“The average cost of a security breach consists of $110,000 lost revenue per hour of downtime – but the effect on a company’s website or application is costing potential business,” said Bird. “The average customer will abandon an attempted session after roughly four seconds of inactivity, so there’s new business being lost as well.”

F5 said of the threats it is seeing at the moment, according to customer surveys, the evolving nature and sophistication of attacks ranks highest, with the internal threat of employee ignorance a close second.

“So what are the top security challenges our customers are seeing?” said Bird. “58% are seeing increasingly sophisticated attacks on their networks, from zero-day to zero-second. 52% were concerned that their own employees don’t realise the impact of not following security policies. Obviously plenty of people said they don’t have enough budget, but that’s not quite the biggest problem facing security departments today.”

F5’s Technical Director Gary Newe, who’s responsible for field systems engineering, said the looming prospect of IoT “scares the bejesus” out of him.

“We’ve all heard about the IoT,” he said before pointing to the connected fridge as a farcically insecure IoT device. “There are 3 billion devices which run Java, which makes it 3 million hackable devices, and that scares the bejesus out of me. This isn’t just a potential impact to the enterprise, but it could have a massive impact on consumers and families. Fitness trackers, for example, just encourage people to give a tonne of data over to companies we don’t know about, and we don’t know how good their security is.”

The scariest bit, Newe emphasised, is the growing knowledge and intelligence of more technically adept youngsters today, and how the rate of technological change will only exacerbate the requirement for a fresh approach to network security.

“Change is coming at a pace, the likes of which we’ve never seen nor ever anticipated,” he said. “We’re building big walls around our networks, but hackers are just walking through the legitimate front doors we’re putting in instead.

“The scariest thing is that the OECD [Organisation for Economic Cooperation and Development] has said the average IQ today is 10 points higher than it was 20 years ago. So teenagers today are smarter than we ever were, they’ve got more compute power than we ever had, and they’re bored. That, to me, is terrifying.”

Microsoft strengthens cloud offering by bringing SQL Server to Linux

Microsoft1Microsoft is bringing its SQL Server to Linux, enabling SQL Server to deliver a consistent data platform across Windows and Linux, as well as on-premises and cloud.

The move has surprised some corners of the industry, as Microsoft moves away from its tradition of creating business software that runs only on the Windows operating system. It has historically been difficult to manage certain Microsoft products on anything other than a Windows server.

Microsoft has always sold PC software which can be run on competitor’s machines, though Chief Executive Satya Nadella broadened the horizons of the business upon appointment through a number of different initiatives. One of the most notable moves was decoupling Microsoft’s Azure cloud computing system from Windows and this weeks’ announcement seems to continue the trend.

The news has been lauded by most as an astute move, strengthening Microsoft’s position in the market. According to Gartner, the number of Linux servers shipped increased from 3.6 million in 2014 from 2.4 million in 2011. Microsoft in the same period saw its shipments drop from 6.5 million to 6.2 million. The move opens up a new wave of potential customers for Microsoft and reduces concerns of lock-in situations.

Microsoft EVP, Cloud and Enterprise Group, Scott Guthrie commented on the company’s official blog “SQL Server on Linux will provide customers with even more flexibility in their data solution,” he said “One with mission-critical performance, industry-leading TCO, best-in-class security, and hybrid cloud innovations – like Stretch Database which lets customers access their data on-premises and in the cloud whenever they want at low cost – all built in. We are bringing the core relational database capabilities to preview today, and are targeting availability in mid-2017.”

The announcement also detailed a number of key features for SQL Server 2016, focused around the critical avenues of data and security. Security encryption capabilities that enable data to always be encrypted at rest, in motion and in-memory are one of the USPs, building on Microsoft’s marketing messages over the last 12 months.

Furthering efforts to diversify the business, Microsoft announced that it would be acquiring mobile app development platform provider Xamarin, last week.

Incorporating Xamarin into the Microsoft business will enhance its base of developer tools and services, once again building on the theme of broadening market appeal and opening new customer avenues for the tech giant.

VMware’s new launches target hybrid cloud and software defined data centres

VMWare campus logoVirtualisation giant VMware has announced two new updates which promise to strengthen its management of the hybrid cloud and of hyperconverged software.

The new version of VMware vRealize Suite is purpose-built for the Hybrid Cloud claims the vendor. Meanwhile, in another release, the new Virtual SAN 6.2 could cater for all-flash hyper-converged systems to turbo-drive cloud computing, creating new options for data deduplication, data compression and erasure coding for as little as one dollar per usable gigabyte.

The new VMware vRealize Business for Cloud 7 promises to address intelligent operations, infrastructure modernisation and DevOps challenges. VMWare claims the vRealize Suite manages all the computing, storage, network and application services across hybrid cloud environments. The DevOps-ready IT, for example, lets IT teams build a cloud for development teams that has a complete application stack and can support developer choice in the form of both API and GUI access to resources. The possibilities are widened by continuous delivery of Code Stream, a feature which speeds up application delivery.

Meanwhile, in the engine room of the cloud, VMware vRealize Operations 6.2 creates the capacity for intelligent workload placement and tight integration with VMware’s vSphere Distributed Resource Scheduler.

Meanwhile, it has carried out new engineering improvements on the high performance infrastructure for the software-defined data centre (SDDC). The key to this improvement is the VMware Virtual SAN technology which has been sold to 3,000 enterprise data centre customers in the 21 months since its initial release.

The new hyper-converged software created by a blend of VMware vSphere, Virtual SAN and vCenter Server converts Intel based x86 servers and direct-attached storage into unified, simple and ‘robust’ units of high performance computing infrastructure, VMWare claims. This slashes the hard and software costs and the management complexity while boosting performance, VMWare claims.

“VMware’s hyper-converged software is gaining customer traction due to its simple, cost-effective and high-performance architecture,” said Yanbing Li, general manager of Storage and Availability Business Unit at VMware. Virtual SAN 6.2 delivers up to ten times the efficiency, he claimed.

Cloud merits acknowledged but adoption concerns linger – Oracle report

cloud question markCloud technology is almost universally acknowledged for its catalysing effect on invention and customer retention, according to new research from Oracle. However, there are still major barriers to adoption.

In Oracle’s study 92% of its sample group of industry leaders testified that the cloud enables them to innovate faster. It also helps companies keep afloat better, with nearly three quarters (73%) reporting that using cloud technology has helped them to retain existing customers more effectively. The cloud also comes out well as a strategic weapon, with 76% of enterprises saying that the newer, more flexible model for handling information helps them to win new customers.

However, the study conducted for Oracle by IDG Connect indicates there is much room for improvement in the adoption of cloud computing. Only half (51%) of the survey sample say their businesses will have reached cloud maturity within two years. According to an Oracle statement, this is a consequence of current uncertainty about moving to the cloud.

Though a compromise between privately owned IT systems and publicly available services is seen as the obvious choice, there are grave concerns about hybrid cloud adoption. Instead of getting the best of both worlds with a hybrid system, many users (60%) reported that the thought of managing multiple IT architectures was off putting. There are fears about the reliability and availability of network bandwidth, which was cited by 57% of the survey as a barrier to adoption. The lack of trust in the relationship with IT suppliers was also a major concern with 52% of the survey sample. Meanwhile those building private cloud infrastructures continue to see security as the prime concern, according to Oracle.

Attitudes could change, but that involves converting the considerable opposition of cloud-sceptics.  There are still significantly large numbers of IT experts who say that winning over key business decision makers is their biggest challenge. This was identified as an issue for 29% of those surveyed.

Johan Doruiter, Oracle’s Senior VP of Systems in EMEA, remained optimistic. “As cloud rapidly reaches maturity, we are seeing a shift in how enterprises perceive the chief benefits and barriers to adoption,” he said. “Traditional concerns have been replaced by the operational worries.”

Cirba expands its infrastructure management optimiser to the public cloud

CloudInfrastructure management vendor Cirba has announced a new workload routing and management option for hybrid clouds. The Cirba infrastructure resource juggling service can now support cloud systems from Microsoft Azure, Amazon Web Services (AWS) and IBM SoftLayer, allowing users to extend their internal management to straddle the public cloud too.

Cirba’s service provides the decision control points which automatically determine where applications can safely run in hybrid environments. It decides where each task runs by conducting detailed analysis of each application’s requirements. It then calculates how best to match them against the available security, cost and technical resources available across. Now the service extends beyond the private infrastructure to include the public clouds.

Though originally designed as an internal system for juggling resources more efficiently, from Thursday Cirba is offering new integrations to Azure, AWS and SoftLayer in order to bring centralised management for enterprise applications across hybrid cloud environments.

Cirba claims that customers will now have extended visibility into applications that are hosted externally. This means the client’s can judge whether their cloud vendor is apportioning the appropriate level of resources, it claims. Clients will also be able to assess these applications against on-premise hosting environments in order to determine whether they should be brought back in-house, claims Andrew Hillier, co-founder and CTO of Cirba. “Without analytics, organisations cannot automate their processes nor can they effectively determine how to meet application requirements without risk or excessive cost,” said Hillier.

Cirba says that the new additions mean that it can now support a range of system that already including internal versions of VMware vCenter, Microsoft Hyper-V, IBM PowerVM on AIX and Red Hat Enterprise Virtualisation-based environments.

In June 2016 Cirba aims to update its Reservation Console in order to create a centralised policy-based control system for hybrid clouds.

VMware broadens its Horizon 7 and Horizon Air

VMWare campus logoCloud infrastructure vendor VMware has announced that VMware Horizon 7 and VMware Horizon will be simpler to set up, faster, easier to maintain and more flexible.

Version 7 of Horizon promises new features that VMware describes under the headings of just in Time Delivery, Blast Extreme, application life cycle management, smart policies and Integration with VMware Workspace ONE.

The new Just in Time Delivery option, a product of Instant Clone Technology (formerly Project Fargo) means managers can provision 2,000 desktops in under 20 minutes. Blast Extreme offers options for GPU off-load to increase scale and mobile network support. The new Application Lifecycle efficiencies promise to cut storage and operational costs by up to 70% and slash the time needed for managing images by up to 95%. Smart Policies and integration with VMware Workspace ONE will both improve management of internal resources.

Meanwhile VMware’s Horizon Air has a hybrid-mode that will pave the way for a simple out of the box set-up, the vendor says. Having achieved that, users can create and scale desktops faster, with new Instant Clone technology integrated with VMware’s App Volumes and User Environment Management technologies.

Another major advantage, according to VMware, is greater hybrid cloud flexibility. In practical terms this will mean that applications and desktop workloads can be moved back and forth from on-premises data centres to the cloud more effectively. The process will be managed from a consistent Cloud Control Plane and will support the use of the cloud as a primary system for everyday work, or as a secondary use case for desktop bursting or disaster recovery.

VMware Horizon 7 and VMware Horizon Air with Hybrid-mode are expected to be generally available before March 2016. VMware Horizon 7 pricing starts at $250 per user for on-premises perpetual licenses. VMware Horizon Air Hybrid-mode cloud subscription pricing starts at $16 per user per month for named users and $26 per user per month for concurrent connections.

“VMware Horizon Air will provide our team with the opportunity to scale desktops on an as needed basis with access to VMware’s public cloud in both Europe and the United States,” said user Jason Bullock, executive director of IT global infrastructure and support at BDP International.

IBM and Catalogic Software combine to slash costs of data management

IBM and Catalogic Software have jointly launched a new set of systems which combine Catalogic’s copy data manager ECX with IBM’s storage offerings, in a bid to help clients trim the excessive costs of duplicate data.

The objective is to make DevOps and Hybrid Cloud initiatives easier and less wasteful for IBM clients by automating storage and data management, creating self-service options and creating access to Catalogic software though IBM’s RESTful API management.

Catalogic’s ECX is described as a virtual appliance that runs on a client’s existing infrastructure and acts as a lever of power over storage controllers, storage software systems and hypervisors. IBM claims it has validated the system through months of testing and the two can work in tandem to improve the operations of the core data centre. The combination of the two creates new tools that are necessary for supporting new workload environments and use cases, according to a Catalogic statement.

Today’s core data centre architecture and associated processes don’t lend themselves to agility and flexibility, though they are reliable and secure. Catalogic’s ECX has given IBM a method of creating the former, without sacrificing the latter, said IBM. The key to this is making the storage infrastructure more flexible so that data can be virtualised and kept in one place rather than endlessly replicated for a variety of different project teams. One of the benefits is that live environments can support key IT functions that rely on copies of production data without having to massively expand the data footprint. ECX and IBM’s service services can jointly create a culture of

elasticity and sharing of cloud resources across a variety of functions including Disaster Recovery, Test and Development, Analytics and other departments.

The lower operating costs of cloud resources and saved manual efforts through ECX’s cloud automation will bring up to a 300% return on investment, claims IBM.

Among the systems that ECX can now combine with are IBM’s Storwize family of hybrid flash/HDD systems, the SAN Volume Controller, FlashSystem V9000, Hybrid Cloud Operations with IBM SoftLayer and IBM Spectrum Protect.

“Copy data management can significantly improve data access and availability and create remarkable cost savings,” said Bina Hallman, VP of IBM Storage and Software Defined Systems.