Why hybrid cloud is the way forward for IT leaders

(c)iStock.com/Erikona

As enterprises increasingly recognise the significant value and opportunity that cloud computing presents, they continue to invest in and grow their cloud strategy. According to Gartner, the use of cloud computing is growing, and by 2016 this growth will increase to become the bulk of new IT spend.

In spite of this, Gartner estimated that, by 2020, on-premise cloud will still account for 70 per cent of the total market and VMware CEO Pat Gelsinger stated that, currently, over 92 per cent of cloud deployments are still on-premise or private. Indeed, in NaviSite’s own recent survey of over 250 UK and US IT professionals, over 89 per cent of UK respondents stated that deploying some sort of private cloud and hybrid infrastructure is a priority within the next 12 months.

There are many good reasons for organisations still opting for applications on hardware owned and managed in-house. Most organisations still have large investments in technology, people, and processes that cannot simply be written-off; certain workloads still do not suit virtualised or multi-tenanted platforms; renting resources is not always cheaper or better than owning them; and there are valid security and compliance reasons for keeping certain data on-premise.

In spite of these concerns, however, the public cloud continues to grow at a ferocious rate, validating the benefits that this infrastructure delivery model offers. That certain data and workloads are better suited for a private cloud infrastructure or for a physical hosted platform therefore seems to be the caveat that opens the door to hybrid solutions. Although many UK businesses have migrated certain applications to the cloud, over three quarters of respondents in NaviSite’s recent survey had migrated under fifty per cent of their infrastructure to the cloud.

A hybrid solution gives organisations the option of scaling resources for specific workloads and running applications on the most appropriate platform for a particular given task. A highly dynamic application with varying spikes may be best supported in the public cloud, a performance-intensive application may be better suited running from the private cloud and a dataset with high regulatory requirements may need to remain on a physical hosted platform. A hybrid solution allows an organisation to place their data where regulatory or security requirements dictate. This is significant as 59 per cent of UK IT professionals surveyed by NaviSite still cite security as their main concern with cloud migration.

Hybrid continues to grow as it is the solution that offers organisations the best of both worlds. For IT leaders, a hybrid strategy that pragmatically embraces the new, whilst making best use of current-state is essential. By going hybrid, today’s IT leaders can pick the best-fit strategy for the current demands of their business, within a flexible framework that will enable them to manage future change.

Microservices and IoT Panel | @CloudExpo [#DevOps #IoT #Microservices]

Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? Join this panel of experts as they peel away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud environment, and we must architect and code accordingly. At the very least, you’ll have no problem filling in your buzzword bingo cards.

read more

How Do You Operationalize a Hybrid World? By @LMacVittie | @CloudExpo [#Cloud]

One of the unintended consequences of cloud is the operational inconsistency it introduces. That inconsistency is introduced because cloud commoditizes the infrastructure we’re used to having control over and visibility into. everything from the core network to the application services upon which business and operations relies to ensure performance, availability and security are often times obscured behind simplified services whose policies and configurations cannot be reconciled with those we maintain on-premise.

read more

US Army deploys hybrid cloud for logistics data analysis

The US Army is working with IBM to deploy a hybrid cloud platform to support its logistics system

The US Army is working with IBM to deploy a hybrid cloud platform to support its logistics system

The US Army is partnering with IBM to deploy a hybrid cloud platform to support data warehousing and data analysis for its Logistics Support Activity (LOGSA) platform, the Army’s logistics support service.

LOGSA provides logistics information capabilities through analytics tools and BI solutions to acquire, manage, equip and sustain the materiel needs of the organisation, and is also the home of the Logistics Information Warehouse (LIW), the Army’s official data system for collecting, storing, organizing and delivering logistics data.

The Army said it is working with IBM to deploy LOGSA, which IBM said it is the US federal government’s largest logistics system, on an internal hybrid cloud platform in a bid to improve its ability to connect to other IT systems, broaden the organisation’s analytics capabilities, and save money (the Army reckons up to 50 per cent).

Anne Altman, General Manager for U.S. Federal at IBM said: “The Army not only recognized a trend in IT that could transform how they deliver services to their logistics personnel around the world, they also implemented a cloud environment quickly and are already experiencing significant benefits. They’re taking advantage of the inherent benefits of hybrid cloud: security and the ability to connect it with an existing IT system. It also gives the Army the flexibility to incorporate new analytics services and mobile capabilities.”

VMware, Telstra bring virtualisation giant’s public cloud to Australia

Telstra and VMware are bringing the virtualisation incumbent's public cloud service to Australia

Telstra and VMware are bringing the virtualisation incumbent’s public cloud service to Australia

VMware announced it is partnering with Telstra to bring its vCloud Air service to Australia.

VMware said the initial VMware vCloud Air deployment in Australia is hosted out of an unspecified Telstra datacentre.

“We continue to see growing client adoption and interest as we build out VMware vCloud Air with our newest service location in Australia,” said Bill Fathers, executive vice president and general manager, Cloud Services Business Unit, VMware.

“VMware’s new Australia service location enables local IT teams, developers and lines of business to create and build their hybrid cloud environments on an agile and resilient IT platform that supports rapid innovation and business transformation,” Fathers said.

Last July VMware made a massive push into the Asia Pacific region, inking deals with SoftBank in Japan and China Telecom in China to bring its public cloud service to the area. But the company said it was adding an Australian location in a bid to appeal to users that have strict data residency requirements.

Duncan Bennet, ‎vice president and managing director, VMware A/NZ added: “Australian businesses will have the ability to seamlessly extend applications into the cloud without any additional configuration, and will have peace of mind, knowing this IT infrastructure will provide a level of reliability and business continuity comparable to in-house IT. It means businesses can quickly respond to changing business conditions, and scale IT up and down as required without disruption to the overall business.”

Telstra has over the past couple of years inked a number of partnerships with large enterprise IT incumbents to strengthen its position in the cloud segment. It was one of the first companies to sign up to Cisco’s Intercloud programme last year, and earlier this month announced a partnership with IBM that will see the Australian telco offer direct network access to SoftLayer cloud infrastructure to local customers.

Red Hat, Dell redouble OpenStack private cloud efforts

Red Hat and Dell are co-developing OpenStack-based private cloud solutions

Red Hat and Dell are co-developing OpenStack-based private cloud solutions

Red Hat and Dell have announced a series of co-engineered, high-density servers the companies claim are optimised for large-scale OpenStack deployments.

The co-engineered servers ship with Red Hat Enterprise Linux 7 and are based on Dell PowerEdge R630 and R730xd high-density rack servers, the latter ideal for compute and the latter optimised for storage utilisation.

“Enterprise customers are requiring robust and rapidly scalable cloud infrastructures that deliver business results,” said Jim Ganthier, vice president and general manager, Dell Engineered Solutions and Cloud.

“Dell and Red Hat continue to jointly deliver cost effective, open source-based cloud computing solutions that provide greater agility to our customers, and this newest version of the Dell Red Hat Cloud Solution leverages best of breed technology from both companies to do so,” Ganthier said.

Radhesh Balakrishnan, general manager, OpenStack at Red Hat said: “Red Hat’s ongoing collaboration with Dell to co-engineer enterprise-grade cloud solutions is further enhancing OpenStack to be production-ready.End customers continue to benefit from Red Hat’s strategic partnership with Dell as we deliver joint solutions that streamline deployment and accelerate time to innovation.”

A number of interrelated forces seem to be at play here. In revealing its fourth quarter 2015 financial results last month Red Hat said deals involving OpenStack-based offerings tripled when compared to the fourth quarter 2014, and with HP pushing its Helion OpenStack-based private cloud offerings hard it seems reasonable to expect Dell, one of its largest private cloud rivals, would want to counter with OpenStack-integrated offerings of its own.

Microsoft unveils Hyper-V containers, nano servers

Microsoft has unveiled Hyper-V containers and nano servers

Microsoft has unveiled Hyper-V containers and nano servers

Microsoft has unveiled a number of updates to Windows Server including Hyper-V containers, which are essentially Docker containers embedded in Hyper-V VMs, and nano servers, a slimmed down Windows server image.

Microsoft said Hyper-V containers are ideal for users that want virtualisation-grade isolation, but still want to run their workloads within Docker containers in a Windows ecosystem.

“Through this new first-of-its-kind offering, Hyper-V Containers will ensure code running in one container remains isolated and cannot impact the host operating system or other containers running on the same host,” explained Mike Neil, general manager for Windows Server, Microsoft in a recent blog post.

“In addition, applications developed for Windows Server Containers can be deployed as a Hyper-V Container without modification, providing greater flexibility for operators who need to choose degrees of density, agility, and isolation in a multi-platform, multi-application environment.”

Windows Server Containers will be enabled in the next release of Windows Server, which is due to be demoed in the coming weeks, and makes good on Microsoft’s commitment to make the Windows Server ecosystem (including Azure) Docker-friendly.

The company also unveiled what it’s calling nano servers, a “purpose-built OS” that is essentially a stripped down Windows Server image optimised for cloud and container workloads. They can be deployed onto bare metal, and because Microsoft removed tons of code it boots up and runs more quickly.

“To achieve these benefits, we removed the GUI stack, 32 bit support (WOW64), MSI and a number of default Server Core components. There is no local logon or Remote Desktop support. All management is performed remotely via WMI and PowerShell. We are also adding Windows Server Roles and Features using Features on Demand and DISM. We are improving remote manageability via PowerShell with Desired State Configuration as well as remote file transfer, remote script authoring and remote debugging.  We are working on a set of new Web-based management tools to replace local inbox management tools,” the company explained.

“Because Nano Server is a refactored version of Windows Server it will be API-compatible with other versions of Windows Server within the subset of components it includes. Visual Studio is fully supported with Nano Server, including remote debugging functionality and notifications when APIs reference unsupported Nano Server components.”

The move is a sign Microsoft is keen to keep its on-premise and cloud platform ahead of the technology curve, and is likely to appeal to .NET developers who are attracted to some of the benefits of containers while wanting to stay firmly within a Windows world in terms of the tools and code used. Still, the company said it is working with Chef to ensure nano servers work well with their DevOps tools.

Cloud security vendor Palerra scores $17m

Palerra is among a number of cloud security startup combining predictive analytics and machine learning algorithms to bolster cloud security

Palerra is among a number of cloud security startups combining predictive analytics and machine learning algorithms in clever ways

Cloud security vendor Palerra has secured $17m in series B funding, a move the company said would help accelerate sales and marketing efforts around its predictive analytics and threat detection services.

Palerra’s flagship service, Loric, combines threat detection and predictive analytics in order to provide automatic incident response and remediation for malicious traffic flowing to a range of cloud services and platforms.

Over the past few years we’ve seen a flurry of cloud security startups emerge, which all deploy analytics and machine learning algorithms to cleverly detect perceived and actual threats and respond in real-time, so it would seem enterprises are starting to become spoilt with choice.

The $17m round was led by August Capital, with participation from current investors Norwest Venture Partners (NVP), Wing Venture Capital and Engineering Capital, and brings the total amount secured by the firm to $25m.

The funds will be used to bolster sales and marketing efforts at the firm.

“The dramatic rise in adoption of cloud services by today’s enterprises against the backdrop of our generation’s most potent cyber threats has necessitated a new approach. LORIC was designed to meet these threats head on and this new round underscores our commitment to deliver the most powerful cloud security solution in the industry,” said Rohit Gupta, founder and chief executive officer of Palerra.

“As the perimeter disintegrates into a set of federated cloud-based and on-premises infrastructures, effective monitoring becomes almost impossible, unless security controls are embedded in these heterogeneous environments. This will require enterprises to reconsider and possibly redesign their security architecture and corresponding security controls by placing those controls in the cloud,” Gupta added.

Security failing to keep pace with cloud technology adoption, report finds

(c)iStock.com/Melpopenem

Cloud service providers (CSPs) can no longer treat security as a luxurious add-on, and customers have to ensure their providers take care of the issue, a new report asserts.

The research, the latest cab off the rank from Ovum and FireHost entitled “The Role of Security in Cloud Adoption within the Enterprise”, offers sound advice to vendors and users alike. True, it’s stuff everyone will have heard before – but it’s worth repeating.

“On too many occasions, security has been positioned as an afterthought when new technology initiatives have been brought to market,” Ovum analyst Andrew Kellett writes. “Any service that includes access via public networks cannot ignore user and data protection requirements.”

It’s certainly a view FireHost agrees with. “For too long, businesses have made assumptions about the security of their cloud service providers,” said Eleri Gibbon, FireHost EMEA VP. “In the instance of a data breach, the client suffers the consequences. That doesn’t sit right with me – after all, if your house falls down unexpectedly, you’d expect people to ask questions about how it was built in the first place.”

It’s safe to say too that companies aren’t exactly over-confident in their providers’ ability to put out the fires. Ovum research shows 92% of enterprises globally have concerns with their CSP over shared cloud infrastructure security issues. It’s a similar number with a lack of control over where data is kept (92%) and a lack of visibility into security controls available (91%).

What may be driving this? If the CSP can’t deal with threats, don’t expect the customer to: a recent Informatica and Ponemon Institute study found 60% of global respondents were “not confident” they had the ability to proactively respond to cloud-based data threats.

However, not all is lost. Kellett argues security should be seen as a “positive driver” for organisations. “Despite well-known security and compliance concerns, there are positive to be gained from working with a cloud-based service provider that includes security and compliance facilities as baked-in components of its overall service delivery model,” he wrote.

“All cloud solutions should be expected to include elements of security as part of the overall offering, but not all cloud security has been created equally or built to achieve the same levels of protection,” he added.

HP Cannot Compete As Public Cloud Service Provider

One year ago HP thought it would be competing with Amazon, Google and Microsoft to become the leader in cloud services. HP has re-branded and re-launched their cloud services many times, the most recent being their Helion service. However, the customer base is practically non-existent.

 

hp-helion-neutron-openstack

 

Last year they acquired Eucalyptus, an open-source vendor that was marketed as being Amazon Web Service compatible. This deal made no sense, and just added to HP’s gloomy cloud history. Though they are ceding the public cloud, they are still selling servers. Their largest customers are cloud companies or cloud behemoths. For other companies, HP hopes to build smaller cloud systems in ways that they can also utilize Amazon, Microsoft and other services.

 

For example a company could use HP computers to create content and Microsoft to handle email or heavy workloads on information. Salesforce.com is cloud platform used to share information.

 

HP was the leader in selling computer services to business, so it looked like selling computing in a new way would be easy for them. However, due to the scale of public clouds, with more than a million servers on each one, being difficult to learn it is very hard for newcomers to enter the market.

 

Enabling companies to create their own software applications is an important aspect of corporate technology, and is an area where HP seriously lacks. HP has put their engineers and sales people together to become better acquainted with each others services in order to promote the sharing of assets and collaboration.

The post HP Cannot Compete As Public Cloud Service Provider appeared first on Cloud News Daily.