VMware’s Software on Amazon Cloud – Surprise!

For many years, VMware and Amazon have been on two sides of the storage world. While VMware had asked customers to run their businesses on their own computer servers, Amazon had always encouraged companies to move it to the cloud. But now, both the companies have teamed up to integrate their views and services.

Beginning next year, VMware’s software will run on Amazon cloud, thereby giving VMware customers to use the existing tools to manage their servers, except that their servers will be located in the cloud. Alongside, users can also make use of Amazon’s database and storage tools and services.

Now, you can already run VMware’s virtual machines on Amazon’s cloud, and can even use VMware’s management tool, vCenter, to manage your virtual machines. So, what’s different with this partnership? Well, to start with both the companies have decided to create a new version of Amazon cloud that’ll allow VMware’s virtual machines to run directly on the cloud, without the need for an Amazon software in between. Also, this new partnership will give users the flexibility to run their software both on the cloud as well as on the existing data centers.

In many ways, this partnership has reiterated the fact that cloud is the present and the future, and no business can afford to ignore it. In addition, it’s also a significant milestone in the world of cloud computing, as VMware has gone from seeing Amazon as a rival, to admitting that its products are the future.

In fact, VMware made a brief foray into the cloud world with its own product called vCloud Air, but it never really took off. As a strategic move, this company has decide to focus on its core business of running virtual machines, and at the same time, is looking to expand to capabilities that’ll allow these virtual machines to run on the cloud. This way, VMware can cater to businesses that want to stay on their own data centers, and also to businesses that want to move to the cloud. This is why such a move is likely to expand the market reach and customer base of VMware, as it’s looking for ways to cope with the changing digital environment.

This partnership is significant for Amazon too, as this is an opportunity to reach out to customers who haven’t still migrated to the cloud. This would give it a better foothold among conservative businesses that still want to keep data on their local servers for reasons ranging from lack of knowledge to security concerns. In fact, this partnership with VMware can help Amazon to reach these customers before its rivals, thereby increasing its market share in a competitive market.

According to International Data Corporation (IDC), this deal will be significant in the short-term for VMware, but in the longer term though, Amazon will be a huge beneficiary as it can bring on more corporate customers under its AWS cover. The biggest winner is of course, the customers, as they can now choose to run their operations on their own VMware-equipped data centers and on Amazon Cloud.

 

The post VMware’s Software on Amazon Cloud – Surprise! appeared first on Cloud News Daily.

AWS and VMware announce hybrid cloud partnership

(c)iStock.com/Prykhodov

It had been rumoured, but now it’s confirmed: Amazon Web Services (AWS) and VMware have announced a strategic alliance which will culminate in a new hybrid cloud service snappily titled ‘VMware Cloud on AWS’.

The two companies say the announcement will give customers a full software-defined data centre (SDCC) experience, combining leadership in private and public cloud. The service will run on AWS bare metal infrastructure, while the SDDC side will come predominantly from VMware Cloud Foundation, including VMware vSphere, VMware Virtual SAN, as well as NSX virtualisation technologies.

The product will be available from ‘mid 2017’ onwards, and will be sold by VMware as an on-demand, elastically scalable service. The companies said pricing information would be made available closer to availability date.

“VMware Cloud on AWS offers our customers the best of both worlds,” said VMware CEO Pat Gelsinger in a statement. “This new service will make it easier for customers to preserve their investment in existing applications and processes while taking advantage of the global footprint, advanced capabilities, and scale of the AWS public cloud.”

Andy Jassy, AWS CEO, added: “Most enterprises are already virtualised using VMware, and now with VMware Cloud on AWS, for the first time, it will be easy for customers to operate a consistent and seamless hybrid IT environment using their existing VMware tools on AWS, and without having to purchase custom hardware, rewrite their applications, or modify their operating model.”

VMware has been busy on the partnership front of late; earlier this week, there was an update on the company’s collaborations with IBM, first announced back in February. According to IBM, 1,000 joint customers are now moving their VMware environments to the IBM Cloud, as well as mobilising and training 4,000 global service consultants to help VMware customers access and leverage IBM’s cloud.

At VMworld in August, the company made note of its SDDC capabilities; VMware Cloud Foundation was claimed to offer “a new ‘as a service’ option that delivers the full power of the SDDC in a hybrid cloud environment.” As this publication noted at the time, VMware’s role in the industry landscape appear to be “an enabler for businesses running on other, more populous clouds.” With that in mind, you can’t get any bigger than AWS in that regard.

[session] Organizational Agility for Digital Transformation | @CloudExpo #Cloud #DigitalTransformation

Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required.
In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, will draw together recent research and lessons learned from emerging and established companies, providing a road map for engaging our organizations in the creation and support of digital value chains.

read more

The Anonymous Neighbor Problem in IT | @CloudExpo #Cloud #DigitalTransformation

COMED, my power company, sends out a monthly report that shows me my energy consumption relative to my neighbors. Every month I’m considerably higher than all my neighbors. This report also has a list of things I could do to reduce my energy consumption. The problem with this report is that it doesn’t take into account my house size relative to the other houses in my neighborhood. My house is the largest model and accounts for about one-third the local sample size. If I were to attempt to reduce my energy consumption so that I would be in line with the monthly average of my neighbors, I would probably have to adopt a lifestyle akin to the Amish. Hence, I call this the Anonymous Neighbor Problem and, I believe, is responsible for driving decisions that are thrust upon IT leadership from executive business managers.

read more

One in three public sector firms have ‘no concrete plans’ for cloud

(c)iStock.com/RapidEye

A new research report from software and business services provider Agilisys has found that more than a third of public sector firms have ‘no concrete plans’ to adopt cloud-based technology.

The findings, which appear in the 2016 Cloud Adoption Survey, found that of the 180 respondents, 36% had no plans to migrate to the cloud.

Almost 60% said that cloud adoption would help to reduce on-site maintenance and IT support requirements, which may indicate it is not a lack of knowledge holding these companies back, although ‘confusion’ and ‘uncertainty’ were cited as key reasons hindering adoption. Perhaps not surprisingly, data compliance and security were also seen as primary obstacles.

“While the report shows that adoption of cloud services is underway in the public sector, some organisations have concerns about their capability to manage compliance and regulation within the cloud as well as how to transform, migrate and operate services in this new environment,” said Sean Grimes, Agilisys IT services director.

“Changing any IT system has inherent challenges, and moving to the cloud is no different. But working with the right partner, the shift should be seen as a transformative move about which organisations can feel positive, potentially providing more control over specific aspects of their IT, not less,” Grimes added.

Naturally, the research results mean something for Agilisys; alongside the report the company has recently launched its Community Cloud initiative aimed specifically at public sector organisations, with Microsoft being confirmed as the first partner to join.

Recent research from Eduserv, a company which offers public sector IT services, found that 12% of all UK council authorities, or 50 councils overall, account for 90% of G-Cloud local government spend. As the company noted, local governments and authorities are more than likely using cloud services, if not officially then in a shadow IT role, so it would make more sense to create official policies given the data risks involved.

IBM launches object storage for hybrid clouds, updates on VMware partnership

(c)iStock.com/Graffizone

IBM has made a couple of major cloudy launches, announcing what is claimed as the industry’s first object storage service for hybrid clouds, as well as elaborating on the partnership with VMware announced earlier this year.

IBM Cloud Object Storage is based primarily around the technology of Cleversafe, which was acquired by the Armonk giant in November last year, but also around SecureSlice, which combines ‘erasure coding’ – whereby data is broken into fragments and stored across a set of different locations – with encryption and decryption.

Alongside security, IBM is also touting the availability and economic benefits of its new storage, saying it can tolerate ‘even catastrophic regional outages’ with continuous availability inherent in its architecture. The company argues that, on internal testing comparing IBM Cloud Object Storage Vault Cross-Region Services to a ‘leading vendor’, it was 24% less expensive when managing half a petabyte of data.

“As clients continue to move massive workloads to hybrid clouds there is a need for an easier, more secure and economical way to store and manage mounting volumes of digital information,” said Robert LeBlanc, IBM Cloud senior vice president in a statement. “With today’s announcement, IBM becomes the leading cloud vendor to provide clients the flexibility and availability of object data storage across on-premises and public clouds.”

As is often the way, IBM has unveiled a brand spanking new customer with its launch, and in this case it is link management platform Bitly, which is using the cloud object storage to analyse the more than 10 billion clicks it processes each month globally. “We turned exclusively to IBM Cloud because of its leadership in data services,” said Robert Platzer, Bitly CTO. “Through this partnership IBM will help us transform our business and build a variety of new cloud services – from advanced analytics and data mining to data research – into our software platform.”

Regarding VMware, IBM has pushed out a number of stats: 1,000 joint customers are now moving their VMware environments to the IBM Cloud, as well as mobilising and training 4,000 global service consultants to help VMware customers access and leverage IBM’s cloud. The news of customers involved in this partnership, including Marriott International and Monitise, were referenced at the recent VMworld event in August. 

A Look Into IBM’s Cloud Object Storage

IBM has set a high standard for cloud storage with its new service called Cloud Object Storage. This service allows organizations to store any amount of unstructured data on their own servers, in the cloud, or in any combination, at one of the lowest rates in the cloud storage market today.  This service will be available from October 13th, 2016 in the US, and from April 1, 2017 in Europe.

This service is built on a technology called SecureSlice that combines encryption, erasure coding, and geographical dispersal of data. Encryption is a technology where messages are encoded in such a way that only those who are authorized can view this message. Erasure coding is also a way of securing the data. This technology breaks the data down into different segments, expands them, and finally encodes them with redundant data pieces. These data fragments are then stored across different geographical locations or across different devices. This method is particularly useful to reconstruct corrupted data, and in this sense, they are a better replacement for RAID systems, as the time and overhead needed to reconstruct data is greatly reduced. Lastly, geographic dispersal of data is the method by which data is spread across different locations for greater security and redundancy. IBM acquired the SecureSlice technology when it bought a company called CleverSafe last year for $1.3 billion.

There are multiple options available for users with respect to storage. One option called Cross Regional Service, allows users to send their treated data to three separate cloud regions located in different parts of the world, while another option, called Regional Service, allows users to store in multiple data centers located within the same region. Regardless of which choice customers make, their data will be made secure and redundant with SecureSlice technology.

With this service and its many options, IBM has extended the SecureSlice technology to hybrid clouds too, thereby giving customers more flexibility and scalability, without compromising on their control over in-house infrastructure. This product comes at a time when the IDC has predicted that hybrid cloud architecture would be adopted by more than 80% of enterprises by 2018. IBM has made a strategic move by acquiring CleverSafe and extending it to a hybrid cloud environment to tap into this growing market.

In terms of cost too, this service is likely to be a good deal for customers. IBM claims that this service costs 25 percent less than Amazon Web Services S3 storage service. Also, it believes that many customers who are already using IBM’s cloud services would be willing to adopt this technology. According to Russ Kennedy, VP Product strategy, users who run apps on Amazon Web Services can also use this service to store their data, as it supports the S3 interface. The same applies to OpenStack Swift customers too, as Cloud Object Storage supports this API as well.

This service has already been deployed at a few early-adopter companies, and many more are expected to adopt it in the next few months.

The post A Look Into IBM’s Cloud Object Storage appeared first on Cloud News Daily.

Why a new era of computing has arrived with the software-defined data centre

(c)iStock.com/4x-image

The delivery of IT services on-demand is becoming an increasingly common ambition for enterprises. As businesses are introduced to public cloud platforms such as Amazon Web Services, they are expecting the same flexibility, real-time delivery and cost savings from services across the entire IT landscape.

In turn, this is leading to a brand new world of computing, one in which old client-server models are being turned on their heads, replaced by mobile and cloud computing. By necessity, this is also leading to radical changes across the entire IT infrastructure layer in the data centre.

Let’s take storage as an example. For decades, storage has been defined by closed, proprietary and monolithic hardware-centric architectures, built for single applications and local network access, with limited redundancy and manual management. And in a traditional data centre, each component of the infrastructure has an independent set of management requirements. Trying to provide dynamic workload delivery is a complex and time-consuming process; manual infrastructure reconfiguration is required and new hardware is often essential. In practice, this places an onerous workload on IT staff who struggle with a lack of agility and leaves the data centre exposed to human error.

With these new requirements placed on IT infrastructure comes the development of the software-defined data centre, which is driving change across the entire industry. For example, all primary storage is moving to flash by necessity, with research firm IDC expecting all-flash arrays to dominate primary storage market spend by 2019.

The software-defined data centre

An essential characteristic of this new model is software that is decoupled from hardware. While its definition varies among vendors, essentially it allows data to be pooled and assigned to applications, making it more flexible to needs, and greatly increases the ability to scale-out depending on the architecture.

The decoupling of software and hardware has also introduced automation into the data centre. The ability to abstract service offerings from the underlying infrastructure paves the way for the delivery of new enterprise cloud consumption models based on the specific needs of the business, such as infrastructure as a service, a converged or hyper-converged infrastructure, or even software-based infrastructure on commodity hardware.

Ultimately, this storage model is defining the next-generation data centre.

From the static to the dynamic

The software-defined data centre is characterised by the ability to deploy, manage, consume and release resources, and monitor infrastructure, all with fine-tuned control. This approach eliminates IT silos and underpins a move from a static to an elastic model, enabling new levels of agility, which are essential to the successful delivery of enterprise services via the cloud.

For example, with software-defined storage that virtualises performance separate from capacity, IT managers gain unprecedented storage control, enabling them to dial up performance or dial it down as required, add either resource in small increments without disruption to scale-out easily and use data services for tasks such as deduplication, snapshots and replication, all in real-time.

Added to this is network functions virtualisation (NFV), which offers a new way to design, deploy and manage networking services. Similar to other next-generation data centre services, it decouples network functions from proprietary hardware appliances so they can run in software. Allied with software-defined networking (SDN), the result is a powerful architecture that is dynamic, manageable, cost-effective, and adaptable – making it ideal for the high-bandwidth and dynamic nature of cloud platforms.

Benefits you can’t ignore

Taken together, software-defined storage and automation, NFV, SDN and the software-defined data centre radically improve service delivery, dramatically reduce costs and enable levels of flexibility not previously seen.

As a result, we’re now seeing a reduction in the amount of time required to manage storage by dynamically classifying and moving data between storage tiers. By transferring infrequently accessed data to lower-cost drives, organisations save money and improve performance by lightening the load. Further, by reducing the number of active files, we’re seeing shorter daily backup times.

It’s hardly surprising then that the software-defined data centre is inevitable and many are already undertaking this journey. If you haven’t already set out on this road, developing this model depends on whether you are building out a new data centre, or updating an existing data centre.

However, wherever you are on the journey, in both cases it’s an investment in the business. If you’re struggling with the decision you need to think about whether or not you want an investment that will gain value over time and grow with the business, or an investment that you struggle with as you try to make the old hardware-centric IT model fit into a new era of computing.

Operational benefits aside, the cost drivers for software-defined data centres are certainly compelling. Take network virtualisation as an example. Instead of applications taking an average of 12 weeks to deploy in an enterprise, it can take minutes, with all of the required network capabilities and security policies attached to the application.

This should certainly impress business decision makers, as far less time and money is spent on implementation and the return on investment begins almost immediately.

Today, many companies are already exploring, planning or virtualising their networks as they move from the client-server era to the mobile-cloud era. They understand only too well that the software-defined data centre is more agile, secure, scalable, and cost-effective to run, than a traditional hardware-centric data centre.

If you’re struggling to convince senior executives to make the leap, speak to them in the language of business benefits to achieve buy-in; the long term cost savings and return on investment, greater flexibility to scale up and down as required, and to have IT services on demand and in real-time.

It is also worth spelling out that agile public cloud computing and on-demand IT is not a future abstract; it is already here and being used by an increasing number of organisations which are already reaping the benefits.

Cloud Migration Dollars and Sense | @CloudExpo #API #SaaS #ITaaS #Cloud

The explosion of cloud onto the enterprise scene has, literally, revolutionized how businesses across the size spectrum do business, yet there’s a price tag tucked into this cloud’s silver lining that smart decision-makers should pay heed to.
This is the era of what I like to call the consumerization of enterprise software. It’s bringing tremendous benefit to the business world, but it’s also bringing a great deal of change, particularly to the understanding of how business software adoption and use is done, and enterprises of all sizes need to familiarize themselves with best practices-understanding to leverage the power of the cloud.

read more

JetBlue’s #DevOps | @DevOpsSummit #APM #DevOps #ContinuousDelivery

JetBlue Airways uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications.
The next BriefingsDirect Voice of the Customer performance engineering case study discussion examines how JetBlue Airways in New York uses virtual environments to reduce software development costs, centralize performance testing, and create a climate for continuous integration and real-time monitoring of mobile applications.

read more