Tech News Recap for the Week of 10/3/2016

Were you busy this week? Here’s a tech news recap of articles you may have missed for the week of 10/3/2016.

VMware will be teaming up with Amazon Web Services to make it easier for customers to run VMware software on AWS. A hacker has released the source code of Mirai – an Internet of Things malware used to launch large scale DDoS attacks. A startup in California is working on launching a cloud service that would put physical satellite data centers in space orbiting around the earth. Cloud IT infrastructure spending is on the rise, the NBA is holding a hackathon, disruption in the storage market, and more top news this week you may have missed!

Remember, to stay up-to-date on the latest tech news throughout the week, follow @GreenPagesIT on Twitter.

Tech News Recap

Did you miss VMWorld? Register for our upcoming webinar to get all of the most important updates from Las Vegas and Barcelona.

 

By Ben Stephenson, GreenPages Technology Solutions

Hypervisors in cloud computing: What is out there for you?

(c)iStock.com/Avalon_Studio

Choosing a cloud provider may seem like a trivial thing. You could go with the choice that most of the world has already made and pick the clear leader in the cloud space — AWS. However, as the public cloud market matures, the volume and range of relevant enterprise options expand. Depending on your requirements, you might find one of the other cloud providers out there more suitable, mainly because of the hypervisor that enterprises use. For example, for VMware vSphere users, vCloud might be the “natural” public cloud choice.

In this article, we discuss several public cloud vendors and consider their underlying infrastructure and the hypervisors that they use to operate their clouds.

Public cloud providers

There are several key players in today’s cloud market, each offering different advantages.

Amazon Web Services (AWS): Amazon not only has an online store where you can buy and sell things, it is also the biggest public cloud provider in the world. It is safe to say that Amazon was the first to provide an option to run workloads in the cloud on a large-scale.

Today, AWS is clearly perceived as the market leader. Of all the cloud providers, AWS offers the richest set of features for use in the cloud. It also has the biggest share in the Cloud Infrastructure as a Service (IaaS) market, and has been highlighted as such in the past six years in the de-facto “State of the Union” from Gartner.

AWS has built up its proprietary platform over the years, and its business model is based entirely on the expectation and assumption that everything can and will run in the public cloud. AWS has no on premise solution.

Microsoft Azure: Microsoft Azure has been around since 2008 (and underwent a name change in 2010). Microsoft is well known for its stronghold in the enterprise market, particularly for desktop software as well as enterprise software such as Exchange and SQL. A few years ago, Microsoft plunged head first into virtualization.

As a result of their sheer size and market share, Azure is perceived as a market leader – along with AWS. The option to mix and match your workloads in your data center and in the cloud has been a real temptation to many of the enterprises. Microsoft recently announced Azure Stack, which is Microsoft’s Azure cloud deployed within your organization’s data center. Under the hood, the stack includes Microsoft’s Hyper-V, Windows, and networking and storage capabilities. Ultimately this creates a very compelling hybrid solution for enterprises.

Google Cloud Platform (GCP): We all know that Google runs the biggest search engine in the world. Google also decided to get into the business of cloud computing in June 2012 and has been a contender in this market ever since. GCP does not yet offer the vast amount of services available from AWS, although they are continuously adding more services to compete. It does, however, have a number of differentiators that allow it to stand out from the competition. Shorter billing cycles, for example, afford customers the option for per minute pricing instead of per hour (in AWS).

Unlike AWS, which runs on Xen hypervisor, Google Cloud runs on KVM. But like AWS, Google Cloud also has a proprietary platform, and is relying on the fact that anything and everything will run in the public cloud, and therefore does not offer an on premise solution.

Rackspace: Rackspace is extremely well known in the hosted infrastructure space. The company takes pride in its fanatical support and first dipped its toes into the public cloud market almost 10 years ago.

Rackspace cloud has added a significant amount of customizations to its platform that will never become part of the community. The customizations include a networking stack and load balancing service and the whole billing and UI aspect – which is not vanilla.

Rackspace offers a private cloud solution as well as a supported and managed service – and this is where they differ from the two cloud providers above. Rackspace will set up a cloud for you, then support and manage it as well.

VMware vCloud: VMware has been the clear visionary and leader in the enterprise virtualization space for many years, although their attempts at becoming a Cloud market leader never really took off. There are a number of possible explanations as to why this happened, be it licensing, pricing or the fact that VMware was very late getting into the game.

VMware originally did not offer a public cloud service of its own. It sold a private cloud product and offered public cloud services through partners and the ecosystem. That changed a few years ago when VMware tried to break into the public cloud market with vCloud Air, which has never been deemed a success.

Hypervisor of choice

The underlying hypervisor capabilities depend on your choice of cloud provider. These are the main hypervisors of choice in use today:

KVM: In 2008, Red Hat acquired Qumranet (the creators of KVM) and since then has put its full support and effort into developing KVM. It is important to note that KVM is an open source project, meaning that there is no licensing involved.

KVM runs on most Linux distributions today and is perceived as the default hypervisor to be used in all virtualization and cloud products offered by most Linux vendors. The open source hypervisor is also the default hypervisor used for most clouds today, probably making it one of the most widely used hypervisors in the world.

Xen: As an open source hypervisor, Xen has undergone a journey, starting with University of Cambridge, then over to Xensource, then acquired by Citrix, and finally to its current place of residence – the Linux Foundation.

AWS is the biggest cloud provider that uses Xen today, where it is the hypervisor of choice. Xen offers a number of advantages over KVM such as the efficiency of paravirtualization, which exceeds what is available in KVM due to the closer access Xen has to the physical hardware, and the fact that it is a more mature product. Xen is not actually part of the Linux Operating system, whereas KVM is part of the Linux kernel.

Hyper-V: Hyper-V is a Microsoft product and, as such, it does not come free. Yes, there are free versions available; however these have many built-in limitations and management at any scale becomes impossible without actually putting out some hard cash.

Hyper-V and Microsoft have always feuded with VMware. Over the past few years, they have managed to chisel away at VMware’s market share in the enterprise by providing a product that does most of what vSphere can do and at a more attractive price. It would be a natural choice if your workloads are Microsoft based, even though Microsoft is looking to support any and all Linux flavours in the future.

ESXi: The feature-rich hypervisor that many enterprises use is ESXi (vSphere). Of course, this is not a free product — VMware has built their whole company on their hypervisor and for many years this has been a great strategy.

It supports any operating system, be it Linux or Windows, with almost any kind of esoteric flavor that you could imagine covered by ESXi. But this is first and foremost an enterprise solution – one that might not be cost effective for everyone – especially if you are just starting out.

One of the biggest advantages offered when running a cloud platform on an Enterprise solution is the added benefits that come with this solution. Two examples (and for transparency, this is also true for Microsoft solutions) are Host restart and instance scheduling. In addition, VMware has built-in HA; in case a host fails, all instances are restarted on another host in the cluster – and your cloud solution does not need to manage or worry about these pieces.

Another example is DRS (Distributed Resource Scheduler), which is responsible for moving instances around between hypervisor nodes in the cluster. Again, this is something that is taken care of in the Hypervisor layer. As opposed to other cloud providers, this happens on a regular basis and not only upon instantiation.

Docker and a final note

It would be remiss to not mention the new kid (or container) on the block – Docker. Docker can run inside an instance on any of the hypervisors above. It is actually another abstraction layer on top of the hypervisor, which allows you to treat the hypervisor and in turn the cloud where you are running it as a commodity. Docker enables you to move between them in a much easier way than previously possible.

Even though your choice of cloud provider will be based on a number of different criteria, such as maturity, feature set or geographical location, there are cases where the underlying hypervisor capabilities will also be a contributing factor in your decision. Understanding the different features that each hypervisor can provide, its history and its share in the market will help you to make the best decision on which cloud to run your workloads.

AT&T and Amazon Come Together to Provide Integrated Cloud Offerings

Dallas-based telecom leader, AT&T, signed a multi-year agreement with Amazon Web Services (AWS) to offer joint services in the areas of cloud networking, IoT, security, and analytics. The terms of the deal was not disclosed, though a press release from AT&T said that this partnership would give its customers access to AWS Cloud in many ways.

This is a significant partnership for both the companies, as it helps one to tap into the strength of the other. Telecom companies like AT&T do not have the cloud infrastructure like that of companies like Amazon and Microsoft, so partnerships are the best way to beef up their offerings. It would take a ton of money and even many years for AT&T to build an infrastructure the size of AWS, and this is why it’s a sensible move to tap into the strength of this cloud service provider.

Earlier, AT&T had signed an agreement with Microsoft to use its Azure platform to securely move customer-centric data between private and public cloud. Also, it entered into an agreement with IBM in 2012 for a similar service.

If you’re wondering why AT&T signed another agreement with AWS, it is to focus on these three specific areas:

  • Under this partnership, AT&T’s NetBond customers can establish faster and more secure connections to the AWS Cloud.  NetBond is an MPLS VPN service that connects enterprise applications to public clouds. Over the last year, NetBond has seen a four-fold increase in traffic, and this partnership is expected to provide enhanced customer visibility, security, and automation for these customers.
  • Through this partnership, AT&T plans to have a stronger foothold in the IoT market, thereby allowing AT&T devices to send data to the cloud seamlessly. This is also a huge market for AT&T, as many connected devices such as cars and fitness machines come with sensors that collect pertinent information from the users. A reliable network is needed to send this data to the cloud, and AT&T wants to project itself as the leading network service provider to send this data. Already, its network includes 29 million connected devices, and these numbers could go up hugely with cloud integration.
  • The final focus area is to improve security on cloud platforms, so the response time for threats is greatly reduced. This is in line with AT&T’s plan to boost its Threat Intellect Platform that was launched this summer. It’s AT&T’s own machine learning system that identifies real-time threats as they occur, and this system can get a big boost with AWS’s cloud security and infrastructure.

These three focus areas could expand to include more areas too in the future, as AT&T is looking to consolidate its cloud business, as a way to make up for the declining growth in its traditional role as a telecom provider.

To implement the terms of this partnership, both companies plan to employ specialists from their respective companies on a joint effort to work on the specified areas.

In all, this is another significant partnership that is sure to augur well not just for the companies, but also for the industry and its customers as a whole.

The post AT&T and Amazon Come Together to Provide Integrated Cloud Offerings appeared first on Cloud News Daily.

[slides] The Future of #IoT Data | @ThingsExpo #M2M #API #BigData

The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016.
In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, discussed how the ability to access and analyze the massive volume of streaming data from millions of connected devices in real time and at scale will enable prediction and optimization, the areas where IoT’s greatest value lies.

read more

Getting funding models fit for cloud: How to stop holding businesses back

(c)iStock.com/tonefotografia

When well implemented, cloud caters to the business need to be more agile, flexible and responsive. But many companies using cloud struggle to reap its full benefits. This often has less to do with cloud itself and more to do with how it’s managed across different functions. Most specifically, businesses are held back by outdated and decentralised funding and investment models, and by lines of business taking responsibility for IT deployment. 

Line of business managers implementing cloud solutions to meet an immediate need may well solve a short term problem. But even one line of business which takes control of its own IT spending can have serious knock-on effects right across the business. We dug deeper to find out how this is affecting business across EMEA in our new report which examines how culture is affecting cloud success.

Our research finds the issue of ‘shadow IT’ – when line of business departments implement cloud technology independently of IT – is not uncommon with 66% of CIOs controlling less than half their company’s IT budget. 

Another point was made clear: cloud funding models require urgent change in order for businesses to achieve greater flexibility (74%), the ability to deliver more cloud services (72%), and increase innovation (66%). A new approach to cloud funding would also help reduce overall IT costs, according to 70% of those surveyed.

Added to this, disjointed projects and implementations, duplicated IT resources and working at tangents are all major consequence of decentralised IT funding models. 35% of survey respondents working in IT said lines of business departments are purchasing cloud services the company already has.

As if that wasn’t bad enough, calculating return on investment on these deployments is often tricky, particularly with hidden costs associated with shadow IT implementations. We found that 33% of respondents working in technology said lines of business operating without the input of the IT team spent too much on IT, 30% said they bought the wrong cloud services, 38% felt it added complexity to IT delivery, and 35% said it increased concerns around security.

The solution isn’t to take these IT decisions away from HR or marketing departments but instead centralise control of IT funding and deployment with the CIO. By doing this and working more closely with the CEO and CFO, the CIO can better steer the technology strategy needed for an enterprise cloud model which connects all cloud resources regardless of whether they are public, private, hybrid, on-premise, converged infrastructure or any other permutation. Such a model can eliminate the creation of cloud data silos which 46% of businesses said they experience.

Crucially, in order to make this model as effective as possible the way in which funding is structured must also change. The traditional view of IT as a cost centre means that cloud investments are not tied to revenue or innovation potential – and can fail as a result. Instead, IT funding should be tied to key projects and based around line of business activities.

By taking this approach, teams will be able to acquire services from a knowledgeable, specialist IT team who can offer solutions for cloud issues. For instance, an internal purchasing model which frames IT as a profit centre will remove issues related to poorly calculated ROI, reduce spending on cloud services and remove common complexities of an organisation-wide IT solution.

A whopping 95% of IT respondents we surveyed believe line of business managers buying IT adds unnecessary complexity to their role. It’s imperative for businesses to get past this problem with an integrated approach to IT in which the CIO works with lines of businesses to develop an organisation-wide enterprise cloud model. If combined with centralised and activity-based funding in which lines of business are accountable for their tech purchases, this approach will open the door to greater agility, flexibility, responsiveness. It will also spell the end to business silos and profligacy – music to the ears of any business.

Commvault “Bronze Sponsor” of @CloudExpo | @Commvault #SDS #DataCenter

SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON’s 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Commvault is a leading provider of data protection and information management solutions, helping companies worldwide activate their data to drive more value and business insight and to transform modern data environments. With solutions and services delivered directly and through a worldwide network of partners and service providers, Commvault solutions comprise one of the industry’s leading portfolios in data protection and recovery, cloud, virtualization, archive, file sync and share.

read more

Announcing @SecureChannels to Exhibit at @CloudExpo | #IoT #InfoSec

SYS-CON Events announced today that Secure Channels will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
The bedrock of Secure Channels Technology is a uniquely modified and enhanced process based on superencipherment. Superencipherment is the process of encrypting an already encrypted message one or more times, either using the same or a different algorithm.

read more

Announcing @CloudBerryLab Named “Cloud Backup Sponsor” of @CloudExpo Silicon Valley | #IoT #Cloud #BigData

“Media Sponsor” of SYS-CON’s 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.

read more

Day 3 Keynote By @ShengLiang | @CloudExpo #Cloud #DevOps #Containers

In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, will discuss the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker containers gain prominence. He’ll explore these challenges and how to address them, while considering how containers will influence the direction of cloud computing.

read more

CDW’s New Cloud Offering

CDW Corporation, a leading provider of technology products and services for businesses, has come up with a new cloud offering called Cloud Planning Services. As the name suggests, this service would revolve around providing the right inputs, and helping companies to devise an appropriate cloud strategy.

This is a comprehensive service that teaches businesses what cloud is, and the benefits that come from it. Further, experts in Cloud Planning Services would work with each business to create a cloud strategy that would work best for them.

To this end, there are four workshops offered under this service, and the intensity ranges from a one-day seminar on cloud basics to a seven-week workshop that takes participants into the very depths of cloud computing, storage, and security. Also, participants have the opportunity to work on customized projects that are based on the needs of every institution. These workshop options include:

  • Cloud 101 – This is a one-day workshop that talks about the latest trends and developments in cloud. It is aimed to help organizations understand the power of cloud, and the ways and means by which they can move their operations to it.
  • Cloud 201 – This is a one-week workshop that takes a detailed look into the customer’s IT environment and their business needs. Accordingly, business cases are constructed to support a cloud strategy. In this workshop, a high-level roadmap for cloud is formulated based on the current state of cloud operations and business environment.
  • Cloud 202 – This three-week workshop includes Cloud 201, as well as other aspects such as the Total Cost of Ownership (TCO) and tips to select a particular cloud vendor whose services would match the company’s requirements. An overview of cloud security and compliance details are also covered in this workshop.
  • Cloud 301- This is a comprehensive seven-week process that encompasses all of the above discussions, and in addition, talks about the financial and non-financial considerations of adopting a particular cloud strategy.  During these seven weeks, companies also get to learn about different IT delivery models, including an assessment of their current data center costs. Finally, cloud experts at CDW will work with the companies to formulate the best cloud strategy.

Besides these comprehensive courses, CDW’s Cloud Planning Services also works with businesses to provide a complete cloud life-cycle visibility, cost of infrastructure, Infrastructure-as-a-service details, pre and post-migration testing, and IaaS validations.

This new service is the perfect complement for CDW’s existing cloud services that include security and collaboration apps offered as SaaS, and a wide mix of infrastructure offered as IaaS.

An important aspect of this planning service is that all advice and suggestions are vendor-neutral, thereby giving companies a ton of flexibility to choose their preferred providers. In this sense, CDW’s Cloud Planning Services explains cloud and all its aspects, without pushing their own products to its customers.

Such an offering can be a win-win situation, as it helps the company to expand its customer base, and generate a new revenue stream. For the end customers, it is pure and authentic information on cloud strategies.

The post CDW’s New Cloud Offering appeared first on Cloud News Daily.