Predictable Performance & Granular Control By @KellyMurphyGS | @CloudExpo [#Cloud]

Traditional storage was designed to present a LUN to a server. This worked well until that server became a hypervisor running 10, 20 or 30 virtual servers, all pointing to the same LUN. Adding to the complexity, all hypervisors need to point to the same LUN to enable VM mobility or high availability. Architecturally, this creates a many-to-one relationship that makes optimization impossible. Each VM has a different I/O pattern. When they all mix together, the aggregate I/O becomes highly random, creating performance issues with no way to isolate problems or control I/O per VM.

read more

Why the future is cloud autonomics

(c)iStock.com/eric1513

Many great innovations have come out of cloud computing, such as on-demand infrastructure, consumption-based pricing and access to global computing resources. However, these powerful innovations have come at a high cost: complexity.

Managing cloud infrastructure today is substantially more complex than managing traditional data center infrastructure. While some of the complexity is a direct consequence of operating in a highly transient and shared computing environment, most has to do with the unintended side effects of cloud computing. For example, consumption-based pricing allows us to pay for only what we use, but requires careful monitoring and continuous optimization to avoid resource waste and poor cost management.

API-driven resources allow us to launch compute and storage with a few lines of code, but require that the resources be highly configurable to support the various needs of its different users. Purpose-built services (e.g. Amazon S3) substantially reduce the barrier to building new types of applications, but require that we obtain the necessary expertise to manage and operate the new services.

The rise of cloud management

These early experiences with the growing complexity of managing cloud environments spawned a new generation of open source and commercial products intended to help us contain this complexity. However, the continued pace of innovation in public, private and hybrid clouds, combined with the increasing importance of multi-cloud environments, has continued to widen the gap between the complexity of our infrastructure and our ability to manage that infrastructure. It has become increasingly clear that something needs to change to support the cloud environments of tomorrow.

The genesis of autonomic computing

In 2001, IBM released a manifesto predicting a looming software complexity crisis caused by our inability to manage the rapid growth in computation, communication and information technology. The solution proposed was autonomic computing, a term which took its name from the autonomic central nervous system, essentially an automatic control system for the human body. In its manifesto, IBM defined autonomic computing as self-managing systems that can configure, optimize, heal and protect themselves without human intervention.

While the paper launched a field of research that remains active today, its impact has not been widely felt in the industry, in large part because the gap IBM forecasted did not become a substantial impediment to businesses until the early mainstream adoption of cloud computing, almost a decade later. But now, with the gap between the complexity of infrastructure and our ability for software to manage this complexity continuing to widen, the IBM manifesto seems suddenly prescient.

The impact of cloud autonomics

Cloud autonomics is the use of autonomic computing to enable organisations to more effectively harness the power of cloud computing by automating management through business policies. Instead of relying on manual processes to optimise cost, usage, security and performance of cloud infrastructure, a CIO/CTO can define business policies that define how they want their infrastructure to be managed and allow an autonomic system to execute these policies.

Cloud autonomics envisions a future in which businesses manage their clouds like brokerages manage equity trading – with policy-aware automation. A business will configure an autonomic system with governance rules. The system will then continuously monitor the business’s cloud environment, and when the environment goes out of compliance, will make the necessary changes to bring it back in line. Some sample policies cloud autonomics anticipates include:

Cost

  • The automated purchase of reserved capacity in support of organizational or functional needs (e.g. AWS reservations).
  • The automated movement of an idempotent workload from one cloud provider to another to obtain cost efficiencies.

Availability

  • The automated migration of data to another region in support of business service level agreements (SLAs).
  • The automated migration and/or backup of storage from one medium to another (e.g. migrating data in AWS EBS to S3 or Glacier).

Performance

  • The automated increase of the machine type for a workload to support more efficient operation of non-horizontally scaling workloads.

Security

  • The automated change of network and/or endpoint security to conform to business policies.

Usage

  • The automated shutdown of idle or long-running instances in support of business policies (e.g. shutdown idle development infrastructure running more than a week).

Making cloud autonomics work for you

Cloud autonomics requires a system capable of executing a collect, analyze, decide and act loop defined in autonomic computing. Unlike autonomic computing, a cloud autonomic system relies on policy-driven optimizations instead of artificial intelligence. The system monitors one or more cloud providers and the customer infrastructure running within these cloud providers, evaluates user-defined policies, and using optimization algorithms, identifies recommendations to alter the infrastructure to be consistent with the user-defined policies. These recommendations may optionally require external approval before execution, and can seek privileges from an external system to execute approved changes.

A cloud autonomic system is capable of continual and automated optimization of cloud infrastructure based on business policies, both within a single cloud environment and across multiple clouds. It promises a future in which organizations can define policies, and have these policies securely managed down to microsecond decisions, resulting in consistent implementation and optimal resource utilization.

The Role of Internet Performance in #Cloud Services By @Dyn | @CloudExpo

With worldwide spending on cloud services and infrastructure growing by 23% in 2015 to $118B, it is clear that cloud services are here to stay. Yet, the rate of cloud adoption varies by companies and markets around the world. With thousands of outages and hijacks across the Internet every day, one reason for hesitation is the faith in quality Internet performance.
In his session at 16th Cloud Expo, Michael Kane, Senior Manager at Dyn, will explore how Internet performance affects your end-user’s experience and how you can make variations in the Internet work to your competitive advantage.

read more

The power game: Ensuring the right data centre relationship

(c)iStock.com/4x-image

By Steve Davis, Marketing Director, Next Generation Data

Britain’s internet demand is expanding so fast that it could consume the nation’s entire power supply by 2035, scientists told the Royal Society earlier this month.

Hopefully it won’t come to that but Andrew Ellis, professor of optical communications at Aston University, told the “Capacity Crunch” conference that data storage and transmission on the internet, along with devices such as PCs and TVs, are already consuming at least 8% and as much as 16% of Britain’s power — and doubling every four years.

This problem is already hitting some data centres and as a result they are limiting the power available to the racks hosted in their facilities. In some cases this can be as low as 2 kW. This was fine a few years ago, but now the higher density racks being installed demand much more power. And while virtualisation and other new technologies allow huge improvements of IT efficiency more power to each rack is a pre-requisite to run them.

Because of this it is not unusual for some data centres to force customers to take more space (and charge for it) than necessary purely to deliver the power required.

This problem is only going to get worse. Make sure you choose a data centre which has the ability to deliver the right levels of power now, have plenty in reserve for the future and won’t penalise you by making you take more space than you really need.

Also ensure your data centre provider can actually deliver the amount of power it has contracted to. There have been cases where power availability has been ‘oversold’ as some providers have gambled on the fact that not all customers will use all their allocated amount of power at the same time.

A data centre relationship should last for many years. Apart from risking business continuity and competitive edge through unreliable or insufficient power, having to make an unplanned move from one data centre provider to another will be a time consuming and costly exercise.

For data centre users it is essential to do your homework sooner than later as increasingly, when it comes to the power game, there will be those facilities that have it and those that don’t. 

AWS unveils programme to train, attract students to cloud

Amazon has launched a programme to help attract students to its cloud services

Amazon has launched a programme to help attract students to its cloud services

Amazon has launched the AWS Educate in a bid to help educators and student cultivate cloud-centric development and operations skills, and attract the next generation of users to its cloud services ecosystem.

The company plans to offer students and educators credits for AWS cloud services and make available cloud-related educational content for teachers to use as course materials. Amazon said the move is intended to help train up students on cloud, which it said it becoming the default environment for developing and deploying greenfield applications.

“For years, the AWS educational grants program has put cloud technology in the hands of educators and students, giving them the ability to put big ideas into action. We’ve seen students develop assistive computer vision technology in collaboration with the National Federation of the Blind, and aspiring entrepreneurs take a web startup from conception to launch within 60 hours,” said Teresa Carlson, vice president, worldwide public sector, AWS.

“Based on the feedback and success of our grant recipients and the global need for cloud-skilled workers, we developed AWS Educate to help even more students learn cloud technology firsthand in the classroom. We’re pleased to offer AWS Educate to educators, students and educational institutions around the world,” Carlson said.

Students and educators at any educational institution can join the programme and can apply to redeem AWS credits for a range of its services including Amazon Elastic Compute Cloud (Amazon EC2), Amazon Simple Storage Service (Amazon S3), Amazon Relational Database Service (Amazon RDS), Amazon CloudFront, Amazon DynamoDB, Amazon Elastic MapReduce (Amazon EMR), Amazon Redshift, and Amazon Glacier.

Programme participants will also get access to online training materials and app testing labs, collaboration forums and materials uploaded from other educators.

Telecom, WebRTC and the ‘Internet of Things’ By @RingPlusMobile | @ThingsExpo [#IoT #WebRTC]

The worldwide cellular network will be the backbone of the future IoT, and the telecom industry is clamoring to get on board as more than just a data pipe.
In his session at @ThingsExpo, Evan McGee, CTO of Ring Plus, Inc., discussed what service operators can offer that would benefit IoT entrepreneurs, inventors, and consumers.
Evan McGee is the CTO of RingPlus, a leading innovative U.S. MVNO and wireless enabler. His focus is on combining web technologies with traditional telecom to create a new breed of unified communication that is easily accessible to the general consumer. With over a decade of experience in telecom and associated technologies, Evan is demonstrating the power of OSS to further human and machine-to-machine innovation.

read more

CenturyLink adds clean cloud datacentre in Washington

CenturyLink has added a datacentre in Washington to its footprint

CenturyLink has added a datacentre in Washington to its footprint

CenturyLink has opened a datacentre in Moses Lake, Washington this week, which is powered in part by hydro-electric energy.

The facility is powered in part by hydroelectric generators located on the nearby Columbia River, and because the local climate allows for significant use of free-air cooling (which is much less power-intensive than traditional cooling methods) the company said the datacentre has among the lowest power usage effectiveness (PUE) ratings in the industry.

“CenturyLink’s new low-cost power datacentre services provide many benefits to our customers, including a highly resilient solution coupled with power costs and efficiency metrics that rank among the best in the industry, and the facility serves as an excellent disaster recovery location,” said David Meredith, senior vice president, CenturyLink. “Enterprises enjoy global access to CenturyLink’s portfolio of cloud and managed hybrid IT services, and we continue to extend the reach of our data center footprint to new markets to meet from the needs of our customers.”

The datacentre is being hosted by Server Farm Realty, a managed datacentre and colocation provider, and offers access to cloud, colocation, networking and managed services.

This is the second datacentre CenturyLink has added to its footprint in recent months. Two weeks ago the company announced a partnership with NextDC to broaden its datacentre footprint in Australia, and in March brought its cloud platform online in Singapore.

While most datacentres are typically located close to large metropolitan centres, Kelly Quinn, research manager with IDC reckons CenturyLink’s latest datacentre could bring more attention to the region’s potential as a hub for other facilities.

The central part of Washington state is one of the geographies in which I see substantial potential for further growth as a datacentre hub,” Quinn said.

“Its potential stems from the area’s abundance of natural, power-generating resources, and its relative immunity from natural disasters.”

“It also may offer customers who are ‘green’ conscious the ability to work with a provider that can satisfy their datacentre needs with renewable energy sources, Quinn added.

USA Freedom Act passes ending bulk data collection

The USA Freedom Act will end bulk data gathering familiar to the PRISM programme and other NSA iniatiatives

The USA Freedom Act will end bulk data gathering familiar to the PRISM programme and other NSA iniatiatives

The USA Freedom Act, a bipartisan bill aimed at reforming the US Patriot Act that would among other things end kind of bulk data collection Edward Snowden revealed two years ago, passed the House or Representatives by a wide margin this week. The move may be welcome news to both telcos and cloud service providers alike, many of which lobbied hard for US surveillance reform.

The bill, which passed in a 328 for – 88 against vote, ends the bulk collection of communications metadata under various legal authorities, and not only includes telephony metadata collected under Section 215 but internet metadata that has been or could be collected under other legal authorities as well.

It will also allow companies to be more transparent with the demands being placed on them by legal authorities, and will create  new oversight and accountability mechanisms that will shed more light on the decisions reached by the Foreign Intelligence Surveillance Court (FISC), which has so far operated in a deeply secretive manner and with little interference.

“This bill is an extremely well-drafted compromise—the product of nearly two years of work.  It effectively protects Americans’ civil liberties and our national security.  I am very proud of the USA Freedom Act and am confident it is the most responsible path forward,” said Jim Sensenbrenner, Republican Representative for Wisconsin’s fifth district.

“If the Patriot Act authorities expire, and the FISC approves bulk collection under a different authority, how would the public know?  Without the USA Freedom Act, they won’t.  Allowing the PATRIOT Act authorities to expire sounds like a civil libertarian victory, but it will actually mean less privacy and more risk.”

“Let’s not kill these important reforms because we wish the bill did more.  There is no perfect.  Every bill we vote on could do more,” he added.

Others, including Ted Lieu (D-CA), voted against the proposed reforms because the bill didn’t go far enough.

“While I appreciate a number of the reforms in the bill and understand the need for secure counter-espionage and terrorism investigations, I believe our nation is better served by allowing Section 215 to expire completely and replacing it with a measure that finds a better balance between national security interests and protecting the civil liberties of Americans,” Lieu said.

“Beyond Section 215, I am troubled that the USA Freedom Act would leave in place Sections 505 and 702, provisions that also allow sweeping data collection and backdoor searches circumventing encryption that can result in the collection of information of US citizens not identified in warrants.  The loopholes left in place will continue to undermine the trust of the American people.”

“A federal district court struck down the NSA’s spying on Americans and called the NSA PRISM program ‘Orwellian.’ A federal appellate court ruled last week that the NSA’s bulk collection program was illegal. Despite these two court decisions, the NSA continues to operate its unconstitutional and illegal programs.”

Many cloud service providers and telecoms companies have for the past two years (since Snowden’s NSA-related revelations primarily) voiced concerns that failure to reform US surveillance practices could alienate customers both foreign and domestic. Microsoft and Google have been particularly vocal about this in recent months.

Google’s vice president public policy and government affairs in the Americas Susan Molinari trumpeted her support of the bill. She said the bill takes a big step forward in surveillance reform “while preserving important national security authorities.”

“It ends bulk collection of communications metadata under various legal authorities, allows companies like Google to disclose national security demands with greater granularity, and creates new accountability and oversight mechanisms.”

“The bill’s authors have worked hard to forge a bipartisan consensus, and the approved bill is supported by the Obama Administration, including the intelligence community. The bill now moves to the other side of the Capitol, and we hope that the Senate will use the June 1 expiration of Section 215 and other legal authorities to modernize and reform our surveillance programs, while recognizing the importance of protecting Americans from harm,” she added.

US-based telco Verizon declined to comment on the passage of the bill.

Cisco Q3: Enterprise shines while service provider biz struggles

Cisco said it enterprise business is looking strong but service provider segment still sees challenges

Cisco said it enterprise business is looking strong but service provider segment still sees challenges

Cisco reported third quarter 2015 revenues of $12.1bn this week, just over 5 per cent what it raked in during the same quarter last year. In a call with analysts this week Cisco execs said the company sees continued growth in its enterprise segment, but its service provider business continue to struggle.

Revenue for the first nine months of fiscal 2015 was $36.3bn, up from $34.8bn for the first nine months of fiscal 2014.

Kelly Kramer, Cisco executive vice president and chief financial officer said the company saw a good balance across its portfolio, with its enterprise segment looking fairly strong, much like the previous quarter.

UCS revenues for the quarter were $3bn, which is sequentially flat but a 30 per cent year-on-year increase, and the company said its seeing growth in its converged infrastructure offerings (those co-developed with VCE and IBM). Its cloud revenues grew 11 per cent year-on-year, mostly on growth in its conferencing cloud software.

In a call with analysts this week Cisco chairman and chief executive John Chambers said the company is seeing better performance in its enterprise segment than its server provider business – hindered in part by an industry-wide slowdown in spending seen over the past few quarters now.

“In enterprise, the shift to selling outcomes, not products, is resulting in larger opportunities and dramatic increases in pipeline. In US enterprise, for example, the value of our pipeline of deals over $1 million increased approximately 60 per cent year-over-year, with the average deal size up over 30 per cent,” he said.

“We are managing continued challenges in our service provider business, which declined 7 per cent, as global service provider capex remained under pressure and industry consolidation continues. We believe the organisational changes we have made in our global service provider organisation are working, and we are very focused on growing our share of wallet.”

“We are managing continued challenges in our service provider business, which declined 7%, as global service provider CapEx remained under pressure and industry consolidation continues. We believe the organizational changes we have made in our global service provider organization are working, and we are very focused on growing our share of wallet”

Chambers also said Cisco’s intercloud strategy announced last year will kick into “phase 2” shortly, and while he declined to specifically outline what that entails he did shed some light on the programme’s challenges in its bid get other service providers on board with it.

“The pieces that we were missing was how do you go into this new environment where each of these “public clouds in clouds” are separate? And you have to be on different vendors or different companies’ tech to have the ability to go into it. So what we’re looking at first is an architecture and it cements our relationships in service providers. And then it really comes through to how you monetise it over time.”

“This will just take time to monetize, but the effect we see indirectly is already huge when you talk about a Deutsche Telekom or a Telstra and our relationships with those,” he added.

Samsung announces open Internet of Things platform

Samsung have launched an IoT platform

Samsung have launched an IoT platform

Samsung has announced the launch of a platform for the Internet of Things, “ARTIK”, which it claims is completely open and serves the entire software and hardware requirements of IoT.

Samsung Electronics’ president and chief strategy officer, Young Sohn, described ARTIK as “the industry’s most advanced, open and secure platform for developing IoT products.” Samsung claims it will enable developers to customise and deploy IoT-devices, as well as the services they deliver.

From a hardware perspective, ARTIK comes in three flavours, imaginatively titled ARTIK 1, 5 and 10 respectively. ARTIK 1 is the 144mmembedded module designed for small form-factor IoT applications and utilises Bluetooth/BLE technology for low-powered short range communications. ARTIK 5, Samsung says, is intended for small-to-medium size devices, such as home hubs and drones, and comes with a 1GHz dual-core processor, flash memory and on-board DRAM.

ARTIK 10 meanwhile is the full-fat module with an eight-core processor, HD video encoding/processing, 2GB DRAM and 16GB flash, and a variety of short-range, low power communications tech inside, such as wifi, Bluetooth, ZigBee. Samsung reckons it’s ideal for media applications, Industrial IoT and home servers.

“Industry requirements for IoT devices vary in terms of battery life, computational horse power and form factor,” said Sohn. “With this family of ARTIK offerings, Samsung is directly addressing the needs of the widest range of customers, uses and applications. ARTIK allows developers to rapidly turn great ideas into market leading IoT products and applications.”

ARTIK also incorporates a number of software considerations to give it credibility as a holistic IoT platform, according to Samsung. Technical aspects of the platform include security and privacy, local storage and computational capabilities, low-power architecture, small form factor, and compatility with the major connectivity protocols. Finally, the platform comes with a software stack which is intended to allow developers to go directly with application framework, thus removing the need to build low-level software