Microsoft’s Underwater Data Center

Microsoft has recently developed a data center in an extremely unusual place- the bottom of the Pacific Ocean. Microsoft has recently unveiled Project Natick, an initiative to bring cloud computing infrastructure to cities that are closer to bodies of water such as the ocean. It is estimate that half of the world’s population lives within 200km of the ocean, so placement of data centers underwater can strategically improve efficiency and response.

In 2015, Microsoft built a capsule filled with pressurized nitrogen that featured a single rack of servers and heat exchangers clamped to the hull in order to regulate the temperature inside. The capsule was submerged 30 feet underwater off of the California coast for 100 days. Such capsules could have their computing hardware replaced every five years; the lifespan of these capsules in 20 years. Microsoft aims to make these under water data centers self-sufficient by utilizing renewable energy to power them. Microsoft is exploring wave and tidal energy.

 

Microsoft explained that “Project Natick reflects Microsoft’s ongoing quest for cloud data center solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable…”The vision of operating containerized data centers offshore near major population centers anticipates a highly interactive future requiring data resources located close to users. Deepwater deployment offers ready access to cooling, renewable power sources, and a controlled environment.”

Microsoft stated: “We see this as an opportunity to field long-lived, resilient datacenters that operate ‘lights out’ — nobody on site — with very high reliability for the entire life of the deployment, possibly as long as 10 years.”

 

Additional comments:

Microsoft spokesperson Athima Chansanchai has commented, “That’s one of the big advantages of the underwater data center scheme—reducing latency by closing the distance to populations and thereby speeding data transmission. Half the world’s population, Cutler says, lives within 120 miles of the sea, which makes it an appealing option…Cooling is an important aspect of data centers, which normally run up substantial costs operating chiller plants and the like to keep the computers inside from overheating. The cold environment of the deep seas automatically makes data centers less costly and more energy-efficient.”

The state of the cloud in 2016: DevOps, Docker momentum grows as hybrid hits its stride

(c)iStock.com/Alex Sava

Hybrid cloud adoption has grown significantly while DevOps and Docker uptake continues to explode, according to the latest RightScale State of the Cloud research report.

The report, which surveyed more than 1,000 technical professionals, found that the more things change, the more they stay the same with regards to the overall trends pervading the cloud industry. Hybrid cloud adoption has gone up from 58% in 2015 to 71% among respondents, with private cloud going up from 63% to 77%. Similarly, DevOps adoption rose to almost three quarters (74%), with Docker uptake more than doubling from 13% last year to 27% in 2016.

Picture credit: RightScale, used under CC BY

Four in five companies polled have at least dipped their toes in cloudy waters, with 29% heavy users, 25% with apps running, and 26% on their first project. Only 9% said they had no plans to move off-premise. With this maturity, RightScale argues, the job role of cloud architect has risen with it. 40% of respondents in that role see themselves as a cloud architect, compared to 44% for IT architect and 16% development architect.

The number of clouds companies are using is also on the up; organisations are on average leveraging three public clouds and three private clouds, while running applications on 1.5 public clouds and 1.7 private clouds and experimenting on a further 1.5 and 1.3 respectively. Smaller businesses are more likely to go all-in on public, with 24% compared to 10% for enterprise, while the most popular enterprise option was for 20% public and 80% private workloads (39%, compared to 22% SMB).

Picture credit: RightScale, used under CC BY

Elsewhere, Amazon Web Services (AWS) was used by 57% of those polled, unsurprisingly making it the most popular cloud provider by far, yet the year over year figures show minimal growth, as enterprise usage increases but small business usage drops. Microsoft Azure infrastructure as a service grew from 12% of respondents in 2015 to 17% in 2016, while platform as a service rose four points to 13% year over year.

These figures correlate with recent analysis on the cloud infrastructure market from Synergy Research, which argued that despite AWS having a yawning gap over the rest of the field, other vendors – in particular Microsoft – were growing at a faster pace.

“The 2016 State of the Cloud survey shows that cloud adoption is growing and hybrid cloud adoption has now hit its stride,” said Kim Weins, RightScale VP marketing, in a company blog post. “The strong growth in the use of private cloud, combined with the ubiquity of public cloud, means that a super-majority of organisations are now operating in a hybrid environment.”

Read more: State of the Cloud 2015: DevOps adoption rises and hybrid cloud strategy deepens in new study

IoT and the Future of Utilities | @ThingsExpo #IoT #M2M #InternetOfThings

As utility companies continue to track usage, consumers are growing equally cognizant of individual energy use as well, especially with the growth of private alternative energy practices such as home solar panels or wind power. That’s why this week’s IIoT top news is focused on the utility technology of tomorrow. This practice of selling privately generated energy back to the smart grid is in its infancy, but consumers now expect device connectivity to track, say, the amount of energy used by lights or the refrigerator during nighttime, off-peak hours. Because of that, the onus lies on utility companies to gather data and deploy advanced analytics that can be translated into relevant information for the average user.

read more

Integrating Agile and DevOps | @DevOpsSummit #DevOps #IoT #Microservices

Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy.
In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and provided examples on how an integrated approach has helped major companies embrace a cloud first, DevOps at scale approach to optimize speed and delivery across the enterprise that is required in today’s hyper competitive marketplace.

read more

Talk to Your Microservice Via a Chat Bot, Not UI | @CloudExpo #BigData #Microservices

In most cases, it is convenient to have some human interaction with a web (micro-)service, no matter how small it is. A traditional approach would be to create an HTTP interface, where user requests will be dispatched and HTML/CSS pages must be served. This approach is indeed very traditional for a web site, but not really convenient for a web service, which is not intended to be good looking, 24×7 up and running and UX-optimized. Instead, talking to a web service in a chat-bot mode would be much more convenient, both for a user and web service developer.

read more

Announcing @MenAndMice to Exhibit at @CloudExpo | #Cloud #BigData #IoT

SYS-CON Events announced today that Men & Mice, the leading global provider of DNS, DHCP and IP address management overlay solutions, will exhibit at SYS-CON’s 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY.
The Men & Mice Suite overlay solution is already known for its powerful application in heterogeneous operating environments, enabling enterprises to scale without fuss. Building on a solid range of diverse platform support, the Men & Mice Suite is taking overlay dexterity one step further into the cloud through full support for Microsoft Azure DNS (January 2016) and AWS Amazon Route 53 (from 2014). Major cloud platform support allows Men & Mice Suite users free rein to extend their infrastructure into the cloud if, and when, they need it.

read more

Time-to-Value in Big Data Deployments | @CloudExpo #BigData #IoT #M2M

As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense.
In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will discuss how enterprises can bring Big Data tools to a point of customer value. He’ll cover core lessons about how to approach Big Data and how to be successful at winning from Big Data in your environment.

read more

Cloud Security Industry to Register Impressive Growth | @CloudExpo #Cloud

In a recent market study offering by Transparency Market Research (TMR), the global cloud security market is projected to grow at a CAGR of 12.80% from 2015 to 2022. The report, titled «Cloud Security Market – Global Industry Analysis, Size, Share, Growth, Trends and Forecast 2014 – 2022», states that the global cloud security market will reach a valuation of US$11.8 bn by 2022 increasing from its 2014 valuation of US$4.5 bn.According to the report, factors such as the rapid proliferation of handheld devices, the increasing trend of CYOD and BYOD policies in corporate organizations, increasing use of cloud-based computing by small and medium scale businesses, and ease of use of cloud security services are driving the cloud security market.

read more

Unit Testing Legacy Code: Three Reasons to Reconsider By @ReselBob | @CloudExpo #Cloud

Teams that have embraced DevOps and begun using the practice of test driven development are familiar with the headaches that accompany testing legacy code.
This is particularly true for companies that have applications out in production that have been working for years, but have no formal tests or testing regiment associated with its codebase.

Most of the time the physical code for the applications is known; many times it is not. In fact, both the code and the developer who wrote the code might have left the building years ago.

read more