Archivo de la categoría: Azure

Microsoft targets customer datacentres with Azure Stack

Microsoft is bolstering its hybrid cloud appeal on the one hand, and going head to head with other large incumbents on the other

Microsoft is bolstering its hybrid cloud appeal on the one hand, and going head to head with other large incumbents on the other

Microsoft revealed a series of updates to its server and cloud technologies aimed at blending the divide between Azure and Windows Server.

The company announced Azure Stack, software that consists of the architecture and microservices deployed by Microsoft to run its public-cloud version of Azure, including some of the latest updates to the platform like Azure Service Fabric and Azure App Fabric – which have made the architecture much more container-like.

Built on the same core technology as Azure but deployed in a customer’s datacentre, the company said Azure Stack makes critical use of among other things some of the company’s investments in software-defined networking.

The company also said it worked a number of bugs out of the next version of Windows Server (2016), with the second preview being made available this week; the net version of Windows Server will include a number of updates announced last month including Hyper-V containers and nano-servers, which are effectively Dockerised and slimmed-down Windows Server images, respectively.

Azure Stack will preview this summer and Windows Server 2016 is already available for preview.

The company also announced, Microsoft Operations Management Suite (OMS), a hybrid cloud management service that supports Azure, AWS, Windows Server, Linux, VMware, and OpenStack.

For Microsoft the updates are a sign of a significant push into hybrid cloud as it looks to align it’s the architecture of its Windows Server and Azure offerings and help customers manage workloads and operations in a multi-cloud world. Interestingly, by taking the Azure architecture directly to customer datacentres it’s effectively going head-to-head with other IaaS software vendors selling alternatives like OpenStack and CloudStack – Dell, HP, Cisco, Red Hat, IBM and so forth – which is in some ways new territory for the cloud giant.

Microsoft jumps into the data lake

Azure Data LakeAt the company’s annual Build conference this week Microsoft unveiled among other things an Azure Data Lake service, which the company is pitching as a hyperscale big data repository for all kinds of data.

The data lake concept is a fairly new one, the gist of it being that data of varying types and structures is created at such a high velocity and in such large volumes that it’s prompting a necessary evolution in the applications and platforms required to handle that data.

It’s really about being able to store all that data in a volume-optimised (and cost-efficient) way that maintains the integrity of that information when you go to shift it someplace else, whether that be an application / analytics or a data warehouse.

“While the potential of the data lake can be profound, it has yet to be fully realized. Limits to storage capacity, hardware acquisition, scalability, performance and cost are all potential reasons why customers haven’t been able to implement a data lake,” explained Microsoft’s product marketing manager, Hadoop, big data and data warehousing Oliver Chiu.

The company is pitching the Azure Data Lakes service as a means of running Hadoop and advanced analytics using Microsoft’s own Azure HDInsight, as well as Revolution-R Enterprise and other Hadoop distributions developed by Hortonworks and Cloudera.

It’s built to support “massively parallel queries” so information is discoverable in a timely fashion, and built to handly high volumes of small writes, which the company said makes the service ideal for Internet of Things applications.

“Microsoft has been on a journey for broad big data adoption with a suite of big data and advanced analytics solutions like Azure HDInsight, Azure Data Factory, Revolution R Enterprise and Azure Machine Learning. We are excited for what Azure Data Lake will bring to this ecosystem, and when our customers can run all of their analysis on exabytes of data,” Chiu explained.

Pivotal is also among a handful of vendors seriously bought into the concept of data lakes. However, although Chiu alluded to cost and performance issues associated with the data lakes approach, many enterprises aren’t yet at a stage where the variety, velocity and volume of data their systems ingest are prompting a conceptual change in how that data is being perceived, stored or curated; in a nutshell, many enterprises are still too siloed – not the least of which in how they treat data.

Datacastle, 21Vianet partner on cloud data protection, backup in China

Datacastle is partnering with 21Vianet to deploy its cloud backup solutions in China

Datacastle is partnering with 21Vianet to deploy its cloud backup solutions in China

Backup provider Datacastle has partnered with 21Vianet in a deal that will see it resell its cloud-based backup and data protection solutions to customers in China.

The solution is being deployed on Microsoft Azure, which partners with 21Vianet to host its infrastructure-as-a-service in the region.

“21Vianet is committed to bringing the worldwide best-in-class cloud solutions on Microsoft Azure in China,” said Wing Ker, president of Microsoft Cloud Operations at 21Vianet. “Enterprises in China will now have endpoint data protection option to protect against ransomware, data loss, and data breach through our partnership with Datacastle.”

Ron Faith, chief executive officer of Datacastle said: “Given 21Vianet’s expertise operating Microsoft Azure in China and their trusted status as a datacentre service provider, customers in China will get the best performance, reliability and security.”

Microsoft and 21Vianet announced general availability of Microsoft Azure Services in China just over a year ago, which launched amid much fanfare. The service launched with about 3,000 clients signed up to use it, and Ralph Haupter, corporate vice-president and chief executive of Microsoft Greater China recently said Azure has accumulated more than 50,000 customers, mainly SMEs.

Microsoft debuts container-like architecture for cloud

Microsoft is trying to push more cloud-friendly architectures

Microsoft is trying to push more cloud-friendly architectures

Microsoft has announced Azure Service Fabric, a framework for ISVs and startups developing highly scalable cloud applications which combines a range of microservices, orchestration, automation and monitoring tools. The move comes as the software company looks to deepen its use of – and ties to – open source tech.

Azure Service Fabric, which is based in part on technology included in Azure App Fabric, breaks apart apps into a wide range of small, independently versioned microservices, so that apps created on the platform don’t need to be re-coded in order to scale past a certain point. The result, the company said, is the ability to develop highly scalable applications while enabling low-level automation and orchestration of its constituent services.

“Service Fabric was born from our years of experience delivering mission-critical cloud services and has been in production for more than five years. It provides the foundational technology upon which we run our Azure core infrastructure and also powers services like Skype for Business, InTune, Event Hubs, DocumentDB, Azure SQL Database (across more than 1.4 million customer databases) and Bing Cortana – which can scale to process more than 500 million evaluations per second,” explained Mark Russinovich, chief technology officer of Microsoft Azure.

“This experience has enabled us to design a platform that intrinsically understands the available infrastructure resources and needs of applications, enabling automatically updating, self-healing behaviour that is essential to delivering highly available and durable services at hyper-scale.”

A preview of the service will be released to developers at the company’s Build conference next week.

The move is part of a broader architectural shift in the software stack powering cloud services today. It’s clear the traditional OS / hypervisor model is limited in terms of its ability to ensure services are scalable and resilient for high I/O applications, which has manifested in among other things a shift towards breaking down applications into a series of connected microservices – something which many equate Docker and OpenStack with, among other open source software projects.

Speaking of open source, the move comes just days after Microsoft announced MS Open Tech, the standalone open source subsidiary of Microsoft, will re-join the company, in a move the company hopes will drive further engagement with open source communities.

“The goal of the organization was to accelerate Microsoft’s open collaboration with the industry by delivering critical interoperable technologies in partnership with open source and open standards communities. Today, MS Open Tech has reached its key goals, and open source technologies and engineering practices are rapidly becoming mainstream across Microsoft. It’s now time for MS Open Tech to rejoin Microsoft Corp, and help the company take its next steps in deepening its engagement with open source and open standards,” explained Jean Paoli, president of Microsoft Open Technologies

“As MS Open Tech rejoins Microsoft, team members will play a broader role in the open advocacy mission with teams across the company, including the creation of the Microsoft Open Technology Programs Office. The Programs Office will scale the learnings and practices in working with open source and open standards that have been developed in MS Open Tech across the whole company.”

Fujitsu, Microsoft collaborate on Azure, Internet of Things

Fujitsu and Microsoft are partnering on IoT for farming and agricutlure

Fujitsu and Microsoft are partnering on IoT for farming and agricutlure

Fujitsu and Microsoft announced an Internet of Things partnership focused on blending the former’s devices and IoT services for agriculture and manufacturing, powered by Windows software and Azure cloud services.

The move will see the two companies offer a solution that blends Fujitsu’s Eco-Management Dashboard, an IoT service for the agricultural sector, and Microsoft’s Azure database services so that data collected from sensors deployed throughout the operations can be analysed to help firms save money and streamline processes.

The companies said the platform has uses in other sectors and can be tailored to a range of different niche verticals.

“Leveraging the Fujitsu Eco-Management Dashboard solution alongside Microsoft Azure and the Fujitsu IoT/M2M platform, we are able to deliver real-time visualisation of the engineering process for big data analytics to improve the entire production process and inform decision-making,” said Hiroyuki Sakai, corporate executive officer, executive vice president, head of global marketing at Fujitsu.

“We are proud to partner with Fujitsu to enable the next generation of manufacturing business models and services enabled by IoT along with advanced analytics capabilities like machine learning,” said Sanjay Ravi, managing director, Discrete Manufacturing Industry at Microsoft. “Fujitsu’s innovation will drive new levels of operational excellence and accelerate the pace of digital business transformation in manufacturing.”

Fujitsu has been doubling down on IoT this year, with manufacturing looking to be a strong sector for those kinds of services according to anlaysts. In January the company announced plans to expand its two core datacentres in Japan in a bid to accelerate demand for its cloud and IoT services.

The 2nd annual Internet of Things World event to be held in San Francisco in May is due to address some of the challenges ahead of the industry in terms of IoT. Sign up here.

IoT-World-banner-small

Microsoft unveils Hyper-V containers, nano servers

Microsoft has unveiled Hyper-V containers and nano servers

Microsoft has unveiled Hyper-V containers and nano servers

Microsoft has unveiled a number of updates to Windows Server including Hyper-V containers, which are essentially Docker containers embedded in Hyper-V VMs, and nano servers, a slimmed down Windows server image.

Microsoft said Hyper-V containers are ideal for users that want virtualisation-grade isolation, but still want to run their workloads within Docker containers in a Windows ecosystem.

“Through this new first-of-its-kind offering, Hyper-V Containers will ensure code running in one container remains isolated and cannot impact the host operating system or other containers running on the same host,” explained Mike Neil, general manager for Windows Server, Microsoft in a recent blog post.

“In addition, applications developed for Windows Server Containers can be deployed as a Hyper-V Container without modification, providing greater flexibility for operators who need to choose degrees of density, agility, and isolation in a multi-platform, multi-application environment.”

Windows Server Containers will be enabled in the next release of Windows Server, which is due to be demoed in the coming weeks, and makes good on Microsoft’s commitment to make the Windows Server ecosystem (including Azure) Docker-friendly.

The company also unveiled what it’s calling nano servers, a “purpose-built OS” that is essentially a stripped down Windows Server image optimised for cloud and container workloads. They can be deployed onto bare metal, and because Microsoft removed tons of code it boots up and runs more quickly.

“To achieve these benefits, we removed the GUI stack, 32 bit support (WOW64), MSI and a number of default Server Core components. There is no local logon or Remote Desktop support. All management is performed remotely via WMI and PowerShell. We are also adding Windows Server Roles and Features using Features on Demand and DISM. We are improving remote manageability via PowerShell with Desired State Configuration as well as remote file transfer, remote script authoring and remote debugging.  We are working on a set of new Web-based management tools to replace local inbox management tools,” the company explained.

“Because Nano Server is a refactored version of Windows Server it will be API-compatible with other versions of Windows Server within the subset of components it includes. Visual Studio is fully supported with Nano Server, including remote debugging functionality and notifications when APIs reference unsupported Nano Server components.”

The move is a sign Microsoft is keen to keep its on-premise and cloud platform ahead of the technology curve, and is likely to appeal to .NET developers who are attracted to some of the benefits of containers while wanting to stay firmly within a Windows world in terms of the tools and code used. Still, the company said it is working with Chef to ensure nano servers work well with their DevOps tools.

Fun Facts about Microsoft Azure

facts about Microsoft AzureLooking for some helpful facts about Microsoft Azure? For those out there that may be confused about the Microsoft Azure solutions offered to date, here is the first in a series of posts about the cool new features of the Microsoft premium cloud offering, Azure.

Azure Backup, ok… wait, what? I need to do backup in the cloud? No one told me that!

Facts about Microsoft Azure

Yes Virginia, you need to have a backup solution in the cloud. To keep this high level below I attempted to outline what the Azure backup offering really is. There are several protections built into the Azure platform that help customers protect their data as well as options to recover from a failure.

In a normal, on premise scenario, host based hardware and networking failures are protected at the hypervisor level. In Azure you do not see this because control of the hypervisor has been removed. Azure, however, is designed to be highly available meeting and exceeding the posted SLAs associated with the service

Hardware failures of storage are also protected against within Azure. At the lowest end you have Local Redundant storage where they maintain 3 copies of your data within a region. The more common and industry preferred method is Geo-Redundant storage which keeps 3 copies in you’re region and 3 additional copies in another datacenter, somewhere geographically dispersed based on a complex algorithm. The above protections help to insure survivability of your workloads.

Important to note: The copies in the second datacenter are crash consistent copies so it should not be considered a backup of the data but more of a recovery mechanism for a disaster.

Did I hear you just ask about Recovery Services in Azure? Why yes, we have two to talk about today.

  • Azure Backup
  • Azure Site Recovery

Azure Site Recovery – This scenario both orchestrates site recovery as well as provides a destination for virtual machines. Microsoft currently supports Hyper-V to Azure, Hyper-V to Hyper-V or VMware to VMware recovery scenarios with this method.

Azure Backup is a destination for your backups. Microsoft offers traditional agents for Windows Backup and the preferred platform, Microsoft System Center 2012 – Data Protection Manager. Keeping the data in the cloud, Azure holds up to 120 copies of the data and can be restored as needed. At this time the Azure Windows backup version only protects files. It will not do Full System or Bare Metal backups of Azure VMs.

As of this blog post to get a traditional full system backup there is a recommend two-step process where you use Windows Backup which can capture a System State backup and the enable Azure Backup to capture this into your Azure Backup Vault.

There are 2 other methods that exist but currently the jury is out on the validity of these offerings. They are VM Capture and Blob Snapshot.

  • VM capture – which is equivalent to a VM snapshot
  • Blob Snapshot – This is equivalent to a LUN snapshot

As I said these are options but considered by many too immature at this time and respectfully not widely adopted. Hopefully, this provides some clarity around Azure and as with all things Microsoft Cloud related, Microsoft issues new features almost daily now. Check back again for more updates on what Azure can do for your organization!

 

By David Barter, Practice Manager, Microsoft Technologies

Microsoft Azure – It’s More Than Just Portability

When people discuss Microsoft Azure, they often think about portability to the cloud. One of the misnomers of the Azure cloud is that you’re just taking your on-prem virtual machines and moving them to the cloud when, in reality, Azure is much more than that. It is about VM portability, but it is also running different platforms in the cloud. It’s using instances which allows users to move, say, a web server to an instance in the Azure cloud so they don’t have to worry about the patching and management of that server from month to month. Instead, the user knows that it’s already taken care of for you. Other benefits include uptime SLAs and back up solutions.

Watch the video below with DJ Ferrara to learn more about the benefits Microsoft Azure has to offer.

 

Microsoft Azure – What are the benefits?

 

http://www.youtube.com/watch?v=yfsobUCjff0

What are your thoughts on Microsoft Azure? Has your organization utilized the platform? Any plans to use Azure in the future? Why or why not?

To hear more from DJ, watch his video blog discussing the pros and cons of different public cloud platforms and when it makes sense to use each. If you’d like to speak with DJ more about the Azure cloud, email us at socialmedia@greenpages.com.

 

Video with DJ Ferrara, Vice President & Enterprise Architect

Comparing Cloud Platforms: When it Makes Sense to Use Each

Video with DJ Ferrara, Vice President & Enterprise Architect

 

 

http://www.youtube.com/watch?v=Gn9-VJ92yxc

 

In this video, DJ discusses the pros and cons different cloud providers have to offer. When does it make sense to use vCloud Air (note: this was filmed right before VMware announced name change from vCHS)? What about Azure? How about Amazon?

If you’re interested in learning more, read this ebook about the evolution of the corporate IT department.

 

 

 

 

 

Nirvanix Shutdown: Collateral Damage in Big Players’ Price War?

The sudden shutdown of Nirvanix, an early but recently faltering participant in the “pure-play” Online Storage space dominated by the likes of AWS S3, Microsoft Azure and Google, is in large part a result of downward pressure on prices as the big players continually lower theirs. Amazon, for instance, launched S3 in 2006 and charged $0.15 per gigabyte-month. After many step-wise price cuts S3 is down to $0.095 per gigabyte-month.

Pure online storage is fast becoming the sole province of vendors who either enjoy economies of scale, or who treat their offerings as a loss-leader to get other business (or a combination of both).

Smaller players may have to add value in other ways to survive. Nirvanix was not profitable, and when their latest round of funding came up short it was the last nail in their coffin.