Archivo de la categoría: Azure

Toyota and Microsoft launch connected car initiative

ToyotaJapanese car brand Toyota has teamed up with Microsoft to launch Toyota Connected, a new joint venture to further the car manufacturer’s efforts towards autonomous vehicles.

Toyota Connected builds on a standing relationship with Microsoft to leverage Azure cloud technology to make the connected driving experience smarter. Based in Plano, Texas, Toyota Connected will expand the company’s capabilities in the fields of data management and data services development initiatives.

“Toyota Connected will help free our customers from the tyranny of technology. It will make lives easier and help us to return to our humanity,” said Zack Hicks, CEO of Toyota Connected.  “From telematics services that learn from your habits and preferences, to use-based insurance pricing models that respond to actual driving patterns, to connected vehicle networks that can share road condition and traffic information, our goal is to deliver services that make lives easier.”

The connected cars market has been growing healthily in recent years, but is not new to Microsoft or Toyota as the two companies have been collaborating in the area of telematics since 2011, working on services such as infotainment and real-time traffic updates. A 2015 report stated that connected car services will account for nearly $40 Billion in annual revenue by 2020, while big data and analytics technology investments will reach $5 billion across the industry in the same period.

The new company itself has been given two mandates; firstly to support product development for customers, dealers, distributors, and partners, through advanced data analytics solutions, and secondly to build on Toyota’s existing partnership with Microsoft to accelerate R&D efforts and deliver new connected car solutions. The company have stated that its vision is to “humanize the driving experience while pushing the technology into the background”.

The launch of Toyota Connected will able enable the organization to consolidate R&D programs into one business unit, which it claims will ensure that all initiatives remain customer centric. Initiatives will focus around a number of areas including in-car services and telematics, home/IoT connectivity, personalization and smart city integration.

As part of the launch, Toyota will also adopt Microsoft’s Azure cloud computing platform, employing a hybrid solution globally, whilst also housing a number of Microsoft engineers in its offices in Plano.

“Toyota is taking a bold step creating a company dedicated to bringing cloud intelligence into the driving experience,” said Kurt Del Bene, EVP, Corporate Strategy and Planning at Microsoft. “We look forward to working with Toyota Connected to harness the power of data to make driving more personal, intuitive and safe.”

Azure Site Recovery: 4 Things You Need to Know

Disaster recovery has traditionally been a complex and expensive proposition for many organizations. Many have chosen to rely on backups of data as the method of disaster recovery. This approach is cost effective, however, it can result in extended downtime during a disaster while new servers are provisioned (referred to as Recovery Time Objective or RTO) and potentially large data loss of information created from the time of the backup the time of the failure (referred to as Recovery Point Objective). In the worst case scenario, these backups are not viable at all and there is a total loss. For those who have looked into more advanced disaster recovery models, the complexity and costs of such a system quickly add up. Azure Site Recovery helps bring disaster recovery to all companies in four key ways.

 

Azure Site Recovery makes disaster recovery easy by delivering it as a cloud hosted service

The Azure Site Recovery lives within the Microsoft cloud and is controlled and configured through the Azure Management Portal. There is no requirement to patch or maintain servers; it’s disaster recovery orchestration as a service. Using Site Recovery does not require that you use Azure as the destination of replication. It can protect your workloads between 2 company-owned sites. For example, if you have a branch office and a home office that both run VMware or Hyper-V, you can use Azure Site Recovery to replicate, protect and fail over workloads between your existing sites. It also has the optional function of being able to replicate data directly to Azure which can be used to avoid the expense and complexity of building and maintaining a disaster recovery site. 

 

Azure Site Recovery is capable of handling almost any source workload and platform

Azure Site Recovery offers an impressive list of platforms and applications it can protect. Azure site recovery can protect any workload running on VMware Virtual Machines on vSphere or ESXi, Hyper-V VMs with or without System Center Virtual Machine Manager and, yes; even physical workloads can be replicated and failed over to Azure. Microsoft has worked internally with its application teams to make sure Azure Site Recovery works with many of the most popular Microsoft solutions including Active Directory, DNS, Web apps (IIS, SQL), SCOM, SharePoint, Exchange (non-DAG), Remote Desktop/VDI, Dynamics AX, Dynamics CRM, and Windows File Server. They have also independently tested protecting SAP, Linux (OS and Apps) and Oracle workloads.

 

Azure Site Recovery has predictable and affordable pricing

Unlike traditional disaster recovery products that require building and maintaining a warm or hot DR site, Site Recovery allows you to replicate VMs to Azure. Azure Site Recovery offers a simple pricing model that makes it easy to estimate costs. For virtual machines protected between company-owned sites, it is a flat $16/month per protected virtual machines. If you are protecting your workloads to Azure then it is $54/month per protected server. In addition, the first 31 days of protection for any server is free. This allows you to try out and test Azure site recovery before you have to pay for it. It is also a way for you to use Azure Site Recovery to migrate your workloads to Azure for free.

 

Azure Site Recovery is secure and reliable

Azure Site Recovery continuously monitors the replication and health of the protected workloads from Azure. In the event of an inability to replicate data, you can configure alerts to email you a notification. Protecting the privacy of your data is a top priority in Site Recovery. All communication between your on premises environment and Azure is sent over SSL encrypted channels. All of your data is encrypted both when in transit and at rest in Azure. Performing failover testing with Azure Site Recovery allows you to do a test failover without impacting your production workloads.

 

For these reasons, companies should be considering adding Azure Site Recovery to their business continuity and disaster recovery toolbox.

 

[If you’re looking for more Microsoft resources, download our recent webinar around strategies for migrating to Office 365]

 

By Justin Gallagher, Enterprise Consultant

Microsoft adds RedHat Linux, Containers and OneOps options to Azure

AzureMicrosoft has launched a trio of initiatives aimed at widening the options of its potential clients of its Azure cloud services.

It made the announcements through the Azure Blog, which promises the availability of new RedHat Enterprise Linux ‘instances’ (i.e. units of computing resources), a new application lifecycle manager, OneOps, and showcased a preview of an imminent Azure Container service.

The Red Hat Enterprise Linux instances are available from the Azure Marketplace. According to the blog, 60 percent of the images available are now Linux-based. Microsoft claims its hybrid model can be running ‘in minutes’ with Red Hat Enterprise Linux images available on Azure Marketplace on a Pay-as-you-go model with hourly billing.

Among the eligible products are Red Hat Enterprise Linux, Red Hat JBoss Enterprise Application Server, Red Hat JBoss Enterprise Web Server, Red Hat Gluster Storage and Red Hat OpenShift.

“Both Microsoft and I love Linux,” said Corey Sanders, Azure’s Director of Program Management. The new instances will help cloud users cater for on-demand workloads, development and testing and cloud bursting in a simple, easily quantifiable system, Sanders said. The Red Hat Enterprise Linux 6.7 and 7.2 images are now live in all regions, except China and the US Government.

The imminent Azure Container Service – currently available for preview – will build on previous Docker and Mesosphere initiatives to make it easier to provision clusters of Azure Virtual Machines onto containerized applications. The process will be a lot quicker since the machines will have been pre-configured with open source components, Sanders said.

Sanders also disclosed that Microsoft has certified for the Azure Marketplace a group of Linux images created by Bitnami. Meanwhile, Microsoft’s new OneOps offering on Azure, which gives clients the user of an open-source cloud and application lifecycle management platform, is a product of a collaboration with the WalmartLabs team (the IT offshoot of retail giant Walmart).

Microsoft creates Azure hub for Internet of Things

azure iotMicrosoft has put its new Azure IoT hub on general availability. In a statement, it claims the new system will be a simple bridge between its customers’ devices with their systems in the cloud. It claims that the new preconfigured IoT offering, when used with the Azure IoT Suite, can be used to create a machine to machine network and a storage system for its data in minutes.

The new Azure IoT Hub promises ‘secure, reliable two-way communication from device to cloud and cloud to device’. It uses the open protocols widely adopted in machine to machine technology, such as MQTT, HTTPS and AMQPS. Microsoft claims the IoT Hub will easily integrate with other Azure services like Azure Machine Learning and Azure Stream Analytics. The Machine Learning service uses algorithms in an attempt to spot patterns (such as unusual activity, hacking attempts or commercial trends) that might be useful to data scientists. Azure Stream Analytics allows data scientists and decision makers to act on those insights in real time, through a system with the capacity to simultaneously monitor millions of devices and take automatic action.

Microsoft launched the Azure IoT Suite in September 2015 with a pledge to guarantee standards through its Certified for IoT programme, promising to verify partners that work with operating systems such as Linux, mbed, RTOS and Windows. Microsoft claims its initial backers were Arduino, Beagleboard, Freescale, Intel, Raspberry Pi, Samsung and Texas Instruments. In the three months since the IoT Suite’s launch it has added ‘nearly 30’ more partners, it claims, notably Advantech, Dell, HPE, and Libelium.

“IoT is poised for dramatic growth in 2016 and we can’t wait to see what our customers and partners will continue to build on our offerings. We’re just getting started,” wrote blog author Sam George, Microsoft’s partner director for Azure IoT.

Containers at Christmas: wrapping, cloud and competition

Empty road and containers in harbor at sunsetAs anyone that’s ever been disappointed by a Christmas present will tell you – shiny packaging can be very misleading. As we hear all the time, it’s what’s inside that counts…

What then, are we to make of the Docker hype, centred precisely on shiny, new packaging? (Docker is the vendor that two years ago found a way to containerise applications: other types of containers, operating system containers, have been around for a couple of decades)

It is not all about the packaging, of course. Perhaps we should say that it is more about on what the package is placed, and how it is managed (amongst other things) that matters most?

Regardless, containers are one part of a changing cloud, data centre and enterprise IT landscape, the ‘cloud native’ movement widely seen as driving a significant shift in enterprise infrastructure and application development.

What the industry is trying to figure out, and what could prove the most disruptive angle to watch as more and more enterprises roll out containers into production, is the developing competition within this whole container/cloud/data centre market.

The question of competition is a very hot topic in the container, devops and cloud space.  Nobody could have thought the OCI co-operation between Docker and CoreOS meant they were suddenly BFFs. Indeed, the drive to become the enterprise container of choice now seems to be at the forefront of both companies’ plans. Is this, however, the most dynamic relationship in the space? What about the Google-Docker-Mesos orchestration game? It would seem that Google’s trusted container experience is already allowing it to gain favour with enterprises, with Kubernetes taking a lead. And with CoreOS in bed with Google’s open source Kubernetes, placing it at the heart of Tectonic, does this mean that CoreOS has a stronger play in the enterprise market to Docker? We will wait and see…

We will also wait and see how the Big Cloud Three will come out of the expected container-driven market shift. Somebody described AWS as ‘a BT’ to me…that is, the incumbent who will be affected most by the new disruptive changes brought by containers, since it makes a lot of money from an older model of infrastructure….

Microsoft’s container ambition is also being watched closely. There is a lot of interest from both the development and IT Ops communities in their play in the emerging ecosystem. At a recent meet-up, an Azure evangelist had to field a number of deeply technical questions regarding exactly how Microsoft’s containers fair next to Linux’s. The question is whether, when assessing who will win the largest piece of the enterprise pie, this will prove the crux of the matter?

Containers are not merely changing the enterprise cloud game (with third place Google seemingly getting it very right) but also driving the IT Ops’ DevOps dream to reality; in fact, many are predicting that it could eventually prove a bit of a threat to Chef and Puppet’s future….

So, maybe kids at Christmas have got it right….it is all about the wrapping and boxes! We’ll have to wait a little longer than Christmas Day to find out.

Lucy Ashton. Head of Content & Production, Container WorldWritten by Lucy Ashton, Head of Content & Production, Container World

Azure Backup gets fine tuned with speed, cache and retention improvements

AzureMicrosoft’s Azure has promised more speed, lower cache demands and better data retention among a range of improvement to its cloud backup services for enterprise data.

Azure Backup now uses a technology called Update Sequence Number (USN) Journal in Windows to track the files that have changed between consecutive backups. USN keeps track of these changes to files and directories on the volume and this helps to identify changed files quickly.

The upshot of this tweak is a faster backup time. “We’ve seen up to a 50% reduction of backup times when using this optimization,” said Giridhar Mosay, Azure’s Program Manager for Cloud and Enterprise. Individual file server backup times will vary according to numbers and sizes of files and directory structure, Mosay warned.

A new algorithm that computes metadata has slashed the amount of cache space needed for each Azure Backup by 66%. The standard allocation of 15% cache space per volume size being backed up to Azure has proved prohibitive for volumes greater than 10TB. The new algorithm makes the cataloguing of the file space to be backed up a much more efficient process, which creates so much less metadata that it demands only 5% cache space, or less. Azure is now modifying its requirement for cache space to a third of the old level.

Meanwhile the resilience of the system has improved as Azure Backup has increased the number of recovery points for cloud backups. This allows for flexible retention policies to meet stringent compliance requirements such as HIPAA (the federal Health Insurance Portability and Accountability Act of 1996) for large enterprises. The new maximum number of recovery points has increased from 366 to 9999.

Other tweaks include more timeouts across the various phases of backup process to ensure that long running jobs complete reliably. Cloud backups will also run a bit more efficiently as a result of a decoupling of the processes of cataloguing and uploading the backup data. Intermittent failures, in the service to handle incremental backups, have also been identified and resolved, according to Mosay. “We are continuing our journey to make Azure backup enterprise grade,” he said.

Microsoft Blog: The cloud for any app and every developer

The below is an excerpt from a recent post on the Microsoft Azure blog by Nicole Herskowitz.

At Microsoft, our vision for Azure is to enable every developer to be able to create, deploy and manage any application in the cloud, regardless of the tools, technology, architecture or platform they prefer. We continue to innovate in delivering services on Microsoft Azure, often in close partnership with leading innovators across many technologies, to ensure open source and third party offerings have first-class support on Azure. Today we’re announcing new technologies and capabilities that advance our mission to make Azure the preferred cloud for any app and every developer — from back-end cloud services to higher level platform services, to the development process itself.

For building highly scalable back-end services in the cloud many developers are turning to microservice architectures. The independent nature of these microservices offers superior application lifecycle management, performance at scale, 24×7 availability and cost efficiency compared with traditional monolithic architectures for service based apps. Today, we’re announcing the public preview of Azure Service Fabric, Microsoft’s platform for developing and operating microservice-based applications. Service Fabric also brings new innovations to microservice development with support for reliable, stateful services for low-latency partitioned data access at scale, and the Actor programming model which drastically simplifies building high-scale microservice applications.

We’ve already seen strong interest in Service Fabric with over 300 customers and partners already building on the platform during the private preview. With the availability of public preview in Azure, you can now explore the scale-out potential of Service Fabric combined with dedicated Visual Studio tooling. Today, Service Fabric is available on Azure and will extend to Windows Server, Linux and other cloud providers next year providing application portability and hybrid scenarios. To get started, download the SDK, check out our getting started videos and documentation and deploy your application to a cluster live in Azure.

For developers who want to build powerful, enterprise grade web and mobile apps that connect to data in the cloud or on-premises, Azure App Service is a highly productive platform for building scalable apps in .NET, NodeJS, PHP, Python or Java as well as engaging mobile apps for iOS, Android and Windows. Azure App Service is one of our most popular Azure services used by more than 60% of customers to host over 700,000 apps. Building on this success, today we announced new capabilities in Azure App Service including:

  • Single sign-on using EasyAuth across all app types making authentication easy, everywhere
  • Code-free interface and data design for rapid development of data-driven Node.js apps
  • API app innovations extended to all app types, eliminating the need for an API gateway

 

To read the entire post, click here.

 

Interested in learning about common migration problems with Microsoft Office 365? Download our latest on-demand webinar.

 

Avere-Microsoft joint effort enables Azure hybrids

server rackEnterprise storage vendor Avere Systems is to work with Microsoft so that its Virtual FXT Edge filers can be used with Microsoft Azure.

The hardware maker, which specialises in creating storage devices that caters for hybrid cloud set ups, says the two vendors are collaborating to make it easier and cheaper to get the qualities of the cloud from IT infrastructure that is situated ‘on premise’.

The system aims to simplify the task of creating a system for providing computing power, memory and storage on demand for enterprise IT staff who are not specialists in running cloud services. The Avere technology is designed to make data that is held on network attached storage (NAS) more readily accessible to Azure, so that users don’t experience any latency.

The rationale is that many companies want the liquidity of cloud computing but are not allowed to move their data off the company premises, according to Avere. Its solution was to invent a ‘virtual NAS’ system that is easy for an enterprise IT department employee to install and manage. Meanwhile the system is sophisticated enough to provide multi-protocol file access (including NFS and SMB) and clusters, making it powerful enough to deliver high availability, scalable performance and capacity.

As hybrid cloud systems become the de facto standard for enterprises, it’s important that they are easy enough for IT department employees to manage, according to Nicole Herskowitz, Microsoft Azure’s Senior Director of Product Marketing, Microsoft Azure.

By adapting the system to work smoothly with Azure, enterprise IT department managers can deploy thousands of Azure HPC instances on-demand to crunch data with low latency and no data migration. This means businesses can tap into hyper-converged infrastructure of Azure with ease, without breaking the bank, Avere claims.

“At Avere, we’ve been dedicated to shattering the myth that organizations can’t have enterprise NAS performance in the public cloud,” said Rebecca Thompson, VP Marketing of Avere Systems, “with Microsoft we’re helping enterprises harness the computing power of Microsoft Azure, which is used by 57% of Fortune 500 companies for big data applications.”

Red Hat launches Cloud Access on Microsoft Azure

redhat office logoRed Hat has followed its recent declaration of a partnership with Microsoft by announcing the availability of Red Hat Cloud Access on Microsoft Azure.

The Access service will make it easier for subscribers to move any eligible, unused Red Hat subscriptions from their data centre to the Azure cloud. Red Hat Cloud Access will give them the support relationship they enjoy with Red Hat with the cloud computing powers of Azure, the software vendor said on its official blog. Cloud Access extends to Red Hat Enterprise Linux, Red Hat JBoss Middleware, Red Hat Gluster Storage and OpenShift Enterprise. The blog hints that more collaborations with Microsoft are to come.

Meanwhile, in his company blog Azure CTO Mark Russinovich gave a public preview of the coming Azure Virtual Machine Scale Sets offering. VM Scale Sets are an Azure Compute resource that allow users to create and manage a collection of virtual machines as a set. These scale sets are designed for building large-scale services targeting big computing, big data and containerized workloads, all of which are increasing in significance as cloud computing evolves, said Russinovich.

By integrating with Azure Insights Autoscale, they provide the capacity to expand and contract to fit requirements with no need to pre-provision virtual machines. This allows users to match their consumption of computing resources to their application needs more accurately.

VM Scale Sets can be controlled within Azure Resource Manager templates and they will support Windows and Linux platform images, as well as custom images and extensions. “When you define a VM Scale Set, you only define the resources you need, so besides making it easier to define your Azure infrastructure, this also allows Azure to optimize calls to the underlying fabric, providing greater efficiency,” said Russinovich. “To deploy a scale set, all you need is an Azure subscription.”

Example Virtual Machine Scale Set templates are available on the GitHub repository.

Bringing the Cloud to Life with the Microsoft Experience Center

It’s oftentimes difficult to get a legitimate user experience when viewing a canned demo. That’s why I’m a big fan of the Microsoft Experience Center. It’s a mobile kit that operates out of the cloud through an Office 365 instance. This allows users to get that legitimate experience of interacting with Microsoft productivity solutions (while having access to experts to answer questions and provide guidance)  because it’s not a prepared environment or running over faster internet. It’s running over whatever the building you’re in is providing so that you can get a real understanding of what the experience will be like accessing these applications from the cloud. Watch the video below where I discuss the Microsoft Experience Center in more detail, including the process, benefits, and key takeaways you’ll leave with.

If you’re interested in learning more about Microsoft Office 365, I’ll hosting a webinar on November 18th entitled, “Microsoft Office 365: Expectations vs. Reality. Strategies for Migrating & Supporting Mobile Workforces. Register here!

This video is also available on GreenPages’ YouTube Channel

 

 

By David Barter, Practice Manager Microsoft Technologies