Microsoft moves Dynamics AX into the cloud

MicrosoftMicrosoft says the latest incarnation of Dynamics AX will mark its transformation from a packaged application to a cloud service.

On Thursday the vendor announced the latest release of its flagship enterprise resource planning (ERP) system will be generally available in the first quarter of 2016. The main difference, it said, is that the ERP is now a service designed for the cloud.

A public preview of the new solution for customers and partners will be available in early December. The new name of the release, Microsoft Dynamics AX, reflects a departure from branding that reflected the year or the version of the product, a characteristic of software packages, it said. From now on the branding will underscore that Dynamics AX is a cloud-based service that will be regularly updated, it said.

Microsoft said it will also implement a new, simple and more transparent subscription pricing model to make it easier for companies to buy the system as they need it. Dynamics AX will offer a new user experience that looks and works like Microsoft Office and shares information between Dynamics AX, Dynamics CRM and Office 365, according to the vendor. It will also combine near-real-time analytics powered by Azure Machine Learning with the ability to visualise data through Power BI embedded in the application, in order to give users more predictive powers.

In response to usability analysis, Dynamics AX will have a browser-based HTML5 client and a new touch-enabled, modern user interface. Now that it’s a cloud system it will adopt the principles of highly visual applications more akin to consumer applications, according to Microsoft.

The classic rigidity of ERP systems has been replaced, according to Scott Guthrie, Microsoft Cloud and Enterprise’s executive VP. “Our ambition to build the intelligent cloud comes to life with apps optimised for modern business. When you combine the hyperscale, enterprise-grade and hybrid-cloud capabilities of Microsoft Azure with the real-time insights and intuitive user experience of Dynamics AX, organisations and individuals are empowered to transform their business operations,” said Guthrie.

WANdisco’s new Fusion system aims to take the fear out of cloud migration

CloudSoftware vendor WANdisco has announced six new products to make cloud migration easier and less dangerous as companies plan to move away from DIY computing.

The vendor claims its latest Fusion system aims to create a safety net of continuous availability and streaming back-up. Building on that, the platform offers uninterrupted migration and gives hybrid cloud systems the capacity to expand across both private public clouds if necessary. These four fundamental conditions are built on seven new software plug-ins designed to make the transition from production systems into live cloud systems smoother, says DevOps specialist WANdisco.

The backbone of Fusion is WANdisco’s replication technology, which ensures that all servers and clusters are fully readable and writeable, always in sync and can recover automatically from each other after planned or unplanned downtime.

The plug-ins that address continuous availability, data consistency and disaster recovery are named as Active-Active Disaster Recovery, Active-Active Hive and Active-Active Hbase. The first guarantees data consistency with failover and automated recovery over any network. It also prevents Hadoop cluster downtime and data loss. The second regulates consistent query results across all clusters and locations. The third, Hbase, aims to create continuously availability and consistency across all locations.

Three further plug ins address the threat of heightened exposure that is created when companies move their system from behind a company firewall and onto a public cloud. These plug-ins are named as Active Back-up, Active Migration and Hybrid Cloud. To supplement these offerings WANdisco has also introduced the Fusion Software Development Kit (SDK) so that enterprise IT departments can programme their own modifications.

“Ease of use isn’t the first thing that comes to mind when one thinks about Big Data, so WANdisco Fusion sets out to simplify the Hadoop crossing,” said WANdisco CEO David Richards.

Hybrid IT Governance: Automation Is Key By @Kevin_Jackson | @CloudExpo #Cloud

As cloud computing continues to grow in importance, enterprises are now facing a new realization. In their almost rampant embrace of cost savings associated with public cloud, many are just now understanding the information technology governance challenge posed by vastly different traditional and cloud computing operational models. Often referred to as hybrid IT, supporting both models has left many executives trying to cope with a lack of hybrid IT operational experience. Challenges can also include security concerns, financial management changes and even dramatic cultural changes.
This myriad of challenges translate into enterprise risk across multiple levels, namely:

read more

How data centre investments are transforming IT industry – and why it won’t slow down

(c)iStock.com/4x-Image

As global spend on cloud infrastructure continues to rocket, leading cloud providers have to up their game and invest billions of dollars in expanding their network of hyperscale data centres.

That’s according to the latest data released by Synergy Research. The company notes the top four cloud providers – Amazon Web Services, Microsoft, Google, and IBM – have approximately 110 data centres located in 20 different countries.

More than $25 billion has been spent in recent merger and acquisitions related to data centres, Synergy argues, with Equinix, Digital Realty, NTT and IBM at the head of operations. IBM’s $2bn acquisition of SoftLayer in 2013, while NTT’s spree of acquisitions over the past five years includes Dimension Data ($3.2bn), Raging Wire ($0.4bn) and e-shelter ($0.5bn). Digital Realty has over the past five years bought 365 Main ($0.7bn), Sentrum ($1.1bn) and Telx ($1.9bn), while Equinix purchases Switch and Data in 2010 ($0.7bn) and is pending completion of buying TelecityGroup for $3.4bn.

This overall trend relates to a huge impact over how companies support their IT needs. As previous Synergy reports have explored, the four leading cloud infrastructure service providers are growing at rates far in excess of the market. Similarly, as service provider data centre spend continues to go up, outlay on enterprise data centre equipment remains static.

“This is a time of unprecedented change in the IT industry,” explained John Dinsdale, a chief analyst and research director at Synergy Research. “End users are getting access to flexible and agile IT services that they could only dream about a few years ago and CIOs are pulling back from buying and managing their own data centres.

“It’s all change,” he added. “Companies like AWS and Microsoft are now major players in enterprise IT; IBM is totally reinventing itself; companies like Equinix and NTT are amassing huge data centre footprints, while HP and Cisco are aggressively growing their cloud technology business units.

“We do not expect the rate of change to lessen over the coming years.”

PubNub Announces BLOCKS | @ThingsExpo @PubNub #IoT #Microservices

PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.

read more

Big Data to Smart Data | @ThingsExpo @CloudianStorage #IoT #M2M #BigData

As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights.
In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at Cloudian, addressed how and where we’ll see smart data first make an impact.

read more

Clouds Are Coming! | @CloudExpo @ZertoCorp #IoT #M2M #API #BigData

The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition.
A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production workloads, which suggests there is still apprehension in moving towards utilizing cloud for critical core infrastructure.

read more

New Canonical offering makes it cheaper to run OpenStack on Autopilot

canonicalCanonical has launched an OpenStack Autopilot system which it claims will make it so much easier to create and run clouds using open systems that it will ‘dramatically’ cut the cost of ownership. In a statement it promised that the need for staff and consultants will fall as a result of the pre-engineered simplicity built into it OpenStack based system.

The OpenStack Autopilot is a new feature in Canonical’s Landscape management system, a platform based on Linux. The Autopilot can add hardware to an existing cloud, making it easy to grow a private cloud as storage and compute needs change.

According to Canonical, currently the biggest challenge for OpenStack operators is finding a way to adapt their cloud to requirements dynamically, when the computing demands of customers are invariably both volatile and unpredictable. The cost of manually doing this, which involves re-designing entire swathes of infrastructure, is proving prohibitive to many clients, it said. The Autopilot provides a best-practice cloud architecture and automates that entire process, it claims.

Canonical is the company behind Ubuntu, the most widely used cloud platform and the most popular OpenStack distribution. According the latest Linux Foundation survey 65% of large scale production OpenStack clouds are built on Ubuntu. The OpenStack Autopilot allows an operator to choose from a range of software-defined storage and networking options.

The Autopilot presents users with a range of software-defined storage and networking options, studies the available hardware allocated to the cloud, creates an optimised reference architecture for that cluster and installs the cloud from scratch, according to Canonical.

The OpenStack Autopilot is so simple to use that any enterprise can create its own private cloud without hiring specialists, according to Mark Baker, Canonical’s cloud product management leader.

“Over time the Autopilot will manage the cloud, handling upgrades and dealing with operational issues as they occur,” said Baker.