Red Hat launches Cloud Access on Microsoft Azure

redhat office logoRed Hat has followed its recent declaration of a partnership with Microsoft by announcing the availability of Red Hat Cloud Access on Microsoft Azure.

The Access service will make it easier for subscribers to move any eligible, unused Red Hat subscriptions from their data centre to the Azure cloud. Red Hat Cloud Access will give them the support relationship they enjoy with Red Hat with the cloud computing powers of Azure, the software vendor said on its official blog. Cloud Access extends to Red Hat Enterprise Linux, Red Hat JBoss Middleware, Red Hat Gluster Storage and OpenShift Enterprise. The blog hints that more collaborations with Microsoft are to come.

Meanwhile, in his company blog Azure CTO Mark Russinovich gave a public preview of the coming Azure Virtual Machine Scale Sets offering. VM Scale Sets are an Azure Compute resource that allow users to create and manage a collection of virtual machines as a set. These scale sets are designed for building large-scale services targeting big computing, big data and containerized workloads, all of which are increasing in significance as cloud computing evolves, said Russinovich.

By integrating with Azure Insights Autoscale, they provide the capacity to expand and contract to fit requirements with no need to pre-provision virtual machines. This allows users to match their consumption of computing resources to their application needs more accurately.

VM Scale Sets can be controlled within Azure Resource Manager templates and they will support Windows and Linux platform images, as well as custom images and extensions. “When you define a VM Scale Set, you only define the resources you need, so besides making it easier to define your Azure infrastructure, this also allows Azure to optimize calls to the underlying fabric, providing greater efficiency,” said Russinovich. “To deploy a scale set, all you need is an Azure subscription.”

Example Virtual Machine Scale Set templates are available on the GitHub repository.

2016 Conference Tracks Announced | @CloudExpo @ThingsExpo #IoT #M2M

There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content. Join @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 7-9, 2016 in New York City, for three days of intense ‘Internet of Things’ discussion and focus, including Big Data’s indespensable role in IoT, Smart Grids and Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) IoT’s use in Vertical Markets.

read more

It’s time to take your big data strategy beyond the data lake

(c)iStock.com/Bastar

A well-established – and unsettling – metric we anecdotally hear from organisations is that analysts spend 70% or more of their time hunting and gathering data and less than 30% of its time deriving insights.

The promise of big data is to reverse this 70/30 ratio. Organisations striving to build a data lake and use data for competitive advantage are seduced by the opportunity to discover new insights to better target customers, improve operations, lower costs, and discover new scientific breakthroughs.

“Big data” is no longer on the horizon; it’s here. According to Gartner, 40% of organisations have already deployed big data.  But a closer look reveals that only 13% of organisations have put their big data solution into a production environment.

Why are organisations struggling to implement big data?  We’ve spoken with customers across industries from around the world, and the barriers to deploying big data in a production environment tend to fall into three categories: findability of data; simplifying data access and making the data more consumable for users; and protecting data privacy and security.

Findability

Big data systems let organisations ingest any type of data such as social media, clickstream, wearable devices, images, emails, documents, and data from relational databases.  This ability to break down silos and gather a wide variety of data at speed is a key enabler of Hadoop. 

Our research, however, suggests that organisations are struggling with how to convert raw, unstructured, and semi-structured information into linkable, consistently defined, analytics-ready digital assets. For example, how does a hospital link a particular gene variant with a patient population or a manufacturer find Tier 1 customers who are dissatisfied with their product? 

To solve this problem, a big data solution needs to be able to automatically decorate data with rich and extensible metadata including data definitions, data quality metrics, retention policies, digital rights, and provenance.  Moreover, a big data system needs to build powerful indexes to let users interactively explore vast and diverse data, generating results with sub-second performance.

Simplifying data access and consumability

The secondary use of data has traditionally been the role of data scientists and analysts who produce business intelligence reports and analytics. But as organisations strive to become more agile and data-driven, analysts are increasingly being pushed to the limit. 

Organisations we speak with are looking to empower knowledge workers with selfservice access to information. Intuitive visualisation and analytics tools like Tableau and QlikView are an important part of the solution.  But for self-serve data to be truly consumable, information-based workers need a catalog of curated datasets they can draw from with a simple point-and-click user interface, eliminating the requirement for advanced SQL and complex schema expertise.

Protecting privacy and security

Ironically, one of the most common concerns we come across in discussions with large organisations – and also perhaps the biggest barrier to deploying big data in a production environment – is the concern that “If I can easily consolidate my organisation’s data for secondary use, what prevents anyone from seeing everything?

Big data systems are notoriously weak in managing information privacy and security. Following an extensive review of industry best practices, we believe that the globally ratified Privacy by Design framework presents a powerful seven-point model to scalable system design. 

A big data system should be able to control who is allowed to see and do what with the data.  For example, a CEO may be allowed to view but not download top secret data; the Accounts Receivable department can view a fully identified dataset but download only de-identified data; and a researcher may be able to view a de-identified dataset and collaborate with an external partner who can only see a subset of the de-identified data.  More advanced systems can also enforce who can see and do what on mobile devices or from outside the corporate firewall.

A scalable and secure system should be able to de-identify information on the fly, and control who is allowed to see and do what, depending on user authorisations. A robust security model that reduces the risk of a data breach should be able to set universally and consistent policy enforcement rules. 

Start at the data lake – but don’t stop there

So by all means, start your big data strategy and implement your data lake. Just don’t stop there. Look for a solution that has a flexible, comprehensive metadata infrastructure out-of-the-box that lets you quickly find and link the right information; gives your end-users self-service access to data without becoming experts in SQL and complex database schema; and universally and consistently enforces fine-grained privacy and security.

Does this sound hard? It doesn’t have to be. You can certainly grow your internal development team to build your data lake from a commercial Hadoop distribution, but there are also players who understand the problem and are building a supported best-of-breed solution. Whichever way you go, it’s time to become more data-driven. So go ahead and enjoy the lake. The water’s fine.

Cloud-Enabled Innovation Will Empower and Disrupt By @DHDeans | @CloudExpo #Cloud

Today, CEOs in all industries and geographies recognize that in 2016 they’ll have an important choice to make, regarding the development of a cohesive digital business transformation agenda — either be empowered by superior cloud-enabled innovations, or risk being disrupted by more progressive market leaders.
The savvy business technology application leaders are already moving from proof-of-concept cloud computing environments to trusting these platforms with their mission-critical workloads. According to the latest worldwide market study by Cisco Systems, this pervasive trend continues to accelerate as demand for cloud resources increase exponentially.

read more

Cloud Expo Summary By @coyboy02 | @CloudExpo #IoT #Cloud #DevOps

Day 1 of Cloud Expo in Santa Clara, California opened with an inspiring keynote from Jason Bloomberg, president of Intellyx, and industry expert on architecting agility for enterprise environments.
The ProfitBricks team was especially impressed with Jason’s talk about how digital transformation involves more organizational change than simple technology change and how today’s businesses need to capitalize on disruption to not only innovate at a more rapid pace but also drive organizational wide transformation. He further discussed how employing modern digital practices when architecting infrastructure solutions, such as self organization and DevOps, can provide a blueprint to reimagine how ongoing innovation can continually drive strategic value in new ways.

read more

Funding of Dell EMC acquisition could scupper deal – report

Dell office logoDell’s $67 billion merger with storage giant EMC could raise a tax challenge that would make their integration unfeasible, according to a re/code report. But the difficulties may be reflected across the industry as cloud drives future convergence, according to one analyst.

Deal’s funding of the EMC takeover, by using a new type of stock share, is under regulatory review and could lead to a $9 billion tax bill. The tax logic is built on the success enjoyed by EMC’s investment in the software vendor VMware, whose value rose by ‘tens of billions of dollars’ after EMC acquired it in 2003.

Dell plans to offer EMC shareholders $33.15 a share for the company, with $24.05 in cash and the balance from tracking stock linked to VMware. EMC owns an 81% stake in VMware. The tracking stock would offset the debt Dell would otherwise be burdened with and help Dell avoid tax.

But the scheme is likely to be reviewed by federal regulators who may revise he tax burden as high as $9 billion, according to the report’s sources.

It reports that Dell management are trying to ensure that key aspects of the deal don’t qualify for the level of transaction tax that would make the merger fail. Dell has reportedly hired the New York firm of Simpson Thacher & Bartlett to influence events in Dell’s favour.

The deal isn’t off, yet, but Dell’s funding options are a bit brazen, said analyst Clive Longbottom at Quocrica. “Dell would do better to sell 61% of the VMware shares direct to the market instead,” said Longbottom.

Making VMware a totally separate business would bring in $20 billion, would be totally legal and result in a much lower tax bill,” said Longbottom. The wider issue for the cloud industry, he said, is the scale of consolidation that is being driven by the cloud and the measures to which companies are big forced to finance them. “These tracking shares worry me. It seems to be a case of smoke and mirrors,” said Longbottom.

“This was never going to be an easy deal and Michael Dell has already stated that it will be a minimum of 9 months before it can close. It then comes down to whether the US government decides that having EMC and Dell survive is a good thing and whether other vendors are also happy for the government to allow this to happen.”

Microsoft launches cloud based Blockchain tech for would be bitcoin traders

bitcoin logo2Microsoft has launched a cloud-based ledger system for would-be bitcoin traders. The Azure service provider aims to ease traditional bankers and financial service companies into a new increasingly legitimised market as it gains currency in the world’s finance centres.

The system uses blockchain technology provided by New York based financial technology start up ConsensYs. It will give financial institutions the means to create affordable testing and proof of concept models as they examine the feasibility of bitcoin trading.

Blockchain technology’s ability to secure and validate any exchange of data will help convince compliance constrained finance institutes that this form of trading is no more dangerous than any other high speed automated trading environment, according to Microsoft. In a bitcoin system the ConsenYs blockchain will be used as a large decentralized ledger which keeps track of every bitcoin transaction.

Cloud technology could aggregate sufficient processing power to cater for all fluctuations demand for capacity in online trading. In turn this means that the IT service provider, whether internal or external, can guarantee that every transaction is verified and shared across a global computer network. The omnipresence of the blockchain reporting system makes it impossible for outside interference to go unmonitored.

The Microsoft blockchain service, launched on November 10th, also uses Ethereum’s programmable blockchain technology which will be delivered to existing banks and insurance clients already using Microsoft’s Azure cloud service. According to Microsoft four global financial institutions have already subscribed to the service.

Until now blockchain has been the ‘major pain point’ in bitcoin trading, according to Marley Gray, Microsoft’s director of tech strategy for financial services. Gray told Reuters that the cloud technology had made the technology affordable and easy enough to adopt. According to Gray it now only takes 20 minutes and no previous experience to spin up a private blockchain. Microsoft said it has simplified the system with templates it created, used in combination with its cloud-based flexible computing model.

The new testing systems made possible by ConsensYs create a ‘fail fast, fail cheap’ model that allows finance companies to explore the full range of possibilities of this new type of trading, said Gray.

SAP claims to have simplified its Business Suite for all enterprise cloud users

SAP HANA VoraSAP has updated its suite of on-premise and cloud editions included in its Business Suite 4 SAP HANA range, which it promised will simplify processes in a range of functions and lines of business.

The announcement of improvements to its SAP S/4HANA Enterprise Management system was made at SAP TechEd in Barcelona.

SAP claimed it has invented new, simpler and faster ways of using its systems in eight major areas that affect users working in finance, sales, service, marketing, commerce, procurement and sourcing, manufacturing, supply chain, asset management, research and development and human resources.

Finance, sales and purchasing staff will benefit from a new method of optimized working capital with new accounts payable and receivable cockpits. In the supply chain, workers can benefit from a new system with fewer stock buffers and a simplified data model for inventory management. This, says SAP, will speed things up by catering for more real-time, high-volume processing. In other departments, procurement, sourcing and supply chain management professionals will benefit from new levels of instant insight into stock and material flow.

Meanwhile, in other part of the enterprise, cloud users in Production Departments will benefit from shorter manufacturing cycles, as a result of streamlined material flow for internal requirements and for material requirements planning. Project managers and supervisors will become more productive, claims SAP, thanks to its new augmented reactivity with real-time monitoring of production orders for flow and critical issues.

Operations managers, on the other hand, will make better operational decisions with easier simulation of supply alternatives. Buyers will be able to lower their procurement costs as a result of the advances on standard integration to the Ariba Network. Meanwhile, at the shop floor, enterprises will be able to offer better customer service thanks to a new sales order fulfilment cockpit that could identify and clear bottlenecks instantly.

These line of business improvements will be found across the SAP portfolio in applications such as SAP Cash Management, SAP SuccessFactor, the Ariba Network in procurement and SAP hybris systems for marketing and commerce.

“We worked together closely to identify where digitized operations can provide the most value,” said Bernd Leukert, member of the Executive Board of SAP SE, Products & Innovation.

Bringing the Cloud to Life with the Microsoft Experience Center

It’s oftentimes difficult to get a legitimate user experience when viewing a canned demo. That’s why I’m a big fan of the Microsoft Experience Center. It’s a mobile kit that operates out of the cloud through an Office 365 instance. This allows users to get that legitimate experience of interacting with Microsoft productivity solutions (while having access to experts to answer questions and provide guidance)  because it’s not a prepared environment or running over faster internet. It’s running over whatever the building you’re in is providing so that you can get a real understanding of what the experience will be like accessing these applications from the cloud. Watch the video below where I discuss the Microsoft Experience Center in more detail, including the process, benefits, and key takeaways you’ll leave with.

If you’re interested in learning more about Microsoft Office 365, I’ll hosting a webinar on November 18th entitled, “Microsoft Office 365: Expectations vs. Reality. Strategies for Migrating & Supporting Mobile Workforces. Register here!

This video is also available on GreenPages’ YouTube Channel

 

 

By David Barter, Practice Manager Microsoft Technologies

Join Parallels at Angelbeat in Charlotte and Philadelphia!

The Parallels team has just returned from Angelbeat Miami and are already planning the next big Angelbeat events—one in Charlotte on November 10, 2015 and one in Philadelphia on November 16, 2015. Will you be there? Ron Gerber, CEO of Angelbeat, will once again welcome the most relevant technology experts. Developers, business analysts, and IT managers […]

The post Join Parallels at Angelbeat in Charlotte and Philadelphia! appeared first on Parallels Blog.