Volkswagen moves to OpenStack platform with start-up Mirantis

VWCar manufacturer Volkswagen Group has chosen OpenStack as its global standard for its next generation private cloud platform, as part of a worldwide standardization project to reduce IT costs and increase automation.

The company signed the deal with start-up Mirantis over the major players in the industry. While the move represents one of the biggest wins to date for the start-up, it would appear that Red Hat have lost out on a healthy deal in the process. Volkswagen is currently a customer of Red Hat, though it is not clear to what degree the relationship will continue.

“As the automotive industry shifts to the service economy, Volkswagen is poised for agile software innovation,” said Mario Müller, VP IT Infrastructure at Volkswagen. “The team at Mirantis gives us a robust, hardened distribution, deep technical expertise, a commitment to the OpenStack community, and the ability to drive cloud transformation at Volkswagen. Mirantis OpenStack is the engine that lets Volkswagen’s developers build and deliver software faster.”

Volkswagen highlighted the move to a private cloud platform will enable the business to better compete in an ever-more digitally enabled world. Müller said that four trends drove the company towards a more agile cloud computing platform, as the new platform enables greater levels of automation as well as a less consuming procurement process.

“Ubiquitous connectivity means we’ll have 50 billion smart sensors in end devices by 2030,” said Müller. “Cloud computing means data access everywhere. That means the amount of stored data doubles every two years. Third, social media. We have 1.3 billion Facebook users today, heading towards 7 billion. And big data. We can do real-time analysis of mass amounts of data.”

Initially Volkswagen will move its infrastructure to Infrastructure-as-a-Service, ending with Platform-as-a-Service for the infrastructure model. On IaaS, Volkswagen will manage the middleware, runtime, data and applications, whereas Mirantis will manage the operating system, virtualization layer, servers, storage and networking, while on PaaS the company will only manage data and applications. The company aim to have PaaS up and running by July of this year. The transition to the IaaS model was completed at the end of 2015.

“First, it’s a service (the current IaaS platform) and not simply dedicated hardware,” said Müller. “The target VW internal audience is administrators and technical competence centres. It’s not designed for end users. The IaaS services provide virtualized hardware computer, networking and storage running on Linux with root access. Connectivity is via VW’s intranet. It’s not yet connected to the Internet. It doesn’t support legacy applications and we don’t yet offer central backups. It’s available to our teams in America, Europe and Asia.”

The deal represents a major win for Mirantis, which previously counted Red Hat as one of its investors. In two rounds of fund-raising in January and June 2013, Mirantis raised $10 million in growth capital funding from various venture capitalists, as well as a further $10 million from Red Hat, Ericsson and SAP. The company then launched its own OpenStack distribution in October 2013, putting it in direct competition with Red Hat, though the technology still worked with Red Hat operating systems.

In recent years, the relationship between the two companies would appear to have soured as Red Hat announced in May 2014 that it would no longer provide support to Linux customers using non-Red Hat versions of OpenStack, contradicting the spirit of the open source community. Later that year in November the company also ordered all employees to stop working with Mirantis. The saga would not have appeared to have effected Mirantis’ perception in the market.

“OpenStack is the open source cloud standard offering companies a fast path to cloud innovation,” said Marque Teegardin, SVP at Mirantis. “It is our privilege to partner with Europe’s largest automaker and we are thrilled to support them as they use the software to out-innovate competitors and expand their business on a global scale.”

NHK World Featured @ThingsExpo | #IoT #M2M #BigData #InternetOfThings

NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special ‘Internet of Things’ and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impact on the industry in Japan. The film’s director is writing a scenario to fit in the story in the next few days will be turned in to the network.

read more

SDN and the software defined data centre: Opportunities and challenges ahead

(c)iStock.com/4x-Image

Imagine you had just arrived at London Euston railway station and wanted to walk to Leicester Square. A map from a well known search engine provider reliably informs us that it is roughly a 30 minute jaunt, but if you did not have that option, you could ask passers-by for directions ad nauseam at each junction until – assuming you do not get duff advice – you reach your destination.

This represents a good analogy for the state of more traditional networks – ‘move to the next hop in the network and a decision is made’ – versus software defined networking (SDN) with a controller, according to Michael Allen, EMEA VP and chief technical officer of network monitoring provider Dynatrace. He sees SDN, and its closely allied brethren, the software defined data centre (SDDC), as the next flavour of virtualisation, but it remains a little out of reach for large enterprises.

“Very, very few enterprises are doing it today, [but] many are looking at it,” Allen tells CloudTech. “It has the eye of the CTO office, it is the next wave of virtualisation, there’s a lot of promise behind it… [but] I have not seen one large enterprise introduction where we’re having to actually plug into an SDN-enabled network. So I think it’s still very much at the hype point.”

This view of the software defined data centre is shared by the analyst houses. A report from Gartner in September argued that, long term, it would be crucial for IT organisations – by 2020, the programmatic capabilities of an SDDC would be required for three in four Global 2000 Enterprises looking to deliver a DevOps and hybrid cloud approach.

Allen makes the comparison with cloud two years ago as to where software defined computing is today. “With cloud, I see so many large enterprises doing that today, putting mission critical platforms on public cloud,” he explains. “That’s where I see SDN today. When we talk about data centre virtualisation, it first started off with rationalisation, and then consolidation of workloads, and then we started to move into the more dynamic nature of it.

“I think today when we’re looking at SDN we’re starting to see more of the static nature of that, VPNs, and the tunnels being set up,” he adds. “The dynamic nature of it in terms of the control, of the intelligence – maybe that’s a couple of years out.”

Recent research appears to back that statement up. A report from 451 Research released last month argued software defined infrastructure (SDI) was a key growth area for the enterprise data centre – yet only one in five organisations polled (21%) said they had achieved SDI. Allen argues the survey results are not particularly surprising, but sounds a note of caution if it could be taken a step further.

“I think still today, we’re just starting to see in the last 12 to 18 months some of the initial SDN players who are kind of startup players now start to be acquired by the larger networking and optimisation players,” he says. “For the enterprises, it’s both a benefit for all of the promise having everything virtualised brings…but if we look at the complexity that dynamic virtualisation or VM migration had on trying to manage that software defined data centre, this for me is probably even worse than the complexity of just virtual machines moving around.

“If they’re in that ‘blueprint’ rigidity that keeps machines sensibly arranged with regards to physical infrastructure, you could have virtual machines running in the same web server, app server, or fabric stack,” Allen adds. “You’re not only dealing with the dynamics of the server virtualisation controllers, but you’re also dealing with the SDN controller’s decisions as well.

“So I think it’s going to be an even bigger headache for people to manage application user experience, application performance, as the network becomes more dynamic and less predictable.”

This is where Dynatrace aims to help in the form of greater visibility in the network. With the release of the latest version of its Data Center Real User Monitoring (DC RUM) software, Dynatrace aims to deliver end user and performance monitoring for software defined networks and, in time, speed up organisations’ transition to more service-oriented IT. The growing trend of shadow IT means the IT department needs to get some power back, according to Allen.

“Because we have visibility on all users, all transactions and all applications, whether it’s internally hosted, cloud hosted or third party, it allows IT to get a little bit back in control,” he explains. “We’re providing visibility getting control of all those services. What’s changed this week? Are there new services the business is using? Where are those services hosted? Are they compliant with corporate policy?

“If organisations put too many machines into a virtual LAN, you create a broadcast domain across the network that can actually end up slowing things down – if that domain is too large and too distributed, then it can cause massive issues for the physical infrastructure,” says Allen. “So providing end user performance visibility…takes those meaningful measures, like having a stopwatch sitting on the edge of the end user, and then when that performance is slowed being able to correlate it, not just to produce data that humans then have to look at, but to automatically do anomaly detection on that data.”

And so back to our intrepid traveller at Euston, who has decided to ditch the walking plan and go to Leicester Square by car, armed with a trusty sat nav. A roadblock has caused the software to recalibrate – and here the analogy provides a glimpse into the future challenges of software defined computing. “When you’re in a traffic jam on a road system, everyone’s sat navs behave the same,” says Allen, “so what might appear a quick route very quickly becomes highly congested – they’re not aware of the other traffic going on. I think that’s going to be a challenge moving forward. We’ve got these hybrid networks where SDN is being rolled in, but it’s not fully governing everything.”

The future of SDN and the SDDC is clearly bright – but there are still a few kinks to iron out before we get there.

Equinix launches Data Hub solution for customers on the edge

Office worker sitting on rooftop in cityData centre company Equinix has launched its Data Hub solution to enable enterprise customers to develop large data repositories and dispense, consume and process the data at the edge.

As part of an Interconnection Oriented Architecture, the Data Hub is a bundled solution consisting of pre-configured colocation and power, combined with cloud-integrated data storage solutions, and will work in conjunction with the company’s Performance Hub solution. The company highlighted that while the Performance Hub solves for the deployment of network gear and interconnection inside Equinix data centres, Data Hub enables the deployment of IT gear integrated with Performance Hub.

The launch builds on a number of trends within the industry, including the growing volume of data utilized by enterprise organizations brought on by the implementation of IoT and big data capabilities. According to research from statista, the number of connected devices is forecast to reach 50 billion units worldwide by 2020. The company believe the healthy growth of data consumption will increase the need for organizations to re-think their existing IT infrastructure, and develop an Interconnection Oriented Architecture, at the edge.

“Data, and its exponential growth, continues to be an ongoing concern for enterprise IT,” said Lance Weaver, VP, Product Offers and Platform Strategy at Equinix. “And there is no expectation it will slow down in the near future.  To keep up with this relentless data rise, it is critical for the enterprise to rethink its IT architecture and focus on an interconnection-first strategy.  Data Hub, is the newest solution from Equinix and we are confident that it will provide our enterprise customers with the data management solutions they need today, while providing for growth tomorrow.”

The company have claimed there are a number of use cases including cloud-integrated tiered storage, big data analytics infrastructure, as well as multi-site deployment for data redundancy, allowing data to be synchronously replicated by an enterprise.

“With the explosive growth of mobile, social, cloud and big data, an enterprise data center strategy needs to evolve from merely housing servers to becoming the foundation for new data-driven business models,” said Dan Vesset, Group VP, Analytics and Information Management at research firm IDC. “Equinix, with its global platform of data centers and interconnection-first approach, offers the type of platform needed to create a flexible datacenter environment for innovation – specifically in the realm of data management at the network edge.”

Understanding DevOps | @DevOpsSummit @IBMDevOps #DevOps

A simple description of DevOps is such: ‘An approach to Application Delivery that applies Lean principles to accelerate feedback and improve time to market.’
What does this mean? In a nutshell it implies that DevOps is a set of principles and practices that enables an organization to make their delivery of applications ‘lean’ and efficient, while leveraging feedback from customers and users to continuous improve.

read more

Toyota and Microsoft launch connected car initiative

ToyotaJapanese car brand Toyota has teamed up with Microsoft to launch Toyota Connected, a new joint venture to further the car manufacturer’s efforts towards autonomous vehicles.

Toyota Connected builds on a standing relationship with Microsoft to leverage Azure cloud technology to make the connected driving experience smarter. Based in Plano, Texas, Toyota Connected will expand the company’s capabilities in the fields of data management and data services development initiatives.

“Toyota Connected will help free our customers from the tyranny of technology. It will make lives easier and help us to return to our humanity,” said Zack Hicks, CEO of Toyota Connected.  “From telematics services that learn from your habits and preferences, to use-based insurance pricing models that respond to actual driving patterns, to connected vehicle networks that can share road condition and traffic information, our goal is to deliver services that make lives easier.”

The connected cars market has been growing healthily in recent years, but is not new to Microsoft or Toyota as the two companies have been collaborating in the area of telematics since 2011, working on services such as infotainment and real-time traffic updates. A 2015 report stated that connected car services will account for nearly $40 Billion in annual revenue by 2020, while big data and analytics technology investments will reach $5 billion across the industry in the same period.

The new company itself has been given two mandates; firstly to support product development for customers, dealers, distributors, and partners, through advanced data analytics solutions, and secondly to build on Toyota’s existing partnership with Microsoft to accelerate R&D efforts and deliver new connected car solutions. The company have stated that its vision is to “humanize the driving experience while pushing the technology into the background”.

The launch of Toyota Connected will able enable the organization to consolidate R&D programs into one business unit, which it claims will ensure that all initiatives remain customer centric. Initiatives will focus around a number of areas including in-car services and telematics, home/IoT connectivity, personalization and smart city integration.

As part of the launch, Toyota will also adopt Microsoft’s Azure cloud computing platform, employing a hybrid solution globally, whilst also housing a number of Microsoft engineers in its offices in Plano.

“Toyota is taking a bold step creating a company dedicated to bringing cloud intelligence into the driving experience,” said Kurt Del Bene, EVP, Corporate Strategy and Planning at Microsoft. “We look forward to working with Toyota Connected to harness the power of data to make driving more personal, intuitive and safe.”

Salesforces acquires AI start up

Robotic hand, accessing on laptop, the virtual world of information. Concept of artificial intelligence and replacement of humans by machines.Salesforce is set to acquire deep learning start up MetaMind in an effort to bolster it artificial intelligence capabilities.

While terms of the deal have not been released, it would appear to be an “acqhire” based agreement, as Salesforce will integrate MetaMind’s technology into its current services. Long-term intentions have not been announced, though MetaMind’s capabilities will be used to automate and personalize customer support in the first instance.

“With MetaMind and Salesforce coming together, we’ll be able to offer customers real AI solutions with breakthrough capabilities that further automate and personalize customer support, marketing automation, and many other business processes,” said MetaMind Founder Richard Socher. “We’ll extend Salesforce’s data science capabilities by embedding deep learning within the Salesforce platform.”

MetaMind’s expertise is based on Socher’s PhD where he explored deep learning artificial intelligence. The company teaches machines to recognize images and understand natural language, operating in a similar way to the networks of neurons in the human brain. While these capabilities have been limited to internet giants such as Facebook, Google and Baidu, Socher founded MetaMind under the ethos to “build deep learning technologies anyone can use, not just the internet giants”. The company was initially funded by Saleforce CEO Marc Benioff and venture capital fund Khosla Ventures.

The acquisition builds on growing AI trends within the industry on the whole, as industry giants are currently competing for leading spot in the emerging segment. With Microsoft, Google, IBM and Facebook, all making strides in recent weeks, it should not be seen as a surprise that one of the world’s largest CRM brands is also entering the fray.

“Over the past year and a half, we’ve been on a mission to empower business users with state of the art deep learning technology to simplify, improve and automate decision making,” said Socher. “And now, we’ll be able to continue our journey at Salesforce on a much larger scale, with the resources and ecosystem of one of the world’s most innovative and influential enterprise software companies.”

For unpaid web users, MetaMind’s products will be discontinued on May 4, whereas for paid users, products will be discontinued on June 4. Although it has not been made 100% clear what the long-term strategy of the acquisition will be Socher highlighted that the MetaMind team’s research will continue and it is still receiving CV’s for new positions.

Outside of the AI space, Salesforce has also signed an agreement with NEC to establish its second data centre in Japan to support its growing customer base over the Asia-Pacific region. Japan’s public cloud service market grew to 2.6 billion yen in 2015 and is forecasted to reach 6.3 billion yen by 2020.

“Salesforce’s plans to open a second data centre in the Kansai area reflects our commitment to Japan and the Asia-Pacific region,” said Shinichi Koide, CEO at Salesforce Japan. “Salesforce continues to increase its strategic investments in the market, enabling local companies to leverage the latest cloud, mobile, social, data science and IoT innovations to create connected experiences that matter to their customers.”

While Salesforce is still considered in the industry as the market leader, Oracle and Larry Ellison have actively targeted Salesforce market share, as the company still appears to measure itself against Salesforce’s success. As a company which has built its reputation on innovation it should not come as a surprise that Salesforce is pursuing technologies such as artificial intelligence to bolster its product offering and enforce its position as the industry leader.

Digital Transformation: Seven Big Traps to avoid in Implementing Bimodal IT

Zumos de verdura ecolcica‘Bimodal IT’ is a term coined by Gartner. It describes one approach for both keeping the lights on with mission critical, but stable core IT systems (Mode 1), whilst taking another route (Mode 2) to delivering the innovative new applications required to digitally transform and differentiate the business.

Both streams of IT are critical. Mode 1 requires highly specialised programmers, long and detailed development cycles. Control, detailed planning and process adherence are of priority. Projects are technical and require little involvement from business teams. Mode 2 requires a high degree of business involvement, fast turnaround, and frequent updates; effectively a quick sprint to rapidly transform business ideas into applications.

According to a recent survey by the analyst group, nearly 40 per cent of CIOs have embraced bimodal IT, with the majority of the remainder planning to follow in the next three years. Those yet to implement bimodal IT were tellingly those who also fared worst in terms of digital strategy performance.

If you’re one of the recently converted, you won’t want to rush blindly into bimodal IT, oblivious to the mistakes made by those who have already ventured down that path.

Based on experience over many customer projects, here are seven mistakes and misconceptions I’ve learned firms need to avoid when implementing bimodal IT:

1. Thinking bimodal IT impacts only IT – In transforming how IT operates, bimodal IT changes the way the business operates too. Mode 2 is about bringing IT and business together to collaboratively bring new ideas to market. This requires the business to be much more actively involved, as well as take different approaches to planning, budgeting and decision making.

2. Lacking strong (business) leadership – Strong IT and business leadership is absolutely critical to implementing bimodal IT. The individual responsible for operationally setting up Mode 2 needs to be a strong leader, and ideally even a business leader. That’s because the goals and KPIs of Mode 2 are so completely different from Mode 1. When Mode 2 is set up by someone with a Mode 1 mind-set, they tend to focus on the wrong things (e.g. upfront planning vs. learning as you go, technical evaluations vs. business value etc.), limiting the team’s chance of success

3. Confusing Mode 2 with ‘agile’ – One of the biggest misconception about bimodal IT is that Mode 2 is synonymous with agile. Don’t get me wrong; iterative development is a key part of it. Because requirements for digital applications are often fuzzy, teams need to work in short, iterative cycles, creating functionality, releasing it, and iterating continually based on user feedback. But the Process element extends beyond agile, encompassing DevOps practices (to achieve the deployment agility required for continuous iteration) and new governance models.

4. Not creating dedicated teams for Mode 1/2 – Organisations that have one team serving as both Mode 1 and Mode 2 will inevitably fail. For starters, Mode 1 always takes precedence over Mode 2. When your SAP production instance goes down, your team is going to drop everything to put out the fire, leaving the innovation project on the shelf. Second, Mode 1 and Mode 2 require a different set of people, processes and platforms. By forcing one team to perform double duty, you’re not setting yourself up for success.

5. Overlooking the Matchmaker role – When building your Mode 2 team, it’s important to identify the individual(s) that will help cultivate and prioritise new project ideas through a strong dialog with the business. These matchmakers have a deep understanding of, and trusted relationship with the business, which they can leverage to uncover new opportunities that can be exploited with Mode 2. Without them, it’s much harder to identify projects that deliver real business impact.

6. Keeping Mode 1 and 2 completely separate – While we believe Mode 1 and Mode 2 teams should have separate reporting structures, the two teams should never be isolated from each other. In fact, the two should collaborate and work closely together, whether to integrate a Mode 2 digital application with a system of record or to transfer maintenance of a digital application to Mode 1 once it becomes mission critical, requiring stability and security over speed and agility.

7. Ignoring technical debt – Mode 2 is a great way to rapidly bring new applications to market. However, you can’t move fast at the expense of accumulating technical debt along the way. It is important to ensure maintainability, refactoring applications over time as required.

While 75 per cent of IT organisations will have a bimodal capability by 2017, Gartner predicts that half of those will make a mess. Don’t be one of them! Avoid the mistakes above to you implement bimodal IT properly and sustainably, with a focus on the right business outcomes that drive your digital innovation initiatives forward.

Written by Roald Kruit, Co-founder at Mendix