A tale of two ITs

Werner Knoblich,  head of strategy at Red Hat in Europe, Middle East, and Africa (EMEA)

Werner Knoblich, senior vp and gm of Red Hat in EMEA

Gartner calls it ‘bimodal IT’; Ovum calls it ‘multimodal IT’; IDC calls it the ‘third platform’. Whatever you choose to call it, they are all euphemisms for the same evolutions in IT: a shift towards deploying more user-centric, mobile-friendly software and services that more scalable, flexible and easily integrated than the previous generation of IT services. And while the cloud has evolved as an essential delivery mechanism for the next generation of services, it’s also prompting big changes in IT says Werner Knoblich, senior vice president and general manager of Red Hat in EMEA.

“The challenge with cloud isn’t really a technology one,” Knoblich explains, “but the requirements of how IT needs to change in order to support these technologies and services. All of the goals, key metrics, ways of doing business with vendors and service providers have changed.”

Most of what Knoblich is saying may resonate with any large organisation managing a large legacy estate that wants to adopt more mobile and cloud services; the ‘two ITs can be quite jarring.

The chief goal used to be reliability; now it’s agility. In the traditional world of IT the focus was on price for performance; now it’s about customer experience. In traditional IT the most common approach to development was the classic ‘waterfall’ approach – requirements, design, implementation, verification, maintenance; now it’s all about agile and continuous delivery.

Most assets requiring management were once physical; now they’re all virtualised machines and microservices. The applications being adopted today aren’t monolithic beasts as they were traditionally, but modular, cloud-native apps running in Linux containers or platforms like OpenStack (or both).

Not just the suppliers – but also the way they are sourced – has changed. In the traditional world long-term, large-scale multifaceted deals were the norm; now, there are lots of young, small suppliers, contracted in short terms or on a pay-as-you-go basis.

“You really need a different kind of IT, and people who are very good in the traditional mode aren’t necessarily the ones that will be good in this new hybrid world,” he says. “It’s not just hybrid cloud but hybrid IT.”

The challenges are cultural, organisational, and technical. According to the 2015 BCN Annual Industry Survey, which petitioned over 700 senior IT decision makers, over 67 per cent of enterprises plan to implement multiple cloud services over the next 18 months, but close to 70 per cent were worried about how those services would integrate with other cloud services and 90 per cent were concerned about how they will integrate those cloud services with their legacy or on-premise services.

That said, open source technologies that also make use of open standards play a massive role in ensuring cloud-to-cloud and cloud-to-legacy integrations are achievable and, where possible, seamless – one of the main reasons why Linux containers are gaining so much traction and mind share today (workload portability). And open source technology is something Red Hat knows a thing or two about.

Beyond its long history in server and desktop OSs (Red Hat Enterprise Linux) and middleware (JBoss) the company is a big sponsor and early backer of Open Stack, increasingly popular cloud building software built on a Linux foundation. It helped create an open source platform as a service, OpenShift. The company is also working on Atomic Host, an open source container-based hosting mechanism for a slimmed down version of RHEL with support for other open source container technologies including Kubernetes and Docker, the darlings of the container community.

“Our legacy in open source is extremely important and even more important in cloud than the traditional IT world,” Knoblich says.

“All of the innovation happening today in cloud is open source – think of Docker, OpenStack, Cloud Foundry, Kubernetes, and you can’t really think of one pure proprietary offering that can match these in terms of the pace of innovation and the rate at which new features are being added,” he explains.

But many companies, mostly the large supertankers, don’t yet see themselves as ready to embrace these new technologies and platforms – not just because they don’t have the type or volume of workloads to migrate, because they require a huge cultural and organisational shift. And cultural as well as organisational shifts are typically rife with political struggles, resentment, and budgetary wrestling.

“You can’t just install OpenStack or Dockerise your applications and ‘boom’, you’re ready for cloud – it just doesn’t work that way. Many of the companies that are successfully embracing these platforms and digitising their organisations set up a second IT department that operates in parallel to the traditional one, and can only seed out the processes and practices – and technologies – they’ve embraced when critical mass is reached. Unless that happens, they risk getting stuck back in the traditional IT mentality.”

An effective open hybrid approach ultimately means not only embracing the open source solutions and technologies, but recognising that some large, monolithic applications – say, Cobol-based mainframe apps – won’t make it into this new world; neither will the processes needed to maintain those systems.

“For some industries, like insurance for instance, there isn’t a recognised need to ditch those systems and processes. But for others, particularly those being heavily disrupted, that’s not the case. Look at Volkswagen. They don’t just see Mercedes, BMW and Tesla as competitors – they see Google and Apple as competitors too because the car becomes a technology platform for services.”

“No industry is secure from disruption, particularly from players that scarcely existed a few years ago, which is why IT will be multi-modal for many, many years to come,” he concludes.

This interview was developed in partnership with Red Hat

Transform Your Android into a Workstation

The smartphone revolution has significantly affected IT networks and software market shares. While Windows applications lead the software market, Android is quickly taking over the mobile market. According to NetMarketShare, the Windows Desktop OS market share is more than 80% while Android market stands at 28% for all versions. Looking at these numbers, it is […]

The post Transform Your Android into a Workstation appeared first on Parallels Blog.

Googles Cloud Services

In order to compete with Cloud giants such as Amazon, IBM, and Microsoft, Google has created two new cloud services, Cloud Dataflow and Cloud Pub/Sub, to handle Big Data necessities. Cloud Dataflow will perform very complex computations on large amounts of data in either batches or streaming mode. Cloud Pub/Sub may send and receive data from applications in a message form.
google dataflow
Cloud Dataflow product manager Eric Schmidt and Cloud Pub/Sub product manager Rohit Khare have written in a blog post, “These fully-managed services remove the operational burden found in traditional data processing system. They enable you to build applications on a platform that can scale with the growth of your business and drive down data processing latency, all while processing your data efficiently and reliably. Every day, customers use Google Cloud Platform to execute business-critical big data processing workloads, including: financial fraud detection, genomics analysis, inventory management, click-stream analysis, A/B user interaction testing and cloud-scale ETL.”
The authors have described Cloud Dataflow as “specifically designed to remove the complexity of developing separate systems for batch and streaming data sources by providing a unified programming model.” This program was based on previous Google innovations such as MapReduce, FlumeJava, and Millwheel.
Cloud Pub/Sub “addresses a broad range of scenarios with a single API, a managed service that eliminates those tradeoffs, and remains cost-effective as you grow, with pricing as low as 5¢ per million message operations for sustained usage.”

The post Googles Cloud Services appeared first on Cloud News Daily.

The race to zero: On the path to no-cost cloud computing

(c)iStock.com/erhui1979

It seems that everyone is in a race, and the pace of business in the modern world is no less evident than in cloud computing. While prices for just about anything are going up, cloud storage prices are nosediving. The competition between leaders such as Google, Amazon, and Box is driving prices down and providers across the industry are being pressured to be part of the “race to zero”.

This race has been accelerating. Microsoft began providing unlimited storage in October 2014, for its OneDrive service included with Office 365. This was in response to Google Apps for Work that provided numerous features that threatened to outdo Microsoft’s well-known business products.InformationWeek provided insight into this story. Also mentioned was a system of checks and balances that exists because while vendors are providing lower and lower cost and even free storage, managing physical resources is expensive.

For example, Amazon made 47 price reductions in six years by late 2014, while its revenue growth slowed from about 80% to below 39% between 2011 and 2014.

Storage will soon be free, many experts are not shy to admit. However, there are many ways around this, and Office 365 is an example. The resources with this Microsoft product are not free and require an annual subscription, so technically users must still pay for what they get. Storage can essentially be thought of as part of it. This represents another trend being seen in the race to zero.

Cloud services worth paying for

Cloud services company Box is also on the bandwagon. Offering affordable storage, it has introduced services people and companies don’t hesitate to pay for. Security is one of them. Nobody is willing to skimp on security, as data is more vulnerable than ever. Box is also developing apps for document and project management as well as collaboration. Dropbox is another example: it is pushing Dropbox for Business with a range of security and administrative functions.

In 2014, IBM announced it was investing $1.2 billion in expanding its data centres. Microsoft has said it is spending $750 million to improve its Wyoming data centre alone. The trend has been to offer services companies don’t pay for in a one-off purchase, but throughout the year. These services must be great enough that just a few companies can handle the pressure.

Memory prices plummet

The falling price of memory is supporting the race to zero. According toBusiness Insider, a hard drive with a gigabyte of capacity cost $9,000 in 1993. In 2013, the same amount of storage was about $0.04. Providers have also passed on the savings in processing power. A $3 million IBM mainframe had 65 kilobytes, operated at 0.043MHZ, and supported a lunar landing mission in 1969. The iPhone 6 has more power at 16GB and 2.6GHz. It costs about $200.

Isn’t it really a race to the top?

It seems the race to zero is being driven in part by competition, despite the diminutive costs of memory today. Dallas Salazar, writing for Seeking Alpha, discusses this very topic. There’s a logical notion that cloud storage gets people’s attention. Amazon is now offering mobile apps for Cloud Drive, a product now available for $60 per year under a plan which it calls Unlimited Everything. The Unlimited Photos plan is even cheaper while the storage has no limits at all. Free storage is a pretty good selling point for larger-scale paid services, as companies gain greater knowledge on cloud computing.

Whether free storage is a strategy to draw customers, or simply a passing of dirt cheap costs from provider to customer, there is no doubt the race to zero is well and truly on.

Feeling insecure? It’s time to find a data centre

(c)iStock.com/4x-image

By Steve Davis, marketing director, NGD

It seems hardly a day goes by without a market survey uncovering yet more shock horror findings about companies of all shapes and sizes believing or knowing they are insecure when it comes to their data security.  But many still admit to be avoiding best practice even though threat levels are higher than ever.

Equally surprising, many organisations still view colocation data centres as being out of reach and soldier on with their servers and critical data on the premises. On the contrary, colocation data storage and hosting is more accessible than ever to businesses in this country thanks to the tumbling cost of high speed fibre network connectivity. This is allowing more data centre operators to locate well away from the traditional data centre heartlands of London and inside the M25 to places where real estate and labour costs are cheaper. In turn this is creating more competition which is being reflected in lower customer pricing.

Some very large and modern data centres now have the economies of scale to go even further by combining the less costly location benefits with a much reduced minimum size threshold for data hosting and storage. This also opens the door for start-up and smaller businesses to take advantage. For example an offer of rack space, cooling, power and connectivity infrastructure for under £25 per day in a Tier 3 data centre was simply off the radar a year ago.

All aboard

So with the growing choice of good quality and affordable colocation facilities now available companies should now be deciding which colo partner to pick and how to differentiate between them. It is no longer a case of ‘how’ to keep data and businesses safe and secure as the solution to the problem is already out there.     

But when it comes to data centre pricing it’s rather like booking a flight and comparing a budget airline or charter fight versus a major scheduled carrier. On the face of it the budget and charter guys will most likely appear cheaper and better value. That’s until you click through to the next page and see all the catches such as additional compulsory and optional extras. Things like inconvenient flight times, less frequency, hold baggage weight limits, costs to sit together or for more legroom, meals and refreshments…it all starts to add up.       

Certainly sufficient space for now and the future, proven certifiable security and operational credentials, high levels of resilience and power, along with DR and business continuity contingencies are also very important criteria to carefully evaluate. But watch out for the small print and any hidden costs and get out clauses.

Similarly, to attract smaller customers into colo, some providers will not be as transparent as others. Be sure to look beyond the headline deals especially on rack power and connectivity, infrastructure level and official certifications, and type of service level agreements on offer.

There are also practical things to consider such as availability of (free) parking facilities and meeting rooms, as well transportation and installation of your existing and new IT systems – often known as server migration. Ideally you will want a provider who can move your IT in for you leaving you without the headache of setting things up.

A few colo providers will be all inclusive, some limited, and others offering pretty much everything but all charged on top.  All ways round, doing that extra mile of due diligence on who to select as your partner will ensure a safe and secure stay at your chosen data centre destination – without any unforeseen surprises!

Google, Microsoft punt big data integration services into GA

Big cloud incumbents are doubling down on data integration

Big cloud incumbents are doubling down on data integration

Google and Microsoft have both announced the general release of Cloud Dataflow and Azure Data Factory, their respective cloud-based data integration services.

Google’s Cloud Dataflow is designed to integrate separate databases and data systems – both streaming and batch – in one programming model while giving apps full access to, and the ability to customise, that data; it is essentially a way to reduce operational overhead when doing big data analysis in the cloud.

Microsoft’s Azure Data Factory is a slightly different offering. It’s a data integration and automation service that regulates the data pipelines connecting a range of databases and data systems with applications. The pipelines can be scheduled to ingest, prep, transform, analyse and publish that data – with ADF automating and orchestration more complex transactions.

ADF is actually one of the core components of Microsoft’s Cortana analytics offering, and is deployed to automate the movement and transformation of data from disparate sources.

The maturation and commoditisation of data integration and automation is a positive sign for an industry that has for a very long while leaned heavily on expensive bespoke data integration. As more cloud incumbents bring their own integration offerings to the table it will be interesting to see how some of the bigger players in data integration and automation, like Informatica or Teradata, respond.

Dropbox promotes former Google engineer to infrastructure lead

Akhil Gupta DropboxDropbox has promoted one of its lead engineers, Akhil Gupta, to vice president of infrastructure, the latest in a series of executive shakeups at the firm.

Gupta has been with Dropbox for the past three years and has helped the company develop its infrastructure strategy. As vice president of infrastructure he will continue to oversee physical and technical infrastructure operations and security, and lead the company’s datacentre scale-out globally.

Before joining Dropbox Gupta spent seven years with Google, managing the search giant’s advertising technical infrastructure.

The infrastructure appointment comes amidst a broader executive shakeup at the cloud storage firm.

Last month the company hired Thomas Hansen, who most recently served as worldwide vice president of small and medium business at Microsoft where he led SME sales globally, to the newly created role of global vice president of sales & channel. It also hired former Twitter product management specialist Todd Jackson as Dropbox’s first vice president of product.

With the new hires Dropbox is looking to bolster its position in the enterprise, the quickest way to gaining seats, against rivals like Box, which heavily targets niche verticals and large traditional organisations as well as startups and smaller firms. Dropbox claims to have over 100,000 business and 400 million users on its platform while Box maintain it has closer to 44,000 organisations as customers.

Jennifer Kent of Parks Associates on IoT and healthcare

BCN spoke to Jennifer Kent, Director of Research Quality and Product Development at Parks Associates, on the anticipated impact IoT will have on healthcare.

BCN: Can you give us a sense of how big an impact the Internet of Things could have on health in the coming years?

Jennifer KentJennifer Kent: Because the healthcare space has been slow to digitize records and processes, the IoT stands to disrupt healthcare to an even greater extent than will be experienced in other industries. Health systems are just now getting to a point where medical record digitization and electronic communication are resulting in organizational efficiencies.

The wave of new data that will result from the mass connection of medical and consumer health devices to the Internet, as part of the IoT, will give care providers real insight for the first time into patients’ behaviour outside of the office. Parks Associates estimates that the average consumer spends less than 1% of their time interacting with health care providers in care facilities. The rest of consumers’ lives are lived at home and on-the-go, engaging with their families, cooking and eating food, consuming entertainment, exercising, and managing their work lives – all of which impact their health status. The IoT can help care providers bridge the gap with their patients, and can potentially provide insight into the sources of motivation and types of care plans that are most effective for specific individuals.

 

Do you see IoT healthcare as an essentially self-enclosed ecosystem, or one that will touch consumer IoT?

IoT healthcare will absolutely touch consumer IoT, at least in healthcare markets where consumers have some responsibility for healthcare costs, or in markets that tie provider payments to patients’ actual health outcomes. In either scenario, the consumer is motivated to take a greater interest in their own self-care, driving up connected health device and application use. While user-generated data from consumer IoT devices will be less clinically accurate or reliable, this great flood of data still has the potential to result in better outcomes, and health industry players will have an interest in integrating that data with data produced via IoT healthcare sources.

 

Medical data is very well protected – and quite rightly – but how big a challenge is this to the development of effective medical IoT, which after all depends on the ability to effectively share information?

All healthcare markets must have clear regulations that govern health data protection, so that all players can ensure that their IoT programs are in compliance with those regulations. Care providers’ liability concerns, along with the investments in infrastructure that are necessary to protect such data, have created the opportunity for vendors to create solutions that take on the burden of regulatory compliance for their clients. Furthermore, application and device developers on the consumer IoT side that border very closely the medical IoT vertical can seek regulatory approval –even if not required – as a means of attaining premium brand status from consumers and differentiation from the may untested consumer-facing applications on market.

Finally, consumers can be motivated to permit their medical data be shared, for the right incentive. Parks Associates data show that no less than 40% of medical device users in the U.S. would share the data from their devices in order to identify and resolve device problems. About a third of medical devices users in the US would share data from their devices for a discount on health insurance premiums. Effective incentives will vary, depending on each market’s healthcare system, but care providers, device manufacturers, app developers, and others who come into contact with medical device data should investigate whether potential obstacles related to data protection could be circumvented by incentivizing device end-users to permit data sharing.

 

You’re going to be at Internet of Things World Europe (5 – 7 October 2015 Maritim proArte, Berlin). What are you looking forward to discussing there and learning about?

While connected devices have been around for decades, the concept of the Internet of Things – in which connected devices communicate in a meaningful way across silos – is at a very early and formative stage. Industry executives can learn much from their peers and from visionary thinkers at this stage, before winners and losers have been decided, business plans hardened, and innovation slowed. The conversations among attendees at events like Internet of Things World Europe can shape the future and practical implementation of the IoT. I look forward to learning how industry leaders are applying lessons learned from early initiatives across markets and solution types.

File Sharing in Parallels Access 3.0

There are a number of great features in the recently released Parallels Access 3.0, but the one I believe that I’ll use the most is the ability to share files directly from my Mac, without any uploading to a cloud service. Don’t get me wrong—I love the Cloud, and Dropbox has saved me more times […]

The post File Sharing in Parallels Access 3.0 appeared first on Parallels Blog.

Ode to Clippy, the Microsoft Word Paper Clip

What’s the best thing to come out of the Windows 10 launch? Believe it or not, it isn’t a specific feature—it was this sketch from Jimmy Fallon on The Tonight Show featuring “Clippit” or “Clippy”, the handy paperclip from past versions of Microsoft Word. Talk about a Throwback Thursday! (Heads up: the below video is […]

The post Ode to Clippy, the Microsoft Word Paper Clip appeared first on Parallels Blog.