Assessing the business case for hybrid cloud services adoption

(c)iStock.com/theevening

Forging a viable business technology strategy for today’s global networked economy is a high priority for most forward-thinking CEOs across the globe. Their guidance to CIOs is to create the fusion between existing IT infrastructure and modern cloud services. Moreover, the shift to a hybrid IT model must support the organization’s key commercial expansion objectives.

The savvy leaders who have a superior approach can extract greater value from their legacy IT investments, and launch new initiatives based upon public and private cloud computing services. This is the new normal – CIOs must create the optimal blended environment for purposeful technology-enabled innovation.

Primary motivation

A global study of 500 hybrid cloud decision makers revealed that organizations are increasingly integrating cloud computing and storage resources with traditional IT infrastructure to accommodate dynamic needs and specific business priorities, according to a market study by the IBM Center for Applied Insights.

This thought-provoking study found that improving productivity is currently the number one goal of cloud service adoption, as the most progressive senior executives plan to offload some of their IT resources and management complexity to the cloud.

A close second goal of digital transformation is improved security and risk reduction — using the flexibility of a hybrid solution to choose which workloads and data to move to the cloud and which to maintain on-premise.

The other two most mentioned goals by survey respondents are IT infrastructure cost reduction – i.e. shifting costs from fixed IT to as-needed cloud services – and scalability to handle dynamic IT workloads.

Why maturity matters

Following their detailed analysis, the IBM report authors grouped the survey respondents into three categories, based upon the maturity of their hybrid management capabilities and whether they’re reporting a strategic edge that’s realised from their hybrid cloud deployments.

Frontrunners are gaining a competitive advantage through hybrid cloud and are managing their environment in an integrated, comprehensive fashion for high visibility and control (as an example, through a single data-driven dashboard). Challengers are on the journey toward competitive advantage, but haven’t fully achieved unified management of their hybrid cloud environment, while chasers are not yet using hybrid cloud to drive competitive advantage and are in the early stages of gaining integrated control over their hybrid environment.

Benefits of hybrid leadership

What’s the big improvement of being a visionary frontrunner in your industry? They’re achieving noteworthy business outcomes with a hybrid cloud — such as productivity gains, including cutting operational costs and maximizing the value of existing IT infrastructure — at a higher rate than other organizations.

Frontrunners are also more effectively using hybrid cloud to drive digital business innovation, including the creation of new products and services and the expansion into new markets. They also apply hybrid cloud solutions to experiment with cognitive computing and the Internet of Things (IoT) — which have the potential to enable the development of new business models.

Moreover, the frontrunners are using hybrid solutions as a driver of business process change, with 85 percent reporting that hybrid cloud service adoption is accelerating a progressive digital transformation agenda at their companies.

Hybrid challenges and opportunities

Most frontrunners say they have achieved measurable progress from their hybrid cloud efforts. However, more than one in four have experienced difficulty executing their plan to integrate legacy IT infrastructure and cloud computing environments.

Finding and retaining the technical staff with the desired experience is often a challenge, with one in three survey respondents citing an internal skills gap as a big unresolved issue. Furthermore, while companies adopt cloud services to improve security, it remains their number one concern.

Infact, frontrunners cite management complexity and security as a major obstruction to progress. Over three-quarters report that hybrid introduces greater IT management complexity into their environment, and 70 percent say that their hybrid environment causes them greater security concerns.

How are these early-adopters leveraging hybrid environments to achieve a meaningful and substantive competitive advantage? Study findings indicate that they apply a very intentional and holistic approach to implementing and managing their hybrid solutions.

Culture is a key to ongoing hybrid success

The established frontrunners also understand that the complexity of hybrid environments is best tackled through a collaborative approach to IT investment decision making — bringing both the managers of IT organizations and Line of Business (LoB) leaders together on a common cause.

The IBM study findings also uncovered that in almost three-quarters of frontrunner organizations, hybrid cloud has elevated the extent to which the CIO is now acting as a trusted adviser to the overall business leadership team.

The collective C-suite and senior IT roles collaborate on key technology decisions that impact business goals. This newfound collaboration sheds light on the benefits of Shadow IT, where the progressive LoB leadership is already using forward-looking cloud services to advance their growth agenda.

Besides, 81 percent of the frontrunners report that hybrid cloud is helping to reduce Shadow IT growth within their organisations. Proving, once again, that savvy CIOs are the ones that are proactively embracing the shift to Hybrid IT models, and thereby regaining workload deployment momentum that was lost due to prior computing and storage resource provisioning constraints.

Gartner forecasts the end of ‘no-cloud’ corporate policies by 2020

(c)iStock.com/BsWei

It’s certainly implausible to think of businesses incorporating a ‘no internet’ policy at their place of work, even when the web was in an embryonic state. This will naturally extend to cloud policies at work, according to analyst house Gartner, who predicts that by 2020, a corporate ‘no-cloud’ policy will be a thing of the past.

The analysts did not stop there, however, filling out a few more betting slips in the process. Gartner also predicts that by 2019, more than 30% of the 100 largest vendors’ new software investments will be cloud-only, rather than cloud-first, while by 2020, more compute power will be sold by IaaS and PaaS providers than sold and deployed into enterprise data centres.

Naturally, this prediction does not mean that cloud vs on premises will be entirely black and white. Gartner insists that not everything will be cloud-based, and with good reason; as this publication examined last week, some workloads will have difficulty in moving. Yet the idea that organisations have nothing in the cloud is the concept Gartner is targeting.

Jeffrey Mann, Gartner research vice president, argues many organisations who claim to be cloud-free will doubtless have a shadow IT influx anyway, exacerbating the problem. “We believe that this position will become increasingly untenable,” he said. “Cloud will increasingly be the default option for software deployment. The same is true for custom software, which increasingly is designed for some variation of public or private cloud.”

As a result, Gartner insists that hybrid is the way forward. “Enterprises and vendors need to focus on managing and leveraging the hybrid combination of on-premises, off-premises, cloud and non-cloud architectures, with a focus on managing cloud-delivered capacity efficiently and effectively,” said Thomas J. Bittman, vice president and distinguished analyst.

Cyber security top of the list for European Commission after launch of €1.8bn initiative

EuropeThe European Commission has launched a new public-private partnership aimed at tackling the challenges of cyber security, and helping European companies become more competitive, reports Telecoms.com.

As part of the partnership, the EC will invest roughly €450 million, and will encourage industry to contribute healthily, targeting a total investment of €1.8 billion by 2020. The new initiative will take form through four pillars.

Firstly, the EC will encourage member states to make the most of the cooperation mechanisms under the new Network and Information Security (NIS) directive. Secondly, the EC will explore the possibility of creating a framework for certification of security products, which can then be distributed in any member state. Thirdly, the EC will establish a contractual public-private partnership with industry to nurture innovation. And finally, the team will create funds to enable SME’s to source investment and scale up.

“Europe needs high quality, affordable and interoperable cybersecurity products and services,” said Günther H. Oettinger, Commissioner for the Digital Economy and Society. “There is a major opportunity for our cybersecurity industry to compete in a fast-growing global market. We call on Member States and all cybersecurity bodies to strengthen cooperation and pool their knowledge, information and expertise to increase Europe’s cyber resilience. The milestone partnership on cybersecurity signed today with the industry is a major step.”

The new strategy builds on the EC’s ‘Open, Safe and Secure Cyberspace’ strategy which was launched in 2013 to ‘protect open internet and online freedom and opportunity’. While the initiative has launched a number of new legislative actions, there would appear to be little evidence much else has been achieved aside from ‘ensuring cooperation’, ‘ensuring a culture of security’ and ‘stepping up cooperation across Europe’. While previous work has been generalist and vague, the new proposition does at least offer encouragement there will be more concrete work achieved.

The NIS directive will support strategic cooperation and exchange of relevant information between member states, as well as creating a number of new bodies including EU Agency for Network and Information Security (ENISA), EU Computer Emergency Response Team (CERT-EU) and European Cybercrime Centre (EC3) at Europol. The plan will be to deliver a blueprint during the first half of 2017, and then deliver the initiative in an undefined timeframe. The EC has outlined a specific plan, though the lack of a timeframe seemingly removes some of the gained credibility.

“Without trust and security, there can be no Digital Single Market. Europe has to be ready to tackle cyber-threats that are increasingly sophisticated and do not recognise borders,” said Andrus Ansip, Vice-President for the Digital Single Market. “Today, we are proposing concrete measures to strengthen Europe’s resilience against such attacks and secure the capacity needed for building and expanding our digital economy.”

Bulgarian gov writes open source into law

Yellow road sign with a blue sky and white clouds: open sourceThe Bulgarian government has launched a number of amendments to the Electronic Governance Act which requires all code written for the government to be open source.

The announcement was made public through Bozhidar Bozhanov’s blog, who is currently acting as an advisor to the Deputy Prime Minister who is responsible for e-governance systems and policies. The new policy doesn’t mean the entire country will be moving towards Linux, though it is one of the first examples of a government putting the concept of open source into legislation. Article 58.a of the act states:

“When the subject of the contract includes the development of computer programs: (a) computer programs must meet the criteria for open source software. (b) All copyright and related rights on the relevant computer programs, their source code, the design of interfaces and databases which are subject to the order should arise for the principal in full, without limitations in the use, modification and distribution. (c) Development should be done in the repository maintained by the Agency in accordance with Art.7cpt.18.”

The amendment will not impact current contracts, or insist on the major vendors give away the source of their products, but only focuses on custom written code. When the government procures IT services or software which means custom code will be written specifically for the project, the act ensures this code will be outsourced for the rest of the country to use.

“After all, it’s paid by tax-payers money and they should both be able to see it and benefit from it,” said Bozhanov on the blog. “A new government agency is tasked with enforcing the law and with setting up the public repository (which will likely be mirrored to GitHub).

“The fact that something is in the law doesn’t mean it’s a fact, though. The programming community should insist on it being enforced. At the same time some companies will surely try to circumvent it.”

An Overview of Current OEM Technology

OEM technology Original equipment manufacturer or OEM technology is a general term that is commonly encountered when you purchase IT products; hardware or software. It involves partnerships between different companies to deliver a best-of-breed solution to the enduser that is significantly cost-effective. The term original equipment manufacturer is used differently by different people. Initially, the […]

The post An Overview of Current OEM Technology appeared first on Parallels Blog.

Upgrading Parallels Desktop 8 to Parallels Desktop 11 for Mac

Upgrading Parallels Desktop 8 (or older) to Parallels Desktop 11 for Mac Time flies by fast and technology changes every day. It’s not surprising some of us are left behind with older software versions that aren’t up to date on our computers. In most cases, the most recent version of almost any app is more […]

The post Upgrading Parallels Desktop 8 to Parallels Desktop 11 for Mac appeared first on Parallels Blog.

Analysing the evolution of the SaaS market: $50bn spending expected by 2024

(c)iStock.com/merznatalia

The global software as a service (SaaS) market is expected to break $50 billion in terms of product spending by 2024, from $12bn this year, according to a report released by price comparison checker Better Buys.

The report, an examination on the state of the SaaS market which takes its cues from various industry reports, argues almost two thirds (64%) of small and medium businesses rely on cloud technologies to drive growth and workforce efficiency, while more than three quarters (78%) of companies polled expect to expand the number of SaaS platforms they use in the next three years, taking the number of apps up with it.

In terms of vendors, Salesforce (11% of market share in 2015) reigns supreme, according to the Better Buys verdict, with Microsoft (8%), Adobe (6%) and SAP (5%) behind. The researchers argue that despite an increase in horizontal SaaS development, such as Salesforce and Slack, vertical-specific software remains the cornerstone of the overall market: “Subscription growth in business intelligence, security, IT, and enterprise vertical applications continue to rise.”

The researchers also give an idea as to how the ‘software as a service’ landscape has evolved, arguing that the first such example was Dun & Bradstreet, founded back in 1824, whose mission was to create “a network of correspondents to serve as a source of reliable, objective credit information.” Take that into account with Moodwire, founded in 2015, with a goal of deriving actionable insights for businesses from a plethora of data, analytics, web pages and social postings, and the evolution is fascinating.

The report also gives good news as to the amount of jobs in the market, at least for US-based IT professionals. The top states for SaaS jobs per 100,000 people were Massachusetts (172), Maryland (162), Washington (150), Virginia (146) and Colorado (144), according to the researchers.

“SaaS providers are finding that fewer customers require education on the benefits of SaaS versus on-premise solutions as more SMBs are used to working with SaaS,” the report concludes. “This has forced SaaS providers to shift their focus from customer education to nurturing and maintaining customer relationships to keep them and minimise loss to growing competition.”

You can read the Better Buys analysis here.

What did we learn from EMC’s data protection report?

a safe place to workEMC has recently released its Global Data Protection Index 2016 where it claims only 2% of the world would be considered ‘leaders’ in protecting their own assets, reports Telecoms.com.

Data has dominated the headlines in recent months as breaches have made customers question how well enterprise organizations can manage and protect data. Combined with transatlantic disagreements in the form of Safe Habour and law agencies access to personal data, the ability to remain secure and credible is now more of a priority for decision makers.

“Our customers are facing a rapidly evolving data protection landscape on a number of fronts, whether it’s to protect modern cloud computing environments or to shield against devastating cyber-attacks,” said David Goulden, CEO of EMC Information Infrastructure. “Our research shows that many businesses are unaware of the potential impact and are failing to plan for them, which is a threat in itself.”

EMC’s report outlined a number of challenges and statistics which claimed the majority of the industry are not in a place they should be with regard to data protection. While only 2% of the industry would be considered leaders in the data protection category, 52% are still evaluating the options available to them. Overall, 13% more businesses suffered data loss in the last twelve months, compared to the same period prior to that.

But what are the over-arching lessons we learned from the report?

Vendors: Less is more

A fair assumption for most people would be the more protection you take on, the more protected you are. This just seems logical. However, the study shows the more vendors you count in your stable, the more data you will leak.

The average data loss instance costs a company 2.36TB of data, which would be considered substantial, however it could be worse. The study showed organizations who used one vendor lost on average 0.83TB per incident, two vendors 2.04TB and three vendors 2.58TB. For those who used four or more vendors, an average of 5.47TB of data was lost per incident.

Common sense would dictate the more layers of security you have, the more secure you will be, however this is only the case if the systems are compatible with each other. It should be highlighted those who lost the larger data sets are likely to be the larger companies, with more data to lose, though the study does seem to suggest there needs to be a more co-ordinated approach to data protection.

And they are expensive…

Using same concept as before, the average cost of lost data was $900,000. For those who have one vendor, the cost was $636,361, for those with two, 789,193 and for those with three vendors the cost was just above the average at 911,030. When companies bring in four or more vendors, the average cost of data loss rises to 1.767 million.

China and Mexico are the best

While it may be surprising, considering many of the latest breakthroughs in the data world have come from Silicon Valley or Israel, China and Mexico are the two countries which would be considered furthest ahead of the trend for data protection.

EMC graded each country on how effective they are were implementing the right technologies and culture to prevent data loss within the organizations themselves. 17 countries featured ahead of the curve including usual suspects of the UK (13.5% ahead of the curve), US (8%), Japan (1%) and South Korea (9%), however China and Mexico led the charge being 20% and 17% respectively ahead.

While it may not be considered that unusual for China to have a strong handle on data within its own boarders, Mexico is a little more surprising (at least to us at Telecoms.com). The country itself has gone through somewhat of a technology revolution in recent years, growing in the last 20 years from a country where only 10% of people had mobile through to 68% this year, 70% of which are smartphones. Mexico is now the 11th largest economy in terms of purchasing power, with the millennials being the largest demographic. With the population becoming more affluent, and no longer constrained by the faults of pre-internet world, the trend should continue. Keep up the good work Mexico.

Human error is still a talking point

When looking at the causes of data loss, the results were widespread, though the causes which cannot be controlled were at the top of the list. Hardware failure, power loss and software failure accounted for 45%, 35% and 34% respectively.

That said the industry does now appear to be taking responsibility for the data itself. The study showed only 10% of the incidents of data loss was blamed on the vendor. A couple of weeks ago we spoke to Intel CTO Raj Samani who highlighted to us the attitude towards security (not just data protection) needs to shift, as there are no means to outsource risk. Minimizing risk is achievable, but irrelevant of what agreements are undertaken with vendors, the risk still remains with you. As fewer people are blaming the vendors, it would appear this responsibility is being realized.

Human error is another area which still remains high on the agenda, as the study showed it accounts for 20% of all instances of data loss. While some of these instances can be blamed on leaving a laptop in the pub or losing a phone on the train, there are examples where simple mistakes in the workplace are to blame. These will not be removed, as numerous day-to-day decisions are based off the back of intuition and gut-feel, and a necessity for certain aspects of the business.

An area which could be seen as a potential danger would be that of artificial intelligence. As AI advances as a concept, the more like humans they will become, and thus more capable of making decisions based in intuition. If this is to be taken as the ambition, surely an intuitive decision making machine would offer a security defect in the same way a human would. Admittedly the risk would be substantially smaller, but on the contrary, the machine would be making X times many more decision than the human.

All-in-all the report raises more questions than provides answers. While security has been pushed to the top of the agenda for numerous organizations, receiving additional investment and attention, it does not appear the same organizations are getting any better at protecting themselves. The fact 13% more organizations have been attacked in the last 12 month suggests it could be getting worse.

To finish, the study asked whether an individual felt their organization was well enough protected. Only 18% believe they are.

SEC filing shows LinkedIn negotiating skills are worth $5bn

Microsoft To Layoff 18,000The US Securities and Exchange Committee has released its filings outlining the road to Microsoft’s acquisition of LinkedIn, during which $5 billion was added to the value of the deal, reports Telecoms.com.

Five parties were involved in the saga, which eventually led to the news breaking on June 13, with Microsoft agreeing to acquire LinkedIn in an all-cash deal worth $26.2 billion. Although it has not been confirmed by the companies themselves according to Re/code Party A, which kicked the frenzy, was Salesforce. Party B was Google, which was also interested in pursuing the acquisition.

Party C and Party D were contacted by LinkedIn CEO Jeff Weiner to register interest however both parties declined after a couple of days consideration. Party C remains unknown, though Party D is believed to be Facebook, who even if had shown interest in the deal, may have faced a tough time in passing the agreement by competition authorities.

In terms of the timeline, a business combination was first discussed by Weiner and Satya Nadella, Microsoft’s CEO during a meeting on February 16, with Party A being brought into the frame almost a month later on March 10. Salesforce CEO Marc Benioff has confirmed several times in recent weeks his team were in discussions with LinkedIn regarding the acquisition. In the following days, Party B was brought into the mix, also declaring interest. Once the interest of Party A and B were understood, Microsoft was brought back into the mix on March 15 with the report stating:

“Mr. Weiner called Mr. Nadella to inquire as to whether Microsoft was interested in discussing further a potential acquisition of LinkedIn, and explained that, although LinkedIn was not for sale, others had expressed interest in an acquisition. Mr. Nadella responded that he would discuss the matter further with Microsoft’s board of directors.”

Prior to the agreement LinkedIn was valued at roughly $130 per share, with the initial offer recorded at $160. Microsoft eventually paid $196 per share, though this was not the highest bid received. The company referred to as Party A in the document put an offer forward of $200 per share, though this would be half cash and half shares in the company. Weiner negotiating skills have seemingly added approximately 50% to the value of LinkedIn shares and bumping up the total value of the deal by $5 billion.

The exclusivity agreement was signed on May 14, though pressure had been put on LinkedIn by both Microsoft and Party A in the weeks prior. It would appear Party A had not been deterred by the agreement, as additional bids were made, once again driving up the perceived value of LinkedIn shares. Microsoft’s offer of $182 was no longer perceived high enough, and encouraged to match Party A’s offer of $200. The report states LinkedIn Executive Chairman Reid Hoffman was in favour of an all cash deal, allowing Microsoft extra negotiating room. Nadella was eventually informed on June 10 the offer had been authorized by the LinkedIn Transactions Committee.

Although Microsoft could be seen to overpaying on the price, it would be worth noting LinkedIn has been valued at higher. The company initially launched its IPO in 2011 and had a promising year in 2013 increasing the share price from $113.5 to over $200 across the 12 month period. Share prices rose to over $250 last November, following quarterly results in February, share prices dropped 44% after projected full-year revenues at $3.6 billion to $3.65 billion, versus $3.9 billion expected by analysts. Considering the fall in fortunes, it may be fair to assume shareholders would be pleased with the value of the deal approaching $200 per share.

While Microsoft has been a relatively quiet player in the social market prior to the acquisition, though this could be seen as a means to penetrate the burgeoning market segment. Although the place of social media in the workplace remains to be seen, Microsoft has essentially bought a substantial amount of data, including numerous high-net worth individuals and important decision makers throughout the world. LinkedIn currently has roughly 431 million members and is considered to be the largest professional social media worldwide.

Another explanation for the deal could be the value of Microsoft to IT decision makers. A report from JPMorgan stated Microsoft would be considered the most important vendor by CIOs to their organization due to the variety of services offered. AWS is generally considered to be the number one player in the public cloud market, though Microsoft offers a wider range of enterprise products including servers, data centres, security solutions, and cloud offerings, amongst many more. Now social can be added to the list. As Microsoft increases its offerings, it could penetrate further into a company’s fabric, making it a much more complicated decision to change vendor.

Telsta adds IoT and big data offering to Network and Services biz unit

Location Australia. Green pin on the map.Australian telco Telstra has continued efforts to bolster its Network Applications and Services (NAS) business unit through acquiring Readify, reports Telecoms.com.

The company has been vocal about its aims for the NAS business unit as it has sought to expand through numerous acquisitions in recent years. Aside from the Readify deal, the company has also incorporated O2 Networks, Bridge Point Communications, Kloud and North Shore Connections, as well as numerous partnerships including with cloud security start-up vArmour.

“This arm of the business (NAS) has been a strong growth area for Telstra, achieving double-digit growth in revenue driven by business momentum in Asia, as well as advances in technology in the cloud computing space,” said a statement on the company website. “We are well equipped to continue to capitalise on this growth and ensure our focus on NAS continues to drive revenue.”

Readify, which currently offers enterprise cloud application solutions as well as Big Data and IoT, will provide an additional platform for Telstra to drive digital transformation for its enterprise customers in domestic and global markets. The offering builds on the January acquisition of Kloud which offers cloud migration services, as well as unified communications solutions and contact centre provider North Shore Connections in 2013, network integration services provider O2 Networks in 2014 and security, networking, and data management provider Bridgepoint, also in 2014.

“Readify will provide application development and data analytics services, nicely complementing Kloud’s existing services,” said Telstra Executive Director Global Enterprise and Services, Michelle Bendschneider. “It will enable Telstra to add incremental value to customers in enterprise cloud applications, API-based customisation and extensions as well as business technology advisory services.”

Back in April, the company announced a business multi-cloud connecting solution, which supports numerous offerings hybrid cloud offerings including Azure, AWS, VMware, and IBM. The one-to-many “gateway” model will enable Australian customers to connect to Microsoft Azure, Office365, AWS, IBM SoftLayer, and VMware vCloud Air, while international customers can only connect to AWS and IBM SoftLayer for the moment.

The cloud and enterprise services market has been a long-ambition of the company, though it did get off to a slow start. Back in 2014, its national rival Optus Business stole a march on Telstra through acquiring Ensyst, winner of Australian Country Partner of the Year at the Microsoft Worldwide Partner Awards during the same year, as it looked to grow its own cloud proposition. It would appear Telstra is making up for lost time through an accelerated program of product releases and acquisitions.