All posts by James

Amazon Web Services announces next generation EC2 instances

(c)iStock.com/zakokor

Amazon Web Services (AWS) has announced M4 instances for its EC2 cloud, adding another selection of compute instances to an already well-established list.

The M4 instances will deliver processing power with custom 2.4 GHz Intel Xeon E5-2676 Haswell processors, and aim to provide lower network latency and jitter – the variation between packets arriving – through Enhanced Networking. M4 also offers dedicated bandwidth to Amazon Elastic Block Store (EBS).

AWS claims M4 instances are most likely to suit a wide variety of applications, such as relational and in-memory databases, as well as gaming servers.

“Amazon EC2 provides a comprehensive selection of instances to support virtually any workload, and we continue to deliver new technologies and high performance in our current generation instances,” said Matt Garman, AWS VP for EC2 in a statement. “With these capabilities, M4 is one of our most powerful instances types and a terrific choice for workloads requiring a balance of compute, memory, and network resources,” he added.

AWS customers can launch M4 instances using the AWS Management Console, AWS Command Line Interface, AWS SDKs, as well as third party libraries.

The latest instances add to AWS’ ecosystem, but also add to the complexity of instances on offer. As a result companies like 2nd Watch, which manages more than 10,000 AWS instance for enterprise, make value out of consulting. Recent figures from the company revealed EC2 remained the most popular AWS service, with 98% of customers using it, just ahead of S3 (97%).

Zacks.com, an analyst house, argued following the launch of M4: “AWS is the biggest public cloud in the market. But the competition in the cloud market is intensifying, and so is the cloud storage war between Microsoft and Google. But amid this war, we remain extremely positive about AWS’s growth prospects. The latest launch is basically an added feather to its cap.”

You can find out more about M4 instances here.

CIO survey reveals importance of mainframe – but also the skills gap with it

(c)iStock.com/mevans

IBM celebrated the 50th anniversary of the mainframe last year – and new research from Compuware reveals the technology is still as important as ever for CIOs.

The research, which polled 350 enterprise CIOs, found almost nine in 10 (88%) see the mainframe as a “key business asset over the next decade”, while a similar number (89%) see mainframe code as “valuable corporate intellectual property” and four in five (78%) believe it is a “key enabler of innovation”.

As William Rabie, head of cloud EMEA and APAC at iland wrote in this publication earlier this month, client-server technology never completely replaced the mainframe. Even though it’s very much a legacy technology, it still has its place in industries such as banking and defence.

It’s clear that CIOs see the mainframe playing a key role in the future of the digital enterprise, there are various concerns related to its development. As the senior platform professionals retire and leave the business, who will step up and deal with the mainframe in their place? Three quarters (75%) of CIOs admit that distributed app developers have little understanding of the mainframe, and a similar figure (70%) are concerned a lack of documentation will create risk in the company.

39% of respondents said they had no explicit plans for addressing shortages in their organisation for mainframe developers. So who is going to fill the gap? Compuware CEO Chris O’Malley is concerned at the results’ findings, comparing the situation to the Millennium Bug.

“CIOs clearly need to re-prioritise investments in the mainframe in order to maximise the value IT delivers to the business and to effectively mitigate the risk associated with the generational shift in IT staffing,” he said. “Not since Y2K has the mainframe required as much CIO attention and direct involvement.

“Hope is never a good mainframe strategy,” he added.

Box announces strong financial figures, raises forecast

(c)iStock.com/ngkaki

Enterprise cloud storage provider Box has announced first quarter revenue figures of $65.6 million, a 45% increase year over year.

The company noted billings of $69.8m for the first quarter of fiscal 2016 and a non-GAAP operating loss of $32.6m – 50% of revenue. This contrasts with first quarter of fiscal 2015, whereby non-GAAP operating loss was 69% of revenue. GAAP operating loss for this quarter was at 71% of revenue, compared to the previous year’s 83%.

Box also gave an update on its customer numbers; an addition of 2000 customers in the quarter to total more than 47000 customers globally, a growth of paying customers to include more than 51% of the Fortune 500, and surpassing 37 million registered users.

According to Reuters, the company raised its full-year forecast to $286m-$290m, up from $281m-$285m. Shares rose over 8% in extended trading on Wednesday.

Dylan Smith, Box co-founder and CFO, said in a statement: “We are proud to have achieved revenue growth of 45% year over year, driven by our continued success moving up market and closing more enterprise deals. While we continue to focus on investing in technology innovation and growth, we also remain committed to achieving positive free cash slow. Our Q1 results show the progress we have made toward this milestone as we demonstrated significant improvement in our operating cash flow.”

Box has made a serious of interesting announcements in recent months, ranging from the customer win with the US Department of Justice to the appointment of Sonny Hashmi, former CIO of the General Services Administration, to help lead the company’s efforts in the federal IT space. It’s certainly early days, but there are certainly encouraging signs for the California-based firm.

Number of cloud apps in the enterprise declines for first time – is IT fighting back?

(c)iStock.com/DragonImages

The average number of cloud apps used per enterprise has declined for the first time, according to the latest data from Netskope.

This shift – which the company notes is “consolidation efforts from IT begin[ning] to take hold” – is exacerbated by the lack of enterprise-grade security in most cloud apps. 89.6% of apps used are not enterprise-ready, and 90% of data loss prevention violations occur in cloud storage apps.

The latter point is of most concern. Netskope identified violations by discovering content at rest in sanctioned cloud apps via their APIs, alongside inspecting content in-line in real-time as per enterprises’ DLP policies, all through the Netskope Active Platform.

The five cloud app categories with the highest volume of policy violations are cloud storage apps, webmail, finance and accounting, social and CRM. In the instance of cloud storage, the action most likely to cause a violation is download, followed by login and upload. Personally identifiable information (PII) appears 27% of the time, followed by payment card information (24%) and what Netskope categorises as confidential or top secret documents.

The average number of apps used by enterprises has declined from 511 in Netskope’s last quarterly report, to 483 this time round.

“With so many cloud apps in the enterprise lacking the capabilities required for safe enablement, it is imperative that IT possess a holistic view of cloud app usage to inform proactive policies that reduce the risk of losing sensitive data,” said Sanjay Beri, Netskope CEO and founder. “More than just knowing where violations occur, it’s important to know how they are occurring and what steps can be taken to mitigate such behaviours.”

The most recent report, which CloudTech examined back in April, showed more than 15% of European organisations use more than 1000 cloud apps, with Google Drive, Facebook, and Twitter the most popular.

Latest research studies examine enterprise cloud and big data adoption strategies

(c)iStock.com/Barcin

The latest figures on European cloud computing adoption, this time from managed services provider Easynet, shows only one in 10 enterprises are using public cloud.

The research, which polled 660 IT decision makers at companies with more than 1000 employees, found that cloud had been adopted by almost three quarters (74%) of European enterprises. In total, almost half (47%) used private cloud, compared with 17% utilising hybrid cloud, and the remaining 26% naturally staying on-premise.

The study found discrepancies – 17% of firms polled in Belgium use public cloud, ahead of the European average – and some less surprising results. The business and consumer services (30%) and IT and computer services (21%) were most likely to lead the trend towards adoption of hybrid cloud, while banking and financial services were among the most likely to favour private cloud.

Belgium, along with the UK, had the highest proportion of hybrid cloud users, with a 23% and 22% score respectively, while the government sector was most likely to opt for on-premise hosting (52%).

Elsewhere, a study from managed public cloud provider 2nd Watch has revealed three in five US companies are currently engaged in a big data project, with a further 20% soon to begin one. The leading force behind big data projects is predominantly the CEO (32%), followed by the CIO (25%) and line of business managers (18%).

The survey also revealed how strategic imperatives are changing to address the needs of big data deployments. Almost three quarters (71%) of those polled said their company had already adopted a new data warehouse or was considering purchasing one. Established enterprise database vendors, such as IBM (38%), Oracle (31%) and HP (24%) are the most likely choices.

New CenturyLink EMEA MD Richard Warley: In IaaS, the telco should win

Picture credit: CenturyLink

Richard Warley, the new EMEA managing director of communications giant CenturyLink, argues the natural winners in the cloud and infrastructure as a service (IaaS) space should be the telcos – and he’s relishing the chance of leading the charge.

Warley’s appointment represents a near full-circle transformation, having been managing director of SAVVIS up until 2008, which CenturyLink acquired in 2011. Indeed, it goes further than that; the new EMEA MD was speaking to CloudTech from a CenturyLink office in Denver which previously belonged to software manufacturer Quest – now part of Dell – whose IPO in 1996 Warley worked on in a previous life as an investment banker.

Having been out of the fold for the best part of seven years, Warley (left) notes CenturyLink has changed ‘radically’, admitting when SAVVIS was acquired by the Louisiana telco, his first thought was ‘who the hell are CenturyLink?’ Yet his reasons for taking the job on are clear. “Part of the reason I came back is the company’s strategic vision is absolutely in the sweet spot of what customers are looking for,” he explains.

“If you look at some of the natural competitors from the global infrastructure side, BT, Vodafone, Verizon, AT&T, all those folks are all predominantly interested in the consumer and the retail play,” he adds. “CenturyLink has said [they are] interested in satisfying the enterprise’s infrastructure problems on a local basis, in region and on a global basis.”

This feeds into Warley’s belief as to why the telco should win – the complexity of bringing together the network and the data centre from an operational and a sales standpoint. It can be quickly summed up in two terms; ‘hybrid IT’ and ‘telco cloud’. Both terms are less than ideal for Warley – they’re both pretty vague, for a start – but he says the latter has an ‘element of truth’ to it.

“The natural winners in the cloud space, and the infrastructure as a service space, are the telcos – or should be, for a number of reasons,” he argues, adding: “The cloud doesn’t work without the network. Increasingly, we’ve seen Microsoft, Amazon and certainly Google getting into the network space in rather a big way, and making substantial network investments, because the cloud doesn’t work if there’s no network to connect it.”

The second reason, a point echoed by CTO Jared Wray when he spoke to this publication last year, is that the cloud is, primarily, a utility service. Who has had experience in offering utility services for donkey’s years? The telcos.

Warley explains: “It takes a lot of innovative expertise to imagine the cloud and then to code it, but over a period of time it will become commoditised to an extent, and the people who can run infrastructure efficiently and cost effectively should be the telcos.” You put the legacy experience and the network together and it should be a recipe for success. Yet there’s a caveat.

“From a customer’s perspective this makes perfect sense, because the customer needs the network, the data centre, dedicated or traditional IT managed services, and needs cloud,” says Warley. “When you look at it from the vendor’s perspective, it looks difficult because you’ve got different skillsets, different operational frameworks that people want to work in.”

This, incorporating DevOps as opposed to a traditional ITIL (Information Technology Infrastructure Library) model, is a key tenet for CenturyLink, primarily through the CenturyLink Cloud Development Centre in Seattle.

From the enterprise customer’s perspective, the sentiment is still focused on ‘journey to the cloud’. How do they make their infrastructure more agile and cost effective, yet without sacrificing the stable operational state of a managed service? “That is the overwhelming topic of discussion amongst the client base,” Warley says, “and we are ideally positioned to have that discussion with our clients and to help them on both sides.”

Warley’s opinions on the competition aren’t restricted to the telcos. The more recognised IaaS players, he argues, don’t come from a service provider perspective as much as the telcos, ceding competitive advantage. The IaaS market as a whole, he notes, is growing rapidly and eating into the total infrastructure market, but will hit ‘increasing headwinds’ in the corporate market. He notes IBM, with its legacy in traditional IT, can stand on both sides of the argument, yet is ‘not sure how happy SoftLayer is in that environment’.

All considered, it’s exactly the sort of mission statement one would expect from the new EMEA managing director of a company in an extremely interesting strategic position. “I wouldn’t say that this is an incrementalist company,” Warley adds. “When it decides to do something, it says ‘this is what we’re going to do and this is how we’re going to go about it’.

“That’s how they’ve come from a switchboard in Louisiana to a very substantial company in a short space of time.”

Why human error is still the biggest risk to your cloud system going down

(c)iStock.com/mediaphotos

The number one risk to system availability remains human error, according to the latest disaster recovery industry report from CloudEndure.

The research examines the various protocols businesses have in place for downtime if – or when – it occurs. On a scale of one to 10, human errors – including application bugs – hit 8.1, compared to network failures (7.2), cloud provider downtime (6.9) and external threats (6.7).

Even though the majority (83%) of organisations have a SLA goal of 99.9% or better, this doesn’t often translate into actual results. 44% of firms said they had at least one outage in the past three months, with 27% admitting their systems had gone down within the past month. 9% of respondents said their systems had never gone down.

Most intriguingly, more than a quarter of firms surveyed (28%) don’t measure service availability at all, and 15% said they do not share system availability numbers with customers. 37% said they meet their availability goals consistently, with 50% saying they hit their goals “most of the time.”

It’s worth noting what the accepted definition of ‘downtime’ is – as the report does not give a clear one. Half of respondents say downtime is simply where the system is not accessible, while roughly a quarter say it means the system is accessible but performance is highly degraded (26%) or some functions are not operational (24%).

Overwhelmingly, the respondents’ cloud provider of choice was Amazon Web Services (AWS). 59% of those polled said they used public cloud, with three quarters (74%) of that number opting for Amazon, ahead of Microsoft (7%), Google (6%) and Rackspace (4%). Not surprisingly, service availability was considered most critical to the customers of 33% of firms.

The report’s main claim is a “strong correlation” between the cost of downtime and the average hours per week invested in disaster recovery. 49% of respondents said they used their own measurement tools, with a quarter (24%) using some sort of third party tool. According to respondents remote storage backup (57%) is the most frequently used strategy to ensure system availability, ahead of storage replication (46%).

Previous reports from CloudEndure examined AWS and Microsoft Azure uptime figures for 2014: AWS showed a 41% reduction in performance issues quarter to quarter last year, while there were significantly more service interruptions in the last three quarters for Azure. 

Dropbox opens up on enterprise cloud strategy with security and integration updates

(c)iStock.com/KIVILCIM PINAR

Dropbox has announced new features in administration, security and integration in a bid to change the way the cloud storage provider works for business.

The company is introducing tighter account security through two-step verification, tiered administrative controls, as well as an extension to the Dropbox for Business API, with new capabilities for shared folders.

CloudLock, Netskope and SkySync are among the data migration providers who are already beginning to build integrations, alongside Israeli firm Adallom, which recently announced it was looking after the security for Dropbox for Business.

“This is a major milestone for Dropbox for Business,” said UK country manager Mark van der Linden. “We’re making a step change to ensure we help businesses be the best place to get work done. We listened to our customers and we’re delivering the features they need most in the areas of security, administration and integration.”

Dropbox is making a series of plays to beef up its enterprise portfolio in a bid to convince businesses they can safely store their business critical data with the cloud storage provider.

 In May, Dropbox announced it had achieved ISO/IEC 27018 privacy standard certification, running through to September 2017.  Other competitors, such as Box, are still waiting for the go ahead with their FedRAMP certification, despite announcing a big customer win in the form of the US Department of Justice.

Dropbox for Business has over 100,000 global customers, including MIT, News Corp and National Geographic, while the business itself is also expanding; the company has opened up seven new global offices since the beginning of 2014.

Cloud migration still poses various challenges, according to new research

(c)iStock.com/pinstock

The Cloud Industry Forum (CIF) has called on cloud service providers (CSPs) to do more to assist with the process of cloud migration for their customers after new research reveals difficulties associated with moving to the cloud.

The CIF survey, which polled 250 senior IT and business decision-makers from the public and private sectors, found only 10% of respondents saying their transition to cloud services “could not have been improved.” 38% of those polled said they had issues relating to the complexity of migration, while a further 30% had difficulties with data sovereignty.

The challenges don’t begin once migration is underway, either: more than a quarter (27%) said they had contractual obstacles from the start, such as clarity of liability. A similar number (28%) said they encountered a brief drop in employee productivity.

Previous research from the CIF has focused predominantly on the growth in cloud adoption. Yet this time around, cloud providers are urged to give more focus on their customers to aid their migration. Michel Robert, Claranet UK managing director, said: “In spite of the growing maturity of the delivery model, cloud migration issues haven’t gone away and are increasing along with adoption levels.

“The IT arrangements of many businesses, particularly those in the mid-market, are incredibly complex, and, increasingly, service providers need to take more of an active role in helping businesses to unpick them and devise migration strategies for their customers,” he added.

In May, the CIF argued the uptake of cloud computing services in business will increase due to the impending shutdown of Windows Server 2003 (WS2003) support, with CRM the most likely application to become cloud-based in the next 12 months. This migration path is long and twisted, as Nick East, CEO of hybrid cloud provider Zynstra, told this publication in April.

Alex Hilton, Cloud Industry Forum CEO, explained how customers should ask their service provider to have appropriate credentials, but equally should have sufficient knowledge on how cloud services can support their business objectives. Elsewhere, Oscar Arean, technical operations manager at Databarracks, argues the “slow” migration from WS2003 so far has been driven by caution, rather than complacency.

Enterprise recognises need to move from public storage to private cloud

(c)iStock.com/nadla

Employees at more than half (55%) of organisations use public file, sync and share (FSS) services. Yet almost three quarters (73%) said they were looking for or had implemented an alternative, according to a new report from CTERA.

More than half (59%) of that number said they favoured a private cloud FSS solution run either on hosted infrastructure or in their own data centre.

35% of organisations have experienced corporate data leakage in 2014 as a result of employees sharing files via usually unsanctioned FSS services. The key cause of driving enterprise file sync and share (EFSS) solutions is naturally security, as well as improved collaboration between internal and external users, and increased need for file access through mobile devices.

“To the enterprise’s credit, the vast majority of organisations are pursuing alternatives to consumer-grade file sharing services that carry security threats and legal risks that companies simply cannot afford,” the report notes.

The most interesting aspect of the report involved the usage and uptake of cloud storage gateways, a network appliance or server which translates storage APIs to block-based storage protocols in order to lower monthly charges and boost data security. Almost seven in 10 organisations have implemented or considered cloud storage gateways – the key reasons for this include off-site storage and universal access (71%), cost savings (71%), and minimising IT overhead (56%). Of that number, 33% of respondents had implemented a cloud storage gateway, with 36% considering it.

Naturally, this is a concern which has been raised by cloud storage providers such as Dropbox and Box, who implement their own enterprise solutions. Last week, Box announced it was working with the US Department of Justice in a major customer win, while Dropbox’s partnerships with Microsoft in recent months have been indicative of a need to beef up its enterprise presence.

“The report highlights the growing interest and need for enterprise-grade, private cloud storage solutions that empower companies with more visibility and control over FSS services,” CTERA concludes. “By providing added security with full integration into organisations’ existing storage and IT infrastructure, these services are also improving the end-user experience – a combination that effectively can put an end to rogue FSS usage.”

You can read the full report here.