Box to tap NTT’s VPN in Japan

Box is teaming up with NTT Com to launch Box over VPN

Box is teaming up with NTT Com to launch Box over VPN

Box and NTT Com have announced a partnership that will see the cloud storage incumbent offer access to its services through NTT’s VPN service. The companies said the move will improve confidence in cloud services among Japanese enterprises and expand the reach of both companies in the local IT services market.

Box also said the ‘Box over VPN’ scheme would improve network security for users and broaden the range of enterprise customers it caters to in the region, in particular enabling it to tap into government and financial services institutions.

“We’re thrilled to partner with NTT Com to help create transformative software for Japanese businesses in every industry,” said Box chief executive and founder Aaron Levie.

“This partnership will help more organizations to benefit from entirely new ways of working by elevating technology to enable secure collaboration and content management across geographical boundaries, while still meeting demands for robust control.”

Hidemune Sugawara, head of application & contents service, senior vice president of NTT Com, said: “By delivering added value based on NTT Com’s expertise in network security, we look forward to providing Box over VPN to a wide range of Japanese businesses. The partnership will enable Box to be combined with ID Federation1 and Salesforce over VPN2, both of which are provided by NTT Com, which will help to expand our file-collaboration businesses targeting large enterprises.”

Japan has one of the most mature cloud services markets in the Asia Pacific region, which as a whole is expected to generate about $7.4bn in 2015 according to Gartner.

Parallels 10 for 10: Dress Up Your MacBook with RAWBKNY

When talking about some of our favorite things we use here at Parallels, we would be remiss if we didn’t mention something as stylish and functional as accessories by RAWBKNY. In fact, our Sr. Director of Global Online Marketing, Farees, sports a Parallels-branded RAWBKNY case that always grabs attention—no matter where in the world he […]

The post Parallels 10 for 10: Dress Up Your MacBook with RAWBKNY appeared first on Parallels Blog.

Making SIEM Easier to Achieve By @MattKiernan | @CloudExpo #Cloud

Log data provides the most granular view into what is happening across your systems, applications, and end users. Logs can show you where the issues are in real-time, and provide a historical trending view over time. Logs give you the whole picture.
A June 2014 Gartner Report on Security Information and Event Management (SIEM) reveals that many surveyed SIEM users indicate “cost” (both in terms of price and effort) as being one of the biggest challenges presented by traditional SIEM tools of today. The irony of this insight is that reducing the cost and complexity of managing security information and events should be the primary function of a SIEM tool.

read more

Day 3 Keynote at @CloudExpo By @RJRogers87 | #IoT #Cloud #BigData

In his keynote at 16th Cloud Expo, Rodney Rogers, CEO of Virtustream, discussed the evolution of the company from inception to its recent acquisition by EMC – including personal insights, lessons learned (and some WTF moments) along the way. Learn how Virtustream’s unique approach of combining the economics and elasticity of the consumer cloud model with proper performance, application automation and security into a platform became a breakout success with enterprise customers and a natural fit for the EMC Federation.

read more

Deutsche Telekom wants to double cloud revenues by 2018

Deutsche Telekom wants to bolster its cloud business

Deutsche Telekom wants to bolster its cloud business

Deutsche Telekom said this week aims to redouble efforts to beat out big IT incumbents in the increasingly lucrative cloud services segment. Through the telco’s IT-focused subsidiary it intends to double cloud revenues over the next three years.

The company said it wants to start generating upwards of two billion euros annually from cloud services by 2018, double what it says it currently pulls in.

“At Deutsche Telekom, we want to grow by more than 20 percent each year in the field of cloud platforms, and to become the leading provider for businesses in Europe,” said Ferri Abolhassan, head of the IT Division at T-Systems.

Last year revenues from cloud solutions, in particular private cloud services, increased double digits at the firm, Abolhassan explained. But with the battle for cloud revenue heating up with more traditional IT service providers and vendors the company needs to scale up its cloud activities both within and outside T-Systems.

“The market for services from the public cloud – infrastructure, platforms and applications – that can be accessed through the public Internet promises further growth. In conjunction with partners, Deutsche Telekom plans to pit itself more strongly against the Internet corporations Google and Amazon in future. To achieve this, the departments within Deutsche Telekom’s segments are now stepping up their cloud activities across the Group,” he said, adding that DT will also continue to try and differentiate on security.

Telco’s haven’t been the natural choice for enterprise IT professionals but over the past few years many like DT have stepped up their cloud strategies, a move which largely sees them both acquiring successful cloud incumbents and integrate them into their own operations – for instance Verizon’s acquisition of Terremark, or CenturyLink’s acquisition of Savvis – and using their existing commercial telecoms and managed services clients as direct channels.

Partnerships are also key in this segment and earlier this year DT announced a flurry of cloud-centric deals with Cisco, Huawei, SAP and Salesforce. That said, the move could be a sign DT will soon ramp up partnerships with other big cloud providers or ISVs – or head down the M&A route.

Tech News Recap for the Week of 6/8/2015

Were you busy last week? Here’s a quick tech news recap of articles you may have missed from the week of 6/8/2015.

Tech News Recap55% of enterprises predict cloud will enable new business models in 3 years. SDN, NFV and Network Virtualization markets are expected to grow at a CAGR of close to 60% over the next 5 years. VMware’s AirWatch has been named a leader in mobility by Gartner. VMware has also expanded its EVO Rail portfolio.

Tech News Recap

Windows Server 2003 End-of-Life is right around the corner! Learn more & create an action plan!

 

By Ben Stephenson, Emerging Media Specialist

Living in a hybrid world: From public to private cloud and back again

Orlando Bayter, chief exec and founder of Ormuco

Orlando Bayter, chief exec and founder of Ormuco

The view often propagated by IT vendors is that public cloud is already capable of delivering a seamless extension between on-premise private cloud platforms and public, shared infrastructure. But Orlando Bayter, chief executive and founder of Ormuco, says the industry is only at the outset of delivering a deeply interwoven fabric of private and public cloud services.

Demand for that kind of seamlessness hasn’t been around for very long, admittedly. It’s no great secret that in the early days of cloud demand for public cloud services was spurred largely by the slow-moving pace traditional IT organisations are often set. As a result every time a developer wanted to build an application they would simply swipe the credit card and go, billing back to IT at some later point. So the first big use case for hybrid cloud emerged when developers then needed to bring their apps back in-house, where they would live and probably die.

But as the security practices of cloud service providers continue to improve, along with enterprise confidence in cloud more broadly, cloud bursting – the ability to use a mix of public and private cloud resources to fit the utilisation needs of an app – became more widely talked about. It’s usually cost prohibitive and far too time consuming to scale private cloud resources quick enough to meet the changing demands of today’s increasingly web-based apps, so cloud bursting has become the natural next step in the hybrid cloud world.

Orlando will be speaking at the Cloud World Forum in London June 24-25. Click here to register.

There are, however, still preciously few platforms that offer this kind of capability in a fast and dynamic way. Open source projects like OpenStack or more proprietary variants like VMware’s vCloud or Microsoft’s Azure Stack (and all the tooling around these platforms or architectures) are at the end of the day all being developed with a view towards supporting the deployment and management of workloads that can exist in as many places as possible, whether on-premise or in a cloud vendor’s datacentre.

“Let’s say as a developer you want to take an application you’ve developed in a private cloud in Germany and move it onto a public cloud platform in the US. Even for the more monolithic migration jobs you’re still going to have to do all sorts of re-coding, re-mapping and security upgrades, to make the move,” Bayter says.

“Then when you actually go live, and have apps running in both the private and public cloud, the harsh reality is most enterprises have multiple management and orchestration tools – usually one for the public cloud and one for the private; it’s redundant, and inefficient.”

Ormuco is one company trying to solve these challenges. It has built a platform based on HP Helion OpenStack and offers both private and public instances, which can both be managed in a single pane of glass; it has built its own layer in between to abstract resources underneath).

It has multiple datacentres in the US and Europe from which it offers both private and public instances, as well as the ability to burst into its cloud platform using on-premise OpenStack-based clouds. The company is also a member of the HP Helion Network, which Bayter says gives it a growing channel and the ability to offer more granular data protection tools to customers.

“The OpenStack community has been trying to bake some of these capabilities into the core open source code, but the reality is it only achieved a sliver of these capabilities by May this year,” he said, alluding to the recent OpenStack Summit in Vancouver where new capabilities around federated cloud identity were announced and demoed.

“The other issue is simplicity. A year and a half ago, everyone was talking about OpenStack but nobody was buying it. Now service providers are buying but enterprises are not. Specifically with enterprises, the belief is that OpenStack will be easier and easier as time goes on, but I don’t think that’s necessarily going to be the case,” he explains.

“The core features may become a bit easier but the whole solution may not, but there are so many things going into it that it’s likely going to get clunkier, more complex, and more difficult to manage. It could become prohibitively complex.”

That’s not to say federated identity or cloud federation is a lost cause – on the contrary, Bayter says it’s the next horizon for cloud. The company is currently working a set of technologies that would enable any organisation with infrastructure that lies significantly underutilised for long periods to rent out their infrastructure in a federated model.

Ormuco would verify and certify the infrastructure, and allocate a performance rating that would change dynamically along with the demands being placed on that infrastructure – like an AirBnB for OpenStack cloud users. Customers renting cloud resources in this market could also choose where their data is hosted.

“Imagine a university or a science lab that scales and uses its infrastructure at very particular times; the rest of the time that infrastructure is fairly underused. What if they could make money from that?”

There are still many unanswered questions – like whether the returns for renting organisations would justify the extra costs (i.e. energy) associate with running that infrastructure, or where the burden of support lies (enterprises need solid SLAs for production workloads) and how that influences what kinds of workloads ends up on rented kit, but the idea is interesting and definitely consistent with the line of thinking being promoted by the OpenStack community among others in open source cloud.

“Imagine the power, the size of that cloud,” says Bayter . “That’s the cloud that will win out.”

This interview was produced in partnership with Ormuco

Is force of habit defining your hybrid cloud destiny?

Experience breeds habit, which isn't necessarily the best thing strategically

Experience breeds habit, which isn’t necessarily the best thing strategically

I’ve been playing somewhat of a game over recent months.  It’s a fun game for all the family and might be called “Guess my job”.  It’s simple to play.  All you need to do is ask someone the question; “What is a hybrid cloud?” then based upon their answer you make your choice.  Having been playing this for a while I’m now pretty good at being able to predict their viewpoint from their job role or vice versa.

And the point of all this?  Simply, that people’s viewpoints are constrained by their experiences and what keeps them busy day-to-day, so often they miss an opportunity to do something different.  For those people working day-to-day in a traditional IT department , keeping systems up and running,  hybrid cloud is all about integrating an existing on-site system with an off-site cloud.  This is a nice, easy one to grasp in principal but the reality is somewhat harder to realize.

The idea of connecting an on-site System of Record to a cloud-based System of Engagement:  pulling data from both to generate new insights is conceptually well understood.  That said, the number of organisations making production use of such arrangements is few and far between.  For example, combining historical customer transaction information with real-time geospatial, social and mobile data and then applying analytics to generate new insights which uncover new sales potential.  For many organizations though, the challenge in granting access to the existing enterprise systems is simply too great.  Security concerns, the ability to embrace the speed of change that is required and the challenge to extract the right data in a form that is immediately usable by the analytical tools may be simply a hurdle too high.  Indeed, many clients I’ve worked with have stated that they’re simply not going to do this.  They understand the benefits, but the pain they see themselves having to go through to get these makes this unattractive to pursue.

So, if this story aligns with your view of hybrid cloud and you’ve already put it in the “too hard” box then what is your way forward?

For most organizations, no single cloud provider is going to provide all of the services they might want to consume.  Implicitly then, if they need to bring data from these disparate cloud services together then there is a hybrid cloud use case:  linking cloud to cloud.  Even in the on-site to off-site hybrid cloud case there are real differences when the relationship is static compared to when you are dynamically bursting in and out of off-site capacity.  Many organizations are looking to cloud as a more-effective and agile platform for backup and archiving or for disaster recovery.  All of these are hybrid cloud use cases to but if you’ve already written off ‘hybrid’ then you’re likely missing very real opportunities to do what is right for the business.

Regardless of the hybrid cloud use case, you need to keep in mind three key principals which are:

  1. Portability – the ability to run and consume services and data from wherever it is most appropriate to do so, be that cloud or non-cloud, on-site or off-site.
  2. Security, visibility and control – to be assured that end-to-end, regardless of where the ‘end’ is, you are running services in such a way that they are appropriately secure, well managed and their characteristics are well understood.
  3. Developer productivity – developers should be focused on solving business problems and not be constrained by needing to worry about how or when supporting infrastructure platforms are being deployed.  They should be able to consume and integrate services from many different sources to solve problems rather than having to create everything they need from scratch.

Business applications need to be portable such that they can both run as well as consume other services from wherever is most appropriate.  To do that, your developers need to be more unconstrained by the underlying platform(s) and so can develop for any cloud or on-site IT platform.  All this needs to be done in a way that allows enterprise controls, visibility and security to be extended to the cloud platforms that are being used.

If you come from that traditional IT department background, you’ll be familiar with the processes that are in place to ensure that systems are well managed, change is controlled and service levels are maintained.  These processes may not be compatible with the ways that clouds open up new opportunities.  This leads to the need to look a creating a “two-speed” IT organisation to provide the rigor where needed for the Systems of Record whilst enabling rapid change and delivery in the Systems of Engagement space.

Cloud generates innovation and hence diversity.  Economics, regulation and open communities drive standardization and it is this, and in particular open standards, which facilitates integration in all of these hybrid cases.

So, ask yourself.  With more than 65 per cent of enterprise IT organizations making commitments on hybrid cloud technologies before 2016 are you ensuring that your definitions – and hence your technologies choices – reflect future opportunities rather than past prejudices?

Written by I’ve been playing somewhat of a game over recent months.  It’s a fun game for all the family and might be called “Guess my job”.  It’s simple to play.  All you need to do is ask someone the question; “What is a hybrid cloud?” then based upon their answer you make your choice.  Having been playing this for a while I’m now pretty good at being able to predict their viewpoint from their job role or vice versa.

Written by John Easton, IBM distinguished engineer and leading cloud advisor for Europe

CIF: ‘Lack of trust holding back cloud adoption’

CIF: 'Cloud users are still citing the same inhibitors'

CIF: ‘Cloud users are still citing the same inhibitors’

Security, privacy and lack of control are still the leading inhibitors holding enterprises back from adopting cloud services, according to the Cloud Industry Forum’s latest research.

The CIF, which polled 250 senior IT decision-makers in the UK earlier this year to better understand where cloud services fit into their overall IT strategies, said when asked about their biggest concerns during the decision-making process to move to the cloud, 70 per cent cited data security and 61 per cent data privacy.

Both are up from the 2014 figures of 61 per cent and 54 per cent, respectively.

“Hybrid will be the modus operandi for the majority of organisations for the foreseeable future, being either not yet ready to move everything to the cloud, or unwilling to. There are a number of contributing factors here: fear of losing control of IT systems, security and privacy concerns, and lack of budget currently stand in the way of greater adoption of cloud by businesses,” said Alex Hilton, chief executive of the CIF.

“The primary issue relates to trust: trust that cloud-based data will be appropriately secured, that it won’t be compromised or inadvertently accessed, and that businesses will be able to retrieve and migrate their data when a contract terminates.”

About 40 per cent of respondents were also concerned they would lose control/manageability of their IT systems when moving to cloud, up from 24 per cent last year.

Richard Pharro, chief executive of APM Group, the CIF’s independent certification partner said cloud providers need to improve how to disclose their privacy and security practices in order to inspire more confidence among current and potential users.

“Some Cloud providers are opaque in the way that they operate. The prevalence of click-through licenses, some of which are littered with unrealistic terms and conditions,” Pharro said, adding that improving public disclosure in cloud contracts could go some way towards improving trust and confidence among customers.

IBM calls Apache Spark “most important new open source project in a decade”

IBM is throwing its weight behind Apache Spark in a bid to bolster its IoT strategy

IBM is throwing its weight behind Apache Spark in a bid to bolster its IoT strategy

IBM said it will throw its weight behind Apache Spark, an open source community developing a processing engine for large-scale datasets, putting thousands of internal developers to work on Spark-related projects and contributing its machine learning technology to the code ecosystem.

Spark, an Apache open source project born in 2009, is essentially an engine that can process vast amounts of data very quickly. It runs in Hadoop clusters through YARN or as a standalone deployment and can process data in HDFS, HBase, Cassandra, Hive, and any Hadoop InputFormat; it currently supports Scala, Java and Python.

It is designed to perform general data processing (like MapReduce) but one of the exciting things about Spark is it can also process new workloads like streaming data, interactive queries, and machine learning – making it a good match for Internet of Things applications, which is why IBM is so keen to go big on supporting the project.

The company said the technology brings huge advances when processing massive datasets generated by Internet of Things devices, improving the performance of data-dependent apps.

“IBM has been a decades long leader in open source innovation. We believe strongly in the power of open source as the basis to build value for clients, and are fully committed to Spark as a foundational technology platform for accelerating innovation and driving analytics across every business in a fundamental way,” said Beth Smith, general manager, analytics platform, IBM Analytics.

“Our clients will benefit as we help them embrace Spark to advance their own data strategies to drive business transformation and competitive differentiation,” Smith said.

In addition to joining Spark IBM said it would build the technology into the majority of its big data offerings, and offer Spark-as-a-Service on Bluemix. It also said it will open source its IBM SystemML machine learning technology, and collaborate with Databricks, a Spark-as-a-Service provider, to advance Spark’s machine learning capabilities.