The One Public Estate programme: A catalyst for public sector cloud adoption

(c)iStock.com/AnthiaCumming

In 2016, the One Public Estate programme will see 100 councils receive £31 million of funding in order to help them make better use of public sector property.

The idea is that local public sector bodies – from neighbouring councils to emergency services – start disposing of, or sharing, buildings in order to reduce running costs, free up property/land for redevelopment and raise money.

In a sector where money is scarce, one of the most compelling areas for potential cost savings is to close down public sector owned datacentres; an IT estate that runs on datacentres sited in council buildings is expensive for a number of reasons. Not only do they take up space – often in expensive city centre locations – that could be used for higher value activities, they also require a significant overhead cost to maintain them effectively.

Moving to third party cloud services providers, perhaps via colocation as a ‘staging’ point, frees up space whilst reducing the need for in-house teams to ‘keep the lights on’. It also allows organisations to clean up their balance sheets, remove the need for capital investment and move IT spend to a ’pay as you use’ operating expense.

The second business reason for moving to an external provider is to create an IT estate that’s suitable for the immediate digital transformation challenges and is also flexible enough to accommodate further changes which will inevitably lie ahead. An external partner can provide cloud infrastructure that scales up and down with the needs of the organisation, keeps data secure and in an environment where information can be shared by a range of partners who need to work together to plan and deliver public services.

The final reason for moving to an external provider is the opportunity to reshape in-house council IT teams to better meet the future needs of the organisation. This is a future that moves away from the old model of an in-house team that builds and operates an IT estate to one that is a service integration and management team that partners with providers to solve business problems, enable change and provide a platform strategic roadmap to support new service delivery in the longer term.

One Public Estate is an important agenda for local government in 2016 and, whether a council is in receipt of the OPE funding or not, one which provides a strong business rationale for IT teams to make their own case for moving their organisation’s IT  infrastructure away from in-house to an external provider.

The storm brewing: What is fuelling public cloud growth?

(c)iStock.com/jkitan

Public cloud adoption is set to continue to grow in 2016. However, recently published research from IT monitoring provider ScienceLogic confirmed there is still some confusion, which could impact enterprise adoption.

Fundamentally, there is a lack of understanding around the ability to simplify workload visibility and management for IT teams as cloud usage becomes more mainstream. Almost half of the IT decision makers surveyed (46%) were unsure of how to proactively monitor the workloads in their public cloud environments, highlighting the need for a solution offering improved visibility, monitoring and infrastructure control.

Some of this confusion lies in that cloud storage, on which all cloud services are built, has moved on from being used simply for archiving, backup and recovery, and is now being used for business critical applications and infrastructure. Shifting to a system requiring less management is therefore daunting decision to take for many CIOs, even when putting faith in the cloud as a longer-term storage strategy.

1&1 made the decision to switch to a next generation data centre model – the company moved to a new storage architecture to reduce operational management complexity and support the delivery of best in class services to enterprise customers

At the same time, service providers offering cloud solutions need to be confident they can guarantee performance of business critical applications, as even the smallest amount of downtime or reduction in service level could have a massive impact on their clients. For instance, a provider who manages transactional software for a commercial bank or large utilities firm could see damages rack into the millions if an outage occurred even for a few hours.

IT decision makers should shop around for a suitable cloud provider. In doing so, there are a number of key things they should be looking for in their cloud provider, including:

  • Built on an all-flash storage architecture – to enable the delivery of consistent application performance
  • True flexibility and scalability – to allow for linear, predictable performance gains as requirements increase
  • RAID-less data protection – to ensure predictable performance in any failure condition
  • Balanced load distribution – to eliminate unpredictable performance hot spots and control performance independent of capacity.
  • Quality of Service control – to completely eliminate “noisy neighbour” issues, and guarantee performance
  • Performance visualisation – to control performance independent of capacity and on demand

1&1 is an example of a cloud service provider that made the decision to switch to a next generation data centre model. The company moved to a new storage architecture to reduce operational management complexity and support the delivery of best in class services to its enterprise customers. It chose an all-flash array, designed to guarantee performance to clients, which was lacking with the previous system.

At the heart of this issue, regardless of the current system being deployed to deliver cloud services, is the need for businesses to have an infrastructure that provides the right level of support for the applications they are running, be they business critical or not. Predictability, visibility and the ability to control both capacity and performance easily and with agility will be the key differentiator in the marketplace.

This will make the cloud storage sector a particularly interesting one, not just in 2016, but also for the foreseeable future.

Rapid Changes in Immature Technologies Are Holding You Back By @JohnBasso | @CloudExpo #Cloud

Many of the technologies we use in our personal and professional lives every day are changing at a very rapid pace. So, do you try to keep up and if so, how? Simple. You let go. Firms often hold on to out of date technology because of the switching costs, insecurity about the best tech choice, lack of expertise, and a host of other reasons. In staying married to old technologies, firms make switching more and more difficult. One of the most pervasive technology trends where this phenomena can be seen is in the conversion from internally maintained servers to cloud hosted servers.

read more

Four Biggest Drivers of Cloud Security Innovation in 2016 By @Auxome | @CloudExpo #Cloud

The rise of cloud-based infrastructure was one of the biggest developments in IT during the past few years, and now we are seeing extensive innovations in cloud security as well. More companies are moving their business-critical data away from onsite data centers and into cloud-based infrastructure. With that in mind, 2016 is going to be another dynamic year for cloud security, as more users and IT teams will be looking for ways to enhance their cloud security while achieving heightened visibility of their cloud-based IT assets. The migration of business workloads to the cloud brings many benefits, but one potential challenge is that the “old ways” of managing IT security don’t work as well in the cloud environment.

read more

[video] SQL Orchestration with @DanKLynn | @CloudExpo @AgilDataInc #Cloud

«There’s a very common problem that the enterprise data industry hasn’t solved yet – as companies add data and new features into their application they add new data stores that are specialized for those purposes and it becomes an operational nightmare to manage. We think we have an interesting angle on that,» explained Dan Lynn, CEO of AgilData, in this SYS-CON.tv interview at 17th Cloud Expo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.

read more

AWS opens up EC2 Container Registry to all

amazon awsCloud giant Amazon Web Services (AWS) has opened its technology for storing and managing application container images up to public consumption.

The AWC EC2 Container Registry Service (ECR) had been exclusively for industry insiders who attended the launch at the AWS re:Invent conference in Las Vegas in October. However, AWS has now decided to level the playing field, its Senior Product Manager Andrew Thomas revealed, guest writing on the blog of AWS chief technologist Jeff Barr. Thomas invited all interested cloud operators to apply for access.

As containers have become the de facto method for packaging application code all cloud service providers are competing to fine tune the process of running code within these constraints, as an alternative to using virtual machines. But developers have fed back teething problems to AWS, Thomas reports in the blog.

ECR, explains Thomas, is a managed Docker container registry designed to simplify the management of Docker container images which, developers have told Thomas, has proved difficult. Running a Docker image registry, in a large-scale job like an infrastructure project, involves pulling hundreds of images at once and this makes self-hosting too difficult, especially with the added complexity of spanning two or more AWS regions. AWS clients wanted fine-grained access control to images without having to manage certificates or credentials, Thomas said.

Management aside, there is a security dividend too, according to Thomas. “This makes it easier for developers to evaluate potential security threats before pushing to Amazon ECR,” he said. “It also allows developers to monitor their containers running in production.”

There is no charge for transferring data into the Amazon EC2 Container Registry. While storage costs 10 cents per gigabyte per month all new AWS customers will receive 500MB of storage a month for a year.

The Registry is integrated with Amazon ECS and the Docker CLI (command line interface), in order to simplify development and production workflows. “Users can push container images to Amazon ECR using the Docker CLI from the development machine and Amazon ECS can pull them directly for production,” said Thomas.

The service was effective from December 21st in the US East (Northern Virginia) with more regions on the way soon.

ESI installs HPC data centre to support virtual prototyping

Cloud computingManufacturing service provider ESI Group has announced that a new high performance computing (HPC) system is powering its cloud-based virtual prototyping service to a range of industries across Europe.

The new European HPC-driven data centre is based on the Teratec Campus in Paris, close to Europe’s biggest HPC centre, the Très Grand Centre de Calcul, the data centre of The French Alternative Energies and Atomic Energy Commission (CEA). The location was chosen in order to make collaborative HPC projects possible, according to ESI. The 13,000 square metre CEA campus has a supercomputer with a peak performance of 200 Teraflops and a CURIE supercomputer capable of running a 2 Petaflops per second.

ESI’s virtual prototyping, a product development process run on computer-aided design (CAD), computer-automated design (CAutoD) and computer-aided engineering (CAE) software in order to validate designs, is increasingly run on the cloud, it reports. Before manufacturers commit to making a physical prototype they create a 3D computer-generated model and simulate different test environments.

The launch of the new HPC data centre gives ESI a cloud computing point of delivery (PoD) to serve all 40 of ESI’s offices across Europe and the world. The HPC cloud PoD will also act as a platform for ESI’s new software development and engineering services.

The HPC facility was built by data centre specialist Legrande. The new HPC is needed to meet the change in workloads driven by virtualization and cloud computing with the annual growth in data is expected to rise from 50% in 2010 to reach 4400% in 2020, according to Pascal Perrin, Datacenter Business Development Manager at Legrand.

Legrand subsidiary Minkels supplied and installed the supporting data centre hardware, including housing, UPS, cooling, monitoring and power distribution systems. The main challenge with supporting a super computer that can ramp up CPU activity by the petaflop and with petabytes of data moving in and out of memory is securing the supporting resources, said Perrin. “Our solutions ensure the electrical and digital supply of the data centre at all times,” he said.

[slides] Cloud Lock-In | @CloudExpo #PaaS #Cloud

Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud.
In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., advocated that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud without worrying about any lock-in fears. In fact by having standard APIs for IaaS would help PaaS explode much faster as there is no need to support plethora of clouds by every PaaS vendor.

read more

Pitfalls of Microsoft O365 Migrations Part 1: Mailbox Size, Spam Filtering, & Address Change

There are several pitfalls that organizations experience when doing Microsoft O365 migrations. This is the first part of a three part video series where I outline some of the most common pitfalls I’ve seen organizations run into. A lot of people don’t fully understand how much your IT deficit has impact on your ability to migrate data. In this first video, I discuss Mailbox Size, Spam Filtering, & Address Changes. If you’re looking for more information around O365 migrations, I recently held a webinar with a couple of my colleagues that takes a deep dive into the topic.

 

Microsoft O365 Migrations Part 1

 

Interested in learning more about Microsoft O365 Migrations? Download David’s recent webinar, “Microsoft Office 365: Expectations vs. Reality

 

By David Barter, Practice Manager, Microsoft Technologies

Converting VirtualBox to Parallels Desktop

Guest blog by Sylvester Sebastian Nino, Parallels Support Team While working with Parallels Desktop and helping our customers, I often go through our knowledgebase as the best source of product-related information. For instance, recently, I helped a customer over the phone convert his virtual machine from VirtualBox to Parallels Desktop by simply going through the […]

The post Converting VirtualBox to Parallels Desktop appeared first on Parallels Blog.