Archivo de la categoría: centurylink

Century launches automation offering for hybrid and multi-cloud

multi cloudCenturyLink has launched Runner, it’s new configuration management and orchestration service designed for hybrid and multi-cloud environments.

The new offering is built with openness in mind, ensuring automation in any cloud or data centre, including its own cloud platform, other third-party cloud providers and on premise infrastructures and devices. Runner focuses on open source automation and orchestration engine as a service.

“Runner is a new product from CenturyLink Cloud that enables fast, easy automation and orchestration on the CenturyLink Cloud Platform, as well as third-party cloud providers and on-premises infrastructure and devices,” said Chris Kent, Runner Product Owner at Century Link. “Runner provides the ability to quickly provision and modifies resources on any environment, and gives users a true Hybrid IT solution, regardless of where their resources are.

“Runner, at its core, is an Ansible engine. On top of that engine exists several other custom services and APIs we’ve created, many of which were created in tandem with the Runner job service to enhance the job execution capabilities.”

The new offering is built on the assumption that customers do not have the time or resource to effectively manage a hybrid or multi-cloud environment, and also cases where customers need better distribution in case of failures. The team seem to be focusing on the concepts of execution speed and a reduction in human error as some of the prominent features to differentiate themselves in an already competitive market. CenturyLink has also differentiated itself by focusing the technology on managing and automating the infrastructure itself, as opposed to focusing on the connections themselves, as with other competitors.

Tackling the resource gap in the transition to hybrid IT

AI-Artificial-Intelligence-Machine-Learning-Cognitive-ComputingIs hybrid IT inevitable? That’s a question we ask customers a lot. From our discussions with CIOs and CEOs there is one overriding response and that is the need for change. It is very clear that across all sectors, CEOs are challenging their IT departments to innovate – to come up with something different.

Established companies are seeing new threats coming into the market. These new players are lean, hungry and driving innovation through their use of IT solutions. Our view is that more than 70 percent of all CEOs are putting a much bigger ask on their IT departments than they did a few years ago.

There has never been so much focus on the CIO or IT departmental manager from a strategic standpoint. IT directors need to demonstrate how they can drive more uptime, improve the customer experience, or enhance the e-commerce proposition for instance, in a bid to win new business. For them, it is time to step up to the plate. But in reality there’s little or no increase in budget to accommodate these new demands.

We call the difference between what the IT department is being asked to do, and what it is able to do, the resources gap. Seemingly, with the rate of change in the IT landscape increasing, the demands on CIO’s by the business increasing and with little or no increase in IT budgets from one year to the next, that gap is only going to get wider.

But by changing their way of working, companies can free up additional resources to go and find their innovative zeal and get closer to meeting their business’ demands. Embracing Hybrid IT as their infrastructure strategy can extend the range of resources available to companies and their ability to meet business demands almost overnight.

Innovate your way to growth

A Hybrid IT environment provides a combination of its existing on-premise resources with public and private cloud offerings from a third party hosting company. Hybrid IT has the ability to provide the best of both worlds – sensitive data can still be retained in-house by the user company, whilst the cloud, either private or public, provides the resources and computing power that is needed to scale up (or down) when necessary.

Traditionally, 80 percent of an IT department’s budget is spent just ‘keeping the lights on’. That means using IT to keep servers working, powering desktop PCs, backing up work and general maintenance etc.

But with the CEO now raising the bar, more innovation in the cloud is required. Companies need to keep their operation running but reapportion the budget so they can become more agile, adaptable and versatile to keep up with today’s modern business needs.

This is where Hybrid IT comes in. Companies can mix and match their needs to any type of solution. That can be their existing in-house capability, or they can share the resources and expertise of a managed services provider. The cloud can be private – servers that are the exclusive preserve of one company – or public, sharing utilities with a number of other companies.

Costs are kept to a minimum because the company only pays for what they use. They can own the computing power, but not the hardware. Crucially, it can be switched on or off according to needs. So, if there is a peak in demand, a busy time of year, a last minute rush, they can turn on this resource to match the demand. And off again.

This is the journey to the Hybrid cloud and the birth of the agile, innovative market-focused company.

Meeting the market needs

Moving to hybrid IT is a journey.  Choosing the right partner to make that journey with is crucial to the success of the business. In the past, businesses could get away with a rigid customer / supplier relationship with their service provider. Now, there needs to be a much greater emphasis on creating a partnership so that the managed services provider can really get to understand the business. Only by truly getting under the skin of a business can the layers be peeled back to reveal a solution to the underlying problem.

The relationship between customer and managed service provider is now also much more strategic and contextual. The end users are looking for outcomes, not just equipment to plug a gap.

As an example, take an airline company operating in a highly competitive environment. They view themselves as being not in the people transportation sector, but as a retailer providing a full shopping service (with a trip across the Atlantic thrown in). They want to use cloud services to take their customer on a digital experience, so the minute a customer buys a ticket is when the journey starts.

When the passenger arrives at the airport, they need to check in, choose the seats they want, do the bag drop and clear security all using on-line booking systems. Once in the lounge, they’ll access the Wi-Fi system, check their Hotmail, browse Facebook, start sharing pictures etc. They may also choose last minute adjustments to their journey like changing their booking or choosing to sit in a different part of the aircraft.

Merely saying “we’re going to do this using the cloud” is likely to lead to the project misfiring. As a good partner the service provider should have the experience of building and running traditional infrastructure environments and new based on innovative cloud solutions so that they can bring ‘real world’ transformation experience to the partnership. Importantly they must also have the confidence to demonstrate digital leadership and understand of the business and its strategy to add real value to that customer as it undertakes the journey of digital transformation.

Costs can certainly be rationalised along the way. Ultimately with a hybrid system you only pay for what you use. At the end of the day, the peak periods will cost the same, or less, than the off-peak operating expenses. So, with added security, compute power, speed, cost efficiencies and ‘value-added’ services, hybrid IT can provide the agility businesses need.

With these solutions, companies have no need to ‘mind the gap’ between the resources they need and the budget they have. Hybrid IT has the ability to bridge that gap and ensure businesses operate with the agility and speed they need to meet the needs of the competitive modern world.

 

Written by Jonathan Barrett, Vice President of Sales, CenturyLink, EMEA

The end of the artisan world of IT computing

cloud computing machine learning autonomousWe are all working toward an era of autonomics ‒ a time when machines not only automate key processes and tasks, but truly begin to analyse and make decisions for themselves. We are on the cusp of a golden age for our ability to utilise the capacity of the machines that we create.

There is a lot of research about autonomic cloud computing and therefore there are a lot of definitions as to what it is. The definition from webopedia probably does the best job at describing autonomic computing.

It is, it says: “A type of computing model in which the system is self-healing, self-configured, self-protected and self-managed. Designed to mimic the human body’s nervous system in that the autonomic nervous system acts and reacts to stimuli independent of the individual’s conscious input.

“An autonomic computing environment functions with a high level of artificial intelligence while remaining invisible to the users. Just as the human body acts and responds without the individual controlling functions (e.g., internal temperature rises and falls, breathing rate fluctuates, glands secrete hormones in response to stimulus), the autonomic computing environment operates organically in response to the input it collects.”

Some of the features of autonomic computing are available today for organisations that have completed – or at least partly completed – their journey to the cloud. The more information that machines can interpret, the more opportunity they have to understand the world around them.

It spells the death of the artisan IT worker – a person working exclusively with one company, maintaining the servers and systems that kept a company running. Today, the ‘cloud’ has literally turned computing on its head. Companies can access computing services and storage at the click of a button, providing scalability, agility and control to exactly meet their needs. Companies pay for what they get and can scale up or down instantly. What’s more, they don’t need their army of IT artisans to keep the operation running.

This, of course, assumes that the applications that leverage the cloud have been developed to be native using a model like the one developed by Adam Wiggins, who co-founded Heroku. However, many current applications and the software stacks that support them can also use the cloud successfully.

More and more companies are beginning to realise the benefit that cloud can provide, either private, public or hybrid. For start-ups, the decision is easy. They are ‘cloud first’ businesses with no overheads or legacy IT infrastructure to slow them down. For CIOs of larger organisations, it’s a different picture. They need to move from a complex, heterogeneous IT infrastructure into the highly orchestrated and automated – and ultimately, highly scalable and autonomic – homogeneous new world.

CIOs are looking for companies with deep domain expertise as well as infrastructure at scale. In the switch to cloud services, the provision of managed services remains essential. To ensure a smooth and successful journey to the cloud, enterprises need a company that can bridge the gap between the heterogeneous and homogeneous infrastructure.

Using a trusted service provider to bridge that gap is vital to maintain a consistent service level to the business users that use or consume the application being hosted. But a cloud user has many more choices to make in the provision of their services. Companies can take a ‘do it myself approach’, where they are willing to outsource their web platform but keep control of testing and development. Alternatively, they can take a ‘do it with me’ approach, working closely with a provider in areas such as managed security and managed application services. This spreads the responsibility between the customer and provider, which can be decided at the outset of the contract.

In the final ‘do it for me’ scenario, trust in the service provider is absolute. It allows the enterprise customer to focus fully on the business outcomes. As more services are brought into the automation layer, delivery picks up speed which in turn means quick, predictable and high-quality service.

Hybrid cloud presents a scenario of the ‘best of both worlds’. Companies are secure in the knowledge that their most valuable data assets are still either on premise in the company’s own private servers or within a trusted hosting facility utilising isolated services. At the same time, they can rely on the flexibility of cloud to provide computing services that can be scaled up or down at will, at a much better price point than would otherwise be the case.

Companies who learn to trust their service provider will get the best user experience. In essence, the provider must become an extension of the customer’s business and not operate on the fringes as a vendor.

People, processes and technology all go together to create an IT solution. But they need to integrate between the company and the service provider as part of a cohesive solution to meet the company’s needs. The solution needs to be relevant for today but able to evolve in the future as business priorities change. Only then can we work toward a future where autonomics begins to play a much bigger part in our working lives.

Eventually, autonomic computing can evolve almost naturally, much like human intelligence has over the millennia. The only difference is that with cloud computing the advances will be made in years, not thousands of years. We are not there yet, but watch this space. In your lifetime, we are more than likely to make that breakthrough to lead us into a brave new world of cloud computing.

 

Written by Jamie Tyler, CenturyLink’s Director of Solutions Engineering, EMEA

The Six Myths of Hybrid IT

It is time to dispel some hybrid cloud myths

Bennett: It is time to debunk some hybrid cloud myths

Many companies face an ongoing dilemma: How to get the most out of legacy IT equipment and applications (many of which host mission-critical applications like their ERP, accounting/payroll systems, etc.), while taking advantage of the latest technological advances to keep their company competitive and nimble.

The combination of cloud and third-party datacentres has caused a shift in the way we approach building and maintaining our IT infrastructure. A best-of-breed approach previously meant a blending of heterogeneous technology solutions into an IT ecosystem. It now focuses on the services and technologies that remain on-premises and those that ultimately will be migrated off-premises.

A hybrid approach to IT infrastructure enables internal IT groups to support legacy systems with the flexibility to optimise service delivery and performance thru third-party providers. Reconciling resources leads to improved business agility, more rapid delivery of services, exposure to innovative technologies, and increased network availability and business uptime, without having to make the budget case for CAPEX investment. However, despite its many benefits, a blended on-premises and off-premises operating model is fraught with misconceptions and myths — perpetuating a “what-if?” type of mentality that often stalls innovation and business initiatives.

Here are the facts behind some of the most widespread hybrid IT myths:

Myth #1: “I can do it better myself.”

If you’re in IT and not aligned with business objectives, you may eventually find yourself out of a job. The hard truth is that you can’t be better at everything. Technology is driving change so rapidly that almost no one can keep up.

So while it’s not always easy to say “I can’t do everything as well as someone else can,” it’s perfectly acceptable to stick to what you’re good at and then evaluate other opportunities to evolve your business. In this case, outsourcing select IT functionality where you can realise improved capabilities and value for your business. Let expert IT outsource providers do what they do best, managing IT infrastructure for companies 24/7/365, while you concentrate on IT strategy to keep your business competitive and strong.

Myth #2: “I’ll lose control in a hybrid IT environment.”

A functional IT leader with responsibility over infrastructure that management wants to outsource may fear the loss of his or her team’s jobs. Instead, the day-to-day management of the company’s infrastructure might be better served off-premise, allowing the IT leader to focus on strategy and direction of the IT functions that differentiate her business in order to stay ahead of fast-moving market innovation and customer demands.

In the early days of IT, it was one size fits all. Today, an IT leader has more control than ever. For example, you can buy a service that comes with little management and spin resources up using imbedded API interfaces. The days where you bought a managed service and had no control, or visibility, over it are gone. With the availability of portals, plug-ins and platforms, internal resources have more control if they want their environment managed by a third party, or want the ability to manage it outright on their own.

Myth #3: “Hybrid IT is too hard to manage.”

Do you want to differentiate your IT capabilities as a means to better support the business? If you do want to manage it on your own, you need to have the people and processes in place to do so. An alternative is to partner with a service provider offering multiple off-premise options and a more agile operating model than doing all of it on your own.  Many providers bundle management interfaces, orchestration, automation and portals with their offerings, which provides IT with complete transparency and granular control into your outsourced solution.  These portals are also API-enabled to ensure these tools can be integrated into any internal tools you have already invested in, and provide end to end visibility into the entire Hybrid environment.

Myth #4: “Hybrid IT is less secure than my dedicated environment.”

In reality, today’s IT service providers are likely more compliant than your business could ever achieve on its own. To be constantly diligent and compliant, a company may need to employ a team of internal IT security professionals to manage day-to-day security concerns. Instead, it makes sense to let a team of external experts worry about data security and provide a “lessons-learned” approach to your company’s security practice.

There are cases where insourcing makes sense, especially when it comes to the business’ mission-critical applications. Some data should absolutely be kept as secure and as close to your users as possible. However, outsourced infrastructure is increasingly becoming more secure because providers focus exclusively on the technology and how it enables their users. For example, most cloud providers will encrypt your data and hand the key to you only. As a result, secure integration of disparate solutions is quickly becoming the rule, rather than the exception.

Myth #5: “Hybrid IT is inherently less reliable than the way we do it now.”

Placing computing closer to users and, in parallel, spreading it across multiple locations, will result in a more resilient application than if you had it in a fixed, single location. In fact, the more mission-critical the application becomes, the more you should spread it across multiple providers and locations. For example, if you build an application for the cloud you’re not relying on any one application component being up in order to fulfil its availability. This “shared nothing” approach to infrastructure and application design not only makes your critical applications more available, it also adds a level of scalability that is not available in traditional in-house only approaches.

Myth #6: “This is too hard to budget for.”

Today’s managed service providers can perform budgeting as well as reporting on your behalf. Again, internal IT can own this, empowering it to recommend whether to insource or outsource a particular aspect of infrastructure based on the needs of the business. However, in terms of registration, costs, and other considerations, partnering with a third-party service can become a huge value-add for the business.

Adopting a hybrid IT model lowers the risk of your IT resources and the business they support. You don’t have to make huge investments all at once. You can start incrementally, picking the options that help you in the short term and, as you gain experience, allow you the opportunity to jump in with both feet later. Hybrid IT lets you evolve your infrastructure as your business needs change.

If IT and technology has taught us anything, it’s that you can’t afford to let fear prevent your company from doing what it must to remain competitive.

Written by Mike Bennett, vice president global datacentre acquisition and expansion, CenturyLink EMEA

CenturyLink open sources more cloud tech

CenturyLink has open sourced a batch of cloud tools

CenturyLink has open sourced a batch of cloud tools

CenturyLink has open sourced a number of tools aimed at improving provisioning for Chef on VMware infrastructure as well as Docker deployment, orchestration and monitoring.

Among the projects open sourced by the company include a Chef provisioning driver for vSphere, Lorry.io – a tool for creating, composing and validating Docker images, and imagelayers.io – a tool that helps improve Docker image visualisation in order to help give developers more visibility into their workloads.

“The embrace of open-source technologies within the enterprise continues to rise, and we are proud to be huge open-source advocates and contributors at CenturyLink,” said Jared Wray, senior vice president of platforms at CenturyLink.

“We believe it’s critical to be active in the open-source community, building flexible and feature-rich tools that enable new possibilities for developers.”

While CenturyLink’s cloud platform is proprietary and developed in house Wray has repeatedly said open source technologies form an essential part of the cloud ecosystem – Wray himself was a big contributor to Cloud Foundry, the open source PaaS tool, when developing Iron Foundry.

The company has also previously open sourced other tools, too. Last summer it punted a Docker management platform it calls Panamax into the open source world, a platform is designed to ease the development and deployment of any application sitting within a Docker environment. It has also open sourced a number of tools designed to help developers assess the total cost of ownership of multiple cloud platforms.

HP, CenturyLink buddy-up on hybrid cloud

CenturyLink and HP are partnering on hybrid cloud

CenturyLink and HP are partnering on hybrid cloud

HP and CenturyLink announced a deal this week that will see HP resell CenturyLink’s cloud services to its partners as part of the HP PartnerOne programme.

As part of the deal HP customers will have access to the full range of CenturyLink services, which are built using HP technology, including managed hosting, colocation, storage, big data and cloud.

“CenturyLink solutions, powered by HP, provide compelling value for organizations seeking hybrid IT solutions,” said James Parker, senior vice president, partner, financial and international, at CenturyLink. “CenturyLink complements the HP portfolio with a breadth of hybrid solutions for enterprises, offering customers the ability to choose the services that make the most sense today, while retaining the flexibility to evolve as business demands shift.”

HP said the move will help CenturyLink expand its reach new customers, with HP exploiting new opportunities to build hybrid cloud solutions for existing customers.

“As businesses map out a path to the cloud, they need flexibility in how they consume and leverage IT services,” said Eric Koach, vice president of sales, Enterprise Group, central region, HP.

“HP cloud, software and infrastructure solutions help CenturyLink and HP enable clients to build, manage and secure a cloud environment aligned with their strategy, across infrastructure, information and critical applications,” Koach said.

Since splitting up HP has bifurcated its partner programmes into the PartnerOne programme for service providers and the Helion PartnerOne programme, the latter of which largely includes services providers building solutions on top of OpenStack or Cloud Foundry.

CenturyLink adds clean cloud datacentre in Washington

CenturyLink has added a datacentre in Washington to its footprint

CenturyLink has added a datacentre in Washington to its footprint

CenturyLink has opened a datacentre in Moses Lake, Washington this week, which is powered in part by hydro-electric energy.

The facility is powered in part by hydroelectric generators located on the nearby Columbia River, and because the local climate allows for significant use of free-air cooling (which is much less power-intensive than traditional cooling methods) the company said the datacentre has among the lowest power usage effectiveness (PUE) ratings in the industry.

“CenturyLink’s new low-cost power datacentre services provide many benefits to our customers, including a highly resilient solution coupled with power costs and efficiency metrics that rank among the best in the industry, and the facility serves as an excellent disaster recovery location,” said David Meredith, senior vice president, CenturyLink. “Enterprises enjoy global access to CenturyLink’s portfolio of cloud and managed hybrid IT services, and we continue to extend the reach of our data center footprint to new markets to meet from the needs of our customers.”

The datacentre is being hosted by Server Farm Realty, a managed datacentre and colocation provider, and offers access to cloud, colocation, networking and managed services.

This is the second datacentre CenturyLink has added to its footprint in recent months. Two weeks ago the company announced a partnership with NextDC to broaden its datacentre footprint in Australia, and in March brought its cloud platform online in Singapore.

While most datacentres are typically located close to large metropolitan centres, Kelly Quinn, research manager with IDC reckons CenturyLink’s latest datacentre could bring more attention to the region’s potential as a hub for other facilities.

The central part of Washington state is one of the geographies in which I see substantial potential for further growth as a datacentre hub,” Quinn said.

“Its potential stems from the area’s abundance of natural, power-generating resources, and its relative immunity from natural disasters.”

“It also may offer customers who are ‘green’ conscious the ability to work with a provider that can satisfy their datacentre needs with renewable energy sources, Quinn added.

CenturyLink partners with NextDC on Australian cloud expansion

CenturyLink is partnering with NextDC to bolster the reach of its cloud services in Australia

CenturyLink is partnering with NextDC to bolster the reach of its cloud services in Australia

CenturyLink is expanding its cloud footprint in Australia this week, partnering with local datacentre incumbent NextDC to bolster its managed and hybrid cloud services to the region.

CenturyLink already has a datacentre presence in Australia but the partnership announced this week will see CenturyLink offer its managed hosting, colocation and cloud services from NextDCs network of datacentres in Sydney, Melbourne, Brisbane, Canberra and Perth.

“We are eager to offer our managed hybrid IT services and consistent IT experience to multinational corporations in Australia, one of the most connected countries in the world,” said Gery Messer, managing director, Asia Pacific, at CenturyLink.

“The extension of CenturyLink’s datacentre footprint into Australia signifies our commitment to serve growing customer demand for IT services in the Asia-Pacific region,” Messer added.

Craig Scroggie, chief executive officer of NextDC commented: “NextDC’s agreement with CenturyLink continues the trend of the world’s top IT providers utilizing NextDC’s national datacentre network to provide services. CenturyLink is an important new member of our ecosystem of carriers, cloud and IT service providers, and its presence will essentially open up a world of new possibilities for Australian organizations on their journey to a hybrid cloud model.”

Like many American cloud incumbents CenturyLink views APAC as a key market moving forward. Last month the company launched a cloud node in Singapore and last year set up a datacentre in Shanghai, China, all in a move to bolster demand for its services in the region.

CenturyLink acquires Orchestrate to strengthen DBaaS offering

CenturyLink has acquired Orchestrate to strengthen its database-as-a-service proposition

CenturyLink has acquired Orchestrate to strengthen its database-as-a-service proposition

CenturyLink has acquired Orchestrate, a database-as-a-service provider specialising in delivering fully managed, high performance, fault tolerant NoSQL database technologies.

CenturyLink said that Orchestrate, which partners with AWS on public cloud hosting for its clients’ datasets, will help bolster its cloud-based database and managed services propositions.

“CenturyLink’s customers, like most enterprises, are expressing interest in solutions that help them meet the performance, scalability and agile development needs of large-scale big data analytics,” said Glen Post, chief executive officer and president of CenturyLink.

“The Orchestrate database service’s ease of use and ability to support multiple database technologies have emerged as key differentiators that we are eager to offer our customers through the CenturyLink Cloud platform,” Post said.

As for drivers of the acquisition, the company said growing use cases around the Internet of Things is creating more demand for fully-managed NoSQL technologies. Orchestrate offers a managed service that basically abstracts many of the underlying hardware and database-specific coding away and delivers an API that enables developers to store and query JSON data easily.

The acquisition will see the Orchestrate services team join CenturyLink’s product development and technology organisation, with Orchestrate co-founders Antony Falco and Ian Plosker as well as vice president of engineering Dave Smith joining the company.

“CenturyLink Cloud features one of the most sophisticated service infrastructures in the market, with a great interface and lots of options for managing complex workflow and third-party applications in the cloud,” Falco said. “Orchestrate’s database service takes the same approach to delivering cost efficiency and ease of use. Enterprise customers are increasingly expecting one global platform to provide these services.”

CenturyLink expands public cloud in APAC

CenturyLink is expanding its public cloud platform in Singapore

CenturyLink is expanding its public cloud platform in Singapore

American telco CenturyLink has expanded the presence of its public cloud platform to Singapore in a bid to cater to growing regional demand for cloud services.

CenturyLink, which recently expanded its managed services presence in China and its private cloud services in Europe and the UK, is adding public cloud nodes to one of its Singapore datacentres.

“The launch of a CenturyLink Cloud node in Singapore further enhances our position as a leading managed hybrid IT provider for businesses with operations in the Asia-Pacific region,” said Gery Messer, CenturyLink managing director, Asia Pacific.

“We continue to invest in the high-growth Asia-Pacific region to meet increasing customer demand,” Messer said.

The company said it wants to cater to what it sees as growing demand for cloud services in the region, citing Frost & Sullivan figures that show the Asia-Pacific region spent almost $6.6bn on public cloud services last year. That firm predicts annual cloud services spending in the region will exceed $20bn by 2018.

The move also comes at a time when the Singapore Government is looking to invest more in both using cloud services and growing usage of cloud platforms in the region.

Last year the Infocomm Development Authority of Singapore (IDA) said it was working with Amazon Web Services to trial a data as a service project the organisations believe will help increase the visibility of privately-held data sets.

The agency also signed a Memorandum of Intent with AWS that would see the cloud provider offer usage credits $3,000 (US) to the first 25 companies to sign up to the pilot, which will go towards the cost of hosting their dataset registries or datasets.

It’s also announced similar partnerships in the past with Pivotal and Red Hat.