CenturyLink partners with NextDC on Australian cloud expansion

CenturyLink is partnering with NextDC to bolster the reach of its cloud services in Australia

CenturyLink is partnering with NextDC to bolster the reach of its cloud services in Australia

CenturyLink is expanding its cloud footprint in Australia this week, partnering with local datacentre incumbent NextDC to bolster its managed and hybrid cloud services to the region.

CenturyLink already has a datacentre presence in Australia but the partnership announced this week will see CenturyLink offer its managed hosting, colocation and cloud services from NextDCs network of datacentres in Sydney, Melbourne, Brisbane, Canberra and Perth.

“We are eager to offer our managed hybrid IT services and consistent IT experience to multinational corporations in Australia, one of the most connected countries in the world,” said Gery Messer, managing director, Asia Pacific, at CenturyLink.

“The extension of CenturyLink’s datacentre footprint into Australia signifies our commitment to serve growing customer demand for IT services in the Asia-Pacific region,” Messer added.

Craig Scroggie, chief executive officer of NextDC commented: “NextDC’s agreement with CenturyLink continues the trend of the world’s top IT providers utilizing NextDC’s national datacentre network to provide services. CenturyLink is an important new member of our ecosystem of carriers, cloud and IT service providers, and its presence will essentially open up a world of new possibilities for Australian organizations on their journey to a hybrid cloud model.”

Like many American cloud incumbents CenturyLink views APAC as a key market moving forward. Last month the company launched a cloud node in Singapore and last year set up a datacentre in Shanghai, China, all in a move to bolster demand for its services in the region.

Sharing Options in Parallels Desktop for Mac

Guest blog by Manoj Dhanasekar, Parallels Support Team Parallels Desktop® for Mac provides close integration between your Mac OS X and Windows virtual machines. You should have Parallels Tools installed in a virtual machine in order to take advantage of this integration and also make sure the Isolate Mac from Windows option is disabled in the […]

The post Sharing Options in Parallels Desktop for Mac appeared first on Parallels Blog.

Dr. Martens and Parallels 2X Remote Application Server (RAS) Team Up

Dr. Martens logo courtesy of Wikipedia under Fair Use. Parallels® announced today that global clothing and footwear brand Dr. Martens has selected Parallels 2X® Remote Application Server (RAS), parallels.com/ras, to deliver applications to its employees. Dr. Martens uses Parallels 2X RAS to provide secure access to core business applications from multiple locations through multiple devices, […]

The post Dr. Martens and Parallels 2X Remote Application Server (RAS) Team Up appeared first on Parallels Blog.

RTC Client for Lync From @GENBAND | @ThingsExpo [#IoT #WebRTC]

GENBAND introduced its Real Time Communications (RTC) Client for Lync* to seamlessly combine real-time communications with Lync Instant Messaging (IM) and Presence.
“We’re shaking up the economics of delivering Unified Communications (UC) and offering a compelling way to integrate previously bespoke communications technologies,” said Carl Baptiste, GENBAND’s Senior Vice President, Enterprise Solutions. “We’re offering enterprises the best of both worlds by combining our own high availability voice, video and collaboration with Lync’s IM and Presence; creating a single, web centric, client. Our RTC Client for Lync is delivered like a web page but operates like a traditional desktop or mobile application. More importantly, it looks familiar to Lync users so they don’t need to be retrained and IT organizations aren’t burdened with the cost and complexity of expanding Lync infrastructure to support real-time communications.”

read more

Former Tesco IT lead: Don’t narrow standardization to just technology

Enterprise technologists need to consider standardisation, orchestration and automation as these extend to business processes

Enterprise technologists need to consider standardisation, orchestration and automation as these extend to business processes

Moving to the cloud means not only focusing on orchestration and standardization at the technical level but change at the business process level, too, explained Tomas Kadlec, who until recently served as group infrastructure IT director at Tesco.

Kadlec, who was speaking at the Telco Cloud Forum in London this week said technologists within enterprises too often narrow their focus on the technologies needed to solve a problem or supply a need.

“In cloud everyone’s always talking about automation and orchestration, and as a technologist and an enthusiast within IT you always focus on the IT element,” he said. “So you go down and automate it, deploy orchestration, and make it brilliant. But suddenly you realize this one step is just that, one step in a much larger process.”

Kadlec was most recently responsible for Tesco’s IT infrastructure strategy globally, and has spent the better part of the past few years building a private cloud deployment model the company could easily drop into regional datacentres to power the troubled European food retailer’s operations locally and beyond.

This was done largely to improve the services it provides to clients and colleagues within the company’s brick and mortar shops, and support a growing range of internal applications and digital services.

He explained that as cloud services become more prominent within organisations, technologists need to start addressing standardisation, automation and orchestration at the business process and organizational level – which is where many of the challenges really are. This will help sell these technologies to other areas of the business outside IT, and help accelerate positive change in these organisations.

“How do you actually bring people together? How do you make sure the Korean team isn’t thinking of moving to Windows 8 or something while the UK team doesn’t even have the capability? It’s about simplification of the landscape, speaking the same language,” he said.

“So as an IT pro there is an element of having to go back to the customers and bridging that gap between needs and capabilities. Standardization is not jus technical, it really about mindset and organizational change,” he added.

Telefónica: Unifying cloud and comms can make us more efficient

Juan Manuel Moreno, global cloud director at Telefónica

Juan Manuel Moreno, global cloud director at Telefónica

Telefónica is working to roll out a wide range of internally and customer-focused cloud services on both its own UNICA and OpenStack platforms, but the company’s global cloud director Juan Manuel Moreno said the challenges of shifting such a large company aren’t purely technical.

Moreno, who was speaking at the Telco Cloud Forum in London Tuesday, said the company has been in the process of transforming its technology platforms and internal operations for some time.

It’s no secret telcos like Telefónica have in recent years worked to strengthen their cloud capabilities in a bid to broaden their service portfolios and battle dwindling voice and data revenues – and to regain ground lost to more nimble OTT players.

“We are reliant on the full ecosystem of providers of technology… using technology partners to build these new services. But it’s a challenge because these players are moving so quickly,” Moreno said. “One of the biggest challenges in this space is that everyone and everything is moving so quickly.”

Much of what the company has done externally was intricately linked with its efforts to virtualize core elements of its own networks, plans it originally set out in 2014, with the aim of virtualising 30 per cent of all new infrastructure by 2016. The company took many of those platforms to use as the foundation for their customer-facing cloud services.

“Unifying communications and cloud or IT is an opportunity to be more efficient, to discover more services, and of course to develop a broad portfolio of services,” he explained.

But he was fairly candid about the challenges involved with moving Telefónica, a large, global service provider with very heterogeneous technology platforms and business processes, over to a more automated, cloud-centric operational model and product portfolio.

He said overcoming the business process-related challenges would also be key if the company is to effectively capture more revenue from its traditional base as well as new segments (he said Telefónica sees SMBs as the largest commercial opportunity for the foreseeable future).

“In cloud the technology is available for everyone now… But we still have to transform internally in order to really strengthen our market position,” he added.

Windows Server 2003 end of life upgrade: Why many companies are leaving it “quite late”

(c)iStock.com/unclejoe

Everyone occasionally likes a comfortable pair of shoes. Businesses, when forced to upgrade their legacy tech, are no different. UK-based cloud provider Exponential-e has sounded a dire warning to enterprises hoping to cling onto their Windows Server 2003 deployments as the 100 day countdown to support switch-off begins at Microsoft towers.

The arguments against keeping Windows Server 2003, which will have support stopped on July 14, are straightforward; it’ll be a hacker’s dream; there will be huge compliance headaches, and so on. Yet according to Exponential-e almost two thirds (63%) of respondents said they did not know what the shutdown meant for them or their business.

A third (35%) admitted they did not feel fully prepared to deal with the end of life process, while only 16% of organisations said they were looking at a cloud-based solution for future deployments.

Nick East is CEO of hybrid IT provider Zynstra, which offers a Windows Server 2003 migration path. He says, based on his experiences with customers, plenty have moved over in good time, but he’s not surprised by the results.

“If I talk to other IT service providers and other technology providers in the industry, large and small, there’s a fairly consistent message: many organisations are leaving this quite late,” he tells CloudTech. “With a relatively short amount of time to go, lots of organisations are still running Windows Server 2003, and infact in some cases not really yet having made a detailed enough assessment to the extent of the impact of the WS2003 platform on critical parts of their business.”

East notes “it’s never too late to do something”, and in some cases migration has been completed in less than four weeks. It all depends on what data and applications you have to move over. The Zynstra boss explains a “divide and conquer” approach where a customer had an easy migration path bar one application, which was run instead on Server 2003 on a virtualised machine.

“One thing that is clear…is the most expensive upgrade strategy is the one that you do in crisis,” he explains. “It’s expensive because you’ve done it after the event, so you really don’t have any options.

“It’s also expensive in the long term because even if you have an Elastoplast approach to the process, then you end up paying for that much further down the line as well,”

Regular readers of sister publication Enterprise AppsTech will have spotted this trend before with Windows XP. The end of life for XP was slated for April 8 2014, and it was obvious many firms were unprepared. One unnamed tech firm told this correspondent at the time it had only gone off XP with weeks to spare describing it as ‘do as we say, not as we do.’  It was chaos; as a result, Microsoft announced an extension to its antimalware support until July 14 2015. Even still, XP today accounts for almost 17% of device share.

East “sympathises” with Microsoft yet argues despite the importance of the software lifecycle, the industry could do better. “This challenge of keeping systems up to date, whether it’s features, bug fixes or security patches, is a standard part of the software industry,” he says. “I think that the software industry should do a better job of making that transparent to customers.”

However, his message is clear to companies who haven’t weighed up their options yet. “Although there may still be time, it may be too late, [but] you can’t bury your head in the sand,” he says. “You’ve just got to get on and build the right IT strategy. The sooner you start, the sooner you’ll know the impact, and the least likely you’ll be in a crisis upgrade.”

Containers in the Cloud By @Nimbix | @CloudExpo [#Cloud]

What’s hot in today’s cloud computing world? Containers are fast becoming a viable alternative to virtualization for the right use cases. But to understand why containers can be a better option, we need to first understand their origins.
In basic terms, containers are application-centric environments that help isolate and run workloads far more efficiently than the traditional hypervisor technology found in commodity cloud Infrastructure as a Service. Modern operating systems (Linux, Windows, etc.) are made up of two basic parts: kernel space and user space. As its name implies, Kernel space is home to the operating system kernel, or the low level instructions that boot the machine, control hardware, provide subsystems (e.g., networking, storage, etc.), and schedule tasks. Tasks (processes, threads, etc.) run in user space, which is home to applications and services. Different operating systems have different levels of modularity and functional “splits” between kernel space and user space, but most architectures are conceptually the same. While hypervisors run virtual machines that make up both spaces, containers encapsulate just the user space, greatly reducing complexity and redundancy. The immediate benefit is higher performance and less “bloat”, extremely important to the economics of cloud computing. The popularity of containers is a direct result of the realization that hypervisor-based technologies are expensive to host and manage for many types of applications.

read more

Locking Down Super-Users for Maximum Cybersecurity By @Centrify | @CloudExpo [#Cloud]

Privileged Identity Management (PIM) is the lowest common denominator in today’s most treacherous corporate and governmental security breaches. Or more accurately: Privilege Mismanagement. Sony, Target, Anthem, JP Morgan Chase, the city of San Francisco and many others succumbed to the reality that the identity of a single super-user account can be subverted for the purposes of manipulating sensitive organizational data, correspondence, commercial goods and intellectual property.

read more

Internet of Things in the Kitchen By @BobGourley | @ThingsExpo [#IoT]

Who needs to go out for dinner, when you have a tireless master chef waiting at home to prepare a gourmet meal for you, any time, day or night? That’s the premise behind a new invention from Moley Robotics, a tech company based in London. Moley unveiled their new “robochef” prototype on April 14, 2015 at the Hannover Messe technology fair in Germany.

Using hands produced by Shadow Robot Company, the robot chef is capable of producing meals from a collection of over 2,000 recipes, using natural motions learned from Tim Anderson, a human chef and winner of the BBC’s MasterChef competition in 2011. Eventually, home chefs will be able to use a motion capture system to “teach” the robot new recipes, or download them from an iTunes-like service.

read more