NEC and partners in Europe to develop converged cloud-based 5G network

5G - 1A new consortium of technology vendors and academics is collaborating on a project to build future 5G mobile networks on superfluid (AKA cloud) principles.

The Superfluidity Project is part of the European H2020 5G Public-Private Partnership (5G PPP) initiative. The stated aim of the project is to define and develop a converged, cloud-based 5G virtual network and service platform. According to a statement, the network will be distributed over the mobile edge and core of 5G networks extending up to data centres.

The collaborators in the project include a number of top IT brands, including NEC, Citrix, Intel, OnApp and Red Hat. Telecoms equipment maker Alcatel Lucent is to act as a technical coordinator. Among the telcos taking part in the project are British Telecom, PT Inovação e Sistemas and Telefónica. A number of research and academic bodies across Europe are also joining the consortium, including CNIT (which will be project coordinator), the University of Ben Gurion, the University of Liège, the University of Technology in Dresden and the University Politehnica of Bucharest.

There will also be input from a trio of small and medium sized enterprises. EBlink, Telcaria Ideas and Unified Streaming will all help shape the development of the superfluid 5G network of the future.

The goal is to create a superfluid set of properties for the network, which would have location, time, scale and hardware independence. These properties would be exemplified through unlimited growth potential, instant service migration and complete transparency of services.

The work, which will have a planned 30 month project lifecycle, began on the 1st of July 2015.

If successful the Superfluidity project will tackle crucial shortcomings in today’s networks and improve on the long and wasteful provisioning processes that are employed today to meet demand for mobile operator networks, according to a statement from NEC.

Software market frustrating for enterprise users says Gemalto research

Software licensing is still causing enterprises grief, according to new research by security firm Gemalto. The biggest pain points and causes of frustration are the inflexibility of licensing arrangements and the unhelpful delivery options.

According to the State of Software Monetization report, software vendors must change if they’re to satisfy enterprise user demand. This means delivering software as a service and making it accessible across multiple devices, it concludes.

The disparity between customer demand and vendor supply has been created by the shift in tastes from enterprise software customers. This is a function of the ‘bring your own device’ (BYOD) phenomenon, which has been partly created by intelligent device manufacturers and mobile phone makers. However, despite creating the demand for more flexibility they have not been able to follow suit and provide a matchingly flexible and adaptable licensing and packaging technique for software, the report says.

The most frequently voiced complaint, from 87% of the survey sample, was about the cost of renewing and managing licenses. Almost as many (83%) complained about the time needlessly wasted on unfriendly processes for renewing and managing licenses (83%) and the time and costs that were lost to non-product-related development (82%). Most of the survey sample (68%) said they had little idea over how the products they buy are being used in the enterprise.

Four out of five respondents believe that software needs to be future-proofed to be successful.

The report was compiled from feedback from 600 enterprise software users and 180 independent software vendors (ISVs), in relation to headaches related to software licensing and packaging.

Software consumption is changing and customers only want to pay for what they use, according to Shlomo Weiss, Senior VP for Software Monetization at Gemalto. “Delivering software, in ways that customers want to consume it, is critical for creating a user experience that sells,” said Weiss.

GreenPages is Helping Dead River Become a Transformational Services Provider

David Widener is the Director of IT & Project Management at Dead River Company and is one of the most cutting edge people we work with. Dead River Company is New England’s largest energy marketer which means they provide wholesale commercial and residential energy services in the form of oil, propane, and, in some cases, natural gas. David is the senior most leader for both IT and Project Management with 20 years of IT experience.

From an IT perspective, Dead River is just rounding that curve from being infrastructure blockers and tacklers to becoming the transformational services provider they know they need to be. David understands that even as a traditional energy marketer, he needs to change the way he does business in order to gain a competitive advantage including integrating new tools, new services, and new processes. This directly falls in line with GreenPages’ launch of our Transformation Services Group.

Watch the video to hear David discuss the projects he has worked on with GreenPages, including Managed Services, Cloud Services, and Project Management initiatives, his experiences in doing so, and his recommendations to his peers that want to transform their IT departments in a similar fashion!

Watch the video on GreenPages’ YouTube Channel

 

 

We’ll be holding a webinar on 11/18 entitled, “Microsoft Office 365: Expectations vs. Reality.” Register now!

How to Run Microsoft Word on Mac

When it comes to productivity applications, Microsoft Word takes the cake. In fact, I’d even go so far to say that Microsoft Word is more than a productivity app; it’s a utility. A necessity for anyone who owns a computer—and yes, even if you own a Mac. I use Microsoft Word every day. I used […]

The post How to Run Microsoft Word on Mac appeared first on Parallels Blog.

How the boardroom is becoming the key driver of cloud adoption

(c)iStock.com/letty17

Three in five (61%) cloud migration projects are driven by business leaders, according to the latest research from Rackspace and Vanson Bourne.

The study, which polled 250 IT decision makers and 250 business leaders throughout October, showed plenty of boardroom-centric concerns topping the list for driving cloud adoption. The most popular motivation for migrating to the cloud was reducing IT cost (61%), followed by increased organisational resilience (50%), improved security (38%) and increased agility (38%).

Yet the research also argues a disparity between the boardroom and the server room. Only a third (33%) of IT decision makers polled said they are highly experienced with cloud-based infrastructure, with more than half admitting they looked to a third party for support.

Nothing wrong with that, of course, but Rackspace naturally argues specialist support is key to freeing up employee backlog. With support, businesses were more able to focus on streamlining operations (49%) and making sure operational models fit (48%), while without support businesses focused on making sure they have the right tools (44%) and making sure they have the right skills (42%).

More than half (58%) of those polled said their objectives had been completely met by moving to the cloud, while almost nine in 10 (88%) respondents said their organisation’s business goals were in some way met. 53% of the companies in the survey had used third party support.

“A move to the cloud is now an organisation-wide business activity rather than simply a function of the IT department,” said Darren Norfolk, Rackspace UK managing director. “Whether business leaders understand the practicalities of a cloud migration project or not, there appears to be broad acceptance that it is a ‘platform play’ that they can use to innovate and grow.

“Increased communication across all levels of the business will create new opportunities for the cloud to have a direct impact on the bottom line,” he added.

Recent research from the Cloud Industry Forum (CIF) found that overall cloud adoption rate in the UK is now at 84%, with almost four in five (78%) of cloud users having adopted two or more cloud services. Norfolk argues that currently “fewer questions are being asked but plenty is being delivered” in terms of adoption.

Ovum Cloud Security

Tim Jennings, an Ovum analyst, has declared that although there are many fears surrounding the security of the cloud, the increasing number of data breaches is more likely to influence enterprise transition to the cloud. This trend exemplifies the increasing level of maturity of the cloud environment. Jennings commented in a blog,” “Given that data security and privacy concerns have been an inhibitor during the early stages of cloud adoption, it is somewhat ironic that the continued spate of high-profile customer data breaches is likely to push more enterprises toward cloud services. One can envisage, therefore, pointed conversations within boardrooms as CIOs and chief security officers are questioned about the likelihood of their organizations being the next to suffer reputational damage through the exposure of customer data. Many organizations will conclude that using the expertise of a third party is a more reliable approach than depending on in-house resources.”

To a certain extent, some degree of vulnerability will always be prevalent. Jennings added, “Many have been like rabbits caught in the headlights, seemingly having little insight into the root cause of the failure, the extent of the consequences, or the actions required for remediation.”

Outsourcing to modern cloud providers appears to be the logical move. Cloud providers have invested large amounts of money into the security sector, covering areas from the physical security of a center to encryption of customer data and advanced security intelligence.

While it is unrealistic for large companies to replicate this sophisticated cloud environments created by experts, adopting a public cloud environment is not always safer. “It may be that enterprises prefer to use either an on premise or virtual private cloud, while still taking advantage of a specialist provider’s management and security capabilities. Nor does it mean that the responsibility for security and customer data passes away from the enterprise—even though the delivery of these capabilities is in the hands of the third party, governance and control must be retained in-house.”

The post Ovum Cloud Security appeared first on Cloud News Daily.

Network-Aware Orchestration: The Next Level of SD-WAN By @SDietric | @CloudExpo #Cloud

Enterprise networks have become complex. They were designed and deployed to meet a specific set of business requirements at a specific point in time. Configuration modifications were rare, and manual or semi-automated processes together with strict change control procedures were enough to maintain reliability and consistent service levels across the organization.
Business needs have shifted dramatically. The adoption of cloud services, business application-focused requirements and evolving security policies require IT organizations to continuously deploy configuration changes. The common approach of either manually performing changes necessary or simply replacing the complete device configuration, rebooting and hoping it will function, creates unacceptable risks and potential network interruptions. Therefore, enterprises are looking for better ways to automate the management of their networks through leveraging existing capabilities to optimize performance and reducing operational risk through standardization and best-practice architectures.

read more

Linux Foundation wants to extend Swagger in connected buildings

smart cityMembers of the Linux Foundation have met in San Francisco to push its newly announced Open API initiative. The collective want to harmonise efforts in the development of connected building technology.

Founding members of the Open API Initiative, including Google, IBM, Intuit, Microsoft and PayPal, want to extend the range of Swagger, the popular framework for building application programming interfaces (APIs). Their collective ambition is to create a vendor neutral, portable and open specification for providing metadata for APIs based on the representational state architecture (REST).

A new open specification could let humans and computers discover and understand the potential of their proposed connected building services with minimal implementation logic. The Initiative will also promote the use of an open API standard.

Swagger, created in 2010 and offered as an open source license a year later, is a description format used by developers in a broad range of industries. It’s used to design and deliver APIs that support a variety of connected applications and services. Downloads of Swagger and Swagger tooling have tripled in the last year as it became the most popular open source framework for defining and creating RESTful APIs.

SmartBear recently acquired the Swagger API open source project from Reverb Technologies and today is working with its industry peers to ensure the specification and format can continue to evolve. The open governance model for the Open API Initiative includes a Technical Developer Committee (TDC) that will manage the specification and keep users abreast of developments.

“Swagger is considered one of the most popular frameworks for building APIs. When an open source project reaches this level of maturity, it just can’t be managed by one entity,” said Jim Zemlin, executive director at The Linux Foundation. “The Open API Initiative will extend this technology to advance connected application development through open standards.”

AWS announces UK will be its third region in the EU by 2017

Amazon Web Services (AWS) is to add a UK region to its empire. On its opening date, mooted for the end of 2016 or early 2017, it will be the third region in European Union and the 12th in the world.

The presence of an AWS region brings lower latency and strong data sovereignty to local users.

Amazon organises its ‘elastic computing’ by hosting it in multiple locations world-wide. The locations are, in turn, sub divided into regions and Availability Zones. Each region is a separate geographical area with multiple, isolated locations known as Availability Zones. The rationale being to give instant local response but geographically diverse back up to each computing ‘instance’ (or user).

Announcing the new UK base in his blog, Amazon CTO Werner Vogels promised that all Britain’s ranges of local and global enterprises, institutes and government departments will get faster AWS Cloud services than they have been getting. The new region will be coupled – for failover purposes – with existing AWS regions in Dublin and Frankfurt. This local presence, says AWS, will provide lower latency access to websites, mobile applications, games, SaaS applications, big data analysis and Internet of Things (IoT) apps.

“We are committed to our customers’ need for capacity,” said Vogels, who promised ‘powerful AWS services that eliminate the heavy lifting of the underlying IT infrastructure’.

The UK government’s Trade and Investment Minister Lord Maude described the decision as ‘great news for the UK’. The choice of the UK, as the third european presence for AWS is, “further proof the UK is the most favoured location in Europe for inward investment,” said Maude.

By providing commercial cloud services from data centres in the UK AWS will create more healthy competition and innovation in the UK data centre market, according to HM Government Chief Technology Officer Liam Maxwell. “This is good news for the UK government given the significant amount of data we hold that needs to be kept onshore,” said Maxwell.

Yesterday, AWS evangelist Jeff Barr revealed in his blog that AWS will be opening a region in South Korea in early 2016, its fifth region in Asia Pacific.

EC calls for Safer Harbour agreement – issues new guidance

The European Commission has issued new guidance to companies on transatlantic data transfers and has called for a rapid creation of a new framework.

In October BCN reported how a ruling on the case of Schrems vs Data Protection Commissioner) rendered the US-EU Safe Harbour Agreement invalid as it was revealed that EU citizen’s data was being accessed by the US National Security Agency (NSA).

The Commission said it has stepped up talks with US authorities on a new framework and issued guidance to help companies comply with the ruling and work with alternative transfer tools.

“We need an agreement with our US partners in the next three months,” said EV VP Andrus Ansip, who is responsible for the Digital Single Market. “The Commission has been asked to take swift action: this is what we are doing. Today we provide clear guidelines and we commit to a clear timeframe to conclude current negotiations.”

“Citizens need robust safeguards of their fundamental rights and businesses need clarity in the transition period,” said Commissioner Vera Jourová, adding that 4,000 companies currently rely on the transatlantic data pact.

The EC guidelines advised on how data transfers can continue to be pursued by businesses in the interim period. It covers issues such as contractual solutions and contractual rules, binding Corporate Rules for intra-group transfers, derogations and the conclusion or performance of a contract. The guideline document, which is 7,981 words long, runs to 16 pages of challenging reading and is open to interpretation.

“As confirmed by the Article 29 Working Party, alternative tools authorising data flows can

still be used by companies for lawful data transfers to third countries like the United States,” concludes the guidance document. “However, the Commission considers that a renewed and sound framework for transfers of personal data to the United States remains a key priority.”

Enforcement against non-compliance with the Safe Harbour court ruling come into place at the end of January 2016.