Category Archives: Networks

Ericsson claims a world first with transcontinental 5G trial

Ericsson, Deutsche Telekom and SK Telecom have announced a partnership to deploy world’s first transcontinental 5G trial network, reports Telecoms.com.

The objective of the agreement will be to provide optimized end-user experiences by providing consistent quality of services and roaming experiences for advanced 5G use cases with enhanced global reach. Ericsson will act as the sole supplier to the project, which will include technologies such as NFV, software defined infrastructure, distributed cloud, and network slicing.

Last October, Ericsson and SK Telecom conducted a successful demonstration of network slicing technology, which featured the creation of virtual network slices optimized for services including super multi-view and augmented reality/virtual reality, Internet of Things offerings and enterprise solutions.

“5G is very different from its predecessors in that the system is built as a platform to provide tailored services optimized for individual customer’s needs, at a global scale,” said Alex Jinsung Choi, CTO at SK Telecom. “Through this three-party collaboration, we will be able to better understand and build a 5G system that can provide consistent and enhanced user experience across the globe.”

Alongside the announcement, Ericsson and SK Telecom also successfully completed a demonstration of 5G software-defined telecommunications infrastructure, using the vendors Hyperscale Datacenter System (HDS) 8000 solution. The pair claims this is a world-first and will enable dynamic composition of network components to meet scale requirements of 5G services.

Software-defined telecommunications infrastructure is one of the enablers of network slicing, which will allow operators to create individual virtualized environments which are optimized for specific users. The demonstration itself focused on two use cases; ultra-micro-network end-to-end (E2E) slicing for personalized services, and ultra-large-network E2E slicing for high-capacity processing.

“SDTI is an innovative technology that enhances network efficiency by flexibly constructing hardware components to satisfy the infrastructure performance requirements of diverse 5G services,” said Park Jin-hyo, Head of Network Technology R&D Center at SK Telecom.

Finally, Ericsson has announced another partnership with Japanese telco KDDI with the ambition of delivering IoT on a global scale and providing enhanced connectivity services to KDDI customers.

The partnership will focus on Ericsson’s cloud-based IoT platform to deliver services such as IoT connectivity management, subscription management, network connectivity administration and flexible billing services. The pair claims the new proposition will enable KDDI’s customers to deploy, manage and scale IoT connected devices and applications globally.

IoT represents a significant opportunity for enterprise customers and operators alike, as it significantly increases the amount of data available and also access points to customers worldwide. Research firm Statista estimates the number of devices worldwide could exceed 50 billion, though the definition of what a connected device is or what an IoT connected device is varies.

“KDDI has for a long time been committed to building the communication environment to connect with world operators in order to support the global businesses of our customers,” said Keiichi Mori, GM of KDDI’s IoT Business Development Division. “We believe that by adopting DCP, we will be able to leverage Ericsson’s connection with world carriers and furthermore promote our unified service deployment globally to customers as they start worldwide IoT deployments.”

What does Clinton have in store for the tech industry?

Location United States. Red pin on the map.Hillary Clinton has recently released her campaign promises for the technology sector should she be elected as President Obama’s successor in November, reports Telecoms.com.

The technology agenda focused on a vast and varied number of issues within the technology industry, including the digital job-front, universal high-speed internet for the US, data transmission across jurisdictions, technological innovation and the adoption of technology in government. Although the statement does indicate a strong stance on moving technology to the top of the political agenda, there does seem to be an element of ‘buzzword chasing’ to gain support of the country’s tech giant.

“Today’s dynamic and competitive global economy demands an ambitious national commitment to technology, innovation and entrepreneurship,” the statement read. “America led the world in the internet revolution, and, today, technology and the internet are transforming nearly every sector of our economy—from manufacturing and transportation, to energy and healthcare.”

But what did we learn about America’s technology future?

Focus on 5G and new technologies

One of the more prominent buzzwords through the beginning of 2016 has been 5G as it is seemingly the turn-to phrase for the majority of new product launches and marketing campaigns. The Clinton has aligned themselves with the buzz in committing to deploying 5G networks (no timeframe), as well as opening up opportunities for a variety of next gen technologies.

“Widely deployed 5G networks, and new unlicensed and shared spectrum technologies, are essential platforms that will support the Internet of Things, smart factories, driverless cars, and much more—developments with enormous potential to create jobs and improve people’s lives,” the statement said.

The deployment of 5G has been split into two separate areas. Firstly, the use of the spectrum will be reviewed with the intention of identifying underutilized bands, including those reserved for the government, and reallocating to improve the speed of deployment. Secondly, government research grants will be awarded to various vendors to advance wireless and data technologies which are directed towards social priorities including healthcare, the environment, public safety and social welfare.

A recent report highlighted from Ovum highlighted the US is on the right track for the deployment of 5G, as the team believe it will be one of the leading countries for the technology. Ovum analysts predict there will be at least 24 million 5G subscribers by the end of 2021, of which 40% will be located in North America.

Europe US court of justiceData Transmission between US and EU

From a data transmission perspective, the Clinton team are seemingly taking offence to the European Court of Justice’s decision to strike down Safe Harbour, and the varied reception for the EU-US Privacy Shield. It would appear the Clinton team is under the assumption the deal between the EU and US was struck down for economic reasons, as opposed to data protection.

“The power of the internet is in part its global nature. Yet increasing numbers of countries have closed off their digital borders or are insisting on “data localization” to attempt to maintain control or unfairly advantage their own companies,” the statement said. “When Hillary was Secretary of State, the United States led the world in safeguarding the free flow of information including through the adoption by the OECD countries of the first Internet Policymaking Principles.

“Hillary supports efforts such as the U.S.-EU Privacy Shield to find alignment in national data privacy laws and protect data movement across borders. And she will promote the free flow of information in international fora.”

While it is could be considered encouraging that the mission of the Clinton team is to open up the channels between the two regions again, it does seem to have missed the point of why the agreement was shot down in the first place. The statement seemingly implies EU countries refused the agreement on the ground of promoting the interests of EU countries in the EU, as opposed to privacy concerns and the US attitude to government agencies access to personal information.

Safe Harbour, the initial transatlantic agreement, was shot down last October, though its proposed successor has come under similar criticism. Only last month, the European Data Protection Supervisor, Giovanni Buttarelli, outlined concerns on whether the proposed agreement will provide adequate protection against indiscriminate surveillance as well as obligations on oversight, transparency, redress and data protection rights.

“I appreciate the efforts made to develop a solution to replace Safe Harbour but the Privacy Shield as it stands is not robust enough to withstand future legal scrutiny before the Court,” said Buttarelli. “Significant improvements are needed should the European Commission wish to adopt an adequacy decision, to respect the essence of key data protection principles with particular regard to necessity, proportionality and redress mechanisms. Moreover, it’s time to develop a longer term solution in the transatlantic dialogue.”

The Clinton team can continue to discuss changes to the transatlantic data transmission policy should they choose, however it is highly unlikely any positive moves are to be made until it gets to grips with the basic concerns of EU policy makers.

Navigating big dataAccess to Government data

Currently there are certain offices and data sets which are accessible to the general public, though this is an area which will be expanded under a Clinton regime. The concept is a sound one; giving entrepreneurs and businesses access to the data could provide insight to how money could be saved, used more efficiently or even new technologies implemented to improve the effectiveness of the government, though there could be a downside.

“The Obama Administration broke new ground in making open and machine-readable the default for new government information, launching Data.gov and charging each agency with maintaining data as a valuable asset,” the statement said. “Hillary will continue and accelerate the Administration’s open data initiatives, including in areas such as health care, education, and criminal justice.”

The downside has the potential to ruin any politician. The program is opening the door for criticism from all sides, and will offer ammunition to any opposition.

Connecting American Citizens

One of the most focused points of the document was around the country’s commitment to ensuring each household and business has the opportunity to be connected to high-speed broadband. While this could be considered an effective sound-bite for the party, it is not a new idea by any means. A recent report highlighted there is currently a surprising number of Americans who do not currently have access to broadband. Although it may be expected those in the rural communities would struggle at times, the report indicated 27% and 25% of New York and Los Angeles respectively would be classed in the “Urban Broadband Unconnected” category, which could be considered more unusual.

Connect America Fund, Rural Utilities Service Program and Broadband Technology Opportunities Program are all well-established operations (Rural Utilities Service Program has been around since 1935) which had been drums for previous presidents to bang also. Clinton has said very little new here or has made little commitment to the initiatives.

The team have however committed to a $25 billion Infrastructure Bank which will enable local authorities to apply for grants to make improvements. This is a new concept which Clinton plans to introduce though the details on how it will be funded, what the criteria for application will be or whether there are any stipulations on which vendors the money can be spend with, are not detailed.

5G will be commercially ready by 2021 – Ovum

Tablet PC with 5GAnalyst firm Ovum has released its inaugural 5G Subscription Forecasts this week which estimates there will be 24 million 5G subscriptions worldwide at the end of 2021, reports Telecoms.com.

The team at Ovum believe 5G commercial services will be a normalized aspect of the telco landscape by 2020, though this will be dominated by those in North America and Asia, who will account for as much as 80% of global 5G subscriptions by 2021. Europe would only take 10% of the pie, with the Middle-East and Africa splitting the remaining 10%.

While 5G could be considered as an advertising buzzword within the telco industry on the whole, the need for speed has been driven primarily by the advancements in user device capabilities. Claims to be the first vendor to have achieved the 5G status are not uncommon, though Nokia was the latest to make such a statement, in claiming its 5G network stack, which combines radio access technology with a cloud-based packet core, running on top of an AirFrame data centre platform, is the foundation of a commercial 5G architecture. This in itself is a bold claim as 5G standards are yet to be fully ratified.

“The main use case for 5G through 2021will be enhanced mobile broadband services, although fixed broadband services will also be supported, especially in the US,” said Mike Roberts, Ovum Practice Leader for carrier strategy and technology. “Over time 5G will support a host of use cases including Internet of Things and mission-critical communications, but Ovum does not believe those use cases will be supported by standardized 5G services through 2021.”

While announcements claiming organizations are 5G ready, a number of these have been excluded from the research. Numerous operators have claimed they will be launching 5G prior to the 2020 date stated in the research though these are services which will generally be deployed on networks and devices which don’t complying with 5G standards. These examples are seemingly nothing more than advertising ploys playing on the excitement of 5G. Ovum defines a 5G subscription as an active connection to a 5G network via a 5G device. 5G is further defined as a system based on and complying with 3GPP 5G standards.

In the UK, Ofcom has stated 5G services could be commercially available by 2020, though this claim has not been backed up by the team at Ovum. While it has not directly commented on the state of play in the UK, Ovum believe the majority of subscribers will be located in the US, Japan, China, and South Korea, countries where the operators have set more aggressive deadlines for the introduction of 5G services.

One area which has not been cleared up is the future spectrum requirements of 5G. Use cases for the high speed networks have already been outlined, including varied cases such as scientific research to financial trading and weather monitoring, though how the supply of spectrum will be split for immediate and future needs is still unknown.

“5G must deliver a further step change in the capacity of wireless networks, over and above that currently being delivered by 4G,” Steve Unger, Group Director at Ofcom on the website. “No network has infinite capacity, but we need to move closer to the ideal of there always being sufficient capacity to meet consumers’ needs”

BT beefs up Cloud of Clouds security with Palo Alto Wildfire

Security concept with padlock icon on digital screenBT is to install a cloud-based breach prevention security system from Palo Alto Networks (PAN) into its global Cloud of Clouds, writes Telecoms.com.

Under the terms of a new BT-PAN agreement, BT’s existing Assure Managed Firewall service will now include PAN’s cloud-based WildFire malware prevention system, described as a ‘key component’ of the security specialist’s Next-Generation Security Platform. In a statement, BT said the installation is part of a long term plan to roll out stronger protection for its cloud-based applications as it aims to encourage enterprise customers to benefit from its Cloud of Clouds.

Though enterprises are keen to embark on a digital transformation, security concerns continue to hold them back, according to Mark Hughes, CEO of BT Security. The most obviously profitable use cases for cloud computing, such as big-data analytics and access to more cloud based applications, are the very attractions that cyber criminals are most likely to target. In the rush to provide greater levels of security telcos and cloud service providers face an investment protection challenge, Hughes said.

While enterprise customers need to access these applications quickly and securely, they must also find future-proof tools that can go with the cloud and won’t have to be expensively replaced in a few years time. “Enterprises need security that can protect them against targeted and increasingly sophisticated cyber-attacks. They need a tougher lining around their cloud services,” said Hughes.

Palo Alto Networks will provide intelligent security monitoring and management services to the BT Assure portfolio of security services. The Palo Alto Networks Next-Generation Security Platform includes a Next-Generation Firewall, Advanced Endpoint Protection and a Threat Intelligence Cloud.

BT recently won two new cloud service contracts with the European Commission worth £24 million, bringing its total of EC contract wins to four in 12 months, BCN reported in January. With data security an increasingly sensitive issue with the EC (as reported in BCN), BT has taken on a challenging brief to provide public and private cloud services across 52 major European institutions, agencies and bodies.

Spotify shifts all music from data centres to Google Cloud

Spotify_Icon_RGB_GreenMusic streaming service Spotify has announced that it is to switch formats for storing tunes for customers and is copying all the music from its data centres onto the Google’s Cloud Platform.

In a blog written by Spotify’s VP of Engineering & Infrastructure, Nicholas Harteau explained that though the company’s data centres had served it well, the cloud is now sufficiently mature to surpass the level of quality, performance and cost Spotify got from owning its infrastructure. Spotify will now get its platform infrastructure from Google Cloud Platform ‘everywhere’, Harteau revealed.

“This is a big deal,” he said. Though Spotify has taken a traditional approach to delivering its music streams, it no longer feels it needs to buy or lease data-centre space, server hardware and networking gear to guarantee being as close to its customers as possible, according to Harteau.

“Like good engineers, we asked ourselves: do we really need to do all this stuff? For a long time the answer was yes. Recently that balance has shifted,” he said.

Operating data centres was a painful necessity for Spotify since it began in 2008 because it was the only way to guarantee the quality, performance and cost for its cloud. However, these days the storage, computing and network services available from cloud providers are as high quality, high performance and low cost as anything Spotify could create from the traditional ownership model, said Harteau.

Harteau explained why Spotify preferred Google’s cloud service to that of runaway market leader Amazon Web Services (AWS). The decision was shaped by Spotify’s experience with Google’s data platform and tools. “Good infrastructure isn’t just about keeping things up and running, it’s about making all of our teams more efficient and more effective, and Google’s data stack does that for us in spades,” he continued.

Harteau cited the Dataproc’s batch processing, event delivery with Pub/Sub and the ‘nearly magical’ capacity of BigQuery as the three most persuasive features of Google’s cloud service offering.

IBM launches Swift answer to Lambda at Interconnect 2016

open sourceIBM has unveiled a raft of new announcements today at Interconnect 2016, its largest ever cloud event. The rally, in Las Vegas, attracted 25,000 clients, partners and developers who were briefed on new partnerships with VMWare, IBM’s work with Apple’s Swift language, Bitly, Gigster, GitHub, Watson APIs and a new platform, BlueMix OpenWhisk.

The Bluemix OpenWhisk is IBM’s answer to Amazon Web Services’ event driven system Lambda, which allows developers to create automated responses to events when certain conditions are met. Automated responses have become a critical area for public cloud service providers and BCN recently reported how Google launched Google Cloud Functions in order to match the AWS offering to developers. All the systems aim to give developers a way to programme responses without needing to implement integration-related changes in the architecture, but IBM claims OpenWhisk is the only one whose underlying code will be available under an open-source license on the code publishing site Github.

By allowing all users open access to inspect code IBM says it can inspire greater levels of developer collaboration. IBM said OpenWhisk is highly customisable through either web services or using commands and it can be adapted to company requirements rather than being an inflexible cloud services.

OpenWhisk will work with both the server-side JavaScript framework and Apple’s increasingly popular Swift programming language. With a range of application programming interfaces (APIs) IBM claims the OpenWhisk service will have greater flexibility than the rival services from Google and AWS.

In a statement IBM explained the next phase of its plan to bring Swift to the Cloud with a preview of a Swift runtime and a Swift Package Catalog to help enable developers to create apps for the enterprise. The new Swift runtime builds on the Swift Sandbox IBM launched in December and allows developers to write applications in Swift in the cloud and create continuous integration (CI) and continuous delivery (CD) condition that run apps written in Swift in production on the IBM public cloud.

IBM also announced a new option for users to run GitHub Enterprise on top of IBM Bluemix and in a company’s own data centre infrastructure.

In another announcement IBM gave details of a new partnership with VMware aimed at helping enterprises take better advantage of the cloud’s speed and economics. A new working arrangement means enterprise customers will find it easier to extend their existing workloads from their on-premises software-defined data centre to the cloud. The partnership gives IBM users the option to run VMware computing, storage and networking workloads on top of the IBM cloud. The new level of integration applies to vSphere, Virtual SAN, NSX, vCenter and vRealize Automation. In addition the IBM cloud is now part of the vCloud Air Network from VMware and the two partners will jointly sell hybrid cloud.

HPE scoops two telco client wins for cloud service projects

HPE office logoHewlett Packard Enterprise (HPE) has announced partnerships with telcos Swisscom and Telecom Italia subsidiary Telecom Personal to share its cloud service expertise and boost its presence in the comms industry.

In the Swisscom project HPE’s brief is to impose a network function virtualization (NFV) discipline on the IT and telecoms infrastructure, using its OpenNFV systems. Swisscom claims it is one of the world’s first communication service providers (CSPs) to pioneer the use of NFV to offer virtual customer premise equipment (vCPE) to its business customers.

In January BCN reported that HPE has launched an initiative to simplify hybrid cloud management for telcos using a new Service Director offering. Among the productivity benefits mooted for HPE Service Director 1.0 was options for pre-configured systems to address specific use cases as extensions to the base product, starting with HPE Service Director for vCPE 1.0.

In the Swisscom project HPE will use its HPE Virtual Services Router and HPE Technology Services in tandem with Service Director to create Swisscom’s new vCPE model. The objective is to allow Swisscom to manage its customers’ network infrastructure from a centralised location and provide networking services on-demand. This will cut costs for the telco, speed up service provision and boost the availability of services. It could also, claims CPE, make it easier to create new services in future.

Argentina based Telecom Personal has asked HPE to modernise its network in order to use 4G/LTE technologies to cater for an increasing appetite for data services among subscribers. HPE has been appointed to re-engineer the infrastructure and expand and upgrade part its network core. The success of the project will be judged on whether HPE can give a measurable improvement in service experience, network speeds and capacity, according to Paolo Perfetti, Telecom Personal’s CTO.

Yesterday BCN reported that HPE has launched AppPulse Trace, a service that developer clients can use to monitor their cloud app performances.

Oracle gets tax breaks to build cloud campus in Texas

OracleOracle has unveiled plans for a technology campus in Austin, Texas in a bid to expand its workforce by 50% in three years. It’s looking for millennials who want to work and live on site and sell cloud computing systems, by creating a combined office and housing complex.

Oracle is also to close its Oregon offices and incorporate the facilities in the new Texas complex. No details were given over staff re-location.

The move is part of a state initiative, including tax breaks and low regulation, as Texas positions itself as a home for innovation and technology. “I will continue to pursue policies that invite the expansion and relocation of tech companies to Texas,” said Texas State Governor Greg Abbott.

The site will include cheap accommodation as Oracle competes for talent in a region with a high concentration of technology start-ups. Its recruitment drive will be aimed at graduates and technical professionals at early stages in their career with the majority of new jobs being created in Oracle’s cloud sales organisation, Oracle Direct.

Oracle is to work with local firms in building the campus, the plans for which include the consolidation of Oracle’s facilities in Oregon. In the first phase it will build a 560,000 square foot complex on the waterfront of Austin’s Lady Bird Lake. It is also building a housing complex next door, with 295 apartments, for employee housing.

Austin’s technology community is teeming with creative and innovative thinkers and the town is a natural choice for investment and growth, claimed Oracle Direct’s Senior VP Scott Armour. “Our campus will inspire, support and attract top talent, with a special focus on the needs of millennials,” said Armour.

Austin’s biggest problems are affordability and mobility, according to Austin’s Mayor Steve Adler. “I look forward to working with Oracle to tackle our biggest challenges,” he said.

Cisco boosts SDN range with ACI update

Cisco corporateCisco claims that customers can take a further step towards network automation as it launched a new release of Application Centric Infrastructure (ACI) software to its software defined networking range.

Despite massive demand there are only 5% of networks being automated, according to Cisco’s own customer feedback. In response it has moved to simplify the task by making it easier to address all the various autonomous segments of any complicated network infrastructure.

The new software revision of ACI makes it capable of microsegmentation of both physical (i.e. bare metal) applications and virtualized applications, which are separated from the hardware by virtual operating systems such as VMware VDS and Microsoft Hyper-V. By extending ACI across multi-site environments it will enable cloud operators and network managers to devise policy-driven automation of multiple data centres.

In addition, Cisco claimed it has paved the way for integration with Docker containers through its contributions to open source. This, it said, means customers can get a consistent policy model and have more options to choose from when using the Cisco Application Policy Infrastructure Controller (APIC).

ACI now supports automated service insertion for any third party service running between layers four and seven on the network stack, it said. More support will be put behind cloud automation tools like VMware vRealize Automation and OpenStack, including open standards-based Opflex support with Open vSwitch (OVS).

The ACI ecosystem now makes the automation of entire application suites possible, including Platform as a Service (PAAS) and Software as a Service (SAAS) and there are now over 5000 Nexus 9000 ACI-ready customers using Cisco’s open platform it said.

“Customers tell me that only five to ten percent of their networks are automated today,” said Soni Jiandani, SVP at Cisco. Though they are eager to adopt comprehensive automation for their networks and network services through a single pane of management, they haven’t managed it yet. However, since several ACI customers have achieved full this could be the next step, said Jiandani.

IBM open-sources machine learning SystemML

Machine 2 MachineIBM is aiming to popularise its proprietary machine learning programme SystemML through open-source communities.

Announcing the decision to share the system source code on the company blog, IBM’s Analytics VP Rob Thomas said application developers are in need of a good translator. This was a reference to the huge challenges developers face when combining information from different sources into data-heavy applications on a variety of computers, said Thomas. It is also a reference to the transformation of a little used proprietary IBM system into a popular, widely adopted artificial intelligence tool for the big data market. The vehicle for this transformation, according to Thomas, will be the open-source community.

IBM claims SystemML is now freely available to share and modify through the Apache Software Foundation open-source organisation. Apache, which manages 150 open-source projects, represents the first step to widespread adoption, Thomas said. The new Apache Incubator project will be code named Apache SystemML.

The machine learning platform originally came out of IBM’s Almaden research lab ten years ago when IBM was looking for ways to simplify the creation of customized machine-learning software, Mr. Thomas said. Now that it is in the public domain, it could be used by a developer of cloud based services to create risk-modeling and fraud prevention software for the financial services industry, Thomas said.

The current version of SystemML could work well with Apache project Spark, Thomas said, since this is designed for processing large amounts of data that stream in from continuous sources like monitors and smartphones. SystemML will save companies valuable time by allowing developers to write a single machine learning algorithm and automatically scale it up using open-source data analytics tools Spark and Hadoop.

MLLib, the machine learning library for Spark, provides developers with a rich set of machine learning algorithms, according to Thomas, and SystemML enables developers to translate those algorithms so they can easily digest different kinds of data and to run on different kinds of computers.

“We believe that Apache Spark is the most important new open-source project in a decade. We’re embedding Spark into our Analytics and Commerce platforms, offering Spark as a service on IBM Cloud, and putting more than 3,500 IBM researchers and developers to work on Spark-related projects,” said Thomas.

While other tech companies have open-sourced machine learning technologies they are generally niche specialised tools to train neural networks. IBM aims to popularise machine learning within Spark or Hadoop and its ubiquity will be critical in the long run, said Thomas.