Category Archives: News & Analysis

European Commission to reform mobile cloud services regulations – report

The EC is looking to create a level playing field in how telcos and mobile cloud service providers are regulated

The EC is looking to create a level playing field in how telcos and mobile cloud service providers are regulated

The European Commission is considering plans to reform how mobile cloud service providers, also know as Over The Top (OTT) companies, are regulated, according to reports from the FT.

Draft documents unveiled by the commission indicate that initiative to create a level playing field between the telecoms industry, cable operators and mobile cloud services like Whatsapp and Skype has long since been forgotten.

According to the Commission, telcos are currently being forced to compete with OTT services “without being subject to the same regulatory regime”, and that it intends to create a “fair and future-proof regulatory environment for all services”.

One of the main directives of the digital single market proposals advocated by the commission relates to the roll-out of superfast broadband infrastructure across the continent. With traditional revenue streams for telcos, such as calls and messaging, on the decline, operators frequently point the finger at OTT services for enabling free and wide-reaching services.

As a consequence, operators claim a lack of incentive when it comes to investing in overhauling  increasingly depreciated copper network infrastructure, particularly around the last mile.

That said, telcos remain hesitant to give its competitors free access to high-speed broadband infrastructure if it isn’t able to suitably monetise the service, which is where net neutrality enters the picture. Aside from the ongoing debate raging in the US of late, net neutrality formed one of the cornerstones of Neelie Kroes’ digital single market proposals, along with the abolishment of consumer roaming fees.

Last month, Telecoms.com reported that the European Union’s Telecoms Council effectively conceded that a U-turn on its net neutrality ambitions was on the cards. There has yet to be an update on whether the open-letter signed by more than 100 MEPs has convinced the Council to steer clear of paid prioritisation of any kind.

It is believed the commission intends to unveil its new digital single market strategy on the 6th May.

Taipei Computer Association, Government launch Big Data Alliance

TCA, government officials launching the Big Data Alliance in Taipei

TCA, government officials launching the Big Data Alliance in Taipei

The Taipei Computer Association and Taiwanese government-sponsored institutions have jointly launched the Big Data Alliance, aimed at driving the use of analytics and open data in academia, industry and the public sector.

The Alliance plans to drive the use of analytics and open data throughout industry and government to “transform and optimise services, and create business opportunities,” and hopes big data can be used to improve public policy – everything from financial management to transportation optimisation – and create a large commercial ecosystem for new applications.

The group also wants to help foster more big data skills among the domestic workforce, and plans to work with major local universities to train more data and information scientists. Alliance stakeholders include National Taiwan University, National Taiwan University of Science as well as firms like IBM, Far EasTone Telecommunications and Asus, but any data owners, analysts and domain experts are free to join the Alliance.

Taiwanese universities have been fairly active in partnering in partnering with large incumbents to help accelerate the use of big data services. Last year National Cheng Kung University (NCKU) in southern Taiwan signed a memorandum of understanding with Japanese technology provider Futjistu which saw the two organisations partner to build out a big data analytics platform and nurture big data skills in academia.

NTT Com subsidiary RagingWire launches California datacentre

RagingWire claims the new facility gives it the largest datacentre in California

RagingWire claims the new facility makes it the owner of the largest datacentre in California

RagingWire Data Centers, a subsidiary of Japanese telecoms giant NTT Com has cut the ribbon on its latest datacentre, known as California Sacramento 3 or CA3.

RagingWire is among a number of incumbents (Alibaba, Time Warner, Equinix) to bolster their cloud presence in the state as of late.

The 180,000 square foot facility packs 14 megawatts of power and 70,000 square feet of server space. It is located and fully integrated with the company’s 500,000 square foot datacentre campus in Sacramento, which includes two other datacentres (CA1 and CA2); the company said when combined the campus creates the largest datacentre in the state of California (680,000 square feet).

“Today is a big day for RagingWire, our customers, and our partners,” said George Macricostas, founder, chairman, and chief executive of RagingWire Data Centers. “The CA3 data center is the next step in RagingWire’s U.S. expansion and a new component for the global data center portfolio of NTT Communications. CA3 is truly a world-class datcentre.”

The move marks another big expansion for NTT Com, which together with its subsidiaries operates over 130 datacentres globally. The company said the latest facility is aimed at companies operating in the Bay Area and Silicon Valley.

“RagingWire has been a strategic addition to the NTT Communications family of companies. We look forward to working with you to deliver information and communications technology solutions worldwidem,” said Akira Arima, chief executive of NTT Communications.

Sinopec taps Alibaba for cloud, analytics services

Sinopec is working with Aliyun to roll out a series of cloud and big data services

Sinopec is working with Aliyun to roll out a series of cloud and big data services

Aliyun, Alibaba’s cloud services division is working with China Petroleum & Chemical Corporation (Sinopec) to roll out a set of cloud-based services and big data technologies to enable the firm to improve is exploration and production operations.

In a statement to BCN the companies said they will work together to roll out a “shared platform for building-based business systems, big data analytics” and other IT services tailored to the petroleum industry.

“We hope to be able to use Alibaba’s technology and experience in dealing with large-scale system architecture, multi-service data sharing, data applications in the large-scale petrochemical, oil and chemical industry operations,” Sinopec said.

The two companies also plan to explore the role of cloud and big data in connected vehicles.

Just last month Aliyun opened its first overseas datacentre in Silicon Valley, a move the Chinese e-commerce giant said will bolster its appeal to Chinese multinational companies.

The company has already firmed up partnerships with large multinationals including PayPal and Dutch electronics giant Philips. The company has five datacentres in China.

It would seem a number of large oil and gas firms have begun to warm to the cloud as of late. Earlier this week Anadarko Petroleum Corporation announced it had signed a five year deal that would see the firm roll out PetroDE’s cloud-based oil and gas field evaluation analytics service.

Equinix announces sixth London datacentre

Equinix has announced five new datacentres globally in the past month

Equinix has announced five new datacentres globally in the past month

Datacentre giant Equinix has announced the launch of its sixth London-based International Business Exchange (IBX) datacentre.

Equinix said the datacentre, LD6, will offer customers the ability to leverage its cloud interconnection service – which lets users create private network links to Microsoft Azure, Amazon Web Services (AWS) and Google Cloud services among others.

The company said the $79m facility, which is located in Slough, is extremely energy efficient (LEED gold-accredited), and utilizes mass air cooling technology with indirect heat exchange and 100 percent natural ventilation.

It measures 236,000 square feet (8,000 square meters) and has capacity for 1,385 cabinets, with the ability to add another 1,385 cabinets in phase two of the facility’s development. Once phase two is complete, the Equinix London Slough campus will provide more than 388,000 square feet (36,000 square meters) of colocation space interconnected by more than a thousand dark fiber links.

“LD6 is one of the most technically advanced datacentres in the UK. It has been designed to ensure that we can continue to provide state-of-the-art colocation for our current and future customers,” said Russell Poole, managing director, Equinix UK. “This latest addition to our thriving London campus sets new standards in efficiency and sustainability.”

The facility is among five new datacentres announced last month. Equinix announced plans in March to roll out new state-of-the-art datacentres in New York, Singapore, Melbourne and Toronto.

Google boosts cloud-based big data services

Google is bolstering its big data services

Google is bolstering its big data services

Google announced a series of big data service updates to its cloud platform this week in a bid to strengthen its growing portfolio of data services.

The company announced the beta launch of Google Cloud Dataflow, a Java-based service that lets users build, deploy and run data processing pipelines for other applications like ETL, analytics, real-time computation, and process orchestration, while abstracting away all the other infrastructure bits like cluster management.

The service is integrated with Google’s monitoring tools and the company said it’s built from the ground up for fault-tolerance.

“We’ve been tackling challenging big data problems for more than a decade and are well aware of the difference that simple yet powerful data processing tools make. We have translated our experience from MapReduce, FlumeJava, and MillWheel into a single product, Google Cloud Dataflow,” the company explained in a recent blog post.

“It’s designed to reduce operational overhead and make programming and data analysis your only job, whether you’re a data scientist, data analyst or data-centric software developer. Along with other Google Cloud Platform big data services, Cloud Dataflow embodies the kind of highly productive and fully managed services designed to use big data, the cloud way.”

The company also added a number of security features to Big Query, Google’s SQL cloud service, including adding row-level permissioning for data protection, made it more performant (raised the ingestion limit to 100,000 rows per second), and announced its availability in Europe.

Google has largely focused its attention on other areas of the stack as of late. The company has been driving its container scheduling and deployment initiative Kubernetes quite hard, as well as its hybrid cloud initiatives (Mirantis, VMware). It also recently introduced a log analysis for Google Cloud and App Engine users.

IBM makes cyber threat data available as a cloud security service

IBM is launching a cybersecurity cloud service

IBM is throwing its hat into the cybersecurity ring

IBM has unveiled a cloud-based cybersecurity service which includes hundreds of terabytes of raw aggregated threat intelligence data, which can be expanded upon by users that sign up to use the service.

At about 700TB, IBM’s X-Force Exchange service is being pitched by the firm as one of the largest and most complete catalogues of cybersecurity vulnerability data in the world.

The threat information is based on over 25 billion web pages and images collected from a network of over 270 million endpoints, and will also include real-time data provided by others on the service (so effectively, the more people join, the more robust the service gets).

“The IBM X-Force Exchange platform will foster collaboration on a scale necessary to counter the rapidly rising and sophisticated threats that companies are facing from cybercriminals,” said Brendan Hannigan, general manager, IBM Security.

“We’re taking the lead by opening up our own deep and global network of cyberthreat research, customers, technologies and experts. By inviting the industry to join our efforts and share their own intelligence, we’re aiming to accelerate the formation of the networks and relationships we need to fight hackers,” Hannigan said.

Last year IBM made a number of acquisitions to bolster end-point and cloud security (CrossIdeas, Lighthouse) and adding cyber threat detection to the mix creates a nicely rounded security portfolio. But the move also put it in direct competition with a wide range of managed security service providers that have been playing in this space for years and going after the same verticals (oil & gas, financial service, retail, media, etc.), so it will be interesting to see how IBM differentiates itself.

NCR offers cloud control for Android-based ATMs

NCR says Kalpana can nearly halve the time it takes to deploy new services

NCR says Kalpana can nearly halve the time it takes to deploy new services

NCR has announced a radical new approach to ATM network deployment, with a cloud-based enterprise application allowing banks to control and manage thin-client devices running a locked-down version of the Android operating system.

Called Kalpana, the software can result in a 40 per cent reduction in cost of ownership and halves the time it takes to develop and deploy new services, the company claims.

Robert Johnson, global director of software solutions at NCR, said banks are under pressure to improve services and reduce costs and so are “making very careful technical choices”. With increased emphasis on digital mobile and internet banking the ATM channel “has started looking a little disconnected”, he added.

ATM architecture development has been relatively static over the past 10 to 15 years, and while the devices have become more sophisticated in terms of features such as cash deposit and recycling, colour screens and online chat capabilities, essentially they are all customised PCs, creating problems of management, security and cost.

According to Andy Monahan, vice president of software engineering and general manager for Kalpana at NCR, the recent migration from Microsoft Windows XP to Windows 7 “has forced a rethink” and a few years ago the company decided that Monahan’s team at NCR’s Global R&D centre in Dundee, Scotland, should “take a blank sheet of paper and ask ‘what should we build next’”.

“There are two main discussions that we have with CIOs,” says Monahan. “One is whether there a viable alternative to Windows and the second is about the fact that the banks have a fairly clear idea of the IT architecture they want, and they want it to be consistent.”

The choice of operating system was relatively straightforward, he said. “When you look across the spectrum of embedded operating systems Android is pretty standard.”

By having a thin-client Android based device “everything that is customised is removed from the device and taken into the enterprise”, from where it can be configured and managed remotely. It also dramatically improves security. “By removing everything from the ATM thin-client and taking it into the cloud you create a locked-down environment that is very secure: there’s no BIOS and there’s no hard drives, just a secure boot loader that validates the kernel and checks all the certificates,” said Monahan.

By having the management systems in the enterprise software stack it is easier to develop new applications and services alongside other channels such as mobile and internet, ensuring a faster time-to-market and a more consistent customer experience, as well as to simplify management and maintenance tasks.

The first ATM to work in the Kalpana environment is NCR’s new Cx110, which uses Android tablet technology. Cardtronics, the world’s largest retail ATM owner/operator, has already taken delivery of the Kalpana software and Cx110 ATMs and plans to pilot them at locations in the Dallas-Fort Worth area, beginning “in the next few weeks”.

Ericsson, Eindhoven University drive connected car partnership

Ericsson has been pushing its connected car platform the past couple of years

Ericsson has been pushing its connected car platform the past couple of years

Swedish infrastructure giant Ericsson has announced a new partnership with Eindhoven University of Technology focused on advancing the intelligent capabilities of automotive vehicles, starting with a solar-powered connected car, reports Telecoms.com.

The car, which will compete in a 3,000km race from Darwin to Adelaide in Australia as part of Solar Team Eindhoven, will be fully solar-powered, and Ericsson will be looking to drive intelligence in the vehicle based on the Connected Traffic Cloud platform it announced at Mobile World Congress in March.

Connected Traffic Cloud is a managed service capable of sharing two-way data between connected cars and road traffic authorities. In the context of the World Solar Challenge, Ericsson will be looking to aggregate car, traffic and weather data, conduct in-depth analytics and maximise the energy and power consumption efficiency of the vehicle.

Announced at Mobile World Congress earlier this year, Ericsson at the time said connected cars and road authorities utilising the platform will benefit from enhanced road safety, improved traffic flow and vehicle performance. The company has previously partnered on similar initiatives with Volvo, the Swedish Transport Administration (Trafikverket) and the Norwegian Public Roads Administration (Statens Vegvesen).

Orvar Hurtig, head of industry and society at Ericsson said real-time data analysis is the key to driving more intelligent road networks.

“Mobile connectivity is increasingly a must-have feature in cars, thanks to both consumer demand for infotainment and a wide range of regulatory initiatives that aim to increase road safety,” he said. “As a result, vehicles are becoming a major source of data that could be used to improve road traffic authorities’ ability to manage traffic and prevent avoidable accidents. Connected Traffic Cloud is the means by which that data could be shared.”

Visit Connected Cars 2015 at the RAI Amsterdam between the 24th & 25th June. Automakers are eligible for free passes.

Garter: IoT requires new architecture strategy

The IoT will require a new architectural approach, Gartner claims

The IoT will require a new architectural approach, Gartner claims

Enterprise IT professionals and architects will need to develop new architectures in order to help mitigate the technological, legal and reputational risks associated with delivering Internet of Things services, Gartner claims.

It has been suggested by some industry specialists that the Internet of Things has the potential to cripple existing datacentre infrastructure from a technical perspective, and could also create new or heightened risks around data security and regulatory compliance.

That said, Gartner believes developing the right architecture to handle the wealth of data generated by IoT sensors will be key to ensuring infrastructure can keep pace with new services being rolled out, and help deal mitigate other non-technical risks.

Mike Walker, research director at Gartner said enterprises need to understand not just the opportunities this wealth of information can generate but the risks as well. He said the anticipated data growth may require organisations to develop new competencies around regulatory compliance, and reassess the impact of security breaches on corporate reputation.

“Enterprise architects need to determine the potential impact, both positive and negative, of IoT technologies and then create actionable deliverables that can define which business opportunities should be pursued as result,” Walker said.

“The first step is bringing together various business unit and IT leaders to explore how the IoT can impact their respective business domains, and agree on actionable business scenarios that will require deep collaboration between them.”

Walker suggested that organisations create internal competency centres to help coordinate activities across internal stakeholders.

“Organisations must understand the profound impact new sources of information will have. Enterprise architects are best positioned to discuss and enable the most lucrative opportunities in partnership with business unit and IT leaders. At the same time, they must work with chief data officers and security officers to structure this data in a way that mitigates the worst risks of pursuing these opportunities,” Walker added.