Todas las entradas hechas por Jamie Davies

BT and Daisy announce £70mn partnership

BT Sevenoaks workstyle buildingBT has announced a £70 million partnership with Daisy Group which will offer customers of the latter to BT’s Wholesale Hosted Centrex (WHC) platform, reports Telecoms.com.

Daisy’s customers will be integrated to the platform over the next 18 months, which provides customers with cloud-based unified communications services including cloud call recording, HD voice services, call analytics and web collaboration.

“Many businesses are now hosting their communication services using cloud technology to make them accessible to all, using any fixed or mobile device, at any time, wherever they might be,” said Gerry McQuade, CEO of BT Wholesale and Ventures. “BT and Daisy Group have been pioneers of that trend, so I’m delighted that we’re coming together to bring customers a powerful combination of experience, scale and expertise.

“We believe the rapid pace of change will continue over the coming years, and we’re looking forward to helping both Daisy and BT customers reap the benefits that change will bring.”

The cloud of clouds initiative launched by BT has been one of the cornerstones of its enterprise business strategy for some time. Last month            , Oracle and BT announced a new partnership which allows customers to use features of BT Cloud Connect environment to gain direct connectivity to the Oracle Cloud.

The relationship between the two companies has been in place long-term, however was extended in 2011 when the pair announced a strategic partnership which allowed BT to sell wholesale calls, Ethernet and broadband products to Daisy’s customers. As part of the initial partnership, Daisy became a third party supplier of PBX telephone systems related maintenance and engineering services to BT.

“We are committed to supporting our customers and partners as the business digitisation journey continues to unfold,” said Neil Muller, CEO of Daisy Group. “This collaboration with BT ensures that we are at the forefront of providing the latest in cloud solutions, increasing customers’ levels of capability and confidence as they continue to manage the relentlessness of technological change. I am hugely proud of Daisy’s relationship with BT and this is a perfect opportunity to further enhance our capability and provide our customers and partners with an industry leading cloud solution.”

Image recognition startup joins Google in France

Googlers having funGoogle has continued its charge on the artificial intelligence market through purchasing French image recognition startup Moodstocks, reports Telecoms.com.

Moodstocks, founded in 2008, develops machine-learning based image recognition technology for smartphones, which has been described by developers as the ‘Shazam for images’. Financials of the agreement have not been confirmed to date.

“Ever since we started Moodstocks, our dream has been to give eyes to machines by turning cameras into smart sensors able to make sense of their surroundings,” Moodstock said on its website. “Today, we’re thrilled to announce that we’ve reached an agreement to join forces with Google in order to deploy our work at scale. We expect the acquisition to be completed in the next few weeks.”

Artificial intelligence is one of the focal points of the Google strategy moving forward, which was confirmed by Google CEO Sundar Pichai during the company’s recent earnings call, though the focus can be dated back to the $625 million DeepMind acquisition in 2014. Although DeepMind is arguably the most advanced AI system in the industry, Telecoms.com readers recently confirmed in a poll Google was the leader in the AI segment, it has seemingly been playing catch up with the likes of Watson and AWS whose offerings have been in the public eye for a substantially longer period of time.

The recognition tools are most likely to be incorporated into the Android operating system, though Moodstocks customers will be able to continue to use the service until the end of their subscription. Moodstocks will be incorporated into Google’s R&D centre in France, where the team will work alongside engineers who are focusing on the development of Youtube and Chrome, two offerings where there could be a link to the Moodstocks technology.

“Many Google services use machine learning (or machine learning) to make them simpler and more useful in everyday life – such as Google Translate, Smart Reply Inbox, or the Google app,” said Vincent Simonet, Head of R&D centre of Google’s French unit. “We have made great strides in terms of visual recognition: now you can search in Google Pictures such as ‘party’ or ‘beach’ and the application will offer you good pictures without you and have never needed to categorize them manually.”

Last month, Google also announced it was expanding its machine research team by opening a dedicated office in Zurich. The team will focus on three areas specifically, machine intelligence, natural language processing & understanding, as well as machine perception.

Elsewhere in the industry, Twitter completed the acquisition of Magic Pony last month reportedly for $150 million. Magic Pony, which offers visual processing technology, was one of the more public moves made by the social media network, which could be seen as unusual as the platform lends itself well to the implementation of AI. Microsoft also announced the purchase of Wand Labs, building on the ‘Conversation-as-a-Platform’ proposition put forward by CEO Satya Nadella at Build 2016.

Equinix makes $874m data centre deal to keep EC happy

Equinix has announced the sale of eight data centres across Europe to Digital Realty Trust for approximately $874 million, reports Telecoms.com.

The deal forms part of a trade-off with competition authorities, as part of the agreement to acquire Telecity which was completed in January. For the acquisition to be accepted by the European Commission, eight data centres had to be relinquished by Equinix, which have now been confirmed as:

Recently acquired Telecity assets:

  • Bonnington House (London)
  • Sovereign House (London)
  • Meridian Gate (London)
  • Oliver’s Yard (London)
  • Science Park (Amsterdam)
  • Amstel Business Park I (Amsterdam)
  • Lyonerstrasse (Frankfurt)

Existing Equinix assets:

  • West Drayton data centre in London

The $3.8 billion acquisition of Telecity added 34 data centres to the Equinix portfolio, and more than doubled the company’s footprint in Europe. Equinix claims it is now the largest retail colocation provider in Europe and globally. Through the deal, Equinix opened up new markets in Dublin, Helsinki, Istanbul, Manchester, Sofia, Stockholm, and Warsaw, now totalling 145 IBX data centre facilities in 40 markets worldwide.

“Equinix’s acquisition of TelecityGroup added critical network and cloud density to better serve our global customers,” said Steve Smith, CEO at Equinix. “Completing this last milestone in the acquisition process paves the way for us to focus fully on helping our enterprise customers leverage our highly interconnected, global data centers for accelerated business performance and innovation.

“Additionally, the purchase of the Paris facilities is an important step in managing our real estate portfolio and ensuring we have the ability to add more capacity in this key market in the future.”

Oracle and Fujitsu partner up to tackle Japanese market

Oracle planeOracle and Fujitsu have announced a partnership to deliver Oracle cloud application and platform services to Japanese customers, reports Telecoms.com.

As part of the agreement, Fujitsu will install will install Oracle Cloud services in its data centre’s in Japan, connect them to its Cloud Service K5 in order to deliver enterprise-grade cloud services. The first service which will be connected will be Oracle’s Human Capital Management (HCM) Cloud, though it will extend further to include offerings such as the Database Cloud Service.

“In order to realize the full business potential of cloud computing, organizations need secure, reliable and high-performing cloud solutions,” said Edward Screven, Chief Corporate Architect at Oracle. “For over three decades, Oracle and Fujitsu have worked together using our combined R&D, product depth and global reach to create innovative solutions enabling customers to scale their organizations and achieve a competitive advantage. Oracle’s new strategic alliance with Fujitsu will allow companies in Japan to take advantage of an integrated cloud offering to support their transition to the cloud.”

In delivering the HCM solution first and foremost, Oracle is living up to its promise of targeting this aspect of the SaaS market segment. Back in March, the team released its quarterly statement, in which CTO Larry Ellison took a shine towards Salesforce, mentioning the company six times in a relatively short statement. Oracle has targeted the HCM and Enterprise Resource Planning (ERP) SaaS markets, as it believes they are currently underserved.

“Oracle Fusion ERP is the overall market leader in the enterprise cloud ERP market. I should say we have more than 10 times the number of ERP customers than Workday. And ERP has always been a much larger market than CRM. Salesforce.com is missing all of that ERP market opportunity,” said Ellison back during the earnings call. “And that in term it should make it easy for Oracle to pass Salesforce.com and become the largest SaaS and PaaS cloud company in the world.”

Widely regarded as a slow starter in the cloud market, Oracle would now appear to be gathering pace through various acquisitions and partnerships. Considering the resource the company has as its disposal, it should not be seen as a surprise Oracle is making strides in the industry.

Cyber security top of the list for European Commission after launch of €1.8bn initiative

EuropeThe European Commission has launched a new public-private partnership aimed at tackling the challenges of cyber security, and helping European companies become more competitive, reports Telecoms.com.

As part of the partnership, the EC will invest roughly €450 million, and will encourage industry to contribute healthily, targeting a total investment of €1.8 billion by 2020. The new initiative will take form through four pillars.

Firstly, the EC will encourage member states to make the most of the cooperation mechanisms under the new Network and Information Security (NIS) directive. Secondly, the EC will explore the possibility of creating a framework for certification of security products, which can then be distributed in any member state. Thirdly, the EC will establish a contractual public-private partnership with industry to nurture innovation. And finally, the team will create funds to enable SME’s to source investment and scale up.

“Europe needs high quality, affordable and interoperable cybersecurity products and services,” said Günther H. Oettinger, Commissioner for the Digital Economy and Society. “There is a major opportunity for our cybersecurity industry to compete in a fast-growing global market. We call on Member States and all cybersecurity bodies to strengthen cooperation and pool their knowledge, information and expertise to increase Europe’s cyber resilience. The milestone partnership on cybersecurity signed today with the industry is a major step.”

The new strategy builds on the EC’s ‘Open, Safe and Secure Cyberspace’ strategy which was launched in 2013 to ‘protect open internet and online freedom and opportunity’. While the initiative has launched a number of new legislative actions, there would appear to be little evidence much else has been achieved aside from ‘ensuring cooperation’, ‘ensuring a culture of security’ and ‘stepping up cooperation across Europe’. While previous work has been generalist and vague, the new proposition does at least offer encouragement there will be more concrete work achieved.

The NIS directive will support strategic cooperation and exchange of relevant information between member states, as well as creating a number of new bodies including EU Agency for Network and Information Security (ENISA), EU Computer Emergency Response Team (CERT-EU) and European Cybercrime Centre (EC3) at Europol. The plan will be to deliver a blueprint during the first half of 2017, and then deliver the initiative in an undefined timeframe. The EC has outlined a specific plan, though the lack of a timeframe seemingly removes some of the gained credibility.

“Without trust and security, there can be no Digital Single Market. Europe has to be ready to tackle cyber-threats that are increasingly sophisticated and do not recognise borders,” said Andrus Ansip, Vice-President for the Digital Single Market. “Today, we are proposing concrete measures to strengthen Europe’s resilience against such attacks and secure the capacity needed for building and expanding our digital economy.”

Bulgarian gov writes open source into law

Yellow road sign with a blue sky and white clouds: open sourceThe Bulgarian government has launched a number of amendments to the Electronic Governance Act which requires all code written for the government to be open source.

The announcement was made public through Bozhidar Bozhanov’s blog, who is currently acting as an advisor to the Deputy Prime Minister who is responsible for e-governance systems and policies. The new policy doesn’t mean the entire country will be moving towards Linux, though it is one of the first examples of a government putting the concept of open source into legislation. Article 58.a of the act states:

“When the subject of the contract includes the development of computer programs: (a) computer programs must meet the criteria for open source software. (b) All copyright and related rights on the relevant computer programs, their source code, the design of interfaces and databases which are subject to the order should arise for the principal in full, without limitations in the use, modification and distribution. (c) Development should be done in the repository maintained by the Agency in accordance with Art.7cpt.18.”

The amendment will not impact current contracts, or insist on the major vendors give away the source of their products, but only focuses on custom written code. When the government procures IT services or software which means custom code will be written specifically for the project, the act ensures this code will be outsourced for the rest of the country to use.

“After all, it’s paid by tax-payers money and they should both be able to see it and benefit from it,” said Bozhanov on the blog. “A new government agency is tasked with enforcing the law and with setting up the public repository (which will likely be mirrored to GitHub).

“The fact that something is in the law doesn’t mean it’s a fact, though. The programming community should insist on it being enforced. At the same time some companies will surely try to circumvent it.”

What did we learn from EMC’s data protection report?

a safe place to workEMC has recently released its Global Data Protection Index 2016 where it claims only 2% of the world would be considered ‘leaders’ in protecting their own assets, reports Telecoms.com.

Data has dominated the headlines in recent months as breaches have made customers question how well enterprise organizations can manage and protect data. Combined with transatlantic disagreements in the form of Safe Habour and law agencies access to personal data, the ability to remain secure and credible is now more of a priority for decision makers.

“Our customers are facing a rapidly evolving data protection landscape on a number of fronts, whether it’s to protect modern cloud computing environments or to shield against devastating cyber-attacks,” said David Goulden, CEO of EMC Information Infrastructure. “Our research shows that many businesses are unaware of the potential impact and are failing to plan for them, which is a threat in itself.”

EMC’s report outlined a number of challenges and statistics which claimed the majority of the industry are not in a place they should be with regard to data protection. While only 2% of the industry would be considered leaders in the data protection category, 52% are still evaluating the options available to them. Overall, 13% more businesses suffered data loss in the last twelve months, compared to the same period prior to that.

But what are the over-arching lessons we learned from the report?

Vendors: Less is more

A fair assumption for most people would be the more protection you take on, the more protected you are. This just seems logical. However, the study shows the more vendors you count in your stable, the more data you will leak.

The average data loss instance costs a company 2.36TB of data, which would be considered substantial, however it could be worse. The study showed organizations who used one vendor lost on average 0.83TB per incident, two vendors 2.04TB and three vendors 2.58TB. For those who used four or more vendors, an average of 5.47TB of data was lost per incident.

Common sense would dictate the more layers of security you have, the more secure you will be, however this is only the case if the systems are compatible with each other. It should be highlighted those who lost the larger data sets are likely to be the larger companies, with more data to lose, though the study does seem to suggest there needs to be a more co-ordinated approach to data protection.

And they are expensive…

Using same concept as before, the average cost of lost data was $900,000. For those who have one vendor, the cost was $636,361, for those with two, 789,193 and for those with three vendors the cost was just above the average at 911,030. When companies bring in four or more vendors, the average cost of data loss rises to 1.767 million.

China and Mexico are the best

While it may be surprising, considering many of the latest breakthroughs in the data world have come from Silicon Valley or Israel, China and Mexico are the two countries which would be considered furthest ahead of the trend for data protection.

EMC graded each country on how effective they are were implementing the right technologies and culture to prevent data loss within the organizations themselves. 17 countries featured ahead of the curve including usual suspects of the UK (13.5% ahead of the curve), US (8%), Japan (1%) and South Korea (9%), however China and Mexico led the charge being 20% and 17% respectively ahead.

While it may not be considered that unusual for China to have a strong handle on data within its own boarders, Mexico is a little more surprising (at least to us at Telecoms.com). The country itself has gone through somewhat of a technology revolution in recent years, growing in the last 20 years from a country where only 10% of people had mobile through to 68% this year, 70% of which are smartphones. Mexico is now the 11th largest economy in terms of purchasing power, with the millennials being the largest demographic. With the population becoming more affluent, and no longer constrained by the faults of pre-internet world, the trend should continue. Keep up the good work Mexico.

Human error is still a talking point

When looking at the causes of data loss, the results were widespread, though the causes which cannot be controlled were at the top of the list. Hardware failure, power loss and software failure accounted for 45%, 35% and 34% respectively.

That said the industry does now appear to be taking responsibility for the data itself. The study showed only 10% of the incidents of data loss was blamed on the vendor. A couple of weeks ago we spoke to Intel CTO Raj Samani who highlighted to us the attitude towards security (not just data protection) needs to shift, as there are no means to outsource risk. Minimizing risk is achievable, but irrelevant of what agreements are undertaken with vendors, the risk still remains with you. As fewer people are blaming the vendors, it would appear this responsibility is being realized.

Human error is another area which still remains high on the agenda, as the study showed it accounts for 20% of all instances of data loss. While some of these instances can be blamed on leaving a laptop in the pub or losing a phone on the train, there are examples where simple mistakes in the workplace are to blame. These will not be removed, as numerous day-to-day decisions are based off the back of intuition and gut-feel, and a necessity for certain aspects of the business.

An area which could be seen as a potential danger would be that of artificial intelligence. As AI advances as a concept, the more like humans they will become, and thus more capable of making decisions based in intuition. If this is to be taken as the ambition, surely an intuitive decision making machine would offer a security defect in the same way a human would. Admittedly the risk would be substantially smaller, but on the contrary, the machine would be making X times many more decision than the human.

All-in-all the report raises more questions than provides answers. While security has been pushed to the top of the agenda for numerous organizations, receiving additional investment and attention, it does not appear the same organizations are getting any better at protecting themselves. The fact 13% more organizations have been attacked in the last 12 month suggests it could be getting worse.

To finish, the study asked whether an individual felt their organization was well enough protected. Only 18% believe they are.

SEC filing shows LinkedIn negotiating skills are worth $5bn

Microsoft To Layoff 18,000The US Securities and Exchange Committee has released its filings outlining the road to Microsoft’s acquisition of LinkedIn, during which $5 billion was added to the value of the deal, reports Telecoms.com.

Five parties were involved in the saga, which eventually led to the news breaking on June 13, with Microsoft agreeing to acquire LinkedIn in an all-cash deal worth $26.2 billion. Although it has not been confirmed by the companies themselves according to Re/code Party A, which kicked the frenzy, was Salesforce. Party B was Google, which was also interested in pursuing the acquisition.

Party C and Party D were contacted by LinkedIn CEO Jeff Weiner to register interest however both parties declined after a couple of days consideration. Party C remains unknown, though Party D is believed to be Facebook, who even if had shown interest in the deal, may have faced a tough time in passing the agreement by competition authorities.

In terms of the timeline, a business combination was first discussed by Weiner and Satya Nadella, Microsoft’s CEO during a meeting on February 16, with Party A being brought into the frame almost a month later on March 10. Salesforce CEO Marc Benioff has confirmed several times in recent weeks his team were in discussions with LinkedIn regarding the acquisition. In the following days, Party B was brought into the mix, also declaring interest. Once the interest of Party A and B were understood, Microsoft was brought back into the mix on March 15 with the report stating:

“Mr. Weiner called Mr. Nadella to inquire as to whether Microsoft was interested in discussing further a potential acquisition of LinkedIn, and explained that, although LinkedIn was not for sale, others had expressed interest in an acquisition. Mr. Nadella responded that he would discuss the matter further with Microsoft’s board of directors.”

Prior to the agreement LinkedIn was valued at roughly $130 per share, with the initial offer recorded at $160. Microsoft eventually paid $196 per share, though this was not the highest bid received. The company referred to as Party A in the document put an offer forward of $200 per share, though this would be half cash and half shares in the company. Weiner negotiating skills have seemingly added approximately 50% to the value of LinkedIn shares and bumping up the total value of the deal by $5 billion.

The exclusivity agreement was signed on May 14, though pressure had been put on LinkedIn by both Microsoft and Party A in the weeks prior. It would appear Party A had not been deterred by the agreement, as additional bids were made, once again driving up the perceived value of LinkedIn shares. Microsoft’s offer of $182 was no longer perceived high enough, and encouraged to match Party A’s offer of $200. The report states LinkedIn Executive Chairman Reid Hoffman was in favour of an all cash deal, allowing Microsoft extra negotiating room. Nadella was eventually informed on June 10 the offer had been authorized by the LinkedIn Transactions Committee.

Although Microsoft could be seen to overpaying on the price, it would be worth noting LinkedIn has been valued at higher. The company initially launched its IPO in 2011 and had a promising year in 2013 increasing the share price from $113.5 to over $200 across the 12 month period. Share prices rose to over $250 last November, following quarterly results in February, share prices dropped 44% after projected full-year revenues at $3.6 billion to $3.65 billion, versus $3.9 billion expected by analysts. Considering the fall in fortunes, it may be fair to assume shareholders would be pleased with the value of the deal approaching $200 per share.

While Microsoft has been a relatively quiet player in the social market prior to the acquisition, though this could be seen as a means to penetrate the burgeoning market segment. Although the place of social media in the workplace remains to be seen, Microsoft has essentially bought a substantial amount of data, including numerous high-net worth individuals and important decision makers throughout the world. LinkedIn currently has roughly 431 million members and is considered to be the largest professional social media worldwide.

Another explanation for the deal could be the value of Microsoft to IT decision makers. A report from JPMorgan stated Microsoft would be considered the most important vendor by CIOs to their organization due to the variety of services offered. AWS is generally considered to be the number one player in the public cloud market, though Microsoft offers a wider range of enterprise products including servers, data centres, security solutions, and cloud offerings, amongst many more. Now social can be added to the list. As Microsoft increases its offerings, it could penetrate further into a company’s fabric, making it a much more complicated decision to change vendor.

Telsta adds IoT and big data offering to Network and Services biz unit

Location Australia. Green pin on the map.Australian telco Telstra has continued efforts to bolster its Network Applications and Services (NAS) business unit through acquiring Readify, reports Telecoms.com.

The company has been vocal about its aims for the NAS business unit as it has sought to expand through numerous acquisitions in recent years. Aside from the Readify deal, the company has also incorporated O2 Networks, Bridge Point Communications, Kloud and North Shore Connections, as well as numerous partnerships including with cloud security start-up vArmour.

“This arm of the business (NAS) has been a strong growth area for Telstra, achieving double-digit growth in revenue driven by business momentum in Asia, as well as advances in technology in the cloud computing space,” said a statement on the company website. “We are well equipped to continue to capitalise on this growth and ensure our focus on NAS continues to drive revenue.”

Readify, which currently offers enterprise cloud application solutions as well as Big Data and IoT, will provide an additional platform for Telstra to drive digital transformation for its enterprise customers in domestic and global markets. The offering builds on the January acquisition of Kloud which offers cloud migration services, as well as unified communications solutions and contact centre provider North Shore Connections in 2013, network integration services provider O2 Networks in 2014 and security, networking, and data management provider Bridgepoint, also in 2014.

“Readify will provide application development and data analytics services, nicely complementing Kloud’s existing services,” said Telstra Executive Director Global Enterprise and Services, Michelle Bendschneider. “It will enable Telstra to add incremental value to customers in enterprise cloud applications, API-based customisation and extensions as well as business technology advisory services.”

Back in April, the company announced a business multi-cloud connecting solution, which supports numerous offerings hybrid cloud offerings including Azure, AWS, VMware, and IBM. The one-to-many “gateway” model will enable Australian customers to connect to Microsoft Azure, Office365, AWS, IBM SoftLayer, and VMware vCloud Air, while international customers can only connect to AWS and IBM SoftLayer for the moment.

The cloud and enterprise services market has been a long-ambition of the company, though it did get off to a slow start. Back in 2014, its national rival Optus Business stole a march on Telstra through acquiring Ensyst, winner of Australian Country Partner of the Year at the Microsoft Worldwide Partner Awards during the same year, as it looked to grow its own cloud proposition. It would appear Telstra is making up for lost time through an accelerated program of product releases and acquisitions.

Contract dispute with HPE costs Oracle $3bn

Lady Justice On The Old Bailey, LondonOracle has released a statement declaring it will appeal a jury decision to side with HPE in a long-running contract dispute worth $3 billion.

The dispute dates back to 2011 when Oracle decided to stop creating new versions of its database and other software for systems running Intel’s Itanium chip. The HP Enterprise claimed the decision violated the contractual terms between the organizations, a claim which the jury also believed. Oracle also claimed Intel had decided to stop supporting Itanium shifting focus to the x86 microprocessor, which the chip-maker has denied.

“Five years ago, Oracle made a software development announcement which accurately reflected the future of the Itanium microprocessor,” said Dorian Daley, General Counsel of Oracle. “Two trials have now demonstrated clearly that the Itanium chip was nearing end of life, HP knew it, and was actively hiding that fact from its customers.

“Oracle never believed it had a contract to continue to port our software to Itanium indefinitely and we do not believe so today; nevertheless, Oracle has been providing all its latest software for the Itanium systems since the original ruling while HP and Intel stopped developing systems years ago.”

Back in 2012, Santa Clara court’s Judge James Kleinberg confirmed to Oracle it would have to maintain its end of the contract for as long as HPE remained in the Itanium game. This decision was appealed by Oracle, which delayed the damages trial.

HPE has been seeking damages of $3 billion – $1.7 billion in lost sales before the case started, plus $1.3 billion in post-trial sales – which was awarded in full by the jury. Daley has unsurprisingly stated Oracle will appeal the decision, which could mean the sage will continue for some time.

Oracle has been having a tough time in the court room as of late, as it was seeking $8.8 billion in damages from Google over the unlicensed use of Java in a case which has dated back to 2010. The recent ruling was a victory for Google as the jury found Android does not infringe Oracle-owned copyrights because its re-implementation of 37 Java APIs is protected by ‘fair use’. Oracle again stated it would appeal the decision, though it has been a tough couple of months for Oracle’s legal team.