Tag Archives: cloud

Red Hat boosts API management biz with 3scale acquisition

Money financingRed Hat has confirmed it has entered into a definitive agreement to 3scale, a provider of API management technology, reports Telecoms.com.

The two companies have been in partnership since early 2015 to create platform for API-based application development, though the acquisition is set to close in June 2016. 3scale currently provides developers with the tools to create, manage and scale APIs, and also recently introduced a containerized version of their API Gateway for Red Hat OpenShift. The tool enabls users to create applications with microservices distributed across diverse, hybrid environments. Upon completion of the transaction, the team commented on its blog it will open source the code almost immediately.

Red Hat claim the API management platform offered by 3scale complements various aspects of its portfolio well, most notably the JBoss Middleware portfolio, and also the elastic cloud environment provided by OpenShift. Although the company has not confirmed whether the 3scale brand will continue in the long-term, it does have a technology roadmap based on current customer requirements and the competitive landscape, which will be honoured.

“3scale complements our existing middleware product portfolio and Red Hat OpenShift by enabling companies to create and publish APIs with tools such as Red Hat JBoss Fuse, and then manage and drive adoption of those APIs once they have been published,” said Craig Muzilla, SVP of Application Platforms Business at Red Hat.

Ret Hat hope the acquisition will prove to be a differentiator in a crowded market, as it believes API management offerings could be the make-or-break factor in a number of new customer acquisitions who are looking at integration solutions. This coupled with API management offerings becoming a more important requirement in cloud application platforms, is the basis of the transaction. Acquiring 3scale enables Red Hat to address these evolving requirements quickly, as it continues the wider industry trend of acquire to innovate over organic growth.

Alongside the acquisition, the team also announced its quarterly results which demonstrated healthy growth. Q1 revenue was reported at $568 million, up 18% year-on-year, with subscription revenues at $502 million, also up 18% year-on-year. Subscription revenue from Application Development-related and other emerging technologies offerings for the quarter was $98 million, an increase of 39%.

“Digital transformation and cloud computing are changing the way companies compete in virtually every industry today,” said Jim Whitehurst, CEO of Red Hat. “Organizations that rapidly embrace agile IT technology are succeeding as industry innovation accelerates around them. Our open source-based technologies are helping customers capture the business benefits associated with this rapid rate of change.”

In terms of the outlook for the remainder of 2016 and beyond, containers were a technology which have been prioritized for the business.

“We actually see containers as a great opportunity for us to continue to differentiate around, a, kernel space and user space being consistent,” said Whitehurst in the company’s earnings call. “So having the same host and technology in the container itself. And then secondly just ability to lifecycle manages against that.

“So containers overall are good for Linux because it helps it grow overall share versus Windows. And then within that we think we have a definitely differentiated position given our position in the OS. So that’s why we can see continue double digit growth in general in the OS category which includes containers.”

Google Fiber adds Miami and Boston to roster

GoogleGoogle has entered into a definitive agreement to acquire Webpass to boost its Google Fiber business unit and add to its wireless broadband ambitions, reports Telecoms.com.

The acquisition builds on an area of innovation which the Google Fiber team have been investigating. Webpass has paired its fiber network with wireless technology, an idea which the Google team have been testing in Kansas City earlier this year. Back in April, Google was given approval to test its 3.5 GHz wireless broadband capabilities using antennas on light poles and various other structures, in and around the Kansas City area. The FCC commented the innovation could create a new flavour of wifi or even an LTE Unlicensed band.

Webpass was founded in 2003, and claims to have customers in the “tens of thousands”, though these are primarily apartment blocks and business users, two demographics which are likely to be of interest to Google. Webpass has focused its sights on business users in recent months, providing services in the range of 100 megabits per second to one gigabit per second, and also operates in two markets Google Fiber which has no exposure; Miami and Boston.

“Google Fiber’s resources will enable Webpass to grow faster and reach many more customers than we could as a standalone company,” said Charles Barr, President at Webpass. “I’m very much looking forward to this next chapter for Webpass, and let me take this opportunity to once again say thank you to all of our loyal customers. We are thrilled to be on this journey together.”

While the deal is still subject to the customary approval process from regulators, it is the first acquisition for the Google Fiber business, indicating the company’s intensions in the arena. The Google Fiber business has been growing at a healthy rate in the last 18 months, though the addition of Webpass will give the company traction in five significant markets in the US, including major cities such as San Francisco, San Diego, Miami, Chicago, and Boston.

UK retailer Boots deputizes in-store app to capitalize on mobility trends

UK retailer Boots has announced it has launched a new app, Sales Assist, to make it easier and simpler for customers to get hold of the products they need.

The app itself is based on the upward trends of customers using devices to gain better value for their pounds as they shop on the high street. By incorporating iPads in a number of shops throughout the UK the app is supporting the retailer’s vision is to use mobility to change the way customers shop.

“At Boots UK we’re investing in innovative new technology to further improve the retail experience for our customers, and mobility is at the forefront of this transformation,” said Robin Phillips, Director of Omnichannel and Development at Boots UK. “By developing Sales Assist, in collaboration with IBM and Apple, and launching it on the 3,700 iPads in our stores, we’re integrating our digital and in-store presence to deliver an even better shopping environment for customers.

“The unique tool allows our colleagues to quickly show product information, ratings and reviews, look up inventory online and make recommendations based on online analytics, all from the shop floor. It will help even our smallest stores feel like a flagship shop, with access to the entire Boots range at their fingertips.”

Boots is using Bluemix, IBM’s cloud platform, to link Sales Assist with the company’s applications and data. The app itself links into the boots.com database allowing shop assistants to locate items, but also use the power of analytics to drive recommendations and impulse buys. The team have not stated how the app will be evolved in the future, though there is the potential for artificial intelligence to be incorporated to drive additional sales in and out of the store.

What did we learn at Cloud & DevOps World?

Cloud & DevOps WorldThe newly branded Cloud & DevOps World kicked off yesterday with one theme prominent throughout the various theatres; cloud is no longer a disruption, but what can be achieved through the cloud is still puzzling decision makers, reports Telecoms.com.

One word which was heard more than any other was maturity, as there would appear to be a general consensus that cloud computing had matured as a concept, process and business model. Although finding the greatest value from the cloud is still a challenge, there is a general feeling those in the IT world are becoming more adventurous and more willing to experiment.

Speaking in the Business Transformation theatre, Hotels.com CIO Thierry Bedos opened up the conference with a look into future trends in cloud computing. Maturity was the main driver of the talk here, as Bedos pointed out AWS’ dominant position as market leader and innovator is starting to loosen. While it would generally be considered strange to call tech giants such as Google and Microsoft challenger brands, it would be fair in the context of public cloud. But not for much longer, as the gap is slimming. For Bedos, this competition is a clear indication of a maturing market.

Along Bedos, Oracle’s Neil Sholay gave us insight into the world of data analytics, machine learning and AI in the Oracle Labs. Bill Gates famously said “Content is King”, and while this remains true, Sholay believes we can now go further and live by the rule “Corpus is King”. Content is still of value, though the technologies and business practise to deliver content have dated the phrase. The value of content is now in mastering its delivery through effective analytics to ensure automation, context and insight. A content campaign is only as good as the information you feed it to provide value to the consumer.

The Cyber & Cloud Security theatre held a slightly different story, but maturity was still a strong theme. ETSI & GSMA Security Working Group Chairperson Charles Brookson commented to us while there is still a lot of work to do to ensure security, the decision makers are maturing in the sense they have accepted 100% secure is unachievable and remaining as secure as possible for as long as possible is the new objective.

For a number of the delegates and speakers this is a new mind-set which has been embraced, however there are still some technical drawbacks. Futuristic advances such as biometric security is set to become a possibility in the near future, but Birmingham University’s David Deighton showed the team had made solid progress in the area. Failure rates are still at 2%, which was generally received as too high, but this has been reduced from 15% in a matter of months. The team would appear to be heading in the right direction, at a healthy pace.

Once again the concept of failure was addressed in the IoT & Data Analytics theatre as conference Chairperson Emil Berthelsen (Machine Research) told us the important lesson from the day was to set the right expectations. Some project will succeed and some will not, but there is no such thing as failure. The concept of IoT is now beginning to gain traction in the enterprise world, starting to show (once again) maturity, but for Berthelsen, the importance of scalability, security and data in IoT solutions was most evident throughout the day.

Day 1 showed us one thing above all else; we’re making progress, but we’re not quite there yet.

UK citizens trust EU countries with data more than the UK

EuropeWith the countdown to Brexit vote in its final days, research from Blue Coat has highlighted British respondents would be more trusting if their data was stored in the EU country as opposed to the UK.

Although only marginal, 40% of respondents believe the EU is a safer bet for storage of data, whereas only 38% elected the UK. Germany was perceived as the most trustworthy state, though this could be seen as unsurprising as the country is generally viewed as having the most stringent data protection laws. France ranked in second place, whereas the UK sat in third.

While the true impact of Brexit will only be known following the vote, the role of the UK in the technology world could be impacted by the decision. The research showed a notable favouritism to store data in countries which are part of the EU and under the influence of the European Commission’s General Data Protection Regulation. When looking across the Atlantic to the US, within the UK has more trust than the rest of Europe, though it could still be considered very low. In the UK, 13% said they would trust the US with their data, whereas this number drops down to 3% where France and Germany are concerned.

“The EU regulatory landscape is set to radically change with the introduction of the GDPR legislation and this research highlights the level of distrust in countries outside the EU,” Robert Arandjelovic, Director of Product Marketing EMEA, Blue Coat Systems. “Respondents prefer to keep their data within the EU, supporting new European data protection legislation.

“More concerning is the fact that almost half of respondents would trust any country to store their data, indicating too many employees simply doesn’t pay enough attention to where their work data is held. This presents a risk to enterprises, even if their employees treat where it is being hosted with little interest.”

While the impact of the Brexit vote is entirely theoretical at the moment, leaving the union could spell difficult times for the UK as EU countries favour those which are in the EU. What is apparent from the statistics is the US still has substantial work to do to counter the ill effects of the Safe Harbour agreement, which was struck down last October. The survey indicates the replacement policy, the EU-US Privacy Shield, has not met the requirements of EU citizens as trust in the US is still low.

Dell sells software business for $2bn to fund EMC deal

Dell has announced Francisco Partners and Elliott Management have agreed to purchase its software business unit as the company moves towards deadline day for the EMC merger, reports Telecoms.com.

The deal, initially reported by Reuters, will include the Quest Software and SonicWALL assets reportedly for just over $2 billion. Both assets were acquired by Dell in recent years, for a combined total of $3.6 billion, and while this could be seen as a big loss for the company, details of what the transaction will include and what will remain in the Dell business have not been confirmed.

The acquisition represents two growing trends within the industry. Firstly, venture capitalists have been making some notable moves in recent weeks, possibly indicating confidence in backing cloud companies have returned. Vista Equity Partners bought Marketo for $1.8 billion last month, then this followed up with a deal for Ping Identity for $600 million. Thoma Bravo also bought Qlik for $3 billion and Providence Strategic Growth invested $130 million in Logic Monitor recently.

Secondly, Dell is starting to peel back layers of their business. For the most part, this shouldn’t be seen as a particular surprise; an acquisition the size of the one Dell is currently going through requires funding, and there is also likely to be a certain level of crossover between the two business units. Characterising sale of Quest Software and SonicWALL, as well as Dell Services in March, as panic sales could be tempting, though it could also be seen as logical.

Dell’s buy-out of EMC was initially announced in October last year for $67 billion, billed as one of the largest acquisitions in the history of the technology industry. At EMC World this year, the team took the chance to launch the new brand, Dell Technologies, but also outline the integration strategy of the two tech giants. Dell’s Chief Integration Officer Rory Read and EMC’s COO of the Global Enterprise Services business unit Howard Elias highlighted while a reduction in headcount and sales would be limited, it would not be entirely unavoidable; two companies as large as Dell and EMC are naturally going to have crossover.

The sales to Francisco Partners and Elliott Management could be seen as a means to raise capital for the acquisition, this is hardly surprising as it was highly unlikely $67 billion was going to be found down the back of the sofa. The team have not commented on the specifics of the agreement to date, however one thing it does highlight is sales are a necessity to funding one of the largest deals in the history of the technology industry.

Demystifying the three myths of cloud database

cloud question markThe cloud is here and it’s here to stay. The cost savings, flexibility and added agility alone mean that cloud is a force to be reckoned with.

However, many businesses are struggling to figure out exactly how to get the most out of the cloud; particularly when choosing what infrastructure elements to leave on-premises and which to migrate to the cloud. A recent SolarWinds survey found that only 42 per cent of businesses will have half or more of their organisations’ total IT infrastructure in the cloud within the next three to five years. Furthermore, seven per cent say their organisation has not yet migrated any infrastructure at all to the cloud, though many of these plan to once they have considered what to transfer and how to do it.

One of the more controversial moves when it comes to migrating infrastructure to the cloud is the database. Hesitancy in making the shift to the cloud is clear, with nearly three quarters (73%) of organisations stating they have yet to do so – but why is this?

The database is often seen as the most critical and important piece of IT infrastructure when it comes to performance, and lies at the heart of most applications, meaning changes are perceived as being risky. If there is a negative effect when moving or changing the way it operates, a ripple effect could impact on the entire business, for example losing important data.

While on some level this fear is justifiable, there are certainly a few reasons which could be defined as myths, or misconception, rather than reality:

Myth 1: Need high performance and availability? The cloud is not a suitable fit.

Several years ago during the early days of the cloud, the ‘one size fits all’ approach may have been fact, however with the natural maturation of the technology we’re at a point where databases in the cloud can meet the needs of even the most demanding applications.

The reality of today’s cloud storage systems is that there are very powerful database services available on the cloud, many based on SSD drives offering up to 48,000 IOPS and 800MBps throughout per instance. Also, while outages in the cloud were a common annoyance two to three years ago, today’s cloud providers often exceeds that of what most on-premises systems are able to deliver. Today’s cloud provider SLAs combined with the ease of setting replicas, standby systems and the durability of the data stored are often able to deliver better results.

This is not to say that the database administrator (DBA) is free of responsibility. While the cloud provider will take care of some of the heavy lifting that is involved with configurative and administrative tasks, the DBA is still responsible for the overall performance. Therefore, the DBA needs to still pay close attention to resource contention, bottlenecks, query tuning, execution plans, etc. – some of which may mean new performance analysis tools are needed.

Myth 2: The cloud is not secure.

Even though security should always be a concern, just because you can stroll into a server room and physically see the server racks doesn’t necessarily mean they are more secure than the cloud. In fact, there have been many more high profile security breaches involving on-premises compared to public cloud.

The truth is the cloud can be extremely secure, you just need a plan. When using a cloud provider, security is not entirely their responsibility, instead it needs to be thought of as a shared job – they provide reasonably secure systems, and you are responsible for secure architecture and processes.

You need to be very clear about the risks, the corporate security regulations which need to be abided by and the compliance certifications that must be achieved. Also, by developing a thorough understanding of your cloud provider’s security model, you will be able to implement proper encryption, key management, access control, patching, log analysis, etc. to complement what the cloud provider offers and take advantage of their security capabilities. With this collaborative approach to security and in-depth understanding of one another, you can ensure that your data is safe, if not safer, than if it were physical server racks down the hall.

Myth 3: If I use cloud I will have no control of my database.

This is another half-truth. Although migrating your database to the cloud does hand over some of the day-to-day maintenance control to your provider, when it comes to performance your control won’t and shouldn’t be any less.

As mentioned above, an essential step to ensure that you remain in control of your database is to understand your cloud provider’s service details. You need to understand their SLAs, review their recommended architecture, stay on top of new services and capabilities and be very aware of scheduled maintenance which may impact your job. Also, it’s important to take into account data transfer and latency for backups and to have all your databases in sync, especially if your database-dependent applications need to integrate with another one and are not in the same cloud deployment.

Finally, keep a copy of your data with a different vendor who is in a different location. If you take an active role in managing backup and recovery, you will be less likely to lose important data in the unlikely event of vendor failure or outage. The truth is that most cloud providers offer plenty of options, giving you the level of control you need for each workload.

Conclusion

The decision to migrate a database to the cloud is not an easy one, nor should it be. Many things need to be taken into account and the benefits and drawbacks need to be weighed up. However, given the tools available and the maturity of the cloud market today, deciding not to explore cloud as an option for your database could be short-sighted.

Written by Gerardo Dada, Head Geek at SolarWinds

Oracle sets sights on IaaS market as it reports 49% cloud growth

Oracle CloudOracle has reported its 2016 Q4 results stating growth over the period declined 1% to $10.6 billion, though its cloud business grew 49% to $859 million, reports Telecoms.com.

2016 has seen Oracle spend almost $2 billion on cloud-specific organizations, as the tech giant continues efforts to transform the Oracle business focus to the burgeoning cloud market. While Oracle could be seen as one of the industry’s elder statesmen, efforts in the M&A market are seemingly paying off as PaaS and SaaS continues to demonstrate healthy growth to compensate for the dwindling legacy business units. The team have also outlined plans to make strides in the IaaS market segment.

Growth in the SaaS and PaaS business has been accelerating in recent years as CEO Safra Catz quoted 20% growth in 2014, 34% in 2015, and now 52% over the course of FY 2016. Q4 gross margin for SaaS and PaaS was 57%, up from 40% during the same period. The progress of the business would appear to be making healthy progress, and Catz does not seem to be content with the current growth levels. The team have ambitions to raise gross margin to 80% in the mid-term, as well as seeing cloud year-on-year revenue growth for Q1 FY 2017 of 75% to 80%.

“For most companies as their business grows, the growth rates go down,” said Catz. “In our case, as the business grows, the growth rates are continuing to increase. Now, as regard to our cloud revenue accounting, we have reviewed it carefully and are completely confident that it is a 100% accurate and if anything slightly conservative.”

Moving forward, CTO Larry Ellison highlighted the team plan on driving rapid expansion of the cloud business. The Oracle team are targeting growth rates which would double that of competitors as its ambition is now to be the first SaaS company to make $10 billion in annual revenue. The team are not only targeting the customer experience markets, but also the Enterprise Resource Management and Human Capital Management segments, where it believes there will be higher growth rates.

“We’re a major player in ERP and HCM,” said Ellison. “We’re almost the only player in supply chain and manufacturing. We’re the number one player in marketing. We’re very competitive. We’re number one – tied for number one in service.”

Secondly, the team will also be aiming to facilitate growth through expanding it IaaS data centre focus, which is currently an ‘also ran’ part of the cloud business. Ellison claims Oracle is in a strong position to grow in this area, having invested heavily second generation data centres, as well the potential for the combination of PaaS and IaaS for the company’s installed base of database customers, helping them move to the cloud.

“And we built, again, the second generation data centre, which we think is highly competitive with anything out there lower cost, better performance, better security, better reliability than any of our competitors, and there’s huge demand for it, and we’re now starting to bring customers into that,” said Ellison. “We think that’s another very important driver to Oracle for overall growth.”

The last few years have seen a considerable transformation in the Oracle business, as it has invested considerably in the development of new technology, as well as acquisitions, seemingly hedging its bets to buy its way into the cloud market. The numbers quoted by Catz and Ellison indicate there has been some traction and the market does seem to be reacting positively to the new Oracle proposition.

In terms of the IaaS market, success in this area will remain to be seen. Although Oracle has the potential to put considerable weight behind any move in this market, it is going to be playing catch up with some noteworthy players, who have cash themselves. Whether Oracle has the ability to catch the likes of AWS, Microsoft Azure and Google, as well as the smaller players in the market, remains to be see, though its success in the SaaS and PaaS markets does show some promise.

Mozilla Firefox launches container feature for multiple online personas

FirefoxThe Mozilla Firefox team has announced it will integrate a new containers driven feature to allow users to sign into multiple accounts on the same site simultaneously.

While the concept of using technology to manage multiple accounts and different personas is not a new idea, the practicalities have been out of reach. With the new feature, users will be able to sign into multiple accounts in different contexts for such uses as personal emails, work accounts, banking, and shopping. Twitter is one of the most relevant examples in the immediate future, as it is not uncommon for individuals to have multiple twitter account for work and personal life.

“We all portray different characteristics of ourselves in different situations,” said Tanvi Vyas, one of the security engineers working on the project, on the company blog. “The way I speak with my son is much different than the way I communicate with my coworkers. The things I tell my friends are different than what I tell my parents. I’m much more guarded when withdrawing money from the bank than I am when shopping at the grocery store. I have the ability to use multiple identities in multiple contexts. But when I use the web, I can’t do that very well.

“The Containers feature attempts to solve this problem: empowering Firefox to help segregate my online identities in the same way I can segregate my real life identities.”

The Mozilla Firefox team are one of the first to have cracked the equation, though it does admit there are a number of challenges to come. Questions which the team now need to answer include:

  • How will users know what context they are operating in?
  • What if the user makes a mistake and uses the wrong context; can the user recover?
  • Can the browser assist by automatically assigning websites to Containers so that users don’t have to manage their identities by themselves?
  • What heuristics would the browser use for such assignments?

“We don’t have the answers to all of these questions yet, but hope to start uncovering some of them with user research and feedback,” said Vyas. “The Containers implementation in Nightly Firefox is a basic implementation that allows the user to manage identities with a minimal user interface.”

Containers for Web

Machine learning front and centre of R&D for Microsoft and Google

Dear Future Im Ready, message on paper, smart phone and coffee on tableMicrosoft and Google have announced plans to expand their machine learning capabilities, through acquisition and new research offices respectively, reports Telecoms.com.

Building on the ‘Conversation-as-a-Platform’ proposition put forward by CEO Satya Nadella at Build 2016, the Microsoft team has announced plans to acquire Wand Labs. The purchase will add weight to the ‘Conversation-as-a-Platform’ strategy, as well as supporting innovation ambitions for Bing intelligence.

“Wand Labs’ technology and talent will strengthen our position in the emerging era of conversational intelligence, where we bring together the power of human language with advanced machine intelligence,” said David Ku, Corporate Vice President of the Information Platform Group on the company’s official blog. “It builds on and extends the power of the Bing, Microsoft Azure, Office 365 and Windows platforms to empower developers everywhere.”

More specifically, Wand Labs adds expertise in semantic ontologies, services mapping, third-party developer integration and conversational interfaces, to the Microsoft engineering team. The ambition of the overarching project is to make the customers experience more seamless by harnessing human language in an artificial environment.

Microsoft’s move into the world of artificial intelligence and machine learning has not been a smooth ride to date, though this has not seemed to hinder investment. Back in March, the company’s AI inspired Twitter account Tay went into melt-down mode, though the team pushed forward, updating its Cortana Intelligence Suite and releasing its Skype Bot Platform. Nadella has repeatedly highlighted artificial intelligence and machine learning is the future for the company, stating at Build 2016:

“As an industry, we are on the cusp of a new frontier that pairs the power of natural human language with advanced machine intelligence. At Microsoft, we call this Conversation-as-a-Platform, and it builds on and extends the power of the Microsoft Azure, Office 365 and Windows platforms to empower developers everywhere.”

Google’s efforts in the machine learning world have also been pushed forward this week, as the team announced dedicated machine learning research based in the Zurich offices, on its blog. The team will focus on three areas specifically, machine intelligence, natural language processing & understanding, as well as machine perception.

Like Microsoft, Google has prioritized artificial intelligence and machine learning, though both companies will be playing catch-up with the likes of IBM and AWS, whose AI propositions have been in the market for some time. Back in April, Google CEO Sundar Pichai said in the company’s earnings call “overall, I do think in the long run, I think we will evolve in computing from a mobile first to an AI first world,” outlining the ambitions of the team.

Google itself already has a number of machine learning capabilities incorporated in its product portfolio, those these could be considered as relatively rudimentary. Translate, Photo Search and SmartReply for Inbox already contains aspects of machine learning, though the team are targeting more complex and accurate competencies.

Elsewhere, Twitter has announced on their blog advertisers will now be able to utilize emoji keyword targeting for Twitter Ads. This new feature uses emoji activity as a signal of a person’s mood or mind set, allowing advertisers to more effectively communicate marketing messages minimizing the potential for backlash of disgruntled twitter users. Although the blog does not state the use of machine learning competencies, it does leave the opportunity for future innovation in the area.