Archivo de la categoría: News & Analysis

AWS launches new features at Chicago Summit

amazon awsAmazon Web Services has launched a number of new features, along with the announcement that AWS Import/Export Snowball is now available in four new regions, including Europe.

Speaking at AWS Chicago Summit, the team announced several updates including new security features, tools which simplify the movement of data around an organizations cloud, platforms for automatically deploying and running apps on Amazon’s cloud infrastructure, testing features, as well as authentication services.

Firstly, the AWS Device Farm Update is a feature, initially introduced last June, which enables customers to test mobile apps on real devices. The service is built on the concept of ‘write once, test everywhere’, enabling developers the chance to test apps in more than 200 unique environments (a variety of carriers, manufacturers, models, operating systems etc.). The update now provides customers with remote access to devices for interactive testing.

Writing on the AWS blog, Jeff Barr, Chief Evangelist at Amazon Web Services said, “you simply open a new session on the desired device, wait (generally a minute or two) until the device is available, and then interact with the device via the AWS Management Console. You can gesture, swipe, and interact with devices in real time directly through your web browser as if the device was on your desk or in your hand. This includes installing and running applications.”

Amazon S3 and Snowball, designed to increase speed of the data migration process, also received attention during the event. The AWS Import/Export Snowball was launched for customers who intend to move larger amounts of data, generally 10 terabytes or more, and has now been beefed up once again. New features for S3 make use of the AWS edge infrastructure to increase speed, and Snowball also has larger-capacity as well as now being available in four new regions.

“Many AWS customers are now using AWS Import/Export Snowball to move large amounts of data in and out of the AWS Cloud,” said Barr. “The original Snowball appliances had a capacity of 50 terabytes. Today we are launching a newer appliance with 80 terabytes of capacity.”

Amazon Kinesis, a service which enables users to manage data that is streamed into the cloud, has been updated to allow users to deploy, run, and scale Elasticsearch in the AWS Cloud, as well interaction with Amazon CloudWatch, its monitoring service.

The Cognito service allows apps to add authentication, user management, and data synchronization without having to write backend code or manage any infrastructure. The ‘Your User Pools’ feature update allows developers to build a user directory that can scale to hundreds of millions of users, to help manage the authentication process.

“Using a user pool gives you detailed control over the sign-up and sign-in aspects of your web and mobile SaaS apps, games, and so forth,” said Barr. “Building and running a directory service at scale is not easy, but is definitely undifferentiated heavy lifting, with the added security burden that comes when you are managing user names, passwords, email addresses, and other sensitive pieces of information. You don’t need to build or run your own directory service when you use Cognito Identity.”

Finally, the Elastic Beanstalk, which automatically deploys and runs apps on Amazon’s cloud infrastructure, has also been updated, by adding support for managed platform updates. Developers are now able to select a maintenance window, and the new feature will update the environment to the latest platform version automatically.

“The updates are installed using an immutable deployment model to ensure that no changes are made to the existing environment until the updated replacement instances are available and deemed healthy (according to the health check that you have configured for the application),” said Barr.

Leadership restructure has little impact as VMWare reports 5% growth

VMWare campus logoVMWare has reported healthy growth during its Q1 earnings call despite disruptions in the management team over the period.

Revenues for the first quarter were reported at $1.59 billion, an increase of 5% in comparison to the same period in 2015, though license revenues saw a drop of 1% to $572 million. The company now expects second-quarter revenue of $1.66 billion to $1.71 billion, compared with analysts’ average estimate of $1.66 billion.

“Q1 was a good start to 2016, both for results and against our strategic goal of building momentum for our newer growth businesses and in the cloud,” said Patrick Gelsinger, CEO at VMWare. “Our results were in line with our expectations for the period and support our outlook for the full year.”

Over the course of the period, there may have been concerns surrounding changes in the leadership team, and how a restructure would impact the performance of the business on the whole. Carl Eschenbach announced last month he would be leaving his post as VMWare President and COO to join venture capital firm Sequoia Capital as a Partner. CFO Jonathan Chadwick also left the business in January.

Eschenbach joined the firm in 2002 as VP of Sales, was appointed as co-President and COO in 2011 and eventually as the stand-alone President in 2012. During Eschenbach’s time at VMWare, revenues grew from $31 million in 2002, to more than $6 billion in 2015. The changes in leadership would not have appeared to have stifled the company’s performance, as its cloud business units performed healthily over the first quarter.

“We think on the executive side, it really is the combination of being able to attract new players than – I mentioned Rajiv (Rajiv Ramaswami, GM, Networking and Security) we brought in a leader for China, Bernard (Bernard Kwok, Greater China President); we’ve been able to continue to attract talent,” said Gelsinger. “We’ve also had commented on our very strong bench, and – like Maurizio (Maurizio Carli, VP Worldwide Sales), we had brought him over from Europe a year plus ago to prepare for this eventuality, and so we had been grooming and preparing for these transitions.”

The company also reported healthy growth for its cloud business unit, including NSX, VSAN, End-User Computing and vCloud Air Network. The company highlighted standalone vSphere license bookings were less than 35% of total bookings, a figure which was more than 50% two years ago. The team claim this reduction demonstrates the product offering has been successfully diversified.

“Turning to hybrid cloud. Total bookings for vCloud Air Network grew over 25% year-over-year,” said Zane Rowe, CFO at VMWare. “We see significant interest from cloud and service providers around the world wanting to utilize our hybrid cloud technologies. For example, as Pat mentioned earlier, IBM will be delivering a complete SDDC offering based on VMware’s technologies across their expanded footprint of cloud data centres worldwide. vCloud Air also performed well in Q1 with large enterprise customer adoption.”

In terms of long-term strategy, Gelsinger outlined a three-point plan to facilitate VMWare’s growth in the cloud market segment. Firstly, the business will consolidate its position in the private cloud space, a segment which it describes as the ‘foundation of our business’. Secondly, through the vCloud Air service and vCloud Air Network, the company aims to encourage its customers extend their private cloud into the public cloud. And finally, connecting, managing and securing end points across a range of public clouds, including Amazon Web Services and Microsoft Azure.

Telstra launches one-to-many Cloud Gateway offering

GatewayAustralian telco Telstra has bolstered his position in the growing cloud market with the launch of Cloud Gateway.

The Cloud Gateway is Telstra’s new solution which enables businesses to connect to multiple public cloud environments, acting as a one-to-many “gateway” model via Telstra’s IP network.

“Most organisations don’t realise the full value of cloud out of a single service,” said Philip Jones, Global Products and Solutions at Telstra. “Instead, our customers are investing in sophisticated hybrid cloud environments, which come with their own range of fragmented networking challenges.

“These include managing multiple vendors, portals and contracts, while trying to maintain a high level of security, performance and operational efficiency. We believe that just because these solutions are sophisticated, doesn’t mean that they should also be complex. Cloud Gateway is Telstra’s simple way to connect multiple clouds, and create hybrid environments.”

The product offering will enable Australian customers to connect to Microsoft Azure, Office365, AWS, IBM SoftLayer, and VMware vCloud Air, while international customers can only connect to AWS and IBM SoftLayer for the moment.

“Telstra is very well positioned to help customers with hybrid and multi-cloud strategies, as we bring the cloud and the network together,” said Jones. “The network is the fundamental piece of the puzzle that helps provide a secure and reliable application experience. Having a single touchpoint also helps reduce IT complexity, enabling our customers to maximise the benefits of investing in cloud.”

Telstra has been making moves within the cloud space in recent months, following the announcement of a cloud innovation centre in February. The centre was launched alongside partners AWS and Ericsson with the focus of accelerating the adoption of cloud technologies.

“Telstra’s vision is to build a trusted network service for mission critical cloud data, and we are excited to explore the opportunity of bringing this vision to life with Ericsson and AWS,” said Vish Nandlall, CTO of Telstra, at the time of the announcement. “The Cloud Innovation Center at Gurrowa intends to bring together cloud experts from Ericsson, AWS and Telstra to encourage cloud adoption and the development of new business opportunities for Telstra and our customers.”

Microsoft enters the containers race

male and female during the run of the marathon raceMicrosoft has cashed in on one of the industry’s trending technologies, with the announcement of the general availability of the Azure Container Service.

The Microsoft container service was initially announced in September 2015 and released for public preview in February, is built on Opensource and offers a choice between DC/OS or Docker Swarm orchestration engines.

“I’m excited to announce the general availability of the Azure Container Service; the simplest, most open and flexible way to run your container applications in the cloud,” said Ross Gardler, Senior Program Manager at Microsoft, on the company’s blog. “Organizations are already experimenting with container technology in an effort to understand what they mean for applications in the cloud and on-premises, and how to best use them for their specific development and IT operations scenarios.”

While the growth of containers technology has been documented in recent months, a number of industry commentators have been concerned about the understanding of the technology within enterprise organizations themselves. A recent survey from the Cloud & DevOps World event, highlighted 74% of respondents agreed with the statement “everyone has heard of containers, but no-one really understands what containers are.”

Aside from confusion surrounding the definition and use case of containers, the Microsoft team believe the growth of the technology is being stunted by the management and orchestration. While the technology does offer organizations numerous benefits, traditional means of managing such technologies has proven to be in-effective.

“Azure Container Service addresses these challenges by providing simplified configurations of proven open source container orchestration technology, optimized to run in the cloud,” said Gardler. “With just a few clicks you can deploy your container-based applications on a framework designed to help manage the complexity of containers deployed at scale, in production.”

Along the availability announcement, Microsoft has also joined a new open source DC/OS project enabling customers to use Mesosphere’s Data Center Operating System to orchestrate their containers projects. The project brings together the expertise of more than 50 partners to drive usability within the software-defined economy.

The Docker Swarm version ensures any Docker compliant tooling can be used in the service. Azure Container Service provides a ‘Docker native’ solution using the same open source technologies as Dockers Universal Control Plane, allowing customers to upgrade as and when required.

IBM reports cloud growth amid 16th quarterly revenue decline

IBMIBM has reported healthy growth for its cloud and strategic imperatives business units, despite witnessing revenue declines for the 16th straight quarter.

The strategic imperatives units, which include the cloud, analytics, mobile, social and security services, delivered $29.8 billion in revenue over the last 12 months, accounting for 37% of total revenues, with cloud accounting for $10.8 billion.

“We delivered $18.7 billion in revenue, $2.3 billion in net income and operating earnings per share of $2.35,” said Martin Schroeter, CFO at IBM. “Importantly, we also made significant investments and took significant actions to accelerate our transformation and move our business into new areas.”

Specifically in Q1, total revenues for the group dropped by 5% to $18.7 billion, the strategic imperatives unit grew 14% to $7 billion, with cloud accounting for $2.6 billion, a 34% year-on-year increase. The company also announced or closed ten acquisitions during the quarter, investing just over $2.5 billion in new businesses including Bluewolf, a Salesforce partner, Truven, a provider of cloud-based healthcare data and The Weather Company’s digital assets.

While the company built its reputation in the traditional IT market segment, sliding revenues and enterprise attention to cloud solutions has enforced a transformation play for the tech giant, which would appear to paying off well.

“We’re continuing to expand our Watson ecosystem and reach,” said Schroeter. “Over the last 12 months, the number of developers using Watson APIs is up over 300% and the number of enterprises we’ve engaged with has doubled. Watson solutions are being built, used, and deployed in more than 45 countries and across 20 different industries.”

Watson would appear to be one of the main focal points for IBM’s new cloud-orientated business model, as the cognitive computing platform has formed the basis of numerous PR campaigns throughout the year, highlighting client wins from pharmaceutical giant Pfizer and the McLaren Honda Formula One team.

“Our enterprise clients are looking to get greater value from their data and IT environment,” said Schroeter. “They’re not just focused on reducing cost and driving efficiency but using data to improve decision-making and outcomes. They’re looking to become digital enterprises that are differentiated by Cognitive. We’re creating Cognitive Solutions that marry digital business with digital intelligence. We’re bringing our industry expertise together with these cognitive solutions and we’re building it all on cloud platforms”

Geographically, the company highlighted business was relatively consistent worldwide, though the Asia-Pacific region did demonstrate growth. EMEA and North America demonstrated slight declines, though there have been improvements from previous quarters, though Latin America continued to prove tough for IBM. The company does have a large business unit in the region, though it quoted volatile economic and political environments in Brazil, as reasoning for declines.

Although the company has not halted the revenue declines which have been a constant for IBM in recent years, the strategic imperatives units would appear to be taking a stronger role in fortunes of the business. IBM has grown its capabilities in numerous developing markers in recent months, including cloud video platforms and user experience, though it does appear to be backing cognitive computing for future growth.

“As we build new businesses in areas like Watson Health and Watson Internet of Things, this requires different skills and to be in different places,” said Schroeter. “I mentioned earlier that over the last year we’ve added over 6000 resources in Watson Health and added over 1000 security experts. These are specialized skills in highly competitive areas. So this is not about reducing our capacity; this is about transforming our workforce.

“So where are we in the transformation? It is continued focus on shifting our investments into those strategic imperatives, it is making sure that the space we’re moving to is higher margin and higher profit opportunity for us and then making sure we’re investing aggressively to keep those businesses growing.”

While IBM is not out of the woods yet, the recent quarterlies did beat analyst predictions and its acquisition activities would appear to be more aggressive than others in the space. The company is seemingly not wasting any time in positioning itself firmly in the cloud space, though it does appear executives are backing the growth of cognitive computing, and Watson’s market penetration in particular, as the catalyst for future success of Big Blue.

IBM announces four new clients for video business unit

Curved video wallIBM has revealed four new client wins for its video business, IBM Cloud Video, a couple of hours ahead of its quarterly earnings announcement.

Speaking at NAB Show, the company announced Comic-Con HQ, Canadian Broadcasting Corporation, AOL and Broadway Video will now all be utilizing the IBM video platform. The company expects the market to exceed $100 billion in the next three years, as well as digital video to account for 80% of all internet traffic by 2019.

“IBM is at the forefront of the industry at a time when video is the driving influence in how organizations communicate, share information, and entertain,” said Braxton Jarratt, General Manager of the IBM Cloud Video business unit. “Today’s announcements will be viewed as a significant milestone in the company’s cloud video strategy, as IBM makes the sharing, distribution, and management of video increasingly simple across any device.”

IBM announced the acquisition of Ustream in January though financials of the agreement were not disclosed. Ustream created a cloud model to support live and on-demand video streams and claimed to have 80 million viewers per month from customers including NASA, Samsung, Facebook, Nike and The Discovery Channel. The IBM Cloud Video business unit was formed by the combination of IBM’s R&D dollars alongside acquisitions of Clearleap, Ustream, Aspera and Cleversafe.

The deal with Comic-Con HQ will offer numerous services including subscriber and content management, billing, and video compatibility on multiple devices. The Canadian Broadcasting Corporation will be using IBM’s tech to support its next-generation, ad-supported streaming video service. AOL will be using transfer and automation software from Aspera (an IBM company) to power its media management platform.

The news comes ahead of the company’s quarterly earnings, in which analysts expect IBM to announce further revenue declines. The company has reported revenue declines for 15 straight quarters, though these trends have been witnessed by several tech giants who have been primarily associated with now-legacy IT, not only IBM. The move into cloud computing is seemingly one of a number of strategies set in place for IBM to counter negative growth, and carve a new niche in the digital ecosystem.

Oracle acquisition boosts position in data-as-a-service market

ContractOracle has announced its intention to acquire Israeli machine-learning company Crosswire, in a bid to strengthen its Data-as-a-Service offering.

The Crosswire technology enables marketers and publishers the opportunity to increase cross-device advertising, personalization and analytics, and builds on Oracle’s efforts to bolster its position in the smart data market segment.

“Uniting identity across desktop, browsers and mobile apps to create a meaningful and consistent relationship with customers and prospects has become one of the critical challenges for marketers,” said Omar Tawakol, General Manager at Oracle Data Cloud. “Identification methods are different on every device and across every channel, and solving this can enable marketers to have a significantly more effective dialogue with the consumer and save billions of advertising dollars.”

The team claim in combining the Crosswire capabilities with its Data Cloud portfolio, marketers will be able to build a graphical representation to identify how consumers interact with their digital devices. Oracle currently has such an offering within its portfolio, though the company claims the Crosswire capabilities increases the accuracy of the data, which in theory offers marketers the opportunity to better allocate advertising budgets.

“Oracle Data Cloud is the fastest growing global Data as a Service business, aggregating more than 3 billion profiles from over 15 million websites in its data marketplace and operating the most accurate ID Graph to enable understanding of consumer behaviour across all media channels,” said Tawakol. “The addition of Crosswise further broadens the Oracle ID Graph to construct a complete view of consumers’ digital interactions across multiple devices.”

The acquisition builds on moves by Oracle over recent years to bolster its cloud business. The company bought Ravello Systems for an estimated $500 million in February, as well as numerous acquisitions in 2015 including CloudMonkey, Maxymiser, and StackEngine.

Microsoft grows in SaaS market but Salesforce still leads the way

Microsoft1New findings from Synergy Research Group highlight Microsoft is growing healthily in the Software-as-a-Service (SaaS) market segment, but Salesforce is still market leader.

According to the research, Microsoft demonstrated the second highest level of growth within the segment at 70% year-on-year, only behind SAP who were at 73%, but still only sits second in the market share rankings. Salesforce was one of only four in the top ten for the segment who demonstrated less than 50% growth, however still accounts for just below 15% of the worldwide market share for SaaS. Adobe, IBM, Oracle, Google, ADP, Intuit and Workday complete the top ten.

“In many ways SaaS is a more mature market than other cloud markets like IaaS or PaaS,” said John Dinsdale, Chief Analyst at Synergy Research Group. “However, even for SaaS it is still early days in terms of market adoption. It is notable that the big three traditional software vendors – Microsoft, Oracle and IBM – are all now growing their SaaS revenues faster than the overall market and yet SaaS accounts for less than 8% of their total software revenues.”

The Software-as-a-Service has been demonstrating healthy growth over recent years, as Synergy estimates the market segment has grown by 40% over the last 12 months, and is expected to triple over the next five years. The growth claims are also supported by research from Cisco. Last year the team predicted by 2019 59% of total cloud workloads will be SaaS, compared to 45% in 2014.

The research also highlights Microsoft as making positive steps in the consumer SaaS market segment alongside its enterprise business. While the consumer segment is roughly a third of the size of the enterprise market, the company’s growth in this area exceeding competitors who currently have a more assured position in the space.

Met Office launches weather app on hybrid cloud platform

SkyThe Met Office has launched its latest app on its new hybrid cloud platform, Weather Cloud, in an effort to increase the speed of delivery and accuracy of its weather data to customers.

The platform itself enables the company to processes meteorological data for mobile, at scale, across all Met Office platforms, to ensure the team can deliver information to the public at times of extreme weather events. In designing the app, the team took a DevOps orientated approach, releasing a Minimum Viable Product (MVP) in the first instance, while monitoring customer feedback to refine the proposition.

“We know that more and more people are choosing mobile devices to access their weather information from the Met Office and it’s vital we continue to address this changing behaviour so we can deliver our world-class weather service,” said Owen Tribe, Head of Digital at the Met Office. “The new app technology will enable us to evolve our digital presence and the ways in which people want to access their weather information in the future.”

During Storm Katie in March, the Met Office received a 200% increase in traffic and with over 8 million visits over the course of the weekend. The team claim the new Weather Cloud platform will better enable them to deal with increased traffic and facilitate better planning for short-term weather events. The company also highlighted the ability to scale down in times of lesser demand to reduce public funds spent on the platform.

Weather Cloud was implemented in AWS with assistance from CloudReach, though the DevOps journey has been maintained as the team continue to make updates to the app based on customer feedback.

“The Met Office now has AWS Cloud infrastructure supporting its services, which can respond to changes in demand quickly, is highly resilient in case of any failures and supports stringent security requirements,” said James Monico, Founder at CloudReach. “Using AWS means that the Met Office does not have to maintain hardware that would otherwise be unused for large parts of the year, but it can instead add and remove resources quickly and dynamically as demand fluctuates.”

New EU data regulations receives warm reception from industry

EuropeThe European Union finally rubber-stamped a refresh of the General Data Protection Regulations (GDPR) that offers greater protection for individuals but at cost of a greater burden on businesses, reports Telecoms.com.

In customary EU fashion this is the culmination of four years of to-ing and fro-ing since the refresh was first proposed. Even the final sign-off took four months to complete, with the text having been agreed last December. Furthermore the new regulations won’t come into law until May 2018, giving all businesses who keep data on European citizens, which must include pretty much every multinational, two years to comply.

“The new rules will give users back the right to decide on their own private data,” said Green MEP Jan Philipp Albrecht, who led the drafting process. “Businesses that have accessed users’ data for a specific purpose would generally not be allowed to collect the data without the user being asked. Users will have to give clear consent for their data to be used. Crucially, firms contravening these rules will face fines of up to 4% of worldwide annual turnover, which could imply € billions for the major global online corporations.

“The new rules will give businesses legal certainty by creating one unified data protection standard across Europe. This implies less bureaucracy and creates a level playing field for all business on the European market. Under the new rules, businesses would also have to appoint a data protection officer if they are handling significant amounts of sensitive data or monitoring the behaviour of many consumers.”

Industry reaction has been broadly positive, but with caveats mainly concerning how easy it will be to comply and some concern about the high ceiling for potential fines. Compounding this is a requirement for companies to disclose data breaches within 72 hours of them happening, which is a pretty small window.

“This will be a technical challenge for those businesses unaccustomed to such stringent measures,” said David Mount of MicroFocus. “They will need to identify the breach itself and the information assets likely to have been affected so they can give an accurate assessment of the risks to the authorities and consumers.

“While this may seem like a positive step towards improved data protection, the US example shows that in reality there can be an unintended consequence of ‘data breach fatigue’. Consumers become accustomed to receiving frequent data breach notifications for even very minor breaches, and as a result it can be hard for them to distinguish serious breaches requiring action from minor events which can be safely ignored. The effect is that sometimes consumers can’t see the wood for the trees, and may start to ignore all warnings – which somewhat negates the point of the measure.

“It is now up to European data privacy regulators to work together to ensure that the GDPR rules are implemented in a way that supports economic growth and improved competitiveness,” said John Giusti, Chief Regulatory Officer of the GSMA. “Regulators will need to exercise particular care in interpreting GDPR requirements – around consent, profiling, pseudonymous data, privacy impact assessments and transfers of data to third countries – to avoid stifling innovation in the digital and mobile sectors.

“All eyes are now on the review of the e-Privacy Directive. The right balance needs to be struck between protecting confidentiality of communications and fostering a market where innovation and investment will flourish. To this end, the GSMA calls on legislators to address the inconsistencies between the existing e-Privacy Directive 2002/58/EC and the GDPR.”

The e-Privacy Directive covers things like tracking and cookies and seems to focus specifically on telecoms companies in the way they process personal data. So for the telecoms sector specifically this refresh could be even more important than the GDPR. The European Commission initiated a consultation on ePrivacy earlier this week and will conclude it on 5 July this year.

William Long, a partner at Sidley Austin, warned that individual countries may view the new GDPR differently. “There are still a number of issues where some member states have fought successfully to implement their own national law requirements, for instance in the area of health data, and this will no doubt lead to certain complexities and inconsistencies,” he said.

“However, organisations should be under no doubt that now is the time to start the process for ensuring privacy compliance with the Regulations. The penalties for non-compliance are significant – at up to 4% of annual worldwide turnover or 20 million euros, whichever is the greater. Importantly, companies outside of Europe, such as those in the US who offer goods and services to Europeans, will fall under the scope of this legislation and will face the same penalties for non-compliance.”

“Our own research shows that globally, 52% of the information organisations are storing and hoarding is completely unknown – even to them, we call this ‘Dark Data’,” said David Mosely of Veritas. “Furthermore, 40% of stored data hasn’t even been looked at in more than three years. How can companies know they’re compliant if they don’t even know what they’re storing? This is why GDPR represents such a potentially massive task, and businesses need to start tackling it now.”

“In order for data to remain secure, there are three core components that are now vital for EU businesses,” said Nikki Parker of Covata. “Firstly, encryption is no longer an optional extra. It provides the last line of defence against would-be snoopers and companies must encrypt all personally identifiable information (PII).

“The second component is identity. True data control involves knowing exactly who has access to it and this can be achieved through encryption key management. Enabling businesses to see who has requested and used which keys ensures a comprehensive audit trail, a requirement of the new regulation.

“Finally, businesses must set internal policies that specifically outline how data can be used, for example, whether data is allowed to leave the EU or whether it can be downloaded. Applying policies to each piece of data means access can be revoked at any moment if the company feels it is in violation of the ruling.”

All this is happening in parallel with the overhaul of the rules governing data transfer between Europe and the US, known as the Privacy Shield. By the time the GDPR comes into force pretty much all companies are going to have to tread a lot more carefully in the way they handle their customers’ data and it will be interesting to see how the first major transgression is handled.