IBM doubles down on developers and open source

IBM is launching a cloud-based open source platform and putting its own tech at the core of it

IBM is launching a cloud-based open source platform and putting its own tech at the core of it

IBM launched developerWorks Open this week, a platform being aimed at developers looking to develop open source solutions in collaboration with IBM using the company’s technology as a foundation.

The cloud-based platform will provide access to emerging IBM tech and expertise in the form of blogs, informational videos and other multimedia, and the opportunity to collaborate with specialists.

The company said it plans to contribute upwards of 50 projects to the initiative spanning various applications in cloud, analytics and mobile, and will also make the contributed services available on Bluemix.

“IBM firmly believes that open source is the foundation of innovative application development in the cloud,” said IBM vice president of cloud architecture and technology Angel Diaz. “With developerWorks Open, we are open sourcing additional IBM innovations that we feel have the potential to grow the community and ecosystem and eventually become established technologies.”

The company is also launching a set of open source projects specifically targeting applications and workflows in a number of industry verticals including healthcare, mobile, retail, insurance and banking. It said much of the open source development today, while promising, “lacks a strategic focus” on business requirements.

IBM has in recent years looked to bolster its open source strategy, in part by creating and owning its own communities. In 2013 for instance it launched the OpenPower Foundation, a group of technology companies innovating with and on top of its Power8 microarchitecture.

The company has also thrown its weight behind a number of large cloud-centric open source projects including OpenStack, Cloud Foundry (on which Bluemix is based), Docker and more recently, Apache Spark.

IBM, Microsoft struggle while SAP largely bucks the trend

IBM, Microsoft and SAP all released their financial results this week

IBM, Microsoft and SAP all released their financial results this week

IBM and Microsoft revealed steep losses this week as the two companies released their Q2 financial results, but SAP seems to have bucked the trend with close to 130 per cent growth in cloud revenues and 13 per cent growth in revenue.

IBM revealed second quarter net income from continuing operations was $3.5bn compared with $4.3bn in the second quarter of 2014, a decrease of 17 per cent, and revenue was down 13 per cent, much of which it blamed on recent large divestitures and related cash impairments.

Year on year growth in its cloud business – from $2.8bn in the second quarter last year to $4.5bn in Q2 2015 – and ten per cent growth in its analytics business hasn’t fully compensated for some of the challenges the company facing elsewhere in its business. The company’s revenues have been in decline for almost three years sequentially.

“Our results for the first half of 2015 demonstrate that we continue to transform our business to higher value and return value to shareholders,” said Ginni Rometty, IBM chairman, president and chief executive officer. “We expanded margins, continued to innovate across our portfolio and delivered strong growth in our strategic imperatives of cloud, analytics and engagement, which are becoming a significant part of our business.”

Microsoft saw quarterly revenues hit $22.2bn in Q2 this year, but the company reported record losses of $14.7bn, much of which resulted from the impact of its $7.5bn write-down of its failing Nokia business, with other costs related to the restructuring nearing $1bn. The company also said the strengthening of the dollar relative to other currencies had a significant impact on its results.

But Microsoft reported commercial cloud revenues grew of 88 per cent in the quarter, driven largely by Office 365, Azure and Dynamics CRM Online uptake, while the division selling on-premise licenses for its productivity offerings declined 4 per cent; the company said it added roughly 3 million cloud users in the quarter.

“In our commercial business we continue to transform the product mix to annuity cloud solutions and now have 75,000 partners transacting in our cloud,” said Kevin Turner, chief operating officer at Microsoft.

German software giant SAP seems to be one of the few large incumbents bucking the trend this quarter. The company revealed cloud subscriptions and support revenue grew 129 per cent in Q2, new cloud bookings were up 162 per cent, and it more than doubled its SAP HANA customers year on year (from 3,600 to over 7,200). The company reported overall quarterly revenues rose 13 per cent to €1.39bn.

“Our second quarter growth in new cloud bookings was significantly higher than in the first quarter. This momentum showed across our entire cloud and business network portfolio,” said SAP chief financial officer Luka Mucic. “Our operating profit performance is beginning to reflect the business transformation we initiated to make SAP ready for the future. We are on track to achieve our full year business outlook.”

The results come as all three companies – Microsoft, IBM and SAP – continue ambitious redeployment and reorganisation efforts to address a shift in the market towards cloud services and away from legacy software and services.

Microsoft Plans to Buy Security Firm Adallom

Microsoft is set to be paying 320 million dollars in cash for Adallom, a startup with software for monitoring the use of cloud-based services. A source has claimed all 90 employees, including the 30 in the US, will function an independent unit of Microsoft and will manage material related to cybersecurity for Microsoft.

While Microsoft has refused to comment on the supposed deal, the Wall Street Journal claims, “According to the people familiar with the matter, Adallom, which employs 90 people world-wide, will continue to operate from Israel, building up Microsoft’s cybersecurity-focused operations in the country.” The first to report the deal were Israeli media outlets Calcalist and Globes, with reports later coming from the Wall Street Journal.

ht_microsoft_cc_120823_wg

Microsoft has continued making the cloud a priority throughout the whole company, and building an intelligent cloud platform is one of three areas of investment for the company. Cloud security is vital to the company as they switch to more internet based occupations, hence the move to purchase Adallom. Usage and revenue from application Office 365 has increased during the first quarter of 2015, and Microsoft want to protect this trend.

This is just one of Microsoft’s myriad of partnerships and acquisitions this year. Microsoft has previously attained a provider of machine learning technologies for e-discovery and information governance. The company’s software uses advanced text analytics to perform multidimensional analyses of data collections, intelligently sorting documents into themes, grouping near-duplicates, and isolating unique data. In addition,  Microsoft has purchased N-trig and Aorato.

The post Microsoft Plans to Buy Security Firm Adallom appeared first on Cloud News Daily.

IBM unveils developer friendly open source cloud projects

(c)iStock.com/Graffizone

IBM is ramping up its commitment to open source technologies by releasing a new platform which enables developers to build cloud applications.

The platform, called developerWorks Open, allows developers to not only download the code but have access to blogs and videos, comprising a global network to accelerate development projects.

IBM has had more than 20 years in the open source computing biz, becoming a main player in projects such as Apache, Linux and Eclipse. Big Blue is a platinum member of the OpenStack foundation, which announced a partnership with Google earlier this month.

The company is also spearheading a project called Academic Initiative for Cloud, where the next generation of developers will be equipped with IBM’s platform as a service offering Bluemix. More than 200 universities in 36 countries have signed up for the initiative, meaning cloud development curricula can potentially reach more than 20,000 students.

Evidently, there is a benefit for IBM – if the next generation are good, and used to Bluemix technology, then Big Blue can snap them up. It’s similar to Rackspace, another big open source advocate, investing in a data science boot camp for PhD students last year.

But developerWorks Open is also about giving something back. IBM is open sourcing a number of its MobileFirst apps – 10 more of which quietly hit the stands this week as part of IBM and Apple’s ongoing partnership – for a variety of industries, including healthcare, retail, insurance and banking.

IBM argues developerWorks Open comes at “an important time” for cloud developers, stressing the need to simplify implementations. Dr Angel Diaz, VP cloud architecture and technology, said: “IBM firmly believes that open source is the foundation of innovative application development in the cloud.

“With developerWorks Open, we are open sourcing additional IBM innovations that we feel have the potential to grow the community and ecosystem and eventually become established technologies,” he added.

What ‘Should’ Come from the Cybersecurity Sprint | @CloudExpo #Cloud

So, in the wake of the OMB hack, the federal CIO (Tony Scott) has directed government agencies to get serious about cyber security. The “30-day sprint,” directs agencies to patch all known vulnerabilities; use information provided by Homeland Security to identify and mitigate known threats; limit the number of privileged users and tighten access controls; and “dramatically accelerate” the use of personal identity verification (PIV) cards and other forms of multifactor identification.

read more

[video] Scalability in the Cloud with Charlie Fei | @CloudExpo @AICIncSolutions #Cloud

«AIC is a manufacturer of server products – server storage. With the onset of cloud computing we’re moving toward the full infrastructure solution, including storage nodes, compute nodes and networking,» explained Charlie Fei, product manager at AIC, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.

read more

AppZero to On-board Windows Server 2003 Apps to Microsoft Azure | @CloudExpo #Cloud

AppZero has announced availability of the AppZero Service Provider edition for Microsoft Azure (AppZero SP for Azure), a software-as-service (SaaS) portal for quickly moving Windows Server applications to Azure, including those running on Windows Server 2003, which reaches end of support today. Those wishing to try the service can log in at azure.appzero.com and move their first five enterprise applications to Azure at no cost. AppZero, which has added more than fifty new certified partners over the last year to assist customers with Windows Server 2003 planning and migration, also announced new 24/7 worldwide support and the opening of a new European office based in the Netherlands to help meet international demand. AppZero is already Microsoft Azure Certified and can be purchased and deployed from the Microsoft Azure Marketplace.

read more

Will datacentre economics paralyse the Internet of Things?

The way data and datacentres are managed may need to change drastically in the IoT era

The way data and datacentres are managed may need to change drastically in the IoT era

The statistics predicting what the Internet of Things (IoT) will look like and when it will take shape vary widely. Whether you believe there will be 25 billion or 50 billion Internet-enabled devices by 2050, there will certainly be far more devices than there are today. Forrester has predicted 82% of companies will be using Internet of Things (IoT) applications by 2017. But unless CIOs pay close attention to the economics of the datacentre, they will struggle to be successful. The sheer volume of data we expect to manage across these IoT infrastructures could paralyse companies and their investments in technology.

The Value of Information is Relative

ABI Research has calculated that there will be 16 Zettabytes of data by 2020. Consider this next to another industry estimate that there will be 44 Zettabytes by 2020.  While others have said that humanity only produced 2.7 Zettabytes up to 2013. Bottom line: the exponential growth in data is huge.

The natural first instinct for any datacentre manager or CIO is to consider where he or she will put that data. Depending on the industry sector there are regulatory and legal requirements, which mean companies will have to be able to collect, process and analyse runaway amounts of data.  By 2019 another estimate suggests that means processing 2 Zettabytes a month!

One way to react is to simply buy more hardware. From a database perspective the traditional approach would be to create more clusters in order to manage such huge stores of data. However, a critical element of IoT is that it’s based on low-cost technology, and although the individual pieces of data have a value, there is a limit to that value. For example, you do not need to be told every hour by your talking fridge that you need more milk or be informed by your smart heating system what the temperature is at home.  While IoT will lead to smart devices everywhere, its value is relative to the actionable insight it offers.

A key element of the cost benefit equation that needs more consideration is the impact of investment requirements at the backend of an IoT data infrastructure. As the IoT is creating a world of smart devices distributed across networks CIOs have to make a decision about whether the collection, storage and analytics happens locally near the device or is driven to a centralised management system.  There could be some logic to keeping the intelligence locally, depending on the application, because it could speed up the process of providing actionable insight. The company could use low-cost, commoditised devices to collect information but it will still become prohibitively expensive if the company has to buy vast numbers of costly database licenses to ensure the system performs efficiently – never mind the cost of integrating data from such a distributed architecture.

As a result, the Internet of Things represents a great opportunity for open source software thanks to the cost effectiveness of open source versus traditional database solutions. Today, open source-based databases have the functionality, scalability and reliability to cope with the explosion in data that comes with the IoT while transforming the economics of the datacentre. A point which Gartner’s recent Open Source Database Management report endorsed when it said:  “Open source RDBMSs have matured and today can be considered by information leaders, DBAs and application development management as a standard infrastructure choice for a large majority of new enterprise applications.”

The Cost of Integrating Structured and Unstructured

There are other key considerations when calculating the economic impact of the IoT on the datacentre. The world of IoT will be made up of a wide variety of data, structured and unstructured. Already, the need for working with unstructured data has given rise to NoSQL-only niche solutions. The deployment of these types of databases, spurred on by the rise of Internet-based applications and their popularity with developers, is proliferating because they offer the affordability of open source. Yet, their use is leading to operational and integration headaches as data silos spring up all around the IT infrastructure due to limitations in these NoSQL-only solutions. In some cases, such as where ACID properties are required and robust DBA tools are available, it may be more efficient to use a relational database with NoSQL capabilities built in and get the best of both worlds rather than create yet another data silo.  In other cases, such has for very high velocity data streams, keeping the data in these newer data stores and integrating them may be optimal.

A key priority for every CIO is integrating information as economically as possible so organizations can create a complete picture of its business and its customers.  The Postgres community has been at the forefront of addressing this challenge with the creation of Foreign Data Wrappers (FDWs), which can integrate data from disparate sources, likes MongoDB, Hadoop and MySQL. FDWs link external data stores to Postgres databases so users access and manipulate data from foreign sources as if it were part of the native Postgres tables. Such simple, inexpensive solutions for connecting new data streams emerging along with the Internet of Everything will be critical to unlocking value from data.

The Internet of Things promises a great transformation in the ability of enterprises to holistically understand their business and customer environment in real time and deliver superior customer engagement.  It is critical, though, that CIOs understand the economic impact on their datacentre investments.  The IoT creates a number of new challenges, which can be addressed using the right technology strategy.

Written by Pierre Fricke, vice president of product, EnterpriseDB

HP inks deal with SunEdison to power its cloud with clean energy

HP is the latest cloud player to boost its green credentials

HP is the latest cloud player to boost its green credentials

HP signed a 12-year power purchasing agreement with SunEdison this week that will see it power its cloud datacentres with renewable energy.

The deal will see SunEdison begin construction on a massive wind farm in Texas, which when completed will generate 300 MW of power. The wind farm will be acquired by TerraForm, a global owner and operator of renewable energy plants, in 2016 once it becomes operational.

HP said the 12 MW of wind electricity it agreed to purchase annually is enough to power all of its Texas-based datacentre operations, and will enable the company to reach its 2020 operational greenhouse gas (GHG) emissions reduction goal by the end of this year, five years ahead of schedule.

“This agreement represents the latest step we are taking on HP’s journey to reduce our carbon footprint across our entire value chain, while creating a stronger, more resilient company and a sustainable world,” said Gabi Zedlmayer, vice president and chief progress officer, Corporate Affairs, HP.

“It’s an important milestone in driving HP Living Progress as we work to create a better future for everyone through our actions and innovations,” Zedlmayer said.

Paul Gaynor, executive vice president, Americas and EMEA, SunEdison said: “By powering their data centers with renewable energy, HP is taking an important step toward a clean energy future while lowering their operating costs. At the same time, HP’s commitment allows us to build this project which creates valuable local jobs and ensures Texan electricity customers get cost-effective energy.”

HP is the latest cloud player to bolster its green credentials. Amazon recently announced two clean energy projects in the US within a month of one another, one in Virginia and the other in North Carolina.

Vodafone Italy launches NFV, cloud-based VoLTE

Vodafone Italy is working with Huawei on what the two claim to be the world's first cloud-based VoLTE deployment

Vodafone Italy is working with Huawei on what the two claim to be the world’s first cloud-based VoLTE deployment

Vodafone is the latest carrier to push ahead with rolling out a voice over LTE (VoLTE) service, with its Italian subsidiary launching the service, reports Telecoms.com.

Setting this VoLTE project apart from other operators pursuing the calling technology, however, is the contribution from Huawei to launch the service on a cloud-based IMS core network. Essentially, the service launch is a live demonstration of NFV in action, with it relying on NFV-compliant core network solutions that are interoperable with commercial off the shelf (COTS) infrastructures. In this instance, the IMS and element management system (EMS) are virtualized, managed by the snappily titled “MANO-VNFM” (management and orchestration virtualized network function management).

Huawei reckons this constitutes a world first, and builds upon work conducted during ETSI NFV ISG’s proof of concept trials. ZTE, China Unicom and HP collaborated on developing a VoLTE service based on vEPC (evolved packet core) and vIMS architecture during one such PoC, and it seems Huawei and Vodafone have steamed ahead with a real-world deployment since the project was demonstrated in January.

A statement released by Huawei referenced the NFV partnership with Vodafone in the wider context of converging the ICT and telecoms worlds. “These innovating are the fruits of partnerships with major operators and join solution optimisation as ongoing processes at Huawei,” it said. “Media plane acceleration, fully automated operation, NFV-based capability exposure, and intelligent network slicing are key areas for NFV consolidation. These future goals are the core of Huawei’s commitment to facilitating cloud transformation for operators.”

Vodafone Italy’s VoLTE rollout, while allegedly being the first to utilise NFV infrastructure, is one of a growing number of European rollouts. Vodafone Germany launched the service in March, while it’s targeting a launch in the UK market at some point this summer. EE and Three, meanwhile, are both looking at a summer 2015 launch date for VoLTE services, as Europe plays catch up with the Far East already leading the way with matured rollouts of the next generation calling technology.