After OpenStack was announced at OSCON in the summer of 2010, the degree of momentum behind this new open source platform has been nothing short of spectacular. Startups and enterprises alike have placed their strategic bets to monetize the OpenStack wave in various ways. As an ecosystem insider and one of the founding sponsors of the OpenStack Foundation, I wanted to offer my views on how various organizations are looking to skin this cat.
I’d like to focus on three of the many efforts currently underway. These three, in particular, happen to be the most vocal about their position and represent three distinct strategy camps. They are Nebula with its OpenStack appliance; Piston with its PentOS cloud operating system; and Dell’s Crowbar, an OpenStack installer.
Todas las entradas hechas por Latest News from Cloud Computing Journal
Cloud Computing in Higher Education
The irony about cloud computing in the higher education environment is that most schools have already been using it to some extent but may not even realize it.
Gmail is one example. Yahoo Mail is another. The fact is web-based applications, which many schools rely on for daily communication, don’t always register with most people as being part of the cloud computing trend. But they are, given that they essentially fit the layman’s rudimentary explanation of the cloud: where storage and computing capacity exist (provided by a vendor) so all that is needed on a PC, laptop, tablet or smartphone is a browser. There are more “technical details” to actual cloud infrastructure, platforms and delivery, but for the purposes here, we will stick with the basic view.
There’s no question that cloud computing usage has exploded and will continue unabated. An article in the September 30, 2011 issue of Campus Technology stated that a new industry forecast is predicting that cloud computing will account for 33 percent of all data center traffic by 2015 – tripling the current percentage and about 12 times the total current volume.
Release of OpenNebula 3.4.1
The OpenNebula project has just announced the general availability of OpenNebula 3.4.1. This is a maintenance release that fixes bugs reported by the community and includes new languages for Sunstone and the Self-Service portals. OpenNebula 3.4 (Wild Duck) was released three weeks ago bringing countless valuable contributions by many members of our community, and specially from Research in Motion, Logica, Terradue 2.0, CloudWeavers, Clemson University, and Vilnius University.
OpenNebula 3.4 incorporated support for multiple Datastores that provides extreme flexibility in planning the storage backend and important performance benefits, such as balancing I/O operations, defining different SLA policies and features for different VM types or users, or easily scaling the cloud storage. Additionally, OpenNebula 3.4 also featured improvements in other areas like support for clusters (resource pools), new tree-like menus for Sunstone, or the addition of the Elastic IP calls in the EC2 Query API.
Cloud Expo New York: The Growing Big Data Tools Landscape
Hadoop, MapReduce, Hive, Hbase, Lucene, Solr? The only thing growing faster than enterprise data these days is the landscape of big data tools. These tools, which are designed to help organizations turn big data into opportunities, are gaining deeper insight into massive volumes of information. A recent Gartner report predicts that enterprise data will increase by 650% over the next five years, which means that the time is now for IT decision makers to determine which big data tools are the best – and most cost-effective – for their organization.
In his session at the 10th International Cloud Expo, David Lucas, Chief Strategy Officer at GCE, will run through what enterprises really need to know about the growing set of big data tools – including those being leveraged by organizations today as well as the new and innovative tools just arriving on the scene. He will also help attendees gain greater insight on how to match the right data center tools to a specific enterprise challenge and what processes need to change to handle big data.
An IT Forecast: Where Cloud, Mobile and Data Are Taking Us
A thorough piece on CNET by Gordon Haff looks at the interconnectivity of cloud computing, mobility and Big Data, and sees these three forces as instrumental in shaping the future of IT.
“Through the lens of next-generation IT, think of cloud computing as being about trends in computer architectures, how applications are loaded onto those systems and made to do useful work, how servers communicate with each other and with the outside world, and how administrators manage and provide access,» Haff writes.
He says the trend also covers the infrastructure and plumbing that make it possible to effectively coordinate data centers full of systems increasingly working as a unified compute resource as opposed to islands of specialized capacity.
Computing is constantly evolving. What makes today particularly interesting is that “we seem to be in the midst of convergent trends of a certain momentum and maturity to reinforce each other in significant ways. That’s what is happening with cloud computing, mobility and Big Data,” Heff writes.
Informatica Upgrades Its iPaaS
Informatica, which already counts its Cloud processing upwards of a billion cloud integration transactions a day, has upgraded the stuff. It reckons its new Cloud Spring 2012 release will deliver the industry’s most comprehensive cloud integration platform-as-a-service (iPaaS).
The biggest addition is a new Cloud Developer Edition that consists of a cloud connector toolkit and dynamic cloud integration templates for rapid connectivity to applications. Developers can embed end-user customizable integration logic and connectivity into cloud applications and platforms.
System integrators and ISVs should be able to build, customize and deliver native connectivity to any cloud or on-premise business and social applications that have published Web Services APIs.
Cloud Expo NY: Enterprise Transformation Using Cloud and Cloud Platforms
Cloud is a shift from the focus on underlying technology implementation to leveraging existing implementations and further building upon them. Cloud orchestration or a network of clouds is the wave of the future where these clouds can operate with elasticity, scalability, and efficiency. Effective service management is an important aspect of managing such networks. The transition to the cloud will enable the further aggregation of composite web services and enhanced business-to-business capabilities for integrating processes and applications.
In his session at the 10th International Cloud Expo, Ajay Budhraja, CTO at the Department of Justice, will discuss how for example the requirement to access multiple clouds will cause a shift toward utilizing identity management services and single sign-on capabilities. With cloud services, a traditional project that just obtained survey information from customers and provided reports was transitioned to leverage an authentication service, cloud customer information service, cloud reporting service and other cloud services to provide a scalable, highly integrated solution quickly.
LogicMonitor Takes SaaS-based Approach
It’s 11pm – do you know where your data are?
This venerable public-service announcement could serve as a slogan for LogicMonitor, which partners with the likes of NetApp, VMware, Dell, HP, and Citrix to deliver SaaS-based monitoring software.
Case in point: company CEO Kevin McGibben points out that during a big Amazon reboot earlier this year,“if you didn’t have notification tools in place for that reboot and if (Amazon’s) monitoring was in that cloud, then you weren’t notified at the time.” Furthermore, he points out that “interdependencies in the entire stack were affected” by the reboot.
Santa Barbara, CA-based LogicMonitor monitors “physical, virtual, and cloud-based IT environments,” McGibben says. “numerous data sources with literally hundreds of device types and technologies are monitored.”
McGibben says the company’s customers are using cloud-computing initiatives because “they have to stay nimble, so are constantly adding data sources to their stacks. Well more than half of our customers have at least some multi-tenant or some presence in the cloud.”
“Most of our customers are going from legacy (infrastructre) to embracing virtualization hosted-services and using private cloud, while figuring out public cloud. We also work with companies who are building their own private clouds.”
The company works on month-to-month contracts, and allows business-side people to “bring business metrics into the system and plot them. (There are) multiple dashboards, so you can wake up in the morning, drink your coffee, look at overall metrics, and get a big overview. You can see how much money you’re making – or not, if you have an interruption.”
This doesn’t imply that you must have someone sitting there with coffee in hand around the clock, McGibben points out. “There is automated alerting, so you don’t just have to star at the glass,” he says. “You can drop a lightweight Java collector in (for the data folks), which watches anytning to be monitored – apps, servers, networking, storage, virtualzation, etc. It will use whatever protocol is appropriate for polling.”
You Can Kiss That Old 19-Inch Rack Good-Bye
A growing throng of Open Compute Project (OCP) disciples converged on Rackspace headquarters in San Antonio, Texas, this week to overturn the established sixty-year-old EIA 310-D rack standard inherited from railroad signaling relays and telephone switching and in its place substitute Open Rack, the very first standard for data centers, especially big hyper-scale data centers like Facebook’s.
Facebook set Open Compute in train a year ago to solve problems it was having trying to shoehorn the compute, storage and networking density it needed into the traditional server rack, a form factor its hardware master calls “blades gone bad.”
Blades supposedly go bad because of what OCP founding board member Andy Bechtolsheim calls “gratuitous differentiation” on the part of vendors and their lock-in-seeking proprietary designs that sacrifice interoperability.
Crash Course in Open Source Cloud Computing at Cloud Expo New York
Very few trends in IT have generated as much buzz as cloud computing. In his session at the 10th International Cloud Expo, Mark Hinkle, Director, Cloud Computing Community at Citrix, will cut through the hype and quickly clarify the ontology for cloud computing. The bulk of the conversation will focus on the open source software that can be used to build compute clouds (infrastructure-as-a-service) and the complementary open source management tools that can be combined to automate the management of cloud computing environments.
The session will appeal to anyone who has a good grasp of traditional data center infrastructure but is struggling with the benefits and migration path to a cloud computing environment. Systems administrators and IT generalists will leave the discussion with a general overview of the options at their disposal to effectively build and manage their own cloud computing environments using free and open source software.