Best Practices for Growing your VPS and Cloud Server Business

 

Parallels recently published a case study looking at how leading Australian web hosting provider, Net Logistics, was able to create new revenue opportunities through Virtual Private Servers (VPS) offerings. In the case study, Net Logistics Business Development Manager, Joseph Salim, described how it took just three years to go from no VPS offerings to VPS accounting for 40% of their customer base. Today, 60% of all new customers buy VPS from Net Logistics. 

 

 

The demand for Virtual Private Servers continues to explode, and can include everything from a single VPS to elastic, hourly-billed cloud infrastructure. Deploying cost effective VPS and cloud server solutions is the key to accelerating revenue and realizing profit. In our upcoming webinar on May 9, the focus is on growing your VPS and cloud server business. Lowell Anderson, Parallels Director of IaaS Product Marketing, will share the results of our latest market research and the best practices to optimize your cost structure and drive profits from your VPS solutions. Liam Eagle, Editor in Chief of Web Host Industry Review (The WHIR), , will moderate the discussion through a range of topics including market survey results describing VPS revenue opportunities, buyer preferences and what drives purchasing for VPS services, best practices for profiting from VPS services and how to plan for the future. 

 

 

Would you like to find out how your business can capitalize on this market opportunity by offering VPS to your customers? Please join us May 9, 2012 at 11am-12pm Pacific, 2pm-3pm Eastern for our Best Practices for Growing your VPS and Cloud Server Business webinar

 

 

To register or invite customers to attend, click here

 

Release of OpenNebula 3.4.1

The OpenNebula project has just announced the general availability of OpenNebula 3.4.1. This is a maintenance release that fixes bugs reported by the community and includes new languages for Sunstone and the Self-Service portals. OpenNebula 3.4 (Wild Duck) was released three weeks ago bringing countless valuable contributions by many members of our community, and specially from Research in Motion, Logica, Terradue 2.0, CloudWeavers, Clemson University, and Vilnius University.
OpenNebula 3.4 incorporated support for multiple Datastores that provides extreme flexibility in planning the storage backend and important performance benefits, such as balancing I/O operations, defining different SLA policies and features for different VM types or users, or easily scaling the cloud storage. Additionally, OpenNebula 3.4 also featured improvements in other areas like support for clusters (resource pools), new tree-like menus for Sunstone, or the addition of the Elastic IP calls in the EC2 Query API.

read more

Cloud Expo New York: The Growing Big Data Tools Landscape

Hadoop, MapReduce, Hive, Hbase, Lucene, Solr? The only thing growing faster than enterprise data these days is the landscape of big data tools. These tools, which are designed to help organizations turn big data into opportunities, are gaining deeper insight into massive volumes of information. A recent Gartner report predicts that enterprise data will increase by 650% over the next five years, which means that the time is now for IT decision makers to determine which big data tools are the best – and most cost-effective – for their organization.
In his session at the 10th International Cloud Expo, David Lucas, Chief Strategy Officer at GCE, will run through what enterprises really need to know about the growing set of big data tools – including those being leveraged by organizations today as well as the new and innovative tools just arriving on the scene. He will also help attendees gain greater insight on how to match the right data center tools to a specific enterprise challenge and what processes need to change to handle big data.

read more

An IT Forecast: Where Cloud, Mobile and Data Are Taking Us

A thorough piece on CNET by Gordon Haff looks at the interconnectivity of cloud computing, mobility and Big Data, and sees these three forces as instrumental in shaping the future of IT.
“Through the lens of next-generation IT, think of cloud computing as being about trends in computer architectures, how applications are loaded onto those systems and made to do useful work, how servers communicate with each other and with the outside world, and how administrators manage and provide access,» Haff writes.
He says the trend also covers the infrastructure and plumbing that make it possible to effectively coordinate data centers full of systems increasingly working as a unified compute resource as opposed to islands of specialized capacity.
Computing is constantly evolving. What makes today particularly interesting is that “we seem to be in the midst of convergent trends of a certain momentum and maturity to reinforce each other in significant ways. That’s what is happening with cloud computing, mobility and Big Data,” Heff writes.

read more

Informatica Upgrades Its iPaaS

Informatica, which already counts its Cloud processing upwards of a billion cloud integration transactions a day, has upgraded the stuff. It reckons its new Cloud Spring 2012 release will deliver the industry’s most comprehensive cloud integration platform-as-a-service (iPaaS).
The biggest addition is a new Cloud Developer Edition that consists of a cloud connector toolkit and dynamic cloud integration templates for rapid connectivity to applications. Developers can embed end-user customizable integration logic and connectivity into cloud applications and platforms.
System integrators and ISVs should be able to build, customize and deliver native connectivity to any cloud or on-premise business and social applications that have published Web Services APIs.

read more

Cloud Expo NY: Enterprise Transformation Using Cloud and Cloud Platforms

Cloud is a shift from the focus on underlying technology implementation to leveraging existing implementations and further building upon them. Cloud orchestration or a network of clouds is the wave of the future where these clouds can operate with elasticity, scalability, and efficiency. Effective service management is an important aspect of managing such networks. The transition to the cloud will enable the further aggregation of composite web services and enhanced business-to-business capabilities for integrating processes and applications.
In his session at the 10th International Cloud Expo, Ajay Budhraja, CTO at the Department of Justice, will discuss how for example the requirement to access multiple clouds will cause a shift toward utilizing identity management services and single sign-on capabilities. With cloud services, a traditional project that just obtained survey information from customers and provided reports was transitioned to leverage an authentication service, cloud customer information service, cloud reporting service and other cloud services to provide a scalable, highly integrated solution quickly.

read more

LogicMonitor Takes SaaS-based Approach

It’s 11pm – do you know where your data are?

This venerable public-service announcement could serve as a slogan for LogicMonitor, which partners with the likes of NetApp, VMware, Dell, HP, and Citrix to deliver SaaS-based monitoring software.

Case in point: company CEO Kevin McGibben points out that during a big Amazon reboot earlier this year,“if you didn’t have notification tools in place for that reboot and if (Amazon’s) monitoring was in that cloud, then you weren’t notified at the time.” Furthermore, he points out that “interdependencies in the entire stack were affected” by the reboot.

Santa Barbara, CA-based LogicMonitor monitors “physical, virtual, and cloud-based IT environments,” McGibben says. “numerous data sources with literally hundreds of device types and technologies are monitored.”

McGibben says the company’s customers are using cloud-computing initiatives because “they have to stay nimble, so are constantly adding data sources to their stacks. Well more than half of our customers have at least some multi-tenant or some presence in the cloud.”

“Most of our customers are going from legacy (infrastructre) to embracing virtualization hosted-services and using private cloud, while figuring out public cloud. We also work with companies who are building their own private clouds.”

The company works on month-to-month contracts, and allows business-side people to “bring business metrics into the system and plot them. (There are) multiple dashboards, so you can wake up in the morning, drink your coffee, look at overall metrics, and get a big overview. You can see how much money you’re making – or not, if you have an interruption.”

This doesn’t imply that you must have someone sitting there with coffee in hand around the clock, McGibben points out. “There is automated alerting, so you don’t just have to star at the glass,” he says. “You can drop a lightweight Java collector in (for the data folks), which watches anytning to be monitored – apps, servers, networking, storage, virtualzation, etc. It will use whatever protocol is appropriate for polling.”

read more

You Can Kiss That Old 19-Inch Rack Good-Bye

A growing throng of Open Compute Project (OCP) disciples converged on Rackspace headquarters in San Antonio, Texas, this week to overturn the established sixty-year-old EIA 310-D rack standard inherited from railroad signaling relays and telephone switching and in its place substitute Open Rack, the very first standard for data centers, especially big hyper-scale data centers like Facebook’s.
Facebook set Open Compute in train a year ago to solve problems it was having trying to shoehorn the compute, storage and networking density it needed into the traditional server rack, a form factor its hardware master calls “blades gone bad.”
Blades supposedly go bad because of what OCP founding board member Andy Bechtolsheim calls “gratuitous differentiation” on the part of vendors and their lock-in-seeking proprietary designs that sacrifice interoperability.

read more

Crash Course in Open Source Cloud Computing at Cloud Expo New York

Very few trends in IT have generated as much buzz as cloud computing. In his session at the 10th International Cloud Expo, Mark Hinkle, Director, Cloud Computing Community at Citrix, will cut through the hype and quickly clarify the ontology for cloud computing. The bulk of the conversation will focus on the open source software that can be used to build compute clouds (infrastructure-as-a-service) and the complementary open source management tools that can be combined to automate the management of cloud computing environments.
The session will appeal to anyone who has a good grasp of traditional data center infrastructure but is struggling with the benefits and migration path to a cloud computing environment. Systems administrators and IT generalists will leave the discussion with a general overview of the options at their disposal to effectively build and manage their own cloud computing environments using free and open source software.

read more

Cloud Expo New York: Infrastructure Planning in Next-Gen Data Centers

Capacity management may not be dead yet, but with the adoption of private clouds it’s barely recognizable. IT organizations are radically changing how they plan and manage infrastructure to cope with the complexity of these large-scale shared environments and prevent the over-provisioning that results from old school planning approaches.
In his session at the 10th International Cloud Expo, Andrew Hillier, Co-Founder and CTO at CiRBA, will outline best practices for gaining control over dynamic capacity supply and workload demand in large-scale virtual and cloud infrastructure. He will also discuss how leading Fortune 500 organizations brought together infrastructure teams, capacity teams and application owners to increase agility, reduce risk and costs by optimizing infrastructure planning and management processes.

read more