Cloud Computing: CiRBA 7.1 Now Available

Console, enabling IT organizations to optimize capacity decisions, VM placements and resource allocations for AIX-based IBM PowerVM environments. This latest version allows customers to leverage the same technology they use to optimize VMware and Red Hat Enterprise Virtualization environments to manage virtualized AIX infrastructure.
According to Andrew Hillier, CiRBA CTO and co-founder, “It becomes an analytics challenge, and the key is to strike the optimal balance of efficiency and risk given the infrastructure capabilities, the requirements of the workloads being hosted, and the policies governing the relationship between the two.”

read more

Cloud Key Management – Addressing Data Encryption in the Public Cloud

Cloud computing is gaining more and more traction across enterprises and SMB organizations. Its many benefits and cost structure provide an attractive alternative to the traditional data center, but at the same time cloud data security, cloud encryption and cloud key management remains top concerns. Thought leaders and analysts agree that cloud data encryption is a fundamental first step. But when looking at the fine print, a more complicated situation is revealed.
We commonly identify 3 approaches to cloud key management; all have their pros and cons. The first approach is to use the encryption as provided by your cloud provider, the pros are obvious – it’s easy to deploy and manage and it transparently integrates with your cloud data layer – but the cost is high – you trust your cloud provider with what should be your best kept secret – your encryption keys. Data security expert Rich Mogul had described it well on his blog. The second approach is to trust a third party with your encryption keys. This approach eliminates some cloud flexibility advantages as encryption is no longer integrated to your cloud, and still carries the same risks as before – you trust a third party with your keys. The third approach involves implementing a key management server back in the physical data center. While this approach is indeed secure, it eliminates many cloud advantages, and forces you back to your data center, when what you wanted is to migrate to the cloud.

read more

Best Practices in Cloud Security

Last week one news item that attracted media attention was the hacking of some nearly 450,000 passwords from Yahoo Service called ‘Yahoo Voice’. The communications on the incident state that, SQL Injection is the primary technique adopted by hackers to get the information out of databases and publish them.
As per further communications, we find the affected company taking more precautions to ensure that security is their highest priority. These events will also generally shake the Cloud Adoption at the enterprise level, where always the Fear, Uncertainty and Doubt in the minds of CIOs may increase due to these incidents.
However the following are the best practices and guidelines that should be adopted by any enterprise when adopting hybrid cloud computing and a one-off incident should not dampen their road map to hybrid computing adoption.

read more

How to Survive in the Cloud (Infographic)

With Microsoft’s announcements at this year’s Worldwide Partner Conference (WPC), it is evident that change is coming to all enterprise software resellers. To better prepare for the inevitable transition to the cloud, here is an infographic that provides an overview of what it was like to sell an on-premise solution compared to what it will be like to sell “in the cloud”.

read more

Penguin Computing Offers HPC Compute Clouds Built for Academia, Research

Penguin Computing today announced partnerships with multiple universities to enable easy, quick and unbureaucratic on-demand access to scalable HPC compute resources for academic researchers.

“Penguin Computing has traditionally been very successful with HPC deployments in academic environments with widely varying workloads, many departments competing for resources and very limited budgets for capital expenses, a cloud based model for compute resources makes perfect sense,” says Tom Coull, Senior Vice President and General Manager of Software and Services at Penguin Computing. “The new partnerships help academic institutions with a flexible cloud based resource allocation for their researchers. At the same time, they present an opportunity for IT departments to create an ongoing revenue stream by offering researchers from other schools access to their cloud.”

Penguin has implemented three versions of academic HPC clouds:

Hybrid Clouds – Which are a local ‘on-site’ cluster configured to support the use of Penguin-on-Demand (POD) cloud resources as needed on a pay-as-you go basis. Local compute resources can be provisioned for average demand and utilization peaks can be offloaded transparently. This model lowers the initial capital expense and for temporary workload peaks excess cycles are provided cost effectively by Penguin’s public HPC cloud. Examples of hybrid cloud deployments include the University of Delaware and Memphis University.

Channel Partnership – Between Universities and Penguin Computing, allow educational institutions to become distributors for POD compute cycles. University departments with limited access to compute resources for research can use Penguin’s virtual supercomputer on-demand and pay-as-they-go, allowing them to use their IT budget for operational expenses. When departments use the university’s HPC cloud, revenue can supplement funding for IT staff or projects, increasing the department’s capabilities. This model has been successfully implemented at the California Institute for Technology in conjunction with Penguin’s PODshell, a web-service based solution that supports the submission and monitoring of HPC cloud compute jobs from any Linux system with internet connectivity.

Combination Hybrid / Channel – The Benefits of the first two models have been successfully implemented at Indiana University (IU) as a public-private partnership. Penguin leverages the University’s HPC facilities and human resources while IU benefits from fast access to local compute resources and Penguin’s HPC experience. IU can use POD resources and provide compute capacity to other academic institutions. The agreement between IU and Penguin also has the support of a group of founding user-partners including the University of Virginia, the University of California, Berkeley and the University of Michigan who along with IU will be users of the new service. The POD collocation offers access through the high-speed national research network internet2 and is integrated with the XSEDE infrastructure that enables scientists to transparently share computing resources.

“This is a great example of a community cloud service,” said Brad Wheeler, vice president for information technology and CIO at Indiana University. “By working together in a productive private-public partnership, we can achieve cost savings through larger scales while also ensuring security and managing the terms of service in the interests of researchers.”

For more information about Penguin Computing’s HPC compute resources, please visit www.penguincomputing.com.


Cloud Computing: Rackspace to Extend Open Ecosystem of Cloud Technologies

Rackspace Hosting on Wednesday announced that it has upgraded its Cloud Tools Marketplace to enhance the customer experience for accessing the ecosystem of cutting-edge applications, tools and solutions.
John Engates, chief technology officer at Rackspace, noted that “the Cloud Tools Marketplace provides an ideal customer experience by making it easier for them to identify the most appropriate technologies to use with their open cloud deployment.”
Through the Cloud Tools Marketplace, developers and enterprise IT professionals building on the open Rackspace Cloud, as well as first generation Cloud Servers, can evaluate more than one hundred industry leading technologies to deliver advanced capabilities for their environments hosted at Rackspace.

read more

Counting the Cost of Cloud

IT costs were always a worry, but only an occasional one. Cloud computing has changed that.
Here’s how it used to be. The New System was proposed. Costs were estimated, more or less accurately, for computing resources, staff increases, maintenance contracts, consultants and outsourcing. The battle was fought, the New System was approved, the checks were signed, and everyone could forget about costs for a while and concentrate on other issues, such as making the New System actually work.
One of the essential characteristics of cloud computing is “measured service.” Resource usage is measured by the byte transmitted, the byte stored, and the millisecond of processing time. Charges are broken down by the hour, and billed by the month. This can change the way people take decisions.
“The New System is really popular. It’s being used much more than expected.”

read more

Greater New York Chamber of Commerce “Association Sponsor” of Cloud Expo

SYS-CON Events announced today that the Greater New York Chamber of Commerce has been named “Association Sponsor” of SYS-CON’s 11th International Cloud Expo, which will take place on November 5–8, 2012, at the Santa Clara Convention Center in Santa Clara, CA.
The mission of the Chamber is to improve the business climate and quality of living in the New York Metropolitan Area for residents, workers and visitors. It provides valuable services to over 20,000 business and civic leaders who represent the backbone of the Greater New York business community.

read more

Survey: Emerging Markets Ready to Adopt Cloud Computing Services

Emerging economies are ripe markets for cloud computing services – including paid services – the Business Software Alliance reported today.
BSA President and CEO Robert Holleyman noted that “we’re seeing a leapfrog effect. A lot of recent adopters of computers and information technology are jumping straight to the cloud.”
BSA partnered with Ipsos Public Affairs to survey nearly 15,000 computer users in 33 countries about their understanding and use of cloud computing.

read more

The cloud news categorized.