Scaling Big Data in the Cloud at Cloud Expo New York

Need to scale your data tier? The foundation of every application is the database layer, and today application architects have more choices than ever. With these choices come new questions: Which database technology is best for your application? How can your application take advantage of Big Data technology? Can you run your relational database at Big Data scale? What does it take to implement a comprehensive data infrastructure, including your core database, incorporating SQL, No SQL and Big Data platforms?
In his session at the 10th International Cloud Expo, Cory Isaacson, CEO/CTO of CodeFutures Corporation, will answer these questions and more. You will learn exactly what it takes to scale your data tier, and how to keep it reliable – despite the challenges presented in Cloud environments. He will also provide a quick review of the primary types of database platforms, enabling you to choose the best technology for your application challenges. He will close with a high volume social application/gaming case study, showing you exactly what it takes to run a high-volume, multi-terabyte database infrastructure in the cloud.

read more

Cloud Expo New York: How to Capitalize on the Cloud and Protect Your Data

As commercial enterprises and the federal sector begin to embrace the public cloud, large organizations are facing a roadblock when it comes to compliance with various data privacy laws and industry regulations. Whether your company falls under E.U. data protection laws, HIPAA PHI guidelines, the U.S.’s FEDRamp requirements, or other regulatory guidelines, the public cloud is not out of reach.
In his session at the 10th International Cloud Expo, Mike Morrissey, VP, Research & Development at PerspecSys, will introduce some of the key privacy, residency, and security areas to consider when adopting public cloud, and best practices for securing the cloud for mission-critical applications without disrupting user functionality.

read more

Cloudera KK to Help Spur Hadoop Adoption in APAC Region

Cloudera, the leading provider of Apache Hadoop-based data management software, services and training, today announced that it has established a Japanese subsidiary, Cloudera KK, and an office in Japan. Cloudera’s formal presence extends availability of its products and support offerings to Japanese enterprises that have deployed or are seeking to deploy Apache Hadoop and related technologies in the Hadoop stack to unlock business insights from their Big Data.

read more

Joyent Cloud Europe Unveiled in Amsterdam

“Many of our Joyent Cloud customers have experienced significant growth based on the superior user experience and performance we deliver on Joyent Cloud,” said Steve Tuck, EVP and GM of Joyent Cloud, Joyent’s public cloud line of business, as Joyent Cloud Europe was unveiled this week -an extension of the US-based Joyent Cloud that will provide high-performance real-time Infrastructure-as-a-Service in Europe, located in Amsterdam.

read more

Will Cloud Become the De Facto Standard for Computing?

“The recent TOSCA initiative has made interoperability for cloud computing closer than ever,” observed Andrew Hillier, co-founder and CTO of CiRBA, in this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan. “However, until players like Amazon and Google join in,” Hillier continued, “it will be difficult for organizations to move from one cloud to the other without risks to their data and infrastructure.”
Cloud Computing Journal: Agree or disagree? – “While the IT savings aspect is compelling, the strongest benefit of cloud computing is how it enhances business agility.”
Andrew Hillier: Although savings and agility are both compelling benefits, it’s usually agility that’s realized first. This isn’t because it is a higher priority, but because it occurs earlier in the cloud adoption process. The push toward standardization and self-service can rapidly increase flexibility and decrease provisioning time, but can actually work against efficiency (much to the surprise of many cloud adopters). The resulting environments are difficult to manage, and many organizations end up with higher spend (for external clouds) or much lower density (internal clouds) than they originally envisioned. Fortunately, by adopting more sophisticated methods of planning and controlling these environments, workload placements and resource allocations can be safely optimized, eliminating over-provisioning once and for all and turning the cloud adoption process into the “win-win” that was originally targeted.

read more

Do Object Storage Plays Displace File Systems or Are They Absorbed?

NAB kept me totally away from all the interesting online discussions last week. It’s too late to respond to @JoinToigo’s tweet (we’d call this Figs after Easter in Dutch), but I thought I’d share my thoughts in a bit more than 140 characters.
The short answer is no … but a better answer is very much *yes*.
The first file systems were not designed with the thought of petabytes of data. I don’t know what the exact projections were back then, but gigabytes must have sounded pretty sci-fi. Bytes and kilobytes were a lot more common. We didn’t think that we’d soon all be creating tens if not hundreds of multi-megabyte files per day.
File systems have of course evolved a lot and some have become so popular you could actually say they have a fan base (I’d need to do research on ZFS fan clubs). It is clear that the file system has played a very important role in the evolution of the computer industry. In my list of features that helped to make computers a commodity, the file system would probably be in the top three (with the windows-style GUI and the mouse). The file system enables the use of directories, which have been the most important tool to keep our data organized.

read more

Information Delivery 2.0 – Reference Architecture

With the enablement of new sources of data flow into the enterprise, it is time to look at the issues of Information Delivery 1.0 of the current enterprises.
Disparate Data Sources, most enterprises have grown multiple database platforms and even within a platform, multiple databases for various reasons. Enterprises taking a lot of pain and effort on ETL towards synchronizing the data.
Enterprises are slowly incorporating big data, unstructured data in their information delivery scope, but don’t have clear means to integrate them.
Rich Media content (audio, video, music and other binary content) are finding their place, but the real context of that content is identified more by the metadata and not by the content.

read more

IBM’s Buying Vivisimo for Its Big Data Push

In the name of its Hadoop-based Big Data platform, IBM is buying Carnegie Mellon spin-off and enterprise search house Vivisimo on undisclosed terms.
The Pittsburgh ISV, which has its own search and navigation system, is supposed to be good at “capturing and delivering quality information across the broadest range of data sources, no matter what format it is, or where it resides,” providing a “single view across the enterprise.” It’s all automated and can be used standalone or embedded.
Vivisimo saw all of $5.66 million in funding from 2000 through 2008 according to CrunchBase, including a $4 million A round led by North Atlantic Capital.

read more

appzero announces availability of zapp cloud migrator

The fastest and most flexible way to move server applications to any cloud, appzero took a market-setting step forward today with the release of zapp cloud migration. This technology extracts Windows server applications from production environments and packages them for movement to any cloud, without re-engineering, change, or lock-in.
Applications packaged by zapp can be copied and run on any cloud or data center server with the ease of an enterprise app store. This capability is well suited to hybrid/federated cloud scenarios in which enterprise workloads are moved on premise or to clouds in response to business requirements.
For use cases that call for ease of application on-boarding with no further planned movement, appzero also offers a dissolve function. Upon deployment, dissolve removes the appzero packaging, installing the application to the OS.

read more

Guest post: The Taxonomy of IT – Part 5

25th April 2012

Journey to the Cloud’s Geoff Smith provides the final part of his five-part blog post series, in which he looks at how to classify current changes in the IT department. This week he focuses on the concept of species, and explaining why the idea of classifying IT is so important.