Todas las entradas hechas por Latest News from Cloud Computing Journal

The Best Way To Get Hadoop Is Getting Better

Good news for Big Data users: Cloudera recently released the second and final beta for Cloudera’s Distribution Including Apache Hadoop version 4 (CDH4), meaning that the official CDH4 release is coming soon. If you aren’t already using CDH, Cloudera offers the leading open-source distribution of the fast, agile, and reliable distributed  computing platform changing the way enterprises […]

read more

GovCloud 2.0 Apps for Citizen Service

When it comes to embracing social media and applications, many businesses seem to question the benefit. True, if you are a consumer goods manufacturer, there is inherent benefit in associating yourself with social media and consumer applications, but what about for Government agencies? Does it make sense for Governments, especially municipal, to look at creating […]

read more

Scaling Big Data in the Cloud at Cloud Expo New York

Need to scale your data tier? The foundation of every application is the database layer, and today application architects have more choices than ever. With these choices come new questions: Which database technology is best for your application? How can your application take advantage of Big Data technology? Can you run your relational database at Big Data scale? What does it take to implement a comprehensive data infrastructure, including your core database, incorporating SQL, No SQL and Big Data platforms?
In his session at the 10th International Cloud Expo, Cory Isaacson, CEO/CTO of CodeFutures Corporation, will answer these questions and more. You will learn exactly what it takes to scale your data tier, and how to keep it reliable – despite the challenges presented in Cloud environments. He will also provide a quick review of the primary types of database platforms, enabling you to choose the best technology for your application challenges. He will close with a high volume social application/gaming case study, showing you exactly what it takes to run a high-volume, multi-terabyte database infrastructure in the cloud.

read more

Cloud Expo New York: How to Capitalize on the Cloud and Protect Your Data

As commercial enterprises and the federal sector begin to embrace the public cloud, large organizations are facing a roadblock when it comes to compliance with various data privacy laws and industry regulations. Whether your company falls under E.U. data protection laws, HIPAA PHI guidelines, the U.S.’s FEDRamp requirements, or other regulatory guidelines, the public cloud is not out of reach.
In his session at the 10th International Cloud Expo, Mike Morrissey, VP, Research & Development at PerspecSys, will introduce some of the key privacy, residency, and security areas to consider when adopting public cloud, and best practices for securing the cloud for mission-critical applications without disrupting user functionality.

read more

Cloudera KK to Help Spur Hadoop Adoption in APAC Region

Cloudera, the leading provider of Apache Hadoop-based data management software, services and training, today announced that it has established a Japanese subsidiary, Cloudera KK, and an office in Japan. Cloudera’s formal presence extends availability of its products and support offerings to Japanese enterprises that have deployed or are seeking to deploy Apache Hadoop and related technologies in the Hadoop stack to unlock business insights from their Big Data.

read more

Joyent Cloud Europe Unveiled in Amsterdam

«Many of our Joyent Cloud customers have experienced significant growth based on the superior user experience and performance we deliver on Joyent Cloud,» said Steve Tuck, EVP and GM of Joyent Cloud, Joyent’s public cloud line of business, as Joyent Cloud Europe was unveiled this week -an extension of the US-based Joyent Cloud that will provide high-performance real-time Infrastructure-as-a-Service in Europe, located in Amsterdam.

read more

Will Cloud Become the De Facto Standard for Computing?

“The recent TOSCA initiative has made interoperability for cloud computing closer than ever,” observed Andrew Hillier, co-founder and CTO of CiRBA, in this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan. “However, until players like Amazon and Google join in,” Hillier continued, “it will be difficult for organizations to move from one cloud to the other without risks to their data and infrastructure.”
Cloud Computing Journal: Agree or disagree? – «While the IT savings aspect is compelling, the strongest benefit of cloud computing is how it enhances business agility.»
Andrew Hillier: Although savings and agility are both compelling benefits, it’s usually agility that’s realized first. This isn’t because it is a higher priority, but because it occurs earlier in the cloud adoption process. The push toward standardization and self-service can rapidly increase flexibility and decrease provisioning time, but can actually work against efficiency (much to the surprise of many cloud adopters). The resulting environments are difficult to manage, and many organizations end up with higher spend (for external clouds) or much lower density (internal clouds) than they originally envisioned. Fortunately, by adopting more sophisticated methods of planning and controlling these environments, workload placements and resource allocations can be safely optimized, eliminating over-provisioning once and for all and turning the cloud adoption process into the “win-win” that was originally targeted.

read more

Do Object Storage Plays Displace File Systems or Are They Absorbed?

NAB kept me totally away from all the interesting online discussions last week. It’s too late to respond to @JoinToigo’s tweet (we’d call this Figs after Easter in Dutch), but I thought I’d share my thoughts in a bit more than 140 characters.
The short answer is no … but a better answer is very much *yes*.
The first file systems were not designed with the thought of petabytes of data. I don’t know what the exact projections were back then, but gigabytes must have sounded pretty sci-fi. Bytes and kilobytes were a lot more common. We didn’t think that we’d soon all be creating tens if not hundreds of multi-megabyte files per day.
File systems have of course evolved a lot and some have become so popular you could actually say they have a fan base (I’d need to do research on ZFS fan clubs). It is clear that the file system has played a very important role in the evolution of the computer industry. In my list of features that helped to make computers a commodity, the file system would probably be in the top three (with the windows-style GUI and the mouse). The file system enables the use of directories, which have been the most important tool to keep our data organized.

read more

Information Delivery 2.0 – Reference Architecture

With the enablement of new sources of data flow into the enterprise, it is time to look at the issues of Information Delivery 1.0 of the current enterprises.
Disparate Data Sources, most enterprises have grown multiple database platforms and even within a platform, multiple databases for various reasons. Enterprises taking a lot of pain and effort on ETL towards synchronizing the data.
Enterprises are slowly incorporating big data, unstructured data in their information delivery scope, but don’t have clear means to integrate them.
Rich Media content (audio, video, music and other binary content) are finding their place, but the real context of that content is identified more by the metadata and not by the content.

read more