ClearStory Data Nails $9 Million A Round

The ballyhooed “Big Data for the masses” start-up, ClearStory Data, which made a bit of a splash back in March when it got seed funding from Andreessen Horowitz, Google Ventures and a bunch of industry luminaries has gotten a first round worth $9 million from chi-chi Silicon Valley VC Kleiner Perkins, with Andreessen Horowitz and Google Ventures pitching in.
Former Twitter VP of engineering Mike Abbott, now a managing partner at Kleiner, will join the board.
The start-up is promising to deliver a newfangled Big Data analytics solution for the non-techie business user like Mad Men’s mythical marketing maven Dan Draper that integrates private and public data sources in a sleek interactive engine. The platform is supposed to simplify access to relational databases, Hadoop, web and social application interfaces, and third-party data at scale to search out trends and patterns.

read more

Gigaspaces Cloudify Partners with OpSpaces for Chef Onboarding

GigaSpaces Technologies, with its new release of the open source Cloudify product, has partnered with OpsCode for a dedicated Chef integration that caters to diverse application stacks and systems.

“The concept of DevOps and recipes can go well beyond setup, to actually manage the entire lifecycle of your applications—from setup, to monitoring, through maintaining high availability, and auto-scaling when required.  This is where Cloudify and Chef come together,” says Bryan Hale, Director of Business Development for OpsCode. “By enabling users to leverage the power and variety of Chef recipes and cookbooks to deploy services, Cloudify supports comprehensive application level orchestration on any cloud.”

In addition to the integration with Chef, this new release also includes the following features:

  • Complete application-level orchestration, allowing automated provisioning, deployment, management and scaling of complex multi-tier apps to any cloud environment
  • Built-in, ready to use recipes for common big data components, such as Hadoop, Cassandra and MongoDB.
  • Support for non-virtualized environments (AKA Bring Your Own Node), allowing you to treat an arbitrary set of server as your “Cloud” and have Cloudify deploy and manage applications on these servers.
  • Comprehensive REST API for easy integration with third-party tooling and programmatic access.
  • Support for all the common cloud infrastructures, including OpenStack, HPCloud, RackSpace, Windows Azure, Apache CloudStack, Amazon AWS and vCloud.

In addition, Cloudify now also simplifies the complexities involved with deploying big data applications to the cloud.  It is well-known that the massive computing and storage resources that are needed to support big data deployments make cloud environments, public and private, an ideal fit.  But managing big data application on the cloud is no easy feat – as these systems and applications often include other services such as relational and non-relational databases, stream processing tools, web front ends and more, where each framework comes with its own management, installation, configuration, and scaling mechanisms.  With its new built-in recipes, Cloudify provides consistent management and cloud portability for popular big data tools, exponentially reducing the operational and infrastructure costs involved with running these systems.

“We’re seeing a growing market trend for the need to migrate applications – not just in one-off processes anymore – but on a much larger scale, by enterprises, managed service providers, and ISVs alike, who are looking to take advantage of the cloud promise—while until now, only about 5% have actually been able to do so,” says Uri Cohen, Vice President of Product Management at GigaSpaces. “The beauty of Cloudify and its recipe-based model is that it enables you to simply and smoothly take both new and existing applications to the cloud by the tens and hundreds through Cloudify’s built-in recipes and the new integration with OpsCode’s Chef, in very short time frames.”

You can fork the Cloudify code from GitHub, or download the Cloudify GA release.

Compuware and REALTECH Partner

Compuware Corporation on Wednesday announced that REALTECH AG is incorporating Compuware APM – dynaTrace Data Center Real User Monitoring (DC RUM) – into the latest release of its IT service management software, theGuard! Service Management Center. This will enable REALTECH to enhance application performance driven by end user experience across SAP, web, non-web and cloud applications.
“We are excited to partner with Compuware to deliver world-class APM capabilities to our current and future customers of theGuard!,” said Cornelia Pohl-Springer, Head of Product Management for theGuard! at REALTECH. “Our combined solution enables customers to have full visibility and control of their IT environments with high operational efficiency while ensuring customer satisfaction. Customers are able to quickly prioritize problems by business impact and get a complete view of all business processes from an end-user perspective.”
Compuware APM is the leader in application performance management technology for SAP and non-SAP environments. Compuware’s DC RUM solution is based on highly scalable agentless technology and offers the broadest application and technology platform support in the market. REALTECH theGuard! is Europe’s leading provider of software for enterprise-wide IT service management. From managing applications and any type of IT-enabled business process, to monitoring networks and systems, theGuard! supports the most important aspects of IT Management today.

read more

‘Real Me’ as-a-Service: Cloud Privacy by Design for E-Health

Naturally one of the critical areas specified in the Canadian E-Health Cloud strategy document is the risks related to data privacy.
Specifically in section 8, from page 42 through 49, they describe the comprehensive standards, audit and certification frameworks that will be required to protect this next major phase of Cloud adoption.
CHI point to the number one risk issue cited by CIOs – Fears of inadequate data privacy protections, and they describe the various component parts what is required to address these risks including due diligence procedures and state of the art privacy controls.

read more

Drastic Measures Not Needed with DRaaS

Perhaps the only thing worse than a disaster happening is seeing it coming and knowing nothing can be done to stop it. Businesses along the northeastern seaboard had several days of warning before Hurricane Sandy struck, certainly not enough time to implement a disaster recovery plan from scratch. Even more painful is the understanding that some disaster recovery plans would not be enough; physical backup systems in separate geographical areas may have still suffered the same losses as the home site due to the size of the storm.
Most disasters come with no warning at all. Explosions, power outages, and simple equipment failure can cause the same damage. Operations are down, customers suffer, and revenues tank. Once business recovers the harder work of wooing back customers and convincing new ones about the company’s reliability begins.

read more

A CIO’s perspective on priorities for 2013

Journey to the Cloud recently sat down with GreenPages Chief Information and Technology Officer Kevin Hall to talk about the outlook for 2013.

JTC: As CIO at GreenPages what are your major priorities heading into 2013?

KH: As CIO, my major priorities are to continue to rationalize and prioritize within the organization. By rationalize I mean looking at what it is we think the business needs vs. what it is we have, and by prioritize I mean looking at where there are differences between what we have and what we need and then building and operationalizing to get what we need into production.  

We are working through that process right now. More specifically, we’re actively trying to do all of this in a way that will simultaneously help the business have more velocity and, as a percentage of revenue, cost less. We’re trying to do more with less …

Amazon Web Services fills out its big data cloud platform

Tony Baer, Principal Analyst, Ovum IT Enterprise solutions, Ovum

Announcements of new data platforms were highlighted at Amazon Web Services’ (AWS) first ever “re: Invent” user conference this week. Among the headlines, AWS announced Amazon Redshift, a managed, petabyte-scale data warehouse service that includes technology components licensed from ParAccel. Amazon Redshift will deliver the power of massively parallel columnar databases at a commodity price for data warehousing customers.

AWS also announced AWS Data Pipeline, a utility to simplify orchestration of data flows between both AWS-based and on-premise data sources and AWS-based processing services.

These new platforms and services will help fill out and add connective tissue to the various AWS platforms and services. Now that Amazon is automating data flows, it should take the next step and add integrated SQL views to NoSQL data stores, something that other providers, including Microsoft, Cloudera, and Hortonworks, are already pursuing.

AWS’s evolving …

2013 Outlook: A CIO’s Perspective

Journey to the Cloud recently sat down with GreenPages Chief Information and Technology Officer Kevin Hall to talk about the outlook for 2013.

JTC: As CIO at GreenPages what are your major priorities heading into 2013?

KH: As CIO, my major priorities are to continue to rationalize and prioritize within the organization. By rationalize I mean looking at what it is we think the business needs vs. what it is we have, and by prioritize I mean looking at where there are differences between what we have and what we need and then building and operationalizing to get what we need into production.  We are working through that process right now. More specifically, we’re actively trying to do all of this in a way that will simultaneously help the business have more velocity and, as a percentage of revenue, cost less. We’re trying to do more with less, faster.

JTC: What do you think will be some of the biggest IT challenges CIOs will face in 2013?

KH:  I think number one is staying relevant with their business. A huge challenge is being able to understand what it is the business actually needs.  Another big challenge is accepting the fact of life that the business has to actively participate with IT in building out IT. In other words, we have to accept the fact that our business users are oftentimes going to know about technologies that we don’t or are going to be asking questions that we don’t have the answers for. All parties will have to work together to figure it out.

JTC: Any predictions for how the IT landscape will look in 2013 and beyond?

KH: Overall, I think there is a very positive outlook for IT as we move into the future. Whether or not the economy turns around (and I believe it is going to), all businesses are seeking to leverage technology. Based on our conversations with our customers, no one has made any statements to say “hey, we’ve got it all figured out, there is nothing left to do.” Everyone is in a state of understanding that more can be done and that we aren’t at the end of driving business value for IT. More specifically, one thing I would have people keep an eye on is the software defined data center. Important companies like VMware, EMC, and Cisco, amongst others, were rapidly moving to a place that reduces datacenter icons so that just as easily as we can spin up Virtual Machines now, we will be able to spin up datacenters in the future. This will allow us to support high velocity and agility.

JTC: Anything that surprised you about the technology landscape in 2012?

KH: Given a great deal of confusion in our economy, I think I was surprised by how positive the end of the year turned out. The thought seems to be that it must be easy for anyone seeking to hire great people right now due to a high rate of unemployment, but in IT people who get it technically and from a business perspective are working, and they are highly valued by their organizations. Another thing I was surprised about is the determination businesses have to go around, or not use, IT if IT is not being responsive. Now we’re in an age where end users have more choices and a reasonably astute business person can acquire an “as a Service” technology quickly, even though it may be less than fully optimized and there may be issues (security comes to mind). Inside a company, employees may prefer to work with IT, but if IT moves too slowly or appears to just say “no,” people will figure out how to get it done without them.

JTC: What are some of the biggest misconception organizations have about the cloud heading into 2013?

KH: I think a major misconception about cloud is about the amount these technologies are actually being used in one’s organization.  It is rare to find a CIO (this included myself up to recently) who has evaluated just how much cloud technologies are truly being used in their business. Are they aware of every single app being used? How about every “as a Service” that is being procured in some way without IT involvement? Therefore, when they think of their platform, are they including in it all of the traditional IT assets as well as all the “aaS” and cloud assets that are at their company? It goes back to how we as IT professionals can’t be meaningful when we are not even positive of exactly what is going on within the walls of our own company.

JTC: Any recommendations for IT Decision makers who are trying to decide where to allocate their 2013 budgets?

KH: I think IT Decision Makers need to be working with colleagues throughout the company to see what they need to get done and then build out budgets accordingly so they truly support the goals of the business. They need to be prepared to be agile so that unexpected, yet important, business decisions that pop up throughout the year can be supported. Furthermore, they need to be prepared from a velocity standpoint so that when a decision is made, the IT department can go from thought to action very quickly.

 

 

OpenNebula in Amazon EC2: A Private Cloud Within a Public Cloud

C12G Labs, the company behind the OpenNebula project, has just announced the availability of the OpenNebula Sandbox for Amazon EC2. OpenNebula is a widely-deployed open-source management solution for enterprise data center virtualization and private cloud computing that implements the Amazon EC2 interface. With the new Sandbox you can deploy in a single click an AWS-compatible OpenNebula nested cloud on Amazon EC2, where you can launch virtual machines within an Amazon instance.
The OpenNebula Sandbox is a CentOS 6.3 virtual machine appliance with a pre-configured automated installation of OpenNebula 3.8, a virtualization node using emulation (QEMU) ready to execute virtual machines, and prepared images to offer a feature rich cloud experience. Users are able to log into the OpenNebula cloud, monitor the managed resources, and launch instances of virtual machines without the hassle of configuring a physical infrastructure.

read more

Security’s FUD Factor

Had a short but interesting twitter exchange with @securityincite @Gillis57 and @essobi ‏(Mike Rothman, Gillis Jones and not sure (sorry!!) respectively) about using Fear, Uncertainty and Doubt when talking IT security services. @Gillis57 initially asked, ‘Question: We discuss FUD constantly (and I agree that it’s too prominent) But isn’t security inherently built upon fear?’ I sent an ‘09 Rothman article (@securityincite said it was ‘old school’ but still has some great comments)about that very topic. Soon, @essobi chimed in with, ‘Our foundation shouldn’t be fear, it should be education. 😀 ,’ @Gillis57 responded, ‘So, look. I agree wholeheartedly, but why do people need to be educated?’ @essobi answered, ‘imo? Bad programming/exploitable logic processes. we need to raise the bar or lower expectations.’ @Gillis57 added, ‘I really don’t think we need to keep selling fear, but denying that we are a fear based industry isn’t helping.’ @securityincite wizdom’d with, ‘Fear is a tactic like anything else. Depends in situation, context, catalyst. And use sparingly.’ And I conceded that, ‘splitting hairs but I try to talk about risk rather than fear – what’s the risk if…which often generates fear.’

read more