Does Hadoop makes the storage and operation of arbitrary enterprise data both scalable and cost effective?
In his session at the 11th International Cloud Expo, Ari Zilka, Chief Products Officer at Hortonworks, will answer frequently asked questions like, “What does it take to move Hadoop from one or two use cases in the average organization to an enterprise-wide data management platform?” and “What can a Hadoop user do to make their jobs easier and more efficient?” He will also project a roadmap two to five years out, explaining how the community can help Hadoop find its rightful place at the core of big data storage and management.
Monthly Archives: September 2012
athenahealth Acquires Healthcare Data Services
athenahealth, Inc., a provider of cloud-based practice management, electronic health record (EHR), and care coordination services, today announced that it has signed a definitive agreement to acquire Healthcare Data Services LLC (HDS), a web-based solutions provider and expert in health care data analysis and population health management for payers and providers. It is anticipated that the transaction will close in October 2012.
This acquisition is expected to expand athenahealth’s cloud-based services portfolio to include high-value, population-based cost and quality data analysis and reporting capabilities. By expanding its services in this way, athenahealth will be strengthening its ability to support health care organizations to navigate the growing number of risk-based payment models, and align care coordination with patient population needs.
“Value-based payment models are fundamentally changing the way patient care is coordinated, delivered, and reimbursed,” said Jonathan Bush, CEO and chairman of athenahealth. “With HDS, we can help health care organizations to thrive in the face of change—to drive down costs through smart, high quality care coordination and to understand the totality of services being provided. This acquisition supports our existing efforts to create an information backbone that makes health care work as it should.”
New payment models offered by the U.S. government and commercial health plans aim to create a reimbursement system that links care reimbursement to the quality of care delivered and, ultimately, to reduce overall health care expenses for populations of patients. These risk-based contracts come in a variety of models, including pay-for-performance (P4P) incentives, bundled payments, shared savings, and global capitation; each requires improved insight into patient population data so that health care organizations can gauge and manage patient needs while simultaneously tracking and adjusting their own performance against risk-based reimbursement contracts.
The addition of HDS would bring many advantages and synergies that would expand athenahealth’s recognition as a single-source provider of best-in-class workflow and data insight solutions to support all payment models:
- Proven Innovation Excellence – HDS has led the market for population
health tools in Massachusetts, which in turn has led the country in
transforming toward more of an accountable care health care
marketplace. Together, the combined organization would be able to help
a broader set of health care organizations to drive quality and
improvement to the way care is coordinated and reimbursed. - Key and Differentiating Services – The proven services and value of
HDS would be able to leverage athenahealth’s investment, sales force
capacity, and complementary cloud-based services. HDS complements
athenahealth’s commitment to support risk-based payment models, as
well as athenahealth’s mission to be the best in the world at getting
medical caregivers paid for doing the right thing. - Advancement to Health Information Open Exchange – With HDS,
athenahealth could better support health care organizations in their
access and use of data as a means to drive clinical and financial
improvement. HDS complements athenahealth’s clinical-cycle,
patient-cycle, and care coordination workflows, as well as
athenahealth’s efforts to create a health care information backbone to
make health care work as it should. - Accelerated Growth within the Large-to-Enterprise Market – HDS brings
valuable relationships within the medium-to-large health systems
market. Greater access to these organizations would accelerate
athenahealth’s delivery of its cloud-based services and in turn
contribute to growing revenue streams.
“By combining HDS with athenahealth, we expect to create a comprehensive, easy-to-access platform for health care organizations to take on and succeed in the face of payment reform and the shift to accountable care,” said Jonathan Porter, co-founder of HDS. “The massive changes going on in health care reimbursement create challenges, but more so create opportunities. Together with athenahealth, we would be able to bridge the gap between providers, payers, and patients by providing ready-to-use population health management capabilities as part of EHR workflows that support restructured payment models and ensure the delivery of appropriate and necessary care to patients.”
Keep the Old, In with the New
For decades, the infrastructure needed to keep your public-facing websites online has had a relatively simple design. After all, it was all contained in one or a few datacenters under your control, or under the control of a contractor who did your bidding where network architecture was concerned.
Requests came in, DNS resolved them, the correct server or pool of servers handled them. There was security, and sometimes a few other steps in between, but that still didn’t change that it was a pretty simplistic architecture. Even after virtualization took hold in the datacenter, the servers were still the same, just where they sat became more fluid.
Cloud Encryption and Healthcare “as a Service” Solutions
Healthcare “as a Service” providers are coming up strong in the market right now. It’s a growing segment, attracting a lot of interest from businesses and investors, but as expected, cloud security, and more specifically cloud encryption and HIPAA requirements, is critical customers considering a healthcare application as a service. In this article I’ll review some aspects as well as relate to data security solutions such as split-key management.
Our recent conversations with Healthcare SaaS providers have highlighted several common aspects, across many solutions.
Obviously, they all need to be “cloudy” (else they would not be SaaS providers). Their customer will access them through the web and their business model includes a pay-as-you-go component. They need their underlying infrastructure to support this model.
It’s DevOps, or It’s the Wrong Conversation
As I was watching this thread develop, with various comments from people that live and breathe IT, one thing kept coming to mind. IT people often try and justify new technology with technology reasoning. It’s analogous to answering a question with another question.
Far too often, because IT has almost always been looked at as a cost-center and measured for ROI based on cost-reduction or productivity improvements, technologist feel the need to drive the justification for a new project based on cost.
A More Practical View of Cloud Brokers
#cloud The conventional view of cloud brokers misses the need to enforce policies and ensure compliance
During a dinner at VMworld organized by Lilac Schoenbeck of BMC, we had the chance to chat up cloud and related issues with Kia Behnia, CTO at BMC. Discussion turned, naturally I think, to process. That could be because BMC is heavily invested in automating and orchestrating processes. Despite the nomenclature used (business process management) for IT this is a focus on operational process automation, though eventually IT will have to raise the bar and focus on the more businessy aspects of IT and operations.
Alex Williams postulated the decreasing need for IT in an increasingly cloudy world. On the surface this generally seems to be an accurate observation. After all, when business users can provision applications a la SaaS to serve their needs do you really need IT? Even in cases where you’re deploying a fairly simple web site, the process has become so abstracted as to comprise the push of a button, dragging some components after specifying a template, and voila! Web site deployed, no IT necessary.
While from a technical difficulty perspective this may be true (and if we say it is, it is for only the smallest of organizations) there are many responsibilities of IT that are simply overlooked and, as we all know, underappreciated for what they provide, not the least of which is being able to understand the technical implications of regulations and requirements like HIPAA, PCI-DSS, and SOX – all of which have some technical aspect to them and need to be enforced, well, with technology.
See, choosing a cloud deployment environment is not just about “will this workload run in cloud X”. It’s far more complex than that, with many more variables that are often hidden from the end-user, a.k.a. the business peoples. Yes, cost is important. Yes, performance is important. And these are characteristics we may be able to gather with a cloud broker. But what we can’t know is whether or not a particular cloud will be able to enforce other policies – those handed down by governments around the globe and those put into writing by the organization itself.
Imagine the horror of a CxO upon discovering an errant employee with a credit card has just violated a regulation that will result in Severe Financial Penalties or worse – jail. These are serious issues that conventional views of cloud brokers simply do not take into account. It’s one thing to violate an organizational policy regarding e-mailing confidential data to your Gmail account, it’s quite another to violate some of the government regulations that govern not only data at rest but in flight.
A PRACTICAL VIEW of CLOUD BROKERS
Thus, it seems a more practical view of cloud brokers is necessary; a view that enables such solutions to not only consider performance and price, but ability to adhere to and enforce corporate and regulatory polices. Such a data center hosted cloud broker would be able to take into consideration these very important factors when making decisions regarding the optimal deployment environment for a given application. That may be a public cloud, it may be a private cloud – it may be a dynamic data center. The resulting decision (and options) are not nearly as important as the ability for IT to ensure that the technical aspects of policies are included in the decision making process.
And it must be IT that codifies those requirements into a policy that can be leveraged by the broker and ultimately the end-user to help make deployment decisions. Business users, when faced with requirements for web application firewalls in PCI-DSS, for example, or ensuring a default “deny all” policy on firewalls and routers, are unlikely able to evaluate public cloud offerings for ability to meet such requirements. That’s the role of IT, and even wearing rainbow-colored cloud glasses can’t eliminate the very real and important role IT has to play here.
The role of IT may be changing, transforming, but it is no way being eliminated or decreasing in importance. In fact, given the nature of today’s environments and threat landscape, the importance of IT in helping to determine deployment locations that at a minimum meet organizational and regulatory requirements is paramount to enabling business users to have more control over their own destiny, as it were.
So while cloud brokers currently appear to be external services, often provided by SIs with a vested interest in cloud migration and the services they bring to the table, ultimately these beasts will become enterprise-deployed services capable of making policy-based decisions that include the technical details and requirements of application deployment along with the more businessy details such as costs.
The role of IT will never really be eliminated. It will morph, it will transform, it will expand and contract over time. But business and operational regulations cannot be encapsulated into policies without IT. And for those applications that cannot be deployed into public environments without violating those policies, there needs to be a controlled, local environment into which they can be deployed.
Lori MacVittie is a Senior Technical Marketing Manager, responsible for education and evangelism across F5’s entire product suite. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University. She is the author of XAML in a Nutshell and a co-author of The Cloud Security Rules |
|
How To Define Cloud Education?
You could say I believe in cloud computing. You could say the same for anyone reading this article. And you could say the same for anyone attending or exhibiting at the upcoming Cloud Expo.
We’ve now moved well beyond the basic question, “what is cloud computing?”
But I’m curious: what is a cloud computing education? How should college students, mid-career technologists, and senior executives be informed, trained, and certified? Should there be such a thing as a cloud computing major and post-grad cloud computing studies?
These questions spring to mind as I prepare to deliver a short seminar on the topic at the Tau Institute, a small research organization I’ve co-founded in two locations: my Illinois hometown and Manila, Philippines.
There is certainly a global aspect to cloud computing, now that we’re in an era where technological advancement happens simultaneously (if unevenly) throughout the world.
But more important is the idea of a universal foundation to cloud. What should people know, and when should they know it?
I’m participating in a new podcast series called Run!, which tackles the subject of “what we do with technology and what it’s doing to us.” We recorded an episode last night in which a major cloud company CTO talked of dealing with three dozen platforms in his job.
Is there such a thing as a basic, standard cloud education? If so, what languages, frameworks, and general understanding should it encompass? Please let me know your thoughts, as I work to build such a thing – a cloud computing education/certification program – within the Tau Institute.
Cloud enthusiasm continues to rise – and are security fears allaying?
A TechSoup Global survey has revealed that most non-governmental organisations (NGOs) are keen to move to the cloud but need to be better informed about its strengths and weaknesses.
TechSoup’s 2012 Global Cloud Computing Survey interviewed over 10,000 NGOs, non-profits and charities and found that many of these companies were unaware they were already using cloud services.
Sound familiar? Well a study from Citrix last month showed that 95% of 1000 Americans surveyed who thought they weren’t using the cloud actually were, so more knowledge on what the cloud encompasses can now be attributed to NGOs, as well as consumers.
Despite this, TechSoup Global – itself a non-profit – noted in its response how many NGOs were utilising the cloud.
The key takeaways from the report are:
- 90% of those surveyed are already using cloud computing, with 53% hoping to move “a significant portion” of their portfolio into the …
Monitoring in the Cloud
“To manage a private cloud infrastructure, we have focused our efforts on the challenges that face our customers when deploying a private cloud for their collaboration environment.”
How did you come to the private cloud approach?
Bystran: As you may know, we have had over 600 customers for more than 15 years now. Most of our customers are large companies with multiple operating units and an International footprint. For example in Europe, we’ve provide our services for many of the big banks and major industries like Total, Technip, and Bayer etc.
These customers have high expectations, so we continuously improve our product with their feedback and future concerns in mind. Because of these deep relationships our product management and development teams often work in a very direct way to fit their needs.
Amazon Starts Spot Market for Reserved Instances
Amazon Web Services has set up an online marketplace where customers can sell their excess EC2 Reserved Instances to other AWS customers.
Reserved Instances cost less because buyers make a one-time payment to reserve compute capacity for a specified term and get a discount on the hourly charges for that instance.
Amazon will charge sellers a 12% fee.
Interested parties can browse the site for term lengths and pricing options outside the standard one-year and three-year term lengths.
AWS suggests owners may want to move instances to a new AWS Region, change to a new instance type or sell capacity for projects that end before the term expires.