Protecting and Preserving Our Digital Lives is a Task We Want to Have Already Done

I once read that a favorite writer of mine, when told by people he met at cocktail parties how much they “wanted to write,” would reply, “No, you want to have written.”

Protecting and preserving our digital lives is much the same — we want to have already taken care of it. We don’t actually want to go through the hassle of doing it.

An article by Rick Broida in PC World sums it up thus:

There are two kinds of people in the world: Those who have lost critical data, and those who will. In other words, if you use technology long enough and neglect to back up your data, you’re guaranteed to have at least one extremely bad day.

The article goes on to outline “How to build a bulletproof cloud backup system without spending a dime“. There’s a lot to do, it all takes effort, but he’s right. Whether you take all his recommendations or some, it’s a good place to start thinking about the steps you (we) all need to take.

Here’s an idea: Come up with a plan and implement it in pieces until you get to the point where you know you are ready for the digital disaster that is out there waiting for us all.

 

Global Interoperability Consortium’s Cloud Computing Project

Managing and disseminating the rapidly increasing amount of geospatial data will be a huge challenge for governments and civilians responding to the world’s next big disaster, Eric Vollmecke of the Network Centric Operations Industry Consortium (NCOIC) told 350 global leaders at the NATO Network Enabled Capability conference held in Lisbon, Portugal, April 23-25.
“From an operational perspective, there is an insatiable appetite for overhead imagery to build situational awareness. Currently, platforms keep growing to collect and disseminate the necessary information. This information is not timely in its response, it’s unwieldy in its deployment and it lacks the flexibility to enable cross-domain interoperability,” said Vollmecke. “Unless we get our arms around all of this, the amount of data will be overwhelming and we will miss precious days trying to get the right information to the right international stakeholders so they can do their work and not sit waiting on the sidelines.”

Eric Vollmecke of the Network Centric Operations Industry Consortium reports 
the proliferation of geospatial information will pose problems for disaster 
responders and describes a project designed to move critical data more 
efficiently using an open cloud-based infrastructure 
 
 

WASHINGTON, April 30, 2013 /PRNewswire-USNewswire/ — Managing and
disseminating the rapidly increasing amount of geospatial data will be a
huge challenge for governments and civilians responding to the world’s
next big disaster, Eric Vollmecke of the Network Centric Operations
Industry Consortium (NCOIC) told 350 global leaders at the NATO Network
Enabled Capability conference held in Lisbon, Portugal, April 23-25.

“From an operational perspective, there is an insatiable appetite for
overhead imagery to build situational awareness. Currently, platforms
keep growing to collect and disseminate the necessary information. This
information is not timely in its response, it’s unwieldy in its
deployment and it lacks the flexibility to enable cross-domain
interoperability,” said Vollmecke. “Unless we get our arms around all of
this, the amount of data will be overwhelming and we will miss precious
days trying to get the right information to the right international
stakeholders so they can do their work and not sit waiting on the
sidelines.”

Vollmecke said the use of a cloud computing environment will improve
the ability to quickly share critical information between nations and
non-governmental organizations. He described the Cloud Concept and
Demonstration project that NCOIC is working on for the National
Geospatial-Intelligence Agency (NGA).

The NGA project is a collaborative effort by NCOIC and its
member-companies to show the interoperability and movement of data in an
open cloud-based infrastructure. NGA is providing unclassified data
that supports a scenario depicting the 2010 earthquake in Haiti. The
project builds on a series of successful lab interoperability
demonstrations based on Haiti that NCOIC conducted in 2010.

“In Haiti, we collected a huge amount of data compared to the
Tasmanian tsunami of 2004. Tomorrow the amount of data could be 100 fold
and one organization alone will not be able to manage the inputs,” said
Vollmecke, who is also a major general in the U.S. Air National Guard
and, while on active duty, commanded two airlift wings during the 2010
Haitian crisis. “With the NGA community cloud project, NCOIC is testing a
collaborative, real-time environment that has both suppliers and
consumers of data at different security levels.”

Information technology solutions provider NJVC is serving as team
leader of the NGA project and participants include Boeing, The Aerospace
Corporation and Open Geospatial Consortium. “NCOIC has assembled a team
you would not normally see on a government-led project,” Vollmecke told
the NATO audience. “Using a consortium is the most rapid and effective
way to facilitate the advancement and deployment of technology. The
parties can set aside their traditional roles and aren’t subject to the
contractual and legal walls that typically are put up between government
and contractors. The exchange of information and ideas is more
free-flowing.”

Vollmecke, who is NCOIC program director, reported that Cycle One of
the NGA project is complete and the cloud infrastructure has been
defined and built, with the team establishing standards and processes,
utilizing best practices, and addressing potential problems such as
ownership, bandwidth, latency, availability, access and security.

In Cycle Two, set to begin in May, NCOIC member-companies will test
out the infrastructure. They will function as “actors” — information
consumers and providers like police, firefighters, rescue workers,
medical personnel, etc. — who plug into the clouds and use the
geospatial data to activate unique, sometimes proprietary, applications
that demonstrate end-user capabilities.

“The key is to have a core or resident capability in the cloud that
can be rapidly expanded on demand, when there is an event or disaster,”
said Vollmecke. “This will free up intelligence analysts to work their
problems, while putting geospatial information into the hands of other
users. Cloud technology can improve everyone’s capability and
effectiveness, while reducing cost, time and risk.”

About NCOIC

The Network Centric Operations Industry Consortium’s core capability
is enabling cross-domain interoperability among and between such areas
as aerospace, civil and military operations, air traffic management,
health care and more. NCOIC is a global not-for-profit organization with
more than 60 members representing 12 countries. It has an eight-year
history of developing net-centric skills and tools that help its members
and customers to operate effectively across diverse global market
sectors and domains. For more information, visit www.ncoic.org

SOURCE Network Centric Operations Industry Consortium

/Web site: www.ncoic.org

Bookmark and Share

Cloud Musings on Forbes

( Thank you. If you enjoyed this article, get free updates by email or RSS – © Copyright Kevin L. Jackson 2012)


read more

Stackdriver’s AWS & Rackspace Cloud Monitor Betas

Bain Capital-backed Stackdriver, a public cloud monitoring start-up, pushed its widgetry out into public beta Tuesday.
The SaaS stuff currently works on Amazon Web Services and the Rackspace Cloud.
The year-old company says it’s got close to 100 AWS customers using its eponymous Stackdriver Intelligent Monitoring to reduce the complexity of managing their cloud-powered applications.
Market research done by Stackdriver’s founders Dan Belcher and Izzy Azeri, both veterans of VMware, found the cloud wasn’t delivering all the elasticity, performance and security it’s cracked up to, forcing users to hire DevOps engineers and write their own scripts, especially as their cloud environments grew larger and more complex.
It also found that the tools to hand weren’t designed with the cloud’s dynamic nature in mind.

read more

NETWAYS Becomes First OpenNebula Premium Partner

As OpenNebula Premium Partner, NETWAYS continues years of collaboration with C12G Labs. From sharing a stand at CeBIT to joint training courses and troubleshooting in customer projects, the two companies enjoy a longstanding partnership. Their work together strengthens OpenNebula development and its continued relevance to operations in production environments. As Ignacio Martin Manager of C12G Labs notes:
“Getting feedback from partners like NETWAYS on OpenNebula in production helps us to stay on top of the growing needs of cloud operators and systems integrators.”

read more

Driving Cloud Innovation: SSDs Change Cloud Storage Paradigm

IT has more opportunities than ever before with the growth in users, devices, data and secure cloud services. This creates not only a more enriching experience for users, but more opportunities for businesses. The key to capitalizing on these opportunities is to have the right tools in place to help scale operations. In his Day 3 Keynote at 12th Cloud Expo | Cloud Expo New York [June 10-13, 2013], Intel’s Rob Crooke will describe the range of products that Intel provides to support different usage models faced by IT with a specific emphasis on the advancement in SSDs.

read more

Cloud Expo New York: When APM Meets Big Data

Analyzing Hadoop jobs and speeding them up is often a tedious and time-consuming effort that requires experts.
In his session at the 12th International Cloud Expo, Michael Kopp, a technology strategist in the Compuware APM center of excellence, will show you how proven APM techniques can be used to speed up your Hadoop jobs at the core without going through tons of log files, beyond just adding more hardware and within minutes instead of hours or days.
Michael Kopp is a technology strategist in the Compuware APM center of excellence and has more than 10 years of experience as an architect and developer in the Java/JEE space. Additionally, Kopp specializes in architecture and performance of large-scale production deployments with a special focus on virtualization and cloud.

read more

Verizon to Go into the Cloud Storage Biz

Verizon Wireless is going into the cloud storage business for tablets and smartphones up against the likes of Dropbox.
It ultimately expects to let customers transfer information between operating systems.
The Verizon Cloud will roll out the service soon, starting with Android and moving on to iOS and then other operating systems later this year.
Verizon would like to see consumers install the upcoming secure app on all their devices including Windows computers so they can upload content and sync it in the cloud.
The company reasons that the content on the widgets is “more valuable than the device itself” and that it can give folks, for some reason all too careless with their widgets, a place to go to restore this content if they lose the device.
The widgetry will be able to handle text messages, call logs, contacts, music, multimedia and other files up to 125GB of storage.

read more

Achieving the Full Promise of the Cloud

As the cloud market explodes, a distinct chasm has become apparent between the operation of infrastructure and applications. Sharing, isolation and load balancing issues in the network, combined with high density virtualization in compute and storage resources, can adversely impact the performance of applications running across the network, frustrating application developers and end users alike.
Enter application defined networking (ADN), which offers great business value for cloud users. A complement to software defined networking (SDN), ADN enables applications to directly control and adapt their networking for optimal performance across both public and private clouds, without compromising on portability or security. With ADN solutions, developers and administrators can automatically instrument, analyze and reconfigure the virtual network of resources to ensure that their cloud-based applications will perform optimally under highly variable conditions, and that they can quickly respond should outages or problems occur.

read more

2013 CRM market share: 40% of CRM systems sold are SaaS-based

Last year, four out of every ten CRM systems sold were SaaS-based, and the trend is accelerating.

In the recent Gartner report  Market Share Analysis: Customer Relationship Management Software, Worldwide, 2012 published April 18, 2013 the authors provide insights into why the worldwide CRM market experienced 12% growth in 2012, three times the average of all enterprise software categories. 

Gartner cites demand they are seeing from their enterprise clients for CRM systems that can help acquire customers, analyze and act on customer behaviours, and increase all-channel management performance.  Big data inquiries are also increasing in CRM, driven by the interest enterprise clients have in getting more value from social network data and interactions.

Key take-aways from the report include the following:

  • The CRM worldwide market grew from $16B to $18B attaining a 12.5% growth rate from 2011 to 2012.
  • 80% of all CRM software in 2012 was sold in …

Cloud Expo New York: Aligning Your Cloud Security with the Business

Multi-tenant environments bring a new set of challenges when it comes to security.
In his session at the 12th International Cloud Expo, Omar Khawaja, Global Principal, Security Solutions, at Verizon Terremark, will provide a structured and sequenced data-centric security approach to identify and align users, systems and access to data. Cloud Expo attendees will learn the key considerations to be had when outsourcing IT environments to the cloud as well as the importance of a model-driven template.
Omar Khawaja is Global Principal, Security Solutions, at Verizon Terremark.

read more