The Newest Data-Storage Device is DNA?

By Randy Weis

Molecular and DNA Storage Devices- “Ripped from the headlines!”

-Researchers used synthetic DNA encoded to create the zeros and ones of digital technology.

-MIT Scientists Achieve Molecular Data Storage Breakthrough

-DNA may soon be used for storage: All of the world’s information, about 1.8 zettabytes, could be stored in about four grams of DNA

Harvard stores 70 billion books using DNA: Research team stores 5.5 petabits, or 1 million gigabits, per cubic millimeter in DNA  storage medium

IBM using DNA, nanotech to build next-generation chips: DNA works with nanotubes to build more powerful, energy-efficient easy-to-manufacture chips

Don’t rush out to your reseller yet! This stuff is more in the realm of science fiction at the moment, although the reference links at the end of this post are to serious scientific journals. It is tough out here at the bleeding edge of storage technology to find commercial or even academic applications for the very latest, but this kind of storage technology, along with quantum storage and holographic storage, will literally change the world. Wearable, embedded storage technology for consumers may be a decade or more down the road, but you know that there will be military and research applications long before Apple gets this embedded in the latest 100 TB iPod. Ok, deep breath—more realistically, where will this technology be put into action first? Let’s see how this works first.

DNA is a three dimensional media, with density capabilities of up to a zettabyte in a millimeter volume. Some of this work is being done with artificial DNA, injected into genetically modified bacteria (from a Japanese research project from last year). A commercially available genetic sequencer was used for this.

More recently, researchers in Britain encoded the “I Have a Dream” speech and some Shakespeare Sonnets in synthetic DNA strands. Since DNA can be recovered from 20,000 year old wooly mammoth bones, this has far greater potential for long term retrievable storage than, say, optical disks (notorious back in the 90s for delaminating after 5 years).

Reading the DNA is more complicated and expensive, and the “recording” process is very slow. It should be noted that no one is suggesting storing data in a living creature at this point.

Molecular storage is also showing promise, in binding different molecules in a “supramolecule” to store up to 1 petabyte per square inch. But this is a storage media in two dimensions, not three. This still requires temperatures of -9 degrees, considered “room temperature” by physicists. This work was done in India and Germany. IBM is working with DNA and carbon nanotube “scaffolding” to build nano devices in their labs today.

Where would this be put to work first? Google and other search engines, for one. Any storage manufacturer would be interested—EMC DNA, anyone? Suggested use cases: globally and nationally important information of “historical value” and the medium-term future archiving of information of high personal value that you want to preserve for a couple of generations, such as wedding video for grandchildren to see.  The process to lay the data down and then to decode it makes the first use case of data archiving the most likely. The entire Library of Congress could be stored in something the size of a couple of sugar cubes, for instance.

What was once unthinkable (or at least only in the realm of science fiction) has become reality in many cases: drones, hand held computers with more processing power than that which sent man to the moon, and terabyte storage in home computers. The future of data storage is very bright and impossible to predict. Stay tuned.

Here is a graphic from Nature Journal (the Shakespeare Sonnets), “Towards practical, high-capacity, low-maintenance information storage in synthesized DNA” http://www.nature.com/nature/journal/vaop/ncurrent/full/nature11875.html#/ref10

Click here to learn more about how GreenPages can help you with your organization’s storage strategy

Other References:

Researchers used synthetic DNA encoded to create the zeros and ones of digital technology.

http://www.usatoday.com/story/news/nation/2013/01/23/dna-information-storage/1858801/

MIT Scientists Achieve Molecular Data Storage Breakthrough

http://idealab.talkingpointsmemo.com/2013/01/mit-scientists-achieve-molecular-data-storage-near-room-temperature.php

DNA may soon be used for storage

http://www.computerworld.com/s/article/9236176/DNA_may_soon_be_used_for_storage?source=CTWNLE_nlt_storage_2013-01-28

Harvard stores 70 billion books using DNA

http://www.computerworld.com/s/article/9230401/Harvard_stores_70_billion_books_using_DNA

IBM using DNA, nanotech to build next-generation chips

http://www.computerworld.com/s/article/9136744/IBM_using_DNA_nanotech_to_build_next_generation_chips

 

Analysing the evolution of single sign-on

Replacing mainframes with 21st century identity

By Paul Madsen, senior technical architect

The concept of single sign-on (SSO) is not a new one, and over the years it has successfully bridged the gap between security and productivity for organizations all over the globe.

Allowing users to authenticate once to gain access to enterprise applications improves access security and user productivity by reducing the need for passwords.

In the days of mainframes, SSO was used to help maintain productivity and security from inside the protection of firewalls. As organizations moved to custom-built authentication systems in the 1990’s, it became recognized as enterprise SSO (ESSO) and later evolved into browser-based plugin or web-proxy methods known as web access management (WAM). IT’s focus was on integrating applications exclusively within the network perimeter.

However, as enterprises shifted toward cloud-based services at the turn of the century and software-as-a-service (SaaS) applications became more …

Instant Cloud Storage Migration

Companies want to migrate data to cloud and work from cloud directly. However, they don’t have time to wait for the data migration to happen if it is taking a long time. Here is a typical conversation around data migration to the cloud.
“Can Gladinet work as a shared network drive for 20 users within my company? I want to have employees copy and edit files and folders in one centralized drive. For example, the drive would show up in the “My Computer” section of Windows. We don’t want something that is only shareable via web sites through web browser. Also, is there lag time for when we drop files to the drive? We want instant access to the files once they are uploaded.” – A customer told us their story and the solution they are looking for.
The customer then explained their existing solution and why it didn’t fit their requirement. “Our main problem with the other service we are testing now is that it took 20 minutes to upload 500MB of data, but then another 5 hours to sync or whatever it was doing so the files would show up on other computers.”

read more

Instant Cloud Storage Migration

Companies want to migrate data to cloud and work from cloud directly. However, they don’t have time to wait for the data migration to happen if it is taking a long time. Here is a typical conversation around data migration to the cloud.
“Can Gladinet work as a shared network drive for 20 users within my company? I want to have employees copy and edit files and folders in one centralized drive. For example, the drive would show up in the “My Computer” section of Windows. We don’t want something that is only shareable via web sites through web browser. Also, is there lag time for when we drop files to the drive? We want instant access to the files once they are uploaded.” – A customer told us their story and the solution they are looking for.
The customer then explained their existing solution and why it didn’t fit their requirement. “Our main problem with the other service we are testing now is that it took 20 minutes to upload 500MB of data, but then another 5 hours to sync or whatever it was doing so the files would show up on other computers.”

read more

89 Degrees Launches ECHO Email Optimization Plan

89 Degrees, a customer engagement agency that leverages data and analytically driven strategy for maximum ROI, has launched a new service, ECHO, an email capabilities & health optimization plan that maximizes email investments for increased impact and ROI.

“Industry surveys show that the amount of email sent by retailers in 2012 jumped by at least 19 percent, and with good reason,” said Arthur Sweetser, CMO of 89 Degrees and recent speaker at the Email Evolution Conference. “Email is evolving and remains highly profitable; the evidence keeps mounting and businesses know it is a smart bet for their budgets.”

89 Degrees also hears from many marketers that are convinced of emails’ value, but still don’t know if they are making the most of their investment. Or they may not be achieving the growth numbers on which they were counting.

“When businesses are allocating a sizable portion of their marketing budget to email marketing, they want to be sure they are spending smart,” continued Sweetser. “That’s why we introduced ECHO – to help marketers reach and exceed their goals, increasing their success with each following campaign.”

Kaplan Launches Education Startup Accelerator

Kaplan, Inc., the education services subsidiary of The Washington Post Company, announced today the launch of the Kaplan EdTech Accelerator, powered by TechStars, an intensive three-month mentoring and business development program for 10 startup companies, in collaboration with TechStars, a nationally recognized startup accelerator.

The Kaplan EdTech Accelerator will select startups using technology to create products and services across the broad spectrum of education including K-12, higher education, professional education, lifelong learning, and other areas. TechStars will invest $20,000 in each company accepted into the program.

The Kaplan EdTech Accelerator is the first corporate sponsored accelerator focused exclusively on the education sector, using TechStars’ mentor-driven, deep immersion model. TechStars has completed 15 accelerator programs and its selected companies have attracted more than $285 million in funding in the past six years.

The Kaplan EdTech Accelerator will host the startups, to be chosen by application, at its offices in New York City’s West Village neighborhood from June to September 2013. They will be mentored by industry leaders, such as Kaplan, Inc. Chairman and CEO Andy Rosen, TechStars founder and CEO David Cohen, Washington Post Company Chairman and CEO Don Graham, noted venture capitalist and Foundry Group Managing Director Brad Feld, and many notable founders of ed-tech companies, including Jose Ferreira of Knewton and Eren Bali of Udemy.

Additionally, Kaplan will provide the startups with office space and facilities, and other resources as they work to build their companies and products. This support includes access to Kaplan’s proprietary “Kaplan Way for Learning” program, which harnesses the latest learnings from the fields of science, instructional design, and technology to support the development of highly effective, evidence-based learning products. Kaplan also has tremendous reach in education with more than one million students enrolled annually, taught by 10,000-plus instructors globally, relationships with 300-plus U.S. school districts, more than 20 university partners worldwide, and thousands of corporate customers.

The program will culminate in Demo Day, when the startups’ founders will present for an elite group of angel and venture investors and education industry influencers, with the goal of securing funding to grow their companies.

“We’re thrilled about partnering with TechStars to launch the Kaplan EdTech Accelerator,” said Andy Rosen, Kaplan, Inc. chairman and CEO. “Kaplan’s mission is to provide students around the world with the best, most efficient means to achieving their educational goals. Ongoing cultivation of new innovations from all across the sector—in ways like this accelerator program—is embedded in our company’s history.”

From its start, Kaplan has pioneered notable education innovations. It has, for example, launched the first wholly online law school in the U.S.; built its online university into one of the country’s largest higher education institutions; and, more recently, created mobile delivery systems for its test prep and professional education customers, a new prior learning assessment service for adult learners, and an innovative, large-scale online instructional platform, KAPx. Kaplan will be making available as mentors several of those Kaplan professionals who have driven many of these innovations.

The application deadline is April 14, 2013. Selected companies will be contacted in late April, and the program will begin in June. Further details and the application for the program are available at KaplanEdTechAccelerator.com.

Weekly Roundup: Amazon Price Cut on RDS Multi-AZ & More

Last week, the cloud world seems to have received lots of new offerings and releases from various providers. There were new releases from CloudStack, Google, StackMob and MongoHQ. There was also a new feature release from Amazon. Microsoft has published an article on choosing the best SQL database solution in Windows Azure. Plus, an acquisition news from VMware and an analyst report on fourth quarter result of Rackspace are also in the offing.
Early last week, IaaS leader Amazon has introduced a new Route 53 DNS Failover feature. This feature pairs well with Amazon S3’s website hosting to create a simple, low-cost and reliable way to deploy a backup website. Thus, directing customers to a backup site when the primary website goes down. Next, they have announced price reduction of a relational database Multi-AZ deployment model from 15% to 32%. This model helps create a standby instance which maintains an up-to-date copy of the primary database. The pricing seems to be varying with the region where the Multi-AZ deployment occurs. Finally, there has been an announcement about the launch of Amazon Redshift to all its customers, which can be used from the AWS management console. This allows users to start with a few hundred terabytes and then scale up to a petabyte or more.

read more

Weekly Roundup: Amazon Price Cut on RDS Multi-AZ & More

Last week, the cloud world seems to have received lots of new offerings and releases from various providers. There were new releases from CloudStack, Google, StackMob and MongoHQ. There was also a new feature release from Amazon. Microsoft has published an article on choosing the best SQL database solution in Windows Azure. Plus, an acquisition news from VMware and an analyst report on fourth quarter result of Rackspace are also in the offing.
Early last week, IaaS leader Amazon has introduced a new Route 53 DNS Failover feature. This feature pairs well with Amazon S3’s website hosting to create a simple, low-cost and reliable way to deploy a backup website. Thus, directing customers to a backup site when the primary website goes down. Next, they have announced price reduction of a relational database Multi-AZ deployment model from 15% to 32%. This model helps create a standby instance which maintains an up-to-date copy of the primary database. The pricing seems to be varying with the region where the Multi-AZ deployment occurs. Finally, there has been an announcement about the launch of Amazon Redshift to all its customers, which can be used from the AWS management console. This allows users to start with a few hundred terabytes and then scale up to a petabyte or more.

read more

CRaSH for Mule, an Introduction

This blog is the formal introduction to the CRaSH console for Mule on which I’ve been working for the past month or so. I’ve decided to interview myself about it because, hey, if I don’t do it, who will?
It is a shell that is running embedded in Mule and that gives command-line access to a variety of Mule internal moving parts. It’s built thanks to the excellent CRaSH project, a toolkit built by Julien Viet and sponsored by eXo Platform, which allows the easy creation of embedded shells.
Well, it’s easy to find it out. Let’s connect to CRaSH for Mule and ask for help:

read more

Opcos looking to enterprise for cloud revenue

Although large enterprise customers remain the focus for mobile operators worldwide when it comes to deploying cloud services, operators feel unlikely to generate a significant return on investment from cloud services in the short term.

The Telecoms.com Intelligence Industry Survey asked what percentage of revenues operators are likely to invest in cloud services over the next two years. The mode response was between 11 and 20 per cent – with 28.9 per cent opting for this bracket. When asked what percentage of revenue respondents believe operators will see for their investment, the mode response, with 38.1 per cent, was zero to 10 per cent.