Nasa making use of cloud computing

Nasa’s Jet Propulsion Laboratory have announced that they will be looking to make use of cloud computing platforms in order to power the administration’s next generation of projects.

The news was revealed at the 2012 re:Invent conference as Nasa IT CTO, Tom Soderstrom, and software engineer, Khawaja Shams, explained that the staff at the Jet Propulsion Laboratory are using cloud hosting platforms in order to power the Mars Curiosity rover mission.

So, not only have we proved that cloud computing could potentially help save the planet but it can also help discover other planets!

The rover has already made some fantastic discoveries on Mars and it makes use of computing and storage platforms in order to communicate with and process date gathered in the mission. This then gets sent back to mission control.

Furthermore, Tom Soderstrom and Khawaja Shams have stated that the rover was a basis …

SYS-CON.tv: There’s No Business Like Cloud and Big Data

In this “CEO Power Panel” moderated by Cloud Expo Conference Chair Jeremy Geelan at the 11th International Cloud Expo in Santa Clara, CA, Chris C. Kemp, Co-Founder & CEO of Nebula; Lawrence Guillory, CEO of Racemi Inc.; Brian Patrick Donaghy, CEO of Appcore, LLC; Rob Clyde, CEO of Adaptive Computing; Joe Weinman, Sr. VP of Cloud Services & Strategy at Telx; Art Landro, CEO of Cordys; and Rodney Rogers; Chairman & Chief Executive Officer of Virtustream discussed the business-related issues that are currently front of mind with chief executives of today’s foremost Cloud and Big Data based companies.

read more

Everything You Wanted to Know About Cloud Hosting

Cloud hosting has been creating quite a buzz lately as being the latest, and some claim also the best type of web hosting around. It is quite different to your usually hosting server and certainly requires a lot of skill to use. Most of the time it cannot be run by a single person, but rather by a team of people who can get the job done at a quicker pace. One of the most popular websites in the world that uses cloud hosting is Google. Ever seen Google’s website down? No. So there really must be some excellent points to cloud hosting that we will now look at.
The single most important thing that you need to know about cloud hosting is that the whole process of creating your website, as well as its components and files, are spread out to different servers, instead of everything being packed into one. They are all still available at the same time, and required parts of information can always be taken when needed. This branching of files is what makes the most impressing advantage of cloud hosting – almost zero downtime. None. Because if one page goes down on a server, the other server automatically reboots it. Ok, so there is a tiny bit of downtime in between, but it is literally done in milliseconds, which makes it almost unnoticeable. Other websites that use this type of server are Amazon and PayPal, as well as any other website that simply cannot afford to go down even for a second.

read more

Airclic Helps Dock Workers Stage, Load Shipments from Multiple Suppliers

Airclic, a global provider of cloud-based software for mobile supply chain and logistics operations, today announced that it has released the latest version of Transport Perform® (version r12.4), its cloud-based 3PL transportation solution that improves order accuracy and operational efficiency. With this release, Airclic has expanded its support for high-volume cross-docking, allowing dock workers to receive, stage and load shipments from one or more suppliers in parallel and validate received items or shipments in real time against Advanced Shipment Notice data.

Transport Perform is a third-party logistics (3PL) mobile transportation management product that works with any device or carrier. It has been designed specifically to improve the efficiency and accuracy of cross-docking and delivery within a logistics operation by providing visibility into the mobile supply chain and giving an organization the features and functionality it needs to enable a more scalable and reliable workflow. Costly and time-consuming paper-based processing can be eliminated alongside received versus loaded versus delivered errors, reducing overall costs and increasing customer confidence that their shipments have been executed correctly. Transport Perform is intuitive and easy-to use, ensuring a seamless deployment and rapid ROI, as customers have reported saving an average of $400 per route, per month.

With Transport Perform r12.4, the full and accurate data required to process shipments from suppliers is captured, validated and assured from the first point of contact on the dock. Inbound shipments are validated in real-time against advanced shipment data (at item level, order level and supplier container level) and automatically direct the dock worker to the correct outbound container and route based on shipment details, such as end destination or customer location. It also boasts highly configurable workflows to ensure that dock worker activity is based on the requirements of the original supplier.

“Organizations that have automated their transportation logistics and delivery systems have improved customer satisfaction while reducing the cost of doing business,” said Pól Sweeney, CTO, Airclic. “Improved accuracy and better efficiency have increased customer confidence and given 3PL providers using Transport Perform an advantage over the competition. The new capabilities announced today will help them accurately record items received on the dock, staged and loaded all at the same time, in real-time.”

The key features of Transport Perform r12.4 include:

  • A graphical interface for administrators, enabling them to fully setup
    and configure workflow forms for dock workers and drivers to ensure
    the accuracy of information captured during receiving and staging
  • Data validation and post-processing behavior definition capabilities
  • Customizable Perform Reports designed to support cross-dock receiving
    and post-receiving reconciliation with supplier systems based upon the
    unique rules of each
  • Allowance for the upload of custom attributes as part of the customer,
    depot or supplier location bulk upload process


Cloud Computing: Google Out to Thump Amazon

Google’s got Amazon Web Services and Rackspace/OpenStack in its sights.
It’s upgrading its young Infrastructure-as-a-Service Compute Engine, still in preview, and piling on 36 new types of server configurations with different virtual cores, memory and disk types, up from the four basic instances it had at introduction in June.
They include high-memory and high-CPU instances as well as a lower-cost diskless file configuration for applications that don’t need a dedicated disk attached to their server.
Google, which is reputedly “blazingly fast” compared to Amazon, also cut prices on its four main instances by around 5% and its standard storage by around 20%.

read more

Yet Another Round of AWS Storage Price Cuts

Timed for announcement at AWS re:Invent, and nicely juxtaposed against this week’s similar storage price cuts by Google, Amazon has trimmed S3 and EBS prices.

They’ve reduced the price of Amazon S3 storage by 24-28% in the US Standard Region, and made similar price reductions in all nine regions worldwide  as well as reducing the price of Reduced Redundancy Storage (RRS). Here are the new prices for Standard Storage in the US Standard Region:

Tier Old Price
(GB / month)
New Price Change
First 1 TB / month $0.125 $0.095 24%
Next 49 TB $0.110 $0.080 27%
Next 450 TB $0.095 $0.070 26%
Next 500 TB $0.090 $0.065 28%
Next 4000 TB $0.080 $0.060 25%
Over 5000 TB $0.055 $0.055 No change

The new prices are listed on the Amazon S3 pricing announcement page. The new prices take effect on December 1, 2012 and will be applied automatically.

Amazon also reduced the per-gigabyte storage cost for EBS snapshots, again world-wide. Here are the new prices:

Region Old Price
(GB / month)
New Price Change
US East (N. Virginia) $0.125 $0.095 24%
US West (Oregon) $0.125 $0.095 24%
US West (Northern California) $0.140 $0.105 25%
EU (Ireland) $0.125 $0.095 24%
Asia Pacific (Singapore) $0.125 $0.095 24%
Asia Pacific (Tokyo) $0.130 $0.100 23%
Asia Pacific (Sydney) $0.140 $0.105 25%
South America (Sao Paulo) $0.170 $0.130 24%


Gartner Highlights Cloud Data Encryption Gateways

Last month Gartner Analyst Jay Heiser conducted an extremely informative and thought-provoking webinar entitled “The Current and Future State of Cloud Security, Risk and Privacy.” During the presentation, Mr. Heiser highlighted what he called the “Public Cloud Risk Gap,” characterized in part by inadequate processes and technologies by the cloud service providers and in part by a lack of diligence and planning by enterprises using public cloud applications. In many ways, it was a call to arms to ensure that adequate controls, thought and preparation are put to use before public clouds are adopted by enterprises and public sector organizations.
From the side of the cloud application provider, the webinar noted that most cloud service offerings are incomplete when measured against traditional “on-premise” security standards, there are relatively few security-related Service Level Agreements (SLAs), and there is minimal transparency on the security posture of most cloud services. From the enterprise side (the cloud service consumer), he points out that they frequently come to the table with inadequate planning and consideration in the area of security requirements definition and have an incomplete data sensitivity classification governing their data assets. Despite this, the webinar highlighted that organizations of all sizes are increasingly willing to place their data externally, and they are increasingly likely to have at least some formalized processes for the assessment of the associated risk – which is good news.

read more

Eucalyptus CEO Mickos: "We Each Have Our Own Sweet Spot"

Eucalyptus was originally known as one of the key open-source contenders in the battle for private cloud computing customers. Later, it became known as the company that broke with the pack to become compatible with AWS public cloud services, thus becoming a player in the world of hybrid cloud.

Now Eucalyptus moved to a 3.2 version with a new GUI that enables self-service, as well as simplified cloud administration and usage reporting.

The new GUI “allows users to perform self-service operations including the provisioning of instances, keypair and password creation, Elastic Block Store (EBS) volume and snapshot operations, image catalog listing and registration, user group operations, and elastic IP operations,” according to the company.

Eucalyptus 3.2 is more robust than previous versions as well, with the company stating its “load-test harness may handle up to 6000 concurrent connections to 20 test applications across multiple groups and phases of development.”

I asked Eucalyptus CEO Marten Mickos how he views the company’s positioning now and how much of 3.2’s improvements and enhancements came from customer feedback.

“Most of the new features came from customer requests,” he said. “One of our big
corporate customers asked for the VNX adapter, for example. The reporting and logging
was asked for by customers (in addition to other features).”

Mickos also said he thought that he and the company’s major competitors “each have our own sweetspot. We are the only AWS-compatible enterprise-focused cloud platform. OpenStack is popular among those who build their own cloud platform. CloudStack is popular among service providers. VMware’s vCloud Director is not very popular, but to the degree it is used, it is used by customers who have gone all-VMware.”

read more

Why governance must drive all security initiatives…even cloud

“The ‘how,’ many change, but the ‘what’ is fundamental to risk management.”

I heard these sage words at a recent ISSA (Information Systems Security Association) meeting from a CIO speaking about security from the cloud.

He continued, “Risk is not unique to the cloud. It experiences the same issues that affect any outsourcing or third party deliverable. It is bounded by the same concerns regarding governance—does it meet the requirements of my industry? Is my data free from co-mingling? Are the proper notification protocols in place?”

Do a Google search on “cloud security” and the first entry is “How secure is the cloud?” True professionals know the argument is not about technology or how security is delivered, but rather one of governance. You need to know exactly who HAS access to what resources and if these levels of access are appropriate.

You need to know who IS accessing resources …

Cloud 2.0 : Re-inventing CRM

Cloud computing isn’t just re-inventing technology, it will also drive evolution of the business practices that the technology is used for.
For example CRM: Customer Relationship Management.
This is a science that started as simple contact management apps, like ACT!, through Goldmine then of course Salesforce.com.

After their ASP (Application Service Provider) phase of the Cloud evolution we’ve since had the social media explosion and so the principle category to add is “social media CRM”. After that came Cloud and so we’re now at a phase best described as Cloud 2.0.

This is most powerfully demonstrated by the public sector, where CRM is about ‘citizen engagement’ and where the core expression of the model can be referenced through ‘CORE’ design, standing for Community Oriented Re-Engineering.
In short this reflects the simple point that online is about communities, and how you re-engineer your business processes to harness this principle is the fundamental nature of this CORE design and therefore how it can be used to implement a Cloud 2.0 strategy.

read more