Amazon Web Services to offer new hierarchical storage options after customer feedback

amazon awsAmazon Web services (AWS) is adding a new storage class to speed up the retrieval of frequently accessed information.

The announcement was made by AWS chief evangelist Jeff Barr on his company blog. Customer feedback had made AWS conduct an analysis of usage patterns, Barr said. AWS’s analytical team discovered that many customers store rarely-read backup and log files, which compete for resources with shared documents or raw data that need immediate analysis. Most users have frequent activity with their files shortly after uploading them after which activity drops off significantly with age. Information that’s important but not immediately urgent needs to be addressed through a new storage model, said Barr.

In response AWS has unveiled a new S3 Standard, within which there is a hierarchy of pricing options, based on the frequency of access. Customers now have the choice of three S3 storage classes, Standard, Standard – IA (infrequent access) and Glacier. All still offer the same level of 99.999999999 per cent durability.‎ The IA Standard for infrequent access has a service level agreement (SLA) of 99 per cent availability and is priced accordingly. Prices start at $0.0125 per gigabyte per month with a 30 day minimum storage duration for billing and a $0.01 per gigabyte charge for retrieval. The usual data transfer and request charges apply.

For billing purposes, objects that are smaller than 128 kilobytes are charged for 128 kilobytes of storage. AWS says this new pricing model will make its storage class more economical for long-term storage, backups and disaster recovery.

AWS has also introduced a lifecycle policy option, in a system that emulates the hierarchical storage model of centralised computing. Users can now create policies that will automate the movement of data between Amazon S3 storage classes over time. Typically, according to Barr, uploaded data using the Standard storage class will be moved by customers to Standard IA class when it’s 30 days old, and on to the Amazon Glacier class after another 60 days, where data storage will $0.01 per gigabyte per month.

Software and platforms as a service driving our growth says Oracle

OracleOracle’s latest quarterly results show the increasing strategic of importance of revenue from cloud software and platforms as a service, according to the vendor. Chairman Larry Ellison also claimed the sales figures show Oracle will soon overtake Salesforce as the top selling cloud operator.

The official figures for Oracle’s fiscal 2016 Q1 period show that total revenues were $8.4 billion, which represent a two per cent fall in US dollars but a seven per cent rise in constant currency. Oracle attributed the fall to the current strength of the US dollar.

However, a clearer pattern emerged in the nature of software sales, when benchmarking all sales in US dollars. While revenues for on premise software were down two per cent (in US dollars) at $6.5 billion, the total cloud revenues were up by 29 per cent at $611 million. The revenue from Cloud software as a service (SaaS) and platform as a service (PaaS) was $451 million, which represents a 34 per cent increase in sales. Cloud infrastructure as a service (IaaS) revenues, at $160 million, rose 16 per cent in the same period.

Meanwhile, Oracle’s total hardware revenue figure for the period, $1.1 billion, also indicated a decline, of three per cent. Using the same US dollar benchmark, Oracle’s services revenues for the period more or less stagnated, at $862 million, a rise of one per cent.

Growth is being driven by SaaS and PaaS, according to Oracle CEO Safra Catz. “Cloud subscription contracts almost tripled in the quarter,” said Catz, “as our cloud business scales-up, we plan to double our SaaS and PaaS cloud margins over the next two years. Rapidly growing cloud revenue combined with a doubling of cloud margins will have a huge impact on growth going forward.”

Oracle’s cloud revenue growth rate is being driven by a year-over-year bookings rise of over 150 per cent in Q1, reported Oracle’s other joint CEO Mark Hurd. “Our increasing revenue growth rate is in sharp contrast to our primary cloud competitor’s revenue growth rates, which are on their way down.”

Oracle is still on target to book up to $2.0 billion of new SaaS and PaaS business this fiscal year, claimed executive chairman Larry Ellison. “That means Oracle would sell between 50 per cent more and double the amount of new cloud business that Salesforce plans to sell in their current fiscal year. Oracle is the world’s second largest SaaS and PaaS company, but we are rapidly closing in on number one.”

How IoT Security could change infrastructure forever

CybersecurityOn September 22nd and 23rd, the first-ever dedicated IoT Security conference and exhibition will take place in Boston.

While at first glance this may appear to concern a specific and rather specialized area, the relationship of the Internet of Things to the broad issue of human security may well prove much more far-reaching and fundamental.

After all, the development of the Internet itself was driven by a Cold War desire to create resilient computer networks that could withstand a nuclear attack. This threat inspired a whole new architecture for sharing and protecting information – one that was intentionally decentralized.

History suggests that precaution can be a key driver of technological innovation. In changing things to protect them, we often open up unforeseen new opportunities.

Which is why, if we return to 2015, there is something fascinating in seeing the same decentralized architectures applied to real-world infrastructures in the name of collective safety.

“When you apply this kind of Internet-type architecture to core infrastructure — whether it’s water or energy or transportation –  these systems start looking a lot more like the Internet,” says John Miri, Chief Administrative Officer at the Lower Colorado River Authority (LRCA) and a speaker at this month’s Boston event. “You start to see water systems, flood data systems and, hopefully, electric grids that are less centralized, more resilient and more difficult to disrupt.”

The LCRA is an 80-year-old institution with roots in the Great Depression, entrusted with providing reliable water, flood protection and electricity to Central Texas and beyond. The areas LCRA serves covers a number of the fastest growing cities in the United States, meaning LCRA faces some pretty substantial demands on its infrastructure.

“Providing the water and power to support growing communities and a growing business and industrial base is no small task,” Miri says. Indeed, LCRA has  broken ground on a quarter of a billion dollar new reservoir, the region’s first new water supply in decades.

Many of these additional demands make  safety and security more important than ever.

“LCRA is now the second largest electric transmission utility in Texas. Our high tension transmission lines go across a large portion of the state. Protecting the electric grid is a pretty hot topic,” Miri says.

These hypothetical threats encompass what Miri calls “bad actors,”  but also less hypothetical threats to the infrastructure.

“When you have a flood, we may have to intentionally shut down electric substations. Everyone knows electricity and water don’t mix – but even having the situational awareness to know that water is approaching a substation is very important to us in keeping the lights on. Using these kinds of smart networks to get a better picture of the threats and dangers to the power grid helps us protect it rather than just saying ‘build more,’” Miri says.

Similarly, a vast number of sensors throughout its Hydromet network enable LCRA to better monitor water levels – and to effectively manage floods.

“By adopting a new, more open, shared technology approach, we could expand the infrastructure we have for flood data collection at a 90% lower cost than if we had done it a traditional way. The technology  actually opens up our infrastructure to a very wide region that never considered it before. We can offer a level of flood monitoring across a wider region and  extend it rural and agricultural communities and other areas that might not have the resources to gain access to this technology.”

Looking ahead, Miri says, there are new opportunities to apply this decentralized, Internet-style architecture to other projects.

“I think when you look forward 10, 15 or 20 years, the whole infrastructure may work differently. It opens up new possibilities and business models that we didn’t have before. For instance, Texas is on the coast. As with any coastal area, we spend time thinking about desalination. Some of the work we’ve been doing on the Internet of Things  is making people think, maybe we don’t need a couple of giant desalination plants – which has been the approach in Australia and Israel – but a number of smaller plants that are networked together, and share the water more efficiently. In the longer term, IoT may actually change the infrastructure itself, which would be very exciting.”

It could be interesting to one day look back at this month’s inaugural IoT Security event and see how many of the topics discussed went on to fundamentally evolve and affect their wider respective domains.

New IBM cloud service could help car makers to internet things and cut emissions

connected-car-normal

IBM has launched a cloud service that aims to harness the power of the Internet of Things (IoT) so that car makers can cut the costs of production, ownership and pollution.

The new IBM Cloud could help the likes of BMW and Mercedes to make better use of the mass of data created by all the intelligent sensors in a car and use this intelligence to make car drivers more efficient. The service aggregates data about the machines, the drivers and the passengers. IBM claims it could cut carbon emissions by helping cut fuel consumption through promoting better driving techniques, smarter route choices and sensible loading.

On the supplier side, the improved intelligence, says IBM, could help vehicle manufacturers to lower both the cost of production and ownership, through techniques such as predictive vehicle maintenance, real-time engine diagnostics and chassis stress analysis.

The IBM Internet of Things (IoT) for Automotive system is available on IBM Cloud’s SoftLayer infrastructure. IBM says it will analyse primary and secondary sources of intelligence. In addition to primary sources, such as geolocation data collected in the car, it will use external sources such as the car maker’s customer data and vehicle history. It will also use data from parking providers.

Automotive supplier Continental uses IBM MessageSight and IBM InfoSphere Streams, components of the IBM IoT for Automotive solution, to help manage complex data streams and apply analytics to its eHorizon system. This allows vehicle electronics to anticipate road conditions using digital mapping and crowd sourced data.

According to Telefonica’s 2013 Connected Car Industry Report, nine in ten new cars will be equipped with extensive connectivity services by 2020. IBM’s mission is to make sense of all the masses of big data and put it to good use, said IBM’s general manager for global automotive industry Dirk Wollschlaeger. “We have the potential to change how we interact with our vehicles,” said Wollschlaeger.

Here’s Why You Should Upgrade to Parallels Desktop Pro Edition

With the arrival of Parallels Desktop 11 for Mac, we also brought something new to the table: Parallels Desktop for Mac Pro Edition. Regarding this new edition, a lot of our customers have asked us the same question: why should I upgrade to Pro? We’re happy to shed a little light! Here are some of […]

The post Here’s Why You Should Upgrade to Parallels Desktop Pro Edition appeared first on Parallels Blog.

Guest Blog: Gilt’s Business Growth, Powered by the Right IT Tools

This is post was written by guest blogger Andrew Robinson, Senior IT Support Engineer at Gilt Groupe K.K. We are extremely excited and pleased to get to share his post as a special guest blog this week! Read on to learn about his experience choosing and using Parallels Desktop for Mac Business Edition. When I first joined […]

The post Guest Blog: Gilt’s Business Growth, Powered by the Right IT Tools appeared first on Parallels Blog.

Microsoft Purchases Adallom

While Microsoft did not specifically relay how much it had paid for the purchase of Israeli based Adallom, website TechCrunch has estimated that the purchase was around $250 million dollars. This purchase serves as Microsoft’s response to the trend of increased cloud based computing. While Microsoft originally sold packaged programs such as its Office software, the company has begun to shift to software offerings through cloud based applications.

Adallom helps customers “protect critical assets across cloud applications,” according toMicrosoft vice president Takeshi Numoto’s blog post. “With more frequent and advanced cybersecurity attacks continuing to make headlines, customer concerns around security remain top of mind. These concerns pose real challenges for IT, who are charged with protecting company data in this rapidly evolving mobile-first, cloud-first world.” Adallom works with cloud based applications such as Salesforce and Dropbox. The three-year-old startup’s technology will be used in other Microsoft offerings as well, such as its Enterprise Mobility Suite and Advanced Threat Analytics.

adallom

The announcement of the acquisition was released the same day that Microsoft relayed its expanding partnership with Dell which will help diversify the amount of Windows 10 powered devices and services for a multitude of businesses. The center of this push is Microsoft Surface Pro tablets and accessories, which Dell will begin to sell in Canada and the United States through online shops, with more markets being added the next year.

“Our global enterprise customers have asked us to match the Surface Pro 3 and Windows 10 experience with enterprise-grade support and services -– and our partnerships like this one with Dell will do just that,” Microsoft chief Satya Nadella said in a release. Dell partnership has kicked off a “Surface Enterprise Initiative.” Microsoft has plans to release a new tablet for businesses.

This comes a day before an event hosted by Apple, which was anticipated to unveil a new tablet which could insert Apple into the market for Surface.

Microsoft is set to increase the business capabilities of its Windows 10 operating system.

The post Microsoft Purchases Adallom appeared first on Cloud News Daily.

Public cloud generating more than $20bn quarterly for IT companies

(c)iStock.com/fazon1

A note released by Synergy Research shows the public cloud continues to make significant inroads into the overall IT market, generating over $20 billion (£12.9bn) in quarterly revenues for IT firms.

According to the research, public cloud operators provide $10bn in quarterly revenues for key technology and IT supplies, while generating $12bn themselves in revenues from infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS).

Synergy classifies the leading vendors into two main categories. On the supply side, the leaders are HP, Cisco, Dell, IBM and Equinix, while for cloud services the market leaders are Amazon Web Services (AWS), Microsoft, Salesforce, Google and IBM. These separate categories have been tracked by the research firm for several years. In February, AWS hit a five years high in cloud infrastructure market share brushing off the increasing competition from Microsoft, while earlier this month it was reported that HP had finally overtaken Cisco in cloud infrastructure equipment.

Digging down into the revenue generators, hardware and software used to build cloud infrastructure contributes $7bn of spend, at a 26% year on year growth, while colocation and data centre lease contributes $2.8bn at 9% growth. SaaS contributes $6.6bn output, compared to cloud infrastructure services – IaaS, PaaS, public and hybrid – at $5.5bn.

Synergy argues the public cloud market is characterised by big numbers, high growth rates – and a collection of large vendors at the top.

“While there is still a place for small to medium sized public cloud players, especially on the service side within a specific region, the public cloud really is dominated by hyperscale cloud operators that can afford to build huge data centre footprints that span multiple continents,” said John Dinsdale, chief analyst and research director.

A moral tale: The bank, the insurance company, and the ‘missing’ data

(c)iStock.com/MarianVejcik

By Steve Davis, Marketing Director, NGD

Last week, a well-known insurance firm revealed a report berating business owners for not safeguarding themselves against cyber attack and failing to take out sufficient cyber insurance cover.  At the same time, another leading insurer was found to have lost a portable data storage device containing thousands of customer files belonging to a major bank.

As for the first insurance company, sure, cyber data loss insurance is a line of defence but somewhat after the fact when considering the damage and chaos that happens in such circumstances. As for the other insurance company, if they and a bank can’t look after their data between them, who or what can? 

Well, there are plenty of alternatives these days for storing and transferring data to prevent carrying it around. How about trying secure networks connected to servers in high security data centres?

For me, physical and digital security should go hand in hand.  From prison grade perimeter fencing, security guards, CCTV, infra-red detection and lockable rack cabinets, to the latest most sophisticated anti-malware and virus software available. Clearly, even the largest organisations cannot consistently expect to attain such rigorous levels of security all on their own and it’s certainly out of reach for most small and medium size firms to attempt it. Sheer cost and keeping up with the ever changing technology landscape see to that.  

This is precisely why colocation data centres like NGD have been in the ascendancy for several years now. Organisations, of all sizes, choose to use them for one primary reason – to keep their data safe and continuously available. 

Modern data centres have multiple levels of physical and digital security in place to deter and prevent all perpetrators, from opportunists hoping to walk in and ‘lift’ a computer server, storage device or whatever, through to the highly organised and systematic cyber terrorist variety. Peace of mind is both available and affordable for all, from those customers requiring just a quarter or half rack up to others looking to run hundreds of racks. 

Although there’s irony in last week’s revelations there’s also a moral to the tale. No matter what, keep your data out of harm’s way at all times – or it may well come back and bite you! Prevention is much more preferable to finding a cure.