The evolution of JavaScript and HTML 5 to support a genuine component based framework (Web Components) with the necessary tools to deliver something close to a native experience including genuine realtime networking (UDP using WebRTC). HTML5 is evolving to offer built in templating support, the ability to watch objects (which will speed up Angular) and Web Components (which offer Angular Directives). The native level support will offer a massive performance boost to frameworks having to fake all these features like Polymer and Angular. It will also encourage people who are not familiar with these next generational frameworks to get in on the action. As I am from a gaming background then I always complain that TCP (Web Sockets) is not genuinely real-time, so I look forward to seeing UDP (WebRTC) solutions being delivered like Desktop Sharing in Chrome 34.
Monthly Archives: February 2016
IoT Open Frameworks By @AllSeenAlliance | @ThingsExpo #IoT #M2M #InternetOfThings
In his session at @ThingsExpo, Noah Harlan, Founder of Two Bulls and President of AllSeen Alliance, will discuss why open source frameworks are vital for the future of IoT.
Noah Harlan is President of AllSeen Alliance and a Founder of Two Bulls, a leading mobile software development company with offices in New York, Berlin, and Melbourne. He is also Managing Director of Digital Strategy for Sullivan NYC, a brand engagement firm based in New York. He has served as an advisor for the White House Office of Science & Technology Policy on gaming and the outdoors, has appeared as a commentator on Bloomberg TV discussing mobile technology, and has an Emmy Award for Advanced Media Interactivity.
Smart Monitoring, Automation, and Open Source at Yahoo By @madgreek65 | @CloudExpo #Cloud
Our guest on the podcast this week is Preeti Somal, VP of Engineering at Yahoo. We discuss the cloud challenges that come with supporting over a billion consumers per day at Yahoo. One way the Yahoo team simplifies this is through automation and smarter monitoring. Yahoo also leverages open source projects such as Hadoop and OpenStack, which helps onboard employees through their many mergers and acquisitions.
The 1-2 Punch of Big Data Processing in the Cloud By @madgreek65 | @CloudExpo #Cloud #BigData
Our guest on the podcast this week is Jason Parsons, Senior Architect and Big Data expert at Cloud Technology Partners. We discuss the benefits of real-time, big data processing in the cloud and why some companies are hesitant to migrate from their traditional data centers. Jason explains why planning is key when moving from on-premise centers and why all companies should begin their cloud initiatives with a minimum viable product before going into full production.
Mindshift Change – Seven Ways to Think Different in the Cloud By @madgreek65 | @CloudExpo #Cloud
Success in the public cloud requires a complete mind shift change. Are you thinking about cloud the right way?
Most large enterprises have spent millions of dollars on their public cloud initiatives over the past few years but have made little progress towards achieving their goals of increased agility and reduced operating costs. The main reason is simple: they approached cloud as if it were a datacenter. In order to achieve their lofty goals, enterprises need a complete mind shift change.
EMC claims it can make data centres All Flash and no downtime
As EMC prepares for its takeover by Dell it claims it has made ‘significant changes’ to its storage portfolio, converting its primary offering to All Flash, modernising array pricings and introducing a new category of flash storage, DSSD D5 at its Quantum Leap event.
EMC’s flagship VMAX All Flash enterprise data services platform and its new DSSD D5 rack-scale flash system are part of a new drive to persuade data centres to use flash technology as their primary storage medium. The vendor claims that by 2020 all storage used for production applications will be flash-based with traditional disk relegated to the roll of bulk storage and archiving.
The new all-flash portfolio will be used by databases, analytics, server virtual machines and virtual desktop infrastructures, says EMC, which predicts that the need for predictable performance with sub-millisecond latencies will persuade data centres to make the extra investment. ENC’s new XtremIO is designed for high-end enterprise workloads, while VMAX All Flash will consolidate mixed block and file workloads that require up to 99.9999% availability, as well as rich data services, IBM mainframe and iSeries support and scalable storage growth up to four petabytes (PB) of capacity.
The DSSD D5 Rack-Scale Flash, meanwhile, is for the most performance-intensive, traditional and next-generation use cases, such as getting microsecond response times on Oracle and Hadoop based analytics jobs. Meanwhile, the new VNX Series arrays represent an entry level all-flash offering which starts at $25,000.
EMC announced that the VMAX array has been re-engineered to offer two new all-flash models: the EMC VMAX 450 and EMC VMAX 850. Both are designed to capitalise on the performance of flash and the economics of today’s latest large-capacity SSDs.
Finally, EMC also announced the DSSD D5 which, it claimed, will be a quantum leap in storage technology, with its new Rack-Scale Flash. TEMC said the new invention will be used in high production applications such as genetic sequencing calculations, fraud detection, credit card authorisation and advanced analytics.
EMC claims it will create a ten fold surge in performance levels. The storage hardware is capable latency of 100 microseconds, throughput at 100 GB/s and IOPS of up to 10 million in a 5U system. EMC DSSD D5 will be generally available in March 2016.
EC clears acquisition of EMC by Dell – won’t distort competition
The European Commission has approved the acquisition of storage and software giant EMC by PC and server maker Dell.
In a statement Commissioner Margrethe Vestager declared that the deal meets the criteria of the EU’s Merger Regulation. The strategic importance of the data storage sector meant that the EC was able to approve Dell’s multi-billion dollar takeover of EMC within a short space of time, according to Vestager, who thanked the Federal Trade Commission for close cooperation.
The Commission assessed the effects of the transaction on the market for external enterprise storage systems. The Commission also investigated the risk that the merged entity could attempt to restrict access to VMware’s software for competing hardware vendors. The Commission is convinced there will be no adverse effects on customers, according to Vestager.
The Commission found that the merged entity has a moderate market share in the market for external enterprise storage systems and the increment brought about by the merger is small. The new Dell/EMC entity will continue to face strong competition from established players, such as Hitachi, HP, IBM and NetApp, as well as from new entrants, it said.
Despite VMware’s ‘strong market position’ in server virtualization software, the available evidence led the EC investigators to conclude that the merged entity would have neither the ability nor the incentive to shut out competitors. The likes of Citrix, Microsoft and Red Hat can give it plenty of competition in the server virtualisation market, the EC has judged, and it predicted that the EMC/Dell hybrid won’t have things its own way in new technology markets.
Since customers typically multi-source from more than one server virtualization software provider and VMware’s approach has traditionally been hardware and software-neutral, it offers work opportunities to a large number of vendors. Equally, in the server market, Dell has strong competitors that will continue to operate either in partnership with VMware or with third party virtualisation software providers.
The combination of Dell’s and EMC’s external enterprise storage systems products won’t have an impact on competition given the number of alternatives to VMware’s software.
The Commission also asked whether the merged entity could shut competitors out from the virtualization software used for converged and hyper-converged infrastructure systems. Here it also found there were no concerns raised. The merger, when first reported in BCN in October 2015, was valued at $60 billion.
Most data in the cloud is exposed says Thales/Ponemon study
A new study into encryption and key management suggests that nearly of all the companies in the world are planning to make a potentially fatal security mistake.
If the new global study is an accurate gauge of global trends, 84% of companies across the world are about to commit sensitive data to the cloud by 2018. However, only 37% of the same survey sample has an encryption plan or strategy.
With consultant PwC recently declaring that cloud computing is attracting the attention of the world’s cyber criminals and attracting a mini-boom in hacking attacks, the lack of data encryption could prove fatally negligent.
The Global Encryption Trends report, commissioned by Thales Security and conducted by IT security think-tank Ponemon, revealed that though the use of encryption is increasing, the security industry isn’t keeping pace with its criminal opponents. In the study Ponemon interviewed 5,009 individuals across multiple industry sectors in 11 of the world’s top economies, including the US, the UK, Germany, France, Brazil and Japan. If that survey is an accurate reflection of the global state of security of the cloud, there are some worrying trends, according to Thales.
While use of encryption is on the up, with nearly three times more organisations classifying themselves as extensive users in comparison with years ago, there is ‘still some way to go’, according to Thales. In 2005 a Thales study found that 16% of its global survey sample used encryption. By 2015 the proportion of encryption users had risen to 41%, of those surveyed. That still means that a minority of companies around the world are using a baseline level of cyber security, according to John Grimm, Senior Director at Thales e-Security. To make matters worse, in that time the cyber crime industry will have been far more agile and fast moving.
Other findings were that 40% of cloud data at rest is unprotected and 57% of companies don’t even know where their sensitive data resides. Sensitive data discovery ranked as the top challenge to planning and executing an encryption strategy, according to researchers.
Support for both cloud and on-premise deployment was rated the most important encryption solution and 58% of companies said they leave their cloud provider to be responsible for protecting sensitive data transferred in the cloud.
The EU General Data Protection Regulation: Prepare for change
(c)iStock.com/xijian
The EU’s wide-ranging rules on General Data Protection Regulation (GDPR) are set to significantly impact all businesses, whether in the EU or further abroad. From 2018, any organisation that collects, uses or shares personal information about European citizens will have to demonstrate compliance with the hotly contested law. This includes using various techniques to ensure that the protection of data is built into the design and infrastructure of an organisation by default.
Securing the data deluge
In an always-on, hyper-connected world, most people think of their data in terms of the live systems that hold their information. In reality, that is just the tip of the iceberg. Data is in fact being copied over and over – for development, testing, quality assurance, training, financial reporting, business intelligence and more. In addition, data is often accessed by third parties, contractors and consultants in other locations or countries often requiring only a username and password to secure access.
However, the EU GDPR will change all this. Moving forward, a software developer will need to be as security conscious as a database administrator – a concept that is likely to be foreign to them. Whilst there is already some level of data protection and ownership within organisations, the new regulations will spark an increase in the education, training and tools required to prove compliance with what is and isn’t allowed.
Introducing the carrot and stick
The new regulation cautions that any personal information needs to be “pseudonymised” so that the person is no longer identifiable, essentially introducing a ‘carrot’ and ‘stick’ approach. A ‘carrot’ recommending pseudonymisation at specific points and reducing certain obligations on those businesses that follow this approach. A ‘stick’ in the form of a threat surrounding the penalties for businesses that are non-compliant.
Moving forward, a software developer will need to be as security conscious as a database administrator – a concept likely to be foreign to them
For many enterprises, this will mean that they need to re-architect operations to accommodate a data-first approach.
Currently, Delphix estimates 90% of data resides as copies in development, testing and reporting shared environments. The first step will be understanding where all the data sits in both production and non-production. The second step will require technology that has the ability to scale and protect all data, not just those bits of information that are the most sensitive.
This will require an investment in new technologies, for example data masking, that can pseudonymise data once and ensure all subsequent copies have the same masking policies applied. However, in the event of a data breach, the cost of these investments is likely to pale in significance when compared to the potential fines of 4 per cent of global turnover.
Conclusion
Given the ever-increasing occurrence and severity of data breaches, it’s becoming more and more important that customers feel their vendors are advocating a data-first approach. This means setting a data protection strategy that covers the entire organisation and reduces the risks to any individuals that are victim of a data breach.
By advocating the adoption of precautionary measures, the EU GDPR goes some way toward ensuring personal information is protected or rendered useless for any successful cyber-criminals that breach fortress walls. It also better protects organisations from the fallout that any data breach has on their finances, resources or reputation.
Big data and IoT expected to raise more than £300bn to UK economy by 2020
(c)iStock.com/Piotr Adamowicz
Big data analytics and the Internet of Things (IoT) are expected to add £322 billion to the UK economy by 2020, according to research released by the Centre for Economics and Business Research (Cebr) and analytics provider SAS.
The research, which builds upon a previous report from Cebr on big data analytics adoption, argues big data and analytics will contribute £40bn per year between 2015 and 2020. Back in 2012, Cebr argued the value of ‘data equity’ was at 0.7% of the UK’s gross domestic product, a figure which is expected to rise to 2.2% by 2020. IoT is expected to reach £81bn by 2020, to 0.7% of GDP. Overall, the market for the two by 2020 is expected to be twice the size of combined education, NHS, and defence budgets for 2015.
“Collecting and storing data is only the beginning,” said Cebr CEO Graham Brough, adding: “It is the application of analytics that allows the UK to harness the benefits of big data and the IoT. Our research finds that the majority of firms have implemented between one and three big data analytics solutions.”
“The combined benefits of IoT and big data will fuel our economy like nothing else,” said Mark Wilkinson, SAS regional vice president. “This report illustrates the considerable impact over the coming years of more organisations embracing big data and IoT to improve decision making that affects efficiency, risk management and new business opportunities.”
The telecoms industry currently has the highest rate of big data and IoT adoption, at 67% and 61% respectively, yet healthcare is expected to struggle in both big data analytics (52%) and IoT (26%).
Other research has examined how IoT will infiltrate workplace habits. A report from Webroot and IO argued that more than half of UK businesses polled plan to employ a ‘chief IoT officer’ in the coming year.