[session] Gaining IoT Insight | @ThingsExpo @MSCloud #IoT #M2M #API #RTC #InternetOfThings

As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azure Machine Learning, enabling you to invoke machine learning modules over streaming data to perform anomaly detection, classification, and other predictive analytics in real time.

read more

Make your Sunday League team as ‘smart’ as Borussia Dortmund with IoT

IoT can help make your football team smarter

IoT can help make your football team smarter

How, exactly, is IoT changing competitive sports? And how might you, reader, go about making your own modest Sunday League team as ‘smart’ as the likes of AC Milian, Borussia Dortmund and Brazil?

We asked Catapult, a world leader in the field and responsible for connecting all three (as well as Premier League clubs including Tottenham, West Brom, Newcastle, West Ham and Norwich) exactly how the average sporting Joe could go about it. Here’s what the big teams are increasingly doing, in five easy steps.

Link-up play

The technology itself consists of a small wearable device that sits (a little cyborg-y) at the top of the spine under the uniform, measuring every aspect of an athlete’s movement using GPS antenna and motion sensors. The measurements include acceleration, deceleration, change of direction and strength – as well as more basic things like speed, distance and heart rate.

Someone’s going to have to take a bit of time off work though! You’ll be looking at a one- or two-day installation on-site with the team, where a sports scientist would set you up with the software.

Nominate a number cruncher

All the raw data you’ll collect is then put through algorithms that provide position-specific and sport-specific data output to a laptop. Many of Catapult’s Premier League and NFL clients hire someone specifically to analyse the massed data.  Any of your team-mates work in IT or accountancy?

Tackle number crunching

Now you’ve selected your data analyst, you’ll want to start them out on the more simple metrics. Everyone understands distance, for instance (probably the easiest way to understand how hard an athlete has worked). From there you can look at speed. Combine the two and you’ll have a fuller picture of how much of a shift Dean and Dave have really put in (hangovers notwithstanding).

Beyond this, you can start looking at how quickly you and your team mates accelerate (not very, probably), and  the effect of deceleration on your intensity afterward. Deceleration is usually the most harmful to tissue injuries.

Higher still up the spectrum of metrics, you can encounter a patented algorithm called inertial movement analysis, used to capture ‘micro-movements’ and the like.

Pay up!

Don’t worry, you won’t have to actually buy all the gear (which could well mean your entire team re-mortgaging its homes): most of Catapult’s clients rent the devices…

However, you’ll still be looking at about £100 per unit/player per month, a fairly hefty additional outlay.

Surge up your Sunday League!

However, if you are all sufficiently well-heeled (not to mention obsessively competitive) to make that kind of investment, the benefits could be significant.

Florida State Football’s Jimbo Fisher recently credited the technology with reducing injuries 88 per cent. It’s one of number of similarly impressive success stories: reducing injuries is Catapult’s biggest selling point, meaning player shortages and hastily arranged stand-ins could be a thing of the past.

Of course if the costs sound a bit too steep, don’t worry: although the timescale is up in the air, Catapult is ultimately planning to head down the consumer route.

The day could yet come, in the not too distant future, when every team is smart!

How will the Wearables market will continue to change and evolve? Jim Harper (Director of Sales and Business Development, Bittium) will be leading a discussion on this very topic at this year’s Internet of Things World Europe (Maritim Pro Arte, Berlin 6th – 7th October 2015)

Interoute opens Trans-Pacific network route between Hong Kong and Los Angeles

Interoute is expanding its fibre network, which will boost its cloud biz

Interoute is expanding its fibre network, which will boost its cloud biz

Interoute has added two new independent networking routes between Los Angeles (LA) and Hong Kong to support what it claims is Europe’s biggest cloud service platform.

It described the additions as ‘the final step in creating a fully meshed global network’. With low latency fibre connecting its territories it claims it gives customers faster access between the USA and Asia regions.

The pan-Pacific services are built on Interoute’s own private MPLS network. With complete ownership of its network, the service provider claims it can guarantee security. The option to choose between one of two distinct routes now gives it much higher levels of reliability, Interoute claimed.

Interoute has integrated its MPLS network with its cloud infrastructure platform Interoute Virtual Data Centre (VDC). The VDC, announced in November 2014, was created and run globally in order to simplify the process of running businesses in multiple markets.

Today’s announced network expansion follows the launches of the Interoute IP points of presence (PoPs) and VDC zones in Los Angeles (LA1) and Hong Kong (Hong Kong2). This announcement also follows Interoute’s recent opening of a new PoP in Singapore (Singapore3), in a bid to strengthen its position in one of the world’s biggest financial hubs.

“Our investment in new links between Asia and the USA signifies the next stage in the development of Interoute’s global networked cloud,” said Mark Lewis, Interoute’s communications and connectivity VP. “Customers wishing to expand across the globe need a network and services platform that supports their digital businesses.”

The new route goes live in September 2015.

Interoute’s estate now comprises 12 datacentres, 14 virtual datacentres, and 31 collocation centres, with connections to 195 additional third-party datacentres across Europe, where it owns and operates 24 dense city networks.

The new routes will help Interoute strengthen its offering beyond Europe, according to Lewis. “With the launch of these new connections, Interoute is delivering the network capacity and service platforms that enterprises need to grow across the Pacific and around the world.”

Ciber machine will convert Cobol into cloud ready code

Legacy code is keeping enterprises from migrating to the cloud

Legacy code is keeping enterprises from migrating to the cloud

Service provider Ciber claims it has solved one of the most expensive problems in business: upgrading legacy systems to make them secure and cloud friendly.

Its new system, Ciber Momentum, converts the code from languages such as Cobol, Ada and Pascal into a more cloud-ready format. By automating the conversion of machine code into a modern format, the Momentum system creates massive time and money savings on projects that can take up to three years if conducted using human resources, according to Ciber.

Gartner research estimates that companies spend 70 per cent of their IT budget on maintaining existing systems, Ciber claims, leaving only 30 per cent available for new projects. This is because few of the programmers familiar with the languages used to create legacy applications are available for work today.

This means that Cobol writers, for example, are three times as expensive to hire as modern developers and, with few hiring options, companies find it difficult to dictate terms.

Since the conversion of a trading system written in Cobol can take years, this is creating a crippling expense and leaving companies vulnerable to competition from cloud based start ups that can move much faster, Ciber claims. Legacy apps are not only inflexible, they are more likely to be a security liability, said Michael Boustridge, president and chief executive of Ciber.

“Most of the time companies get hacked, the criminals are exploiting vulnerabilities of an old system,” said Boustridge, “legacy computers are not secure.”

Boustridge said Ciber intends to reverse the formula for the industry, so that CIOs will be able to spend 70 per cent of their budgets on new projects and only 30 per cent on maintenance.

The fast-track to the cloud can only be 80 to 85 per cent software generated as some human checking and balancing will be necessary. However, Boustridge claimed that conversion project times will be halved.

The automated system will also uncover any anomalies in legacy coding. These logical inconsistencies were often created by programmers who were notorious for over complicating systems in order to inflate their value to their employers, according to Boustridge. “Anything in the old code that doesn’t add up will be exposed,” said Boustridge.

The system, now on global release, will be available for partners to white label and offer as part of their own client service.

Why Parallels RAS is a Great Choice for Micro ISV Businesses

Micro ISVs hold a significant share in the software market. A micro ISV can be defined as an independent software vendor with an office size of 1-10. In most cases, a Micro ISV office comprises a single person…who’s an all-out geek when it comes to what they do. (Just like us!) This single person creates the […]

The post Why Parallels RAS is a Great Choice for Micro ISV Businesses appeared first on Parallels Blog.

Are you sure your data centre is really as secure as you think it is?

(c)iStock.com/bjdlzx

Data centres are only as secure as the connectivity that links them up to a network. They can otherwise be prone to cyber-attacks or information security breaches, which can have catastrophic consequences for any organisations that wish to transmit data and backup data between their own data centres. With network latency being an issue that needs to be addressed, the transmission to and from data centres or to the outside world could increase the risks associated with these threats.

According to Clive Longbottom – renowned analyst and Client Services Director at Quocirca, latency can lead to lost transactions whenever there is a failure in the connectivity, application or platform. High latency levels also make it impossible to make real-time IT use such as voice or video transmissions, but he thinks that latency is only one part of the equation. There are also complex network, application and hardware mix considerations to bear in mind.

“Latency has two effects on security in data centres: the first issue is about how closely you can keep your data centres in synchronicity with each other, and when you are transmitting data you have got to have security keys”, explains David Trossell – CEO of Bridgeworks. So whether you are putting data into the cloud or into a multi-tenanted data centre, it’s crucial to be secure. In other words no unauthorised person should have the ability to pry into the data itself.

Encrypt data

So underlying the protection of the data centre from an information security perspective is the need for enterprises to encrypt data whenever it is uploaded to the cloud or transmitted between data centres for the purposes of back-up and retrieval. The encryption needs to occur while the data is at rest, when it isn’t being sent across a network. It’s also worth noting that the most secure encrypted data only has one person who holds the keys to it.

“An AS 256 key offers the strongest encryption, but the problem is that a strong security key takes up more computing power and yet with encrypted data you shouldn’t be able to perform any de-duplication which looks for repeated patterns in the data”, says Trossell. To improve the ability to quickly transmit data most organisations would traditionally opt for a wide area network (WAN) optimisation tool, where the encryption process occurs while the data is in transit using IPSEC.

Encrypting the data at rest means that it’s more secure. For traditional WAN optimisation,the keys would have to be offered to the WAN optimisation engine in order to decrypt it, de-duplicate it and before using internet security protocol IPsec across the wide area network. The traditional WAN optimisation engine would then need to strip off IPsec, and this would then permit the re-encryption of the data. This means that you now have two security keys in a couple of places – this can be the biggest security risk.

“For the highest levels of security, data should be encrypted before it hits storage”, says Longbottom. He adds: “This requires full stream, speed capable encryption and yet this is often not feasible.” He says the next level there is about storing and then encrypting, and it requires deleting the unencrypted version afterwards: “This is encryption at rest and on the move, but too many organisations just go for encryption on the move, and so if someone can get to the storage media, then all of the information is there for them to access it and what is typically overlooked is key management.”

Mind your keys

“If you lose the keys completely, no-one should be able to get at the information – even yourself; and if they are compromised, you at least have control over them as long as you are aware that you can do something about it”, he explains. However, if the keys are held by a third party, then he says it becomes a richer target for hackers, “as it will likely hold the keys for a group of companies rather than just one, and the speed of response from noticing the breach to notification to the customer to action begin taken could be a lot longer.”

The trouble is that the data is traditionally often not secure when it is encrypted while in transit across the network. “The issue here is that if you have a high speed WAN link, then this will inhibit the movement of data and then you are not fulfilling your WAN optimisation”, comments Trossell. His colleague Claire Buchanan, CCO at Bridgeworks adds: “You are impacting on your recovery time objective (RTO) and on your recovery point objective (RPO).” The RPO is the last point of when the data was backed up, and the RTO is how quickly the data can be retrieved and put back to work.

Gain control

“With encryption at rest the corporate is in full control and it is the sole owner of the key, but normally WAN optimisation tools simply pass the data through with no acceleration and in order to provide some level of security, traditional WAN optimisation tools provide an IPsec layer – but this is not anywhere close to the levels of security that many corporations require”, she explains.

To gain control she thinks that organisations need a new solution. In her view that solution is self-configuring and optimised networks (SCIONs), which provide a high level of security and enable organisations to significantly reduce latency. They use machine intelligence to become not only self-configuring, but to also self-manage and self-monitor any kind of network – particularly WANs. This enables any kind of transition to cloud-based infrastructures easier to achieve while providing a secure way for organisations to maximise the utilisation of their infrastructure by up to 98%. SCIONs reduce the effects of latency too, going well beyond transactional flow processing and steady state infrastructures.

Security used to be quite light in terms of security compliance, but a number of new threats have arisen and they weren’t as high as they are now. “You have the internal problems, such as the one represented by Snowden, and with more powerful machines the lower encryption of 128 bit is far easier to crack than something with 256 bit encryption which adds layers of complexity.” Trossell claims that nowadays there are more disgruntled employees than ever – and Wikileaks is an example of it, but the employees have to have the keys before they can access the encryption.

Longbottom adds that it wasn’t long ago that 40 bit encryption was seen as being sufficient. It required a low level of computing resources and in most cases it was adequately hard to break. Increased resource availability made it easier to break within a matter of minutes. “Therefore the move has been to 256 bit – AES, 3DES, BloFISH and so on”, he says before adding that cloud computing provides a means for hackers to apply brute strength to try and break the keys.

The solution is to keep the keys on site, and to limit the number of people who have access to them.  By doing this the data and therefore the data centres remain secure.  “Previously organisations have had no choice, but to simply move the encrypted data at a slow speed, and with traditional WAN optimisation it simply passes the data along the pipe without any acceleration”, says Buchanan. Corporations still think it’s the only way to go, but not anymore. Encryption is often needed to ensure that the data is secure whenever there is a need to transmit data between data centres or to the cloud without compromising on speed or security.

Speed with security

Buchanan adds that WANrockIT – the market leader in SCION solutions – can help organisations to improve the speed and security of this process: “With WANrockIT your encryption is just another block of data to us, accelerated just like any other data without it being touched – plus, if you are using encrypted data, the software has the ability to put IPsec on top so that you effectively get double encryption.”

One anonymous Bridgeworks’ customer, for example, tried to transfer a 32GB video file over a 500MB satellite link with 600ms of latency, and it took 20 hours to complete. With WANrockIT installed in just 11 minutes, the process only took 10 minutes to complete. Another customer could only do incremental back-ups of 50GBs rather than being able to do nightly back-ups of 430GBs – again the issue was latency at 86ms. It took 12 hours on their OC12 pipes, but when WANrockIT was installed the 50GBs back-ups were securely completed within 45 minutes. This allowed the full nightly back-ups to complete, and so the organisation could rest in the knowledge that its data was secure.

The security of an organisation’s data centre is therefore as much about its data as it is about how it prevents hacking and unplanned incidents that could prevent it from operating. Leaving a data centre without the ability to quickly and securely back-up inherently means that it’s insecure by nature as it won’t be able to respond whenever a disaster occurs.

So if your data centre is over reliant on sending sensitive data across a network without securing it at rest – before it is transmitted to another data centre or to the cloud, then it is potentially putting itself at risk. With data loss and downtime costing the UK £10.5bn a year, according to the EMC Global Data Protection Index, is it worth the risk? To protect your data centre and to speed up data transfers, use a SCION solution such as WANrockIT that does it quickly and securely.

DataClear to Exhibit at @CloudExpo | #Cloud #DevOps #IoT #Microservices

SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean, destroying it completely when you log out. If you wish to store your data, the solution will include a custom data encryption system with multiple factor authentication. DataClear also provides a corporate solution for multiple networked users and secure cloud based servers.

read more

Businesses want private cloud as revenue enabler says IDC

Cisco says cloud - primarily private cloud - is the key to unlocking new business value

Cisco says cloud – primarily private cloud – is the key to unlocking new business value

Cloud business is moving into a second wave of adoption, according to a global study commissioned by Cisco. Half (53 per cent) the survey group said they expect cloud to raise their revenues in the next two years – with almost as many (44 per cent) identifying private cloud as their chosen enabler.

The lack of private cloud options could be handicapping cloud business, analyst IDC reports, in its Cisco-sponsored Infobrief, “Don’t Get Left Behind: The Business Benefits of Achieving Greater Cloud Adoption”. Only one per cent of organizations claimed to have optimized cloud strategies in place and 32 per cent admitted they have no cloud strategy at all.

Cisco’s customers would be more interested in the second wave of cloud if it resolved their concerns about security, performance, price, control and data protection, according to its vice president for global cloud and managed services sales, Nick Earle. Cisco’s customer sentiments seem to be reflected in the IDC study, according to Earle, and the interest in private and hybrid clouds would seem to confirm this, with 64 per cent of cloud adopters reportedly considering hybrid cloud.

“Our strategy to build private and hybrid infrastructure is reflected in the new IDC study,” said Earle.

The study identifies five levels of cloud maturity: ad hoc, opportunistic, repeatable, managed and optimized. As the cloud strategy of organizations matures, moving from the lowest level ad hoc clouds to fully developed optimized clouds, ‘dramatic’ business benefits materialise, Cisco contends. It quantifies these benefits as revenue growth of 10.4 per cent, IT cost cutting at 77 per cent, a 99 per cent reduction in the time to lay on IT services and applications, a 72 per cent improvement in meeting service level agreements and a doubling of the IT department’s capacity to invest in new projects.

On a macro economic level the study estimated that ‘mature’ cloud organizations gain an average of $1.6 million in additional revenue for every application run on private or public cloud. They also cut the cost per application by $1.2 million by running them in the cloud.

Cisco said that private cloud will improve resource use, allow projects to run at greater scale and will give faster response times, while providing more control and security.

Though concerns about the complexities of hybrid cloud adoption – workload portability, security, and policy enablement – were reflected study, up to 70 per cent of respondents expect to migrate data between public and private clouds or among multiple cloud providers.

Three Steps to Enable Rock Solid Cloud Security By @IanKhanLive | @CloudExpo #Cloud

Cloud security is at the top of every CIO’s list. It is also the first subject that comes up when you engage in a discussion about the cloud. For those of us who followed the recent Ashley Madison story (from a tech perspective), you would agree that while the breach happened for so many reasons, security is at the heart of it. Here are some key aspects for creating a solid organization, keeping cloud security in perspective.

read more