SoftLayer operates a global cloud infrastructure platform built for Internet scale. With a global footprint of data centers and network points of presence, SoftLayer provides infrastructure as a service to leading-edge customers ranging from Web startups to global enterprises. SoftLayer’s modular architecture, full-featured API, and sophisticated automation provide unparalleled performance and control. Its flexible unified platform seamlessly spans physical and virtual devices linked via a worldwide network for secure, low-latency communications.
Monthly Archives: April 2016
What did we learn from Verizon’s IoT report?
Verizon has recently released its State of the Market: Internet of Things 2016 report, which outlined the growth and potential for the IoT industry. The report features a number of use cases and information detailing the technology’s rise to fame, but also barriers for enterprise organizations over the coming months.
Here, we’ve detailed a few of the lessons learnt from the report:
IoT is no longer a nice idea, it’s becoming mainstream
If 2015 was the year IoT gained credibility in the business world, 2016 is the year IoT gains commercial feasibility.
While the concept of IoT has been around for some time, the idea of the technology having a B2B commercial backbone, capable of delivering on commercial objectives is now a reality. The potential of IoT has been well discussed but now we are seeing companies delivering on the promise. IBM is one which has particularly active in this segment, driving the Watson use case through the press repeatedly this year.
Wearables had a head start on B2B applications, though could be to thank for the relative ease of acceptance within the industry (both for enterprise and consumers). The marketing campaigns surrounding the earliest fitness wearables or smart watches normalized IoT, allowing for what could be perceived as a simple transition into the B2B sphere. But thanks to these (comparative) simple applications, the integration of IoT into the manufacturing process, healthcare, transportation, utilities, smart cities and any other context you could think of, has been a seemingly simple transition.
According to Verizon’s research, IoT networks connections have been growing healthily, the number of connections in the utilities industry has grown 58% between 2014 and 2015, and this is also backed up by forecasts by IDC research. The IDC’s findings estimate the IoT market spend will increase from $591.7 billion in 2014 to $1.3 trillion in 2019.
IoT might be entering mainstream, but data could hold it back
Data acquisition, analysis and action might be becoming one of the most repetitive conversations in the industry, but that is for good reason.
Verizon recently commissioned a report by Oxford Economics highlighted only 8% of businesses are using more than 25% of the IoT data which they have collected. In fact, only 50% of the businesses involved in the study said they would be using more than 25% of the collected IoT data in three years’ time.
On the surface, this shouldn’t seem as an issue that would cause too many problems, until you take into account the long-term deliverables of IoT. The promise of IoT is the collection of vast quantities of data to allow advanced analytics tools to make accurate predictions and customizations. If only a partial amount of the data is being analysed, only a partial amount of the promise can be realized.
IoT has hit the mainstream market, however it will never reach the promised deliverables if companies are not analysing more of the data collected. What is the point is spending millions on sensors, connections, storage and data scientists, if the full potential of the technology cannot be achieved. Can the long term financial security of the IoT industry be guaranteed if the promise is never fully realized?
There could be a number of reasons for the backlog of data, though industry insiders have told BCN the interface required to translate different data sets into a common language for analysis could be one of the reasons for the holdup. It would appear not all of the IoT value chain has evolved at the same pace.
Regulators will have to play a more significant role in the future
Regulation does and will play a major role in the delivery and adoption of IoT. Back in 2007 the Energy Act in the US accelerated the role of IoT in the monitoring of energy consumption, and while this could be considered the initial catalyst, growth has increased year on year ever since.
While this is an instance of regulation giving the IoT industry freedom to grow, it should not be seen as a surprise if regulators put in place rulings which could limit what the industry can and cannot do. Whether it is the ethical use of data, volumes of data which can be collected on a single person or the means in which and where the data is stored, regulation is likely to play a more significant role in coming years.
The report discusses the security of IoT which is a constant barrier for businesses and individuals alike. New regulations are likely to severely punish instances of data loss, and when you consider the sheer volume of data should IoT reach its potential, future instances of data loss could be disastrous.
Currently regulation within the IoT market is relatively low-key, encouraging growth of the technology as opposed to monitoring it, however there are a number of areas which need consideration in the short- to mid-term future. Lack of control and information asymmetry, low-quality consent, intrusive identification of behaviour patterns and user profiling and limitations on the possibility of remaining anonymous whilst using services are all areas which should be taken into consideration.
Maintel announces acquisition of Azzurri Communications
Systems integrator Maintel announced it has entered into a conditional agreement to acquire Azzurri Communications.
The acquisition of Azzurri Communications, which offers communication services including telephony, mobile services, document management, is part of a larger push for Maintel market position. The company highlighted that the deal is a strong component in its strategy to grow and diversify its revenue base.
“Azzurri Communications is a highly respected business with a complementary product offering and target market, which will provide enhanced scale and visibility for the combined group,” said Eddie Buxton, CEO at Maintel. “This acquisition will accelerate Maintel’s shift into hosted cloud and data, ensuring we are well positioned to take advantage of these high growth areas of the unified communications market. It will also build scale in managed services, continuing the shift in our business mix, which we have been driving following previous acquisitions.”
The acquisition will broaden Maintel’s offering to include a network services division, a mobile division, managed services, and technology and professional services. “With the acquisition of Azzurri, Maintel will also gain a new set of highly skilled and professional team members. We are looking forward to welcoming Azzurri employees to the group,” said Buxton.
Alongside the announcement, Maintel’s also reported a 21% increase in revenues for 2015, up to £50.6m. As part of the company’s growth strategy, it also acquired Datapoint and Proximity Communications in recent months.
“We are really pleased at the prospect of joining Maintel because this enables the combined business to offer its customers a broader range of services,” Chris Jagusz, CEO of Azzurri Communications. “Our employees will benefit too by being part of one of the most significant players in our market.”
Duo security and Teneo introduce new authentication system for employee mobility
Duo security and Teneo have teamed up to create cloud-based two-factor authentication to simplify employee’s access to work networks through their smart phone.
The new system will enable businesses to deploy a one-tap authentication via smartphones rather than using separate ID key fobs. Teneo will provide the Duo cloud solution to customer organisations worldwide as a managed service, with employees simply downloading the Duo Security mobile app to their
“Duo is an easy step to securing corporate access across all users, in any environment,” said Henry Seddon, VP EMEA at Duo Security. “Easy and effective solutions are key to ensuring trusted access across an entire organisation.”
Duo Security’s two-factor authentication solution works across a wide variety of PCs, Macs, laptops and mobile devices as well as Apple iOS, Google Android and Blackberry operating systems, providing a more flexible two-factor network authentication system.
“Duo Security ties in with Teneo’s ethos of bringing to market smarter software offerings that make business-critical tasks like security simpler and intuitive for IT teams and employees alike,” said Marc Sollars, CTO of Teneo. “Even now, many data security set-ups are difficult and represent a kind of rules-based drag on workplace productivity. Duo Security gives forward-thinking customers a simple way to make network access much easier and beef up their overall network security. This ‘one tap’ authentication will become crucial as today’s businesses become ever-more dependent on mobile devices and applications to compete”
Recent research has highlighted to the community that security continues to be an issue for enterprise, as employees would appear to be indifferent to security protocols. As the employees themselves are seemingly one of the greatest threats to the organization, making any security standards as simple as possible would appear to be a sensible strategy in shoring up an organizations perimeter.
Announcing @DcDnews “Media Sponsor” | @CloudExpo #IoT #DataCenter
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON’s 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY.
DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world’s most ICT dependent organizations make risk-based infrastructure and capacity decisions.
The Textbook Definition of #ContinuousDelivery | @DevOpsSummit #DevOps
To paraphrase Kent Beck: software delivers no value apart from runtime. Ideas take physical form in hardware, virtualized only part of the way down, and someone other than the developers makes the ideas manifest. So then: ops folks are the crop’s caretakers; developers design the seeds.
Well, of course this isn’t true. Developers don’t turn pie-in-the-sky math into socially constructive math that sysadmins then make useful. No: developers write code that makes things happen. We care about runtime just as much as ops, so we need to know how our software is helping people in reality—and how it can do a better job—in order for us to refine both the ideas and the implementation.
So cycle-time shortening—the metric of continuous delivery, and one of the Twelve Principles of Agile Software—is equally about users and makers. Working software delivered sooner is better for users than equally working software delivered later. And informative feedback gleaned more quickly from real-world use is better for developers than equally informative feedback gathered more slowly. Your customers’ problem space measures your software’s value better than your requirements specification ever could. Calibrate against that space as often as you can.
You Need DevOps By @DMacVittie | @DevOpsSummit @StackIQ #DevOps
For those unfamiliar, as a developer working in marketing for an infrastructure automation company, I have tried to clarify the different versions of DevOps by capitalizing the part that benefits in a given DevOps scenario. In this case we’re talking about operations improvements. While devs – particularly those involved in automation or DevOps will find it interesting, it really talks to growing issues Operations are finding.
The problem is right in front of us, we’re confronting it every day, and yet, a ton of us aren’t fixing it for our organizations, we’re merely kicking the ball down the road.
[session] Bring Added Value to Your Software Product By @MobiDev_ | @CloudExpo #Cloud
Every successful software product evolves from an idea to an enterprise system. Notably, the same way is passed by the product owner’s company.
In his session at 18th Cloud Expo, Oleksii Ostroverkhyi, CCO at MobiDev, will provide a generalized overview of the evolution of a software product, the product owner, the needs that arise at various stages of this process, and the value brought by a software development partner to the product owner as a response to these needs.
Pfizer utilizes IBM Watson for Parkinson’s research
IBM and Pfizer have announced a research collaboration with the intention of improving how clinicians deliver care to Parkinson’s patients.
The collaboration will be built on a system of sensors, mobile devices, and IBM Watson’s machine learning capabilities, to provide real-time disease symptom information to clinicians and researchers. The team aim to gain a better understanding as to how the disease progresses as well as how patients react to certain medications, to design future clinical trials and also speed up the development of new therapies.
“We have an opportunity to potentially redefine how we think about patient outcomes and 24/7 monitoring, by combining Pfizer’s scientific, medical and regulatory expertise with IBM’s ability to integrate and interpret complex data in innovative ways,” said Mikael Dolsten, President of Pfizer Worldwide R&D.
According to the World Health Organization, neurological diseases such as Parkinson’s affect almost one billion families around the world, Approximately 60,000 Americans are diagnosed with Parkinson’s disease each year according to the Parkinson’s Disease Foundation, and an estimated seven to 10 million people suffer from the disease globally.
“The key to our success will be to deliver a reliable, scalable system of measurement and analysis that would help inform our clinical programs across important areas of unmet medical need, potentially accelerating the drug development and regulatory approval processes and helping us to get better therapies to patients, faster,” said Dolsten.
The collaboration seeks to create a holistic view of a patient’s well-being by seeking to accurately measure a variety of health indicators. Data generated through the system could also arm researchers with the insights and real-world evidence needed to help accelerate potential new and better therapies.
“With the proliferation of digital health information, one area that remains elusive is the collection of real-time physiological data to support disease management,” said Arvind Krishna, SVP at IBM Research. “We are testing ways to create a system that passively collects data with little to no burden on the patient, and to provide doctors and researchers with objective, real-time insights that we believe could fundamentally change the way patients are monitored and treated.”
[session] Configuring Database as a Service with SoftLayer and Bluemix By @SoftLayer | @CloudExpo @IBMBluemix #Cloud
Following the notion of “The cloud” as a model and not a place, learn how to extend your SoftLayer infrastructure to utilize the PaaS offerings of Bluemix. In his session at 18th Cloud Expo, Ryan Tiffany, a Sales Engineer at SoftLayer, an IBM Company, will utilize both the command line and GUI portals and show you how to order a SoftLayer server and configure a front end application to use the Database as a Service offering from Bluemix.