Lifeboat to Distribute AVG Business Managed Workplace | @CloudExpo @LifeboatVAD #SaaS #Cloud #Storage

Lifeboat Distribution has expanded their distribution agreement with AVG Technologies allowing distribution into the North America, Latin America and Caribbean market place. AVG Technologies began their partnership with Lifeboat Distribution i n 2010.
“Lifeboat Distribution shares our commitment to maximizing opportunities for our channel ecosystem and at the same time, effectively meeting the security needs of today’s businesses,” said Fred Gerritse, General Manager, AVG Business. “As a leading, specialty distributor in security, the company brings a proven track record providing our AVG Business security portfolio. We are now taking our partnership to the next level to simplify onboarding and implementation of managed security services through expansion of our AVG Managed Workplace solution. We look forward to creating a win – win scenario for our channel partners and their business customers.”

read more

Cloud Computing vs Hyperconvergence | @CloudExpo #SDS #Cloud #Storage

The next generation of platforms is here is option in both cloud and on premises hyperconverged infrastructure. How do real people in today’s business world make the correct choices on where to move?
As IT departments look to move beyond traditional virtualization into cloud and hyperconverged infrastructure (HCI) platforms, they have a lot to consider. There are many types of organizations with different IT needs and it is important to determine whether those needs align more cloud or HCI. Before I dig into the differences, let me go over the similarities.

read more

Research argues overconfidence in disaster recovery is ‘common and costly’

(c)iStock.com/roberthyrons

A new UK study from cloud disaster recovery provider iland has found that 95% of respondents have faced an outage or data loss in the past year – with 87% of that number saying it triggered a failover.

The survey, conducted by Opinion Matters and which specifically polled 250 UK decision makers responsible for their company’s IT disaster recovery plans, also found that of the 87% who had executed a failover, while 82% said they were confident it would be successful, 55% encountered problems.

The majority of ‘disasters’, as noted by respondents, were more likely to be mundane system failure (53%) or human error (52%) compared to cyber attacks (32%) and environmental issues (20%).

42% of those polled said that if their systems were down for seconds, it would be ‘highly disruptive’ or ‘catastrophic’ for businesses. Naturally, the numbers go up when it comes to minutes (69%) and hours (90%), but only 27% of companies said they were able to recover all their systems immediately following an outage.

98% agreed their systems could be available within 24 hours – clearly not a good enough standard. “What we see a lot of the time is customers are coming in with old solutions where they are getting more hours and days from a recovery time perspective,” Sam Woodcock, iland principal solutions architect, told CloudTech.

Another interesting facet of the research showed the key differences between cloud-based and on-premise disaster recovery (DR). Respondents were more likely to spend more on their on-premise kit, but expected less downtime with it when compared to a cloud solution. Perhaps not surprisingly, iland’s message from the study is that cloud DR can be both cost-effective and minimise downtime.

DR testing was also examined; 37% of respondents are testing infrequently or not at all, while 63% has a trained team that tests DR either quarterly or twice a year. The study argues that more can – and should – be done in this regard.

“Traditionally, a long time back solutions were quite intrusive,” said Woodcock. “Some solutions involved replication actually being paused – reducing your recovery point objectives. That’s obviously not beneficial to the business if a disaster were able to hit during that test – potentially there would be some additional data loss with those more traditional solutions.

“What we’re seeing more, as we’re progressing through, [are] products come to the market that give you the ability to test non-intrusively,” he added. “That’s really giving an upward trend into the frequency that people test, and how frequently people test as well.”

iland gave three recommendations for organisations to get the most of their disaster recovery plans. Companies need to ensure a balance between downtime and cost, ensure testing is easy and cost-effective, and ensure that their DR solution helps maintain IT compliance.

Anything you can do, AI can do better: Making infrastructure smarter

(c)iStock.com/pixtum

Data flows and artificial intelligence (AI) are changing the globalised supply chain by sending data around the world. Using the cloud as the data transport method requires more efficient data acceleration. AI is infiltrating many places, and it’s not just about replicating human performance and taking jobs, or streamlining processes. It also helps make technology smarter.

AI can be part of IT infrastructure. David Trossell, CEO and CTO of Bridgeworks, argues that it can be a good thing, and needn’t involve employees being made redundant. In his view, artificial intelligence offers a good story as it can enable organisations to better manage their IT infrastructure, and enable organisations to improve their business intelligence, allowing them to make better decisions at a time when big data volumes are increasing at a tremendous rate. In fact, CTOVision.com claims that some experts predict that by the year 2020, the volume of digital data will reach as high as 40 trillion gigabytes.” So the task of sifting through it going to become harder and harder.

Lucas Carlson, senior vice president of strategy at Automic Software, believes it’s been an interesting summer – and for reasons most of us wouldn’t think about. He says that there are ‘5 ways artificial intelligence will change enterprise IT’ in his article for Venture Beat. He says artificial intelligence can be used for predicting software failures, detecting cyber-security issues, creating super-programmers, making sense of the Internet of Things, and managing robots in datacentres.

Rise of AI

Just a few short years ago artificial intelligence was at the beginning of the hype cycle along with fuzzy Logic and other such things, and so much was hoped for out of these technologies but they slowly faded away in to the background”, says Trossell. This pushed artificial intelligence out of favour, becoming more of an academic interest than anything else.

That was until the ground-breaking and great movie, AI, came along and shocked the world with life-like human thinking. But whilst it brought AI back to the forefront it did so with the fear that robots would displace humans as the superior race”, he explains. One could add that there have been many science fiction movies like this – such as the Terminator series of films. The films are about the battle between humans and killing machines – cyborgs – which take on a human form in order to enable them to walk unnoticed until they attack their targets with an eye on defeating humankind. Perhaps the most shocking aspect of the story is that the machines were originally built by our own race.

Trossell thinks the doom and gloom presented by these films will happen well after his lifetime.  Yet hopefully, mankind will choose to make more machine intelligence for our own good, rather than for our own destruction. Nevertheless, he thinks that AI is slowly re-emerging in a positive light, not as the prophecy of destruction of mankind but as a companion.  Many people are still concerned that this will displace jobs and lead to global unemployment with all the social upheaval and ills that accompanies it, but history teaches us otherwise.”

He explains why this is the case: At the start of the industrial revolution, James Hargreaves’ Spinning Jenny suddenly changed the lives of many cottage industry spinners, replacing many with one largely unskilled worker. Out of this there was a massive increase of employment as other industries mechanised to take opportunities with the increase in spun wool.  Yes, we did go through the age of the dark satanic mills, but industry is at the heart of our civilisation and many of us enjoy the benefits created by it with an exceptional quality of life.  What this shows us is while there is a short-term displacement in employment; this will be absorbed by new industries created later.”

Expert systems

He says that AI has the ability to create expert systems that never tire and they can augment humans.” For example, artificial intelligence is being used to spot breast cancer as it has the ability to learn – improving the cancer identification rate by up to 99.5% accuracy. AI can be employed in many other situations where it can augment experts to drive efficiency and ROI. Data is all around us – most of us are drowning in it – but it drives our modern society”, he comments. Yet this leaves the question of where to store this growing volume of data. Another question that often needs to be answered by many organisations today is about how to move it quickly and securely around the globe.

Data is at the heart of every organisation and many have a global presence as well as customers worldwide, and increased distances create a major bottleneck”, he warns. Although network speeds have increased exponentially over the past few years, this has not necessarily improved the performance of transmitting data over distance”, he explains. In his view, what it has done is exasperated the problem and delivered organisation a poor return on their investment. He says this is all caused by the ghosts that plagues networks; latency and packet loss, but these can be mitigated or reduced with the help of AI.

Mitigating latency

There are techniques that highly skilled engineers or programmers can employ to mitigate some of the effects of latency and packet loss, but the network – and more importantly wide area networks (WANs) – are a living, unstable entity to work with”, he warns. Therefore something has to be done, and preferably without much human intervention. After all, we as humans are all prone to making mistakes. Traditionally, to maintain the optimal performance of the data flowing across a WAN would require an engineer to constantly measure and tune the network, and this is where errors can be made.

With AI, the goal of maximising the performance removes the need for constant human intervention – leading to the potential benefits of reducing any human error, better network management, improve disaster recovery, reduced packet loss and increased accuracy. Artificial intelligence can dump a set of rules and learn in a similar way to how it is being deployed in the breast cancer article. It can learn from its experience the way the network behaves, and it also knows how data flows across the network”, he claims. This is enabling society to develop a fit and forget approach.

Two solutions that do this are PORTrockIT and WANrockIT. They help to mitigate the effects of latency and reduce packet loss. Beyond them Trossell believes that we have just begun to scratch the surface with the possibility of AI, and  there are many other thought and time intensive processes within the IT world that it can be  applied to, if only we had the courage to release control.” That must happen because we as humans are reaching the limit of our ability to manage the ever-increasing data volumes without some form of machine intelligence to support our endeavours. Could we hand over to AI the management of space, the location and the accessibility of data? Yes we can, because anything we can do, AI can often do it better, and we need its support going forth. 

[session] True Sensor #IoT Connection | @ThingsExpo @Neo4j #M2M #Sensors

I’m a lonely sensor. I spend all day telling the world how I’m feeling, but none of the other sensors seem to care.
I want to be connected. I want to build relationships with other sensors to be more useful for my human. I want my human to understand that when my friends next door are too hot for a while, I’ll soon be flaming. And when all my friends go outside without me, I may be left behind.
Don’t just log my data; use the relationship graph.
In his session at @ThingsExpo, Ryan Boyd, Engineer and Head of Developer Relations at Neo4j, will help attendees understand the dependencies and figure out the root cause of the failures.

read more

[session] @SolidFire: DevOps the Lean Cloud | @CloudExpo | #APM #DevOps

All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in – resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud.
In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of SolidFire, will discuss how to leverage this concept to seize on the creativity and business agility to make it real.

read more

[whitepaper] Driving #DigitalTransformation | @CloudExpo @IBMSystems #IoT

Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we’ve seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today’s digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and on the other side, organizations that will find themselves as roadkill on the technology highway.

read more

Container Platforms: Build or Buy? | @DevOpsSummit #DevOps #APM #BigData

Throughout history, various leaders have risen up and tried to unify the world by conquest. Fortunately, none of their plans have succeeded. The world goes on just fine with each country ruling itself; no single ruler is necessary. That’s how it is with the container platform ecosystem, as well.
There’s no need for one all-powerful, all-encompassing container platform. Think about any other technology sector out there – there are always multiple solutions in every space. The same goes for container technology. When you create something that is super scalable, the drawback is that it’s not going to be a one-size-fits-all approach. However, one person’s drawback is another person’s advantage.

read more

Windows 2016 Containers on Azure | @DevOpsSummit #Azure #DevOps #Docker #Containers

An Azure platform is running on a virtualized server and hence the above container hosts may not give the same performance as an equivalent container host on a bare metal server. However, this thought process will help organizations to come up with new use cases of utilizing Windows 2016 containers as part of their Infrastructure equations. Again as the industry as seen both SMP and MPP architectures, the usage of SMP should be viewed with respect to specific workloads.

read more