Todas las entradas hechas por Cloud Pro

How big data can give you a competitive edge in sports


Cloud Pro

30 Jul, 2019

When the athletes dramatised in the 1981 Oscar-winning film Chariots of Fire were competing nearly a century ago, a stopwatch was one of the few devices producing data to measure their sporting achievement. But these days sport is all about measurement and analysing the data it produces. Whether it’s tracking your location, heart rate, oxygen saturation, or nutrition, huge amounts of information are being collected from athletes, including amateurs. But professionals in particular are finding that data collection and analysis could be what gives them the edge in competition.

Sports sensors sit where two of the biggest trends in contemporary computing meet: big data and the Internet of Things. The latter is a significant driver of the former. On the one hand, you need the connected devices that can track the relevant parameters to assess the factors behind sporting excellence and measure improvements. On the other hand, you need powerful data analytics to take the information produced, make sense of it, find trends, and help inform how athletes can do better.

Sales of running watches alone are growing five per cent every year, according to MarketWatch. These have now gone well beyond pedometer wristbands like the original Fitbit, and can include a GPS, heart rate monitor, and even a pulse oximeter to measure blood oxygen levels. They can also link wirelessly to cadence sensors in running shoes and on bikes, monitor your sleep patterns, and then automatically transfer the data collected to the internet. Some sports watches can use an accelerometer to detect which stroke you are using when swimming and when you push off at the beginning of a length, so the number of lengths you swim can be counted automatically.

Even consumer-grade systems can tell you useful things about your exercise ability that can guide how you train, such as VO2 Max, which measures the maximum amount of oxygen a person can utilise during intense exercise. This provides an assessment of cardiovascular fitness and can help you track your progress getting fit as you implement a distance-running programme. However, whilst the amateur trend for tracking exercise is driving device ubiquity and the sheer volume of data, professionals have access to systems that can provide much greater levels of detail, and with it a real edge in performance.

Data gets results in the beautiful game

For example, data is being used to improve football performance by analysing opposing team strategy and finding ways to combat it. Scottish football team Hearts used information from the InStat database to predict that a high-pressing game using players who could keep the pressure on for many kilometres of running would help them beat Celtic – and it worked. They are not alone, as more than 1,500 clubs and national teams use InStat, giving the company information on more than 400,000 players.

But clubs also build up data on their own players using sophisticated devices from companies such as Catapult Sports that can collect up to 1,000 data points per second. Similarly, the STATsports Apex can calculate more than 50 metrics at once, such as max speed, heart rate, step balance, high metabolic distance and dynamic stress load. This goes beyond the pure numbers but adds interpretation about how this is affecting an individual athlete. The data is collected in the cloud for historical comparative use. Teams now use this information to help decide which players to purchase to achieve their objectives for the season, employing services like InStat and Opta to provide the details they need.

InStat collects data for football, ice hockey, and basketball, whilst Opta includes these plus cricket, rugby union, baseball, golf, motorsport, tennis and handball amongst others. Although team games have many variables that can make player statistics only part of the picture, sports that focus on individual performance such as athletics can rely heavily on data to provide clear insights on how to aid improvement. This goes well beyond GPS tracking of outdoor events. Wearable devices with accelerometers, magnetometers and gyroscopes can track hundreds of data points to describe an athlete’s physical motion.

Stryd’s running shoe attachment can capture cadence (steps or cycles per minute) and ground contact. This can be used to analyse running style, which can be compared to previous sessions and other athletes. This information can spot nascent talent or help an athlete hone their style so they can emulate what makes the most successful sportspeople win. It can also detect warning factors like asymmetric movements that might cause a future injury or imply an impending one. EliteHRV’s sensors can provide high levels of detail on heart rate variability to see the physical effects of different levels of performance, so that athletes can recover adequately from their sessions and not over-train.

The secrets of the perfect golf swing

Another individual sport that is gaining considerable benefit from wearable sensors and data analytics is golf. Any sport using a bat, racquet or club can gain benefit from analysing a player’s swing, but in golf, the swing and body posture are particularly constrained, without also having to take into account additional factors like cross-court movement or ball spin, although atmospheric conditions will have an effect. Systems collecting golf performance data include Opta and ShotLink. The latter has results data dating back to 1983 and tracks 93 events a year.

GolfTEC, in contrast, is more focused on how an individual achieves their performance. The company has developed a SwingTRU Motion Study database of 13,000 pro and amateur golfers that includes information on 48 different body motions per swing. The analysis found six key areas that indicate an excellent player. These factors were discovered by correlating swing data with performance. Similarly, TrackMan uses cameras to analyse a player’s swing to aid training.

Rather than just analysing the past, predicting the future is where the application of big data analytics to sport will prove particularly valuable. As with every area of big data analytics, this will be dramatically affected by the application of AI and machine learning. Any massive store of unstructured data can potentially benefit from AI technology, which can help structure the data and find patterns in it proactively. First you feed in performance data, physical metrics during the activities, nutrition, sleep, atmospherics, plus anything else available. Then AI-empowered analytics will look for patterns that could provide strategies that make a difference, particularly when the margins for winning can be so small.

The systems we’ve discussed here are just the beginning. Data-driven sports are still only in their infancy, with much more to come in the next few years to help athletes find a competitive edge. Sport is just one area where big data and analytics are having a major influence, too. Healthcare, smart cities, and our understanding of the natural world are all seeing dramatic contributions.

Discover more amazing stories powered by big data and Intel technology

The nature of technology


Cloud Pro

29 Jul, 2019

Changes to the planet are happening at an alarming rate, and it’s no secret that we’re partly to blame. But one way in which we are making positive changes to the Earth is through developing technology that can help us protect our planet. And how best to do this? By understanding exactly what is happening now, so we can help predict and influence future events.

The size and complexity of the ecosystem is one of the biggest challenges when monitoring the planet. While plenty of data has been collected over the years, until recently, it has been almost impossible to get the insights needed to truly understand what’s going on.

This is where cutting edge technologies come in. From large scale global mapping to tracking individual species, big data analysis and deep learning are now being used to analyse the Earth and its life.

Macro scale – mapping and modelling the Earth

Forests and oceans cover the majority of the planet. Their health is crucial to the stability of the ecosystem, but monitoring them is no easy feat. Enter digital maps and models. By using big data analytics to create high-resolution maps of the planet, many of which are updated in real time, scientists can literally get the bigger picture.

Global Forest Watch is a UN sponsored web application that uses live satellite images to assess the health of all forests globally. It has provided the tools and data to support projects such as the Amazon Conservation Association, which works to protect biodiversity in the Amazon, and Forest Atlas, which helps manage forest resources in Liberia.

Rezatec, an analytics company, has developed similar forest intelligence and the data analysis techniques and machine learning algorithms to produce crucial insights. In 2017, Rezatec partnered with the Forestry Corporation New South Wales to add value to its existing data by using multiple datasets to create more detailed and useful maps. This gave forest management the tools to spot pests and diseases and environmental changes early.

Similar projects are underway to understand our oceans. Ocean Data Viewer is another resource supported by the UN that collects and curates a wealth of data on coastal and ocean biodiversity from a variety of trusted scientific research facilities. Multiple global data sets and maps can then be viewed and downloaded for research and analysis.

71% of our planet is water, but we have better maps of Mars than we have of the ocean floor. Now, thanks to data gathered from ships, buoys, and satellites, oceanographers have plenty of information at their disposal. From analysing sediment samples with AI to employing cutting-edge laser techniques, they can now map out features on the seabed such as rivers and underwater volcanoes and detect changes as and when they happen. Advanced analytics, powered by Intel, are helping to make these developments faster and more efficient.

Micro scale – monitoring individual animal species

Decline in an individual species is often an early indicator that something is wrong, as many animals are essential to the Earth’s balance. By analysing animal behaviour, scientists can give governments, business leaders and the global community the tools they need to encourage species to thrive.

From counting whales from space to tagging elephants with AI trackers, there are plenty of new and exciting technological developments giving scientists more insight into our wildlife than ever before. However it’s the smaller, more elusive creatures that can be the trickiest to understand. Luckily, technologies are now being developed to provide solutions for these species.

Take bees, for example. Through the pollination of crops, they perform an essential role in the global economy. However, their numbers are in worrying decline. In 2016, Intel teamed up with Australia’s Commonwealth Scientific and Industrial Research Organization on a groundbreaking bee-tracking project. Tiny RFID ‘backpacks’ were attached to 10,000 Tasmanian bees to monitor their every move. Meanwhile, their hives were fitted with an Intel Edison board to collect data on internal conditions and honey production. From the data gathered, scientists deduced that the use of pesticides, climate change and the loss of wildflower habitats were all contributing factors to the species’ decline.

Bats are another crucial, but elusive species. They are a great indicator of biodiversity, the loss of which has been likened to «burning the library of life». As bats only thrive when insect species are in abundance, understanding their behaviour and numbers can help scientists assess the health of the local environment.

In an Intel and UCL project, automatic smart ‘Echo Box’ detectors were fitted around the Queen Elizabeth Olympic Park in London. These boxes picked up ultrasonic bat calls, allowing scientists to acoustically track bats in the area. The Intel edge processor in each box then processed and converted the sound files into visual representations of the calls and deep learning algorithms analysed and logged each pattern. On some nights, over 20,000 calls were detected, indicating a reassuringly high level of bat activity.

All of these projects highlight the amazing capabilities and flexibilities of today’s technology. From big data analysis to deep learning, the same groundbreaking tools used to empower businesses globally are being applied to make a real difference to the health of the planet.

Discover more amazing stories powered by big data and Intel technology

10 key takeaways from Ingram Micro’s UK Cloud Summit


Cloud Pro

21 Jun, 2019

With research from PwC suggesting that movement in digital transformation stems from a collaboration of power perspectives, the channel knows all too well the challenges of sustaining the digital dialogue. Ensuring that people are at the forefront of their mission, the channel seeks to continue its disruption as the core focus of human experience propels it further into discovery.

And this is exactly what made the UK Cloud Summit the ever more spectacular; people coming together to enable the human experience that channel brings. This mindset can be used to encourage our partners to think in the same way and drive business forward.

To be able to host two days of celebration featuring unique keynotes, insightful breakout tracks and invaluable networking opportunities, is a testament to the great relationships we have with all stakeholders of the channel and our partners.

The cloud-coveted reigns were passed over to our brilliant speakers who delivered excellent conducive presentations over the two days, covering contemporary concepts on cyber security, hybrid cloud, collaborative tools and novel ways to enhance your ROI.

So, what are the ten key takeaways from UK Cloud Summit?

1 – Cloud for competitive advantage

Scott Murphy, Director of Cloud, Ingram Micro UK&I, expresses that cloud isn’t the enabler of revolutionary tech, but plays a deeper role of epitomising the crux of transformation; a true implement of competitive advantage.

Whilst an enabler creates a catalyst for change, it’s a means to an end. To truly harbour the digital transformation age, you need a means in itself. Cloud technology is the foundation of a new way of working, collaborating and driving new advanced solutions, from which can only develop from our partners.

Our role is to unleash and sustain this unprecedented potential.

2 – Education is key to better security

What creates a better future for our cloud solutions? Having access to the right tools so that we can better protect ourselves.

As told by our guest speaker, Alexis Conran (best known for BBC Three’s The Real Hustle), we’re the most vulnerable to cyber threats because of our involvement in the industry. As advocates of technology and cyber security, we can often fall short.

It’s important to still carry due diligence into our decisions to navigate the landscape that we’ve often set up ourselves, because as Alexis expressed, nothing is 100% secure.

3 – Service providers need to be service integrators

According to Leigh Schvartz, head of cloud and MSP offerings at Fujitsu, seats provisioned and managed by a service provider are set to double in the next five years.

Opening new potential in enhancing strategic management, this manifests a new way of supporting partners. People now want more of a service integrator, a service provider who can work side-by-side to uncover more business opportunity and scale more.

4 – Partnering to offer a wider range of services

Not every provider can service all the needs of their customer. While large companies such as Fujitsu can scale up to orchestrate the major cloud platforms, smaller service provider partners don’t have the resources or the people to be specialised enough to go through all of them – which is why partnerships are emerging in ecosystems.

«So you might have an infrastructure specialist partner, and a co-locator and an application consultancy on Azure, all team up, work together to build something that they could all take to market,» says one speaker at the event.

5 – Providers need to vary their pricing models

Many companies have adopted subscription pricing models, but this may not suit some providers, or indeed some customers. For mid-sized service providers, for example, the heavy investment needed cannot accommodate the as-a-service package.

Paying for what you use is becoming increasingly popular, but it’s also becoming apparent that businesses prefer a fixed term contract behind this to mitigate risk.

6 – Private hosted cloud is becoming more popular

From a cloud perspective, private environments hosted by the service provider are going to be very popular.

«As people prepare to move things to the public cloud, if that is the direction of travel, actually offsetting it to a private hosted service provider, in the first instance, makes sense,» said one speaker.

7 – Cyber protection has evolved from data protection

The talk around the protection of information and cyber security within enterprises shouldn’t be limited to just hardware. It’s now evolved beyond that, according to Ronin McCurtin, vice president of Northern Europe for Acronis.

He says this now includes privacy, authenticity and security, particularly as the amount of data being processed has exploded in recent years creating greater demand for multiple storage repositories.

«Suddenly, our data went from being on our laptop to being everywhere,» he says. «We need to provide a solution so we know that our data can be actually protected when it’s out there.»

One example of a threat to enterprises is ransomware. «A lot of companies are going to be attacked by ransomware and we need to make sure that they’re protected from those kinds of situations going forward.»

8 – Authenticity will take centre stage

Making sure that emails, messages, etc are authentic will be much more important and easier to do.

McCurtin says that end users will need technology working together to make sure that messages sent to each were genuinely from the sender. Notarising such communications using blockchain technology could help in quarantining such messages.

9 – Embracing a multi-cloud strategy

Rob Price, UK Partner Organisations CTO at Cisco, presented Cisco’s view on how enterprises are embracing a multi-cloud strategy.

Many organisations have come to the cloud via shadow IT and used it in a siloed way. IT managers now need to administrate the many clouds that the organisation uses. Cisco’s advice here is that businesses think about the many options available to them, create implementation plans and proof of value and achievable milestones.

10 – The channel needs to start small in cloud

Executive vice president of Ingram Micro Cloud Global, Nimesh Davé, told delegates at the summit that partners need to «start off small» when moving to the cloud and find a niche where they know they can do well.

This, he says, means building up a practice around small applications stacks and be better than anyone else at it.

The cloud is opening the doors for the channel and service providers who put the human experience at the heart of their activities. Digital transformation works best when businesses and partners work together for the greater good – when they work in tandem, they can be unstoppable.

Ingram Micro UK Cloud Summit 2019: A retrospective


Cloud Pro

21 Jun, 2019

Global distributor Ingram Micro held its annual UK Cloud Summit last month, at London’s prestigious Landmark hotel in Marylebone.

A companion event to the company’s flagship Cloud Summit X event, held in San Diego last March, the UK Cloud Summit bought together 128 partners and more than 350 delegates to hear more about Ingram Micro’s plans for its UK channel partners and to network, collaborate and share knowledge with each other.

In addition to senior Ingram Micro executives, attendees heard keynote speeches from industry leaders including Dropbox, McAfee, Microsoft and more. Many speakers highlighted the size and strength of the opportunity facing cloud service providers and resellers.

As Alex Hilton, CEO of trade body the Cloud Industry Forum (CIF), pointed out, although around 70% of organisations say they’re undertaking some kind of digital transformation project, PwC’s research shows that 85% of these projects eventually fail – ample evidence that businesses need support in this area.

Ingram Micro’s executive vice president of global cloud computing Nimesh Davé also noted that models of commerce and economics have moved away from transactional, ownership-based approaches to a situation in which consumers increasingly want to consume everything as a subscription-based service. This includes businesses as well as individuals.

Understanding your customer

The overarching theme, however, was the need for partners to better understand their customers and the business challenges they face.

«Business, and the way you conduct things, has completely changed and it will continue to change,» Davé said, «The way you do things – and the relationship you have with your customers and your partners – will need to get deeper than ever before. Because you’re not only going to have to be looking from the outside, you’re going to need to be looking from the inside.

«You can no longer be there and not be in the business. That means understanding your customer’s business is going to become more and more important, because you are going to move into becoming solution providers.»

This was a sentiment echoed by many of Ingram’s partners. As cloud technologies and digital transformation become ever more essential to modern businesses, channel firms recognised the need to move away from simple, revenue-based deals providing off-the-shelf hardware and services and gravitate towards offering clients more comprehensive, interconnected end-to-end technology packages.

Delegates were also treated to invaluable insight from industry veterans, who shared a mixture of practical advice for resellers and MSPs, as well as their visions for how channel firms can forge more sustainable, profitable and valuable relationships with their customers. One of the key highlights of the summit was undoubtedly an appearance from Alexis Conran, host of BBC Three’s ‘The Real Hustle‘, in which he explained how having high-end cyber security systems (while vitally important) doesn’t automatically mean that a clever and determined attacker can’t get around them through social engineering and other nefarious tactics.

Elsewhere, partners such as Acronis, Cisco, and Fujitsu gave their insights into how channel companies can approach the complexities of issues like building a cohesive multi-cloud offering, creating robust security practises and supporting clients in a services-driven business.

Ever-evolving industry

The event made it clear that the industry is going through a fundamental change. Nowhere was this more evident than the keynote speech from James Chadwick, Microsoft’s cloud services partner lead, during which he gave attendees an inside look at how the tech giant has transformed itself – inside and out – into a company that puts customers and partners first.

Chadwick spoke about some of the areas in which it’s trying to change, like putting time and effort into upskilling its partners as well as its employees, or making more dynamic use of internal data to boost efficiency.

He also cited some decisions from Microsoft which would have been unthinkable just a few years ago, such as making Apple one of its major partners, buying GitHub, or opening itself up to Linux. AI and the IoT are now crucial areas that Microsoft is helping its customers to explore.

«The world’s changed – and suddenly we’re talking about different things. We’re not talking about Sharepoint any more. We’re not talking about Exchange,» he said.

«There are some really important topics that we need to be talking to our customers about, and we need to be talking with our partners to our customers about those topics.»

Quality over quantity

The reason many firms were so excited to attend this year was their eagerness to attract new partners of their own. Rather than simply trying to sweep up as many net new partners as possible, however, all of the companies we spoke to emphasised that they were looking for quality rather than quantity.

They also stressed that they were looking to grow their channel ecosystem responsibly, ensuring that every partner still gets the care, attention and support that they need, rather than simply leaving them to their own devices and cashing their cheques.

«I came from an organisation that had 2,000 UK partners, and we’re not ready to support that size of organisation, so we want a core of good quality, focused partners,» said Richard Agnew, EMEA vice president of security firm and Ingram Micro partner Code 42.

«I can’t expect a partner just to decide ‘I’m going to sell Code 42’ and we go ‘great, send us some PO’s… it doesn’t work like that. So they need support and training and professional services help and pre-sales help. And we’re of limited scale, so 20 feels like the right sort of number for us to have.»

Ingram Micro’s vision for the channel

One of the biggest things Ingram Micro’s partners will have taken away from the event is the certainty that the company is doing everything it can to support them. Davé repeatedly promised the crowd of assembled delegates that Ingram will continue to
invest in its channel, citing Ingram’s belief that «the channel will be around a lot longer than any of us will».

«The last time I looked into it, about 80% of cloud was done direct,» Ingram Micro’s director of cloud and advanced solutions Scott Murphy told delegates.

«Now, for me, that represents a massive opportunity for channel because as customers have moved into cloud, then they want to go to multi-cloud and they want to talk about ‘how do I manage this, how do I secure it, how do I optimise it. They’re not going to get that from one single vendor and that’s the area where channel partners can play.»

Murphy added: «There is a set of services all the way from discovery right through to management and cost optimisation that you can leverage into your client base to help you unearth opportunities and serve to bring value rather than just driving pure consumption.»

The company put its money where its mouth was, however, with the launch of a range of new, partner-focused services designed to make it easier for all of its ecosystem to succeed. ISVs, for example, can use the new Ingram Micro Connect scheme to quickly and easily add themselves to the Ingram Micro Cloud Marketplace, as can resellers who have worked with clients to develop custom apps, allowing them to add new revenue streams to their business with virtually no effort.

Ingram will also be expanding its Comet competition for ISVs to new territories, seeking out best-of-breed software creators to add to its channel.

For resellers, the company announced the new Sales and Marketing Hub, creating a centralised resource where partners can access standardised sales sheets, playbooks and product guides for all the vendors Ingram Micro works with, as well as tools and templates to help them create and launch dynamic, modern and effective multi-channel digital marketing campaigns.

Celebration of excellence

As part of the event, Ingram Micro also honoured its partners with a number of awards recognising outstanding achievement within its channel.

Microsoft claimed the top prize as Vendor of the Year, while Ricoh UK and Softcat picked up MSP of the Year and Charitable Partner of the Year respectively.

Creators and Integrators of the Year went to Dropbox Business, while Highlander was awarded the title of VAR of the Year, and Colt Technology walked away with an award for Cloud Change Agent of the Year.

There was also special recognition for the trailblazing women within the UK channel, with Angelika Schmeing, senior vice president of Operations and Finance at The Henson Group, receiving the Women Leading in Channel award.

For Ingram Micro, the UK Cloud Summit is a great way to hear from its partners about how it’s doing as a distributor, as well as to update them on new announcements from the company, but Murphy says that this isn’t his primary goal with the event.

«For me, the content of the event is key, in terms of being able to share that content with partners, but it’s more about the ecosystem of the community,» he said.

«So, partner A talking to partner B and sharing best practice, talking about how things work for them. That to me is the value – me getting that feedback, and my team getting that feedback.»

For Ingram’s vendor partners, however, it’s also a much-needed reinforcement of the importance of cloud computing in the modern IT landscape.

«I think it’s been too long that some channel partners have ignored cloud while it’s been clearly coming up on the horizon,» says Nigel Hawthorn, EMEA marketing director for McAfee.

«It’s great that Ingram has put together a huge investment in time and effort and money to deliver an agenda that is two days of cloud computing. I’m really pleased about this. Hopefully, together, we can all learn how to how to do business better in the cloud.»

CIF’s CEO Hilton added: «The value of events like this shouldn’t be underestimated because there is an opportunity for attendees to hear from a mix of vendors and suppliers to really understand and establish what their best-of-breed opportunities are.

«Ingram puts that best-of-breed collection together and that’s an opportunity for the supply chain, and the channel generally, to choose their version of that whether it’s the same as Ingram’s, or whether they augment it and supplement it themselves, but actually just go to market and present a complete solution set for their customers.»

The smarter route to SD-WAN


Cloud Pro

3 Jun, 2019

If you deal with networking technology, there’s one trend that you’ll almost certainly have heard more about than any other, and that’s SD-WAN. It’s been labelled as ‘the next big thing’ for a few years now, with everyone from IT admins to vendors and providers extolling its virtues.

Short for software-defined wide-area network, SD-WAN technology is used to simplify the management of wide-area networks which connect remote locations like data centres or regional offices across large geographic distances. It does this by abstracting the control and management from the network and hardware layer onto a separate layer of software.

This abstraction means that various elements of WAN management can be centralised. For example, MPLS, standard broadband and 4G LTE services can all be bundled together as part of an SD-WAN to provide an overall pool of network capacity to be portioned out. Hardware management can also be policy-based, allowing for zero-touch provisioning and remote configuration. Network virtualisation is employed too, as is a flexible approach to traffic routing.

These capabilities – many of which were previously not available to businesses – are what make SD-WAN so transformative when compared to traditional WANs. Standard WAN deployments are usually very rigid in terms of how they’re configured, and any changes or updates often need to be made locally. They’re expensive too, and often require the installation of specialised hardware throughout the network.

SD-WAN solutions, by contrast, are far more versatile in how they’re configured. Because the control plane is abstracted from the data plane, changes to the network architecture can be made remotely through software, rather than having to reconfigure on-site hardware or invest in additional capacity.

Benefits of SD-WAN

This introduces a number of attendant benefits; because the control of the networking hardware is abstracted, companies aren’t required to make sure all of their equipment is from the same vendor in the interests of compatibility, but can instead invest in more cost-effective generic equipment. Pooling network connections also increases resiliency. If the MPLS connection goes down, for example, 4G or broadband links can automatically pick up the slack without outages.

Remote provisioning and management of equipment also means that network engineers and administrators don’t need to have an on-site presence in every location in order to carry out maintenance and upgrades, thereby saving on travel time and staffing costs. Scaling is easier and cheaper, too; networking equipment for new branches can be shipped directly from the factory to the new location, plugged in and then rapidly and remotely configured based on predefined policies.

This makes it particularly relevant for fast-growing businesses who need to expand without IT concerns causing a bottleneck. Rather than having to ensure every location has in-house networking staff to handle troubleshooting, most problems can be dealt with remotely. SD-WAN also enables a much greater level of network automation, which allows companies to scale the size of their network without vastly increasing the management and administration overheads for the network team. This, in turn, allows small teams to manage a large SD-WAN network which would normally require a much larger headcount to cope with if it were a traditional WAN.

Potential barriers

However, SD-WAN is not without its challenges, and while managing an SD-WAN system is often simpler and less time-intensive than managing a normal WAN, the same is not always true of setting one up. There are barriers to rolling out an SD-WAN infrastructure that can place it out of the reach of smaller and mid-sized organisations.

Cost is often a key factor here; SD-WAN solutions from large blue-chip vendors are often charged at a high premium, and when combined with existing software licenses and rental fees for the MPLS, broadband and LTE connections that make up the core of the networking stack, this can lead to a high cost-of-entry for companies looking to start their SD-WAN journey.

On top of that, deploying and managing an SD-WAN requires specialised knowledge. Certifications and courses in how to operate SD-WAN systems can cost thousands, and IT professionals who possess said certifications are in high demand and can command significant salaries.

The time investment required for an SD-WAN rollout should not be underestimated, either. Aside from the training necessary to set up and operate it, SD-WAN systems should be thoroughly planned out before deployment. Not only that, but the actual switch-over from regular WAN to SD-WAN takes time – and for over-stretched, time-poor IT teams (who are often the ones who stand to gain the most from SD-WAN) this is time that they’re already spending on the day-to-day needs of their existing network.

Managed SD-WAN solutions

Growing companies can overcome these challenges, however, and the best way of doing so is by working with a partner to deploy a managed SD-WAN solution. Partnering with an experienced, battle-tested SD-WAN provider bypasses the need for investing time and money in up-skilling existing employees, as they already have a team of qualified technicians at their disposal. And, because these technicians will also work with you on designing the architecture, as well as taking on the heavy lifting when it comes to rolling out the network, your staff can focus on the business-critical tasks of maintaining your network without taking time to handle the migration.

Managed SD-WAN services have advantages beyond the initial rollout, though. In this model, the partner is responsible for operating and maintaining the network, rather than the customer. This allows internal IT staff to focus on extracting value from their SD-WAN deployments without having to worry about managing the network as a whole. In practice, this means IT teams can spend their time working on using the enhanced visibility and granular control offered by SD-WAN to improve the stability, performance and optimisation of their networks rather than getting bogged down in tasks that don’t deliver value, offering all of the advantages of SD-WAN with none of the complexity.

Moreover, choosing Zen Internet as their managed SD-WAN provider allows businesses to reduce the cost of their SD-WAN deployments by bundling all of the costs together into a single package from a single provider. This includes not only the software-based control platform, SD-WAN networking hardware, associated licenses and circuits that make up an SD-WAN network, but also the underlying MPLS, broadband and 4G LTE connectivity that powers it. These technologies are bundled into a single managed service, allowing you to consume SD-WAN on a cloud-style as-a-service basis.

Zen has a rich and proven heritage in business and carrier-grade networking, including large-scale WAN estates, making Zen an ideal partner to both deploy and manage large, complex SD-WAN projects. Zen customers also benefit from award-winning service and support, with a plethora of experienced technical engineers on standby to ensure that problems are resolved rapidly.

SD-WAN can be a daunting prospect for growing businesses, but it doesn’t have to be. Partnering with Zen allows companies to offload the work of setting up and managing a complex, highly-sophisticated SD-WAN model to a company with a strong background in the intricacies of business networking, as well as a broad portfolio of network and connectivity technologies. Supported by Zen, companies are empowered to get the maximum value from their SD-WAN solutions, transforming and future-proofing their organisations without the added headaches and cost that SD-WAN often entails.

To learn more about how an SD-WAN can revolutionise your business, download Zen’s free buyers guide.

Raising the bar on enterprise computing


Cloud Pro

23 May, 2019

With its move to the Xeon Scalable architecture, Intel began a revolution in its enterprise processors that went beyond the normal performance and energy efficiency selling points. Combined with new storage, connectivity and memory technologies, Xeon Scalable was a step change. The new 2nd Gen Intel® Xeon® Scalable processors don’t just continue that work but double down on it, with a raft of improvements and upgrades – some revolutionary – that add up to a significant shift in performance for today’s most crucial workloads. Whether you’re looking to push forward with ambitious modernisation strategies or embrace new technologies around AI, 2nd Gen Intel® Xeon® Scalable processors should be part of your plan. The revised architecture doesn’t just give you a speed boost, but opens up a whole new wave of capabilities.

More cores and higher speeds meet AI acceleration

It’s not that this latest processor doesn’t bring conventional performance improvements. Across the line, from the entry-level Bronze processors to the new, high-end Platinum processors, there are increases in frequency, while models from the Silver family upwards get more cores at roughly the same price point, not to mention more L3 cache. For instance, the new Xeon Silver 4214 has 12 cores running at a base frequency of 2.2GHz with a Turbo frequency of 3.2GHz, plus 16.5MB of L3 cache. That’s a big step on from the 10 cores 2.2GHz and 2.5GHz of the old Xeon Silver 4114, which had just 13.75MB of cache, and one that’s replicated as you move on upwards through the line.

At the high-end, the improvements stand out even further. The new Platinum 9200 family has processors with up to 56 cores running 112 threads with a base frequency of 2.6GHz and a Turbo frequency of 3.8GHz. By any yardstick that’s an incredible amount of power. What’s more, these processors have 77MB of L3 cache and support for up to 2,933MHz DDR4 RAM – the fastest ever natively supported by a Xeon processor. Put up to 112 cores at work in a two-socket configuration, and you’re looking at unbelievable levels of performance for a single unit system.

From heavy duty virtualisation scenarios to cutting-edge, high-performance applications, these CPUs are designed to run the most demanding workloads. Intel claims a 33% performance improvement over previous-generation Xeon Scalable processors, or an up to 3.5x improvement over the Xeon E5 processors of five years ago.

Yet Intel’s enhancements run much deeper. The Xeon Scalable architecture introduced the AVX-512 instruction set, with a double-width register and double the number of registers over the previous AVX2 instruction set, dramatically accelerating high-performance workloads including AI, cryptography and data protection. The 2nd generation Intel® Xeon® Scalable processor takes that one stage further with AVX-512-DL (deep learning) Boost and Vector Neural Network Instruction; new instructions designed specifically to enhance AI performance both at the data centre and the edge.

Deep learning has two major aspects – training and inference – where the algorithm is first trained to assign different ‘weights’ to some aspect of data being input, then asked to infer weights for new data based on what the AI learnt during that training. DL Boost and VNNI are designed specifically to accelerate the inference process by enabling it to work at lower levels of numerical precision, and to do so without any perceptible compromise on accuracy.

Using a new, single instruction to replace the work of three of the old ones, it can offer serious performance upgrades for deep learning applications such as image-recognition, voice recognition and language translation. In internal testing, Intel has seen boosts of up to 30x over previous-generation Xeon Scalable processors. What’s more, these technologies are built to accelerate Intel’s open source MKL-DNN Deep Learning library, which can be found within the Microsoft Cognitive Toolkit, TensorFlow and BigDL libraries. There’s no need for developers to rebuild everything to use the new instructions because they work within the libraries and frameworks DL developers already use.

With AVX-512, VNNI and DL Boost more enterprises have the power to harness the potential of deep learning. Workloads that would have pushed previous processors to their limits, like image analysis or complex modelling and simulation, run at much higher speeds. The end result is a lower barrier to entry for cutting edge DL applications, while significant research, financial and medical applications could expand to bring in more organisations and run at truly practical speeds.

The next-gen platform

Of course, the processor isn’t all that matters in a server or system, which is why 2nd Gen Intel® Xeon® Scalable processors are designed to work hand-in-hand with some of Intel’s most powerful technologies. Perhaps the most crucial is Intel® Optane™ DC Persistent Memory, which combines Intel’s 3D XPoint memory media with Intel memory and storage controllers to bring you a new kind of memory, with the performance of RAM but the persistence – and lower costs – of NAND storage.

Optane is widely known as an alternative to NAND-based SSD technology, but in its DC Persistent Memory form it can replace standard DDR4 DIMMs, augmenting the available RAM and act as a persistent memory store. Paired with a 2nd Gen Intel® Xeon® Scalable processor, you can have up to six Optane DC Persistent Memory modules per socket partnered with at least one DDR4 module. With 128GB, 256GB and 512GB modules available, you can have up to 32TB of low latency, persistent RAM available without the huge costs associated with using conventional DDR4.

The benefits almost speak for themselves. With such lavish quantities of RAM available, there’s scope to run heavier workloads or more virtual machines; Intel testing shows that you can run up to 36% more VMs on 2nd Gen Intel® Xeon® Scalable processors with Intel® Optane™ DC Persistent Memory. What’s more, this same combination opens up powerful but demanding in-memory applications to a much wider range of enterprises, giving more companies the chance to run real-time analytics on near-live data or scour vast datasets for insight. Combine this with the monster AI acceleration of Intel’s new CPUs, and some hugely exciting capabilities hit the mainstream.

Yet there’s still more to these latest Xeon Scalable chips than performance – it’s the foundation of a modern computing platform, built for a connected, data-driven business world. Intel QuickAssist technology adds hardware acceleration for network security, routing and storage, boosting performance in the software-defined data centre. There’s also support for Intel Ethernet with scalable iWARP RDMA, giving you up to four 10Gbits/sec Ethernet ports for high data throughput between systems with ultra-low latency. Add Intel’s new Ethernet 800 Series network adapters, and you can take the next step into 100Gbits/sec connectivity, for incredible levels of scalability and power.

Security, meanwhile, is enhanced by hardware acceleration for the new Intel Security Libraries (SecL-DC) and Intel Threat Detection Technology, providing a real alternative to expensive hardware security modules and protecting the data centre against incoming threats. This makes it tangibly easier to deliver platforms and services based on trust. Finally, Intel’s Infrastructure Management Technologies provide a robust framework for resource management, with platform-level detection, monitoring, reporting and configuration. It’s the key to controlling and managing your compute and storage resources to improve data centre efficiency and utilisation.

The overall effect? A line of processors that covers the needs of every business, and that provides each one with a secure, robust and scalable platform for the big applications of tomorrow. This isn’t just about efficiency or about delivering your existing capabilities faster, but about empowering your business to do more with the best tools available. Don’t let the 2nd generation name fool you. This isn’t just an upgrade; it’s a game-changer.

Discover more about data innovations at Intel.co.uk

UK Cloud Awards 2019 winners announced


Cloud Pro

17 May, 2019

County Hall in London last night played host to the sixth annual UK Cloud Awards, where accolades were awarded in a number of categories to showcase and reward the innovation and success of this country’s cloud industry.

The awards, which are put on by the Cloud Industry Forum (CIF) and Cloud Pro, were supported by Platinum sponsors ScienceLogic and CDW, acknowledged and rewarded the best cloud providers, products and projects from the past year.

 “From the standard and sheer number of entries we received for the awards it’s clear that it’s an incredibly exciting time to be in the cloud industry,” said Alex Hilton, CIF’s CEO.

“Having served as a judge every year since we founded the awards, I can say that deciding the winners this year was the toughest one yet, and that making the shortlist is an achievement in itself. However, there can of course only be one winner in each category and a big congratulations is deserved for everyone that took home prizes on the night.”

The expert panel of independent judges was led by head judge Jez Back and despite the overwhelming number of submissions, just 17 were lucky enough to take awards home on the night.

“The UK Cloud Awards received yet another record number of entries this year, with a depth and quality which proves the health and strength of the UK cloud industry,” Back said.

“The judges and I were looking for examples of unrelenting focus on helping to transform the way organisations operate and delivering meaningful outcomes for their clients. I’m pleased to say that our winners delivered on both counts and I’d like to congratulate the finalists and ultimate winners on their achievements and successes.”

 The winners are as follows:

Best in Class

Most Innovative Enterprise Product

Cloe from Densify

Most Innovative SMB Product

GoSimple Tax

Best Cloud Platform Solution

SuiteCloud from Oracle NetSuite

Best Cyber Security Product or Service

OnDMARC from Red Sift

Best Fintech Product or Service

Float

Best AI / ML Enabled Product or Service

Data Science from SnapLogic

Best Data Management Product or Service

Cloud Data Services from Pure Storage

Best Cloud Enabled End User Experience

Dropbox Business from Dropbox

Best Digital Transformation Projects

Public Sector / Third Sector Project

Financial Conduct Authority in partnership with Sopra Steria

Private Sector Enterprise Project

Manchester United in partnership with HCL

Private Sector SMB Project

Media Matters in partnership with Chalkline

DevOps and Functions as a Service Implementation

HeleCloud

Best in Class, Service Providers

Best Cloud Service Provider (CSP)

Workiva

Best Managed Service Provider (MSP)

Unify Communications Ltd

Cloud Migration Partner / Technical Collaboration Project

Ensono at Guinness World Records

Personal Achievement Awards

Cloud Newcomer of the Year

Myles Clarke, Product Manager at Cloud Gateway

Cloud Visionary of the Year

Chris Dunning at TechQuarters

A raffle was held on the night for the UK Cloud Awards’ official charity partner, The Sick Children’s Trust, with prizes donated by Sennheiser and Corona/3 Monkeys Zeno and Dennis Publishing.

Attendees dug deep in their pockets to raise money to support the charity’s efforts in keeping sick children and their families together.

Paul Franklin, group publisher of Channel Pro, Cloud Pro and IT Pro, added: “This year’s UK Cloud Awards was our biggest and best event yet so a huge thanks is owed to our Platinum sponsors CDW and ScienceLogic, and our Award sponsors, Fujitsu, Navisite, TechData and Veritas.

“We established the UK Cloud Awards to champion excellence and highlight the cloud innovation and leadership taking place in the UK, but they wouldn’t be possible without the backing of the industry. We fully expect to repeat the success of this year’s awards in 2020, so watch this space!”

Q&A: Andrew Cowling, Fujitsu Scanners


Cloud Pro

7 May, 2019

What does cloud mean to you and what benefits do you think it brings to businesses?

The cloud is probably the next major step in computing. It enables users to access and make use of software and services through remote access via the internet – through their desktop PC, laptop, tablet or phone. For businesses, this means that they no longer need to have super powerful computers at their premises, whilst still being able to easily access the software they need without any administration or updating headaches. It also means that they are no longer tied to a desktop in their office, but can actually access their work anywhere with an internet connection. Cloud enables greater collaboration potential across departments whether in the same building or different continents. Ultimately, decision making can be faster and enables better expectations.

Do you think the UK cloud industry has an advantage over other geographies? Are we excelling?

In the UK, we generally have very good internet access, both through the landline network and mobile networks. Our compact geographical size means that mobile coverage is excellent across most of the country. This makes cloud access pretty good wherever you may be. Users are getting used to using cloud services such as Google Docs and Dropbox, so this means they are becoming much more familiar with the concept.

What else do you think needs to be done to champion innovation in the UK cloud industry?

Businesses are already making great strides into the cloud arena. I think it’s now really just about thinking ‘outside the box’ and focusing on ways in which the cloud could benefit their clients. 

Please can you provide a bit more detail for those not familiar with your company?

Fujitsu specialises in producing high-speed, professional document scanners and software. Our scanners can quickly convert paper documents into electronic, searchable PDF documents, saving valuable office space and reducing the risk of loss or damage to the information through fire, flood or theft.

Why have you decided to get involved with the UK Cloud Awards 2019?

We’re really keen to promote the benefits of working in the cloud and associate ourselves with this exciting shift we are seeing in everyday workplaces. Our products are all focused on enabling businesses to be more effective, work smarter, and realise the benefits that moving from paper to digital processes can bring.

Our latest scanners can automatically route scanned documents to the right cloud service at the touch of a button. They can intuitively recognise what is being scanned and route accordingly. For example, this could be your receipts straight to expense systems, business cards to a CRM or those important documents straight to a cloud service for instant access and action.

What key trends/challenges are you seeing with your customers around cloud?

Education is probably the biggest challenge with the cloud. Customers don’t always instantly pick up the ways in which the cloud can help their business, so part of our job is getting to know the customers and suggesting ways in which they could make the most of it. The next-generation workforce will demand how they interact and engage.  Right now, though, it is about trying to shift habits and ‘but this is the way we always do thing’ mentality to help businesses realise that they have to move with the times to remain competitive and profitable.

How do you think the cloud landscape has evolved in the past five years?

Cloud access and use is getting more ubiquitous through popular software use such as Dropbox, Evernote and so on, whilst Microsoft, Google and Amazon Web Services are serving more and more software platforms. It’s also getting easier and cheaper for mobile data to be sent quickly on the go. We are also seeing more niche providers such as those who can manage expenses so perhaps we will see more evolution as we see greater adoption.

What do you think has driven this shift?

It’s a mixture of both customer demand and businesses taking advantage of the enormous, scalable computing power available through the cloud, offering new and innovative services for consumers. Customers are keen to take advantage of any new technology that will save time, money, and make their lives easier. Businesses can also take advantage through the increased resilience and security that big cloud service providers are offering.

What other trends and patterns do you see around cloud computing and related technologies?

IT departments around the world are going through a transformation. It’s becoming unnecessary for them to have their own enormous servers and computing power on site when they can outsource this business to cloud computing specialists. This will save administration time and expense, whilst they can easily pay to scale the cloud computing power to suit their needs at any given time.

What role do you see cloud playing in business life a year or five years from now?

Cloud services are already becoming indispensable for businesses. Hopefully, they will benefit our quality of life by allowing more remote and flexible working for a greater majority of employees. This should, ultimately, lead to business cost savings through remote sharing, virtual meetings and more efficient workers who spend less time on the road.

Looking further along the line, how do you see cloud shaping the way we live and work in the future?

It’s clear that this technology is developing at an incredibly fast pace and cloud services will have a huge impact on the workplace in the future. These are exciting times and I can see that the cloud will enable businesses to take advantage of increasingly useful technologies such as artificial intelligence and unthinkable amounts of computing power. I’m really looking forward to seeing Fujitsu continue to develop its ScanSnap Cloud services to make our scanners even more useful to our users in the future.

What can you do with deep learning?


Cloud Pro

29 Apr, 2019

If there’s one resource the world isn’t going to run out of anytime soon it’s data. International analyst firm IDC estimates the ‘Global Datasphere’ – or the total amount of data stored on computers across the world – will grow from 33 zettabytes in 2018 to 175 zettabytes in 2025. Or to put that in a more relatable form, 175 billion of those terabyte hard disks you might find inside one of today’s PCs.

That data pool is an enormous resource, but one that’s far too big for humans to exploit. Instead, we’re going to need to rely on deep learning to make sense of all that data and discover links we don’t know even exist yet. The applications of deep learning are, according to Intel’s AI Technical Solution Specialist, Walter Riviera, “limitless”.

“The coolest application for deep learning is yet to be invented,” he says.

So, what is deep learning and why is it so powerful?

Teaching the brain

Deep learning is a subset of machine learning and artificial intelligence. It is specifically concerned with neural networks – computer systems that are designed to mimic the behaviour of the human brain.

In the same way that our brains make decisions based on multiple sources of ‘data’ – i.e. sight, touch, memory – deep learning also relies on multiple layers of data. A neural network is comprised of layers of “virtual or digital neurons,” says Riviera. “The more layers you have, the deeper you go, the cleverer the algorithm.”

There are two key steps in deep learning: training and inference. The first is teaching that virtual brain to do something, the second is deploying that brain to do what it’s supposed to do. Riviera says the process is akin to playing a guitar. When you pick up a guitar, you normally have to tune the strings. So you play a chord and see if it matches the sound of the chord you know to be correct. “Unconsciously, you match the emitted sound with the expected one,” he says. “Somehow you’re measuring the error – the difference between the two.”

If the two chords don’t match, you twiddle the tuning pegs and strum the chord again, repeating the process until the sound from the guitar matches the one in your head. “It’s an iterative process and after a while you can basically drop the guitar, because that’s ready to go,” says Riviera. “What song can you play? Whatever, because it’s good to go.”

In other words, once you’ve trained a neural network to work out what’s right and wrong, it can be used to solve problems that it doesn’t already know the answer to. “In the training phase of a neural network, we provide data with the right answer… because we know what is the expected sound. We allow the neural network to play with that data until we are happy with the expected answer,” says Riviera.

“Once we’re ready to go, because we think the guitar is playing well, so the neural network is actually giving the expected answer or the error is very close to zero, it’s time to take that brain and put it in a camera, or to take decisions in a bank system to tell us that it’s a fraud behaviour.”
Deep learning as a concept isn’t new – indeed, the idea has been around for 40 years. What makes it so exciting now is that we finally have all the pieces in place to unlock its potential.

“We had the theory and the research papers, we had all the concepts, but we were missing two important components, which were the data to learn from and the compute power,” says Riviera. “Today, we have all of these three components – theory, data and infrastructures – but what we’re missing is the fourth pillar, which is creativity. We still don’t know what we can and can’t achieve with deep learning.”

Deeper learning

That’s not to say that deep learning isn’t already being put to amazingly good use.

Any regular commuter will know the sheer fist-thumping frustration of delays and cancelled trains. However, Intel technology is being used to power Resonate’s Luminate platform, which helps one British train company better manage more than 2,000 journeys per day.

Small, Intel-powered gateways are placed on the trackside, monitoring the movements of trains across the network. That is married with other critical data, such as timetables, temporary speed restrictions and logs of any faults across the network. By combining all this data and learning from past behaviour, Luminate can forecast where problems might occur on the network and allow managers to simulate revised schedules without disrupting live rail passengers. The system can also make automatic adjustments to short-term schedules, moving trains to where they are most needed.

The results have been startling. On-time arrivals have increased by 9% since the adoption of the system, with 92% of trains now running to schedule.

Perhaps just as annoying as delayed trains is arriving at the supermarket to find the product you went there for is out of stock. Once again, Intel’s deep learning technology is being used to avert this costly situation for supermarkets.

The Intel-powered Vispera ShelfSight system has cameras mounted in stores, keeping an eye on the supermarket shelves. Deep-learning algorithms are used to train the system to identify individual products and to spot empty spaces on the shelves, or even products accidentally placed in the wrong areas by staff.

Staff are alerted to shortages using mobile devices, so that shelves can be quickly restocked and lost sales are kept to a minimum. And because all that data is fed back to the cloud, sales models can be adjusted and the chances of future shortages of in-demand products are reduced.

Only the start

Yet, as Riviera said earlier, these applications of deep learning are really only the start. He relays the story of the Italian start-up that is using deep learning to create a system where drones carry human organs from hospital to hospital, eliminating the huge disadvantages of helicopters (too costly) and ambulances (too slow) when it comes to life-critical transplants.

It’s not the only life-saving application he can see for the technology, either. “I’d like to see deep learning building an autonomous system – robots – that can go and collect plastic from the oceans,” he says. “We do have that capability, it’s just about enabling it and developing it.”

“The best [use for deep learning] is yet to be invented,” he concludes.

Discover more about data innovations at Intel.co.uk

Nine AI myths versus reality


Cloud Pro

25 Apr, 2019

Whenever artificial intelligence, aka AI, comes up in conversation, the usual image that springs to most people’s minds is a threatening killer robot along the lines of Terminator that has nothing but murderous intentions towards humanity.

But these days the AI acronym is being liberally sprinkled far and wide, often referring to things that stretch well beyond its primary meaning. In this feature, we look at some of the common myths and misconceptions, compared to the real scientific situation in each case.

1. AI will create a malevolent Skynet-style system that will destroy humanity

Let’s look at the most popular myth first – the scary robot elephant in the room. Terminator is the most well-known example, but it is a recurring sci-fi theme from 2001: A Space Odyssey to the latest season of Star Trek: Discovery. On the one hand, technology has been automating the delivery of ordnance for decades, with in-missile video footage from the 1991 Gulf War just one watershed moment in a process towards greater autonomy that dates back to the German V1 and V2 rockets of World War II. The US army has been deploying Unmanned Air Vehicles (UAVs) for decades, and now has around 10,000 of them in regular use. But all of these still have human operators for key functions. The Defence Advanced Research Projects Agency (DARPA) has been awarding grants for the development of UAVs that can navigate themselves indoors. But as the RAND corporation points out, very few countries use armed drones just yet, and there is much controversy about their central value in warfare compared to conventional weapons systems. So even if fully autonomous fighting machines are developed, there are still many hurdles before they are deployed without human oversight, let alone take over the world.

2. AI systems and robots will eventually replace all jobs, making most people redundant

According to a report published by the UK’s Department of Work and Pensions, 8,820,545 jobs could be wiped out by 2030 because of AI, particularly in the retail sector. Aside from the strangely specific number of job losses predicted, it’s worth noting that it won’t just be menial labour that gets replaced. Complex intellectual activities are already being replicated by expert systems, such as legal and medical advice. AI has been making inroads into healthcare to allow earlier diagnosis without the need for consultation with specialists, who are always at a premium. You can even put your job title into the Will Robots Take My Job website to see how likely you are to be replaced by AI. In reality, though, similar arguments have been made since the agrarian and industrial revolutions. On the one hand, many jobs will be automated by AI, but on the other, people can retrain, or young people educated in a different direction for the new jobs that are emerging – potentially designing and building those AI-powered robots.

3. Siri, Google Home, Cortana and Alexa are AI

Voice-activated speakers have been a Christmas hit for the last couple of years, and more people are getting used to giving the smartphones verbal commands, too. These are undeniably clever, convenient systems (when they work…), but in reality, they are just advanced natural language processing (NLP) recognition algorithms akin to dictation software like Dragon Naturally Speaking. There is no original thought going on, just a lot of pre-programmed responses to verbal commands.

4. AI is a computer version of the human brain

We now get to the main underlying myth of AI – that computers model the human brain. This could be the subject of multiple PhD dissertations, but in a nutshell (and just for starters), computers are still based on the Von Neuman machine model of the mid-1940s. This reads data from memory, operates on it, and writes the result back to memory. This is not how brains work. Even multi-core processors, or HPC datacentres full of them, are still much more serial in their operation than a brain, which in contrast has a slow frequency (around 200Hz compared to multiple Gigahertz) but is massively parallel. Not just massively parallel, but inputs and outputs are connected in complicated feedback loops, with workings that we still don’t understand completely yet. This isn’t to say that computer AI isn’t amazingly useful, or that we won’t ever fully understand the human brain. But current AI is at best a very rough simulation, not even close to a digital facsimile of the cerebrum of homo sapiens.

5. AI systems can learn for themselves

Another two-letter acronym often found alongside AI is ML (Machine Learning). The common myth is that ML is a fully autonomous process, which will potentially lead to AI that transcends human intelligence and eventually decides to get rid of us (see 1 above). However, ML still intrinsically involves teaching by humans. Every AI system needs to be fed source material chosen by people, and its outputs adjusted by human experts until they work. This is precisely the process that Google’s self-driving car system is going through right now, and this won’t stop even when it gets the green light as a commercially available system in new vehicles.

6. AI systems will be much more impartial than human beings

As a result of AI’s ML being fed by humans, there’s no reason to believe that it will be any more impervious to prejudices than the humans that taught it. Early facial recognition systems had trouble identifying ethnicities, and the Tay Twitter bot was rapidly turned into a rabid racist by the tone of social media conversations. On the other hand, an AI that has been trained to be as impartial as the best human examples will consistently be better in this respect than the worst humans, which is where there is clear value for the legal and medical professions, amongst others.

7. AI will soon be smarter than human beings, or never will

Because AI runs on computers that are not an exact replica of the human brain (see four above), this is a bit of a trick question. On the one hand, there is no sign that a general AI will transcend overall human intelligence anytime soon, because we still don’t know exactly what the latter is. But on the other hand, much more narrow AI has been beating humans for some years now, such as defeating the best human chess player, and surpassing the best players of Jeopardy!. In 2004, none of the vehicles in the DARPA Grand Challenges completed the course, but in 2005, five did, and now we have self-driving cars being tested on public roads. So AIs will likely be smarter than humans in many key areas (and already are in some), but may never be in a general, overall sense – whatever that even means.

8. The technological singularity is approaching, or will never happen

Related to seven above, there is a theory, originally presented by maths and computer science professor Vernor Vinge, that the development of artificial superintelligence will arrive around 2030 and then AI will upgrade itself beyond human understanding and the world will change unrecognisably. This has been dubbed “the singularity”. For reasons already discussed above, the date already looks massively optimistic before we even get into the details, due to our lack of understanding of the human brain. But, on the other hand, the singularity doesn’t necessarily have to involve completely human intelligence. We could be creating something that isn’t human, just loosely based on us. Nevertheless, this idea still gives computers the ability to have their own intentions, which somehow emerge spontaneously from the advanced technology. Right now, no AI can do anything other than what humans told it to do, even if your flaky desktop PC might sometimes make you believe otherwise.

9. AI is just sci-fi and nothing that should concern business

Just as AI can create new jobs as well as replace old ones, it’s a topic that should be central to all businesses that intend to survive and grow over the coming decades, rather than ignored. On the one hand, the scary sci-fi scenarios of human replacement are very distant if ever likely to happen at all. But, on the other hand, there are real opportunities to enhance business services and consumer interaction via the more limited systems that have been shown to be effective already. Business intelligence data analytics, forward resource planning, and automated customer service are just the beginning. Companies that embrace AI, whilst being realistic about its limitations, will be the ones that prosper.

Discover more about the AI solutions powered by Intel here