AI, Competition and Balloons | @ExpoDX #AI #IoT #IIoT #DigitalTransformation

W. Edward Deming taught that quality is achieved by measuring as much as possible and reducing variations, and reducing variation is achieved by improving the system, not just pieces. Japan widely adopted Deming’s philosophies in the 1950s and became the 2nd biggest economy in the world. Quality improvement didn’t decrease jobs in Japan, it increased jobs.
AI now has the ability to expand and codify Deming’s philosophies – to take them to the next level. AI can improve and standardize decision making based on logic, rather than the fear of missing objectives, bonuses or losing one’s job. It can continuously monitor for quality against specifications by analyzing streams of real-time data coming from embedded sensors connected to the IIoT, IoT and IoA (internet of agriculture). This means companies that are aggressive early adopters of these digital technologies will have more knowledge, higher quality and significant competitive advantages, which means more demand for their products, sales, customer service, manufacturing, distribution, etc. It also means aggressive adopters will likely generate more jobs.

read more

10 ways machine learning is revolutionising marketing

  • 84% of marketing organizations are implementing or expanding AI and machine learning in 2018.
  • 75% of enterprises using AI and machine learning enhance customer satisfaction by more than 10%.
  • 3 in 4 organizations implementing AI and machine learning increase sales of new products and services by more than 10% according to Capgemini.

Measuring marketing’s many contributions to revenue growth is becoming more accurate and real-time thanks to analytics and machine learning. Knowing what’s driving more Marketing Qualified Leads (MQLs), Sales Qualified Leads (SQL), how best to optimize marketing campaigns, and improving the precision and profitability of pricing are just a few of the many areas machine learning is revolutionizing marketing.

The best marketers are using machine learning to understand, anticipate and act on the problems their sales prospects are trying to solve faster and with more clarity than any competitor. Having the insight to tailor content while qualifying leads for sales to close quickly is being fueled by machine learning-based apps capable of learning what’s most effective for each prospect and customer. Machine learning is taking contextual content,  marketing automation including cross-channel marketing campaigns and lead scoring, personalization, and sales forecasting to a new level of accuracy and speed.

The strongest marketing departments rely on a robust set of analytics and Key Performance Indicators (KPIs) to measure their progress towards revenue and customer growth goals. With machine learning, marketing departments will be able to deliver even more significant contributions to revenue growth, strengthening customer relationships in the process.

The following are 10 ways machine learning is revolutionizing marketing today and in the future:

57% of enterprise executives believe the most significant growth benefit of AI and machine learning will be improving customer experiences and support

44% believe that AI and machine learning will provide the ability to improve on existing products and services. Marketing departments and the Chief Marketing Officers (CMOs) running them are the leaders devising and launching new strategies to deliver excellent customer experiences and are one of the earliest adopters of machine learning. Orchestrating every aspect of attracting, selling and serving customers is being improved by marketers using machine learning apps to more accurately predict outcomes. Source: Artificial Intelligence: What’s Possible for Enterprises In 2017 (PDF, 16 pp., no opt-in), Forrester, by Mike Gualtieri, November 1, 2016. Courtesy of The Stack.

58% of enterprises are tackling the most challenging marketing problems with AI and machine learning first, prioritizing personalized customer care, new product development

These “need to do” marketing areas have the highest complexity and highest benefit. Marketers haven’t been putting as much emphasis on the “must do” areas of high benefit and low complexity according to Capgemini’s analysis. These application areas include Chatbots and virtual assistants, reducing revenue churn, facial recognition and product and services recommendations. Source:  Turning AI into concrete value: the successful implementers’ toolkit, Capgemini Consulting. 2017. (PDF, 28 pp., no opt-in).

By 2020, real-time personalized advertising across digital platforms and optimized message targeting accuracy, context and precision will accelerate

The combined effect of these marketing technology improvements will increase sales effectiveness in retail and B2C-based channels. Sales Qualified Lead (SQL) lead generation will also increase, potentially reducing sales cycles and increasing win rates. Source: Can Machines be Creative? How Technology is Transforming Marketing Personalization and Relevance, IDC White Paper Sponsored by Gerry Brown, July 2017.

Analyze and significantly reduce customer churn using machine learning to streamline risk prediction and intervention models

Instead of relying on expensive and time-consuming approaches to minimize customer churn, telecommunications companies and those in high-churn industries are turning to machine learning. The following graphic illustrates how defining risk models help determine how actions aimed at averting churn affect churn impact probability and risk. An intervention model allows marketers to consider how the level of intervention could affect the probability of churn and the amount of customer lifetime value (CLV). Source: Analyzing Customer Churn by using Azure Machine Learning.

Price optimization and price elasticity are growing beyond industries with limited inventories including airlines and hotels, proliferating into manufacturing and services

All marketers are increasingly relying on machine learning to define more competitive, contextually relevant pricing. Machine learning apps are scaling price optimization beyond airlines, hotels, and events to encompass product and services pricing scenarios. Machine learning is being used today to determine pricing elasticity by each product, factoring in channel segment, customer segment, sales period and the product’s position in an overall product line pricing strategy. The following example is from Microsoft Azure’s Interactive Pricing Analytics Pre-Configured Solution (PCS). Source: Azure Cortana Interactive Pricing Analytics Pre-Configured Solution.

Improving demand forecasting, assortment efficiency and pricing in retail marketing have the potential to deliver a 2% improvement in Earnings Before Interest & Taxes (EBIT), 20% stock reduction and 2 million fewer product returns a year

In Consumer Packaged Goods (CPQ) and retail marketing organizations, there’s significant potential for AI and machine learning to improve the entire value chain’s performance. McKinsey found that using a concerted approach to applying AI and machine learning across a retailer’s value chains has the potential to deliver a 50% improvement of assortment efficiency and a 30% online sales increase using dynamic pricing. Source:  Artificial Intelligence: The Next Frontier? McKinsey Global Institute (PDF, 80 pp., no opt-in)

Creating and fine-tuning propensity models that guide cross-sell and up-sell strategies by product line, customer segment, and persona

It’s common to find data-driven marketers building and using propensity models to define the products and services with the highest probability of being purchased. Too often propensity models are based on imported data, built in Microsoft Excel, making their ongoing use time-consuming. Machine learning is streamlining creation, fine-tuning and revenue contributions of up-sell and cross-sell strategies by automating the entire progress. The screen below is an example of a propensity model.

Lead scoring accuracy is improving, leading to increased sales that are traceable back to initial marketing campaigns and sales strategies

By using machine learning to qualify the further customer and prospect lists using relevant data from the web, predictive models including machine learning can better predict ideal customer profiles. Each sales lead’s predictive score becomes a better predictor of potential new sales, helping sales prioritize time, sales efforts and selling strategies. The following two slides are from an excellent webinar Mintigo hosted with Sirius Decisions and Sales Hacker. It’s a fascinating look at how machine learning is improving sales effectiveness. Source: Give Your SDRs An Unfair Advantage with Predictive (webinar slides on Slideshare).

Identifying and defining the sales projections of specific customer segments and microsegments using RFM (recency, frequency and monetary) modeling within machine learning apps is becoming pervasive

Using RFM analysis as part of a machine learning initiative can provide accurate definitions of the best customers, most loyal, biggest spenders, almost lost, lost customers and lost cheap customers.

Optimizing the marketing mix by determining which sales offers, incentive and programs are presented to which prospects through which channels is another way machine learning is revolutionizing marketing

Specific sales offers are created supported by contextual content, offers, and incentives. These items are made available to an optimization engine which uses machine learning logic to continually try to predict the best combination of marketing mix elements that will lead to a new sale, up-sell or cross-sell. Amazon’s product recommendation feature is an example of how their e-commerce site is using machine learning to increase up-sell, cross-sell and recommended products revenue.

Data sources on machine learning’s impact on marketing:

4 Ways to Use Machine Learning in Marketing Automation, Medium, March 30, 2017

84 percent of B2C marketing organizations are implementing or expanding AI in 2018. Infographic. Amplero.
AI, Machine Learning, and their Application for Growth, Adelyn Zhou. SlideShare/LinkedIn.  Feb. 8, 2018.

AI: The Next Generation of Marketing Driving Competitive Advantage throughout the Customer Life Cycle (PDF, 10 pp., no opt-in), Forrester, February 2017.

An Executive’s Guide to Machine Learning, McKinsey Quarterly. June 2015.

Artificial Intelligence for Marketers 2018: Finding Value beyond the Hype, eMarketer. (PDF, 20 pp., no opt-in). October 2017

Artificial Intelligence: The Next Frontier? McKinsey Global Institute (PDF, 80 pp., no opt-in)

Artificial Intelligence: The Ultimate Technological Disruption Ascends, Woodside Capital Partners. (PDF, 111 pp., no opt-in). January 2017.

AWS Announces Amazon Machine Learning Solutions Lab, Marketing Technology Insights

B2B Predictive Marketing Analytics Platforms: A Marketer’s Guide, (PDF, 36 pp., no opt-in) Marketing Land Research Report.
Four Use Cases of Machine Learning in Marketing, June 28, 2018, Martech Advisor,
How Artificial Intelligence and Machine Learning Will Reshape Small Businesses, SMB Group (PDF, 8 pp., no opt-in) May 2017.

How Machine Learning Helps Sales Success (PDF, 12 pp., no opt-in) Cognizant

Inside Salesforce Einstein Artificial Intelligence A Look at Salesforce Einstein Capabilities, Use Cases and Challenges, Doug Henschen, Constellation Research, February 15, 2017

Machine Learning for Marketers (PDF, 91 pp., no opt-in) iPullRank

Machine Learning Marketing – Expert Consensus of 51 Executives and Startups, TechEmergence. May 15, 2017.

Marketing & Sales Big Data, Analytics, and the Future of Marketing & Sales, (PDF, 60 pp., no opt-in), McKinsey & Company.

Sizing the prize – What’s the real value of AI for your business and how can you capitalize? (PDF, 32 pp., no opt-in) PwC, 2017.

The New Frontier of Price Optimization, MIT Technology Review. September 07, 2017.

The Power Of Customer Context, Forrester (PDF, 20 pp., no opt-in) Carlton A. Doty, April 14, 2014. Provided courtesy of Pegasystems.

Turning AI into concrete value: the successful implementers’ toolkit, Capgemini Consulting. 2017. (PDF, 28 pp., no opt-in)

Using machine learning for insurance pricing optimization, Google Cloud Big Data and Machine Learning Blog, March 29, 2017

What Marketers Can Expect from AI in 2018, Jacob Shama. Mintigo. January 16, 2018.

Ingram Micro Cloud becomes headline sponsor for the UK Cloud Awards 2018


Cloud Pro

14 Mar, 2018

Entries are now closed for this year’s UK Cloud Awards and the judges are busy looking through a bumper crop of submissions. But that doesn’t mean things have gone quiet, as we have another exciting announcement to make…

We’re pleased to confirm that Ingram Micro Cloud has come on board as exclusive headline sponsor of the UK Cloud Awards, which are now in their fifth year of operation. 

“The UK has emerged as one of the leading players in the cloud arena, so we have a lot to shout about. We have cultivated a lot of homegrown talent and a rapidly-growing list of world-class cloud companies that are driving innovation and transforming businesses,” said Apay Obang-Oyway, director of cloud and software for the UK & Ireland at Ingram Micro.

“Channel partners, in particular, have risen to the cloud challenge and have been working hard to evolve their operating models to seize the opportunities presented by delivery model. The UK Cloud Awards offer the important opportunity to celebrate these successes and we are very pleased to be able to lend them our support.”

Brought to you by Cloud Pro in association with the Cloud Industry Forum (CIF) and now the additional support of Ingram Micro Cloud, the awards will take place on 16 May 2018 at the prestigious County Hall in London. The awards aim to bring together the great and the good of the UK cloud industry to celebrate achievements and innovation in this space. 

This year’s judging panel consists of Max Cooter, former Cloud Pro editor, Maggie Holland, our editorial director, and other industry luminaries under the guidance of judging chair Frank Bennett.  

Alex Hilton, CEO of the Cloud Industry Forum, added: “Ingram Micro Cloud is playing a pivotal role in the ongoing transformation of the channel and the wider cloud industry, enabling end users to realise the transformational benefits of cloud. We are therefore delighted that they have lent their support to what is shaping up to be the best and biggest UK Cloud Awards yet. We received a record number of entries this year, and the standard of those entries has never been higher. This is making judging a real challenge, but I look forward to celebrating with the team from Ingram and winners on the night itself!”

Why the Caribbean’s digital future depends upon the cloud

In 2018, many first-world countries are seeing the remarkable, disruptive impact of the digital economy. In the Caribbean, however, there is no consistent delivery of state-of-the-art information and communication technology (ICT) services from country to country, and the availability of online services to citizens can vary widely.

At the same time, Caribbean countries have seen firsthand the debilitating impact of natural disasters. The hurricanes of 2017 were a stark reminder of the need for a more strategic approach to improve resilient infrastructure. Citizens and economies of vulnerable small island states are at risk, as the loss of infrastructure including access to critical IT systems is a major impediment to both the efforts to coordinate post disaster relief efforts and to general economic recovery. The post-Hurricane Maria struggles of Dominica are a sobering reminder of the impact which natural disasters can have on a small country with limited resources. It takes months to recover core infrastructure, including ICT services, after these massive storms.

Today there are several hurdles for putting in place a new, common and resilient infrastructure benefiting consumers and businesses alike. For one, government IT departments own and manage most of the technology internally.  Given the limited resources available to these departments, this ownership model can impede on the delivery of a high-performing, “always-on” computing environment. Typically, government IT departments have cost constraints which limit their ability to maintain a state-of-the-art environment.

The availability of highly-qualified IT personnel required to run sophisticated cloud computing environments is also a contributing factor to the slow growth of cloud computing within governments in the region. Much of the time and resources are spent maintaining basic infrastructure which hampers the ability to deliver new (or improved) digital applications for citizens and businesses. Governments in the Caribbean are also playing catch-up when it comes to disaster recovery (DR) and business continuity strategies, with many IT departments lacking sophisticated backup architectures.

Call for Change by CARICOM

In 2014, the heads of government of CARICOM, consisting of 15 Caribbean nations, issued a policy directive for the creation of a single ICT space. The vision for this single ICT infrastructure serving governments, IT providers and consumers is to have common policy, legal and regulatory frameworks, a robust national and regional broadband infrastructure and secure management systems.

Cloud computing should be one of the central pillars to this strategy. The shared, distributed, on-demand architecture of the cloud mitigates many of the barriers facing government IT departments across the Caribbean:

  • Governments often lack adequate funds to purchase, provision and maintain the enabling technology for delivering modern applications and services. The ongoing need to invest in maintaining the skills of the government IT workforce is a drain on scarce resources which could be re-directed to frontline services for citizens and businesses.
  • Cloud computing promises equitable access for individual nations to an always-on, high-performing, secure infrastructure. It also ensures timely delivery of new applications and updates needed to compete in today’s economy.
  • In a single CARICOM ICT space, cloud computing infrastructure is ideal for both disaster recovery and business continuity, since systems can be configured to automatically failover to off-island data centres on unaffected islands, restoring service availability within hours if not minutes. While this won’t cover all scenarios, it will allow businesses and government offices with backup power sources to reconnect quickly to the network and critical applications.

The role of commercial cloud providers

Realizing this vision for all Caribbean countries may take some time, but the capabilities are already available through some regional cloud providers.  These local companies are building the necessary infrastructure to support the single ICT space model, and are investing in the economy to ensure it is viable for all Caribbean nations. Although there are larger cloud service providers available internationally, such as Google, Amazon and Microsoft, these major players haven’t made the move to deploy cloud computing infrastructure in the Caribbean region. Working with global cloud providers may also prove difficult for countries where privacy laws require local data storage or where the privacy laws in the hosting provider’s country differ significantly from those in CARICOM countries.

Commercial cloud providers in the region can deliver a shared, on-demand, scalable, secure and reliable cloud computing model that eliminates the need and expense for each country to build and manage their own private clouds.

Commercial cloud providers also fill an important gap in knowledge. Many government IT departments are running virtual environments, which are not synonymous with cloud computing services, as defined by ISO and NIST. Commercial cloud service providers, by the nature of their business, require their infrastructure and processes to meet these international standards, and can be of great assistance to government IT departments which lack the necessary cloud skill sets to implement and manage a recognized private or hybrid cloud environment. They can work in collaboration with existing government IT departments to ensure a reliable and high-performing environment that better serves citizens and businesses. This collaboration will also help those with a more traditional approach to ICT understand the benefits of incorporating external expertise.

There is also the matter of urgency: how best to fulfill CARICOM’s agenda for 21st century government while also helping local businesses grow and deliver new types of services that can enhance quality of life for all people? It is unrealistic to expect Caribbean governments to make the investment in the technical platforms for ISO standard business continuity and disaster management, to find and pay for the highly skilled technical resources required to manage increasingly sophisticated IT environments 24/7, nor to optimize such an environment for resilience, security and speed which modern applications require.

By partnering with commercial cloud service providers to deliver and maintain the underlying cloud infrastructure, government IT departments can focus on delivering and facilitating the endpoint applications and services to citizens and businesses.

Long-term benefits of a public-private partnership in the cloud

Many government IT departments fund their ICT requirements through capital expenditure (CAPEX) budgets. As most governments try to limit capital expenditure, it can be difficult for IT departments to get the necessary funding to support their environment, particularly in the event of non-planned projects. This can impede the progress of critical ICT projects.

In addition, since CAPEX budgets must plan for replacing end-of-life ICT equipment, it’s difficult to sustain critical frontline ICT services.  Working with a commercial cloud provider gives governments the ability to move to an operational expenditure (OPEX) model where costs are more predictable and consistent, and provide a faster, lower risk way to adopt cloud computing.

On the business side, access to a fast, flexible cloud environment encourages digital entrepreneurship. Companies can be born overnight, through access to advanced cloud and mobile technologies. Startups can focus on business development without needing to invest in and manage IT infrastructure, dramatically reducing cost of entry. A Caribbean-based cloud service brings potential for a new tech sector across the Caribbean. One needs only to look north at the rapid development of the SaaS industry to see the potential.

Better IT services is also a win-win for consumers, providing the ability to participate in the public forum, share information about political and societal developments and contribute in ways that have become commonplace elsewhere, such as online donations to political campaigns. Digital government brings greater ease in completing personal business, such as vehicle registration, obtaining a passport and paying a utility bill, with the confidence of using a secure website.

Government IT leaders, IT infrastructure experts, technology vendors and cloud service providers have much to offer by coming together in the common goal of delivering world-class digital platforms and services to the region. The fruition of a secure, cloud-based infrastructure is an important first step in building the Caribbean’s digital future.

Salesforce to buy $100m of Dropbox shares post-IPO


Clare Hopping

14 Mar, 2018

Dropbox has updated its S-1 IPO filing, saying it thinks it will be able to sell shares at between $16 and $18 per share, and it’s going to sell shares worth $100 million to Salesforce immediately after it floats.

The cloud-based file share and sync business is opening up 36,000,000 shares to raise $648 million when it starts trading on the Nasdaq exchange later this month, the filing reveals, valuing the company at between $7 billion and $8 billion when restricted stock units are also taken into consideration. It’s still below the $10 million the company was worth in 2014 when it raised $350 million in venture funding, but it still means Dropbox is the highest-value tech IPO since Snap went public last year.

Salesforce and Dropbox have formed a pretty close relationship over the last few months, with the SaaS firm most recently announcing plans to integrate its Commerce Cloud and Marketing Cloud services with Dropbox, giving customers access to the cloud storage service. It will also mean Salesforce Quip users can access Dropbox-stored content, and Dropbox adding support for Quip documents, so it’s no surprise Salesforce plans to buy a large chunk of the available shares.

Dropbox will sell a total of 5,882,353 shares to Salesforce at an average value of $17 per share, which falls right in the middle of the company’s estimated per-share value.

“Salesforce Ventures LLC has entered into an agreement with us pursuant to which it has agreed to purchase $100,000,000 of our Class A common stock in a private placement at a price per share equal to the initial offering price,” Dropbox’s S-1 filing stated. “This transaction is contingent upon, and is scheduled to close immediately subsequent to, the closing of this offering.

Main image credit: Shutterstock

ICO Holder Named @ExpoDX Media Sponsor | @ICOHolder #FinTech #Blockchain #Bitcoin #Ethereum

DXWorldEXPO LLC announced today that ICOHOLDER named “Media Sponsor” of Miami Blockchain Event by FinTechEXPO. ICOHOLDER give you detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to info@dxworldexpo.com. Miami Blockchain Event by FinTechEXPO also offers sponsorship and exhibit opportunities.

read more

Can the cloud go all the way to the edge?

There is lots of conversation in the industry at the moment around the impact that edge computing is going to have on the cloud and specifically if it is going to kill the cloud, or at the very least impact its ability to cope.

With the proliferation of the Internet of Things (IoT), and the predicted future ubiquity of sensors, the cloud and associated technologies in their current state won’t be able to cope with the sheer amount of data being generated by machines – and won’t be able to keep up with the need for speed that these devices and sensors will demand.

To put it into perspective, a driverless car generates 10GB of data every mile from details like GPS, street signs and other surroundings. In order for the vehicle to respond to that data, it needs to process information at an incredible speed, in ‘real’ real-time to ensure that it doesn’t crash and makes it to its destination. To process data at this kind of speed means it can’t rely on that data being sent back to a central cloud in a datacentre, and then transferred back to the device to take action. If the car is coming up to a stop sign, that data needs to be processed in milliseconds, and this processing will need to be done close to the device, not in the cloud. It is said that for this very reason a self-driving car will, in essence, be a mobile data centre with the ability to analyse terabytes of information in almost real time.

In his presentation, Return to the Edge and the End of Cloud Computing, Peter Levine talks about how, in the future, we are going to be collecting ‘the world’s information’. We will have smart sensors on everything; from our ovens to our shoes, to our cars and our keys. These devices will sometimes need to speak to each other in real-time or will need to transfer data instantaneously in order to deliver recommendations about whether your soufflé is sinking, or if you need to change your stride pattern when running. This is already beginning with smart watches and home temperature control devices, but it is easy to see how this industry will continue to boom.

The type of data that IoT devices and sensors produce won’t always be simple text-based information that can be sent back and forth quickly over a network. It will be images and videos that demand lots of processing power, without compromising on speed.

With this move to smart devices in every area of our life, many are predicting the end of the cloud as we know it, and a return to distributed computing where the processing is done closer to the edge. Many of the large online TV and film streaming services are already moving their IT systems to regional hubs to be closer to users in particular areas across the country, but in order to function optimally in this new sensor-driven world, the edge has to come much, much closer. 

As technology improves, the ability to monitor more and more physical assets will increase, which will, in turn, increase the amount of data that needs to be processed. For data to be useful in our day to day life, we will want instant insights and recommendations – we will quickly lose the patience of waiting ten seconds for data to be sent to the cloud, processed, analysed and sent back to our devices – much like our impatience when waiting for a website to load. As we become increasingly dependent on these devices, we will need it instantaneously, to provide guidance, recommendations, insights and for the devices themselves to take automatic action.

But what connects everything? Technology. Now, we don’t think that tech – certainly in the sense of data centres or cloud – impacts on our trainers. But in the future, when they’re smart-connected to our Fitbits and phones, trainers will be supported by tech and data. As much as we already talk about our dependence on our phones and technology in everyday life, the world will only become more reliant on tech. From our actions to our travel, to our physical devices, to our cooking and our decision-making.

The opportunity is huge, but is the current cloud roadmap delivered by most cloud vendors going to cope in a true edge-led world?

The cloud develops alongside ingenuity: How cloud is essential to underpin new technologies

The Internet needed a means to maximise its potential – and the solution would need to be highly adaptable and accepting of a wide variety of technology. Cloud computing has evolved with the information age.

The evolution of applications for cloud computing: A history

The Allied Telecom Group describes cloud computing as an interconnected network of remote servers; specifically, Internet-host servers that process and manage information, as well as a place to store data. Cloud computing was initially seen as a backup solution for hard drives. However, it soon became much more than this.

Cloud computing development advanced quickly over a very short time. In 2013, mobile banking was already active in cloud computing. By 2015, the cloud was being used by about half of the US government’s agencies. This amounted to $2 billion in spending.

The cloud was no longer just for tech companies. Businesses big and small began to recognise its advantages.

The many uses of cloud computing

As the cloud branched out of the technology space, it started to show its true potential. It quickly became water cooler jargon as companies across the US, and then throughout the world. The business world may have been the most influential in bringing cloud computing to the masses; certainly, IT staffing benefitted from users’ lack of understanding of cloud implementation.

People knew they wanted the cloud before they knew what it was. Unfortunately, the cloud remained an elusive concept to grasp. Logically it should not be; it is in essence a network. However, most users do not see the command line transactions that IT personnel do. They use the cloud behind layers of interfaces. It is not uncommon for even an entire organisation to consider their cloud options without a systematic plan.

It's important to understand as well that the cloud can be extremely dynamic – and it helps to visualise how ingrained the cloud has become in the modern Internet.

Take New Generation Applications and w3schools as examples. New Gen Apps lists several applications for the cloud from a business perspective: scalable usage; chatbots and other communication; productivity; business processes; backup and recovery; application development; test and development; big data analytics; and social networking. Compare this with w3schools’ analysis from the viewpoint of Internet programmers and coders: file storage; photo editing; digital video; Twitter applications; anti-virus; word processing, spreadsheets and presentation software; maps; and eCommerce.

All the potential uses for the cloud are unknown

One could say that the potential applications for the cloud are limited only by the imagination of human beings. Most networks are more capable than initially thought. They pool resources together that bring about great achievements.

Cloud computing harnesses networking. It strives to perfect it. One can think of it as making communication clearer. For instance, it can unify word processing applications with multi-lingual chatboxes, so scientists from around the world can collaborate to write a book in real time. It allows people from every nation to play online games and compete within fractions of a second.

Every last potential use for cloud computing is unknown. What can be predicted, however, is that as new innovations arise, the cloud is a tool and platform essential to practical uses of new technology. As it happens, it is also a means by which old technology or ideas can interface with new ones. It reduces the need for new versions. Like a neutral zone from which all parties can draw resources, the cloud is a medium like no other.

Cloud services resemble a Swiss Army knife

After a while, the cloud begins to look like the Swiss Army knife for Internet applications. There are numerous programming languages, and yet the cloud can benefit them all. Networking is as important to business as it is to education or any congregation of people.

The cloud is a network, as well as a tool that facilitates information across networks. It is not just a chat box or library that everyone can access. It facilitates services on top of the main application. Services can draw from other applications to produce a new product quickly; this was a profound achievement that led to the plethora of mobile applications available today. In addition to its adaptability, the cloud also solved a lot of download issues. It allowed devices to run applications on distant servers – another function which helped build a lot of applications.

Cloud computing is one of those rare occurrences in which ingenuity gains a partner.

What is fog computing?


Nicole Kobie

15 Mar, 2018

The cloud is as ubiquitous in computing as it is in the skies over Britain, but experts have forecast a new meteorological-named IT architecture that could become just as important: fog computing.

What is fog computing?

Let’s help cut through your haze: just like cloud computing, fog computing is an architecture for remote document storage, but rather than housing it all on one server (or one company’s servers), your files are distributed. That doesn’t mean there are copies of them on multiple servers, but that the data that makes up your files is spread widely, so no-one but you can see the entire thing.

“Our proposal is based on this idea of a service that renders information completely immaterial – in the sense that, for a given period of time, there’s no place on earth that contains information complete in its entirety,” noted the researchers, Rosario Culmone and Maria Concetta De Vivo of the University of Camerino, who submitted the idea via a paper in the International Journal of Electronic Security and Digital Forensics.

If your files are always split into smaller pieces of data, they’re less useful to hackers, thus boosting security. It also means that if local authorities want to see your files, they won’t be able to access them in their entirety, with the bits spread across multiple jurisdictions.

How does it work on a technical level?

The “fog” uses standard networking protocols in a new way, using virtual buffers in routers to send packets of your data every which way, all the time – so no file ever sits in its entire, full form on a single server at any given time.

The researchers compared it to sending a letter with a tracking device in the mail, but rather than have it delivered to one place, it bounces around from post office to post office. That would make it rather hard for a snoop or thief to find, since there’s no way of knowing if it’s in transit in a postman’s bag, or which sorting office it’s sat in. But the owner of the letter need only enable the tracking device to find it immediately.

Sounds like it could go horribly wrong

There would be bandwidth pressure if we stored our entire collections of data in such a way, but fog computing could offer an alternative to cloud computing for those who need extra secure remote storage.

Isn’t fog computing to do with IoT?

Yes and no. The decentralised storage and computation of Internet of Things data at the edges of networks, rather than in data centres, uses the same weather-themed jargon, although it’s sometimes known as “edge computing”.

When will this be available?

Sorry, the Camerino researchers offered no forecast of when to expect fog computing to be ready for use. We also don’t yet know what the next meteorological IT buzzword will be. We just hope it involves sunshine, this time.

Nutanix acquires app mapping provider Netsil in another nod to multi-cloud rise

The rise of multi-cloud has helped secure another M&A deal in the cloud division: enterprise cloud services provider Nutanix has announced the acquisition of Netsil, an app mapping, discovery and management software provider.

Netsil, based in San Francisco, aims to give enterprises complete visibility into all of their applications and services. In the words of the company, its ‘algorithm-based and non-invasive technology helps to achieve visibility and control at scale while keeping application transparency.’

As organisations increasingly bring more cloud providers into the fold, with greater numbers of applications, complexity and confusion is often the result. The rise in microservices and containerised applications, with more companies dipping their toes into the water, exacerbates the issue. With siloed IT departments relying on legacy monitoring tools based on static environments and slow application architectures, organisations simply cannot keep up – or so goes the theory.

This is the second deal announced by Nutanix this month. The start of March saw the company agree to acquire Minjar – although the company’s main product, AWS and Azure-based DevOps automation tool Botmetric, will be more familiar to readers.

“Netsil’s innovative technology offers an original approach to simple yet comprehensive application discovery and operations management across multiple cloud environments and will be a powerful addition to Nutanix,” said Sunil Potti, Nutanix chief product and development officer in a statement. Harjot Gill, CEO and founder of Netsil, added: “Nutanix has built a very solid enterprise cloud OS platform, which when combined with Netsil’s real-time observability, becomes even more strategic when addressing a growing microservices market.

“We are really happy to be joining a company where Netsil’s capabilities will be used to their fullest.”

Financial terms of the deal, which is subject to the satisfaction of customary closing conditions, were not disclosed.

Read more: Enterprise cloud for dummies: Prepare your organisation for the new era of IT infrastructure