It’s time to diversify your cloud portfolio – or bank on failure

You don’t have to be a Wall Street professional to understand the reasons to invest in more than one company. So, why would a business choose a strategy of investing in a single cloud? It doesn’t make sense in today’s marketplace.

Initially, cloud providers like Microsoft Azure, AWS or Google Cloud were commodities using pricing as a competitive advantage. But today, these providers are differentiating themselves with unique value-add services and features that make sense for different businesses. Google, for example, offers Vision AI services, which for retail business will enable them using facial recognition to connect online and offline identifies for loyalty programs. But if a company is focused on replatforming some of its legacy applications and bringing them to the cloud, it will probably want to use Microsoft Azure SQL and Azure Active Directory.

In addition to features, companies want to consider their needs for locality, regulatory or service-level agreement challenges that each provider might address differently. When going multi-cloud, they should take advantage of “Best of Cloud” as well, leveraging the optimal feature sets from each provider that makes sense for the business.

Ultimately, if a company is not moving toward building a cloud ecosystem of microservices and applications, then it will likely fall behind competitors. Businesses should consider these four principles for building a best-in-class multi-cloud strategy.

Understand the Accordion Thesis

IT is like an accordion and at any moment, there are parts of it that are expanding and parts of it that are contracting. The contraction is standardisation, where all diverse choices are gradually squeezed down to a single standard (like the 19-inch server rack); whereas, the expanding parts are the innovations, such as machine learning, chatbots and AI. On the innovation side, an organisation needs to be able to select and experiment with an infinite number of choices. The trick in managing IT is understanding standardising and where you’re innovating, and to try not to standardise on the innovation side or innovate on the standardisation side.

The hardest part for most IT organisations is the “Fat Middle,” where it’s unreasonable to try and force a single standard choice, but where a plethora of choices is just unnecessary overhead. Most healthy businesses can tolerate two to six choices within each category, for example having both SQL Server and PostgreSQL available.

Companies can use the Accordion Thesis to inform “Build versus Buy” decisions as well. It can also be helpful for avoiding one of the greatest business risks: analysis paralysis. Do pilots, learn things, but have a bias towards building cloud native applications or microservices that will facilitate innovation, and leaning on standardised services provided by the major cloud leaders.

Take control of what’s already in the cloud

Every Fortune 500 company is using the public cloud, but most likely are not using it strategically or even on purpose. Much of this use is rogue lines of business, such as applications that the marketing team utilised because they wanted to move fast without informing the rest of the organisation. A part of a company’s cloud strategy is taking control of the ways it’s already using the cloud. As the number of applications continues to expand geometrically, businesses can find it harder and harder to understand the operational footprint of this sort of tactical ‘mix and match’ approach.

Build your own ecosystem, don’t depend on others

The more businesses have built against one cloud provider, the more likely they’ve inadvertently embraced a lot of -isms of that provider around things like storage, authentication layer, and access or network topologies that are not portable when they shift to a multi-cloud approach. The sooner a business adopts a multi-cloud strategy, the less it’ll have to rework to make it possible.

Once a business has identified the differentiated features and services it wants to take advantage of, it can bind them to its applications using a system broker. Imagine a service broker like a fund manager: it has access to all the different options and connects them together. This ecosystem is what will differentiate innovation, enabling companies to create additional applications that take advantage of best-in-class features from leading cloud providers.

Don’t conflate the present with the future

Another reason to diversify a cloud portfolio is that the future is unknown. In 2000, people thought the space was so crowded that “it was finished” when up against leaders like Altavista, Lycos, Yahoo! and Excite. Today, there is a similar — albeit misguided — implicit assumption: “Cloud is done because the major players are here.”

Just because AWS was there first, and has tapped into a decade of pent-up demand, it’s foolhardy to believe AWS will always be the market leader. When a business adopts a standard that has at least two vendors, then it can protect itself to some degree against the unknown.

Of course, the engineering team may have some resistance to a multi-cloud approach because they assume there is upfront work to learn a new cloud’s commands. This may seem obvious to some, but the languages of Microsoft Azure, AWS, or Google Cloud are like romance languages: they’re all very similarly rooted. There are minor changes for each provider, so this shouldn’t be a major hurdle, or, a business can use software which automatically translates between these cloud infrastructure providers.

There’s a reason that the top banks, auto-manufactures, telecoms and other Fortune 100 companies are all adopting multi-cloud strategies: innovation is key to their success. The speed of innovation is now more paramount than ever, especially given that competitors have a lower barrier for entry with the reduced cost of outsourcing infrastructure.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

IBM and WANdisco team up on relational database tech


Clare Hopping

9 Jan, 2019

IBM and WANdisco have developed relational database technology to extend IBM Big Replicate (WANdisco Fusion) for customers requiring a hybrid cloud environment solution.

Although WANdisco has previously supported Hadoop to offer a SQL solution on private clouds, this is the first time the company has been in a position to also support relational databases for hybrid set-ups.

IBM Db2 Big SQL, which has been jointly developed with WANdisco, takes IBM Big Replicate one step further, offering a SQL engine for Hadoop, with the added bonus of HDFS, RDBMS, NoSQL databases, object stores and WebHDFS source support.

“This co-engineered SQL solution with IBM is an exciting breakthrough for WANdisco as this is the first time that our technology has been applied to SQL data,” David Richards, chief executive officer and chairman of WANdisco said.

“This launch also represents a significant advancement of our relationship with a key partner and the scope of our addressable market in IBM’s channel. WANdisco’s unique technology presents great opportunity to collaborate with partners to address novel data requirements that previously have not been possible to meet.”

Benefits of IBM Db2 Big SQL include low latency, high security, SQL compatibility, and Enterprise Data Warehouse (EDW) and federation capabilities, giving data scientists and others with the responsibilities for processing data the tools to access data and scale their business, however many users need to access the information at once.

“Our close relationship with IBM was built upon further in 2018 with an increased royalty percentage and substantial client contracts,” Richards added. “We look forward to growing opportunities with IBM in the year ahead, leveraging our new co-engineered product to address as yet untapped data requirements.”

IBM makes leap in latest quantum computer release – but what does it all mean?

IBM thinks it has gained a head-start on the slowly emerging quantum market with the launch of what it has described as the first commercially useable integrated quantum computing system.

Quantum computing is based on the principles of quantum mechanics and aims to take advantage of subatomic particles existing in more than one state at any time. Its much-vaunted potential therefore is in its ability to read between the lines and come up with calculations we can only dream of today.

It has the potential to make current standards fall desperately short. For instance, according to Microsoft, while classical computers would take one billion years to break encryptions such as RSA, a quantum research project the company is aiming to put together in 2019 would break it in 100 seconds.

The IBM Q System One, once released, can be seen as something of a milestone in the processes involved. The sheer technological effort thus far has ensured practically any quantum project has gone no further than the laboratory stage. Even the most minimal fluctuations in temperature, or the merest ambient noise will put qubits awry, which have only 100 microseconds of useful lifespan as it stands anyway.

To help mitigate against this, IBM’s design includes a nine foot tall, nine foot wide case of half-inch thick glass forming an airtight enclosure that can open effortlessly using motor-driven rotation. The company certainly didn’t skimp when it came to the design, enlisting multiple design studios alongside Goppion, a Milan-based manufacturer of high-end museum display cases whose ‘clients’ include the Mona Lisa and the Crown Jewels at the Tower of London.

“For the first time ever, IBM Q System One enables universal approximate superconducting quantum computers to operate beyond the confines of the research lab,” IBM trumpeted in a press release. “The IBM Q System One is a major step forward in the commercialisation of quantum computing,” said Arvind Krishna, SVP hybrid cloud and director of IBM research. “This new system is critical in expanding quantum computing beyond the walls of the research lab as we work to develop practical quantum applications for business and science.”

More than meets the eye

Don’t think you can fill out a form, send it off and then receive a shiny box from IBM in return just yet, however. The company did not reveal how much a machine would theoretically cost, nor even a provisional release date.

The Q System One is a 20 qubit computer which, in real-world terms, unfortunately does not account for very much today. Towards the end of 2017 IBM reached the 20 qubit milestone with 50 the next target. Yet to put that in comparison, James Clarke, who heads up Intel’s quantum research unit, told New Scientist at the start of this year his team was looking to longer-term goals, saying a device will need a million qubits before it has a truly significant impact.

IEEE, the engineering standards organisation, has long had something of a sceptical view around the timeframes regarding quantum initiatives. Travis S. Humble, distinguished scientist at Oak Ridge National Laboratory who leads the IEEE working group around quantum computing, wrote for this publication a note of caution that despite it being a ‘pivotal point’ in the technology’s development, ‘many eyes [were] watching the road ahead, but [not] moving forward.’

Writing for the IEEE in November Mikhail Dyakonov, researcher in theoretical physics at the University of Montpellier, put it particularly succinctly. “A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe,” he wrote.

So are IBM, Intel, Microsoft et al all therefore ploughing millions of dollars into a wild goose chase? Not yet at least. Analyst firm CCS Insight posited that those who are interested in cloud should become interested in quantum and its potential to ease the burden currently experienced by silicon workloads. The company predicted in October that IBM would win the race to launch the first commercial quantum computing applications, putting the year at 2022.

Alternately, let The Verge give a different perspective on the Q System One launch. “[They are] still very much experimental devices…supposed to be research tools,” wrote James Vincent, “letting us work out, qubit by qubit, how quantum devices might work at all."

Picture credit: IBM

https://www.iottechexpo.com/northamerica/wp-content/uploads/2018/09/all-events-dark-text.pngInterested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located IoT Tech Expo, Blockchain Expo, AI & Big Data Expo and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam and explore the future of enterprise technology.

CFP Deadline For @CloudEXPO Silicon Valley January 31 | #Cloud #CIO #DevOps #IoT #Blockchain #MachineLearning #ArtificialIntelligence

At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throughout enterprises of all sizes.

read more

Cloud take-up growing 32% year-on-year


Clare Hopping

9 Jan, 2019

OThe UK cloud market is growing faster than ever, with vendor revenues swelling by 32% in 2018, passing $250 billion for the first time, according to Synergy Research’s latest figures.

The most accelerated growth happened in the IaaS and PaaS sub-sectors, with 50% growth reported apiece in just 12 months.

Hybrid cloud management software grew by 41% and enterprise SaaS and public cloud infrastructure both increased by 30%. Hosted private cloud infrastructure services weren’t far behind with 29% growth.

“In 2018 cloud started to dominate IT spending in some areas, sucking up potential growth opportunities for non-cloud technologies and services,” said John Dinsdale, a chief analyst and research director at Synergy Research Group.

During 2018, there was a distinct increase on cloud infrastructure spend, overtaking the amount invested in hardware and software on both public and private clouds for the first time. However, total spend on hardware and software wasn’t low and this surpassed $100 billion, evenly split between public and private cloud environments.

Microsoft, Amazon/AWS, Dell EMC, IBM, Salesforce, Cisco, HPE, Adobe, and VMware revenues accounted for half of all cloud-related income.

Cloud service providers, in particular, had a very good year, generating more than $150 billion in revenues from cloud infrastructure services and enterprise SaaS via applications covering search, social networking, email, e-commerce, gaming and mobile apps.

“Cloud technologies are now generating massive revenues for both cloud service providers and technology vendors and our latest forecasts show that while market growth rates will inevitably erode due to the sheer scale of the numbers, the overall market will double in size in under four years,” Dinsdale added.

SAP Keynote at @CloudEXPO New York | @SAP @SAPIndia @DilipKhandelwa #SAP #HANA #AI #Cloud #CIO #DigitalTransformation

Cloud is the motor for innovation and digital transformation. CIOs will run 25% of total application workloads in the cloud by the end of 2018, based on recent Morgan Stanley report. Having the right enterprise cloud strategy in place, often in a multi cloud environment, also helps companies become a more intelligent business. Companies that master this path have something in common: they create a culture of continuous innovation.

In his presentation, Dilipkumar Khandelwal outlined the latest research and steps companies can take to make innovation a daily work habit by using enterprise cloud computing. He shared examples from companies that have benefited from enterprise cloud computing and took a look into the future of how the cloud helps companies become a more intelligent business.

read more

Andrew Keys #Blockchain Keynote at @EXPOFinTech New York | @ConsenSysAndrew #Ethereum #Bitcoin #FinTech #SmartCities

Andrew Keys is co-founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereum.

read more

Jonathan Hoppe Keynote at @CloudEXPO New York | @TotalUptime #Cloud #Serverless #DataCenter #CIO #ArtificialIntelligence #DigitalTransformation

Data center, on-premise, public-cloud, private-cloud, multi-cloud, hybrid-cloud, IoT, AI, edge, SaaS, PaaS… it’s an availability, security, performance and integration nightmare even for the best of the best IT experts.

Organizations realize the tremendous benefits of everything the digital transformation has to offer. Cloud adoption rates are increasing significantly, and IT budgets are morphing to follow suit. But distributing applications and infrastructure around increases risk, introduces complexity and challenges availability at every turn.

To embrace DX and to come out on top, there are four underlying principles that should guide you. Understanding these four essentials along with their relevance and impact will elevate you to DX Hero status now. Jonathan will provide a high-level overview of these principles and how some of his organization’s clients have embraced them with resounding success.

read more

1&1 HiDrive Business review: An affordable but sloppy offering


Dave Mitchell

8 Jan, 2019

An affordable cloud service for small businesses but let down by limited features and weak online help

Price 
£10 exc VAT

1&1 Internet has decided the time is right to provide its own hosted file sharing service and gets the ball rolling with a tempting proposition. The HiDrive Business plan on review starts at only £10 per month for a one year contract and supports five users, plus a total of 1TB of shared cloud storage.

Deployment is swift, as the HiDrive account holder has admin privileges for adding team members from the cloud portal. From the My Team page, you can invite new users by entering their email address and deciding whether to dish out admin privileges and access to shared resources.

The WebDAV protocol is enabled for each user so they can mount their HiDrive cloud storage as a network drive. You’ll need this for macOS and Linux clients as although they can use their personal web portal to access files, HiDrive doesn’t provide desktop apps for these operating systems.

The Windows app handles real-time syncing and provides right-click menu options to open your HiDrive folder in Explorer. However, the app’s web portal link only takes you to the main 1&1 web site where you have to login again, while its help link directs you to a non-existent web page.

Every user gets a public folder and if they choose to sync this, it becomes available to other team members that have this option enabled. This makes a great general shared repository as any user can drop files into it from their desktop or cloud portal and make them instantly available to the whole team.

You can control access to the public folder for selected users from the admin portal. Their profiles can be edited to block access to this folder or you can permit read-only or read/write access.

The desktop app can sort out WebDAV mappings for you. From the HiDrive Drive panel in the app settings, just choose a drive letter, enter your account password and it’ll map a new drive with direct access to the public folder and your personal cloud storage.

Folders and files can be securely sent to anyone by selecting them from the portal or Windows Explorer and entering an email address in the pop-up window. Password protection, download limits and expiry dates can be applied to the link but the basic spelling mistake in the pre-configured email message body won’t impress potential clients.

The same process is used to invite recipients without a HiDrive account to upload files to your share. This time, you enable write access prior to sending the email and at any time, you can revoke the folder share.
A right-click save feature in Windows Explorer allows selected local folders to be copied over to your cloud storage. These initially appear in the user’s personal cloud folder and can be moved to the public folder if required.

HiDrive provides a simple backup service where you can create a schedule from the admin portal that secures all your team’s cloud data. You can run backups as often as every four hours, retain them for up to a year and restore selected backups as copies.

If you want to run HiDrive backups directly from the desktop app or iOS and Android mobile devices, you must upgrade to the Pro version. You can backup NAS appliances to the cloud but you’ll also need the protocol pack upgrade which adds CIFS/SMB, Rsync and SFTP/FTP support.

HiDrive Business is a simple cloud syncing and sharing option well suited to small businesses but 1&1’s lack of attention to detail doesn’t inspire confidence. The poor online help and sloppy presentation takes the shine off what is arguably an affordable solution.