The use of containers by developers — and now increasingly IT operators — has grown from infatuation to deep and abiding love. But as with any long-term affair, the honeymoon soon leads to needing to live well together … and maybe even getting some relationship help along the way. And so it goes with container orchestration and automation solutions, which are rapidly emerging as the means to maintain the bliss between rapid container adoption and broad container use among multiple cloud hosts. This BriefingsDirect cloud services maturity discussion focuses on new ways to gain container orchestration, to better use serverless computing models, and employ inclusive management to keep the container love alive.
Monthly Archives: February 2018
Multi-Dimensional View of Cloud Business Data | @CloudExpo #BI #Analytics
As a enterprise you need to quantify whether you achieved better productivity and reduced cost with cloud deployments. How does your organization’s leadership team correlate and compare which cloud vendor is more cost effective for you? Or which region is proving to be more expensive than the others. Do you have tagging guidelines in place to ensure accountability for the cloud resources? The solution lies in having the right set of cloud analytics tools such as dashboard and reports, to analyze the available data.
AI and #DigitalTransformation | @ExpoDX #FinTech #AI #ArtificialIntelligence
Fingerspitzengefühl: A German word used to describe the ability to maintain attention to detail in an ever-changing operational and tactical environment by maintaining real-time situational awareness. The term is synonymous with the English expression of “keeping one’s finger on the pulse”. The problem with fingerspitzengefühl traditionally, in addition to pronouncing it, has been it is hard for an individual to scale up. Today that is changing. In a world of sensors, AI and mobile devices, having real-time situational awareness is far easier than ever before. In fact, today the challenge is not how to do it, but what to do with the massive volume of data that can be provided.
How IBM is focusing on containers, blockchain, machine learning and more in its cloud push
Last month, IBM put out its most recent earnings report – and one statistic stood out. For the first time in more than five years, revenue did not decline from the previous year’s quarter.
Cloud revenue was $17 billion for the year, while run rate in the company’s as-a-service offerings was more than $10bn. Speaking to analysts, James Kavanaugh, IBM senior VP and chief financial officer, said the company was ‘doing a lot of work to reposition [the] business, to help move clients to the future, investing, shifting skills and reallocating capital… in short, a lot of heavy lifting.’
Revenues reversing a longstanding downward trend point to IBM enjoying the fruits of its labour – for now. But the truth is, that from containers, to blockchain, to machine learning, IBM has been looking at a series of emerging technologies for some time – and it is now looking to utilise its expertise to give clients a competitive advantage.
One of the key ways the company is doing this is through Garages, a space for companies big and small to test out these technologies and see how it affects their operations. IBM launched its tenth, in Copenhagen, earlier this week. And according to John Considine, general manager of cloud infrastructure services, it’s ‘fantastic’ to go through the problem-solving process with organisations.
“It might be our deep learning, our analytics and data, certainly our cloud platform to create a solution that’s, believe it or not, workable,” he tells CloudTech. “[It’s] a short period of time just to say this is feasible, it can be done, and you can get on this innovation and transformation pretty quickly.”
The artificial intelligence and machine learning angle is of particular note given the recent news of an extended partnership between IBM and Salesforce. Salesforce named IBM its preferred cloud provider, while the latter named the former its preferred platform for customer engagement. But the intrigue was on how the two companies would do it – namely, combining the insights of Watson and Einstein respectively.
“One of our theories leading into the cloud, for the past few years, of course is that data is enormously important for the enterprises,” says Considine, “and given more than 80% of the world’s data is still maintained behind the corporate firewall, our focus has been how do we enable the businesses to take advantage of that data, to combine it with new processing techniques, new data sets, and new capabilities.
“[It’s about] all the things associated with machine learning and deep learning, analytics and bringing all of these things together in a form that allows them to tap into those resources and deliver not only application modernisation, but really even process reinvention.”
Another partnership recently continued and extended was that of NVIDIA. The GPU and chip provider is bringing its Tesla V100 GPU to the IBM cloud. Considine explains how important this is for machine learning; to train a 300 MB model – something in the region of 300 million trillion maths operations – takes hours rather than years with a data centre full of standard processors. “That kind of acceleration is critical for us and for the market to be able to make the value of these artificial intelligence and deep learning models reachable and actually relevant,” says Considine.
Take containers as another example. As this publication said at the start of this year, 2017 was notable for the rise of container technologies, not least because the largest cloud vendors, from Amazon Web Services (AWS), to Microsoft, and Oracle, signed up for the Cloud Native Computing Foundation (CNCF), home to the Kubernetes project. IBM – along with Google, Intel, VMware and others – was a founder member in 2015. Todd Moore, IBM vice president for open technologies, sits as the CNCF’s governing board chair.
Considine takes up the story. “A while ago – probably a year and a half ago, in cloud time a long time ago, [we] started making some significant investments in containers,” he says. “Partly because we have so many patents, so much research in the area, we really adopted the value of containers, and have been busily transforming a lot of our services to be container-based internally.
“Part of this process has been to take our learnings about how to make these transformations – and you can think of this for everything in our SaaS properties, even some of our cloud properties, we’ve really gone through a process of effectively retooling to take advantage of this containers environment,” Considine adds. “It really has been good for us internally, helping us chart the roadmap and then work with our customers to take them through that application modernisation in that container journey.”
The earnings report meant another marker in the continual ‘cloud wars’ discussion. Like a lot of things, it depends on what you call cloud. For cloud infrastructure services, AWS is the clear winner. In fact, some may argue, it won a long time ago.
As AWS passed $5 billion in quarterly revenues earlier this month, Synergy Research, a long time tracker of cloud infrastructure figures, described the market leaders as setting ‘a fierce pace.’ According to the analyst firm’s figures, AWS has approximately the same level of market share as its four nearest competitors combined – Microsoft, IBM, Google and Alibaba – and continues to grow.
Yet there is more to cloud than infrastructure. This is the view that Bob Evans, formerly chief of communications at SAP and Oracle and now a strategic comms advisor, holds. Writing a regular column for Fortune, Evans posits that Microsoft is the real leader – taking into account infrastructure, software, platform and more – ahead of Amazon and IBM. For the most recent quarter, Evans argues that IBM was ‘a huge winner on multiple fronts’.
“The narrative – and it is dead wrong – repeated relentlessly and tirelessly by many in the media and amplified by some analysts that Amazon is the runaway winner in the cloud is baseless, sloppy and terribly misleading,” wrote Evans. “It’s time for that long-running string of extremely fake news that ‘Amazon rules the cloud’ to come to an end.”
So how does IBM see this ‘cloud wars’ narrative? Considine says he understands the issues with analysis when some companies obfuscate their definition of cloud, or what goes in which revenue bucket. Ultimately however, it all comes down to the investment in emerging technologies.
“Here’s the trick: for us, we are the enterprise cloud,” says Considine. “And this investment we’ve made in the technologies, in the underlying infrastructure… [there’s] huge growth this year in our infrastructure both in geographic and capacity expansion, but as well as new features and the rate of delivering new features.
“These are just fundamental things that pile on to deliver the capabilities we need – but these are focused in really providing solutions for the enterprises, and then helping them make this transformation.”
Bessemer sees blockchain, serverless and APIs influencing the cloud arena in 2018
Serverless computing, APIs and blockchain are all going to shape the cloud landscape in 2018 and beyond, according to the latest State of the Cloud report from Bessemer Venture Partners.
The influential annual report, the fourth of its kind, takes a look at the state of investments, IPOs and future trends in the cloud realm. The cloud IPO market recovered somewhat in 2017 compared to 2016’s slump, with Okta, Cloudera and MongoDB among the companies taking the plunge – yet still below historical averages, according to Bessemer.
The portents however are good for future entrants, as there are now almost as many private cloud unicorns – companies valued at more than $1 billion – as public. Private unicorns include Slack, DocuSign and Stripe – all companies who have scored highly on the Forbes 100 top private cloud companies ranking. Another traditionally high performer, Dropbox, has already confidentially filed for IPO this year – if reports are to be believed.
Regarding the rise of serverless computing, Bessemer argues it is being driven by three macro trends; Docker making containers more developer-friendly, more, scalable APIs, and open source becoming more available and accepted. As regular readers of this publication will note, containers have been frequently in the news this year already, from Cisco launching its Kubernetes package and Red Hat acquiring CoreOS.
As far as blockchain is concerned, Bessemer cites various examples of organisations in different industries using the technology. The food industry has participants in the shape of Unilever, Dole and Nestle, while aviation has Lufthansa, Airbus and AirFrance and retail Walmart and Alibaba.
The latter can be seen as focus for another trend the report covers; that of the cloud ‘being flat’. Alibaba – whom Synergy Research among others have placed among the leaders in cloud infrastructure – is cited alongside the likes of Huawei, Atlassian and Adyen as examples of companies outside the US pushing the envelope.
Ultimately, however, whatever technological trends 2018 will bring, it’s all about the balance sheet. As Byron Deeter, partner at Bessemer, explains, “the cloud computing revolution has changed the way we value companies and approach investment opportunities.” As the company said last year, it expects good companies to grow from $1 million to $10m in annual recurring revenue (ARR) in four years, but the best companies can do it in two years.
You can check out the full presentation here.
Open Source #IoT Ecosystems | @ExpoDX @DellEMC #SmartCities
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution.
EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
More small businesses paying for cloud storage – but they may not be getting value for money
More than four in five small businesses (81%) say they pay for their cloud storage, representing an increase of 10% over the last three years, according to new data from B2B research firm Clutch.
The study, which polled 300 IT decision makers at US small businesses who use cloud storage, aimed to examine what small businesses need to know when looking for a storage provider, as well as how vendors can better market their tools to small businesses.
The figures show that even small businesses who got on board with cloud storage last year would be seen as laggards; 63% of respondents adopted cloud storage in 2015 or earlier.
The main benefits of cloud storage according to survey respondents are wide-ranging, from access to data, cited by 29% of those polled, to data security (20%) and ease of access in sending and receiving large files (18%). This may be small beer to many organisations, but it seems to be effective if the figures about more businesses paying for their cloud storage are anything to go by.
Organisations are most likely to pay between $51 and $250 per month, with this option cited by 38% of respondents. 23% said they paid between $251 and $1000, with 9% saying they paid more than $1000 per month.
Naturally, while security is always the number one challenge, for smaller businesses price is significant as well. The report notes how, for some small businesses, their only reason to move data to the cloud is to use it as an archive. Almost two thirds (65%) say they want to use the cloud as an archive in some capacity, with 28% of respondents saying they only plan to use cloud storage for archiving.
The report advocates cold storage offerings, such as Amazon Glacier or Google Coldline, as a cost-effective alternative. “If you are looking between a backup solution or a collaboration solution, the collaboration solution is more expensive. The backup is cheaper,” said Istvan Lam, CEO of Tresorit.
Ultimately, however, the report advocates that cloud storage is key for small businesses, but only if utilised efficiently. “Small businesses place emphasis on cloud storage security, and the cloud’s ability to offer flexible and fast access to data. Yet small businesses may not be using cloud storage most effectively in terms of cost or brands,” the report concludes.
“Businesses should place data that is infrequently accessed in a cheaper long-term backup solution,” it adds. “Furthermore, small businesses should ensure the brand of cloud storage they use offers the reliability and security they need.”
This is the latest in a series of studies Clutch has put together on small businesses and cloud computing. As this publication reported in November, a previous study argued organisations were ‘confident’ about cloud storage security, yet many did not follow industry standards.
You can read the full report here.
Microsoft aims to snare Box, Dropbox and Google customers with new OneDrive deal
Look out Box, Dropbox and Google – Microsoft is offering free OneDrive to customers of its cloud storage rivals in an attempt to poach them.
The move was announced on an official Microsoft blog yesterday. The company will offer free OneDrive for Business to Box, Dropbox and Google customers with a couple of caveats; the organisations must not currently be OneDrive for Business or Office 365 customers, and they must make a minimum 500 user commitment. The offer is valid as of February 6 and runs until June 30 2018.
Microsoft added that the growth of OneDrive had been ‘amazing’, and that more than 350,000 organisations were now using the product. These include Accenture, aerospace and defence provider Textron, and Rackspace.
The company has added a variety of enhancements to OneDrive over the past months, including secure external file sharing without the need for a Microsoft account, a self-service file recovery solution, and improved on-demand file browsing and managing.
“OneDrive with Office 365 isn’t just a cloud storage solution,” wrote Ron Markezich, Microsoft corporate vice president. “It’s a core ingredient of the modern workplace.”
Elsewhere, a curious article appeared on Microsoft’s UK news centre with the opening line: “Microsoft has launched the UK’s most powerful cloud services.”
A bold statement indeed. The article relates to the launch of M-Series virtual machines in Azure, which are optimised for large in-memory workloads, such as SAP HANA, and can support up to 128 virtual central processing units (vCPU) and up to 3.8 tebibytes of RAM on a single VM. Microsoft claims this is the most offered by any public cloud.
M-Series VMs were made generally available in December, as well as B-Series VMs, which are aimed at lower CPU workloads such as web servers and small databases. The trick of B-Series is that it offers a slight twist on the usual pay as you go models; customers can build up credits for the predominant low CPU utilisation, which can then be used during spikes. If you have an hour or two, a full list of Azure VM sizes and capabilities is available on the company’s documentation page.
Why data scientist is the best job in America – according to Glassdoor
- Data scientist has been named the best job in America for three years running, with a median base salary of $110,000 and 4,524 job openings.
- DevOps engineer is the second-best job in 2018, paying a median base salary of $105,000 and 3,369 job openings.
- There are 29,187 software engineering jobs available today, making this job the most popular regarding Glassdoor postings according to the study.
These and many other fascinating insights are from Glassdoor’s 50 Best Jobs In America For 2018. The Glassdoor Report is viewable online here. Glassdoor’s annual report highlights the 50 best jobs based on each job’s overall Glassdoor Job Score. The Glassdoor Job Score is determined by weighing three key factors equally: earning potential based on median annual base salary, job satisfaction rating, and the number of job openings. Glassdoor’s 2018 report lists jobs that excel across all three dimensions of their Job Score metric. For an excellent overview of the study by Karsten Strauss of Forbes, please see his post, The Best Jobs To Apply For In 2018.
LinkedIn’s 2017 U.S. Emerging Jobs Report found that there are 9.8 times more Machine Learning Engineers working today than five years ago with 1,829 open positions listed on their site as of last month. Data science and machine learning are generating more jobs than candidates right now, making these two areas the fastest growing tech employment areas today.
Key takeaways from the study include the following:
Six analytics and data science jobs are included in Glassdoor’s 50 best jobs In America for 2018
These include Data Scientist, Analytics Manager, Database Administrator, Data Engineer, Data Analyst and Business Intelligence Developer. The complete list of the top 50 jobs is provided below with the analytics and data science jobs highlighted along with software engineering, which has a record 29,817 open jobs today:
- Median base salary of the 50 best jobs in America is $91,000 with the average salary of the six analytics and data science jobs being $94,167.
- Across all six analytics and data science jobs there are 16,702 openings as of today according to Glassdoor.
- Tech jobs make up 20 of Glassdoor’s 50 Best Jobs in America for 2018, up from 14 jobs in 2017.
Source: Glassdoor Reveals the 50 Best Jobs in America for 2018
The importance of getting security on-board early in cloud projects
It goes without saying that security is a vital part – the most vital part – of cloud projects. But getting security engagement in early will make the whole process easier, from reduced risk of data loss to quicker project delivery.
That is the key finding from a new report by Hurwitz & Associates. The paper, sponsored by Lacework and titled ‘Balancing Velocity and Security in the Cloud’, polled 85 IT leaders from the Americas and Europe and examined the importance of automation in ensuring compliance and predictability in cloud projects, among other issues.
On automation, the paper offers this. “It’s no secret that there is a shortage of security professionals,” the report notes. “It is important that security offerings incorporate automation to allow security teams to address more events and to give junior analysts the ability to handle issues that are typically left for more senior analysts.”
Almost 85% of those polled said they were deploying significant cloud projects, with 35% going cloud first for all projects and 48% saying they were going for selective large projects. More than half (53%) of those polled said the most important characteristic of their cloud was that it was ‘safe and secure’, while the next most popular choice, ‘deliver new services and updates faster’, was selected by only 13% of respondents.
Only two in five (41%) of respondents agreed that their company catches every cyber attack and data breach of its cloud. That said, the researchers argue this figure may be out of kilter. “We suspect that the problem is much more serious than the level of confidence suggests,” the report noted. “It is simply not evident that security leaders are actually identifying all security threats.”
Yet organisations are attempting to improve their security operations. Almost 90% of those polled said their security and cloud operations teams were working closely together, while the same number, when asked on their most important priority before evaluating a cloud solution, inferred security was key before a line of code was written. 45% opted for project planning, compared with 22% for requirements definition, 13% for technology selection and 10% for project review and approval.
For those who get security onside early in a cloud project, the benefits are evident. Almost all (94%) of those polled said early engagement by the security team resulted in a reduced risk of cybercrime and data loss; 42% opted for faster project delivery and others for more predictable schedule (22%) and lower project costs (18%).
What’s more, when it came to key cloud security requirements, containers were certainly on the horizon. “As containers become the backbone of cloud applications, security teams need to track, identify, and manage containers along with monitoring container-to-container traffic within the cloud, not just from and to the cloud,” the report noted.
“The high velocity and scale of public clouds are shattering everything the security industry has assumed for the past 10 years,” said Sanjay Karla, co-founder and chief product officer at Lacework. “The acceleration of cloud adoption is now paving the way for security teams to deploy automated security solutions that naturally augment security teams’ ability to continuously validate their cloud configuration for security and maintain secure daily operations in the cloud.”
You can read the full report here (email required).