Mozilla planning revamped Thunderbird for 2019


Keumars Afifi-Sabet

3 Jan, 2019

Mozilla has announced its opensource email client Thunderbird will benefit from a redesigned user interface (UI) and better Gmail support within the next year.

As part of its roadmap for 2019, the firm will grow its team by half a dozen members, from eight to 14 engineers, in order to make the service faster, more secure, and improve the user experience (UX).

Announcing the plans in a blog post, the Firefox developer said it will build on the progress made with the release of Thunderbird 60 in August, which saw major upgrades to its core code and improvements to security and stability.

“We heard from users who upgraded and loved the improvements, and we heard from users who encountered issues with legacy add-ons or other changes that they hurt their workflow,” said Thunderbird community manager Ryan Spies.

“We listened, and will continue to listen. We’re going to build upon what made Thunderbird 60 a success, and work to address the concerns of those users who experienced issues with the update.

“Hiring more staff will go a long way to having the manpower needed to build even better releases going forward.”

Mozilla will prioritise Thunderbird’s design and UX improvements, after receiving “considerable feedback” and complaints, with a primary focus on improving compatibility with Google’s Gmail.

This, specifically, will see better support for Gmail’s labels, a way to categorise messages, and improvements to how Gmail-specific features translate to the Thunderbird client.

Among the project’s engineering priorities for the new year will be looking into methods for measuring slowness, and developing fixes to specific bugs that deteriorate the user experience.

The new staff members will also be put to work re-writing parts of the core code and “working toward a multi-process Thunderbird”.

The client’s notifications and encryption settings will also benefit from an overhaul, the firm confirmed.

Thunderbird will seek to integrate its own notifications with a user’s operating system, while Mozilla will allow users to more easily secure their communications after an engineer was recently hired with a specific remit over security.

Mozilla hasn’t yet completely determined its roadmap, and wouldn’t guarantee that all changes outlined, including the UI redesign, would be available in the next Thunderbird release.

Google’s radar-based gesture sensor given the go-ahead


Bobby Hellard

3 Jan, 2019

Google has been given the green light by the FCC to push forward with a radar-based sensor that can recognise hand gestures, with the technology being pegged as a new way to control smartphones and IoT devices.

Project Soli, which is a form of sensor technology that works by emitting electromagnetic waves in a broad beam, was initially blocked due to concerns it would disrupt existing technology.

Radar beam interpreting hand gestures – courtesy of Google 

Objects within the beam scatter energy, reflecting some portion back towards a radar antenna. The signals reflected back capture information about its characteristics and dynamics, including size, shape, orientation, material, distance and velocity.

However, it’s taken Google a number of years to get project Soli going as initially its radar system was unable to accurately pick up user gestures and had trouble isolating each motion. Google attributed these problems to the low power levels the smartwatch had to operate on due to FCC restrictions.

The tech giant applied for a waiver from the FCC to operate at higher power levels, something that was initially protested by Facebook as it claimed higher power levels could interfere with existing technology. This dispute has been settled by the two companies and Google has been granted a waiver after the FCC determined that project Soli could serve the public interest and had little potential for causing harm.

The approval means that Soli can move forward and create a new way to interact with technology. Due to the small size of the chips, it can be fitted into wearables, smartphones and many IoT devices.

Carsten Schwesig, the design lead of project Soli, said his team wanted to create virtual tools because they recognised that there are certain control actions, such as a pinch movement, that can be read fairly easily.

“Imagine a button between your thumb and index finger — the button’s not there, but pressing it is a very clear action and there is a very natural haptic feedback that occurs as you perform that action,” he said.

Virtual button gif – courtesy of Google

“The hand can both embody a virtual tool and it can also be acting on the virtual tool at the same time. So if we can recognise that action, we have an interesting direction for interacting with technology.”

There is currently no indication as to when the company plans to roll out the new technology.

Microsoft 365 offers new bundles for cyber security and GDPR compliance


Keumars Afifi-Sabet

3 Jan, 2019

Microsft has added two new security-centric subscription bundles to its Microsoft 365 enterprise suite to be made available from February 1 2019.

The new ‘Identity & Threat Protection’ and ‘Information Protection & Compliance’ packages are aimed at enterprise customers unable to commit to most high-end Microsoft 365 subscriptions, but who still want to benefit from the firm’s security and compliance tools.

The former will include Microsoft Threat Protection, which comprises Azure Advanced Threat Protection (ATP), Windows Defender ATP and Office 365 ATP including Threat Intelligence. Priced at $12 per user per month, it will also offer Microsoft Cloud App Security, and Azure Active Directory.

The latter package, which is more geared towards assisting customers with compliance needs such as handling requests under the European Union’s General Data Protection Regulation (GDPR), will be priced at $10 per user per month.

This package combines Office 365 Advance Compliance and Azure Information Protection, and is designed to aid chief compliance officers with ongoing risk assessments within their organisations. The package will also automatically classify and protect sensitive data and use AI to respond to regulatory requests.

“A big driver of customer adoption of Microsoft 365 is the need for security and compliance solutions in an age of increasingly sophisticated cybersecurity threats, as well as complex information protection needs due to regulations like the GDPR,” said corporate vice president for Microsoft 365 Ron Markezich.

“As we speak to customers about the future of work, we know security and compliance are some of the highest organizational priorities and we hope these new offerings will help them achieve their security and compliance goals.”

These products already exist as part of the Microsoft 365 E5 suite, the most high-end subscription bundle available, but is being made available to existing and prospective customers on a standalone basis.

Microsoft confirmed that neither the price, nor composition, of its existing packages will change.

Why efficient multi-cloud management and DevOps requires transparency

As multi-clouds become the norm, finding and addressing wasteful cloud resources jump to the top of the list of IT concerns. Keeping cloud management simple, timely, and accurate requires a view into your application usage that is clear and comprehensive.

Hybrid clouds give organisations the ability to get the best of both worlds: on-premises for traditional apps and resources they want to keep close at hand, and in the public cloud to realise the speed, agility, and efficiency of cloud-native applications. The challenge is to maintain the optimal balance between public and private clouds to achieve your business objectives. Doing so requires a 360-degree view of the full application lifecycle.

Companies need to evaluate multi-cloud management platforms and orchestration tools which can take the mystery out of hybrid IT by giving an up-to-date view of resource utilisation without slowing down DevOps activities. According to a recent survey of CIOs and IT managers, 37 percent of respondents identified unpredictable costs as their greatest cloud concern, topped only by security.

Sharpening your view into critical operations

According to studies conducted by ISACA Research, one out of three organisations doesn't calculate cloud computing ROI. This study identifies three "core IT activities" that must be monitored regularly and accurately:

  • Quickly spinning up new cloud environments or adapting old ones 
  • Providing the right services to the right people at the lowest cost possible
  • Keeping those user services and app stacks reliable, secure, and stable

Gaining visibility into application health is one of the top four challenges of multi-cloud management, topped only by security and performance concerns.

The benefits of visibility into all your workloads — in the cloud and on-premises — is demonstrated by an accounting application, which has peaks and valleys of activity in standard business cycles. 

One of the greatest impacts of enhanced visibility into application performance and health is the ability of CIOs to partner with the business units that rely on the apps. Knowing how cloud and non-cloud resources are being used in the organisation allows CIOs to recommend specific platforms and services, keep tabs on the inevitable shadow IT projects, and have a more thorough knowledge of what the business units need. 

Bringing monitoring and logging into a consolidated view across clouds with an orchestration platform like Morpheus unlocks the ability to detect app stack outages, scale across platforms and clouds, and otherwise assure day-2 production application tasks are first class citizens within the deployment phase of the app lifecycle. 

Too often, Biz Dev and Ops are three separate functions. However having full view of multi-cloud operations and the ability to provide services to both business stakeholders and developers on demand can elevate the IT Ops team to a position of business value rather than business frustration.

An opportunity for DevOps to drive the business to new heights

The continuous integration/continuous (CI/CD) delivery nature of DevOps is taking organisations by storm. DevOps teams think in terms of application portability and velocity, which means applications are built independently of where they will live [and] move across the continuum of on-prem, private and public clouds with complete transparency to end users.

There's only one way to achieve such a level of end-user transparency in multi-cloud and hybrid cloud environments: via a single unified interface that is shared by all of the teams that touch the app life cycle. The growing popularity of multi-cloud management platforms such as Morpheus is due in large part to the increasing demand for a single, comprehensive view of diverse public cloud and private cloud services.

It must go beyond a unified interface though. Organisations using configuration management tools as part of their orchestration flow can track configuration state changes in development and then enforce an identical state of dependencies through test and production. When coupled with self-service provisioning, organisations are able to quickly tear down and refresh the entire pipeline at any time because everything is stored and managed as code.

Unified multi-cloud management lets teams execute workflows and determine the best execution venue for workloads by identifying the optimal platform based on cost, reliability, and service portfolio. The single point of control the services provide means users have new levels of order and visibility into multi-cloud environments and governance.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

2019 will be the year cloud-native becomes the new norm


Keri Allan

8 Jan, 2019

The vast majority of businesses see cloud as a critical component of their digital transformation strategy – some 68% of businesses already have cloud-based systems place, or are in the process of implementing them, according to technology consultancy Amido.

But more specifically, businesses are recognising the benefits of cloud-native applications: software designed specifically to run on cloud infrastructure.

In 2018, the Cloud Native Computing Foundation (CNCF), a vendor-neutral home for cloud-native projects, saw its end-user community grow to over 50 members. This includes household names such as Uber, Airbnb, Netflix, Adidas, Spotify, Mastercard and Morgan Stanley.

Cloud native applications offer hyperscale provisioning, resilience, high availability and responsiveness, all which help businesses operate faster with greater flexibility. It’s, therefore, no real surprise that many industry experts believe that 2019 is the year cloud-native will become the ‘new normal’.

The benefits of cloud-native technology

“For CIOs, cloud-native is an enabler; a transformative technology,” says Amido’s chief technology officer (CTO) Simon Evan. “They’re using it to do things they can’t do on premise. Driving this are things like AI workloads, which benefit all sectors from finance through to healthcare and retail. You can free up staff from menial tasks, improve customer experience and benefit from predictive analytics,” he highlights.

“Taking a cloud-native approach means businesses can harness the real power of the cloud to their advantage as it offers them faster responses to the changing needs of the business and the market, ensures their technology portfolio is up to date and driving innovation and improves the customer experience while increasing ROI,” adds Puja Prabhakar, senior director, Applications and Infrastructure at consultancy firm Avanade UKI.

Cloud-native technologies can often become “boring” compared to emerging apps, according to CNCF CTO/COO Chris Aniszczyk, as the tech stabilises and matures over the years. However, he argues this shouldn’t be seen as a negative.

“Boring means organisations can focus on delivering business value, rather than spending time on making the technology usable,” he explains.

Experts advise businesses to embrace these ‘boring’ technologies in 2019, particularly the installation and configuration of platforms and containers as a service platforms, such as Docker, OpenShift and Kubernetes.

“I expect more traction for Kubernetes as more organisations use it for distributed applications across hybrid cloud infrastructure that includes public clouds, private clouds, multiple public clouds, public clouds with on-premise environments and combinations of them all,” says Jay Lyman, principal analyst, Cloud Native and DevOps at 451 Research.

He believes that more organisations will leverage containers and microservices for not only new cloud-native applications but also increasingly those built on traditional and legacy infrastructure.

The rise of serverless in 2019

Prabhakar adds that businesses should also consider how they’re designing their full stack and backend application engineering. Specifically, she believes engineering should be focused on creating applications inherently designed for development on the cloud, such as serverless frameworks, microservices frameworks, API integration frameworks, DevOps, data stores, and machine learning.

Other cloud-native technologies set to take a front-row seat in 2019 include commercialised service mesh offerings, which, according to CNCF’s Aniszczyk, are the next frontier in making service-to-service communication safer, faster and more reliable.

“Service meshes like Linkerd are ready to be used in production deployments and can help businesses scale applications without latency or downtime. They can also be used to help secure traffic between services and applications,” he points out.

Following an explosion of interest in 2018, serverless technologies also look set to pick up momentum in 2019.

“Serverless for enterprise is a huge trend,” says Liz Rice, chair of 2018’s CloudNativeCon and KubeCon events. “We’ll see lots of discussions on how and where enterprises can apply architectures based on serverless functions and perhaps a better understanding of the cultural/DevSecOps implications of serverless functions will emerge in 2019.”

“Serverless won’t be appropriate for all classes of application, and will co-exist alongside container architectures for some time to come,” she adds.

Talent and security challenges remain

In December, a flaw allowing easy access into every single machine in a cluster via the Kubernetes API server was quickly caught and resolved, making security another hot topic. The community came together to discuss how to best solve security challenges facing the open source/cloud-native community and a number of security-related initiatives have been announced to help organisations go beyond what is natively provided by the Kubernetes platform. “And as we go into 2019, I expect we’ll continue to see more efforts crop up,” Aniszczyk says.

In response to all these trends, businesses need to invest not just in technology, but also in acquiring new talent and retraining existing staff in cloud-native methodology and technology.

“Adoption of cloud-native technology will only be held back by the lack of skills in the market,” points out Ilja Summala, CTO of Nordcloud Group.

Lyman agrees that the lack of cloud-native expertise and experience is probably the biggest challenge facing the industry. “Few organisations can find large numbers of Kubernetes and other cloud-native experts and even if they could find them, it is an expensive proposition. This is why including and training existing staff in cloud-native initiatives as much as possible will be critical moving forward.”

He also recommends talent also focuses on open source technology.

“End users have never been as participatory and influential as with Kubernetes,” he explains. “There is ample room to get involved with many open source software projects and Kubernetes Special Interest Groups (SIGs), and this is helping the community to focus more directly on the problems that companies are facing and the objectives they are trying to meet.”

However, there’s one other issue that looks set to take longer to resolve. That’s changing the culture of how we work, a big challenge for business that’s not going to be fixed overnight.

“The shift from monolithic/waterfall to agile/DevOps is more about process and organisational psychology than it is about which technologies to employ,” points out Mark Collier, COO of the OpenStack Foundation. “This has been talked about for several years and it’s not going away anytime soon. It’s the big problem that enterprises must address and it’s going to take years to get there as it’s a generational shift in philosophy.”

Reports: Netflix to hire Activision Blizzard CFO


Adam Shepherd

2 Jan, 2019

Netflix will shortly announce the hiring of Activision Blizzard’s current CFO Spencer Neumann, according to reports.

A source close to the deal has said that Neumann is slated to start at Netflix early this year, according to reports from Reuters. He will reportedly be based in Los Angeles, and will focus on production costs.

Neumann will replace outgoing CFO David Wells, who announced in August that he would be stepping down after a 14-year tenure with the company. Wells has been CFO since 2010, during which time he has helped guide the company from a minor DVD rental service to an omnipresent cultural phenomenon.

Neumann is leaving his current post at Activision Blizzard under something of a cloud. The company publicly disclosed his imminent dismissal as part of a regulatory filing earlier this week, stating only that it was “unrelated to the Company’s financial reporting or disclosure controls and procedures”. He is currently on paid leave.

His tenure as CFO at Activision Blizzard has lasted for less than two years, and the last 12 months have seen the games publisher’s stock price drop by just over 25%. In his absence, his predecessor Dennis Durkin – who is currently acting as chief corporate officer and served as CFO from 2012 to 2017 – will step back into the role.

Neumann has a strong media background and previously had a lengthy career as part of the Disney empire, having started with the company in 1992. His reported focus on production indicates that Netflix is still pursuing its goal of having 50% of its library of movies and TV shows being made in-house – a goal that has seen Netflix spend billions of dollars on creating high-profile shows such as House of Cards and Stranger Things.

Would you quit Facebook for $1,000?


Adam Shepherd

2 Jan, 2019

The average person would need to be paid more than $1,000 (~£790) in order to deactivate their Facebook account for a year, according to recent research.

The study, published in scientific journal PLOS One, found that the social network is still extremely valuable – not just in terms of its share price, but to its users as well. “Our results provide evidence that online services can provide tremendous value to society even if their contribution to GDP is minimal,” the researchers wrote.

In order to test how much Facebook’s users valued it, researchers conducted auctions in which participants submitted bids of how much it would take to get them to deactivate their account. In order to prevent strategic bidding, the auctions were conducted using the ‘second-price’ method, where only the second-highest (or in this case second-lowest) bid would win the auction.

The winning bidder was paid by the researchers in exchange for proof that their account had been deactivated, in the knowledge that the researchers would be periodically checking their account.

Two groups of college students and one group of adults were tested using these auctions, with all three groups numbering between 122 and 138 people. One group of students was asked how much it would take for them to quit Facebook for one day, for three days, and for one week, extrapolating from their answers to find out how much it would take to get them to quit for one year. The other two groups were asked for the annual figure outright.

In all three cases, the average annual figure was over $1,000, ranging as high as $2,076 for one of the student groups. Unsurprisingly, the adults proved more willing to give up the social platform, with an average bid of $1,139 for the year. Even so, this means that these users would need to be paid almost $100 per month before they’d sacrifice their Facebook accounts.

The researchers also tested a separate sample of 931 users via Amazon’s Mechanical Turk platform, which yielded an annual average of $1,921 – although the researchers did not check whether the winners had deactivated their accounts following the auction.

The news of Facebook’s enduring popularity may come as a surprise to some; the company has spent the past year mired in a series of seemingly never-ending scandals over issues including data privacy, leadership struggles, information warfare and more.

“Concerns about data privacy, such as Cambridge Analytica’s alleged problematic handling of users’ private information, which are thought to have been used to influence the 2016 United States presidential election, only underscore the value Facebook’s users must derive from the service,” the researchers stated.

“Despite the parade of negative publicity surrounding the Cambridge Analytica revelations in mid-March 2018, Facebook added 70 million users between the end of 2017 and March 31, 2018. This implies the value users derive from the social network more than offsets the privacy concerns.”

Farewell 2018, hello 2019: The last 12 months in cloud – and what’s on the horizon

2018 was yet another fascinating year when it came to cloud computing, along with the emerging technologies which complement and rely on the cloud.

As the ecosystem has matured and organisations’ confidence has increased, conversations have shifted gear accordingly. RightScale’s State of the Cloud report for 2018 back in February found that for every enterprise which hadn’t dipped its toes in one of the major cloud vendor offerings, there were at least two who had. In one of the more outlandish predictions from last year, a report from Citrix in July argued that, by 2025, the term ‘cloud’ will have gone the way of ‘internet’ et al, its ubiquity forcing a semantic sea-change.

With that in mind, what were the key trends of last year? In many ways, they were variations on previous themes. Below, CloudTech explores some of the stories which shaped 2018, and predictions from industry executives on what may happen in 2019.

The behemoths lurch ever further forward

At the end of 2018, according to industry market trackers at Synergy Research, Amazon Web Services (AWS) held just over one third of the cloud infrastructure space. Microsoft was a clear second, holding approximately 14% of the market, ahead of IBM and Google, in joint third, and Alibaba. Meanwhile, capex for data centre infrastructure among the largest players continues to skyrocket.

Again, little has changed in with regard to the vendors, with those who follow these companies’ bets purely for the financials missing the bigger story. That said, another benchmark analyst report came to a different conclusion. Gartner’s 2018 Magic Quadrant for infrastructure as a service (IaaS) found room in its leaders’ section for Google, breaking a half-decade duopoly between AWS and Microsoft. Only six vendors made the final cut for inclusion – the lowest ever – showcasing the strength and saturation of the market.

AWS had another stellar year. On the revenue side, $6.7bn for its most recent quarter represented a 45% increase from this time last year, with profit breaking $2bn, more than half Amazon’s overall figure. On the product side, re:Invent revealed a mix of the new and old, from beefier blockchain and machine learning initiatives – more of which shortly –  to a potentially game-changing hybrid cloud renewal of vows with VMware.

For Microsoft, as this publication put it, the message has barely changed, but the numbers just kept going up – a claimed 76% leap in Azure year on year according to the most recent figures. Google meanwhile had a year of mixed emotions. Growth was again strong, with CEO Sundar Pichai crowing in April of ‘significantly larger, more strategic deals’ as the enterprise strategy came more into place in 2018. Yet the departure of Google Cloud chief Diane Greene, announced in November, makes the near future an interesting one.

Blockchain, AI and getting the recipe right

As the sun set on 2017, this publication noted how blockchain and artificial intelligence had made major strides in a cloudy context. 2018 again saw notable leaps in this area. AWS, for instance, who had previously denied being interested in blockchain, changed tack in the first half of the year before adding a managed blockchain and quantum ledger database service at re:Invent.

In terms of AI, everyone has their eyes on the prize. Google Cloud’s initiatives in this area were impressive, with the launch of pre-packaged AI services in August particularly notable. AWS customers who signed up in 2018 included Major League Baseball and Formula 1, worth noting because of their insistence on exploring all things machine learning.

From a software perspective, one of the most interesting reports issued last year was from venture capital firm Work-Bench who explored emerging enterprise software with rip-roaring results. Their analysis was that the behemoths were winning at AI right now – largely by hoovering up the best talent – and if other companies were to make strides, automated machine learning (AutoML) and focusing on revamping traditional business intelligence was the answer.

Looking at the infrastructure side, organisations’ requirements for deeper insights powered by AI and ML would naturally require a stronger network. Writing in October, regular columnist Sagar Nangare noted how Gartner was ‘emphasising the empowerment of edge-focused infrastructure.’ “Looking at a future of having all digital devices serving to end users, there was a need for such edge topology which can give cloud-like performance closer to devices and reduce the burden on network usage,” he wrote.

One small step for quantum computing – but a leap in the near future?

A trend which became more apparent from the cloud vendor perspective in 2018 was quantum computing – but there is naturally still a long way to go.

In October, this publication attended the annual predictions jamboree of analyst firm CCS Insight. The material around quantum was especially intriguing. “Those interested in cloud are going to differentiate themselves around quantum,” explained Nick McQuire, VP enterprise. “The size of customer workloads is doubling every year as more and more organisations go to cloud. We think quantum over time is going to solve a lot of this strain.”

Naturally, it’s all research and development right now and specific signposts are a little way off. In the same month, Travis Humble, distinguished scientist at Oak Ridge National Laboratory and leader of the IEEE working group for quantum computing devices, wrote on the progress being made at the sharp end.

The ever-changing goalposts for cloud security

As technologies emerge, gaps appear and the race is on between the good and bad guys. One area which had seen a serious increase in 2018 was cryptojacking, where compute resources are ransacked to mine for cryptocurrency. While a report from Unit 42 in December mused that the decline in blockchain markets meant cryptojacking was not as hot as previously, cloud infrastructure is still a major target.

A debate which continues to rage is around how much responsibility the biggest cloud players need to take for their customers’ security. Evidently, the messages around shared responsibility are still not getting through to all customers. In November for instance AWS launched extra steps to ensure customers’ S3 buckets don’t become misconfigured – the redesign of a key dashboard earlier in the year to include bright orange warning indicators having not proved sufficiently effective – to help ensure a simple user error doesn’t become a major data breach.

The market for cloud access security brokers (CASBs), those vendors who sit between on- and off-premises infrastructure looking for potential mishaps and misconfigurations, has therefore grown. Gartner published a specific Magic Quadrant in November focusing on CASBs, positing that through 2023 “at least 99%” of cloud security issues will be the fault of the customer. Ultimately, as technology has matured, security practices have appeared to stand still.

2018’s major mergers and acquisitions

March – Salesforce acquired MuleSoft for $6.5 billion
July – Broadcom acquired CA Technologies for $18.9 billion
October – Cloudera and Hortonworks announced $5.2 billion merger deal
October – IBM acquired Red Hat for $34 billion in total enterprise value
November – SAP acquired Qualtrics for $8 billion

What will happen in 2019?

“Multi-cloud strategies reached new levels of adoption last year. But despite considerable uptake already, we expect multi-cloud’s prominence to grow further still in 2019,” said Stephan Fabel, director of product management at Canonical. “Multi-cloud is almost becoming the default cloud strategy as organisations look to avoid vendor lock-in, granting themselves greater flexibility in deploying the most relevant cloud across different departments and functions.

“Instead of being restricted to the ecosystem of one vendor, a multi-cloud approach permits organisations to deploy a mixture of cloud apps to suit their needs across the business, while at the same time technologies such as Kubernetes can be used to containerise and deploy applications across different cloud providers when necessary,” added Fabel.

Containers – a technology whose continued rise was exemplified by the IBM/Red Hat deal in October – will again solidify, but it has to be in line with security priorities. “We’ve seen this all too frequently – speed is good for business, bad for security,” said Dan Hubbard, chief product officer at Lacework. “Security isn’t given the attention it needs and containers can fall victim to loose security management.

“In 2019, we’ll see smart enterprises build containers into their overall security posture and ensure they are using the right processes and tools for development while adhering to security principles,” added Hubbard. “We can expect that to become gospel for companies who really ‘get it’ in terms of effective container strategies. They will realise that there’s no such thing as fast development without security.”

“I will call it and say that Kubernetes has officially won the battle for containers orchestration,” added Lee James, CTO EMEA at Rackspace. “But this battle has left us with some interesting observations. While companies are fully embracing the benefits of containers and microservice architecture, usage of Kubernetes is still very much within software development-focused organisations.

“With AWS, Azure, Google and VMware [having] launched or launching their own Kubernetes services, it will be interesting to see whether lock-in Kubernetes starts to surface as take up grows or whether the multi-cloud brokerage can prevail,” said James.

If 2018 emphasised anything, it was that organisations who fail to become truly data-driven will most likely fail at almost everything else. Naturally, this continued growth is in the minds of many industry execs as 2019 dawns. “Enterprises unable to leverage their data will face extinction – it’s that simple,” said Manish Sood, CEO of Reltio. “Companies that don’t evolve according to new customer expectations or develop innovative business and revenue models will become a part of these stats.”

“In 2019 organisations will discover that assuring the security of their supply chain is a lost cause,” added Steve Durbin, managing director of the Information Security Forum. “Instead, it is time to refocus on managing their key data and understanding where and how it has been shared across multiple channels and boundaries, irrespective of supply chain provider.

“This will cause many organisations to refocus on the traditional confidentiality and integrity components of the information security mix, placing an additional burden on already overstretched security departments.”

Looking at more general business strategies, Neil Thacker, CISO and DPO EMEA at Netskope, argued the need for more advanced adoption through 2019. “In 2019 it will be more cost-effective to move to the cloud for both digital and security transformation,” said Thacker. “By adopting a cloud-first mentality, businesses will be able to effectively future-proof and operate more efficiently than on-premise, legacy solutions.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Matt Hancock says every GP practice must offer Skype appointments by 2024


Keumars Afifi-Sabet

2 Jan, 2019

The health secretary, Matt Hancock, has called on every GP practice in England to offer digital appointments within five years as part of plans to shake-up an ‘outdated’ IT setup.

With the IT market dominated by just two providers, a new GP IT Futures framework will open up the NHS to investment and encourage competition, Hancock said.

By 2023 to 2024, every patient should be able to access GP services digitally, with practices offering online and video consultations as standard, through services such as Skype or Google Hangouts. This will continue alongside clinicians seeing patients in person.

Executing the plans will free up staff time, and reduce delays by hastening the flow of data between GP practices, hospitals and social care institutions, according to the Department for Health and Social Care (DHSC)

The framework will also examine how patient data can be migrated to cloud services, and how both patients and clinicians can access this data securely in real-time.

“Too often the IT used by GPs in the NHS – like other NHS technology – is out of date. It frustrates staff and patients alike, and doesn’t work well with other NHS systems. This must change,” said Matt Hancock.

“I love the NHS and want to build it to be the most advanced health and care system in the world – so we have to develop a culture of enterprise in the health service to allow the best technology to flourish.

“I want to empower the country’s best minds to develop new solutions to make things better for patients, make things better for staff, and make our NHS the very best it can be.”

New standards, to be developed by NHS Digital, will introduce minimum technical requirements for any systems implemented, so they are able to communicate easily and securely, and be upgradeable.

The DHSC says it will seek to end existing contracts with providers which do not adhere to the minimum technical requirements.

“The next generation of IT services for primary care must give more patients easy access to all key aspects of their medical record and provide the highest quality technology for use by GPs,” said NHS Digital’s chief executive Sarah Wilkinson.

“They must also comply with our technology standards to ensure that we can integrate patient records across primary care, secondary care and social care.

“In addition, we intend to strengthen quality controls and service standards, and dramatically improve the ease with which GPs can migrate from one supplier to another.”

Hancock has dedicated a significant portion of his six-month tenure as health secretary talking up the role technology can play in transforming the NHS, marking his start with a £500 million digital transformation pledge.

He also called for the NHS to vastly expand its library of approved apps, shortly after taking up his new role, to support both patients and clinicians in their roles.

How the industry cloud computing market is gaining new momentum

As more CIOs and CTOs embrace hybrid IT infrastructure models, incorporating multicloud solutions, another key trend is gaining momentum. According to the latest worldwide market study by International Data Corporation (IDC), five large industry groups are expected to spend a total of $37.5 billion on industry cloud solutions in 2018.

The five industry groups are healthcare, public sector, finance, retail and wholesale, plus the manufacturing sector. Among them, manufacturing grew the most, while retail and wholesale were next to increase their investment.

Industry cloud market development

The overall market is expected to reach $45.4 billion in 2019 with three of the five groups growing above the market average of 21.5 percent. Healthcare provider and public sector spending are both forecast to grow below the market average, although their 2019 growth rates will be higher than those for 2018.

"IDC's latest forecast shows that industry cloud growth rates will continue to accelerate over the next three years, which is very unusual for multi-billion-dollar markets," said Frank Gens, senior vice president & chief analyst at IDC. "This growth is being driven by rapidly-digitising industries like healthcare, financial services, and manufacturing, where industry clouds are becoming the cornerstones for next-generation growth and innovation strategies."

From a geographic perspective, the United States market will make up close to three-quarters of the overall market in 2018. Most of the other regions will enjoy stronger than average market growth with Japan and China expected to grow the most year over year at 54 percent and 47 percent respectively.

These two countries are also forecast to grow at an even higher annual rate in 2019. The other regions, such as U.S., Latin America, and the Middle East & Africa, will also outperform their 2018 growth rates.

The healthcare provider market in the U.S. is expected to pass the $10 billion mark in 2018 for the first time while the Western Europe market for healthcare industry cloud is also forecast to hit a landmark in 2018 by crossing the $1 billion mark.

Relative to all other regions, Japan can be considered a late adopter to industry cloud deployment. Having said that, the region will fast track to pass the $1 billion mark by 2022. Meanwhile, China is expected to reach that landmark two years earlier.

Outlook for industry cloud application growth

According to the IDC assessment, the industry cloud market continues to accelerate as cloud users demand both vertically-specific capabilities in their solutions and industry expertise from their cloud service providers.

To capture this growth, cloud vendors have increasingly shifted their horizontal capabilities to form industry cloud solutions, while industry clouds themselves have created consortiums of collaboration to drive industry innovation.

Healthcare has led industries towards this trend, but the finance, manufacturing, and retail industries have internalised the successes and failures from horizontal platforms to form their own paths.

IDC believes the industry cloud market is among the largest vertical growth opportunities for both technology vendors and professional services firms through 2025.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.