Microsoft Azure is best for developers, says Forrester


Clare Hopping

16 Apr, 2018

Forrester has announced Microsoft Azure and AWS offer customers the best PaaS experiences in its latest Wave Report.

The company pitted the world’s 11 leading PaaS vendors, including Alibaba, Amazon Web Services (AWS), CenturyLink, Google, IBM, Microsoft, Oracle, Pivotal Software, Red Hat, Salesforce and SAP against each other across a range of different factors, with Microsoft and AWS’ platforms offering the best user experience across the board.

Microsoft was the overall winner, with the analyst firm praising its range of services such as database tools (referring to the Azure Cosmos DB) and integrations, with lots of support for developers via preconfigured resources that help businesses get up and running quickly.

“Overall operational tools and features are strong; Microsoft also operates a leading data-center network and offers Azure Stack for on-premises deployment,” the firm noted. “Microsoft offers a range of AI services on Azure, but only Azure ML is distinctive.”

However, there are some drawbacks to Microsoft’s platform, such as natural language processing, limitations of its function as a service (FaaS) and releasing features without the proper documentation. In these areas, AWS ranked better, putting its suite of PaaS services in second place. 

AWS also generates three times as much revenue as Microsoft’s cloud platform and offers much more to the developer community, with more preconfigured services and service bundles putting it hot on Microsoft’s tail. 

Google came in at third place, helping developers build apps quickly with its “zero-configuration infrastructure” and backing of open-source platforms such as Kubernetes and TensorFlow. it also offers a range of fully managed services, which puts it only slightly behind Microsoft and AWS.

Oracle, IBM and Salesforce were all noted as strong performers, while SAP, Alibaba, Red Hat and Pivotal were ranked as contenders. CenturyLink came in at the bottom of the leaderboard as a challenger.

IoT Workshop at @ExpoDX New York | @CHarrold303 #AI #IoT #IIoT #SmartCities #DigitalTransformation

IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the horizon, that we already run the risk as enablers and supporters of not being able to effectively understand and develop these complex and multi-disciplined solutions on our own. That understanding, of the basics of circuits, sensors, and how those things work together and with software is the key to being able to understand, engineer, and support IoT solutions in your own environment.

read more

Julio Villarreal Pelegrino Joins @CloudEXPO Faculty | @RedHat #CloudNative #OpenStack #DevOps #DigitalTransformation

In this presentation, you will learn first hand what works and what doesn’t while architecting and deploying OpenStack. Some of the topics will include:- best practices for creating repeatable deployments of OpenStack- multi-site considerations- how to customize OpenStack to integrate with your existing systems and security best practices.

read more

Why trust and transparency are key for companies complying with new EBA cloud guidance

New guidance from official regulators should be music to the ears of anyone involved in compliance. Clarification, reference points and approved examples make the business of compliance that much more straightforward and are generally welcomed by compliance experts. In that spirit, it was with the best intentions – to clear the pathway to cloud adoption for financial services companies – that the European Banking Authority issued the guidance with which the financial sector must comply by 1 July this year.

Still, compliance experts on both sides of the cloud service provider (CSP)/customer divide might be forgiven for scratching their heads when it comes to interpreting the new directions in a real-world scenario. 

The EBA has opted for a principles-based, technology neutral approach to the guidance. In some ways this makes sense – technology is evolving at an astonishing rate and being too prescriptive could risk limiting the ability to make the most of the next exciting innovation. However, I feel that financial services companies require some more prescriptive standards, certifications and best-practice examples to provide greater clarity and help them unlock the benefits of cloud computing. As a cloud compliance specialist, here is my take on some of the key elements of the EBA guidance and how financial companies and CSPs will need to work together to comply with its principles.  

Third party oversight offers verifiable, auditable trust

The guidance requires that financial organisations seek full understanding of the risks associated with their cloud outsourcing operations and the level of data and system security that CSPs will deliver. Therefore, the initial priority for a financial organisation is to establish that its cloud service provider – or prospective provider – has identified and is operating their risk, security and personal information management systems to a standard that will satisfy the guidance. This is not a small hurdle: the guidance does not specify which of the available standards is acceptable so there is a degree of subjectivity involved in deciding what constitutes a sufficiently rigorous approach. This will likely lead to a longer due diligence and discovery phase. Organisations should look for CSPs that are ISO 27001-certified for information security management as a minimum, but for cloud-specific aspects of security, the Cloud Security Alliance (CSA) Star certification programme provides auditable ongoing assurance that the provider is meeting and sustaining the highest standards.

When it comes to personal information security the forthcoming EU General Data Protection Regulation (GDPR) has prompted some CSPs who are leading the market in cloud compliance to certify to BS 10012:2017, which ensures they are operating best practice systems for data protection under the GDPR and should meet the level of assurance required by the guidance.

Third party oversight and validation from certifications such as CSA Star and BS10012:2017 plus transparency into the policies and processes of the cloud provider allow financial institutions deep insight into the operations and procedures of their cloud partners. The key mantra here should be verifiable trust and transparency.

It’s important to note, also, that standards continue to evolve alongside the environment they relate to and CSPs have to work continuously to achieve ongoing certification. Including references to industry standards in the EBA’s guidelines like the ones I’ve mentioned above will provide useful signposts towards the route that financial organisations should take to achieve compliance.

Best practice SLA and monitoring relationships

The ability to continuously monitor the security and risk of cloud service provision is a key axiom of the guidance and will be critical to the success and compliance of the cloud outsourcing relationship. To achieve this it’s vital that the CSP and the financial organisation’s risk and monitoring programmes are aligned. If you have to decipher and translate risk and monitoring programmes between entities, confusion and disconnects will arise. Again, standards offer a solution: if both entities are aligned to ISO 27001 there is a common approach on which to build an effective monitoring strategy.

A best-practice service level agreement and monitoring relationship should be instigated at executive level within both organisations, reflecting its importance to both parties. A strong and transparent working partnership between the risk and compliance teams on both sides should underpin the regular cycle of audit, reporting and assurance. Look for a cloud service provider that provides visibility into your cloud resources and the associated security settings and compliance postures as well as a straight-forward means of getting the reporting you need for auditing purposes.

Chain outsourcing: Overcoming the financial sector’s Achilles heel

Outsourcing of any kind has historically been a major challenge and strictly regulated in the financial sector. In recognition of the flexible and collaborative nature of cloud service providers, the new guidance sets out the terms and processes under which chain outsourcing – a cloud provider outsourcing an element of its provision to a third party – is acceptable. As with most aspects of the guidance, strong emphasis is placed on ongoing risk management and transparency between the CSP and financial organisation. CSPs must agree to notify the financial institution should they subcontract an element of their service to another provider and must ensure that the subcontracted company meets the same standards set out in the original agreement between the CSP and its customer. Consent from the financial institution is not required, however, as this is deemed impractical. It is the responsibility of the financial organisation to determine whether the third party outsourced arrangement now constitutes unacceptable risk.

Throughout all aspects of the EBA guidelines it is abundantly clear that the relationship between financial organisations and their CSPs needs to be extremely close and transparent, and conducted at a senior level. Verifiable trust through certification is the linchpin of the whole relationship and the partnership will be dysfunctional (and potentially inviable) without this cornerstone in place.

In future guidance, I would like to see the EBA put more definition around the exact standards and best practices it expects to see in financial sector cloud outsourcing projects, but in their absence I hope that financial companies will discover that CSPs themselves can offer the consultative expertise needed to help them unlock the many benefits of the cloud.       

Hong Kong analysis shows importance of cloud technologies in improving productivity and ROI

Cloud technologies are key to Hong Kong businesses according to a new report from SolarWinds – but containers, blockchain and robotics still have a fair way to go yet.

The findings appear in the company’s latest IT trends report, ‘The Intersection of Hype and Performance.’ The research polled 75 IT practitioners, managers and directors in Hong Kong – with the overall research quizzing more than 800 respondents across four continents – and found cloud and hybrid IT was the most important technology and management tool for organisations’ strategy today, cited by 92% of those polled.

Big data and analytics, cited by 77% of respondents, was also a key tool, ahead of automation (71%), software-defined everything (SDx) (52%), and the Internet of Things (51%). Containers scored 29%, while blockchain (11%) and robotics (9%) fared poorly.

Worryingly, 61% said their IT environment was not performing at its optimal level, compared with only 19% who said it was. In something of an anomaly, 38% of mid-sized businesses said their IT was at optimal level, compared with only 11% for SMBs and enterprises respectively.

Cloud ranked highly in the vast majority of questions asked. Almost half (49%) of overall respondents said it had the best potential to deliver highest ROI, alongside big data (49%) and automation (47%). 52% of overall respondents – rising to 67% for smaller businesses – said cloud had the greatest potential to deliver further productivity, while 73% identified it as a ‘transformational’ technology, albeit behind big data analytics (85%).

The report notes the importance of automation as forming the next generation of cloud services – a trend this publication has reported on frequently this year. “Where the C-suite considers AI, ML, and deep learning to be fundamental elements of digital transformation, IT professionals are looking toward the technology and processes that underpin continuous integration and delivery – which ultimately enable enhanced performance and digital experience in today’s environments,” the report explains.

Hong Kong is one of the most advanced cloud nations, hitting top spot in the most recent analysis from the Asia Cloud Computing Association (ACCA). The region was praised specifically for its ‘tradition of robust future planning and a strong tech industry… ensuring [its] infrastructure is primed for fast, reliable and secure cloud offerings targeting the entire region.’

“In 2018 more than ever, IT professionals have an opportunity to continue identifying ways to optimise the digital experience for end users in hybrid IT environments while prioritising investments in technologies that will deliver business value visible well beyond IT,” the report concludes. “IT must also be the convening voice in business discussions, showcasing the ongoing value of IT professionals as the partners to the business, supplying expertise and experience on the technologies that will enable the business to deliver digital transformation success.”

You can read the full report here (pdf).

Data centres and cloud networks: Security in the modern context

Traditionally, companies have sought to create a hardened IT network perimeter that kept all potential cyber threats out and to protect organisations through the use of network security platforms such as firewalls. In the modern context, however, this has become a restrictive and dangerous approach and I will explain why. 

What we think of as traditional firewalls are only really able to inspect unencrypted traffic. This means that attackers will use encrypted communications to exploit and maintain control over assets. Attackers have also moved to exploit changes in application design and implementation, and use network paths between application components that traverse internal data centre and cloud networks. 

While traditional network security appliances, such as firewalls and Intrusion Prevention Systems (IPS), are still useful for creating choke points in conventional networks, their utility declines rapidly in cloud and distributed networks. This is because the traditional model of network security was based on the assumption that the majority of traffic would be passing from the perimeter “south” towards monolithic service pods, with little traffic propagating across the data centre.  We also assumed that the majority of our services would be hosted in data centres that enterprises would own and deploy themselves. 

In contrast, modern application architecture now takes advantage of highly automated cloud and hosted data centre solutions based on multiple layers of virtualisation. The rise of containerisation and the move towards micro-service architectures has also lead to a proliferation of network traffic between workloads within, and across, data centre and cloud networks. Now much of this traffic moves east – west rather than north – south, meaning that the adequacy of traditional security appliances is vastly reduced.  It also leads to a reduction in our visibility into the traffic flows between application components. The automation and orchestration functions within these applications can make it difficult to predict how and where data will transit across the network, and whether the network will be entirely under our control.

To protect these modern application architectures we need to be able to apply security policy to east – west traffic in a way that is consistent with the automation and orchestration tools available within the enterprise. Conventional network security tools typically integrate poorly with these systems – although vendors continue to improve this situation through the implementation of configuration APIs (application programming interfaces) and the general move to Software Defined Networking (SDN) – which leads to delays in the implementation of new services and the creation of unwelcome blockers in the management of service infrastructures. 

One approach taken by conventional network security vendors has been to create virtual appliance versions of their existing platforms, with the intention that these can be deployed in cloud and virtualised networks in a way that mirrors traditional distributions. Unfortunately, this does nothing to alleviate the key issues with the legacy model of deploying network security, resulting in a broken model that fails to address the crucial requirements of the modern network. A new model for delivering network security is required.

In modern environments, we need network security functions to be heavily automated and capable of integrating with the standard toolsets available to operational teams. New toolsets should devolve network security functions down to the endpoint and/or workload, whilst still providing centralised programmatic methods for configuration. This requirement has led to the development of micro- or nano-segmentation. The approach is based on the need to apply security policies to network traffic regardless of where services are physically deployed, and allows the distribution of fine-grained security policies, usually at the workload or container level. This ensures that traffic between them is still subject to inspection and the application of policy, even if it never leaves the physical host that they are running on.

This is a vital point: traditional network security approaches cannot do this since they require the traffic to break out of the physical host at some point. In the past, attempts to meet this requirement have led to the implementation of highly complex and fragile routing configurations that often lead to the loss of key advantages for virtualised networks. These “work around” solutions have often been exploited by attackers to persist within a compromised network and can enable lateral movement within a service – something that micro-segmentation technology is explicitly designed to prevent.

A nice side-effect of the centralised management of network security policy on workloads is that through logging and other forms of telemetry, it is possible to passively detect and map out application data flows, which is an invaluable feature in highly automated networks spanning multiple data centres and cloud services. This can be combined with active application performance management systems that hook into orchestration and automation platforms to dynamically adjust network configurations and optimise service delivery.

The modern enterprise is heavily reliant on the use of cloud and virtualised network services, and it would be foolish to try to shoehorn these services into a traditional network security model that is simply incapable of supporting them fully. Modern enterprises should be deploying new security architectures that support their application infrastructure and can adapt in step with the requirements of consistent service performance and business requirements. 

A key component of this is the deployment of micro-segmentation technology. This ensures that application data flows are adequately protected in a way that is easy to integrate with highly adaptable automation and orchestration tools. The end result is that enterprises can limit their exposure to threats that exploit brittle and low yield traditional security architectures and gain valuable insight into their application infrastructure through passive and active network telemetry.

Parallels Mac Management Update: SCCM Branch Version 1802 will Force PKI Compliance for Users

As you know, Microsoft SCCM is updated periodically with what Microsoft calls branch versions. Since the first branch version, 1511, Parallels® Mac Management for Microsoft® SCCM has not had any down time due to Microsoft’s changes.  Jason Sandys, a Microsoft MVP and friend of Parallels, recently tweeted about the latest branch version—1802—and some rather big […]

The post Parallels Mac Management Update: SCCM Branch Version 1802 will Force PKI Compliance for Users appeared first on Parallels Blog.

Hybrid cloud security strategies analysed in new research

Hybrid cloud and multi-cloud security is becoming top of mind for organisations – but many still persist with best of breed tools for both systems rather than combining into one ‘best of suite’ offering.

That is the key finding following a report from Santa Clara-based Cavirin Systems. The report, which polled more than 350 IT admins, IT decision makers and C-suite executives, found 81% of organisations currently deploy a hybrid or multi-cloud strategy, with 11% only going on-premise and 8% with one cloud provider.

For those who have two or more, Azure, cited by almost half of those polled, was most popular, ahead of IBM (45.8%), Oracle (34.7%), Amazon Web Services (32.3%), and Alibaba (20.7%). 46% said their setup was on-premise with VMs, while the same number cited on-premise with private cloud management.

When it came to what hybrid cloud security meant for those polled, more than two thirds (68.9%) said it meant verification their public account was secure and confirmation that workloads in the cloud, such as VMs and container instances, were secure. More than half (52.6%) said it meant ensuring all sensitive data was out of the cloud.

The most popular form of hybrid cloud security architecture is separate best of breed tools for both on-premise and cloud, cited by more than 60% of respondents. More than half (51.4%) said they used a best in suite tool – in other words a single tool spanning on-premise and cloud. More than a third (36.7%) said they use a cloud access security broker (CASB) tool for their hybrid security management, while one in five are using a dedicated container security tool.

When looking at the overall state of health, only one respondent was brave enough to admit their cybersecurity posture needed ‘immediate help.’ More than half (53.4%) said their outlook was healthy, with 22% saying their posture was impenetrable.

According to separate research from ESG, more than four in five enterprises are adopting a hybrid cloud approach, yet only 30% were using unified security tools spanning both on-premise and cloud. “The fact that this will grow to 70% over the next two years speaks well of Cavirin’s hybrid cloud approach, helping address a key barrier to hybrid cloud adoption – security and visibility,” said Doug Cahill, ESG lead cybersecurity and cloud analyst.

Naturally, Cavirin has a solution to this problem. The newest product, CyperPosture Intelligence, aims to ‘deliver risk, cybersecurity and compliance management by providing visibility and actionable intelligence to the CISO and other stakeholders across hybrid environments’, in the company’s words.

Public cloud market to surpass $300bn by 2021 says Gartner – with 21% growth this year

Public cloud continues to go up and up: according to Gartner, the market will grow 21.4% in 2018 to total $186.4 billion (£131.4bn).

Almost 40% of this will come from software as a service (SaaS), with a quarter to come from what Gartner calls cloud business process services (BPaaS) – delivering business process outsourcing (BPO) – and 22% to come from infrastructure as a service (IaaS).

IaaS, however, will outstrip BPaaS by 2021 according to Gartner’s predictions. In three years total public cloud service revenues will surpass $300 billion ($302.5bn), with SaaS accounting for 38% of that total, IaaS 27% and BPaaS 19%. SaaS will also hit 45% of total application software spending by 2021.

When it came to analysing IaaS specifically, Gartner predicts the hyperscale players to increase their dominance. In 2016, the analyst firm said the top 10 players in the market – Amazon Web Services (AWS), Microsoft Azure, Google, IBM and the rest – accounted for half of the total IaaS market. By 2021, this figure will rise to 70%.

“The increasing dominance of the hyperscale IaaS providers creates both enormous opportunities and challenges for end users and other market participants,” said Sid Nag, Gartner research director. “While it enables efficiencies and cost benefits, organisations need to be cautious about IaaS providers potentially gaining unchecked influence over customers and the market.”

Nag noted the rise of multi-cloud as key to this. Organisations will want to move workloads from cloud to cloud without fear of reprisal. Could there be another wave of vendor lock-in? “In response to multi-cloud adoption trends, organisations will increasingly demand a simpler way to move workloads, applications and data across cloud providers’ IaaS offerings without penalties,” said Nag.

Platform as a service (PaaS) will comprise 8% of the total public cloud market this year at a relatively princely $15 billion, while cloud management and security services will total $10.5bn.

What to look for in a secure cloud system


Esther Kezia Thorpe

12 Apr, 2018

Cloud security and concerns around it have dominated conversations about cloud adoption, with a recent study from Ingram Micro revealing that it’s a top concern for 83% of organisations looking for a cloud solution.

But as the technology available advances, cloud suppliers are able to use the industry’s most sophisticated security solutions to protect data, and are able to justify investment in top-level security to protect a wide range of customers.

Of course, not all cloud solutions support the same level of security. So what should organisations be looking out for when exploring all the functions offered by vendors to ensure they get the best level of security?


Increasingly, cloud vendors are better at protecting corporate data than organisation’s own IT departments. Learn more in ‘Demystifying Cloud Security’.

Download now


Here are three things to look out for before committing to that cloud contract.

Information access

The first thing to check for is the solution’s ability to share information across departments. This functionality is key to CIOs looking to transform the business by improving customer experience, improving organisational agility and introducing new digital revenue streams.

Corporations run hundreds, and sometimes even thousands of interconnected applications to support their operations. Traditional solutions stored information in many different places, so keeping those systems in sync was a challenging task.

True, multi-tenancy SaaS makes all of this much easier, with human resource, finance and planning data stored in one application. This central design has many benefits, with all systems working from a common framework, so there are no inconsistencies in data. It also eradicates the disconnect between the system and its users, a problem prevalent in many legacy systems.

Consequently, security improves with a single version of the software that is continuously updated, scanned and patched. This is much better than working with multiple solutions, and any security-related changes to the system architecture is relayed to all customers simultaneously. If a leading enterprise needs a stringent new security feature, it is available to an SMB as well.

Encryption benefits

In the old days, corporations relied on firewalls to protect information, thinking that once the business had warded off outsiders, information was safe. Such thinking is now very outdated, with hackers able to attack systems at different levels. Once in a system, they stay, often working their way from low-level to high-level security clearances and compromising sensitive information.

One way firms can protect themselves is through encryption. Typically, data is encrypted in transit, which is a first rather than a last step. Once information enters the data centre, it is unencrypted and therefore vulnerable. To address this problem, organisations need to encrypt information at rest in a persistent data store.

Unfortunately, cloud services built on legacy architectures rarely support the encryption of all customer data at rest because encryption solutions are complex and difficult to implement.

With modern cloud architectures, a good cloud vendor will take on those responsibilities, especially if privacy and security are embedded into the solution’s system right from the start.


What should you look for in a cloud solution to ensure that your corporate data can be kept safe? Learn more in this whitepaper on cloud security.

Download now


Support for third-party standards

Industry and government groups have designed various compliance frameworks to protect customer information, such as the GDPR coming into force in just a few weeks. However, the specifications are only a starting point.

While assessing a solution, the various compliance standards and security implementations should be thoroughly examined. Is the service simply aligned with the standard or has the service been certified? How is the information stored? What level of encryption is supported? How are updates handled?

All cloud providers claim to have secure systems, but few offer the higher levels of protection needed with an enterprise’s valuable data. Carefully examining a vendor’s solution, however good it may seem on the surface, is key to a breach-free, compliant cloud future.