LinkedIn finally migrates to Azure


Bobby Hellard

24 Jul, 2019

LinkedIn is set to migrate its computing workloads to its parent company’s Azure public cloud, some three years after being acquired by Microsoft.

The professional social network will shift all workloads from its own data centres to Microsoft’s cloud, with the project taking will ‘multiple years’ to complete, according to LinkedIn’s senior VP of engineering, Mohak Shroff.

Although Microsoft has largely left LinkedIn to its own management since it bought the platform in 2016, it has used a number of Azure technologies for improvements to content delivery, updates to its news feed and also to keep inappropriate content off the site.

The success of these improvements made Azure the right choice and one of the reasons for the decision to now move from its own datacentres, according to Shroff.

“Moving to Azure will give us access to a wide array of hardware and software innovations and unprecedented global scale,” he wrote in a blog post. “This will position us to focus on areas where we can deliver unique value to our members and customers. The cloud holds the future for us and we are confident that Azure is the right platform to build on for years to come.”

In October, Microsoft published details of how it had moved LinkedIn’s 14,000 employees off Google services and onto Office 365,  which took a couple of years. The move to Azure, however, may take much longer.

There are a number of Microsoft services that still don’t run on Azure, such as Office 365 and Xbox Live. Similarly, Amazon Web Services (AWS) has been trying to take its workloads off Oracle databases – most of which are Amazon systems set up prior to AWS.

Microsoft’s $1bn OpenAI partnership underpinned with closer Azure ties


Keumars Afifi-Sabet

23 Jul, 2019

Microsoft has invested $1 billion into an industry wide artificial intelligence (AI) partnership that will harness Azure cloud technology to develop AI for supercomputers.

The not-for-profit organisation OpenAI, co-founded by Tesla CEO Elon Musk, is basing its partnership with Microsoft on three key areas, largely focused on how the firm’s Azure cloud platform can integrate with ongoing work.

The two organisations will jointly build “Azure AI supercomputing technologies” while OpenAI will port its existing services to run on Microsoft’s cloud platform. Moreover, the company will become OpenAI’s preferred partner for marketing AI technologies as when they are commercialised.

The initiative will also focus on creating artificial general intelligence (AGI). This differs from conventional AI in its broad and multi-functional nature, as opposed to being developed for specific applications.

Microsoft argues generalisation, and “deep mastery of multiple AI technologies”, will help address some of the world’s most pressing issues. These range from global challenges like climate change to creating more personalised issues like healthcare and education.

With its capacity to understand or learn any intellectual task that a human can, AGI is also a popular subject in science-fiction writing, as writers and futurists extrapolate this to machines experiencing consciousness.

“The creation of AGI will be the most important technological development in human history, with the potential to shape the trajectory of humanity,” said OpenAI CEO Sam Altman.

“Our mission is to ensure that AGI technology benefits all of humanity, and we’re working with Microsoft to build the supercomputing foundation on which we’ll build AGI. We believe it’s crucial that AGI is deployed safely and securely and that its economic benefits are widely distributed. We are excited about how deeply Microsoft shares this vision.”

OpenAI was founded in December 2015 as an organisation dedicated to researching next-gen AI technologies and the applications for these. Its missions centre on developing AI that serves as an extension of individual humans, not a replacement.

It’s a similar AI vision to Microsoft’s, with the industry giant committing to developing AI grounded in an ethical framework. Its foray into automation and machine learning has largely come in the way of voice recognition and in medical contexts.

It’s a step-change from the culture that led to Microsoft launching, and later shutting down, the infamous Tay bot in 2016. This chat Twitter-based chatbot was initially designed to emulate a teenage girl but ended up parroting racial slurs and conspiracy theories after it was hijacked by trolls.

Capgemini report shows why AI is the future of cybersecurity

These and many other insights are from Capgemini’s Reinventing Cybersecurity with Artificial Intelligence Report published this week. You can download the report here (28 pp., PDF, free, no opt-in). Capgemini Research Institute surveyed 850 senior executives from seven industries, including consumer products, retail, banking, insurance, automotive, utilities, and telecom. 20% of the executive respondents are CIOs, and 10% are CISOs. Enterprises headquartered in France, Germany, the UK, the US, Australia, the Netherlands, India, Italy, Spain, and Sweden are included in the report. Please see page 21 of the report for a description of the methodology.

Capgemini found that as digital businesses grow, their risk of cyberattacks exponentially increases. 21% said their organization experienced a cybersecurity breach leading to unauthorized access in 2018.

Enterprises are paying a heavy price for cybersecurity breaches: 20% report losses of more than $50 million. Centrify’s most recent survey, Privileged Access Management in the Modern Threatscape, found that 74% of all breaches involved access to a privileged account. Privileged access credentials are hackers’ most popular technique for initiating a breach to exfiltrate valuable data from enterprise systems and sell it on the Dark Web.

Key insights include the following:

69% of enterprises believe AI will be necessary to respond to cyberattacks

The majority of telecom companies (80%) say they are counting on AI to help identify threats and thwart attacks. Capgemini found the telecom industry has the highest reported incidence of losses exceeding $50M, making AI a priority for thwarting costly breaches in that industry.

It’s understandable by Consumer Products (78%), and Banking (75%) are second and third given each of these industry’s growing reliance on digitally-based business models. U.S.-based enterprises are placing the highest priority on AI-based cybersecurity applications and platforms, 15% higher than the global average when measured on a country basis.

73% of enterprises are testing use cases for AI for cybersecurity across their organisations today with network security leading all categories

Endpoint security the third-highest priority for investing in AI-based cybersecurity solutions given the proliferation of endpoint devices, which are expected to increase to over 25B by 2021. Internet of Things (IoT) and Industrial Internet of Things (IIoT) sensors and systems they enable are exponentially increasing the number of endpoints and threat surfaces an enterprise needs to protect.

The old “trust but verify” approach to enterprise security can’t keep up with the pace and scale of threatscape growth today. Identities are the new security perimeter, and they require a Zero Trust Security framework to be secure. Be sure to follow Chase Cunningham of Forrester, Principal Analyst, and the leading authority on Zero Trust Security to keep current on this rapidly changing area. You can find his blog here.

51% of executives are making extensive AI for cyber threat detection, outpacing prediction, and response by a wide margin

Enterprise executives are concentrating their budgets and time on detecting cyber threats using AI above predicting and responding. As enterprises mature in their use and adoption of AI as part of their cybersecurity efforts, prediction and response will correspondingly increase. “AI tools are also getting better at drawing on data sets of wildly different types, allowing the “bigger picture” to be put together from, say, static configuration data, historic local logs, global threat landscapes, and contemporaneous event streams,” said Nicko van Someren, Chief Technology Officer at Absolute Software.

64% say that AI lowers the cost to detect and respond to breaches and reduces the overall time taken to detect threats and breaches up to 12%

The reduction in cost for a majority of enterprises ranges from 1% – 15% (with an average of 12%). With AI, the overall time taken to detect threats and breaches is reduced by up to 12%. Dwell time – the amount of time threat actors remain undetected – drops by 11% with the use of AI. This time reduction is achieved by continuously scanning for known or unknown anomalies that show threat patterns. PetSmart, a US-based specialty retailer, was able to save up to $12M by using AI in fraud detection from Kount. By partnering with Kount, PetSmart was able to implement an AI/Machine Learning technology that aggregates millions of transactions and their outcomes.

The technology determines the legitimacy of each transaction by comparing it against all other transactions received. As fraudulent orders were identified, they were canceled, saving the company money and avoiding damage to the brand. The top 9 ways Artificial Intelligence prevents fraud provides insights into how Kount’s approach to unsupervised and supervised machine learning stops fraud.

Fraud detection, malware detection, intrusion detection, scoring risk in a network, and user/machine behavioral analysis are the five highest AI use cases for improving cybersecurity

Capgemini analyzed 20 use cases across information technology (IT), operational technology (OT) and the Internet of Things (IoT) and ranked them according to their implementation complexity and resultant benefits (in terms of time reduction).

Based on their analysis, we recommend a shortlist of five high-potential use cases that have low complexity and high benefits. 54% of enterprises have already implemented five high impact cases. The following graphic compares the recommended use cases by the level of benefit and relative complexity.

56% of senior execs say their cybersecurity analysts are overwhelmed and close to a quarter (23%) are not able to successfully investigate all identified incidents

Capgemini found that hacking organizations are successfully using algorithms to send ‘spear phishing’ tweets (personalized tweets sent to targeted users to trick them into sharing sensitive information). AI can send the tweets six times faster than a human and with twice the success. “It’s no surprise that Capgemini’s data shows that security analysts are overwhelmed. The cybersecurity skills shortage has been growing for some time, and so have the number and complexity of attacks; using machine learning to augment the few available skilled people can help ease this. What’s exciting about the state of the industry right now is that recent advances in Machine Learning methods are poised to make their way into deployable products,” said van Someren.

Conclusion

AI and machine learning are redefining every aspect of cybersecurity today. From improving organizations’ ability to anticipate and thwart breaches, protecting the proliferating number of threat surfaces with Zero Trust Security frameworks to making passwords obsolete, AI and machine learning are essential to securing the perimeters of any business. 

One of the most vulnerable and fastest-growing threat surfaces are mobile phones. The two recent research reports from MobileIronSay Goodbye to Passwords (4 pp., PDF, opt-in) in collaboration with IDG, and Passwordless Authentication: Bridging the Gap Between High-Security and Low-Friction Identity Management (34 pp., PDF, opt-in) by Enterprise Management Associates (EMA) provide fascinating insights into the passwordless future. They reflect and quantify how ready enterprises are to abandon passwords for more proven authentication techniques including biometrics and mobile-centric Zero Trust Security platform.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Slack gives desktop app performance-driven makeover


Bobby Hellard

23 Jul, 2019

Slack has unveiled a new version of its desktop app that’s reportedly faster and more efficient thanks to a complete rebuild of the platform’s underlying technology.

This will be rolled out to users in an update for Slack’s Windows and macOS desktop app. It promises big performance improvements, such as loading 33% faster and using 50% less RAM use than before, according to the company.

Over the course of 2018, Slack worked on shifting the web and desktop clients – which essentially use the same codebase – to a modern stack and away from jQuery and other technologies it used when first launched in 2012. The result is that Slack will no longer create a standalone copy for each workspace and take up RAM for each instance.

There’s a bit of risk-taking in this update as rewriting or changing the code of a platform can be extremely problematic. The desktop version of Slack is its oldest client, and as such, a few internal cracks were starting to show in the foundation, according to Slack engineers Mark Christian and Johnny Rodgers.

“Conventional wisdom states that rewrites are best avoided, but sometimes the benefits are too great to ignore,” the pair wrote in a blog post. “One of our primary metrics has been memory usage and the new version of Slack delivers…”

“These results have validated all of the work that we’ve put into this new version of Slack and we look forward to continuing to iterate and make it even better as time goes on,” the engineers continued. 

Part of the update included adopting React, a popular JavaScript, as Slack’s original user interface (UI) was built using HTML templates. These often need to be manually built whenever underlying data changed and the engineers said it was a “pain” to keep both data model and UI in sync.

The new update has kept the existing codebase but created a more modern section of it that the engineers have called “future-proof” and has been added incrementally. All UI components have been built with React and this code is both “multi-workspace aware” and also loads lazily, without adding unnecessary data – which is why the loading speed has improved.

Since launching, Slack has become the go-to comms app for startups and is often accused of enabling an ‘always-on’ work culture. Such is its popularity, the company recently went public and has become a key rival to some of the biggest tech companies around.

Currently, its main competitor is Microsoft Teams, which recently overtook Slack in terms of users, hitting 13 million – three million more than Slack.

How can VDI support an edge computing strategy?

Virtual desktop infrastructure (VDI) is, as the name suggests, a virtualisation technology that creates individual, fully personalised desktop virtual machines with user profile control. It has been in existence since 2006, and while it has experienced some variation in popularity, in recent times, it has seen considerable growth. This is reflected in market research, which estimates that the global VDI market will be worth around $5 billion by 2022. 

In its early days, the VDI concept was appealing to IT strategists on many levels. By virtualising desktops, not only could businesses reduce hardware costs, but they could also break the costly three-year refresh cycle, simplify desktop management and save IT teams valuable administrative and support time.

But, there was a downside. Early VDI technologies were accompanied by a complex and expensive backend infrastructure. Similarly, investing in VDI software brought with it significant  licensing fees and vendor hardware lock-in that increased the cost of implementation. These issues became common barriers to adoption of VDI in the enterprise market. 

But more recently, the emergence of edge computing in combination with hyperconverged infrastructure has disrupted existing VDI technologies. But, what makes these technologies so well suited?

Reducing the admin burden while improving services

At the heart of the approach is simplicity. Rolling out a hyperconverged edge computing solution is practical for hundreds of users, even when they are supported by small IT teams. That’s because, in general, it’s a technology that doesn’t require specialist knowledge, other than a few hours of training.

That implementations simplicity is evident as soon as the virtual desktops have been rolled out. For instance, software and anti-virus updates can be remotely managed and maintained for each user. And, by centralising and automating other day-to-day tasks, the technology enables  IT teams to focus on other issues, such as strategic planning or dealing with unexpected emergencies. 

Some edge computing solutions also allow IT teams to take centralised management and administration functionality a stage further and integrate automated disaster recovery capabilities such as replication, snapshot scheduling and file-level recovery. By having this kind of consistent disaster recovery plan running in the background, IT teams don’t need to rely on employees to update their own antivirus software or manage their own data backups. Looking more strategically, full network backups and snapshots of individual desktop profiles can be sent over wider networks to a cloud repository or remote datacentre. 

In the event of a failure at a network access point or terminal, the user can immediately move to a different machine and log back in, and in most cases, continue from where they left off. This plays to the objective of just about every IT team, which is to deliver highly available IT infrastructure.

Security and agility at the edge

A VDI deployment running on a hyperconverged edge computing solution enables users to log-on securely to any machine on the network and gain access to their files, emails and applications. They aren’t limited to PC terminals, they load their personal desktop or applications on their mobile phone or tablet, significantly boosting workforce agility.

Regardless of their location, IT teams can monitor user profiles and receive automated alerts that can help identify potentially suspicious activity or log users out if their account has been inactive for a certain amount of time. A VDI deployment can also offer a cost-effective and secure method to extend network access beyond the office walls to provide remote access to employees wherever they are located.

In many organisations, the security and admin challenges associated with managing BYODs are considerable. However, by integrating BYODs onto an officially sanctioned VDI environment, employee mobiles and tablets can be more effectively protected from potential security risks, so information is better secured from accidental disclosure and loss. 

The advent of hyperconverged edge computing, and the accompanying reductions in cost and complexity, have enabled businesses of all sizes to benefit from the technology. Additional functionality, improvements in performance, and the huge benefits provided by simplification of system management have made it equally easy for end users and IT teams to adopt VDI. As the edge computing market continues to flourish, VDI is on course for even more growth in the years ahead.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Arcserve UDP Cloud Direct review: Capable but limited multi-site backup


Dave Mitchell

22 Jul, 2019

A cloud backup solution well suited to protecting distributed offices but spoilt by a lot of rough edges

Price 
£1,599 per year exc VAT

Arcserve’s UDP Cloud Direct is simple to deploy and easy to use – qualities that will immediately endear it to overworked IT departments. Two versions are available: we tested the Backup-as-a-Service (BaaS) edition, which promises pain-free cloud backup and recovery, but you can step up to the Disaster Recovery-as-a-Service (DRaaS) package, which adds in-the-cloud virtualised recovery of key systems.

Pricing for BaaS is based on your cloud storage requirements. The 1TB package costs £1,599 per year, or if you go for a 5TB subscription, you save over £300 per terabyte. DRaaS comes as a separate service, where you’ll pay around £350 per year for a VM with one CPU and 4GB of RAM.

After signing up, you get access to a personalised web portal, from which you can download backup agents for Windows, Mac and Linux – although the latter two are file-only. It took mere seconds to install the agent on our Windows servers and enter our account details, after which each one popped up in the portal. The ease with which clients can be registered and managed makes Arcserve ideal for companies with multiple sites.

All of your clients can be viewed in the portal’s Systems tab, and selecting one takes you to backup task creation. Oddly, you can schedule backups for selected days, but each task can only be run once a day at a specific time: if you want more frequent protection, you’ll have to run extra jobs manually.

File and folder backup is supported on all platforms, and on Windows you can also secure entire systems as images. SQL Server and Exchange databases can be optionally backed up too, and you don’t even have to specify the locations: the agent simply backs up all the databases it can find. You can even back up NetApp filers.

After installing the agent on our Hyper-V host, we were able to browse its VMs from the portal as well, and run agentless backups of selected ones. Installing the Cloud Direct virtual appliance on our VMware ESXi 6.7 lab host let us view its VMs from the portal, and choose which ones to protect with a single click.

Ongoing backup activity can also be monitored from the portal, and the agent has a System Tray popup that shows its progress. If bandwidth is an issue, bandwidth throttles can be applied in Kbits/sec to individual tasks.

Hybrid backups are simple to set up too because you can add a physical storage location to any task. This can be anything from a local disk or external USB drive to a NAS share or IP SAN; frustratingly, the sparse user manual doesn’t detail how to configure this, but a little experimentation confirms that you can declare disk locations by entering their paths, while NAS shares can be accessed using UNC syntax.

When it’s time to restore your data, you can just head to the web portal, choose a system, select the required recovery point and either restore it in its entirety back to the client or open the browser window and select individual files and folders to recover.

Unfortunately, Exchange item-level restores aren’t supported: you can only select the entire database and recover it back to the host, whereas VMware VMs can be recovered as raw image files or directly to vCenter. It’s not possible to restore files from local devices via the portal either, so if you want to quickly bring back a file from a NAS drive, you’ll need to open up Explorer and copy it across by hand.

Those limitations take some of the shine off Arcserve UDP Cloud Direct, as do the lack of support for hourly backup jobs and poor documentation. If you can live with those specific issues, though, this is a great cloud backup solution that’s ideal for SMEs looking to protect multiple systems and locations from one cloud portal.

Microsoft Azure secures 64% growth with goal to build ‘world’s computer’

Microsoft has unveiled its fourth quarter results to almost universal acclaim, with revenue up 12% to $33.7 billion (£26.9bn) and Azure revenue growth up 64%.

The company has been resolutely sticking to a theme on its earnings reports. "Microsoft Cloud drives record fourth quarter results," the release read this time last year. It was "Microsoft Cloud strength powers record first quarter results" three months hence, and for Q2, "Microsoft Cloud strength fuels second quarter results." 

Now, it was simply "Microsoft Cloud powers record fourth quarter results" – and it makes sense as to why. Revenue across its various buckets went up; intelligent cloud, which covers parts of Azure as well as server products and enterprise services, was at $11.4 billion up 19%, while productivity and business processes – focusing predominantly on Office 365 and LinkedIn – was at $11bn and up 14%. 

Speaking to analysts, Microsoft CEO Satya Nadella said the company was building Azure as 'the world's computer, addressing customer's real world operational sovereignty and regulatory needs.' Nadella noted the company's presence in Cape Town and Johannesburg, representing the first major move into Africa from a hyperscale cloud provider, as well as increased partnerships with Oracle, Red Hat and VMware.

"Azure is the only cloud with limitless data and analytics capabilities across the customer's entire data estate," said Nadella. "Our differentiated approach, from developer tools and infrastructure, to data and analytics, to AI, is driving growth. The world's leading companies trust Azure for their mission-critical workloads, including more than 95% of the Fortune 500."

Of those, the most recent – announced just yesterday – was AT&T, following a similar deal the telco penned with IBM. The partnership with Oracle, as this publication mused in June, was potentially another one with retail as a focus. Of the three customers cited in the press materials, two were large retailers, in particular Albertsons, whose CIO all-but-said in January that the move to Microsoft was partly down to Amazon's retail presence.

According to guidance released overnight by Synergy Research, Microsoft remains the clear number two in cloud infrastructure services, well behind AWS but significantly ahead of the chasing pack. The company remains the 'very clear' market leader in software as a service, with its growth rate above the market rate in both SaaS and IaaS.

The two cloud infrastructure leaders' competition is soon set to come to a head with the award of the single-vendor JEDI cloud computing contract from the Pentagon expected to be announced next week. 

You can read Microsoft's full results here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft Azure review: Competitive cloud pricing takes a bite out of AWS


K.G. Orphanides

19 Jul, 2019

A clear and cost-effective path for deploying your Microsoft office network in the cloud

Price 
Highly variable

The purchase, roll-out and maintenance of servers for your business’s Microsoft-based network can be prohibitive. But now, if you have a fast enough internet connection, you can deploy Active Directory, file servers, database servers and more in the cloud, and Microsoft’s Azure platform provides a clear and cost-effective way to move server and network infrastructure online.

Microsoft’s Azure cloud platform is a clear favourite for a company that would otherwise use local Windows Server and Active Directory systems to support Windows PCs for staff. While it’s by no means a closed shop – a wide range of Linux distributions are also supported on Azure – Microsoft’s cloud service provides ready integration with your local network and the company’s desktop products.

There are potential disadvantages to using cloud infrastructure for your business: you’ll never have any kind of physical access to or control over your systems and you’re at the mercy of your host’s outages. But Azure provides a 99.99% uptime guarantee for its virtual machines and you won’t be responsible for maintaining and updating your hardware and licenses manually, which can save on both capital expenditure and time for your IT staff.

You can choose where your Azure servers are hosted, ensuring, for example, that relevant customer data is kept within the same geographic region as those customers and allowing you to get the best possible connection speeds by ensuring that you’re not sending data halfway around the world.

Microsoft Azure review: Deployment

There are almost unlimited potential uses for cloud computing and storage, which can be a barrier to entry in and of itself: it’s important to work out what’s most convenient and cost-effective to keep local and what would be most easily deployed in the cloud.

Active Directory, Windows Server VM and storage options can be used to roll out Infrastructure as a Service for your business on an Azure private virtual network. To make good use of such a setup, you’ll need a fast – and ideally symmetric – internet connection and a reasonably powerful firewall router capable of handling a VPN connection between your local network and the Azure cloud.

There are also plenty of options to cover your public-facing networking needs, from Ubuntu Server images already configured with Apache for web hosting to web app development platforms that make it easy to test and roll out apps in Java, Python, Node.js and more.

Secure cloud storage is available at a range of costs and access tiers (hot, cool and archive), from managed disks for your virtual machines to file shares and scalable containers for unstructured data.

Microsoft Azure review: Pricing

Opting for cloud services rather than physical infrastructure massively reduces your business’s initial outlay on hardware and licences, although you’ll have a higher monthly operational expenditure due to subscription fees and – potentially – support costs, either directly via an Azure support plan or from a specialist IT support firm if you don’t have in-house expertise.

Pricing for complex cloud services is by its nature highly variable: one company’s needs won’t be the same as the next, even if both fall within the SMB bracket.

The simplest subscriptions for most services are on a pay-as-you-go basis, with exact pricing depending on the product specification you opt for, such as virtual machine configuration or quantity of storage, but cost-saving reserved instances – for which you pay an upfront flat rate for a fixed period – are also available.

You’ll need to work out individual pricing based on what systems and infrastructure you need, and we strongly recommend using both Azure’s estimate calculator before you buy and its cost tracking tools afterwards to make sure you’re getting good value for your money. We’re fans of Azure’s costing tools, which are somewhat easier to work with than AWS’s calculator.

On PAYG, a general-purpose VM with 2 virtual CPUs and 8GB of memory, running Windows Server will cost £150.43 per month. A 1024GB HDD to go with that costs £33.58 per month, although a wide variety of larger and smaller HDD and SSD disk options are available. Storage transactions – reads and writes, typically billed in 4MB blocks – are an additional cost, priced at £37.27 per 100,000 transaction units. A variety of snapshotting and backup options for your disks are available at further cost, depending on how much data you need to back up at any given point.

A virtual network is free, but you’ll have to pay for inbound and outbound data transfer – that currently works out at 75p per 100GB. Azure Active Directory Basic costs £11.18 monthly plus 75p per user, per month and an extra £10.43 per ten users if you want multi-factor authentication.

This very basic setup works out at £196.68 per month. While making exact comparisons is challenging due to differing default configurations and terminology, an equivalent Amazon AWS estimate for a Windows Server VM, Active Directory services, Virtual Private Cloud and data throughput came to $276.27 (£248.25) a month, making Azure a slightly better deal for this particular scenario.

In the case of Windows Server and other Microsoft licenced products, software license fees are included in the cost of running the deployment, but the Azure Hybrid Benefit for Windows Server means that you can use any physical Windows Server licences you already have to reduce the cost of operating your cloud servers.

Microsoft is currently competing heavily against cloud-dominating rival AWS, with a price match promise for comparable services and a promise to undercut Amazon on Windows and Microsoft SQL databases and servers. There are also plenty of trial options to help you see whether Azure is the platform for you and, Microsoft clearly hopes, get you sufficiently invested.

When you sign up, you get £150 credit to spend on anything you like within 30 days, plus 12 months of basic services for free, including Linux or Windows virtual machines and a pair of 64GB SSDs to use with them, 5GB apiece of blob (unstructured data) and file storage, SQL and Azure Cosmos databases and enough bandwidth in and out of Microsoft’s data centres – 15GB – to take advantage of all that.

Azure’s always-free services include entry-level cloud apps, some Active Directory services, Azure Kubernetes Service container management and deployment, developer tools and free, unlimited private code repositories, push notification sending, 50 virtual networks and more.

Microsoft Azure review: User interface

Azure has departed from its Microsoft Server inspired interface choices in recent years, in favour of a more broadly accessible user interface design language that feels less cluttered than that of rivals Google and AWS. The Azure portal home screen provides an overview of the resources you currently have in use, links to monitoring and security tools and a quick access list of the most popular Azure services – a full, searchable list can be reached via the All services tab in the left-hand menu pane.

This pane also gives you access to all categories of Azure services, from your storage server and virtual machine lists to cost management and analysis tools to help you ensure that you’re not burning more money than you should be.

While it’s a bit of an acquired taste, the interface does an admirable job of making a mind-boggling array of services accessible and easily deployable within a few clicks. You can also create multiple, highly customisable Dashboard displays to track the status of your various hosted systems and services.

When it comes to virtual machine installations, like AWS, Azure expects you stick to using the operating system install images provided, though you can migrate existing on-premises VMs to Azure, assuming your images meet certain compatibility criteria. If you need greater flexibility on this front, such as the ability to upload and install operating systems from your own ISO images, you’ll have to use a more VM-focused cloud services provider such as Vultr, but be aware that this will also involve more manual configuration in general.

Microsoft Azure review: Verdict

There’s a lot of competition in cloud services, but Microsoft is being particularly aggressive when it comes to undercutting Amazon and Google, which makes it a very good option for small businesses right now. A variety of reserved and pay-as-you-go pricing options are available, with enough free credit and services to support a proof-of-concept project before you commit fully.

If you’re looking to deploy cloud-based infrastructure to connect your Windows systems, Azure probably represents one of the easiest routes to doing that, although you’ll still need traditional Windows network and server management expertise to get everything set up: this isn’t a ready-rolled and fully configured solution. However, if you’re looking to deploy Microsoft products in the cloud, Azure is exactly the right tool for the job.

Azure revenue surpasses Windows for the first time


Connor Jones

19 Jul, 2019

Microsoft’s cloud business revenue has surpassed income from its Windows arm for the first time ever, with the company’s fourth-quarter financial results demonstrating the strength of its cloud infrastructure and services.  

It’s just one of the many areas that defied analyst expectations as the Redmond-based company excelled across the board, posting revenue hikes in nearly all areas of the business.

There are no exact figures to illustrate how well Azure is doing specifically because Microsoft bundles the figures into its umbrella “intelligent cloud business”. However, the revenue for the company’s cloud arm increased 19% to $11.4 billion, narrowly beating Windows in its “personal computing” business which made $11.3 billion – an increase of 4%.

The prioritisation of cloud services began when Sadya Nadella took the reigns in Redmond back in 2014 and ever since, all-things-cloud have been driving the company’s revenue. 

The one division in decline was Microsoft’s gaming business which pulled down the revenue score for its Personal Computing arm – handing the lead to Intelligent Cloud. Overall gaming revenue declined 10% with Xbox software and services down 3%.

Other than the minor blip in gaming, the future looks healthy for Microsoft. It continues to build on its lead as the world’s most valuable company, a revelation which it announced in April after it surpassed Apple and became valued at over $1 trillion for the first time ever – a feat achieved in no small part to its ever-booming cloud growth.

“It was a record fiscal year for Microsoft, a result of our deep partnerships with leading companies in every industry,” said Satya Nadella, chief executive officer of Microsoft. “Every day we work alongside our customers to help them build their own digital capability – innovating with them, creating new businesses with them, and earning their trust.

“This commitment to our customers’ success is resulting in larger, multi-year commercial cloud agreements and growing momentum across every layer of our technology stack,” he added.

Microsoft currently sits in second place in terms of its market share for cloud business, with Amazon Web Services (AWS) commanding a majority lead. According to Canalys, AWS dominates with a 32.8% market share, Microsoft’s Azure comes in with 14.6% and Google Cloud, which recently pivoted towards hybrid cloud, has 9.9%.

IBM revenue continues to fall as it chases cloud leaders


Bobby Hellard

18 Jul, 2019

IBM has said its acquisition of Red Hat will accelerate growth, despite reporting that current overall revenue has fallen for the fourth straight quarter.

Overall revenue was down 4.2% from the prior year to $19 billion, according to its second-quarter earnings, released on Wednesday. But the tech giant’s cloud business is on the up, with a 5% rise on $5.6 billion.

However, while these look positive, the figures still suggest IBM is far behind rival cloud providers AWS and Microsoft. Amazon’s cloud computing division leads the market and has continued to grow annually, with a massive 41% jump for the 1st quarter of 2019. While Microsoft’s Azure posted revenue growth of 73% in the third quarter of its fiscal year. Both companies will be reporting 2nd quarter earnings in the next few days and are expected to maintain their momentum.

According to IBM’s 2nd quarter figures, however, it is on track to achieve full-year expectations, especially if that excludes the recent acquisition of Red Hat – which was finalised last week.

“In the second quarter, we continued to grow in the high-value areas of the business, led by a strong performance across our Cloud and Cognitive Software segment,” said Ginni Rometty, IBM chairman, president and CEO.

“With the completion of our acquisition of Red Hat, we will provide the only true open hybrid multi-cloud platform in the industry, strengthening our leadership position and uniquely helping clients succeed in chapter 2 of their digital reinventions.”

IBM ended the second quarter with $46.4 billion of cash on hand, of which approximately $34 billion was used in July to close the acquisition of Red Hat. It does seem that the open-source specialist is a big part of IBM’s strategy to close the gap on its rivals, and lends credence to the idea that IBM’s fortunes are largely tied up in the success of Red Hat.

“On August 2, we will discuss how the acquisition of Red Hat will accelerate IBM’s revenue growth, contribute to our high-value model and enhance our free cash flow generation going forward,” said James Kavanaugh, IBM’s senior VP and CFO.