Google Cloud and Bharti Airtel partner in echo of Azure and Jio telecoms deal

Google Cloud’s rampant push at the start of this year has continued, with the company announcing a partnership with Indian telco Bharti Airtel, as well as a customer win in airline Lufthansa.

The agreement with Airtel will see the company offer collaboration tool G Suite to small and medium sized businesses as part of its integrated B2B connectivity solutions.

According to the most recent figures from Statista, Airtel currently sits third in terms of market share for wireless telecom subscribers at 27.58%, behind leader Vodafone Idea and less than 1% behind Reliance Jio (27.8%).

“Airtel and Google Cloud have a shared vision of delighting customers with great products. India, with its growing economy and adoption of digital services, offers one of the biggest opportunities to serve customers with innovative solutions,” said Gopal Vittal, managing director and CEO of Bharti Airtel’s India and South Asia. “We are pleased to further strengthen our deep relationship with Google Cloud and build products and services aimed at transforming Indian businesses.”

In August, Reliance Jio signed a 10-year deal with Microsoft which featured a variety of integrations. The telco would move its non-network applications to Azure, while its internal workforce would receive the Microsoft 365 suite. Jio’s connectivity infrastructure would also promote the adoption of Azure as part of the company’s cloud-first strategy.

Google’s partnership with Airtel will aim to make a dent in the Microsoft/Jio deal. “Indian companies are making a massive transformation to the cloud and we’re thrilled to partner with Airtel to support this transition,” said Google Cloud CEO Thomas Kurian.

The Indian market is an interesting one. While, like China, the country performed poorly in the most recent Asia Cloud Computing Association (ACCA) analysis in 2018, this is primarily due to the disparity between the most connected and rural areas. According to Synergy Research at the time of the Jio partnership, Amazon Web Services (AWS) and Microsoft were the clear one and two in India. Satyajit Sinha, an analyst at Counterpoint, told the Economic Times in August that the move would require AWS and Google to come up with ‘new, perhaps cheaper’ models for the Indian market.

Elsewhere, Lufthansa Group has chosen Google Cloud for its operations efforts. Citing machine learning and infrastructure capabilities, the airline is looking to streamline its data processes including recommendations for greater customer experience. “We’re bringing the best of Lufthansa Group and Google Cloud together to solve airlines’ biggest challenges and positively impact the travel experience of the more than 145 million passengers that fly annually with them,” said Kurian.

It has not all been good news for Google this week, however. According to reports, healthcare software provider Epic is opting to move ahead with AWS and Azure as its cloud providers, citing a lack of interest in Google Cloud among its customers. Healthcare, alongside retail and financial services, are the three primary industries Google is targeting, as Kurian noted at last April’s Google Next.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Announcing the launch of IT Pro 20/20


Cloud Pro

20 Jan, 2020

We understand that as IT professionals, your free time is incredibly valuable. You need to be able to figure out what stories truly matter and what trends are worth your attention, and do so quickly and with confidence. While we take immense pride in the content that we create online, we recognise that a website can be too mercurial for some of our readers.

It’s for this reason that we decided to launch our very own podcast at the end of last year, helping to diversify the types of content we deliver and offering a whole new way of consuming the latest industry stories.

Well, we’re not stopping there. Today, we’re excited to announce the launch of ‘IT Pro 20/20’, a digital magazine that shines a spotlight on the content that we feel every IT professional should be aware of, only in a condensed version that can be read on the go, at a time that suits you.

Each issue will act as a snapshot of the month gone by, serving as an essential guide to industry’s most important developments and trends, some of which may have been overshadowed by transient news headlines. What’s more, IT Pro 20/20 will also feature exclusive content, including features and analysis, that has yet to be published on our website. We want to provide IT decision-makers with a clear view of the month gone by, and a glimpse of what issues could shape their roles in the months and years to come.

The magazine allows us to showcase our content in a way that we’ve not been able to previously, providing additional context and insight in a format that works on any device, whether online or offline. It’s perfect for those that enjoy a long read, whether that’s on the commute to or from work, or during a lunch break.

IT Pro 20/20 will be available to download for free, released on the last day of every month, starting from 31st January. As each issue will be reflecting the main themes of each month, we thought it would be best to start with a look at what this year is likely to hold for tech, and a look at some of the key issues facing IT decision-makers. As we progress through the year, we’re also going to major on certain key events, so keep an eye out for those.

It’s the same high-quality content that you’ve come to expect from IT Pro, only in a format that complements your hectic schedule.

Head over to our sign up page on our sister site to register for a free issue of IT Pro 20/20, delivered on the last day of each month.

Exploited Internet Explorer flaw won’t be patched until next month


Nicole Kobie

20 Jan, 2020

Microsoft has warned that millions of people still using the Internet Explorer browser could be at risk from a zero-day flaw that is actively being exploited by hackers.

The flaw, which is in a scripting engine of the browser, makes use of memory corruption to execute code. “An attacker who successfully exploited the vulnerability could gain the same user rights as the current user,” Microsoft noted in its security guidance. “If the current user is logged on with administrative user rights, an attacker who successfully exploited the vulnerability could take control of an affected system.”

That could let attackers install programs, access data, or create new accounts, the company noted.

“One way in which the vulnerability could be exploited is via a web-based attack, where users could be lured into visiting a boobytrapped webpage – perhaps via a malicious link in an email,” security and industry analyst Graham Cluley noted in a blog post.

Cluley added that the flaw appeared to be related to a similar vulnerability in Mozilla Firefox spotted earlier this month. The discovery of both flaws was attributed to Qihoo 360, with the security firm tweeting last week as it reported the Firefox flaw that there was also an IE version.

Microsoft said it was aware of “limited targeted attacks” using the vulnerability. Microsoft said it was working on a fix, and suggested it would come with the next Patch Tuesday, which is due out on 11 February.

While users will have to wait for a patch, Microsoft noted that anyone running IE on various versions of Windows Server may be protected by default settings called Enhanced Security Configuration. Microsoft also suggested a workaround for other users, which involves restricting access to JScript.dll, though that will have to be undone when the update is issued.

“Blocking access to this library can prevent exploitation of this and similar vulnerabilities that may be present in this old technology,” notes guidance by the CERT coordination centre at Carnegie Mellon. “When Internet Explorer is used to browse the modern web, jscript9.dll is used by default.”

The best mitigation is to switch to a modern browser, with Microsoft referring to IE as a “compatibility solution” for older apps rather than a browser to push out widely to staff. However, according to Net Applications’ Market Share figures, 7.4% of web users are still on IE — two percentage points more than Microsoft’s Edge, which was first released in 2015.

Revamped UK Cloud Awards 2020 now open for entries


Cloud Pro

20 Jan, 2020

The Cloud Industry Forum (CIF) and Cloud Pro are excited to announce that the seventh annual UK Cloud Awards is open for business – with a revamped list of categories and a new location.

Aiming to celebrate the very best successes in digital transformation, diversity, innovation and customer experience, the 2020 UK Cloud Awards is seeking entries for its 20 new-look categories.

Organisations have between now and 20 March to submit an entry to the 2020 UK Cloud Awards through its website, with the shortlist announced on 28 April. The final awards ceremony will then be hosted on 4 June at the Brewery, in London.

“Back for a seventh year, the UK Cloud Awards continue to evolve,” said CEO of CIF, Alex Hilton. “We have completely reshaped them, with a fantastic new location and twenty transformational awards that reflect the pace of change and innovation in the UK industry.

“Our emphasis for 2020 is recognising people, projects and innovative companies that are making an impact with their customers. We are thrilled to continue to be involved with shaping the cloud industry.”

The awards are no longer broken down based on category and subcategory and have been expanded from the 17 categories organisations could enter last year.

This beefed-up list in 2020 includes project-based awards for excellence in particular areas, such as FinTech Project of the Year, Big Data/Analytics Project of the Year, and DevOps Project of the Year.

There will also be awards recognising cloud innovation, including Cloud Product Innovation of the Year and Cloud security Innovation of the Year.

All entries will be assessed and scrutinised by a panel of expert judges, spearheaded up returning head judge Jez Back, managing director of Erebus Technology Consulting.

Joining him are a string of recognised experts including chief technology advocate with Splunk, Andi Mann and industry analyst Jon Collins, among other widely-recognised industry figures.

“I am delighted to be back once again as Chair of Judges, for what really has become one of the most credible, stand-out events in the UK’s thriving cloud industry,” Back said.

“This year the awards are bigger than ever, with a new list of twenty categories available to enter, covering AI, DevOps, partnerships, cloud project implementation and more.

“We aim to celebrate diversity and talent, and through the UK Cloud Awards recognise the leading figures at the forefront of this innovation.”

The full list of award categories are as follows:

  • Cloud Product Innovation of the Year
  • Cloud Security Innovation of the Year
  • Cloud Collaboration Platform of the Year
  • SaaS Implementation Project of the Year
  • Internet of Things Project of the Year
  • FinTech Project of the Year
  • Big Data/Analytics Project of the Year
  • Machine Learning/Artificial Intelligence Project of the Year
  • DevOps Project of the Year
  • Public Sector Project of the Year
  • Enterprise Project of the Year
  • SMB Project of the Year
  • Digital Transformation Project of the Year
  • Cloud Start Up Company of the Year
  • Cloud Growth Business of the Year
  • Cloud Technology Positive Action Award of the Year
  • Cloud Team of the Year
  • Cloud Channel Partner of the Year
  • Cloud Leader of the Year
  • Newcomer of the Year

“The UK Cloud Awards have gone from strength to strength since their creation,” said Dennis group editor and editorial director of B2B, Maggie Holland.

I remain as excited about the event and everything it stands for now as I was from day one. As a journalist writing about cloud computing, I get to see first-hand just how much talent and innovation this industry has to offer.

“Yet, as a judge, I am always still surprised and delighted by how the bar keeps on being raised.

“I can’t wait to see the entries for this year’s awards and to be able to help those shortlisted and the winners, as well as the great and good of the cloud industry, celebrate that success at the awards event night itself.”

Why CIOs and CTOs need to realise digital business transformation requires collaboration

The evolution of information technology (IT) leadership – sometimes driven by the line of business (LoB), sometimes by the legacy IT organisation – is reaching a state of balance where business-IT collaboration sets the stage for meaningful digital transformation.

Software application leaders must examine these predictions to learn how this equilibrium will take shape in their organisation. Over the next two years, ~50 percent of organisations will experience increased collaboration between their business and IT teams, according to the latest worldwide market study by Gartner.

Gartner believes that the dispute between LoB leaders and traditional CIOs or CTOs over the control of enterprise technology deployment will lessen, as both sides learn that joint participation in the process is critical to the success of innovation within a digital workplace.

Application leadership market development

"Business units and IT teams can no longer function in silos, as distant teams can cause chaos," said Keith Mann, senior research director at Gartner. "Traditionally, each business unit has had its own technology personnel, which has made businesses reluctant to follow the directive of central IT teams."

Increasingly, however, savvy organisations now understand that a unified objective is essential to ensure the integrity and stability of core business. As a result, individuals stay aligned with a common goal, work more collaboratively and implement new business technologies effectively.

The role of 'application leader' has changed significantly with the replacement of manual tasks by cloud-based applications in digital workplaces. According to the Gartner assessment, the application leader must ensure that this transition is supported by appropriate skills and talent.

As more and more organisations opt for cloud-based applications, artificial intelligence (AI) techniques such as machine learning, natural language processing, chatbots and virtual assistants are emerging as digital integrator technologies.

"While the choice of integration technologies continues to expand, the ability to use designed applications and data structures in an integrated manner remains a complex and growing challenge for businesses. In such scenarios, application leaders need to deliver the role of integration specialists in order to ensure that projects are completed faster and at a lower cost," said Mann.

Outlook for application development collaboration

Enterprise application leaders will have to replace the command-and-control model with versatility, diversity and team engagement with key stakeholders. Application leaders must become more people-centric and provide critical support to digital transformation initiatives.

Additionally, in a digital workplace, it is the application leader’s responsibility to serve as the organisational 'nerve centre' by quickly sensing, responding to, and provisioning applications or IT infrastructure.

"Application leaders will bring together business units and central IT teams to form the overall digital business team," said Mann.  That said, in this environment, the successful IT vendors will learn to adapt to this procurement transition – where a variety of buyers and influencers drive the discovery, consideration and ultimate solution selection process.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Expect 2020 to see public and private cloud outspend traditional IT infrastructure, says IDC

Revenue for cloud infrastructure equipment dipped a little in the most recent quarter, IDC has argued – yet it is more indicative of a wider IT downturn than a cloud-specific malaise.

The analyst firm, in its latest Worldwide Quarterly Cloud IT Infrastructure Tracker report, found an overall quarterly figure of $16.8 billion (£12.9bn) – a decline of 1.8% year over year. IDC increased its forecast for total 2019 spending to $65.4bn, which represents flat performance.

Public cloud saw something of a hit according to IDC’s figures, with a downturn of 3.7% year over year, albeit seeing $11.9bn in quarterly sales. The analyst notes to expect more quarterly volatility, particularly as the hyperscalers continue to dominate the market, as the overall segment generally trends upwards.

For 2019, public cloud saw minimal change in market share, comprising just over 30% of the overall cloud IT infrastructure market. This is expected to reach almost 40% by 2023. Yet the key figure here is 2020, where IDC expects public and private cloud spending outperform traditional IT. 2019 saw the balance nearly tipped at 49.8% for public and private cloud, although recent quarters (Q319 at 53.4%) saw success.

When it came to specific vendors, Dell Technologies was the best performing in Q3, capturing 15.5% of the market share at $2.62bn revenues. This was a 2.6% downturn on the previous year, with HPE (8% rise, 11% share), Inspur (14.8% rise, 7.2% share) and Cisco (5% rise, 6.7% share) helping to take up the slack. Lenovo, at $723 million, saw a 20.2% yearly downturn to drop to fifth in the market.

Looking at specific geographies, decline was noted in the US, Western Europe, and Latin America. Again, IDC noted, this was related to a general market blip. Asia Pacific – excluding Japan – saw growth of 1.2% year on year, which IDC saw again as flat.

You can find out more about the IDC Worldwide Quarterly Cloud IT Infrastructure Tracker here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Why ‘lift-and-shift’ is an outdated approach


Cloud Pro

17 Jan, 2020

There are many reasons why a business might consider moving some or all of its applications to public cloud platforms like AWS, Azure or Google Cloud. One of the most compelling, however, is the cloud’s ability to reduce the complexity of an organisation’s IT. Removing the need for management of the physical infrastructure that applications run on can yield big benefits for simplification.

Running your applications in the cloud allows you to take advantage of a new compute economy and scale the capacity up and down as needed, as well as letting you hook into a huge range of additional tools and services. While this is well and good for building new applications, porting pre-existing legacy apps over to cloud-based services can often prove challenging.

Organisations that want to migrate pre-existing workloads to a public cloud are faced with a choice: do they re-architect their applications for cloud, or do they simply attempt to port it to their chosen platform wholesale, with no alteration? For many companies, the latter approach – known as the ‘lift-and-shift’ method – initially sounds like the more attractive option. It allows them to get into the cloud faster, with a smaller amount of work, meaning the IT team has more time to devote to other elements of the migration or to developing entirely new capabilities.

Sadly it’s not quite as simple as that. While some applications can be moved over fairly seamlessly, not all apps are suited to this method. Compatibility is the first issue that companies are liable to run into with lift-and-shift; particularly when dealing with legacy applications, there’s a good chance the original code relies on old, outdated software, or defunct libraries. This could make running that app in the cloud difficult, if not impossible, without modification. Organisations also misinterpret the business continuity options available in public cloud and sometimes assume the options are the same as the on-premises counterpart.

“In a lot of cases with server-side applications, they’re not delivered and packaged as well as workspace applications are on an end-user’s desktop,” says Lee Wynne, CDW’s Public Cloud Architecture Practice Lead, “so finding somebody who actually installed the application on the server in the first place can be difficult.”

This, Wynne points out, along with a lack of documentation and issues with upgrading the original OS that a virtual machine runs on, can prove “very costly and time consuming” when trying to port legacy applications to the cloud with an old OS. In terms of business continuity, Wynne says:

“It can take a fair amount of explaining that in the public cloud domain, the ability to move machines from host to host with zero downtime across availability zones isn’t really a thing, therefore if you are moving a critical business workload from your current data centre that is highly protected by various VMware HA features, you need to consider how that will remain online through availability zone outages. In other words, you have to architect for failure”.

Cost modelling is also a critical component, Wynne says, and organisations need to make sure that the cost modelling they’re doing is an accurate representation of what their actual usage will look like.

“The accuracy element of cost modelling is really critical when you’re assessing at scale. You’re not just assessing a couple of VMs, you’re assessing a whole data centre or a few thousand; you’ve got to be accurate with the costs, and you’ve got to be able to get the instance types that are displayed during those accurate cost assessments.

“Therefore picking the tooling and the right telemetry at the beginning, and getting those costs accurate for your business case, is probably one of the first risks that you’ll come across with a cloud migration. Otherwise, you just end up making it three times more expensive than it actually is, and therefore providing executives and decision makers with the wrong information.

“If you think way back when we went from physical servers to virtual servers, no one did an as-is migration of those physical machines – they monitored them over a two-to-three month period, and then they migrated them based on real utilisation. So they cut down memory, cut down CPU, so they could fit as much as possible on the target VMware cluster. And this is exactly the same with public cloud. That’s why you ensure that you do your cost modelling right. It needs to be lean and optimised, as you are paying by the minute or, in some cases, by the second.”

It’s important to establish how your apps interact with each other, too. Very few business applications exist in isolation, so making sure that all of the integrations and connections between software still function as required – both during and after the migration – is vital. For this reason, CDW includes a dependency mapping service as part of its Cloud Plan offering, which analyses the connections between VMs and then groups them together into functions, so that they can be migrated in smaller groups.

“That reduces risk significantly,” Wynne says. “It’s naive to think that if you’re looking to do a migration on a couple of hundred virtual machines, that you’re going to do them all in one go. It’s not the way it works, you do it batch by batch. So what you don’t want to do is introduce risk by migrating a batch of virtual machines over to public cloud and then realise afterwards that actually, these machines are going to communicate back to the source data centre on an application protocol, which is latency-sensitive – so it’ll break it, it won’t work, it’ll be too slow. So you end up having to roll back, or migrate more VMs really quickly that you didn’t plan for.”

With all this in mind, it’s absolutely key that when starting a cloud migration project, companies take the time to look at their applications realistically, identifying which of them need to be retooled before they can be moved. There may even be cases where it’s faster to rebuild an application from the ground up, rather than tweaking the existing version for a cloud deployment.

The reduction of operational complexity is a key issue for organisations of all types, and it’s one that the cloud can play a huge role in, but be warned – the process of cloud migration isn’t always a simple matter of scooping up your VMs en masse and dumping them into your chosen cloud environment. A good cloud migration involves looking long and hard at which of your applications truly belong in the cloud, and then investing time and effort in making sure that those applications are engineered to get the maximum benefit from a cloud environment.

Organisations that are looking to start this process don’t have to do it alone; CDW is a tried and tested partner, that can help guide your business to the cloud and make sure that your applications are delivering the most value with the least overhead.

Get in touch with a CDW Account Director and ask for details on. CloudCare® JumpStart and CloudCare® CloudPlan. Visit uk.cdw.com

Future-proofing your cloud strategy


Cloud Pro

20 Jan, 2020

Adopting cloud technologies is something that many organisations have struggled with, and these difficulties have been caused in no small part by large amounts of pre-existing technical debt. Legacy applications and physical infrastructure have, in many cases, become a millstone around the necks of companies who want to capitalise on the economic and transformational benefits of the cloud, leading to long migration roadmaps.

This is a challenge in and of itself, but as companies race to embrace the cloud, it’s imperative to ensure they don’t repeat the same mistakes that caused these headaches in the first place. Companies have often found themselves locked into expensive and inflexible relationships with vendors, and while this is less of an issue with the cloud than with on-premise systems, it has by no means disappeared altogether.

There are ways to mitigate the risk of cloud vendor lock-in, however, and the most important factor is proper planning. Planning is always the first phase of any cloud migration, and it’s vital to ensure that before you even open an account with a cloud provider, you’ve worked out all your application dependencies, compatibility requirements and budgets. You can then assess which cloud providers meet those needs, making it easier to identify an alternative provider to move to in the future.

Businesses should also take a long, hard look at which services they actually need to move to the cloud. Contrary to some popular wisdom, not every application and service is necessarily better off being run in the cloud. Establishing what does and doesn’t need to be migrated can save time and money later on, both in the initial migration process and potentially subsequent ones if those workloads are repatriated to on-premise systems.

Organisations should be mindful of contract lengths, too. Long lease terms on physical infrastructure are one of the biggest impediments to organisations’ cloud migrations, and although many cloud services will offer discounts on the price of longer contract lengths, this also contributes towards those same lock-in issues that you’re trying to avoid.

It’s important to remember that you can design and set up most elements of your public cloud infrastructure without spending a single penny, and you should do this as early as possible. Although public cloud providers market themselves on their ability to get up and running very quickly, mature organisations that want to undertake a comprehensive cloud deployment will quickly find that setting this up, along with all the associated account structures, security and compliance frameworks, will quickly eat up unexpectedly large amounts of time.

“None of this costs any money in terms of consumption,” says Lee Wynne, CDW’s public cloud architecture practice lead. “You can build out a well-architected multi-account AWS or Azure design, you can do all your billing strategy, and you can design and provision all your platform architecture and it costs you no money. There are no cloud consumption costs at all to do that; you have to know what you are doing but it won’t cost you a penny. So when you’re ready, when that urgent project comes along, and they go ‘yeah, we need to spin this up in Azure or AWS, or GCP ASAP’, then you’re ready to go. Nobody’s waiting for anything, it’s just done and dusted, a big tick in the box for your IT services. There’s nothing dependent on this from a project perspective, no cost assessments, no buzzwords about machine learning and AI strategies, no politics.

“Now, when it comes to thinking about vendor lock-in you start thinking ‘well, should I just be doing this for one platform? Or should I do it for all of them?’, because it’s the same. That strategy is the same across AWS, Azure or Google Cloud. They all have the same thing: a multi-account structure and a platform architecture, all can be designed and configured with no consumption so it’s not too difficult to do that for all of them. And then all of a sudden, you’ve got a little bit more choice and bit more flexibility about where you choose to deploy certain elements. The only complexity is how you set up network comms between them, if that even needs to happen.”

Designing your workloads and applications in such a way that they can be deployed across multiple clouds from the outset makes jumping ship to a new provider much easier, and it also has the added bonus of making them much leaner and more portable in general.

Vendors themselves have even started warming to the idea of customers deploying across multiple providers, and multi-cloud tools and functionalities are becoming increasingly common. VMware, for example, has been rapidly expanding its portfolio of multi-cloud management tools and has recently announced its plans to incorporate Kubernetes management tools into vSphere, making multi-cloud workload management even simpler. Microsoft has also got in on the trend, announcing earlier this year that Azure Cost Management will also help customers manage their AWS spending.

Another important consideration is the tools that you use to build and run your apps. Most organisations will likely construct their apps using open source (or at the very least widely supported) programming languages, but once those applications are in the cloud, it’s very easy to end up accidentally bolting additional compatibility requirements onto them.

“For example, AWS has CodeStar and Azure has Azure DevOps, and they have a lot of native tools in their ecosystem that you can use to wrap around your source code, test and deployment pipelines,” Wynne points out. “Then you become reliant on it; it becomes part of your DevOps process, and then it becomes part of your culture, and it becomes part of your team. So over time, you can easily end up locked into one of those cloud platforms, if you consume a lot of their native services around your source code, and you begin to extend it further into proprietary database services such as Google Firebase, Azure Cosmos and AWS Aurora and serverless monitoring and debugging tools.”

Vendor lock-in remains a major problem for many IT departments, and it can be just as much of a risk when it comes to the cloud. With thorough planning, good design and careful monitoring, however, it is possible to architect your cloud estate in a way that minimises the chance of your organisation repeating the same woes of time gone by. For businesses that want to ensure they’re implementing the right cloud strategies and designing their platforms in the most flexible way possible, CDW is an ideal partner for planning and orchestrating your cloud migration, bringing years of experience and unparalleled cloud expertise to bear on your business challenges.

Get in touch with a CDW Account Director and ask for details on CloudCare® JumpStart or CloudPlan. Visit uk.cdw.com

Microsoft plots ‘carbon negative’ target for 2030


Keumars Afifi-Sabet

17 Jan, 2020

Microsoft has outlined a set of ambitious plans to remove more carbon from the atmosphere than it emits by the end of the decade.

By 2030, Microsoft is aiming to be ‘carbon negative’, in that the carbon it removes from the atmosphere outweighs the carbon emitted, including the activity of its wider supply chain.

This is in addition to a $1 billion climate fund to accelerate research and development into carbon reduction, capture and removal technology that doesn’t already exist today.

Moreover, Microsoft wants to continue the trend of lowering emissions while increasing carbon removal so that by 2050 it will, on paper, have removed all the carbon it has emitted since its foundation in 1975.

“While the world will need to reach net-zero, those of us who can afford to move faster and go further should do so,” said Microsoft president Brad Smith.

“We recognize that progress requires not just a bold goal but a detailed plan. As described below, we are launching today an aggressive program to cut our carbon emissions by more than half by 2030, both for our direct emissions and for our entire supply and value chain.

“While we at Microsoft have worked hard to be “carbon neutral” since 2012, our recent work has led us to conclude that this is an area where we’re far better served by humility than pride. And we believe this is true not only for ourselves, but for every business and organization on the planet.”

The industry stalwart is the latest in a string of companies, including Amazon and HP, to enter an arms race geared on reducing carbon footprints and embracing cleaner and greener technologies.

Amazon, for example, has pledged to be carbon neutral by 2040, while Google Cloud hit its 100% renewable energy goal in April 2018, powering its data centres and offices from renewable sources, including solar and wind. Salesforce, similarly, achieved net-zero greenhouse gas emissions the previous year.

HP, on the other hand, has committed to releasing routine sustainability reports that track its progress in its aims to reduce its carbon footprint. It has started to build many of its products with sustainability in mind, including the forthcoming HP Elite Dragonfly business 2-in-1.

Microsoft says it can achieve its own “aggressive” set of targets by first investing in nature-based initiatives, such as planting trees, with the goal of shifting to technology-based programmes when they become more viable.

The wider strategy, however, encompasses a set of smaller goals that Microsoft hopes to hit along the way to achieving its major targets for 2030 and 2050.

By 2025, for instance, Microsoft is hoping to shift to a 100% supply of renewable energy, while aiming to fully electrify its global campus operations vehicle fleet by 2030.

The firm is hoping to implement new procurement processes and tools to incentivise its suppliers to reduce their carbon emissions too, with these pencilled in for July 2021. For customers, meanwhile, Microsoft will roll out a sustainability calculator and dashboard that estimates emissions from Azure services.

Google-parent Alphabet now worth $1 trillion


Bobby Hellard

17 Jan, 2020

Google’s parent company Alphabet has become the fourth US tech company to reach a market value of $1 trillion (£765 billion), ending trading at $1,451.70 per share on Thursday.

It will hold a quarterly conference to discuss Q4 and 2019 financial results on 3 February but Wall Street Analysts are expecting it to report revenue of $46.9 billion – up 20% year-on-year.

Alphabet is now part of a US tech elite with Apple, Amazon and Microsoft all having reached the $1 trillion market cap over the past two years. The iPhone maker was the first to surpass the mark in August 2018 with Amazon hitting it a month later. Microsoft was the third company, doing so in April 2019.

Like Amazon and Microsoft, revenues from its cloud ventures are believed to have heavily contributed to Alphabet’s overall growth, with the Google Cloud Platform doubling it’s revenue run rate to £2 billion per quarter between February 2018 and July 2019, according to CNBC.

Cloud, Google’s Play app store and Google’s hardware division have all been key drivers for the company, according to its earnings report. In Q3 of 2019, revenue from these segments of the business increase 39% year-on-year.

Sundar Pichai’s recent promotion to CEO of Alphabet has also helped to increase optimism among stockholders, according to Pivotal Research Group analyst Michael Levine.

“We are incrementally more constructive about what we perceive as multiple ways to get paid under the recently appointed Pichai regime,” Levine wrote, as reported by Markets Insider. His evaluation is reportedly based on an estimate of Alphabet’s 2021 EBITDA (earnings before interest, tax, depreciation and amortization).

Pichai took over as CEO of both Alphabet and Google after co-founders Larry Page and Sergey Brin stepped down in December. The change of leadership offers “the most optionality for multiple expansion for the stock we have seen in years,” according to Levine, who also sighted Thomas Kurian‘s helm of Google’s cloud business as a positive sign.