Todas las entradas hechas por Rene Millman

University of Texas global database to help scientists explore effects of climate change on North Pole


Rene Millman

22 Oct, 2018

A new database has been created to help track the effects of climate change on the North Pole.

Researchers at the University of Texas at San Antonio have developed the database, called ArcCI (or Arctic CyberInfrastructure) that combines thousands of images that have been taken along the years of the Arctic Ocean.

They said that this database would help scientists and the world see the physical changes occurring in the region including ice loss. It is hoped that the web-based repository would enable researchers to spend more time analysing information rather than just collecting and processing data.

«This is to help scientists spend more time doing the science,» said Professor Alberto Mestas-Nuñez, one of two researchers at the University of Texas at San Antonio working on the on-demand data mining module.

«At the present time there isn’t a place on the internet that provides all these datasets but also an algorithm that allows [extraction of] information,» added Mestas. «Most of the time scientists spend time getting data and preparing it. Typically, it’s about 80% preparing the data and 20% doing the actual science. We want to break that paradigm.»

The system will enable scientists to extract information of various ice properties including submerged ice, ice concentration, melt ponds or ice edge, the boundary between an area of ice and the open sea.

The original idea for the ArcCI database came from Professor Hongjie Xie, the principal investigator of the project at UTSA and a professor in the university’s Department of Geological Sciences. While big data analytics and dashboards have been used in many industries, this has not yet been applied to monitoring ice in the Arctic.

Xie along with Xin Miao at Missouri State University started working on the project five years ago. The project has also been funded by the National Science Foundation to develop this database that uses high-resolution imaging either obtained on-site, via satellites or via airborne monitoring.

Currently, the cloud-based system holds about a terabyte of images but will increase in the future as new images are added. The database will also integrate new algorithms as well as additional datasets as they become available.

The cloud framework and interface are being prototyped by Chaowei Yang at George Mason University, another investigator partnering with UTSA. A beta version of ArcCI will be presented at a meeting of the American Geophysical Union to be held in Washington D.C. in December 2018.

Huawei to sell servers powered by own chips


Rene Millman

10 Oct, 2018

Huawei is to begin selling servers powered by its own processors, which will see it move away from Intel CPUs. 

According to a Reuters report, the processors are made by Huawei’s semiconductor company Hisilicon. These chips are featured in some of its smartphones and telecommunications equipment.

Currently, Huawei sells servers powered by Intel processors to telecoms and cloud companies. Huawei has not divulged how many of its servers will make use of its own chips.

The report said that Huawei will be using the 7nm-based Ascend 910 chipset; the firm claims this is twice as powerful as competitor Nvidia’s v100. The Ascend 910 chipset will be available summer next year.

The chips won’t be on offer to third parties, according to the firm’s rotating chairman Eric Xu, speaking at the company’s annual global partners’ conference, Huawei Connect.

“Since we do not sell to third parties, there is no direct competition between Huawei and chip vendors,” Xu said on Wednesday, in response to questions about competition from Qualcomm, AMD and Nvidia. “We provide hardware and cloud computing service.”

In addition to Ascend 910, there is also the Ascend 310, a chip for smart devices. Both chipsets are aimed at artificial intelligence applications. The Ascend 910 is focused on datacentre usage. Huawei said the chip can not only process data faster than competitors but would train machine learning models in minutes. The Ascend 310 is aimed at not only smart devices, but also Internet of Things devices using artificial intelligence.

«Going forward we need to think of new ways to prepare our business and industry for change. There are clear signs that AI will change or disrupt a whole host of industries,” said Xu.

The firm unveiled its first chip in 2017 Kirin 970. In 2018 it introduced another AI chip Kirin 980 which is expected to be featured in the upcoming Mate 20 flagship handset.

Google Plus to shut down after massive data leak


Rene Millman

9 Oct, 2018

Google is shutting down Google Plus after it not only failed to gain traction with people happier with the likes of Facebook and Twitter, but also because it discovered a massive data leak affecting up to half a million users.

In a blog post, Google said that after a major security review, dubbed Project Strobe, the social networking service would close. The review found a sizable flaw in Google Plus APIs that meant malicious apps could extract data such as the name, email address, occupation, gender, and age from a person’s profile.

“It does not include any other data you may have posted or connected to Google+ or any other service, like Google+ posts, messages, Google account data, phone numbers or G Suite content,” said Ben Smith, Google Fellow and vice president of engineering.

Smith said that “the Profiles of up to 500,000 Google+ accounts were potentially affected.” However, Smith added that the API’s log data is only kept for only two weeks and analysis showed that up to 438 applications may have used this API.

“We found no evidence that any developer was aware of this bug, or abusing the API, and we found no evidence that any Profile data was misused,” Smith said. 

Google’s Privacy & Data Protection Office reviewed this issue to look at the type of data involved to see if the firm could accurately identify the users to inform, whether there was any evidence of misuse, and whether there were any actions a developer or user could take in response, according to Smith.  “None of these thresholds were met in this instance,” he said.

Smith said that despite Google’s engineering teams putting in a lot of effort, “it has not achieved broad consumer or developer adoption, and has seen limited user interaction with apps. The consumer version of Google+ currently has low usage and engagement: 90 percent of Google+ user sessions are less than five seconds.”

Google Plus will come to an end for consumers next August, but business users will still be able to use the service as an internal corporate social network.

The firm has also promised to institute new security rules, including limits around the types of use cases that are permitted to access consumer Gmail data.

“Only apps directly enhancing email functionality – such as email clients, email backup services and productivity services (e.g., CRM and mail merge services)—will be authorised to access this data,” Smith added.

Google will also remove access to contact interaction data from the Android Contacts API within the next few months. In addition, Google Account permissions dialog boxes will be spilt to show each requested permission, one at a time, within its own dialog box.

Why GDPR creates a «vicious circle» for marketers


Rene Millman

14 Jun, 2018

New data protection rules will frustrate consumers who demand personalised experiences but are wary of handing over their data, but organisations who prove trustworthy stand to benefit, according to experts.

The General Data Protection Regulation (GDPR) came into force on 25 May and gives people more control over what personal data organisations can collect, allowing them to move it to other companies or demand organisations delete it altogether.

It also requires companies to be more transparent about how they use people’s personal information and gets rid of passive opt-outs some organisations relied on to obtain customer consent: now people must actively agree to their data being collected and processed.

As a result of that, and with GDPR making people more aware of the value of their data, marketers’ jobs are about to get harder, according to The Content Advisory’s privacy lead, Tim Walters, speaking at Aprimo’s recent Sync Europe conference.

«I am convinced that GDPR will rather significantly reduce the amount of first party and third-party data that marketing teams have access to,» he said.

Walters said the data was the fuel for marketing efforts and pointed to a recent report by Accenture that identified a more systemic or structural problem that is choking off the supply of fuel for customer management.

But he pointed to a «vicious circle» being created by GDPR, where customers want «hyper-relevant» experiences when shopping online, but are very reluctant to hand over their personal data that would inform those relevant experiences.

«Consumers will punish brands that do not provide … relevant experiences by abandoning them for other providers,» he said. «Not because they necessarily know that another provider can provide that experience, but because [they] hope that they can.»

However, on the other hand, GDPR and the high profile Cambridge Analytica scandal, where millions of Facebook users’ profile data was harvested to allegedly influence US voters in the 2016 presidential election, has increased public awareness about the risks around sharing their data.

«The fact that they don’t know what’s going on with the data that they surrender … means that they are reluctant to provide that personal data. which is precisely necessary to create the kinds of experiences that they demand,» said Walters, adding: «That’s the vicious circle.»

But GDPR provides an opportunity for companies that recognise people are in control of their personal data – creating that trust breaks the vicious cycle.

«Every company in the world, whether they are subject to GDPR or not, should be looking for some kind of framework or template or guidebook to show them how to go about putting consumers in control of their data,» Walters concluded.

«That is the only way to make progress. That guidebook or template is exactly what the GDPR is.»

Edmund Breault, head of marketing at Aprimo, told IT Pro that GDPR is absolutely going to help marketing organisations by putting customers at the heart of their efforts.

«While the efforts in the short term on marketing organisations to become GDPR-compliant have added burden, GDPR fundamentally makes customer-centricity a «legal» requirement,» he said.

«We are now in a trust economy and marketers need to provide capabilities to allow consumers to stay in control of their data.»

Picture: Bigstock

Ryanair flies away from Microsoft and into AWS cloud


Rene Millman

11 May, 2018

Ryanair is to move almost all its infrastructure to AWS following a successful migration of some of its IT systems.

The budget airline plans to close the vast majority of its data centres over the next three years. It already runs several core production workloads on AWS, such as Ryanair Rooms and Ryanair.com, and is building a company-wide data lake on Amazon S3, using Amazon Kinesis to gain insights from customer and business data.

The airline is also ditching Microsoft SQL Server in favour of Amazon Aurora to run an email marketing campaigns in Europe in a bid to cut down costs in communicating with its 22 million European subscribers.

“We’ve chosen to work with the world’s leading cloud to develop and deliver services that will transform our customers’ travel experiences. By rebuilding core applications, converting data into actionable insights, and creating intelligent applications, we are putting the solutions in place to continue our leadership in the travel industry,” said Ryanair’s CTO, John Hurley.

It is also working with the AWS ML Solutions Lab to create an application that enables the company to automatically detect surges in demand for flight segments and anticipate schedule changes.

“Machine learning is hugely important to our growth, and we’re pursuing a variety of AWS machine learning services, including Amazon SageMaker, to enhance customer UI experience and personalise the myRyanair portal for every unique traveller,” said Hurley.

“We’re currently trialling Amazon Lex to enhance our customer support experience, by intelligently routing customer support requests to the right type of assistance – whether that be a customer support representative or an artificial intelligence-driven interaction.”

Mike Clayville, vice-president of worldwide commercial sales at AWS, said the airline’s plans to move to AWS are much like a lot of other enterprise looking to migrate as many of their existing applications as they can as quickly as possible.

“Because we have the most comprehensive set of cloud services, including our leading machine learning and deep learning services, Ryanair will be able to employ those services to drive greater customer and employee satisfaction. We’re excited to help them create first-class experiences on AWS as they continue to use our capabilities and services at an accelerated pace,” he said.

Microsoft wants to make Azure your AI destination


Rene Millman

8 May, 2018

Microsoft yesterday set out to open up its cloud platform to developers hoping to build AI applications, revealing a host of machine learning initiatives at its annual Build conference.

Redmond wants its Azure cloud to be the backbone of developers’ AI innovations, with CEO Satya Nadella saying: «The era of the intelligent cloud and intelligent edge is upon us. These advancements create incredible developer opportunity and also come with a responsibility to ensure the technology we build is trusted and benefits all.»

Project Kinect for Azure,is a package of sensors, including its next-generation depth camera, with onboard compute designed to allow local devices to benefit from AI capabilities.

Meanwhile, Microsoft’s Speech Devices SDK aims to enable developers to build a variety of voice-enabled scenarios like drive-through ordering systems, in-car or in-home assistants, smart speakers, and other digital assistants.

The tech giant also previewed its Project Brainwave, an architecture for deep neural net processing that Nadella said «will make Azure the fastest cloud for AI». This is now available on Azure and for edge computing, and is fully integrated with Azure Machine Learning with support for Intel FPGA hardware and ResNet50-based neural networks.

The firm is also open sourcing the Azure IoT Edge Runtime, allowing customers to modify, debug and have more transparency and control over edge applications.

Azure IoT Edge also runs Custom Vision technology, a new service that Microsoft says enables devices such as drones and industrial equipment to work without cloud connectivity. This is the first Azure Cognitive Service to support edge deployment, with more coming to Azure IoT Edge over the next several months, according to the vendor.

«With over 30 cognitive APIs we enable scenarios such as text to speech, speech to text, and speech recognition, and our Cognitive Services are the only AI services that let you custom-train these AI capabilities across all your scenarios,» said Nadella.

Also announced was a new SDK for Windows 10 PCs in partnership with drone company DJI. Using Azure cloud, the SDK brings real-time data transfer capabilities to nearly 700 million Windows 10 devices. As part of the commercial partnership, DJI and Microsoft will co-develop tools leveraging Azure IoT Edge and Microsoft’s AI services to make drones available for agriculture, construction, public safety and other verticals.

Nadella also announced the company’s new AI for Accessibility, a $25 million, five-year programme aimed at using AI to help more than one billion people around the world who have disabilities.

The programme consists of grants, technology investments and expertise, and will also incorporate AI for Accessibility innovations into Microsoft Cloud services. Microsoft said that the initiative is similar to its previous AI for Earth scheme.

Intelligent apps

Microsoft also handed developers the ability to introduce more customisation for Microsoft 365 applications, so organisations can tailor them to their needs.

Developers whose businesses use the Microsoft 365 bundle of Office 365, Windows 10 and Enterprise Mobility and Security can now benefit from wider integrations in collaboration app Microsoft Teams, and even publish custom apps on the Teams app store.

There is also deeper SharePoint integration within Microsoft Teams to enable people to pin a SharePoint page directly into channels to promote deeper collaboration. Developers can use modern script-based frameworks like React within their projects to add more pieces that can be organised within SharePoint pages.

The vendor also revealed new Azure Machine Learning and JavaScript custom functions that let developers and organisations create their own additions to the Excel catalogue of formulas.

Google’s workflow-focused Cloud Composer service enters beta


Rene Millman

2 May, 2018

Google has launched a new cloud service called Cloud Composer to help organisations design, create, and manage consistent workflows within Google Cloud Platform (GCP).

The service, which is currently in beta, is designed to develop, schedule and monitor enterprise workflows across internal datacentres or multiple clouds.

It offers End-to-end GCP integration, and users can orchestrate their full GCP pipeline through Cloud Composer’s integration within Google Cloud Platform.

It also connects a user’s pipeline through a single orchestration service whether a workflow exists on-premises or in multiple clouds.

“We believe there should be an easy and reliable workflow solution at a platform level, like other cloud services,” said James Malone, Google product manager, in a blog post.

“With the aforementioned features and others outlined below, Cloud Composer delivers a single managed solution to create and manage workflows, regardless of where they live, and gives you the portability to take these critical pieces of infrastructure with you if you migrate your environment.”

Google said the service was a “starting point” and a number of features are planned for the future including additional Google Cloud regions, Airflow and Python version selection, and autoscaling.

Cloud Composer and Airflow have support for BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Datastore, Cloud Storage, and Cloud Pub/Sub. The Airflow GCP documentation includes specifics on how to use the operators for these products.

Malone said an earlier alpha program gave access to hundreds of users and feedback from the helped improve the product.

«At Blue Apron, our data workflows need to operate flawlessly to enable on-time delivery of perishable food. Cloud Composer helps us orchestrate our pipelines more reliably by allowing us to author, schedule, and monitor our workflows from one place using Python,” said Michael Collis, staff software engineer at Blue Apron.

There will be a consumption-based pricing structure for Cloud Composer that includes virtual CPU per hour, Gb per month and GB transferred per month as the cloud data orchestrator is based on several Google Cloud platform components.

Image: Shutterstock

How to move your accounting to the cloud


Rene Millman

5 Apr, 2018

If you have always done your accounts with a traditional accounting package, it’s perhaps time you started thinking about moving them to the cloud.

Many organisations turned to cloud applications to either help reduce costs or improve operational speeds, yet those same organisations have now started moving their accounts to the cloud too. While there’s some trepidation about moving highly sensitive financial information off premises, the drivers to do so are becoming harder to ignore.

One such driver is the Making Tax Digital initiative from HMRC. From April 2019, all companies over the VAT threshold of £85,000 will be required to keep digital records and submit these electronically. There are also some suggestions that income tax could become mandatory as well, meaning businesses will have to review all aspects of how they engage with clients today and what that might look like in future.

There’s increasing pressure on accountants to do more than has traditionally been expected of them. According to a recent survey of accountants, 83% said clients expect them to do more than they did five years ago, with 42% expecting their accountants to offer business advice as part of the service.

«With this in mind, accountants are looking to lighten their administrative burden which will enable them to spend more time attending to these new demands,» says Sean Evers, accountant development director at Sage. «Moving accounting to the cloud allows organisations to spend less time on admin and more on attracting and serving customers.»

Benefits of cloud accounting

One of the main benefits of moving accounts to the cloud is quicker access to better software.

«Software is constantly evolving and improving, from bug fixes, patches to new features,» says Kris Brown, UK R&D Director at TechnologyOne. «When you enhance the software for one customer, all customers benefit. Not only is it much quicker to deploy cloud solutions, you also gain full control over the deployment process for new software. «

If you only wish to implement certain parts of the solutions, you can test those and then leave the other features ‘switched off’.»

Paul McCooey, a partner at chartered accountants Duncan & Toplis, says that recording long lists of payments is a thing of the past, as your bank account automatically syncs with your software.

«[There are] no more manually listing invoices [either]. The software can issue them automatically. And no more lost receipts or unclaimed tax relief – simply take a photograph of your bills or receipts and upload them,» he says.

«You’ll spend less time chasing debts too – cloud accounting software can issue automated reminders and ‘pay now’ buttons as payment prompts.»

Defining the future state

Paul Nicklin, technical director at inniAccounts, says that moving to the cloud isn’t as simple as flicking a switch so you need to do your homework.

«The world is going cloud first, so you’ll have an easier time if everything is in the cloud. However, you still need to consider how systems interact.

«For instance, if you have an ERP system and payroll plugged into on-premise accounting package it will be more tricky, but far from impossible, to move than if you have cloud-based services that are compatible with your old and new provider,» says Nicklin.

Brown says that the first phase in moving an organisation’s accounts to the cloud is to have a clear view of what success will look like – the future state – and to clearly define the current state.

«Every organisation is unique and will develop its own unique path to transition based on its existing applications, systems and requirements,» he says.

«It can be as simple as taking a copy of what you have on premise and starting the next day in the cloud. A like for like move of business process and software doesn’t have to be a massive task. But, a ‘lift and shift’ approach – placing current applications into a hosted environment – won’t deliver true transformation.»

Brown adds that if you are moving to a new provider, then this does need to be considered as a new implementation, and at that point, a change of processes and culture will be required to realise the full benefits of SaaS.

Nicklin believes that if your business is small, and your existing accounts aren’t as streamlined as they could be, then moving to cloud is the perfect opportunity to clean things up.

«In our experience of moving accounts from complex spreadsheets through to exports from accounts packages, we’ve encountered quality issues,» says Nicklin. «Our advice, therefore, tends to be not to bring too much history. It’s often wrong.»

Integrating accounts with other cloud apps

By migrating other parts of the back office into the cloud, it ensures that the benefits seen by the accounting team are also seen across the board, argues to Sage’s Sean Evers.

«Back office functionalities can often be time and resource consuming but moving to the cloud will take away much of the heavy lifting associated with these processes, allowing organisations to focus on serving their customers and improving workforce efficiency.»

He believes that once accounting is moved to the cloud, organisations should regularly review processes and challenge their efficiency. This is particularly useful for identifying those processes that can be automated or streamlined by using a third-party app, provided it’s able to connect to your cloud accounting service.

«That way, not only are you saving time and money around the accounting aspect, but all over your business. It also means more time to do what you want to do – whether that’s growing your business or more personal time,» says Evers.

David Lindores, technical director at Eureka Solutions, says that it is not a requirement to have other parts of the business in the cloud, however, this is highly advisable.

«There are integration tools that are available which allow businesses to transfer data seamlessly between cloud-based software solutions and on-premise software,» he says.

Image: Shutterstock

Save the Children: How cloud helps in disaster zones


Rene Millman

21 Mar, 2018

The cloud can greatly benefit charities trying to help people around the world in humanitarian disasters, according to Save the Children’s head of IT.

For the charity’s teams deployed in disaster zones, time is of the essence, and they need to do whatever it takes to save children’s lives delivering life-saving food, water, healthcare, protection and education, said Gerry Waterfield, head of global IT services at Save the Children International, speaking at Cloud Expo Europe in London today.

Using different cloud services, the charity can mobilise quickly and securely without having to deploy preconfigured devices with its line-of-business suite of applications, Waterfield said. Being able to deploy this so quickly can mean the difference between life and death.

“The work we do is in very difficult locations, so we have to think about connectivity, it is one of the biggest issues we face before we use the cloud,” he said. «The other issue is having power; if there is no power, there is no connectivity, hence no internet.»

The charity works in more than 120 countries around the word and helped 22 million children in 2016. Waterfield said that bandwidth is frequently at a premium and the charity is heavily reliant on costly satellite communications, so the use of lightweight web apps is important.

In order to get power, and thus connectivity, Save The Children has looked at using solar power because of the amount of sunlight available in a lot of areas where it works.

Wit so many refugees fleeing war over the Mediterranean Sea, having connectivity at sea means that the charity can access real-time weather data from the cloud as well as data on numbers making the dangerous journey across this stretch of water so that the charity is better able to be in the right place to offer assistance.

To that end, Waterfield said that the charity has used Office 365, as it can be rolled out everywhere to any device. It has also used a cloud-based HR system from Oracle. Waterfield said this has been helpful in emergency situations where volunteers have to be assembled quickly and onboarded as well as in helping select the right people for the right roles on the ground.

Save the Children has also used Kobo Toolbox, to allow workers in emergency situations create ad-hoc reports, and Facebook’s Workplace as an enterprise social network to allow workers to exchange information more quickly about situations and projects.

Going forward, Waterfield said that he would like to see the charity be able to use more technology in the field as this would help more people in crisis situations.

The key challenges of migrating databases to the cloud


Rene Millman

20 Mar, 2018

Today more and more companies are adopting big-data and cloud-based architectures to meet growing business demands. However, while these new technologies offer a number of operational benefits, there are some key problems which must not be overlooked.

Benefits of migration

The key benefits of migrating databases to the cloud are similar to those associated with moving any workload to the public cloud; agility, accessibility, scalability and cost-effectiveness.

«That said, there are also more specific benefits relating to databases that can be achieved,» says Mark Shaw, manager of Systems Engineering Western Europe at Rubrik. «When organisations migrate databases to the cloud, they can take full advantage of the additional services which exist in public cloud that may not be available on-premise. Cloud also offers an elastic scalability- which is useful when demand is high for a particular database.»

Eric Schrock, CTO at Delphix, says that organisations may want to re-platform to a new architecture to support evolving business needs.

«This could include moving to a managed database service such as Amazon RDS, adopting open source technology such as PostgreSQL, or moving to a horizontally scalable data store like Cassandra,» says Schrock. «This can lower costs while increasing velocity through modern tooling. This is a significant investment reserved for critical initiatives; but the first step is often getting the application, and hence its data, to the cloud in its current form.»

When should a database be moved

The scenarios that could merit migrating a database to a cloud service are generally driven by business requirements. For example, when an on-premises application service is being re-architected as part of a digital transformation strategy.

Other examples may include «seasonal businesses which scale on-premises for peak and wish to challenge the status quo», says Martin Jones, professional services director at SCC. In that case, companies are able to «use cloud database services in line with performance requirements, or when a business wishes to use a wider suite of tools, to gain insight into their application, which could be prohibitive if carried out on-premises,» adds Jones.

Can a database be moved to the cloud in isolation?

Unlike a collaboration system, which is a natural candidate to move in isolation to other IT systems, a database is by its very design connected to a variety of other systems and services making up a multi-tiered architecture.

This architecture can then be queried and updated by users belonging to different groups, such as employees, partners, and customers, and also mined by analysts and researchers seeking future trends.

«It is vitally important to remember, in order for the database to be migrated successfully, that all the associated tiers and systems are known, their function clearly understood, and their performance benchmarked,» says Paul Griffiths, senior director of Advanced Technology Group at Riverbed.

«Only with a complete view of all the components and their interactions with each other, can it be decided which elements need to be migrated along with the database itself.»

Moving a database

Migrating a database to the cloud can seem like a daunting task, and in some cases, it’s the initial challenge of figuring out what all the «moving parts» are which can cause migration projects to fall at the first hurdle.

«This can lead to organisations never being able to fully exploit the benefits of cloud. While that may not bring about the demise of a company, it could affect how competitive they are able to be in the market,» says Griffiths.

Schrock says that the first challenge is realising that database migration is not a one-time event. Moving your database to the cloud is only relevant if you migrate the application, too, which is cumbersome and can take days to weeks to months. During this time, teams must ensure that their applications will run in the cloud, that they can develop effectively within the cloud, and minimise disruption during the final transition.

«All of this requires high-quality data in the cloud for continuous testing across the application, cutover process, and SDLC integration,» says Schrock. «Failure to drive quality testing will at best slow down the project, and at worst cause significant disruption, poor quality post-transition, and an inability to move quickly to address problems.»

Shaw says that security concerns are often another huge barrier when it comes to migrating databases to the cloud.

«Data has become the most valuable and important asset for all organisations and, therefore, effective protection is paramount for the success and future growth of the business as a whole,» says Shaw. «If your business’s data protection is not adequate and you suffer a breach then your reputation, brand and, by extension, the entire business is at risk. Hence the reluctance of some businesses to adopt new strategies and embrace cloud.»

Move it, then monitor it

When a database has been migrated, it’s important to use tools like SQL Server Query Store to monitor, evaluate and understand the baseline data – for example, transactional throughput and surges of daily/weekly/monthly activity.

«These metrics will help organisations to determine whether to move from standalone PaaS databases to PaaS Elastic database pools. Lastly, it makes good business sense to use the data to show the TCO reduction to key stakeholders such as the CTO,» says Alan McAlpine, senior consultant of Enterprise Data at IT services firm ECS.

Leaving databases on-premise

Not all databases should go to the cloud, explains Roberto Mircoli, EMEA CTO of Virtustream. «Once you have identified a specialised enterprise-class cloud provider which guarantees the level of security, compliance, performance and availability required by your business, then what should be left on-premise is really only what’s constrained by residual latency limitations or extremely stringent regulatory requirements.»

«So for example, some particular applications in the automotive industry leverage manufacturing integration and intelligence to collect data in real-time from the automation systems to plan the procurement of components and materials; in these particular cases, 20, 15 or even 10ms of latency are not acceptable,» he says.

Image: Shutterstock