Innovation: The financial industry’s best-kept secret?

18 Apr, 2019

The financial services sector (FSS) has reached a tipping point; should it stay the same and hope for the best or should it embrace disruption and associated technologies and emerge different, stronger and better from its legacy cocoon?

But do FSS firms really need to change at all? Perhaps what does need to change is how they are perceived. Indeed, with a reputation of being not just slow, but reluctant to change, FSS organisations have actually been innovating all the while behind the scenes. Their downfall, perhaps, lies in not shouting loudly about it as many other sectors have done.

So, why are those in the industry now starting to talk much more openly about technology and the impact it can have not just on operations, but the customer experience?

We have reached the point where banks, in particular, must consider and position themselves not just as financial entities, but as lifestyle brands too, according to Bharat Bhushan, CTO of banking and financial markets at IBM’s Financial Services division.

“The industry has been innovating in a pull and a push manner. Our ability to interact with our banks using mobile devices doesn’t sound like an innovation, but it isn’t that long ago that we were walking into branches and checking ATMs. Now, people can check their balance several times a day while waiting for trains/planes etc.” he says. 

“It’s important to distinguish between the incumbents and the challengers. FinTech has been one of the reasons for banks to innovate – if they don’t innovate, their lunch will be eaten.”

The devil is in the detail when it comes to moving innovation from theory to reality, though, with execution becoming the real differentiator, according to a recent blog post by Bhushan.

“An organisation with a culture of innovation, collaboration and true customer focus will be able to maintain differentiation and identity in the digital world as they do in the physical world. The winners will use their digital capabilities to draw intelligence from data in real-time and use it to drive behaviour change or, convenience,” he added.

Turning adversity into opportunity

With an industry changing almost beyond recognition, despite historic innovation, many FSS organisations feel somewhat unsettled.

Indeed, most of those in the sector entered 2019 feeling less certain about the future than ever.

“[This uncertainty can be] quite uncomfortable for some organisations. They’re thinking ‘we’ve got to change, but we don’t really know where to start.’ We really try and structure the way we help companies by looking at those opportunities. Sometimes they just need a bit of help to figure out how to combine the technology with the business aspects and the user side of things. We can help them get started on the road and often getting started is the hardest bit,” adds Holly Cummins, worldwide development lead of the IBM Cloud Garage.

“Once companies see they can innovate and nothing terrible happened that is a seed for something that can then ripple out through the organisation and help them innovate in other ways. It’s testing the water and seeing what happens and that takes away a lot of the fear.”

In engagements with businesses, these pilots or test cases are dubbed ‘minimum viable products’ according to Cummins.

Cummins adds: “We worked on a project years ago with a bank that had quite a large budget set aside for something. It was really innovative around allowing their customers who collected loyalty points to pay with them…

“We tried something smaller. Everybody said it was wonderful, it was innovative but they didn’t want it – their partners just weren’t able to digest it – even though everybody agreed it was great. So, even though the project failed in one sense, the bank was actually delighted as they hadn’t wasted money and knew what to do differently if they tried it again. A good thing came out of something that could be perceived as a failure or negative.”

It will be the types of organisations that turn negatives into positives that, ultimately, win out.

Open banking, for example, means the industry has much more to gain than it stands to lose.

“When everyone becomes digital, the big challenge for financial institutions is how to differentiate themselves from the nearest competitor. That’s where the magic will start to happen,” Bhushan adds.

“Open Banking requires a cultural shift in a financial institution. Once you form a culture where you are not just behaving like a bank… Open APIs and banking are a fundamental shift from a very closed industry to one on its way to becoming an open platform.

“Historically, you and I have been generating data for these banks for the last 60/70 years since computers were used. That data belonged to the bank. But with open APIs, that data belongs to us. Regulations like GDPR have helped empower that.”

Levelling the playing field

Advanced technologies are now no longer the preserve of large enterprises or those with infinite pots of cash. Now, all organisations can benefit from the speed and scale of subscription-based services that deliver business-level results in almost a consumer-friendly consumption model. 

“Financial institutions need to address how ideas are generated and executed. It’s about failing fast. They should be using the cloud, for example, as a fundamental mechanism. Cloud and everything that goes with cloud is fundamental to that change,” according to Bhushan.

“DevOps in itself is a big shift for some banks – globally not just in the UK. The idea of bottom up and top down innovation [is very new].”

Continuous delivery is an important weapon to have in your arsenal as an FSS organisation, according to Jim McKay, worldwide solutions architect at IBM SoftLayer.

“Ultimately, it’s about how quickly Solutions can be developed. How can you evolve the [old] ways to become more agile? And that’s really where we’re seeing the opportunity for cloud,” he says.

The cloud and emerging technologies such as AI and machine learning serve as both a catalyst for and a reason to change for the financial industry. Customers, ultimately, expect the same – if not greater – level of tech sophistication they’re used to in their consumer lives when it comes to everyday tasks such as banking.

Every element of work carried out in the FSS is being transformed by technology and AI represents a $16 trillion opportunity, according to IBM’s Tiffany Winman reporting on all the industry news and views coming from IBM’s Think 2019 event earlier this year.

It’s an opinion industry experts share, with analyst firm Gartner suggesting that AI implementation grew by 270% in the last four years and 37% in the last year alone. This, says Winman, is helping firms digitally reinvent themselves and using AI to reduce costs, create new revenue streams and enhance the customer experience.

But there remains some fear, uncertainty and doubt (FUD) around AI, with concerns specifically focused on job loss or displacement.

“IBM’s approach to AI is not to replace humans, but rather to create augmented intelligence that helps amplify human cognition,” Winman wrote.

“Companies should ask: How do you leverage technology to help humans do what they do, better? And how can technology bring value in such a way that it allows them to do higher value things?”

Rob Thomas, IBM General Manager of Data and AI, has another way of looking at it, saying at Think 2019: “AI is not going to replace managers, but managers who use AI will replace the managers who do not.”

Thomas said AI was the “new electricity” which should certainly serve as a warning to those who are still in any doubt as to the transformative powers it possesses for businesses of all sectors but, specifically, the financial industry.

Partnering remains key to success

Like every other industry, it’s important for those in the FSS to recognise they don’t have to go it alone. 

“Banks need to open up to the idea of partnering with other companies and find the right balance between investing in FinTech or partnering. Finding that balance of using FinTech to innovate incrementally or in a process means we finally have a win/win scenario,” Bhushan says, adding its an innovation inhibitor if those organisations are so proprietary that they simply can’t – rather than won’t work with anyone else.

Blockchain, for example, is a key innovation that offers much to FSS firms. However, up until recently, its brilliance had gotten mixed up and somewhat lost in the hype surrounding crypto currencies, according to Bhushan.

“I’ve certainly seen a shift [in that thinking] now,” he says. “Blockchain works really well when you have multiple parties. In a mortgage situation, for example, you can have the land registry, banks – all of them participating in the Blockchain. But, it needs all of those parties to come to the table and agree to the standards.”

The real 21st century currency 

Against the backdrop of disruption and innovation, those in the FSS are facing the same security threats – often intensified – as many other industries.

“Cyber security is as much a business problem as a technology one. Every employee is a cyber security manager,” Sean McKee, senior manager of cyber threat management at TD Bank, told delegates at Think 2019.

“What can an organisation do to prepare? Your plan is not worth the paper it’s written on if nobody knows how to use it… TD Bank did a five month workup for a two day test. The president of the US bank immediately saw the results of their decisions,” he continued.

McKee outlined four key steps to effective implementation:

 

  1. Exercise and test your strategy and plan. Come to a cyber range and bring your playbook

  2. Technical security controls testing (ethical hacking committee)

  3. Resource for success. It takes a long time to produce a proper exercise

  4. Continuous improvement

Security and resilience can often be seen as more of a burden than compliance in the FSS sector, but it can also unlock many positives, according to Bhushan.

“I think security can be a very empowering mechanism to enable trust between you and your customers and partners at all levels,” he says.

“In the 21st century, information is money. So, by being secure you are actually accelerating the trust your suppliers and partners have in you and accelerating business results.”

In a world that is travelling in all sorts of different directions at 100 miles per hour, the temptation to simply hit the pause button to take stock can be incredibly tempting. However, by embracing the cloud, continuous development and DevOps, FSS firms can continue to keep the lights on, defend against threats and innovate, too.

“Organisations can no longer look to their competitors to decide what to do next. Innovation and reinvention is key to existence in this rapidly changing world. Delivering customers’ needs with agility and pace is vital. Organisations should start on their journey by exploring how data and AI together can uncover the value hidden in their data and then deliver features with simplicity,” Bhushan adds.

“Digital is a prerequisite and a journey, it is not the destination. Creating magical experiences that consumers will pay for is the end game.”

Best Making Tax Digital software 2019: Make sure your accounting is MTD-compliant


K.G. Orphanides

29 Apr, 2019

HMRC’s new Making Tax Digital scheme came into effect on April 1, which means that all subsequent filings from VAT-registered businesses in the UK must use accounting software or a bridging tool to file online via the MTD portal, as well as keeping compliant records of VAT transactions.

The easiest way to ensure that you’re compliant is by using online accounting software to ensure that your books are kept correctly, automatically track VAT in your transactions and which, once linked to your MTD account, allows you to make your monthly or quarterly VAT returns in a couple of clicks.

Here, we’ve highlighted some of the best MTD-compliant accounting suites for small and medium businesses in the UK, between them suited to a wide range of budgets and requirements.

Intuit Quickbooks

QuickBooks is an absolutely top-notch accounting suite, with features that include time tracking and full payroll with support for PAYE income tax and deductions for staff, billed at £1 per employee per month.

Bank accounts can be linked for automatic syncing or, with a bit of fiddling, added to your Chart of Accounts by hand so you can manually upload statements. It’s extremely polished and comfortable to use, with one of the best web interfaces we’ve seen in an online accounting suite and a similarly efficient mobile app.

Note, though, that QuickBooks’ entry-level £8 per month Self-Employed tier doesn’t support MTD VAT filing and its higher tiers don’t support self-assessment. This means that Quickbooks is a better choice for bigger SMEs, while sole traders and freelancers should look at alternatives such as FreeAgent.

Charities and rights organisations should also be aware that there’s a cloud over Intuit’s ethical credentials due to its US lobbying efforts to keep the government from allowing private citizens to file personal taxes online without the use of commercial software such as their own TurboTax product.

Price when reviewed

QuickBooks Essentials: £15 per month (exc VAT)
QuickBooks Plus: £22.50 per month (exc VAT)

Read our full Intuit QuickBooks review for more information.

FreeAgent

Unlike some services, which restrict the number of clients or invoices you can work with at cheaper tiers, FreeAgent’s UK subscription tiers give you everything your businesses can legally take advantage of, with no other restrictions. Its interface and documentation are also excellent and we were very pleased to find that FreeAgent, almost uniquely, uses UK data centres.

Thanks to integrated self-assessment submissions, it’s extremely well suited to sole traders and freelancers, whether in service or sales-based industries, while its tier for limited companies includes full real-time payroll filing and PAYE support.

Other features include project and time management, bank account linking and manual account and statement tracking for transaction reconciliation, multicurrency support across all tiers and a detailed automated reminder system for overdue invoices.

Its pay-monthly price is a little more expensive than rivals such as FreshBooks and QuickBooks but, for VAT-registered small businesses, the features FreeAgent provides mean that you’ll never outgrow your accounting software and you won’t have to pay for extra bolt-ons.

Price when reviewed

Sole Traders: £19 per month (exc VAT)
Partnerships: £24 per month (exc VAT)
Limited companies: £29 per month (exc VAT)

Read our full FreeAgent review for more information.

Zoho Books

A wealth of features at even lower subscription tiers, along with the already-great value pricing, makes Zoho Books a very worthwhile choice for small businesses on a budget.

It provides an excellent set of tools for managing your business’s incomings, outgoings and VAT. Its top Professional tier is particularly good for sales-oriented businesses that need inventory tracking. Multicurrency support, time tracking and MTD-compliant VAT returns are present at all tiers. The only major downside is that it doesn’t have any kind of payroll facility – you’ll have to rely on third party software or services for this.

Its entry-level Basic tier is limited to 50 contacts – customers or suppliers – but includes both bank linking and manual statement upload for transaction reconciliation, custom invoices, expense tracking, projects and timesheet. The Standard tier increases your contact limit to 500 and adds support for more automated workflows and modules, plus the ability to log bills, issue vendor credits, use reporting tags, require purchase approval and receive SMS notifications.

The top end Professional tier has unlimited contacts,10 user seats, purchase and sales order, an inventory for basic stock control and support for a custom domain name.

Its interface is a little more complex than some of its rivals, but Zoho Books doesn’t skimp on features. If you can do without integrated payroll, this one of the most cost-effective ways to track your finances and make sure you’re MTD-compliant.

Price when reviewed:

Zoho Books Basic: £5 per month (exc VAT)
Zoho Books Standard: £10 per month (exc VAT)
Zoho Books Professional: £15 per month (exc VAT)

Read our full Zoho Books review for more information.

CloudBees acquires Electric Cloud to further CI/CD mission

CloudBees has announced the acquisition of San Jose-based software veteran Electric Cloud, with the aim to become the ‘first provider of end-to-end continuous integration, continuous delivery, continuous deployment and ARA (application-release automation).

The acquisition will marry two leaders in their respective fields. Electric Cloud was named as a leader in reports from both Gartner and Forrester around application release orchestration and continuous delivery and release automation respectively.

CloudBees’ most famous contribution to the world of software delivery is of course Jenkins, an open source automation used for continuous delivery. The original architects of Jenkins are housed at CloudBees, including CTO Kohsuke Kawaguchi.

Last month, CloudBees announced the launch of the Continuous Delivery Foundation (CDF), leading the initiative alongside the Jenkins Community, Google, and the Linux Foundation. At the time, Kawaguchi said: “The time has come for a robust, vendor-neutral organisation dedicated to advancing continuous delivery. The CDF represents an opportunity to raise the awareness of CD beyond the technology people.”

From Electric Cloud’s side, the company bows to CloudBees as the ‘dominant CI vendor, CD thought leader and innovator’. “CloudBees recognised the enormous value of adding release automation and orchestration to its portfolio,” the company wrote on its acquisition page. “With Electric Cloud, CloudBees integrates the market’s highest-powered release management, orchestration and automation tools into the CloudBees suite, giving organisations the ability to accelerate CD adoption.”

“As of today, we provide customers with best-of-breed CI/CD software from a single vendor, establishing CloudBees as a continuous delivery powerhouse,” said Sacha Labourey, CloudBees CEO and co-founder in a statement. “By combining the strength of CloudBees, Electric Cloud, Jenkins and Jenkins X, CloudBees offers the best CI/CD solution for any application, from classic to Kubernetes, on-premise to cloud, self-managed to self-service.”

Financial terms of the acquisition were not disclosed.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

A guide to securing application consistency in multi-cloud environments

Cloud computing. It is like a well-trodden path when it comes to talking about digitalisation. Multi-cloud is another term that has crossed many lips, and something that we are still getting to know now, as efforts continue to ramp up to meet the demand of the digital era. For organisations, the demand to adapt and move with the times is apparent. But the demand for organisations to transform digitally is even more pressing. The whole ecosystem of growth, progression and delivering on the expectations of the end user rests in the delivery of cutting-edge, end-to-end managed IT services.

It is the kind of transformational change that is driving organisations to seek consistency across multiple environments.

The availability of multiple platforms means organisations can be spoilt-for-choice, and when they have picked and deployed a wide variety of applications that are subject to shifts and changes, they can lack overall control, support and visibility. And, with the story of multi-cloud barely on the first chapter, there is more still to come and more to be done to ensure support for present and future digital transformation needs.

The multi-cloud approach is enabling new digital workflows, and companies can ensure better collaborative capabilities between what may have once been siloed components.

The landscape is, however, an ever-changing environment.

The customer centric view means being application centric and organisations need to ensure they can support customer-specific services as they either evolve, change or become redundant. All businesses must support a huge amount of vastly different server applications, with Virtual Services ranging from very low throughput single node IoT devices, to highly critical real-time online production servers that need to deliver high-availability online examination software. This is where multi-cloud requires another layer in order to work to its best capabilities for such a carefully balanced environment, and where the consideration of application delivery services and software-driven infrastructure solutions comes in to play.

The problem with multi-cloud

The requirement now is for platforms where every application and deployment is managed seamlessly so business can thrive in the long term. The solution needs to address the holes that a rush to multi-cloud infrastructure can leave, such as a requirement for easier management capabilities and analysis technology that can work across platforms to analyse and troubleshoot applications. Multi-cloud risks adding complexity, but enterprises know they can’t have siloed clouds. Having many different tools to deploy applications in different clouds can fracture development teams, and inconsistent services and processes across clouds defeats the very purpose of a multi-cloud initiative in the first place.

The problems stem from the differences between all of the components of multi-cloud, such as how on-prem data centres work, the requirements of various applications and the disparity between clouds. The compute, storage and networking resources themselves are not the issue when it comes to multi-cloud, but the consistent provisioning and management over every working component is more likely the sticking point.

It is the mission of internal service providers to achieve consistency without slowing everything down. The business wants speed and that means finding a way to deploy applications on different platforms quickly. In the race to get applications deployed teams want guardrails to ensure they can move quickly without the risk of driving off the cliff (or in this case) cloud edge. They want a safeguard where they know exactly what to expect whatever the environment they operating in.

Keeping up with the clouds

The difficulty is that, in the rush to keep up with the latest IT strategies driving digital change across every vertical sector, organisations have struggled to deploy a well-integrated, and complete multi-cloud solution. Instead they have resorted to bolting multiple clouds on to existing structures, in turn leaving a complicated mismatch that can lead to vendor or platform lock-in. It is the kind of thing that requires a whole team of specialists to manage and configure application deployment and delivery in various silos over multiple clouds, as opposed to on-prem.

These problems are real and difficult, but they are not impossible to fix. The solution is abstraction. Too many services today are opinionated about the underlying infrastructure when they shouldn’t have to be. For example, a hardware appliance is confined to the datacentre and many cloud providers offer proprietary services unique to their cloud and their cloud alone. The next generation of application services is abstracted from the underlying infrastructure, software-defined, and opinionated only about the needs of the application (not the infrastructure that delivers it). These software-only services will play a critical role in the data centre and across multiple clouds providing consistent experiences in every environment.

Not only is this a simple solution, the tools to carry it out are readily available right now. And ultimately, this is the solution where multiple environments must be fully integrated and it allows organisations to get the most out of multi-cloud use where each application sits in the cloud that provides maximum benefit to the business.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

From research to reality: How the cloud is powering AI


Cloud Pro

18 Apr, 2019

Whether it’s pocket-sized mobile communicators, cars that can drive themselves or a global information-sharing network, scientists and researchers have a history of turning the marvels of technology dreamed up by science fiction writers into reality – and the composer of that endeavour is the creation of artificial intelligence.

Although we’re still some way from being served by self-aware robot butlers that can reliably pass the Turing test, AI technology has progressed immeasurably in the last decade alone. AI has moved from being the sole province of research projects working with giant supercomputers to something that all of us carry around in our pockets, and cloud computing has been a huge part of that move from research to reality.

The most foundational change came when public cloud offerings like Amazon Web Services and Google Cloud Platform became widely available. The development of AI using methods such as, deep-learning and neural networks requires a considerable amount of compute power. Once it became possible to rent as many servers as you needed from a cloud provider, tasks that were once only possible by universities and science labs suddenly became accessible to everyone.

Moreover, these servers take advantage of best-in-class hardware from Intel, featuring technical developments specifically designed to enable AI. For example, high-performance Xeon Scalable chips and low-latency Optane Memory. On top of this, many cloud platform providers have, in recent years, started to specifically cater to machine learning and AI development, offering servers and services tailored to make training deep-learning models quicker and easier than ever.

As these barriers to entry come down, companies and hobbyists around the world have started experimenting with machine learning and AI, exploring the possibilities and pushing the boundaries of what it can do. Much of this research has been shared with the wider community under open source licenses. A key example of is TensorFlow™, a machine learning library developed internally by Google and shared freely with the rest of the world.

Alongside the compute power to train deep-learning models, the cloud has also provided the datasets on which to train them. The development of AI has gone hand in hand with the big data boom, as companies start gathering and storing exponentially more data for analytical purposes. A side effect of this is that there are now huge corpuses of data that can be fed into machine learning models in order to train them in tasks like pattern recognition, clustering and regression.

All of this makes it much easier to improve and develop AI technology, but there’s one key reason that it’s now a legitimate business tool rather than simply a technical endeavour, and that’s the ease of consumption that cloud models offer. It’s so much easier for customers and end-users to consume AI tools running in the cloud as part of a SaaS application compared to traditional on-premise software.

None of the processing is done locally, so there’s no hardware requirements, and because the vendor is responsible for maintaining the AI on a day-to-day basis, there’s no need to hire machine learning or AI specialists. With no extra effort or investment required, companies are becoming more and more comfortable with the idea of integrating AI processes into their day-to-day workflows.

The general public has also grown increasingly familiar with AI technology thanks to the growing prevalence of AI-powered digital assistants like Siri, Alexa and Cortana. These services have helped acclimatise people to working with AI, as well as opening their eyes to the benefits offered by the technology.

These factors have made developing commercial AI more viable, resulting in an explosion of AI-enabled tools and services, many of which have been snapped up by cloud giants like Google and Salesforce and integrated into their product portfolios. Most cloud storage companies, for instance, augment their search capabilities by using machine vision algorithms to accurately identify objects in photographs or text in documents.

AI is also increasingly being used by companies as an initial point of contact for customer service, with chatbots handling Tier One support queries and sales inquiries. A far cry from the long-established automated telephone menus, these programs are intelligent and responsive, and are becoming increasingly common.

It’s not just public clouds that have spurred AI advancements; private and hybrid cloud deployments have also seen a great deal of change. Banking is one area where AI can have a huge impact in terms of analysing huge quantities of data very quickly, but for regulatory reasons, financial firms often can’t – or won’t – use public cloud providers. Instead, these institutions use private clouds to run custom-built or specially-adapted machine learning algorithms to sort through their data.

Intel’s advancements in processor technology have brought cloud-scale computing power within the reach of companies operating their own private cloud, meaning that you no longer need a sizeable data centre to run machine learning applications. Instead, you can run AI tasks on as little as one rack, depending on the size of the deployment. This enables you to keep total control of your data and infrastructure, whilst still taking advantage of cloud-style consumption and delivery models.

In the comparatively short timeframe that cloud computing has been a mainstream phenomenon, AI has gone from being the preserve of academics to a day-to-day reality for businesses around the world; used to perform all kinds of diverse tasks from data analysis to customer service. Machine learning applications are now being developed, deployed and delivered via cloud platforms, empowered and enabled by Intel’s next generation data centre technologies. Whether you’re looking to run your AI applications on a public, private, virtual private or hybrid cloud, Intel is making your AI smarter, stronger and faster than ever before.

Discover more about cloud innovations at Intel.co.uk

The evolution of the data centre


Cloud Pro

18 Apr, 2019

Whether it’s making a credit card purchase, messaging your friends or even simply ordering a pizza, virtually all of the things we do on a daily basis are powered and supported by data centres.

But the data centres we rely on today are a far cry from the technology of the past; they’ve changed almost immeasurably since digital computing took its first early steps in the 1950s and ‘60s. Processing power and capacity have increased exponentially over the years and the infrastructure needed to support modern applications has grown ever more complex.

These advances have been driven by the growing demands of both businesses and consumers. First, the birth of the internet led to an explosion in the amount of people consuming online services, which necessitated vast increases in the amount of processing power and capacity that data centres had to offer.

Before long, the need for server capacity spawned the creation of third-party providers, who would host companies’ servers in their own facility, thus removing the initial expense and ongoing overheads of setting up an on-premise data centre for companies. Eventually, as network technology and connectivity improved, this gave way to the cloud computing model where companies rent space not in a data centre, but on the server itself.

Cloud computing has been a major catalyst for change in the data centre; not only have many operating models fundamentally shifted as a result as a result of its rise, but it’s also driven technological advancements like multi-tenant systems, lightning-fast storage and AI applications.

Processing

One of the most foundational changes in data centre technology was the advent of multi-core processors around the turn of the millennium. By fitting two or more processing cores onto a single die, chip manufacturers could radically boost the total performance of data centre hardware, allowing the same workloads to be run with fewer machines.

Multi-core processing also brought huge advantages to virtualisation, which has been a linchpin of the data centre’s growth. Because each processing core runs in parallel with the others, multi-core systems can run huge amounts of virtual machines simultaneously with minimal drops in performance, vastly increasing the amount of applications that can be run at once.

Containerisation has had a similar impact; each VM can host multiple containers within it, each of which can host its own application. This allows data centres to exponentially multiply their capacity for applications. As well as spearheading the continued advancement of multi-core processing, Intel has also been a leader in developing virtualisation and container technology, working with engineering partners to make containers and VMs lighter, faster and more resilient.

Cooling

Data centre equipment is highly powerful, but all that power generates large amounts of heat. Unfortunately, server processors are highly sensitive, and need to be kept below a certain temperature in order to ensure optimal performance. In order to maintain this, data centres have to be very carefully climate-controlled, relying on complex and expensive cooling systems that are often the second-largest consumers of power.

While these cooling systems are still very necessary, Intel’s continued advancement in processor technology has made server processors more thermally efficient, generating less heat and therefore requiring less cooling. On top of that, the company also introduced sensors to its server chips in 2011 which allow data centre administrators to measure the temperature and airflow within a data centre. This enables them to better identify hot and cold spots, modelling the placement of new racks and equipment according to temperature conditions.

Along with preventing costly outages, increasing thermal efficiency throughout the data centre also prolongs the lifespan of the servers themselves and reduces the amount of overall cooling necessary, thereby saving administrators money in terms of the substantial operational costs incurred by cooling efforts.

Power

Intel has also steadily improved the power efficiency of its data centre products. Newer chips like its Xeon Scalable range offer greater performance than previous generations, while consuming less electricity. As with improved cooling performance, this saves data centre operators money in operational costs, but it also allows more chips to be packed into the same physical space.

This means that companies can eke more computational power out of the same resources, without needing to invest in more cabinets, increased power consumption or more cooling. Space efficiency is a key concern, too; floor space within a data centre is often in high demand, so the more physical components that can be packed into a single rack, the better.

Storage

The move from traditional spinning-platter HDDs to SSDs was a huge leap forward in this regard, as it meant that storage drives could take up much less space inside a server, albeit at a higher cost. SSDs were also much faster than HDDs at accessing the data stored on them, greatly speeding up overall server operations and enabling much faster performance for tasks like data analytics.

Intel has been instrumental in advancing storage technology through its partnership with Micron, which involved introducing data striping for increased performance and pioneering high-reliability enterprise drives. It also led the workgroup that developed NVMe technology and, more recently, co-developed 3D Xpoint memory technology, which offers unparalleled speeds for low-latency workloads. You may be familiar with Intel’s Optane range of memory and storage products, all of which are powered by 3D Xpoint.

The end result of all of these numerous changes, developments and advancements has been modern data centres, which are capable of supporting complex, cloud-native workloads. Gone are the days of monolithic mainframes supporting single applications; now, data centres play host to hundreds upon hundreds of sophisticated, multi-core, multi-processor servers, each making use of advanced software-defined networking and low-latency solid state storage drives to power millions of simultaneous applications and processes.

Intel has been at the heart of this change for decades, drawing on its engineering heritage and world-class research expertise to push the boundaries of what data centres are capable of. Whether it’s Optane storage technology, high-performance Xeon Platinum processors or the intelligent software supporting virtualised and containerised applications, Intel remains at the bleeding edge of enterprise processing technology.

Discover more about data storage innovations at Intel.co.uk

Salesforce boosts Einstein portfolio to add more AI into the cloud


Clare Hopping

18 Apr, 2019

Salesforce has unveiled a suite of new Einstein features that will enable developers to custom build AI integrations and machine learning into their apps without labouring over code.

The Einstein Platform Services platform has been enhanced to offer developers more advanced tools, including the drag and drop Einstein Translation feature that can translate any object or field seamlessly.

For example, if a business is dealing with a customer whose native language doesn’t match the operator’s language set, they can be re-routed automatically to a staff member that can communicate in the caller’s native tongue.

Einstein Optical Character Recognition reads documents and autonomously updates the records in Salesforce, saving vital resources that are sometimes wasted on such time-consuming tasks, while Einstein Prediction Builder makes use of Einstein’s powerful prediction capabilities, helping admins build AI models on Salesforce fields or objects.

With Einstein Predictions Service, admins are also able to embed these predictions into third-party systems, whether ERP apps, HR apps or other platforms that need a predictive analytics boost. For example, if Einstein predicts that employees are feeling dissatisfied in the workplace, these predictions can be absorbed into a business’s talent management platform to help advise a future retention strategy.

“The promise of AI is no longer reserved for data scientists. With Einstein, we are empowering developers and admins — the lifeblood of every Salesforce deployment — to make customizable AI a reality for their business,” said John Ball, EVP and general manager at Salesforce Einstein.

“But our mission goes beyond just making the technology more accessible, we are committed to democratizing AI that people can trust and use appropriately.”

Insights 2019: Epicor extends Microsoft Azure partnership to power fresh ERP enhancements


Keumars Afifi-Sabet

17 Apr, 2019

Epicor has built on its partnership with Microsoft Azure to roll out major artificial intelligence (AI) and Internet of Things (IoT) enhancements to its suite of enterprise resource planning (ERP) tools.

Manufacturers and distributors will benefit from upgrades to the company’s Epicor ERP and Prophet 21 platforms respectively, powered by closer integration with Microsoft’s Azure cloud platform almost a year after this partnership was first announced.

Details of both releases were outlined by Epicor’s chief product and technology officer Himanshu Palsule at the company’s annual Insights customer conference, hosted this year at Mandalay Bay, Las Vegas.

In particular, this Azure base layer will power the company’s digital assistant Epicor Virtual Agent (EVA), as well as render better connectivity between people and smart machines in the firm’s Epicor ERP platform for manufacturers.

«We’ve made great progress in moving many of you onto the Azure platform,» Palsule said during his keynote address. «And as you saw with EVA and the connected enterprise, we have now started using the platform elements of Azure.

«We’ve started using the service levels, service fabric and the IoT Hub and other parts of that. And that’s always been our strategy.»

Improvements to the manufacturer-centric Epicor ERP platform sees a host of tools such as AI and analytics integrate with the system to enhance the company’s vision for a ‘connected enterprise’.

The Epicor IoT module, in particular, will connect smart machines across the manufacturing floor to Microsoft’s Azure IoT Hub, with data pulled directly from sensors and visualised on the ERP home page.

Meanwhile, the Prophet 21 web-based app, targeted at distributors, is encouraging its customers to embrace the public cloud to transform their business processes.

This app was recently ported onto the public cloud in its entirety, with software engineers telling Cloud Pro the most exciting part of the shift is customers’ newfound ability to use the product on any device, from tablets to iMacs.

The latest releases of Epicor’s ERP tools for manufacturers and distributors make steps towards realising ambitions set out last year, when the strategic partnership was first outlined last year.

As to where the company aims to take this in the future, Palsule told the press at a Q&A following the keynote that customers whose own platforms are tied to Azure will in future be able to draw additional benefits.

He also outlined how the partnership would work in practice, and address concerns around security, saying the firm resisted any temptation of trying to solve it themselves.

«The way it works is you get the machines communicate with the IoT Hub, the IoT Hub has a certain restriction on security protocols. Then that data then goes into ERP and communicates back,» said Palsule.

«Everything that we’re doing follows the standard Microsoft protocols so we are relying heavily on this partnership to give us security.»

Meanwhile, for Microsoft, the company sees this partnership as the perfect marriage between the underlying cloud infrastructure and the niche expertise that a partner in the mould of Epicor can offer.

«We got our engineering teams together we sat down, and ultimately worked out all the architectural designs,» Microsoft’s partner director for global ISV alliances and business development Don Woods told Cloud Pro.

«We put the ERP into Azure, and then started going to customers and said ‘look, you want to get into the cloud? Epicor’s moving into the cloud. It’s going to SaaS. You want to benefit from all this? Join us in that movement’.

«And Epicor wanted to make sure that the customers had that capability, because that’s where it’s going.»

VMware’s blockchain now integrates with DAML smart contract language

VMware’s move into the blockchain space represents the latest cloud vendor getting involved with the technology – and it has been enhanced with the announcement of an integration with Digital Asset.

Digital Asset, which operates DAML, an open source language for constructing smart contracts, is integrating the latter with the VMware Blockchain platform. The move to open source DAML was relatively recent, and the company noted this importance when combining with an enterprise-flavoured blockchain offering.

“DAML has been proven to be one of the few smart contract languages capable of modelling truly complex workflows at scale. VMware is delighted to be working together on customer deployments to layer VMware Blockchain alongside DAML,” said Michael DiPetrillo, senior director of blockchain at VMware. “Customers demand choice of language execution environments from their blockchain and DAML adds a truly robust and enterprise-focused language set to a blockchain platform with multi-language support.”

The timeline of the biggest cloud players and their interest in blockchain technologies is an interesting one. Microsoft’s initiatives have been long-standing, as have IBM’s, while Amazon Web Services (AWS) went back on its word to launch a blockchain service last year. VMware launched its own project, Project Concord, at VMworld in Las Vegas last year but followed this up with VMware Blockchain in beta in November.

Despite the interest around blockchain as a whole, energy consumption has been a target for VMware CEO Pat Gelsinger, who at a press conference in November described the technology’s computational complexity as ‘almost criminal.’

VMware was named by Forbes earlier this week in its inaugural Blockchain 50. The report – which carries similarities to its annual Cloud 100 rankings – aimed to provide analysis on those with the most exciting initiatives based in the US and who had a minimum valuation of sales of $1 billion.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

The state of cloud business intelligence 2019: Digging down on Dresner’s analysis

  • An all-time high 48% of organisations say cloud BI is either “critical” or “very important” to their operations in 2019
  • Marketing and sales place the greatest importance on cloud BI in 2019
  • Small organisations of 100 employees or fewer are the most enthusiastic, perennial adopters and supporters of cloud BI
  • The most preferred cloud BI providers are Amazon Web Services and Microsoft Azure.

These and other insights are from Dresner Advisory Services’ 2019 Cloud Computing and Business Intelligence Market Study. The 8th annual report focuses on end-user deployment trends and attitudes toward cloud computing and business intelligence (BI), defined as the technologies, tools, and solutions that rely on one or more cloud deployment models. What makes the study noteworthy is the depth of focus around the perceived benefits and barriers for cloud BI, the importance of cloud BI, and current and planned usage.

“We began tracking and analysing the cloud BI market dynamic in 2012 when adoption was nascent. Since that time, deployments of public cloud BI applications are increasing, with organisations citing substantial benefits versus traditional on-premises implementations,” said Howard Dresner, founder, and chief research officer at Dresner Advisory Services. Please see page 10 of the study for specifics on the methodology.

Key insights gained from the report include the following:

An all-time high 48% of organisations say cloud BI is either “critical” or “very important” to their operations in 2019

Organisations have more confidence in cloud BI than ever before, according to the study’s results. 2019 is seeing a sharp upturn in cloud BI’s importance, driven by the trust and credibility organisations have for accessing, analysing and storing sensitive company data on cloud platforms running BI applications.

Marketing and sales place the greatest importance on cloud BI in 2019

Business intelligence competency centres (BICC) and IT departments have an above-average interest in cloud BI as well, with their combined critical and very important scores being over 50%.

Dresner’s research team found that operations had the greatest duality of scores, with critical and not important being reported at comparable levels for this functional area. Dresner’s analysis indicates operations departments often rely on cloud BI to benchmark and improve existing processes while re-engineering legacy process areas.

Small organisations – of 100 employees or fewer – are the most enthusiastic, perennial adopters and supporters of cloud BI

As has been the case in previous years’ studies, small organisations are leading all others in adopting cloud BI systems and platforms.  Perceived importance declines only slightly in mid-sized organisations (101-1,000 employees) and some large organisations (1,001-5,000 employees), where minimum scores of important offset declines in critical.

The retail/wholesale industry considers cloud BI the most important, followed by technology and advertising industries

Organisations competing in the retail/wholesale industry see the greatest value in adopting cloud BI to gain insights into improving their customer experiences and streamlining supply chains. Technology and advertising industries are industries that also see cloud BI as very important to their operations. Just over 30% of respondents in the education industry see cloud BI as very important.

R&D departments are the most prolific users of cloud BI systems today, followed by marketing and sales

The study highlights that R&D leading all other departments in existing cloud BI use reflects broader potential use cases being evaluated in 2019. Marketing & Sales is the next most prolific department using cloud BI systems.

Finance leads all others in their adoption of private cloud BI platforms, rivaling IT in their lack of adoption for public clouds

R&D departments are the next most likely to be relying on private clouds currently. Marketing and sales are the most likely to take a balanced approach to private and public cloud adoption, equally adopting private and public cloud BI.

Advanced visualisation, support for ad-hoc queries, personalised dashboards, and data integration/data quality tools/ETL tools are the four most popular cloud BI requirements in 2019

Dresner’s research team found the lowest-ranked cloud BI feature priorities in 2019 are social media analysis, complex event processing, big data, text analytics, and natural language analytics. This years’ analysis of most and least popular cloud BI requirements closely mirror traditional BI feature requirements.

Marketing and sales have the greatest interest in several of the most-required features including personalised dashboards, data discovery, data catalog, collaborative support, and natural language analytics

Marketing and sales also have the highest level of interest in the ability to write to transactional applications. R&D leads interest in ad-hoc query, big data, text analytics, and social media analytics.

The retail/wholesale industry leads interest in several features including ad-hoc query, dashboards, data integration, data discovery, production reporting, search interface, data catalog, and ability to write to transactional systems

Technology organisations give the highest score to advanced visualisation and end-user self-service. Healthcare respondents prioritise data mining, end-user data blending, and location analytics, the latter likely for asset tracking purposes. In-memory support scores highest with Financial Services respondent organisations.

Marketing and sales rely on a broader base of third party data connectors to get greater value from their cloud BI systems than their peers

The greater the scale, scope and depth of third-party connectors and integrations, the more valuable marketing and sales data becomes. Relying on connectors for greater insights into sales productivity & performance, social media, online marketing, online data storage, and simple productivity improvements are common in marketing and sales. Finance requiring integration to Salesforce reflects the CRM applications’ success transcending customer relationships into advanced accounting and financial reporting.

Subscription models are now the most preferred licensing strategy for cloud BI and have progressed over the last several years due to lower risk, lower entry costs, and lower carrying costs

Dresner’s research team found that subscription license and free trial (including trial and buy, which may also lead to subscription) are the two most preferred licensing strategies by cloud BI customers in 2019. Dresner Advisory Services predicts new engagements will be earned using subscription models, which is now seen as, at a minimum, important to approximately 90% of the base of respondents.

60% of organisations adopting cloud BI rank Amazon Web Services first, and 85% rank AWS first or second

43% choose Microsoft Azure first and 69% pick Azure first or second. Google Cloud closely trails Azure as the first choice among users but trails more widely after that. IBM Bluemix is the first choice of 12% of organisations responding in 2019.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.