Between the mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations.
In his session at @DevOpsSummit at 19th Cloud Expo, Charles Kendrick, CTO at Isomorphic Software, presented a revolutionary model enabled by new technologies. Learn how business and development users can collaborate – each using tools appropriate to their expertise – to build mockups and enhance them all the way through functional prototypes, to final deployed applications. This approach helps you improve usability, exceed end-user expectations, and still hit project milestones.
Monthly Archives: April 2018
Databases in Containers | @DevOpsSummit #DevOps #Docker #Kubernetes
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap – Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
How to move your accounting to the cloud
If you have always done your accounts with a traditional accounting package, it’s perhaps time you started thinking about moving them to the cloud.
Many organisations turned to cloud applications to either help reduce costs or improve operational speeds, yet those same organisations have now started moving their accounts to the cloud too. While there’s some trepidation about moving highly sensitive financial information off premises, the drivers to do so are becoming harder to ignore.
One such driver is the Making Tax Digital initiative from HMRC. From April 2019, all companies over the VAT threshold of £85,000 will be required to keep digital records and submit these electronically. There are also some suggestions that income tax could become mandatory as well, meaning businesses will have to review all aspects of how they engage with clients today and what that might look like in future.
There’s increasing pressure on accountants to do more than has traditionally been expected of them. According to a recent survey of accountants, 83% said clients expect them to do more than they did five years ago, with 42% expecting their accountants to offer business advice as part of the service.
“With this in mind, accountants are looking to lighten their administrative burden which will enable them to spend more time attending to these new demands,” says Sean Evers, accountant development director at Sage. “Moving accounting to the cloud allows organisations to spend less time on admin and more on attracting and serving customers.”
Benefits of cloud accounting
One of the main benefits of moving accounts to the cloud is quicker access to better software.
“Software is constantly evolving and improving, from bug fixes, patches to new features,” says Kris Brown, UK R&D Director at TechnologyOne. “When you enhance the software for one customer, all customers benefit. Not only is it much quicker to deploy cloud solutions, you also gain full control over the deployment process for new software. “
If you only wish to implement certain parts of the solutions, you can test those and then leave the other features ‘switched off’.”
Paul McCooey, a partner at chartered accountants Duncan & Toplis, says that recording long lists of payments is a thing of the past, as your bank account automatically syncs with your software.
“[There are] no more manually listing invoices [either]. The software can issue them automatically. And no more lost receipts or unclaimed tax relief – simply take a photograph of your bills or receipts and upload them,” he says.
“You’ll spend less time chasing debts too – cloud accounting software can issue automated reminders and ‘pay now’ buttons as payment prompts.”
Defining the future state
Paul Nicklin, technical director at inniAccounts, says that moving to the cloud isn’t as simple as flicking a switch so you need to do your homework.
“The world is going cloud first, so you’ll have an easier time if everything is in the cloud. However, you still need to consider how systems interact.
“For instance, if you have an ERP system and payroll plugged into on-premise accounting package it will be more tricky, but far from impossible, to move than if you have cloud-based services that are compatible with your old and new provider,” says Nicklin.
Brown says that the first phase in moving an organisation’s accounts to the cloud is to have a clear view of what success will look like – the future state – and to clearly define the current state.
“Every organisation is unique and will develop its own unique path to transition based on its existing applications, systems and requirements,” he says.
“It can be as simple as taking a copy of what you have on premise and starting the next day in the cloud. A like for like move of business process and software doesn’t have to be a massive task. But, a ‘lift and shift’ approach – placing current applications into a hosted environment – won’t deliver true transformation.”
Brown adds that if you are moving to a new provider, then this does need to be considered as a new implementation, and at that point, a change of processes and culture will be required to realise the full benefits of SaaS.
Nicklin believes that if your business is small, and your existing accounts aren’t as streamlined as they could be, then moving to cloud is the perfect opportunity to clean things up.
“In our experience of moving accounts from complex spreadsheets through to exports from accounts packages, we’ve encountered quality issues,” says Nicklin. “Our advice, therefore, tends to be not to bring too much history. It’s often wrong.”
Integrating accounts with other cloud apps
By migrating other parts of the back office into the cloud, it ensures that the benefits seen by the accounting team are also seen across the board, argues to Sage’s Sean Evers.
“Back office functionalities can often be time and resource consuming but moving to the cloud will take away much of the heavy lifting associated with these processes, allowing organisations to focus on serving their customers and improving workforce efficiency.”
He believes that once accounting is moved to the cloud, organisations should regularly review processes and challenge their efficiency. This is particularly useful for identifying those processes that can be automated or streamlined by using a third-party app, provided it’s able to connect to your cloud accounting service.
“That way, not only are you saving time and money around the accounting aspect, but all over your business. It also means more time to do what you want to do – whether that’s growing your business or more personal time,” says Evers.
David Lindores, technical director at Eureka Solutions, says that it is not a requirement to have other parts of the business in the cloud, however, this is highly advisable.
“There are integration tools that are available which allow businesses to transfer data seamlessly between cloud-based software solutions and on-premise software,” he says.
Image: Shutterstock
Oracle adds AI and bare metal to UK cloud region
Oracle has built out its UK cloud offering with the addition of several new services to its local region.
Those include Oracle’s autonomous database, Exadata, AI and blockchain, which are all now available from its UK data centres, meaning organisations that store their data locally can now leverage these capabilities.
They can also take advantage of Oracle’s bare metal cloud based on its new X7 server, and block storage, both designed for high-performance computing (HPC) workloads.
Claiming the UK-hosted services benefit from “extremely low latency and high bandwidth”, James Stanbridge, VP and lead product manager for Oracle Cloud Infrastructure (OCI) in Europe and Asia, said: “It’s a very significant new region and what Oracle is doing and our strategy has been over many years now is to build new regions and cloud capabilities exactly where and when customers need them to meet their growing needs to complete their transformation to the cloud.”
Oracle touted customers like Bristol startup YellowDog, which specialises in processing HPC workloads, and the National Grid, to show its cloud is attracting users both big and small.
“The goal National Grid has is a complete IT transformation from on-premise to a cloud-first strategy – that’s by no means unusual [among our customers],” Stanbridge told Cloud Pro.
“These IT transformation projects have long lead times on them, much more like six to nine-month projects. I’m seeing that in data centre regions in the EU. We are starting to see some of those large IT transformations move in [to the UK region].”
Oracle’s new additions to its UK region come as part of a new push to expand its global data centre footprint, opening 12 new regions of three or more facilities across Asia, Europe, and the Americas.
It added new UK facilities in the first half of 2017, alongside an expansion in Turkey and the US. Oracle spent just $1.7 billion on new cloud regions in 2016, compared to $31 billion by rivals Amazon Web Services (AWS), Microsoft and Google. CEO Mark Hurd defended his company’s outlay by saying that Oracle’s fast hardware meant it could provide fewer facilities.
Bringing #DevOps to #DataScience | @DevOpsSummit @CAinc #AI #Analytics
There’s no shortage of material about DevOps on the Interwebs. But as you sift through mountains of information, it appears mostly skewed toward better and faster web and mobile app development. Look for details on how the thinking can be applied to the wicked world of data science or analytics and you’ll be hard pressed to find much at all. This seems odd, because a sustainable business isn’t just carved out on the back of web and mobile apps. It’s also dependent on intimately understanding customer data and pivoting towards the opportunities revealed – with yes, web and mobile apps.
Peter Principle and Big Data | @ExpoDX #BigData #Analytics #AI #IoT #IIoT
Wikibon just released their “2017 Big Data Market Forecast.” How rosy that forecast looks depends upon whether you look at Big Data as yet another technology exercise, or if you look at Big Data as a business discipline that organizations can unleash upon competitors and new market opportunities. To quote the research: “The big data market is rapidly evolving. As we predicted, the focus on infrastructure is giving way to a focus on use cases, applications, and creating sustainable business value with big data capabilities.”
Oracle enhances its UK cloud services with new region
Oracle has announced the availability of enhanced UK cloud services promising customers ‘unprecedented levels of performance and availability’.
The company will expand its UK cloud region, which comprises three physically separate, high bandwidth and low latency sites, and will offer a range of Oracle PaaS and IaaS services, as well as the highly publicised autonomous database service.
To that end, the company rolled out a couple of satisfied customers; the National Grid, who said using Oracle’s IaaS would make sure they ‘deliver lower cost services going forward’, while cloud rendering firm YellowDog said it will appreciate the lower latency compared to using Oracle’s EU cloud region.
Oracle announced its first autonomous database service, the Oracle Autonomous Data Warehouse Cloud at an event last week, with attendees told it will be the first of several autonomous PaaS services being delivered this year. In recent earnings reports, Larry Ellison has spoken about little else; last month he told analysts “no other cloud provider has anything like it” and that autonomous mobility, analytics, and integration services were in the pipeline.
Yet of more interest – to this publication at least – is the fact Oracle now appears to be more confident in its global cloud footprint.
About a year ago this reporter contacted the company to ask for a map, or even a list, of its cloud data centre regions – only to be told there wasn’t one. Today, behold the map [below]. The earliest date CloudTech can find for the page’s appearance was November last year, which suggests this is a relatively recent initiative.
The map shows a total of 12 regions across five continents, with Europe represented by London, Slough, Amsterdam and Frankfurt. Oracle’s future plans however are far greater. At its CloudWorld event last month, Oracle promised a dozen new regions, including expansion to Canada, China, India and Japan among others.
CLOUD Act brings Microsoft’s US data privacy court spat to an end
A long-running court battle between Microsoft and the US government over data sovereignty appears to be drawing to a close, with both sides calling for the dismissal of the case in light of a new law.
The case, which has been ongoing since 2013, centres around a number of emails stored in Microsoft’s Dublin data centre related to a criminal investigation into alleged drug trafficking. Redmond has repeatedly refused to comply with a warrant to hand over the emails to US law enforcement agencies, arguing that as the information is stored outside of the country it doesn’t fall under the jurisdiction of American authorities.
Prosecutors in the cas have countered by saying that because the data is held by an American entity – i.e. Microsoft – it falls under American jurisdiction, basing their arguments on the US Stored Communications Act from 1986.
Over the course of the past five years, the case has been fought all the way up to the US Supreme Court and attracted interventions and amicus briefs from the likes of Apple, Cisco, AT&T and, most recently, the European Union.
It now seems, however, that this bitterly fought battle is coming to an end – for now, at least.
The US Department of Justice (DoJ) has requested the case be dismissed, according to Reuters, with Microsoft backing the call, despite the two presenting arguments to the Supreme Court justices as recently as 27 February.
The reason for the about turn is a new piece of legislation signed into effect by US president Donald Trump on 22 March, which both sides say effectively resolves the dispute. The CLOUD (Clarifying Lawful Overseas Use of Data) Act states that US law enforcement agencies have the right to issue and enforce warrants relating to data held by American companies abroad, although there is also recourse to object if the order is in conflict with foreign laws.
In the filing, Microsoft’s lawyers said: “Microsoft agrees with the government that there is no longer a live case or controversy between the parties with respect to the question presented.”
While it may be the end of this particular case, this isn’t necessarily the end of the story completely. The DoJ has acquired a new warrant for the data governed by the new law, which means there’s still opportunity for Microsoft to object and, indeed, for the whole cycle to start again.
Putting the D in VDI: How virtual desktop infrastructure got its desktop back
The point of virtual desktop infrastructure (VDI) is to offer employees anytime, anywhere accessibility to your organization’s applications and data. VDI is a “desktop away from the desktop.” The problem is that more emphasis has had to be placed on the infrastructure part of VDI due to outdated technology that creates complexity. But newer VDI technology is about to restore the desktop to its rightful place.
A promising beginning
In a tradition dating back more than a quarter of a century, when it’s time to buy desktops, the IT department focuses on how much CPU, memory and storage comes with each desktop. Why should a virtual desktop project be any different? IT staff should spend their time thinking about the same desktop attributes. Instead, they spend most of their time talking about "infrastructure": servers, storage, layers, management tools and much more. Managing all this infrastructure is exhausting and expensive, and when the focus is on the "I" and not on the "D," users end up unhappy and IT staff end up frustrated. How did this happen?
IT typically spends about $1500 USD (roughly €1200) per desktop or laptop. That cost is amortized over three to four years. But the overhead of dealing with physical desktops is often unsustainable, and once the world went mobile, being tethered to a desktop was a sure way to give your competitors the advantage. In response, some IT teams made the decision to deploy virtual desktops and apps.
The promise of VDI was compelling: Greater IT efficiency and information security and a productive mobile workforce. But in order to implement VDI on-premises, IT needs to translate the desktop attributes into expensive and complex data center technologies. IT staff started asking questions like, “If I have 1000 users, how many servers do I need? How much shared SAN/NAS storage do I need? In which data centers do I put this infrastructure?”
A complicated middle act
With traditional VDI there are many moving parts. Organizations that choose VDI need to determine how many servers they’ll need. “Which applications are used? What is the CPU and memory usage rate? How many users can I fit onto a certain class of servers? Do I need 20, 30 or 50 servers for 1000 users?” It all depends on usage.
Even trickier is the struggle to determine storage needs. Local storage on PCs is the cheapest storage available – about $100/TB. SAN/NAS can be 25-100 times that cost. If each user had 1TB of storage on their desktop, then you would need 1000 TB of SAN/NAS. That is massively expensive.
To keep VDI from dying on the vine, providers created various ways to optimize storage. The conversation went something like this: “Oh, you can optimize with a single image so you don't need to have 1000 copies of Windows OS. Now, let's put in layers so you don't need to have 1000 copies of each application. Wait, what about profile management tools to store end user personalization? You need that, too. Oh, and you can no longer manage it with your existing PC management tools like SCCM and Altiris. So, your VDI infrastructure is a stand-alone management framework.”
Now, these workarounds may seem viable, but they fail to take into account the fact that Windows wasn't architected to operate in this manner. So, customers struggle with app compatibility, corrupted profiles and application updates that blow away desktops. At the same time, the storage vendors started implementing de-duplication so that the 1000 copies of Windows and applications in each user's desktop were automatically de-duped at the storage layer. Hyper-converged infrastructure (HCI) vendors ultimately adopted de-duplication, and even though HCI really began to affect the cost of VDI implementations, it hasn't gone far enough.
At this point in the process, you have to plan for where all this infrastructure is going to live. Which data center should it be in? How far away will all your end users be from that data center? What does that mean for latency? How will their user experience be? How much bandwidth will they require?
A solid finish
So then, the infrastructure had to be addressed before the desktop could get its moment in the spotlight. IT departments have had to jump through complex infrastructural hoops to deliver a mission-critical workload to a class of users. But there are more important things for IT teams to do and more value for them to add than dealing with all this complexity.
The advent of cloud computing created an opportunity to completely re-imagine what the phrase “virtual desktops” means. Now, the data center is any region of the public cloud you select. Essentially, the infrastructure becomes invisible in that region – at least in terms of you having to worry about it. Desktops can be placed close to the users so they have a great experience. All IT needs to do is determine the configuration of the desktop, just like they determine the configuration of a physical PC.
It is similar and, in fact, simpler to buy a desktop cloud solution than it is to buy a PC. The IT team simply chooses a desktop configuration running in Azure. They order the number of units needed for their end users. Then they use their corporate image to create copies of desktops in the various regions where the users are. But rather than shipping a PC to each user, IT simply emails a link for their desktop.
In this way, cloud computing has restored the emphasis on the desktop and eliminated infrastructure complexity. Instead of fretting over infrastructure, organizations can now focus on what class of desktop their users need. It’s no longer a question of how many servers they will need but how much CPU. And when needs change, modify the desktop configuration in minutes. Users will have a better experience and IT teams can concentrate on higher-value projects. Now VDI is delivered simply and efficiently as a turnkey service.
Embracing #Cognitive with Hybrid Cloud | @CloudExpo @IBMcloud @IBMSystems #AI #SDS #ArtificialIntelligence
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architect, will explore how storage and software-defined solutions from IBM have evolved for the road ahead. Walk-away knowing how you can bring new levels of speed, agility and efficiency to the applications and workloads you choose to deploy across a hybrid cloud model.