Apple tests biometric sign-in for iCloud


Bobby Hellard

9 Jul, 2019

Apple is testing biometric security for iCloud in its beta programmes for the next raft of operating systems, according to reports.

Last month, the tech giant announced the beta versions of iOS13, iPadOS 13 and MacOS Catalina, which is being rolled out to testers in the coming weeks. Now, according to 9to5Mac, these operating systems will test a new sign-in process for iCloud on the web meaning that users can sign-in using either Face ID or Touch ID.

When you visit iCloud in the Safari browser on a device running any of the new operating systems, you’ll see a new pop-up prompting you to sign-in using your Apple ID with biometrics. Tapping “continue” automatically signs the user in.

Users on devices with Touch ID, including the 2018 MacBook Air and Touch Bar-equipped MacBook Pro models, will also be met with a similar sign-in process that uses their fingerprint to authenticate without requiring two-factor authentication.

Apple’s testing of this easy way of signing into the iCloud website is likely related to its forthcoming Sign In with Apple feature – announced last month – which allows users to sign in to apps and websites using their Apple ID.

The feature is being touted as a more secure alternative to similar sign-in services offered by Facebook, Google, and Twitter since it authenticates the user with Face ID or Touch ID and doesn’t send personal information to app and website developers.

Further privacy boosts are being reported with Sign In with Apple also allowing users to create a randomly generated email address that hides their real one when signing up for a third-party app or service.

The new sign-in feature is coming with Apple’s new operating systems when they’re released this fall and will be available across macOS, iOS, and through websites.

Making tax digital: What does it mean for your tax returns?


Barry Collins

11 Jul, 2019

Making Tax Digital is the woefully abject name for the government’s latest attempt to modernise the way we submit our taxes. It’s not as if we’ve been filing with quill and parchment until now, after all.

However, stale branding isn’t the only problem with the new system. Parts of Making Tax Digital have predictably fallen behind schedule, while even the parts that have been introduced this spring have been fudged. Still calculating your VAT returns in spreadsheets? You might get away with it for a little bit longer.

Whether you’re a small business owner or just filing your own income tax return, you’ll eventually be forced to defer to the government’s desire to do everything via accountancy software. Not much to fear there for the average reader, perhaps, although HMRC has plenty to gain when it comes to avoiding costly errors and detecting tax fraud.

There’s also an upside for small business owners. With the software doing the drudge work that your accountant used to spend hours poring over and billing you for – they can now concentrate on helping you grow your business.

We find out what the recent changes mean for your business, evaluate the software that can help you submit your taxes and find out what it all means for business owners.

VAT’s the way to do it

The Making Tax Digital (MTD) programme set out with a bold timetable. However, as with most government IT projects, things haven’t quite gone to plan. “The timeline changes a lot because of HMRC getting behind on the building of the software,” said Chris Barnard, tax manager at accountancy firm UHY Hacker Young. “Originally it was meant to be VAT for everyone next year and then the year after that was going to be self-assessment, but that seems unlikely now. I can see that being three, four years away now.”

VAT has at least made it out of the starting blocks, although only businesses with a taxable turnover of more than £85,000 are required to keep their records digitally and submit their VAT returns using MTD-compatible software.

Filing online isn’t new, of course. Anyone running a VAT-registered business will be familiar with logging on to the HMRC website once a quarter and filling in the nine different boxes that confirm the amount of VAT you’ve charged on sales, the amount of VAT you’re reclaiming on purchases and so forth. Well, now you need a piece of government-approved software to fill those boxes for you.

You might be surprised to learn that’s all the information the government wants – at least for the time being. Although software such as Xero, QuickBooks or Sage will have all the detailed transaction data on every sale you’ve made that quarter, none of that info is currently being passed to HMRC. Instead, the software will simply populate the VAT return figures automatically, rather than you having to enter the figures by hand or copy and paste them out of your accountancy software.

That might sound like a lot of fuss about nothing, but Ed Molyneux, CEO of accountancy software provider FreeAgent, told us that simply preventing copy and paste errors will be make a big difference to the Revenue. “Where errors tend to occur is in transcribing one thing to another,” said Molyneux. “You know what it’s like, it’s easy to swap digits or miss a digit here and there. [MTD] is better than people fat-fingering numbers into a form.”

But it’s not just about cutting out the typos. “One of the other parts of the rules is those nine [VAT return] boxes have to be digitally linked to the underlying accounting record,” said Barnard.

If you were already submitting your VAT returns using accountancy software, that was already the case, but as Barnard said: “There are tens of thousands of businesses that are just using Excel sheets to do their accounting records, but when HMRC want to do an inquiry, they don’t have the confidence that the numbers they’ve got actually mean anything.”

Xero’s director of partner and product, Damon Anderson, also believes the new legislation will prevent other familiar tax problems for both business owners and the Revenue. “This new legislation enables taxes to be filed digitally and quarterly with greater accuracy and speed,” Anderson said. “As a result, this prevents the end-of-year scramble to harmonise Excel spreadsheets or find paper receipts, making businesses more efficient and more transparent to business partners and to HMRC.”

Wither Excel?

Does that mean Excel is no longer an acceptable means of keeping your company’s books? Not quite. Companies that want to continue to manage their accounts in spreadsheets can do so for the time being, but they will need a piece of “bridging software” to submit the return. Most of the big accountancy software firms are already offering this. “As Excel spreadsheets cannot interact with MTD software, Xero has also developed a bridging software solution to ease the transition for those that aren’t fully online and are worried about looming deadlines,” said Anderson.

The bridging software basically takes the calculated figures from your spreadsheet and populates the VAT return, but the operative word here is “calculated”. You can’t just type figures into a spreadsheet and have them uploaded to HMRC: the Revenue wants to see that you’re doing your sums properly, even if you don’t have to upload the transaction data itself.

“At first, they [HMRC] didn’t want to have spreadsheets at all, but after consultations it was decided that as a compromise, the use of spreadsheets would still be allowed for a limited amount of time,” said Barnard. “If you have a tab for sales, the total sales figure will have to digitally link to the VAT return. You upload the spreadsheet to whatever bridging software you’re using and then you put a cell reference into the bridging software, and that picks a number up from the spreadsheet.”

Eventually, however, businesses will need to migrate to approved accountancy packages.

The bigger picture

Many businesses have been reluctant to move to Making Tax Digital because they fear the Revenue will have access to all their transaction data, according to Barnard. As we explained earlier, that’s not yet the case — MTD only requires the same, sparse VAT data that you were forced to submit previously.

However, there’s no doubt that the long-term goal is to gain access to granular detail of companies’ accounts. That will, of course, provide an enormous fillip for the Revenue’s efforts to clamp down on tax fraud. “Down the road, what is now voluntary information – individual invoice transactions or expenses – will at some point be mandatory parts of the legislation,” said Barnard. “That’s probably two to five years away.

“HMRC is looking to be smarter in the way they do inquiries. In the past it was a random exercise, and they spent a lot of money going out to businesses to do inspections and realising there was a typo on the VAT return or it was absolutely fine. They want to move to artificial intelligence systems like the banks have used to spot trends. If there’s a certain business and their submission falls out of the normal criteria of what that business should be submitting, it will flag up an alert.”

Demanding that businesses use accountancy software instead of Excel spreadsheets has another advantage – there’s an online invoice trail. Whereas businesses might sometimes offer to do a “cash job” and not bother to report the work for VAT purposes, “it’s more difficult now not to record those transactions in the system,” said Barnard. “A lot of the time, a client will want an invoice. That invoice will be sent from Xero or whatever and once it’s in the system that invoice automatically goes on the VAT return.”

Benefits for businesses?

Clearly the government is set to benefit from Making Tax Digital, and it must be good news for the accountancy software firms too: tens of thousands of businesses are now being practically mandated to use their software.

What business owners are probably asking themselves is: what’s in it for us? With accountancy software costing hundreds of pounds a year in subscriptions fees and all the hassle of learning how to use it, is there any advantage to being forced to do your books in this fashion, instead of dumping a pile of paper invoices and receipts in your accountant’s lap once a year and leaving them to get on with it?

Because almost all of the major software packages do a lot of the accountant’s legwork for them – such as automatically creating VAT returns, balance sheets and end-of-year corporation tax calculations – your accountancy fees should (in theory) be reduced, as your accountant doesn’t have to spend as much time doing all that work themselves.

Of course, if there’s one thing accountants are pretty good at, it’s making money, so even if your fees don’t go down, “your accountant should be able to provide a better service”, according to Barnard. “When I first started training, you’d talk to your client about once a year at the end of their accounting year and you were always working with data that was about 18 months old. The client didn’t know what was going on right here, right now. With cloud accounting, they can get management accounts from the software right away. The accountant can explain what the management accounts mean, in detail, and provide recommendations off that, provide more tax advice, and tax planning. There’s a lot more services accountants should be providing now”.

Molyneux agrees that the software should free accountants to make a more meaningful difference to firms. FreeAgent recently surveyed accountants and found that the worst parts of their job were “chasing clients for data, having to fix errors in clients’ data, transcribing stuff into [different] systems and filing tax returns,” according to Molyneux. “The bit they don’t get to spend enough time with their clients on is how to make the business more successful.”

Molyneux says many sole traders and small business owners are spending “hundreds, if not thousands of pounds a year, just to pull together a set of numbers to file a tax return and stay out of prison”.

“Accountants are saying ‘I can’t sell advisory services to people who bring me a shoebox of receipts every year. But I can sell them those services if we can basically assume the numbers are taken care of,” Molyneux added.

And talking of staying out of prison, Molyneux adds that using accountancy software may in fact indicate to the authorities that you’re running your accounts properly, lowering the risk of the dreaded tax inspection. “Just the knowledge that the data has come from a trusted source, like a bank transaction feed, and has not just been keyed into a system… with plenty of scope for distortion is a big step in the right direction for them [HMRC] being able to trust that data. If you’ve got a system that meets certain criteria in its trustability or auditability, you might even get a bit of a green light through the filing system.”

China cloud computing market grew 40% year over year, report claims

The cloud computing ecosystem in China continues to grow; according to a new report the market size rose almost 40% in 2018.

As reported by Xinhua, the China Academy of Information and Communications Technology (CAICT) pegged the Chinese cloud market at 96.28 billian yuan (£11.14bn) last year in its most recent whitepaper, up 39.2% from 2017. The market will almost double in size between now and 2022, CAICT added, forecast to reach 173.1bn yuan.

Regarding the private cloud, the whitepaper put that at 52.5bn yuan for 2018 – up 23.1% year on year – and is expected to reach 117.2bn yuan by 2022.

By 2020, according to a recent missive from the Ministry of Industry and Information Technology, there will be an additional one million enterprises utilising cloud computing in China.

It is one of various proclamations which underscore the country’s potential. According to the most recent analysis from the Asia Cloud Computing Association (ACCA), in April last year, China placed only above Vietnam in a comparison of the primary Asia-Pacific nations. The majority of the problems cited by ACCA relate to China’s size and the long tail of digital transformation, with the country scoring poorly on infrastructure and connectivity.

“The Chinese government continues to devote considerable fiscal resources to the development and improvement of infrastructure, a move that will undoubtedly pay off in the next few years,” the report noted.

When this translates into the roadmap of vendors, Chinese players are starting to make inroads across Asia-Pacific. According to data from Synergy Research in May, the top three players in China are ranked in the top six across APAC as a whole. Alibaba (#2), Tencent (#4) and Sinnet (#6) sit astride Amazon, Microsoft and Google respectively. A large part of this, the research firm noted, is down to China being ‘by far’ the largest country market, growing ‘much faster’ than the rest of the region.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

IDC: Global spending on public cloud services and infrastructure to reach £398b in 2023

The latest IDC study suggests that global spending on public cloud services and infrastructure will be more than double by 2023, mainly driven by digital transformation deployments. According to the market researcher, spending will grow at a rate of 22.3 per cent from $229b (£182b) in 2019 to nearly $500b (£398b) in 2023.

Eileen Smith, programme director at IDC, said: “Adoption of public (shared) cloud services continues to grow rapidly. Enterprises, especially in professional services, telecommunications, and retail, continue to shift from traditional application software to software-as-a-service (SaaS) and from traditional infrastructure to infrastructure-as-a-service (IaaS) to empower customer experience and operational-led digital transformation initiatives.”

SaaS will hold more than half of all public cloud spending during the forecast period, says IDC, adding that the market segment comprising applications and system infrastructure software (SIS) will be dominated by applications purchases. The report said: “The leading SaaS applications will be customer relationship management (CRM) and enterprise resource management (ERM). SIS spending will be led by purchases of security software and system and service management software.”

It said that the IaaS would be the second largest category of public cloud spending throughout the forecast period, which will be followed by platform-as-a-service (PaaS). IaaS spending, spanning servers and storage devices will also be the rapid growing category of cloud spending with a growth rate of 32 per cent. The market research firm said: “PaaS spending will grow nearly as fast – 29.9 per cent – led by purchases of data management software, application platforms, and integration and orchestration middleware.”

Three industries, namely professional services, discrete manufacturing, and banking, will be responsible for more than one-third of all public cloud services spending during the forecast period.

The report further said: “While SaaS will be the leading category of investment for all industries, IaaS will see its share of spending increase significantly for industries that are building data and compute-intensive services. For example, IaaS spending will represent more than 40 per cent of public cloud services spending by the professional services industry in 2023 compared to less than 30 per cent for most other industries. Professional services will also see the fastest growth in public cloud spending with a five-year CAGR of 25.6 per cent.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft spruces up Outlook in a bid to catch up with major G Suite upgrades


Keumars Afifi-Sabet

5 Jul, 2019

Outlook is set to get a range of new features this month including a dark mode, a redesigned email experience and improvements to calendar synchronicity as part of a major overhaul of the platform.

Users of Microsoft’s Office 365 email service will see a number of improvements to the way messages can be read, categorised and organised, the firm announced. Changes to calendar and meeting functionality, and a series of significant aesthetic tweaks, make up the full complement of changes.

The new Outlook will feature categories that make it easier to tag, find or organise messages, with users able to add multiple categories to a single message.

A favouriting mechanism, in which contacts, groups or entire categories can be highlighted, also offers easier access to certain aspects of any user’s inbox. As with Gmail, meanwhile, users can also draft multiple emails on-the-go using ‘tabs’ that rest on the lower portion of the user interface (UI).

There’s also a snooze function for emails that need to be dealt with later. Snoozing a message removes it temporarily from the inbox, with it reappearing as an unread message at top of the pile once the snooze period expires.

Among the most eye-catching features, however, is a new dark mode, which lets users personalise their UI for night-time or low-light browsing. The lights can be turned back on when reading a specific email or composing one by configuring this mode in the settings menu.

The firm’s main rival in this space, Google, has spent the past year or so updating G Suite productivity suite, including a number of significant changes to Gmail, notably the use of artificial intelligence (AI) for predictive responses and inbox management.

Meanwhile, tweaks are also being made to Outlook’s calendar functionality, including the ability to search across multiple calendars, as well as filters to adjust the parameters when hunting for a person or event.

It’s also now possible to quickly create events and book rooms for meetings from the calendar surface on Outlook, while the ‘week view’ dedicates a larger screen area to today and tomorrow.

The changes will be implemented from in late July, with ‘targeted release’ customers no longer able to see an opt-in toggle that switches between the old Outlook and the beta version of the latest iteration.

West Sussex and Surrey Fire services move to Infographics cloud


Bobby Hellard

5 Jul, 2019

West Sussex and Surrey Fire and Rescue services have come together for a digital transformation project that will see both units’ IT operations move to the cloud. 

The first phase of the endeavour will see the two agencies move to an Infographics platform called FireWatch, which is a fully managed cloud delivery and maintenance model running on Microsoft’s Azure platform.

Infographics said the FireWatch Cloud platform will provide a range of benefits to both fire and rescue services, including a connected application for integrated HR, payment, training and development, fleet monitoring and staff availability systems. It will also provide flexible and remote access to their fire service management system, including full management and upgrades, the company claimed.

At the same time as moving to FireWatch Cloud, West Sussex and Surrey fire services will also be moving to a joint control operation using the Vision mobilisation and control platform from Capita. Phase two of the project will see Infographics deploy a single interface, with FireWatch processing live incident data from Capita Vision.

Jon Lacey, West Sussex Fire and Rescue’s acting deputy chief fire officer, said that working in collaboration with the team in Surrey enables them both to become more effective and efficient as a service, as well as opening up the possibility of future collaboration.

“The opportunity to work in collaboration with Infographics and Surrey Fire and Rescue Service will transform the way we mobilise our assets to emergency incidents and provide improved support to our teams across our service,” he said.

How to minimise the risk of outages – with better software testing

The software as a service model has been widely embraced as digital transformation becomes the norm. But with it comes the risk of network outages. IDC has estimated that for the Fortune 1000, the average total cost of unplanned application downtime per year can range from $1.25 to $2.25 billion. This risk arises primarily from the rapid iteration of the DevOps methodology and the subsequent testing shortfalls.

To protect against certain errors and bugs in software, a new and streamlined approach to software testing is in order.

The DevOps/downtime connection

Development and testing cycles are much different than they used to be due to adoption of DevOps methodology. To remain competitive, software developers must continually release new application features. They’re sometimes pushing out code updates as fast as they are writing them. This is a significant change from how software and dev teams traditionally operated. It used to be that teams could test for months, but these sped-up development cycles require testing in days or even hours. This shortened timeframe means that bugs and problems are sometimes pushed through without the testing required, potentially leading to network downtime.

Adding to these challenges, a variety of third-party components must be maintained in a way that balances two opposing forces: changes to a software component may introduce unexplained changes in the behavior of a network service, but failing to update components regularly can expose the software to flaws that could impact security or availability.

Testing shortcomings

It’s pricy to deal with rollbacks and downtime caused by bugs. It typically costs four to five times as much to fix a software bug after release as it does to fix it during the design process. The average cost of network downtime is around $5,600 per minute, according to Gartner analysts.

Financial losses are a problem, but there’s more to be lost here. There’s also the loss of productivity that occurs when your employees are unable to do their work because of an outage. There are the recovery costs of determining what caused the outage and then fixing it. And on top of all of that, there’s also the risk of brand damage wreaked by angry customers who expect your service to be up and working for them at all times. And why shouldn’t they be angry? You promised them a certain level of service, and this downtime has broken their trust.

And there’s another wrinkle. Software bugs cause issues when they are released, but they can also lead to security issues further down the road. These flaws can be exploited later, particularly if they weren’t detected early on. The massive Equifax breach, in which the credentials of more than 140 million Americans were compromised,  and the Heartbleed bug are just two examples. In the case of the Heartbleed bug, a vulnerability in the OpenSSL library caused significant potential for exploitation by bad actors.

Developers make changes to the code that trigger a pipeline of automated tests in this environment of continuous integration and delivery. The code then gets approved and pushed into production. A staged rollout begins, which allows new changes to be pushed out quickly. But this also relies heavily on the automated test infrastructure.

This is hazardous, since automated tests are looking for specific issues, but they can’t know everything that could possibly go wrong. So then, things go wrong in production. The recent Microsoft Azure outage and Cloudflare’s Cloudbleed vulnerability are examples of how this process can go astray and lead to availability and security consequences.

A new way to test

A solution to the shortcomings of current testing methods would find potential bugs and security concerns prior to release, with speed and precision and without the need to roll back or stage. By simultaneously running live user traffic against the current software version and the proposed upgrade, users would see only the results generated by the current production software unaffected by any flaws in the proposed upgrade. Meanwhile, administrators would be able to see how the old and new configurations respond to actual usage.

This would allow teams to keep costs down, while also ensuring both quality and security, and the ability to meet delivery deadlines – which ultimately helps boost return-on-investment. For the development community, building and migrating application stacks to container and virtual environments would become more transparent during development and more secure and available in production when testing and phasing in new software.

Working with production traffic to test software updates lets teams verify upgrades and patches in a real-world scenario. They are able to quickly report on differences in software versions, including content, metadata and application behavior and performance. It becomes possible to investigate and debug issues faster using packet capture and logging. Upgrades of commercial software are easier because risk is reduced.

Toward quality releases

Application downtime is expensive, and it’s all the more painful when it’s discovered that the source is an unforeseen bug or security vulnerability. Testing software updates in production overcomes this issue by finding issues as versions are compared side by side. This method will save development teams time, headaches and rework while enabling the release of a quality product.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Premium email firm Superhuman ends pixel tracking after backlash


Bobby Hellard

4 Jul, 2019

A premium email startup which gained notoriety for letting users see who opened their messages and the location they were opened, by default, has apologised and promised to change its service.

Superhuman, a plugin for email accounts that aims to speed up emailing, came under fire for using ‘pixel tracking’ by default and without consent from those that receive messages sent from users of the service.

But, after a number of complaints and a critical blog post that went viral, the company has issued an apology and promised to change its service.

Faster Emails

A lot of people became aware of Superhuman via a New York Times profile. Under a picture of a diamond-encrusted ‘new message’ pendant, the startup was described as a premium app for speeding up emails with AI-powered shortcuts and quirks. One of the reasons it attracted such attention was because it’s a $30 a month service and invitation-only – aspiring users need to fill in a questionnaire about their email usage to determine whether they need it.

“We have the who’s who of Silicon Valley at this point,” the company’s founder Rahul Vohra, told the NYT. It’s alleged that 180,000 people are on a waiting list to use the service. “We have insane levels of virality that haven’t been seen since Dropbox or Slack,” Vohra added.

What these people are supposedly desperate for is an app that plugs into their existing email account (currently only works with Gmail and Google G Suite addresses) that promises to speed up the process of emailing others. There are features that let users undo sending, buttons to automatically pull up a contact’s LinkedIn profile, an “instant intro” shortcut that moves the sender of an introductory email to bcc, and a scheduling feature, which sees that you’re typing “next Tuesday” and automatically pulls up your calendar for that day.

But one feature that was briefly mentioned in the piece caught the attention of users on Twitter; email tracking.

Read Receipts

“Superhuman is a surveillance tool that intentionally violates privacy by notifying senders every time their emails have been viewed by recipients,” Mike Davidson, a VP at InVision, tweeted. “I would never trust this company. Only way to make sure your own privacy isn’t violated is to disable images in your own email app.”

Later, in a blog post that went viral, Davidson explained out that the email tracking, called ‘Read Receipts’ on Superhuman, is a default setting for the service. He wrote that the read/unread status of an email is not something the receiver can opt-out of. He showed an example of an email he had sent via Superhuman.

“A running log of every single time you have opened my email, including your location when you opened it,” he explained. “Before we continue, ask yourself if you expect this information to be collected on you and relayed back to your parent, your child, your spouse, your co-worker, a salesperson, an ex, a random stranger, or a stalker every time you read an email. Although some one-to-many email blasting software has used similar technologies to track open rates, the answer is no; most people don’t expect this. People reasonably expect that when – and especially where – they read their email is their own business.”

Pixel Tracking

Every time you view an image while browsing the Web, that image is stored on a server and downloaded to your computer. As such the host server has knowledge of where that computer is and when it downloaded the image. 

This technicality can be used to embed a tiny image, often a mere pixel wide, into web pages and emails, which can be very difficult to spot or transparent.

As such, opening an email with such an image in it triggers a download from a server which effectively exposes the time the recipient opens the email and the computer’s location, essentially serving at the foundations for automatic read receipts. 

As Davidson’s example showed, the emails not only show when they were opened but also where they were opened. So in a sense, Superhuman could be used to track the location of people by simply sending them an email. This is down to IP addresses as the downloaded tracking pixel records it – this how the internet determines where your computer is physically and digitally. Criminals sometimes exploit this to work out if a house is empty and ripe for looting.

The Outcome

The exposure Superhuman received in the NYTs quickly turned to controversy, but the company has been swift to offer a fix. CEO Vohra posted an apology in a blog and said that effective immediately, Superhuman will stop tracking locations, will delete existing location information and will turn off read receipts by default.

“When we built Superhuman, we focused only on the needs of our customers,” he wrote. “We did not consider the potential bad actors. I wholeheartedly apologize for not thinking through this more fully.

“We are removing location information in all read statuses for all emails sent with Superhuman, effective immediately. This will also apply to emails sent in the past.”

The trends emerging to create a more sustainable data centre industry

Harnessing technology to improve sustainability is increasingly becoming a top priority for businesses across all industries. A recent PwC study revealed that by using AI systems across their operations, global agriculture, transport, energy and water sectors could contribute to a 4% reduction of worldwide greenhouse gas emissions by 2030.

One industry that is emerging at the forefront of sustainability is the data centre industry. While driving energy efficiencies delivers benefits back to the business, it also has a positive impact on clients and of course, our environment. Any good data centre business knows that taking steps to reduce the environmental impact of their facility is essential, but how are the best ones already responding to this challenge?

A greener, more responsible cloud

Reducing the amount of electricity required to operate a data centre is an important step in improving sustainability. The Storage Networking Industry Association (SNIA) recently commented that 5% of total global energy usage is by electronics – this number will grow to at least 40% by 2030 unless companies make major advances in lowering electricity consumption.

Data centre cooling is one of the main energy costs that is rising along with the demand for data centre capacity. Developing in-house water-cooling systems is one way to address this. For example, servers that use a water element to cool themselves, rather than electricity, inevitably reduce energy consumption and optimise air flows. This is because the water-cooling systems are built into racks with integrated heat exchangers and power distribution units and users need less than 10% overhead energy on top of server energy. A typical data centre, in contrast, needs between 40-100% more.

Cloud migration service providers allow corporations to not only reduce the cost of operations and develop new uses, but also to reduce their environmental footprint compared to their legacy and generally less efficient facilities. Aggregating cloud computing needs through large hyperscale cloud service providers becomes part of the answer to control the risk mentioned above.

On top of that, we’re also seeing providers shift towards using hydro, wind and solar power as part of their energy supply. When locating sites, cloud services providers focus on low carbon energy supply areas, like hydroelectricity in the Quebec region. Alternatively, cloud providers can enter into agreements with energy suppliers to guarantee the origin is from renewable sources.

Simplifying data centre operations

The global cloud data centre traffic was 6 zettabytes (ZB) in 2016, and it is now projected to reach 19.5 ZB per year by 2021. With this in mind, data centre operators will always need to invest significant amounts into maintaining and improving their efficiency, and there is a more sustainable way to do just that.

For example, through a fully integrated industrial model, providers are capable of building systems that are more energy efficient and should always thrive to optimise the use of data centres and server resources across their customer base. This can be done through virtualisation, re-use or recycling of servers and more green energy.

In addition, dedicated research teams can work alongside production to ensure the functionality of a product and assess safety requirements before they are manufactured. Innovation that consists of improving efficiency with fewer resources ultimately respects continuous renewal, through the development of agility and the creation of sustainable solutions. Hence, providing the best services for customers.

Looking ahead to a sustainable future

When it comes to managing and fitting out a data centre, it’s clear that sustainability needs to be top of mind. Those who jump on board with green energy consumption will be well positioned to grow their data centre into the future.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

It’s great to move to a DevOps approach – but have you told anyone else?

One of the key inhibitors to progress in DevOps is internal communication – or the lack of it – according to a new survey.

The report, from Trend Micro, polled 1,310 IT decision makers across enterprise and SMB levels and found almost nine in 10 (89%) are lobbying for greater communication between software development and IT security teams.

More than three quarters (77%) said developers, security and operations teams needed to be in closer contact. So where is the breakdown? Only one in three respondents said DevOps was a shared responsibility between software development and IT operations. Enhancing IT security is a greater priority than anything else in building out a DevOps strategy, respondents added.

In total, just under three quarters (74%) said DevOps initiatives had become more important over the past year. One in three (34%) admitted particular silos were making it harder to create DevOps cultures in their organisation.

“History of software development shows that the biggest and best process improvements never happen quickly due to the most valuable variable – people – who have existing behavioural patterns and cultural components,” said Steve Quane, executive vice president of network defence and hybrid cloud security at Trend Micro. “Organisations implementing a DevOps structure are going in a strong direction, but security cannot be forgotten during this transition.”

Writing for this publication earlier this month, Viktor Farcic, principal software delivery strategist at CloudBees, explored the ideal implementation and template. “DevOps is not about creating more silos, nor is it about renaming existing departments as ‘DevOps departments.’ Instead, it’s about people working together to accomplish a common goal: successful release of new features to production,” wrote Farcic.

“When everyone works as a single team focused on a single product, communication improves, the need for administrative overhead decreases, and ownership is established,” Farcic added. “Working together and understanding how one department’s actions affect others create empathy. As a result, productivity and quality increase, and costs and time to market decrease.”

Evidently, there appears to be a way to go before this nirvana is achieved.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.