All posts by Jamie Davies

Data, data, data. The importance of backing up data

Cloud datacentreMore often than not when browsing the internet each morning you’ll soon discover that in fact, this morning is “Talk Like a Pirate Day”, or “Hug a Vegetarian Day”, or something equally humorous. Today is an awareness day which, conversely, holds some use to the world on the whole.

World Backup Day encourages consumers to back up their family photos, home videos, documents and emails, on more than one device. The World Backup Day website lists numerous ways in which a consumer’s data or documents can be lost, however this day is also very applicable to the world of enterprise IT.

“The rapid increase in the amount of data that consumers and organisations store is one of the biggest challenges facing the backup industry,” says Giri Fox, Director of Technical Services at Rackspace. “Organisations aren’t always sure what data they should be keeping, so to make sure they don’t discard any important data they sometimes end up keeping everything which adds to this swell of data.

“For many companies, a simple backup tool is no longer enough to make sure all these company assets are safe and available, they need support in keeping up with the sheer scale of data and to fix problems when a valuable file or database goes missing.”

The volume of data being utilized (and in some cases not utilized) has grown astronomically, but to a certain degree, security and employee behaviour has not kept pace with this growth. Cyber criminals always seem to be one step ahead of ahead of enterprise when attempting to access data, but what is more worrying is the trend of employee indifference to IT security.

A recent survey highlighted employee negligence and indifference to IT policy is one of the most significant inhibitors to cloud security with only 35% of respondents highlighting that they use passwords in work.

Giri Fox, Director of Technical Services at Rackspace

Giri Fox, Director of Technical Services at Rackspace

“Over recent years, organisations have become far more aware of the importance of backing up their data and we’ve noticed the impact here at Rackspace, where currently we backup 120 PB per month globally,” adds Fox. “One of the main challenges for us is that businesses don’t just want to back-up more data than ever before, they want it to be done quicker than ever before.

“Also, the process of doing so has become more complex than it used to be because companies are more conscious than ever of the compliance regulations they have to adhere to. Fortunately, with the development of deduplication techniques, we are now able to back-up unique sections of data rather than duplicating large pools continuously, which has sped-up the backing-up process.”

Outside of employee indifference to cloud security, changes to EU-US data protection policy have highlighted the significance of data-backup and prevention of data loss. An instance of data loss could be crippling for an organization, whether it is financial penalties or the loss of information which could prove to be invaluable in the future.

“Initiatives like World Backup Day are a great way of highlighting the importance of backing up in an age where, as Ann Winbald put it, ‘data is the new oil’,” comments Fox.

In a world where data can be seen as one of the most important commodities to any business, the value of securing, backing up and encrypting data cannot be underplayed. That said, the majority of the working world (outside of the IT department), do not appreciate the value of security, mostly not out of malice, more because they don’t know any better.

“In the post-Edward Snowden era we’re also seeing just how seriously companies are thinking about encryption. Many companies now want to make sure their backed up data is no longer just encrypted when it goes outside the four walls of a data centre, but inside it as well,” says Fox.

Cloud-based data backup solutions increasing in popularity – survey

Cloud data sharing conceptResearch from Kroll Ontrack has stated cloud-back backup solutions are increasing in popularity, though hardware based options still account for the majority.

The survey, which was only open to participants who have experienced loss of valuable data, highlighted 51% of respondents are still using hardware based options, though this figure is down from 68% in 2015. Cloud-based solutions are currently being considered by 23%, an increase from 18% over the last 12 months.

What could cause concern within the industry is that in instance of data loss, 86% of the respondents said they did have a backup in place, and 48% highlighted they backup the data on a daily basis. If these statistics are to be believed, why is data being lost on such a regular basis? 22% stated the backup was not operating correctly, 21% said the device was not included in backup procedures and 21% commented the backup was out of date.

“It’s no longer enough to have a backup solution where you just hope for the best,” said Robin England, Senior Research & Development Engineer, Kroll Ontrack. “As our survey results indicate year after year, conducting backups is just one step in an overall backup strategy.”

While security and data protection appears to be at the top of the agenda for most organizations, it would appear human indifference and negligence, as well as a shortage of resource are not backing up company claims. A number of organizations have cited recently one of the main challenges for enterprise organizations is the relaxed approach to security demonstrated by its employees.

The statistics also back this point up as 54% of respondents highlighted they did not have the time to effectively research and administer an effective backup solution. While the time factor is a significant barrier here, 24% of respondents said the cost was prohibitive which is down from 31% in 2015. When combined with the statistic that the number of respondents who do daily backups increased by six percentage points over the same period, the findings could imply that enterprise organizations are taking the process of data backup more seriously.

“Storage devices pack more and more data into smaller and more complex systems,” said England. “This not only requires IT teams to dedicate significant time to actually back up the data, but requires even more time to verify the backups worked properly. IT teams face a challenging balancing act when ensuring all of this is managed effectively.”

While the statistics are encouraging, it would still appear that human error and a lack of centralized oversight are the underlying causes for data loss.

Socitm outlines concerns for local government ahead of new data protection regulations

Compliance ConceptThe Society of Information Technology Management, Socitm, has stated that local government bodies should review all information governance arrangements in light of changes to EU-US data protection policies.

In its latest briefing, Data protection: <Control><All><Delete>?, Socitm has recommended that all IT professionals update their information, security and data protection policies, as councils could face difficulty in remaining compliant under the new legislative framework.

Data protection has been a hot topic in recent months, following the European Court of Justice striking down the Safe Harbor agreement last year, as well criticisms of its replacement, the EU-US Privacy Shield. “Legal action in the wake of the Snowden revelations challenged the degree of protection for citizens’ data provided by Safe Harbor,” Socitm said in the statement. “New measures giving foreigners’ data some legal protection have been put in place, but it is not yet known whether the European authorities will consider that US privacy protection is now adequate.”

In recent weeks, Privacy activist Max Schrems, who has been linked to the initial downfall of Safe Harbour, said in a statement reacting to Privacy Shield, “Basically, the US openly confirms that it violates EU fundamental rights in at least six cases. The commission claims that there is no ‘bulk surveillance’ any more, when its own documents say the exact opposite.”

Socitm said in the statement that new European Data Protection Regulation will also update data laws in the UK, which currently don’t account for new technologies. The UK Data Protection Law was written in 1998, several years before the launch of social media platforms Facebook and Twitter, as well as the surge in data usage from both consumers and enterprise. Socitm stated that councils could be let in a vulnerable position when the regulations are brought in officially.

The regulations, a draft of which were released in January, stated that data protection legislation would have to be updated for the digital age, consumers would have to have access to their own data to understand how and where it is utilized, as well as increasing security standards for an individual’s data.

The fear here seems to be focused around the volume of changes that would need to be enforced once the new regulations are in place. It would appear Socitm is concerned that local councils will not be able to keep pace, leaving the councils in a non-compliant and susceptible position.

“Accommodating the changes will be a matter of amending existing processes rather than inventing new ones,” said Dr Andy Hopkirk, Head of Research at Socitm. “Some of the changes could be onerous and problematic. For example, councils will need to be able to deal correctly and completely with ‘right to be forgotten’ requests – perhaps the single greatest challenge in an almost ubiquitously networked and distributed computing world.”

Microsoft pushes forward with AI despite Tay set-back

Microsoft To Layoff 18,000Microsoft has announced a number of updates for its advanced analytics and machine learning offerings as part of its ‘Conversation-as-a-Platform’ push.

Despite the company facing criticism after its twitter-inspired PR stunt Tay backfired last week, the company has pushed forward within the artificial intelligence space, updating its Cortana Intelligence Suite and releasing its Skype Bot Platform.

“As an industry, we are on the cusp of a new frontier that pairs the power of natural human language with advanced machine intelligence,” said Satya Nadella, CEO at Microsoft, at the Build 2016 event. “At Microsoft, we call this Conversation-as-a-Platform, and it builds on and extends the power of the Microsoft Azure, Office 365 and Windows platforms to empower developers everywhere.”

The Cortana Intelligence Suite, formerly known as Cortana Analytics Suite, is built on the company’s on-going research into big data, machine learning, perception, analytics and intelligent bots. The offering allows developers to build apps and bots which interact with customers in a personalized way, but also react to real-world developments in real-time.

Microsoft also announced two new additions to the suite, Microsoft Cognitive Services, formerly known as Project Oxford and the Microsoft Bot Framework, both of which are still in preview.

The first, Microsoft Cognitive Services, has 22 APIs available for developers including emotion detection, speech analysis and Custom Recognition Intelligent Service. The face application programming interface made headlines last year, as the results of an app which estimated user ages was highly varied.

At the time, the team highlighted “the age and gender-recognition features are labelled as experimental features” and also said that despite the mistakes the app made, the fact that it trended on twitter meant that the volumes of data collected would aid the company in refining the technology over time.

The second, Microsoft Bot Framework, can be used in any programming language, enabling developers to build intelligent bots which can converse with customers in a variety of platforms including text/SMS, Office 365, and the web. The bots can also connect to social channels such as Twitter and Slack. The company claims that the bots can be utilized in a number of different complex scenarios though only simple ones, such as ordering a pizza or booking a hotel room, have been demoed so far.

The company also announced the launch of its Skype Bot Platform, enables developers to build bots which can interact with customers through Skype’s multiple forms of communication, including text, voice, video and 3-D interactive characters. The preview bots are very simple and limited for the moment, however once the bots are combined with the Cortana Intelligence Suite there could be potential for the bots to appear more human.

While it is early days for the Microsoft AI platforms, the team are riding the waves of both positive and negative headlines, seemingly leading the industry in the AI space. The company’s competitors are also pushing hard in the AI world, though the weight behind the announcements this week could imply that Microsoft are investing in a more serious manner than others in the industry.

Korean government prioritizes growth of cloud computing

Network ExpansionThe Korean government has announced a new policy to accelerate the adoption of cloud computing in the country, according to Business Korea.

Speaking at a cloud computing conference in Korea, the Ministry of Science, ICT and Future Planning have announced that it will be running a number of initiatives to increase the adoption of cloud computing from 6.4% to 13%, seemingly over the next twelve months. Over the same period, the government also plans to increase the number of Korean cloud companies from 353 to 500, as well as growing private cloud adoption in public institutions to at least 3%.

The Korean government has estimated that should the new initiatives be successful the domestic cloud market could be worth in excess of 1.1 trillion won, roughly £670 million. To support the growth of the industry, the government will also build a cloud computing support centre in Daegu City, which will provide guidance for public institutions who are making the transition.

While the government has laid bare its intentions for the industry in the country, it has not been stated how cloud computing is currently perceived by enterprise. The government has estimated that 6.4% of businesses in Korea currently utilize the cloud, whereas in the UK the figure is viewed as generally much higher. It has been estimated recently that 93% of enterprise in the UK have adopted the cloud.

In what could be seen as a move to encourage enterprise appetite for the cloud, the government has invited enterprises in need of cloud computing in various industries to join the deregulation task force currently led by IT firms in the private sector.

Alongside this announcement, the government has also prioritized the growth of SME’s through the adoption of cloud. In what appears to be a move to emulate companies such as Uber and AirBnB, Ministry of Science, ICT and Future Planning will work in collaboration with the Center for Creative Economy & Innovation to provide cloud software and infrastructure to smaller organizations who could otherwise not afford the technology.

In terms of international expansion of the Korean cloud computing industry, the government will once again provide assistance highlighting the Software-as-a-Service market. It believes the SaaS market is where the country has the greatest opportunity to compete on the international scale, as there is not an outright market leader for the moment. It also believes that the country is a good position to capitalize on the growing Infrastructure-as-a-Service market in South East Asia.

The success of all cloud initiatives could partly depend on the success of the government in engaging enterprise in the country and building the appetite for the technology, which is at a low adoption rate in comparison to other nations.

Brocade makes play for DevOps market with StackStorm acquisition

BrocadeNetworking vendor Brocade has acquired StackStorm, a start-up that builds software for automating datacentre operations.

StackStorm, which describes itself as an organization defined by the DevOps ideology, said on its blog that it will be joining Brocade to help accelerate the company’s efforts to bring DevOps style scalable open source automation to Brocade’s networking solutions. This is one of the first moves which Brocade has made to capitalize on the growing DevOps trends within the industry.

The company was launched in 2013 and left stealth mode in May 2014 promoting itself as a company which can streamline datacentre operations. The model itself is focused on incorporating a DevOps ideology into the datacentre, automating common tasks, claiming it can help companies run their facilities like Facebook, where a single person can be responsible for tens of thousands of servers, not just a couple of hundred.

On the Brocade blog, PG Menon, Senior Director of Technology & Strategy for Switching, Routing and Analytics said “Using StackStorm technology, Brocade customers will be able to bring DevOps methods to networking as well as experience many of the benefits of scale-out IT automation enjoyed by the Cloud Titans.

“Simply put, achieving business agility through DevOps methods for IT automation that also includes networking is no longer limited to Cloud Titans. Every IT shop will be able to realize those same benefits.”

While DevOps is seen as one of the strongest growing trends within the cloud industry, Brocade is building its business case on the fact that the use of DevOps is limited to tech giants at the top of the ladder such as Amazon and Facebook. The company aim to deliver the same agility to smaller organizations who cannot command the software manpower of the industry’s major players in designing and delivering DevOps-enabled business agility.

Under Brocade, the StackStorm technology will be extended to networking and new integrations will be developed for automation across IT domains such as storage, compute, and security. The StackStorm team also highlighted in its blog it anticipates investment from Brocade to increase the size of its team over the coming months.

Containers and microservices starting to enter mainstream market – survey

MainstreamA recent survey from NGINX highlighted containers and microservices are two buzzwords which are starting to enter the mainstream market as companies target daily software releases.

While daily software releases are the ultimate goal within the industry, 70% of respondents highlighted that they are currently releasing new code only once a week, with only 28% reaching target. Barriers cited were a lack of automation tools, a constant trade-off between quality of code and expected speed of delivery, as well as a lack of central management, accountability, and collaboration tools.

Containers are now seemingly beginning to enter the mainstream as 69% of respondents said that they were either actively using or investigating the technology currently. Of the 20% using containers in production, more than a third are running more than 80% of their workloads on containers and more than half for mission-critical applications. The technology is also creating a new buyer audience for vendors, as 74% of respondents said developers were responsible for choosing development and delivery tools as opposed to heads of departments or managers.

Microservices tell a slightly different story, as while adoption levels are similar at approximately 70% currently using or investigating, the trend is weighted more towards small and medium organizations rather than the blue chips. Of the larger organizations, 26% are researching, 36% are currently using in development or production however 38% aren’t using microservices at all.

AWSThe survey also demonstrated AWS are continuing to dominate market share, accounting for 49%. Despite Google and Microsoft Azure grabbing headlines recently with a number of new client wins, acquisitions and product releases, the market seemingly still favours AWS with respondents highlighting an accessible price point as one of the most important factors when selecting a cloud provider.

Continuous integration and continuous delivery are becoming development best practices, as 27% of the respondents would now consider their organization to have a mature practise for continuous integration and delivery. On the other end of the scale, roughly a third said that they were keen to move forward with continuous integration and delivery but the necessary level of collaboration or understanding is not yet widespread in their organizations as of yet.

While the survey does demonstrate the integration of cloud-first technologies such as containers are moving beyond the early-adopter stage, it will be some time before such technologies become common place in large scale organizations were the wheels are slower to turn. Like the cloud business model, containers and microservices seem to be offering small and medium size operations the opportunity to compete with larger organizations budgets through technology innovation, agility and speed of deployment.

NTT Data to acquire Dell Services for $3.06 billion

NTT DataJapan’s NTT Data is to acquire Dell’s IT Services business for $3.06 billion, in an effort to bolster its footprint in the North American region.

The announcement confirms speculation over recent months as to the future of the IT Services division, as Dell has been rumoured to be searching for a buyer for the business unit to aid financing of the EMC deal. Dell Services was initially formed through the acquisition of Perot Systems in 2009 for $3.9 billion. The new agreement with NTT Data will see Dell absorb an $800 million loss on the division and could indicate that financing the EMC acquisition is more difficult than initially expected.

In December, BCN reported Dell had been facing challenges in financing one of the biggest financial deals in history. For the $63 billion EMC acquisition to proceed, Dell has had to reduce its levels of debt with the Perot Systems business unit rumoured to be a favourite for sale.

The company will initially remain under the leadership of Suresh Vaswani, current President of Dell Services, who will continue to report to Dell CEO Michael Dell until the completion of the deal. It is believed that as part of the acquisition NTT Data will take on 28,000 Dell employees, though future leadership of the business has not been confirmed.

“NTT Data is pleased with the unique opportunity to acquire such high-calibre talent, and a corporate culture that shares common values with NTT Data, with emphasis on client first, foresight, teamwork and a commitment to innovation,” said Toshio Iwamoto, President and CEO of NTT Data Corporation. “Welcoming Dell Services to NTT DATA is expected to strengthen our leadership position in the IT Services market and initiates an important business relationship with Dell.”

NTT Data’s acquisition of the IT services division is the largest by the company to date and continues to bolster its North American footprint. Revenues for NTT Data in overseas markets has more than doubled since 2011 and in the same period the company has spent more than $600 million on acquisitions. The company has prioritized growth in the North America region, primarily targeting lucrative contracts in the healthcare, banking, financial services and insurance.

Since 2011 NTT Data has been proactive in bolstering its overseas business with a number of acquisitions throughout the world. In Europe it acquired companies including Everis and Value Team, in North America Optimal Solutions Integration and Carlisle & Gallagher were added, whereas iPay88 increased the company’s footprint in Malaysia.

“There are few acquisition targets in our market that provide this type of unique opportunity to increase our competitiveness and the depth of our market offerings,” said John McCain, CEO of NTT Data. “Dell Services is a very well-run business and we believe its employee base, long-standing client relationships, and the mix of long term and project-based work will enhance our portfolio.”

The deal could indicate that financing the EMC agreement has proved to be more difficult than initially expected. Dell Services as a business unit was reportedly to be valued in the region of $5 billion, which could highlight Dell’s urgency in completing the sale. If reports are correct, it would appear NTT Data has negotiated a good deal.

IBM launches brain-inspired supercomputer with Lawrence Livermore National Laboratory

artificial intelligence, communication and futuristicIBM and Lawrence Livermore National Laboratory have launched a new project to build a brain-inspired supercomputing platform for deep learning inference.

The project will be built on IBM’s TrueNorth chip, which the company claims will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a tablet computer. The neural network design of IBM’s Neuromorphic System aims to be able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing in a much more economical manner than current chips.

“The delivery of this advanced computing platform represents a major milestone as we enter the next era of cognitive computing,” said Dharmendra Modha, Chief Scientist for Brain-inspired Computing at IBM Research.   “We value our relationships with the national labs. In fact, prior to design and fabrication, we simulated the IBM TrueNorth processor using LLNL’s Sequoia supercomputer. This collaboration will push the boundaries of brain-inspired computing to enable future systems that deliver unprecedented capability and throughput, while helping to minimize the capital, operating and programming costs – keeping our nation at the leading edge of science and technology.”

The technology itself will be utilized in a number of different manners within the National Nuclear Security Administration (NNSA), including the organizations Stockpile Stewardship Program, a program of reliability testing and maintenance of its nuclear weapons without the use of nuclear testing.

“Neuromorphic computing opens very exciting new possibilities and is consistent with what we see as the future of the high performance computing and simulation at the heart of our national security missions,” said Jim Brase, Livermore National Laboratory’s Deputy Associate Director for Data Science. “The potential capabilities neuromorphic computing represents and the machine intelligence that these will enable will change how we do science.”

While Artificial Intelligence has been one of the more prominent trends in the cloud computing world, the success of the technology and PR stunts launched has been varied.

AlphaGo is an example of the success of AI, as Google Deepmind’s AI program beat world Go champion Lee Se-dol in a five match series. As traditional machine learning techniques could not be applied in this instance, the team combined an advanced tree search with deep neural network allowing the program to readjust its behaviour through reinforcement learning. The win came as a surprise to commentators, as the game Go relies on intuition and feel.

On the opposite end of the spectrum, Microsoft has had to release an apology after its twitter inspired AI stunt backfired. The program tweeted controversial comments as it was unable to grasp the politically incorrect nature of the messages it received from users, as reported by the Independent.

NTT Data to acquire Dell Services for $3.06 billion

NTT DataJapan’s NTT Data is to acquire Dell’s IT Services business for $3.06 billion, in an effort to bolster its footprint in the North American region.

The announcement confirms speculation over recent months as to the future of the IT Services division, as Dell has been rumoured to be searching for a buyer for the business unit to aid financing of the EMC deal. Dell Services was initially formed through the acquisition of Perot Systems in 2009 for $3.9 billion. The new agreement with NTT Data will see Dell absorb an $800 million loss on the division and could indicate that financing the EMC acquisition is more difficult than initially expected.

In December, BCN reported Dell had been facing challenges in financing one of the biggest financial deals in history. For the $63 billion EMC acquisition to proceed, Dell has had to reduce its levels of debt with the Perot Systems business unit rumoured to be a favourite for sale.

The company will initially remain under the leadership of Suresh Vaswani, current President of Dell Services, who will continue to report to Dell CEO Michael Dell until the completion of the deal. It is believed that as part of the acquisition NTT Data will take on 28,000 Dell employees, though future leadership of the business has not been confirmed.

“NTT Data is pleased with the unique opportunity to acquire such high-calibre talent, and a corporate culture that shares common values with NTT Data, with emphasis on client first, foresight, teamwork and a commitment to innovation,” said Toshio Iwamoto, President and CEO of NTT Data Corporation. “Welcoming Dell Services to NTT DATA is expected to strengthen our leadership position in the IT Services market and initiates an important business relationship with Dell.”

NTT Data’s acquisition of the IT services division is the largest by the company to date and continues to bolster its North American footprint. Revenues for NTT Data in overseas markets has more than doubled since 2011 and in the same period the company has spent more than $600 million on acquisitions. The company has prioritized growth in the North America region, primarily targeting lucrative contracts in the healthcare, banking, financial services and insurance.

Since 2011 NTT Data has been proactive in bolstering its overseas business with a number of acquisitions throughout the world. In Europe it acquired companies including Everis and Value Team, in North America Optimal Solutions Integration and Carlisle & Gallagher were added, whereas iPay88 increased the company’s footprint in Malaysia.

“There are few acquisition targets in our market that provide this type of unique opportunity to increase our competitiveness and the depth of our market offerings,” said John McCain, CEO of NTT Data. “Dell Services is a very well-run business and we believe its employee base, long-standing client relationships, and the mix of long term and project-based work will enhance our portfolio.”

The deal could indicate that financing the EMC agreement has proved to be more difficult than initially expected. Dell Services as a business unit was reportedly to be valued in the region of $5 billion, which could highlight Dell’s urgency in completing the sale. If reports are correct, it would appear NTT Data has negotiated a good deal.