Todas las entradas hechas por Jamie Davies

NetApp takes Q4 hit but predicts turnaround in 2018

Kurian

NetApp CEO George Kurian

NetApp has reported yearly decline during the Q4 earnings call, though CEO George Kurian remains positive the company will return to moderated revenue growth in 2018.

Revenue for Q4 was $1.38 billion, down from the $1.425 billion mid-year estimations, as the company reported an $8 million loss. Q3 saw the company report a profit of $135 million, whereas Q4 2015 saw the team account for $153 million profit. Year-on-year annual figures were also down 9% to $5.6 billion, with profits also decreasing to $229 million from $560 million in 2015, a decrease of 59%. Although investors have not reacted positively to the news, the $842 million acquisition of Solidfire hit the books during the period, taking some of the light off the company’s performance.

“As we discussed last quarter, we’re making fundamental changes to return the Company to revenue growth with improved profitability, cash flow, and shareholder returns,” said Kurian. “To deliver on this commitment, we’re executing a comprehensive and sustained transformation.”

The company has been undertaking a wide-ranging transformation project in recent months, moving the focus away from the traditional portfolio, which declined roughly 40% year-on-year, and towards the strategic growth initiatives, which grew 21% year-on-year. The strategic initiatives now account for 53% of net product revenue.

Priorities for the business will continue to focus around the delivery of the strategic initiatives including cluster data ONTAP, branded E-Series and All-Flash-Arrays, as well as the introduction of the next generation of ONTAP in the coming weeks. The team claim the new offering will simplify customers IT transformations to modern data centres and hybrid cloud environments, while also giving the customer greater flexibility when it comes to engineered systems, software defined storage, or cloud.

Kurian stated during the call he expects another tough year for fiscal 2017, though the company should be in a position to turn the corner in fiscal 2018, prioritizing the hybrid cloud market, as well as streamlining costs within the organization.

“When I took over as CEO, NetApp was dealing with several internal challenges,” said Kurian. “We were late to the All-Flash-Array market. We were not prepared to assist our installed base of customers in migrating to clustered ONTAP, and we had limited traction in the hybrid cloud.

“Heading into fiscal year ’17, our momentum with customers is accelerating. Data is at the heart of our customers IT transformation efforts and this is where NetApp has a profoundly important role to play. Our strategic relevance to customer’s digital transformation roadmaps is evidenced by the growth of our strategic solutions. We’re making meaningful progress, but still have work ahead of us and remain focused on execution. I remain highly confident in NetApp’s potential.”

Under Kurian’s leadership, NetApp has seemingly been forced into a wide-ranging transformation project to remain relevant in current and future market conditions. By self-admission, NetApp was not ready for the cloud-orientated world, and too focused on legacy products; however the team have outlined the organization’s roadmap to drive the company back into the positive.

Firstly, the team are positioning the clustered ONTAP offerings, as well as SolidFire to reinforce the company’s position as a supplier of technology to the cloud, both service providers and enterprise organizations who are managing private cloud environments. Kurian claims the company leads the way for enterprise storage and data management technology in the open stack cloud deployments.

Within the hyper scale segment, the company reported healthy growth for product offerings which combine hyper scale and cloud computing environments. Kurian noted while it is early days, the company are making progress in carving out market share in the segment.

Finally, Kurian highlighted there are have been a number of examples throughout the year of companies transitioning data back to on premise platforms from public cloud environments. The team believe the current offering positions them well in the market to capture those workloads as they transition.

Although Kurian has put a positive spin on year-on-year declines, outlining in depth the company’s strategy to return to prominence within the newly defined market, the market has reacted slightly differently. Shares fell almost 7% in afterhours trading.

CIOs prioritize collaboration to increase security – Intel

a safe place to workIntel Security has released new findings which claims CIOs are targeting collaboration as a means to shore up defences against cyber threats.

Respondents to the survey believe their own organizations could be between 38-100% more secure if threat management and incident response personnel and systems could simply collaborate better. The team believe collaboration is one area which is often overlooked, with decision maker’s often favouring new threat detection or prevention tools, though security operations’ effectiveness can be increased through better collaboration between silos within the organization.

“Threat management contributions are almost evenly spread among different roles, but there are some notable areas of specialization,” the company stated in its “How Collaboration Can Optimize Security Operations” report. “Every handoff or transition can add significant operational overhead—along with the potential for confusion and chaos and delays in responding. But, on the upside, there is also huge potential for collaboration and increased efficiencies.”

The report states CIOs are still prioritizing new tools as a means to shore up their own perimeters, though collaborations technologies were not far behind in the rankings. 40% of the respondents highlighted their spend would be prioritized on better detection tools, 33% pointed towards preventative tools and 32% said improved collaboration between SOC analysts, incident responders and endpoint administrators.

One of the main challenges for these organizations is the process, accuracy and trust in communication. For a number of organizations data is shared manually and potentially reprocessed several times, increasing the possibility of inaccuracy. Automated collaboration tools ensure data is shared quickly and accurately through an array of different functions and responsibilities. “Trust arises from good communication, transparency, and accountability, all of which engender confidence in the outcome,” the report states.

The number of tools being used within these organizations is also a challenge, as data is often transferred between or collected centrally manually. The average number tools companies use to investigate and close an incident is four, though 20% of the respondents said they can use up to 20 different products to achieve the same aims, further increasing the challenge. Though larger and more geographically diverse organizations will by definition use more tools, the same principles of collaboration and automation apply, and in theory could increase the security of an organizations perimeter.

“Tougher new EU data privacy regulations, which are currently in the process of being modernized, will be implemented in 2017,” said Raj Samani, EMEA CTO for Intel Security, in the report. “Organizations will be legally required to implement a security architecture that ensures a secure and trustworthy digital exchange of data throughout the EU. Data privacy needs to be assured at every level and across the entire infrastructure. In light of that, improved incident investigation and response processes that bring together collaborative tools and teams are imperative.”

While most organizations are answering the threat of more advanced cyber threats with the implementation of more advanced defence solutions, collaboration is an area which could be seen as a complementary means. Collaboration can contribute to real-time visibility for various teams, improve execution capabilities, as well as speed of response.

Salesforce to run some core services on AWS

Salesforce 1Salesforce has announced it will run some of its core services on AWS in various international markets, as well as continuing investments into its own data centres.

The announcement comes two weeks after the company experiences a database failure on the NA14 instance, which caused a service outage which lasted for 12 hours for a number of customers in North America.

“With today’s announcement, Salesforce will use AWS to help bring new infrastructure online more quickly and efficiently. The company will also continue to invest in its own data centres,” said Parker Harris, on the company’s blog. “Customers can expect that Salesforce will continue to deliver the same secure, trusted, reliable and available cloud computing services to customers, regardless of the underlying infrastructure.”

While Salesforce would not have appeared to have suffered any serious negative impact from the outage in recent weeks, the move could be seen as a means to rebuild trust in its robustness, leaning on AWS’ brand credibility to provide assurances. The move would also give the Salesforce team options should another outage occur within its own data centres. The geographies this announcement will apply to have not been announced at the time of writing.

Sales Cloud, Service Cloud, App Cloud, Community Cloud and Analytics Cloud (amongst others) will now be available on AWS, though the move does not mean Salesforce is moving away from their own data centres. Investment will continue as this appears to be a failsafe for the business. In fact, Heroku, Marketing Cloud Social Studio, SalesforceIQ and IoT cloud already run on AWS.

“We are excited to expand our strategic relationship with Amazon as our preferred public cloud infrastructure provider,” said Salesforce CEO Marc Benioff. “There is no public cloud infrastructure provider that is more sophisticated or has more robust enterprise capabilities for supporting the needs of our growing global customer base.”

Salesforce SMB’s business leader talks data analytics, AI and the age of entrepreneurship

Sanj Salesforce

Sanj Bhayro, SVP EMEA Commercial at Salesforce

While the business world has traditionally favoured the biggest and the richest, cloud as a technology is seen as the great equalizer. Through a transition through to the cloud, SMBs are being empowered to take on the nemesis of enterprise business, with the number of wins growing year-on-year.

This, according to Salesforce’s Sanj Bhayro, is one of the most exciting trends we’re now witnessing in business throughout the world. Bhayro currently leads the EMEA SMB business at Salesforce and for almost 11 years has been part of the team which has seen the power of intelligent CRM systems grow backroom businesses to industry giants. Just look at the growth and influence of companies such as Uber and AirBnB for justification of his claims.

“The SMB business in Salesforce is one of the most exciting, because we get to work with really innovative companies,” said Bhayro. “All the innovation in the industry is coming from these small to medium sized businesses. They are disrupting the traditional market which is in turn forcing the traditional players to transform their own business models.

“Something which is interesting from our perspective at Salesforce is that when we started 17 years ago the internet wasn’t that prevalent, the cloud wasn’t a word that was used that often, and it was the SMB companies who adopted our technology. The cloud offered them the operational efficiency, the scale and the reach to take on these traditional players. These smaller organizations are looking more and more towards technology as the enabler for innovation.”

The majority of the SMBs could be considered to be too small to drive innovation in-house. For the most part, the IT department is small, and responsible for ‘keeping the lights on’, working through the cloud has enabled innovation and created opportunities for these organizations. And for the most part, the ability to be innovative is much more prominent in the smaller organizations.

Note: There is a poll embedded within this post, please visit the site to participate in this post’s poll.

The fail-fast business model is one which has captured the imagination of numerous enterprise organizations around the world. Amazon CEO Jeffrey Bezos recently claimed the fail-fast model was the catalyst for recent growth within the AWS business, though the majority are seemingly struggling to implement the right culture which encourages learning and innovating through failing. For the majority, failure is simply failure, not part of the journey to success.

But this in itself is one of the ways in which the smaller, more agile organizations are innovating and catching enterprise scale businesses. The implementation of cloud platforms speeds up the failures and lessens negative impacts on the business, to further drive the journey to innovation.

“For start-ups and early stage companies, failing is an accepted mentality. How many companies are actually the same as when they started? They failed, learned and then progressed. As businesses become bigger and bigger it becomes a lot more difficult. Certainly for larger companies there is a lot more friction around the fail-fast model. Smaller companies are culturally set up to allow them to pivot and try new things, whereas larger ones, purely because of their size, are constrained.”

Outside of the SMB team, Salesforce engineers have been prioritizing the use of artificial intelligence for future product launches and updates. This was reinforced during the company’s quarterly earnings call in recent weeks as CEO Marc Benioff backed AI as the next major growth driver. While there is potential for AI in the SMB market place, for the moment it is only for those who are ahead of the curve.

For the most part, data analytics is starting to drip down into smaller organizations, though there is still a substantial amount of data which is not being utilized. For Bhayro, as the concept of the cloud is now ubiquitous, the opportunities are almost limitless. But only once these organizations have got on top of managing their own data, breaking down the silos within the business.

Robotic hand, accessing on laptop, the virtual world of information. Concept of artificial intelligence and replacement of humans by machines.“AI translates well into the SMB business model and it will be the SMBs who drive where AI goes,” said Bhayro. “There are generally two camps when it comes to the SMB market, those who are cloud-native, those who capitalizing on the sharing-economy and those who are more traditional organizations. The shift that the traditional business has to make to break down the silos, and to move towards a cloud back-end is far more difficult than a company like Deliveroo who started in the cloud and can scale. Never the less that shift has to be made.”

“So much data is being created and there’s so much that you can do with it. The problem is that so many companies are not doing enough with their data. Recent reports stated that most companies can only analyse 1% of their data. Even before we start moving towards AI technologies, the way we service intelligence is through insight. We need to provide the right tools to make data available and malleable, to everybody in your business. These data analytics tools are the first steps and then we can look forward to AI technologies.”

The UK government has made numerous schemes available to SMBs to encourage the growth of this subsector in recent years, and Bhayro believes these efforts have been playing off in the international markets.

“I delighted to say that the UK takes a leadership position (in relation to SMB growth and innovation in comparison to the rest of Europe),” said Bhayro. “Something in the region of 95-96% of the companies in the UK are SMBs, and the government is currently doing the right things to encourage and propel entrepreneurs. I think we’re in the time of entrepreneurship, and this is the time for people to have the vision and grow. These companies are having wonderful ideas, and they are moving into the growth period, but it’s the customer experience which really differentiates them from the competition. Not many of these companies are set up to achieve customer experience objectives, but this is where we (Salesforce) come in.”

Employees not taking advantage of mobility initiatives – survey

Digital Device Tablet Laptop Connection Networking Technology ConceptDespite mobility being one of the top priorities for organizations throughout the world, research from IDC has shown only 13% of those who are given the option actually work from home.

Enterprise mobility has proved to be one of the more prominent trends emerging out of the evolution to cloud-based platforms, as employees aim to create a working environment which encourages innovation and creativity however the study shows the generosity is not being taken advantage of. One statistic which could be seen as an obstacle to adoption is two in five line managers admit they do not want their employees to work from home.

Numerous organizations have highlighted mobility strategies as a priority for coming months, as organizations aim to utilize the power and freedom of cloud based applications to increase the productivity of employees. Findings from 451 Research claims 40% of enterprise organizations are prioritizing mobilization of general business apps over the next two years, as opposed to focusing solely on field services and sales teams. The trends towards mobility are also confirmed when assessing the M&A market. In the mobile device management and mobile middleware segment, 28% of the total deals (21 of 74) and 77% of their total value ($3bn of $3.9bn) over the past decade have occurred over the past two years alone.

Although other research has suggested organizations are shifting to a mobility mind-set, IDC’s study has outlined the drive towards is still in the early adopter stages, despite numerous organizations claiming its importance. The leadership team were particularly critical of considering working from home to be acceptable, as only 43% of employees are confident leadership is fully behind mobility as a concept. Of those who do have the opportunity to work from home, only 14% spend more than half their time outside the office.

From a leadership perspective, new EU regulations regarding the protection, residence and transition of data could have an impact on their attitudes towards mobility, as penalties for non-compliance will be to the tune of €20 million or 4% of the organization’s annual turnover, whichever is greater.

While vendors are striving to improve the efficiency of mobility solutions, as well as championing efforts to make the technologies on the whole more secure, unless the adoption of the mobility culture is increased from the end-user side, there are unlikely to be any changes in the near future. If the statistics remain true, mobility initiatives will not achieve the required ROI, which could have a negative long-term impact on the investments made into the mobility segment on the whole.

CSC announces HPE enterprise services merger to create $26bn business

Meg Whitman

HPE CEO Meg Whitman

CSC has announced it will be merging with the enterprise services segment of HPE, as the latter reported its fourth consecutive quarter of year-over-year revenue growth.

Revenues for 2016 Q2 were reported at $12.7 billion, up more than 1%, as the team attributed the success to its servers, storage, networking and converged infrastructure business units. The enterprise services unit also saw a healthy performance, and will now be spun out and merged with CSC to create a $26 billion organization.

“The transaction is currently targeted to be completed by March 31, 2017,” said HPE CEO Meg Whitman on the company’s earnings call. “For the combined CSC and Enterprise Services, this will create a new company that will be a pure-play global IT services leader. For customers, this means global access to world class offerings in cloud, mobility, application development and modernization, business process services, IT services, big data and analytics, and securities.”

The move comes six months after CSC underwent a similar split to HP and HPE. CSC serves commercial and government clients globally, whereas CSRA targets public sector clients in the United States. Following the completion of the transaction next year, CSC’s current president and CEO Mike Lawrie will continue to head up the new company, though the new brand has not been released as of yet. Both companies have seemingly benefited from their respective splits in recent months, demonstrating healthy growth since the two separations.

Since the CSC separation, the company has been aggressively reinforcing its position in the market with various acquisitions and joint ventures. Created CeleritiFinTech, a joint venture with HCL, to strengthen its position in the banking sector, acquired UXC to increase its footprint in the Australia-New Zealand region and bought Xchanging to bolster its insurance solutions.

“Our proposed merger with HPE Enterprise Services is a logical next step in CSC’s transformation,” Lawrie said. “As a more powerful and versatile global technology services business, the new company will be well positioned to innovate, compete and serve clients in a rapidly changing marketplace. We are excited by the great potential this merger brings to our people, clients, partners and investors, and by the opportunity to strengthen our relationship and collaboration with HPE.”

In terms of HPE moving forward, Whitman highlighted next generation software defined infrastructure is a priority for the business, focused on servers, storage, networking, converged infrastructure, hyper-converged, and Helion. The company has stated it will remain open to future acquisitions, though it would appear there aren’t any major targets in the pipeline as Whitman seemed ‘standoffish’ during the earnings call.

NTT Data partners with Privitar to make customers GDPR compliant

Lady JusticeNTT Data UK has announced a partnership agreement with Privitar to provide data protection solutions built on new requirements set out by the EU General Data Protection Regulation.

The GDPR requires companies to process and use the personal data of any European customers in a justifiable and ethical manner, whilst also giving increased control of the data back to the customers themselves. As the role of data increases within the business world customers have become increasingly interested in how their personal information is stored and used. Insight delivered from this data can be used to drive additional revenues for a business, though once GDPR comes into legislation in 2018, there will be strict guidance on how the data is used.

NTT Data believe this dynamic will create complications for various organizations, and claim combining the NTT Data’s data and process capabilities, with Privitar’s privacy software, will create a proposition which will comply to all GDPR data requirements.

“By combining NTT DATA’s sector-specific domain knowledge with Privitar’s software we can now deliver programmes that make our clients champions of both privacy and innovation,” said Steve Mitchener, CEO of NTT Data UK. “I’m excited that this partnership will allow our clients to fully utilise their data assets without fear of reputational and financial damage, or regulatory action.”

Let the countdown to GDPR begin

Location Germany. Red pin on the map.The road to data protection has been a long and confusing one. Despite being one of the biggest concerns of consumers and corporates throughout the world, progress has hardly been moving at breakneck speed, but as of today (May 25th), companies now have exactly two years to ensure they are compliant with the EU’s General Data Protection Regulation.

The general objectives of the GDPR are to give citizens back the control of their personal data and to simplify the regulatory environment for international business by unifying the regulation within the EU. Data protection is a complicated business throughout the EU mainly due slight differences from country to country, and then again, with overarching EU regulations, or directives which haven’t even made it to regulation.

Conversations surrounding the new regulations have been ongoing since 2012, though companies now have until 25th May 2018 to ensure they are fully compliant. For this would seem an adequate amount of time, however a recent YouGov and Netskope survey highlighted only one in five are confident they will be compliant in this time period. For Eduard Meelhuysen, VP at Netskope, decision makers need to take a step back to get a better understanding of the current state of their data, before concentrating on any company app.

“If they are to comply, IT teams will need to make the most of the two-year grace period which means that both cloud-consuming organisations and cloud vendors will need to take active measures now,” said Meelhuysen. “As a starting point, organisations should take a hard look at how their data are shared and stored, focusing in particular on any cloud apps in use across the organisation.

“The GDPR makes specific provisions for unstructured data of the type created by many cloud apps, data which are typically harder to manage and control. That means organisations need to manage employees’ interactions with the cloud carefully as a key tenet of GDPR compliance.”

a safe place to work“As cloud app use continues to increase within businesses, data will become harder to track and control. But with the GDPR instigating a maximum possible fine of €20 million or 4% of global turnover (whichever is higher) in certain cases, there is now more incentive than ever for companies to focus on data protection. Getting a handle on cloud app use will be a crucial part of ensuring compliance for any organisation, and IT teams will need to start work now to meet the May 2018 compliance deadline.”

One area which has been given attention within the GDPR is that of data residency. New regulations will require organizations do not store in or transfer data through countries outside the European Economic Area that do not have equivalently strong data protection standards. The list of countries that meet these standards is short, 11, with a notable absentee, the United States of America, which could pose problems for numerous organizations.

While this may be considered one of the headline areas for the GDPR and one which will likely be heavily scrutinized, for Dave Allen, General Counsel at Dyn, concentrating too much on this area could lull companies into a false sense of security.

“As the EU GDPR comes into effect, businesses will need to take a hard look at their current methods of sharing and storing data,” said Allen. “While some Internet companies have begun to address new challenges at the fixed locations where data is stored – this alone will not necessarily be enough to ensure compliance.

“Those companies focusing solely on data residency may well fall victim to a false sense of confidence that sufficient steps have been taken to address these myriad regulations outlined in the GDPR. As the GDPR will hold businesses accountable for their data practices, businesses must recognise that the actual paths data travels are also a key factor to consider. In many ways, the constraints which come with the cross-border routing of data across several sovereign states mean these paths pose a more complex problem to solve.

“Although no silver bullet exists for compliance with the emerging regulations which govern data flows, businesses which rely on the global Internet to serve their customers should be seriously considering visibility into routing paths along both the open Internet and private networks. As we enter an era of emerging geographic restrictions, businesses with access to traffic patterns in real time, in addition to geo-location information, will find themselves in a much stronger position to tackle the challenges posed by the GDPR.”

Anonymous unrecognizable man with digital tablet computerOverall, the GDPR will ensure companies take a greater level of responsibility to safeguard the personal data they hold from attacks. Recent months have seen a number of highly publicised attacks significantly impact the reputation of well-known and respected brands, making consumers nervous about which of their personal information is being held. Previously, attacks on such organizations would not have been thought possible; surely they have the budgets to ensure these breaches wouldn’t happen?

Another headline proposition from the GDPR is the consumer’s right to access data which is stored on them, and also the right to have this data ‘forgotten’. For Jon Geater, CTO at Thales e-Security, this will create numerous challenges and changes to the way in which data is stored and accessed.

“The new rules also make clear another important factor that we should already have known: that you can outsource your risk, but you can’t outsource your responsibility,” said Geater. “If organisations use a third party provider to store and manage data – such as a cloud provider, for example – they are still responsible its protection and must demonstrate exactly how the data is protected in the remote system. Therefore, formal privacy-by-design techniques need to make their way down the supply chain if companies are to avoid penalties or nightmarish discovery and analysis tasks.

“In addition, organisations will now have to provide citizens with online access to any their own personal data they store. While the Data Protection Act traditionally allowed anyone to request access to this data, with GDPR in effect organisations must make this available for download ‘where possible’ and ‘without undue delay’.

“This is a very significant change and securing this access will represent a significant challenge to many organisations – especially while still complying with the new tighter rules – and will require robust cybersecurity technology across the board.”

What is clear is there will be complications. This shouldn’t be considered a massive surprise as any new regulations are fraught with complications on how to remain or become compliant, but the European Commission isn’t messing around this time. With fines of €20 million or 4% of global turnover (whichever is greater), the stick is a hefty one, and the carrot is yet to be seen.

Accenture outlines future of cloud and data analytics in sport

Accenture 3Although the digital age has created a wealth of opportunities for organizations to create new revenue streams and attract new audiences, maintaining engagement of these customers is becoming an increasing difficult job, according to Accenture’s Nick Millman.

The availability and ease of information in the 21st century has created a new dynamic where consumers are now becoming increasingly competent at multi-tasking and operating several devices, which has made the task of keeping a viewer’s attention throughout the course of a sporting event more challenging. Millman, who leads the Big Data & Analytics Delivery at Accenture, are using this dynamic to create new engagement opportunities for the Six Nations.

“There will be a number of people who will watch the entirety of a match, however there will be others who will be playing with their tablet or phone and enjoying the multi-screen experience,” said Millman. “To keep the level of engagement, sports need to become more digital themselves, providing more insight and data to fans who are watching the game. Ideally you want them to be on their phone looking at something which is relevant to the game as opposed to Facebook or what their friends are doing.”

Accenture first teamed up with the Six Nations as a technology partner four years ago, where the initial partnership focused on demonstrating the company’s mobility capabilities through creating the official app. What started as a basic app now acts as a delivery platform where Accenture can showcase their data analytics capabilities, processing more than 2 million rows of data per game and creating visuals in (near) real-time to tell a different story behind the sport itself.

The data itself is not necessarily the greatest use to the fans, so Accenture has brought in rugby experts year-on-year to help understand the nuances of the information. This year Nick Mallet, Ben Kay and David Flatman helped the team tell the story. This is the same in the business world. Data analysts themselves may not be able to make the right decisions when it’s comes to the application of the data, as they wouldn’t understand the market in the same way as a Managing Director who has been in the industry for 30 years. The application of data in sport and the business world will only be effective when it is merged with expertise and experience to provide context.

Accenture 2“One of the interesting things which we saw is that there is now an interesting dynamic between data driven decisions and gut feel,” Millman highlighted. “In some cases when you are watching the game you may think that one player would be considered the best on the park, but the data tells a different story. Seeing one hooker for example hit every line out perfectly might make him look like the most effective, but the data might suggest the opposition hooker who produced several small gains when carrying the ball had a greater impact on the game.

“This can translate into the business world also, as a marketing team may have a better feel about a product which it wants to push out to the market, but the data team have evidence which shows resource should be focused on a different area of the business,” said Millman. “I don’t think there is a right answer to what is better, data driven decision making or intuition, but it’s an interesting dynamic. The successful businesses will be the ones who are effective at blending the data and the skills to come to the right outcome.”

While the role of analytics is becoming more prominent in sport and the business world, there is still some education to be done before the concepts could be considered mainstream. Analytics may be big business in the enterprise segments, but there are still a large proportion of SMBs who do not understand the power of data analytics for their own business. The ability to cross sell, develop a stronger back story of your customer, maintain engagement or even implement artificial intelligence programs is only available once the core competencies of big data and analytics are embraced within the organization.

Accenture 1For Accenture, wearables and IoT are next on the horizon and potentially virtual reality in the future. This year the app was available on the Apple watch, as Millman is starting to see trends which could shift the consumption of data once again.

“It’s still early days, but some of the consumption of data is likely to shift from tablets and smartphones,” said Millman. “Like it shifted from desktops to laptops to smartphones and tablets, it may shift to wearable devices in the future.

“Also this year we build a prototype using virtual reality to immerse people into the rugby experience. I’m not sure VR will become mainstream in a sporting context in the next 12-18 months but I think increasingly VR and AR (augmented reality) will become a part of the sports viewing experience.”

Atlassian launches Bitbucket Pipelines

Door to new opportunityAtlassian has announced a number of new developments within its team collaboration software portfolio, including the launch of Bitbucket Pipelines platform.

The new platform extends its cloud-based Bitbucket code repository to provide teams with entire continuous delivery workflows from source to deployment in the cloud. The team claim the new proposition helps developers who are struggling to apply on-premises continuous integration and delivery tools as software development and production applications are shifting into the cloud.

“Atlassian is helping teams across all industries do amazing things. We’re helping developers at Cochlear build aural implants to help people hear, The Telegraph to inform millions of readers each day, and Lufthansa Systems to provide IT services for everything from aviation safety to entertainment,” said Sri Viswanath, CTO, Atlassian. “The common thread between these teams and your own is the need to work smarter and faster. We’re seeing more and more of these teams choosing to collaborate in the cloud. In fact, over half of our customers choose to collaborate in the cloud and an even higher number of new customers select our cloud offerings.”

Elsewhere, the team also launched a native mobile platform to increase connectivity between departments who are using the Confluence and JIRA tools, building on the enterprise mobility trends, as well opening up its JIRA Service Desk product, to developers to build add-ons that create and update requests or extend JIRA Service Desk’s automation capabilities to react to changes in requests.

The company has also joined the Open API Initiative and replace the company’s existing API documentation, using a custom site generator, RADAR and it has released this software as open source to be used by any Open API provider.

“Collectively, we have a lot to gain from an open, widely accepted definition language for REST APIs,” said Viswanath. “We’re committed to actively contributing to the standard and are now a member of the Open API Initiative and the Linux Foundation, alongside industry leaders like Google, Microsoft, PayPal and others.”