Archivo de la categoría: Interviews

Ticketmaster VP of Engineering talks DevOps

Stephen Williams VP Engineering TicketmasterIs there anything that Stephen Williams, VP Engineering at Ticketmaster can’t do? Whether it be leading the technology and development of the International Ticketmaster and Live Nation consumer platforms for the last 10 years or building web and apps across a heterogeneous range of technologies that include the best of breed OSS and commercial software founded on Java, PHP and .Net stacks. There is no doubt that he will be a great addition to the speaker line up and the upcoming DevOps World event on November 4th in London.

Over the last 2 years Steve has been focusing attention across all international teams to define and direct the change and implementation of co-ordinated engineering strategies in collaboration with Product and Technical Operations teams. A primary focus has been the evangelising and embracing DevOps culture: Steve led defining the aspects of Ticketmaster DevOps program, the development of a unique way to visualise the program of work and the journey they’re now on. Prior to the event Stephen shared some views on DevOps and a few other things besides.

What does your role involve and how are you involved with DevOps?

My role involves overseeing the management of two teams, in London and Sweden, working on the Ticketmaster International and Live Nation consumer platforms both supporting in 10-15 markets and growing. I’m very fortunate in having two great managers on both platforms, which enables me to also focus on larger strategic projects across our entire international engineering organisation.

I’ve been very involved with DevOps from the start within Ticketmaster International. Myself and another colleague together defined the International TM DevOps strategy. Following the release of the strategy we set up focused working groups to create some essential standards around tooling and instrumentation. From there we worked with various teams to convert the strategy into requirements and enable all teams to begin their journey. At the same time defined KPIs show where DevOps is having positive impacts and allow us to report this back to the business to promote the benefits, or if benefits are not being realised as expected then we can re-evaluate the strategies.

How have you seen DevOps affecting IT teams’ work?

Our TM DevOps Strategy has provided goals and a shared vision for our Teams. The strategy is defined in a way that allows each team to select their own path, which they select depending on the context of their needs. If you think of a roadmap, there are many ways to get from Point A to Point B – you the driver will determine which is the most appropriate route depending on various factors. Our strategy works in a similar way.

Having the right vision and ability to choose your path has created motivation and desire to succeed across our teams. It’s inspiring them to want to deploy faster and create opportunities for the business to learn quicker about new features. Having standards for tooling and best practices is helping to create a culture where more collaboration and sharing of ideas is starting to happen so we only solve problems the one time.

What is the biggest challenge you are facing with DevOps and how are you trying to overcome it?

The biggest challenges is capacity within the systems engineering team to align closer to the product delivery teams, whilst still having a large operational support requirements to service. It can be a slow process to unpick some areas and re-align teams when so many demands are coming in. There are several initiatives we’re employing such as shift left, moving operational tasks to support teams and free some capacity for the Systems Engineers to work more closely with developers. A lightweight CAB and improving demand management filtering have also been put in place to funnel requests.

Can you share a book, article, movie that you recently read/watched and inspired you?

I’m currently working on an organisational strategy to implement competency based skills frameworks to standardise the roles across the international engineering organisation, increase operational efficiency and support career progression and the satisfaction of staff. Research to develop the strategy led me to several articles and videos on Holacracy. Holacracy is attempting to define a way an organisation can be more flexible by allowing individuals to have more authority to solve problems and cut through bureaucracy.

It’s a fascinating approach to creating more autonomy, increase flow and a higher performing organisation. If Windows and MacOs are to an agile organisation then it’s more like mobile O/S for the organisation. I’m starting to ask what if we tried this, how could we, where will the benefits be. A great video to learn more about Holocracy is by Brian Robertson: https://www.youtube.com/watch?v=tJxfJGo-vkI.

15880-DevOps-World-LogoWhat are you hoping to achieve by attending the DevOps World?

As with all conferences there is a bit of promoting our brand through the promotion of the exciting and great work we’re doing at Ticketmaster but also to use it as a learning opportunity to hear more about Devops, maybe we can use the information to extend our own DevOps strategy, or learn what not to try potential risks to watch out for, from other people and to meet new people to make relationships.

Infectious Media CTO on how DevOps is affecting ICT teams

people_daniel_de_sybelThe fourth employee of Infectious Media, Dan de Sybel started his career as an Operations Analyst for Advertising.com, where during a six year tenure, he launched the European Technology division, producing bespoke international reporting and workflow platforms, as well as numerous time saving systems and board level business intelligence.

Dan grew the EU Tech team to 12 people before moving agency side, to Media Contacts UK, part of the Havas Media Group. At Havas, Dan was responsible for key technology partnerships and spearheading the agency’s use of the Right Media exchange under its Adnetik trading division.

At Infectious Media, Dan’s Technology division yielded one of the first Big Data analysis systems to reveal and visualise the wealth of information that RTB provides to its clients. From there, the natural next step was to produce the Impression Desk Bidder to be able to action the insights gained from the data in real time and thus close the loop on the programmatic life cycle. Dan’s team continues to enhance its own systems, whilst integrating the technology of other best-in-class suppliers to provide a platform that caters to each and every one of our clients’ needs.

Ahead of his presentation at DevOps World on November 4th in London, Dan shares his insights on how he feels DevOps is affecting ICT teams, the DevOps challenges he is facing as well as what he is doing to overcome it.

What does your role involve and how are you involved with DevOps?

​Infectious Media runs its own real-time bidding software that takes part in hundreds of thousands of online auctions for online advertising space every second. As CTO, it’s my job to ensure we have the right team, processes and practices in place to ensure this high frequency, low latency system remains functional 24×7 and adapts to the ever changing marketplace and standards of the online advertising industry.

DevOps practices naturally evolved at Infectious Media due to our small teams, 1 week sprint cycles and growing complexity of systems. Our heavy use of the cloud meant that we could experiment frequently with different infrastructure setups and adapt code to deliver the best possible value for the investment we were prepared to make. These conditions resulted in far closer working of the developers and the operational engineers and we have not looked back since.

How have you seen DevOps affecting IT teams’ work?

Before adopting the DevOps philosophy, we struggled to bring the real-time bidding​ system to fruition, never sure if problems originated in the code, in the operational configurations of infrastructure, or in the infrastructure itself. Whilst the cloud brought many benefits, never having complete control of the infrastructure stack led to ​many latency and performance issues that could not be easily explained. Furthermore, being unable to accurately simulate a real-world environment for testing without spending hundreds of thousands of pounds meant that we had to work out solutions for de-risking testing new code in live environments. All of these problems became much easier to deal with once we started following DevOps practices and as a result, we have a far happier and more productive technology team.

What is the biggest challenge you are facing with DevOps and how did/are you trying to overcome it?

​The biggest challenge was overcoming initial inertia to switch to a model that was so far unproven and regarded as a bit of a fad. Explaining agile methodologies and the compromises it involves to senior company execs is hard enough, but as soon as you mention multiple daily release cycles necessitating fewer governance processes and testing on live, you are bound to raise more than a few eyebrows.​

​Thankfully, we are a progressive company and the results proved the methodology. Since we adopted DevOps, we’ve had fewer outages, safer, more streamlined deployments and, crucially, more features released in less time

Can you share a book, article, movie that you recently read/watched and inspired you – in regards to technology?

The Phoenix Project. ​Perhaps a bit obvious, but it’s enjoyable reading a novel that covers some of the very real problems IT professionals experience in their day-to-day roles with the very solutions that we were experimenting with at the time.

15880-DevOps-World-LogoWhat are you hoping to achieve by attending the DevOps World?

​Really my goal is to understand and help with some of the problems rolling DevOps practices out across larger companies can yield. In many respects, rolling out DevOps in small startups is somewhat easier as you have far less inertia from tried and trusted practices, comparatively less risk and far fewer people to convince that it’s a good idea. I’ll be interested to hear about other people’s experiences and hopefully be able to share some advice based on our own.

Advancing the cloud in South East Asia

Mike_MuddToday is the first day of Cloud South East Asia in Kuala Lumpur, and the attendance alone testifies to the enthusiasm and curiosity around cloud development in the region in general and in Malaysia in particular.

One great authority on the topic is the chair of today’s event, Mike Mudd (pictured), MD at Asian Policy Partners LLC. Following keynotes from the likes of Amazon Web Services and the Asia Cloud Computing Association, Business Cloud News sat down with Mudd to discuss the significance of cloud computing standards in the region, something touched upon by a number of speakers.

BCN: Hi Mike. It was pointed out today that there is a slight disparity between the enthusiasm for the cloud in South East Asia, and the pace of actual adoption in the region. What would you say the big impediments are?

Michael Mudd: Well there’s the general one which is what I’ve described as the ‘trusted cloud’. This encompasses two things. One is security, and the other is privacy. The other issue however is that, only really half of the region here has adequate data protection rules. Some have them on the books but they’re either not enforced, they’re enforced laxly, or they are only applicable to the private sector, and not applicable to government. This is quite distinct to privacy laws in say Europe, where it goes across all sectors.

In addition, in certain countries, they’re trying to say that you cannot send any personally identifiable information across borders. This is important when it comes to financial information: banks, insurance, stock exchange, this type of thing, as well as healthcare.

And are regional governments taking up the cloud in general?

Forward looking governments are. Singapore, Hong Kong to a certain degree – but there’s not an idea of a ‘cloud first’ policy yet. It’s still very much ‘hug my server, build my data centre etc..’

From the point of view of the regulators, particularly the financial services, to do their job they’ve got to be able to audit. And one of the things they consider important to that is being able to physically enter premises if required. Certain jurisdictions want to see servers. If the data is in the cloud, then that too is an issue, and something that has to be addressed.

Do you think that the new Trans-Pacific Partnership could provide a way out of this impasse?

What has been drafted to my understanding (though we’ve still got to see the details) in the TPP, is wording which will enable or should enable cross border data flows to work far more easily. Again it was only signed two days ago so we don’t know exact words. (Like all trade negotiations they’re done in confidence – people complain they’re done in secrecy but all are done in the same way.)

Why is this so important?

From the point of view of cloud computing, this is new. Most trade agreements deal with traditional things. Agriculture being the first trading product, manufacturing the second, the third being services, but the fourth one is the new one: trading data, trading information, flowing across borders.

It actually goes right back to the very beginning. Information’s always been important for trade: being able to have a free flow of information. I’m not talking about security or government: that kind of thing is always sensitive and will always be treated separately as it should be, but commercial information is very important. It’s the reason your ATM card works here as well as in London. That’s a cross border data flow of information!

Standards are only just emerging. We obviously have technical standards – their objective is to enable interoperability between disparate machines. Those kinds of standards have been around a long time – they’re based on industry protocols etc. What have starting to come up now are management standards, standards coming out now very specifically for cloud.

Cloud computing in the public sector

BCN has partnered with the Cloud Asia Forum event to speak to some of its speakers. In this interview we speak to Ben Dornier, Director of Corporate & Community Services, City of Palmerston.

BCN: What does your role involve and how is technology helping your organisation grow and reach more customers? What is the role of Cloud Computing in this?

Ben Dornier: My role includes responsibility for general corporate affairs (finance, city tax revenue, legal affairs, HR, IT, contracts, insurance and risk) as well as governance and strategy (the city strategy, annual budget, annual financial reporting, performance reporting, policy and corporate strategy), and community services (libraries, city recreational facilities, city facilities, city community services).

ICT plays a major role in ensuring this portfolio can not only be adequately delivered, but especially in ensuring it is done efficiently and sustainably. Cloud computing is a major player, with several major systems already in the cloud, and our transfer of all corporate ICT systems into public/private cloud hybrids over the course of this financial year. It has reduced our risk and cost base, and allowed us a shift of emphasis from employing pure technical expertise to technical strategy expertise, allowing us to focus on our core services while improving service standards.

What do you consider as the three main challenges for wide Cloud Computing adoption in Asia and how do you anticipate they can be overcome?

Interesting question, and really I can only answer regarding the public sector – the first is primarily HK based. I note a reticence amongst public agencies to provide mobility solutions to their employees, and I think this seriously hampers the effectiveness of cloud based solutions to get government workers out of their desks and into the city infrastructure and services, which I believe likely drags on costs and efficiency. With this as a barrier, many of the benefits of cloud based solutions will not be readily as apparent to the government – and the skill sets of highly competent, highly mobile workforce will not be an advantage.

Second, I see the structural issues associated with data governance and related policy as a serious barrier, although this is steadily decreasing. As long as policy makers are not actively addressing cloud procurement and adoption issues, the ICT staff supporting internal decision making will not be able to recommend new and innovative models of service delivery without there being fairly high costs associated with development. This continues the prevalence of ‘bespoke systems’ and the myth that ‘our agency and its requirements are unique, and we need a unique system’. I simply do not believe this is true any longer, and nations which address this at a federal or national level are reaping the benefits.

Third, in ‘cloud-readiness’, Asia is rapidly climbing – but this is really a private sector metric. I would strongly advocate that there be a concerted effort in the industry to support a public sector metric, which could bootstrap some of the incredible work happening in the private sector, and be a convincing argument for changes in public policy towards cloud use. Public sector use will be a serious revenue driver once procurement practices are able to support government cloud use in the least restrictive manner appropriate.

How much is Mobility part of your strategy? Is it important for organisations to enable employee mobility and reach out to customers through mobile devices?

Mobility is a ‘force-multiplier’ for us (to borrow from military terms), which allows us to increase productivity while reducing pressures on human resources. Municipal employees are able to spend less time at their desks entering data into corporate systems, be it for inspections and assessments of civic assets, to animal and parking infringements. For these staff, less time at the desk means more time doing the work they were hired to do. It also allows us to offer better employment flexibility for staff who would prefer to operate part time or odd hours, without some of the productivity issues often associated with workplace flexibility.

We are also finding that young employees are increasingly expecting us to provide this capability, and quickly adopt mobile solutions. As for our city residents, more than 50% are accessing city information through mobile devices when and where they need it, and an increasing proportion of these rely on mobile devices as their primary access. This will only increase.

How do you think Disruptive Technologies affect the way business is done in your industry?

Technology disruption is continuing to be a key component, particularly as older, expensive Line of Business systems are proving not nearly as capable as well managed cloud based solutions. I believe an increasing disruptor in this area will be cloud based integration services offering connections which tie multiple cloud based solutions into effectively a single service from the perspective of the end user.

There will always be a role for major system suppliers, but increasingly the aggregated cloud based service sector will take a large chunk of market share while reducing the risks associated to big capex spends and expensive implementations. When I am spending tax money, this is an important consideration!

Can you recommend a – relevant to Cloud and Technology – book/film/article that inspired you?

Being a bit more digital, might I suggest a blog! I have a heavy interest in concepts around ‘smart cities’, a technology disruption occurring around the business of building very expensive but often technologically ‘dumb’ civil infrastructure like bridges and waste facilities. I am an avid reader of posts at Jesse Berst’s “Smart Cities Now” blog, through his site at www.smartcitiescouncil.com. There are a few good blogs in this sector, but I enjoy the variety Jesse’s site provides.

What was your interest in attending Cloud Asia Forum? What are you looking to achieve by attending the event?

Frankly, I know from past experience that I am guaranteed an ‘ah-hah’ moment, or even several, which will change my thinking and perspective on a specific area related to cloud solutions in government. I am looking forward to hearing the speakers and interacting with delegates and finding out where these ‘ah-hah’ moments will occur. This year I am particularly interested in listening to topics covering C-Level persuasion, the translation of the technical advantages of cloud computing into corporate decision making involving non-technical (meaning ICT!) executives. For me, I think this will be helpful in persuading elected officials on their own terms about the benefits of cloud adoption.

asia cloud forum logo

Game development and the cloud

Sherman ChinBCN has partnered with the Cloud South East Asia event to interview some of its speakers. In this interview we speak to Sherman Chin, Founder & CIO of Sherman3D.

Cloud South East Asia: Please tell us more about Sherman3D and your role in the gaming industry.

Sherman Chin:  I started game development during my college days when I did game development as hobby projects. I then graduated with a BSc (Hons) in Computing from the University of Portsmouth, UK, and was the recipient of the 2002 International Game Developers Association scholarship. I formed Sherman3D shortly after and I oversaw the entire game development pipeline. Though my experience is in programming, I am able to serve as a bridge between the technical and creative team members.

I worked on over 20 internationally recognized games including Scribblenauts published by Warner Bros. Interactive Entertainment, Nickelodeon Diego’s Build & Rescue published by 2K Play, and Moshi Monsters Moshling Zoo published by Activision. Sherman3D is the longest lasting Malaysian indie game development company incorporated since 2003. With Sherman3D, I am the first Malaysian to release a game on Steam, the largest digital distribution platform for games online, after being voted in by international players via the Steam Greenlight process.

Within the gaming industry, I also worked as a producer in Japan, as a project manager in Canada, and as a COO in Malaysia. With over 15 years of experience in the gaming industry, I am currently the external examiner for the games design course at LimKokWing University and a game industry consultant for the Gerson Lehrman Group providing advisory services for international investors.

How has technology such as cloud supported your growth?

One important aspect of cloud technology is how ubiquitous it is. It allows my international development team to work online from anywhere in the world. This has helped us tremendously as we move our development operations online. We have our documents edited and stored online, we have our project management online, we have our video conference sharing sessions online, and we even have our game sessions online.

These online activities are made possible with cloud technology. More directly related to our product, Alpha Kimori was initially coded as a 3D tech demo for the Butterfly.net supercomputing grid, which was showcased at the Electronic Entertainment Expo in 2003.

I continued work on Alpha Kimori as a 2D JRPG that was then featured on the OnLive cloud gaming service for PC, Mac, TV, and mobile. OnLive streamed our game on multiple platforms with minimal effort on our part. Thanks to OnLive, we reached a bigger audience before finally making it on to Steam via the Greenlight voting process by players who wanted to see Alpha Kimori on Steam.

Do you think cloud has an important role in the gaming industry and do providers give you enough support?

Yes, cloud does play an important role in the gaming industry and providers do give enough support. OnLive was extremely helpful for example. It was perfect for an asynchronous game such as Alpha Kimori which had a turn based battle system. Unfortunately, synchronous realtime games have a more difficult time adapting to the slower response rate from the streaming cloud servers. In order to boost response time, servers have to be placed near the players. Depending on the location of the servers, a player’s mileage might vary.

As broadband penetration increases, this becomes less of an issue so early implementations of Cloud gaming might have been too early for its time. I do see a bright future though. We just have to match the optimum sort of games to Cloud gaming as the technology progresses.

What will you be discussing at Cloud South East Asia?

At Cloud South East Asia, I will be discussing how asynchronous Japanese Role Playing Game elements are suitable for Cloud gaming as they require less of a response time compared to synchronous real time battle games. I will also do a post mortem of Alpha Kimori on the Cloud gaming platforms it was on.

Cloud technology was not always a bed of roses for us and we had to adapt as there were not many precedents. In the end though, each cloud gaming platform that Alpha Kimori was on helped us to advance our game content further. I will also talk about the auxiliary resources on the Cloud for game design such as the amazing suite of free technology provided by Google. I will also talk a bit about the sales of Alpha Kimori on Steam and how Cloud technology affects it with features such as Steam Cards.

Why do you think it is an important industry event and who do you look forward to meeting and hearing more from?

Having its roots in Japanese Role Playing Games, Alpha Kimori was selected by the Tokyo Game Show (TGS) committee for its Indie Game Area in September, 2015. Sherman3D is once again honoured to be the only Malaysian indie team sponsored by TGS and as such, we view TGS as an important industry event for us. It will help us penetrate the Japanese market and we look forward to meeting and hearing from potential Japanese business partners willing to help us push the Alpha Kimori intellectual property in Japan.

What is next for Sherman3D?

Sherman3D will go on developing the Alpha Kimori series and licensing our Alpha Kimori intellectual property to other developers worldwide. We want to see our Alpha Kimori universe and brand grow. We are also working on the Alpha Kimori comic and anime series. Ultimately, Sherman3D will spread the Great Doubt philosophy in Alpha Kimori where it is not about the past or the future but our experience in the current moment that counts. Only from now do we see our past and future shaped by our own perspective because the truth is relative to our human senses. Attaching too much to anything causes us suffering and accepting the moment gives us true freedom as it allows us to love without inhibitions. Sherman3D will continue to spread the Great Doubt philosophy in its endeavours in the entertainment industry.

Learn more about how the cloud is developing in South East Asia by attending Cloud South East Asia on 7th & 8th October 2015 at Connexion @ Nexus, KL, Malaysia.

SEA Logo

How the cloud enables the Bistip social marketplace to scale efficiently

OLYMPUS DIGITAL CAMERABCN has partnered with the Cloud South East Asia event to interview some of its speakers. In this interview we speak to Rohit Kanwar, CEO of Indonesian social marketplace service Bistip.

Cloud South East Asia: Who are Bistip and how are you shaking up the Indonesian market?

Rohit Kanwar:  Bistip is Peer to Peer market place for social delivery through which item seekers and travellers are connected. Bistip travellers can post their trips in the platform visible to everyone and item seekers offer them extra money for bringing them their desired items. Currently Bistip has close to 35,000 customers and more than 100,000 web visits per month.

We are analogous in the logistics industry, like Uber or AirBnB.  Social couriers existed in most of the Asian countries before, but they were limited to close friends and families. However, with the adoption of smartphone and cloud based technologies it’s faster and easier to scale and roll out services to more people now.

Indonesians love buying high value goods, which are often expensive to purchase locally; with the help of Bistip they are now accustomed to getting “Anything from anywhere globally ” within Indonesia, at affordable prices.  Our mission is reduce overseas travel cost by providing travellers with extra money. We are adding 1000 customers per month, with a revenue over 250K USD and plan to achieve 1 million registered customers by 2017.

How is technology helping you grow and reach new customers?

Our customer acquisition strategy is 100% digitally focussed; we are also using online advertising tools to reach out to new customers. We are actively using digital technology and analytics to reach our target audience and demography based on respective products. We also use social media listening technology to understand the sentiments of customers about our product and services. Modification of products based on customer insight helps us to reach new customers and better serves existing members.

What role does cloud computing play in your business?

All our web and app based platforms are on cloud based technology. Its subscription based model (Pay as you go) helps to scale up capacity in a cost effective way. Our Cloud vendors provide us with features such as managed services, Web Performance dashboards and analytics integrated platforms, which enable us to focus on our Business KPIs instead of day to day network operations. Cloud base technology is a boom to start-ups as it reduces high capex spending on IT infrastructure.

How do you think established, global technology vendors are supporting start-up companies in Indonesia?

Indonesia currently has 70 million smartphones and more than 100 million Internet users.  Jakarta generates more tweets than any other city in the world. There is huge paradigm shift among technology vendors towards Indonesia due to the growing, successful start-up ecosystem in the country. Global technology vendors such as Google, Cloudera, IBM, Microsoft, etc. are showing a keen interest in understanding business requirement for start-ups.

Tech Vendors have dedicated support teams assisting start-up companies with solving their problems. Global technology vendors regularly arrange boot camps, Hackathons and networking seminars in the country in order to support start-ups with fund raising and other mentorship activities.   A few technology vendors have venture funds to support start-ups as well.  Overall the atmosphere and ecosystem is developing faster than ever before in Indonesia.

What is next for Bistip?

Bistip is coming up with a mobile application, along with new features and services, to reduce the shipping time and provide more security to our customers. It will also have an integrated payment system. Bistip aims to achieve 1 million customers by end of 2017.

Bistip is also exploring the opportunity to start operations in other countries. We have signed two partnerships with Uber in Indonesia, Aramex and are likely to sign a few more with online travel portal and retail stores in the near future. We are also in final level talks with a VC for raising further funds. Hopefully our execution plans will follow our strategy and we can provide our travellers with more benefits and our buyers with a better service.

 

Learn more about how the cloud is developing in South East Asia by attending Cloud South East Asia on 7th & 8th October 2015 at Connexion @ Nexus, KL, Malaysia.

SEA Logo

Make your Sunday League team as ‘smart’ as Borussia Dortmund with IoT

IoT can help make your football team smarter

IoT can help make your football team smarter

How, exactly, is IoT changing competitive sports? And how might you, reader, go about making your own modest Sunday League team as ‘smart’ as the likes of AC Milian, Borussia Dortmund and Brazil?

We asked Catapult, a world leader in the field and responsible for connecting all three (as well as Premier League clubs including Tottenham, West Brom, Newcastle, West Ham and Norwich) exactly how the average sporting Joe could go about it. Here’s what the big teams are increasingly doing, in five easy steps.

Link-up play

The technology itself consists of a small wearable device that sits (a little cyborg-y) at the top of the spine under the uniform, measuring every aspect of an athlete’s movement using GPS antenna and motion sensors. The measurements include acceleration, deceleration, change of direction and strength – as well as more basic things like speed, distance and heart rate.

Someone’s going to have to take a bit of time off work though! You’ll be looking at a one- or two-day installation on-site with the team, where a sports scientist would set you up with the software.

Nominate a number cruncher

All the raw data you’ll collect is then put through algorithms that provide position-specific and sport-specific data output to a laptop. Many of Catapult’s Premier League and NFL clients hire someone specifically to analyse the massed data.  Any of your team-mates work in IT or accountancy?

Tackle number crunching

Now you’ve selected your data analyst, you’ll want to start them out on the more simple metrics. Everyone understands distance, for instance (probably the easiest way to understand how hard an athlete has worked). From there you can look at speed. Combine the two and you’ll have a fuller picture of how much of a shift Dean and Dave have really put in (hangovers notwithstanding).

Beyond this, you can start looking at how quickly you and your team mates accelerate (not very, probably), and  the effect of deceleration on your intensity afterward. Deceleration is usually the most harmful to tissue injuries.

Higher still up the spectrum of metrics, you can encounter a patented algorithm called inertial movement analysis, used to capture ‘micro-movements’ and the like.

Pay up!

Don’t worry, you won’t have to actually buy all the gear (which could well mean your entire team re-mortgaging its homes): most of Catapult’s clients rent the devices…

However, you’ll still be looking at about £100 per unit/player per month, a fairly hefty additional outlay.

Surge up your Sunday League!

However, if you are all sufficiently well-heeled (not to mention obsessively competitive) to make that kind of investment, the benefits could be significant.

Florida State Football’s Jimbo Fisher recently credited the technology with reducing injuries 88 per cent. It’s one of number of similarly impressive success stories: reducing injuries is Catapult’s biggest selling point, meaning player shortages and hastily arranged stand-ins could be a thing of the past.

Of course if the costs sound a bit too steep, don’t worry: although the timescale is up in the air, Catapult is ultimately planning to head down the consumer route.

The day could yet come, in the not too distant future, when every team is smart!

How will the Wearables market will continue to change and evolve? Jim Harper (Director of Sales and Business Development, Bittium) will be leading a discussion on this very topic at this year’s Internet of Things World Europe (Maritim Pro Arte, Berlin 6th – 7th October 2015)

A tale of two ITs

Werner Knoblich,  head of strategy at Red Hat in Europe, Middle East, and Africa (EMEA)

Werner Knoblich, senior vp and gm of Red Hat in EMEA

Gartner calls it ‘bimodal IT’; Ovum calls it ‘multimodal IT’; IDC calls it the ‘third platform’. Whatever you choose to call it, they are all euphemisms for the same evolutions in IT: a shift towards deploying more user-centric, mobile-friendly software and services that more scalable, flexible and easily integrated than the previous generation of IT services. And while the cloud has evolved as an essential delivery mechanism for the next generation of services, it’s also prompting big changes in IT says Werner Knoblich, senior vice president and general manager of Red Hat in EMEA.

“The challenge with cloud isn’t really a technology one,” Knoblich explains, “but the requirements of how IT needs to change in order to support these technologies and services. All of the goals, key metrics, ways of doing business with vendors and service providers have changed.”

Most of what Knoblich is saying may resonate with any large organisation managing a large legacy estate that wants to adopt more mobile and cloud services; the ‘two ITs can be quite jarring.

The chief goal used to be reliability; now it’s agility. In the traditional world of IT the focus was on price for performance; now it’s about customer experience. In traditional IT the most common approach to development was the classic ‘waterfall’ approach – requirements, design, implementation, verification, maintenance; now it’s all about agile and continuous delivery.

Most assets requiring management were once physical; now they’re all virtualised machines and microservices. The applications being adopted today aren’t monolithic beasts as they were traditionally, but modular, cloud-native apps running in Linux containers or platforms like OpenStack (or both).

Not just the suppliers – but also the way they are sourced – has changed. In the traditional world long-term, large-scale multifaceted deals were the norm; now, there are lots of young, small suppliers, contracted in short terms or on a pay-as-you-go basis.

“You really need a different kind of IT, and people who are very good in the traditional mode aren’t necessarily the ones that will be good in this new hybrid world,” he says. “It’s not just hybrid cloud but hybrid IT.”

The challenges are cultural, organisational, and technical. According to the 2015 BCN Annual Industry Survey, which petitioned over 700 senior IT decision makers, over 67 per cent of enterprises plan to implement multiple cloud services over the next 18 months, but close to 70 per cent were worried about how those services would integrate with other cloud services and 90 per cent were concerned about how they will integrate those cloud services with their legacy or on-premise services.

That said, open source technologies that also make use of open standards play a massive role in ensuring cloud-to-cloud and cloud-to-legacy integrations are achievable and, where possible, seamless – one of the main reasons why Linux containers are gaining so much traction and mind share today (workload portability). And open source technology is something Red Hat knows a thing or two about.

Beyond its long history in server and desktop OSs (Red Hat Enterprise Linux) and middleware (JBoss) the company is a big sponsor and early backer of Open Stack, increasingly popular cloud building software built on a Linux foundation. It helped create an open source platform as a service, OpenShift. The company is also working on Atomic Host, an open source container-based hosting mechanism for a slimmed down version of RHEL with support for other open source container technologies including Kubernetes and Docker, the darlings of the container community.

“Our legacy in open source is extremely important and even more important in cloud than the traditional IT world,” Knoblich says.

“All of the innovation happening today in cloud is open source – think of Docker, OpenStack, Cloud Foundry, Kubernetes, and you can’t really think of one pure proprietary offering that can match these in terms of the pace of innovation and the rate at which new features are being added,” he explains.

But many companies, mostly the large supertankers, don’t yet see themselves as ready to embrace these new technologies and platforms – not just because they don’t have the type or volume of workloads to migrate, because they require a huge cultural and organisational shift. And cultural as well as organisational shifts are typically rife with political struggles, resentment, and budgetary wrestling.

“You can’t just install OpenStack or Dockerise your applications and ‘boom’, you’re ready for cloud – it just doesn’t work that way. Many of the companies that are successfully embracing these platforms and digitising their organisations set up a second IT department that operates in parallel to the traditional one, and can only seed out the processes and practices – and technologies – they’ve embraced when critical mass is reached. Unless that happens, they risk getting stuck back in the traditional IT mentality.”

An effective open hybrid approach ultimately means not only embracing the open source solutions and technologies, but recognising that some large, monolithic applications – say, Cobol-based mainframe apps – won’t make it into this new world; neither will the processes needed to maintain those systems.

“For some industries, like insurance for instance, there isn’t a recognised need to ditch those systems and processes. But for others, particularly those being heavily disrupted, that’s not the case. Look at Volkswagen. They don’t just see Mercedes, BMW and Tesla as competitors – they see Google and Apple as competitors too because the car becomes a technology platform for services.”

“No industry is secure from disruption, particularly from players that scarcely existed a few years ago, which is why IT will be multi-modal for many, many years to come,” he concludes.

This interview was developed in partnership with Red Hat

Jennifer Kent of Parks Associates on IoT and healthcare

BCN spoke to Jennifer Kent, Director of Research Quality and Product Development at Parks Associates, on the anticipated impact IoT will have on healthcare.

BCN: Can you give us a sense of how big an impact the Internet of Things could have on health in the coming years?

Jennifer KentJennifer Kent: Because the healthcare space has been slow to digitize records and processes, the IoT stands to disrupt healthcare to an even greater extent than will be experienced in other industries. Health systems are just now getting to a point where medical record digitization and electronic communication are resulting in organizational efficiencies.

The wave of new data that will result from the mass connection of medical and consumer health devices to the Internet, as part of the IoT, will give care providers real insight for the first time into patients’ behaviour outside of the office. Parks Associates estimates that the average consumer spends less than 1% of their time interacting with health care providers in care facilities. The rest of consumers’ lives are lived at home and on-the-go, engaging with their families, cooking and eating food, consuming entertainment, exercising, and managing their work lives – all of which impact their health status. The IoT can help care providers bridge the gap with their patients, and can potentially provide insight into the sources of motivation and types of care plans that are most effective for specific individuals.

 

Do you see IoT healthcare as an essentially self-enclosed ecosystem, or one that will touch consumer IoT?

IoT healthcare will absolutely touch consumer IoT, at least in healthcare markets where consumers have some responsibility for healthcare costs, or in markets that tie provider payments to patients’ actual health outcomes. In either scenario, the consumer is motivated to take a greater interest in their own self-care, driving up connected health device and application use. While user-generated data from consumer IoT devices will be less clinically accurate or reliable, this great flood of data still has the potential to result in better outcomes, and health industry players will have an interest in integrating that data with data produced via IoT healthcare sources.

 

Medical data is very well protected – and quite rightly – but how big a challenge is this to the development of effective medical IoT, which after all depends on the ability to effectively share information?

All healthcare markets must have clear regulations that govern health data protection, so that all players can ensure that their IoT programs are in compliance with those regulations. Care providers’ liability concerns, along with the investments in infrastructure that are necessary to protect such data, have created the opportunity for vendors to create solutions that take on the burden of regulatory compliance for their clients. Furthermore, application and device developers on the consumer IoT side that border very closely the medical IoT vertical can seek regulatory approval –even if not required – as a means of attaining premium brand status from consumers and differentiation from the may untested consumer-facing applications on market.

Finally, consumers can be motivated to permit their medical data be shared, for the right incentive. Parks Associates data show that no less than 40% of medical device users in the U.S. would share the data from their devices in order to identify and resolve device problems. About a third of medical devices users in the US would share data from their devices for a discount on health insurance premiums. Effective incentives will vary, depending on each market’s healthcare system, but care providers, device manufacturers, app developers, and others who come into contact with medical device data should investigate whether potential obstacles related to data protection could be circumvented by incentivizing device end-users to permit data sharing.

 

You’re going to be at Internet of Things World Europe (5 – 7 October 2015 Maritim proArte, Berlin). What are you looking forward to discussing there and learning about?

While connected devices have been around for decades, the concept of the Internet of Things – in which connected devices communicate in a meaningful way across silos – is at a very early and formative stage. Industry executives can learn much from their peers and from visionary thinkers at this stage, before winners and losers have been decided, business plans hardened, and innovation slowed. The conversations among attendees at events like Internet of Things World Europe can shape the future and practical implementation of the IoT. I look forward to learning how industry leaders are applying lessons learned from early initiatives across markets and solution types.

Enabling smart cities with IoT

The Internet of Things will help make cities smarter

The Internet of Things will help make cities smarter

The population of London swells by an additional 10,000 a month, a tendency replicated in cities across the world. To an extent such growth reflects the planet’s burgeoning wider population, and there is even an interesting argument that cities are an efficient way of providing large numbers with their necessary resources. What we know as the ‘smart city’ may well prove to be the necessary means to manage this latest shift at scale.

Justin Anderson is sympathetic to this assessment. As the chairman of Flexeye, vice chair of techUK’s Internet of Things Council, and a leader of government-funded tech consortium Hypercat and London regeneration project Old Oak Common, he is uniquely positioned to comment on the technological development of our urban spaces.

“We are in an early stage of this next period of the evolution of the way that cities are designed and managed,” he says. “The funny thing about ‘smart’ of course, is that if you look back 5000 years, and someone suggested running water would be a good idea, that would be pretty smart at the time. ‘Smart’ is something that’s always just over the horizon, and we’re just going through another phase of what’s just over the horizon.”

There’s some irony in the fact that Anderson finds himself so profoundly involved in laying the foundations for smarter cities, since architects have been in his family for 400 years, and he intended to go in that direction himself before falling into the study of mathematics – which then led to a career in technology.

“There are lots of similarities between the two,” he says. “Stitching lots of complex things together and being able to visualise how the whole thing might be before it exists. And of course the smart city is a world comprised of both the physical and virtual aspects of infrastructure, both of which need to be tied together to be able to manage cities in a more efficient way.”

Like many of the great urban developments, the smart city is mostly going to be something invisible, something we quickly take for granted.

“We’re not necessarily all going to be directly feeling the might of processing power all around us. I think we’ll see a lot of investment on the industrial level coming into the city that’s going to be invisible to the citizen, but ultimately they will benefit because it’s a bit more friction taken out of their world. It’ll be a gradual evolution of things just working better – and that will have a knock on effect of not having to queue for so long, and life just being a little bit easier.”

There are, however, other ways connectivity could change urban life in the coming years: by reversing the process of urban alienation, and allowing online communities to come together and effect real world change.

“If you can engage citizens in part of that process as a way that they live, and make sure that they feel fully accountable for what the city might be, then there’s also a lot of additional satisfaction that could come from being a part of that city, rather than just a pawn in a larger environment where you really have no say and just have to do what you’ve got to do. Look at something like air quality – to be able to start to get that united force and be able to then put more pressure upon the city authorities to do something about it. Local planning policy is absolutely core in all of this.”

Anderson sees technology as an operative part of the trend towards devolution, with cities and their citizens gaining more and more control of their destiny. “If you build that sort of nuclear community around issues rather than just around streets or neighbourhoods, you get new levels of engagement.” For such changes to be effected, however, there is plenty that still needs doing on the technical level – a message Anderson will bringing to Internet of Things World Europe event in Berlin this October.

“I think the most important thing right now is that technology companies come together to agree on a common urban platform that is interoperable, allowing for different components to be used appropriately, and that we don’t find ourselves locked into large systems that mean cities can’t evolve in a flexible and fluid way in the future. We have to have that flexibility designed into this next stage of evolution that comes frMakom interoperability. My drive is to make sure everyone is a believer in interoperability.”