Archivo de la categoría: Opinion

Why visibility and control are critical for container security

Reacting to the steady flow of reported security breaches in open source components such as Heartbleed, Shellshock and Poodle is making organisations focus increasingly on making the software they build more secure, improving application delivery, agility and security. As organisations increasingly turn to containers to improve application delivery and agility, the security ramifications of the containers and their contents are coming under increased scrutiny.

An overview of today’s container security initiatives 

Container providers such as Docker and Red Hat, are aggressively moving towards reassuring the marketplace about container security. Ultimately, they are focusing on the use of encryption to secure the code and software versions running in Docker users’ software infrastructure to protect users from malicious backdoors included in shared application images and other potential security threats.

However, this method is slowly being put under scrutiny as it covers only one aspect of container security, excluding whether software stacks and application portfolios are free of known, exploitable versions of open source code.

Without open source hygiene, Docker Content Trust will only ever ensure that Docker images contain the exact same bits that developers originally put there, including any vulnerabilities present in the open source components. Therefore, they only amount to a partial solution.

A more holistic approach to container security

Knowing that the container is free of vulnerabilities at the time of initial build and deployment is necessary, but far from sufficient. New vulnerabilities are being constantly discovered and these can often impact older versions of open source components. Therefore, what’s needed is an informed open source technology that provides selection and vigilance opportunities to users.

Moreover, the security risk posed by a container also depends on the sensitivity of the data accessed via it, as well as the location of where the container is deployed. For example, whether the container is deployed on the internal network behind a firewall or if it’s internet-facing will affect the level of risk.

In this context, a publicly available attack makes containers subject to a range of threats, including cross-scripting, SQL injection and denial-of-services which containers deployed on an internal network behind a firewall wouldn’t be exposed to.

For this reason, having visibility into the code inside containers is a critical element of container security, even aside from the issue of security of the containers themselves.

It’s critical to develop robust processes for determining; what open source software resides in or is deployed along with an application, where this open source software is located in build trees and system architectures, whether the code exhibits security vulnerabilities and whether an accurate open source risk profile exists.

Will security concerns slow container adoption? – The industry analysts’ perspective

Enterprise organisations today are embracing containers because of their proven benefits; improved application scalability, fewer deployment errors, faster time to market and simplified application management. However, just as organisations have moved over the years from viewing open source as a curiosity to understanding its business necessity, containers seem to have reached a similar tipping point. The question now seems to be shifting towards whether security concerns about containers will inhibit further adoption. Industry analysts differ in their assessment of this.

By drawing a parallel to the rapid adoption of virtualisation technologies even before the establishment of security requirements Dave Bartoletti, Principal Analyst at Forrester Research, believes security concerns won’t significantly slow container adoption. “With virtualization, people deployed anyway, even when security and compliance hadn’t caught up yet, and I think we’ll see a lot of the same with Docker,” according to Bartoletti.

Meanwhile, Adrian Sanabria Senior Security Analyst at 451 Research believes enterprises will give containers a wide berth until security standards are identified and established. “The reality is that security is still a barrier today, and some companies won’t go near containers until there are certain standards in place”, he explains.

To overcome these concerns, organisations are best served to take advantage of the automated tools available to gain control over all the elements of their software infrastructure, including containers.

Hence, the presence of vulnerabilities in all types of software is inevitable, and open source is no exception. Detection and remediation of vulnerabilities, are increasingly seen as a security imperative and a key part of a strong application security strategy.

 

Bill_LedinghamWritten by Bill Ledingham, EVP of Engineering and Chief Technology Officer, Black Duck Software.

Preparing for ‘Bring Your Own Cloud’

BYOD1_smallIn 2015, experts expect to see more sync and sharing platforms like Google Drive, SharePoint and Dropbox offer unlimited storage to users at no cost – and an increasing number of employees will no doubt take advantage of these simple to use consumer platforms to store corporate documents, whether they are sanctioned by IT or not, turning companies into ‘Bring Your Own Cloud’ free-for-alls.

How can IT leaders prepare for this trend in enterprise?
Firstly, it’s important to realise it is going to happen. This isn’t something IT managers can stop or block – so businesses need to accept reality and plan for it.

IT leaders should: consider what’s really important to manage, and select a solution that solves the problem they need to solve. Opting for huge solutions that do everything isn’t always the best option, so teams should identify whether they need to protect data or devices.

Planning for how to communicate the new solution to users is something to consider early and partnering with the business units to deliver the message in terms that are important to them is an invaluable part of the process. The days of IT deploying solutions and expecting usage are long gone.

Using a two-pronged approach is recommended – IT managers should utilise both internal marketing and education to spread awareness about the benefits of the solution, and implement policies to set standards on what is required. Often end users aren’t aware that their organisation even has data security policies, and education can go a long way to getting compliance without being punitive.

What are the benefits of allowing employees to use these services to store corporate information?

The key benefits are mobility, increased productivity, improved user experience, and greater employee satisfaction and control.

What are the biggest implications for security?

The biggest implications for security involve the loss of valuable intellectual property and internal information such as financials and HR data, as well as data leakage, leading to privacy violations and loss of sensitive customer data. In addition, there are potential violations of regulatory policies for healthcare, financial services, and similar industries.

How can companies manage and control the use of these cloud storage apps when employees are using them in a BYOD environment?

In BYO use cases, companies should look for solutions that are focused on securing and managing data rather than devices. In a BYOD environment, IT managers can’t rely on the ability to lock down devices through traditional methods.

Instead, companies must be able to provide workspaces that have secure IT oversight, but also integrate with what is in the current environment.

Often the current environment has data in many places: file servers, private clouds, public clouds, etc. Choosing a data management solution that integrates with where the company’s data lives today will be more suitable than forcing data to be moved to a single location. This will reduce deployment time and give more flexibility later on to choose where to store the data.

How can organisations educate users and create suitable policies around the use of these tools?

Organisations should consider classifying corporate data. Does every piece of data need to be treated the same way?

Creating realistic policies that protect the company from real harm is so important, as is treating highly sensitive data differently from other data and training employees to know the difference.  Teams will also find it useful to integrate data security essentials into regular organisational onboarding and training programs, and update them as policies evolve.

How can companies find the most suitable alternatives to the free unlimited cloud storage users are turning to, and how do you convince employees to use them over consumer options?

The best solutions balance user experience for end users with robust security, management, and audit controls on the IT side. From a user experience perspective, companies should choose a solution with broad platform adoption, especially for BYOD environments. From a security perspective, choosing a solution that is flexible enough to provide secure IT oversight and that integrates with what you have today will stand the company in good stead. The last thing IT managers want to do is to manage a huge data migration project just to get a data security solution off the ground.

How can companies get around the costs and resources needed to manage their own cloud storage solutions?
Again, flexibility is key here. The best solutions will be flexible enough to integrate with what you have today, but also will allow you to use lower-cost cloud storage when you are ready.

What’s the future of the market for consumer cloud storage – can we expect their use to continue with employees?

Cloud storage in general isn’t going anywhere. The benefits and economics are just too compelling for both consumers and organisations. However, there is and has always been a need to manage corporate data — wherever it resides — in a responsible way. The best way to do this is by using solutions that deliver workspaces that are secure, manageable, and integrated with what businesses and consumers have today.

 

chanel chambersWritten by Chanel Chambers, Director of Product Marketing, ShareFile Enterprise, Citrix.

Game development and the cloud

Sherman ChinBCN has partnered with the Cloud South East Asia event to interview some of its speakers. In this interview we speak to Sherman Chin, Founder & CIO of Sherman3D.

Cloud South East Asia: Please tell us more about Sherman3D and your role in the gaming industry.

Sherman Chin:  I started game development during my college days when I did game development as hobby projects. I then graduated with a BSc (Hons) in Computing from the University of Portsmouth, UK, and was the recipient of the 2002 International Game Developers Association scholarship. I formed Sherman3D shortly after and I oversaw the entire game development pipeline. Though my experience is in programming, I am able to serve as a bridge between the technical and creative team members.

I worked on over 20 internationally recognized games including Scribblenauts published by Warner Bros. Interactive Entertainment, Nickelodeon Diego’s Build & Rescue published by 2K Play, and Moshi Monsters Moshling Zoo published by Activision. Sherman3D is the longest lasting Malaysian indie game development company incorporated since 2003. With Sherman3D, I am the first Malaysian to release a game on Steam, the largest digital distribution platform for games online, after being voted in by international players via the Steam Greenlight process.

Within the gaming industry, I also worked as a producer in Japan, as a project manager in Canada, and as a COO in Malaysia. With over 15 years of experience in the gaming industry, I am currently the external examiner for the games design course at LimKokWing University and a game industry consultant for the Gerson Lehrman Group providing advisory services for international investors.

How has technology such as cloud supported your growth?

One important aspect of cloud technology is how ubiquitous it is. It allows my international development team to work online from anywhere in the world. This has helped us tremendously as we move our development operations online. We have our documents edited and stored online, we have our project management online, we have our video conference sharing sessions online, and we even have our game sessions online.

These online activities are made possible with cloud technology. More directly related to our product, Alpha Kimori was initially coded as a 3D tech demo for the Butterfly.net supercomputing grid, which was showcased at the Electronic Entertainment Expo in 2003.

I continued work on Alpha Kimori as a 2D JRPG that was then featured on the OnLive cloud gaming service for PC, Mac, TV, and mobile. OnLive streamed our game on multiple platforms with minimal effort on our part. Thanks to OnLive, we reached a bigger audience before finally making it on to Steam via the Greenlight voting process by players who wanted to see Alpha Kimori on Steam.

Do you think cloud has an important role in the gaming industry and do providers give you enough support?

Yes, cloud does play an important role in the gaming industry and providers do give enough support. OnLive was extremely helpful for example. It was perfect for an asynchronous game such as Alpha Kimori which had a turn based battle system. Unfortunately, synchronous realtime games have a more difficult time adapting to the slower response rate from the streaming cloud servers. In order to boost response time, servers have to be placed near the players. Depending on the location of the servers, a player’s mileage might vary.

As broadband penetration increases, this becomes less of an issue so early implementations of Cloud gaming might have been too early for its time. I do see a bright future though. We just have to match the optimum sort of games to Cloud gaming as the technology progresses.

What will you be discussing at Cloud South East Asia?

At Cloud South East Asia, I will be discussing how asynchronous Japanese Role Playing Game elements are suitable for Cloud gaming as they require less of a response time compared to synchronous real time battle games. I will also do a post mortem of Alpha Kimori on the Cloud gaming platforms it was on.

Cloud technology was not always a bed of roses for us and we had to adapt as there were not many precedents. In the end though, each cloud gaming platform that Alpha Kimori was on helped us to advance our game content further. I will also talk about the auxiliary resources on the Cloud for game design such as the amazing suite of free technology provided by Google. I will also talk a bit about the sales of Alpha Kimori on Steam and how Cloud technology affects it with features such as Steam Cards.

Why do you think it is an important industry event and who do you look forward to meeting and hearing more from?

Having its roots in Japanese Role Playing Games, Alpha Kimori was selected by the Tokyo Game Show (TGS) committee for its Indie Game Area in September, 2015. Sherman3D is once again honoured to be the only Malaysian indie team sponsored by TGS and as such, we view TGS as an important industry event for us. It will help us penetrate the Japanese market and we look forward to meeting and hearing from potential Japanese business partners willing to help us push the Alpha Kimori intellectual property in Japan.

What is next for Sherman3D?

Sherman3D will go on developing the Alpha Kimori series and licensing our Alpha Kimori intellectual property to other developers worldwide. We want to see our Alpha Kimori universe and brand grow. We are also working on the Alpha Kimori comic and anime series. Ultimately, Sherman3D will spread the Great Doubt philosophy in Alpha Kimori where it is not about the past or the future but our experience in the current moment that counts. Only from now do we see our past and future shaped by our own perspective because the truth is relative to our human senses. Attaching too much to anything causes us suffering and accepting the moment gives us true freedom as it allows us to love without inhibitions. Sherman3D will continue to spread the Great Doubt philosophy in its endeavours in the entertainment industry.

Learn more about how the cloud is developing in South East Asia by attending Cloud South East Asia on 7th & 8th October 2015 at Connexion @ Nexus, KL, Malaysia.

SEA Logo

Semantic technology: is it the next big thing or just another buzzword?

Most buzzwords circulating right now describe very attention-grabbing products: virtual reality headsets, smart watches, internet-connected toasters. Big Data is the prime example of this: many firms are marketing themselves to be associated with this term and its technologies while it’s ‘of the moment’, but are they really innovating or simply adding some marketing hype to their existing technology? Just how ‘big’ is their Big Data?

On the surface of it one would expect semantic technology to face similar problems, however the underlying technology requires a much more subtle approach. The technology is at its best when it’s transparent, built into a set of tools to analyse, categorise and retrieve content and data before it’s even displayed to the end user. While this means it may not experience as much short term media buzz, it is profoundly changing the way we use the internet and interact with content and data.

This is much bigger than Big Data. But what is semantic technology? Broadly speaking, semantic technologies encode meaning into content and data to enable a computer system to possess human-like understanding and reasoning. There are a number of different approaches to semantic technology, but for the purposes of this article we’ll focus ‘Linked Data’. In general terms this means creating links between data points within documents and other forms of data containers, rather than the documents themselves. It is in many ways similar what Tim Berners-Lee did in creating the standards by which we link documents, just on a more granular scale.

Existing text analysis techniques can identify entities within documents. For example, in the sentence “Haruhiko Kuroda, governor of Bank of Japan, announced 0.1 percent growth,” ‘Haruhiko Kuroda’ and ‘Bank of Japan’ are both entities, and they are ‘tagged’ as such using specialised markup language. These tags are simply a way of highlighting that the text has some significance; it remains with the human user to understand what the tags mean.

 

1 taggingOnce tagged, entities can then be recognised and have information from various sources associated with them. Groundbreaking? Not really. It’s easy to tag content such that the system knows that “Haruhiko Kuroda” is a type of ‘person’, however this still requires human input.

2 named entity recognition

Where semantics gets more interesting is in the representation and analysis of the relationships between these entities. Using the same example, the system is able to create a formal, machine-readable relationship between Haruhiko Kuroda, his role as the governor, and the Bank of Japan.

3 relation extraction

In order for this to happen, the pre-existing environment must be defined. In order for the system to understand that ‘governor’ is a ‘job’ which exists within the entity of ‘Bank of Japan’, a rule must exist which states this as an abstraction. This is called an ontology.

Think of an ontology as the rule-book: it describes the world in which the source material exists. If semantic technology was used in the context of pharmaceuticals, the ontology would be full of information about classifications of diseases, disorders, body systems and their relationships to each other. If the same technology was used in the context of the football World Cup, the ontology would contain information about footballers, managers, teams and the relationships between those entities.

What happens when we put this all together? We can begin to infer relationships between entities in a system that have not been directly linked by human action.

4 inference

An example: a visitor arrives on the website of a newspaper and would like information about bank governors in Asia. Semantic technology allows the website to return a much more sophisticated set of results from the initial search query. Because the system has an understanding of the relationships defining bank governors generally (via the the ontology), it is able to leverage the entire database of published text content in a more sophisticated way, capturing relationships that would have been overlooked by computer analysis alone. The result is that the user is provided with content more closely aligned to what they are already reading.

Read the sentence and answer the question: “What is a ‘Haruhiko Kuroda’?” As a human the answer is obvious. He is several things: human, male, and a governor of the Bank of Japan. This is the type of analytical thought process, this ability to assign traits to entities and then use these traits to infer relationships between new entities, that has so far eluded computer systems. The technology allows the inference of relationships that are not specifically stated within the source material: because the system knows that Haruhiko Kuroda is governor of Bank of Japan, it is able to infer that he works with other employees of the Bank of Japan, that he lives in Tokyo, which is in Japan, which is a set of islands in the Pacific.

Companies such as the BBC, which Ontotext has worked with, are sitting on more text data than they have ever experienced before. This is hardly unique to the publishing industry, either. According to Eric Schmidt, former Google CEO and executive chairman of Alphabet, every two days we create as much information as was generated from the dawn of civilisation up until 2003 – and he said that in 2010. Five years later and businesses of all sizes are waking up to this fact – they must invest in the infrastructure to fully take advantage of their own data. You may not be aware of it, but you are already using semantic technology every day. Take Google search as an example: when you input a search term, for example ‘Bulgaria’, two columns appear. On the left are the actual search results, and on the right are semantic search results: information about the country’s flag, capital, currency and other information that is pulled from various sources based on semantic inference.

Written by Jarred McGinnis, UK managing consultant at Ontotext

How IoT Security could change infrastructure forever

CybersecurityOn September 22nd and 23rd, the first-ever dedicated IoT Security conference and exhibition will take place in Boston.

While at first glance this may appear to concern a specific and rather specialized area, the relationship of the Internet of Things to the broad issue of human security may well prove much more far-reaching and fundamental.

After all, the development of the Internet itself was driven by a Cold War desire to create resilient computer networks that could withstand a nuclear attack. This threat inspired a whole new architecture for sharing and protecting information – one that was intentionally decentralized.

History suggests that precaution can be a key driver of technological innovation. In changing things to protect them, we often open up unforeseen new opportunities.

Which is why, if we return to 2015, there is something fascinating in seeing the same decentralized architectures applied to real-world infrastructures in the name of collective safety.

“When you apply this kind of Internet-type architecture to core infrastructure — whether it’s water or energy or transportation –  these systems start looking a lot more like the Internet,” says John Miri, Chief Administrative Officer at the Lower Colorado River Authority (LRCA) and a speaker at this month’s Boston event. “You start to see water systems, flood data systems and, hopefully, electric grids that are less centralized, more resilient and more difficult to disrupt.”

The LCRA is an 80-year-old institution with roots in the Great Depression, entrusted with providing reliable water, flood protection and electricity to Central Texas and beyond. The areas LCRA serves covers a number of the fastest growing cities in the United States, meaning LCRA faces some pretty substantial demands on its infrastructure.

“Providing the water and power to support growing communities and a growing business and industrial base is no small task,” Miri says. Indeed, LCRA has  broken ground on a quarter of a billion dollar new reservoir, the region’s first new water supply in decades.

Many of these additional demands make  safety and security more important than ever.

“LCRA is now the second largest electric transmission utility in Texas. Our high tension transmission lines go across a large portion of the state. Protecting the electric grid is a pretty hot topic,” Miri says.

These hypothetical threats encompass what Miri calls “bad actors,”  but also less hypothetical threats to the infrastructure.

“When you have a flood, we may have to intentionally shut down electric substations. Everyone knows electricity and water don’t mix – but even having the situational awareness to know that water is approaching a substation is very important to us in keeping the lights on. Using these kinds of smart networks to get a better picture of the threats and dangers to the power grid helps us protect it rather than just saying ‘build more,’” Miri says.

Similarly, a vast number of sensors throughout its Hydromet network enable LCRA to better monitor water levels – and to effectively manage floods.

“By adopting a new, more open, shared technology approach, we could expand the infrastructure we have for flood data collection at a 90% lower cost than if we had done it a traditional way. The technology  actually opens up our infrastructure to a very wide region that never considered it before. We can offer a level of flood monitoring across a wider region and  extend it rural and agricultural communities and other areas that might not have the resources to gain access to this technology.”

Looking ahead, Miri says, there are new opportunities to apply this decentralized, Internet-style architecture to other projects.

“I think when you look forward 10, 15 or 20 years, the whole infrastructure may work differently. It opens up new possibilities and business models that we didn’t have before. For instance, Texas is on the coast. As with any coastal area, we spend time thinking about desalination. Some of the work we’ve been doing on the Internet of Things  is making people think, maybe we don’t need a couple of giant desalination plants – which has been the approach in Australia and Israel – but a number of smaller plants that are networked together, and share the water more efficiently. In the longer term, IoT may actually change the infrastructure itself, which would be very exciting.”

It could be interesting to one day look back at this month’s inaugural IoT Security event and see how many of the topics discussed went on to fundamentally evolve and affect their wider respective domains.

How the cloud enables the Bistip social marketplace to scale efficiently

OLYMPUS DIGITAL CAMERABCN has partnered with the Cloud South East Asia event to interview some of its speakers. In this interview we speak to Rohit Kanwar, CEO of Indonesian social marketplace service Bistip.

Cloud South East Asia: Who are Bistip and how are you shaking up the Indonesian market?

Rohit Kanwar:  Bistip is Peer to Peer market place for social delivery through which item seekers and travellers are connected. Bistip travellers can post their trips in the platform visible to everyone and item seekers offer them extra money for bringing them their desired items. Currently Bistip has close to 35,000 customers and more than 100,000 web visits per month.

We are analogous in the logistics industry, like Uber or AirBnB.  Social couriers existed in most of the Asian countries before, but they were limited to close friends and families. However, with the adoption of smartphone and cloud based technologies it’s faster and easier to scale and roll out services to more people now.

Indonesians love buying high value goods, which are often expensive to purchase locally; with the help of Bistip they are now accustomed to getting “Anything from anywhere globally ” within Indonesia, at affordable prices.  Our mission is reduce overseas travel cost by providing travellers with extra money. We are adding 1000 customers per month, with a revenue over 250K USD and plan to achieve 1 million registered customers by 2017.

How is technology helping you grow and reach new customers?

Our customer acquisition strategy is 100% digitally focussed; we are also using online advertising tools to reach out to new customers. We are actively using digital technology and analytics to reach our target audience and demography based on respective products. We also use social media listening technology to understand the sentiments of customers about our product and services. Modification of products based on customer insight helps us to reach new customers and better serves existing members.

What role does cloud computing play in your business?

All our web and app based platforms are on cloud based technology. Its subscription based model (Pay as you go) helps to scale up capacity in a cost effective way. Our Cloud vendors provide us with features such as managed services, Web Performance dashboards and analytics integrated platforms, which enable us to focus on our Business KPIs instead of day to day network operations. Cloud base technology is a boom to start-ups as it reduces high capex spending on IT infrastructure.

How do you think established, global technology vendors are supporting start-up companies in Indonesia?

Indonesia currently has 70 million smartphones and more than 100 million Internet users.  Jakarta generates more tweets than any other city in the world. There is huge paradigm shift among technology vendors towards Indonesia due to the growing, successful start-up ecosystem in the country. Global technology vendors such as Google, Cloudera, IBM, Microsoft, etc. are showing a keen interest in understanding business requirement for start-ups.

Tech Vendors have dedicated support teams assisting start-up companies with solving their problems. Global technology vendors regularly arrange boot camps, Hackathons and networking seminars in the country in order to support start-ups with fund raising and other mentorship activities.   A few technology vendors have venture funds to support start-ups as well.  Overall the atmosphere and ecosystem is developing faster than ever before in Indonesia.

What is next for Bistip?

Bistip is coming up with a mobile application, along with new features and services, to reduce the shipping time and provide more security to our customers. It will also have an integrated payment system. Bistip aims to achieve 1 million customers by end of 2017.

Bistip is also exploring the opportunity to start operations in other countries. We have signed two partnerships with Uber in Indonesia, Aramex and are likely to sign a few more with online travel portal and retail stores in the near future. We are also in final level talks with a VC for raising further funds. Hopefully our execution plans will follow our strategy and we can provide our travellers with more benefits and our buyers with a better service.

 

Learn more about how the cloud is developing in South East Asia by attending Cloud South East Asia on 7th & 8th October 2015 at Connexion @ Nexus, KL, Malaysia.

SEA Logo

The FT discusses app and cloud strategy

christy rossBCN caught up with Christy Ross, Head of Application and Publishing Services, Technology at the Financial Times, to get some insight into the company’s approach to digital publishing, mobile apps and the cloud.

BCN: From a digital perspective, what is the FT currently focussed on?

Christy Ross: Print has been written off for years now, no pun intended, but we’re still doing very well. However our main interest these days — rather than investing in print product – is in looking at how we can identify and supply other means of content delivery and then to actually make some money from that. Over the past few years we’ve done things to help us to maintain a direct relationship with our subscribers, such as building our own web app rather than place anything on the Apple Store or Play Store.

We have also done a lot around building APIs, so that we can provide distinct feeds of information to businesses, enabling them to come to us and say, ‘we are particualrly interested in these areas of news, or analysis, and will pay you for that’. Of course we’ve also seen mobile take off massively, so probably over 50% of our new subscription revenue comes from mobile, rather than fromm the browser or tablets.

Why is the FT able to be so confident when asking for revenue from its readers?

We’ve been quite lucky. We were one of if not the first UK newspaper to introduce a paywall. A lot has been made of the fact that paywalls ‘don’t work,’ and we’ve seen a number of other daily national papers put them up and pull them back down again, but we are very wedded to ours.

That’s because we are a niche product. If you like, we’re ‘the business world’s second newspaper.’ So in the UK someone will have, say, their Times or the Telegraph (or in the US they’ll have the Washington Post or the New York Times), but then their second newspaper will be the Financial Times. You can’t get our content anywhere else, particularly not the analysis we provide. While we are interested in breaking news and do follow it, our key differetnaitor is analysis and that comment of what is going on in the world and what it means long term. People aree able to use these insights in their business decisions – and people are prepared to pay for that.

Is there anything unique about your current mobile application in itself?

At the end of the day we are a  content provider. It’s about getting the content out as quickly as we can, and providing the tools to our editorial users so they can concentrate on writing and not worry so much about layout – we’re doing a lot more about templating, metadata, and making our content much richer, so that, when a reader comes on, the acutal related stories mean something to them, and it’s easier for them to navigate through our considerable archive on the same poeople and companies, and be able to form a much more rounded opinion.

What about internal technical innvoation?

We’ve built our own private cloud, and we’re also heavily investigating and starting to use AWS, so doing a lot out there to support the public cloud. One of our strategy points is that any new applcaition or new functionality that we look to bring online, we have to start by looking on the public cloud to see if we can host and proivide it on that, and there has to be a very good technical reason for not doing it. We’re pushing it much more that way.

We have also borrrowed a concept from Netflix, their Chaos Monkey appraoch, where every now and then we deliberately break parts of our estate to see how resilient applications are, and to see how we can react to some of our applications not being available and what that means to our user base. Just a a couple of weekends ago we completely turned off one of our UK data centres, where we’d put most of our publishing and membership applciations in advance, to see what it did, and also to see whether we could bring up the applications in our other data centres – to see how long it took us and what it meant for things like our recovery time objectives.

 

Christy Ross will be appearing at Apps World Europe (18- 19 November, Excel, London)

Businesses are ready for cloud – but lack of transparency is limiting its usefulness

cloud puzzleDespite common perceptions, cutting costs isn’t the primary reason businesses are choosing cloud these days. The other major advantages are the agility and scalability cloud brings, enabling organisations to quickly respond to business demand. The combination of benefits is driving both IT and lines of business to rely on cloud to serve as a foundation for innovation and enablement.

But the advantages of cloud cannot be fully harnessed if transparency into the environments is compromised. Clouds that limit visibility result in significant operational and financial issues, including performance problems or outages, challenges reporting to management, and unexpected bills. In fact, challenges with transparency restrict 63% of organizations from growing their cloud usage. That’s according to a recent global survey conducted by Forrester Consulting that we commissioned. The survey sought insights from 275 IT executives and decision makers who are experienced cloud customers.

When it comes to data about cloud environments, what are organisations looking for from their providers? Clearly security and compliance information is important. Worryingly, 39% of those surveyed said they lacked security data and 47% said they lacked compliance data. Not surprisingly, the majority said they needed on-demand access to necessary reports to make compliance and audit processes easier.

That said, on-demand reporting technology only goes so far, and many respondents wanted suggestions and/or support from experts on staff at the cloud provider. In light of evolving security risks and corporate compliance concerns – especially as lines of business adopt cloud without IT involvement – cloud providers need to simplify the process for ensuring advanced security and compliance in the cloud, not get in the way.

Beyond security and compliance, performance information, historical information and clear details about costs and upcoming bills are also key. Without this, businesses find it hard to plan for or meet the needs of their end users. It also makes it extremely difficult to budget properly.

Just like with their own servers, organisations need to understand the performance of a cloud service to get the most from it, whether that means making sure resources are running properly, anticipating potential issues or preventing wasteful “zombie virtual machines.” Due to a lack of transparency from their cloud providers, more than a third of the respondents in the survey ended up with bills they hadn’t expected and 39% found they were paying for resources they weren’t actually using.

Cloud customers can use data to make better purchasing decisions. Clear information from a cloud provider will help companies discover where they need more resources, or even where they can regain capacity and maximise their spend.

Once again though, beyond the on-demand data, customers require solid support to ensure they are getting what they need from cloud. In the survey, 60% of respondents said that problems with support were restricting their plans to increase their usage of cloud. Issues like slow response times, lack of human support, lack of expertise of the support personnel and higher-than-expected support costs started with the onboarding process and only continued. Aside from preventing customers from reaping the benefits of cloud, these issues leave businesses feeling that they’re seen more as a source of revenue than as a valued cloud customer.

When it comes down to it, cloud customers should not settle for cloud services that limit visibility into the cloud environments. Compromises in transparency mean sacrifices to very agility, scalability and cost benefits that drive organizations to cloud in the first place. And beyond transparency, customers should not underestimate the human element of cloud. A cloud provider’s customer support plays a huge role in speeding return on cloud investment, and ultimately, in determining success and failure of a cloud initiative.

As the Forrester study states, “Whether you are a first-time cloud user or looking to grow your cloud portfolio, our research shows that your chances of success are greater with a trusted cloud provider at your side — one that gives you the technology and experts to solve your challenges.”

You can read more about the survey findings in the study, “Is Your Cloud Provider Keeping Secrets? Demand Data Transparency, Compliance Expertise, and Human Support From Your Global Cloud Providers.”

Written by Dante Orsini, senior vice president, iland

Sixth-sensors: The future of the Internet of Things and the connected business

IT departments will soon have to worry about IoT

IT departments will soon have to worry about IoT

An IT admin walks in to his cabin and instantly knows something is wrong. He does not even have to look at his dashboard to identify the problem. Instead, he heads straight to the server room to fix the server which is overheating because of a failed fan.

The IT admin does not have a sixth-sense. He is alerted to the problem by an internet-enabled thermostat in the server room which sensed the rise in temperature and automatically changed the lighting to alert the admin, through an internet-enabled lightbulb and his smart watch.

This is not the plot of a futuristic Sci-Fi movie. It is 2015 and just one example of how the Internet of Things (IoT) is already at work in business.

Smart living

Every few years, IT communities become awash with new buzzwords and trends that early adopters declare as the next big thing and sceptics decry as impractical and over-hyped. Over time, some fizzle out because of low industry acceptance, while others go on to really disrupt the industry.

From smart cars to watches and even homes, connected technologies are already changing consumer lives, fueling growing expectations and apprehensions. Last year, the government demonstrated its belief in the future potential of technology when it pledged to spend £45m to develop the IoT, more than doubling the funds available to the UK technology firms developing everyday devices that can communicate over the internet.

In the consumer market, IoT technology is already being lapped up. Within just a few months of its launch, Apple claimed 75% of the smartwatch market. As yet, self-driving cars are yet to take to Britain’s roadways. However, with prototypes already being pioneered and app developers racing to create everything from connected entertainment to automated piloting using GPS, when the infrastructure required to make smart cities a reality is sanctioned by local councils and city mayors, IoT could literally find itself in the driving seat.

Smart workplaces

Outside of very early prototype projects, currently, IoT does not rank highly on the enterprise agenda, which is typically a few years behind the general technology adoption cycle. However, in the not-too-distant future, smart-devices will be the norm – IDC estimates the market will be worth $8.9 Trillion by 2020, with 212 billion connected devices.

With the promise of enhanced business processes and intelligence, IoT is increasingly being touted as a holy amalgamation of big data, mobility and cloud technology. Despite this, in the short term at least, businesses will be reluctant to flow of sensitive data through such internet-enabled devices due to obvious security concerns. The exception is in the large businesses that have already explored the potential of machine-to-machine connectivity in their industries, such as automotive and insurance.

Where smart devices are catching up in day-to-day business is in an entirely different function of operations – facilities. What if your management decides to get internet-enabled LED bulbs and thermostats which connect to the internet? Will the IoT bring additional responsibilities on to the service desk? A definite yes.

Facilities need to be managed – and a tool to manage them. That’s just the start. For example, each bulb in a smart IoT connected environment must be monitored and checked to confirm they are working.

Assuming there are over 100 such appliances in an office environment, consider all the IP addresses that will need to be allocated. Likewise, a mesh network would also be required to control the IP address allocation, where one connected device would result in an ad-hoc network.

As previously non-IT facilities start to be connected to the internet, it will be the job of the IT team to make sure they’re working well. As the volume of devices connected to the network grows, securing it will be even more challenging.

Of course, organisations can get around the security challenge by having a local network dedicated only for these devices, but the management of this expanded estate would nonetheless require a dedicated management tool.

Where large organisations have already invested in machine-to-machine (M2M) interactions and deployed connected devices in their facilities, the purpose has typically been to achieve automation and gather more intelligence.

As yet, smaller businesses do not have to worry about automation and logistics at such large scales and it’s clear that the IoT is definitely not going transform their business operations overnight. However, before long, IoT will be something all IT departments should learn to manage – especially the new generation of IoT-connected devices which would traditionally have been classed and managed as non-IT assets.

Written by Pradyut Roy, product consultant, ManageEngine

Networking the Future with SDN

SDN will be vital for everything from monitoring to security

SDN will be vital for everything from monitoring to security

The nature of business is constantly changing; customers are demanding faster, more responsive services, and as a result, firms need to ensure that their backend technology is up to scratch. Increasing adoption of the cloud, mobility and big data technologies has encouraged the IT department to address how they can best support these developing trends whilst benefiting the customer and employee experience.

By looking at the heart of their infrastructure, the network, businesses can provide more agile and flexible IT services that can quickly meet user demand.  So what improvements can be made to the networks to satiate customer demand?

A software defined network (SDN) is emerging as an obvious approach for technology decision makers, empowering them to provide a faster, more agile and scalable infrastructure. SDN is considered the next evolution of the network, providing a way for businesses to upgrade their networks through software rather than through hardware – at a much lower cost.

SDN provides holistic network management and the ability to apply more granular unified security policies whilst reducing operational expenses such as the need to use specific vendor hardware and additional technology investments. In fact, IDC recently predicted that this market is set to grow from $960 million in 2014 to more than $8 billion by 2018, globally.

A Growing Trend

Datacentres and service providers have, until now, been the most common adopters of SDN solutions. As a result there has been a notable improvement in better customer service and faster response times with firms deploying new and innovative applications quicker than ever. In the past year, we have seen firms in sectors like healthcare and education take advantage of the technology. However, while SDN is developing quickly, it is still in its early stages, with several industries yet to consider it.

There is a focus to encourage more firms to recognise the benefits of SDN in the form of the OpenDaylight Project. The OpenDaylight Project is a collaborative open source project which aims to accelerate the adoption of SDN – having already laid the foundation for SDN deployments today, it is considered to be the central control component and intelligence that allows customers to achieve network-wide objectives in a much more simplified fashion. The community, which includes more than a dozen vendors, is addressing the need for an open reference framework programmability and control enabling accelerated innovation for customers of any size and in any vertical.

Driving Business Insights

Looking ahead to the future for this new way of networking, there are a number of ways SDN can benefit the business. For example, SDN looks set to emerge as the new choice for deploying analytics in an economical and distributed way – in part due to the flexible nature of its infrastructure and the growing prominence of APIs – as the SDN optimized network can be maintained and configured with less staff and at a lower cost.

Data analytics-as-a-service is being tipped as the vehicle that will make big data commoditised and consumable for enterprises in the coming years; analyst house IDC found that by 2017, 80% of the CIO’s time will be focused on analytics – and Gartner predicts that by 2017 most business users and analysts in organisations will have access to self-service tools to prepare data for analysis themselves.

However, the right network environment will be key so that data analytics has the right environment to flourish. An SDN implementation offers a more holistic approach to network management with the ability to apply more granular unified security policies while reducing operational expenses. Being able to manage the network centrally is a huge benefit for firms as they look to increase innovation and become more flexible in response to changing technology trends.

Using analytics in tandem with a newly optimized SDN can empower IT to quickly identify any bottlenecks or problems and also help to deploy the fixes. For example, if a firm notices that one of their applications is suffering from a slow response time and sees that part of the network is experiencing a lot of latency at the same time, it could immediately address the issue and re-route traffic to a stronger connection.

Realising the Potential of SDN

In order to implement an SDN solution, it will be imperative for enterprises to firstly make themselves familiar with the technology and its components, create cross functional IT teams that include applications, security, systems and network to get an understanding what they wish to achieve and secondly, investigate best-of-breed vendor solutions that can deliver innovative and reliable SDN solutions which leverage existing investments without the need to overhaul longstanding technologies. This way, businesses can reap the benefits of SDN whilst saving time as well as money and mitigate risk.

Using analytics and SDN in combination is just one future possibility which could make it far simpler for businesses to deploy servers and support users in a more cost-effective and less resource-intensive way. It can also provide an overall improved user experience. With SDN offering the power to automate and make the network faster and big data providing the brains behind the operation; it’s an exciting match that could be an enterprise game changer.

Written by Markus Nispel, vice president of solutions architecture and innovation at Extreme Networks