Category Archives: News & Analysis

Eagle Eye Networks CEO Dean Drako acquires cloud access firm for $50m

Eagle Eye's CEO and former Barracuda Networks president is buying a cloud access and control company for $50m

Eagle Eye’s CEO and former Barracuda Networks president is buying a cloud access and control company for $50m

Dean Drako, president and chief executive of Eagle Eye Networks and former Barracuda Networks president has wholly acquired Brivo, a cloud access control firm, for $50m.

Brivo said its cloud-based access control system, a centralised management and security system for video surveillance cameras, currently services over 6 million users and 100,000 access points.

The acquisition will give Eagle Eye, a specialist in cloud-based video surveillance technology, a flexible access control tool to couple with its current offerings, Drako said.

“My goal was to acquire the physical security industry’s best access control system,” Drako explained.

“Brivo’s true cloud architecture and open API approach put it a generation ahead of other access control systems. Cloud solutions provide exceptional benefits and Brivo is clearly the market and technology leader. Brivo is also committed to strong, long-standing relationships with its channel partners, which I believe is the best strategy for delivering extremely high customer satisfaction.”

Though Eagle Eye will remain autonomous from Brivo, Drako will serve as the company’s chairman; Steve Van Till, Brivo’s president and chief executive, will continue serving in this capacity.

He said Eagle Eye will work to integrate Brivo’s flagship solution, Brivo OnAir, with its cloud security camera system, which will help deliver video verification and natively viewable and searchable video.

“We are extremely excited that Dean Drako has acquired Brivo and is serving as chairman. In addition to Dean’s experience founding and leading Barracuda Networks to be a multi-billion dollar company, he has grown his latest company, Eagle Eye Networks, to be the technology leader in cloud video surveillance,” Van Till said.

“We both share the vision of delivering the tremendous advantages of cloud-based systems to our customers,” he added.

Cisco beefs up Intercloud strategy

Cisco is bolstering its Intercloud programme by partnering with ISVs

Cisco is bolstering its Intercloud programme by partnering with ISVs

Cisco is bolstering its Intercloud strategy this week, announcing partnerships with 35 ISVs which the company said would help create and offer a wider range of cloud services based on Cisco infrastructure.

The company announced the impending launch of an Intercloud Marketplace that will be populated with apps certified to run on Intercloud infrastructure, due to go live in autumn this year.

Cisco said it is partnering with a range of commercial app and development companies including Apprenda, Active State and Docker to make their cloud developer environments work on the Intercloud platform.

The company is also partnering with big data solution providers including MapR, Hortonworks, Cloudera and the Apache Hadoop Community to offer hybrid cloud big data implementation support. Additionally, it said it would expose APIs to enable software-based control of networking and security, a move it claims will help developers create Internet of Things services more effectively.

Cisco’s Intercloud strategy has been somewhat of a slow burner, even by Cisco’s own estimates. The company has about 100 Intercloud customers and 65 partners globally, though last month Cisco chairman and chief executive John Chambers said the programme would pick up pace as it moved into “phase 2” of the Intercloud strategy, which is what the Marketplace is all about.

“The pieces that we were missing was how do you go into this new environment where each of these “public clouds in clouds” are separate? And you have to be on different vendors or different companies’ tech to have the ability to go into it. So what we’re looking at first is an architecture and it cements our relationships in service providers. And then it really comes through to how you monetise it over time,” he said at the time.

“This will just take time to monetize, but the effect we see indirectly is already huge when you talk about a Deutsche Telekom or a Telstra and our relationships with those.”

Citizens Bank signs 5-year managed services deal with IBM

Citizens Bank has tapped IBM for a managed services deal

Citizens Bank has tapped IBM in a managed services deal

Citizens Bank is moving its back-end technology infrastructure to a managed services environment following the signing of  a five-year IT services agreement with IBM.

Using a hybrid IT approach, IBM will optimise the bank’s existing IT infrastructure by integrating automation and predictive analytics technologies to standardise and streamline many of its internal IT systems and processes, including core banking applications, branch operations and online and mobile banking.

“Information technology plays a key role in our ability to anticipate and meet the needs of every customer, across every channel,” said Ken Starkey, chief technology officer, infrastructure services, Citizens Bank. “This agreement with IBM will provide immediate access to new technologies and capabilities, enabling us to create greater efficiencies in support of Citizens’ growth objectives.”

Under the contract, IBM will operate Citizens’ existing and future IT systems located in the bank’s data centres in Rhode Island and North Carolina. The bank already uses IBM systems and technologies. IBM also will support Citizens’ voice and data networks and provide IT support for all Citizens colleagues.

Philip Guido, general manager, IBM Global Technology Services, North America, said: “This is part of a multi-stage transformation of Citizen’s IT environment that lays the foundation for integrating additional IBM solutions in the future, making the bank more agile and responsive to the growing needs of its customers.”

AWS announces huge solar project following criticisms of its green cred

AWS announced a large solar project, part of its commitment to powering all of its global infrastructure with renewables

AWS announced a large solar project, part of its commitment to powering all of its global infrastructure with renewables

Amazon announced this week that it has teamed up with Community Energy to build and operate an 80 megawatt (MW) solar farm in Virginia, which the companies claim to be the largest solar farm in the state.

The announcement comes just one day after an environmental advocacy group hit out at AWS over its carbon footprint and energy reporting practices.

The companies said the solar farm, to be named the Amazon Solar Farm US East, will start generating approximately 170,000 megawatt hours (MWh) of solar power annually as early as October 2016 – which is roughly equivalent to the amount of energy used to power approximately 15,000 US homes for a year.

Amazon said the power purchasing agreement (PPA) is part of its long-term goal announced last year of powering all of its datacentre infrastructure using 100 per cent renewables. It said as of April this year about a quarter of its infrastructure is powered by renewables.

“We continue to make significant progress towards our long-term commitment to power the global AWS infrastructure with 100 percent renewable energy,” said Jerry Hunter, vice president of infrastructure at Amazon Web Services. “Amazon Solar Farm US East – the second PPA that will serve both existing and planned AWS datacenters in the central and eastern US – has the added benefit of working to increase the availability of renewable energy in the Commonwealth of Virginia.”

Community Energy chief executive Brent Alderfer said: “We are pleased to work with Amazon Web Services to build the largest solar farm in Virginia and one of the largest east of the Mississippi. This project, which wouldn’t have been possible without AWS’ leadership, helps accelerate the commercialization and deployment of solar photovoltaic (PV) technologies at scale in Virginia.”

Earlier this week Green America, a US-based environmental advocacy group, said Amazon is far behind other datacentre operators – including some of its large competitors like Google, Microsoft, Apple and Facebook – in terms of its renewable energy use and reporting practices. Google and Apple have been particularly strong in using or generating renewable energy to power their datacentres, with Apple committed to a number of large solar projects globally.

The group launched a campaign this week aimed at convincing Amazon to alter its environmental strategy. It is calling on Amazon to commit to full use of renewables for its datacentres by 2020 (AWS hasn’t set a target date publicly); submit accurate and complete data to the Carbon Disclosure Project; and issue and annual sustainability report.

ITU to address IoT standardisation for smart cities

The ITU is coordinating standardisation efforts on IoT technologies for smart cities

The ITU is coordinating standardisation efforts on IoT technologies for smart cities

The ITU has set up a working group to help set out standardisation requirements for Internet of Things (IoT) technologies in smart cities deployments. The ITU said the next five years will be crucial for IoT standards development.

The new ITU-T Study Group for “IoT and its applications, including smart cities and communities” will help coordinate international standards development on the use of IoT and M2M technologies to address urban development challenges.

The organisation said it is “well positioned” to help governments and the private sector capitalise on the potential for IoT to transform city infrastructure through smart buildings, transportation systems modernisation, smart energy and water networks.

ITU secretary-general Houlin Zhao said the new ITU-T Study Group, which will initially be hosted in Singapore, will bring together a diverse selection of stakeholders including ITU’s technical experts as well as national and metropolitan administrations responsible for urban development.

Chaesub Lee, director of the ITU Telecommunication Standardization Bureau said: “The coming five years will be crucial in ensuring that IoT technologies meet their potential. ITU-T is very active in IoT standardization, and we aim to assist cities around the world in creating the conditions necessary for IoT technologies to prove their worth in addressing urban-development challenges.”

Given the nascent state of IoT there are preciously few standards, and one of the benefits that could potentially emerge from developing them within the context of smart cities is that the number of systems requiring integration, and the diversity of their requirements, is significant. While this makes the task of securing consensus on standards more complex, it could make the standards generated more robust and widely applicable to a range of use cases.

Microsoft buys BlueStripe to bolster hybrid cloud monitoring

BlueStripe will bolster Microsoft' hybrid cloud performance monitoring capabilities

BlueStripe will bolster Microsoft’ hybrid cloud performance monitoring capabilities

Microsoft has acquired BlueStripe, a vendor of infrastructure monitoring solutions for applications distributed across multiple datacentres and cloud platforms.

BlueStripe and Microsoft have worked together closely over the years and the company’s solutions are often used to extend Microsoft System Centre for application infrastructure performance monitoring on a combination of Microsoft and non-Microsoft stacks.

Microsoft said the acquisition would give a strong boost to its hybrid cloud strategy.

“More and more, businesses are turning to applications to drive innovation and gain competitive advantage. To support this explosion of applications, agile cloud development environments and more componentized architectures and micro-services are growing exponentially. Applications and data are being spread across on-premises datacentres and public, private and hosted clouds as a result. While IT teams may not operate all of the infrastructure where the applications run, they still require visibility and the ability to manage these applications in order to support and protect the business,” explained Mike Neil, general manager of enterprise cloud at Microsoft.

“BlueStripe’s enterprise-class solution enables IT professionals to move from monitoring IT at the infrastructure level to gaining visibility into applications at the transaction level. The technology discovers and maps applications and dependencies, pinpoints problems for faster resolution, and helps maintain SLAs across complex underlying infrastructure. By mapping the structure of distributed applications, BlueStripe also helps in the process of updating applications to more modern platforms and migrating to the cloud.”

Microsoft said it will integrate BlueStripe’s solution into its infrastructure and ops management offerings like System Center and Operations Management Suite (OMS), but as part of the acquisition BlueStripe will cease selling its solution in the near term.

The challenge with application monitoring is it has always implied a tradeoff between flexibility and granularity. That said, BlueStripe’s FactFinder offering could give Microsoft – a strong proponent of hybrid cloud – a big boost with enterprises looking to extend their Microsoft stacks for application deployments or vice versa.

IBM releases tool to advance cloud app development on OpenPower, OpenStack

IBM has announced a service to help other develop and test OpenPower-based apps

IBM has announced a service to help other develop and test OpenPower-based apps

IBM announced the launch of SuperVessel, an open access cloud service developed by the company’s China-based research outfit and designed for developing and testing cloud services based on the OpenPower architecture.

The service, developed by Beijing’s IBM Research and IBM Systems Labs, is open to business partners, application developers and university students for testing and piloting emerging applications that use deep analytics, machine learning and the Internet of Things.

The cloud service is based on the latest Power8 processors (with FPGAs and GPU-based acceleration) and uses OpenStack to orchestrate the underlying cloud resources. The SuperVessel service is sliced up into various “labs”, each focusing on a specific area, and is initially launching with four: Big Data, Internet of Things, Acceleration and Virtualization.

“With the SuperVessel open computing platform, students can experience cutting-edge technologies and turn their fancy ideas into reality. It also helps make our teaching content closer to real life,” said Tsinghua University faculty member Wei Xu. “We want to make better use of SuperVessel in many areas, such as on-line education.”

Terri Virnig, IBM Vice President of Power Ecosystem and Strategy said: “SuperVessel is a significant contribution by IBM Research and Development to OpenPower. Combining advanced technologies from IBM R&D labs and business partners, SuperVessel is becoming the industry’s leading OpenPower research and development environment. It is a way IBM commits to and supports OpenPower ecosystem development, talent growth and research innovation.”

The move is part of a broader effort to cultivate mindshare around IBM’s Power architecture, which it opensourced two years ago; it’s positioning the architecture as an ideal platform for cloud and big data services. Since the launch of the OpenPower Foundation, the group tasked with coordinating development with Power, it has also been actively working with vendors and cloud service provider to mashup a range of open source technologies – for instance, getting OpenStack to work on OpenPower and Open Compute-based hardware.

Toronto real-estate developer, Honeywell partner on IoT for facilities

Honeywell is working with Menkes to deploy IoT systems and analytics in its facilities

Honeywell is working with Menkes to deploy IoT systems and analytics in its facilities

Toronto-based real-estate firm Menkes Developments and industrial electronics giant Honeywell have announced a deal that will see the two combine Internet of Things sensors and cloud services to reduce energy and operational costs at one of the real-estate firm’s properties.

The companies will initially deploy a smart facilities system designed by Honeywell at the Telus Tower in Toronto, as well as a cloud-based analytics platform used to monitor and analyse facility performance data and offer recommendations to improve operations.

“We are committed to pushing the boundaries of smart buildings, identifying new methods to leverage connectivity and improve our facilities,” said Sonya Buikema, vice president, commercial property management, Menkes.

“Honeywell’s technology and services complement our philosophy, and expand the ways in which we’re able to drive performance and better serve our customers,” Buikema said.

The companies said they want to use the technologies to explore new opportunities for improving efficiency and environmental impact.

“Even the most advanced facilities will experience a gradual decrease in performance over time, and it can be difficult to identify and address those issues before they negatively impact the bottom line,” said John Rajchert, president of Honeywell Building Solutions. “Honeywell has the tools and expertise to make it easier for companies to not only know what is happening in their facilities, but to also take the appropriate actions to keep them operating at a high level.”

While facilities automation has been around for some time now only recently have vendors like Honeywell begun selling insights-as-a-service through various cloud-based analytics platforms.

Facilities, particularly mixed use spaces, are very complex to manage and often include a range of different systems (i.e. motion sensors, temperature control, air filtration, lighting, security systems, etc.), so it’s likely squeezing out any operational improvements or insights through the growing interconnection between these various technologies and services will play a big role in real-estate moving forward.

Docker startup Rancher Labs secures $10m for container-based IaaS software

Rancher is developing container-based IaaS software

Rancher is developing container-based IaaS software

Rancher Labs, a startup developing Linux container-based infrastructure-as-a-service software, has secured $10m in a series A round of funding, which it said would be used to bolster its engineering and development efforts.

Rancher Labs, which was started by CloudStack founder Sheng Liang and Cloud.com (which was acquired by Citrix in 2011) founder Shannon Williams, offers infrastructure services purpose-built for containers.It also developed a lightweight Linux OS called RacherOS. “We wanted to run Docker directly on top of the Linux Kernel, and have all user-space Linux services be distributed as Docker containers. By doing this, there would be no need to use a separate software package distribution mechanism for RancherOS itself,” the company explained.

The company said that as technologies like Docker become more popular in production mode so do other requirements around things like networking (i.e. load balancing), monitoring, storage management, and other infrastructure requirements needed to stand up a reliable cloud workload.

“Containers are quickly becoming the de-facto large-scale production platform for application deployment,” Liang said.

“Our goal is to provide organizations with the tools needed to take full advantage of container technology. By developing storage and networking software purpose-built for containers, we are providing organizations with the best possible experience for running Docker in production.”

The company’s goal is to develop all of the infrastructure services necessary to give enterprises confidence in deploying containers in production at scale, and it plans to use the funding to accelerate its development and engineering efforts.

Jishnu Bhattacharjee, managing director at Nexus Venture Partners, one of the company’s investors said: “Software containers have dramatically changed the way DevOps teams work, becoming an essential piece of today’s IT infrastructure. The team at Rancher Labs recognized the technology’s potential early on, along with the pain points associated with it.”

While the technologies and tools to support Linux containers are still young there seems to be growing volume around using them for production deployments; one of the things that makes them so attractive in the cloud world is their scalability, and the ability to drop them in almost any environment – whether bare metal or on a hypervisor.

Green America hits out at Amazon for its dirty cloud

Amazon has committed to bolstering its use of renewables, but Green America thinks it needs to go further

Amazon has committed to bolstering its use of renewables, but Green America thinks it needs to go further

Notforprofit environmental advocacy group Green America is launched a campaign to try and convince Amazon to reduce its carbon footprint and catch up with other large cloud incumbents’ green credentials.

Green America said Amazon is behind other datacentre operators – including some of its large competitors like Google, Apple and Facebook – in terms of its renewable energy use and reporting practices.

“Every day, tens of millions of consumers are watching movies, reading news articles, and posting to social media sites that all use Amazon Web Services.  What they don’t realize is that by using Amazon Web Services they are contributing to climate change,” said Green america’s campaigns director Elizabeth O’Connell.

“Amazon needs to take action now to increase its use of renewables to 100 percent by 2020, so that consumers won’t have to choose between using the internet and protecting the planet,” O’Connell said.

Executive co-director Todd Larsen also commented on Amazon’s green cred: “Amazon lags behind its competitors, such as Google and Microsoft, in using renewable energy for its cloud-based computer servers.  Unlike most of its competitors, it also fails to publish a corporate responsibility or sustainability reporting, and it fails to disclose its emissions and impacts to the Carbon Disclosure Project.”

Amazon has recently taken strides towards making its datacentres greener. In November last year the company committed to using 100 per cent renewable energy for its global infrastructure, bowing to pressure from organisations like Greenpeace which have previously criticised the company’s reporting practices around its carbon footprint. But organisations like Green America still believe the company is way off the mark on its commitment.

Green America’s campaign is calling on Amazon to commit to full use of renewables for its datacentres by 2020; submit accurate and complete data to the Carbon Disclosure Project; and issue and annual sustainability report.

An Amazon spokesperson told BCN that the company and its customers are already showing environmental leadership by adopting cloud services in the first place.

“AWS customers have already shown environmental leadership by moving to cloud computing, which is inherently more environmentally friendly than traditional computing. Any analysis on the climate impact of a datacentre should take into consideration resource utilization and energy efficiency, in addition to power mix,” the spokesperson said.

“On average, AWS customers use 77 per cent fewer servers, 84 per cent less power, and utilize a 28 per cent cleaner power mix, for a total reduction in carbon emissions of 88 per cent from using the AWS Cloud instead of operating their own datacentres. We believe that our focus on resource utilization and energy efficiency, combined with our increasing use of renewable energy, will help our customers achieve their carbon reduction and sustainability goals. We will continue to provide updates of our progress on our AWS & Sustainable Energy page,” she added.