Archivo de la categoría: IBM

IoT, big data used to make Lake George a model for water body preservation

IBM and the Jefferson Project have teamed up to use IoT and big data to monitor and analyse Lake George's health

IBM and the Jefferson Project have teamed up to use IoT and big data to monitor and analyse Lake George’s health

The Jefferson Project at Lake George, New York, a collaborative project between Rensselaer Polytechnic Institute, IBM Research and The FUND for Lake George is using Internet of Things sensors and big data analytics to create a model that can be used to help preserve a wide range of water sources.

Researchers have been monitoring the chemistry of algae in Lake George for the past 35 years to demonstrate how the lake is being affected by a range of inputs including pollution, tourism, and weather activity.

But the project has recently started a new phase which has seen IBM and Jefferson Project researchers put sophisticated underwater sonar-based sensors (powered by solar energy) to measure a range of data.

Those sensors are linked up to custom software deployed on IBM Blue Gene/ Q supercomputer and IBM Smarter Planet software deployed in a datacentre on the Rensselaer campus.

Rick Relyea, director of the Jefferson Project at Lake George said the IoT sensors have greatly improved data accuracy, which has allowed researchers to improve the models they generate from the patterns being observed.

“The Jefferson Project provides the unique opportunity for biologists and environmental scientists to work closely with engineers, physicists, computer scientists and meteorologists to understand large lakes at a level of detail and intensity that is simply unprecedented,” Relyea said.

“Together, we will make tremendous inroads into understanding how lakes naturally behave and how human activities alter biodiversity, the functioning of freshwater ecosystems, and overall water quality.”

The project researchers have already used the preliminary data generated by the sensors to create a range of models that can help predict the impact of weather events, salt run-off, and heavy tourism on water circulation and the water body’s chemistry, which Rylea said could be applied to many other bodies of water in a bid to improve their health.

“The major threats to Lake George are many of the same threats to lakes around the world. Too many nutrients coming in from either fertiliser or sewage. We have contaminants coming in, those may be pesticides, it may be road salts. We have development coming in changing the habitat around the lakes, changing how much water run-off there is. And we have invasive species coming in.”

IBM and Box announce global enterprise cloud partnership

US enterprise tech giant IBM has revealed a new global partnership with cloud storage outfit Box to integrate their products and sell into vertically targeted enterprise markets.

More specifically the strategic alliance will combine Box’s cloud collaboration platform with a number of IBM solutions, including analytics, cloud and security. Both companies will sell the combined products.

“Today’s digital enterprises demand world-class technologies that transform how their organizations operate both internally and externally,” said Aaron Levie, co-founder and CEO of Box. “This extensive alliance between Box and IBM opens up an exciting opportunity for both companies to reach new markets and deliver unified solutions and services that can redefine industries, advance secure collaboration and revolutionize enterprise mobility.”

“This partnership will transform the way work is done in industries and professions that shape our experience every day,” said Bob Picciano, SVP of IBM Analytics. “The impact will be felt by experts and professionals in industries such as healthcare, financial services, law, and engineering who are overwhelmed by today’s digital data and seek better solutions to manage large volumes of information more intelligently and securely. The integration of IBM and Box technologies, combined with our global cloud capabilities and the ability to enrich content with analytics, will help unlock actionable insights for use across the enterprise.”

The alliance will focus on three main areas: content management and social collaboration; enterprise cloud, security and consulting; and custom app development for industries. The general thread of the announcement seems to be a desire to bring cloud applications to regions and industries that are not currently making the most of them and is just the latest in a sequence of collaborations by both Box and IBM.

IBM stands up SoftLayer datacentre in Italy

IBM has added its first SoftLayer datacentre in Italy, and its 24th globally

IBM has added its first SoftLayer datacentre in Italy, and its 24th globally

IBM announced the launch of its first SoftLayer datacentre in Italy this week, which is located in Cornaredo, Milan.

The company said the datacentre in Milan, a growing hub for cloud services, will enable it to offer a local option for Italian businesses looking to deploy IBM cloud services. The facility, it’s 24th SoftLayer datacentre globally, has a capacity for up to 11,000 servers, a power rating of 2.8 megawatts, and is designed to Tier III spec.

“The Italian IT sector is changing as startups and enterprises alike are increasingly turning to the cloud to optimize infrastructure, lower IT costs, create new revenue streams, and spur innovation,” said Marc Jones, chief technology officer for SoftLayer.

“The Milan datacentre extends the unique capabilities of our global platform by providing a fast, local onramp to the cloud. Customers have everything they need to quickly build out and test solutions that run the gamut from crunching big data to launching a mobile app globally,” Jones added.

Nicola Ciniero, general manager of IBM Italy said: “This datacentre represents a financial and technological investment made by a multinational company that has faith in this country’s potential. Having an IBM Cloud presence in Italy will provide local businesses with the right foundation to innovate and thrive on a global level.”

The move comes just a couple of months after IBM added a second SoftLayer datacentre in the Netherlands.

Is force of habit defining your hybrid cloud destiny?

Experience breeds habit, which isn't necessarily the best thing strategically

Experience breeds habit, which isn’t necessarily the best thing strategically

I’ve been playing somewhat of a game over recent months.  It’s a fun game for all the family and might be called “Guess my job”.  It’s simple to play.  All you need to do is ask someone the question; “What is a hybrid cloud?” then based upon their answer you make your choice.  Having been playing this for a while I’m now pretty good at being able to predict their viewpoint from their job role or vice versa.

And the point of all this?  Simply, that people’s viewpoints are constrained by their experiences and what keeps them busy day-to-day, so often they miss an opportunity to do something different.  For those people working day-to-day in a traditional IT department , keeping systems up and running,  hybrid cloud is all about integrating an existing on-site system with an off-site cloud.  This is a nice, easy one to grasp in principal but the reality is somewhat harder to realize.

The idea of connecting an on-site System of Record to a cloud-based System of Engagement:  pulling data from both to generate new insights is conceptually well understood.  That said, the number of organisations making production use of such arrangements is few and far between.  For example, combining historical customer transaction information with real-time geospatial, social and mobile data and then applying analytics to generate new insights which uncover new sales potential.  For many organizations though, the challenge in granting access to the existing enterprise systems is simply too great.  Security concerns, the ability to embrace the speed of change that is required and the challenge to extract the right data in a form that is immediately usable by the analytical tools may be simply a hurdle too high.  Indeed, many clients I’ve worked with have stated that they’re simply not going to do this.  They understand the benefits, but the pain they see themselves having to go through to get these makes this unattractive to pursue.

So, if this story aligns with your view of hybrid cloud and you’ve already put it in the “too hard” box then what is your way forward?

For most organizations, no single cloud provider is going to provide all of the services they might want to consume.  Implicitly then, if they need to bring data from these disparate cloud services together then there is a hybrid cloud use case:  linking cloud to cloud.  Even in the on-site to off-site hybrid cloud case there are real differences when the relationship is static compared to when you are dynamically bursting in and out of off-site capacity.  Many organizations are looking to cloud as a more-effective and agile platform for backup and archiving or for disaster recovery.  All of these are hybrid cloud use cases to but if you’ve already written off ‘hybrid’ then you’re likely missing very real opportunities to do what is right for the business.

Regardless of the hybrid cloud use case, you need to keep in mind three key principals which are:

  1. Portability – the ability to run and consume services and data from wherever it is most appropriate to do so, be that cloud or non-cloud, on-site or off-site.
  2. Security, visibility and control – to be assured that end-to-end, regardless of where the ‘end’ is, you are running services in such a way that they are appropriately secure, well managed and their characteristics are well understood.
  3. Developer productivity – developers should be focused on solving business problems and not be constrained by needing to worry about how or when supporting infrastructure platforms are being deployed.  They should be able to consume and integrate services from many different sources to solve problems rather than having to create everything they need from scratch.

Business applications need to be portable such that they can both run as well as consume other services from wherever is most appropriate.  To do that, your developers need to be more unconstrained by the underlying platform(s) and so can develop for any cloud or on-site IT platform.  All this needs to be done in a way that allows enterprise controls, visibility and security to be extended to the cloud platforms that are being used.

If you come from that traditional IT department background, you’ll be familiar with the processes that are in place to ensure that systems are well managed, change is controlled and service levels are maintained.  These processes may not be compatible with the ways that clouds open up new opportunities.  This leads to the need to look a creating a “two-speed” IT organisation to provide the rigor where needed for the Systems of Record whilst enabling rapid change and delivery in the Systems of Engagement space.

Cloud generates innovation and hence diversity.  Economics, regulation and open communities drive standardization and it is this, and in particular open standards, which facilitates integration in all of these hybrid cases.

So, ask yourself.  With more than 65 per cent of enterprise IT organizations making commitments on hybrid cloud technologies before 2016 are you ensuring that your definitions – and hence your technologies choices – reflect future opportunities rather than past prejudices?

Written by I’ve been playing somewhat of a game over recent months.  It’s a fun game for all the family and might be called “Guess my job”.  It’s simple to play.  All you need to do is ask someone the question; “What is a hybrid cloud?” then based upon their answer you make your choice.  Having been playing this for a while I’m now pretty good at being able to predict their viewpoint from their job role or vice versa.

Written by John Easton, IBM distinguished engineer and leading cloud advisor for Europe

IBM calls Apache Spark “most important new open source project in a decade”

IBM is throwing its weight behind Apache Spark in a bid to bolster its IoT strategy

IBM is throwing its weight behind Apache Spark in a bid to bolster its IoT strategy

IBM said it will throw its weight behind Apache Spark, an open source community developing a processing engine for large-scale datasets, putting thousands of internal developers to work on Spark-related projects and contributing its machine learning technology to the code ecosystem.

Spark, an Apache open source project born in 2009, is essentially an engine that can process vast amounts of data very quickly. It runs in Hadoop clusters through YARN or as a standalone deployment and can process data in HDFS, HBase, Cassandra, Hive, and any Hadoop InputFormat; it currently supports Scala, Java and Python.

It is designed to perform general data processing (like MapReduce) but one of the exciting things about Spark is it can also process new workloads like streaming data, interactive queries, and machine learning – making it a good match for Internet of Things applications, which is why IBM is so keen to go big on supporting the project.

The company said the technology brings huge advances when processing massive datasets generated by Internet of Things devices, improving the performance of data-dependent apps.

“IBM has been a decades long leader in open source innovation. We believe strongly in the power of open source as the basis to build value for clients, and are fully committed to Spark as a foundational technology platform for accelerating innovation and driving analytics across every business in a fundamental way,” said Beth Smith, general manager, analytics platform, IBM Analytics.

“Our clients will benefit as we help them embrace Spark to advance their own data strategies to drive business transformation and competitive differentiation,” Smith said.

In addition to joining Spark IBM said it would build the technology into the majority of its big data offerings, and offer Spark-as-a-Service on Bluemix. It also said it will open source its IBM SystemML machine learning technology, and collaborate with Databricks, a Spark-as-a-Service provider, to advance Spark’s machine learning capabilities.

IBM, Revive Vending set up London Café that brews analytics-based insights

Honest Café is an unmanned coffee shop powered by Watson analytics

Honest Café is an unmanned coffee shop powered by Watson analytics

IBM is working with Revive Vending on London-based cafés that use Watson analytics to help improve customer service and form a better understanding of its customers.

The ‘Honest Café’ – there are three in London, with four more planned – are all unmanned; instead, the company deploys high-end vending machines at each location, which all serve a mix of health-conscious snacks, juices, food and beverages.

The company is using Watson analytics to compensate for the lack of wait staff, with the cognitive compute platform being deployed to troll through sales data in a bid to unearth buying patterns and improve its marketing effectiveness.

“Because Honest is unmanned, it’s tough going observing what our customers are saying and doing in our cafes,” said Mark Summerill, head of product development at Honest Café. “We don’t know what our customers are into or what they look like. And as a start-up it’s crucial we know what sells and what areas we should push into.”

“We lacked an effective way of analyzing the data,” Summerill said. “We don’t have dedicated people on this and the data is sometimes hard pull to together to form a picture.”

“Watson Analytics could help us make sure we offer the right customers the right drink, possibly their favorite drink,” he said.

It can leverage the buying patterns by launching promotional offers on various goods to help improve sales, and it can also correlate the data with social media information to better inform its understanding of Honest Café customers.

“We identified that people who buy as a social experience have different reasons than those who dash in and out grabbing just the one drink,” Summerill said. “They also have a different payment method, the time of the day differs and the day of the week. Knowing this, we can now correctly time promotions and give them an offer or introduce a product that is actually relevant to them.”

“Having direct access to Twitter data for insights into what our buyers are talking about is going to help augment our view and understanding of our customers even further,” he added.

Citizens Bank signs 5-year managed services deal with IBM

Citizens Bank has tapped IBM for a managed services deal

Citizens Bank has tapped IBM in a managed services deal

Citizens Bank is moving its back-end technology infrastructure to a managed services environment following the signing of  a five-year IT services agreement with IBM.

Using a hybrid IT approach, IBM will optimise the bank’s existing IT infrastructure by integrating automation and predictive analytics technologies to standardise and streamline many of its internal IT systems and processes, including core banking applications, branch operations and online and mobile banking.

“Information technology plays a key role in our ability to anticipate and meet the needs of every customer, across every channel,” said Ken Starkey, chief technology officer, infrastructure services, Citizens Bank. “This agreement with IBM will provide immediate access to new technologies and capabilities, enabling us to create greater efficiencies in support of Citizens’ growth objectives.”

Under the contract, IBM will operate Citizens’ existing and future IT systems located in the bank’s data centres in Rhode Island and North Carolina. The bank already uses IBM systems and technologies. IBM also will support Citizens’ voice and data networks and provide IT support for all Citizens colleagues.

Philip Guido, general manager, IBM Global Technology Services, North America, said: “This is part of a multi-stage transformation of Citizen’s IT environment that lays the foundation for integrating additional IBM solutions in the future, making the bank more agile and responsive to the growing needs of its customers.”

IBM partners with SiCAD on cloud-based IoT silicon design and test service

IBM is working with SiCAD to offer an IoT silicon design service in the cloud

IBM is working with SiCAD to offer an IoT silicon design service in the cloud

IBM is partnering with silicon design platform provider SiCAD to offer a cloud-based high performance services for electronic design automation (EDA) which the companies said can be used to design silicon for smartphones, wearables and Internet of Things devices.

The IBM Electronics Design Automation (EDA)-based tools will be delivered via SoftLayer infrastructure on a PAYG basis and provide on-demand access to silicon design tools.

IBM will initially deliver three key services: IBM Library Characterization, to create abstract electrical and timing models for chip designs; IBM Logic Verification, to simulate electronic systems and design languages; and IBM Spice, an electronic circuit simulator used to check design integrity and probe chip behaviour.

The company said deployment clusters will be segregated (both compute and networking), so clients won’t share any infrastructure.

“The proliferation of smartphones, tablets, wearable devices and Internet of Things (IoT) products has been the primary driver for increased demand for semiconductor chips. Companies are under pressure to design electronic systems faster, better and cheaper,” said Jai Iyer, founder and chief executive of SiCAD. “A time-based usage model on a need basis makes sense for this industry and will spur innovation in the industry while lowering capital and operations expenses.”

The companies said the partnership will help enable startups designing silicon for IoT applications, a venture not only increasingly attractive because of the explosion of activity around IoT and the need for purpose-built chip architectures but the sheer size of the silicon land-grab in the sector.

IBM, UK gov ink $313m deal to promote big data, cognitive compute research

IBM and the UK government are pouring £313m into big data and cognitive computing R&D

IBM and the UK government are pouring £313m into big data and cognitive computing R&D

The UK government has signed a deal with IBM that will see the two parties fund a series of initiatives aimed at expanding cognitive computing and big data research.

The £313m partnership will see the UK government commit £113m to expand the Hartree Centre at Daresbury, a publicly funded facility geared towards reducing the cost and improving the efficiency and user-friendliness of high performance computing and big data for research and development purposes.

IBM said it will further support the project with technology and onsite expertise worth up to £200m, including access to the company’s cognitive computing platform Watson. The company will also place 24 IBM researchers at the Centre, who will help the researchers commercialise any promising innovations developed there.

The organisations will also explore how to leverage OpenPower-based systems for high performance computing.

“We live in an information economy – from the smart devices we use every day to the super-computers that helped find the Higgs Boson, the power of advanced computing means we now have access to vast amounts of data,” said UK Minister for Universities and Science Jo Johnson.

“This partnership with IBM, which builds on our £113 million investment to expand the Hartree Centre, will help businesses make the best use of big data to develop better products and services that will boost productivity, drive growth and create jobs.”

David Stokes, chief executive for IBM in the UK and Ireland said: “We’re at the dawn of a new era of cognitive computing, during which advanced data-centric computing models and open innovation approaches will allow technology to greatly augment decision-making capabilities for business and government.”

“The expansion of our collaboration with STFC builds upon Hartree’s successful engagement with industry and its record in commercialising technological developments, and provides a world-class environment using Watson and OpenPower technologies to extend the boundaries of Big Data and cognitive computing,” he added.

Cisco, IBM spend big in OpenStack-focused land grab

Cisco and IBM have both acquired OpenStack vendors this week

Cisco and IBM have both acquired OpenStack vendors this week

Cisco and IBM have both signed deals to acquire OpenStack vendors this week, with Cisco acquiring Piston Cloud Computing, a firm specialising in OpenStack-based private cloud software and IBM buying up managed private cloud provider Blue Box.

Piston offers what it calls Piston CloudOS, a supped up version of OpenStack for private clouds alongside custom cluster management and monitoring software and APIs. Underneath that sits a custom Linux micro-OS that contains all of the necessary code to run CloudOS; the company said it can be characterized as a “bare-metal operating system that is tailor-made for pools of hyper-converged, commodity resources.”

Cisco said Piston will help it deliver on its Intercloud vision, which sees a Cisco-based set of cloud services that can federate with one another; OpenStack seems increasingly to be at the heart of that effort.

“Paired with our recent acquisition of Metacloud, Piston’s distributed systems engineering and OpenStack talent will further enhance our capabilities around cloud automation, availability, and scale. The acquisition of Piston will complement our Intercloud strategy by bringing additional operational experience on the underlying infrastructure that powers Cisco OpenStack Private Cloud,” said Hilton Romanski, corporate development lead at Cisco.

“Additionally, Piston’s deep knowledge of distributed systems and automated deployment will help further enhance our delivery capabilities for customers and partners,” he added.

The move will also give Cisco some strong in-house OpenStack expertise. One of Piston’s co-founders, Chris MacGown, was among the originating members of OpenStack’s Nova-core (compute) development team.

This is the second big OpenStack-focused acquisition Cisco has made in recent months. In September last year Cisco acquired Metacloud, a provider of commercially supported OpenStack. Metacloud also had IP in the networking technology it has integrated into its OpenStack distribution, which gave Cisco’s OpenStack play an SDN boost.

IBM, meanwhile has acquired managed private cloud provider Blue Box, which the company said would help bolster its hybrid cloud and OpenStack strategy.

The company said the move would enable it to provide a public cloud-like experience within its customers’ datacentres by allowing it to offer a remotely managed OpenStack offering, which it hadn’t previously.

“IBM is dedicated to helping our clients migrate to the cloud in an open, secure, data rich environment that meet their current and future business needs,” said IBM general manager of cloud services Jim Comfort. “The acquisition of Blue Box accelerates IBM’s open cloud strategy making it easier for our clients to move to data and applications across clouds and adopt hybrid cloud environments.”

IBM said it will continue to support Blue Box customers and use the company as a channel to sell its own cloud services.

Both acquisitions are a sign the old-hat vendors are putting their money where their mouths are when it comes to OpenStack – particularly when some of them, like HP and Red Hat, are digging their heels in and aggressively pushing their OpenStack wares.