Archivo de la categoría: IBM

IBM, Mubadala joint venture to bring Watson cloud to MENA

IBM is bringing Watson to the Middle East

IBM is bringing Watson to the Middle East

IBM is teaming up with Abu Dhabi-based investment firm Mubadala Development Company to create a joint venture based in Abu Dhabi that will deliver IBM’s cloud-based Watson service to customers in the Middle East and Northern Afirca (MENA) region.

The companies will set up the joint venture through Mubadala’s subsidiary, Injazat, which will be the sole provider of the Watson platform in the region.

The companies said the move will help create an ecosystem of MENA-based partners, software vendors and startups developing new solutions based on the cognitive compute platform.

“Bringing IBM Watson to the region represents the latest major milestone in the global adoption of cognitive computing,” said Mounir Barakat, executive director of ICT at Aerospace & Engineering Services, Mubadala.

“It also signals Mubadala’s commitment to bringing new technologies and spurring economic growth in the Middle East, another step towards developing the UAE as a hub for the region’s ICT sector,” Barakat said.

Mike Rhodin, senior vice president of IBM Watson said Mubadala’s knowledge of the local corporate ecosystem will help the company expand its cognitive compute cloud service in the region.

IBM has enjoyed some Watson wins in financial services, healthcare and the utilities sectors, but the company has been fairly quiet on how much the division rakes in; over the past year the company made strides to expand the platform in the US, Africa and Japan, and recently made a number of strategic acquisitions in software automation in order to boost Watson’s appeal in customer engagement and health services.

IBM, partners score 7 nm semiconductor breakthrough

IBM, Samsung and Globalfoundries claimed a 7nm semiconductor breakthrough

IBM, Samsung and Globalfoundries claimed a 7nm semiconductor breakthrough this week

Giving Moore’s Law a run for its money, IBM, Globalfoundries and Samsung claimed this week to have produced the industry’s first 7 nanometre node test chip with functioning transistors. The breakthrough suggests a massive jump in low-power computing power may be just on the horizon.

IBM worked with Globalfoundries, the chip division it divested in October last year, and Samsung specialists at the SUNY Polytechnic Institute’s Colleges of Nanoscale Science and Engineering (SUNY Poly CNSE) to test a number of silicon innovations developed by IBM researchers including Silicon Germanium (SiGe) channel transistors and Extreme Ultraviolet (EUV) lithography integration at multiple levels, techniques developed to accommodate the changing nature of the rules of physics that apply at such small scales.

Most microprocessors found in servers, desktops and laptops today are developed with 22nm and 14nm processes, and mobile processors are increasingly being developed with 10nm processors, but IBM claims the 7nm process developed by the semiconductor alliance enjoys 50 per cent area scaling improvements over today’s most advanced chips.

IBM said the move could result in the creation of a chip small and powerful enough to “power everything from smartphones to spacecraft.”

“For business and society to get the most out of tomorrow’s computers and devices, scaling to 7nm and beyond is essential,” said Arvind Krishna, senior vice president and director of IBM Research. “That’s why IBM has remained committed to an aggressive basic research agenda that continually pushes the limits of semiconductor technology. Working with our partners, this milestone builds on decades of research that has set the pace for the microelectronics industry, and positions us to advance our leadership for years to come.”

The companies also said the chips have a 50 per cent power-to-performance improvement over existing server chips, and could be used in future iterations of Power architecture, IBM’s mainframe architecture which it open sourced in a bid to improve its performance for cloud and big data workloads.

IBM has in recent months ramped up silicon-focused efforts. The company is partnering with SiCAD to offer a cloud-based high performance services for electronic design automation (EDA) which the companies said can be used to design silicon for smartphones, wearables and Internet of Things devices. Earlier this month the company also launched another OpenPower design centre in Europe to target development of high performance computing (HPC) apps based on the Power architecture.

Columbia Pipeline links up with IBM in $180m cloud deal

CPG is sending most of its applications to the cloud

CPG is sending most of its applications to the cloud

Newly independent Columbia Pipeline Group (CPG) signed a $180m deal with IBM this week that will see the firm support the migration of its application infrastructure from on-premise datacenters into a hybrid cloud environment.

CPG recently split from NiSource to become an independent midstream pipeline and storage business with 15,000 miles of interstate pipeline, gathering and processing assets extending from New York to the Gulf of Mexico.

The company this week announced it has enlisted IBM, a long-time partner of NiSource, to help it migrate its infrastructure and line of business applications (finance, human resources, ERP) off NiSource’s datacenters an into a private cloud platform hosted in IBM’s datacenters in Columbus, Ohio.

The wide-ranging deal will also see CPG lean on IBM’s cloud infrastructure for its network services, help desk, end-user services, cybersecurity, mobile device management and operational big data.

“IBM has been a long-time technology partner for NiSource, providing solutions and services that have helped that company become an energy leader in the U.S.,” said Bob Skaggs, chief executive of CPG. “As an independent business, we are counting on IBM to help provide the continued strong enterprise technology support CPG needs.”

Philip Guido, general manager, IBM Global Technology Services, North America said: “As a premier energy company executing on a significant infrastructure investment program, CPG requires an enterprise technology strategy that’s as forward-thinking and progressive as its business strategy. Employing an IT model incorporating advanced cloud, mobile, analytics and security technologies and services from IBM will effectively support that vision.”

Companies that operate such sensitive infrastructure – like oil and gas pipelines – are generally quite conservative when it comes to where they host their applications and data, though the recent IBM deal speaks to an emerging shift in the sector. Earlier this summer Gaia Gallotti, research manager at IDC Energy Insights told BCN that cloud is edging higher on the agenda of CIOs in the energy and utilities sector, but that they are struggling with a pretty significant skills gap.

Camden Council uses big data to help reduce fraud, save money

Camden Council is using big data to tackle fraud and save cash as its budgets slim

Camden Council is using big data to tackle fraud and save cash as its budgets slim

Camden Council is using a big data platform to create a ‘Residents Index’ to help tackle debt collection, illegal subletting and fraud.

The service, based on IBM’s InfoSphere platform, centrally stores and manages citizen data collected from 16 different systems across London – including data from Electoral Services, Housing and Council Tax Services – to help give a single view of local residents.

Authorised users can access the platform to search relevant data and highlight discrepancies in the information given to the Council by residents to help reduce fraud and save money on over-procurement of public services.

It’s also using the Index to improve the accuracy of its electoral register. Using the platform, it said it was able to fast track the registration of more than 80 per cent of its residents and identify new residents who need to vote.

“Big data is revolutionising the way we work across the borough, reducing crime and saving money just when public services are facing huge funding cuts,” said Camden councillor Theo Blackwell.

“Take School admission fraud; parents complain about people gaming the system by pretending to reside in the borough to get their kids into the most sought-after schools. Now with the Residents Index in place, Council staff can carry out detailed checks and identify previously hidden discrepancies in the information supplied to the Council to prove residency. We have already withdrawn five school places from fraudulent applicants making sure that school places fairly go to those who are entitled to them.”

“The Resident Index has proven its worth, helping the Council to become more efficient, and now contains over one million relevant records. This is just one example and we have other plans to use the benefits of data technology to improve public services and balance the books.”

Early last year Camden Borough laid out its 3 year plan to use more digital services in a bid to save money and improve the services it offers to local residents, which includes using cloud services to save on infrastructure cost and big data platforms to inform decision making at the Council.

IBM, Nvidia, Mellanox launch OpenPower design centre to target big data apps

IBM has set up another OpenPower design centre in Europe to target big data and HPC

IBM has set up another OpenPower design centre in Europe to target big data and HPC

IBM, Nvidia and Mellanox are setting up another OpenPower design centre in Europe to target development of high performance computing (HPC) apps based on the open source Power architecture.

The move will see technical experts from IBM, Nvidia and Mellanox jointly develop applications on OpenPower architecture which take advantage of the companies’ respective technologies – specifically IBM Power CPUs, Nvidia’s Tesla Accelerated Computing Platform and Mellanox InfiniBand networking solution.

The companies said the move will both advance development of HPC software and create new opportunities for software developers to acquire HPC-related skills and experience.

“Our launch of this new centre reinforces IBM’s commitment to open-source collaboration and is a next step in expanding the software and solution ecosystem around OpenPower,” said Dave Turek, IBM’s vice president of HPC Market Engagement.

“Teaming with Nvidia and Mellanox, the centre will allow us to leverage the strengths of each of our companies to extend innovation and bring higher value to our customers around the world,” Turek said.

The centre will be located in IBM’s client centre in Montpellier, France and complement the Jülich Supercomputing Center launched in November last year.

IBM has been working with a broad range of stakeholders spanning the technology, research and government sectors on Power-based supercomputers in order to satisfy its big Power architecture ambitions. The company hopes Power will command roughly a third of the scale-out market over the next few years.

IoT, big data used to make Lake George a model for water body preservation

IBM and the Jefferson Project have teamed up to use IoT and big data to monitor and analyse Lake George's health

IBM and the Jefferson Project have teamed up to use IoT and big data to monitor and analyse Lake George’s health

The Jefferson Project at Lake George, New York, a collaborative project between Rensselaer Polytechnic Institute, IBM Research and The FUND for Lake George is using Internet of Things sensors and big data analytics to create a model that can be used to help preserve a wide range of water sources.

Researchers have been monitoring the chemistry of algae in Lake George for the past 35 years to demonstrate how the lake is being affected by a range of inputs including pollution, tourism, and weather activity.

But the project has recently started a new phase which has seen IBM and Jefferson Project researchers put sophisticated underwater sonar-based sensors (powered by solar energy) to measure a range of data.

Those sensors are linked up to custom software deployed on IBM Blue Gene/ Q supercomputer and IBM Smarter Planet software deployed in a datacentre on the Rensselaer campus.

Rick Relyea, director of the Jefferson Project at Lake George said the IoT sensors have greatly improved data accuracy, which has allowed researchers to improve the models they generate from the patterns being observed.

“The Jefferson Project provides the unique opportunity for biologists and environmental scientists to work closely with engineers, physicists, computer scientists and meteorologists to understand large lakes at a level of detail and intensity that is simply unprecedented,” Relyea said.

“Together, we will make tremendous inroads into understanding how lakes naturally behave and how human activities alter biodiversity, the functioning of freshwater ecosystems, and overall water quality.”

The project researchers have already used the preliminary data generated by the sensors to create a range of models that can help predict the impact of weather events, salt run-off, and heavy tourism on water circulation and the water body’s chemistry, which Rylea said could be applied to many other bodies of water in a bid to improve their health.

“The major threats to Lake George are many of the same threats to lakes around the world. Too many nutrients coming in from either fertiliser or sewage. We have contaminants coming in, those may be pesticides, it may be road salts. We have development coming in changing the habitat around the lakes, changing how much water run-off there is. And we have invasive species coming in.”

IBM and Box announce global enterprise cloud partnership

US enterprise tech giant IBM has revealed a new global partnership with cloud storage outfit Box to integrate their products and sell into vertically targeted enterprise markets.

More specifically the strategic alliance will combine Box’s cloud collaboration platform with a number of IBM solutions, including analytics, cloud and security. Both companies will sell the combined products.

“Today’s digital enterprises demand world-class technologies that transform how their organizations operate both internally and externally,” said Aaron Levie, co-founder and CEO of Box. “This extensive alliance between Box and IBM opens up an exciting opportunity for both companies to reach new markets and deliver unified solutions and services that can redefine industries, advance secure collaboration and revolutionize enterprise mobility.”

“This partnership will transform the way work is done in industries and professions that shape our experience every day,” said Bob Picciano, SVP of IBM Analytics. “The impact will be felt by experts and professionals in industries such as healthcare, financial services, law, and engineering who are overwhelmed by today’s digital data and seek better solutions to manage large volumes of information more intelligently and securely. The integration of IBM and Box technologies, combined with our global cloud capabilities and the ability to enrich content with analytics, will help unlock actionable insights for use across the enterprise.”

The alliance will focus on three main areas: content management and social collaboration; enterprise cloud, security and consulting; and custom app development for industries. The general thread of the announcement seems to be a desire to bring cloud applications to regions and industries that are not currently making the most of them and is just the latest in a sequence of collaborations by both Box and IBM.

IBM stands up SoftLayer datacentre in Italy

IBM has added its first SoftLayer datacentre in Italy, and its 24th globally

IBM has added its first SoftLayer datacentre in Italy, and its 24th globally

IBM announced the launch of its first SoftLayer datacentre in Italy this week, which is located in Cornaredo, Milan.

The company said the datacentre in Milan, a growing hub for cloud services, will enable it to offer a local option for Italian businesses looking to deploy IBM cloud services. The facility, it’s 24th SoftLayer datacentre globally, has a capacity for up to 11,000 servers, a power rating of 2.8 megawatts, and is designed to Tier III spec.

“The Italian IT sector is changing as startups and enterprises alike are increasingly turning to the cloud to optimize infrastructure, lower IT costs, create new revenue streams, and spur innovation,” said Marc Jones, chief technology officer for SoftLayer.

“The Milan datacentre extends the unique capabilities of our global platform by providing a fast, local onramp to the cloud. Customers have everything they need to quickly build out and test solutions that run the gamut from crunching big data to launching a mobile app globally,” Jones added.

Nicola Ciniero, general manager of IBM Italy said: “This datacentre represents a financial and technological investment made by a multinational company that has faith in this country’s potential. Having an IBM Cloud presence in Italy will provide local businesses with the right foundation to innovate and thrive on a global level.”

The move comes just a couple of months after IBM added a second SoftLayer datacentre in the Netherlands.

Is force of habit defining your hybrid cloud destiny?

Experience breeds habit, which isn't necessarily the best thing strategically

Experience breeds habit, which isn’t necessarily the best thing strategically

I’ve been playing somewhat of a game over recent months.  It’s a fun game for all the family and might be called “Guess my job”.  It’s simple to play.  All you need to do is ask someone the question; “What is a hybrid cloud?” then based upon their answer you make your choice.  Having been playing this for a while I’m now pretty good at being able to predict their viewpoint from their job role or vice versa.

And the point of all this?  Simply, that people’s viewpoints are constrained by their experiences and what keeps them busy day-to-day, so often they miss an opportunity to do something different.  For those people working day-to-day in a traditional IT department , keeping systems up and running,  hybrid cloud is all about integrating an existing on-site system with an off-site cloud.  This is a nice, easy one to grasp in principal but the reality is somewhat harder to realize.

The idea of connecting an on-site System of Record to a cloud-based System of Engagement:  pulling data from both to generate new insights is conceptually well understood.  That said, the number of organisations making production use of such arrangements is few and far between.  For example, combining historical customer transaction information with real-time geospatial, social and mobile data and then applying analytics to generate new insights which uncover new sales potential.  For many organizations though, the challenge in granting access to the existing enterprise systems is simply too great.  Security concerns, the ability to embrace the speed of change that is required and the challenge to extract the right data in a form that is immediately usable by the analytical tools may be simply a hurdle too high.  Indeed, many clients I’ve worked with have stated that they’re simply not going to do this.  They understand the benefits, but the pain they see themselves having to go through to get these makes this unattractive to pursue.

So, if this story aligns with your view of hybrid cloud and you’ve already put it in the “too hard” box then what is your way forward?

For most organizations, no single cloud provider is going to provide all of the services they might want to consume.  Implicitly then, if they need to bring data from these disparate cloud services together then there is a hybrid cloud use case:  linking cloud to cloud.  Even in the on-site to off-site hybrid cloud case there are real differences when the relationship is static compared to when you are dynamically bursting in and out of off-site capacity.  Many organizations are looking to cloud as a more-effective and agile platform for backup and archiving or for disaster recovery.  All of these are hybrid cloud use cases to but if you’ve already written off ‘hybrid’ then you’re likely missing very real opportunities to do what is right for the business.

Regardless of the hybrid cloud use case, you need to keep in mind three key principals which are:

  1. Portability – the ability to run and consume services and data from wherever it is most appropriate to do so, be that cloud or non-cloud, on-site or off-site.
  2. Security, visibility and control – to be assured that end-to-end, regardless of where the ‘end’ is, you are running services in such a way that they are appropriately secure, well managed and their characteristics are well understood.
  3. Developer productivity – developers should be focused on solving business problems and not be constrained by needing to worry about how or when supporting infrastructure platforms are being deployed.  They should be able to consume and integrate services from many different sources to solve problems rather than having to create everything they need from scratch.

Business applications need to be portable such that they can both run as well as consume other services from wherever is most appropriate.  To do that, your developers need to be more unconstrained by the underlying platform(s) and so can develop for any cloud or on-site IT platform.  All this needs to be done in a way that allows enterprise controls, visibility and security to be extended to the cloud platforms that are being used.

If you come from that traditional IT department background, you’ll be familiar with the processes that are in place to ensure that systems are well managed, change is controlled and service levels are maintained.  These processes may not be compatible with the ways that clouds open up new opportunities.  This leads to the need to look a creating a “two-speed” IT organisation to provide the rigor where needed for the Systems of Record whilst enabling rapid change and delivery in the Systems of Engagement space.

Cloud generates innovation and hence diversity.  Economics, regulation and open communities drive standardization and it is this, and in particular open standards, which facilitates integration in all of these hybrid cases.

So, ask yourself.  With more than 65 per cent of enterprise IT organizations making commitments on hybrid cloud technologies before 2016 are you ensuring that your definitions – and hence your technologies choices – reflect future opportunities rather than past prejudices?

Written by I’ve been playing somewhat of a game over recent months.  It’s a fun game for all the family and might be called “Guess my job”.  It’s simple to play.  All you need to do is ask someone the question; “What is a hybrid cloud?” then based upon their answer you make your choice.  Having been playing this for a while I’m now pretty good at being able to predict their viewpoint from their job role or vice versa.

Written by John Easton, IBM distinguished engineer and leading cloud advisor for Europe

IBM calls Apache Spark “most important new open source project in a decade”

IBM is throwing its weight behind Apache Spark in a bid to bolster its IoT strategy

IBM is throwing its weight behind Apache Spark in a bid to bolster its IoT strategy

IBM said it will throw its weight behind Apache Spark, an open source community developing a processing engine for large-scale datasets, putting thousands of internal developers to work on Spark-related projects and contributing its machine learning technology to the code ecosystem.

Spark, an Apache open source project born in 2009, is essentially an engine that can process vast amounts of data very quickly. It runs in Hadoop clusters through YARN or as a standalone deployment and can process data in HDFS, HBase, Cassandra, Hive, and any Hadoop InputFormat; it currently supports Scala, Java and Python.

It is designed to perform general data processing (like MapReduce) but one of the exciting things about Spark is it can also process new workloads like streaming data, interactive queries, and machine learning – making it a good match for Internet of Things applications, which is why IBM is so keen to go big on supporting the project.

The company said the technology brings huge advances when processing massive datasets generated by Internet of Things devices, improving the performance of data-dependent apps.

“IBM has been a decades long leader in open source innovation. We believe strongly in the power of open source as the basis to build value for clients, and are fully committed to Spark as a foundational technology platform for accelerating innovation and driving analytics across every business in a fundamental way,” said Beth Smith, general manager, analytics platform, IBM Analytics.

“Our clients will benefit as we help them embrace Spark to advance their own data strategies to drive business transformation and competitive differentiation,” Smith said.

In addition to joining Spark IBM said it would build the technology into the majority of its big data offerings, and offer Spark-as-a-Service on Bluemix. It also said it will open source its IBM SystemML machine learning technology, and collaborate with Databricks, a Spark-as-a-Service provider, to advance Spark’s machine learning capabilities.