IBM reports cloud growth amid 16th quarterly revenue decline

IBMIBM has reported healthy growth for its cloud and strategic imperatives business units, despite witnessing revenue declines for the 16th straight quarter.

The strategic imperatives units, which include the cloud, analytics, mobile, social and security services, delivered $29.8 billion in revenue over the last 12 months, accounting for 37% of total revenues, with cloud accounting for $10.8 billion.

“We delivered $18.7 billion in revenue, $2.3 billion in net income and operating earnings per share of $2.35,” said Martin Schroeter, CFO at IBM. “Importantly, we also made significant investments and took significant actions to accelerate our transformation and move our business into new areas.”

Specifically in Q1, total revenues for the group dropped by 5% to $18.7 billion, the strategic imperatives unit grew 14% to $7 billion, with cloud accounting for $2.6 billion, a 34% year-on-year increase. The company also announced or closed ten acquisitions during the quarter, investing just over $2.5 billion in new businesses including Bluewolf, a Salesforce partner, Truven, a provider of cloud-based healthcare data and The Weather Company’s digital assets.

While the company built its reputation in the traditional IT market segment, sliding revenues and enterprise attention to cloud solutions has enforced a transformation play for the tech giant, which would appear to paying off well.

“We’re continuing to expand our Watson ecosystem and reach,” said Schroeter. “Over the last 12 months, the number of developers using Watson APIs is up over 300% and the number of enterprises we’ve engaged with has doubled. Watson solutions are being built, used, and deployed in more than 45 countries and across 20 different industries.”

Watson would appear to be one of the main focal points for IBM’s new cloud-orientated business model, as the cognitive computing platform has formed the basis of numerous PR campaigns throughout the year, highlighting client wins from pharmaceutical giant Pfizer and the McLaren Honda Formula One team.

“Our enterprise clients are looking to get greater value from their data and IT environment,” said Schroeter. “They’re not just focused on reducing cost and driving efficiency but using data to improve decision-making and outcomes. They’re looking to become digital enterprises that are differentiated by Cognitive. We’re creating Cognitive Solutions that marry digital business with digital intelligence. We’re bringing our industry expertise together with these cognitive solutions and we’re building it all on cloud platforms”

Geographically, the company highlighted business was relatively consistent worldwide, though the Asia-Pacific region did demonstrate growth. EMEA and North America demonstrated slight declines, though there have been improvements from previous quarters, though Latin America continued to prove tough for IBM. The company does have a large business unit in the region, though it quoted volatile economic and political environments in Brazil, as reasoning for declines.

Although the company has not halted the revenue declines which have been a constant for IBM in recent years, the strategic imperatives units would appear to be taking a stronger role in fortunes of the business. IBM has grown its capabilities in numerous developing markers in recent months, including cloud video platforms and user experience, though it does appear to be backing cognitive computing for future growth.

“As we build new businesses in areas like Watson Health and Watson Internet of Things, this requires different skills and to be in different places,” said Schroeter. “I mentioned earlier that over the last year we’ve added over 6000 resources in Watson Health and added over 1000 security experts. These are specialized skills in highly competitive areas. So this is not about reducing our capacity; this is about transforming our workforce.

“So where are we in the transformation? It is continued focus on shifting our investments into those strategic imperatives, it is making sure that the space we’re moving to is higher margin and higher profit opportunity for us and then making sure we’re investing aggressively to keep those businesses growing.”

While IBM is not out of the woods yet, the recent quarterlies did beat analyst predictions and its acquisition activities would appear to be more aggressive than others in the space. The company is seemingly not wasting any time in positioning itself firmly in the cloud space, though it does appear executives are backing the growth of cognitive computing, and Watson’s market penetration in particular, as the catalyst for future success of Big Blue.

Why the era of the intelligent cloud has arrived

(c)iStock.com/fazon1

Enterprises are impatient to translate their investments in cloud apps and the insight they provide into business outcomes and solid results today.

The following insights are based on a series of discussions with C-level executives and revenue team leaders across several industries regarding their need for an Intelligent Cloud:

  • In the enterprise, the cloud versus on-premise war is over, and the cloud has won. Nearly all are embracing a hybrid cloud strategy to break down the barriers that held them back from accomplishing more.
  • None of the C-level executives I’ve spoken with recently are satisfied with just measuring cloud adoption. All are saying the want to measure business outcomes and gain greater insights into how they can better manage revenue and sales cycles.
  • Gaining access to every available legacy and third party system using hybrid cloud strategies is the new normal. Having data that provides enterprise-wide visibility gives enterprises greater control over every aspect of their selling and revenue management processes. And when that’s accomplished, the insights gained from the Intelligent Cloud can quickly be turned into results.

Welcome to the era of the intelligent cloud

The more enterprises seek out insights to drive greater business outcomes, the more it becomes evident the era of the Intelligent Cloud has arrived. C-level execs are looking to scale beyond descriptive analytics that defines past performance patterns.  What many are after is an entirely new level of insights that are prescriptive and cognitive. Getting greater insight that leads to more favourable business outcomes is what the Intelligent Cloud is all about. The following Intelligent Cloud Maturity Model summarises the maturity levels of enterprises attempting to gain greater insights and drive more profitable business outcomes.

maturity model

Why the intelligent cloud now?  

Line-of-business leaders across all industries want more from their cloud apps than they are getting today. They want the ability to gain greater insights with prescriptive and cognitive analytics. They’re also asking for new apps that give them the flexibility of changing selling behaviors quickly.  In short, everyone wants to get to the orchestration layer of the maturity model, and many are stuck staring into a figurative rearview mirror, using just descriptive data to plan future strategies.  The future of enterprise cloud computing is all about being able to deliver prescriptive and cognitive intelligence.
Consider the following takeaways:

Who is delivering the intelligent cloud today?

Just how far advanced the era of the Intelligent Cloud is became apparent during the Microsoft Build Developer Conference last week in San Francisco.  A fascinating area discussed was Microsoft Cognitive Services and their implications on the Cortana Intelligence Suite. Microsoft is offering a test drive of Cognitive Services here. Combining Cognitive Services and the Cortana Intelligence Suite, Microsoft has created a framework for delivering the Intelligent Cloud. The graphic below shows the Cortana Analytics Suite.

Cortana suite

Apttus, a leader in Quote-to-Cash automation, cloud-based enterprise software is announcing the Apttus Intelligent Cloud today. The Apttus Intelligent Cloud drives desired behaviours from everyone on the revenue team and provides prescriptive information to company decision makers to significantly enhance Apttus’ category-defining Quote-to-Cash applications, maximising revenue for Apttus customers. The Apttus Intelligent Cloud includes the full Apttus Quote-to-Cash Suite, Incentives Suite, and Intelligence Suite. The graphic below defines the Apttus Intelligent Cloud.  In the interest of full disclosure, I am an employee of Apttus. 

pgbackup: Backup PostgresSQL

Mediante barman (pgbarman) podremos automatizar los backups y restauraciones de bases de datos PostgreSQL

En CentOS, simplemente deberemos instalar el paquete desde EPEL y configurar unos mínimos (/etc/barman/barman.conf)

[barman]
barman_home = /var/lib/barman
barman_user = barman
log_file = /var/log/barman/barman.log
compression = gzip
configuration_files_directory = /etc/barman.d

También deberemos configurar acceso mediante claves SSH (sin contraseña) en ambas direcciones desde el servidor de barman al usuario de la base de datos de las instancias que queramos hacer backup.

La configuración de los backups los haremos en ficheros independientes en el directorio /etc/barman.d para mayor comodidad, por ejemplo /etc/barman.d/pgm.conf

[pgm]
description = "postgres master"
ssh_command = ssh postgres@192.168.56.29
conninfo = host=192.168.56.29 user=postgres
retention_policy_mode = auto
retention_policy = RECOVERY WINDOW OF 30 days
wal_retention_policy = main

Mediante el comando show-server podremos buscar en que directorio deberemos dejar los archivados:

# barman show-server pgm | grep incoming_wals_directory
	incoming_wals_directory: /var/lib/barman/pgm/incoming

Mediante el modulo de puppet eyp-postgres podremos configurar un archive command para hacer rsync desde la base de datos al servidor.

	class { 'postgresql':
		wal_level => 'hot_standby',
		max_wal_senders => '3',
		checkpoint_segments => '8',
		wal_keep_segments => '8',
		archive_mode => true,
		archive_command_custom => 'rsync -a %p barman@192.168.56.31:/var/lib/barman/pgm/incoming/%f',
	}

Una vez este todo configurado podremos hacer copias de seguridad mediante backup, por ejemplo:

# barman backup pgm 
Starting backup for server pgm in /var/lib/barman/pgm/base/20160415T165403
Backup start at xlog location: 0/3000020 (000000010000000000000003, 00000020)
Copying files.
Copy done.
Asking PostgreSQL server to finalize the backup.
Backup end at xlog location: 0/30000E0 (000000010000000000000003, 000000E0)
Backup completed

Supongamos que tenemos ya el backup y hacemos lo siguiente en la base de datos:

postgres=# insert into test values('fuckthesystem');
INSERT 0 1
postgres=# select * from test;
       txt        
------------------
 sakura
 enlargeyourpenis
 fuckthesystem
(3 rows)


postgres=# delete from test;
DELETE 2
postgres=# select * from test;
 val 
-----
(0 rows)

postgres=# 

Podremos ver el listado de backups disponibles mediante list-backup:

# barman list-backup pgm
pgm 20160415T165403 - Fri Apr 15 14:54:04 2016 - Size: 19.3 MiB - WAL Size: 0 B

Para poder hacer la restauración necesitaremos algunos detalles del backup, los veremos mediante show-backup:

# barman show-backup pgm latest
Backup 20160415T165403:
  Server Name            : pgm
  Status                 : DONE
  PostgreSQL Version     : 90216
  PGDATA directory       : /var/lib/pgsql/9.2/data

  Base backup information:
    Disk usage           : 19.3 MiB
    Timeline             : 1
    Begin WAL            : 000000010000000000000003
    End WAL              : 000000010000000000000003
    WAL number           : 0
    Begin time           : 2016-04-15 14:54:01.835645+02:00
    End time             : 2016-04-15 14:54:04.514398+02:00
    Begin Offset         : 32
    End Offset           : 224
    Begin XLOG           : 0/3000020
    End XLOG             : 0/30000E0

  WAL information:
    No of files          : 0
    Disk usage           : 0 B
    Last available       : None

  Catalog information:
    Retention Policy     : VALID
    Previous Backup      : 20160415T142747
    Next Backup          : - (this is the latest base backup)

Necesitamos el nombre del backup y el “Begin time” exacto. Procedemos primero a parar la base de datos:

# /etc/init.d/postgresql-9.2 stop

A continuación indicamos a barman que recupere desde dicho backup indicando el begin time mediante –target-time y el nombre del backup. Podemos

# barman recover  --target-time "2016-04-15 14:54:01.835645+02:00" --remote-ssh-command="ssh postgres@192.168.56.29" pgm 20160415T142747 /var/lib/pgsql/9.2/data
Processing xlog segments for pgm
	000000010000000000000001
	000000010000000000000002
	000000010000000000000003
	000000010000000000000003.00000020.backup
	000000010000000000000004
Starting remote restore for server pgm using backup 20160415T165403 
Destination directory: /var/lib/pgsql/9.2/data
Copying the base backup.
Copying required wal segments.
The archive_command was set to 'false' to prevent data losses.

Your PostgreSQL server has been successfully prepared for recovery!

Please review network and archive related settings in the PostgreSQL
configuration file before starting the just recovered instance.

WARNING: Before starting up the recovered PostgreSQL server,
please review also the settings of the following configuration
options as they might interfere with your current recovery attempt:

    external_pid_file = '/var/lock/subsys/postgresql-9.2'			# write an extra PID file

Una vez recuperado, simplemente deberemos levantar la instancia:

/etc/init.d/postgresql-9.2 start

Si nos conectamos, veremos que tenemos los datos en el momento de hacer el backup:

[root@pgm ~]# psql -U postgres
psql (9.2.16)
Type "help" for help.

postgres=# select * from test;
       txt        
------------------
 sakura
 enlargeyourpenis
(2 rows)

postgres=# 

Tags:

Ransomware Evolution | @CloudExpo #InfoSec #DataCenter #Security

Initially, we came across ransomware which exploited the entire system and just restricted you from interacting with your own device, later on requiring you to pay dollars if you want to go back and use your computer.

And then it started becoming obsolete because an end-user. People were asking themselves: “That is my computer, would I pay $100 for it? If I don’t really have data, I’d better format my PC and start all over again.” So, that strategy – locking access to computers, started becoming obsolete. What did the bad guys do? They realized that the previous strategy was only good when the data that computer was holding was valuable. So they started asking ransom for the data, and that’s what they’re doing now.

read more

Cloud-Phobia Bubble | @CloudExpo @AvereSystems #DigitalTransformation

Sometime over the last 10 years, the term “cloud” transformed from a big, fluffy collection of water droplets in the sky to the (somewhat daunting) future of IT and a fundamental part of some of the most innovative technologies in the world today. Cloud adoption rates have been increasing steadily over the years due to its compelling economic and technical advantages, allowing organizations to fulfill user requests instantaneously and only for the period of time that they need it. But for every organization that is fearlessly embracing cloud computing and the many benefits it offers, there is another organization that is hesitantly weighing infrastructure options, fearful of taking the leap.

read more

Cloud Expo Returns to the Javits Center | @CloudExpo #IoT #Cloud

This is not a small hotel event. It is also not a big vendor party where politicians and entertainers are more important than real content.
This is Cloud Expo, the world’s longest-running conference and exhibition focused on Cloud Computing and all that it entails.
If you want serious presentations and valuable insight about Cloud Computing for three straight days, then register now for Cloud Expo.

read more

Critical (Outdoor) IoT Applications | @ThingsExpo #IoT #IIoT #M2M #InternetOfThings

It’s safe to assume that the majority of all Internet of Things (IoT) devices operate near large populations of people. Of course, right? This is where the action happens – smart devices, smart cars, smart infrastructure, smart cities, etc. Plus, the cost of getting “internet-connected” in these areas is relatively low – public access to Wi-Fi is becoming widely available, cellular coverage is blanketed over cities, etc.

read more

[session] The Best Course Around Multi-Cloud Adoption By @Cloudyn_buzz | @CloudExpo #Cloud

Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited.
In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudyn, will examine the risks of sourcing IT resources from a single cloud service vendor, as well as the benefits to be reaped by adopting multi-cloud strategies.

read more

‘How to Create Angular 2 Clients for the Cloud’ Workshop at @CloudExpo | #Cloud

Angular 2 is a complete re-write of the popular framework AngularJS. Programming in Angular 2 is greatly simplified – now it’s a component-based well-performing framework. This immersive one-day workshop at 18th Cloud Expo, led by Yakov Fain, a Java Champion and a co-founder of the IT consultancy Farata Systems and the product company SuranceBay, will provide you with everything you wanted to know about Angular 2.

read more

Step-by-Step IoTization | @ThingsExpo #IoT #IIoT #M2M #DigitalTransormation

The paradigm has shifted. A Gartner survey shows that 43% of organizations are using or plan to implement the Internet of Things in 2016. However, not just a handful of companies are still using the old-style ad-hoc trial-and-error ways, unaware of the critical barriers, paint points, traps, and hidden roadblocks. How can you become a winner?
In his session at @ThingsExpo, Tony Shan will present a methodical approach to guide the holistic adoption and enablement of IoT implementations. This overarching model is an adaptive framework to systematically handle the rapid changes and exponential growth, composed of the following modules: Prototype, Extend, Assess, Strategize, Formulate, Accelerate, Initiate, and Revamp (PEAS FAIR). We will zoom in to individual components in each module. Working examples will be discussed to illustrate the practical use of this method in the real world, along with best practices and lessons learned to help organizations IoTize step-by-step.

read more