Todas las entradas hechas por James

Salesforce unleashes Wave analytics cloud, aims to disrupt industry

Picture credit: Salesforce

Cloudy giant Salesforce has announced Wave, the Salesforce Analytics Cloud, as its Dreamforce event in San Francisco. The platform is designed to make it easier than ever before to explore data, and take action from any device.

You can’t say it wasn’t coming. Last month Salesforce CEO Marc Benioff tweeted out a “top secret” version of the Dreamforce schedule, and it soon came to people’s attention that the first keynote on October was entitled “analytics cloud.”

An updated one later that month – again “top secret”, with “do not share or tweet” written underneath it – was tweeted out by Benioff, this time with “Project Wave keynote” replacing analytics cloud. It certainly must be an exciting development at Salesforce HQ, as the chief exec was tweeting out updates over the weekend, breaking his own company’s embargo.

“Today, Salesforce is disrupting the analytics market, just as we disrupted the CRM industry 15 years ago,” Benioff said. “We’re not only connecting companies with customers in a whole new way with our Customer Success Platform, we’re empowering companies to know their customers like never before with the groundbreaking Wave Analytics Cloud.”

Chris Barbin, CEO of Appirio, a Salesforce Analytics Cloud partner, said: “Salesforce’s entrance into the business intelligence market with its Analytics Cloud offering is well-timed. Most of the BI tools on the market today are built on last-generation technology that is not focused on the new user-led analysis needs, and they have not been born to take full advantage of the cloud or mobile platforms.

“The benefit of Wave for customers is that it democratises business intelligence; it embraces the new user-led data discovery and puts the power in the hands of everyday users,” he added.

This represents a very interesting area for Salesforce, for whom various observers – Ben Kepes of Forbes being one of them – saw analytics as one of the missing pieces of the jigsaw.

Last month Bluewolf released its yearly report showing the state of Salesforce adoption. The report found that the Salesforce Sales Cloud was used by 89% of respondents, Service Cloud by 46%, Community Cloud by 22%, and Marketing Cloud by 18%.

The latter two were said to be growing at a rate which is ‘on fire’. It’ll be interesting to see how Wave performs here.

IBM Watson lands in Thailand, South Africa and Australia

Picture credit: ChrisDag/Flickr

IBM has announced that its supercomputer Watson is being rolled out in a variety of locations, including Australia, Thailand, and Spain.

The global expansion has come about after the IBM Watson Group was formed in January. IBM also announced various collaborations with companies, as well as startups creating apps that are powered by Watson.

Watson is being trialled in Spain with CaixaBank, to develop a cognitive system to teach Watson Spanish. Similarly, ANZ Global Wealth is working with Watson to analyse and observe the types of questions coming from both customers and financial advisors, to offer an improved advice process.

Watson’s credentials in healthcare will also be tested. The supercomputer is being deployed in Bumrungrad International Hospital in Thailand, to improve the quality of cancer care, as well as at Metropolitan Health in South Africa, to provide personalised, outcome-based health services. The latter will be the first commercial application of Watson on the African continent.

Deakin University in Australia is also trialling Watson to develop an online student engagement advisor, developing and fitting profiles for the university’s 50,000 students, assisting them from where certain buildings in the university are to which careers they should pursue.

Evidently, it’s an emphasis on improving the client and improving Watson’s capabilities. One of the key tenets to Watson is that it continually learns from its mistakes.

CloudTech was treated to a demonstration of Watson Analytics last month, and found some interesting insights. Users have three ways of starting their project; either starting with a question, starting from a use case, or starting from data itself.

Ask Watson Analytics a question – in the case of the demo, ‘why do people purchase?’ – and it spits out data based on drivers, not based as much on figures. As Watson Analytics is based in the cloud, it gives great flexibility in adding use cases.

Read more: Watson Analytics: How it makes sales and marketing’s jobs easier

Here’s proof Oracle is now taking cloud computing very seriously

Picture credit: Peter Kaminski/Flickr

It’s taken a long time for Oracle, and former CEO Larry Ellison in particular, to embrace cloud computing. But now the ship has truly sailed, as the software giant has hired Google App Engine mastermind Peter Magnusson in a senior VP role.

Magnusson, who was most recently VP engineering at Snapchat, was engineering director at Google, responsible for Google’s platform as a service offering App Engine, as well as working on strategy for Cloud Platform.

Little is known about the role Magnusson will undertake at Oracle; his LinkedIn page confirms he’s now working at Redwood, but coyly, the job description merely states ‘Oracle public cloud’.

The company has been engaging in a strategic shift over the past couple of months, with founder Ellison stepping sideways to assume the role of chief technical officer and Safra Catz and Mark Hurd appointed as co-CEOs.

At Oracle OpenWorld last month, Ellison was effusive about Oracle’s new cloud capabilities, making expansions in IaaS, PaaS and SaaS, as well as providing a cloud capable version of Java.

“Extreme performance has always been part of the Oracle brand,” he told delegates. “What has not always been a part of the Oracle brand is the notion of extreme ease of use and extreme low cost.” The Oracle Cloud platform, he said, changes that.

“Our cloud is bigger than people think, and it’s going to get a lot bigger,” he added.

Magnusson’s move from Google to Snapchat was an interesting one at the time, but given Snapchat is one of Google’s biggest cloud customers – a fact the search giant occasionally likes to slip into conversation – the defection made sense. We’ll just have to wait and see what prompted this switch.

Oracle didn’t have any comment to make when CloudTech enquired.

Cloud underpins majority of tech trends for 2015, Gartner analysts find

Picture credit: Pam Broviak/Flickr

Cloud computing is one of the 10 strategic technology trends for 2015, according to analysis from Gartner.

The findings were presented by analysts at the Gartner Symposium/ITxpo earlier this week. Gartner defines a strategic technology trend as one “with the potential for significant impact on the organisation in the next three years.”

David Cearley is vice president and Gartner Fellow. He says there are three main themes with 2015 tech trends; the merging of the real and virtual worlds; the technology impact of the digital business shift; and ‘intelligence everywhere’.

The latter point can be seen in a few of the trends; computing everywhere, the Internet of Things, and smart machine learning. In other words, computational power is moving away from the device.

Naturally, cloud and client computing will be a key element of this. For 2015, Gartner argues, the focus will be on promoting centrally coordinated applications that can port across multiple devices.

“Cloud is the new style of elastically scalable, self-service computing, and both internal applications and external applications will be built on this new style,” said Cearley.

“While network and bandwidth costs may continue to favour apps that use the intelligence and storage of the client device effectively, coordination and management will be based in the cloud.”

This more sophisticated definition of cloud computing is in stark contrast to cloud’s position in the latest Gartner hype cycle, where it was stuck firmly in the ‘trough of disillusionment’.

Gartner has made various predictions about cloud computing in the past, with varied degrees of success. By 2015, the analyst house predicts the death of the traditional sourcing model of IT, as well as well as a move to cloud office systems.

Other trends include software-defined apps and infrastructure, web-scale IT, advanced analytics, and 3D printing.

Read more about Gartner’s prognosis here.

Financial firms still stricken with fear of the cloud, survey shows

Picture credit: SeniorLiving.Org

Three quarters of businesses in the financial sector are still concerned about adopting cloud-based applications, according to survey data from the NCC Group.

The research, which surveyed CIOs from financial services firms with more than 1000 employees, saw that 72% of respondents fear the cloud because of concerns over data not being backed up, and issues of disaster recovery.

Two in five (40%) respondents aren’t currently using cloud because they fear sudden data loss on a mass scale, while three quarters (74%) of firms would need at least a week to implement a disaster recovery plan – 5% said it would take two to three months.

This is a particularly interesting admission given some of the scare stories from cloud computing firms over the past several months. These have ranged from the likes of Joyent and Autotask, which went down for a matter of hours, to vendors such as CodeSpaces, which was forced to cease trading altogether after what it described as a “well-orchestrated” DDoS attack.

“Without a proper disaster recovery plan, a company can quickly fall to its knees,” said Daniel Liptrott, managing director of NCC Group’s escrow division. “However, there are comprehensive backup solutions available to those using cloud applications, so businesses needn’t shy away from cloud adoption due to fear of data loss.”

Liptrott added that business spend an awful lot on cloud services, but comparatively little in keeping operations afloat. It’s an important point to make, and a part of the IT infrastructure companies seem to forget. As Gartner analyst Kyle Hilgendorf wrote, cloud exits are not nearly as sexy as cloud deployments – and your business can be brutally exposed when the proverbial excrement hits the air oscillating machine.

Recent figures from cloud provider Databarracks found that only 30% of smaller businesses had a business continuity plan in place, compared to 54% of medium organisations and 73% of large businesses.

Latest Australian cloud computing policy further adopts “cloud first” approach

Picture credit: Max Anderson/Flickr

The latest draft of Australia’s federal cloud computing policy insists that agencies “must” adopt cloud so long as it’s fit for purpose, protects user data and represents value for money.

The 14 page paper, signed by minister for finance Mathias Cormann and minister for communications Malcolm Turnbull, aims to “reduce the cost of government ICT by eliminating duplication and fragmentation, and will lead by example in using cloud services to reduce costs, lift productivity and develop better services.”

Cloud services will have their value for money determined as per the Commonwealth Procurement Rules, and security of data assesses by the Protective Security Policy Framework. The government has issued a six step outline as a ‘high level approach to the process of evaluating cloud services’:

  • Assess information against legislative and regulatory requirements
  • Evaluate the market for cloud services including existing initiatives by other agencies
  • Determine the suitability of the cloud service against the information requirements
  • Procure and implement the cloud service
  • Monitor the cloud service for performance and compliance
  • Review the cloud service for ongoing benefits realisation

Adoption figures show a slow uptake, according to the report. Cloud procurements in AusTender – the centralised hub for business opportunities, cloud or otherwise – have only totalled $4.7m since July 2010. The Australian government spends approximately $6bn a year on IT.

As a result the report discusses key plans going forward, including evaluation of cloud services for new IT services, and the establishment of a Cloud Services Panel by January 2015. The key recommendation, however, is to move AusTender to a cloud based service, after an options analysis by the Department of Finance.

“Agencies have made limited progress in adopting cloud,” the report warns. “A significant opportunity exists for agencies to increase their use of cloud services through the Australian Government Cloud Computing Policy.”

Despite a few teething snags, the UK government’s G-Cloud programme is making steady progress, replacing the CloudStore with the Digital Marketplace for cloud procurements. Australia currently ranks third in the latest Asia Pacific cloud readiness survey, behind Japan and New Zealand, according to ACCA – an increase of four places from the previous year.

Read the full report here.

SolidFire announces new funding, new storage nodes, but no plans for IPO

Picture credit: Bob Mical/Flickr

Flash storage provider SolidFire has beefed up its funding pool with a series D round of $82m, bringing its total up to $150m.

The funding was led by Greenspring Associates, a new investor, along with current investors NEA, Novak Biddle, Samsung Ventures and Valhalla Partners. SolidFire says the new funds will be part of a global push and to advance its all-flash storage architecture.

“Series D financing for SolidFire is important,” Jay Prassl, SolidFire VP marketing, told CloudTech. “When you’re building an infrastructure company and a storage company like SolidFire, it’s a capital-intensive business.

“This D round funding is very important because it puts SolidFire very much on a path to profitability,” he added. “We are growing a very long term standalone storage company, and raising these funds allows us to really set us up on a path to profitability and leave the options open, if you will, for SolidFire to continue to make additional moves as it goes forward.”

Prassl added there was nothing set in stone regarding an IPO – indeed, reading the history books of what’s happened to storage companies in the past, he admits there’s no prize for going public too early.

Citing Violin Memory as an example, Prassl said: “Going public is often just one step in the process of continuing to grow a company, and it’s a choice you make at a certain point in time. Many companies…have been forced to go public…SolidFire certainly does not want to be in that position.”

SolidFire sees itself squarely at a key trend of big data architecture, offering storage based on flash memory, which is a more energy-efficient way of reading and digesting data. It’s evidently a popular idea, as the investment money keeps rolling in.

The firm has announced the expansion of its SF Series product line with two new storage nodes, offering users a cheaper way to get on board with the product. The SF2405 and SF4805 nodes represent the third generation of SolidFire hardware, with the SF2405 a low-end product release and SF4805 doubling up on that.

SolidFire says the SF2405 is aimed at IT departments and managers looking to take their first steps towards deploying a private cloud infrastructure, and IT as a service – but it doesn’t mean the company is taking its eye off the ball for its traditional large enterprise customer base.

“It’s cut the entry price point for SolidFire storage systems in half,” Prassl said. “That’s significant because that opens up a broader array of customers to SolidFire’s capabilities that maybe weren’t accessible before.

“Many people often think that a smaller storage node indicates your movement towards a smaller target customers, and that’s not the case here,” he added.

Prassl added the keys to initiating a small cloud environment are consolidation, automation of infrastructure, and ability to scale – something SolidFire feels it can do particularly well.

“So many flash companies out there today are focused on one thing: flash”, he said. “We’re a very different storage company. We use flash, for sure, but we go far beyond the media to deliver these three key areas.”

As Prassl argued, this is one of the main reasons why Greenspring Associates took an interest in investing.

Atos and IOC outline plans for cloud computing in 2016 Olympic Games

Picture credit: Oliver E Hopkins/Flickr

International IT services provider Atos has confirmed that Canopy, a platform as a service cloud offering, will be providing the platform for the Olympics to move to the cloud.

Canopy, which is backed by Atos, EMC and VMware, will provide a private cloud solution to transition core planning systems for the Olympics, including accreditation, sport entries and qualification, and workforce management.

Atos has been working with the International Olympic Committee (IOC) since the 1980s, and last year signed a new long term contract to deliver IT solutions for the Olympics.

Both sides aren’t backing down over the toughness of the challenge. Over 80 competition and non-competition venues will have its IT infrastructure linked together, which totals hundreds of servers and thousands of laptops and PCs.

“Here we see a paradigm shift,” Atos notes, “from a ‘build each time’ to a ‘build once’ model and delivering services over the cloud.

“Rio 2016 is a key milestone in this transformational shift.”

“Atos is our long-term worldwide IT partner who has played a critical role in helping us deliver seven successful Olympic Games,” said the IOC’s Jean-Benoit Gauthier. “We are now trusting it to transfer the delivery of the IT for the Games to the cloud, so we can continue to innovate and ensure an excellent Games experience for all.”

Preparation for getting Rio’s infrastructure in shape began at the time of the last Olympic Games in 2012 with the design of the systems. Currently the focus is on building the systems ready for testing, and by 2016 the IT equipment will be deployed.

Rio will be the first Games with extensive IT infrastructure built in the cloud, with the technology being too nascent for the London Games in 2012. Gerry Pennell, CIO of LOCOG (London Organising Committee of the Olympic and Paralympic Games) told Computing at the time: “The infrastructure in the cloud is not sufficiently mature enough to support the kinds of things we’re doing in the Olympics.”

Given worries from senior IOC officials concerning the state of preparation for the Olympics – one called it “the worst in living memory” – let’s hope the IT building doesn’t go over time.

AT&T and Amazon Web Services buddy up with NetBond VPN service

Picture credit: Mike Mozart/Flickr

Recent stories that have arrived at CloudTech HQ have focused on the opportunity the cloud provides for telcos, and this one’s no different: AT&T has announced a partnership with Amazon Web Services (AWS) for AT&T NetBondSM, its network enabled cloud (NEC) solution.

With NetBond, AT&T customers can utilise a VPN to connect to any cloud compute or IT service environment in AT&T’s partner ecosystem, bypassing the Internet completely. The partnership with AWS enables users to access business applications and information stored in Amazon’s cloud.

Melanie Posey, IDC research vice president, described the collaboration as a “likely game changer” and added: “The addition of AWS broadens AT&T’s already expansive NetBond ecosystem and will give customers highly secure, reliable, and on-demand connections to another key public cloud service provider.”

Current NetBond partners include Microsoft, Salesforce.com, VMware, IBM and Box, covering a large breadth of the cloud computing ecosystem.

AT&T’s particular cloud play is an interesting one. NetBond integrates with the existing VPN, meaning customers don’t need to order more equipment; the technology avoids DDoS attacks through the private global network; and users can save up to 60% on networking costs, and increase performance by up to 50%.

The overall effect is an intriguing one as telcos begin to make their behemoth moves to providing cloud services. It’s always been there in the background, of course, but a litany of recent strategic shifts can’t just be a coincidence.

Last week comms provider CenturyLink announced expansion to China – always a good sign of growth – while Ericsson recently announced it had taken a majority stake in platform as a service (PaaS) provider Apcera, adding to its Ericsson Cloud System portfolio.

Telcos see the market opportunity; they’ve already got an expansive network and customer buy-in, and if they add cloud services on top it’s got all the ingredients of being a winning proposition.

For AT&T in particular, it’s been a long time coming. Back in 1993 the firm released a product called PersonaLink, which aimed to be an ‘electronic meeting place’ for people to share information and documents, and could differentiate between ‘dumb’ and ‘smart’ messages.

“You can think of our meeting place as the cloud,” the video stated. Did AT&T invent the cloud more than 20 years ago, as Salesforce CEO Marc Benioff tweeted back in May? Not quite – the technology came from now defunct tech firm General Magic, whose goal was to distribute the computing load evenly between bigger and smaller devices in the network.

But with a big telco network and a large partner ecosystem, AT&T might be coming good on that video’s promise.

CenturyLink expands further into Asian cloud market with Chinese managed hosting

Picture credit: CenturyLink

Communications provider CenturyLink has opened up a Shanghai data centre and announced the availability of its managed hosting service in China.

The latest expansion continues the telco’s aggressive move to the cloud, after a series of data openings this year including Toronto, Reading and Chicago. Back in May CenturyLink slashed prices on its cloud storage, in line with competitors, and in August they announced launch of CenturyLink Private Cloud, which aims to combine the security of private with the agility of public cloud.

The expanse in Asia brings the total number of data centres at CenturyLink to 57.

“For multinational corporations looking to grow their customer base, entering China presents enormous opportunities and challenges,” said Gery Messer, CenturyLink Asia Pacific managing director. “CenturyLink makes it easy for businesses to host within China’s borders, offering access to the same highly secure managed services and consistent IT experience available across our global footprint.”

The current strategy for CenturyLink is a simple one: create solutions for collocation, hybrid and cloud services on customer demand – private and public mashed together. Back in August Richard Seroter, director of CenturyLink Cloud, told CloudTech the company wanted to create a “single pane of glass” effect, whereby customers can manage a variety of agile infrastructure services in one interface.

Last year, a report on behalf of the US-China Economic and Security Review Commission found there was a ‘distrust in foreign technology’, among other causes, which prohibited the growth of cloud computing in China. Amazon Web Services (AWS) opened up a Chinese region in December last year.

CenturyLink is partnering with Chinese IT solutions provider Neusoft to deliver the managed service. “This collaboration offers clients a unique opportunity to expand their IT presence into China by leveraging the resources of two world-class global IT providers to help simplify the complexities inherent in China,” said Angela Wang, Neusoft senior vice president.