Actifio Partners with Camouflage | @CloudExpo @Actifio #Cloud

Actifio has announced its partnership with Camouflage Software Inc., a leading provider of solutions for data masking in Test Data Management. The partnership brings a best-in-class solution to address the challenges of data access, control, security, and storage costs in the test and development space.
«Software is eating the world,» as Marc Andreessen famously said in 2011. Since then, more and more industries have been transformed by software, to the point where the largest distributor of films has no theaters (Netflix,) the largest provider of delivery services has no cars (Uber,) the largest renter of rooms owns no hotels (Airbnb). Software – especially the development of company-specific applications and platforms with the power to create sustainable advantage and even transform entire industries – has become a critical lever of business strategy in even the most unlikely business segments. Meanwhile, physical and technical infrastructure, the source of differentiated capability for decades, if not centuries, is increasingly seen as a commodity to be accessed at the lowest possible cost.

«Software is eating the world,» as Marc Andreessen famously said in 2011. Since then, more and more industries have been transformed by software, to the point where the largest distributor of films has no theaters (Netflix,) the largest provider of delivery services has no cars (Uber,) the largest renter of rooms owns no hotels (Airbnb). Software – especially the development of company-specific applications and platforms with the power to create sustainable advantage and even transform entire industries – has become a critical lever of business strategy in even the most unlikely business segments. Meanwhile, physical and technical infrastructure, the source of differentiated capability for decades, if not centuries, is increasingly seen as a commodity to be accessed at the lowest possible cost.

read more

Cloud market growing 28% a year and worth $110 billion says study

Money cloudCloud services boomed in 2015 as the barriers to adoption toppled and confidence surged, says a new study. Operators and vendors in the six major cloud services and infrastructure market groups earned $110 billion in the four quarters ending in September 2015, according to new data from Synergy Research Group. This represents an annual growth rate of 28% on average.

The fastest growing sectors, Public Infrastructure as a Service (IaaS) and Platforms as a service (PaaS) grew at almost double the average rate, however, with a 51% increase in revenues in the 12 month period. Perhaps surprisingly, the public cloud is still growing at a faster rate than the hybrid cloud, which many pundits have tipped to be the immediate future for enterprise computing as companies hedge their bets between on and off-premise computing models.

The private and hybrid cloud infrastructure service markets grew by 45%. However, spending on infrastructure hardware and software is still higher than spending on cloud services, but the gap is narrowing rapidly. The top six companies in the hybrid cloud sector were identified as Cisco, HP Enterprise (HPE), AWS, Microsoft, IBM and Salesforce.

Even the lowest performing cloud sectors grew by 16%, Synergy reported.

The latest yearly figures, measured in the period from Q4 2014 to Q3 2015, showed that the total spend on infrastructure hardware and software to build cloud services exceeded $60 billion. Of that $60 billion, at least $30 billion was apportioned to private cloud projects. However, spending on public cloud projects, while in the minority, is growing much more rapidly.

The investments in infrastructure by the cloud service providers brought a return, the analyst says, as the figures show it helped them to generate $20 billion in revenues from cloud infrastructure services (IaaS, PaaS, private & hybrid services) and a further $27 billion from SaaS. There was also a return from additional income sources arising out of supporting internet services such as search, social networking, email and e-commerce. A new sector is emerging, as unified communications as a service (UCaaS) began to show healthy growth and, according to Synergy, is showing signs of driving some radical changes in business communications.

Last year the cloud became mainstream and moved beyond the early adopter phase as barriers to adoption fell, according to Synergy Research Group’s Chief Analyst Jeremy Duke. “Cloud technologies are now generating massive revenues and high growth rates that will continue long into the future,” said Synergy Research Group Chief Analyst John Dinsdale.

BT Cloud Connect to give customer direct link to Salesforce

BT cloud of cloudsTelco BT is to give its corporate customers the option of a hotline to Salesforce’s cloud service through its BT Cloud Connect service.

The telco claimed it can provide a high performance link to Salesforce’s Customer Success Platform and give customers a more reliable and faster performance from the system, as part of its

Cloud of Clouds programme. BT’s global network connects 200 data centres, 48 of which it owns and operates itself.

The service will be rolled out incrementally from February 2016. The priorities for service roll out will be the US first, then Europe, followed by the rest of the world.

Clients desperately want the cloud to help them manage and access vast amounts of valuable data, but it needs to be made easier for them, according to Keith Langridge, VP of network services at BT Global Services. “Our Cloud of Clouds makes it easier by providing direct, secure and high performance connectivity to the applications. Salesforce is a true pioneer of the technology so this is an important milestone in the delivery of our vision,” said Langridge.

The methods that companies use to connect with the cloud needs to be refined, according to Salesforce’s UK MD Andrew Lawson. “BT is accelerating this shift for its customers,” said Lawson. The addition of Salesforce to its cloud of clouds roster will transform the way BT’s clients can connect with customers, partners and employees.

OVH claims integration of TimeSeries PaaS with Sigfox IoT cloud

France-based ISP turned cloud operator OVH has announced that its TimeSeries platform service is now integrated with the IoT simcloud service provided by IoT specialist Sigfox.

The fine tuning of the two services was announced at the CES 2016 show in Las Vegas, where the two service providers demonstrated showed how the OVH TimeSeries platform as a service (PaaS) can analyse and act on data being fed in from 7,000 sensors connected to the Sigfox IoT.

OVH claimed that machine-learning algorithms within the TimeSeries service can identify patterns and automatically trigger actions in response to the perceived situation. Having started as an ISP in Roubaix, France OVH has evolved to become a cloud service provider in France, Germany, Italy, Poland, Spain, Ireland, the United Kingdom, the Netherlands, Lithuania, the Czech Republic and Finland.

In November BCN reported how it has launched a public cloud service in the UK with customisable security as protection against cyber attacks becomes a major selling point alongside open systems mobility. It recently expanded its offering into the US and Canada. It currently has 220,000 servers in 17 data centres but claims it will have opened 12 new data centres by 2018.

The new integration means that now companies can use OVH’s PaaS TimeSeries application programming interfaces to retrieve data. This frees companies from having to build and manage their own databases, it claims.

The integration and demo at CES will help companies to understand the entire value chain of the Internet of Things, according to OVH CEO Laurent Allard. “A turn-key system for storing and managing data and hosting business applications makes it much simpler, quicker and cheaper to get running with the IoT,” said Allard.

In other news, Sigfox has also announced a pilot programme with the French postal service company La Poste. The two companies are collaborating to invent a new of online postal.

The Domino programme aims to automate the ordering of parcel pickup and delivery via Sigfox’s IoT network. A regional rollout will start in the first half of 2016.

2015 – The Year of the Open Source Explosion By @JnanDash | @CloudExpo #Cloud

Open source software – software freely shared with the world at large – is an old idea, dating back to the 1980s when Richard Stillman started preaching the gospel calling it free software. Then Linus Torvalds started working on Linux in the early 1990s. Today, Linux runs our lives. The Android operating system that runs so many Google phones is based on Linux. When you open a phone app like Twitter or Facebook and pull down all those tweets and status updates, you’re tapping into massive computer data centers filled with hundreds of Linux machines. Linux is the foundation of the Internet.

read more

Infinio Blog: Executive Viewpoint 2016 Prediction

This post originally appeared on Virtual-Strategy Magazine and is authored by Scott Davis, CTO at Infinio, a GreenPages partner.  It does not necessarily reflect the views or opinions of GreenPages Technology Solutions.

 

It’s that time of year for CTO predictions. The rate of innovation and disruption across IT is certainly accelerating, providing ample opportunities for comment. Although there is a significant amount of disruptive change going on across many disciplines, I wanted to primarily focus on storage observations for 2016.

Emergence of Storage-class Memory

Toward the end of 2016, we’ll see the initial emergence of a technology that I believe will become the successor to flash. This new storage technology (storage class memory, or SCM) will fundamentally change today’s storage industry just as dramatically as flash changed the hard drive industry. Intel/Micron calls one version 3D XPoint and HP/SanDisk have joined forces for another variant.

SCM is persistent memory technology – 1,000 times faster than flash, 1,000 times more resilient, and unlike flash, it delivers symmetric read/write performance. SCM devices connect to memory slots in a server and they are mapped and accessed similarly to memory, although they are slightly slower. Unlike previous generations of storage technology, SCM devices can be addressed atomically at either the byte level or block-level granularity. Operating systems will likely expose them as either very fast block storage devices formatted by traditional file systems and databases (for compatibility) or as direct memory mapped “files” for next-generation applications. Hypervisors will likely expose them as new, specially named and isolated SCM regions for use by applications running inside the guest operating system (OS).

I expect that SCM will provide unprecedented storage performance, upend the database/file system structures we’ve grown accustomed to, and further drive the trend towards server-side storage processing, shaking up everything from storage economics to application design.

VSAN becomes an Alternative to HCI

Hyperconverged infrastructure (HCI) is a sales strategy wrapped around a software-defined storage architecture that has garnered much attention in the past few years. HCI offerings comprise integrated hardware and software “building blocks” bundled and sold together as a single entity. The hardware is typically a server with direct attached storage disks and PCI-e flash cards. All the software needed to run virtual workloads is packaged as well, including hypervisor, systems management, configuration tools and virtual networking. Perhaps most relevant to our part of the industry, there is always a software-defined storage (SDS) stack bundled with HCI offerings that virtualizes the disks and flash hardware into a virtual storage array while providing storage management capabilities. This SDS stack delivers all the storage services to the virtual machines.

In VMware’s EVO:Rail offering, VMware Virtual SAN (VSAN) is this integrated storage stack. Now battle-tested and rich with enterprise features, VSAN will become more prevalent in the datacenter.  Organizations attracted to cost-effective, high-performance server-side software-defined storage solutions no longer have to embrace the one-size-fits-all hyperconverged infrastructure sales strategy along with it. They will increasingly choose the more customizable VSAN-based solutions, rather than prepackaged HCI offerings, particularly for sophisticated enterprise data center use cases.

Flash Continues to Complement Traditional Spinning Drives, Not Replace Them

While the all-flash array market continues to grow in size, and flash decreases in price, the reality of flash production is that the industry does not have the manufacturing capacity necessary to enable flash to supplant hard disk drives. A recent Register article quoted Samsung and Gartner data that suggested that by 2020, the NAND Flash industry could produce 253 exabytes (EB), which is three times the current manufacturing capacity at a cost of approximately $23 billion.

 

Click to read the rest of this post!

 

Are you interested in learning how Infinio could potentially fit into your IT strategy? Reach out!

 

 

Why Parallels Desktop for Mac Business Edition Beats Fusion

Get full control of your VM licenses with Parallels Desktop for Mac Business Edition. Should the management of your virtual machine software be any different than other asset management? If you need to obtain visibility and control of your VM software, it’s time to consider Parallels Desktop for Mac Business Edition. Offering the best and […]

The post Why Parallels Desktop for Mac Business Edition Beats Fusion appeared first on Parallels Blog.

Enterprise IT in 2016: Why it is no longer enough to just be the best in your industry

(c)iStock.com/tomazl

Over the course of 2015, we’ve seen many of the trends we predicted last year come to fruition. As cloud adoption in the UK soared past 84% in May, it was evident that we had been right that the technology would become even more mainstream. With enterprises aiming to become more mobile, 2015 has also seen the normalisation of “anytime, anywhere working” and the increased dependency on IT to help drive business transformation.

With over two-thirds of IT departments set to increase their operational IT budgets in 2016, it does not look like the importance of IT in driving business goals will diminish anytime soon – the question really lies in how. Although we don’t have a crystal ball that we can look into to predict anything with certainty, there are some trends emerging that we see shaping the enterprise IT in the year ahead:

Businesses will have to use IT to evolve

It is no longer sufficient “just” to be the best in your industry.  Businesses across all industries will need to examine their untapped data assets to drive the next wave of innovation and competition. Disruptive technologies have changed the potential of businesses of all sizes, and within each industry there will be those that will effectively leverage data assets to drive unprecedented levels of competition in 2016.

The nature of security will continue to change rapidly

As we’ve seen over the past 18 months, the nature of the threat has changed in fundamental ways.  No longer is perimeter based security sufficient – if it ever was in the first place. More than ever, a deep, granular, distributed security model will be needed.  Advances in software defined networking combined with other non-traditional approaches (think beyond IP and port ACLs!) will be what enables IT to keep pace with the evolving threat.

Understanding for how to map application portfolios to many cloud models will grow

Insomuch as mainframes still exist (indeed are a growth area for some), so will on premise IT, private cloud, boutique cloud, and hyper scale cloud.  All will continue to remain relevant.  Much of the new development, so called “born in cloud” applications, is likely to align with the hyper scale cloud, while the vast majority of existing enterprise applications may not. 

The value proposition of hyper scale cloud will be stemmed by shortage of truly able developers

There is already a shortage of developers that can truly capitalise on the value of hyper scale cloud. Indeed, many “born in cloud” applications are really just traditional designs deployed to the cloud.  Applications, even newly developed ones, often rely on the infrastructure layer to provide resilience.  The next generation of applications engineered to provide resilience at the application layer – i.e. those that can tolerate infrastructure failure – will suffer until this developer shortage is addressed. Unlikely to end in 2016, this is long term problem that will require one or more higher education cycles to fully resolve.

There will be a resurgence of the private cloud…but not for long

Early adopters of public cloud will re-evaluate the commercial fit of private cloud – and late adopters may move directly to private cloud due to regulations and compliance needs.  Cloud economics are compelling for a wide variety of use cases, but a CAPEX investment supporting a stable, long term application base often makes sense.  In addition, many regulatory bodies regularly lag behind in innovation, and private cloud often addresses compliance obligations while still providing many of the benefits of public cloud.  However, this resurgence is likely to be short lived as regulatory bodies catch up, applications evolve, and more flexible pricing models for public cloud prevail.

As we move into 2016, it’s clear that organisations will continue to look to their IT teams to remain competitive – both for developing new business solutions and meeting existing challenges. As such, it’s important that they are prepared to tackle the biggest hurdles and continue to take advantage of the opportunities that IT presents to the enterprise.

AWS announces latest EC2 price cuts, launches South Korea data centre

(c)iStock.com/4774344sean

Amazon Web Services (AWS) has embraced 2016 by lowering the prices of C4, M4, and R3 instances by 5% in its EC2 cloud, as well as announcing the launch of its new Asia Pacific data centre region, in Seoul.

The price reductions, for C4 and M4 instances running Linux, apply to customers in the US East, US West, Ireland, Frankfurt, Tokyo, Singapore, Sydney regions, while the same applies for R3 instances but adding Brazil.

The M4 instances were launched back in June as ‘next generation’ with the aim of providing lower network latency and less packet jitter, a move analyst house Zacks.com described as “an added feather to its cap.”

Customers concerned over how much they will get billed at the end of this month should note the reductions for on-demand and dedicated hosting apply retroactively as of January 1, while reserved instances are in effect as of January 5. The move represents the 51st AWS price cut overall.

Elsewhere, the South Korean AWS data centre is now open for business, having been originally announced in November alongside India, Ohio, a second region in China, and the UK – the latter of which this publication recently analysed. The Seoul region has two availability zones, bringing the total to 32 ‘zones’ from 12 geographic regions, including its government cloud. Each cloud provider has different definitions over its data centre footprint, but for comparison Microsoft – who is also launching a UK data centre this year – lists 20 ‘regions’ in its portfolio.

One company which is grabbing the chance to house data in Seoul through AWS is Korean gaming firm Nexon, which has more than two thirds of its sales coming from overseas. “We are currently running our new mobile MMORPG game, HIT, 100% on AWS,” said Sang-Won Jung, Nexon VP of new development. “With the new AWS region in Korea, we plan to use AWS not just for mobile games, but also for latency sensitive PC games as well.”

You can find out more about the EC2 price cuts here and the Seoul data centre launch here.

Cloudability buys DataHero for more accurate cost analysis

M&AAccounting start-up Cloudability has acquired data visualisation service provider DataHero, another cloud start-up that formed at about the same time.

Oregon-based Cloudability’s growth came from helping companies track their spending on public cloud infrastructure. It announced the addition of San Francisco based DataHero on the company blog and hailed the extension of its presence 500 miles away in California.

However, while 12 of DataHero’s staff are to join Cloudability, its founder and CEO Chris Neumann will not join, neither will the company’s CFO, engineering VP or the VP for marketing. DataHero will continue to operate as normal for the foreseeable future until it can be integrated into the CloudAbility portfolio.

CloudAbility CEO Mat Ellis said the process will involve building a connector to make it easier for its clients to use DataHero to ingest different information streams, such as invoices from Zuora and conversions from Google Analytics, into Cloudability. The upshot, he said, is to help clients see what’s happening in their business and get a sense of the business costs that matter, such as the IT cost per new customer or the unit contribution margin after the cost of goods sold. The technology matters to cloud users because it helps companies save money on research and development as it brings them the best of both products in one service. “We both have an awesome dashboard which cost a lot of money to get right,” said Ellis. “Now there’s no need to do that twice.”

As the cloud makes it harder for managers to get a clear picture of their asset performances, the data visualisation market has entered a period of consolidation. Salesforce bought EdgeSpring, Zendesk bought BIME, Microsoft bought Datazen and Cloudability previously acquired DataPad. Cloudability has also acquired start ups in other areas such as CloudVertical, RipFog and Attribo.

DataHero had previously raised $10 million in venture funding, the latest award of $6.1 million being announced in May 2015.

“As companies spend more to run applications on public clouds, managing that cloud spending becomes increasingly urgent, difficult and risk prone,” wrote Ellis on his own blog. “Mastering the cloud at scale requires us to think about spending in a completely new way.”

Instead of asking macro economic questions about how much IT is costing every year, the new challenge is to provide micro-economic detail about the cost of every activity, he argued. “We should ask the cost of almost anything: each web page served, widget sold, ride taken across town or flight to the other side of the planet,” he said.

The cost of the DataHero acquisition was not released.