Todas las entradas hechas por Business Cloud News

EMC and VMware launch hyperconverged VxRail appliance

EMC2Storage vendor EMC and virtualiser VMware have jointly launched a family of hyper-converged infrastructure appliances (HCIA) for VMware environments. The plug and play gadgets are meant to simplify infrastructure management in departments experiencing high growth.

The VxRail appliance family combines EMC’s data services and systems management with VMware’s software such as vSphere and Virtual SAN. The intention is to create software defined storage natively integrated with vSphere in a single product family with one point of support. The all-flash VxRail appliances could simplify VMware customer environments and boost performance and capacity in a simple plug and play operation, the vendors claim.

The appliances were jointly engineered to integrate virtualisation, computing, storage and data protection in one system with a single point of support, say the vendors. Since they can be aggregated at great scale, the estate of appliances can grow from supporting two virtual machines (VMs) to thousands of VMs on a ‘pay-as-you-grow’ basis.

Starting prices for small and medium businesses and remote offices are around $60,000, with options for performance intensive workloads to be catered for with up to have 76 TB of flash. The appliances will run EMC’s data services including replication, backup and cloud tiering at no additional charge. In addition RecoverPoint for Virtual Machines, Virtual SAN, vSphere Data Protection and EMC Data Domain are all available.

Meanwhile VCE VxRail Manager will provide hardware awareness with timely notifications about the state of applications, VMs and events. VxRail Appliances can use EMC cloud tiering to extend to more than 20 public clouds such as VMware vCloud Air, Amazon Web Services, Microsoft Azure and Virtustream. These can provide an additional 10TB of on-demand cloud storage per appliance.

“The new appliances put IT organisations on a path to eliminating complexity and collapsing cost structures,” said Chad Sakac, President of the Converged Platforms division of EMC.

According to ESG research on hybrid cloud 70% of IT respondents plan to invest in HCI in the next 24 months. The new appliance family is due out n Q2 2016.

New IBM z13s mainframe was built with a BIOS for hybrid cloud

datacentre cloudIBM has designed its latest mainframe to address the challenges stopping hybrid cloud from becoming the de facto model of enterprise computing. The result has been benchmarked by analysts as the world’s most secure server for enterprise hybrid cloud computing.

The new IBM z13s mainframe, unveiled on February and available from March, is pre-installed with high levels of security and a greater capacity to process security functions, according to the manufacturer. The new levels of security are created by embedding IBM’s newly developed cryptography features into the z13s’s hardware. By running cryptography functions in silicon the mainframe can run its encryption and decryption processes twice as fast as previous generations of machine, boosting the speed of information exchange across the cloud, it claimed.

The new mainframe creates the most secure server in environment in the world, according to an independent report quoted by IBM from researcher Strategy Analytics (2015 Global Server Hardware and Server OS Reliability Survey).

Encrypting sensitive data across company IT departments, regional offices and the public cloud has become a barrier to adoption of this more efficient model of computing, according to IBM’s senior VP of Systems Tom Rosamilia. In response the new z13s model has extra cryptographic and tamper-resistant hardware-accelerated cryptographic coprocessor cards. These have faster processors and more memory, encrypting at twice the speed of previous mid-range systems, which means that hybrid clouds can now handle high-volume, cryptographically-protected transactions, without delay.

The new model uses the Cyber Security Analytics which are standard within the z systems range of mainframes, with the addition of IBM Security QRadar security software, which correlates security intelligence from 500 sources in order to help it spot anomalies and potential threats. This can be used along with the Multi-factor Authentication built into the z/OS operating system for the mainframe range.

The system also uses IBM’s Security Identity Governance and Intelligence to create policy to govern and audit access, in order to cut internal data loss. Access to application programming interfaces (APIs) and microservices, configurable by IBM integration partners, can be used to shut down any further hybrid computing vulnerabilities according to IBM, which announced the addition of BlackRidge Technology, Forcepoint and RSM Partners to its Ready for IBM Security Intelligence partner programme.

Adobe moves to stop Creative Cloud from deleting user data

AdobeAdobe Systems has been forced to take action after Creative Cloud graphics service started deleting delete important user data from its Mac users, for no apparent reason.

The deletions took place without warning or permission and seem to have been triggered after users of the Mac version of Creative Cloud had logged in after a new update to the service had been installed.

The problem was reported by data backup service Backblaze which posted a warning to users on its web site, outlining the root of the problem. On signing in to the Creative Cloud, Mac users of the cloud services were somehow activating a script that deletes the contents of the first folder in the Mac’s root directory, which is prioritised by alphabetical order. This was a particularly bad problem for Backblaze users, because the first folder liable for deletion on their service would be a hidden root folder called .bzvol which contains critical information. The hierarchical naming scheme used to identify this folder happened to coincide with the priorities of the rogue deletion script, so the Mac user’s most important files were deleted first.

Backblaze technical support has issued a Youtube video to explain the phenomenon to puzzled users after social media outlets such as Twitter began to register large numbers of complaints about unauthorized data deletions. The problem could be even worse for other users, who don’t use Backblaze, since the first folder in line for deletion in their Mac root drive would be DocumentRevisions-V100. This is a folder that stores data required for the Mac autosave and Version History functions to work properly. The loss of these contents could leave creative users with severe version control problems and the loss of work which they would have assumed would be automatically saved.

In other circumstances the bug will delete any folders with spaces in the name, which would automatically appear at the top of the listing.

Adobe said it is stopping the distribution of the update until the issue has been resolved. The version causing the deletions is 3.5.0.206. Adobe has warned Creative Cloud users to delay any updates for the time being.

Verizon announces its plans to pull out of public cloud

VerizonVerizon Communications has served notice to its customers that it is to pull the plug on its public cloud offering.

The news emerged as security researcher Kenn White used Twitter to publish a copy of a customer communication sent from Verizon Communications, which warned client that Verizon will ‘discontinue its Public Cloud, Reserve Performance and Marketplace services on April the 12th. As an alternative, Verizon said it will offer Virtual Private Cloud which, it says, provides the cost effectiveness of a multi tenant public cloud but includes added levels of configuration, control and support. It claims this will improve isolation and control for more advanced businesses.

When Verizon shuts down the virtual servers currently running Public Cloud and Reserved Performance, no data or content will be retained, it told customers, warning that without prior transfer to the discontinuation their data would be permanently deleted.

Virtual Private Cloud is the service that Verizon intends to carry on offering to the enterprise market and Verizon said it is making significant investments in the enterprise cloud platform in 2016.

In January BCN reported that Verizon was examining its options for selling its global estate of 48 data centres. Verizon would reportedly expect to raise over $2.5 billion and streamline its business. Currently its colocation portfolio generates $275 million a year. Other telcos such as AT&T, CenturyLink and Windstream have also divested themselves of their data centres businesses in recent years.

According to channel publication CRN Verizon could work with Google to cater for the increasing demand for hybrid cloud systems among enterprise customers, with a Verizon-branded hybrid service running on Google’s public cloud a possibility. This would obviate the need for Verizon having its own public cloud offering. Neither party has confirmed or denied the speculation about their alleged partnership.

Amazon Web Services buys HPC management specialist Nice

amazon awsAmazon Web Services (AWS) has announced its intention to acquire high performance and grid computer boosting specialist Nice.

Details of the takeover of the Asti based software and services company were not revealed. However, in his company blog AWS chief evangelist Jeff Barr outlined the logic of the acquisition. “These [Nice] products help customers to optimise and centralise their high performance computing and visualization workloads,” wrote Barr, “they also provide tools that are a great fit for distributed workforces making use of mobile devices.”

The NICE brand and team will remain intact in Italy, said Barr. Their brief is to continue to develop and support the company’s EnginFrame and Desktop Cloud Visualization (DCV) products. The only difference, said Barr, is that they now have the backing of the AWS team. In future, however, NICE and AWS are to collaborate on projects to create better tools and services for high performance computing and virtualisation.

NICE describes itself as a ‘Grid and Cloud Solutions’ developer, specialising in technical computing portals, grid and high performance computing (HPC) technologies. Its services include remote visualization, application grid-enablement, data exchange, collaboration, software as a service and grid intelligence.

The EnginFrame product is a grid computing portal designed to make it easier to submit analysis jobs to super computers and to manage and monitor the results. EnginFrame is an open framework based on Java, XML and Web Services. Its purpose is to make it easier to set up user-friendly, application- and data-oriented portals. It simplifies the submission and control of grid computing enabled applications. It also acts to monitor workloads, data and licenses from within the same user dashboard. By hiding the diversity and complexity of the native interfaces, it aims to allow more users get the full range of benefits from high performing computing platforms, whose operating systems are off-puttingly complex.

Desktop Cloud Visualization is a remote 3D visualization technology that enables technical computing users to connect to OpenGL and Direct/X applications running in a data centre. NICE has customers in industries ranging from aerospace to industrial, energy and utilities.

The deal is expected to close by the end of March 2016.

Efficiency gains most compelling reason for cloud, say enterprises

SurveyThe majority of US enterprises will increase their spending on cloud computing by up to 50% this year, according to US based researcher Clutch.

Conversely, the research also indicates that 6% of enterprises will cut their spending on cloud. The survey of 300 IT professionals at medium to large enterprises could indicate the different uses for cloud computing, with some companies using it to manage costs while others use it as a strategic weapon.

The study found that nearly 30% of the sample will maintain their current levels of cloud spending, with 6% saying they will reduce their cloud computing budget. A significant minority, 47%, identified efficiency improvements as the main benefit of cloud computing. There were no figures on whether performance improvements may encourage companies to spend less money on cloud services in future however.

The statistics on the uses for cloud computing do not suggest this is a tactical, strategic investment, however. The most popular motive cited for enterprise cloud usage, in the US, would appear to be better file storage, which was nominated as the primary objective for buying cloud services by 70% of the survey. The next most popular application of the cloud, backup and disaster recovery, which was nominated by 62% of the IT professionals, is another cost item. However, the cloud was chosen for application deployment among 51% of the sample, but there was no breakdown of whether this was viewed as a cost saving measure or a tactical investment. Similarly, the figures for the numbers of buyers who used the cloud for testing, 46%, was not broken down into tactical and cost saving motives.

Storage costs are the easy win and prove the value of the cloud: tactical use may be a later development, said Duane Tharp, VP of technical sales and services at service provider Cloud-Elements. “The returns on file storage are pretty straight-forward. Every company needs file storage,” said Tharp. “The ease of adopting the cloud for file storage could prove the concept and pave the way for the adoption of other use cases later.”

Rackspace launches Red Hat driven Private Cloud

Cloud computingHosting company Rackspace has launched Private Cloud which (as the name suggests) is a private cloud ‘as a service’ built on the foundation of OpenStack technology.

The new offering is an addition to its portfolio of Rackspace OpenStack-as-a-Service offerings, as part of the hosting company’s strategy to simplify and popularise OpenStack private and hybrid clouds.

The service is to be fully managed by OpenStack and Red Hat experts at Rackspace and backed by the company’s ‘Fanatical Support’ team. The offering is backed by a guarantee of 99.99% OpenStack API uptime. Rackspace contributes to Red Hat’s Enterprise Linux OpenStack Platform by testing and certifying hardware and software compatibility and benchmarking its performance and availability. Rackspace manages and maintains the Red Hat environment including the underlying Red Hat Enterprise Linux, Red Hat Satellite and Red Hat Enterprise Linux OpenStack Platform.

It’s all about making the customer’s infrastructure problems go away, according to Darrin Hanson, vice president and general manager of OpenStack Private Cloud at Rackspace. “We help make OpenStack simple by eliminating the complexity and delivering it as a service to customers in their data centre, a Rackspace data centre or in a colocation facility,” said Hanson.

Exponential Docker usage shows container popularity

Global Container TradeAdoption of Docker’s containerisation technology has entered a period of explosive growth with its usage numbers nearly doubling in the last three months, according to its latest figures.

A declaration on the company blog reports that Docker has now issued 2 billion ‘pulls’ of images. In November 2015 the usage figure stood at 1.2 bullion pulls and the Docker Hub from which these images are pulled was only launched in March 2013.

Docker’s invention of software defined autonomous complete file system that encapsulates all the elements of a server in microcosm – such as code, runtime, system tools and system libraries – has whetted the appetite of developers in the age of the cloud.

In January 2016, Docker users pulled images nearly 7,000 times per minute, which was four times the run rate a year ago. In that one month Docker enjoyed the equivalent of 15% of its total transaction from the past three years.

The number of ‘pulls’ is significant because each of these transactions indicates that a Docker engine is downloading an image to create containers from it. Development teams use Docker Hub to publish and use containerised software, and automate their delivery. The fact that two billion pulls have now taken place indicates the popularity of the technology and the exponential growth rate in the last three months is an indicator of the growing popularity of this variation of virtualisation.

There are currently over 400,000 registered users on Docker Hub. “Our users span from the largest corporations, to newly-launched startups, to the individual Docker enthusiast and their number is increasing every day,” wrote Docker spokesman and blog author Mario Ponticello.

Around a fifth of Docker’s two billion pulls come from its 93 ‘Official Repos’ – a curated set of images from Docker’s partners, including NGINX, Oracle, Node.js and Cloudbees. Docker’s security-monitoring service Nautilus maintains integrity of the Official Repos over time.

“As our ecosystem grows, we’ll be adding single-click deployment and security scanning to the Docker platform,” said Monticello.

A Rightscale study in January 2016 found that 17% of enterprises now have more than 1,000 virtual machines in the public cloud (up 4% in a year) while private clouds are showing even stronger appetite for virtualisation techniques with 31% of enterprises running more than 1,000 VMs, up from 22% in 2015.

Cloud merits acknowledged but adoption concerns linger – Oracle report

cloud question markCloud technology is almost universally acknowledged for its catalysing effect on invention and customer retention, according to new research from Oracle. However, there are still major barriers to adoption.

In Oracle’s study 92% of its sample group of industry leaders testified that the cloud enables them to innovate faster. It also helps companies keep afloat better, with nearly three quarters (73%) reporting that using cloud technology has helped them to retain existing customers more effectively. The cloud also comes out well as a strategic weapon, with 76% of enterprises saying that the newer, more flexible model for handling information helps them to win new customers.

However, the study conducted for Oracle by IDG Connect indicates there is much room for improvement in the adoption of cloud computing. Only half (51%) of the survey sample say their businesses will have reached cloud maturity within two years. According to an Oracle statement, this is a consequence of current uncertainty about moving to the cloud.

Though a compromise between privately owned IT systems and publicly available services is seen as the obvious choice, there are grave concerns about hybrid cloud adoption. Instead of getting the best of both worlds with a hybrid system, many users (60%) reported that the thought of managing multiple IT architectures was off putting. There are fears about the reliability and availability of network bandwidth, which was cited by 57% of the survey as a barrier to adoption. The lack of trust in the relationship with IT suppliers was also a major concern with 52% of the survey sample. Meanwhile those building private cloud infrastructures continue to see security as the prime concern, according to Oracle.

Attitudes could change, but that involves converting the considerable opposition of cloud-sceptics.  There are still significantly large numbers of IT experts who say that winning over key business decision makers is their biggest challenge. This was identified as an issue for 29% of those surveyed.

Johan Doruiter, Oracle’s Senior VP of Systems in EMEA, remained optimistic. “As cloud rapidly reaches maturity, we are seeing a shift in how enterprises perceive the chief benefits and barriers to adoption,” he said. “Traditional concerns have been replaced by the operational worries.”

Cohesity claims data silo fragmentation solution

Data visualisationsSanta Clara based start-up Cohesity claims it will be able to drastically reduce the escalating costs of secondary storage.

The new Cohesity Data Platform achieves this, it reckons, by consolidating all the diverse backup, archive, testing, development and replication systems onto a single, scalable entity.

In response to feedback from early adopters, it has now added site-to-site replication, cloud archive, and hardware-accelerated, 256-bit encryption to version 2.0 of the Data Platform (DP).

The system tackles one of the by-products of the proliferation of cloud systems, the creation of fragmented data silos. These are the after effects of the rapid unstructured growth of IT which led to the adoption of endless varieties of individual systems for handling backup, file services, analytics and other secondary storage use cases. By unifying them, Cohesity claims it can cut the storage footprint of a data centre by 80%. It promises an immediate tangible return on investment by obviating the need for backup.

Among the time saving features that have been added to the system are automated virtual machine cloning for testing and development and a newly added public cloud archival tier. The latter gives enterprise users the option of spilling over their least-used data to Google Cloud Storage Nearline, Microsoft Azure and Amazon S3 and Glacier in order to cut costs. The Cohesity Data Platform 2.0 also provides ‘adaptive throttling of backup streams’, which minimises the burden that storage places on the production infrastructure.

“We manage data sprawl with a hyperconverged solution that uses flash, compute and policy-based quality of service,” said Cohesity CEO Mohit Aron.