Archivo de la categoría: Storage

NetApp partners with Google Cloud to maximise flexibility for cloud data storage

NetApp, a data infrastructure company, has expanded its partnership with Google Cloud to make it easier for organisations to leverage their data for generative AI (GenAI) and other hybrid cloud workloads. NetApp and Google Cloud are announcing the Flex service level for Google Cloud NetApp Volumes which supports storage volumes of nearly any size. NetApp… Read more »

The post NetApp partners with Google Cloud to maximise flexibility for cloud data storage appeared first on Cloud Computing News.

Future-proof your business: cloud storage without the climate cost

With over half of all corporate data held in the cloud as of 2022, demand for cloud storage has never been higher. This has triggered extreme energy consumption throughout the data centre industry, leading to hefty greenhouse gas (GHG) emissions. Worryingly, the European Commission now estimates that by 2030, EU data centre energy use will… Read more »

The post Future-proof your business: cloud storage without the climate cost appeared first on Cloud Computing News.

NetApp unveils updates to industry’s ‘only unified data storage solution’

NetApp, a cloud-led, data centric software company, has made updates to what it describes as the industry’s only unified data storage solution, including new block storage products, multiple improvements to public cloud storage services, and updates to NetApp Keystone Storage-as-a-Service (STaaS) – all designed to drive simplicity, savings, security and sustainability for customers.  Over two… Read more »

The post NetApp unveils updates to industry’s ‘only unified data storage solution’ appeared first on Cloud Computing News.

Cloud Repatriation is picking up speed

Massive data growth, rising costs for cloud services, and the need for more flexibility have given hosting data and workloads on-premises new momentum, as Eric Bassier, senior director of products at Quantum, explains. The benefits of cloud computing, in general, are undisputed. Cloud usage has grown rapidly over the past decade and particularly in the… Read more »

The post Cloud Repatriation is picking up speed appeared first on Cloud Computing News.

Hacer un rescan de discos iSCSI

En el caso que hayamos cambiado el tamaño de un disco iSCSI, vamos a necesitar hacer un rescan de las sesiones iSCSI

Vamos a suponer que hemos ampliado el siguiente disco:

# fdisk -l /dev/sdb

Disk /dev/sdb: 32.2 GB, 32212254720 bytes
64 heads, 32 sectors/track, 30720 cylinders
Units = cylinders of 2048 * 512 = 1048576 bytes

   Device Boot      Start         End      Blocks   Id  System
/dev/sdb1               1       30720    31457264   83  Linux

Mediante iscsiadm podemos hacer el rescan de las sessiones iSCSI:

# iscsiadm -m node -R
Rescanning session [sid: 1, target: iqn.1992-08.com.netapp:sn.193472889, portal: 10.12.16.222,3260]

A continuación veremos el nuevo tamaño del disco:

# fdisk -l /dev/sdb

Disk /dev/sdb: 85.8 GB, 85899345920 bytes
64 heads, 32 sectors/track, 81920 cylinders
Units = cylinders of 2048 * 512 = 1048576 bytes

   Device Boot      Start         End      Blocks   Id  System
/dev/sdb1               1       30720    31457264   83  Linux

Tags:

A Look into Cloudian’s HyperStore 4000

Cloudian has come up with a new cloud data archiving product called HyperStore 4000 that promises to have many unique features to support the changing needs of businesses.

HyperStore 4000 offers object storage, which means, data is stored as objects and not as files. The obvious advantage is that each object acts as a single repository instead of a file system where documents are nested within subfolders.

This cloud archiving option comes with a storage of 700 TB, and even has two separate computer nodes for every chassis. Essentially, what this means is there is a high level of flexibility for data storage, as it can be configured as a three-way cluster. In addition, it ensures the highest levels of data availability as it has a built-in cloud tiering system.

With this product, customers have the choice to store data on their own premises or on popular cloud platforms like AWS, Microsoft Azure and Google Cloud. This data can tier to Cloudian’s public cloud as well.

HyperStore 4000 is the perfect addition to Cloudian’s line of products as it combines their existing expertise with what businesses need today. Data is growing at astronomical speeds. According to a report published by IDC, the amount of data is doubling in size every two years, and this means, by 2020, we’ll reach 44 zettabytes (equivalent to 44 trillion gigabytes). Such a rapidly growing data needs ample storage space, and this is exactly what HyperStore 4000 offers with its 700 TB space.

According to Jon Toor, the Chief Marketing Officer of Cloudian, this device will be particularly useful for industries that deal with large amounts of data such as genome sequencing companies, video surveillance units, and those involved in the entertainment industry. These industries prefer large on-premise storage than cloud simply because it’s safe and they know where their data is located. He also opined that it could replace tape archives that are slowly going out of existence.

In terms of pricing too, Cloudian’s HyperStore 4000 gives a competitive deal. A report released by the company says that this device offers on-premise performance at the price of cloud services, as it costs 40 percent less per GB when compared to other products from Cloudian. This lower pricing is Cloudian’s way of establishing itself in a market that is being dominated by cloud providers with extensive infrastructure.

In fact, one of the biggest reasons for many companies to move to the cloud is cost-saving. If Cloudian can offer the same cost, there is a high possibility for customers to buy HyperStore 4000. To top it, the many recent reports about hacking and data loss has brought up renewed concerns about cloud security. So, when Cloudian offers a solution that is safer than cloud, but at the same price, many businesses are sure to look into it.

In short, Cloudian has come up with a competitive market to bridge the gap that exists between cloud and on-premise storage. Its competitive price, abundant storage and easy-to-use features can after all make businesses reconsider their data archiving strategies.

The post A Look into Cloudian’s HyperStore 4000 appeared first on Cloud News Daily.

Cisco Introduces New Products

Cloud applications are becoming an integral part of everyday life, both at the personal and professional level. It’s little wonder that companies are vying with each other to tap into this huge market. The latest company to join this bandwagon of enterprise cloud suite applications is Cisco, with the introduction of ONE Enterprise Cloud Suite.

Introduced at Cisco’s Partner Summit in San Francisco on November 1st, ONE Enterprise Cloud Suite is a hybrid cloud software solution that’ll allow companies to make the most of their cloud environment. In many ways, this is a self-service portal that is customized to meet the needs of end-users, app developers, and IT professionals, so they can have a flexible environment backed by a solid infrastructure. It’ll offer advanced automation tools for managing the infrastructure, cloud, and other related services by providing real-time diagnostics and historical analysis. One of the most important features of ONE Enterprise Cloud Suite is Big Data automation, so as to provide higher levels of consistency and reduced risk.

Besides this cloud enterprise suite, Cisco also announced a new storage server called UCS S3260. Part of its Cisco Unified Computing System (UCS), this is the first offering in UCS S-series of servers. Known for its modular architecture, the first of its kind in the industry, this server offers a high level of scalability and cloud connectivity that is sure to go a long way in helping customers to convert their data into useful business insights. Further, this server can store terabytes of data, that can be quickly scaled to Petabytes if needed, with Cisco’s UCS manager, and unified I/O connectivity.

Another unique aspect of this server is that it consumes almost 50 percent less power than similar servers, and takes up to 60 percent less space. This is a big move considering that power consumption is one of the biggest aspects of a server’s maintenance costs, closely followed by the cost of space. By addressing both these aspects, Cisco has lowered the cost of total ownership by almost 50 percent when compared to public cloud, and at the same time, offers the perfect infrastructure to power any kind of workload. This can be particularly important for IoT and other data-driven applications, as this server can be quickly scaled to store the enormous amounts of data that is being generated every minute today. Going forward, the rate of data generation is only going to increase, and servers like UCS S3260 are expected to fill this growing demand for storage.

Through this new cloud platform and storage server, Cisco wants to provide an architecture that’ll make it easy for customers to Analyze, Simplify, Automate, and Protect (ASAP) their data, according to a press release from the company. In addition, both these announcements are sure to make it smooth and easy for businesses to make the most of their hybrid cloud adoption.

With these offerings, Cisco has made a foray into the storage business through its successful domain of server business.

The post Cisco Introduces New Products appeared first on Cloud News Daily.

A Look Into IBM’s Cloud Object Storage

IBM has set a high standard for cloud storage with its new service called Cloud Object Storage. This service allows organizations to store any amount of unstructured data on their own servers, in the cloud, or in any combination, at one of the lowest rates in the cloud storage market today.  This service will be available from October 13th, 2016 in the US, and from April 1, 2017 in Europe.

This service is built on a technology called SecureSlice that combines encryption, erasure coding, and geographical dispersal of data. Encryption is a technology where messages are encoded in such a way that only those who are authorized can view this message. Erasure coding is also a way of securing the data. This technology breaks the data down into different segments, expands them, and finally encodes them with redundant data pieces. These data fragments are then stored across different geographical locations or across different devices. This method is particularly useful to reconstruct corrupted data, and in this sense, they are a better replacement for RAID systems, as the time and overhead needed to reconstruct data is greatly reduced. Lastly, geographic dispersal of data is the method by which data is spread across different locations for greater security and redundancy. IBM acquired the SecureSlice technology when it bought a company called CleverSafe last year for $1.3 billion.

There are multiple options available for users with respect to storage. One option called Cross Regional Service, allows users to send their treated data to three separate cloud regions located in different parts of the world, while another option, called Regional Service, allows users to store in multiple data centers located within the same region. Regardless of which choice customers make, their data will be made secure and redundant with SecureSlice technology.

With this service and its many options, IBM has extended the SecureSlice technology to hybrid clouds too, thereby giving customers more flexibility and scalability, without compromising on their control over in-house infrastructure. This product comes at a time when the IDC has predicted that hybrid cloud architecture would be adopted by more than 80% of enterprises by 2018. IBM has made a strategic move by acquiring CleverSafe and extending it to a hybrid cloud environment to tap into this growing market.

In terms of cost too, this service is likely to be a good deal for customers. IBM claims that this service costs 25 percent less than Amazon Web Services S3 storage service. Also, it believes that many customers who are already using IBM’s cloud services would be willing to adopt this technology. According to Russ Kennedy, VP Product strategy, users who run apps on Amazon Web Services can also use this service to store their data, as it supports the S3 interface. The same applies to OpenStack Swift customers too, as Cloud Object Storage supports this API as well.

This service has already been deployed at a few early-adopter companies, and many more are expected to adopt it in the next few months.

The post A Look Into IBM’s Cloud Object Storage appeared first on Cloud News Daily.

Disruption in the Storage Market: Advances in Technology and Business Models

New technologies, business models, and vendors have led to major disruption in the storage market. Watch the video below to hear Randy Weis discuss the evolution of flash storage, how new business models have driven prices down, and the vendors that are making it possible.

Or watch on YouTube

 

Did you miss VMWorld? Register for our upcoming webinar to get all of the most important updates from Las Vegas and Barcelona.

Managed Cloud Storage – What’s the hold up?

Boxes on trolley in warehouseOrganisations operating in today’s highly competitive and lightning-speed world are constantly looking for new ways to deliver services to customers at reduced cost. Cloud technologies in particular are now not only being explored but are becoming widely adopted, with new Cloud Industry Forum statistics showing that 80% of UK companies are adopting cloud technology as a key part of their overall IT and business strategy.

That said, the cloud is yet to be widely accepted as the safe storage location that the industry is saying it is. There is still a great deal of apprehension, in particular from larger organisations, to entrust large volumes of data to the cloud. Indeed, for the last 20 years, storage has been defined by closed, proprietary and in many cases monolithic hardware-centric architectures, which were built for single applications, local network access, limited redundancy and highly manual operations.

Storage demands are changing

The continuous surge of data in modern society, however, now requires systems with massive scalability, local and remote accessibility, continuous uptime and great automation, with fewer resources having to manage greater capacity. The cloud is the obvious answer but there is still hesitancy.

Let’s face it though, anyone who is starting out today is unlikely to go out and buy a whole bunch of servers to deploy locally. They are much more likely to sign up for cloud-based managed services for functions like accounting, HR and expenses, and have a laptop with a big hard drive to store and share files using Gmail, Dropbox and so on. It is true to say that smaller businesses are increasingly using storage inside cloud apps, but for larger businesses, this option is not quite so simple or attractive. Many enterprises are turning to the cloud to host more and more apps but they still tend to keep the bulk of their static data on their own servers, to not only ensure safety and security but also to conduct faster analytics.

Open Door LightThe cloud storage door is only slightly ajar

With increasing data volumes and accelerated demand for scalability, you would expect many businesses to be using cloud-based managed storage already. However, the fact remains that there are still many businesses burying their heads in the sand when it comes to cloud storage. As a result, there is quite a bit of fatigue amongst the storage vendors who have been promoting cloud for some time, but not seeing the anticipated take-up. In fact, I would go so far as to say that the door the industry is pushing against is only slightly ajar.

As with most things, there are clouds and there are clouds. At the end of the day, cloud-based storage can be anything an organisation wants it to be – the devil is in the architecture. If you wanted to specify storage that incorporates encryption, a local appliance, secure high-bandwidth internet connectivity, instant access, replication, green and economical storage media – a managed cloud storage service can actually ‘do’ all of these things and indeed, is doing so for many organisations. There is take-up, just not quite as much as many storage vendors would like.

It’s all about the data

Nowadays, for most organisations it is about achieving much more than just the safe storage of data. It’s more and more common to bolt-on a range of integrated products and services to achieve a wide range of specialist goals, and it’s becoming rare that anyone wants to just store their data (they want it to work for them). Most organisations want their data to be discoverable and accessible, as well as have integrity guarantees to ensure the data will be usable in the future, automated data storage workflows and so on. Organisations want to, and need to, realise the value of their data, and are now looking at ways to capitalise on it rather than simply store it away safely.

Some organisations though, can’t use managed cloud storage for a whole raft of corporate, regulatory and geographical reasons. The on-premise alternative to a cloud solution, however, doesn’t have to be a burden on your IT, with remote management of an on-site storage deployment now a very real option. This acknowledges that storage capabilities that are specific to an industry or to an application are now complex. Add on some additional integrated functionality and it’s not something that local IT can, or wants to, deal with, manage or maintain. And who can blame them? Specialist services require a specialist managed services provider and that is where outsourcing, even if you can’t use the cloud, can add real value to your business.

What do you want to do with your data?

At the end of the day, the nature of the data you have, what you want to do with it and how you want it managed, will drive your storage direction. This includes questions around whether you have static or data that’s subject to change, whether your storage needs to be on-premise or can be in the cloud, whether you want to backup or archive your data, whether you want an accessible archive or a deep archive, whether you need it to be integrity-guaranteed or something else, long or short term. Cloud won’t always necessarily be the answer; there are trade-offs to be made and priorities to set. Critically, the storage solution you choose needs to be flexible enough to deal with these issues (and how they will shift over time) and that is the difficulty when trying to manage long-term data storage. Everything is available and you can get what you want but you need to make sure that you are moving to a managed cloud service for the right reasons.

Ever-increasing organisational data volumes will continue to relentlessly drive the data storage industry and today’s storage models need to reflect the changing nature of the way in which businesses operate. Managed storage capabilities need to be designed from the ground up to facilitate organisations in maximising the value they can get from their data and reflect how those same organisations want to access and use it both today, and more importantly, for years to come.

Written by Nik Stanbridge, VP Marketing at Arkivum