Archivo de la categoría: Storage

Flash Storage: Is it right for you?

In this video, I discuss flash storage. Remember, flash storage isn’t just an enterprise play. It’s important to understand how it can be used and when you should purchase it. Who are the mayor players? What’s the difference between all-flash and hybrid or adaptive flash? What about single cell or multi-level cell? What’s the pricing like?

What you should be doing is designing a solution that can take can take advantage of the flash that is right for your applications and that fits your needs and purposes. A combination of flash drives and spinning drives put together correctly with the right amount of intelligent software can address nearly everybody’s most critical application requirements without breaking the bank.

 

http://www.youtube.com/watch?v=6Nn1O3C3Vqo

 

If you’re interested in talking more about flash storage, reach out!

 

 

By Randy Weis, Practice Manager, Information Infrastructure

Emerging Technologies Across the Storage Landscape

There has been an influx of emerging technologies across the storage landscape. Many vendors are using the exact same hardware but are figuring out ways to do a lot of smarter things with the software. In this post, I’ll cover a handful of vendors who are doing a great job innovating at the software layer to improve storage technology and performance.

Nimble

Nimble was founded by the same people who did Data Domain and does data compression. Their success led to EMC buying them in June 2009. The company is known for its massively popular backup targets. They’re the one of the first ones to compress and duplicate the data as it was being stored to greatly reduce the amount of data that needed to be stored. Essentially, Nimble takes commodity solid state drives and slow 7,200 RPM spinning disks and turns it into an extremely fast, well-performing hybrid SAN, while delivering excellent compression ratios and the best support team in the business. Very simply, they’re doing smarter things with the same technology everyone else is using. It’s highly scalable and well designed. For example, you can change your controllers on the array during business hours with no interruptions, as opposed to having to wait until off hours as companies have been forced to do traditionally.

DataGravity

What’s interesting about DataGravity is that they have taken an entirely different approach to traditional storage. They make arrays that perform on par with just about everyone else’s, yet their secret sauce is taking unstructured, uncategorized data and categorizing it at the time it’s being written. Why is this important? A lot of companies have to keep track of Social Security Numbers, Credit Card Numbers, etc. Traditionally, you have to buy expensive software to do this. DataGravity does it at the time the Data is written. You don’t need to invest in any additional software. That sounds too good to be true, right? Every modern SAN has two storage controllers. Most are active passive or they are both on. DataGravity has one controller looking at these traditional things while the other storage controller looks at data, categorizes it and looks at data management functions. This eliminates the need for expensive compliance related software and data protection management.

Who should take advantage?

Any company that has to deal with regulatory compliance (Healthcare, Finance, etc.).

Simplivity

Simplivity offers hyper-converged infrastructure similar to Nuatanix, EVO Rails, and Dell Vertex. The piece that makes them unique is their dedication to reduce IO. They take all data and compress/duplicate at ingestion once and forever. This means that if I write a data block and the data is already on the storage system, there is zero IO; I don’t have to rewrite it. Furthermore, I can migrate virtual machines from one data center to another. It’s easy to migrate a 5 gig virtual machine and write less than a 100 mgs across the WAN. Also, when I clone a machine, there is no IO. IO is something companies can’t address during work hours because it takes up way too many resources and would bring them to their knees. You can’t do it without impacting the business. When you have Simplivity, there is no need for a third party backup vendor. Redundant data spreads through notes and only writes redundant blocks. It’s easy to have petabytes of backups living on terabytes of storage.

Who should take advantage?

We have a client who is currently in Massachusetts that is looking to move to a Colocation Facility in Florida. For this use case, Simplivity is a quick and easy way to migrate that data geographically without huge impacts on bandwidth, WAN costs, etc.

Pure Storage

If you’re looking for ridiculously fast storage, Pure Storage could be the solution for you. They use the same flash technology as everyone else, but they read and write to it differently so it’s much more efficient, optimized, and it matches the flash technology. Typically, vendors have been writing to flash drivers in the same way that they were treating spinning disk.

Who should take advantage?

If your organization has applications that require tremendously fast storage, this could be a good fit for you. One example would be if you have extremely demanding Oracle SAP or SQL applications.

VMware

VMware brings a lot of great benefits to the table with EVO: Rail. EVO: Rail is basically VMware SAN with prebuilt hardware that can very quickly and easily be deployed. It’s a scalable, software-defined data center building block that provides compute, networking, storage and management. Furthermore, it’s highly resilient.

Who should take advantage?

This is a good fit for organizations that have branch offices where there is a need for smaller VMware environments at multiple locations. It’s a quick, inexpensive way to manage them centrally from a virtual center.

 

Be sure to keep your eyes out for HP who is making innovations in flash storage. More on that soon.

Have you used any of these solutions? How have your experiences been? If you would like to talk more about this, send us an email at socialmedia@greenpages.com

Fun Facts about Microsoft Azure

facts about Microsoft AzureLooking for some helpful facts about Microsoft Azure? For those out there that may be confused about the Microsoft Azure solutions offered to date, here is the first in a series of posts about the cool new features of the Microsoft premium cloud offering, Azure.

Azure Backup, ok… wait, what? I need to do backup in the cloud? No one told me that!

Facts about Microsoft Azure

Yes Virginia, you need to have a backup solution in the cloud. To keep this high level below I attempted to outline what the Azure backup offering really is. There are several protections built into the Azure platform that help customers protect their data as well as options to recover from a failure.

In a normal, on premise scenario, host based hardware and networking failures are protected at the hypervisor level. In Azure you do not see this because control of the hypervisor has been removed. Azure, however, is designed to be highly available meeting and exceeding the posted SLAs associated with the service

Hardware failures of storage are also protected against within Azure. At the lowest end you have Local Redundant storage where they maintain 3 copies of your data within a region. The more common and industry preferred method is Geo-Redundant storage which keeps 3 copies in you’re region and 3 additional copies in another datacenter, somewhere geographically dispersed based on a complex algorithm. The above protections help to insure survivability of your workloads.

Important to note: The copies in the second datacenter are crash consistent copies so it should not be considered a backup of the data but more of a recovery mechanism for a disaster.

Did I hear you just ask about Recovery Services in Azure? Why yes, we have two to talk about today.

  • Azure Backup
  • Azure Site Recovery

Azure Site Recovery – This scenario both orchestrates site recovery as well as provides a destination for virtual machines. Microsoft currently supports Hyper-V to Azure, Hyper-V to Hyper-V or VMware to VMware recovery scenarios with this method.

Azure Backup is a destination for your backups. Microsoft offers traditional agents for Windows Backup and the preferred platform, Microsoft System Center 2012 – Data Protection Manager. Keeping the data in the cloud, Azure holds up to 120 copies of the data and can be restored as needed. At this time the Azure Windows backup version only protects files. It will not do Full System or Bare Metal backups of Azure VMs.

As of this blog post to get a traditional full system backup there is a recommend two-step process where you use Windows Backup which can capture a System State backup and the enable Azure Backup to capture this into your Azure Backup Vault.

There are 2 other methods that exist but currently the jury is out on the validity of these offerings. They are VM Capture and Blob Snapshot.

  • VM capture – which is equivalent to a VM snapshot
  • Blob Snapshot – This is equivalent to a LUN snapshot

As I said these are options but considered by many too immature at this time and respectfully not widely adopted. Hopefully, this provides some clarity around Azure and as with all things Microsoft Cloud related, Microsoft issues new features almost daily now. Check back again for more updates on what Azure can do for your organization!

 

By David Barter, Practice Manager, Microsoft Technologies

Storage Has Evolved – It Now Provides the Context & Management of Data

 

Information infrastructure is taking storage, which is a very fundamental part of any data center infrastructure, and putting context around it by adding value on what has been typically seen as a commodity item.

Bits in and of themselves have little value. Add context to it and assign value to that information and it becomes an information infrastructure. Organizations need to seek to add value to their datacenter environments by leveraging some advanced technologies that have become part of our landscape. These technologies include software defined storage, solid state storage, and cloud based storage. Essentially, there is a new way to deliver a datacenter application data infrastructure.

Storage has evolved

 

http://www.youtube.com/watch?v=yzbwG0g-Y7c

Interested in learning more about the latest in storage technologies? Fill out this form and we’ll get back to you!

By Randy Weis, Practice Manager – Information Infrastructure

EMC Acquired TwinStrata in July. What’s This Mean For You Moving Forward?

Video with Randy Weis, Practice Manager, Data Center   http://www.youtube.com/watch?v=McUyYF9NIec   Back in July, storage giant EMC acquired TwinStrata. Information infrastructure and storage expert Randy Weis breaks down TwinStrata’s capabilities and explains what this means for your organization.   Interested in speaking with Randy about the latest trends in storage? Email us at socialmedia@greenpages.com  …Read More »

Top 25 Findings from Giagom’s 4th Annual “Future of Cloud Computing” Survey

By Ben Stephenson, Journey to the Cloud

 

Giagom Research and North Bridge Partners recently released their 4th annual “Future of Cloud Computing” study. There was some great data gathered from the 1,358 respondents surveyed. In case you don’t have time to click through the entire 124 slideshare deck, I’ve pulled out what I think are the 25 most interesting statistics from the study. Here’s the complete deck if you would like to review in more detail.

 

  • 49% using the cloud for revenue generating or product development activities (Slide 9)
  • 80% of IT budget is used to maintain current systems (Slide 20) <–> GreenPages actually held a webinar recently explaining how organizations can avoid spending the majority of their IT budgets on “keeping the lights on
  • For IT across all functions tested in the survey, 60-85% of respondents will move some or significant processing to the cloud in the next 12-24 months (Slide 21)
  • Shifting CapEx to OpEx is more important for companies with over 5,000 employees (Slide 27)
  • For respondents moving workloads to the cloud today, 27% said they are motivated to do so because they believe using a cloud platform service will help them lower their capital expenditures (Slide 28)
  • Top Inhibitor: Security, remains the biggest concern, despite declining slightly last year, it rose again as an issue in 2014 and was cited by 49% of respondents (Slide 55)
  • Privacy is of growing importance. As an inhibitor, Privacy grew from 25% in 2011 to 31% (Slide 57)
  • Over 1/3 see regulatory/compliance as an inhibitor to moving to the cloud (Slide 60)
  • Interoperability concerns dropped by 45%, relatively, over the past two years…but 29% are still concerned about lock in (Slide 62)
  • Nearly ¼ people still think network bandwidth is an inhibitor (Slide 64)
  • Reliability concerns dropped by half since 2011 (Slide 66)
  • Amazon S3 holds trillions of objects and regularly peaks at 1.5 million requests per second (Slide 71)
  • 90% of world’s data was created in past two years…80% of it is unstructured (Slide 73) <–> Here’s a video blog where Journey to the Cloud blogger Randy Weis talks about big data in more detail
  • Approximately 66% of data is in the cloud today (Slide 74)
  • The number above is expected to grow 73% in two years (Slide 75)
  • 50% of enterprise customers will purchase as much storage in 2014 as they have accumulated in their ENTIRE history (slide 77)
  • IaaS use has jumped from 11% in 2011 to 56% in 2014 & SaaS has increased from 13% in 2011 to 72% in 2014 (Slide 81)
  • Applications Development growing 50% (Slide 84) <–> with the growth of app dev, we’re also seeing the growth of shadow IT. Check out this on-demand webinar “The Rise of Unauthorized AWS Use. How to Address Risks Created by Shadow IT.”
  • PaaS approaching the tipping point! PaaS has increased from 7% in 20111 to 41% in 2014. (Slide 85) <–> See what one of our bloggers, John Dixon, predicted in regards to the rise of PaaS at the beginning of the year.
  • Database as a Service expected to nearly double, from 23% to 44% among users (Slide 86)
  • By 2017, nearly 2/3rds of all workloads will be processed in cloud data centers. Growth of workloads in cloud data centers is expected to be five times the growth in traditional workloads between 2012 and 2017. (Slide 87)
  • SDN usage will grow among business users almost threefold…from 11% to 30%  (Slide 89) <–> Check out this video blog where Nick Phelps talks about the business drivers behind SDN.
  • 42% use hybrid cloud now (Slide 93)
  • That 42% will grow to 55% in 2 years (Slide 94) <–> This whitepaper gives a nice breakdown of the future of hybrid cloud management.
  • “This second cloud front will be an order of magnitude bigger than the first cloud front.” (Slide 117). <–> hmmm, where have I heard this one before? Oh, that’s right, GreenPages’ CEO Ron Dupler has been saying it for about two years now.

Definitely some pretty interesting takeaways from this study. What are your thoughts? Did certain findings surprise you?

 

 

 

Dropbox Forced to Kill Shared Links Due to Security Snafu

Oops! Dropbox announced it is killing existing shared links where documents include ordinary hyperlinks to websites. The problem is the plain old referrer in the header tells that website the URL the inbound link came from. That’s a standard way sites know where their non-direct traffic is coming from. In this scenario, however, the referrer is the URL of the shared dropbox document.

The symptom Dropbox users will experience? Complaints from recipients that the link they were given doesn’t work (if in doubt check the link yourself).

From the Dropbox post on the issue:

While we’re unaware of any abuse of this vulnerability, for your safety we’ve taken the following steps to make sure this vulnerability can’t be exploited:

  • For previously shared links to such documents, we’ve disabled access entirely until further notice. We’re working to restore links that aren’t susceptible to this vulnerability over the next few days.
  • In the meantime, as a workaround, you can re-create any shared links that have been turned off.
  • For all shared links created going forward, we’ve patched the vulnerability

Here’s how to rebuild affected links.

A Guide to Successful Big Data Adoption

By Randy Weis, Practice Manager, Data Management & Virtualization

In this video, storage expert Randy Weis talks about the impact big data is having on organizations and provides an outline for the correct approach companies should be taking in regards to big data analytics.

http://www.youtube.com/watch?v=jZ3V2ynOD44

What is your organization doing in regards to big data? Email us at socialmedia@greenpages.com if you would like to talk to Randy in more depth about big data, data management, storage, and more.