Archivo de la categoría: Featured

ATTENTION: Important Information About Microsoft Ending Extended Support!

By Rob O’Shaughnessy,Software Licensing Specialist, Pre-Sales Technical Support

There are only a few months left before Microsoft ends its Extended Support for Windows XP, Office 2003 and Exchange 2003. On April 8, 2014 Extended Support will cease to exist. At this point you should be making arrangements to upgrade to the latest editions of Microsoft products so you can continue to receive the necessary support that Microsoft provides. If you’re looking to upgrade, here are some paths for you to take.

Windows XP Professional: Microsoft offers an upgrade price through volume licensing that allows Window XP Professional to upgrade to Windows 7/8 Professional. It’s worth noting that if you are running an older PC, you may want to test to see if it is compatible with the newer versions of Windows.

Exchange 2003: Microsoft doesn’t offer an upgrade price for Exchange; however, they do offer two options to get the latest edition of Exchange in your environment. The first is on-premise licensing of Exchange 2013. This would be loaded and managed locally and MSRP for Exchange Server is $708. In addition, each Device or User accessing Exchange would also require a Client Access License that starts at $68 for the Device Cal and $78 for the User Cal.

The other option to purchase Exchange is through Microsoft’s Cloud known as Office 365. Known as Exchange Online, this off-premise subscription license provides the same Exchange experience as on-prem but without having to have a lot of the local infrastructure in place. There are two options: Exchange Online Plan 1, which is $4 per User per Month and Exchange Online Plan 2, which adds enterprise features and is $8 per User per Month.

Office 2003: Similar to Exchange, Office doesn’t offer an upgrade price and also like Exchange, Office can be purchased as a volume license or through Office 365. The MSRP for Office Standard through volume licensing is $380 a license and for Professional Plus it’s $508. If you prefer the subscription-based model, Office Professional Plus can be purchased through Office 365 for $15 per User per Month. Office 365 requires a minimum of a year for the subscription. If you need to upgrade both Exchange and Office, Microsoft provides an Office 365 Plan that includes Office Professional Plus, Exchange Plan-2, SharePoint Plan-2 and Lync Plan-2 for $20 per User per Month. There are several volume licensing agreements and Office 365 Plans to choose from.

Last, although still a ways away, Windows Server 2003 R2 extended support will be ending on July 14, 2014.

 

If you are looking for assistance, here is how GreenPages can help:

Licensing: Our top notch licensing desk can assist you in understanding all the nuances of Microsoft volume licensing as well as Office 365. We can work with you to find what program fits best for your organization, help mitigate costs, and ensure your compliancy. We can run license history reports to make sure you’re appropriately licensed and review all the various licensing changes in products such as Windows, System Center and SQL.

Migrations & Professional Services: As a dual gold-competency Microsoft Service provider, GreenPages can assist you with the migration to a new client platform. This can include services to perform the following:
• Upgrading client computers from Windows XP to Windows 7/8
• Upgrading Office from 2003 to 2013
• Installing and configuring System Center technologies to incorporate upgrades to the latest technologies and implementing automated patching, remote control, software deployment, and Operating System Deployment (OSD)

Upgrading client operating systems and office packages can be the tip of the IT Iceberg. Let GreenPages help implement a lasting lifecycle management infrastructure for your environment.

When it comes to messaging, there are a number of scenarios to evaluate including
• Upgrade Exchange 2003/2007/2010 to Exchange 2013
• Migrate from Exchange to Office 365

Whether an on-premises upgrade is in your forecast, or you are ready to seize the opportunity to move toward a hybrid cloud environment and migrate to Office 365, GreenPages can help with these and other Microsoft projects.

Microsoft Extended Support Ending 4//2014

On April 8, 2014 Extended Support will cease to exist. Fill out this form and a GreenPages Representative will contact you with more information around how GreenPages can help with licensing, migration, and professional services challenges!

Why Nirvanix Doesn’t Mean the End of Cloud Storage

By Randy Weis, Practice Manager, Virtualization & Data Management

By now everyone is familiar with the Nirvanix fiasco. Now that the dust has settled, I decided to talk about the implications this has had, and will have, on the cloud storage market as well as to highlight some silver linings organizations can take away from the meltdown.

http://www.youtube.com/watch?v=mtQmGQBWzbc

If you’re looking for more content around storage and information management check out my recent posts “A Guide to Successful Big Data Adoption” as well as “10 Storage Predictions for 2014.”

Do you have questions for Randy about storage & data management? Email us at socialmedia@greenpages.com

The PaaS Market as We Know it Will Not Die Off

I’ve been hearing a lot about Platform as a Service (PaaS) lately as part of the broader discussion of cloud computing from both customers and in articles across the web. In this post, I’ll describe PaaS, discuss a recent article that came out on the subject, and take a shot at sorting out IaaS, PaaS, and SaaS.

What is PaaS?

First a quick trip down memory lane for me. As an intern in college, one of my tours of duty was through the manufacturing systems department at an automaker. I came to work the first day to find a modest desktop computer loaded with all of the applications I needed to look busy, and a nicely printed sheet with logins to various development systems. My supervisor called the play: “I tell you what I want, you code it up, I’ll take a look at it, and move it to test if it smells ok.” I and ten other ambitious interns were more than happy to spend the summer with what the HR guy called “javaweb.” The next three months went something like this:

Part I: Setup the environment…

  1. SSH to abcweb01dev.company.com, head over to /opt/httpd/conf/httpd.conf, configure AJP to point to the abcapp01 and 02dev.company.com
  2. SSH to abcapp01.dev.company.com, reinstall the Java SDK to the right version, install the proper database JARs, open /opt/tomcat/conf/context.xml with the JDBC connection pool
  3. SSH to abcdb01dev.company.com, create a user and rights for the app server to talk to the web server
  4. Write something simple to test everything out
  5. Debug the environment to make sure everything works

Part II: THEN start coding…

  1. SSH to abcweb01dev.company.com, head over to /var/www/html and work on my HTML login page for starters, other things down the road
  2. SSH to devapp01dev.company.com, head over to /opt/tomcat/webapps/jpdwebapp/servlet, and code up my Java servlet to process my logins
  3. Open another window, login to abcweb01dev and tail –f /var/www/access_log to see new connections being made to the web server
  4. Open another window, login to abcapp01dev and tail –f /opt/tomcat/logs/catalina.out to see debug output from my servlet
  5. Open another window, login to abcdevapp01 and just keep /opt/tomcat/conf/context.xml open
  6. Open another window, login to abcdevapp01 and /opt/tomcat/bin/shutdown.sh; sleep 5; /opt/tomcat/bin/startup.sh (every time I make a change to the servlet)

(Host names and directory names have been changed to protect the innocent)

Setting up the environment was a little frustrating. And I knew that there was more to the story; some basic work, call it Part 0, to get some equipment in the datacenter, the OS installed, and IP addresses assigned. Part I, setting up the environment, is the work you would do to setup a PaaS platform. As a developer, the work in Part I was to enable me and my department to do the job in Part II – and we had a job to do – to get information to the guys in the plants who were actually manufacturing product!

 

So, here’s a rundown:

Part 0: servers, operating systems, patches, IPs… IaaS

Part I: middleware, configuration, basic testing… PaaS

Part II: application development

So, to me, PaaS is all about using the bits and pieces provided by IaaS, configuring them in a usable platform, delivering that platform to a developer so that they can deliver software to the business. And, hopefully the business is better off because of our software. In this case, our software helped the assembly plant identify and reduce “in-system damage” to vehicles – damage to vehicles that happens as a result of the manufacturing process.

Is the PaaS market as we know it dead?

I’ve read articles predicting the demise of PaaS altogether and others just asking the question about its future. There was a recent Networkworld article entitled “Is the PaaS market as we know it dying?” that discussed the subject. The article makes three main points, referring to 451 Research, Gartner, and other sources.

  1. PaaS features are being swallowed up by IaaS providers
  2. The PaaS market has settled down while the IaaS and SaaS markets have exploded
  3. Pure-play PaaS providers may be squeezed from the market by IaaS and SaaS

 

I agree with point #1. The evidence is in Amazon Web Services features like autoscaling, RDS, SQS, etc. These are fantastic features but interfacing to them locks developers in to using AWS as their single IaaS provider. The IaaS market is still very active, and I think there is a lot to come even though AWS is ahead of other providers at this point. IaaS is commodity, and embedding specialized (read: PaaS) features in an otherwise IaaS system is a tool to get customers to stick around.

I disagree with point #2. The PaaS market has not settled down – it hasn’t even started yet! The spotlight has been on IaaS and SaaS because these things are relatively simple to understand, considering the recent boom in server virtualization. SaaS also used to be known as something that was provided by ASPs (Application Service Providers), so many people are already familiar with this. I think PaaS and the concepts are still finding their place.

Also disagree with point #3, the time and opportunity for pure-play PaaS providers is now. IaaS is becoming sorted out, and it is clearly a commodity item. As we highlighted earlier, solutions from PaaS providers can ride on top of IaaS. I think that PaaS will be the key to application portability amongst different IaaS providers – kind of like Java: write once, run on any JVM (kind of). As you might know, portability is one of NIST’s key characteristics of cloud computing.

Portability is key. I think PaaS will remain its own concept apart from IaaS and SaaS and that we’ll see some emergence of PaaS in 2014. Why? PaaS is the key to portable applications — once written to a PaaS platform, it can be deployed on different IaaS platforms. It’s also important to note that AWS is almost always associated with IaaS, but they have started to look a lot like a PaaS provider (I touched on this in a blog earlier this month). An application written to use AWS features like AutoScaling is great, but not very portable. Lastly, the PaaS market is ripe for innovation. Barriers to entry are low as is required startup capital (there is no need to build a datacenter to build a useful PaaS platform).

This is just my opinion on PaaS — I think the next few years will see a growing interest in PaaS, possibly even over IaaS. I’m interested in hearing what you think about PaaS, feel free to leave me a comment here, find me on twitter at @dixonjp90, or reach out to us at socialmedia@greenpages.com

To hear more from John, download his whitepaper on hybrid cloud computing or his ebook on the evolution of the corporate IT department!

 

 

A Guide to Successful Big Data Adoption

By Randy Weis, Practice Manager, Data Management & Virtualization

In this video, storage expert Randy Weis talks about the impact big data is having on organizations and provides an outline for the correct approach companies should be taking in regards to big data analytics.

http://www.youtube.com/watch?v=jZ3V2ynOD44

What is your organization doing in regards to big data? Email us at socialmedia@greenpages.com if you would like to talk to Randy in more depth about big data, data management, storage, and more.

Grading the Internet’s 2014 Tech Predictions

 

The time is here for bloggers across the internet to make their tech predictions for 2014 and beyond (we have made some ourselves around storage and cloud). In this post, a couple of our authors have weighed in to grade predictions made by others across the web.

Prioritizing Management Tool Consolidation vs. New Acquisitions

Enterprise customers will want to invest in new tools only when necessary. They should look for solutions that can address several of their needs so that they do not have to acquire multiple tools and integrate them. The ability to cover multiple areas of management (performance, configuration and availability) to support multiple technologies (e.g., application tiers) and to operate across multiple platforms (Unix, Windows, virtual) will be important criteria for enterprises to assess what management tools will work for them.  (eweek)

Agree – I have been saying this for a while.  If you want a new tool, get rid of 5 and consolidate and use what you have now or get one that really works. (Randy Becker)

 

Bigger big data spending

IDC predicts spending of more than $14 billion on big data technologies and services or 30% growth year-over-year, “as demand for big data analytics skills continues to outstrip supply.” The cloud will play a bigger role with IDC predicting a race to develop cloud-based platforms capable of streaming data in real time. There will be increased use by enterprises of externally-sourced data and applications and “data brokers will proliferate.” IDC predicts explosive growth in big data analytics services, with the number of providers to triple in three years. 2014 spending on these services will exceed $4.5 billion, growing by 21%. (Forbes)

Absolutely agree with this.  Companies of all sizes are constantly looking to garner more intelligence from the data they have.  Even here at GreenPages we have our own big data issues and will continue to invest in these solutions to solve our own internal business needs. (Chris Ward)

 

Enterprises Will Shift From Silo to Collaborative Management

 In 2014, IT organizations will continue to feel increased pressure from their lines of business. Collaborative management will be a key theme, and organizations will be looking to provide a greater degree of performance visibility across their individual silo tiers to the help desk, so it is easier and faster to troubleshoot problems and identify the tier that is responsible for a problem. (eweek)

Agree – cross domain technology experts are key!  (Randy Becker)

 

New IT Will Create New Opportunities

Mobility, bring-your-own device (BYOD) and virtual desktops will all continue to gain a foothold in the enterprise. The success of these new technologies will be closely tied to the performance that users can experience when using these technologies. Performance management will grow in importance in these areas, providing scope for innovation and new solutions in the areas of mobility management, VDI management and so on. (eweek)

Disagree – This is backwards. The business is driving change and accountability.  It is not IT that creates new opportunities – it is the business demanding apps that work and perform for the people using them. (Randy Becker)

 

Here comes the Internet of Things

By 2020, the Internet of Things will generate 30 billion autonomously connected end points and $8.9 trillion in revenues. IDC predicts that in 2014 we will see new partnerships among IT vendors, service providers, and semiconductor vendors that will address this market. Again, China will be a key player:  The average Chinese home in 2030 will have 40–50 intelligent devices/sensors, generating 200TB of data annually. (Forbes)

Totally agree with this one.  Everything and everybody is eventually going to be connected.  I wish I were building a new home right now because there are so many cool things you can do by having numerous household items connected.  I also love it because I know that in 10 years when my daughter turns 16 that I’ll no doubt know in real-time where she is and what she is doing.  However, I doubt she’ll appreciate the ‘coolness’ of that.  Although very cool, this concept does introduce some very real challenges around management of all of these devices.  Think about 30 billion devices connected to the net….  We might actually have to start learning about IPv6 soon… (Chris Ward)

 

Cloud service providers will increasingly drive the IT market

As cloud-dedicated datacenters grow in number and importance, the market for server, storage, and networking components “will increasingly be driven by cloud service providers, who have traditionally favored highly componentized and commoditized designs.” The incumbent IT hardware vendors will be forced to adopt a “cloud-first” strategy, IDC predicts. 25–30% of server shipments will go to datacenters managed by service providers, growing to 43% by 2017. (Forbes)

Not sure I agree with this one for 2014 but I do agree with it in the longer term.  As more and more applications/systems get migrated to public cloud providers, that means less and less hardware/software purchased directly from end user customers and thus more consolidation at the cloud providers.  This could be a catch 22 for a lot of the traditional IT vendors like HP and Dell.  When’s the last time you walked into an Amazon or Google datacenter and saw racks and racks of HP or Dell gear?  Probably not too recently as these providers tend to ‘roll their own’ from a hardware perspective.  One thing is for sure…this will get very interesting over the next 24 to 36 months… (Chris Ward)

 

End-User Experience Will Determine Success

Businesses will expect IT to find problems before their users do, pinpoint the root cause of the problem and solve the problem as early as possible. IT organizations will seek solutions that will allow them to provide great user experience and productivity. (eweek)

Agree – 100% on this one. Need a good POC and Pilot that is well managed with clear goals and objectives. (Randy Becker)

 

Amazon (and possibly Google) to take on traditional IT suppliers

Amazon Web Services’ “avalanche of platform-as-a-service offerings for developers and higher value services for businesses” will force traditional IT suppliers to “urgently reconfigure themselves.” Google, IDC predicts, will join in the fight, as it realizes “it is at risk of being boxed out of a market where it should be vying for leadership.” (Forbes)

I agree with this one to an extent.  Amazon has certainly captured a good share of the market in two categories, developers and large scale-out applications and I see them continuing to have dominance in these 2 spaces.  However, anyone who thinks that customers are forklift moving traditional production business applications from the datacenter to the public cloud/Amazon should really get out in the field and talk to CIOs and IT admins as this simply isn’t happening.  I’ve had numerous conversations with our own customers around this topic, and when you do the math it just doesn’t make sense in most cases – assuming the customer has an existing investment in hardware/software and some form of datacenter to house it.  That said, where I have seen an uptake of Amazon and other public cloud providers is from startups or companies that are being spun out of a larger parent. Bottom line, Amazon and others will absolutely compete with traditional IT suppliers, just not in a ubiquitous manner. (Chris Ward)

 

The digitization of all industries

By 2018, 1/3 of share leaders in virtually all industries will be “Amazoned” by new and incumbent players. “A key to competing in these disrupted and reinvented industries,” IDC says, “will be to create industry-focused innovation platforms (like GE’s Predix) that attract and enable large communities of innovators – dozens to hundreds will emerge in the next several years.” Concomitant with this digitization of everything trend, “the IT buyer profile continues to shift to business executives. In 2014, and through 2017, IT spending by groups outside of IT departments will grow at more than 6% per year.” (Forbes)

I would have to agree with this one as well.  The underlying message here is that IT spending decisions continue to shift away from IT and into the hands of the business.  I have seen this happening more and more over the past couple of years and can’t help but believe it will continue in that direction at a rapid pace. (Chris Ward)

What do you think about these predictions? What about Chris and Randy’s take on them?

Download this free eBook about the evolution of the corporate IT department.

 

 

Defining Requirements Leads to Successful IT Projects

By Erin Marandola, Contract Administrator, PMP

Simply stated, a successful IT project is one that is completed on time and within budget.  But, how do we get there, and why are there so many project failures?  From a service provider’s perspective, a successful project avoids scope creep (the project getting out of control), which adds cost, time and risk.  The successful project should also avoid gold plating (the addition of unintended added features to the final product of the project).  These pitfalls can be easily avoided.  In this blog, I’ll review how properly defining requirements can contribute to a thorough, well-thought out Statement of Work, and lead to a successful project. 

If there is a mutually agreed upon Statement of Work outlining the project scope, deliverables, acceptance criteria, and assumptions, each party should have a clear, equal understanding of the project, right?  Not exactly.  A key factor in project failure is neglecting to exhaustively define and document project requirements within the Statement of Work.  When we withhold information, assumptions are made.  Since we don’t all think the same way, this can lead to the service provider believing certain terms and conditions are true, while the customer believes otherwise.   

Looking back at my career, a few project failures come to mind.  In one case, there was a different perception of what was considered in and out of scope between various parties.  For example, the Statement of Work said “Eight (8) hours of post-implementation support.”  The customer assumed the provider would provide support to end users, but the provider assumed the support would be at the system level and provided only to system administrators.  In another case, assumptions were made while scoping the project and writing the Statement of Work, but they were never documented and validated by all parties.  This resulted in an engineer arriving onsite for an Exchange upgrade, only to realize the project could not be completed based on conflicts in the client’s environment.  It was assumed the customer had a Disaster Recovery solution in place that would support the upgrade, but that was not the case.  Had the requirements been documented, this would not have happened. 

Register for our upcoming webinar to learn more about project management best practices

To create a comprehensive Statement of Work, we need to methodically define requirements.  The most crucial ingredient in defining requirements is the stakeholder, defined as anyone with a vested interest in the project, or anyone that will be impacted by the project.  Stakeholders should be included in meetings where scope and requirements are being defined.  They can open our eyes to the impacts the future project will have on the organization, environment and processes.  They can also help define what the business and functional requirements are, and what constraints might hinder project objectives.  Additionally, stakeholders help define what assumptions the project team is working under and how project success will be measured.  The benefit of stakeholder involvement in defining requirements is the collaboration – the stakeholder meetings facilitate consensus amongst participants, ownership, and buy-in in the project.  The collaborative approach allows stakeholders to assess multiple options to reach the project goals and mutually agree upon the best fit. 

Once the requirements from the stakeholder meeting are defined, progressively elaborated and documented, a Statement of Work can be created incorporating the feedback.  Prior to mutual execution of the Statement of Work, it is crucial that the service provider and customer review the document together to ensure both parties understand the business need, desired solution, assumptions, and scope of work.  The Statement of Work should be updated as appropriate based on feedback from the review session(s). 

In summary, defining requirements early on is essential in keeping a future project on track, in scope and in budget.  Stakeholders are an invaluable resource in defining requirements.  Defining, documenting and incorporating requirements into the Statement of Work results in a document that is clear, through and easy to manage to, helping to avoid some of the pitfalls we earlier alluded to.  Best of all, defining requirements leads to a project that meets the true needs of the organization. If you’re looking for more information around IT project management, our VP of Project Management and our Director of Project Management are holding a webinar on January 23rd to discuss the benefits of creating a Project Management Office.

 

5 Cloud Predictions for 2014

By John Dixon, LogicsOne

 

Here are my 5 Cloud Predictions for 2014. As always, leave a comment below and let me know what you think!

1. IaaS prices will drop by at least 20%

Amazon has continued to reduce its pricing since it first launched its cloud services back in 2006. In February of last year, Amazon dropped its price for the 25th time. By April prices dropped for the 30th time and by the summer it was up to 37 times. Furthermore, there was a 37% drop in hourly costs for dedicated on-demand instances. Microsoft announced that they will follow AWS’s lead with regard to price cuts. I expect this trend to continue in 2014 and likely 2015. I highlight some of these price changes and the impact it will have on the market as more organizations embrace the public cloud in more detail in my eBook.

2. We’ll see signs of the shift to PaaS

Amazon is already starting to look more like a PaaS provider than an IaaS provider. Just consider pre-packaged, pre-engineered features like Auto Scaling, CloudWatch, SQS, RDS among other services. An application hosted with AWS that uses all of these features looks more like an AWS application and less like a cloud application. Using proprietary features is very convenient, but don’t forget how application portability is impacted. I expect continued innovation in the PaaS market with new providers and technology, while downward price pressures in the IaaS market remain high. Could AWS (focusing on PaaS innovation) one day source its underlying infrastructure to a pure IaaS provider? This is my prediction for the long term — large telecoms like AT&T, Verizon, BT, et al. will eventually own the IaaS market, Amazon, Google, Microsoft will focus on PaaS innovation, and use infrastructure provided by those telecoms. This of course leaves room for startup, niche PaaS providers to build something innovative and leverage quality infrastructure delivered from the telecoms. This is already happening with smaller PaaS providers. Look for signs of this continuing in 2014.

3. “The cloud” will not be regulated

Recently, there have been rumblings of regulating “the cloud” especially in Europe, and that European clouds are safer than American clouds. If we stick with the concept that cloud computing is just another way of running IT (I call it the supply chain for IT service delivery), then the same old data classification and security rules apply. Only now, if you use cloud computing concepts, the need to classify and secure your data appropriately becomes more important. An attempt to regulate cloud computing would certainly have far reaching economic impacts. This is one to watch, but I don’t expect any legislative action to happen here in 2014.

4. More organizations will look to cloud as enabling DevOps

It’s relatively easy for developers to head out to the cloud, procure needed infrastructure, and get to work quickly. When developers behave like this, they not only write code and test new products, but they become the administrators of the platforms they own (all the way from underlying code to patching the OS) — development and operations come together. This becomes a bit stickier as things move to production, but the same concept can work (see prediction #5).

5. More organizations will be increasingly interested in governance as they build a DevOps culture

As developers can quickly bypass traditional procurement processes and controls, new governance concepts will be needed. Notice how I wrote “concepts” and not “controls.” Part of the new role of the IT department is to stay a step ahead of these movements, and offer developers new ways to govern their own platforms. For example, a real time chart showing used vs. budgeted resources will influence a department’s behavior much more effectively than a cold process that ends with “You’re over budget, you need to get approval from an SVP (expected wait time: 2-8 weeks).”

DevOps CIO Dashboard

 Service Owner Dashboard

The numbers pictured are fictitious. With the concept of Service Owners, the owner of collaboration services can get a view of the applications and systems that provide the service. The owner can then see how VoIP spending is a little above the others, and drill down to see where resources are being spent (on people, processes, or technology). Different ITBM applications display these charts differently, but the premise is the same – real time visibility into spend. With cloud usage in general gaining steam, it is now possible to adjust the resources allocated to these services. With this type of information available to developers, it is possible to take proactive steps to avoid compromising the budget allocated to a particular application or service. On the same token, opportunities to make informed investments in certain areas will become exposed with this information.

So there you have it, my 2014 cloud predictions. What other predictions do you have?

To hear more from John, download his eBook “The Evolution of Your Corporate IT Department” or his Whitepaper “Cloud Management, Now

 

 

10 Storage Predictions for 2014

By Randy Weis, Consulting Architect, LogicsOne

As we wrap up 2013 and head into the New Year, I wanted to give 10 predictions I have for the storage market for 2014.

  1. DRaaS will be the hottest sector of cloud-based services: Deconstructing cloud means breaking out specific services that fit a definition of a cloud type service such as Disaster Recovery as a Service (DRaaS) and other specialized and targeted usages of shared multi-tenant computing and storage services. Capital expenditures, time to market, and staff training are all issues that prevent companies from developing a disaster recovery strategy and actually implementing it. I predict that DRaaS will be the hottest sector of cloud based services for small to medium businesses and commercial companies. This will impact secondary storage purchases.
  2. Integration of flash storage technology will explode: The market for flash storage is maturing and consolidating. EMC has finally entered into the market. Cisco has purchased Whiptail to integrate it into unified computing systems. PCI flash, server flash drives at different tiers of performance and endurance, hybrid flash arrays, and all flash arrays will all continue to drive the adoption of solid state storage in mainstream computing.
  3. Storage virtualization – software defined storage on the rise: VMware is going to make their virtual VSAN technology generally available at the beginning of Q2 in 2014. This promises to create a brand new tier of storage in datacenters for virtual desktop solutions, disaster recover, and other specific use cases. EMC is their first release of a software defined storage product called ViPR. It has a ways to go before it really begins to address software defined storage requirements, but it is a huge play in the sense that it validates a segment of the market that has long had a miniscule share. DataCore has been the only major player in this space for 15 years. They see EMC’s announcement as a validation of their approach to decoupling storage management and software from the commodity hard drives and proprietary array controllers.
  4. Network Attached Storage (NAS) Revolution: We’re undergoing a revolution with the integration and introduction of scale out NAS technologies. One of the most notable examples is Isilon being purchased by EMC and starting to appear as a more fully integrated and fully available solution with a wide variety of applications. Meanwhile NetApp continues to innovate in the traditional scale up NAS market with increasing adoption of ONTAP 8.x. New NAS systems feature support of the most recent releases SMB 3.0, Microsoft’s significant overhaul of Windows-based file sharing protocol (also known as CIFS). This has a significant impact on design Hyper V Storage and Windows file sharing in general. Client and server side failover are now possible with SMB 3.0, which enables the kind of high availability and resiliency for Hyper V that VMware has enjoyed as a competitive advantage.
  5. Mobile Cloud Storage – File Sharing Will Never Be the Same: Dropbox, Box, Google Drive, Huddle and other smartphone-based methods to access data anywhere are revolutionizing the way individual consumers access their data. This creates security headaches for IT admins, but the vendors are responding with better and better security built into their products. At the enterprise level, Syncplicity, Panzura, Citrix ShareFile, Nasuni and other cloud storage and shared storage technologies are providing deep integration into Active Directory and enabling transfer of large files across long distances quickly and securely. These technologies integrate with on premise NAS systems and cloud storage. Plain and simple, file sharing will never be the same again.
  6. Hyper Converged Infrastructure Will Be a Significant Trend: The market share dominance of Nutanix, Simplivity (based in Westborough, MA) and VMware’s VSAN technology will all change the way shared storage is viewed in datacenters of every size. These products will not replace the use of shared storage arrays but, instead, provide an integrated, flexible and modular way to scale virtualized application deployments, such as VDI and virtual servers. These technologies all integrate compute & storage, networking (at different levels) and even data protection technology, to eliminate multiple expenditures and multiple points of management. Most importantly, Hyper-converged Infrastructure will allow new deployments to begin small and then scale out without large up-front purchases. This will not work for every tier of application or every company, but it will be a significant trend in 2014.
  7. Big Data Will Spread Throughout Industries: Big Data has become as much a buzzword as cloud. The actual use of the technologies that we call big data is growing rapidly. This adoption is not only in internet giants like Google and companies that track online behavior, but also in industries such as insurance, life sciences, and retailers. Integration of big data technologies (i.e. Hadoop, MapReduce) with more traditional SQL database technology allows service providers of any type to extract data from traditional databases and begin processing it on a huge scale more efficiently and more quickly, while still gaining the advantage of more structured databases. This trend will continue to spread throughout many industries that need to manage large amount of structured and unstructured data.
  8. Object based storage will grow: Cloud storage will be big news for 2014 for two major reasons. The first reason stems from shock waves of Nirvanix going out of business. Corporate consumers of cloud storage will be much more cautious and demand better SLAs in order to hold cloud storage providers accountable. The second reason has to do with adoption of giant, geographically dispersed data sets. Object based storage has been a little known, but important, development in storage technology that allows data sets on scale of petabytes to be stored and retrieved by people who generate data and those who consume it. However, these monstrous data sets can’t be protected by traditional RAID technologies. Providers such as Cleversafe have developed a means to spread data across multiple locations, preserving its integrity and improving resiliency while continuing to scale to massive amounts.
  9. More Data Growth: This may seem redundant, but it is predicted that business data will double every two years. While this may seem like great news for traditional storage vendors, it is even better news for people who provide data storage on a massive scale, and for those technology firms that enable mobile access to that data anywhere while integrating well with existing storage systems. This exponential data growth will lead to advances in file system technologies, object storage integration, deduplication, high capacity drives and storage resource/lifecycle management tool advances.
  10. Backup and Data Protection Evolution + Tape Will Not Die: The data protection market continues to change rapidly as more servers and applications are virtualized or converted to SaaS. Innovations in backup technology include the rapid rise of Veeam as a dominant backup and replication technology – not only for businesses but also for service providers. The Backup as a Service market seems to have stalled out because feature sets are limited; however the appliance model for backups and backup services continue to show high demand. The traditional market leaders face very strong competition from the new players and longtime competitor CommVault. CommVault has evolved to become a true storage resources management play and is rapidly gaining market share as an enterprise solution. Data deduplication has evolved from appliances such as Data Domain into a software feature set that’s included in almost every backup software out there. CommVault, Veeam, Backup Exec, and others all have either server side deduplication or client side deduplication (or both). The appliance model for disk-spaced backups continues to be popular with Data Domain, ExaGrid, and Avamar as leading examples. EMC dominates this market share – the competition is still trying to capture market share. Symantec has even entered the game with its own backup appliances, which are essentially servers preconfigured with their popular software and internal storage. Tape will not die. Long term, long capacity archives still require use of tapes, primarily for economic reasons. The current generation of tape technology, such as LTO6, can contain up to 6 TB of data on a single tape. Tape drives are routinely made with built-in encryption to avoid data breaches that were more common in the past with unencrypted tape.

 

So there you have it, my 2014 storage predictions. What do you think? Which do you agree with/disagree with? Did I leave anything off that you think will have a major impact next year? As always, reach out if you have any questions!