Archivo de la categoría: Virtualization

Transform Your Android into a Workstation

The smartphone revolution has significantly affected IT networks and software market shares. While Windows applications lead the software market, Android is quickly taking over the mobile market. According to NetMarketShare, the Windows Desktop OS market share is more than 80% while Android market stands at 28% for all versions. Looking at these numbers, it is […]

The post Transform Your Android into a Workstation appeared first on Parallels Blog.

10 Things to Know About Docker

DockerIt’s possible that containers and container management tools like Docker will be the single most important thing to happen to the data center since the mainstream adoption of hardware virtualization in the 90s. In the past 12 months, the technology has matured beyond powering large-scale startups like Twitter and Yelp and found its way into the data centers of major banks, retailers and even NASA. When I first heard about Docker a couple years ago, I started off as a skeptic. I blew it off as skillful marketing hype around an old concept of Linux containers. But after incorporating it successfully into several projects at Spantree I am now a convert. It’s saved my team an enormous amount of time, money and headaches and has become the underpinning of our technical stack.

If you’re anything like me, you’re often time crunched and may not have a chance to check out every shiny new toy that blows up on Github overnight. So this article is an attempt to quickly impart 10 nuggets of wisdom that will help you understand what Docker is and why it’s useful.

Docker is a container management tool.

Docker is an engine designed to help you build, ship and execute applications stacks and services as lightweight, portable and isolated containers. The Docker engine sits directly on top of the host operating system. Its containers share the kernel and hardware of the host machine with roughly the same overhead as processes launched directly on the host machine.

But Docker itself isn’t a container system, it merely piggybacks off the existing container facilities baked into the OS, such as LXC on Linux. These container facilities have been baked into operating systems for many years, but Docker provides a much friendlier image management and deployment system for working with these features.

 

Docker is not a hardware virtualization engine.

When Docker was first released, many people compared it to virtual machine hypervisors like VMWare, KVM and Virtualbox. While Docker solves a lot of the same problems and shares many of the same advantages as hypervisors, Docker takes a very different approach. Virtual machines emulate hardware. In other words, when you launch a VM and run a program that hits disk, its generally talking to a “virtual” disk. When you run a CPU-intensive task, those CPU commands need to be translated to something the host CPU understands. All these abstractions come at a cost: two disk layers, two network layers, two processor schedulers, even two whole operating systems that need to be loaded into memory. These limitations typically mean you can only run a few virtual machines on a given piece of hardware before you start to see an unpleasant amount of overhead and churn. On the other hand, you can theoretically run hundreds of Docker containers on the same host machine without issue.

All that being said, containers aren’t a wholesale replacement for virtual machines. Virtual machines provide a tremendous amount of flexibility in areas where containers generally can’t. For example, if you want to run a Linux guest operating system on top of a Windows host, that’s where virtual machines shine.

 

Download the whitepaper to read the rest of the list of 10 Things You Need to Know About Docker

 

 

 

 

Whitepaper by Cedric Hurst, Principal at Spantree

CTO Focus Interview: Gunnar Berger, Citrix

CTO Focus InterviewIn the third installment of our CTO Focus Interview series, I got to speak with Gunnar Berger, CTO at Citrix (View Part I and Part II of the series). Gunnar is a well respected thought leader who previously worked as an Analyst at Gartner and joined Citrix last June. Gunnar is on a mission to make VDI easier and cheaper to deploy. I’d highly recommend following Gunnar on Twitter to hear more from him.

 

Ben: What are your primary responsibilities at Citrix?

Gunnar: A lot of what I do at Citrix is on the back end and not necessarily public facing. In the public view, it’s more of looking at a long term strategy. Most roadmaps are looking ahead 12-18 months. I can be influential in these plans, but I am really looking at the longer term strategy. Where are we going to be in 3-5 years? How do we actually get to that place? How do you take today’s roadmap and drive it towards that 5 year plan? One of the main reasons I took the job at Citrix is because I want to fix VDI. I think it costs too much and is too complex. I think we truly can change VDI at Citrix.

 

Ben: What are some of the challenges you face as a CTO?

Gunnar: One of the main challenges when looking at long term strategies is that things can happen in the short term that can impact those long term plans. That’s every CTO’s challenge regardless of industry. In this particular industry, things change every single day. Every couple of months there is a major merger or acquisition. You have to be nimble and quick and be ready to make adjustments on the fly. My background at Gartner is very relevant here.  I have to make sure I understand where the customer is now and where they will be 3-5 years from now.

If you look at the history of Citrix, look back 5 years and you see they made an incorrect prediction on VDI. You can create a long term strategy and have it not work out. If you aren’t correct with your long term strategy, it’s important to capture that early on and pivot.

 

Ben: What goals do you have for 2015?

Gunnar: I have three main goals heading into 2015. The first is doubling down on applications. The second is to review the complexity and costs of VDI. The third is to “bridge to the cloud.”

1. Double down on applications

Citrix over rotated on VDI but now the pendulum is moving back. VDI has a place but so does RDS. We are doubling down so that XenApp is not a second class citizen to XenDesktop. Apps are what users want, XenApp is our tried and true solution for pushing these apps out to users on any device.

2. Review complexity and cost of VDI

My overall goal is to make VDI easier to deploy and cheaper to deploy. This plays into a long term strategy. Let’s face it, VDI deployments take a lot of time and money. I can’t remember where it was that I heard this stat, but for every dollar of a VDI sale I need to sell $12 in everything else. For a customer to buy one thing they need to buy $12 of something else…not an ideal situation for the customer.

We need to solve that issue to make it less costly. I’m unapologetically a fan of VDI. I think it’s an extremely powerful technology that has a lot of great benefits, but it is currentlycostly and complex. Luckily, in my position I get to work with a lot of really smart people that can solve this so I’m confident that Citrix will make VDI what I have always wanted it to be.

3. Bridge to the cloud

This is where Citrix Workspace Services comes into play. You will start seeing more and more of this from Citrix over the next several months. Essentially this is the unification of all of our different products (i.e. XenDesktop, XenApp, XenMobile, NetScaler, etc.). We will be “SaaS-ifying” our entire stack, which is a massive undertaking. We really want to improve the admin experience by creating a single administrative interface for users of all different product suites.

The goal is provide the same benefits to an enterprise that an end user receives from products like the ChromeBook – automatically get the latest version so you never have to update manually. We want to get to the point that no matter what, customers are always operating on the most recent versions. This obviously benefits the customer as they are getting the latest things instantly.

Citrix isn’t going to try to become a cloud provider. To do that you need billions of dollars. We’re building a bridge to enable users to move seamlessly from on-prem to off-prem. You want to be on Azure or Amazon? We can do that.

The idea is that this becomes the middle ground between you and those cloud providers. What I like about being the intermediary is being able to dial up and back between clouds seamlessly to allow customers to stand things up and test them in minutes instead of days.

 

Ben: Citrix has made heavy investments in mobility. Where do you see mobility in the next 3-5 years?

Gunnar: Honestly, I want to stop talking about mobility like it’s something special. Everything we are doing these days is mobile. Mobile Device Management? Mobile Application Management? We need to drop the mobile from this story. It’s device management. It’s applications management. As far as where mobility fits in with Citrix – it’s inherent to the big picture much like the necessity to breath. I say this to paint a picture because it’s in our DNA. This is what Citrix has done for the last 25 years. In today’s world with smartphones and tablets, we take apps and make them run elsewhere just like we have always done.

 

Ben: Throughout your career, what concept or technology would you say has had the most drastic impact on IT?

Gunnar: Hands down virtualization. Virtualization is the root of where cloud started. Cloud is the most disruptive technology moving forward, and it all started with the hypervisor.

 

Are you a CIO/CTO interested in participating in our Focus Interview series? Email me at bstephenson@greenpages.com

By Ben Stephenson, Emerging Media Specialist

 

Riding on the Cloud – The Business Side of New Technologies

For the last couple of years “The Cloud” has been a buzzword all over the business and IT world.

What is The Cloud? -Basically, it is the possibility to use remote servers to handle your processing, storage and other IT needs. In the olden days you only the resources that you physically had on your computer; these days that’s not the case. You can “outsource” resources from another computer in a remote location and use them anywhere. This has opened so many doors for the world of business and has helped bring new companies into the internet.

Why? Because of how much it reduces the cost of being on the internet. A server is a costly piece of equipment and not everybody can afford it. Between the initial cost and upkeep of the hardware, you could easily spend a few thousand pounds every year.

The cloud has brought on the Virtual Private Server, which gives you all the benefits of an actual server without the hefty price tag. A hosting company will rent out a piece of their processing capabilities to your company and create a server environment for you. You only pay for what you use and you don’t have to worry about things like hardware failure, power costs or having room for a couple of huge server racks.

But what if your business grows? One of the biggest advantages of the cloud is that it can grow along with your business and your needs. It’s highly scalable and flexible, so if you ever need some extra storage or extra bandwidth, it’s a really easy fix that does not require you to purchase new equipment.

Since your own personal business cloud is by definition a remote solution, this means that you can access it from anywhere and everywhere as long as you have an internet connection. Want to make changes to your server? You can probably do it without leaving your house, even from the comfort of your own bed.

The same applies to your staff. If anyone ever needs to work from home or from another machine that’s not their work computer, all of the important files and resources they could possibly need can be hosted in the cloud, making those files accessible from anywhere. If someone’s office computer breaks there’s a backup and no data is lost.

The Cloud also makes sharing files between members of your staff a lot easier. Since none of the files are hosted on a local machine everybody has access to the files they require. Files update in real time, applications are shared and you can create a business environment that’s exponentially more effective.

Of course, the cloud still offers security and access control so you can keep track of who can see which files. A good cloud services provider also provides protection against malware and other security risks, to make sure that no pesky interlopers get into your files.

If your business is growing and so are your IT needs, then the cloud is an option worth exploring. Embrace the future, adopt new technologies and take your business to the next level.

So You Want to be a Cloud Architect? Part I

Cloud ArchitectLately, I’ve been taking a look at the role of a cloud architect. What does that role look like? How does one acquire the necessary skills? Where do aspiring cloud architects turn for training? Is it possible to acquire the skills to be an effective cloud architect by taking a course or two? Rick Blaisdell wrote a great blog about a month ago called “Top Cloud Skills Employers are Looking For” that I would highly recommend reading.

Cloud Architect Training, Today

I recently wrote a blog about a Forbes article I read by Jason Bloomberg about the concept of cloudwashing. I think this idea applies very well to the cloud training offerings out there today. If you take a look, many of the cloud training courses that currently exist from established vendors are simply rebranded with “Cloud” as a highlight.

The Valuable Cloud Architect

The cloud architect should possess a healthy combination of the following skills:

  • Technical –especially virtualization with a splash of programming and automation
  • IT operations – in particular the concept of IT services and an IT services portfolio
  • Understanding of your business, its initiatives, its challenges, etc.

Technical Skills of the Cloud Architect

Technically speaking, cloud architects need to bring their past experience to bear, validated by the following certifications:

  • Current VCP
  • ITIL v3 Foundations
  • At least one vendor certification (e.g., AWS Certified Solution Architect)
  • Basic knowledge of programming and automation(2)

Technical competence is more of a pre-requisite for a cloud architect. It should be assumed that a cloud architect has some hands-on experience with these items. As they say, these technical skills are necessary but not sufficient for the complete cloud architect. Having acquired these certifications, an architect would probably be hip to the recent progression of IT that occurred since virtualization became mainstream. If you’ve read this far, you’d probably agree that virtualization alone does not mean cloud computing. However, many of the fundamental characteristics of cloud borrow from virtualization and are practical due to virtualization.

Other Core Skills of the Cloud Architect

Maybe the role of the cloud architect is less technical than we think. Business and market knowledge is absolutely critical for the cloud architect, for several reasons:

  1. Products, features, and prices are changing day to day in the market for cloud services – why is this happening and what will happen tomorrow?
  2. The traditional corporate IT market is, effectively, now a competitor in this market for IT services
  3. Using cloud concepts, new companies are being formed and are growing rapidly – some of these companies may challenge your own business – how can the cloud architect understand them and improve their company’s competitive advantage, recognize partnership opportunities, bring products to market more rapidly using cloud and other emerging technology?

 

The third point is a bit of a departure from what we’ve seen as a cloud architect. This says that a cloud architect should really be a specialist in business issues. More than that, we think that Corporate IT should look to transform to specialize in the business rather than specialize in providing IT services. IT departments should be an Innovation Center for the business. More on that in a future post.

The market for cloud computing is changing every day. Established providers like Amazon Web Services introduce new features and products while also dropping prices. Established companies like Microsoft up their game quickly. New companies form to carve out a niche or challenge an established provider. Barriers to entry are low. Economists call this an active competitive marketplace. What does this mean for consumers? Consumers enjoy significant downward pressure on prices for cloud services (especially for commodity IaaS). This also means many new products from which to choose. For this reason, we think the modern cloud architect also needs to have some knowledge of the following:

  • Consulting Experience (particularly, how to assess an organization)
  • Relationships Between the Customer and Provider(1)
  • Essentials of Behavior in a Competitive Market(1) – specialization, substitutes, complements, network effects

I’ve seen a few posts lately suggesting that individuals in corporate IT need to retool. Here is one of those posts. I definitely agree with the author that IT administrators need to shift focus to services rather than servers. What this post leaves out is a recommendation of which new skills are needed. Which skills or certifications should an IT administrator or architect acquire? How do they get them? Later in this series, I’ll propose a training path for the new role of Cloud Architect, why certain skills are important, and how to acquire them.

For Part II, I’ll propose three different types of cloud architect, outline the responsibilities that individuals in this role might have, and describe a path to obtain the skills needed to deliver in this new role. Leave a comment below with your thoughts & stay tuned for part II.

 

By John Dixon, Consulting Architect

Photo credit = http://www.patheos.com/blogs/deaconsbench/2011/10/what-they-didnt-ask/

Updating Your Network Infrastructure for Modern Devices

Today the world of IT infrastructures is changing. This is due to the way companies communicate and the way they send and receive data within their networks, and the development of cloud computing and virtualised servers has re-shaped the way we share information with one another.

Cloud computing is a scalable and reliable cloud based environment which utilises remote servers to host and store all of our information. Just some of the benefits of cloud computing include improved accessibility, reduced spending on maintaining localised servers, a streamlining of processes and much more flexibility for businesses and organisations. (To find out more about how cloud computing works and how it can benefit your business, visit PC Mag online.)

Networking and Secure Infrastructures

With the increased accessibility of using servers in the cloud, it’s never been more important for network security. A greater number of people and an increasing number of new devices, including mobile devices will request access to modern day business networks. From laptops and contemporary tablet devices, Blackberries and smart phones, to desktop computers and other digital devices, one single business will have a lot of different data handlers to consider.

With new devices, are increased levels of complexity when it comes to traffic patterns, and as expected there are more security threats when more devices request to access your network. With this in mind, today’s IT infrastructure needs to be updated in order to cope with the increasing amount of data flowing over the IT network. (For more information on networking, visit Logicalis, an international IT solutions provider.)

The Importance of Accessibility

What’s most important to understand is the importance of welcoming such changes to your IT network. Virtualisation can improve the way businesses send and receive information, both internally and externally, and can also help organisations of all sizes cut down on costs in the long-run. Cloud servers can also provided added security with data backup and the development of virtualised computing can reduce planned downtime by up to 90%.

With the growth and development of modern devices it’s now more important than ever to ensure that you have increased accessibility for all business devices. Finding the right IT solutions provider for your business can help you support next-generation technology whilst encouraging better communication between key people in your company. 

Read more on how virtualisation and cloud servers could be redefining the roles of IT within a business on the Logicalis blog

VMware Horizon 6: Updates, Improvements, and Licensing Changes

By Chris Ward, CTO

I got a late start on this blog post but I’m a fan of the saying “better late than never!”  VMware officially shipped Horizon 6, the long awaited major update, to its end user computing product set late last month. There are numerous updates and improvements across the product set with this major release, but there is also a change in how it is licensed. In the past Horizon was consumed either as individual products (VIEW, Mirage, Workspace, etc.) or as a suite which included all components. With this new release, VMware has transitioned to its traditional product hierarchy which includes Horizon Standard, Advanced, and Enterprise editions.  

Each edition builds on previous versions with additional features added into the mix. The Standard edition basically amounts to what we’ve known as VIEW in the past.  It is the baseline VDI feature set inclusive of the connection and security servers, PCoIP protocol, ThinApp application virtualization, and linked clone functionality. Moving to the Advanced edition adds in the Mirage management, Remote Desktop Session Host (RDSH), Horizon Workspace, and vSAN integration.  The Enterprise edition adds vCOPS monitoring and vCAC/vCenter Orchestrator integration.

One of the more exciting features of Horizon 6 is RDSH application publishing. This is a big deal because it’s been a glaring missing checkbox when comparing Horizon to Citrix in the past. This feature allows you to configure Windows terminal server (RDSH) farms which are configured to publish individual applications rather than full desktop sessions, very closely resembling Citrix XenApp. Why’s this a big deal?  Well, it can save a lot of back end horsepower when you can have 50 users share a single RDSH VM to run a few applications rather than needing 50 desktop VMs in traditional VDI. This allows a more flexible architecture so you can deliver each application in the best way possible, rather than being forced into deploying only full desktop operating systems. 

Mirage integration with the traditional VIEW product has improved as well.  While not 100% there, you can now get some of the Mirage OS/application layering functionality inside the VDI environment while still being able to use Mirage in its native capacity as a physical desktop management platform.  vSAN integration is a big step forward in potentially minimizing the typically large storage costs for a VDI environment, and the inclusion of vCOPS in the Enterprise edition is great as it provides very deep insight into what’s going on under the covers with your Horizon infrastructure, including deep PCoIP analytics.  Finally, the Workspace component of Horizon has improved greatly, allowing you to provide your end users with a single web page whereby they can access VDI desktops, RDSH based published applications, Citrix XenApp published applications, ThinApp packaged applications, and SaaS based apps such as Office365, Google Apps, etc.

With this release, VMware seems to be delivering on its promise that the EUC space is one of its 3 strategic focus areas.  I look forward to further improvements, along with the integration of Airwatch into the Horizon family in upcoming releases. For now, Horizon 6 is a very big step in the right direction. 

Have you tried or migrated to Horizon 6 since the launch?  If so, please share your thoughts!

 

Are you interested in learning about how you can extend your data center into the cloud with VMware vCloud Hybrid Service? Register for our upcoming webinar!

 

 

10 Storage Predictions for 2014

By Randy Weis, Consulting Architect, LogicsOne

As we wrap up 2013 and head into the New Year, I wanted to give 10 predictions I have for the storage market for 2014.

  1. DRaaS will be the hottest sector of cloud-based services: Deconstructing cloud means breaking out specific services that fit a definition of a cloud type service such as Disaster Recovery as a Service (DRaaS) and other specialized and targeted usages of shared multi-tenant computing and storage services. Capital expenditures, time to market, and staff training are all issues that prevent companies from developing a disaster recovery strategy and actually implementing it. I predict that DRaaS will be the hottest sector of cloud based services for small to medium businesses and commercial companies. This will impact secondary storage purchases.
  2. Integration of flash storage technology will explode: The market for flash storage is maturing and consolidating. EMC has finally entered into the market. Cisco has purchased Whiptail to integrate it into unified computing systems. PCI flash, server flash drives at different tiers of performance and endurance, hybrid flash arrays, and all flash arrays will all continue to drive the adoption of solid state storage in mainstream computing.
  3. Storage virtualization – software defined storage on the rise: VMware is going to make their virtual VSAN technology generally available at the beginning of Q2 in 2014. This promises to create a brand new tier of storage in datacenters for virtual desktop solutions, disaster recover, and other specific use cases. EMC is their first release of a software defined storage product called ViPR. It has a ways to go before it really begins to address software defined storage requirements, but it is a huge play in the sense that it validates a segment of the market that has long had a miniscule share. DataCore has been the only major player in this space for 15 years. They see EMC’s announcement as a validation of their approach to decoupling storage management and software from the commodity hard drives and proprietary array controllers.
  4. Network Attached Storage (NAS) Revolution: We’re undergoing a revolution with the integration and introduction of scale out NAS technologies. One of the most notable examples is Isilon being purchased by EMC and starting to appear as a more fully integrated and fully available solution with a wide variety of applications. Meanwhile NetApp continues to innovate in the traditional scale up NAS market with increasing adoption of ONTAP 8.x. New NAS systems feature support of the most recent releases SMB 3.0, Microsoft’s significant overhaul of Windows-based file sharing protocol (also known as CIFS). This has a significant impact on design Hyper V Storage and Windows file sharing in general. Client and server side failover are now possible with SMB 3.0, which enables the kind of high availability and resiliency for Hyper V that VMware has enjoyed as a competitive advantage.
  5. Mobile Cloud Storage – File Sharing Will Never Be the Same: Dropbox, Box, Google Drive, Huddle and other smartphone-based methods to access data anywhere are revolutionizing the way individual consumers access their data. This creates security headaches for IT admins, but the vendors are responding with better and better security built into their products. At the enterprise level, Syncplicity, Panzura, Citrix ShareFile, Nasuni and other cloud storage and shared storage technologies are providing deep integration into Active Directory and enabling transfer of large files across long distances quickly and securely. These technologies integrate with on premise NAS systems and cloud storage. Plain and simple, file sharing will never be the same again.
  6. Hyper Converged Infrastructure Will Be a Significant Trend: The market share dominance of Nutanix, Simplivity (based in Westborough, MA) and VMware’s VSAN technology will all change the way shared storage is viewed in datacenters of every size. These products will not replace the use of shared storage arrays but, instead, provide an integrated, flexible and modular way to scale virtualized application deployments, such as VDI and virtual servers. These technologies all integrate compute & storage, networking (at different levels) and even data protection technology, to eliminate multiple expenditures and multiple points of management. Most importantly, Hyper-converged Infrastructure will allow new deployments to begin small and then scale out without large up-front purchases. This will not work for every tier of application or every company, but it will be a significant trend in 2014.
  7. Big Data Will Spread Throughout Industries: Big Data has become as much a buzzword as cloud. The actual use of the technologies that we call big data is growing rapidly. This adoption is not only in internet giants like Google and companies that track online behavior, but also in industries such as insurance, life sciences, and retailers. Integration of big data technologies (i.e. Hadoop, MapReduce) with more traditional SQL database technology allows service providers of any type to extract data from traditional databases and begin processing it on a huge scale more efficiently and more quickly, while still gaining the advantage of more structured databases. This trend will continue to spread throughout many industries that need to manage large amount of structured and unstructured data.
  8. Object based storage will grow: Cloud storage will be big news for 2014 for two major reasons. The first reason stems from shock waves of Nirvanix going out of business. Corporate consumers of cloud storage will be much more cautious and demand better SLAs in order to hold cloud storage providers accountable. The second reason has to do with adoption of giant, geographically dispersed data sets. Object based storage has been a little known, but important, development in storage technology that allows data sets on scale of petabytes to be stored and retrieved by people who generate data and those who consume it. However, these monstrous data sets can’t be protected by traditional RAID technologies. Providers such as Cleversafe have developed a means to spread data across multiple locations, preserving its integrity and improving resiliency while continuing to scale to massive amounts.
  9. More Data Growth: This may seem redundant, but it is predicted that business data will double every two years. While this may seem like great news for traditional storage vendors, it is even better news for people who provide data storage on a massive scale, and for those technology firms that enable mobile access to that data anywhere while integrating well with existing storage systems. This exponential data growth will lead to advances in file system technologies, object storage integration, deduplication, high capacity drives and storage resource/lifecycle management tool advances.
  10. Backup and Data Protection Evolution + Tape Will Not Die: The data protection market continues to change rapidly as more servers and applications are virtualized or converted to SaaS. Innovations in backup technology include the rapid rise of Veeam as a dominant backup and replication technology – not only for businesses but also for service providers. The Backup as a Service market seems to have stalled out because feature sets are limited; however the appliance model for backups and backup services continue to show high demand. The traditional market leaders face very strong competition from the new players and longtime competitor CommVault. CommVault has evolved to become a true storage resources management play and is rapidly gaining market share as an enterprise solution. Data deduplication has evolved from appliances such as Data Domain into a software feature set that’s included in almost every backup software out there. CommVault, Veeam, Backup Exec, and others all have either server side deduplication or client side deduplication (or both). The appliance model for disk-spaced backups continues to be popular with Data Domain, ExaGrid, and Avamar as leading examples. EMC dominates this market share – the competition is still trying to capture market share. Symantec has even entered the game with its own backup appliances, which are essentially servers preconfigured with their popular software and internal storage. Tape will not die. Long term, long capacity archives still require use of tapes, primarily for economic reasons. The current generation of tape technology, such as LTO6, can contain up to 6 TB of data on a single tape. Tape drives are routinely made with built-in encryption to avoid data breaches that were more common in the past with unencrypted tape.

 

So there you have it, my 2014 storage predictions. What do you think? Which do you agree with/disagree with? Did I leave anything off that you think will have a major impact next year? As always, reach out if you have any questions!

 

The 2013 Tech Industry – A Year in Review

By Chris Ward, CTO, LogicsOne

As 2013 comes to a close and we begin to look forward to what 2014 will bring, I wanted to take a few minutes to reflect back on the past year.  We’ve been talking a lot about that evil word ‘cloud’ for the past 3 to 4 years, but this year put a couple of other terms up in lights including Software Defined X (Datacenter, Networking, Storage, etc.) and Big Data.  Like ‘cloud,’ these two newer terms can easily mean different things to different people, but put in simple terms, in my opinion, there are some generic definitions which apply in almost all cases.  Software Defined X is essentially the concept of taking any ties to specific vendor hardware out of the equation and providing a central point for configuration, again vendor agnostic, except of course for the vendor providing the Software Defined solution :) .  I define Big Data simply as the ability to find a very specific and small needle of data in an incredibly large haystack within a reasonably short amount of time. I see both of these technologies becoming more widely adopted in short order with Big Data technologies already well on the way. 

As for our friend ‘the cloud,’ 2013 did see a good amount of growth in consumption of cloud services, specifically in the areas of Software as a Service (SaaS) and Infrastructure as a Service (IaaS).  IT has adopted a ‘virtualization first’ strategy over the past 3 to 4 years when it comes to bringing any new workloads into the datacenter.  I anticipate we’ll begin to see a ‘SaaS first’ approach being adopted in short order if it is not out there already.  However, I can’t necessarily say the same on the IaaS side so far as ‘IaaS first’ goes.  While IaaS is a great solution for elastic computing, I still see most usage confined to the application development or super large scale out application (Netflix) type use cases.  The mass adoption of IaaS for simply forklifting existing workloads out of the private datacenter and into the public cloud simply hasn’t happened.  Why?? My opinion is for traditional applications neither the cost nor operational model make sense, yet. 

In relation to ‘cloud,’ I did see a lot of adoption of advanced automation, orchestration, and management tools and thus an uptick in ‘private clouds.’  There are some fantastic tools now available both commercially and open source, and I absolutely expect to see this adoption trend to continue, especially in the Enterprise space.  Datacenters, which have a vast amount of change occurring whether in production or test/dev, can greatly benefit from these solutions. However, this comes with a word of caution – just because you can doesn’t mean you should.  I say this because I have seen several instances where customers have wanted to automate literally everything in their environments. While that may sound good on the surface, I don’t believe it’s always the right thing to do.  There are times still where a human touch remains the best way to go. 

As always, there were some big time announcements from major players in the industry. Here are some posts we did with news and updates summaries from VMworld, VMware Partner Exchange, EMC World, Cisco Live and Citrix Synergy. Here’s an additional video from September where Lou Rossi, our VP, Technical Services, explains some new Cisco product announcements. We also hosted a webinar (which you can download here) about VMware’s Horizon Suite as well as a webinar on our own Cloud Management as a Service Offering

The past few years have seen various predictions relating to the unsustainability of Moore’s Law which states that processors will double in computing power every 18-24 months and 2013 was no exception.  The latest prediction is that by 2020 we’ll reach the 7nm mark and Moore’s Law will no longer be a logarithmic function.  The interesting part is that this prediction is not based on technical limitations but rather economic ones in that getting below that 7nm mark will be extremely expensive from a manufacturing perspective and, hey, 64k of RAM is all anyone will ever need right?  :)

Probably the biggest news of 2013 was the revelation that the National Security Agency (NSA) had undertaken a massive program and seemed to be capturing every packet of data coming in or out of the US across the Internet.   I won’t get into any political discussion here, but suffice it to say this is probably the largest example of ‘big data’ that exists currently.  This also has large potential ramifications for public cloud adoption as security and data integrity have been 2 of the major roadblocks to adoption so it certainly doesn’t help that customers may now be concerned about the NSA eavesdropping on everything going on within the public datacenters.  It is estimated that public cloud providers may lose as much as $22-35B over the next 3 years as a result of customers slowing adoption due to this.  The only good news in this, at least for now, is it’s very doubtful that the NSA or anyone else on the planet has the means to actual mine anywhere close to 100% of the data they are capturing.  However, like anything else, it’s probably only a matter of time.

What do you think the biggest news/advancements of 2013 were?  I would be interested in your thoughts as well.

Register for our upcoming webinar on December 19th to learn how you can free up your IT team to be working on more strategic projects (while cutting costs!).