Archivo de la categoría: SDDC

VMworld 2015: Day One Recap

It was a long but good week out west for VMworld 2015. This year’s event was kicked off by Carl Eschenbach (COO) who said there were roughly 23,000 attendees at the event this year, a new record. Carl highlighted that the core challenges seen today by VMware’s customers are speed, innovation, productivity, agility, security, and cost.  Not a huge surprise based on what I have seen with our customer base. Carl then went into how VMware could help customers overcome these challenges and broke the solutions up into categories of run, build, deliver, and secure. The overarching message here was that VMware is keenly focused on making the first three (run, build, and deliver) easier and focusing on security across all of the various product/solution sets in the portfolio.  Carl also hit on freedom, flexibility, and choice as being core to VMware, meaning that they are committed to working with any and all vendors/solutions/products, both upstream in the software world and downstream in the hardware world.  We’ve heard this message now for a couple of years and it’s obvious that VMware is making strides in that area (one example being more and more Openstack integration points).

 

Carl then began discussing the concept of a single Unified Hybrid Cloud.  In a way, this is very similar to GreenPages’ CMaaS messaging in that we don’t necessarily care where systems and applications physically reside because we can provide a single pane of glass to manage and monitor regardless of location.  In the case of VMware, this means having a common vSphere based infrastructure in the datacenter or in the cloud and allowing seamless movement of applications across various private or public environments.

Carl then introduced Bill Fathers, the general manager for vCloud Air.  Apparently, the recent rumors regarding the death of vCloud Air were greatly exaggerated as it was front and center in both keynotes and during Sunday’s partner day. As far as vCloud Air adoption, Bill said that VMware is seeing the most traction in the areas of DR, application scaling, and mobile development.

Bill brought Raghu Raghuram, who runs the infrastructure and management (SDDC) business, up on stage with him. Ragu, again, kept the conversation at a high level and touched on the rise of the hybrid application and how VMware’s Unified Hybrid Cloud strategy could address this.  A hybrid application is one in which some components (typically back end databases) run in the traditional on premise datacenter while other components (web servers, middleware servers, etc.) run in a public cloud environment. This really ties into the age old concept of “cloud bursting,” where one might need to spin up a lot of web servers for a short period of time (black Friday for retail, Valentine’s day for flower shops, etc.) then spin them back down. This has really been a bit of science fiction to date, as most applications were never developed with this in mind and, thus, don’t necessarily play nice in this world.  However, VMware (and I can personally attest to this via conversations with customers) is seeing more and more customers develop “cloud native” applications which ARE designed to work in this way. I would agree, this will be a very powerful cloud use case over the next 12-24 months. I see GreenPages being very well position to add a ton of value for our customers in this area, as we have strong teams on both the infrastructure and cloud native application development sides of the equation.

Another tight collaboration between Bill and Raghu’s teams is Project Skyscraper; the concept of Cross-Cloud vMotion, which, as the name would imply, is the process of moving a live running virtual machine between a private cloud and vCloud Air (or vice versa) with literally zero downtime.  Several technologies come together to make this happen including NSX to provide the layer 2 stretch between the environments and shared nothing vMotion/vSphere replication  to handle the data replication and actual movement of the VM.  While this is very cool and makes for a great demo, I do question why you would want to do a lot of it. As we know, there is much more to moving an existing application to a cloud environment than simply forklifting what you have today.  Typically, you’ll want to re-architect the application to take full advantage of what the public cloud can offer. But, if you simply want an active/active datacenter and/or stretch cluster setup and don’t have your own secondary datacenter or co-lo facility to build it, this could be a quick way to get there.

Following Raghu was Rodney Rogers CEO of Virtustream, the hosting provider recently acquired by EMC and the rumored death nail to vCloud Air.  Rodney did a great job explaining where Virtustream fits in the cloud arena. It is essentially a place to host business critical tier 1 applications, like SAP, in a public cloud environment.  I won’t go into deep technical detail, but Virtustream has found a way to make hosting these large critical applications cost effective in a robust/resilient way. I believe the core message here was that Virtustream and vCloud Air are a bit like apples and oranges and that neither is going away. I do believe at some point soon we’ll be hearing about some form of consolidation between the two so stay tuned!

Ray O’Farrell, the newly appointed CTO and longtime CDO (Chief Development Officer), was next up on the stage.  He started off talking about containers (Docker, Kubernetes, etc.) in a general sense.  He quickly went on to show some pretty cool extensions that VMware is working on so that the virtualization admins can have visibility into the container level via traditional management tools such as the vCenter Web Client.  This is a bit of a blind spot currently as the VMware management tools can drill down to the virtual machine level but not any additional partitioning (such as containers) which may exist within virtual machines.  Additionally, Ray announced Project Photon. It’s basically a super thin hypervisor based on the vSphere kernel which would act as a container platform within the VMware ecosystem. The platform consists of a controller which VMware will release as open source and a ‘machine’ which will be proprietary to VMware as part of the Photon Platform but will be a paid subscription service.  Additionally, there will be an integrated bundle of the Pivotal Cloud Foundry platform bundled with Photon as another subscription option.  It’s apparent that VMware is really driving hard into the developer space, but it remains to be seen if workloads like big data and containers will embrace a virtual platform. I’ll post a recap of Tuesday’s general session tomorrow!

GreenPages is hosting a webinar on 9/16, “How to Increase Your IT Equity: Deploying a Build-Operate-Transform Model for IT Operations” . Learn how to create long-term value for your organization and meet the increasing demand for services. Register Now!

 

By Chris Ward, CTO

Network Virtualization: A Key Enabler of the SDDC

In this video, Steve Mullaney, VMware’s SVP of Networking and Security Business Unit, discusses network virtualization. Network virtualization is a key enabler to delivering a software defined data center. According to Steve, from a customer perspective there really ends up being two use cases. The first is an agility use case to increase speed to innovation. In the past, organizations have had to separate infrastructures for development and dev and production. Network virtualization is allowing people to have one common computing infrastructure that they can logically isolate and create separate networks. This easily allows them to move from production to dev to test.

The second use case is security. Network virtualization allows organizations to provide additional security mechanisms within their data centers by using microsegmentation. If a company were to do this with physical firewalls and exiting technology, it would be extremely expensive and close to impossible operationally to implement. Network virtualization makes this a possibility.

You can hear more from Steve on Twitter. Follow @smullaney

 

Network Virtualization and the Software Defined Data Center

 

http://www.youtube.com/watch?v=CfiYqF9EU10

 

 

 

GreenPages is one of VMware’s top partners in the country and last year won its Global Virtualization of Business Critical Applications Award. Email us at socialmedia@greenpages.com to see how GreenPages can help with your VMware initiatives.

 

 

VMworld 2014 Recap: SDDC, EUC & Hybrid Cloud

By Chris Ward, CTO   Another year, another VMworld in the books. It was a great event this year with some key announcements and updates. First, some interesting stats: The top 3 strategic priorities for VMware remain unchanged (Software Defined Datacenter/Enterprise, End User Computing, and Hybrid Cloud).  Some interesting numbers presented included on premise infrastructure…Read More »

The Big Shift: From Cloud Skeptics & Magic Pills to ITaaS Nirvana

By Ron Dupler, CEO GreenPages Technology Solutions

Over the last 4-6 quarters, we have seen a significant market evolution, with our customers and the overall market moving from theorizing about cloud computing to defining strategies and plans to reap the benefits of cloud computing solutions and implement hybrid cloud models. In a short period of time we’ve seen IT thought leaders move from debating the reality and importance of cloud computing, to trying to understand how to most effectively grasp the benefits of cloud computing to improve organizational efficiency, velocity, and line of business empowerment. Today, we see the leading edge of the market aggressively rationalizing their application architectures and driving to hybrid cloud computing models.

Internally, we call this phenomenon The Big Shift. Let’s discuss what we know about The Big Shift. First for all of the cloud skeptics reading this, it is an undeniable fact that corporate application workloads are moving from customer owned architectures to public cloud computing platforms. RW Baird released an interesting report in Q’4 of 2013 that included the following observations:

  • Corporate workloads are moving to the public cloud.
  • Much of the IT industry has been asleep at the wheel as Big Shift momentum has accelerated due to the fact that public cloud spending still represents a small portion of overall IT spend.
  • Traditional IT spending is growing in the low single digits. 2-3% per year is a good approximation.
  • Cloud spending is growing at 40% plus per year.
  • What we call The Big Shift is accelerating and is going to have a tremendous impact on the traditional IT industry in the coming years. For every $1.00 increase in public cloud spending, there is a corresponding $3.00-$4.00 decrease in customer-owned IT spend.

There are some other things we know about The Big Shift:

The Big Shift is disrupting old industry paradigms and governance models. We see market evidence of this in traditional IT industry powerhouses like HP and Dell struggling to adapt and reinvent themselves and to maintain relevance and dominance in the new ITaaS era. We even saw perennial powerhouse Cisco lower its 5 year growth forecast during last calendar Q’4 due to the forces at play in the market. In short, the Big Shift is driving disruption throughout the entire IT supply chain. Companies tied to the traditional, customer-owned IT world are finding themselves under financial pressures and are struggling to adapt. Born in the cloud companies like Amazon are seeing tremendous and accelerating growth as the market embraces ITaaS.

In corporate America, the Big Shift is causing inertia as corporate IT leaders and their staffs reassess their IT strategies and strive to determine how best to execute their IT initiatives in the context of the tremendous market change going on around them. We see many clients who understand the need to drive to an ITaaS model and embrace hybrid cloud architectures but do not know how best to attack that challenge and prepare to manage in a hybrid cloud world. This lack of clarity is causing delays in decision making and stalling important IT initiatives.

Let’s discuss cloud for a bit. Cloud computing is a big topic that elicits emotional reactions. Cloud-speak is pervasive in our industry. By this point, the vast majority of your IT partners and vendors are couching their solutions as cloud, or as-a-service, solutions. Some folks in the industry are bold enough to tell you that they have the magic cloud pill that will lead you to ITaaS nirvana. Due to this, many IT professionals that I speak with are sick of talking about cloud and shy away from the topic. My belief is that this avoidance is counterproductive and driven by cloud pervasiveness, lack of precision and clarity when discussing cloud, and the change pressure the cloud revolution is imposing on all professional technologists. The age old mandate to embrace change or die has never been more relevant. Therefore, we feel it is imperative to tackle the cloud discussion head on.

Download our free whitepaper “Cloud Management, Now

Let me take a stab at clarifying the cloud discussion. Figure 1 below represents the Big Shift. As noted above, it is undeniable that workloads are shifting from private, customer owned IT architectures, to public, customer rented platforms, i.e. the public cloud. We see three vectors of change in the industry that are defining the cloud revolution.

Cloud Change Vectors

The first vector is the modernization of legacy, customer-owned architectures. The dominant theme here over the past 5-7 years has been the virtualization of the compute layer. The dominant player during this wave of transformation has been VMware. The first wave of virtualization has slowed in the past 4-6 quarters as the compute virtualization market has matured and the vast majority of x86 workloads have been virtualized. There is a new second wave that is just forming and that will be every bit as powerful and important as the first wave. This wave is represented by new, advanced forms of virtualization and the continued abstraction of more complex components of traditional IT infrastructure: networking, storage, and ultimately entire datacenters as we move to a world of software defined datacenter (SDDC) in the coming years.

The second vector of change in the cloud era involves deploying automation, orchestration, and service catalogues to enable private cloud computing environments for internal users and lines of business. Private cloud environments are the industry and corporate IT’s reaction to the public cloud providers’ ability to provide faster, cheaper, better service levels to corporate end users and lines of business. In short, the private cloud change vector is driven by the fact that internal IT now has competition. Their end users and lines of business, development teams in particular, have new service level expectations based on their consumer experiences and their ability to get fast, cheap, commodity compute from the likes of Amazon. To compete, corporate IT staffs must enable self-service functionality for their lines of business and development teams by deploying advanced management tools that provide automation, orchestration, and service catalogue functionality.

The third vector of change in the cloud era involves tying the inevitable blend of private, customer-owned architectures together with the public cloud platforms in use today at most companies. The result is a true hybrid cloud architectural model that can be managed, preserving the still valid command and control mandates of traditional corporate IT,  and balancing those mandates with the end user empowerment and velocity expected in today’s cloud world.

In the context of these three change vectors we see several approaches within our customer base. We see some customers taking a “boil the ocean” approach and striving to rationalize their entire application portfolios to determine best execution venues and define a path to a true hybrid cloud architecture. We see other customers taking a much more cautious approach and leveraging cloud-based point solutions like desktop and disaster recovery as-a-service to solve old business problems in new ways. Both approaches are valid and depend on uses cases, budgets, and philosophical approach (aggressive, leading-edge, versus conservative follow-the-market thinking).

GreenPages business strategy in the context of the ITaaS and cloud revolution is simple. We have built an organization that has the people, process, and technologies to provide expert strategic guidance and proven cloud-era solutions for our clients through a historical inflection point in the way that information technology is delivered to corporate end users and lines of business. Our cloud management as a service offering (CMaaS) provides a technology platform that helps customers integrate the disparate management tools deployed in their environments and federate alerts through an enterprise command center approach that gives a singular view into physical, virtual, and public cloud workloads. CMaaS also provides cloud service brokerage and governance capabilities allowing our customers to view price-performance analytics across private and public cloud environments, design service models and view the related bills of material, and view and consolidate billings across multiple public cloud providers. What are your thoughts on the Big Shift? How is your organization addressing the changes in the IT landscape?

The 2013 Tech Industry – A Year in Review

By Chris Ward, CTO, LogicsOne

As 2013 comes to a close and we begin to look forward to what 2014 will bring, I wanted to take a few minutes to reflect back on the past year.  We’ve been talking a lot about that evil word ‘cloud’ for the past 3 to 4 years, but this year put a couple of other terms up in lights including Software Defined X (Datacenter, Networking, Storage, etc.) and Big Data.  Like ‘cloud,’ these two newer terms can easily mean different things to different people, but put in simple terms, in my opinion, there are some generic definitions which apply in almost all cases.  Software Defined X is essentially the concept of taking any ties to specific vendor hardware out of the equation and providing a central point for configuration, again vendor agnostic, except of course for the vendor providing the Software Defined solution :) .  I define Big Data simply as the ability to find a very specific and small needle of data in an incredibly large haystack within a reasonably short amount of time. I see both of these technologies becoming more widely adopted in short order with Big Data technologies already well on the way. 

As for our friend ‘the cloud,’ 2013 did see a good amount of growth in consumption of cloud services, specifically in the areas of Software as a Service (SaaS) and Infrastructure as a Service (IaaS).  IT has adopted a ‘virtualization first’ strategy over the past 3 to 4 years when it comes to bringing any new workloads into the datacenter.  I anticipate we’ll begin to see a ‘SaaS first’ approach being adopted in short order if it is not out there already.  However, I can’t necessarily say the same on the IaaS side so far as ‘IaaS first’ goes.  While IaaS is a great solution for elastic computing, I still see most usage confined to the application development or super large scale out application (Netflix) type use cases.  The mass adoption of IaaS for simply forklifting existing workloads out of the private datacenter and into the public cloud simply hasn’t happened.  Why?? My opinion is for traditional applications neither the cost nor operational model make sense, yet. 

In relation to ‘cloud,’ I did see a lot of adoption of advanced automation, orchestration, and management tools and thus an uptick in ‘private clouds.’  There are some fantastic tools now available both commercially and open source, and I absolutely expect to see this adoption trend to continue, especially in the Enterprise space.  Datacenters, which have a vast amount of change occurring whether in production or test/dev, can greatly benefit from these solutions. However, this comes with a word of caution – just because you can doesn’t mean you should.  I say this because I have seen several instances where customers have wanted to automate literally everything in their environments. While that may sound good on the surface, I don’t believe it’s always the right thing to do.  There are times still where a human touch remains the best way to go. 

As always, there were some big time announcements from major players in the industry. Here are some posts we did with news and updates summaries from VMworld, VMware Partner Exchange, EMC World, Cisco Live and Citrix Synergy. Here’s an additional video from September where Lou Rossi, our VP, Technical Services, explains some new Cisco product announcements. We also hosted a webinar (which you can download here) about VMware’s Horizon Suite as well as a webinar on our own Cloud Management as a Service Offering

The past few years have seen various predictions relating to the unsustainability of Moore’s Law which states that processors will double in computing power every 18-24 months and 2013 was no exception.  The latest prediction is that by 2020 we’ll reach the 7nm mark and Moore’s Law will no longer be a logarithmic function.  The interesting part is that this prediction is not based on technical limitations but rather economic ones in that getting below that 7nm mark will be extremely expensive from a manufacturing perspective and, hey, 64k of RAM is all anyone will ever need right?  :)

Probably the biggest news of 2013 was the revelation that the National Security Agency (NSA) had undertaken a massive program and seemed to be capturing every packet of data coming in or out of the US across the Internet.   I won’t get into any political discussion here, but suffice it to say this is probably the largest example of ‘big data’ that exists currently.  This also has large potential ramifications for public cloud adoption as security and data integrity have been 2 of the major roadblocks to adoption so it certainly doesn’t help that customers may now be concerned about the NSA eavesdropping on everything going on within the public datacenters.  It is estimated that public cloud providers may lose as much as $22-35B over the next 3 years as a result of customers slowing adoption due to this.  The only good news in this, at least for now, is it’s very doubtful that the NSA or anyone else on the planet has the means to actual mine anywhere close to 100% of the data they are capturing.  However, like anything else, it’s probably only a matter of time.

What do you think the biggest news/advancements of 2013 were?  I would be interested in your thoughts as well.

Register for our upcoming webinar on December 19th to learn how you can free up your IT team to be working on more strategic projects (while cutting costs!).