Archivo de la etiqueta: cloud

5 Ways to Understand Your Applications and IT Services

How do you view your organization’s applications and IT services? At GreenPages, we often suggest that organizations begin to conceptualize IT services as corporate IT evolves from a technology provider to an innovation center. Now, there are ways to establish and maintain a service portfolio through ITBM (IT Business Management or IT Financial Management) systems, but these are often out of reach for customers less than enterprise level. However, you can conceptualize IT services by looking at your applications from five different perspectives. Let’s use Microsoft Exchange as an example.

applications and IT servicesExchange is an enterprise application that provides email and calendaring. If you’re reading this, there is a good chance that you own servers that host the various components that comprise Exchange. One way to think about cloud is to identify the Exchange servers, their operating systems, the application version, performance requirements, etc. and identify a “place in the cloud” where you can procure servers of similar specifications and migrate the instances. I consider this as the infrastructure perspective. When it comes to cloud computing, this is perhaps the least important.

 

To take full advantage of cloud computing, understand your applications and IT services from a few additional perspectives:

  1. Functional
  2. Financial
  3. Operational (including lifecycle)
  4. Organizational
  5. Use-case

 

Hopefully, after looking at these different perspectives, you’ll see Exchange as part of an IT service that fits this description:

“In operation for over 20 years, E-Communications is a business service that allows each of our 1,200 employees to communicate through email, coordinate meetings, find coworkers’ contact information, and organize tasks using their PC, Mac, mobile device, or home computer 24x7x365. The service is supported by Microsoft Exchange and Active Directory, which both run under VMware vSphere. The service requires 1 full-time administrator who added 12 new users and logged 157 support tickets in 2014. In 2014, charges for software maintenance, personnel, infrastructure depreciation, and outside support services totaled $87,456. A software upgrade is planned for 2015. Users do not generally complain about the performance of the service, other than the size of their mailbox quotas (which are limited to 10GB per user). The company as a whole plans to offer telecommuting packages to more than 250 employees in 2015.”

Armed with this understanding of your IT service that includes Exchange, you might take the following action:

  1. Fund an Office365 migration with capital you had allocated for the Exchange upgrade project
  2. Provide copies of Office applications to telecommuters (without additional charge)
  3. Expand the mailbox quota from 10GB to 50GB
  4. Repurpose your Exchange admin to help telecommuters establish their home offices in 2015
  5. Reduce your spend on E-Communications by more than 50% (from $72.88/user to $35.00/user)

 

Of course, not every application is easily identifiable as belonging to an IT service. The functionality or financial aspects of IT services are often difficult to quantify. However, at GreenPages, especially when looking at cloud computing options, we recommend examining all of your applications through these five perspectives. For this reason, GreenPages has embedded this process in a piece of software that can quickly build your services portfolio and recommend optimizations based on current offerings available – such as Microsoft Office 365.

What are your thoughts?

You can hear more from John in his eBook, “The Evolution of Your Corporate IT Department

By John Dixon, Director of Cloud Services

CIO Focus Interview: David Chou

CIO focus interviewThis is the fourth installment of our CIO Focus Interview series. This time, I spoke with David Chou, the CIO of a large academic medical center. A recognized thought leader, David is on the Huffington Post’s 2015 list of the top 100 most social CIOs on Twitter, and I would definitely recommend following him. Enjoy!

 

Ben: Could you give us some background on your IT experience?

David: I was fortunate to be put on the IT fast track. I was your typical college student getting a BA in Computer Science, and somehow I landed an analyst job at a small community hospital in LA. This allowed me to get the opportunity to really understand the health care industry from an operational standpoint. From there, I focused on understanding operations and then finding the right technologies to fit in. I took the opposite approach than most IT professionals do. I dug deep into the operations model and then figured out which technologies worked well and matched them. That approach led to me getting exposure up the food chain that opened some doors for me. One thing I realized when talking to my counterparts who are successful is that you have to grasp opportunities, even if it means disrupting other aspects of your life.

 

Ben: What is your job like now?

David: Currently, I work at a large academic medical center. In bigger medical centers, there are typically CIOs across all three verticals – healthcare, research, and higher education. Oftentimes, this causes tension and barriers in terms of adoption. In my position, I have control over all three, which is a pretty unique model to have. In addition, we are a public center which also makes us unique in how we operate.

 

Ben: What are your main responsibilities?

David: Today, I manage day to day operations and an $82 million budget. Early in my career the CIO operated transactional data entry, maintaining mainframes, etc. Now it’s a lot more strategic. Technology should be at the core of every organization. The CIO has to be involved strategically. This means being a part of the executive team and having a seat at the table.

{Follow David on Twitter @dchou1107}

Ben: What areas of IT do you think are having the biggest impact on the industry?

David: Right now the focus is on the “4 pillars” of cloud, mobile, social and big data. Any executive that doesn’t have that vision is not going to be well off in the future. These are extremely important and strategic to me. I am trying to get the organization to adopt the cloud. Organizational culture plays a big role in this. Cloud can be an uncomfortable topic so that’s a barrier. I’m challenging that traditional mindset.

Mobile is also very big for us. Consumers in healthcare want to have personalized medicine. They want to shop for healthcare the same way they shop on Amazon. That’s where I believe healthcare is moving towards – a retail model. Whoever successfully pulls that off first is going to cause a huge disruption. We’re all trying to figure out how to utilize it. We want to be able to predict outcomes and provide the best customer experience possible.

I really believe in the importance of social media and the value of capturing consumer engagement and behavior. In my vertical, it has not been widely adopted yet. The big focus has been on cloud, mobile and big data.

 

Ben: How are you incorporating those technologies in your organization?

David: We’re in the process of incorporating a hybrid cloud model in our environment. From a budgetary and contractual perspective we’re all ready to go, we’re just getting the organization’s terms and conditions aligned with the cloud  providers. It’s a challenge for us to get public cloud providers to agree to our terms and conditions.

Our Electronic Medical Record system went live a year ago. Four years ago we had disparate systems that took a lot of manual upkeep. The first step to remedying this was moving from manual to digital. Now that we have that new format, we can take a controlled approach. We’ll look into some consumer friendly products that allow users to have access to data and have self-serving and provisioning capabilities. After this is implemented for a year, my goal is to take another look. We’ll have what we need to solve 80% of problems, so the question will be whether or not that extra 20% is worth a full blown BI platform for analytics?

 

Ben: What advice do you have for other CIOs starting out in the healthcare industry?

David: Take the time to build that relationship with the business. Learn the terms and lingo. Talking tech won’t work with most business executives so you need to adapt. Ultimately, you need to focus on understanding the needs of the customer and solving those needs.

 

Are you interested in winning a GoPro? Subscribe to our blog by 2/12/2015 for your chance to win!

 

By Ben Stephenson, Emerging Media Specialist

5 Tips to be Prepared to Answer Cloud Questions from the C-Suite

Okay, so here we are in 2015 in this new age of cloud…what should IT professionals do to be ready to answer cloud questions and to migrate? It’s not a matter of if the CIO/CEO asks the question; it’s a matter of when. We, as IT worker bees, often are not privy to the conversations between the uber competitive CEOs of the world. They wouldn’t be CEO’s if they weren’t A-type competitive individuals. So the rule is how do I keep up with the Joneses, AKA my competitors in my marketspace.

answer cloud questions

Here are 5 recommendations that should help prepare the IT Director for this request from up on high.

Update your server and application stack

You probably should have run all your updates at year’s end, but you may have been too busy. So now is the time to do this. Update servers, desktops, router firmware, and mobile devices. This is often one of the most time-consuming, often overlooked and problem-causing tasks you can undertake (especially when a server doesn’t come back up after a reboot). Do it now and do it right, and you’ll start the year way ahead the game.

Educate yourself          

Now is the time to read what the market analysts say. Read what the vendors are saying. See what Gartner has to say for top of mind solutions, like Microsoft Azure. Don’t wait until the CEO says “Hey what is our cloud strategy” to run back to your desk and start training. Rollout is for usage, not for running up the learning curve. Also, proactively educate your staff, educate your users, and educate your management. In the end, you will be glad you did.

Clarify for cloud strategy

I’m not talking about a hair rinse you should use once a week. This is about clarifying your intentions around adoption of cloud for the year with upper management. Get out ahead of this, make sure they know what to expect then you will foster conversation that leads to insight on what they expect. Starting 2015 without clear expectations on both sides leads to confusion and eventually a year goes by and nothing has been accomplished.

Timeline your cloud migration

Take the calendar and break it into milestones. For example, by end of Q1 you want to have any hardware or software issues resolved per the Office365 Readiness toolkit results. By end of Q2 you want to have a pilot functioning for the testing of Office365 or Azure. Fail to plan, plan to fail as my friend and peer Randy Becker says…

Purge the spam

Many technology consumers feel that, much like at home, at work they should keep any and every e-record. That obviously leads to bloat in Exchange databases, file server solutions and other places. Backups become uncontrollable, and finally when you need to migrate, that little piece of corrupted spam will stop the mailbox from migrating.  Seriously — the beginning of the year is a great time to get rid of all those pieces of hardware you no longer need. And when you do purge, make sure to do it responsibly (i.e. empty the ‘Deleted’ and ‘Sent Items’ folders). Run the Exchange Maintenance Tasks and compact that database. In regards to hardware, larger cities usually have computer recycling services that can safely get rid of your old technology. Use them. By tossing out the junk, you’ll make for a much more efficient start of 2015.

So this should help you, the IT director, find a path to be ready to answer the “Cloud Ready” question. To reiterate, it is not a matter of if that question is coming, it is a matter of when. Good Luck and May the Cloud be with You.

If you’re interested in speaking more with David about how you can better prepare yourself for your organization’s transition to the cloud, click here.

 

By David Barter, Practice Manager – Microsoft Technologies

2015 Predictions: Cloud and Software-Defined Technologies

As we kick off the new year, it’s time for us to get our 2015 predictions in. Today, I’ll post predictions from John Dixon around the future of cloud computing as well as from our CTO Chris Ward about software-defined technologies. Later this week, we’ll get some more predictions around security, wireless, end-user computing& more from some of our other experts.

John Dixon, Director, Cloud Services
On the Internet of Things (IoT) and Experimentation…
In 2015, I expect to see more connected devices and discussion on IoT strategy. I think this is where cloud computing gets really interesting. The accessibility of compute and storage resources on the pay-as-you-go model supports experimentation with a variety of applications and devices. Will consumers want a connected toaster? In years past, companies might form focus groups, do some market research, etc. to pitch the idea to management, get funding, build a team, acquire equipment, then figure out the details of how to do this. Now, it’s entirely possible to assign one individual to experiment and prototype the connected toaster and associated cloud applications. Here’s the thing; the connected toaster probably has about zero interest in the market for consumer appliances. However, the experiment might have produced a pattern of a cloud-based application that authenticates and consumes data from a device with little or no compute power. And this pattern is perhaps useful for other products that DO have real applications. In fact, I put together a similar experiment last week with a $50 Raspberry Pi and about $10 of compute from AWS — the application reports on the temperature of my home-brew fermentation containers, and activates a heating source when needed. And, I did indeed discover that the pattern is really, really scalable and useful in general. Give me a call if you want to hear about the details!

On the declining interest in “raw” IaaS and the “cloud as a destination” perspective…

I’ve changed my opinion on this over the past year or so. I had thought that the declining price of commodity compute, network, and storage in the cloud meant that organizations would eventually prefer to “forklift move” their infrastructure to a cloud provider. To prepare for this, organizations should design their infrastructure with portability in mind, and NOT make use of proprietary features of certain cloud providers (like AWS). As of the end of 2014, I’m thinking differently on this — DO consider the tradeoff between portability and optimization, but… go with optimization. Optimization is more important than infrastructure portability. By optimization in AWS terms, I mean taking advantage of things like autoscaling, cloudwatch, S3, SQS, SNS, cloudfront, etc. Pivotal and CloudFoundry offer similar optimizations. Siding with optimization enables reliability, performance, fault tolerance, scalability, etc., that are not possible in a customer-owned datacenter. I think we’ll see more of this “how do I optimize for the cloud?” discussion in 2015.

2015 predictions

Chris & John presenting a breakout session at our 2014 Summit Event

Chris Ward, CTO

On SDN…

We’ll see much greater adoption of SDN solutions in 2015.   We are already seeing good adoption of VMware’s NSX solution in the 2nd half of 2014 around the micro segmentation use case.  I see that expanding in 2015 plus broader use cases with both NSX and Cisco’s ACI.  The expansion of SDN will drag with it an expansion of automation/orchestration adoption as these technologies are required to fully realize the benefits of broader SDN use cases.

On SDS…

Software defined storage solutions will become more mainstream by the end of 2015.  We’re already seeing a ton of new and interesting SDS solutions in the market and I see 2015 being a year of maturity.  We’ll see several of these solutions drop off the radar while others gain traction and I have no doubt it will be a very active M&A year in the storage space in general.

 

What do you think about Chris and John’s predictions?

If you would like to hear more from these guys, you can download Chris’ whitepaper on data center migrations and John’s eBook around the evolution of cloud.

 

By Ben Stephenson, Emerging Media Specialist

Our Top 10 Blog Posts of 2014

With the year officially coming to an end, I decided to pick our top 10 blog posts of 2014 (in no particular order)…

 

  1. The Big Shift: From Cloud Skeptics & Magic Pulls to ITaaS Nirvana – In this post GreenPages’ CEO Ron Dupler covers the shift in industry that has disrupted old paradigms and driven uses to embrace hybrid cloud architectures.
  2. How Software Defined Networking is Enabling the Hybrid Cloud – Networking expert Nick Phelps discusses how software defined networking is enabling the hybrid cloud & creating the networks of tomorrow.
  3. Have You Met My Friend, Cloud Sprawl? – John Dixon explains cloud sprawl and provides advice for IT departments on how to deal with it.
  4. A Guide to Successful Big Data Adoption – In this video, storage expert Randy Weis talks about the impact big data is having on organizations and provides an outline for the correct approach companies should be taking in regards to big data analytics.
  5. Key Announcements from Citrix Synergy 2014 Part 1 and Part 2 – In this 2 part blog series, Randy Becker summarizes the key announcements from the Citrix Synergy event in Anaheim and the impact these changes will have on the industry.
  6. Don’t Be a Michael Scott – Embrace Change in IT – Limitless paper in a paperless world
  7. Managing Resources in the Cloud: How to Control Shadow IT & Enable Business Agility – In this video, our CTO Chris Ward discusses the importance of gaining visibility into Shadow IT and how IT Departments need to offer the same agility to its users that public cloud offerings like Amazon provide.
  8. CIO/CTO Interview Series: Stuart Appley, Rick Blaisdell, Gunnar Berger – This year, we started a CIO/CTO interview series on the blog to get the opinions and insights of some of the top thought leaders out there. Above are the first three of the series.
  9. VDI: You Don’t Need to Take an All-or-Nothing Approach – In this video, Francis Czekalski discusses the benefits of not taking an all-or-nothing approach with VDI.
  10. Network Virtualization: A Key Enabler of the SDDC – This is actually a guest video with VMware’s SVP of Networking and Security Business Unit.

 

Were there any posts you think should have been included on the list that weren’t?

 

By Ben Stephenson, Emerging Media Specialist

 

CTO Focus Interview: Gunnar Berger, Citrix

CTO Focus InterviewIn the third installment of our CTO Focus Interview series, I got to speak with Gunnar Berger, CTO at Citrix (View Part I and Part II of the series). Gunnar is a well respected thought leader who previously worked as an Analyst at Gartner and joined Citrix last June. Gunnar is on a mission to make VDI easier and cheaper to deploy. I’d highly recommend following Gunnar on Twitter to hear more from him.

 

Ben: What are your primary responsibilities at Citrix?

Gunnar: A lot of what I do at Citrix is on the back end and not necessarily public facing. In the public view, it’s more of looking at a long term strategy. Most roadmaps are looking ahead 12-18 months. I can be influential in these plans, but I am really looking at the longer term strategy. Where are we going to be in 3-5 years? How do we actually get to that place? How do you take today’s roadmap and drive it towards that 5 year plan? One of the main reasons I took the job at Citrix is because I want to fix VDI. I think it costs too much and is too complex. I think we truly can change VDI at Citrix.

 

Ben: What are some of the challenges you face as a CTO?

Gunnar: One of the main challenges when looking at long term strategies is that things can happen in the short term that can impact those long term plans. That’s every CTO’s challenge regardless of industry. In this particular industry, things change every single day. Every couple of months there is a major merger or acquisition. You have to be nimble and quick and be ready to make adjustments on the fly. My background at Gartner is very relevant here.  I have to make sure I understand where the customer is now and where they will be 3-5 years from now.

If you look at the history of Citrix, look back 5 years and you see they made an incorrect prediction on VDI. You can create a long term strategy and have it not work out. If you aren’t correct with your long term strategy, it’s important to capture that early on and pivot.

 

Ben: What goals do you have for 2015?

Gunnar: I have three main goals heading into 2015. The first is doubling down on applications. The second is to review the complexity and costs of VDI. The third is to “bridge to the cloud.”

1. Double down on applications

Citrix over rotated on VDI but now the pendulum is moving back. VDI has a place but so does RDS. We are doubling down so that XenApp is not a second class citizen to XenDesktop. Apps are what users want, XenApp is our tried and true solution for pushing these apps out to users on any device.

2. Review complexity and cost of VDI

My overall goal is to make VDI easier to deploy and cheaper to deploy. This plays into a long term strategy. Let’s face it, VDI deployments take a lot of time and money. I can’t remember where it was that I heard this stat, but for every dollar of a VDI sale I need to sell $12 in everything else. For a customer to buy one thing they need to buy $12 of something else…not an ideal situation for the customer.

We need to solve that issue to make it less costly. I’m unapologetically a fan of VDI. I think it’s an extremely powerful technology that has a lot of great benefits, but it is currentlycostly and complex. Luckily, in my position I get to work with a lot of really smart people that can solve this so I’m confident that Citrix will make VDI what I have always wanted it to be.

3. Bridge to the cloud

This is where Citrix Workspace Services comes into play. You will start seeing more and more of this from Citrix over the next several months. Essentially this is the unification of all of our different products (i.e. XenDesktop, XenApp, XenMobile, NetScaler, etc.). We will be “SaaS-ifying” our entire stack, which is a massive undertaking. We really want to improve the admin experience by creating a single administrative interface for users of all different product suites.

The goal is provide the same benefits to an enterprise that an end user receives from products like the ChromeBook – automatically get the latest version so you never have to update manually. We want to get to the point that no matter what, customers are always operating on the most recent versions. This obviously benefits the customer as they are getting the latest things instantly.

Citrix isn’t going to try to become a cloud provider. To do that you need billions of dollars. We’re building a bridge to enable users to move seamlessly from on-prem to off-prem. You want to be on Azure or Amazon? We can do that.

The idea is that this becomes the middle ground between you and those cloud providers. What I like about being the intermediary is being able to dial up and back between clouds seamlessly to allow customers to stand things up and test them in minutes instead of days.

 

Ben: Citrix has made heavy investments in mobility. Where do you see mobility in the next 3-5 years?

Gunnar: Honestly, I want to stop talking about mobility like it’s something special. Everything we are doing these days is mobile. Mobile Device Management? Mobile Application Management? We need to drop the mobile from this story. It’s device management. It’s applications management. As far as where mobility fits in with Citrix – it’s inherent to the big picture much like the necessity to breath. I say this to paint a picture because it’s in our DNA. This is what Citrix has done for the last 25 years. In today’s world with smartphones and tablets, we take apps and make them run elsewhere just like we have always done.

 

Ben: Throughout your career, what concept or technology would you say has had the most drastic impact on IT?

Gunnar: Hands down virtualization. Virtualization is the root of where cloud started. Cloud is the most disruptive technology moving forward, and it all started with the hypervisor.

 

Are you a CIO/CTO interested in participating in our Focus Interview series? Email me at bstephenson@greenpages.com

By Ben Stephenson, Emerging Media Specialist

 

Gartner Data Center Conference: Success in the Cloud & Software Defined Technologies

I just returned from the Gartner Data Center conference in Vegas and wanted to convey some of the highlights of the event.  This was my first time attending a Gartner conference, and I found it pretty refreshing as they do take an agnostic approach to all of their sessions unlike a typical vendor sponsored event like VMWorld, EMC World, Cisco Live, etc.  Most of the sessions I attended were around cloud and software defined technologies.  Below, I’ll bullet out what I consider to be highlights from a few of the sessions.

Building Successful Private/Hybrid Clouds –

 

  • Gartner sees the majority of private cloud deployments being unsuccessful. Here are some common reasons for that…
    • Focusing on the wrong benefits. It’s not all about cost in $$. In cloud, true ROI is measured in agility vs dollars and cents
    • Doing too little. A virtualized environment does not equal a private cloud. You must have automation, self-service, monitoring/management, and metering in place at a minimum.
    • Doing too much. Putting applications/workloads in the private cloud that don’t make sense to live there. Not everything is a fit nor can take full advantage of what cloud offers.
    • Failure to change operational models. It’s like being trained to drive an 18 wheeler then getting behind the wheel of a Ferrari and wondering why you ran into that tree.
    • Failure to change funding model. You must, at a minimum, have a show back mechanism so the business will understand the costs, otherwise they’ll just throw the kitchen sink into the cloud.
    • Using the wrong technologies. Make sure you understand the requirements of your cloud and choose the proper vendors/technologies. Incumbents may not necessarily be the right choice in all situations.
  • Three common use cases for building out a private cloud include outsourcing commodity functions, renovating infrastructure and operations, and innovation/experimentation…but you have to have a good understanding of each of these to be successful (see above).
  • There is a big difference between doing cloud to drive bottom line (cost) savings vs top line (innovation) revenue expansion. Know ‘why’ you are doing cloud!
  • On the hybrid front, it is very rare today to see fully automated environments that span private and public as the technology still has some catching up to do. That said, it will be reality within 24 months without a doubt.
  • In most situations, only 20-50% of all applications/workloads will (or should) live in the cloud infrastructure (private or public) with the remaining living in traditional frameworks. Again, not everything can benefit from the goodness that cloud can bring.

Open Source Management Tools (Free or Flee) –

 

  • Organizations with fewer than 2500 employees typically look at open source tools to save on cost while larger organizations are interested in competitive advantage and improved security.
  • Largest adoption is in the areas of monitoring and server configuration while cloud management platforms (i.e. openstack), networking (i.e. open daylight), and containers (i.e. docker) are gaining momentum.
  • When considering one of these tools, very important to look at how active the community is to ensure relevancy of the tool
  • Where is open source being used in the enterprise today? Almost half (46%) of deployments are departmental while only about 12% of deployments are considered strategic to the overall organization.
  • Best slide I saw at the event which pretty much sums up open source….

 

Gartner Data Center Conference

 

If this makes you excited, then maybe open source is for you.  If not, then perhaps you should run away!

3 Questions to Ask Your SDN Vendor –

  • First, a statistic…organization which fail to properly integrate their virtualization and networking teams will see a 3x longer MTR (mean time to resolution) of issues vs those who do properly integrate the teams
  • There are approximately 500 true production SDN deployments in the world today
  • The questions to ask…
    • How to prevent network congestion caused by dynamic workload placement
    • How to connect to bare metal (non-virtualized) servers
    • How to integrate management and visibility between the underlay/overlay
  • There are numerous vendors in this space, it’s not just VMware and Cisco.
  • Like private cloud, you really have to do SDN for the right reasons to be successful.
  • Last year at this conference, there were 0 attendees who indicated they had investigated or deployed SDN. This year, 14% of attendees responded positively.

 

If you’re interested in a deeper discussion around what I heard at the conference, let me know and I’ll be happy to continue to dialogue.

 

By Chris Ward, CTO. Follow Chris on Twitter @ChrisWardTech . You can also download his latest whitepaper on data center transformation.

 

 

Fun Facts about Microsoft Azure

facts about Microsoft AzureLooking for some helpful facts about Microsoft Azure? For those out there that may be confused about the Microsoft Azure solutions offered to date, here is the first in a series of posts about the cool new features of the Microsoft premium cloud offering, Azure.

Azure Backup, ok… wait, what? I need to do backup in the cloud? No one told me that!

Facts about Microsoft Azure

Yes Virginia, you need to have a backup solution in the cloud. To keep this high level below I attempted to outline what the Azure backup offering really is. There are several protections built into the Azure platform that help customers protect their data as well as options to recover from a failure.

In a normal, on premise scenario, host based hardware and networking failures are protected at the hypervisor level. In Azure you do not see this because control of the hypervisor has been removed. Azure, however, is designed to be highly available meeting and exceeding the posted SLAs associated with the service

Hardware failures of storage are also protected against within Azure. At the lowest end you have Local Redundant storage where they maintain 3 copies of your data within a region. The more common and industry preferred method is Geo-Redundant storage which keeps 3 copies in you’re region and 3 additional copies in another datacenter, somewhere geographically dispersed based on a complex algorithm. The above protections help to insure survivability of your workloads.

Important to note: The copies in the second datacenter are crash consistent copies so it should not be considered a backup of the data but more of a recovery mechanism for a disaster.

Did I hear you just ask about Recovery Services in Azure? Why yes, we have two to talk about today.

  • Azure Backup
  • Azure Site Recovery

Azure Site Recovery – This scenario both orchestrates site recovery as well as provides a destination for virtual machines. Microsoft currently supports Hyper-V to Azure, Hyper-V to Hyper-V or VMware to VMware recovery scenarios with this method.

Azure Backup is a destination for your backups. Microsoft offers traditional agents for Windows Backup and the preferred platform, Microsoft System Center 2012 – Data Protection Manager. Keeping the data in the cloud, Azure holds up to 120 copies of the data and can be restored as needed. At this time the Azure Windows backup version only protects files. It will not do Full System or Bare Metal backups of Azure VMs.

As of this blog post to get a traditional full system backup there is a recommend two-step process where you use Windows Backup which can capture a System State backup and the enable Azure Backup to capture this into your Azure Backup Vault.

There are 2 other methods that exist but currently the jury is out on the validity of these offerings. They are VM Capture and Blob Snapshot.

  • VM capture – which is equivalent to a VM snapshot
  • Blob Snapshot – This is equivalent to a LUN snapshot

As I said these are options but considered by many too immature at this time and respectfully not widely adopted. Hopefully, this provides some clarity around Azure and as with all things Microsoft Cloud related, Microsoft issues new features almost daily now. Check back again for more updates on what Azure can do for your organization!

 

By David Barter, Practice Manager, Microsoft Technologies

What To Move To the Cloud: A More Mature Model for SMBs

what to move to the cloudMany SMBs struggle with deciding if and what to move to the cloud. Whether it’s security concerns, cost, or lack of expertise, it’s oftentimes difficult to map the best possible solution. Here are 8 applications and services to consider when your organization is looking to move to the Cloud and reduce their server footprint.

 

What to move to the cloud

1. Email

Obviously in this day and age email is a requirement in virtually every business. A lot of businesses continue to run Exchange locally. If you are thinking about moving portions of your business out to the cloud, email is a good place to start. Why should you move to the cloud? Simple, it’s pretty easy to do and at this point it’s been well documented that mail runs very well up in the cloud. It takes a special skill set to run Exchange beyond just adding and managing users. If something goes wrong and you have an issue, it can often times be very complicated to fix. It can also be pretty complicated to install. A lot of companies do not have access to high quality Exchange skills. Moving to the cloud solves those issues.  Having Exchange in the Cloud also gets your company off of the 3-5 year refresh cycle for the hardware to run Exchange as well as the upfront cost of the software.

Quick Tip – Most Cloud e-mail providers offer Anti-Spam/Anti-virus functionality as part of their offering. You can also take advantage of Cloud based AS/AV providers like MacAfee’s MXLogic.

2. File Shares

Small to medium sized businesses have to deal with sharing files securely and easily among its users. Typically, that’s a file server running locally in your office or at multiple offices. This can present a challenge of making sure everyone has the correct access and that there is enough storage available. Why should you move to the cloud? There are easy alternatives in the cloud to avoid dealing with those challenges. Such alternatives include Microsoft OneDrive, Google Drive or using a file server in Microsoft Azure. In most cases you can use Active Directory to be the central repository of rights to manage passwords and permissions in one place.

Quick Tip – OneDrive is included with most Office 365 subscriptions. You can use Active Directory authentication to provide access through that.

3. Instant Messaging/Online Meetings

This one is pretty self-explanatory. Instant messaging can oftentimes be a quicker and more efficient form of communication than email. There are many platforms out there that can be used including Microsoft Lync, Skype and Cisco Jabber. A lot of these can be used for online meetings as well including screen sharing. Your users are looking for these tools and there are corporate options. With a corporate tool like Lync or Jabber, you can be in control. You can make sure conversations get logged, are secure and can be tracked. Microsoft Lync is included in Office 365.

Quick Tip – If you have the option, you might as well take advantage of it!

4. Active Directory

It is still a best practice to keep an Active Directory domain controller locally at each physical location to speed the login and authentication process even when some or most of the applications are services are based in the Cloud. This still leaves most companies with an issue if their site or sites are down for any reason.  Microsoft now has provided the ability to run a domain controller in their Cloud with Azure Active Directory to provide that redundancy that many SMBs do not currently have.

Quick Tip – Azure Active Directory is pre-integrated with Salesforce, Office 365 and many other applications. Additionally, you can setup and use multi-factor authentication if needed.

5. Web servers

Web servers are another very easy workload to move to the cloud whether it’s Rackspace, Amazon, Azure, VMware etc. The information is not highly confidential so there is a much lower risk than putting extremely sensitive data up there. By moving your servers to the cloud, you can avoid all the traffic from your website going to the local connection; it can all go to the cloud instead.

Quick Tip – most cloud providers offer SQL server back-ends as part of their offerings. This makes it easy to tie in the web server to a backend database. Make sure you ask your provider about this.

6. Back Up 

A lot of companies are looking for alternate locations to store back up files. It’s easy to back up locally on disk or tape and then move offsite. It’s often cheaper to store in the cloud and it helps eliminate the headache of rotating tapes.

Quick Tip – account for bandwidth needs when you start moving backups to the cloud. This can be a major factor.

7. Disaster Recovery

Now that you have your backups offsite, it’s possible to have capacity to run virtual machines or servers up in the cloud in the event of a disaster. Instead of moving data to another location you can pay to run your important apps in the cloud in case of disaster. It’s usually going to cost you less to do this.

Quick Tip – Make sure you look at your bandwidth closely when backing up to the Cloud. Measure how much data you need to backup, and then calculate the bandwidth that you will need.  Most enterprise class backup applications allow you to throttle the backups so they do not impact business.

8. Applications

A lot of common applications that SMBs use are offered as a cloud service. For example, Salesforce and Microsoft Dynamics. These companies make and host the product so that you don’t have to onsite. You can take advantage of the application for a fraction of the cost and headache.

In conclusion, don’t be afraid to move different portions of your environment to the cloud. For the most part, it’s less expensive and easier than you may think. If I was starting a business today, the only thing I would have running locally would be an AD controller or file server. The business can be faster and leaner without the IT infrastructure overhead that one needed to run a business ten years ago.

Looking for more tips? Download this whitepaper written by our CTO Chris Ward “8 Point Checklist for a Successful Data Center Move

 

By Chris Chesley, Solutions Architect

Cloud Computing in 2020: Looking into that Crystal Ball

Cloud Computing in 2020Recently, @thedodgeretort of Enterprise CIO Forum held a Twitter chat about what cloud computing in 2020 will look like. I decided to write up a quick blog sharing my thoughts on the topic. Looking into the crystal ball, I see a few things happening with cloud by 2020 — call it 5 years out. First, cloud will transform into more of a utility and a grid of computing power. Second, we’ll see a much deeper manifestation of the core characteristics of cloud computing, especially with regard to flexible capacity, consistent access, and high portability. Third, I anticipate a lot of activity in machine-to-machine transactions and communications (call it IoT if you like). Fourth: superesilient applications. Fifth: compute traded as a commodity. And finally, within 5 years, I think IT and the overall business will come together to actually take advantage of these technologies. Read on for more detail.

Cloud Computing in 2020

 

1. A utility and computing grid

In 5 years, large companies will still hang on to their datacenters to run some services. However, with security more robust, I think that corporations will make available their own computing resources as much as they consume cloud resources – just like some households generate their own electricity and sell it back to the grid. I think Cisco’s Intercloud concept has an angle on this.

2. Flexible capacity, consistent access, and high portability

A cloud/compute socket just like an electrical socket. Standardized applications and connectors that “plug in” to the grid and are removed just as easily. Virtualization has the first stab at this, encapsulating the OS, data, and applications neatly in a VMX and VMDKs. Containers are the next stab. Redhat has an angle on this with their CloudForms PaaS. Raw compute power becomes more and more of a commodity as portability improves; meaning downward pressure on IaaS prices will remain to some degree (see #4).

3. IoT or machine-to-machine communications/transactions

One machine determines that it needs to acquire more compute power to complete its work. It makes a “deal” to go out and acquire that compute power, uses it, and gives it back to the grid. Or, on the flip side, a machine that knows when it can stand idle and rent its own power. Another angle on this, a virtual machine or application has knowledge of its SLA, and moves to the provider who can deliver on that SLA at the least cost. Love it or hate it, Apple’s Siri has an early angle on this. From what I’ve read about the technology, queries to Siri find their way back to Apple datacenters, not only to obtain answers, but to improve the accuracy of queries for all Siri users.

4. Superesilient applications

As prices for cloud trend downward and portability improves (see #2 and #5), disaster recovery will take a new shape. Instead of running on a 2-site/2-region DR architecture, applications will run on a 5, 10, 20, or 30-site “DR” architecture, with all nodes being active. Does it matter where your application is running at that point? Potentially, it’s running all over the east coast, or all over the country. Some services from AWS already have an angle on this with services that are redundant across regions (a.g., S3, elastic load balancing, etc.), not to mention things like DNS on the Internet. I think it will become cost-effective to do this, in general, within 5 years.

5. Compute traded as a commodity, just like crude oil

This might be a stretch in 5 years, but with the trend of IaaS being more commoditized and portability improving, we’ll see a day when compute power is traded in a commodities market. In the channel, this is already fairly common – IaaS providers are eager to cut favorable deals with resellers who agree to purchase large chunks of infrastructure upfront, only to resell at a later date.

6. IT and the business coming together

DevOps was the first marriage of two groups that had been previously at odds (oftentimes). Within 5 years, I think maturity in IT will improve to the point that they become as focused on the business as any other traditional LOB. IT becomes an Innovation Center — they are focused on the business, and behave proactively. Corporate IT shifts its focus from requirements to possibilities. See my previous posts on the emerging idea of a cloud architect who will be important in this shift.

 

To sum up… we’re just at the beginning of possibilities in cloud computing.

 

To hear more from John, you can download his eBook, “The Evolution of the Corporate IT Department

Photo credit http://bestpaperz.com/ct/8766019-crystal-ball.html