All posts by Journey to the Cloud

Listen. Observe. Question. The Art of Complex IT Project Management

 

By Melanie Haskell, Project Manager, GreenPages Technology Solutions

Managing complex technology projects requires cooperation from multiple resources, spanning different departments and management levels, technology manufacturers, and organizations. Due to the complexity of the modern IT environment, project management in this industry is much more than coordinating phone calls and assigning tasks. The ability to communicate effectively (listening, observing, and questioning) is crucial to positive IT project outcomes.

 

How do I get a networking engineer to provide me a daily status report? How do I explain that disaster recovery between sites depends on bandwidth between those sites?  How do I ensure the project I’m working to deliver is in line with the customer’s expectations?  How do I know if the customer wants to mitigate the risk of BYOD?

There are three simple rules of effective communication when managing complex IT projects:

•             Listen

•             Observe

•             Question

Listen

I was recently attending a webinar on the Importance of Listening and the presenter mentioned an interesting exercise. Ask a person three or four times to recite the word “White,” then ask them what cow’s drink. Nine times out of ten, the person will say “Milk,” not “Water.” This illustrates what happens when people try to solve, rather than listen. If you get ahead of yourself in an IT project, mistakes happen. As an IT Project Manager, my role is to not just watch for this behavior in myself (making an assumption about how a customer will test applications in a VDI pilot for example) but also in other project stakeholders. Perhaps a CTO wants additional storage up and running by the end of the week but the IT Director states he does not have the resources to meet that deadline. What is meant by “up and running?” and what “resources” does the IT Director need to meet the request? Is it a people shortage, bandwidth issues, manufacturer backorder, rack space? A project manager that listens well, can untangle the issue to keep the project on track.

Observe

Effective IT project managers have the ability to quickly gauge stakeholders’ level of technical knowledge, area of expertise, level of responsibility, etc. so they can tailor any message to be clearly heard and effectively understood. But another important skill is the power of observation. Project Managers need to ensure all stakeholders are engaged. I was recently in a meeting where I watched someone subtly tune out another person because they thought that person was discussing a topic that was not in their particular “wheelhouse.” But in the modern IT environment, we cannot function in IT silos any longer. Not only is everything connected from a technology standpoint, but all IT projects are also business projects. Effective project managers use observational skills for better project outcomes by minimizing knowledge gaps and ensuring all stakeholders are engaged.

Question

As an IT Project Manager I work with many different customer contacts (at varying levels of an organization) daily. If I am working with a linear and nimble IT environment and I need a port opened on the firewall, most likely all I need to do is ask.  However, if the project involves a customer environment that is layered (maybe ITIL certified)  and has a team of 8 people responsible for network security and I need that same port opened on their firewall, I need to approach the request very differently (and probably have to wait for a Change window).  If  I am working with the Executive Administrative Assistant of a law firm and ask if all equipment has been received, racked, and cabled and is ready for the engineer to arrive onsite, I need to provide a deeper level of detail in my question than if I were asking the same question to an IT Manager.

By employing strong communication skills—listening, observing, and questioning—IT project managers can ensure successful, effective IT project outcomes.

 

Stay Safe in the Cloud With Two-Factor Authentication

The use of two-factor authentication has been around for years, but the recent addition of this security feature in cloud services from Google and Dropbox has drawn widespread attention.  The Dropbox offering came just two months after a well-publicized security breach at their online file sharing service.

Exactly What Is Two-Factor Authentication?

Of course, most online applications require a user name and password in order to log on.  Much has been written about the importance of managing your passwords carefully.  However, simple password protection only goes so far.

Two-factor authentication involves not only the use of something the user knows such as a password, but also something that only the user has.  An intruder can no longer gain access to the system simply by illicitly obtaining your password.

Authentication Tools

  • ATM Cards:  These are perhaps the most widely used two-factor authentication device.  The user must both insert the card and enter a password in order to access the ATM.
  • Tokens:  The use of tokens has increased substantially in recent years.  Most of these are time-based tokens that involve the use of a key sized plastic device with a screen that displays a security code that continually changes.  The user must enter not only their password, but also the security code from the token. Tokens have been popular with sensitive applications such as on-line bank and
    brokerage sites.
  • Smart Cards:  These function similarly to ATM cards, but are used in a wider variety of applications.  Unlike most ATM cards, smart cards have an embedded microprocessor for added security.
  • Smart Phones:  The proliferation of smart phones has provided the perfect impetus to expand two-factor authentication to widely used internet applications in the cloud.  In these cases, users must enter not only a password, but also a security code from their phone or other mobile device.  This code can be sent to a phone by the service provider as an SMS text message or generated on a smartphone using a mobile authenticator app.  Both Google and Dropbox now use this method.

Yahoo! Mail and Facebook are also introducing two-factor authentication using smart phones.  However, their methodology only prompts the user to enter the security code if a security breach is suspected or a new device is used.

So What’s Next?

Cloud security is a hot topic and two-factor authentication is one way to mitigate users’ well founded concerns.  As a result, development and adoption of two-factor authentication systems is proceeding at a rapid pace and should be available for most cloud applications within just a few short years.

The shift from token based authentication to SMS based authentication is also likely to accelerate along with smart phone use.

Two-factor and even three-factor authentication using biometrics will become more popular.   Finger print readers are already quite common on laptop computers.  Use of facial recognition, voice recognition, hand geometry, retina scans, etc. will become more common as the technology develops and the price drops.  The obvious advantage of these biometric systems is that the physical device cannot be stolen or otherwise used by a third party to gain access to the system.

As with any security system, two-factor authentication is not 100% secure.  Even token systems have been hacked and there is no doubt that there will be breaches in SMS authentication tools as well.  However, two-factor authentication still provides the best way to stay safe in the cloud and it’s advisable to use it whenever possible.

This post is by Rackspace blogger Thomas Parent. Rackspace Hosting is a service leader in cloud computing, and a founder of OpenStack, an open source cloud operating system. The San Antonio-based company provides Fanatical Support to its customers and partners, across a portfolio of IT services, including Managed Hosting and Cloud Computing.

Cloud Corner Series -The Networking & Storage Challenges Around Clustered Datacenters



www.youtube.com/watch?v=fRl-KDveZQg

In this new episode of Cloud Corner, Director of Solutions Architecture Randy Weis and Solutions Architect Nick Phelps sit down to talk about clustered datacenters from both a networking and storage perspective. They discuss the challenges, provide some expert advice, and talk about what they think will be in store for the future. Check it out and enjoy!

Cloud Corner Series -The Networking & Storage Challenges Around Clustered Datacenters



www.youtube.com/watch?v=fRl-KDveZQg

In this new episode of Cloud Corner, Director of Solutions Architecture Randy Weis and Solutions Architect Nick Phelps sit down to talk about clustered datacenters from both a networking and storage perspective. They discuss the challenges, provide some expert advice, and talk about what they think will be in store for the future. Check it out and enjoy!

Journey to the Cloud First Year: Top 10 Posts

Journey to the Cloud has now been around for over a year! We thought it would be cool to count down our Top 10 Posts since starting the blog. Let us know in the comment section if you think we missed one of your favorites!

10. Cloud Theory to Cloud Reality: The Importance of Partner Management by Robb Schlosser

In Robb’s one and only post he discusses the importance of partner management on an organization’s journey to the cloud.

9. Going Rogue: Do the Advantages Outweigh the Risks? by John Dixon

John reflects on a Twitter chat he participated in hosted by the Cloud Commons blog. Are all rogue IT projects bad things? Could this type of activity be beneficial? If rogue IT projects could be beneficial, should they be supported or even encouraged?

8. The Journey to the New IT: Four Key Observations by Chris Chesley

In this video blog (accompanied by text), Solutions Architect Chris Chesley discusses the four major transformations he has seen in IT. Users are now the focus, not applications or locations, Virtualization is now a commodity, Cloud is here, and Better technology, better ways of solving issues.

7. The Private Cloud Strikes Back by Trevor Williamson

When Salesforce.com’s JP Rangaswami made comments dissing the private cloud, Trevor Williamson responded with fire!

6. Thin on Thin Provisioning – Good Idea or Recipe for Disaster? by Chris Ward

Chris Ward discusses best practices of thin on thin provisioning. What is it? How do I use it? Positives vs. Negatives? Recommendations.

5. How Cloud Computing is Like Transforming a ’68 Dodge Dart by Trevor Williamson

In order to break down the many different concepts of cloud and cloud technologies, Trevor compares a traditionally managed datacenter with a 1968 Dodge Dart. Video & Text.

4. Mobile Devices in a Cloud World by Ken Smith

In this post, Ken discusses security of endpoint mobile devices.

3. What Should I Do about Cloud? by John Dixon

Pick your poison… Public, Private, Hybrid, Community, SaaS, IaaS, PaaS… even XaaS (anything as a service!). On-premises, off-premises… or even “on-premise” if you want!

2. How a Cloud Infrastructure Can Save or Make You Money by Trevor Williamson

Everyone is wondering about the ROI of a cloud infrastructure. In this post, Trevor points to where the revenue benefits are found or where costs are typically saved in a cloud infrastructure vs. a traditional infrastructure.

1. Planning for Cloud Infrastructures: Build It and They Will…Not Pay For It? by Trevor Williamson

And at number 1…Trevor discusess the CAPEX and OPEX funding issues that are causing the biggest headaches in the industry!

 

What’d you think of the list?

If you’re looking for additional free resources check out this Private Cloud Preflight Checklist, this VDI Webinar Recording, or this Managed Services Article!

 

RECAP: HP Discover 2012 Event

If you are going to do something, make it matter.  That was the key phrase that was posted throughout the conference at HP Discover 2012 in Las Vegas a couple weeks ago.  With some of the new announcements, HP did just that.

One of the biggest announcements in my opinion is the HP Virtual Connect Direct-Attached Fibre Channel Storage for 3PAR. In a nutshell, it helps to reduce your SAN infrastructure by eliminating switches and HBAs. You connect your Blade System Servers directly to the 3PAR array.  This allows you to have a single layer FC storage network.  Since you won’t have a fabric to manage, you can increase your provisioning process by as much as 2.5X.  Also, by removing the fabric layer, you can eliminate up to 55% latency.

This will allow organizations to reduce costs by eliminating the SAN fabric.  It will save on operating costs by cutting down on capital expenditure.  It also scales with the “pay as you grow” methodology allowing you to purchase only what you need.

Complexity is greatly decreased with the wire-once strategy.  If new servers are added to the Blade Chassis, they simply access the storage through the already connected cabling.

Virtual Connect Manager allows for a single pane of glass approach.  It can be used through a web interface or CLI, for those UNIX lovers.

The new trend in IT is Big Data.  Some of the biggest customer challenges are the velocity and volume of data, the large variety and disparate sources of data, and the complex analytics that are required for maximizing the value of information.  HP introduced Vertica 6, which does all of
these.

Vertica 6 FlexStore has been expanded to allow access to any data, stored at any location, through any interface.  You can connect to Hadoop File Systems, existing databases, and data warehouses.  You can also access unstructured analysis platforms such as HP/Autonomy IDOL.

It also includes high performance data analytics for the R Statistical Tool natively and in parallel without the in-memory and single-threaded limitations of R. Vertica 6 has expanded their C++ SDK to add secure sandboxing of user-defined code.

Workload Management simplifies the user experience by enabling more diverse workloads.  Some users experienced up to a 40X speed increase on their queries.  Regardless of size, Workload Management balances all system resources to meet SLAs.

Vertica 6 software will run on the HP public cloud.  Web and mobile applications generate a ton of data.  This will allow business intelligence to quickly spot any trends that are developing and act accordingly.

Not to be overlooked are the enhancements made to the core components that are already part of the system.

Over the past few years, there has been a big interest in disk to disk backup and deduplication.  HP’s latest solution in this space is the B6200 with StoreOnce Catalyst software.  It has over 50 patents that deliver world record performance of 100TB/hr backups and 40TB/hr restores.  This claims to be 3X and 5X faster, respectively, than the next leading competitor.

The hardware is scalable.  It starts at 48TB (32TB usable) and can grow to 768TB (512TB usable).  With a typical deduplication rate of 20X, the system can provide extended data protection for up to 10PBs.

This is a federated backup solution that allows you to move data from remote sites to multiple datacenters without having to reduplicate it.  It integrates with HP Data Protector, Symantec NetBackup, and Symantec BackupExec giving the administrator one console to manage all deduplication, backup, and disaster recovery operations.

The portfolio also includes smaller units for SMB customers. They take advantage of the same type of technologies allowing companies to meet those pesky backup windows.

As a leading HP Partner, GreenPages can assist you with these or any of the products in the HP portfolio.

By Mark Mychalczuk

Cloud Corner Series – Dissecting Virtualization



www.youtube.com/watch?v=pL29FHWXa3U

 

In this segment of Cloud Corner, we bring on Solutions Architect Chris Chesley to discuss various aspects of virtualization. Chris also gets quizzed on how well he knows his fellow Journey to the Cloud Bloggers. Let us know if you agree or disagree with the points Chris makes. We asked Chris the following questions:

1. If I’m virtualized, am I in the cloud?

2. How virtualized would you recommend organizations become?

3. What is the biggest aspect organizations misunderstand about virtualization?

4. What is the single biggest benefit of virtualization?

5. What does it mean to be 100% virtualized, and what are the benefits?

6. Where should companies who have not virtualized anything start?

Check out Episode 1 and Episode 2 of Cloud Corner!

Guest Post: Cloud Management

 

By Rick Blaisdell; CTO ConnectEDU

Cloud computing has definitely revolutionised the IT industry and transformed the way in which IT Services are delivered. But finding the best way for an organization to perform common management tasks using remote services on the Internet is not that easy.

Cloud management incorporates the task of providing, managing, and monitoring applications into cloud infrastructures that do not require end-user knowledge of the physical location or of the system that delivers the services. Monitoring cloud computing applications and activity into requires cloud management tools to ensure that resources are meeting SLA’s, working optimally and also not effecting systems and users that are leveraging these services.

With appropriate cloud management solutions, private users are now able to manage multiple operating systems on the same dedicated server or move the virtual servers to a shared server all from in the same cloud management solution.  Some cloud companies offer tools to manage this entire process, some will provide this solution using a combination of tools and managed services.

The three core components of cloud environment, Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and finally Software as a Service (SaaS), now offer great solutions to manage cloud computing, but the management tools need to be flexible and scalable just as the cloud computing strategy of an organization should be. With the new paradigm of computing, cloud management has to:

  • continue to make cloud easier to use;
  • provide security policies for the cloud environment;
  • allow safe cloud operations and ease migrations;
  • provide for financial controls and tracking;
  • audit and reporting for compliance.

Numerous tasks and tools are necessary for cloud management. A successful cloud management strategy includes performance monitoring in terms of response times, latency, uptime and so on, security and compliance auditing and management, initiating, supervising and management of disaster recovery.

So, why is it so important to implement a cloud management strategy into an organization? By having a cloud management strategy that fits into the cloud computing resources that a company uses, it offers a faster delivery of IT services to businesses, it reduces capital and operating costs, it charges backs automatically for resource usage and reporting and it allows IT departments to monitor their service level requirements.

 

 

This post originally appeared on http://www.rickscloud.com/cloud-management/