Todas las entradas hechas por Guest Author

Project Management and the Cloud

A Guest Post by Joel Parkinson, a writer for projectmanager.com

In the world of information technology, the “cloud” has paved the way for a new method for managing things on the Internet. In a cloud environment, computing “takes place” on the Worldwide Web, and it takes the place of the software that you use on your desktop. Cloud computing is also hosted on the Web, on a server installed in a “data center”, which is usually staffed and managed by people who are experts at technology management. What does the cloud mean to project management? Here’s an overview of what cloud project management is.

What Cloud Computing Means to Project Managers

Project management is defined as the “set” of activities and processes that are done to execute, and complete, a task that’s outsourced by one party to another. Project management ensures the high probability of success of a project, through the efficient use and management of resources.   So what does cloud computing mean to project managers?  According to PM veterans, cloud computing offers a greener and more sustainable project management environment, lowers cost, eliminates the use of unnecessary software and hardware, improves scalability, and eases the process of information-sharing between team managers and staff, customers and executive management.

Benefits of Cloud Project Management

In a project management environment, the cloud speeds up the whole process. As cloud services are available anytime, any day, the cloud can help a project management team hasten the process of execution, and provides improved results and outputs too.   With the cloud, project managers and staff can also easily monitor, and act without delays as information is delivered on a real-time basis. Let’s look at the other benefits of the cloud for project managers.

Improved Resource Management

The cloud’s centralized nature also allows for the improved utilization, allocation and release of resources, with status updates and real-time information provided to help optimize utilization. The cloud also helps maintain the cost of resource use, whether its machine, capital or human resource.

Enhanced Integration Management

With the cloud, different processes and methods are integrated, and combined to create a collaborative approach for performing projects. The use of cloud-based software can also aid in the mapping and monitoring of different processes, to improve overall project management efficiency.

Overall, the cloud platform reduces the gridlocks and smoothens the project management process, and makes the whole project team productive and efficient in terms of quality of service for the customer, and it also enhances the revenues of the organization.

But does the cloud project management model mean a more carefree and less-costly environment? We could say it makes the whole process less costly, but not overly carefree. Despite the perks provided by the cloud, everything still needs to be tested and monitored, and every member of the project management team must still work upon deployment, and each of them should still be fully supported by project managers, and the clients. The cloud is perhaps the biggest innovation in the IT industry because it “optimizes” the utilization of resources within an enterprise.

Alcatel-Lucent, GigaSpaces Partner for Delivery of Carrier Cloud PaaS

Guest post by Adi Paz, Executive VP, Marketing and Business Development, GigaSpaces

GigaSpaces Cloudify solution enables the on-boarding of applications onto any cloud. For several months now, GigaSpaces has been working with Alcatel-Lucent (ALU) on the use of Cloudify in a carrier cloud service environment. Together with Alcatel-Lucent’s CloudBand™ solution, Cloudify is a fundamental building block in the technological backbone of ALU’s carrier-grade Platform-as-a-Service (CPaaS).

Dor Skuler, Vice President & General Manager of the CloudBand Business Unit at Alcatel-Lucent, has said that, “Offering CPaaS as part of the CloudBand solution enables service providers to make a smooth migration to the carrier cloud and quickly deploy value-added services with improved quality and scalability, without the need for dedicated equipment.”

This new class of carrier cloud services brings the benefits of the cloud to the carrier environment without sacrificing security, reliability and/or quality of applications. The CPaaS enables the on-boarding of mission-critical applications on a massive scale, including both legacy and new carrier cloud services. This is a factor in meeting the requirements of many customers’ Service Level Agreements (SLAs) by integrating carrier networks.

Unlike regular cloud environments, where an application needs to explicitly handle multi-zone deployments, CPaaS enables the application workload and availability to be handled through a policy driven approach. The policy describes the desired application SLA, while the carrier CPaaS maps the deployment of the application resources to the cloud and reflects the best latency, load, or availability requirements.

Additionally, the integration will enable the creation of network-aware CPaaS services, simplified on-boarding to ALU’s CloudBand platform, multi-site app deployment and simplification of management of a number of latency and location-sensitive applications. The ability to comply with five-nine reliability, security, and disaster recovery requirements ensures peace-of-mind for enterprises choosing to on-board mission-critical applications to the carrier network.

The Cloudify Approach

Cloudify manages applications at the process level, and as such uses the same underlying architecture for any application regardless of the language or the technology stack that comprises the application. That said, working at the process level is often not enough, because not all processes are made the same. For example, databases behave quite differently from web containers and load-balancers. In order for us to still get in-depth knowledge about the managed application’s processes, Cloudify uses a recipe-based approach. The recipe-based approach enables us to describe the elements that are specific to that individual process, such as the configuration element, the dependency on other processes, the specific key performance indicators that tell if that process’ behavior is aligned with its SLA, and so on.

Working on the process level makes it possible to plug into a large variety of infrastructures, whether they happen to be public, private, or bare-metal environments. Cloudify uses an abstraction layer known as the Cloud Driver that interfaces with the cloud infrastructure to provide on-demand compute resources for running applications.

The Cloudify process can be implemented be done on individual clouds from HP, Microsoft, IBM, CloudStack, etc., or in the carrier network infrastructure of a company like Alcatel-Lucent.

Adi Paz is responsible for developing and communicating GigaSpaces’ strategy, and managing the company’s go-to-market activities and strategic alliances.

 

High Street and Main Street 2013: Business Failure or Rejuvination?

Guest Post by Pontus Noren, director and co-founder, Cloudreach.

Since Woolworths stores disappeared from the physical high streets and main street in January 2009, the bricks and mortar retailers have been falling apart. More than 27,000 people were out of work when its 800 stores closed, consigning a century of trading to the history books. An alarming amount of traditional big names have sunk since: already this year we have seen Jessops, HMV and Blockbuster Video enter administration or bankruptcy.

It is upsetting to see these names disappear from view, but do not believe the headlines. The high street is not dying, it is changing.

Most recently, Blockbuster Video encountered trouble because people were ditching the traditional movie rental model in exchange for internet streaming services. Blockbuster’s model involved leaving the comfort of your sofa, walking to a video rental store, then scouring the shelves for something you wanted to watch. You’d even face a fine if it was not returned on time. In contrast, the likes of LOVEFiLM and NETFLIX charge a monthly subscription fee and allow members to browse extensive video libraries online before streaming an unlimited amount of content.

Being successful in business is all about changing the game – overlooking what is out there and offering something new. This shift in the key players of the movie rental market has been facilitated by cloud computing technology. The emergence of cloud makes it much easier for businesses to grow rapidly, as you only pay for the server space you use with the likes of Amazon Web Services.

That ability to quickly scale up and down contrasts the traditional IT model, where businesses purchase physical servers and maintain them in-house.

When technology changes, it can have a radical effect on an industry, altering the way in which things are delivered and consumed. However, the level of spend in the economy stays the same so although these shops are closing, the economy shouldn’t suffer at all. The general public will always have a certain amount of money to spend – they just spend it in different ways depending on trends and what’s available, for example spending three pounds on a coffee from Costa rather than a DVD from HMV. That has been reflected in the phoenix Woolworths business. Shop Direct acquired the brand, and its new Woolworths website now offers half a million products. This new trading status reflects the change that has taken place: where people once browsed shelves of goods in shops, they now browse the web for a bargain. People are voting with their virtual feet and it is obvious that everything is heading online.

Not only is online more convenient, and often cheaper, but people can also have richer interactions with brands online, and can benefit from items tailored to their individual specifications – something that it is difficult for high street retailers to do well. The term Web 3.0 is being coined at the moment – with streaming and personalisation coming to the fore more so than ever before.

Web 3.0 and cloud have the potential to form a strong partnership. This force has already transformed the greetings card industry, with the likes of Funky Pigeon and Moonpig using the power of cloud to produce and deliver completely personalised greetings cards. Traditional market leader Clintons Cards closed half of its stores after entering administration in December, having taken a big hit from the success of its online competitors. Online retailing has advanced from being able to offer cheaper products to ones that are also completely tailored to customers’ wishes.

The bricks and mortar high street of the future will be filled with outlets, boutiques, restaurants and coffee shops, which all inspire physical interactions – service-based offerings will be prevalent. However, the most successful businesses will have a solid online strategy supported by cloud technology to deliver a personalised, richer experience for the customer and scalable operations to meet demand. For example, retailers should look at the likes of grab-and-go food outlet Eat, which plans store portfolio growth using cloud.

The cloud changes everything. Retailers must make the most of the tools and technologies at their disposal or they risk falling behind their competitors – or worse, risk being the next big name to hit the headlines for the wrong reasons.

Pontus Noren, director and co-founder, Cloudreach

Pontus Noren is director and co-founder, Cloudreach.

Five IT Security Predictions for 2013

Guest Post by Rick Dakin, CEO and co-founder of Coalfire, an independent IT GRC auditor

Last year was a very active year in the cybersecurity world. The Secretary of Defense announced that the threat level has escalated to the point where protection of cyber assets used for critical infrastructure is vital. Banks and payment processors came under direct and targeted attack for both denial of service as well as next-generation worms.

What might 2013 have in store? Some predictions:

1. The migration to mobile computing will accelerate and the features of mobile operating systems will become known as vulnerabilities by the IT security industry. 

Look out for Windows 95 level security on iOS, Android 4 and even Windows 8 as we continue to connect to our bank and investment accounts – as well as other important personal and professional data – on smartphones and tablets.

As of today, there is no way to secure an unsecured mobile operating system (OS). Some risks can be mitigated, but many vulnerabilities remain. This lack of mobile device and mobile network security will drive protection to the data level. Expect to see a wide range of data and communication encryption solutions before you see a secure mobile OS.

The lack of security, combined with the ever-growing adoption of smartphones and tablets for increasingly sensitive data access, will result is a systemic loss for some unlucky merchant, bank or service provider in 2013. Coalfire predicts more   than 1 million users will be impacted and the loss will be more than $10 million.

2. Government will lead the way in the enterprise migration to “secure” cloud computing.

No entity has more to gain by migrating to the inherent efficiencies of cloud computing than our federal government. Since many agencies are still operating in 1990s-era infrastructure, the payback for adopting shared applications in shared hosting facilities with shared services will be too compelling to delay any longer, especially with ever-increasing pressure to reduce spending.

As a result, Coalfire believes the fledgling FedRAMP program will continue to gain momentum and we will see more than 50 enterprise applications hosted in secure federal clouds by the end of 2013. Additionally, commercial cloud adoption will have to play catch-up to the new benchmark that the government is setting for cloud security and compliance. It is expected that more cloud consumers will want increased visibility into the security and compliance posture of commercially available clouds.

3. Lawyers will discover a new revenue source – suing negligent companies over data breaches.

Plaintiff attorneys will drive companies to separate the cozy compliance and security connection. It will no longer be acceptable to obtain an IT audit or assessment from the same company that is managing an organization’s security programs. The risk of being found negligent or legally liable in any area of digital security will drive the need for independent assessment.

The expansion of the definition of cyber negligence and the range of monetary damages will become more clear as class action lawsuits are filed against organizations that experience data breaches.

4. Critical Infrastructure Protection (CIP) will replace the Payment Card Industry (PCI) standard as the white-hot tip of the compliance security sword.

Banks, payment processors and other financial institutions are becoming much more mature in their ability to protect critical systems and sensitive data.  However, critical infrastructure organizations like electric utilities, water distribution and transportation remain softer targets for international terrorists.

As the front lines of terrorist activities shift to the virtual world, national security analysts are already seeing a dramatic uptick in surveillance on those systems. Expect a serious cyber attack on critical infrastructure in 2013 that will dramatically change the national debate from one of avoidance of cyber controls to one of significantly increased regulatory oversight.

5. Security technology will start to streamline compliance management.

Finally, the cost of IT compliance will start to drop for the more mature industries such as healthcare, banking, payment processing and government. Continuous monitoring and reporting systems will be deployed to more efficiently collect compliance evidence and auditors will be able to more thoroughly and effectively complete an assessment with reduced time on site and less time organizing evidence to validate controls.

Since the cost of noncompliance will increase, organizations will demand and get more routine methods to validate compliance between annual assessment reports.

Rick Dakin is CEO and co-founder of Coalfire is an independent information technology Governance, Risk and Compliance (IT GRC) firm that provides IT audit, risk assessment and compliance management solutions. Founded in 2001, Coalfire has offices in Dallas, Denver, Los Angeles, New York, San Francisco, Seattle and Washington D.C. and completes thousands of projects annually in retail, financial services, healthcare, government and utilities. Coalfire’s solutions are adapted to requirements under emerging data privacy legislation, the PCI DSS, GLBA, FFIEC, HIPAA/HITECH, HITRUST, NERC CIP, Sarbanes-Oxley, FISMA and FedRAMP.

To Cloud, or Not: Getting Started

Guest Post by Gina Smith

Many small business owners are still apprehensive about utilizing cloud options. While it can be a big step, there are significant long-term benefits to utilizing this expanding innovation, including:

  • Enhanced Security – Cloud providers go to great lengths to protect client data, often implementing security protocols which are much more advanced than those on most “hard” networks.
  • Emergency Backup – No need to worry in the event of a fire, earthquake, flood, storm or other natural disaster. Your data and files are safe and being backed up in the “cloud”.
  • Remote Access – You and your employees can gain access to company data at anytime from anywhere in the world.
  • Easily Upgrade or Replace Computers – Quickly and painlessly replace obsolete or faulty computers by connecting the new machine(s) and remotely accessing and/or transferring any data needed directly from the cloud!

Once a business decides to take that step into the “cloud”, many get “stuck” trying to figure out which options will work best for their needs. Amazon is considered by many to be a pioneer in the world of so-called “remote computing” services. And now, Internet giant Google has thrown its hat into the game, launching their “Google Cloud” platform earlier this year.

Amazon AWS (Advanced Web Services)

Amazon was one of the first companies to develop a remote access/cloud computing product catered to the general public. They still offer the most extensive options for both users and developers. The Amazon Elastic Compute Cloud (EC2) is attractive to many companies because they offer “pay-as-you-go” programs with no upfront expenses or long-term commitments required. Amazon Simple Storage (S3) is also very flexible, offering storage options in different regions around the world. Some companies choose to store their data in a lower priced region to reduce storage costs or in a region different from where their company is located for disaster recovery purposes. Amazon still offers the most versatile services and options. Some claim their system can be difficult to learn initially, but fairly easy to get around once you get the hang of it.

Google Cloud Services

There is no doubt that Google has made a permanent mark in history. The Internet giant has revolutionized our lives and made a significant impact on modern society. The company’s launch of their Google Cloud Platform got people who had previously discounted the cloud to seriously begin considering it again. Why? Well, it’s simple. Google has already developed applications which people are comfortable with and familiar. This, of course, makes the entire thought of cloud conversion and eventual emersion much less intimidating. Google’s cloud platform is still in its early stages and does not offer quite the flexibility and options as Amazon AWS – yet. Their data centers are secure and well managed, and their interface and applications are fairly easy to learn and navigate.

GoogleAppsAndroid
GoogleAppsiOS
GoogleMobile

While this article offers a good general overview of each system, it is always advisable to conduct your own research to determine which provider will best suit your needs. Both Amazon AWS and Google Cloud provide reliable, secure, dependable, cost-saving options for businesses. Also consider utilizing companies specializing in cloud management and backup, such as www.spanning.com. And, as your business grows and your cloud use increases, don’t forget that Cloudyn can use their Cloud Intelligence and other advanced tools to analyze your usage. They can be a tremendous asset in helping manage and optimizing your data costs.

Gina Smith writes freelance articles for magazines, online outlets and publications.Smith covers the latest topics in the business, golf, tourism, technology and entertainment industries.

Euro Data Centre Viewpoint: 2013 to be a Year of Growth, Uncertainty

Guest Post by Roger Keenan, Managing Director of City Lifeline

The data centre industry forms part of the global economy and, as such; it is subject to the same macro-economic trends as every other industry.  For 2013, those continue to be dominated by uncertainty and fear.  The gorilla of course, is the on-going problem in the Eurozone.  This time last year, many commentators predicted that this would come to a head in 2012, with either the central monetary authorities accepting fiscal union and central control across the Eurozone, or the Eurozone starting to break up.  In the event, neither happened and the situation remains unresolved and will continue to drive uncertainty in 2013.

One major uncertainty has been resolved with a convincing win for Barack Obama in the US presidential elections and the removal of the possibility of a lurch to the right.  However, the “fiscal cliff” remains and will cause a massive contraction in the US economy, and hence the world economy, if it goes ahead at the end of 2012.  For the UK, predictions are that interest rates will stay low for the next two to three years as the banks continue to rebuild their strengths at the expense of everyone else.

So the macro-economic environment within which the data centre industry operates is likely to stay uncertain and fearful in 2013.  Companies have massive cash reserves, but they choose to continue to build them rather than invest.  Decision making cycles in 2013 are likely to be as they are now – slow.  Companies will not invest in new project unless they have the confidence that their customers will buy, and their customers think the same and so the cycle goes round.

At a more specific industry level, the on-going trend towards commoditisation of infrastructure is likely to continue.  Whereas data centres five years ago were specific and unique, new entrants to the market have made data centre capacity more available than it was and driven up technical standards.  Older facilities have upgraded to match new builds, which ultimately benefits the industry and its customers.  The new builds and rebuilds are of varying quality and veracity, with some being excellent, however, others are claiming tier levels and other standards which are simply not true or claiming to be in central London whilst actually being somewhere else – perhaps following the example of London Southend Airport?  Even in a more commoditised market, quality, connectivity, accessibility and service still stand out and well-run established data centres will always be first choice for informed customers.

The next part of the consolidation process is probably networks; new entrants are coming into a market where prices continue to fall at a dizzying rate.  There is no end of small new entrants to the marketplace, some of which will succeed and some of which will fall by the wayside.  At the larger end, consolidation continues.  In City Lifeline’s central London data centre alone, Abovenet has become Zayo (and consequently moved from the very top of everyone’s list to the very bottom, possibly not causing joy in Abovenet’s marketing department), Cable and Wireless/Thus has become part of Vodafone, PacketExchange has become part of GT-T and Global Crossing has become part of Level 3.

Data Centre Infrastucture Management (DCIM) systems may establish themselves more in 2013.  DCIM was predicted to have a massive impact, with Gartner stating publicly in 2010 that penetration would be 60% by 2014.  In the event, penetration at the end of 2012 is only 1%.  DCIM is hard and laborious to implement but it offers serious benefits to larger organisations in terms of the management of their physical assets, power, space and cooling and can quickly repay its investment in answering the basic question “how many servers can I have for the capacity I am paying for”.  DCIM deserves more success than it has had to date, and perhaps 2013 will be the year it takes off.

Power densities will continue to increase in 2013.  Five years ago, many racks drew 2KW (8 amps).  Now 8 amp racks are becoming unusual and 16 amps racks are the norm.  Five years ago 7KW racks (about 30 amps) were unusual, now they are common, and 20KW racks are starting to appear.  The trend to higher and higher performance and power densities will continue.

The data centre industry continues to grow, driven by the move to Cloud.  By the end of 2013, an estimated 23% of all data centre space will be in commercial colocation operations.  The leading market segments are likely to be Telecoms and Media, with 24%, Healthcare and Education, with 21% and Public Sector, also with 21%.  In-house data centre capacity is likely to continue to decrease and the commercial colocation market to grow, even in spite of the uncertain macro-economic environment.

Roger Keenan, Managing Director of City Lifeline

 Roger Keenan joined City Lifeline, a leading carrier neutral colocation data centre in Central London, as managing director in 2005.  His main responsibilities are to oversee the management of all business and marketing strategies and profitability. Prior to City Lifeline, Roger was general manager at Trafficmaster plc, where he fully established Trafficmaster’s German operations and successfully managed the $30 million acquisition of Teletrac Inc in California, becoming its first post-acquisition Chief Executive.

Four Things You Need to Know About PCI Compliance in the Cloud

By Andrew Hay, Chief Evangelist, CloudPassage

Andrew HayAndrew Hay is the Chief Evangelist at CloudPassage, Inc. where he is lead advocate for its SaaS server security product portfolio. Prior to joining CloudPassage, Andrew was a a Senior Security Analyst for 451 Research, where he provided technology vendors, private equity firms, venture capitalists and end users with strategic advisory services.

Anyone who’s done it will tell you that implementing controls that will pass a PCI audit is challenging enough in a traditional data center where everything is under your complete control. Cloud-based application and server hosting makes this even more complex. Cloud teams often hit a wall when it’s time to select and deploy PCI security controls for cloud server environments. Quite simply, the approaches we’ve come to rely on just don’t work in highly dynamic, less-controlled cloud environments. Things were much easier when all computing resources were behind the firewall with layers of network-deployed security controls between critical internal resources and the bad guys on the outside.

Addressing the challenges of PCI DSS in cloud environments isn’t an insurmountable challenge. Luckily, there are ways to address some of these key challenges when operating a PCI-DSS in-scope server in a cloud environment. The first step towards embracing cloud computing, however, is admitting (or in some cases learning) that your existing tools might be not capable of getting the job done.

Traditional security strategies were created at a time when cloud infrastructures did not exist and the use of public, multi-tenant infrastructure was data communications via the Internet. Multi-tenant (and even some single-tenant) cloud hosting environments introduce many nuances, such as dynamic IP addressing of servers, cloud bursting, rapid deployment and equally rapid server decommissioning, that the vast majority of security tools cannot handle.

First Takeaway: The tools that you have relied upon for addressing PCI related concerns might not be built to handle the nuances of cloud environments.

The technical nature of cloud-hosting environments makes them more difficult to secure. A technique sometimes called “cloud-bursting” can be used to increase available compute power extremely rapidly by cloning virtual servers, typically within seconds to minutes. That’s certainly not enough time for manual security configuration or review.

Second Takeaway: Ensure that your chosen tools can be built into your cloud instance images to ensure security is part of the provisioning process.

While highly beneficial, high-speed scalability also means high-speed growth of vulnerabilities and attackable surface area. Using poorly secured images for cloud-bursting or failing to automate security in the stack means a growing threat of server compromise and nasty compliance problems during audits.

Third Takeaway: Vulnerabilities should be addressed prior to bursting or cloning your cloud servers and changes should be closely monitored to limit the expansion of your attackable surface area.

Traditional firewall technologies present another challenge in cloud environments. Network address assignment is far more dynamic in clouds, especially in public clouds. There is rarely a guarantee that your server will spin up with the same IP address every time. Current host-based firewalls can usually handle changes of this nature but what about firewall policies defined with specific source and destination IP addresses? How will you accurately keep track of cloud server assets or administer network access controls when IP addresses can change to an arbitrary address within a massive IP address space?

Fourth Takeaway: Ensure that your chosen tools can handle the dynamic nature of cloud environments without disrupting operations or administrative access.

The auditing and assessment of deployed servers is an addressable challenge presented by cloud architectures. Deploying tools purpose-built for dynamic public, private and hybrid cloud environments will also ensure that your security scales alongside your cloud server deployments. Also, if you think of cloud servers as semi-static entities deployed on a dynamic architecture, you will be better prepared to help educate internal stakeholders, partners and assessors on the aforementioned cloud nuances – and how your organization has implemented safeguards to ensure adherence to PCI-DSS.

 


Woz is Worried About “Everything Going to the Cloud” — the Real Issue is Giving Up Control

Guest Post By Nati Shalom, CTO and Founder of GigaSpaces

In a recent article, Steve Wozniak, who co-founded Apple with the late Steve Jobs, predicted “horrible problems” in the coming years as cloud-based computing takes hold. 

“I really worry about everything going to the cloud,”.. “I think it’s going to be horrendous. I think there are going to be a lot of horrible problems in the  next five years. ….“…with the cloud, you don’t own anything. You already signed it away.”

When I first read the title I thought, Wozniak sounds like Larry Ellison two years ago, when he pitched the Cloud is hype, before he made a 180-degree turn to acknowledge Oracle wished to be a cloud vendor too. 

Reading it more carefully, I realized the framing of the topic is instead just misleading. Wozniak actually touches on something that I hear more often, as the cloud hype cycle is moves from a Peak of Inflated Expectations into through the Trough of Disillusionment.

Wozniak echos an important lesson, that IMO, is major part of the reason many of the companies that moved to cloud have experienced lots of outages during the past months. I addressed several of these aspects in in a recent blog post: Lessons from the Heroku/Amazon Outage.

When we move our operations to the cloud, we often assume that we’re out-sourcing our data center operation completely, including our disaster recovery procedures. The truth is that when we move to the cloud we’re only outsourcing the infrastructure, not our operations, and the responsibility of how to use this infrastructure remain ours.

Choosing better tradeoffs between producivity and control

For companies today, the main reason we chose to move to the cloud in the first place was to gain better agility and productivity. But in starting this cloud journey, we found that we had to give up some measure of control to achieve the agility and productivity.

The good news is that as the industry mature there are more choices that provides better tradeoffs between producivity and control:

  • Open source cloud such as OpenStack and CloudStack
  • Private cloud offering
  • DevOps and automation tools such as Chef and Puppet
  • OpenSource PaaS such as Cloudify, OpenShift and CloudFoundry
  • DevOps and PaaS combined such Cloudify

As businesses look at cloud strategy today, there isn’t a need to give up control over productivity. With technologies like Cloudify, businesses can get the best out of both worlds.

References


Examining the G-Cloud Initiative – How the UK Public Sector is moving to the Cloud

Guest Post by Ben Jones

Ben Jones is a tech writer, interested in how technology helps businesses. He’s been assisting businesses in setting up cloud based IT services around the south of England.

There’s a cloud on the horizon of Whitehall. But this isn’t a prediction of stormy times ahead. No, this is the G-Cloud, and it’s being heralded by some as government’s biggest ever IT breakthrough.

In years gone by, the government has been accused of paying too much for IT contracts, many of which were won by a small number of suppliers. But now, the G-Cloud initiative aims to change this. The online system called, CloudStore, is part of the government’s plans to slash IT costs by £200million per year. So how is this going to be achieved? Well, the target is to move half of the government’s IT spending to cloud computing services and the CloudStore, also dubbed the government’s app store, is the key.

It was first announced as a government strategy almost 18 months ago in March 2011 with specific aim of making IT services for the public sector easier and cheaper. This means ditching the expensive bespoke IT services with lengthy, expensive contracts. Instead this initiative aims to replace these with more choice both in suppliers and, as a result prices. It’s a radical change in the historic approach by both the government and the public sector. Furthermore, cloud computing has the potential to be a global governmental strategy, with the American government already having its own version in place. And a look at the figures gives a clear indication why, with some governmental departments reporting a drop in the cost of IT services by as much as 90 per cent. And following the first CloudStore catalogue launch in mid-February, some 5000 pages were viewed in the first two hours, and in the first ten weeks, contracts worth £500,000 were signed. In this first procurement, around 257 suppliers offering approximately 1700 services were signed to the first G-Cloud CloudStore.

It’s the government’s attempt to bring competitiveness to its suppliers, encouraging a wider selection and promoting flexibility in procurements thus allowing more choice to the public sector. And what’s interesting is the mix of both small and medium sized businesses with over half of the suppliers signed to the first CloudStore being SMEs. This includes the likes of web hosting company Memset whose managing director Kate Craig-Wood backed the G-Cloud Services, who says they offered value for money for the taxpayer.

This new initiative heralds a new era for the British government and the wider public sector. And it’s hoped the new IT system will put paid to the Government’s history of ill-advised and mismanaged IT projects. That’s not to say there haven’t been any concerns over the G-Cloud Initiative. Some key concerns have related to how it’s going to be rolled out to public sector workers across the UK with some employees having fears over security as well as a lack of understanding. However, these haven’t stopped the second round of procurement for the G-Cloud in May 2012 with the total procurement value now available there soaring to £100 million. And in this time, the framework will run for 12 months and not the six as per the first iteration. This year-long contract will then become the standard, although it has been reported that this could be extended to 24 months in certain cases.


The ‘Curse of Knowledge’: An IT Consultant’s Tips to Avoid Stifling Innovation

Guest Post by Bob Deasy, CIO, Lead I.T. Consulting

Bob Deasy

Bob Deasy is CIO of Lead I.T. Consulting, which provides focused IT solutions and business strategy consulting in the Portland area.

The phrase “curse of knowledge” first appeared a 1989 paper titled “The Curse of Knowledge in Economic Settings: An Experimental Analysis,” which introduced the concept that “better informed agents are unable to ignore private information even when it is in their best interests to do so; more information is not always better.” While most of us assume that experts are the best people to turn to for new ideas, the truth is that experts are often less able to innovate than greenhorns. For instance, if your IT consultant thinks along the exact same lines as you, it’s difficult to find new ways of doing things.

Although this concept is counterintuitive at first, it makes sense upon consideration of the knowledge-building process. Every field has its own lingo and agreed-upon principles. These guidelines help organize and canonize information that would otherwise be difficult to remember. To gain entry into upper academic echelons and posh corner offices, a person must learn how to follow the appropriate industry rules. IT consultants, for instance, often have a set of IT management rules, such as the ITIL guidelines, practically engrained on their brains, so they may not see areas that are best served by alternative approaches.

The more you know, the harder it is to get out of the box of agreed-upon industry rules that you’ve built around yourself. The mind of an expert can easily settle into a certain pattern or rut, simply because “that’s the way we’ve always done it.” When entire technology consulting firms are operating from the same handbook, it’s difficult to achieve true innovation. Intel co-founder Andrew S. Grove put it this way: “When everybody knows that something is so, it means that nobody knows nothin’.” As we get to know a topic better, it is harder for us to see it in creative, new ways. Understanding the “rules” of knowledge limits our ability to bend or break them.

Sophisticated but ultimately useless software is one example of how the curse of knowledge thwarts IT innovation. Engineers, in their insulated community, can’t help but design software for other engineers. Too often, the product of their efforts is packaged well and marketed widely but ultimately impractical or downright useless for the average company.

Brothers Chip and Dan Heath explore how to evade the curse of knowledge in their book Made to Stick: Why Some Ideas Stick and Others Come Unstuck. Below, we explore a few of these suggestions through an IT management perspective. IT consultants and mangers can cultivate innovation by following these tips:

1. Build a team with a variety of skills.

Steve Jobs took this approach to heart when he created the Pixar building, which was designed to force accountants, animators and all other niche experts to interact in the building’s sole set of bathrooms. As Jobs said, “Technology alone is not enough – it’s technology married with liberal arts, married with the humanities, that yields us the results that make our hearts sing.” When an IT consultant, CFIO or other IT management guru is forced to work with complete novices, new ways of thinking naturally open up. Total beginners will likely have unique knowledge in other areas that can be applied to IT management in unique, groundbreaking ways.

2. Avoid jargon; seek teaching opportunities.

Explaining the basics can help experts think about their understanding in a new light, fostering innovation. In her book Innovation Killer, Cynthia Barton Rabe tells of a colleague at Eveready who came to the flashlight business with no preconceived notions of what did and did not work. At that time, all Eveready flashlights were red and utilitarian. Drawing from her years of experience in marketing and packaging at Ralston Purina, this flashlight newbie overhauled the Eveready line to include pink, green and baby blue torches – colors that would be more likely to attract female shoppers. Thus, the floundering flashlight business was revived.

Rabe concludes that such “zero gravity thinkers,” as she calls them, fuel innovation by asking very basic questions that force experts to step back into a beginner’s mindset. Because going back to the basics can seem like backtracking to those who are very familiar with specialized knowledge, it’s not unusual for frustration to run high when zero gravity thinkers are on the scene. However, if a team can work through this irritation, innovation soon follows.

3. Hire “Renaissance thinker” consultants.

Ms. Rabe concedes that outside parties, such as IT consultants, can serve as zero gravity thinkers, assuming they have a broad range of knowledge. If your IT consultant’s only employment has been through technology consulting firms, he or she will not be as likely to innovate. In contrast, an IT consultant who came to the field as a second career will be able to see wholly new approaches.