Managing Resources in the Cloud: How to Control Shadow IT & Enable Business Agility

 

In this video, GreenPages CTO Chris Ward discusses the importance of gaining visibility into Shadow IT and how IT Departments need to offer the same agility to its users that public cloud offerings like Amazon can provide.

 

http://www.youtube.com/watch?v=AELrS51sYFY

 

 

If you would like to hear more from Chris, download his on-demand webinar, “What’s Missing in Today’s Hybrid Cloud Management – Leveraging Cloud Brokerage”

You can also download this ebook to learn more about the evolution of the corporate IT department & changes you need to make to avoid being left behind.

 

 

 

How the Data Science of Email is Solving Today’s Business Problems

There are 182 billion emails sent every day, generating a lot of data about how recipients and ISPs respond. Many marketers take a more-is-better approach to stats, preferring to have the ability to slice and dice their email lists based numerous arbitrary stats. However, fundamentally what really matters is whether or not sending an email to a particular recipient will generate value. Data Scientists can design high-level insights such as engagement prediction models and content clusters that allow marketers to cut through the noise and design their campaigns around strong, predictive signals, rather than arbitrary statistics.
SendGrid sends up to half a billion emails a day for customers such as Pinterest and GitHub. All this email adds up to more text than produced in the entire twitterverse. We track events like clicks, opens and deliveries to help improve deliverability for our customers – adding up to over 50 billion useful events every month. While SendGrid data covers only about 2% of all non-spam email activity in the world, it gives SendGrid a unique snapshot of email activity spanning from the senders to the recipients and inbox providers like gmail and yahoo. To cope with data of this scale SendGrid has designed and implemented custom data structures to store 10s of billions of items in memory on a single commodity machine.

read more

Should Cloud Be Part of Your Backup and Disaster Recovery Plan?

The introduction of the Cloud has enabled the fast and agile data recovery process which is effectively more efficient than restoring data from physical drives as was the former practice. How does this impact Backup & Recovery, Disaster Recovery and Business Continuity initiatives?
Cloud backup is the new approach to data storage and backup which allows the users to store a copy of the data on an offsite server – accessible via the network. The network that hosts the server may be private or a public one, and is often managed by some third-party service provider. Therefore, the provision of cloud solution for the data recovery services is a flourishing business market whereby the service provider charges the users in exchange for server access, storage space and bandwidth, etc.

read more

Web Host Industry Review “Media Sponsor” of Cloud Expo Silicon Valley

SYS-CON Events announced today that the Web Host Industry Review has been named “Media Sponsor” of SYS-CON’s 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Since 2000, The Web Host Industry Review has made a name for itself as the foremost authority of the Web hosting industry providing reliable, insightful and comprehensive news, reviews and resources to the hosting community. TheWHIR Blogs provides a community of expert industry perspectives. The Web Host Industry Review Magazine also offers a business-minded, issue-driven perspective of interest to executives and decision-makers. WHIR TV offers on demand web hosting video interviews and web hosting video features of the key persons and events of the web hosting industry. WHIR Events brings together like-minded hosting industry professionals and decision-makers in local communities. TheWHIR is an iNET Interactive property.

read more

Cloud computing: An important component of the physical infrastructure for smart cities

By Victor M. Larios

Today half of the world’s population is living in urban areas, and cities are growing their infrastructures and services to keep up. Traditionally city governments have different departments to oversee the metropolitan services for citizens; however, departments are not fully communicating their plans and actions, utilizing their services as independent entities. As a city grows, duplicated efforts and waste of resources emerge. In developing a smart city infrastructure, it is necessary to think of cities as complex systems with departments as subsystems sharing all resources and assets.

For example, a typical department of transportation models traffic patterns in order to plan new roads or arrange streets for efficient mobility. In a systemic approach streets in the city are a shared resource – the education department adds traffic at peak times according to school schedules; the sanitation department influences traffic with low speed vehicles collecting garbage; and the environmental department estimates degrees of pollution via the density of traffic identified by the transportation department. Also, the health department could use such information, as well as weather conditions, to increase its pharmaceutical stock in relation to pollution numbers or anticipated storms or natural disasters.

In this scope, it’s fundamental for cities in a smartification process to consolidate their infrastructure according to the basic principles of services design such as modularity, exportability, interoperability, extensibility, and scalability.

Cloud computing technologies offer a good solution for cities to consolidate their physical infrastructure. Cloud technologies provide different levels of services such as IaaS (infrastructure as a service), PaaS (platform as a service) and SaaS (software as a service) for efficiency, quality of service on demand and green infrastructure.

In 2013 the IEEE launched a new and ambitious educational program for the development of Smart Cities with the goal of identifying and sharing best practices to support cities in their smartification process. Guadalajara, Mexico was the first city selected for this IEEE initiative due in part to the city government’s decision to renew the downtown and build a Digital Creative City (Guadalajara Ciudad Creativa Digital or GCCD).  The master plan, designed by Prof. Carlo Ratti and Prof. Dennis Frenchman from MIT and several consultancy groups, proposed a master plan to transform the city without loosing its traditions, identity, and architecture.

During the kickoff workshop in Guadalajara in October 2013 local IEEE volunteers defined a strategy for six working groups to tackle different layers of the Smart City: 1) Physical Infrastructure, 2) Internet of Things, 3) Open Data Framework, 4) Analytics and Visualization, 5) Metrics for Smart Cities, and 6) Education for Smart Cities.

According to the GCCD original master plan, the city environment would have a device sensor network and a set of cloud services.  Infrastructure requirements include an optical fiber backbone network and data center facilities for urban informatics. In order to innovate interactions of citizens with information and services, the urban informatics are based on private cloud services supported by the concept of the Urban Operating System (UOS). The UOS is a complex event system manager based on data analytics from the sensor network that optimizes and forecasts the use of city resources and services by the citizens.

Cloud computing technologies offer a good solution for cities to consolidate their physical infrastructure

One of the first proposals is private cloud architecture for the city of Guadalajara. A roadmap in cloud technologies development envisioned for two years includes a local talent program to develop skills in cloud computing to encourage innovative solutions for the coming challenges.

One of the projects in progress at GCCD is the construction of the digital creative accelerator complex as a smart building to host small/medium business for creative industries, and an innovation center with living labs for IoT, smart cities, and other areas of research interest for the municipality. Besides this new smart building complex, a renewal smart building project called the “Ingenium Campus” will be the vector of knowledge within the city.

It comprises an incubator to support local start-ups in the digital creative and smart cities fields, a media arts magnet middle school and the Ingenium Institute as a joint effort between the educational universities of Guadalajara and the GCCD companies for talent engagement, education and entrepreneurship development.

Hence, the first services in the private cloud will be related to environmental, social and economic aspects of the city.

The challenges foreseen for the IEEE Guadalajara pilot and cloud initiative include finding a cost effective strategy to support the private cloud, and ensuring security for cloud users, which is paramount given the mixed environment of government, citizens and companies who will be sharing the private cloud. Additionally, the city must adapt public policies to enhance the benefits of a consolidated infrastructure with the proposed private cloud and concepts such as the UOS. With Guadalajara as its first Smart City, the IEEE Smart Cities Initiative is working to build a consortium of cities to share their experience and best practices.

About the author

Victor M. Larios has received his PhD and a DEA in Computer Science at the Technological University of Compiègne, France and a BA in Electronics Engineering at the ITESO University in Guadalajara, Mexico. He works at the University of Guadalajara (UDG) holding a Full Professor-Researcher position at the Department of Information Systems and he is the director of the Smart Cities Innovation Center at CUCEA UDG Campus. Besides, Dr. Victor M. Larios is the founder of the UDG PhD in Information Technologies in 2007, and has been leading projects in Guadalajara between academy, government and high tech industry as IBM, Intel and HP focusing his research to distributed systems, parallel computing, data analytics and visualization, serious games and smart cities.

Four critical areas for G-Cloud in the next 12 months

This month saw the publication of the business plan for the GDS which lays out the focus of its work up until this time next year.

The scope of its work pipeline is impressive in its ambition and potential impact.

In digitising the top 25 most important government services by its 2015 deadline the GDS estimates it will save £979m each year.

Meanwhile, from its G-Cloud programme, it estimates that if the current rate of spend continues it will realise annual savings of around £200m by March 2015.

After being subsumed into the GDS last year it is good to see that G-Cloud has not only retained focus around driving the cloud agenda in government but the extent to which, as a relatively small part of the GDS, it is able to show that there is a compelling commercial reason for government to continue to put cloud at the heart of what it does.

But of course, there is always more that can be done. As we head out of the first year of Cloud First, this year is a critical one in driving the cloud agenda in the public sector. With that in mind here are four areas I think Tony Singleton and his team should focus on:

1)  Continue to provide vocal leadership around the role of G-Cloud –  G-Cloud is maturing in terms of supplier numbers and spend but what is still missing are the higher value transactions which are vital if it is to be seen as the definitive framework for cloud services.

As larger value contracts come up for procurement in the year ahead it is vital that G-Cloud team acts as vocal cheerleaders from within government to ensure departments are aware of and use the framework. This will be particularly important as CloudStore gets incorporated into the Digital Marketplace later this year.

2)  A better balanced set of arguments around the benefits of cloud – cost cutting has been top of the Coalition’s agenda and, as such, has ended up being positioned at the heart of the rational for cloud adoption. The focus on costs is important but risks creating a one dimensional perception of a solution whose real benefit is in its ability to transform and simplify the way the public sector operates and delivers services. 

What is needed then, is a more balanced set of arguments around the case for cloud adoption so buyers have a better understanding of business benefits beyond simple cost savings.

3)  Reducing barriers to buying  – if cheerleading and knowledge building are important, so too is the job of identifying and addressing the issues which are stopping public sector buyers engaging with cloud computing. Chief of these is how G-Cloud fits an organisation’s approach to procurement.

Building awareness of G-Cloud outside of central government, and in local government specifically, is also important. I know the G-Cloud team is starting to take action on all of these areas but it is vital that they benefit from their long term attention.

4)  Encourage government departments to talk about their success – Here at Eduserv we have delivered some great work in many parts of the public sector – from central to local government and other agencies. We’re proud of our success but it can be incredibly hard to get clients to talk publicly about what they have achieved. 

Case studies play a critical role in educating the sector, reducing perceptions of risk and providing a reference point for organisations to plan their own projects. We’d love to see more encouragement and support for government departments to talk about what they have done well and to perhaps make success stories available on CloudStore.

This, of course, is just a starting point and provides some top line areas for action. But if we make an impact on these areas I am confident that in twelve months’ time we will be able to look back on a year when CloudFirst has really made an impact on the public sector.

Can bare metal beat virtualised public cloud for NoSQL database performance? Internap says yes

Internet content delivery provider Internap has today released a series of results which show a better benchmark performance on big data transfer from its bare metal offering as opposed to Amazon and Rackspace public clouds.

The numbers, which were crunched by performance aggregator Cloud Spectator, aimed to quantify performance differences when operating in-memory NoSQL databases in virtualised public cloud and automated bare metal cloud environments, and found that bare metal outperformed both Amazon and Rackspace across three different workloads in throughput and latency.

For Inserting Data Workload, Internap outperformed Amazon by 51% and Rackspace five times over in throughput speed, also easily having less latency, while in the Balanced workload – 50% read, 50% update – Internap beat Amazon by similar scores (50%), while overcoming Rackspace by 2.7 times better performance.

The Internap automated bare metal cloud also comfortably disposed of Amazon (61%) and Rackspace (x2.5) in the Read Heavy workload, which you’ll be unsurprised to hear is 95% read and 5% update. In the vast majority of the tests, Amazon beat Rackspace to the punch for second place.

The tests were conducted by selecting random records of a workable situation; in the Balanced workload, it was an application which tracked the activity of users at e-commerce sites and then personalised ads based on what they did.

The results, the researchers argued, ‘confirmed the prediction’ that bare metal would outlast cloud servers in these particular conditions, but added a note of caution.

“Compared with the large differential between Internap and Amazon for throughput speeds, Amazon instance throughput performed close to Rackspace’s cloud servers,” the researchers wrote. “Latency is also relatively similar between the Amazon and Rackspace servers.”

The report added: “Perhaps this is an indicator that at the larger instance sizes, there is a certain cap that performance reaches, whether due to throttling or physical constraints of the hardware.”

The researchers picked Internap’s bare metal offering – well they had to really, it’s a bit like a kid bringing a football to the park and not being picked for the game – alongside AWS for its “dominant position in the market and wide variety of instance types”, and Rackspace for good work in previous tests, showing high disk performance.

The report also duly noted its peers, including a report from Altoros Systems which looks at the methodology of benchmark testing on NoSQL databases.

It’s not the most common option, but some cloudy firms do have a bare metal option – SoftLayer, who recently celebrated its first anniversary of being bought by IBM, being one of them.

You can find the full report here (registration required).

TMCnet Named “Media Sponsor” of Cloud Expo 2014 Silicon Valley

SYS-CON Events announced today that TMCnet has been named “Media Sponsor” of SYS-CON’s 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Technology Marketing Corporation (TMC) is the world’s leading business to business and integrated marketing media company, servicing niche markets within the communications and technology industries.

read more

O’Reilly Media Named “Media Sponsor” of Cloud Expo Silicon Valley

SYS-CON Events announced today that O’Reilly Media has been named “Media Sponsor” of SYS-CON’s 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
O’Reilly Media spreads the knowledge of innovators through its books, online services, magazines, and conferences. Since 1978, O’Reilly Media has been a chronicler and catalyst of cutting-edge development, homing in on the technology trends that really matter and spurring their adoption by amplifying «faint signals» from the alpha geeks who are creating the future. An active participant in the technology community, the company has a long history of advocacy, meme-making, and evangelism.

read more