Samsung unveils cloud gaming service

Samsung have revealed that they will be offering a cloud-based gaming service which they will build into their flagship 7000 series HDTVs.

This is a big step forward for an idea that has been around for quite a while now. The back-end infrastructure is to be performed by Gaikai as they’ll provide cloud-based games from a number of high profile companies, including Electronic Arts.

Samsung have also said that the cloud service will support wired and wireless gamepad controllers from several manufacturers. PlayStation and Xbox owners should, therefore, have no trouble connecting to Samsung’s cloud gaming service.

The cloud service is currently entering beta testing and it has been announced that mouse and keyboard support is being considered, however it will not be available when the service first launches.

Ethan Rasiel, Samsung’s public relations director, has been explaining that only 2012 models of the 7000 series TVs …

Cloud Computing: The Biggest Big Data at Cloud Expo New York

Have you ever stopped to ask the question, what is the biggest Big Data? If you haven’t, it’s probably because you don’t imagine that it could have much to do with your day-to-day business concerns. But, in a real sense, you are already intimately involved in the ultimate Big Data.
In his session at the 10th International Cloud Expo, Sandy Steier, CEO & Co-Founder of 1010data, will describe the largest possible dataset and show that, if you are involved with any data, you are automatically involved in the biggest data. He will also discuss what this implies for the future of data analysis.

read more

Cloud Computing: The Biggest Big Data at Cloud Expo New York

Have you ever stopped to ask the question, what is the biggest Big Data? If you haven’t, it’s probably because you don’t imagine that it could have much to do with your day-to-day business concerns. But, in a real sense, you are already intimately involved in the ultimate Big Data.
In his session at the 10th International Cloud Expo, Sandy Steier, CEO & Co-Founder of 1010data, will describe the largest possible dataset and show that, if you are involved with any data, you are automatically involved in the biggest data. He will also discuss what this implies for the future of data analysis.

read more

Cloud Computing: The SI’s CIO Steve DeLuca to Speak June 13 at Cloud Expo

The SI Organization, Inc.’s (the SI) Chief Information Officer Steve DeLuca will speak at the 10th International Cloud Expo June 13 at the Javits Center in New York City.
His session, “Transitioning a Full Enterprise to Cloud in Just 10 Months: Lessons Learned,” will focus on the successful deployment of the SI’s cloud-based infrastructure.
The lessons learned from the SI’s internal deployment will serve other organizations as they embark on similar transformations of their IT environments. The SI team achieved a highly-accessible, virtualized and elastic environment utilizing server virtualization, desktop virtualization, user profile virtualization, application virtualization, and intelligent storage network. These elements, leveraging COTS integration, allowed the team to rapidly deploy robust enterprise services from scratch in just 10 months, while maturing the environment efficiently over time and ensuring the continuity of operations for the SI’s workforce as well as customers’ critical national security missions.
The SI is a leading provider of full life cycle, mission-focused systems engineering and integration capabilities to the U.S. Intelligence Community, Department of Defense and other agencies. Its scalable systems engineering platform for modeling, simulation and analysis helps customers baseline requirements, optimize resources and manage risk. The company has 40 years of experience successfully delivering complex, system-of-systems technology solutions. The SI employs more than 2,100 people, with major locations in Chantilly, Va.; Denver; Laurel, Md.; Los Angeles; and Valley Forge, Pa. For more information, visit www.thesiorg.com .

read more

Cloud Computing: The SI’s CIO Steve DeLuca to Speak June 13 at Cloud Expo

The SI Organization, Inc.’s (the SI) Chief Information Officer Steve DeLuca will speak at the 10th International Cloud Expo June 13 at the Javits Center in New York City.
His session, “Transitioning a Full Enterprise to Cloud in Just 10 Months: Lessons Learned,” will focus on the successful deployment of the SI’s cloud-based infrastructure.
The lessons learned from the SI’s internal deployment will serve other organizations as they embark on similar transformations of their IT environments. The SI team achieved a highly-accessible, virtualized and elastic environment utilizing server virtualization, desktop virtualization, user profile virtualization, application virtualization, and intelligent storage network. These elements, leveraging COTS integration, allowed the team to rapidly deploy robust enterprise services from scratch in just 10 months, while maturing the environment efficiently over time and ensuring the continuity of operations for the SI’s workforce as well as customers’ critical national security missions.
The SI is a leading provider of full life cycle, mission-focused systems engineering and integration capabilities to the U.S. Intelligence Community, Department of Defense and other agencies. Its scalable systems engineering platform for modeling, simulation and analysis helps customers baseline requirements, optimize resources and manage risk. The company has 40 years of experience successfully delivering complex, system-of-systems technology solutions. The SI employs more than 2,100 people, with major locations in Chantilly, Va.; Denver; Laurel, Md.; Los Angeles; and Valley Forge, Pa. For more information, visit www.thesiorg.com .

read more

Translating a Vision for IT Amid a “Severe Storm Watch”

IT departments adopt technology from two perspectives: from a directive by the CIO to a “rogue IT” suggestion or project from an individual user. The former represents a top-down condition, while the latter has technology adoption from the bottom-up. Oftentimes, there seems to be confusion somewhere in the middle, resulting in a smorgasbord of tools at one end, and a grand, ambitious strategy at the other end. This article suggests a framework to implement a vision from strategy, policy, process, and ultimately tools.

Vision for IT -> Strategies -> Policies -> Processes -> Procedures -> Tools and Automation

Revenue Generating Activities -> Business Process -> IT Services

As a solutions architect and consultant, I’ve met with many clients in the past few years. From director-level staff to engineers to support staff in the trenches, IT has taken on a language of its own. Every organization has its own acronyms, sure. Buzzwords and marketing hype strangle the English language inside the datacenter. Consider the range of experience present in many shops, and it is easy to imagine the confusion. The seasoned, senior executive talks about driving standards and reducing spend for datacenter floor space, and the excited young intern responds with telecommuting, tweets, and cloud computing, all in a proof-of-concept that is already in progress. What the…? Who’s right?

 

It occurred to me a while ago that there is a “severe storm watch” for IT. According to the National Weather Service, a “watch” is issued when conditions are favorable for [some type of weather chaos]. Well, in IT, more than in other departments, one can make these observations:

  • Generationally-diverse workforce
  • Diverse backgrounds of workers
  • Highly variable experience of workers
  • Rapidly changing products and offerings
  • High complexity of subject matter and decisions

My colleague, Geoff Smith, recently posted a five-part series (The Taxonomy of IT) describing the operations of IT departments. In the series, Geoff points out that IT departments take on different shapes and behaviors based on a number of factors. The series presents a thoughtful classification of IT departments and how they develop, with a framework borrowed from biology. This post presents a somewhat more tactical suggestion on how IT departments can deal with strategy and technology adoption.

Yet Another Framework

A quick search on Google shows a load of articles on Business and IT Alignment. There’s even a Wikipedia article on the topic. I hear it all the time, and I hate the term. This term suggests that “IT” simply does the bidding of “The Business,” whatever that may be. I prefer to see Business and IT Partnership. But anyway, let’s begin with a partnership within IT departments. Starting with tools, do you know the value proposition of all of the tools in your environment? Do you know about all of the tools in your environment?

 

A single Vision for IT should first translate into one or more Strategies. I’m thinking of a Vision statement for IT that looks something like the following:

“Acme IT exists as a competitive, prime provider of information technology services to enable Acme Company to generate revenue by developing, marketing, and delivering its products and services to its customers. Acme IT stays competitive by providing Acme Company with relevant services that are delivered with the speed, quality and reliability that the company expects. Acme IT also acts as a technology thought leader for the company, proactively providing services that help Acme Company increase revenue, reduce costs, attract new customers, and improve brand image.”

Wow, that’s quite a vision for an IT department. How would a CIO begin to deliver on a vision like that? Just start using VMware, and you’re all set! Not quite! Installing VMware might come all the way at the end of the chain… at “Tool A” in the diagram above.

First, we need one or more Strategies. One valid Strategy may indeed be to leverage virtualization to improve time to market for IT services, and reduce infrastructure costs by reducing the number of devices in the datacenter. Great ideas, but a couple of Policies might be needed to implement this strategy.

One Policy, Policy A in the above diagram, might be that all application development should use a virtual server. Policy B might mandate that all new servers will be assessed as virtualization candidates before physical equipment is purchased.

Processes then flow from Policies. Since I have a policy that mandates that new development should happen on a virtual infrastructure, eventually I should be able to make a good estimate of the infrastructure needed for my development efforts. My Capacity Management process could then requisition and deploy some amount of infrastructure in the datacenter before it is requested by a developer. You’ll notice that this process, Capacity Management, enables a virtualization policy for developers, and neatly links up with my strategy to improve time to market for IT services (through reduced application development time). Eventually, we could trace this process back to our single Vision for IT.

But we’re not done! Processes need to be implemented by Procedures. In order to implement a capacity management process properly, I need to estimate demand from my customers. My customers will be application developers if we’re talking about the policy that developers must use virtualized equipment. Most enterprises have some sort of way to handle this, so we’d want to look at the procedure that developer customers use to request resources. To enable all of this, the request and the measurement of demand, I may want to implement some sort of Tool, like a service catalog or a request portal. That’s the end of the chain – the Tool.

Following the discussion back up to Vision, we can see how the selection of a tool is justified by following the chain back to procedure, process, policy, strategy, and ultimately vision.

This framework provides a simple alignment that can be used in IT departments for a number of advantages. One significant advantage is that it provides a common language for everyone in the IT department to understand the reasoning behind the design of a particular process, the need for a particular procedure, or the selection of a particular tool over another.

In a future blog post, I’ll cover the various other advantages of using this framework.

Food for Thought

  1. Do you see a proliferation of tools and a corresponding disconnect with strategy in your department?
  2. Who sets the vision and strategy for IT in your department?
  3. Is your IT department using a similar framework to rationalize tools?
  4. Do your IT policies link to processes and procedures?
  5. Can you measure compliance to your IT policies?

Cloud Computing: The Enterprise API Management Platform at Cloud Expo NY

In his session at the 10th International Cloud Expo, Corey Scobie, Chief Technology Officer of SOA Software, will discuss how businesses are harnessing the power of APIs to reach new customers and markets. He will walk the audience through the growth and evolution of the API, why effective API management is important, how the game changes when companies expose business applications to the outside world.
A brief history of the API
How to use APIs to make money, save money, build brand
“Appification” and the innovation model of the open API
API management nuts and bolts, and best practices
Why Enterprise APIs are important
Some great examples of companies doing it right

read more

Cloud Computing: The Enterprise API Management Platform at Cloud Expo NY

In his session at the 10th International Cloud Expo, Corey Scobie, Chief Technology Officer of SOA Software, will discuss how businesses are harnessing the power of APIs to reach new customers and markets. He will walk the audience through the growth and evolution of the API, why effective API management is important, how the game changes when companies expose business applications to the outside world.
A brief history of the API
How to use APIs to make money, save money, build brand
“Appification” and the innovation model of the open API
API management nuts and bolts, and best practices
Why Enterprise APIs are important
Some great examples of companies doing it right

read more

Cloud Computing: Start Spreading the News… Cloud Expo, New York

Cloud Expo 2012 is almost here. This promises to be an incredible event, with thousands of attendees and over 100 speakers. As previously mentioned, I’m privileged to be presenting on Making Hybrid Cloud Safe & Reliable. I’m particularly excited that I’ll be introducing attendees to the new concept of API-Aware Traffic Management. It will also be great to be back in New York City!
I recently read Daniel Kahneman’s book Thinking Fast & Slow, a fascinating study of how the human mind works. With the new capabilities offered by big data and Cloud computing — the dual themes for next week’s event — and the increasing personalization of technology through Mobile devices, I think we have an opportunity to make our digital systems more human in their processing. What does that mean? Well, more intuitive in user experience, more lateral through caching of unstructured data and more adaptive to changing conditions. API-Aware Traffic Management certainly reflects this potential.

read more

Enterprise Cloud Security – Comprehensive Security Approach

Cloud Security has been one of the top challenges reported by organizations that want to migrate to the Cloud. This is a challenge since the organization’s data may now be stored externally that can pose greater challenges to data integrity and compliance. Even though the data may be in the Cloud provider’s space, any compromises put the organization at risk. The Cloud can introduce new security risks that need to be addressed, however there are specific ways to manage the risks and leverage the benefits that Cloud has to offer and to ensure secure solutions across the Enterprise.

As part of the Cloud vendor selection, it is important to ensure a solid business presence and financial stability. If the vendor goes out of business, it’s important to make sure the organization’s data is secure and will not be lost. The vendor should provide secure service management capabilities for provisioning, updates and auditing. Prior to moving to the Cloud, an assessment of data sensitivity and compliance requirements should be one of the initial steps. Subsequently, specific vulnerabilities for the Cloud solution should be identified, documented and addressed. From an Enterprise Security perspective, policies, tools and controls should be developed for protection. There are many ways in which the security risks can be mitigated. One of the ways is to make sure that the providers have audits and certifications to ensure the security of the data. The location of the data is a common concern, if the data is needed in a specific area it is important to incorporate this aspect in the service level agreements with the vendor. Security controls at every level should be documented and addressed as part of the certification activities. For the Government, FEDRAMP is a program that supports secure cloud computing and provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services.

read more