I’ve been doing a lot of work with Windows Azure Mobile Services (WAMS). It’s a brilliant technology that allows you to stand up powerful OData compliant services to support your Windows 8 Store Apps, Windows Phone 8 Apps, and even iOS apps in just a few minutes. It’s hard to oversell the sheer awesomeness of this stuff.
I’m currently working on a bunch of code that will shortly become a sample project highlighting both WAMS and Windows 8 Apps (look for a project called “FamilyPig” coming soon). In the process of building that, I ran into a couple of questions – one of which I’ll cover here, and give some guidance to people who might be running into a similar question.
One of the cool features that makes WAMS super easy to work with is the concept of “dynamic schema”. In a nutshell, that means that if you have an existing table, and you throw a Plain Old CLR Object (POCO) at it using the InsertAsync method (of the IMobileServiceTable interface), WAMS is smart enough to look at the object coming in, and make sure that it has all the columns that it needs in the underlying Windows Azure SQL Database table to store the record (assuming that “dynamic schema” is enabled on the mobile service). If the column does exist, it gets created on the fly. Very, very cool. Note, what’s actually happening under the covers is that your POCO object is being converted to a JSON object for transmission over the wire, and WAMS is pulling apart that JSON object to look at the columns.
Monthly Archives: January 2013
Houston, We Have Cloud
The data centers of the future may look more like NASA ground control – governance inside, resources out
One theme has remained consistent throughout the evolution of cloud thus far – enterprise IT wants to retain control of both its data and access to to it.
This is not an unreasonable demand. After all, it is enterprise IT – and its leadership – that will pay the price should customer data leak or regulations not complied with. Despite the growing view that cloud security is a joint, shared responsibility between customer and provider, it is enterprise IT that must put into place the mechanisms for both controlling and proving control over data and access, not cloud providers or integrators. The provider can offer services designed to provide that control, but it is not the one that must implement the polices or report on their effectiveness.
Cloud Conversations: Gaining Cloud Confidence | Part 1
This is the first of a two-part industry trends and perspectives series looking at how to learn from cloud outages (read part II here).
In case you missed it, there were some public cloud outages during the recent Christmas 2012-holiday season. One incident involved Microsoft Xbox (view the Microsoft Azure status dashboard here) users were impacted, and the other was another Amazon Web Services (AWS) incident. Microsoft and AWS are not alone, most if not all cloud services have had some type of incident and have gone on to improve from those outages. Google has had issues with different applications and services including some in December 2012 along with a Gmail incident that received covered back in 2011.
Does mobility and the cloud equal total compatibility?
Over the last fifteen or so years, we have seen computing make the transition from the fixed and immovable desktops and servers, through to chunky underpowered laptops, through to less chunky but more powerful laptops using Wi-Fi connectivity, through to smartphones, 3G, tablets.
We are now nearing a state of complete computing mobility. More recently, we have seen cloud computing grow exponentially with cloud hosting and remote data storage services such as Dropbox taking centre stage.
It is predicted that in 2013, there will be significant merging between the cloud and mobile computing.
The cloud and mobility have already merged in certain markets however, but under a different guise. Where ‘mobility’ has meant the access and input of data in any place and at any time, whether in business or public, it follows that there should be a convergence of such mobility with the extensive array of cloud services.
There …
Amazon Web Services starts to assert itself
By Laurent Lachal, Senior Analyst, Ovum Software
At the end of November 2012, Amazon Web Services (AWS) held its first partner and customer conference in Las Vegas. Dubbed AWS “re: Invent”, the event was a success. It enabled AWS to assert itself as a large and influential player in the IT marketplace.
However, as well as AWS’s strengths, it also highlighted some of its traditional weaknesses. These include poor communications and an inability to put forward messages adapted to the needs of enterprise executives.
A growing footprint but needs more transparency
AWS shares as little information about itself as it can get away with. As the infrastructure-as-a-service (IaaS) market matures, it becomes increasingly awkward for the organisation to be so closely guarded.
Nevertheless, the conference itself, in its size, energy, and quality of networking opportunities, was a good reminder of the growing influence that AWS yields in the IT …
Step-by-Step: Build a SharePoint 2013 Lab in the Cloud on Windows Azure
Now that SharePoint Server 2013 has been released, I frequently get asked about ways in which a SharePoint 2013 lab environment can be easily built for studying, testing and/or performing a proof-of-concept. You could certainly build this lab environment on your own hardware, but due to the level of SharePoint 2013 hardware requirements, a lot of us may not have sufficient spare hardware to implement an on-premise lab environment.
This makes a great scenario for leveraging our Windows Azure FREE 90-day Trial Offer to build a free lab environment for SharePoint 2013 in the cloud. Using the process outlined in this article, you’ll be able to build a basic functional farm environment for SharePoint 2013 that will be accessible for approximately 105 hours of compute usage each month at no cost to you under the 90-day Trial Offer.
This makes a great scenario for leveraging our Windows Azure FREE 90-day Trial Offer to build a free lab environment for SharePoint 2013 in the cloud. Using the process outlined in this article, you’ll be able to build a basic functional farm environment for SharePoint 2013 that will be accessible for approximately 105 hours of compute usage each month at no cost to you under the 90-day Trial Offer.
2013 – The Year of the (Trusted) Cloud
Every year has threatened to be “the year of the Cloud” but my main prediction for 2013 IT industry developments will be that yes, this is the year that it cements its role as the most major of technology disruptions.
It’s hard for Cloud to be defined in such a delineated manner, as it has been underway as a technology trend for many years, there is a wide spectrum of categories of what’s involved which are all continually evolving and also customer adoption has also been underway for many to various extents.
So it’s already been the Cloud for quite some time and so a bit late to make it the Year Of.
Software Sales in the Cloud
Businesses live and die by their ability to sell. Despite common belief, the key to successful selling in the cloud doesn’t rely on the technology infrastructure behind the scenes — the real key is in billing.
In essence, a cloud solution is an agreement between the vendor and the user. The ability to monetize that agreement is the heart of how to sell in the cloud. The more customers using your solution, the more revenue you create. To ensure a steady and predictable revenue stream the back end systems need to be able to support it.
The temptation with any sales process is to sell the sizzle; however, in a cloud environment, function is what’s important. Simplicity is key – from account creation to managing payment processes, billing, and user management, making these processes as simple as possible will make signing up new customers much easier.
Are You Stuck in a Cloud?
Ultimately cloud computing will be driven by customer needs for control and enabled by a new generation of hybrid cloud solutions.
Today public and private cloud operating models offer islands of IT operating efficiency that can be highly desirable for applications and services specifically designed for the cloud or which are already virtualized. Yet for the vast majority of multi-tier applications the cloud is still a pipe dream of promises and expectations. The problem isn’t with the cloud service providers per se, but rather with the vast gulf of manual processes required to migrate and operate apps (and their services) into and between data centers, colocation facilities and various public clouds.
Hybrid Cloud Is the Future
While most pundits have fallen in love with public cloud, behind the scenes a variety of technologies and vendors are evolving the will deliver the ultimate clouds, hybrid clouds.
Amazon and Google have done a remarkable job promoting public cloud, as VMware, Cisco and OpenStack have done similarly with private cloud. Yet late last year at the December Gartner Data Center Conference 2012 the dominant theme was neither public cloud nor private cloud, but rather hybrid cloud. The hybrid cloud is form of cloud computing whereby applications and services can run across multiple clouds, colocation and data centers seamlessly, as a single hybrid cloud.
Before you dismiss hybrid cloud as another marketing twist on cloudwashing, consider its roots and evolution from the physical data center into an elastic, software-defined data center (as VMware calls it). We started talking about a new, elastic architecture back in 2010 from the context of the necessary evolution of networks to support elastic and dynamic virtual infrastructures, including clouds.