Internet of Things (IoT) – A Technical Primer | @ThingsExpo #IoT #M2M #API #Wearables

We are rapidly moving to a brave new world of interconnected smart homes, cars, offices and factories known as the Internet of Things (IoT). Sensors and monitoring devices will touch every part of our lives. Let’s take a closer look at the Internet of Things.
The Internet of Things is a worldwide network of objects and devices connected to the Internet. They are electronics, sensors, software and more. These objects connect to the Internet and can be controlled remotely via apps and programs.
Because they can be accessed via the Internet, these devices create a tremendous opportunity to integrate computers and the physical world. They will improve our lives by making things more efficient and accurate, providing economic benefits derived from more effective use of resources.

read more

New players ally to G-Cloud 7 amid accusations of anti-cloud behaviour

Cloud computingA number of new service providers have announced their participation in the latest iteration of the UK’s government computing services framework, G-Cloud 7. Among the new suppliers pledging to meet the conditions of the latest framework were Fordway, Acuity, Company 85, RedCentric and Komodo Digital.

However, critics have argued that The Crown Commercial Service (CCS) has introduced uncloud-like behaviour, as newly introduced limits could hinder buyers from expanding their use of cloud services.

Under the new rules in G-Cloud 7, users will be forced to re-tender via G-Cloud if they intend to buy additional capacity or services that will cost more than 20% of their original contract’s value. This, according to industry body EuroCloud UK, goes against the defining principle of cloud computing, scalability.

“It deters buyers from using the G-Cloud framework, because it actively discourages the pay per use principle,” said Neil Bacon, MD of Global Introductions and a member of EuroCloud’s G-Cloud working group. Worse still, he said, it will prevent buyers from getting the economies of scale that are the original motivation for their buying decision.

Several G-Cloud providers, including EduServ and Skyscape, outlined their concerns about the move in writing to the Cabinet Office. However, Surrey-based service provider Fordway has committed to the new system, launching its Cloud Intermediation Service (CIS) on G-Cloud 7.

The new service helps clients assess, plan, transform and migrate their infrastructure partly or completely to public cloud. It promises agile project management, bundling together the resources that clients will need to support their in-house team at each stage of the transition.

Fordway claims its relationships with public cloud providers such as Amazon Web Services, Microsoft and Google allow it to create a pivotal single point of contact to manage a transition irrespective of the target platforms.

In Fordway’s case, clients may not be subject to unexpected fluctuations in capacity demand, according to MD Richard Blanford.

“Most IT teams will only migrate their systems to cloud once, and it’s a big step. For the sake of their organisation and their own careers it needs to be planned and delivered successfully, on time and within budget, without any surprises,” said Blanford.

ERP uptake set to boom as risk diminishes – research

enterprise IT rolloutA new survey provides potentially exciting news of lucrative opportunities for cloud service providers. Nearly a third of all enterprise resource planning (ERP) systems in the world will attempt the migration to the cloud in the next two years, if a study commissioned by Hitachi Solutions Europe is accurate.

With rare cloud migration skills at a premium, the mass migration could prove lucrative for experts in this area, according to Hitachi.

The managers of ERP systems have been the most reluctant of all IT users to move to the cloud, according to Tim Rowe, Director of Hitachi Solutions Europe. The importance of this foundation system and its inherent complexity have dissuaded companies from taking any risks. However, as perception of the benefits of cloud computing spreads, the pressure to move is beginning to outweigh the resistance to change, said Rowe. The complexity of ERP systems, once a sales blocker, is now set to become a massive source of margin, according to Hitachi.

“Now we are starting to see a shift as the benefits start to outweigh the perceived risks,” said Rowe.

The survey found that that though 31% of organisations have moved all or part of their ERP systems to the Cloud, or are in the process of doing so, that leaves a healthy 69% who are still keeping ERP in house. However, 44% of the survey group of 315 senior finance professionals

said they would contemplate moving into the cloud in the next two years. If this is an accurate representation of global sentiment, then in the next two years around 30% of all ERP systems will begin an expensive migration to the cloud.

Among the companies with 500 employees there was just as much enthusiasm for taking a Cloud-based approach to ERP. With 27% of this demographic saying they have moved all or part of their ERP to the Cloud, or are in the process, the proportion is roughly similar to the overall average (31%).

Enterprises conducting feasibility research, through peer reviews, will be encouraged by the feedback given by earlier adopters, according to Hitachi. Its study found that 80% of their survey group of finance professionals rated their experience of using cloud-based ERP as excellent or good.

The main blockages to cloud based ERP projects will be data security and privacy risk (ranked as the top concern in 38% of cases) and dependency on a third party provider, nominated as the top fear by 35% of respondents.

AWS launches EC2 Dedicated Hosts feature to identify specific servers used

amazon awsAmazon Web Services (AWS) has launched a new service for the nervous server hugger: it gives users knowledge of the exact server that will be running their machines and also includes management features to prevent licensing costs escalating.

The new EC2 Dedicated Hosts service was created by AWS in reaction to the sense of unease that users experience when they never really know where their virtual machines (VMs) are running.

Announcing the new service on the company blog AWS chief evangelist Jeff Barr says the four main areas of improvement would be in licensing savings, compliance, usage tracking and better control over instances (AKA virtual machines).

The Dedicated Hosts (DH) service will allow users to port their existing server-based licenses for Windows Server, SQL Server, SUSE Linux Enterprise Server and other products to the cloud. A feature of DH will be the ability to see the number of sockets and physical cores that are available to a customer before they invest in software licenses. This improves their chances of not overpaying. Similarly the Track Usage feature will help users monitor and manage their hardware and software inventor more thriftily. By using AWS Config to track the history of instances that are started and stopped on each of their Dedicated Hosts customers can verify usage against their licensing metrics, Barr says.

Another management improvement is created by the Control Instance Placement feature, that promises ‘fine-grained control’ over the placement of EC2 instances on each Dedicated Host.

The provision of a physical server may be the most welcome addition to many cloud buyers dogged by doubts over Compliance and Regulatory Requirements. “You can allocate Dedicated Hosts and use them to run applications on hardware that is fully dedicated to your use,” says Barr.

The service will help enterprises that have complicated portfolios of software licenses where prices are calculated on the numbers of CPU cores or sockets. However, Dedicated Hosts can only run in tandem with AWS’ Virtual Private Cloud (VPC) service and can’t work with its Auto Scaling tool yet.

The IoT in Palo Alto: connecting America’s digital city

jonathan_reichental_headshot_banffPalo Alto is not your average city. Established by the founder of Stanford University, it was the soil from which Google, Facebook, Pinterest and PayPal (to name a few) have sprung forth. Indeed, Palo Alto has probably done more to transform human life in the last quarter century than any other. So, when we think of how the Internet of Things is going to affect life in the coming decades, we can be reasonably sure where much of expected disruption will originate.

All of which makes Palo Alto a great place to host the first IoT Data Analytics & Visualization event (February 9 – 11, 2016). Additionally fitting: the event is set to be kicked off by Dr. Jonathan Reichental, the city’s Chief Information Officer: Reichental is the man entrusted with the hefty task of ensuring the city is as digital, smart and technologically up-to-date as a place should be that has been called home by the likes of Steve Jobs, Mark Zuckberg, Larry Page and Sergey Brin.

Thus far, Reichental’s tenure has been a great success. In 2013, Palo Alto was credited with being the number one digital city in the US, and has made the top five year upon year – in fact, it so happens that, following our long and intriguing telephone interview, Reichental is looking forward to a small celebration to mark its latest nationwide ranking.

BCN: Jonathan, you’ve been Palo Alto’s CIO now for four years. What’s changed most during that time span?

Dr Jonathan Reichental: I think the first new area of substance would be open government. I recognise open government’s been a phenomenon for some time, but over the course of the last four years, it has become a mainstream topic that city and government data should be easily available to the people. That it should be machine readable, and that an API should be made available to anyone that wants the data. That we have a richer democracy by being open and available.

We’re still at the beginning however. I have heard that there are approximately 90,000 public agencies in the US alone. And every day and week I hear about a new federal agency or state or city of significance who are saying, ‘you can now go to our data portal and you can access freely the data of the city or the public agency. The shift is happening but it’s got some way to go.

Has this been a purely technical shift, or have attitudes had to evolve as well?

I think if you kind of look at something like cloud, cloud computing and cloud as a capability for government – back when I started ‘cloud’ was a dirty word. Many government leaders and government technology leaders just weren’t open to the option of putting major systems off-premise. That has begun to shift quite positively.

I was one of the first to say that cloud computing is a gift to government. Cloud eliminates the need to have all the maintenance that goes with keeping systems current and keeping them backed up and having disaster recovery. I’ve been a very strong proponent of that.

Then there’s social media  – government has fully embraced that now, having been reluctant early on. Mobile is beginning to emerge though it’s still very nascent. Here in Palo Alto we’re trying to make all services that make sense accessible via smart phone. I call it ‘city in a box.’ Basically, bringing up an app on the smart phone you should be able to interact with government – get a pet license, pay a parking fee, pay your electrical bill: everything should really be right there on the smartphone, you shouldn’t need to go to City Hall for many things any more.

The last thing I’d say is there has been an uptake in community participation in government. Part of it is it’s more accessible today, and part of it is there’s more ways to do so, but I think we’re beginning also to see the fruits of the millennial generation – the democratic shift in people wanting to have more of a voice and a say in their communities. We’re seeing much more in what is traditionally called civic engagement. But ‘much more’ is still not a lot. We need to have a revolution in this space for there to be significant change to the way cities operate and communities are effective.

Palo Alto is hosting the IoT Data Analytics & Visualization in February. How have you innovated in this area as a city?

One of the things we did with data is make it easily available. Now we’re seeing a community of people in the city and beyond, building solutions for communities. One example of that is a product called Civic Insight. This app consumes the permit data we make available and enables users to type in an address and find out what’s going on in their neighbourhood with regard to construction and related matters.

That’s a clear example of where we didn’t build the thing, we just made the data available and someone else built it. There’s an economic benefit to this. It creates jobs and innovation – we’ve seen that time and time again. We saw a company build a business around Palo Alto releasing our budget information. Today they are called OpenGov, and they sell the solution to over 500 cities in America, making it easy for communities to understand where their tax payer dollars are being spent. That was born and created in Palo Alto because of what we did making our data available.

Now we get to today, and the Internet of Things. We’re still – like a lot folks, especially in the government context – defining this. It can be as broad or as narrow as you want. There’s definitely a recognition that when infrastructure systems can begin to share data between each other, we can get better outcomes.

The Internet of Things is obviously quite an elastic concept, but are there areas you can point to where the IoT is already very much a reality in Palo Alto?

The clearest example I can give of that today is our traffic signal system here in the city. A year-and-a-half ago, we had a completely analogue system, not connected to anything other than a central computer, which would have created a schedule for the traffic signals. Today, we have a completely IP based traffic system, which means it’s basically a data network. So we have enormous new capability.

For example, we can have schedules that are very dynamic. When schools are being let out traffic systems are one way, at night they can be another way, you can have very granular information. Next you can start to have traffic signals communicate with each other. If there is a long strip of road and five traffic systems down there is some congestion, all the other traffic signals can dynamically change to try and make the flow better.

It goes even further than this. Now we can start to take that data – recording, for example, the frequency and volume of vehicles, as well as weather, and other ambient characteristics of the environment – and we can start to send this to the car companies. Here at Palo Alto, almost every car company has their innovation lab. Whether it’s Ford, General Motors, Volkswagen, BMW, Google (who are getting into the car business now) – they’re all here and they all want our data. They’re like: ‘this is interesting, give us an API, we’ll consume it into our data centres and then we’ll push into cars so maybe they can make better decisions.’

You have the Internet of Things, you’ve got traffic signals, cloud analytics solutions, APIs, and cars as computers and processors. We’re starting to connect all these related items in a way we’ve never done before. We’re going to follow the results.

What’s the overriding ambition would you say?

We’re on this journey to create a smart city vision. We don’t really have one today. It’s not a product or a service, it’s a framework. And within that framework we will have a series of initiatives that focus on things that are important to us. Transportation is really important to us here in Palo Alto. Energy and resources are really important: we’re going to start to put sensors on important flows of water so we can see the amount of consumption at certain times but also be really smart about leak detection, potentially using little sensors connected to pipes throughout the city. We’re also really focused on the environment. We have a chief sustainability officer who is putting together a multi-decade strategy around what PA needs to do to be part of the solution around climate change.

That’s also going to be a lot about sensors, about collecting data, about informing people and creating positive behaviours. Public safety is another key area. Being able to respond intelligently to crimes, terrorism or natural disasters. A series of sensors again sending information back to some sort of decision system that can help both people and machines make decisions around certain types of behaviours.

How do you expect this whole IoT ecosystem to develop over the next decade?

Bill Gates has a really good saying on this: “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.”  It’s something that’s informed me in my thinking. I think things are going to move faster and in more surprising ways in the next ten years for sure: to the extent that it’s very hard to anticipate where things are headed.

We’re disrupting the taxi business overnight, the hotel business, the food business. Things are happening at lightning speed. I don’t know if we have a good sense of where it’s all headed. Massive disruption across all domains, across work, play, healthcare, every sort of part of our lives.

It’s clear that – I can say this – ten years from now won’t be the same as today. I think we’ve yet to see the full potential of smart phones – I think they are probably the most central part of this ongoing transformation.

I think we’re going to connect many more things that we’re saying right now. I don’t know what the number will be: I hear five billion, twenty billion in the next five years. It’s going to be more than that. It’s going to become really easy to connect. We’ll stick a little communication device on anything. Whether it’s your key, your wallet, your shoes: everything’s going to be connected.

Palo Alto and the IoT Data Analytics & Visualization event look like a great matchup. What are you looking forward to about taking part?

It’s clearly a developing area and so this is the time when you want to be acquiring knowledge, networking with some of the big thinkers and innovators in the space. I’m pleased to be part of it from that perspective. Also from the perspective of my own personal learning and the ability to network with great people and add to the body of knowledge that’s developing. I’m going to be kicking it off as the CIO for the city.

API Design Using Behavior Driven Development By @JustinRohrman | @CloudExpo #API #Cloud

Test-driven development (TDD) has been around for a while now. Behavior-driven development (BDD), a comparably recent methodology, emerged from the practice of TDD and could reasonably be called a narrower application of TDD.
The TDD process allows a developer to use a failing unit test to express a shortcoming of the system. The next step is to modify the production code to get the failing test to pass without making existing tests fail. BDD more or less takes this same concept and adds the idea that the tests should be written in easy-to-understand language describing the problem domain, and that tests should express user acceptance criteria.

read more

Microsoft Blog: The cloud for any app and every developer

The below is an excerpt from a recent post on the Microsoft Azure blog by Nicole Herskowitz.

At Microsoft, our vision for Azure is to enable every developer to be able to create, deploy and manage any application in the cloud, regardless of the tools, technology, architecture or platform they prefer. We continue to innovate in delivering services on Microsoft Azure, often in close partnership with leading innovators across many technologies, to ensure open source and third party offerings have first-class support on Azure. Today we’re announcing new technologies and capabilities that advance our mission to make Azure the preferred cloud for any app and every developer — from back-end cloud services to higher level platform services, to the development process itself.

For building highly scalable back-end services in the cloud many developers are turning to microservice architectures. The independent nature of these microservices offers superior application lifecycle management, performance at scale, 24×7 availability and cost efficiency compared with traditional monolithic architectures for service based apps. Today, we’re announcing the public preview of Azure Service Fabric, Microsoft’s platform for developing and operating microservice-based applications. Service Fabric also brings new innovations to microservice development with support for reliable, stateful services for low-latency partitioned data access at scale, and the Actor programming model which drastically simplifies building high-scale microservice applications.

We’ve already seen strong interest in Service Fabric with over 300 customers and partners already building on the platform during the private preview. With the availability of public preview in Azure, you can now explore the scale-out potential of Service Fabric combined with dedicated Visual Studio tooling. Today, Service Fabric is available on Azure and will extend to Windows Server, Linux and other cloud providers next year providing application portability and hybrid scenarios. To get started, download the SDK, check out our getting started videos and documentation and deploy your application to a cluster live in Azure.

For developers who want to build powerful, enterprise grade web and mobile apps that connect to data in the cloud or on-premises, Azure App Service is a highly productive platform for building scalable apps in .NET, NodeJS, PHP, Python or Java as well as engaging mobile apps for iOS, Android and Windows. Azure App Service is one of our most popular Azure services used by more than 60% of customers to host over 700,000 apps. Building on this success, today we announced new capabilities in Azure App Service including:

  • Single sign-on using EasyAuth across all app types making authentication easy, everywhere
  • Code-free interface and data design for rapid development of data-driven Node.js apps
  • API app innovations extended to all app types, eliminating the need for an API gateway

 

To read the entire post, click here.

 

Interested in learning about common migration problems with Microsoft Office 365? Download our latest on-demand webinar.

 

Datacentreplus MD Mashukul Hoque on carving a niche for SME customers

(c)iStock.com/letty17

Meet Datacentreplus. The company, based in Manchester, UK, is aiming to find a gap in the North West data centre market by serving the small to medium business customer.

“We believe colocation and dedicated server hosting should be simple,” a missive on the company’s website reads. “We’re getting back to basics and offering no-nonsense, straightforward colocation and dedicated servers, with a focus on quick delivery and great customer service that does not stop the moment you pay your bills.”

Mashukul Hoque (left) is the founder and managing director of Datacentreplus. His view is that the larger data centre players in the North West are not doing enough to keep their customers happy.

“The North West data centre landscape is the biggest outside London and the South East,” he tells CloudTech. “Currently, it is dominated by large players who are primarily interested in acquiring the larger customers, and their business model is geared towards that. This leaves a large gap where the small to medium customer is not very well served,” he adds.

This move was born, to some extent, out of necessity. Hoque is also managing director of Manchester-based software company Sandyx, and his “relatively small, but vital” data centre requirements were refused point blank – or if not, with a plethora of caveats. “There appeared to be little or no flexibility on offer,” he explains. “By adding in annual costs, set-up fees and network engineering support fees, I felt as though my business was simply being discredited and being pushed away.”

The North West, and the greater Manchester area in particular, is ripe for technological transformation. According to the Tech Nation report released earlier this year, the greatest volume of digital employment, after inner London, and Bristol and Bath, is Manchester.

Hoque has no plans to move, although he says Datacentreplus is already looking at a second, larger site. “Our intention is to remain in the North West and actually probably just in Manchester,” he explains. “We’re looking forward to providing some of the infrastructure that will enable lots of tech startups to incubate in Manchester.”

The company officially soft launched last year as Hoque anticipated some ‘technical glitches’. Yet he notes: “Many of our customers have come from the established data centres and we have been genuinely surprised at the lack of effort made by these data centres to retain customers. The data centre is now 20% full without any serious marketing effort on our part, so we’re pleased that we’ve made unexpectedly good progress.”

Not everything has gone exactly to plan, however. Hoque blames a ‘very slow pace’ in getting any kind of infrastructure upgrade from utilities and telecoms providers from going more quickly, among others. “The key challenges we have experienced so far is one centred round skills set – there is a real lack of UK-based engineers who have networking and other essential skills,” he adds.

Another key issue Hoque envisages Datacentreplus will face, alongside many other data centre providers, is sustainability – minimising power usage and reducing the environmental impact of data centres. In a research report published earlier this month, Emerson Network Power advocated a similar issue, although technological change is advancing; as Chris Molloy, a distinguished engineer at IBM notes, as IT equipment becomes more resilient, data centres will either generate less heat or be able to tolerate much higher temperatures.

For the time being however, having launched its e-commerce site last week, Datacentreplus is aiming to educate small businesses on the cloud – and go similarly stratospheric in its North West hub.

4 Quick Ways to Access the Windows Start Menu in Coherence Mode

Guest blog by Sylvester Sebastian Nino, Parallels Support Team Most of the people who use Parallels Desktop enjoy seamless access to Mac and Windows programs and files. The best way to blend the two OSes even more is by using the Coherence view mode to run your Windows virtual machine. One of my coworkers has […]

The post 4 Quick Ways to Access the Windows Start Menu in Coherence Mode appeared first on Parallels Blog.

Tech to Buy this Black Friday

I love the holidays. I love the lights, the hot toddies, giving fun gifts to people, and maybe* something for myself while I’m at it. (*In other words, this is the shopping list for myself!) Here’s some choice tech to keep your eyes peeled for while you’re navigating the Black Friday shopping deals this year. […]

The post Tech to Buy this Black Friday appeared first on Parallels Blog.