Why culture change is essential to fulfil the promise of digital-first public services

(c)iStock.com/Tuomas Kujansuu

This month the CBI published a major report outlining its vision for the future of public services.

For anyone in local government, it is well worth a read. Not only does it stress the critical importance of getting our public services online but it also points out that while we are making progress, there is much more that needs to be done.

Here there are three clear areas for local government to focus on.

The first is in doing a better job of understanding and meeting the expectations of citizens when it comes to the way services are delivered. While 77% of people go online to find or use services every day, only 41% interact with public authorities in this way. Councils are already falling short of expectations.

The second area for focus is what appears to be a lack of digital literacy among public sector leaders. The CBI’s figures show some 75% of council leaders think their councils embrace technology to improve local services. Ask the public if that is the case and only 29% share that view. Without greater digital literacy among local government managers and leaders, the sector is unlikely to be able to accelerate the delivery of digital services in the future.

The last issue comes from the way the public sector builds digital public services. Here, it is clear that the old ways of procuring services and managing projects aren’t necessarily fit for today’s digital purposes. To reshape local government, we need to take a fresh approach to planning and strategy.So how can local government leaders address these issues?

Organisations need to build knowledge among managers and teams who deliver services so they understand what the digital-first agenda means for them

A first step is for leaders in local government to ensure they and their teams retain a clear understanding of what digital by default really means for organisational strategy.

Strategy which is focused on simply using technology to cut costs will not be sufficient to deliver digital public services in the long term. Digital by default is about putting technology at the heart of the way councils plan and do business, something which demands deeper change than simply ‘lifting and shifting’ existing services online.

A second area to focus on is in bridging the digital skills deficit at all levels of local government.

At the top of organisations, councils need to make sure their leaders have a full appreciation of the role technology must play. This means giving the chief information officer and technology experts a greater role and influence in the planning and delivery of services so that digital is ‘baked in’ to the decision making process from beginning to end.

More broadly, organisations need to build knowledge among managers and teams who deliver services so they understand what the digital first agenda means for them.

By investing in knowledge and skills, a more ‘digital-centric’ culture will evolve. To support this culture it’s important to remember that transformation of public services and the organisations that deliver them cannot happen without investment in technology.

Here, finance directors, chief executives and those leading service areas or directorates need to look at the kind of investments that the likes of Bristol City Council and Maidenhead and Windsor are making to transform and reinvent the way the organisations work in order to understand the crucial role technology plays in service transformation.

There is no doubt that delivering digital public services will stretch and challenge local government in the coming years. But if there is one thing we have learnt during the past four years of budget cuts, it is that local government is a sector with a track record of rising to and overcoming the big challenges when it matters.

Amazon offering unlimited cloud storage for $5 a year in Black Friday tie-in

(c)iStock.com/AdrianHillman

Amazon is offering customers an unlimited one year cloud storage deal for $5 to tie in with its Black Friday promotions.

The deal, which would normally cost users $59.99, may be of most interest to Microsoft customers, after the company recently shuttered its unlimited OneDrive plans. In that instance, a few spoiled it for the many; a company blog post explained “a small number of users backed up numerous PCs and stored entire movie collections and DVR recordings…in some instances exceed[ing] 75 TB per user, or 14,000 times the average.” Microsoft has since decided to cap each OneDrive account to 1 TB.

No such problems for Amazon users; as the company explains, “there’s no limit to how many files you can upload, and we’ll never change or reduce the resolution of your images.” For those who are primarily uploading photos to their storage – a key problem for many of those who complained about OneDrive – Amazon has an unlimited $11.99 per year offering, again with the carrot of a free trial. For those who do take up the offer, the price for unlimited storage goes back to $59.99 after a year.

Evidently, this is just a short term customer gain strategy from Amazon to tie in with Black Friday – if one is so inclined the official Black Friday page has a plethora of deals – but longer term, Microsoft’s decision to halt its unlimited OneDrive accounts should not be seen as an admission of failure. That was the view of Brian Taptich, the CEO of developer-focused cloud storage provider Bitcasa, whose company made a similar decision in 2014 when it found businesses were taking advantage of their plan.

“I suspect Microsoft just learned that, however theoretical models may have supported the efficacy of offering unlimited storage of a fixed – and low – fee, in the almost entirely frictionless world of data transfer, the first users who show up to the all you can eat buffet break the model with their unimaginable volumes of data,” he said.

You can find out more about the Amazon deal here.

Staying Compliant in the Cloud Without a Cybersecurity Attorney By @BThies | @CloudExpo #Cloud

Cybersecurity is a complex field, and with laws varying across states and countries, keeping cloud usage compliant can become a real headache for enterprise security decision-makers. As regulations continue to lag behind the rapid pace of technological advancements, many IT security professionals turn to the expertise of cybersecurity lawyers, who not only understand the ambiguities of the law, but are also able to secure and protect their employers’ interests in the case of a breach. Cybersecurity attorneys are not necessary, however, for everyday operations. While they play an important role in dealing with specific crises, it is possible for a company’s security officials to cope with most situations on their own.

read more

Clouds across Europe powered by wood, water and nuclear fission

datacentre cloudTwo differing approaches to powering the cloud with renewable energy have been unveiled this week.

In northern Russia a new datacentre facility in Udomlya is to power the 10,000 racks that support the cloud using nuclear fission, in order to generate the 80 MW needed to power the facility. Meanwhile, Luxembourg-based colocation provider LuxConnect is to power its new Tier IV data centre in Bettembourg with a wood burner.

The two data centres illustrate the differing approaches to powering the cloud. According to LuxConnect business development manager Claude Demuth it is becoming increasingly important for service providers, that use datacentre facilities to host their cloud services, to demonstrate that their electricity is powered by a sustainable source.

Until recently, LuxConnect met this commitment by purchasing credits for power generated from water driven turbines in Norway.  While the power used in their datacentre is not the very same power fed into the grid in Norway, the credits can be exchanged for a local source of power and LuxConnect was still credited as a user of sustainable power. However the Luxembourg government suggested that the new facility should use local renewable energy from biomass.

In response LuxConnect has built its own plant to burn waste wood from pallets, timbers and old furniture. The released energy is converted into electricity which will run the new data centre’s power and cooling. The bio mass burning plant has been built across the road from the data centre and connects via underground pipes.

Meanwhile in Russia, according to news agency Telecom Daily, nuclear power operator

Rosenergoatom, which runs ten nuclear power plants with 33 reactors, is to supply the Udomlya. According to reports it has offered Facebook and Google space on the upcoming campus, in order to help the American companies comply with new data residency laws.

Samsung unveils 128GB DDR4 memory modules for datacentres

Samsung 128GB RAMSamsung Electronics says it is mass producing memory modules for datacentre and enterprise servers that could turbo charge cloud services.

It has published details, in a blog of double data rate-4 (DDR4) memory in 128-gigabyte (GB) modules. These, when installed in enterprise servers and data centres, could significantly speed the rate of processing in cloud computing applications, slashing response times, boosting productivity and raising the quality of service.

The new modules use TSV (which stands for ‘through silicon via’), which is an advanced chip packaging technology that vertically connects DRAM chip dies using electrodes that penetrate the micron-thick dies through microscopic holes. Samsung first used this when it introduced its 3D TSV DDR4 DRAM (64GB) in 2014. TSV is used again in this new dual inline memory module (RDIMM) which, claims Samsung, opens the door for ultra-high capacity memory at the enterprise level.

The 128GB TSV DDR4 RDIMM is comprised of a total of 144 DDR4 chips, arranged into 36 4GB DRAM packages, each containing four 20-nanometer (nm)-based 8-gigabit (Gb) chips assembled with TSV packaging technology.

Unlike conventional chip packages, which interconnect die stacks with wire bonding, the TSV packages interconnect through hundreds of fine holes and vertically connected by electrodes passing through the holes. This creates a massive improvement in signal transmission speeds. In addition the Samsung’s 128GB TSV DDR4 module has a special data buffer function that improves module performance and lowers power consumption.

As a result servers can reach 2,400 megabits per second (Mbps), roughly twice their normal speed at half the power usage. Samsung says it’s now accelerating production of TSV technology to ramp up 20nm 8GB DRAM chips to improve manufacturing productivity.

“We will continue to expand our technical cooperation with global leaders in servers, consumer electronics and emerging markets,” said Joo Sun Choi, executive vice president of Memory Sales and Marketing at Samsung Electronics.

New Microsoft Trust Center aims to offer stability in shifting cloud industry

MicrosoftMicrosoft has aggregated all the information about its cloud services into one single point of reference, in an attempt to clarify and simplify the increasingly ethereal nature of the cloud.

The announcement, in a blog on the company web site, comes in the same week that one of the new incarnations of HP, Hewlett Packard Enterprises (HPE), repositioned itself as a reseller of Microsoft’s Azure public cloud services.

With the onset of the cloud industry reshaping both the IT industry and companies, the software company turned cloud service vendor has moved to clarify the picture for enterprise buyers.

The new Microsoft Trust Center aims to unify all the different strands of its enterprise cloud services, as confused customers customer began to clamour for a single version of the truth, instead of having to choose between multiple references issued by a choice of Microsoft trusted resources. In the new scheme, the Microsoft Trust Center will be a consistent source of information about all its enterprise cloud services, such as Microsoft Azure, Microsoft Dynamics CRM Online, Microsoft Intune and Microsoft Office 365.

The Microsoft blog post says the Trust Center will be built on security, privacy and control, compliance and transparency. To this end it will advise cloud buyers on how Microsoft’s cloud services will observe international and regional standards, privacy and data protection policies and security features and function.

On Tuesday it was announced that HPE was to become a Microsoft Azure reseller partner, while in return HPE will become a preferred cloud services provider when Microsoft customers need help. The new arrangement, revealed by HPE CEO Meg Whitman in a quarterly analyst call, illustrates how the IT industry is being reshaped around the new hybridisation of computing services. The arrangement means HPE can sell its own hardware and cloud computing software to companies for the private, ‘on-premise’ part of the private-public combination. Meanwhile, the public cloud will be provided by Microsoft’s Azure computing service.

Transparency, according to the Microsoft Trust Center blog, is to be one of the foundations of cloud services.

How Silicon Valley is disrupting space

spaceship close upWe tend to think of the Space Industry as quintessentially cutting edge. As such it feels awfully strange to hear somebody compare it to the pre-Uber taxi industry – nowadays the definition of an ecosystem ripe for seismic technological disruption.

Yet comparing the two is exactly what Sean Casey (Founder and Managing Director of the Silicon Valley Space Centre) is doing, during a phone conversation ahead of his appearance at February’s IoT Data Analytics & Visualization event in Palo Alto.

“With all Silicon Valley things there’s kind of a standard formula that involves disruptive technologies and large markets. Uber’s that way. Airbnb is the same,” says Casey. “Space is dominated by a bunch of large companies, making big profits from the government and not really interested in disrupting their business. The way they’re launching their rockets today is the same way they’ve been doing it over the last forty years. The reliability has increased, but the price hasn’t come down.”

Nowadays, however, a satellite needn’t cost hundreds of millions of dollars. On the contrary, costs have even come down to as little as $150,000. Talk about economising! “Rather than spending hundreds of millions of dollars on individual satellites, we can fly hundreds of satellites at a greatly reduced cost and mitigate the risk of a single failure,” Casey says. In addition, he explains that these satellites have tremendous imaging and communications capabilities – technology leveraged from a very everyday source. “The amount of processing power that you can fly in a very small satellite comes from a tremendous processing power that we all have in our cell phones.”

Entrepreneur Elon Musk was one of the first to look at this scenario, founding SpaceX. “Maybe he was bringing some new technology to the table,” says Casey, “but he’s basically just restructured his business to make launch costs cheaper.”

However, due perhaps in part to the historical proximity of the US government and the Space Industry, regulatory opposition to newcomers has been particularly strident. It is a fact that clearly irritates Casey.

“Elon Musk has had to fight regulatory obstructions put up by people in Washington that said we want to keep you out of the business – I mean, how un-American is that? We’re supposed to be a capitalist country that embraces new opportunity and change. Get a grip! That stuff is temporary, it’s not long term. The satellite industry is often reluctant to fly new technologies because they don’t think they can sell that approach to their government customers.”

Whereas lower prices open the door to new customers, and new use cases – often moving hand-in-hand with developments in analytics. This brings us to perhaps the most interesting aspect of a very interesting discussion. There are, on the one hand, a number of immediate feasible use cases that come to Casey’s mind – analysing the flow of hospital visits to anticipate and epidemics, for example, not to mention a host of economic usages, such as recording and analysing shipping, resources, harvests and more…

On the other hand, while these satellites will certainly offer clients a privileged vantage point from which to view and analyse the world (we don’t refer to the ‘bird’s eye view’ for nothing), precisely what discoveries and uses will be discovered up there in the coming years remains vague – albeit in a tantalising sort of way.

“It’s one of those things that, if you’ve never looked at it. If you’ve never had that data before, you kind of don’t know what you’re going to find. After this is all played out, you’ll see that this was either a really big step forward or it was kind of a bust and really didn’t amount to anything.  It’s sort of like asking the guys at Twitter  to show that their company’s going to be as big as it became after they’d done their Series A Financing – because that’s where these satellite companies are. Most of them are Series A, some of them are Series B – SpaceX is a lot further on.”

One thing that looks certain is that Silicon Valley is eyeing up space as its Final Frontier. From OneWeb and O3b founder Greg Wyler’s aspiration to connect the planet, to Google’s  acquisition of Skybox and Monsanto’s acquisition of Climate Corp – plus a growing number of smaller investments in space-focussed start-ups, not to mention the aforementioned SpaceX and Amazon’s more overt investment in rocket science, capitalism is coming to the cosmos.

Sean Casey will be appearing at IoT Data Analytics & Visualization (February 9 – 11, 2016 Crowne Plaza Palo Alto). Click here to register.

Exploding Open Source — CIO Lessons from the Edge By @ABridgwater | @CloudExpo #Cloud #BigData

The combined notions of open source and the ‘community contribution’ model of collaborative software application development are, of course, not new.
The history of open source is actually traced back to early software exchanges between universities driven by academic principles of knowledge sharing in the 1960s. Sometime afterwards (August 25, 1991), Finnish computer scientist Linus Torvalds created Linux… and the rest is history.

read more

Secure Authentication for Public Wi-Fi | @CloudExpo #IoT #BigData

Wi-Fi has become a necessity of the digital age, and like everything, everyone loves it even more when it is free. Whether it’s used to access a presentation at a new client meeting, to host a video conference call, or edit and email important documents, public Wi-Fi means nearly anywhere can become an office. Couple this with the fact that there are as many mobile devices on the planet as there are people, and businesses now have the most flexible and tech-saturated workforce in history. However, public Wi-Fi networks, by their very nature, are a hotbed for silent cyber attacks, as a business’s sophisticated security systems won’t have any affect on them to protect users.

read more

Public cloud growing at pace – but hidden issues remain

(c)iStock.com/AnthiaCumming

Public cloud adoption will continue to grow at a rapid pace, but a lack of understanding will prohibit adoption in some organisations, according to the latest research from hybrid IT monitoring provider ScienceLogic.

The research, which polled more than 1,600 IT professionals, found 62% of organisations are already using at least one public cloud, while more than four in five (83%) expect public cloud spend to increase in the next 12 months. Of the main public cloud providers, Amazon Web Services (58%) not altogether surprisingly is the most popular among survey respondents, followed by Microsoft Azure (43%) and Google Cloud Platform (13%).

Yet the research found organisations lack advanced visibility, monitoring and infrastructure control in their public cloud environments. 82% of those polled said they were unable to ensure optimum performance, health, and availability of their public cloud workloads, while almost half (46%) said they did not know how to, or simply do not, proactively monitor their workloads.

This, of course, could spell danger – and for many organisations, it already has. Half of firms polled said they have experienced at least one complete network outage in the past 12 months, with more than a quarter (27%) saying they have had more than two hours of downtime per event. On average, organisations lose about $3.9 million – approximately $12,000 per minute – annually as a result of network outages.

“As hybrid IT and multicloud usage becomes mainstream for organisations, so does the need to simplify workload visibility and management for IT teams,” said Dave Link, ScienceLogic CEO. “Without this deep visibility of dependencies, organisations risk losing millions per year due to network outages that could have been prevented or shortened with the use of monitoring tools.”

According to figures from Synergy Research back in September, the public cloud continues to make significant inroads into the overall IT market, generating almost £13 billion in quarterly revenues for IT firms.