Look closer to home for the biggest cloud security issue, SME execs told

Picture credit: iStockPhoto

The biggest threat to cloud security among IT is the company’s employees unintentionally exposing data, according to the latest research figures from CloudEntr.

The study, which took 438 survey responses from industries including financial and manufacturing, found three quarters (75%) of smaller businesses are most worried about their workforce when it comes to securing data in the cloud. Larger IT firms were more concerned about hackers using employee credentials to get their hands on data.

Not surprisingly, regulated institutions were more concerned about cloud compliance than non-regulated, but 75% also said their biggest tool in becoming more secure was employee education.

It’s not just employee education, but shadow IT which continues to be a problem. 29% of those polled said they had no plans to use the cloud in their organisations, but of that number, nearly half of IT pros said they knew of employees who were using it.

The vast majority (89%) of IT pros questioned said they were concerned with cloud security, and security (63%) was more important than convenience in a cloud solution.

There have been various vulnerabilities and outages in recent days, from Docker’s vulnerability recorded earlier this week, to Microsoft Azure’s downtime from a bug which slipped the testing process. Dejan Lukan, writing for CloudTech earlier this week, noted data breaches and data loss as some of the most serious threats to organisations, as well as a lack of understanding.

“Enterprises are adopting cloud services in everyday operations, but it’s often the case they don’t really understand what they are getting into,” he wrote.

“When moving to the cloud there are different aspects we need to address. If the [cloud service provider] doesn’t provide additional backup of the data, but the customer expects it, who will be responsible when the hard drive fails? The customer will blame the CSP, but in reality it’s the customer’s fault, since they didn’t familiarise themselves enough with the cloud service operations.”

The IT pros surveyed by CloudEntr are making it their prerogative to change this. 89% of respondents who have been impacted by security breaches say they planned primarily to educate their employees in the next year.

What do you make of this data? Are you worried about what your workforce is doing in the cloud?

Docker vulnerability exposed, users urged to upgrade for cloud security

Picture credit: iStockPhoto

Docker, the Linux container for run-anywhere apps, has a major vulnerability in all but the latest version of its software which can enable malicious code to extract hosted files.

The vuln, described as ‘critical’ in severity, was first spotted by Red Hat’s security researcher Florian Weimer and independent researcher Taunis Tiigi, with Docker crediting them in a security advisory.

“The Docker engine, up to and including version 1.3.1, was vulnerable to extracting files to arbitrary paths on the host during ‘Docker pull’ and ‘Docker load’ operations,” it reads. “This was caused by symlink and hardlink traversals present in Docker’s image extraction.

“This vulnerability could be leveraged to perform remote code execution and privilege escalation,” it added.

The advisory document noted there was no cure for this issue, and urged users to upgrade to the latest iteration.

This wasn’t the only bug in the system either. An issue which affects versions 1.3.0 and 1.3.1 allows a malicious image creator to modify the default run profile of containers – yet this has been fixed with the current version.

The problem arises when taking into account the vast majority of major cloud computing providers have partnered up with Docker in order to package sleek, secure applications on its platform. Microsoft announced its deal in October, with Google, Amazon Web Services and Rackspace also on board.

It’s easy to see why these vendors are buddying up; as Docker leverages the host’s operating system, there are no overheads or difficulties in spinning up virtual machines when shipping an application in its container. But like a lot of nascent products that are hitting the zeitgeist, it’s best to not get carried away on an untested system when security scare stories are just around the corner.

Users are urged to upgrade to version 1.3.2 as soon as they can, which they can find here.

‘UP Your OPs Game’ By @DTzar | @DevOpsSummit [#DevOps @MS_ITPro]

Want to enable self-service provisioning of application environments in minutes that mirror production?
Can you automatically provide rich data with code-level detail back to the developers when issues occur in production?
In his session at DevOps Summit, David Tesar, Microsoft Technical Evangelist on Microsoft Azure and DevOps, will discuss how to accomplish this and more utilizing technologies such as Microsoft Azure, Visual Studio online, and Application Insights in this demo-heavy session.

read more

Monetize This! Internet of Things By @MetraTech | @ThingsExpo [#IoT]

The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate.
Get ready to show them the money!

read more

How DevOps can improve reliability when deploying with AWS

Picture credit: iStockPhoto

Reliability, in the cloud technology era, can be defined as the likelihood that a system will provide uninterrupted fault-free services, as it is expected to, within the constraints of the environment in which it operates. According to the Information Technology Laboratories (ITL) bulletin published by the National Institute of Standards and Technology (NIST) on cloud computing, reliability in cloud environments is composed of the following:

  • The hardware and software services offered by the cloud service provider
  • The provider’s and consumers’ personnel
  • The connectivity to the services

In other words, reliability is dependent primarily on connectivity, availability of the services when needed, and the personnel managing and using the system. The focus of this article is about how DevOps can improve reliability when deploying in the cloud, particularly in Amazon Web Services (AWS) environments.

DevOps can, in fact, help improve the reliability of a system or software platform within the AWS cloud environment. When it comes to connectivity, there is the Route 53 domain name system routing service, the Direct Connect service for dedicated connection, and ElasticIP, offered by AWS. For assurance of availability, AWS offers Multi Availability Zone configurations, Elastic Load Balancing, and Secure Backup storage solutions with Amazon S3.

DevOps can be used to configure, deploy and manage these solutions and the personnel issues for reliability are practically non-existent because of the automation that is brought about by AWS DevOps offerings such as OpsWorks.

OpsWorks makes it possible to maintain the relationship between deployed resources persistently, meaning that an IP address that is elastically allocated to a resource (for example, an EC2 instance) is still maintained when it is brought back online, even after a period of inactivity. Not only does OpsWorks allow an organization to configure the instance itself, but it also permits the configuration of the software on-demand at any point in the lifecycle of the software itself, taking advantage of built-in and custom recipes.

Deployment of the application and the infrastructure is seamless with OpsWorks as the applications and infrastructure can be configured and pointed to a repository from which the source is retrieved, automatically built based on the configuration templates, and deployed with minimal to no human interaction.

Additionally, the auto-healing and automatic scaling features of AWS are perhaps what gives AWS deployments the greatest reliability, since when one instance fails, OpsWorks can automatically replace the failed instance with a new one and/or scale up or scale down as demand for the resource fluctuates. This ensures that there is uninterrupted and failure-free availability of services offered to connected consumers.

So, with a DevOps approach to deployment of cloud apps in an AWS environment, it may very well be possible that while the shift to cloud computing may test the reliability of the technology, in reality, it is a not something that one needs to be overly concerned about. Try the DevOps approach to deployment and see for yourself how reliable it can be and is.

The post How DevOps can Improve Reliability When Deploying with AWS appeared first on Cloud Computing News.

Software Licensing in the Cloud By @FlexeraSoftware | @CloudExpo [#Cloud]

The cloud has made many things simpler by allowing us to achieve economies of scale, eliminating the software installation process and enabling access of information from virtually anywhere. The list of benefits goes on, but contrary to popular belief, moving to the cloud will not make software licensing – and the challenges that go along with it – disappear. Instead, the cloud introduces a new set of software delivery models and licensing models that can make license management a challenge for the unprepared.
Cloud technologies need to be carefully considered by organizations considering new delivery models such as Software as a Service (SaaS) and Infrastructure as a Service (IaaS). Closer examination of the licensing rules tied to the type of cloud considered (public, private or hybrid) is also essential – each has its own set of license management ramifications.

read more

Securing Data in the Cloud with @VerizonAnne | @CloudExpo [#Cloud @VZCloud]

“Verizon offers public cloud, virtual private cloud as well as private cloud on-premises – many different alternatives. Verizon’s deep knowledge in applications and the fact that we are responsible for applications that make call outs to other systems. Those systems and those resources may not be in Verizon Cloud, we understand at the end of the day it’s going to be federated,” explained Anne Plese, Senior Consultant, Cloud Product Marketing at Verizon Enterprise, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.

read more

Moving to the cloud is “so easy your mother could do it”, exec claims

Picture credit: iStockPhoto

Danvers Baillieu, chief operating officer at VPN provider hidemyass, says he can’t understand companies who are reluctant to move to the cloud because it’s so easy to set up “your mother could do it.”

Baillieu spoke to CloudTech after research from Reconnix earlier this month found the majority of UK businesses weren’t ready to move to the cloud, citing a lack of in-house skills as the problem.

“I assume nobody is saying they can’t move over to Gmail apps because it’s so easy to set up that your mother could do it,” he says. “The whole point of a lot of those services [is] it requires no IT skills.

“I think actually the resistance from the IT department comes because they see themselves being done out of a job when half of these things get implemented,” he adds.

Baillieu asserts his data is safer in a remote centre since hidemyass migrated, even though he understands people are ‘always reticent about innovation.’

“We got to the point where we needed a development server,” he explains. “We bought a server and stuck it in our office, and we found it less convenient to do that.

“One of the things we talked about when we had this server in our office [was] what happens if our office burns down, what happens if our office gets broken to, and all of these physical things?” he adds. “All of those concerns were front of mind, and when we moved it to a data centre that does all of this stuff really well, we were much happier about it.”

A more pressing recent concern is through where data resides for EU customers. After Microsoft was forced in a court order to give the US government a customer email in its Dublin data centre, it wasn’t unreasonable to expect alarm bells to ring from customers – and the cloud vendors have responded.

Having previously been a lawyer, Baillieu wryly notes how there are ‘always people who want to tell you why something can’t be done’. Yet he adds: “The big providers have obviously had pushbacks from their customers on these sorts of issues, and they adapt.

“Once it’s up, it’s a lot easier than having to worry about racks overheating in your office.”

‘Cloud Consumption’ with @Solgenia_Corp | @CloudExpo [#Cloud]

“Cloud consumption is something we envision at Solgenia. That is trying to let the cloud spread to the user as a consumption, as utility computing. We want to allow the people to just pay for what they use, not a subscription model,” explained Ermanno Bonifazi, CEO & Founder of Solgenia, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.

read more

G20: Mexico Must Improve Its ICT

Mexico could do much better in its national commitment to ICT, according to our research at the Tau Institute. As a G20 nation, Mexico’s President Enrique Pena Nieto was recently at the G20 Summit in Brisbane, Australia, where a comprehensive strategy plan was presented to all delegates. The plan discussed job growth and several areas of reform. It did not directly address ICT, other than a mention of fledgling telco reform.

Believing as we do that economic and societal development over the long-term are a function of a commitment to a strong ICT environment, we encourage leaders in all sectors and strata of Mexican society to redouble their efforts in this area.

Yet that sentiment may seem irretrievably glib given the difficulties facing Mexico, which must be viewed through the lens of the country’s difficult history and relationship with the United States.

Can’t Avoid This
We avoid political discussions; for an American to plunge into the pool of political discussion about the US and Mexico may seem to be very dangerous and stupid. But the elephant in the room is too large to avoid when it comes to discussing the US and Mexico. So here goes…

Over the course of my career, I remember Mexican Presidents Lopez Portillo, de la Madrid, Salinas de Gortari, Zedillo, the very consequential Vicente Fox, and Calderon (as well as his constitutional battle with Lopez Obrador) quite well. I know less about current leader Pena Nieto, and that is perhaps how he wants it. He seems to be a bit of a cipher to all.

But I don’t recall any great moments between leaders of Mexico and the United States. The mjaor highlight probably came when Presidents Clinton and Salinas de Gortari edged toward their respective political centers to agree to NAFTA. But two decades after its ratification, NAFTA has had a very modest affect in helping the economies of the US and Mexico grow.

Meanwhile, the flow of undocumented immigrants and drugs has continued. At 12 million, there are now twice as many illegal immigrants from Mexico in the US today compared to then, there are narco wars that define Mexico in the perception of many people, and there is a drug abuse situation in the United States that has hardly abated.

Mexico is struggling with legislation to increase competition within its telco sector, which is currently Dominated with a capital “D” by Mexican billionaire Carlos Slim. I can’t imagine this will be easily accomplished. Slim seems determined to create an image for himself that makes the big, dominant American telcos seem like purveyors of free chocolate and kittens in comparison.

Talking Past One Another
The immigation debate, such as it is in the United States, presents a classic example of people talking right past one another, with no intent to engage in a serious discussion. It seems also as if Mexico’s government at the highest levels has little incentive to jump into this debate; the steady flow of impoverished people into the US relieves it of responsibility for these people and augments the flow of capital into the country at the tune of more than $20 billion per year, much higher for example than the amount that tourism generates for the country.

Furthermore, Mexico remains in and of itself. Its leaders have never seemed to aspire to a leadership role within Latin America, perhaps in no small part to the country’s geographical isolation from South America. Mexico is a North American nation, with its own unique history and revolution(s) from its colonial past.

The Crux
Which leads us to the chapping point between Mexico and the US – the Texas Annexation, Mexican-American War, and Treaty of Guadalupe Hidalgo, all from the 1840s. Mxico lost more than half of its territory during that time, although to be fair, a large percentage of that was the very thinly settled territory that became California and other Western states after becoming part of the US.

Mexico has passively aggressively tried to reclaim its lost lands to the present day, and the US has passively aggressively ignored the issue. The informal push known as the Reconquista is dismissed as a fringe movement by commentators on both sides of the border, but yet the topic lurks in the shadows and while never being formally addressed by the respective nations’ leaders.

President Obama’s recently announced effort to make sense of illegal immigration from Mexico, only the latest of such Presidential efforts dating to the original bracero agreements between the US and Mexico in 1942.

I’ve seen no response from Mexico’s government, which is pre-occupied at the present moment with a telenova involving murdered students, a house owned by the First Lady, and ongoing accusations of corruption at all governmental levels. Of course, the US government is pre-occupied by a few domestic and global issues as well. The US-Mexico relationship is not at the top of the list for either country.

How to Proceed
So how do technology companies proceed? There is already significant foreign direct investment in Mexico, more than $100 billion per year. NAFTA has improved supply chains and other corporate efficincies across the border.

And there is certainly plenty of potential for continued economic improvement, as a middle class continues to grow despite obstacles in Mexico. If the telco nut can be cracked, we would expect overall Internet access and bandwidth to improve dramatically throughout Mexico.

Mexico ranks 73rd overall in our rankings, out of 103 nations surveyed, behind Brazil and Peru, behind South Africa, and behind the Philippines, for example. It does poorly in its income tier as well, finishing 14th among 19 nations, just ahead of Argentina. In our “Goldilocks Index,” which measures how hot countries’ technology factors are running, Mexico runs too cold. It’s on a par with Paraguay in this area.

Yet given its overall wealth, Mexico remains only a middling challenge to improve overall, despite its large population. It’s on a par with Turkey in this category; not great, but achieving significant progress in Mexico’s ICT infrastrucutre does not rank among the great challenges of the world. Places such as India, Pakistan, Indonesia, the Philippines, South Africa, and Nigeria offer far steeper challenges.

This is what our numbers say; they do not account for the elephant in the room. And as I write this, the non-dialogue between the two nations continues. There are efforts in the NGO space and within academia to improve the dialogue. Our research at the Tau Institute aims to do the same, not only between Mexico and the US, but among the nations of the world. But until political leaders at the highest levels of Mexico and the United States take such a dialogue seriously, nothing of consequence can happen, in our opinion.

As always, our overall results are meant to start conversations, not offer quick solutions. We welcome the opportunity to dive into all of our numbers and develop more substantive analyses of Mexico’s present and future place along the ICT continuum.

read more