SAP’s Bernd provides an overview of the myriad of IoT’s that will be arriving at our door steps in the coming months and years. Some of them have already made their precense felt and more of them on the threshold. As far as busines value is concerned it appears to be a deluge. It is anticipated that billions of devices will be interconnected and our private lives are forever compromised. From the list of IoT collected in this article there is but one conclusion, it is not just IoT by IOET (internet of every thing).
Monthly Archives: December 2014
Memory Optimization By @SproutCore | @DevOpsSummit [#DevOps]
The next release candidate for 1.11.0 will be out very shortly, but I thought it best to post a brief update on the past week’s work as this week saw a concentrated effort on core optimization.
First we took another look at the use of arguments lists throughout the framework and found several more occurrences of it being accessed in an inefficient manner. Depending on the browser, accessing arguments in such a way that causes it to be allocated can be up to 80% slower and so it’s really good to have these all fixed.
Should You Move SharePoint Data to the Cloud By @ProfessionalAdv | @CloudExpo [#Cloud]
With the advent of cloud computing, Microsoft SharePoint had graduated to the cloud, but take-up was slow – according to research firm IDC, only around 4% of offices at the time were using virtual applications to manage productivity.
So should your business tick with managing its SharePoint data in-house or move to the cloud?
The cryptic cloud: Can cloud encryption operate effectively right now?
©iStock.com/Henrik5000
Encryption in the digital world is akin to a safe in the physical world. Data is locked away and can only be seen by those who have the correct key. Among other things, encryption is what provides an assurance of confidentiality in data security and it is fast gaining ground in the cloud. But is encrypted data therefore more secure? Not if your keys are transferred in the clear, duplicated or mismanaged.
Data that goes to the public cloud is usually transferred securely and files are not kept on public web servers, so the obvious security measures are there. But once it gets to the storage server, data is beyond the user’s reach or control. It may be stored unencrypted or not, it may be read by the service administrator or not. It may be delivered to influential third parties, such as governments or associated agencies, or not. It may be compromised if the servers are broken into or it may be accessed if the servers are physically hijacked.
Key holder is king
Any reasonable storage provider who offers encryption will store each of their customers’ data encrypted with their own unique encryption key. The deal is that the provider holds the data and the user holds the key – normally derived from the account’s password using some algorithm (PBKDF2 is a good example).
But here’s where things get tricky: usually encryption of stored data occurs on the provider’s servers, which means that during the process of encryption they must hold the encryption key. We, as the customer, have no choice but to trust the service provider.
In this encryption model, the provider will at certain moments hold its users encryption keys, which it requires to be able to encrypt and decrypt information. The user holds the key and the provider holds the encrypted data. Whenever the user needs to access his or her pool of data they must lend the key to the provider in a ‘security by proxy model’. This happens by logging in to the service.
Online storage providers should seek to offer assurances until encryption comes of age
Web application or compiled programs are no different. The password travels within the point-to-point encrypted tunnel (usually SSL/TLS) which means that it exists in a memory space in plain text at the client and at the server. The key has been copied.
So if the provider has other, less clear, less honest intentions or is coerced, they can keep copies of encryption keys and decrypt users data. Customers cannot do anything to mitigate this without additional mechanisms. For instance, a truecrypt volume would solve the problem – although, with this solution there are a number of down sides and a significant loss of flexibility, not to mention potential integrity issues, particularly when attempting to share files in the TC volume. If such tools are not used, the simple truth is that data held by online storage providers, encrypted or not, is simply held on trust.
Data security vs trust
Issues of trust aside, a discussion needs to take place about the merits of data security versus the implications of security by proxy.
If a user requires data security within online storage, then they should not use a security by proxy model. They must use a method that ensures that the provider never has sufficient information to decrypt or facilitate decryption of stored customer data. Can data security be implemented in such an encryption model that ensures that the provider never has sufficient information in its systems to decrypt it? Yes, naturally, there are protocols and methods for doing this and no doubt many more solutions are in the pipeline.
But as of today, such a model does not exist, as many providers are required to have access to surrender information to authorities.
The current cloud storage services which provide encryption or add on encryption services are of the ‘security by proxy’ model. Therefore in all cases users do not get assurance that the provider cannot access their data. The provider can, with little or no effort, access their customers’ data, if they want to.
It’s for this reason that data held in remote storage should always be assessed by the user to determine its value should it be compromised. (If the celebrities whose naked selfies were leaked online had paused for thought in this respect, the Celebgate fiasco could have been prevented.) In the meantime, online storage providers should seek to offer assurances until encryption comes of age.
China to assess trustworthiness of cloud providers
©iStock.com/123ArtistImages
Cloud computing providers in China are to be assessed for their trustworthiness according to a report – which could mean bad news for foreign CSPs.
According to a report from China Daily, the new government system will be similar to the FedRAMP accreditation used in the US. Only firms who pass this ruling in full will be allowed to take part in government projects, according to Zuo Xiaodong, vice president of the China Information Security Research Institute.
“The basic idea of the security rating mechanism is to find trustworthy hardware, software and service providers to ensure that the government has total control of the entire ecosystem,” he said.
The overall effect, the report argues, could mean greater difficulty for overseas cloud vendors to get into Chinese business. They will be allowed to take part in the assessment, but may be required to provide further data for security reasons. According to sources cited in the report, China may want overseas IT providers out of the government procurement market by 2020.
The China Daily report quotes Wang Zhengfu, chief operating officer at server provider Sugon, who describes the development as a “golden opportunity” to take on foreign companies.
Many are looking to gain a foothold in China, which is expected to be a powerhouse in driving ICT over the coming year. A recent IDC predictions report argued China will have “skyrocketing influence” on the global market, with 43% of industry growth.
Among the foreign service providers in China include Amazon Web Services, who announced their roadmap this time last year, CenturyLink, who opened up a Shanghai data centre in October, and Interoute, who launched a new virtual data centre in nearby Hong Kong.
A report last year on behalf of the US-China Economic and Security Review Commission discussed potential security concerns if the US consumer market uptake for Chinese cloud computing services increased. This sounded a warning for conflating US and Chinese services on both sides – and with this latest development, it could be full speed ahead for that plan.
FBI Provides An Update on Sony Hack By @BobGourley | @CloudExpo [#Cloud]
By Bob Gourley
The FBI just posted the first official written articulation of why they believe North Korea is linked to the ongoing Sony Hack. As someone who has worked with FBI investigators in the past I have to tell you they do not go public like this unless they have evidence. It is true they have been […]
Is Cloud Security an Oxymoron? By @GiladPN | @CloudExpo [#Cloud]
Cloud computing is increasingly becoming part of the enterprise IT landscape. In fact, a recent cloud security survey conducted by HP reveals that 70 percent of all respondents say their company is using some form of the cloud. The study also found that cloud penetration jumps to 80 percent for enterprise-size organizations. With such dramatic growth figures, it is no wonder that enterprise companies are carefully reviewing cloud security policies’ implication on cloud data. Here’s another fact from that same survey: While more than half of those using the public cloud are confident that critical and sensitive data can be secured in the public cloud, 16 percent of all companies in the cloud reported at least one public cloud breach in the past 12 months.
Agile Development for the Database By @Datical | @DevOpsSummit [#DevOps]
Few companies are able to implement Agile Development across the organization because there exists a chasm between the Application Development team and the Database Administrators.
All too often, database development is an afterthought in Agile Development. Developers perfect how best to implement a solution in code, but tend to spend too little time on representing that solution in the Database. This is unfortunate, given that 65% of your change requests require changes to the application schema, according to independent research conducted by Simon Management Group. This finding implies that Database changes are every bit as important as the Application Code, and should be treated as tier-one artifacts in your release process.
If You Can’t See It, You Can’t Manage It By @AppDynamics | @DevOpsSummit [#DevOps]
“There was 5 exabytes of information created between the dawn of civilization through 2003, but that much information is now created every 2 days, and the pace is increasing…,” – Eric Schmidt, Former CEO, Google. If IT leaders hadn’t already heard Schmidt’s famous quotation, today they are definitely facing the challenge he describes. Gone are […]
The post If you can’t see it, you can’t manage it – ITOA use case #1 written by Marcus Sarmento appeared first on Application Performance Monitoring Blog from AppDynamics.
2015 Will Be the Year of the Platform By @ServiceNow | @CloudExpo [#Cloud]
This past year has continued to see rising enterprise cloud adoption, and with it efforts from tech giants to control the cloud through price wars. Like most things, the path we’ve taken to this point will certainly impact where the road ahead will lead.
If 2014 was the continued battle for cloud infrastructure, 2015 will be the rise of the cloud platform, as enterprises will focus on creating apps and workflows that take advantage of the growing platform options across enterprise needs such as HR, Financial Services and IT. The rise of the platform will bring with it a variety of other changes – all of which stem from more and more CIOs turning to the cloud to deliver substantial innovation and business results. From verticalization to data as a service, we’ll ring in the new year with dramatic changes to the cloud landscape that stand to transform how we view and utilize cloud technologies. Below are five enterprise cloud platform inspired predictions for this coming year.