For most organizations, the move to hybrid cloud is now a question of when, not if. Fully 82% of enterprises plan to have a hybrid cloud strategy this year, according to Infoholic Research. The worldwide hybrid cloud computing market is expected to grow about 34% annually over the next five years, reaching $241.13 billion by 2022. Companies are embracing hybrid cloud because of the many advantages it offers compared to relying on a single provider for all of their cloud needs. Hybrid offers balance and flexibility. It helps companies achieve a wide array of business goals, including availability, reliability, security and cost-efficiency.
Monthly Archives: July 2017
How to Sponsor @CloudExpo | #DevOps #IoT #AI #DX #DigitalTransformation
21st International Cloud Expo, taking place October 31 – November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday’s debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and infrastructure as a service.
Announcing @CloudRank “Bronze Sponsor” of @CloudExpo | #AWS #DX #Azure
SYS-CON Events announced today that Cloud Academy named “Bronze Sponsor” of 21st International Cloud Expo which will take place October 31 – November 2, 2017 at the Santa Clara Convention Center in Santa Clara, CA. Cloud Academy is the industry’s most innovative, vendor-neutral cloud technology training platform. Cloud Academy provides continuous learning solutions for individuals and enterprise teams for Amazon Web Services, Microsoft Azure, Google Cloud Platform, and the most popular cloud computing technologies. Get certified, manage the full lifecycle of your cloud-based resources, and build your knowledge based using Cloud Academy’s expert-created content, comprehensive Learning Paths, and innovative Hands-on Labs.
Report argues ‘concerning’ lack of understanding over IaaS shared responsibility models
It is a question almost as old as the concept: who should look after cloud security, the vendor or the customer? A new report from Barracuda Networks argues there is a ‘concerning’ lack of understanding with regard to the shared responsibility model for infrastructure as a service (IaaS) providers.
For Amazon Web Services (AWS) and Microsoft, the two leading IaaS providers, the meaning is clear. Microsoft points out the difference between software as a service (SaaS), platform as a service (PaaS), and IaaS. For IaaS, while the provider looks after physical security and shares demands on host infrastructure and network controls, as Microsoft puts it, the customer is responsible for app level controls, identity and access management, endpoint protection and data classification. AWS describes the vendor and customer as being responsible for security ‘of’ and ‘in’ the cloud respectively.
So why, therefore, did almost two thirds (64%) of the 550 EMEA IT decision makers polled by Barracuda say they believed securing customer data in the public cloud was the vendor’s responsibility? 61% believed the same around applications, 60% for operating systems, while only 57% said service providers control the physical security of infrastructure.
“The lack of clarity regarding organisations’ versus IaaS providers’ cloud security responsibilities creates grey areas that IT decision makers must address if they want to keep key data and systems secure,” the report noted.
Issues regarding public cloud security do seem to have been picked up by the organisations polled, however. 57% of all respondents said they had added additional security to its public cloud, with 37% saying they were planning to. The figure was highest in the Belgium and Netherlands – 70% affirmative – with the UK lowest on just 43%, albeit with 39% planning to invest.
Naturally, the advice from Barracuda was to ‘partner with a vendor agnostic security expert to advise on exactly which pieces of the IaaS puzzle is the customer’s responsibility’. “The bottom line is that organisations are continuing to invest in public cloud projects, but they need a trusted vendor-neutral partner to help them navigate the choppy waters of cybersecurity if they want to minimise risk in the process,” the report wrote.
“With sweeping new European data protection regulations landing in May 2018, no organisation can afford to ignore security today.”
You can read a blog post here and download the full report (registration required) here.
Microsoft Could Lay off Thousands of Employees
In a clear sign of reorganization within the company, Microsoft has announced that it will be focusing more intently on its cloud services. The announcement said that Microsoft is reorganizing its global workforce to give more importance to cloud services than mere standalone pieces of software.
It’s not so much of a surprise due to many reasons. Firstly, its cloud business is doing amazingly well and has exceeded all expectations. At the same time, revenue from its traditional businesses has slowed down. Putting all this together, cloud is definitely the future of this company, so it’s only right that it focuses its resources on the most profitable sections.
Secondly, reorganization has been taking place in fairly frequent intervals since Satya Nadella took over as the CEO in 2014. Through his visionary ideas, he has been able to make Microsoft a key player in the cloud. At the same time, it has also necessitated many structural changes and Nadella has been doing it to improve the overall efficiency of the organization.
However, this is not good news for many employees who work here. According to a report by The Wall Street Journal, Microsoft plans to cut down thousands of jobs as a part of this latest reorganization. Most of these jobs are likely to be on the sales side.
Though the exact numbers and the locations where the layoffs will happen is not yet clear, it is worrying to some extent because economies of the U.S and other Western countries is slowly stabilizing after the 2008 financial crisis. Even unemployment rates are one of the lowest in a decade. Considering these improvements in the economy, it’s a spot of bother if Microsoft embarks on any mass layoff.
Also, countries like India and China are facing an economic slowdown, so if the layoffs happen here, it could again create an unpleasant effect.
Microsoft though expects no huge impact on its business on a day-to-day business. So, we’ll have to wait and see the impact these layoffs will have on economies and maybe even the IT industry as a whole.
But one thing that we can infer is the growing might of cloud. All major companies believe that we’re moving into a new phase of technology that is likely to be dominated by cloud, artificial intelligence and machine learning. In fact, many analysts and economists are already warning about spikes in unemployment that can come when machines start performing many of the mundane jobs currently being done by lower and middle class residents of a country.
This doesn’t mean machines will take over humans, just like what happens in Hollywood movies. Rather what it means is that the nature of jobs will change. Many of the lower-end jobs will be replaced by machines and robots and we’ll move higher up to create robots or even do other things that’ll drastically improve the overall quality of our lives.
In this sense, there’s much to look forward to, even if Microsoft and others layoff people now.
The post Microsoft Could Lay off Thousands of Employees appeared first on Cloud News Daily.
How to tackle changing cloud security threats: A guide
IT workers face a serious challenge when it comes to file sharing. In one corner is corporate governance which seeks to protect businesses and prevent cyber-attacks. In the opposite corner are end users who want to work more efficiently – collaboratively – by sharing or saving files.
The best way of ending this conflict is to find middle ground. In attempting this resolution, enterprises need to find the right balance between IT security and governance on one side and the needs of employees on the other. To ensure cloud protection when storing or sharing files, businesses need to provide end to end encryption, data residency control, authentication of internal and external collaborators and a good user experience.
The first key aspect is providing end-to end encryption. The encryption can only be successful if it is latency free which ensures performance isn’t adversely affected. The enterprise also needs ownership of the keys in order to implement the encryption successfully.
Although some companies will have a positive view of a service provider managing security keys, as it reduces the stress of managing this function, there are downsides. A third-party provider may be required to hand over data to a government, thereby losing control of the security of the document.
Is there any halfway option that avoids this loss of control? One way is to allow only the owner of the encryption keys the ability to decrypt those keys used on the public service. With this model, they own the hardware and the keys.
Yet another option is to keep the hardware on-site. This means the data and metadata is on site and provides peace-of-mind to organisations for whom security is a major priority.
In summary, the route picked by any organisation is determined by the approach that best suits its business. While some companies may prefer owning the keys due to their size and the flexibility it offers as the business changes, others will be content to hand over control to a third party. The strategy will therefore be decided by the degree of control required and capacity to adapt.
The second aspect is having 100% data residency control; a necessity that no organisation can bypass. As we see an ever-increasing layer of regulations put in place, at a national and regional level, data residency has become more important.
The issue is more prominent in Europe, in particular the 27 member states of the EU, although data residency is a worldwide factor. Many international companies aim to standardise to one single solution. Conforming to international laws is a requirement for a company with multiple offices in different regions. So, a US company with offices in Europe will need to conform to UK laws as well as those of the EU. In the US itself, interstate laws may also apply. In Europe, some countries have to keep the data in the country it was created.
To complicate matters further, different types of data have different requirements which determines where that data can be hosted and the approach that needs to be taken. An enterprise may require two solutions or just one which enables it to comply for all kinds of data.
As regulation change is inevitable and regular, enterprises should own data storage or have control over residency. Having the agility to adapt to changing regulations can only benefit companies. Regulation change needs to be carefully considered and included in strategic planning by enterprise, allowing themselves a degree of latitude as circumstances change.
The third aspect is putting in place advanced authentication for internal collaborators. To minimise the risk of passwords being hacked, one solution is two factor authentication. Users risk leaving themselves open to hacking and breaches by reusing the same passwords and passwords with only minor variations. To avoid this vulnerability, two factor or multi-factor passwords should be used.
The fourth aspect is authenticating external collaborators. There are inherent risks with this area of authentication. Inevitably, sharing data to external partners, suppliers and clients is crucial for business success. IT needs to play a key role in controlling what is being shared, with whom and how information is being shared. In addition, IT needs to know how long data is being shared for and it must control sharing permissions which can be stopped when required. There are many examples of how sharing data and access to files can lead to security risks. One example is of participants in a webinar being given continued access to a shared company folder for over five years. During that time, the company ownership changed but access to shared information has remained the same.
The reason this factor is of greater importance is the risk of intellectual property being lost to a third party. When working with a third party on projects, sharing data happens frequently. Safeguards need to be in place so that all parties know who has rights to access or share specific information and what the terms and conditions are for that access. IT needs to provide the relevant tools to enable individuals to manage permissions. The security team’s role is to be aware of all the data being shared at any given point.
When collaboration occurs between internal enterprise users, one is safe in the knowledge that risks are to some extent contained, as the data rests within corporate boundaries. However, in many instances these days, IT must meet the needs of external collaborators for outsourced projects and work with contractors, designers and others.
The bigger challenge for IT is how to ensure confidentiality and data integrity, outside of its control. In order to achieve this, enterprises need to have in place robust policies for collaborators for authentication and have a complete view of permissions granted.
The final aspect is the risks of providing user-friendly file sharing services that come with risks to an organisation’s confidential and sensitive data. The increase in collaboration and employees behavioural change can severely impact businesses. There must be attractive advantages of using enterprise-controlled secure file sharing so users can switch from the file sharing methods they currently use.
Enterprise users have the ability to use convenient file sharing services such as Google Drive or Dropbox. These tools allow users to access files anytime, on any device, at any location and make changes in real-time. The challenge for organisations is to implement enterprise file synching tools and policies before users start using unauthorised solutions. This is only part of the solution. Companies need to remember that simply dealing with file sharing and not addressing the entire file data challenge will lead to problems in the long run. What is required is a solution which creates fully secure workplace collaboration.
Ensuring the user experience of the service is as good as a service such as Google Drive is crucial. If it isn’t, users will simply not want to switch over. Users have an expectation level of file sharing services that must be matched by enterprises so that users can migrate onto more secure platforms.
When it comes to creating a file sharing strategy alongside a virtual desktop approach, it is inevitable that the user experience has to be better than the laptop experience, for it to be successful. For file sharing, the strategy must deliver what users regard as an accepted standard. One way of ensuring this happens is to enable more capabilities.
One successful technique is to have a file sharing capability which is faster, provides data protection and backup, at the same time enabling remote office and branch NAS. This will achieve the twin objectives of creating a secure environment for the enterprise and maintaining a high level of user experience.
In summary, the techniques outlined here are essential components in the aim of achieving total security in the cloud. They enable the IT organisation to do its job effectively and ensure stable business continuity for the enterprise.
Six ways cloud ERP is revolutionising how services deliver results
- Cloud ERP is the fastest growing sector of the global ERP market with services-based businesses driving the majority of new revenue growth.
- Legacy Services ERP providers excel at meeting professional & consulting services information needs yet often lack the flexibility and speed to support entirely new services business models.
- Configure-Price-Quote (CPQ) is quickly emerging as a must-have feature in Services-based Cloud ERP suites.
From globally-based telecommunications providers to small & medium businesses (SMBs) launching new subscription-based services, the intensity to innovate has never been stronger. Legacy Services ERP and Cloud ERP vendors are responding differently to the urgent needs their prospects and customers have with new apps and suites that can help launch new business models and ventures.
Services-based Cloud ERP providers are reacting by accelerating improvements to Professional Services Automation (PSA), Financials, and questioning if their existing Human Capital Management (HCM) suite can scale now and in the future. Vertical industry specialization is a must-have in many services businesses as well. Factoring all these customer expectations and requirements along with real-time responsiveness into a roadmap deliverable in 12 months or less is daunting. Making good on the promises of ambitious roadmaps that includes biannual release cycles is how born-in-the-Cloud ERP providers will gain new customers including winning many away from legacy ERP providers who can’t react as fast.
The following key takeaways are based on ongoing discussions with global telecommunications providers, hosters and business & professional services providers actively evaluating Cloud ERP suites:
Roadmaps that reflect a bi-yearly release cadence complete with user experience upgrades are the new normal for Cloud ERP providers
Capitalizing on the strengths of the Salesforce platform makes this much easier to accomplish than attempting to create entirely new releases every six months based on unique code lines. FinancialForce, Kenandy and Sage have built their Cloud ERP suites on the Salesforce platform specifically for this reason. Of the three, only FinancialForce has provided detailed product roadmaps that specifically call out support for evolving services business models, multiple user interface (UI) refreshes and new features based on customer needs. FinancialForce is also one of the only Cloud ERP providers to publish their Application Programming Interfaces (APIs) already to support their current and next generation user interfaces.
Cloud ERP leaders are collaborators in the creation of new APIs with their cloud platform provider with a focus on analytics, integration and real-time application response
Overcoming the challenges of continually improving platform-based applications and suites need to start with strong collaboration around API development. FinancialForce’s decision to hire Tod Nielsen, former Executive Vice President, Platform at Salesforce as their CEO in January of this year reflects how important platform integration and an API-first integration strategy is to compete in the Cloud ERP marketplace today. Look for FinancialForce to have a break-out year in the areas of platform and partner integration.
Analytics designed into the platform so customers can create real-time dashboards and support the services opportunity-to-revenue lifecycle
Real-time data is the fuel that gets new service business models off the ground. When a new release of a Cloud ERP app is designed, it has to include real-time Application Programming Interface (API) links to its cloud platform so customers can scale their analytics and reporting to succeed. What’s most important about this from a product standpoint is designing in the scale to flex and support an entire opportunity-to-revenue lifecycle.
Having customer and partner councils involved in key phases of development including roadmap reviews, User Acceptance Testing (UAT) and API beta testing are becoming common
There’s a noticeable difference in Cloud ERP apps and suites that have gone through UAT and API beta testing outside of engineering. Customers find areas where speed and responsiveness can be improved and steps saved in getting workflows done. Beta testing APIs with partners and customers forces them to mature faster and scale further than if they had been tested in isolation, away from the market. FinancialForce in services and IQMS in manufacturing are two ERP providers who are excelling in this area today and their apps and suites show it.
New features added to the roadmap are prioritized by revenue potential for customers first with billing, subscriptions, and pricing being the most urgent
Building Cloud ERP apps and suites on a platform free up development time to solve challenging, complex customer problems. Billing, subscriptions, and pricing are the frameworks many services businesses are relying on to start new business models and fine-tune existing ones. Cloud ERP vendors who prioritize these have a clear view of what matters most to prospects and customers.
Live and build apps by the mantra “own the process, own the market”
Configure-Price-Quote (CPQ) and Quote-to-Cash (QTC) are two selling processes services and manufacturing companies rely on for revenue daily and struggle with. Born-in-the-cloud CPQ and QTC competitors on the Salesforce platform have the fastest moving roadmaps and release cadences of any across the platform’s broad ecosystem. The most innovative Services-focused Cloud ERP providers look to own opportunity-to-revenue with the same depth and expertise as the CPQ and QTC competitors do.
How to Sponsor @ThingsExpo | #DX #IoT #SmartCities #DigitalTransformation
Internet of @ThingsExpo, taking place October 31 – November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world.
The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago.
All major researchers estimate there will be tens of billions devices – computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as “IoT-Ready” as it can be!
Global cloud IT infrastructure revenues hit $8 billion in Q117, says IDC
Global cloud IT infrastructure revenues hit $8 billion (£6.2bn) in the first quarter of 2017 going up almost 15% year over year – with Cisco the big winners, according to IDC.
The analyst firm has put out the latest figures on its Worldwide Quarterly Cloud IT Infrastructure Tracker (below), and found Dell and Hewlett Packard Enterprise (HPE) could not be separated at the top, with Cisco behind. The two main players saw their revenues dip compared with this time last year; Dell hit $1.289bn compared to HPE with $1.118bn, with a decrease in revenue of 0.2% and 8.6% respectively, while Cisco, in third with $902 million, saw its revenue go up 8.7%.
It’s worth noting at this juncture that despite the disparity IDC declares a statistical tie if there is a difference of one percent or less between vendor revenues. It’s also worth noting that HPE’s revenues are combined, as of Q216, with the New H3C Group, a venture announced in May last year between HPE and Tsinghua Holdings.
Looking at regional figures, vendor revenue from cloud IT infrastructure sales grew fastest in Canada at 59.1%, followed by Asia Pacific – excluding Japan – at 18.7% and Japan at 15.3%. The US and Western Europe saw growth at 15.1% and 8.9% respectively.
“After a weak performance during 2016, storage purchases for cloud IT environments had a strong rebound in the first quarter, driving overall growth in this segment,” said Natalya Yezhkova, IDC research director for enterprise storage in a statement. “Overall, the first quarter set a strong beginning of the year for the cloud IT infrastructure market.
“With positive dynamics in purchasing activity by hyperscalers across all technology segments we expect a strong year ahead for the fastest growing public cloud segment,” Yezhkova added. “And as end users continue to embrace the benefits of private cloud infrastructures, spending in this segment will also expand.”
According to a missive put out in April by IDC, overall 2016 vendor revenue was $32.6 billion, at a 9.2% climb year on year.
Terark – A Company that Makes Cloud 200x Faster?
With infrastructure and technology in place, the next frontier that companies are aiming for is speed. Terark, a Chinese company believes that it has the secret to make cloud 200x faster than its existing speed.
Is it true?
Apparently yes. Terark has developed algorithms that compress data to help databases run 200x faster than their existing speeds. This translates roughly to one Terark database doing the job of five servers running existing industry standard databases.
The inventor of this algorithm, Lei Peng, is also the CTO of Terark. According to him, existing databases store their data in blocks and each block has an index associated with it. To retrieve data, a search has to be made through the indices and the corresponding block has to be retrieved. To do this, these blocks of data have to be compressed and decompressed, and this puts a huge workload on existing servers.
Terark’s algorithm addresses this limitation by using a method called Nested Succint Trie that can index 100% of the data. For comparison, current systems index only one percent of the data.
This way, blocks don’t have to be compressed, decompressed and retrieved, rather they can be read into directly. Compression still happens, but at the global level, so the query speeds are much faster than before.
An analogy for this method is the library. Let’s say a library has different sections such as art and gardening and each section has hundreds of books. Each of these books have their own index, mostly likely as the first page.
When you want a book, the librarian will direct you to the appropriate section, but you’ll have to go through each book’s contents to find the information you want. That’s the existing system.
With Terark, the index pages of all the books are stored in a single database, which means, you can simply search through the index to find the book you want. It’s almost like putting your entire library on Google and searching through it, according to Remy Trichard, the VP of Terark.
Such an innovative approach has definitely caught the attention of big players in the cloud market. Already, this company has entered into a $1 million contract with Alibaba Cloud. Under the terms of this contract, the Alibaba will give its clients the choice to switch their databases to TerarkDB to get faster speeds during search. Though the pricing structure is not still clear, Alibaba claims that its customers can save a ton of time and money by switching to TerarkDB.
Undoubtedly, this is a big deal for Terark and inspired by the success of its model, it is looking to move beyond the Chinese shores. In fact, it is looking to expand into European and American markets to scour for potential clients and to help them understand this new technology.
That said, this company is not looking to expand their offices beyond China, at least not for now. There are only ten employees now, but it already has six patents. It’ll be interesting to see how this company moves forward in the coming months.
The post Terark – A Company that Makes Cloud 200x Faster? appeared first on Cloud News Daily.