Maryland is all set to move its services to the cloud

Maryland is leading the way in moving its services to the cloud, as it has started building a cloud-based platform to upgrade its customer processes, specifically its human services technology infrastructure.

Maryland is thinking of creating a unique product called Total Human Services Information Network, dubbed as, MD THINK. This is a cloud-based repository that will offer integrated data to all the programs and projects handled by the departments of Human Services, Licensing, Regulation, Juvenile Services, Health and Mental Hygiene.

The obvious advantage with such a system is that users can get the information they want from a single repository. In the past, data was siloed in different departments and this made it difficult for one department to access the information contained in another. As a result, they were not able to make the best decisions because no department ever had the complete picture of an individual. Since everything about an individual was not known, there was no scope for contextual decisions, and this really impeded the success rate of many of its programs, not to mention the hardships it caused to the residents of Maryland.

To overcome this problem, MD THINK came into being. Gov. Larry Hogan and his administration are striving to make this project a reality by investing $14 million from the allocated budget funds for the financial year 2017.

According to experts, MD THINK will use a scalable and cloud-based platform that will streamline all the processes and consolidate it into a single repository for data storage and retrieval. The first phase will focus on children and families, so that social workers can better cater to their needs.

In addition, case workers will be given a tablet device that they can use while on field work, as a part of MD THINK. Such a move will help case workers to record information right away and even lookup for  any pertinent information that can help them offer a better service.

There are many advantages that come with such a program. First off, Maryland is likely to save a ton of time and money, as it can streamline its resources and use them in a more productive manner. More importantly, it will be able to gather data analytics that can provide deep insights into the program beneficiaries, how it benefits them and what more can be done to improve the quality of life of Maryland citizens.

According to Gov. Hogan, this project will transform the way the state of Maryland deliver human services to its residents and it’ll also finally bring the process of delivering government services into the 21st century.

This cloud-based project, in many ways, reflects the power of technology and how it can be used to make our society a better place for living. Though the benefits of cloud are well-understood, not all government services are ready to move to it. This can be due to a combination of many factors such as existing legacy systems, the cost of migrating existing data and systems to a new system, budget constraints, mindset of policymakers and more.

Hopefully, this step by Maryland paves the way for other states to also follow suit and embrace cloud in a big way.

The post Maryland is all set to move its services to the cloud appeared first on Cloud News Daily.

2017 is the year when IoT gets serious: Here’s what cloud providers need to know

If you attended Mobile World Congress (MWC) last month you would have heard a lot of buzz about 5G and the Internet of Things (IoT) with many vendors like HPE, IBM, VMware and others talking about their role in 5G and IoT.  As iland is a V-Cloud partner, I was particularly interested to hear about the VMware IoT theatre sessions which went on throughout the show during which VMware talked about their IoT strategy and how it was helping organisations to harness IoT devices.

As you would expect there was a lot on show. There was everything from IoT-enabled safety jackets to keep people safe at sea or in the mountains to connected cars showing the latest innovations in vehicle-to-vehicle communications and autonomous driving. Sierra Wireless demonstrated a live LTE CAT-M network for smart water solutions while Huawei put the spotlight on its connected drone.

Moving away from all the hype of Mobile World Congress, I believe that 2017 will be the year where the focus turns to real deployments and the monetisation of IoT.  It is fair to say that in 2016 IoT was still in its infancy in terms of revenue and deployments. Now however I think we will start to see some real live systems that increase productivity, increase customer satisfaction and develop new revenue streams.

At MWC we heard that 5G will be the panacea for IoT. However 5G is still a number of years from being realised in any meaningful way. In the meantime, telcos will have to deal with new IoT models using alternative technologies today. Telecom operators’ strategies and business models for generating revenues from IoT will continue to develop through 2017 and I think we will continue to see the battle between NB-IoT and LTE-M play out.

Regulation and standardisation will also come into focus more in 2017. On the regulatory front, I have to admit we haven’t seen much come through yet but I expect to see more government interest as IoT becomes more pervasive in smart cities, the public sector and energy.

Smart cities will lead the charge in IoT deployments. The awareness of what ‘smart city’ means is now beginning to capture the attention of residents. They value safety (smart lighting), convenience (transportation, parking) and the potential cost savings that cities can deliver (smart metres, on-demand waste pickup and so on).  That said, cities will continue to be strained by the need for money to support the deployment of sensors to gather data and the integration of city-wide systems.

As business investment in cloud technologies continues, so IoT moves up the agenda. IoT data tends to be heterogeneous and stored across multiple systems. As such, the market is calling for analytical tools that seamlessly connect to and combine all those cloud-hosted data sources, enabling businesses to explore and visualise any type of data stored anywhere in order maximise the value of their IoT investment.

Equally organisations across the world will be deploying flexible business intelligence solutions that allow them to analyse data from multiple sources of varying formats. Joining together incongruent IoT data into a single pane of glass, companies can have a more holistic view of the business, which means they can more easily identify problems and respond quickly. In other words we need to extract the data from IoT devices then figure out what it all means. With the solution to the ‘last-mile’ of IoT data organisations can then start to increase efficiencies and optimise their business offering.

While the possibilities for IoT to improve our world seem endless, concerns over security are very real. As we put more and more critical applications and functions into the realm of IoT it increases the opportunity for breaches.  In 2016, the market finally began to take security seriously, largely because of the increase in IoT hacks. Incidents such as the big denial-of-service attack in October and even the potential of a drone injecting a malicious virus via lights (from outside a building) caused great concern throughout the industry. As a result, we saw some solid announcements such as the Industrial Internet Consortium releasing its security framework. With all the new vulnerable devices now being put into service, hackers will continue to exploit IoT systems. I think we can certainly expect to see more large scale breaches as hackers look for newly connected devices, especially in the energy and transportation areas.

Many of iland’s top customers are working in the IoT space and they are looking to iland to provide the most secure and top of the line environments to host their applications in the cloud. Here at iland we take this very seriously and we are constantly improving, upgrading and fortifying our platform to meet the growing needs of the IoT market.  This means applying on-premise levels of IT security to cloud workloads. For example two factor authentication, role based access control, encryption and vulnerability scanning which enables a protective shield for the cloud to scan all incoming and outgoing data for malicious code, regardless of the device being used.  Additionally, we have just introduced our Secure Cloud Services designed to foster strategic, self-sufficient visibility and control of a cloud environment.

I believe that 2017 will be the ‘Year of IoT’ and at iland we are focused on making sure that we provide the most secure and compliant environment for our customers so that they can take advantage of the opportunities that IoT presents. Making sure your IT systems are compliant and having the reporting mechanisms so that you can prove your data is secure will be a critical success factor in order to leverage IoT devices.  

Editor’s note: if you are interested in finding out more about the iland Secure Cloud™ or iland Secure Cloud Services please visit here.

2017 is the year when IoT gets serious: Here’s what cloud providers need to know

If you attended Mobile World Congress (MWC) last month you would have heard a lot of buzz about 5G and the Internet of Things (IoT) with many vendors like HPE, IBM, VMware and others talking about their role in 5G and IoT.  As iland is a V-Cloud partner, I was particularly interested to hear about the VMware IoT theatre sessions which went on throughout the show during which VMware talked about their IoT strategy and how it was helping organisations to harness IoT devices.

As you would expect there was a lot on show. There was everything from IoT-enabled safety jackets to keep people safe at sea or in the mountains to connected cars showing the latest innovations in vehicle-to-vehicle communications and autonomous driving. Sierra Wireless demonstrated a live LTE CAT-M network for smart water solutions while Huawei put the spotlight on its connected drone.

Moving away from all the hype of Mobile World Congress, I believe that 2017 will be the year where the focus turns to real deployments and the monetisation of IoT.  It is fair to say that in 2016 IoT was still in its infancy in terms of revenue and deployments. Now however I think we will start to see some real live systems that increase productivity, increase customer satisfaction and develop new revenue streams.

At MWC we heard that 5G will be the panacea for IoT. However 5G is still a number of years from being realised in any meaningful way. In the meantime, telcos will have to deal with new IoT models using alternative technologies today. Telecom operators’ strategies and business models for generating revenues from IoT will continue to develop through 2017 and I think we will continue to see the battle between NB-IoT and LTE-M play out.

Regulation and standardisation will also come into focus more in 2017. On the regulatory front, I have to admit we haven’t seen much come through yet but I expect to see more government interest as IoT becomes more pervasive in smart cities, the public sector and energy.

Smart cities will lead the charge in IoT deployments. The awareness of what ‘smart city’ means is now beginning to capture the attention of residents. They value safety (smart lighting), convenience (transportation, parking) and the potential cost savings that cities can deliver (smart metres, on-demand waste pickup and so on).  That said, cities will continue to be strained by the need for money to support the deployment of sensors to gather data and the integration of city-wide systems.

As business investment in cloud technologies continues, so IoT moves up the agenda. IoT data tends to be heterogeneous and stored across multiple systems. As such, the market is calling for analytical tools that seamlessly connect to and combine all those cloud-hosted data sources, enabling businesses to explore and visualise any type of data stored anywhere in order maximise the value of their IoT investment.

Equally organisations across the world will be deploying flexible business intelligence solutions that allow them to analyse data from multiple sources of varying formats. Joining together incongruent IoT data into a single pane of glass, companies can have a more holistic view of the business, which means they can more easily identify problems and respond quickly. In other words we need to extract the data from IoT devices then figure out what it all means. With the solution to the ‘last-mile’ of IoT data organisations can then start to increase efficiencies and optimise their business offering.

While the possibilities for IoT to improve our world seem endless, concerns over security are very real. As we put more and more critical applications and functions into the realm of IoT it increases the opportunity for breaches.  In 2016, the market finally began to take security seriously, largely because of the increase in IoT hacks. Incidents such as the big denial-of-service attack in October and even the potential of a drone injecting a malicious virus via lights (from outside a building) caused great concern throughout the industry. As a result, we saw some solid announcements such as the Industrial Internet Consortium releasing its security framework. With all the new vulnerable devices now being put into service, hackers will continue to exploit IoT systems. I think we can certainly expect to see more large scale breaches as hackers look for newly connected devices, especially in the energy and transportation areas.

Many of iland’s top customers are working in the IoT space and they are looking to iland to provide the most secure and top of the line environments to host their applications in the cloud. Here at iland we take this very seriously and we are constantly improving, upgrading and fortifying our platform to meet the growing needs of the IoT market.  This means applying on-premise levels of IT security to cloud workloads. For example two factor authentication, role based access control, encryption and vulnerability scanning which enables a protective shield for the cloud to scan all incoming and outgoing data for malicious code, regardless of the device being used.  Additionally, we have just introduced our Secure Cloud Services designed to foster strategic, self-sufficient visibility and control of a cloud environment.

I believe that 2017 will be the ‘Year of IoT’ and at iland we are focused on making sure that we provide the most secure and compliant environment for our customers so that they can take advantage of the opportunities that IoT presents. Making sure your IT systems are compliant and having the reporting mechanisms so that you can prove your data is secure will be a critical success factor in order to leverage IoT devices.  

Editor’s note: if you are interested in finding out more about the iland Secure Cloud™ or iland Secure Cloud Services please visit here.

Commodore 64 to #DevSecOps | @DevOpsSummit #APM #Agile #AI #DevOps

We all know the story: a farm, a kid, a Commodore 64, and a modem maxing out at 300 bps. A few unexpected phone bills later, and young Ian Allison is figuring out how to game the system so he can keep using his newfound gateway to the world of tech. According to Ian, that is where he began building the foundation of skills for his career in computer security. At the recent All Day DevOps conference, Ian (@iallison), now with Intuit, talked about his history of being «that» security guy. You know, the one who thinks developers don’t care about security or deadlines, and, really, are just plain «stupid.» But, don’t worry, he is enlightened now and realizes that we all have the same goal – everyone wants to build a secure system.

read more

Measuring Browser Support for HTML5

HTML5 is the current standard for content displayed in web browsers. Your ordinary web browsing depend on how well the browser you use supports this standard. In addition, some applications have web-based versions that use features of HTML5 to provide a solution that does not require the installation of any code, so that you can […]

The post Measuring Browser Support for HTML5 appeared first on Parallels Blog.

[session] @Ford to Present Secure #IoT at @ThingsExpo NY | #IIoT #M2M #AI

With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, will discuss some of the security challenges of the IoT infrastructure and relate how these aspects impact Smart Living. The material will be delivered interactively to engage with the audience and will consist of the following three parts.

read more

SUSE is HPE’s Main Linux Provider

SUSE is a major Linux provider, and recently, it has entered into an agreement with HPE to tap into each other’s assets.

Under the terms of this partnership, SUSE will acquire HPE’s cloud assets such as the HPE OpenStack and HPE Stackato. Using these assets, SUSE plans to expand its own OpenStack Infrastructure-as-a-Service (IaaS) solution, that in turn, will accelerate its entry into the Cloud Foundry Platform-as-a-Service (PaaS) market.

In a release made by the company, the OpenStacks assets from HPE will be integrated into its own OpenStack cloud to help SUSE bring to the market a certified enterprise-ready solution for its clients and customers who use the SUSE eco-system.

Besides acquiring these assets,  HPE has also named SUSE as its preferred opensource partner for Linux, OpenStack and Cloud Foundry solutions.

While it may sound like SUSE is a major beneficiary of this partnership, in reality, it’s a win-win situation for both the companies. Under this partnership, HPE will use SUSE’s OpenStack Cloud and Cloud Foundry solutions as the foundation for its popular Helion Stackato and Helion OpenStack solution. This company believes that by partnering with SUSE, it can provide the best in class PaaS solutions that are simple to deploy in the multi-cloud environments of its customers.

From the above terms, it’s clear that both HPE and SUSE will hire programmers and do the cloud development work together, but HPE will sell and deploy these services, in addition to providing support for it. Of course, this is not an exclusive partnership as SUSE is always open to finding other partners too in the future.

Also, both the companies have a non-exclusive agreement under which HPE has a right to use SUSE’s OpenStack IaaS and SUSE’s Cloud Foundry PaaS technology for its own development in its Stackato and OpenStack platforms.

This agreement represents the long and complex relationship that these two companies have. A few years ago, HPE merged its non-core software assets with a company called Micro Focus, that owns SUSE. Secondly, SUSE has always worked with HPE on the Linux side. This additional partnership will further cement the relationship between these two companies in the long run.

So, how is this beneficial to everyone involved?

For SUSE, these partnerships describe its evolution from a Linux provider to a full-fledged cloud software development company. It’s in a better position to take on competition from companies like Red Hat and Canonical. In this sense, this partnership can signal its strong entry into the cloud, and unlike the other two companies, it has partnered with a strong and top computer partner in HPE.

As for HPE, this joint development efforts can greatly cut down the time and resources needed to create applications. In today’s competitive market, getting products to the market as quickly as possible is the key, and with its efforts shared with SUSE, this can greatly help HPE to speed up its development process.

Due to the enormous benefits for both these companies and for the cloud industry as a whole, this partnership can be a significant one for everyone involved.

The post SUSE is HPE’s Main Linux Provider appeared first on Cloud News Daily.

Google gets Colgate, Verizon, HSBC on board – a customer step up for enterprise cloud

“Cloud is just an incredibly cool place to be working right now – it’s where a lot of the digital revolution for every industry is going on.”

So began Diane Greene, Google’s cloud SVP to launch the company’s Next conference in San Francisco. But aside from the hype, industry commentators have argued that this was the week in which Google’s enterprise cloud offering came of age. With new customers including Colgate-Palmolive, eBay and Verizon, and existing customers such as Disney and Home Depot seeing better than expected returns, they’re not wrong.

To borrow a line from Eric Knorr, writing for InfoWorld, and with the greatest of respect to the following two companies, there is a difference between tech firms such as Evernote and Snapchat – the former being announced as big news in September last year, the latter for so long being Google’s cloudy poster child – and genuine, real McCoy enterprises. Verizon ranks at #13 on the most recent Fortune 500. Colgate-Palmolive is at #174, with eBay at #300.

The stats were fascinating. In just over three months on moving to Google’s cloud, Colgate saw 90% of its almost 40,000 employees were collaborating on Google Drive, with 57,000 hours of video hangouts clocked up in February alone. Verizon has rolled out G Suite to more than 150,000 enterprises.

Alin D’Silva, VP and CTO digital workplace at Verizon, said that the trial process allayed any deep-seated trepidation around cloud, while the importance of real-time collaboration and the fact that employees in Verizon companies AOL and Telogis were already there helped seal the deal. “We also realised that we had to skate to where the puck is going,” he told delegates. “As the years pass, the talent that comes in to the company is going to expect a product like this.”

Another customer wheeled out onto the stage was HSBC. Darryl West, the bank’s CIO (above), said the company ‘took the plunge’ into the evolving Hadoop ecosystem three years ago and admitted it was a ‘tough road’ in some places. “Apart from having the $2.4 trillion of assets on the balance sheet, we also have at the core of the company a massive asset in our data, and what’s been happening in the last two to three years is a massive growth in the size of our data assets,” he said.

This translated to a somewhat out of date slide being beamed to attendees. The true volume of data at HSBC today, West explained, was more than 100 petabytes. With great data comes great insight, but only if you can get the right people working on it. “What we need to do as a bank is work with partners and enable us to understand what’s happening with this data, draw out the insights so we can run a better business and create some amazing customer experiences,” said West.

So those are the customer stories – what about the product? This publication has already covered the partnership with Rackspace for managed services, but also announced was a deal with SAP, principally to feature SAP HANA running on Google Cloud Platform (GCP).

The potential of machine learning and artificial intelligence (AI) was a key point during the keynote – Mike White, CTO and SVP at Disney frequently cited it as key in both retail and character development – and was mentioned again here with the SAP partnership. “Google and SAP intend to collaborate on building machine learning features into intelligent applications like conversational apps that guide users through complex workflows and transactions,” Nan Boden, Google Cloud head of global technology partners wrote in a blog post.

“This is just one example of how the Google Cloud and SAP partnership will enable digital business transformation using the power of machine learning, with deeper technology integrations to come.”

Plenty of other product news was bundled in elsewhere. Among the highlights were greater integrations with Google App Engine to include Node.js, Ruby, Java 8, Python and Go – “bring your code, we’ll handle the rest” – as well as a new service for BigQuery, automating data movement from certain Google applications directly into the analytics warehouse.

GCP will also be opening up new regions in California, Montreal and the Netherlands, bringing the total up to 17 locations in the future with the usual tenets of lower latency, increased scalability and greater disaster recovery promised.

You can find out more about the product announcements here and the SAP partnership here.

Google gets Colgate, Verizon, HSBC on board – a customer step up for enterprise cloud

“Cloud is just an incredibly cool place to be working right now – it’s where a lot of the digital revolution for every industry is going on.”

So began Diane Greene, Google’s cloud SVP to launch the company’s Next conference in San Francisco. But aside from the hype, industry commentators have argued that this was the week in which Google’s enterprise cloud offering came of age. With new customers including Colgate-Palmolive, eBay and Verizon, and existing customers such as Disney and Home Depot seeing better than expected returns, they’re not wrong.

To borrow a line from Eric Knorr, writing for InfoWorld, and with the greatest of respect to the following two companies, there is a difference between tech firms such as Evernote and Snapchat – the former being announced as big news in September last year, the latter for so long being Google’s cloudy poster child – and genuine, real McCoy enterprises. Verizon ranks at #13 on the most recent Fortune 500. Colgate-Palmolive is at #174, with eBay at #300.

The stats were fascinating. In just over three months on moving to Google’s cloud, Colgate saw 90% of its almost 40,000 employees were collaborating on Google Drive, with 57,000 hours of video hangouts clocked up in February alone. Verizon has rolled out G Suite to more than 150,000 enterprises.

Alin D’Silva, VP and CTO digital workplace at Verizon, said that the trial process allayed any deep-seated trepidation around cloud, while the importance of real-time collaboration and the fact that employees in Verizon companies AOL and Telogis were already there helped seal the deal. “We also realised that we had to skate to where the puck is going,” he told delegates. “As the years pass, the talent that comes in to the company is going to expect a product like this.”

Another customer wheeled out onto the stage was HSBC. Darryl West, the bank’s CIO (above), said the company ‘took the plunge’ into the evolving Hadoop ecosystem three years and admitted it was a ‘tough road’ in some places. “Apart from having the $2.4 trillion of assets on the balance sheet, we also have at the core of the company a massive asset in our data, and what’s been happening in the last two to three years is a massive growth in the size of our data assets,” he said.

This translated to a somewhat out of date slide being beamed to attendees. The true volume of data at HSBC today, West explained, was more than 100 petabytes. With great data comes great insight, but only if you can get the right people working on it. “What we need to do as a bank is work with partners and enable us to understand what’s happening with this data, draw out the insights so we can run a better business and create some amazing customer experiences,” said West.

So those are the customer stories – what about the product? This publication has already covered the partnership with Rackspace for managed services, but also announced was a deal with SAP, principally to feature SAP HANA running on Google Cloud Platform (GCP).

The potential of machine learning and artificial intelligence (AI) was a key point during the keynote – Mike White, CTO and SVP at Disney frequently cited it as key in both retail and character development – and was mentioned again here with the SAP partnership. “Google and SAP intend to collaborate on building machine learning features into intelligent applications like conversational apps that guide users through complex workflows and transactions,” Nan Boden, Google Cloud head of global technology partners wrote in a blog post.

“This is just one example of how the Google Cloud and SAP partnership will enable digital business transformation using the power of machine learning, with deeper technology integrations to come.”

Plenty of other product news was bundled in elsewhere. Among the highlights were greater integrations with Google App Engine to include Node.js, Ruby, Java 8, Python and Go – “bring your code, we’ll handle the rest” – as well as a new service for BigQuery, automating data movement from certain Google applications directly into the analytics warehouse.

GCP will also be opening up new regions in California, Montreal and the Netherlands, bringing the total up to 17 locations in the future with the usual tenets of lower latency, increased scalability and greater disaster recovery promised.

You can find out more about the product announcements here and the SAP partnership here.

Access governance and the cloud: Security and organisational insight are the bottom line

How does access governance apply to the cloud? Well, while the cloud has been established as a standard for many organisations, access and governance to manage such solutions has not yet become a standard solution.

Access governance helps organisations of all sizes in every industry by ensuring that each employee has the correct access to the systems that they need to perform their jobs while keeping the company’s data and network secure. Access governance specifically allows organisational leaders to easily manage accounts and access, and is put in place to ensure that access is correct. This works by setting up a model of precisely the access rights for each role in the organisation, for every employee no matter where they may be based.

To provide a bit more detail on the meaning of this, access rights are created for specific roles in each relevant department. Access rights should be unique to the individual, not copied and pasted from another employee with a similar role or job function (this happens a lot in organisations where many employees perform much of the same work, like in manufacturing and healthcare, but should be avoided).

Checks and balances in access rights

Access governance means you can correct or populate access rights according to a model that you have established for your departments or teams. Again, individual access rights are important and an access matrix may prove to be a valuable tool to use when determining who needs access to which systems when for which role. Reconciliation is another way to ensure access rights. Reconciliation compares how access rights are set up to be in the model to how they actually are, and allows you to create a report on any differences found. Insomuch, any record or access point that is not accurate can then be easily corrected.

Attestation is still another form of checking access and helps verify all information. A report is forwarded to managers of a department for verification to ensure all users and their rights are accounted for and that everything in the log is correct. The manager verifies access and either marks rights for deletion, immediate change or maintains current access. After examining all of the rights, the manager must give final approval for the proposed set of changes to ensure that everything is correct.

During the course of an employee’s employment, it is an extremely common occurrence for the employee to receive too many rights, or to acquire access rights while working on projects. But these rights are often never revoked once they have been assigned. Access is frequently overlooked or not considered important enough to take away. What if one of your employees have access to a solution many of your other employees are assigned to use? The access governance concept allows you to provide and monitor access across the entire organization, from those using in-house solutions and those using cloud resources to access information.

Organisational access can be easily monitored through the use of access governance technology. Here’s why this is important: The typical access process goes a little something like this – a new employee is hired in the human resources department as a senior recruiter and needs accounts and resources created so he or she can begin work. The employee then automatically receives a Coupa cloud account, for example, PeopleSoft access and the ability to open the department’s shared drive and an email address. At this point, this employee should be ready for work.

Then, for those that employ access governance technology to monitor the goings on in their organisation, that process looks a little like this: Rules are created to review access rights of employees in each respective manager’s department. A review is conducted of who has what and why. Same goes for employees who are added to roles or newly hired to the organisation. Then, if access is no longer required following the completion of a project or a change in roles, the manager or other departmental leader can tag the access granted to be revoked and ensure that it is done automatically right away. This eliminates the need for a multi-level manual processes simply by the click of a button. All access for the employee to a specific system, or all systems, can be revoked. That’s the added value of a security measure. 

Why the cloud needs access governance

As more employees take to remote locations as their work environments, so do the number of users operating cloud applications. Access governance strategies can be employed to secure these applications for the employees not working in the physical corporate office or organisational facility.

Business leaders have many types of applications to manage and many roles for employees because of how teams are created within current organisations. Employees may be based abroad, working from home, traveling or just working offsite, all of which can effect access governance and technology use and access within across each of these situations.

Organisational leaders who invest in the cloud and building their companies through it may wish to add access governance technology to improve the security of their information while allowing their employees the opportunity to remain productive wherever they may be. Plus, and this is the bottom line of any security professional, you’ll be able to see who is doing what when and where with your information no matter where they happen to be.