Continuous Intelligence of @SumoLogic | @DevOpsSummit #DevOps #AI #ML #DL

New levels of insight and intelligence into what really goes on underneath the covers of modern applications help ensure that apps are built, deployed, and operated properly. We’ll now explore how these new levels of insight and intelligence into what really goes on underneath the covers of modern applications help ensure that apps are built, deployed, and operated properly.

read more

CIOs and the Life Sciences Industry Part 2: Defining the IT Roadmap

Part Two of the blog series, click here for a link to Part One

Step 1: Defining the Core Capabilities

As noted in the last entry, an organization’s core capability can be viewed as those things an organization does particularly well to drive meaningful business results. Examples can range from talent management, lean manufacturing, customer care, research or product design. For pharmaceuticals, some specific examples could be pipeline management, study design, regulatory management including submissions, responses, and related matters, as well as drug discovery.

If you do not already have an organizational capability map, you need to begin by meeting with each business area. From those discussions, you can collaboratively develop a capabilities list for that area.

That list will need to be filtered and sorted into priority order. The output from this, as well as discussions with other areas, will then need to be consolidated into a single list. One example for a fictional manufacturer might look as follows:

For the strategic capabilities, additional detail is required. For the example above, a detail for Manufacturing might appear as follows:

Step 2: Assessing the Gap

At this point, we need to map the capabilities, in this case for Manufacturing to the IT systems which provide enablement. Gaps are also identified. A simplified view of this mapping looks as follows:

At the conclusion of this step, you should have a list of the capabilities for each (key) business area and any associated gaps.  This information will be critical as we move into the next steps of the process.

Although it is important to prioritize at each phase, the most important prioritization occurs during the consolidation rollout for the whole organization. Many factors may drive that process and will be discussed in the next entry.

In the next entry, we will discuss Steps 3 and 4: the process of consolidation and prioritization across the organization.

If you would like to set up a conversation with Clint, please reach out.

By Clint Gilliam, Virtual CIO, GreenPages Technology Solutions

 

Welcome to an #AI-Defined World | @CloudExpo #AiDI #ArtificialIntelligence

For years, the cloud has been considered as the most powerful and disruptive force ever. What is true for IT, however, it is hard to find legit arguments for the entire enterprise stack. The cloud delivers the necessary toolset to support the digital evolution on a very technological level to purchase, build and run an infrastructure, platform and software as service. Thus, the cloud must be considered as the foundation, the non-discussable requirement! However, for IT as well as business decision makers the voluntary exercise now is artificial intelligence (AI). Only if they can collect, aggregate, process and make use of the companies’ knowledge as well as the knowledge of the surrounding environments in an automated way, they will be competitive in the future.

read more

[session] #OpenStack Integration By @Nutanix | @CloudExpo #AI #DevOps

Some people worry that OpenStack is more flash then substance; however, for many customers this could not be farther from the truth. No other technology equalizes the playing field between vendors while giving your internal teams better access than ever to infrastructure when they need it. In his session at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will talk through some real-world OpenStack deployments and look into the ways this can benefit customers of all sizes. He will also talk through Nutanix’s OpenStack integrations and show how together these technologies give IT professionals a new way to approach infrastructure in today’s cloud world.

read more

[session] @Nutanix Enterprise Cloud for #DevOps | @CloudExpo #APM #SDN

DevOps is often described as a combination of technology and culture. Without both, DevOps isn’t complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his general session at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore the ways that Nutanix technologies empower teams to react faster than ever before and connect teams in ways that were either too complex or simply impossible with traditional infrastructures.

read more

Hybrid cloud analytics: Don’t be cloud-washed by the new term on the block

Not surprisingly, with all the momentum in hybrid cloud infrastructure, we’re starting to hear the term “hybrid cloud analytics” pop up in the modern business intelligence (BI) market. However, it’s a term that is being overused and misunderstood as those in the industry seek to align with the latest trend. We see the future value of hybrid cloud computing as helping to empower customers to embrace a cloud strategy of their own versus having it dictated to them by a vendor.

A hybrid cloud environment is defined by the customer. What do I mean by that? I mean that a hybrid cloud solution should not dictate where or which cloud the customer must use with their on-premise installation. Although this point should seem obvious, some large vendors in the space are ignoring this critical point, as they dictate choices based on their (lack of) capabilities.

There’s a lot of confusion about what is possible with hybrid cloud analytics so I will offer to clarify what hybrid is, and what it’s not. I will break down into three parts:

Cloud – a delivery mechanism, not a solution

I’m surprised by the number of market entrants that were born in the cloud, and use that as their core differentiation. Cloud computing is a delivery vehicle. Simple visualizations of data via the cloud are not going to drive business value. As the pioneer and leader in the modern BI market, we’ve learned that customers need both a broad and deep analytical approach to better visualise, explore and understand their data. This is important to become more

informed, gain new insights and make better decisions to derive real business value through analytics. Having a dumbed-down analytics solution that is delivered via the cloud is just going to keep you behind your competition. Having said that, we do see value in cloud delivery of world-class analytics, which many customers currently deploy on their own private clouds.

Hybrid – a hybrid approach to analytics just makes sense

Today, companies need a choice of deployment options, whether on-premise or in a private cloud leveraging the infrastructure of their choice. They get to choose where they want analytics to run.

However, the truth is that an either-or choice does not truly represent where the vast majority of customers are today in their IT investments, and where they plan to be over time. Most customers that we talk to have both data and applications that run on-premise, behind their firewall, as well as data and applications that both originate and run in the cloud. The world is not black and white; it has many shades of grey. That’s why a true hybrid approach is required to help support both where customers are today, as well as help them migrate more of their workloads off-premise over time as they so choose. A hybrid cloud approach to analytics is key to enabling a customers’ cloud strategy vs. dictating it. This is why the trend is pointing toward hybrid cloud analytics.

Hybrid cloud analytics: Full centralised control of all data, wherever it resides

The simple definition of hybrid cloud is a computing environment that uses a mix of on-premise, private cloud, and/or public cloud infrastructure to deliver services, with orchestration between the platforms. This could be hybrid cloud joins multiple clouds – or on-premise installations with cloud-based installations. Under that general definition, many vendors will claim “hybrid cloud analytics” in their marketing verbiage. Although being able to publish an analytical application (or sheet for some) from an on-premise installation to a cloud offering could be valuable, it is not hybrid cloud analytics.

Where the data resides in a true hybrid cloud analytics solution should not matter to the user who could access it from any device based on their role and security permissions. A properly governed solution allows you to define rules around where data and/or the analysis on that data can be stored or run – you can create enforcement rules on where things can and will reside based on the sensitivity and security of that dataset. It should be easy to manage user entitlements and licensing between the platforms. A hybrid cloud analytics solution must allow for bi-directional migration to/from one infrastructure environment to another and should be managed as one, seamless environment across infrastructure boundaries via a single console.

This is where the future of true hybrid cloud analytics is headed because these are the considerations IT leaders are taking to safeguard their data while gaining the flexibility and scalability for more self-service use of data in the cloud. 

Hybrid cloud analytics: Don’t be cloud-washed by the new term on the block

Not surprisingly, with all the momentum in hybrid cloud infrastructure, we’re starting to hear the term “hybrid cloud analytics” pop up in the modern business intelligence (BI) market. However, it’s a term that is being overused and misunderstood as those in the industry seek to align with the latest trend. We see the future value of hybrid cloud computing as helping to empower customers to embrace a cloud strategy of their own versus having it dictated to them by a vendor.

A hybrid cloud environment is defined by the customer. What do I mean by that? I mean that a hybrid cloud solution should not dictate where or which cloud the customer must use with their on-premise installation. Although this point should seem obvious, some large vendors in the space are ignoring this critical point, as they dictate choices based on their (lack of) capabilities.

There’s a lot of confusion about what is possible with hybrid cloud analytics so I will offer to clarify what hybrid is, and what it’s not. I will break down into three parts:

Cloud – a delivery mechanism, not a solution

I’m surprised by the number of market entrants that were born in the cloud, and use that as their core differentiation. Cloud computing is a delivery vehicle. Simple visualizations of data via the cloud are not going to drive business value. As the pioneer and leader in the modern BI market, we’ve learned that customers need both a broad and deep analytical approach to better visualise, explore and understand their data. This is important to become more

informed, gain new insights and make better decisions to derive real business value through analytics. Having a dumbed-down analytics solution that is delivered via the cloud is just going to keep you behind your competition. Having said that, we do see value in cloud delivery of world-class analytics, which many customers currently deploy on their own private clouds.

Hybrid – a hybrid approach to analytics just makes sense

Today, companies need a choice of deployment options, whether on-premise or in a private cloud leveraging the infrastructure of their choice. They get to choose where they want analytics to run.

However, the truth is that an either-or choice does not truly represent where the vast majority of customers are today in their IT investments, and where they plan to be over time. Most customers that we talk to have both data and applications that run on-premise, behind their firewall, as well as data and applications that both originate and run in the cloud. The world is not black and white; it has many shades of grey. That’s why a true hybrid approach is required to help support both where customers are today, as well as help them migrate more of their workloads off-premise over time as they so choose. A hybrid cloud approach to analytics is key to enabling a customers’ cloud strategy vs. dictating it. This is why the trend is pointing toward hybrid cloud analytics.

Hybrid cloud analytics: Full centralised control of all data, wherever it resides

The simple definition of hybrid cloud is a computing environment that uses a mix of on-premise, private cloud, and/or public cloud infrastructure to deliver services, with orchestration between the platforms. This could be hybrid cloud joins multiple clouds – or on-premise installations with cloud-based installations. Under that general definition, many vendors will claim “hybrid cloud analytics” in their marketing verbiage. Although being able to publish an analytical application (or sheet for some) from an on-premise installation to a cloud offering could be valuable, it is not hybrid cloud analytics.

Where the data resides in a true hybrid cloud analytics solution should not matter to the user who could access it from any device based on their role and security permissions. A properly governed solution allows you to define rules around where data and/or the analysis on that data can be stored or run – you can create enforcement rules on where things can and will reside based on the sensitivity and security of that dataset. It should be easy to manage user entitlements and licensing between the platforms. A hybrid cloud analytics solution must allow for bi-directional migration to/from one infrastructure environment to another and should be managed as one, seamless environment across infrastructure boundaries via a single console.

This is where the future of true hybrid cloud analytics is headed because these are the considerations IT leaders are taking to safeguard their data while gaining the flexibility and scalability for more self-service use of data in the cloud. 

How cloud is ‘background radiation’ in a record tech M&A 2016

The cloud tech IPO landscape may have struggled a little of late, but the analysts at EY – formerly Ernst & Young – argue that on a global scale, 2016 saw an all-time record high in overall technology activity due to the “massive digital transformation” caused by disruptive technologies.

The overall aggregate 2016 value of $466.6 billion (£373bn) was up 2% over 2015’s previous record, while the Q416 figure of $117.2bn was down 38% year over year.

For the full year of 2016, the consultancy put together a graph of number of deals in the sector compared with average value of deal. As expected from a more mature market, cloud and software as a service (SaaS) was miles ahead in terms of volume – between 1,200 and 1,400 – but paled in monetary terms compared to connected car and Internet of Things (IoT) technologies.

The report coins the term ‘background radiation’ when describing cloud deals, adding that the IoT and artificial intelligence (AI) will continue to fuel high tech-targeted M&A in 2017.

In terms of specific regions, cloud and SaaS was a factor in more than a quarter (28%) of EMEA deals in Q416, compared with the Americas where it factored into almost 950 deals.

The standout, as the report affirms, was the Oracle-NetSuite acquisition for $9.3bn, announced in July but only completed in November. Microsoft’s $26.2bn outlay for LinkedIn was described by EY as a reflection of “the way social networking is transforming business, the rising role of big data and the potential for both those technologies to transform Microsoft products.” Equinix’s $3.6bn move to snaffle 29 data centres from Verizon in December was also noted.

Earlier this month, the yearly analysis issued by Byron Deeter, partner at Bessemer Venture Partners (BVP), came to a pretty similar conclusion. While IPOs ran comparatively dry in the cloud space, with Twilio the star performer, companies acquired in the public cloud space represented 40% of the $300bn market cap. The top 100 private cloud companies, as noted by Forbes in September, also represented more than $100bn of private enterprise value alone.

 

The artificial intelligence (AI) market is set to reach $3,061 billion by 2024 according to recent research. The AI Expo world series looks at the future impact of these technologies, including business intelligence, machine learning, and chatbots. Find out more here.

How cloud is ‘background radiation’ in a record tech M&A 2016

The cloud tech IPO landscape may have struggled a little of late, but the analysts at EY – formerly Ernst & Young – argue that on a global scale, 2016 saw an all-time record high in overall technology activity due to the “massive digital transformation” caused by disruptive technologies.

The overall aggregate 2016 value of $466.6 billion (£373bn) was up 2% over 2015’s previous record, while the Q416 figure of $117.2bn was down 38% year over year.

For the full year of 2016, the consultancy put together a graph of number of deals in the sector compared with average value of deal. As expected from a more mature market, cloud and software as a service (SaaS) was miles ahead in terms of volume – between 1,200 and 1,400 – but paled in monetary terms compared to connected car and Internet of Things (IoT) technologies.

The report coins the term ‘background radiation’ when describing cloud deals, adding that the IoT and artificial intelligence (AI) will continue to fuel high tech-targeted M&A in 2017.

In terms of specific regions, cloud and SaaS was a factor in more than a quarter (28%) of EMEA deals in Q416, compared with the Americas where it factored into almost 950 deals.

The standout, as the report affirms, was the Oracle-NetSuite acquisition for $9.3bn, announced in July but only completed in November. Microsoft’s $26.2bn outlay for LinkedIn was described by EY as a reflection of “the way social networking is transforming business, the rising role of big data and the potential for both those technologies to transform Microsoft products.” Equinix’s $3.6bn move to snaffle 29 data centres from Verizon in December was also noted.

Earlier this month, the yearly analysis issued by Byron Deeter, partner at Bessemer Venture Partners (BVP), came to a pretty similar conclusion. While IPOs ran comparatively dry in the cloud space, with Twilio the star performer, companies acquired in the public cloud space represented 40% of the $300bn market cap. The top 100 private cloud companies, as noted by Forbes in September, also represented more than $100bn of private enterprise value alone.

Adobe Introduces new Cloud-based Digital Signatures

Adobe has transformed the way we read, share and sign digital documents. The latest in this category is cloud-based digital signatures.

Now, Adobe Sign will offer cloud-based digital signatures on any device that uses the Adobe Document Cloud. This move is a major step for Adobe and a significant one for the cloud community as a whole, as there have been many challenges associated with implementing it.

Digital signatures are much different from e-signatures, and they require a detailed verification process that can be time-consuming and effort-centric. In addition, the existing digital signatures are fragmented, with most of them being proprietary.

To overcome this challenge, Adobe decided to come up with cloud-based digital signatures that are open to multiple certificate providers. The obvious advantage with such open standards is that it allows interoperability between solutions, and helps in widespread adoption. In this sense, it can be the beginning of exciting times for the digital signature industry.

Adobe is all set to embrace this trend, as a press report released by the company says that its digital signatures are one of the most advanced and secure type of electronic signatures available today, and it can be used for signing many important documents like mortgage applications and healthcare forms.

Besides digital signatures, Adobe has also added a host of other features that is sure to appeal to large teams. Some of the features it has added to its arsenal include document routing and integration with Microsoft SharePoint to allow for easy signing and tracking.

All the changes are based on the recommendations given by Cloud Signature Consortium, a global network of industry experts who are working together to create standardized specifications for online document sharing and digital signatures. For some time now, this group has been advocating for wide and open standards, as it can help to build a secure digital signature functionality across all devices and applications. Adobe headed this consortium in 2016, and is today leading the way, with the world’s first open cloud-based digital signatures.

Alongside introducing digital signature, Adobe has also added new functionality to its Adobe Sign technology, so it can now streamline the flow of documents and tasks across different teams, and even the organization as a whole. This way, the documents can be integrated better with the digital processes and systems of the organization.

As a part of its updates, Adobe has enhanced its mobile app with a new technology called Adobe Sensei. This technology uses machine learning, artificial intelligence and deep learning capabilities to make the right predictions. This mobile app will also make it easy for users to convert any paper document to its digital version with just a smartphone scan.

To top these changes, the process of digital signing complies with standards such as eIDAS that has to be followed in the EU.

With such sweeping changes, Adobe is all set to take the digital signature industry by storm.  It also wants to tap into the growing demand for digital signatures, as workforce are increasingly mobile and prefer to have digital copies of documents.

The post Adobe Introduces new Cloud-based Digital Signatures appeared first on Cloud News Daily.