Crisis Response Using Cloud Computing | @CloudExpo #ML #IoT #M2M #Cloud #FedRAMP

Cloud computing is more than servers and storage. In a crisis situation it can actually be a lifesaver. BlackBerry, in fact, has just become the first cloud-based crisis communication service to receive a Federal Risk and Authorization Management Program (FedRAMP) authorization from the United States Government for its AtHoc Alert and AtHoc Connect services. If you’re not familiar with FedRAMP, it is a US government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services. The Blackberry certification was sponsored by the US Federal Aviation Administration.

read more

[video] Composable Infrastructure with @HTBase | @CloudExpo #IaaS #Cloud

“We focus on composable infrastructure. Composable infrastructure has been named by companies like Gartner as the evolution of the IT infrastructure where everything is now driven by software,” explained Bruno Andrade, CEO and Founder of HTBase, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.

read more

[video] The Age of Digital Transformation | @CloudExpo @Catchpoint #WebPerf #AI #DX #DevOps

The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it’s the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to the new world.

read more

Do you have the data agility your business needs?

Data is the new battleground. For companies, the situation is clear – their future depends on how quickly and efficiently they can turn data into accurate insights. This challenge has put immense pressure on CIOs to not only manage ever-growing data volumes, sources, and types, but to also support more and more data users as well as new and increasingly complex use cases.

Fortunately, CIOs can look for support in their plight from unprecedented levels of technological innovation. New cloud platforms, new databases like Apache Hadoop, and real-time data processing are just some of the modern data capabilities at their disposal. However, innovation is occurring so quickly and changes are so profound that it is impossible for most companies to keep pace, let alone leverage those factors for a competitive advantage.

The challenge CIOs face today is acute, with rapidly advancing platforms and technology, and more sources to connect and users to support than ever before

It’s clear that data infrastructures today can’t be static if they are to keep pace with the data requirements of the business.  Today’s competitive environment requires adaptive and scalable infrastructures able to solve today’s challenges and address tomorrow’s needs. After all, the speed with which you process and analyse data may be the difference between winning and losing the next customer. This is significantly more important today than 10 or 15 years ago, since companies used to make a strategic database choice once and keep running it for a decade or two.  Now we see companies updating their data platform choices far more frequently to keep up.

If companies are to thrive in a data-driven economy, they can’t afford to be handcuffed to ‘old’ technologies; they need the flexibility and agility to move at a moment’s notice to the latest market innovations. However, it’s not enough for companies to simply be technology agnostic; they also need to be in a position to re-use data projects, transformations, and routines as they move between platforms and technologies.

How can your company meet the agility imperative? To start, let’s consider the cloud question.

Many clouds and constituencies

In a data-driven enterprise, the needs of everyone – from developers and data analysts to non-technical business users – must be considered when selecting IaaS solutions. For example, application developers who use tools such as Microsoft Visual Studio and .NET will likely have a preference for the integration efficiencies of Microsoft Azure.

Data scientists may want to leverage the Google Cloud Platform for the advanced machine learning capability it supports, while other team members may have a preference for the breadth of the AWS offering.  In a decentralised world where it’s easy to spin up solutions in the cloud, different groups will often make independent decisions that make sense for them. The IT team is then saddled with the task of managing problems in the multi-cloud world they inherited – problems that often grow larger than the initial teams expected.

One way to meet a variety of stakeholders’ needs and embrace the latest technology is to plan a multi-cloud environment by design, creating a modern data architecture that is capable of serving the broadest possible range of users. This approach can safeguard you from vendor lock-in, and far more importantly, ensure you won’t get locked out of leveraging the unique strengths and future innovations of each cloud provider as they continue to evolve at a breakneck pace in the years to come.

Integration approaches for data agility

Once perhaps considered a tactical tool, today the right integration solution is an essential and strategic component of a modern data architecture, helping to streamline and maximise data use throughout the business.

Your data integration software choice should not only support data processing “anywhere” (on multi-cloud, on-premise, and hybrid deployments) but also enable you to embrace the latest technology innovations, and the growing range of data use cases and users you need to serve.

Hand coding

I said “data integration software” as I simply don’t believe that a modern data architecture can be supported by hand-coded integration alone. While custom code may make sense for targeted, simple projects that don’t require a lot of maintenance, it’s not sustainable for an entire modern data architecture strategy.

Hand coding is simply too time-consuming and expensive, requiring high-paid specialists and high ongoing maintenance costs. Moreover, hand-coded projects are tied to the specific platform they were coded to, and often even a particular version of that platform, which then locks the solution to that vendor and technology snapshot.  In a continually accelerating technology environment, that’s a disastrous strategic choice.  Also, hand coding requires developers to make every change, which limits the organisation’s ability to solve the varied and evolving needs of a widely-distributed group of data consumers.  And finally, it can’t leverage metadata to address security, compliance, and re-use.

Traditional ETL tools

Traditional ETL tools are an improvement over hand-coding, giving you the ability to be platform agnostic, use lower skilled resources and reduce maintenance costs. However, the major drawback with traditional ETL tools is that they require proprietary runtime engines that limit users to the performance, scale, and feature set the engines were initially designed to address.

Almost invariably, they can’t process real-time streaming data, and they can’t leverage the full native processing power and scale of next-generation data platforms, which have enormous amounts of industry-wide investment continually improving their capabilities. After all, it’s not simply about having the flexibility to connect to a range of platforms and technologies – the key is to leverage the best each has to offer. Moreover, proprietary run-time technologies typically require software to be deployed on every node, which dramatically increases deployment and ongoing management complexity.

Importantly, this proprietary software requirement also makes it impossible to take advantage of the spin up and spin down abilities of the cloud, which is critical to realising the cloud’s potential elasticity, agility and cost savings benefits. Traditional ETL tools simply can’t keep up with the pace of business or market innovation and therefore prevent, rather than enable digital business success.

Agile data fabric

What’s required for the digital era is scalable integration software built for modern data environments, users, styles, and workflow – from batch and bulk to IoT data streams and real-time capabilities – in other words, an agile data fabric.

The software should be able to integrate data from the cloud and execute both in the cloud and on-premises.  To serve the increasing business need for greater data agility and adaptability, integration software should be optimised to work natively on all platforms and offer a unified and cohesive set of integration capabilities (i.e. data and application integration, metadata management, governance and data quality).

In a data-driven enterprise, the needs of everyone – from developers and data analysts to non-technical users – must be considered when selecting IaaS solutions

This will allow organisations to remain platform agnostic, yet be in a position to take full advantage of each platforms’ native capabilities (cloud or otherwise) and data technology. All the work executed for one technology should be easily transferable to the next, providing the organisation with economies of skills and scale.

The other critical capability you should look for in an Agile Data Fabric is self-service data management. Moving from a top-down, centrally controlled data management model to one that is fully distributed is the only way to accelerate and scale organisation-wide trustworthy insight. If data is to inform decisions for your entire organisation, then IT, data analysts and line of business users all have to be active, tightly coordinated participants in data integration, preparation, analytics, and stewardship. Of course, the move to self-service can result in chaos if not accompanied by appropriate controls, so these capabilities need to be tightly coupled with data governance functions that provide controls for empowering decision makers without putting data at risk and undermining compliance.

The challenge CIOs face today is acute – with rapidly advancing platforms and technology, and more sources to connect and users to support than ever before. Meeting these new and ever-evolving data demands requires that companies create a data infrastructure that is agile enough to keep pace with the market and the needs of the organisation.

Read more: How IaaS cuts time for app deployment and maintenance costs while improving innovation

What’s Going On Between Amazon and Walmart?

Amazon has built a huge empire over the last decade. It started off as an online retailer of books and since then, has slowly and steadily built its business. Today, it is the online retailer that many people visit when they have to buy anything from pins to phones and everything in between. All this makes Amazon the biggest online retailer in the world.

Now, Amazon is furthering its ambitions by entering into the brick and mortar world of retail with its recent acquisition of Whole Foods for a whopping $13.7 billion. In addition, it has signed into an agreement with Nike to sell its shows directly on its website.

Amazon even plans to introduce a service called Prime Wardrobe under which customers can order and try clothes of different brands for a period of seven days before deciding whether they should buy or not. Such services are likely to change the face of retailing as it’s more customer-centric than before.

And this is the beginning of its clash with Walmart.

Walmart has been the largest retailer in the world and has dominated this market for many decades, even long before computers and the Internet came into being. With the entry of Amazon, there is a clear threat for Walmart because both these companies will compete in the same space over the coming years, at least that’s how it looks like now.

As long as Amazon was being the king in the digital space, Walmart had no problems because it was the king in the realm of physical shopping. But with Amazon entering this space, that comfort has clearly been breached.

Though Amazon has not given any kind of plan about what it’s going to do with this acquisition and how it will benefit its customers, it has definitely kick started a retail war.

Walmart, on its part, is putting pressure on Amazon to stay away. However, how Walmart is putting the pressure is what is making the whole process irksome. Instead of a direct clash, Walmart is pushing its vendors to move away from Amazon Cloud Services and opt for the services of companies like Microsoft and Google. With Walmart leading the way, other companies such as Target are also exerting pressure on their respective IT vendors to move their operations away from Amazon cloud.

Little wonder that Amazon is reacting strongly and is accusing Walmart and other large retailers of bullying.

Is it working?

Apparently not as many vendors are persuading Walmart and refuse to abide by its d\irection of changing cloud providers. But that’s not going to last forever because Walmart can simply outsource its IT operations to a company that stores data on Azure or Google Cloud Platform.

As customers, we can expect the war to get dirtier before subsiding. But eventually, we may the winners if Amazon sets up large stores like Walmart, as we can get the best value for our company through intense competition from retailers.

The post What’s Going On Between Amazon and Walmart? appeared first on Cloud News Daily.

How IaaS cuts time for app deployment and maintenance costs while improving innovation

More than half of respondents in a survey from Oracle say moving to infrastructure as a service (IaaS) has significantly cut their time to deploy new applications and services, while three in five claim it is easier to innovate through it.

The study, conducted alongside Longitude Research and which surveyed more than 1,600 IT professionals across nine countries, also found IaaS had significantly cut ongoing maintenance costs for a majority (54%) of those polled.

Naturally, the more organisations are using IaaS, the more confident they are of its success. 56% of experienced users agreed with the statement that IaaS ‘provides world-class availability and uptime’, compared with 49% of established users, 45% of recent adopters, and only 29% of non-adopters. For the statement ‘IaaS provides world-class speed’, it was similar, with 52% of experienced users and 25% of non-adopters respectively.

When it came to more negative perceptions surrounding IaaS, the UK came out on top. 57% of respondents grumbled that IaaS was ‘not secure enough for most critical data’, compared to only 39% in Germany, while 55% and 43% respectively were concerned over losing control of on-premises systems.

“When it comes to cloud adoption, there has always been a case of perception lagging behind reality,” said James Stanbridge, vice president of IaaS product management at Oracle. “Cloud is still relatively new to a lot of businesses and some outdated perceptions persist.

“We are now seeing high levels of success and satisfaction from businesses that are saving money, cutting complexity and driving exciting innovation thanks to cloud infrastructure,” Stanbridge added. “Those resisting the move need to challenge the perceptions holding them back because the longer they wait, the further ahead their competitors will pull.”

The push for Oracle towards IaaS will not be a huge surprise given the company has said it is an important focus for them. Speaking to analysts following the impressive $1.36 billion cloud quarter results last week, Larry Ellison said that during the current fiscal year, the company expects both its IaaS and PaaS (platform as a service) businesses to ‘accelerate into hyper growth’. SaaS revenue hit $964 million in the most recent quarter, compared to PaaS and IaaS with $397m.

You can read the report here (UK-centric).

High-Level Plan for #DigitalTransformation | @CloudExpo #AI #ML #DX #IoT

The size of competitors and the longevity of their brands, are less predictive of future success than the quality and speed of their information logistics systems, and their ability to use it as a competitive advantage. More data is being generated today than ever before, and successful companies are investing in business analytics and big data solutions to mine competitive advantages. There is a new sense of urgency today as businesses realize data has a shelf life, and the value of it diminishes rapidly over time. In an always-connected world where consumers and their needs are transient, timing is everything and a special type of data is needed – real-time data. In order to capture competitive advantages and contextual relevance before data expires, enterprises must deploy optimized information logistics systems (OILS) that deliver on the potential fast enough to exploit it.

read more

Atomistic and Holistic View of an Enterprise | @CloudExpo #AI #DX #DevOps #DataCenter

The purpose of enterprise architecture is to be able to consciously design an enterprise rather than allowing it to happen randomly and unconsciously. It is worth noting that it implies knowledge of a certain intended outcome or desired state in mind. Enterprise architecture (EA) is a discipline that enables designing the enterprise consciously and deliberately, rather than letting it happen randomly. EA design is informed by business vision, strategic intent, and insights on the functioning of the enterprise. So the purpose of enterprise architecture is to be able to consciously design an enterprise rather than allowing it to happen randomly and unconsciously. It is worth noting that it implies knowledge of a certain intended outcome or desired state in mind. This is usually referred to as the target state. The target state is defined in terms of the attainment of certain capabilities and fulfillment of certain milestones.

read more

[session] Your Hybrid Cloud | @CloudExpo @CAinc #DX #Serverless #DevOps

Cloud promises the agility required by today’s digital businesses. As organizations adopt cloud based infrastructures and services, their IT resources become increasingly dynamic and hybrid in nature. Managing these require modern IT operations and tools. In his session at 20th Cloud Expo, Raj Sundaram, Senior Principal Product Manager at CA Technologies, will discuss how to modernize your IT operations in order to proactively manage your hybrid cloud and IT environments. He will be sharing best practices around collaboration, monitoring, configuration and analytics that will help you boost experience and optimize utilization of your modern IT Infrastructures.

read more

Disruptive Innovations Fuel #DigitalTransformation | @CloudExpo #AI #DX #IoT #M2M

What’s disruptive innovation, and why does it matter to leaders in the C-suite? It’s how the savvy non-conformist will target market opportunities. How does this happen, when established companies seem to have the advantage? Creative software developers can quickly apply new technologies and digital business models to capture untapped demand.

Moreover, the most disruptive new companies will eventually reshape entire industries, swiftly pushing aside the legacy incumbent players — it’s a form of Digital Darwinism. The global networked economy will blossom, thanks to the pervasive Internet, while the adaptive entities will survive and prosper.

Over the next five years, global digital transformation will continue to have a significant impact on the demands and requirements of Internet Protocol (IP) networks, according to key findings from the latest Cisco Visual Networking Index (VNI).

Over the forecast period, global IP traffic is expected to increase three-fold reaching an annual run rate of 3.3 zettabytes by 2021 — that’s up from an annual run rate of 1.2 zettabytes in 2016.

Apps for Next-Generation Internet of Things

According to the Cisco assessment, machine-to-machine (M2M) connections that support Internet of Things (IoT) applications are calculated to be more than half of the total 27.1 billion devices and connections. They will account for five percent of global IP traffic by 2021.

IoT innovations in connected home, connected healthcare, smart cars or transportation and a host of other next-generation M2M services are driving this incremental growth — a 2.4-fold increase from 5.8 billion in 2016 to 13.7 billion by 2021.

With the rise of connected applications, the healthcare vertical will be the fastest growing industry segment (30 percent CAGR). The connected car and connected cities applications will have the second-fastest growth (29 percent CAGRs respectively).

Video Content Will Flood the Public Internet

That said, video will continue to dominate IP traffic and overall Internet traffic growth — representing 80 percent of all Internet traffic by 2021, that’s up from 67 percent in 2016. Globally, there will be nearly 1.9 billion Internet video users (excluding mobile-only) by 2021, that’s up from 1.4 billion in 2016.

The world will reach three trillion Internet video minutes per month by 2021. Emerging mediums, such as live Internet video, will increase 15-fold and reach 13 percent of Internet video traffic by 2021.

“As global digital transformation continues to impact billions of consumers and businesses, the network and security will be essential to support the future of the Internet,” said Yvette Kanouff, SVP and GM of Service Provider Business at Cisco.

Global IP traffic is expected to reach 278 exabytes per month by 2021, that’s up from 96 exabytes per month in 2016. Global IP traffic is expected to reach an annual run rate of 3.3 zettabytes by 2021.

Busy hour Internet traffic is increasing faster than average Internet traffic. Busy hour Internet traffic will grow 4.6-fold (35 percent CAGR) from 2016 to 2021, reaching 4.3 Pbps by 2021, compared to average Internet traffic that will grow 3.2-fold (26 percent CAGR) over the same period reaching 717 Tbps by 2021.

Regional IP Traffic Growth by 2021

  • Asia-Pacific: 107.7 exabytes/month, 26 percent CAGR, 3.2-fold growth
  • North America: 85 exabytes/month, 20 percent CAGR, 2.5-fold growth
  • Western Europe: 37.4 exabytes/month, 22 percent CAGR, 2.7-fold growth
  • Central Europe: 17.1 exabytes/month, 22 percent CAGR, 2.75-fold growth
  • Latin America: 12.9 exabytes/month, 21 percent CAGR, 2.6-fold growth
  • ME & Africa: 15.5 exabytes/month, 42 percent CAGR, 5.8-fold growth

Global Business IP Traffic Highlights

Commercial IP traffic will grow at a CAGR of 21 percent from 2016 to 2021. Increased adoption of advanced video communications in the enterprise segment will cause business IP traffic to grow by a factor of 3 between 2016 and 2021.

Business Internet traffic will grow at a faster pace than IP wide area network (WAN). Furthermore, IP WAN will grow at a CAGR of 10 percent, compared with a CAGR of 20 percent for fixed business Internet and 41 percent for mobile business Internet.

Business IP traffic will grow fastest in North America. Business IP traffic in North America will grow at a CAGR of 23 percent — that’s a faster pace than the global average of 21 percent. In volume, Asia Pacific will have the largest amount of business IP traffic in 2021, at 17 EB per month. North America will be the second at 14 EB per month.

Note: the Cisco VNI Complete Forecast for 2016 to 2021 relies upon independent analyst forecasts and real-world network usage data. Upon this foundation are layered Cisco’s own estimates for global IP traffic and service adoption. A detailed methodology description is included in the complete report.

read more