Cloud market valued at $148bn for past year, growing 25% annually

(c)iStock.com/Maciej Noskowski

Operator and vendor revenues across the main cloud services and infrastructure market segments hit $148 billion (£120.5bn) in 2016 growing at 25% annually, according to the latest note from analyst firm Synergy Research.

Infrastructure as a service (IaaS) and platform as a service (PaaS) experienced the highest growth rates at 53%, followed by hosted private cloud infrastructure services, at 35%, and enterprise SaaS, at 34%. Amazon Web Services (AWS) and Microsoft lead the way in IaaS and PaaS, with IBM and Rackspace on top for hosted private cloud.

In the four quarters ending September (Q3) 2016, total spend on hardware and software to build cloud infrastructure exceeded $65bn, according to the researchers. Spend on private cloud accounts for more than half of the overall total, but public cloud spend is growing much more rapidly. The note also argues unified comms as a service (UCaaS) is growing ‘steadily’.

“We tagged 2015 as the year when cloud became mainstream and I’d say that 2016 is the year that cloud started to dominate many IT market segments,” said Jeremy Duke, Synergy Research Group founder and chief analyst in a statement. “Major barriers to cloud adoption are now almost a thing of the past, especially on the public cloud side.

“Cloud technologies are now generating massive revenues for technology vendors and cloud service providers and yet there are still many years of strong growth ahead,” Duke added.

The most recent examination of the cloud infrastructure market by Synergy back in August argued AWS, Microsoft, IBM and Google continue to grow more quickly than their smaller competitors and, between them, own more than half of the global cloud infrastructure service market. 

PCM starts the year with an acquisition

PCM, one of the leading technology providers in North America, has started 2017 with a big acquisition. On Tuesday, it acquired Stratiform, an industry leading provider of a wide range of cloud services. It paid a sum of C$2.1 million based on the closing price of Stratiform’s shares on December 29, 2016. In addition, it has agreed on a potential payout of C$1.75 million over a period of three years. Stratiform’s revenue was C$5.5 million for the fiscal year that ended on July 31, 2016.

This acquisition is expected to give a big fillip to PCM’s hopes of becoming a leading cloud provider in this region. In fact, Stratiform’s cloud-based products and services will lay the foundation for PCM’s cloud-related offerings. Through this acquisition, PCM plans to leverage on the clouds expertise of Stratiform to expand its own presence in Canada and the United States, and also reach out to more markets, especially small and medium business (SMB), mid-market, and public sector enterprises of North America. Stratiform is also a Microsoft Gold Partner, so this acquisition is likely to give PCM a better foothold in the world of Azure, Office 365, and Enterprise Mobility Suite.

Stratiform was founded in 2012 by Jordan Byman and Darren Lloyd, with a clear aim to focus on Microsoft-related technologies. Within a short time, this company became a Microsoft Gold Partner for devices, deployment, cloud services, connectivity, and a Microsoft Silver Partner for hosting cloud solutions in the small and mid-market business segments. This company offers professional consulting, designing, and planning services on a wide range of Microsoft solutions. According to its LinkedIn profile, it employs anywhere 11 to 50 people. All this points to a small company that had an amazing run over the four years since its founding.

PCM, on the other hand, was founded in 1987, and is headquartered in El Segundo, California. With more than 3,700 employees spread across its 45 locations in the United States, Canada, Pakistan, and the Philippines, this company has an annual total revenue of more than $1.6 billion. Its operations is divided into many subsidiaries that include sales, services, marketing, logistics, and business process outsourcing (BPO).

This acquisition augurs well for PCM, as it can give it a strong foothold in the cloud market. In this sense, Stratiform is a good choice for acquisition, as it has an established market and expertise that has to be built on by PCM. Historically, PCM has expanded its operations and revenue by way of strategic acquisitions. It was founded as a telemarketing, direct marketing, and print catalogs company, and it made a foray into other areas by acquiring companies such as PC Wintel, Computability, Wareforce, Data Systems Worldwide (DSW), and En Pointe Technologies and Sales Inc.

As for Stratiform, it had a dream run, and it can continue building its expertise with a larger pool of resources, thereby giving the brains behind Stratiform a better chance to reach out to a global audience.

In all, 2017 started off on a great note for both the companies!

The post PCM starts the year with an acquisition appeared first on Cloud News Daily.

Why you can’t let disaster recovery slide off your IT budget in 2017

(c)iStock.com/olm26250

As we welcome in the New Year, we are already seeing multiple blogs prognosticating 2017 trends, setting priorities and suggesting resolutions. We are also rapidly approaching the 2017 budget cycle. I am sure you will read many articles concerning new plans or resolutions for the coming year, but this one will be about an old resolution: IT disaster recovery (DR).

When disaster strikes, organisations need to be able to recover IT systems as quickly as possible. Not having a disaster recovery plan in place can put the business at risk of high financial costs, reputation loss and even greater risks for its clients, customers and employees. Despite this, each year business continuity gets cut from the budget and companies continue to fail to invest in DR.

Here are five common objections that continue to dominate the disaster recovery budget discussion and why IT leaders need to refute them:

“It’s going to cost a fortune”

Business leaders often assume that disaster recovery is going to break the bank. When thinking about a robust disaster recovery plan, secondary data centres complete with HVAC, as well as second copies of all servers, storage and networks comes to mind. Furthermore, there is a general misconception that systems are sitting idle, just waiting for disaster to strike, and all this is before even considering the maintenance costs involved.

However, having a robust disaster recovery plan in place doesn’t have to mean investing in a secondary data centre. Technology has developed massively in the last few years and there are now a number of different options that enable organisations to minimise the cost of DR without sacrificing the recoverability of IT systems. Cloud-based disaster recovery, often termed Disaster-Recovery-as-a-Service (DRaaS) enables failover of virtual machines to secure cloud locations. Often billed by VM or by TB of storage, DRaaS provides the flexibility to only pay for what you need. Having an on-demand pricing model means the costs are therefore remarkably low. With DRaaS, organisations do not have to sacrifice the ability to fail over in a time of need and are also gaining the benefits of security and compliance within the cloud platform. In most cases, it has now become a lot more cost effective for organisations to invest in DRaaS rather than building and managing a secondary data centre.

“But I have backup down the hall”

Some businesses may argue they are covered in case of disaster because they have a robust backup system in the form of an on-site server. If you back up each day to this, then surely you do not need DRaaS?

However, backup ‘down the hall’ is not immune from a localised disaster and additionally, should disaster strike, restoring data from back up takes hours, if not days. DRaaS is about minimising downtime. With DRaaS organisations can restore operations quickly (often in minutes or even seconds) and in a highly automated fashion. It can also be tested in advance so that if and when an issue does arise the infrastructure can be recovered at the push of a button as the failover system has been fully tested and proven.

The difference between back up and DR is significant and both can co-exist happily in a secure and compliant business continuity strategy.

“We don’t get bad weather!” 

With headlines focusing on big natural disasters, many believe that if they live in a region with generally good weather, they are exempt from the danger of an outage. This is a false sense of security, however, as the ‘disaster’ in disaster recovery doesn’t just refer to natural disasters caused by weather events.

Outages are increasingly more likely to be the result of human error or malicious attacks – just look at the increase in ransomware attacks we’ve seen on businesses over the past year. Organisations are also susceptible to power outages, upgrade problems or bad coding.

Incidents such as these are completely out of an IT team’s control. It is therefore vital that there is a robust disaster recovery plan in place to be able to recover when the inevitable happens.

“We don’t have outages” 

This objection is for the most part unrealistic. Generally, people do not like talking about outages. Usually it is not a case of an organisation not experiencing outages, it is more likely that these outages do not get fed back to senior leadership.

Whilst some smaller outages may go unnoticed and leave a business moderately unscathed, over the course of a week, a month or a year downtime adds up and ultimately becomes expensive, having an unplanned effect on revenue. In addition to this, downtime can impact reputation, customer loyalty and employee productivity.

When it comes to outages organisations need to be more transparent in their approach; utilise the data on outages, attacks, maintenance windows, patch and upgrade problems that exist in your IT department to implement a reliable and effective DR strategy.

“We can handle a little downtime” 

The final objection is ‘we don’t need a robust DR plan because we can deal with a few minutes of downtime’. Businesses may question how much downtime will really impact the business and argue that since all their systems are not customer facing, it isn’t the end of the world.

However, downtime can actually have a very significant impact on revenue. In the last decade, our expectations as consumers and IT end users have changed. We expect everything instantly and business is increasingly conducted online. As a result, people are more sensitive to an interruption in service and having even a few minutes downtime could have a massive impact on customer loyalty, not to mention bottom line revenue.  

The impact of downtime is tremendous. A 2016 survey conducted by Opinion Matters on behalf of iland showed that, for 69% of respondents, downtime of only minutes would have highly disruptive or catastrophic business impact. Additionally, Gartner has reported that 72% of firms had to use their IT disaster recovery plans, in its 2015 Business Continuity Management survey, and estimates in their 2016 Magic Quadrant for Disaster Recovery as a Service that the DRaaS market will nearly triple in the next three years to a revenue point of $3.4 billion by 2019.

A robust disaster recovery strategy is vital to running a successful and secure business. If any of these five objections have influenced your decision to invest in a business continuity plan, it may be time to reconsider. Without an IT disaster recovery plan, you run the risk of incurring serious business losses through outages, hours of downtime, lost data, and negative impact on reputation. Make 2017 the year that DR is put firmly back in the IT budget. 

Why you can’t let disaster recovery slide off your IT budget in 2017

(c)iStock.com/olm26250

As we welcome in the New Year, we are already seeing multiple blogs prognosticating 2017 trends, setting priorities and suggesting resolutions. We are also rapidly approaching the 2017 budget cycle. I am sure you will read many articles concerning new plans or resolutions for the coming year, but this one will be about an old resolution: IT disaster recovery (DR).

When disaster strikes, organisations need to be able to recover IT systems as quickly as possible. Not having a disaster recovery plan in place can put the business at risk of high financial costs, reputation loss and even greater risks for its clients, customers and employees. Despite this, each year business continuity gets cut from the budget and companies continue to fail to invest in DR.

Here are five common objections that continue to dominate the disaster recovery budget discussion and why IT leaders need to refute them:

“It’s going to cost a fortune”

Business leaders often assume that disaster recovery is going to break the bank. When thinking about a robust disaster recovery plan, secondary data centres complete with HVAC, as well as second copies of all servers, storage and networks comes to mind. Furthermore, there is a general misconception that systems are sitting idle, just waiting for disaster to strike, and all this is before even considering the maintenance costs involved.

However, having a robust disaster recovery plan in place doesn’t have to mean investing in a secondary data centre. Technology has developed massively in the last few years and there are now a number of different options that enable organisations to minimise the cost of DR without sacrificing the recoverability of IT systems. Cloud-based disaster recovery, often termed Disaster-Recovery-as-a-Service (DRaaS) enables failover of virtual machines to secure cloud locations. Often billed by VM or by TB of storage, DRaaS provides the flexibility to only pay for what you need. Having an on-demand pricing model means the costs are therefore remarkably low. With DRaaS, organisations do not have to sacrifice the ability to fail over in a time of need and are also gaining the benefits of security and compliance within the cloud platform. In most cases, it has now become a lot more cost effective for organisations to invest in DRaaS rather than building and managing a secondary data centre.

“But I have backup down the hall”

Some businesses may argue they are covered in case of disaster because they have a robust backup system in the form of an on-site server. If you back up each day to this, then surely you do not need DRaaS?

However, backup ‘down the hall’ is not immune from a localised disaster and additionally, should disaster strike, restoring data from back up takes hours, if not days. DRaaS is about minimising downtime. With DRaaS organisations can restore operations quickly (often in minutes or even seconds) and in a highly automated fashion. It can also be tested in advance so that if and when an issue does arise the infrastructure can be recovered at the push of a button as the failover system has been fully tested and proven.

The difference between back up and DR is significant and both can co-exist happily in a secure and compliant business continuity strategy.

“We don’t get bad weather!” 

With headlines focusing on big natural disasters, many believe that if they live in a region with generally good weather, they are exempt from the danger of an outage. This is a false sense of security, however, as the ‘disaster’ in disaster recovery doesn’t just refer to natural disasters caused by weather events.

Outages are increasingly more likely to be the result of human error or malicious attacks – just look at the increase in ransomware attacks we’ve seen on businesses over the past year. Organisations are also susceptible to power outages, upgrade problems or bad coding.

Incidents such as these are completely out of an IT team’s control. It is therefore vital that there is a robust disaster recovery plan in place to be able to recover when the inevitable happens.

“We don’t have outages” 

This objection is for the most part unrealistic. Generally, people do not like talking about outages. Usually it is not a case of an organisation not experiencing outages, it is more likely that these outages do not get fed back to senior leadership.

Whilst some smaller outages may go unnoticed and leave a business moderately unscathed, over the course of a week, a month or a year downtime adds up and ultimately becomes expensive, having an unplanned effect on revenue. In addition to this, downtime can impact reputation, customer loyalty and employee productivity.

When it comes to outages organisations need to be more transparent in their approach; utilise the data on outages, attacks, maintenance windows, patch and upgrade problems that exist in your IT department to implement a reliable and effective DR strategy.

“We can handle a little downtime” 

The final objection is ‘we don’t need a robust DR plan because we can deal with a few minutes of downtime’. Businesses may question how much downtime will really impact the business and argue that since all their systems are not customer facing, it isn’t the end of the world.

However, downtime can actually have a very significant impact on revenue. In the last decade, our expectations as consumers and IT end users have changed. We expect everything instantly and business is increasingly conducted online. As a result, people are more sensitive to an interruption in service and having even a few minutes downtime could have a massive impact on customer loyalty, not to mention bottom line revenue.  

The impact of downtime is tremendous. A 2016 survey conducted by Opinion Matters on behalf of iland showed that, for 69% of respondents, downtime of only minutes would have highly disruptive or catastrophic business impact. Additionally, Gartner has reported that 72% of firms had to use their IT disaster recovery plans, in its 2015 Business Continuity Management survey, and estimates in their 2016 Magic Quadrant for Disaster Recovery as a Service that the DRaaS market will nearly triple in the next three years to a revenue point of $3.4 billion by 2019.

A robust disaster recovery strategy is vital to running a successful and secure business. If any of these five objections have influenced your decision to invest in a business continuity plan, it may be time to reconsider. Without an IT disaster recovery plan, you run the risk of incurring serious business losses through outages, hours of downtime, lost data, and negative impact on reputation. Make 2017 the year that DR is put firmly back in the IT budget. 

Better Predictors | @CloudExpo @Schmarzo #BigData #IoT #AI #ML #Analytics

I love the simplicity of the data science concepts as taught by the book “Moneyball.” Everyone wants to jump right into the real meaty, highly technical data science books. But I recommend to my students to start with the book “Moneyball.” The book does a great job of making the power of data science come to life (and the movie doesn’t count, as my wife saw it and “Brad Pitt is so cute!” was her only takeaway…ugh).

read more

A look into Cerevo’s Data Logger

Do you love biking? Have you always been interested in your ride’s statistics, like how many miles did you cover, average speed, calories burned, and more? There are already many apps that give you this information, and the latest to join this market segment is Cerevo’s Ride -1.

Ride-1 is a data logger that comes with advanced technologies and sensors to give you the most accurate information. It is equipped with nine axis motion sensors, out of which three each will handle geomagnetism, acceleration, and angular velocity respectively. There are also other sensors that come with it such as temperature sensor, air pressure sensor, GPS system, and illuminance sensor, to give you a complete picture of your riding experience.

The best part about this device is that it comes with an 8-GB flash memory that can store up to 400 hours of data. This means, you can store all information pertaining to your rides over a certain time period, and analyze the same, to understand if you’re on the right track to achieve your fitness and biking goals. Also, you can automatically upload this data through cloud or a Wi-Fi connection to other devices, or you can even share it with other people so they know your exact geographic location. This feature can be particularly useful for those who want to track the rides of their near and dear ones.

In addition to these features, Ride-1 is relatively small, and can be attached easily to your bike within minutes. Thus, there is no hassle of a long setup time, and cumbersome usage instructions. When you combine it with a smartphone, you can almost use it like a cycle computer, as it gives you information about your posture, speed, and other aspects in real-time. This feature is most helpful for professional athletes, and others who ride with a specific goal in mind. You or your coach can analyze all this information to make the appropriate changes that’ll get you faster to your biking goals.

Currently, this product is available on Cerevo’s online shop, and also on select retail and electronic stores. It’s priced at around $210. Though some experts consider this to be a little pricey, it’s definitely worth for the features it offers. Also, this price is much lower than that of a highly functional cycle computer, and it can be used extensively on other devices too, provided you combine it with a cloud system and a smartphone. It is also waterproof, and comes with a rechargeable battery that can power the system for 15 hours after a charge of just three hours.

This product is manufactured by a Japanese company called Cerevo. Found in 2007 and headquartered in Tokyo, this company specializes in making next-generation networked devices for both consumers and professional users. The unique ideas and design of Cerevo have made it a popular brand among tech savvy users, especially those belonging to the Millennial generation.

With such cool features, Cerevo’s Ride-1 is going to be a big hit, and the company is likely to meet its target of 10,000 pieces much before its estimated time of three years.

The post A look into Cerevo’s Data Logger appeared first on Cloud News Daily.

Phishers pretending to be Apple

In my opinion, one of the most despicable types of computer criminals today is the phisher. As Wikipedia explains: “Phishing is the attempt to obtain sensitive information such as usernames, passwords, and credit card details (and, indirectly, money), often for malicious reasons, by disguising as a trustworthy entity in an electronic communication… Phishing is typically carried […]

The post Phishers pretending to be Apple appeared first on Parallels Blog.

The top 10 ways integrating ERP, CRM, and more will transform manufacturing in 2017

(c)iStock.com/Yuri_Arcurs

Integrating ERP, CRM, and legacy systems lead to greater manufacturing innovation, setting the foundation to move beyond business models that don’t stay in step with customers’ fast-changing needs. Bringing contextual intelligence into manufacturing that centers on customers’ unique, fast-changing requirements is a must-have to keep growing sales profitably. By integrating ERP, CRM, SCM, pricing and legacy systems together, manufacturers can provide customers what they want most, and that’s accurate, fast responses to their questions and perfect orders delivered.

Integration powers manufacturing innovation

Enabling a faster pace of innovation in manufacturing starts by using systems and process integration as a growth catalyst to profitably grow. There is a myriad of ways integration will transform manufacturing in 2017, and the top 10 ways are presented below:

  • Real-time visibility across selling, pricing, product, manufacturing and service improves the speed of customer response and makes planning easier. By integrating legacy SAP ERP systems with CRM, pricing, product catalog, Manufacturing Execution Systems (MES) and service, telling customers in real-time the status of their orders is possible. Having real-time data on manufacturing operations provides planners with the visibility they need to optimize production schedules, including fine-tuning Material Requirements Planning (MRP). By orchestrating these areas of manufacturing more efficiently, customer satisfaction increases, the potential of upselling and cross-sell improves and less order fulfillment errors turn into higher profits.
  • Making analytics the fuel manufacturing needs to move faster, attaining time-to-market goals and exceeding customer expectations. One of the quickest ways manufacturers are going to use integration to fuel greater growth in 2017 is by using analytics to measure operations from the customer’s perspective first. From quality management to order fulfillment and meeting delivery dates, every manufacturer has the baseline data they need to begin a customer-driven analytics strategy today. Integration is the catalyst that is making this happen. Making quality a company-wide focus begins with real-time integration of quality management and broader IT systems. enosiX has taken a unique approach to real-time integration, streamlining quality inspections and inventory control for beverage equipment manufacturer Bunn.
  • Improving new product success rates by integrating CRM, pricing, product catalog, service, and Product Lifecycle Management (PLM) systems are enabling manufacturers to create new product lines that drive new business models. For consumer electronics and high-tech products manufacturers serving B2C (business to consumer) and Business to Business (B2B), speed and time-to-market are a core part of their business models. Capitalizing on the speed of customers’ changing requirements is more important to stay ins type with than competitors, however. To do this, manufacturers capturing feedback from service and PLM systems and then putting it into context using CRM systems can innovate faster than competitors who track each other instead of customers.
  • Configure-Price-Quote (CPQ) will continue to be one of the most effective strategies manufacturers can use for accelerating sales in 2017, made possible by the real-time integration between ERP, CRM, pricing and manufacturing systems. Winning new customers and closing deals often comes down to being faster than competitors at delivering accurate, complete quotes and proposals. By integrating CRM, ERP, and pricing systems manufacturers can trim days and in some cases weeks and months off of how long it takes to produce a quote or proposal. CPQ will continue to accelerate in 2017, gaining momentum as more manufacturers move beyond their manually-based methods of quoting and opt for more integrated approaches to excelling at this vital selling activity.
  • Industry 4.0’s many advantages including creating smart factories are dependent on the real-time integration of traditional IT and manufacturing systems increasing production speed and quality. Engraining greater contextual intelligence into every phase of manufacturing increases shop-floor visibility. It also makes planning more efficient and customer-driven. The key to revitalizing existing production centers and getting them started on the journey to becoming smart factories depends on the real-time integration of IT and manufacturing systems.
  • Personalizing pricing strategies by customer persona and segment using real-time integration between CRM, pricing, accounting and finance systems to optimize profitability. Manufacturers doing this today also have propensity models that define which customers are most and least likely to accept up-sell and cross-sell offers. For many manufacturers, this level of pricing precision is possible today with greater systems integration. By having pricing strategies defined by persona and segment, measuring just how much speed and time-to-market matters to each is possible by measuring sales rates of new products and services.
  • IT system security companywide improves with tighter real-time integration as long-standing legacy systems are updated to enable greater connectivity with newer systems. When manufacturers choose to pursue a more focused, urgent strategy of systems integration to improve manufacturing performance, system security often improves companywide. It’s because longstanding legacy systems, often the most vulnerable to unauthorized use, get re-evaluated at the operating system and integration levels. The result is company-wide IT security improves when real-time integration is attained. For manufacturers where 70% or more of their materials and costs are from outside their owned production centers, this is more important in 2017 than ever before.
  • Sensor data generated from the Internet of Things (IoT) combined with advanced analytics is transforming manufacturing today and will accelerate in 2017. Manufacturers with globally-based operations are piloting and using IoT strategies in daily operations today. A few are working with semiconductor manufacturers to design in their specific requirements at the chip level. Having real-time integration in place between ERP, CRM, pricing and services systems provides the scalable, secure foundation to build advanced analytics and IoT platforms that can scale over the long-term.
  • Market leaders in manufacturing are designing in real-time integration to their connected products, enabling new sources of revenue. General Electric’s approach to monitoring jet engines in flight and providing real-time data to aircraft manufacturers including Boeing and airlines globally is an example of how integration is enabling entirely new business models. A global aerospace manufacturer who requested anonymity is working with integrated circuit developers Broadcom, Intel, and Qualcomm to create chipsets that can provide sensor-based data on an entire jet’s health in real-time anywhere in the world, anytime.
  • Greater visibility and speed are coming to supply chains, enabling manufacturers the ability to take an accepted quote and turn it into build instructions in real-time. Automating the steps of taking a quote and turning it into a bill of materials, scheduling the best possible work teams, and orchestrating parts and materials all is becoming automated from quote approval. From a customer’s perspective, all they see is the approved quote and activity starting immediately to provide the products they ordered. By having this level fo real-time supply chain integration, speed becomes the new normal and customer expectations are met and often exceeded.

The top 10 ways integrating ERP, CRM, and more will transform manufacturing in 2017

(c)iStock.com/Yuri_Arcurs

Integrating ERP, CRM, and legacy systems lead to greater manufacturing innovation, setting the foundation to move beyond business models that don’t stay in step with customers’ fast-changing needs. Bringing contextual intelligence into manufacturing that centers on customers’ unique, fast-changing requirements is a must-have to keep growing sales profitably. By integrating ERP, CRM, SCM, pricing and legacy systems together, manufacturers can provide customers what they want most, and that’s accurate, fast responses to their questions and perfect orders delivered.

Integration powers manufacturing innovation

Enabling a faster pace of innovation in manufacturing starts by using systems and process integration as a growth catalyst to profitably grow. There is a myriad of ways integration will transform manufacturing in 2017, and the top 10 ways are presented below:

  • Real-time visibility across selling, pricing, product, manufacturing and service improves the speed of customer response and makes planning easier. By integrating legacy SAP ERP systems with CRM, pricing, product catalog, Manufacturing Execution Systems (MES) and service, telling customers in real-time the status of their orders is possible. Having real-time data on manufacturing operations provides planners with the visibility they need to optimize production schedules, including fine-tuning Material Requirements Planning (MRP). By orchestrating these areas of manufacturing more efficiently, customer satisfaction increases, the potential of upselling and cross-sell improves and less order fulfillment errors turn into higher profits.
  • Making analytics the fuel manufacturing needs to move faster, attaining time-to-market goals and exceeding customer expectations. One of the quickest ways manufacturers are going to use integration to fuel greater growth in 2017 is by using analytics to measure operations from the customer’s perspective first. From quality management to order fulfillment and meeting delivery dates, every manufacturer has the baseline data they need to begin a customer-driven analytics strategy today. Integration is the catalyst that is making this happen. Making quality a company-wide focus begins with real-time integration of quality management and broader IT systems. enosiX has taken a unique approach to real-time integration, streamlining quality inspections and inventory control for beverage equipment manufacturer Bunn.
  • Improving new product success rates by integrating CRM, pricing, product catalog, service, and Product Lifecycle Management (PLM) systems are enabling manufacturers to create new product lines that drive new business models. For consumer electronics and high-tech products manufacturers serving B2C (business to consumer) and Business to Business (B2B), speed and time-to-market are a core part of their business models. Capitalizing on the speed of customers’ changing requirements is more important to stay ins type with than competitors, however. To do this, manufacturers capturing feedback from service and PLM systems and then putting it into context using CRM systems can innovate faster than competitors who track each other instead of customers.
  • Configure-Price-Quote (CPQ) will continue to be one of the most effective strategies manufacturers can use for accelerating sales in 2017, made possible by the real-time integration between ERP, CRM, pricing and manufacturing systems. Winning new customers and closing deals often comes down to being faster than competitors at delivering accurate, complete quotes and proposals. By integrating CRM, ERP, and pricing systems manufacturers can trim days and in some cases weeks and months off of how long it takes to produce a quote or proposal. CPQ will continue to accelerate in 2017, gaining momentum as more manufacturers move beyond their manually-based methods of quoting and opt for more integrated approaches to excelling at this vital selling activity.
  • Industry 4.0’s many advantages including creating smart factories are dependent on the real-time integration of traditional IT and manufacturing systems increasing production speed and quality. Engraining greater contextual intelligence into every phase of manufacturing increases shop-floor visibility. It also makes planning more efficient and customer-driven. The key to revitalizing existing production centers and getting them started on the journey to becoming smart factories depends on the real-time integration of IT and manufacturing systems.
  • Personalizing pricing strategies by customer persona and segment using real-time integration between CRM, pricing, accounting and finance systems to optimize profitability. Manufacturers doing this today also have propensity models that define which customers are most and least likely to accept up-sell and cross-sell offers. For many manufacturers, this level of pricing precision is possible today with greater systems integration. By having pricing strategies defined by persona and segment, measuring just how much speed and time-to-market matters to each is possible by measuring sales rates of new products and services.
  • IT system security companywide improves with tighter real-time integration as long-standing legacy systems are updated to enable greater connectivity with newer systems. When manufacturers choose to pursue a more focused, urgent strategy fo systems integration to improve manufacturing performance, system security often improves companywide. It’s because longstanding legacy systems, often the most vulnerable to unauthorized use, get re-evaluated at the operating system and integration levels. The result is company-wide IT security improves when real-time integration is attained. For manufacturers where 70% or more of their materials and costs are from outside their owned production centers, this is more important in 2017 than ever before.
  • Sensor data generated from the Internet of Things (IoT) combined with advanced analytics is transforming manufacturing today and will accelerate in 2017. Manufacturers with globally-based operations are piloting and using IoT strategies in daily operations today. A few are working with semiconductor manufacturers to design in their specific requirements at the chip level. Having real-time integration in place between ERP, CRM, pricing and services systems provides the scalable, secure foundation to build advanced analytics and IoT platforms that can scale over the long-term.
  • Market leaders in manufacturing are designing in real-time integration to their connected products, enabling new sources of revenue. General Electric’s approach to monitoring jet engines in flight and providing real-time data to aircraft manufacturers including Boeing and airlines globally is an example of how integration is enabling entirely new business models. A global aerospace manufacturer who requested anonymity is working with integrated circuit developers Broadcom, Intel, and Qualcomm to create chipsets that can provide sensor-based data on an entire jet’s health in real-time anywhere in the world, anytime.
  • Greater visibility and speed are coming to supply chains, enabling manufacturers the ability to take an accepted quote and turn it into build instructions in real-time. Automating the steps of taking a quote and turning it into a bill of materials, scheduling the best possible work teams, and orchestrating parts and materials all is becoming automated from quote approval. From a customer’s perspective, all they see is the approved quote and activity starting immediately to provide the products they ordered. By having this level fo real-time supply chain integration, speed becomes the new normal and customer expectations are met and often exceeded.

Matson, AWS, and overcoming the cloud lock-in challenge for big data

(c)iStock.com/narvikk

Shipping giant Matson made news back in November with its major migration to Amazon Web Services (AWS). Matson’s recent announcement to go ‘all in’ with AWS is another example of large enterprises realising the intrinsic value of the public cloud.

But merely getting set up on one of the large cloud providers is now table stakes – everyone has a pilot. True cloud success is another matter.

It took years to complete this migration at Matson, according to its CIO in the news coverage; so it’s not surprising that many companies are wary. In a 2016 Rightscale survey, companies ranked “lack of resources/expertise” as their greatest cloud challenge, even more than security. While that signifies an evolution in attitudes toward cloud, it also foreshadows new challenges. One important topic Matson’s CIO raised is something all enterprises need to consider when embarking on their cloud journey – cloud provider lock-in.

The perils of cloud lock-in

Concerns about getting ‘locked-in’ to a public cloud provider are not just about contracts, pricing or negotiating leverage. CIOs also worry about migration costs, unavoidable provider outages, redundancy and disaster recovery. They hear about innovations from one cloud provider that aren’t available with another and shifting performance metrics. Many buyers have unpleasant memories from lock-in scenarios in the mega-vendor and enterprise resource planning (ERP) heyday. In the age of the cloud, we want and expect choices

That’s why many companies are thinking about how to architect a multi-cloud strategy, across multiple public cloud providers, even if they may not be planning to implement it anytime soon. Risk-averse companies want to hedge their bets, innovators want lots of flexibility and many still aren’t certain which cloud provider is best suited for their needs.

But companies struggle with exactly how to adopt services in the public cloud – especially for big data. Lock-in concerns are particularly worrisome with analytics and data management workloads because it’s not just about vendor or infrastructure lock-in – it’s more than that.

Built-in or lock-in?

Cloud providers increasingly offer their own components for data warehousing and big data, alongside marketplaces of third-party solutions and services. For example, AWS offers Amazon Redshift for data warehousing and Amazon EMR for big data and Hadoop workloads. Similarly, Azure offers the Azure Data Warehouse and HD Insight. Compared to DIY, the built-in components can seem like a simple option. But they also represent a strategic step with long-term implications.

That’s because companies often need to make significant investments in custom tooling to integrate existing data flow with these built-in services. This makes it challenging later to move workloads to another cloud provider without duplicating the investment. This is a big deal for big data with a ripple effect on costs, related processes, data sources, tools, etc.

If an organisation is reliant on the cloud providers’ built-in services, they are making a long-term platform bet with high switching costs. And, when we’re talking about enterprise data – potentially huge volumes – that’s a much a bigger deal than say, switching smartphone platforms and even that seems a little painful, doesn’t it?

Multi-cloud services emerge

Fortunately, thanks to the rise of multi-cloud managed services, enterprises can now get the most out of the cloud’s capabilities while minimizing lock-in challenges. Services are emerging for a variety of cloud functions, including big data. Using services that work across cloud providers is a prime example of a multi-sourcing or dual-sourcing strategy. It also supports high-availability goals, making it easier to move workloads to a different provider and region as needed.

Migrating to the cloud unlocks incredible agility and productivity gains, and multi-cloud managed service providers can make things easier. Don’t get locked out over fears of lock-in.