The Big Data Divide | @CloudExpo #BigData #IoT #IIoT #M2M #IternetOfThings

Within enterprises – and even small businesses – there are two distinct groups with differing goals, needs and objectives when it comes to business intelligence and analytics strategies. Business users are tasked with analyzing Big Data to help their companies make timely and more meaningful decisions, and as such, require immediate access to a wide variety of sources, including structured, semi-structured, unstructured and streaming data. On the other side of the house, IT professionals are tasked with storing and securing massive data stores, as well as ensuring regulatory compliance of corporate information. Therefore, they prefer to make information available on an as-needed basis, rather than creating an environment of open access. So the tug-of-war game begins.

read more

GSX Solutions to Define the Next Generation of Performance Monitoring | @CloudExpo #Cloud

For a company in the IT sector, a 20-year anniversary signifies that we have experienced and endured so many radical changes in the industry. We have witnessed the era in which mainframes were challenged by minis, which were then themselves challenged by PCs. Then came the age of software, then something called the Internet, and now another something called the cloud. Every time, we thought that nothing more could be invented, another massive change would take place. If we are still here and thriving after 20 years, it is a testament to our ability to realize that things will always be constantly changing, and that we must adapt to new worlds.

read more

Edmunds.com Chooses @AriaSystemsInc to Fuel Online Auto Sales | @CloudExpo #Cloud

Aria System has announced that Edmunds.com, the premier destination for car shoppers with 20 million monthly visitors, has chosen Aria’s active monetization platform to help scale their billing operations to meet the skyrocketing demands of its customer base.
Edmunds.com, named one of “The World’s Top 10 Most Innovative Companies of 2015 in Automotive” by Fast Company, selected Aria as the best solution for its current and future business models after an extensive market evaluation. The Aria platform will provide end-to-end monetization and billing capabilities for the Edmunds Program, which gives its dealer partners a powerful online platform to reach millions of in-market car shoppers, and to close deals faster, and with higher customer satisfaction.

read more

Organisations start to address the subtleties of securing ‘as a service’ propositions

(c)iStock.com/BlackJack3D

According to a recent Verizon report, 94% of companies expect more than a quarter of their workloads to be in the cloud within two years. The enterprise is moving to a cloud model of IT service consumption and delivery for a variety of reasons: IT organisations can become more responsive to business requirements by scaling up on demand, operations are refocused toward the company’s core competence – which is typically not building data centres, and the long term costs can be more than 65% less than traditional IT models.

As we celebrate the tenth anniversary of cloud titan Amazon Web Services, it’s important to reflect on what the market has learned about security in the first decade of cloud computing and what enterprises must consider while beginning their journey to the cloud.

When considering a move to the cloud, it’s vital to remember that there are fundamentally different security models involved when adopting shared software as a service (SaaS) offerings compared to infrastructure as a service (IaaS). While people tend to lump these two classes of services together as public, data ownership, encryption management, and service access are wildly different between these two varieties of cloud services:

While SaaS services are typically easier to consume than IaaS services because you don’t have to manage the application layer, they’re also less secure than the more DIY-oriented approach that you have with IaaS deployments. It’s important to consider that these architectural differences are fundamental to the way third parties may try to access your data, and there are some very famous cases where cloud security is not one-size-fits-all. There are various areas which need to be addressed.

Encryption

As the now-famous FBI vs. Apple case is provoking a global debate around the challenges of end-to-end security – both good and bad – it is a good lesson in source-based encryption. The case highlighted how user-generated security keys can create significant barriers to data access while simultaneously launching one of the most public government-sponsored hacking campaigns of all time.

Data ownership

In the above case, Apple completely complied with the government’s request for any of the San Bernadino shooter’s data that was managed by its iCloud SaaS service – and it could do so because it also owned the customer’s data. Similar cases have surfaced all around the world, where national interests are not as aligned, such as Microsoft vs. the US Government. What is becoming increasingly clear is that unless you’re the service and data owner, a third party SaaS provider can be leaned upon in ways you may not have prepared for and ways you’re less able to protect yourself from.

Threat radius

Aggregating the information of or about many organisations within one system has always been a critical shortcoming of public and private cloud IT systems. There is a direct, linear correlation between the amount of users of a public cloud SaaS service and its appeal to hackers – where the largest services can often deliver the best data payload to hackers. Over the last two years there have been countless examples of high-value hacking events from the U.S. Office of Personnel Management, to Target, Sony and more.

As a reaction to this surge in security breaches, security managers and CISO’s are advising that organisations segregate their data sets, minimise the volume of any one high target asset and create secure, encrypted tenants around users to the greatest level of granularity as possible. Businesses, such as private equity firm The Carlyle Group, have been working with CTERA to adopt best practices to ensure security is approached correctly. The firm has worked to encrypt each of its offices with a unique key and explains its approach in a video here.

Here are some best practices as you consider moving your data to the cloud.

Own your keys, and generate your keys: Third party encryption key generation is an additional level of vulnerability that isn’t appropriate for today’s secure organisations.

Make your cloud private: Virtual private clouds (VPCs) have achieved the same level of security as private data centres and have always been able to deliver. For this reason, organisations can now go all-in on their cloud agenda without compromising on application or data security.

Compartmentalise your users, divisions and data to minimise the thread radius of any security compromise: Scalable systems that create unique encryption keys and data isolation around cloud tenants can ensure that in the event of any breach, the breach is contained to that user, application or segmented data set.

By taking on board these best practices, you can avoid issues that are being uncovered in the high profile cases and address the cloud security challenge.

Overcoming the data integration challenge in hybrid and cloud-based environments

Vivo, the Brazilian subsidiary of Spanish telco Telefónica deployed TOA Technologies' cloud-based field service management softawre

Industry experts estimate that data volumes are doubling in size every two years. Managing all of this is a challenge for any enterprise, but it’s not just the volume of data as much as the variety of data that presents a problems. With SaaS and on-premises applications, machine data, and mobile apps all proliferating, we are seeing the rise of an increasingly complicated value-chain ecosystem. IT leaders need to incorporate a portfolio-based approach and combine cloud and on-premises deployment models to sustain competitive advantage. Improving the scale and flexibility of data integration across both environments to deliver a hybrid offering is necessary to provide the right data to the right people at the right time.

The evolution of hybrid integration approaches creates requirements and opportunities for converging application and data integration. The definition of hybrid integration will continue to evolve, but its current trajectory is clearly headed to the cloud.

According to IDC, cloud IT infrastructure spending will grow at a compound annual growth rate (CAGR) of 15.6 percent each year between now and 2019 at which point it will reach $54.6 billion.  In line with this, customers need to advance their hybrid integration strategy to best leverage the cloud. At Talend, we have identified five phases of integration, starting from the oldest and most mature right through to the most bleeding edge and disruptive. Here we take a brief look at each and show how businesses can optimise the approach as they move from one step to the next.

Phase 1: Replicating SaaS Apps to On-Premise Databases

The first stage in developing a hybrid integration platform is to replicate SaaS applications to on-premises databases. Companies in this stage typically either need analytics on some of the business-critical information contained in their SaaS apps, or they are sending SaaS data to a staging database so that it can be picked up by other on-premise apps.

In order to increase the scalability of existing infrastructure, it’s best to move to a cloud-based data warehouse service within AWS, Azure, or Google Cloud. The scalability of these cloud-based services means organisations don’t need to spend cycles refining and tuning the databases. Additionally, they get all the benefits of utility-based pricing. However, with the myriad of SaaS apps today generating even more data, they may also need to adopt a cloud analytics solution as part of their hybrid integration strategy.

Phase 2: Integrating SaaS Apps directly with on-premises apps

Each line of business has their preferred SaaS app of choice: Sales departments have Salesforce, marketing has Marketo, HR has Workday, and Finance has NetSuite. However, these SaaS apps still need to connect to a back-office ERP on-premises system.

Due to the complexity of back-office systems, there isn’t yet a widespread SaaS solution that can serve as a replacement for ERP systems such as SAP R/3 and Oracle EBS. Businesses would be best advised not to try to integrate with every single object and table in these back-office systems – but rather to accomplish a few use cases really well so that their business can continue running, while also benefiting from the agility of cloud.

Phase 3: Hybrid Data Warehousing with the Cloud

Databases or data warehouses on a cloud platform are geared toward supporting data warehouse workloads; low-cost, rapid proof-of-value and ongoing data warehouse solutions. As the volume and variety of data increases, enterprises need to have a strategy to move their data from on-premises warehouses to newer, Big Data-friendly cloud resources.

While they take time to decide which Big Data protocols best serve their needs, they can start by trying to create a Data Lake in the cloud with a cloud-based service such as Amazon Web Services (AWS) S3 or Microsoft Azure Blobs. These lakes can relieve cost pressures imposed by on-premise relational databases and act as a “demo area”, enabling businesses to process information using their Big Data protocol of choice and then transfer into a cloud-based data warehouse. Once enterprise data is held there, the business can enable self-service with Data Preparation tools, capable of organising and cleansing the data prior to analysis in the cloud.

Phase 4: Real-time Analytics with Streaming Data

Businesses today need insight at their fingertips in real-time. In order to prosper from the benefits of real-time analytics, they need an infrastructure to support it. These infrastructure needs may change depending on use case—whether it be to support weblogs, clickstream data, sensor data or database logs.

As big data analytics and ‘Internet of Things’ (IoT) data processing moves to the cloud, companies require fast, scalable, elastic and secure platforms to transform that data into real-time insight. The combination of Talend Integration Cloud and AWS enables customers to easily integrate, cleanse, analyse, and manage batch and streaming data in the Cloud.

Phase 5: Machine Learning for Optimized App Experiences

In the future, every experience will be delivered as an app through mobile devices. In providing the ability to discover patterns buried within data, machine learning has the potential to make applications more powerful and more responsive. Well-tuned algorithms allow value to be extracted from disparate data sources without the limits of human thinking and analysis. For developers, machine learning offers the promise of applying business critical analytics to any application in order to accomplish everything from improving customer experience to serving up hyper-personalised content.

To make this happen, developers need to:

  • Be “all-in” with the use of Big Data technologies and the latest streaming big data protocols
  • Have large enough data sets for the machine algorithm to recognize patterns
  • Create segment-specific datasets using machine-learning algorithms
  • Ensure that their mobile apps have properly-built APIs to draw upon those datasets and provide the end user with whatever information they are looking for in the correct context

Making it Happen with iPaaS

In order for companies to reach this level of ‘application nirvana’, they will need to have first achieved or implemented each of the four previous phases of hybrid application integration.

That’s where we see a key role for integration platform-as-a-service (iPaaS), which is defined by analysts at  Gartner as ‘a suite of cloud services enabling development, execution and governance of integration flows connecting any combination of on premises and cloud-based processes, services, applications and data within individual or across multiple organisations.’

The right iPaaS solution can help businesses achieve the necessary integration, and even bring in native Spark processing capabilities to drive real-time analytics, enabling them to move through the phases outlined above and ultimately successfully complete stage five.

Written by Ashwin Viswanath, Head of Product Marketing at Talend

IBM acquires Salesforce consulting partner

IBMIBM has announced plans to acquire Salesforce specialist consulting business Bluewolf, which will bolster the Global Business Services Interactive Experience (iX) department.

IBM iX, marketed as a next-generation hybrid consultancy and digital agency, has been bolstering its ranks in recent months, as the Bluewolf deal is set to be the fourth since the turn of the year. In January the business bought US ad agency Resource/Ammirati, February saw the purchase of Berlin-based digital agency Aperto and earlier this month the acquisition of ecx.io, another digital marketing agency, was announced. While previous deals have taken IBM iX down the route of digital marketing and advertising, the Bluewolf deal takes the department back down more traditional IBM routes.

While it has not been announced when the deal will be completed, IBM hopes the deal will provide an edge in the market for medium-sized businesses and enterprise scale organizations. Bluewolf, which specializes in helping companies integrate Salesforce’s CRM services into their IT systems, is believed to be one of Salesforce’s oldest consulting partners, claiming to have delivered more than 9,500 successful worldwide.

“I’m so proud of Eric (Eric Berridge, Bluewolf CEO), who built Bluewolf from a start-up into a leader in Salesforce services,” said Marc Benioff, chairman and CEO, Salesforce. “The powerful combination of our strategic partners, IBM and Bluewolf, will help clients transform and demonstrate the growing client demand for our Customer Success Platform.”

IBM said the acquisition of Bluewolf would give the Global Business Services division deeper consulting capabilities, as it continues efforts to differentiate the brand in a crowded market place. “There is no question that the consumer-grade experience has emerged as a fundamental element in modern business strategy,” said Bridget van Kralingen, SVP at IBM Global Business Services. “Meeting that expectation defines next-generation differentiation and competitive position, and with Bluewolf, we add expertise to scale that capability to the cloud-based capabilities of Salesforce.”

The series of acquisitions seemingly build on the trends more demanding customers and evolving consumer expectations on the digital landscape. An IBM survey stated 81% of C-suite leaders anticipate more digital and virtual engagement by 2020 and 66% anticipate a stronger focus on customers as individuals. It would appear IBM is attempting to get a jump-start on competitors through strategic acquisition, as opposed to organic growth and transformation.

Why colocation is becoming the “nexus” of cloud and enterprise IT

(c)iStock.com/4X-Image

Global colocation revenues are going up and are in track to reach more than $33bn (£23bn) by 2018 according to the latest forecast from 451 Research. Yet the latest figures, argue the analysts, mean the data centre industry is maturing rather than in any trouble.

2015 was the best year on record for deals in the data centre, hosting and managed services sector, according to 451. The two most prominent companies in the colocation space are Digital Realty and Equinix; the former commands the most real estate by square feet (7.8% of share), while the latter, despite having less than half of the footprint, coughs up the most annual revenues (8.1% compared to 5.6% for Digital Realty).

According to the research, the largest global region in terms of operational space is Asia Pacific (40.1%), followed by North America (33.7%) and EMEA (22.1%). By 2018, the overall footprint for global colocation will be at 176.5 million, up from 132.4 million today.

“Colocation is quickly becoming the nexus of both cloud and enterprise IT,” said Katie Broderick, 451 research director. “The colocation market is serving as ‘data centre arms dealer’ to both enterprises and the cloud. In this process, colocation is often becoming the strategic connection point between the two.”

These figures continue to be one in the eye for those who predicted cloud would kill the data centre. As a Logicworks post explained in February: “As enterprises begin to move ‘easy’ workloads to [providers such as] AWS, they want to move not-ready workloads to a managed environment outside their internal data centres.

“Colocation is rising in popularity precisely because enterprises want cloud,” it adds. “It fits well into a hybrid cloud plan, helps enterprises consolidate data centres, and helps transition people and processes to a shared responsibility model.”

For data centres overall, the leading market continues to be the US. According to the latest figures from Synergy Research, almost half (46%) of major cloud and internet data centre sites are based there, with China second placed at only 6% of coverage.

Intel backs software-defined-infrastructure to bolster position in hybrid cloud market

IntelIntel has backed the growth of software-defined infrastructure to bolster its management and orchestration position in the hybrid cloud market segment.

The company announced the launch of Xeon processor E5-2600 v4 product family, and the SSD DC D3700 and D3600 Series, alongside industry partnerships with VMware and the Cloud Native Computing Foundation. To boost its open-source credentials, Intel will also be collaborating with open-source players CoreOS and Mirantis.

“Enterprises want to benefit from the efficiency and agility of cloud architecture and on their own terms – using the public cloud offerings, deploying their own private cloud, or both,” said Diane Bryant, GM of Intel’s Data Center Group. “The result is pent-up demand for software-defined infrastructure. Intel is investing to mature SDI (software-defined infrastructure) solutions and provide a faster path for businesses of all sizes to reap the benefits of the cloud.”

It would appear Intel is backing the growth of SDI as a means of building its position the management and orchestration market. As part of the Cloud for All initiative, Intel is investing in others in the industry to accelerate SDI-enabled clouds. A survey from 451 Research also provides weight to the Intel position as 67% of enterprises plan on increasing spend on SDI over the course of 2016.

The E5-2600 v4 product family also includes Resource Director Technology which it claims will aid customers to move to fully automated SDI-based clouds. The updated product offering will provide 20% more cores and cache than the prior generation, which could provide an improved orchestration position, according to the company.

As part of the collaboration with CoreOS and Mirantis, Intel will assist in merging together the technologies to create an open-source solution to orchestrate container and virtual machine-based applications. It would appear that alongside the move to differentiate the brand through a SDI product offering, Intel are seemingly joining the charge on open-source propositions, a growing trend throughout the cloud industry.