Verizon Enterprise launch cloud backup product with Actifio

cloud puzzleVerizon Enterprise Solutions has launched a new cloud backup service alongside Actifio, aimed at accelerating application development, and improving business resiliency.

The new offering, which will be available to customers using a virtualized environment, to create unified hybrid cloud environment with the aim of making data easier to manage, access and protect. The product will be available for customers in North America in June, and other regions towards the end of the year.

“The complexity of legacy infrastructure limits the ability of many enterprises to innovate around their data,” said Dan Jablonski, Director of Cloud and IT solutions, Verizon Enterprise Solutions. “We chose Actifio’s class-leading copy data virtualisation technology to power this new offering because it means we can now offer customers a simple, single solution to protect, move and store data in our cloud. Together with Actifio, we’re helping clients to be more agile so they can deliver better experiences to their own customers.”

The new offering is built on Actifio’s technology, which it claims will allow customers to move data back and forth between the customer premise and Verizon’s cloud-based infrastructure, allows self-serve instant access to data to improve speed of deployment and improve resiliency and availability by protecting data across the full range of conventional protection use cases.

“Data is the lifeblood of business, and it’s essential to have access to the data and applications you need when and where you need them,” said Ash Ashutosh, CEO of Actifio. “This next step in our relationship with Verizon will enable us to provide exactly that to more customers around the world, more easily and efficiently than ever before. We are thrilled to take this step forward with what is becoming one of our most important and valued cloud service provider partnerships.”

Joyent launches Container-Native offerings for public and hybrid cloud platform

JoyentJoyent has launched its next generation container-native (G4) and KVM-based (K4) instance package families, which are now available on its Triton-powered public cloud platform.

The company’s cloud platform runs on containers, as opposed to traditional VM’s which the majority of other cloud platforms run on, which it claims will notably improve efficiency. The software used to run the Triton Cloud service is 100% open source and available for customers to use to operate in their own private data centres within a hybrid cloud model.

“Workloads are more efficient on Triton Cloud,” said Bill Fine, Vice President Product and Marketing at Joyent. “This is because Triton allows you to run containers natively, without having to pre-provision (and pay for) virtual machine hosts. The result is less waste and more cost savings for you.

“Consider our recent blueprint to run WordPress in containers. A minimum running implementation requires six g4-highcpu-128M instances and costs just over $13 per month. That minimal site may be perfect for a small blog or staging a larger one. Should you need to scale it, you can resize the containers without restarting them or scale horizontally with Docker-compose scale (or another scheduler of your choice).”

Note: There is a poll embedded within this post, please visit the site to participate in this post’s poll.

Joyent’s value proposition and marketing campaigns are seemingly built on the claim it is cheaper and more efficient than AWS, as it would appear the team are set on taking the fight to the incumbent industry leader. The company claim there is a notable price-performance cost advantage, more specifically, Elasticsearch clusters on Triton complete query requests 50% to 70% faster, Sharded MongoDB clusters complete tasks 100% to 150% faster and Standard primary/replica Postgres configurations up to 200% faster, in comparison to AWS.

“The cost of running on Triton is about half the cost of running on AWS,” said Fine. “With enough experimentation and determination you might be able to narrow this cost gap by more efficiently bin-packing your containers into VMs on AWS, but on Triton those efficiencies are built in and the cost and complexity of VM host clustering is removed. Each container just runs (on bare metal) with the resources you specify.”

Dell-EMC Merger Spurs Enterprises to Re-Examine Storage Strategies | @CloudExpo #Cloud

A BriefingsDirect IT market analysis discussion explores customer impacts to the global storage market now that the $67 billion Dell-EMC merger deal appears imminent.
A massive and complex financing apparatus, largely built on private equity debt, undergirds the deal, with privately held Dell taking over the publicly traded EMC and VMware federation. This largest IT vendor deal ever is expected to close sometime between now and October 2016.

read more

Amdocs and Microsoft team up to launch Cloud-Fusion

metalcloud_lowresAmdocs and Microsoft have collaborated to create an enterprise connectivity and applications solution, reports Telecoms.com.

Amdocs Cloud-Fusion will, according to Amdocs, enable the development of cloud service offerings including bandwidth, WAN optimisation and the delivery of committed SLAs. The solution combines Microsoft’s Azure cloud infrastructure and Amdocs’ NFV-ready Network Cloud Service Orchestrator, which also lends it the ability to design and deliver VNFs from any network vendor. The combination, therefore, allow access to Azure’s business services and third-party Microsoft Azure Marketplace solutions, Amdocs says.

The Cloud-Fusion platform is intended to provide unified management, monitoring, orchestration and assurance, enabling service providers to automate fulfilment and operations of cloud-based services. The hopeful outcome is to improve customer experience

“Today 17 percent of all businesses each have more than 1,000 virtual machines supporting a range of business-critical applications that reside in the public cloud, up from 13 percent in 2015,” said Ann Hatchell, head of network marketing at Amdocs. “Service providers can offer a one-stop shop for differentiated hybrid cloud services with service guarantees for enterprise customers, and streamline end-to-end service management across telco and public cloud environments, thereby improving service agility and reducing complexity.”

“Service providers will be able to capture new revenue streams from their business segment customer base by adding cloud services and providing access to Microsoft Azure’s value-added business services and Azure Marketplace’s solutions through secure service provider networks,” said Bob De Haven, GM for Worldwide Communications and Media at Microsoft.

IBM’s Watson takes aim at cybersecurity

Anonymous unrecognizable man with digital tablet computerIBM has launched a new cloud-based version of the company’s cognitive technology to tackle the rising challenge of cyber security.

The company’s R&D team have recently completed a year-long research program to teach Watson to understand the nuances of security research findings, which can be used to realize patterns and uncover evidence of hidden cyber-attacks which could have been missed. IBM plan to move the security-version into beta test later this year.

Watson’s new capabilities builds on the skills gap within the security job market, but also the idea that big data in a security perspective is too vast for human capabilities. Like other areas of the cloud industry, simple tasks are being automated, allowing employees to concentrate on the more critical areas of the business. IBM claim the average organization deals with 200,000 pieces of security event data per day, with enterprises spending $1.3 million a year dealing with false positives alone.

“Even if the industry was able to fill the estimated 1.5 million open cyber security jobs by 2020, we’d still have a skills crisis in security,” said Marc van Zadelhoff, GM at IBM Security. “The volume and velocity of data in security is one of our greatest challenges in dealing with cybercrime. By leveraging Watson’s ability to bring context to staggering amounts of unstructured data, impossible for people alone to process, we will bring new insights, recommendations, and knowledge to security professionals, bringing greater speed and precision to the most advanced cybersecurity analysts, and providing novice analysts with on-the-job training.”

The new offering is built on Watson’s ability to learn and reason from unstructured data, 80% of which cannot be processed by non-cognitive tools, and IBM claim the offering will learn from a number of different sources including blogs, articles, videos, reports and alerts. The company believe only 8% of this unstructured data is being utilized currently, making the concept of secure almost impossible. Once Watson for Security is released it will provide customers insights into emerging threats, as well as recommendations on how to stop them.

To further enhance the offering, the team have also announced eight partnerships with various universities to train Watson on the language of cybersecurity. The universities include California State Polytechnic University, Pomona; Pennsylvania State University; Massachusetts Institute of Technology; New York University; the University of Maryland, Baltimore County (UMBC); the University of New Brunswick; the University of Ottawa and the University of Waterloo.

Real-time AI for Smart Networks – Building an OpenStack ‘Policy Brain’ | @CloudExpo #AI #Cloud

Another of the main foundations of the AT&T Domain 2.0 program is the ‘DCAE’ framework: Data Collection, Analytics and Events. In short their Big Data platform for enabling smart management.
“In the D2 vision, virtualized functions across various layers of functionality are expected to be instantiated in a significantly dynamic manner that requires the ability to provide real-time responses to actionable events from virtualized resources, ECOMP applications, as well as requests from customers, AT&T partners and other providers.

read more

Korea’s KBS TV to Feature @CloudExpo | @ThingsExpo #IoT #Cloud

Korean Broadcasting System (KBS) will feature the upcoming 18th Cloud Expo | @ThingsExpo in a New York news documentary about the “New IT for the Future.” The documentary will cover how big companies are transmitting or adopting the new IT for the future and will be filmed on the expo floor between June 7-June 9, 2016, at the Javits Center in New York City, New York. KBS has long been a leader in the development of the broadcasting culture of Korea. As the key public service broadcaster of Korea, KBS has undertaken initiatives at technological turning points while providing a communication channel for diverse views.

read more

Uber looks to outsource server infrastructure

(Image Credit: iStockPhoto/Bogdan Kosanovic)

Uber, the ride-sharing giant, is exploring the public cloud and leaving the privacy of its own server infrastructure. Sources say that it has solicited bids from cloud players such as Amazon, Microsoft and Google for hosting part of its server load.

An Uber deal would also mean significant PR for the winning vendor.

The ride-sharing firm has expanded rapidly and is now active in 69 countries worldwide, and it faces the challenge of keeping its servers and software in the best condition to ensure that high performance and availability.

To achieve this speed and performance, Uber is looking to set up server facilities as geographically close to its customers as it can. It would be able to expand the global power of its infrastructure by tapping the cloud computing infrastructure of major providers like Google, Microsoft and Amazon.

According to reports, Uber is only looking to shift small parts of its code to the cloud, but the volume of those parts could be massive for a cloud provider, given the scope of Uber’s worldwide operations. This may trigger significant competition among providers, including IBM and others. This news comes as cloud vendors attempt to win over larger businesses. An Uber deal would also mean significant PR for the winning vendor.

The report further indicates that Uber’s technology is seemingly vendor-neutral, so they may partner with different cloud providers, based on the data centre strength in various regions, like Microsoft in India or Alibaba and Baidu in China.

Uber has not yet commented on or confirmed the matter.

Which cloud provider do you think Uber should partner with? Share your thoughts in the comments.

Salesforce plans to launch IoT offering on AWS

Salesforce WearSalesforce has announced plans to launch its new IoT offering on AWS facilities, moving away from it traditional play of using its own data centre infrastructure, reports The Wall Street Journal.

The offering is reportedly going to be launched by Salesforce in the next couple of months, is currently available to a select number of customers as the team test the various features. Saleforce’s IoT Cloud was initially announced last September, enabling customers to personalize the way they sell, service and market top their prospects. As part of the development, Salesforce has partnered with a number of firms including ARM, Etherios, Informatica, PTC ThingWorx and Xively LogMeln, to bring the service to market.

“Salesforce is turning the Internet of Things into the Internet of Customers,” said Marc Benioff, CEO of Salesforce at the time. “The IoT Cloud will allow businesses to create real-time 1:1, proactive actions for sales, service, marketing or any other business process, delivering a new kind of customer success.”

Salesforce has traditionally built new services on its own data centre infrastructure, though it would appear to be joining a number of other companies, including Netflix, who are utilizing the services of AWS as well as in-house options. This is not the first experience of AWS for Salesforce however, as the company acquired Heroku in 2010, which operated on AWS. Working with AWS also gives Salesforce the flexibility to manage what could be large scale growth should the offering receive large scale traction upon launch, as adding additional hardware to its own data centre to meet demand could take days or even weeks.

Alongside the IoT announcement, Benioff has taken to Twitter to apologize for a database failure on the NA14 instance, which caused outages for a number of customers in North America, which lasted for more than 12 hours.

The failure occurred after “a successful site switch” of the NA14 instance “to resolve a service disruption that occurred between 00:47 to 02:39 UTC on May 10, 2016 due to a failure in the power distribution in the primary data centre” the company said. Although not confirmed by Salesforce, it would appear a large number of customers throughout North America were impacted by the failure.

Salesforce apology

New HP Tech Venture Group may lead to HPE overlap

HPHP has announced the launch of HP Tech Ventures, the new corporate venture arm of the business, which will invest in IoT and artificial intelligence start-ups that could end up competing with HPE.

The team will aim to develop partnership and identify potential acquisitions within the new era of disruptive technologies. HP Tech Ventures, which will be based out of offices in Palo Alto and Tel Aviv, will be led by Chief Disrupter, Andrew Bolwell targeting new technologies in 3D transformation, immersive computing, hyper-mobility, Internet of Things, artificial intelligence, and smart machines in the first instance.

Following the split of Hewlett-Packard into two separate organizations, HP took the PC and printer assets, while HPE is now focused on enterprise-orientated technologies. Over the last several months, HPE has made numerous product launches and investments in cloud, machine-learning and IoT technologies, and HP Tech Ventures targeted technologies (IoT, AI, smart machines etc.) could potentially make the once combined companies, competitors. HPE also has its own venture arm, where it has invested in various cloud, big data and security start-ups.

“The next technology revolution is shifting towards strategic markets that speak to HP’s strengths,” said Shane Wall, HP Chief Technology Officer and head of HP Labs. “With our global brand and broad reach into consumer and commercial markets worldwide, HP can help start-ups bring product to market, build their business and scale in the global marketplace as they grow.”

The company has claimed it will be able to offer rapid scale to innovative start-ups, through its technology network, as well as its channel and distribution partners. The launch would appear to be one of HP’s strategies to counter the negative impact which declining PC sales is having on its traditional business, entering into new markets through potential acquisitions as opposed to organic growth.