Todas las entradas hechas por cloudtech

CNCF details advancements in cloud projects

The Cloud Native Computing Foundation (CNCF) held its Kubecon CloudNativeCon North America conference on 6th December 2017 and shared developments in its open-source cloud efforts.

The CNCF houses the Kubernetes container orchestration system and 13 additional cloud project that enable organisations to build cloud native architectures. The event, which had over 4,000 attendees, introduced new members and shared multiple project updates – such as 1.0 releases from the containerd, Jaeger, CoreDNS and Fluentd projects.

Originally formed as a Linux Foundation Collaborative Project in July 2015, the CNCF now includes the likes of Microsoft and AWS among other major cloud providers.

Project updates

One of the new projects announced is the ‘containerd’ container runtime, originally developed by Docker, which joined the CNCF on 29th March at the CloudNative/Kubecon EU event in Berlin, Germany. The containerd 1.0 release was announced at KubeCon North America 2017 and provides a stable base for container engine development.

The Jaeger project also launched its 1.0 release. Jaeger is a distributed tracing system that can be used to help find application performance bottlenecks. It became a CNCF project on 13 September 2017. Chris Aniszczyk, CNCF CTO, said: "As you start building cloud native applications, having proper monitoring and tracing for applications are table stakes. Jaeger now gives cloud developers the ability to use distributed tracing within their stack."

The Fluentd data collector project, originally developed by software firm Treasure Data before joining the CNCF in November 2016, also reached the 1.0 milestone. Aniszczyk said: "Fluentd is already a fairly mature project. The 1.0 milestone shows they have grown their committer and maintainer base to be larger than just a single company, which is important for the long-term health of the project."

Aniszczyk expects as many as 11 additional new projects to join CNCF by the end of 2018. He said the CNCF is reviewing five new projects for admission to the foundation in December 2017.

"One of lessons we learned from other foundations is that it is not always healthy to force integration across projects via an aligned released cycle. Our attitude has been to allow the market to decide whether it makes sense to put different project together,” explains Aniszczyk.

CNCF and Aniszczyk have expressed interest in the emerging market for serverless technologies. He said all the major cloud providers – including Amazon, Google and Microsoft – are working together to identify important issues such as function portability through the CNCF serverless working group. The working group is developing a specification called Open Events that will standardise how function based events are executed across the different serverless platforms.

What are your thoughts on the CNCF’s projects and momentum? Let us know in the comments.

Acuity: Cloud biometric solutions will authenticate over 1tn transactions annually by 2022

Acuity expects cloud-based biometric solutions to authenticate more than one trillion transactions annually by 2022

Cloud-based biometric solutions are expected to drive the mobile biometric market to $50.6 billion in annual revenue by 2022, according to Acuity Market Intelligence.

Acuity predicts that more than 5.5 billion biometrically enabled mobile devices will create a global platform that would support one trillion cloud-based biometric transactions yearly by 2022.

Maxine Most, Acuity's Principal and lead analyst, said: "Biometric have become a mainstream convenience for unlocking smartphones and verifying on-device transactions. But the market is evolving towards a hierarchy of integrated biometric authentication methods that range from simple device-based verification to third-party biometric Cloud, or server-side, solutions. These solutions will replace traditional digital identity schemes and provide more secure and reliable identity assurance on a global scale.”

As per Acuity, the annual biometric transaction revenue is expected to grow from $474 million in 2017 to $18 billion in 2022, thus exceeding a 100% CAGR. Biometric app revenue is expected to increase at 26% during this period as revenue increases from $9.4 billion to $29 billion yearly. Acuity predicts mobile biometric transaction volume to reach 1.4 trillion yearly in 2022 – more than 70% cloud based – as biometric app downloads exceed 16.7 billion that year.

Most said: "Pushing biometrics to 'the edge' of the mobile ecosystem with device-based authentication improves the mobile experience, but with limitations. The big payoff comes from Cloud, or server-side biometrics, that simplify authentication and reduce friction while linking an individual to a device and platform independent Unique Verifiable Identity (UVI)."

Are you surprised at this rate of growth? Share your thoughts in the comments.

Trend Micro and AWS collaborate to boost cloud security

Trend Micro, which has announced integration with Amazon GuardDuty, is also one of the first companies for Enterprise Contracts for AWS Marketplace.

Enterprise Contracts for AWS Marketplace provides standardised contract terms across many software vendors to simplify and hasten procurement for the recently announced Amazon Web Services (AWS) Web Application Firewall (WAF) Managed Rules Partner Program.

With the AWS cooperation, Trend Micro expects to deliver application protection rules as part of the new AWS WAF Managed Rules Partner Program.

“AWS Marketplace is simplifying the enterprise software procurement experience to accelerate customer innovation," comments Dave McCann, VP of AWS Marketplace and Catalog Services, Amazon Web Services. "Trend Micro is a valued AWS Marketplace seller that embraces customer feedback to drive their innovation in product and business models. We are delighted to have them as one of the first companies for Enterprise Contract for AWS Marketplace.”

The Trend Micro and Amazon GuardDuty integration permits users to take advantage of the security findings from Amazon GuardDuty in order to make smarter decisions with their Amazon Elastic Compute Cloud (Amazon EC2) and Amazon EC2 Container Service (Amazon ECS) workloads.

Kevin Simzer, EVP at Trend Micro, said: “We are proud to provide an extra layer of protection to the innovative applications that AWS builders are deploying on the cloud. Our collaboration with AWS allows us to deliver scalable security that removes friction from procurement, devops lifecycle and day-to-day operations.”

What are your thoughts on the Trend Micro and AWS collaboration? Let us know in the comments.

Amazon adopts open source as competition increases

Amazon has declared the adoption of the open-source technology ‘Kubernetes’ as the competition in the cloud business increases.

The technology, formulated originally by a Google team, has received support from several enterprises including Microsoft, Oracle, and IBM. It is said to have many advantages including the faculty to process an application on any public cloud including Azure and Google Cloud Platform. This makes it easy for Kubernetes to migrate from one cloud vendor to another.

At re:Invent, Amazon Web Services’ (AWS) annual conference, the company’s CEO Andy Jassy made the Kubernetes announcement.

According to information shared by analysts, Amazon previously offered a service the same as Kubernetes but the Google technology established itself as the standard for such “container” technologies and AWS was left with no other option but offer its support for the open-source technology.

Commenting on the AWS situation, Joe Beda, one of the developers of Kubernetes and the CTO of Heptio, said: “This is an example of AWS looking outside of their own world in response to customer need.”

In the year 2006, AWS pioneered the cloud computing business with a service that is considered to be a fast and simple method to get affordable yet high-powered computing services for smaller business. The business not only attracted smaller business, but large enterprises too.

However, changes emerged in the market. Along with AWS, two of its rivals also gained share in the cloud infrastructure market worldwide according to market research firm IDC.

While AWS’ share rose to 45.4% in H1/2017, compared to 43.8% in H1/2015, rival Google Cloud’s share increased to 3.1% in 2017 from 1.7% in 2015. The most significant increase was noted in the share of Microsoft Azure, where an increase to 10.3% was observed in 2017 from 5.6% in 2015.

Amit Agarwal, chief product officer of Datadog, said, “Amazon is still the clear market leader, but the cloud infrastructure market is massive and there’s room for many players.”

What are your thoughts on Amazon's adoption of Kubernetes? Let us know in the comments.

Box introduces framework to apply machine learning to cloud content

Cloud content management company Box has unveiled Box Skills, a framework for applying machine learning tools such as computer vision, video indexing, and sentiment analysis to stored content. Box Skills will facilitate businesses to re-imagine the business processes considered as impractical to digitise or automate or too expensive.

At BoxWorks 2017, Box previewed three initial Box Skills currently in development, leveraging machine learning tools from IBM Watson, Microsoft Azure and Google Cloud to solve common business use cases, including:

Audio Intelligence: Uses audio files to create and index a text transcript that can be easily searched and manipulated in a variety of use cases; powered by IBM Watson technology.

Video Intelligence: Provides transcription, topic detection and detects people to allow users to quickly look up the information they need in a video; powered by Microsoft Cognitive Services.

Image Intelligence: Detects individual objects and concepts in image files, captures text through optical character recognition (OCR), and automatically adds keyword labels to images to easily build metadata on image catalogues; powered by Google Cloud Platform.

David Kenny, Senior Vice President, IBM Watson and Cloud Platform, said: “Box Skills is an extension of our strategic partnership with Box aimed at helping businesses work more efficiently, solve challenges and seize opportunities for innovation.”

Scott Guthrie, EVP of Microsoft’s cloud and enterprise division, said: “Box Skills brings together Box’s cloud content management and Microsoft Azure’s industry-leading AI services to deliver intelligent insights for customers.”

Box also launched Box Skills Kit, a set of developer resources for building custom Box Skills. Box Skills Kit will enable an ecosystem of cloud services providers, independent software vendors, systems integrators, and enterprises themselves to create custom Box Skills implementations.

The company also demonstrated its Box Graph technology at BoxWorks 2017. It is an intelligent network of content, relationships, and activity that Box will leverage to power new experiences and services for both individual Box users and enterprises. Box launched the first new experience powered by Box Graph, called Feed. It is a personalised activity feed that curates and surfaces the most relevant updates, insights and content for each Box user.

Jeetu Patel, Chief Product Officer for Box, said: “Box Skills and Box Graph represent a truly practical application of intelligence for the enterprise, ensuring our customers can realise incredible value from every piece of content they have in Box.”

What are your thoughts on Box’s new framework? Let us know in the comments.

IBM develops new programming model to create serverless applications

IBM has announced a new capability at the Serverless Conference in New York City on October 10th, 2017 to further the development of serverless applications.

The new tool, Composer, comprises a library of patterns that are key for building serverless applications. It also has the inherent serverless features such as auto-scaling and pay-as-you-go built into it.

Currently, the programming model is available for use in only Node.js language, but developers can deploy the model to make it available in other programming languages such as Python, Swift, and Java.

Rodric Rabbah, principal researcher for IBM Cloud Functions, said: «Rather than forcing people to learn new programming languages, we’d rather bring the model into their favourite programming language.»

The library highlights the key features of serverless and Functions-as-a-Service, as Rabbah explains: «To make functions first class and still allow you to orchestrate the execution of all the functions and the data flow between them automatically, freeing the programmer from having to do that.»

IBM researcher Paul Castro explained that working with Composer is unlike the typical process wherein the developers building solutions with serverless either had to roll their own composition in an ad hoc manner or use a separate service, such as AWS Step Functions. Castro said: “Composer is bringing that composition into the development flow you would already use for serverless, and it’s well integrated into what we have in OpenWhisk / IBM Cloud Function.»

IBM is also introducing the functions shell, a new tool to help with developing, deploying, running, and debugging serverless Functions and compositions. For example, a developer can use it to edit code in a text editor instead of a drag-n-drop UI, or to validate compositions with visualizations without switching tools. It also helps developers deploy and invoke compositions using familiar CLI commands.

What are your thoughts on IBM’s new Composer tool? Let us know in the comments.

IDC: Public cloud accounts for just over a third of worldwide IT infrastructure sales

International Data Corporation (IDC) has said that public cloud providers are reshaping the IT market; being responsible for a third of all the servers, data storage, and Ethernet switching equipment sales.

As per the IDC research, public cloud companies crossed a milestone during Q2/2017, now accounting for just over a third (33.5%) of worldwide IT infrastructure sales, an increase of 34.1% y/y. The revenue totaled $8.7 billion.

During Q2/2017, the Private cloud generated $3.7 billion in sales, an increase of almost 10% y/y. Together the public and private cloud IT infrastructure sales have nearly tripled in the last four years, noted IDC.

Demand decreased 3.8% y/y in Q2/2017, representing the traditional or non-cloud customers. Yet the segment remains an important one, generating $13.6 billion in Q2 and representing more than half (52.4%) of the market.

Demand for storage is high, accounting for over a third of public cloud revenues in Q2, an increase of 30.4% y/y. Sales of both Ethernet switches and servers increased 26.8% and 24.6%, respectively.

Research director at IDC’s Computing Platforms practice Kuba Stolarski said that Amazon has been a driving force behind the increase in the spending. Its rivals, however, are just sitting still.

Stolarski commented about them, saying «it is important to remember that many of the other hyperscalers – Google, Facebook, Microsoft, Apple, Alibaba, Tencent, and Baidu – are preparing for their own expansions and Skylake/Purley refreshes of their infrastructure.» Skylake and Purley are newer processor architectures for high-performance servers from chipmaking giant Intel that cloud providers can use to speed up their workloads.

Dell is slightly in the lead with 11.8% of the market on cloud IT infrastructure sales of over $1.4 billion and HPE is right behind 11.1% share of the market and over $1.3 billion in revenue.

Cisco is at the third position, with just over a $1 billion in sales and 8.2% of the market. Huawei, NetApp and Inspur make into the top five vendors. Collectively, ODMs (original design manufacturers) that sell directly to data centre customers beat all of them, with $5.4 billion in sales during Q2/2017 and 44% of the market.

Are you surprised at the IDC results? Share your thoughts in the comments.

Google Cloud catches up to AWS with Transfer Appliance

Google Cloud has caught up to AWS with a physical ‘Transfer Appliance’ to move your data from your own local servers into the giant’s cloud.

Amazon already has a solution which it calls ‘Snowball’ and features 50TB or 80TB capacities in a ruggedised appliance which the company sends to your premises so you can fill it with your data locally before it heads back to your preferred AWS data centre. The idea, of course, is that you can benefit from a much quicker transfer without the latency and cost of uploading over a standard WAN (Wide Area Network).

If you have a large amount of data and don’t want to pay through the roof, Google’s solution may benefit you more than Amazon’s. The web giant has upped the capacity of Amazon’s similar offerings with a 100TB/2U basic Transfer Appliance, or an incredible 480TB/4U variation. Both are designed to fit into 19” racks.

Google has provided this handy chart of the estimated time differences between a physical and online transfer:

«Using a service like Google Transfer Appliance meant I could transfer hundreds of terabytes of data in days not weeks,” comments Tom Taylor, Head of Engineering at The Mill. “Now we can leverage all that Google Cloud Platform has to offer as we bring narratives to life for our clients.»

As for pricing, the 100TB model is priced at $300, plus shipping via Fedex (approximately $500); the 480TB model is priced at $1800, plus shipping (approximately $900). Initially, the appliance will only be available in the US.

It’s worth noting, of course, that Amazon still takes the crown if you need to transfer an insane amount of data with its 100PB (yes, petabyte) truck it calls the Snowmobile. Before the 45-foot long ruggedized shipping container – which is pulled by a semi-trailer truck – rolls out to your premises, you will need an initial assessment.

Are you impressed with Google’s Transfer Appliance? Share your thoughts in the comments.

Oracle is hiring 1000 employees for fast-growing cloud business

After reporting a 58% year-on-year revenue growth in its cloud businesses and garnering sales of $4.6 billion worth from cloud computing software and hardware, multinational computer technology giant Oracle Corporation is expanding its cloud computing services in Europe, the Middle East and Africa.

Cloud-related products now account for more than 12% of Oracle’s total sales.

Tino Scholman, vice president of Oracle’s cloud computing for the region, said: «Our cloud business is growing at incredible rates, so now is the right time to bring in a new generation of talent.» Hence, the company is set to recruit 1000 employees in Europe, the Middle East and Africa to serve its growing cloud computing services in the region to accommodate the growing needs of the company. It seeks workers having two to six years of experience to staff sales, management, finance, recruitment, marketing and human resources roles for its cloud computing service.

At present, the Redwood, California-based company employs about 51,000 staff in the US and 85,000 globally.

Oracle derived 28% of its overall revenue from Europe, the Middle East and Africa in 2016. Sales in the region declined 2% to $10.6 billion owing to shifting customer preferences from Oracle’s traditional enterprise computing software to cloud-based services.

According to research firm IDC, public-cloud spending is expected to increase 27% year-on-year to reach $82 billion by 2020. Bloomberg Intelligence’s July report states that Oracle’s «cloud infrastructure products are gaining traction and should become a major pillar of growth next year, amid increasing competition from Amazon.»

Demand for cloud-computing services is seeing a notable rise with Amazon, Alphabet’s Google, Microsoft, International Business Machines and others reporting sweeping growth in cloud-computing sales. To beat the cloud push and remain competitive, these companies have been adding data centers across Europe.

Are you impressed with Oracle’s growth? Share your thoughts in the comments.

Opinion: Is the use of public cloud ‘fundamentally disempowering’?

Speaking at the OpenStack Summit in Boston last month, Edward Snowden warned that the use of public cloud providers is ‘fundamentally disempowering’.

As reported by ZDNet, Snowden told the audience – through video conference, of course – that ‘we can’t let people be mindless when they’re building clouds.’ “You give them money, and they provide you with a service, but you are also providing them with more than money. You’re giving up control, influence.”

But what does this mean in terms of keeping vital workloads in the public cloud? Below, industry experts weigh in on the issues:

David Griffiths, VP of EMEA, Tintri

Many have been quick to recognise the benefits delivered by public cloud, but what is also clear are the sacrifices made when it comes to control over data. The public cloud provides agility and ability to scale, however, as Snowden explains, often at the cost of freedom to shape an environment to specific workload requirements.

Put simply, pouring money into a third-party infrastructure that companies have no real ownership of doesn’t make good budgetary, business or security sense – especially when the technology exists to provide the same scale and efficiency within their own data centre.

Enterprise cloud, however, provides public cloud-like agility allowing organisations to benefit from similar applications and services. Additionally, it allows for massive scale-out with the security, privacy and governance levels you would expect from a private environment. Alongside predictive analytics, granular-level abstraction and the ability to automate, it ensures organisations know exactly where their data is at any given time.

Gary Watson, founder and VP of technical engagement at Nexsan

One of the main points Snowden addresses is the ability for third party providers to access encrypted user data. Trusting a third-party provider with data is a step that should be very carefully considered. In today’s digital age, data is the lifeblood of any organisation and it is fundamental that organisations can guarantee control, security and locality. However, there is no doubt that organisations require the flexibility and agility of the cloud, as it promotes a more collaborative way of working and we are certainly seeing an uptake in cloud-based solutions.

On-premises private cloud solutions are available, which allow organisations the benefits of the cloud while keeping data on site through a privately-owned appliance. Forward thinking organisations that can incorporate the agility and flexibility of the cloud while still being able to maintain control over security and data locality will be in a far better position in the market. In order to do this, it is key that businesses understand their unique data needs and opt for a solution that will enable secure, reliable access.

Jake Madders, director at Hyve Managed Hosting

No one can deny that public cloud is a hugely successful IT innovation that shows no signs of slowing down. While there are numerous benefits to entrusting your data with one of the big public providers, there are also considerable drawbacks when it comes to performance, security and compliance with an unmanaged public cloud. AWS-like auto-scaling cannot identify bottlenecks and over-used resources in the way a Managed Service Provider (MSP) can, nor can an unmanaged public environment provide the same level of security and adherence to regulation, which is especially important with big changes surrounding GDPR about to take place.

Working with an MSP can guarantee optimum service levels across all platforms, taking the best aspects from each, all while offering continual support for business looking to make the best out of cloud computing.

Paul Mills, group sales director at Six Degrees Group

Edward Snowden’s comments raise some interesting considerations, but adopting public cloud should not be about giving up control.  Public cloud services have an important role to play for organisations of any size and can provide a significant springboard to business transformation when used in the right way.  

However, in choosing any type of cloud service – public, private or hybrid – privacy, governance and regulation need to be at the forefront of the decision-making process to keep you in the driving seat and to ensure the correct services are chosen.  There are many options available and organisations should plan this activity carefully, seeking advice from their trusted technology partners to ensure they find the best solution to meet their needs.