Archivo de la categoría: Enterprise IT

NTT announces five major additions to Enterprise Cloud

NTT cloud diagramJapan’s NTT Communications (NTT Com) has announced five major improvements to its Enterprise Cloud in a bid to become the chosen vehicle for digital transformation. The new enhanced Enterprise Cloud is immediately available in Japan, and will be rolled out in the UK, Singapore, US, Australia, Hong Kong and Germany later this year.

According to NTT Com, enterprises want to make the difficult crossing from traditional IT to the cloud but need greater support. In response, it has announced that its new Enterprise Cloud offering will make the journey easier. The new improved offerings are described by NTT Com as Hosted Private Cloud for traditional ICT, Multi Tenant Cloud for cloud native ICT, seamless hybrid clouds, free and seamless connections between cloud platforms and a cloud management platform to give full visibility and governance.

NCC Com said enterprises with cloud ambitions face two major challenges: migration of their traditional systems and, having gone to the cloud, changing their mode of development-operations to fit the new ‘cloud native’ application culture.

NTT Com outlined how each of these five Enterprise Cloud ‘enhancements’ will help its target clients. The Hosted Private Cloud for Traditional ICT now consists of dedicated bare-metal servers with options for multi-hypervisor environments, including VMware vSphere and Microsoft Hyper-V. The logic of the service is to make it easier for companies with traditional ICT to migrate to a hosted private cloud.

The Enterprise-class Multi-tenant Cloud for Cloud-Native ICT is based on OpenStack architecture, giving customers an industry-standard open API to control the Enterprise Cloud. It comes with Platform-as-a-Service (PaaS) software from Cloud Foundry in order to provide Dev-Ops efficiencies. The open architecture was needed to address customer concerns about vendor lock-ins, says NTT Com.

The Seamless Hybrid Cloud Environment is created for clients by NTT Com configuring all the relevant network components (the virtual servers, bare-metal servers, firewalls and load balancers) running on complex on-premises environments.

The promised ‘Free and Seamless Connection between Cloud Platforms’ is made by connecting the Enterprise Cloud with a 10Gbps best-effort closed network, free of charge. In addition, connectivity between Enterprise Cloud platforms and data centres is provided at ‘competitive’ prices globally, said NTT Com.

Finally, the new Cloud Management Platform (CMP) promises full visibility and IT governance by unifying the control of both Enterprise Cloud and third-party providers’ clouds, including Amazon Web Services (AWS) and Microsoft Azure.

Microsoft adds RedHat Linux, Containers and OneOps options to Azure

AzureMicrosoft has launched a trio of initiatives aimed at widening the options of its potential clients of its Azure cloud services.

It made the announcements through the Azure Blog, which promises the availability of new RedHat Enterprise Linux ‘instances’ (i.e. units of computing resources), a new application lifecycle manager, OneOps, and showcased a preview of an imminent Azure Container service.

The Red Hat Enterprise Linux instances are available from the Azure Marketplace. According to the blog, 60 percent of the images available are now Linux-based. Microsoft claims its hybrid model can be running ‘in minutes’ with Red Hat Enterprise Linux images available on Azure Marketplace on a Pay-as-you-go model with hourly billing.

Among the eligible products are Red Hat Enterprise Linux, Red Hat JBoss Enterprise Application Server, Red Hat JBoss Enterprise Web Server, Red Hat Gluster Storage and Red Hat OpenShift.

“Both Microsoft and I love Linux,” said Corey Sanders, Azure’s Director of Program Management. The new instances will help cloud users cater for on-demand workloads, development and testing and cloud bursting in a simple, easily quantifiable system, Sanders said. The Red Hat Enterprise Linux 6.7 and 7.2 images are now live in all regions, except China and the US Government.

The imminent Azure Container Service – currently available for preview – will build on previous Docker and Mesosphere initiatives to make it easier to provision clusters of Azure Virtual Machines onto containerized applications. The process will be a lot quicker since the machines will have been pre-configured with open source components, Sanders said.

Sanders also disclosed that Microsoft has certified for the Azure Marketplace a group of Linux images created by Bitnami. Meanwhile, Microsoft’s new OneOps offering on Azure, which gives clients the user of an open-source cloud and application lifecycle management platform, is a product of a collaboration with the WalmartLabs team (the IT offshoot of retail giant Walmart).

EMC and VMware launch hyperconverged VxRail appliance

EMC2Storage vendor EMC and virtualiser VMware have jointly launched a family of hyper-converged infrastructure appliances (HCIA) for VMware environments. The plug and play gadgets are meant to simplify infrastructure management in departments experiencing high growth.

The VxRail appliance family combines EMC’s data services and systems management with VMware’s software such as vSphere and Virtual SAN. The intention is to create software defined storage natively integrated with vSphere in a single product family with one point of support. The all-flash VxRail appliances could simplify VMware customer environments and boost performance and capacity in a simple plug and play operation, the vendors claim.

The appliances were jointly engineered to integrate virtualisation, computing, storage and data protection in one system with a single point of support, say the vendors. Since they can be aggregated at great scale, the estate of appliances can grow from supporting two virtual machines (VMs) to thousands of VMs on a ‘pay-as-you-grow’ basis.

Starting prices for small and medium businesses and remote offices are around $60,000, with options for performance intensive workloads to be catered for with up to have 76 TB of flash. The appliances will run EMC’s data services including replication, backup and cloud tiering at no additional charge. In addition RecoverPoint for Virtual Machines, Virtual SAN, vSphere Data Protection and EMC Data Domain are all available.

Meanwhile VCE VxRail Manager will provide hardware awareness with timely notifications about the state of applications, VMs and events. VxRail Appliances can use EMC cloud tiering to extend to more than 20 public clouds such as VMware vCloud Air, Amazon Web Services, Microsoft Azure and Virtustream. These can provide an additional 10TB of on-demand cloud storage per appliance.

“The new appliances put IT organisations on a path to eliminating complexity and collapsing cost structures,” said Chad Sakac, President of the Converged Platforms division of EMC.

According to ESG research on hybrid cloud 70% of IT respondents plan to invest in HCI in the next 24 months. The new appliance family is due out n Q2 2016.

New IBM z13s mainframe was built with a BIOS for hybrid cloud

datacentre cloudIBM has designed its latest mainframe to address the challenges stopping hybrid cloud from becoming the de facto model of enterprise computing. The result has been benchmarked by analysts as the world’s most secure server for enterprise hybrid cloud computing.

The new IBM z13s mainframe, unveiled on February and available from March, is pre-installed with high levels of security and a greater capacity to process security functions, according to the manufacturer. The new levels of security are created by embedding IBM’s newly developed cryptography features into the z13s’s hardware. By running cryptography functions in silicon the mainframe can run its encryption and decryption processes twice as fast as previous generations of machine, boosting the speed of information exchange across the cloud, it claimed.

The new mainframe creates the most secure server in environment in the world, according to an independent report quoted by IBM from researcher Strategy Analytics (2015 Global Server Hardware and Server OS Reliability Survey).

Encrypting sensitive data across company IT departments, regional offices and the public cloud has become a barrier to adoption of this more efficient model of computing, according to IBM’s senior VP of Systems Tom Rosamilia. In response the new z13s model has extra cryptographic and tamper-resistant hardware-accelerated cryptographic coprocessor cards. These have faster processors and more memory, encrypting at twice the speed of previous mid-range systems, which means that hybrid clouds can now handle high-volume, cryptographically-protected transactions, without delay.

The new model uses the Cyber Security Analytics which are standard within the z systems range of mainframes, with the addition of IBM Security QRadar security software, which correlates security intelligence from 500 sources in order to help it spot anomalies and potential threats. This can be used along with the Multi-factor Authentication built into the z/OS operating system for the mainframe range.

The system also uses IBM’s Security Identity Governance and Intelligence to create policy to govern and audit access, in order to cut internal data loss. Access to application programming interfaces (APIs) and microservices, configurable by IBM integration partners, can be used to shut down any further hybrid computing vulnerabilities according to IBM, which announced the addition of BlackRidge Technology, Forcepoint and RSM Partners to its Ready for IBM Security Intelligence partner programme.

Exponential Docker usage shows container popularity

Global Container TradeAdoption of Docker’s containerisation technology has entered a period of explosive growth with its usage numbers nearly doubling in the last three months, according to its latest figures.

A declaration on the company blog reports that Docker has now issued 2 billion ‘pulls’ of images. In November 2015 the usage figure stood at 1.2 bullion pulls and the Docker Hub from which these images are pulled was only launched in March 2013.

Docker’s invention of software defined autonomous complete file system that encapsulates all the elements of a server in microcosm – such as code, runtime, system tools and system libraries – has whetted the appetite of developers in the age of the cloud.

In January 2016, Docker users pulled images nearly 7,000 times per minute, which was four times the run rate a year ago. In that one month Docker enjoyed the equivalent of 15% of its total transaction from the past three years.

The number of ‘pulls’ is significant because each of these transactions indicates that a Docker engine is downloading an image to create containers from it. Development teams use Docker Hub to publish and use containerised software, and automate their delivery. The fact that two billion pulls have now taken place indicates the popularity of the technology and the exponential growth rate in the last three months is an indicator of the growing popularity of this variation of virtualisation.

There are currently over 400,000 registered users on Docker Hub. “Our users span from the largest corporations, to newly-launched startups, to the individual Docker enthusiast and their number is increasing every day,” wrote Docker spokesman and blog author Mario Ponticello.

Around a fifth of Docker’s two billion pulls come from its 93 ‘Official Repos’ – a curated set of images from Docker’s partners, including NGINX, Oracle, Node.js and Cloudbees. Docker’s security-monitoring service Nautilus maintains integrity of the Official Repos over time.

“As our ecosystem grows, we’ll be adding single-click deployment and security scanning to the Docker platform,” said Monticello.

A Rightscale study in January 2016 found that 17% of enterprises now have more than 1,000 virtual machines in the public cloud (up 4% in a year) while private clouds are showing even stronger appetite for virtualisation techniques with 31% of enterprises running more than 1,000 VMs, up from 22% in 2015.

Google adds to its Cloud Platform as vendors compete with AWS Lambda

Google officeGoogle has added to its public cloud infrastructure for developers, Cloud Platform, with a new service that allows app writers to set up functions that can be triggered in response to events. The new Google Cloud Functions has drawn comparison with the Lambda offering from Amazon Web Services (AWS).

The service was not announced to the public, but news filtered out after documentation began to appear on Google’s web site, offering advice to developers. According to the briefing notes, Google Cloud Functions is a ‘lightweight, event-based, asynchronous’ computing system that can be used to create small, single-purpose functions in response to cloud events without the need for managing servers or programming in a runtime environment. Access to the service is available to anyone who fills out a form on the web site.

Google’s answer to AWS Lambda is the latest attempt to catch up with AWS by filling in the omissions in its own service. In September 2015 BCN reported how Google’s Cloud Platform is being sped up by the addition of four new content delivery networks, with CloudFlare, Fastly, Highwinds Network and Level 3 Communications adding to Google’s network of 70 points of presence in 33 countries as part of a new Google CDN Interconnect programme.

Google has also bolstered its cloud offering with new networking, containerisation and price cuts, BCN reported in November 2015. Google has also recruited VMware cofounder Diane Greene to lead all of its cloud businesses, as reported last year.

Google Cloud Functions run as Node.js modules and can be written in JavaScript. A response could be set up to react to, say, circumstances in a user’s Google Cloud Storage, such as an unwanted type of picture file or title. The service also works with webhooks, which contributes to a speeding up of programming processes and code maintenance.

The prices for Cloud Functions were not listed, as the service is still in Alpha mode.

Meanwhile a new start up, Iron.io, has raised $11.5 million in venture capital to develop its own answer to Lamba and Cloud Functions. Microsoft is also rumoured to be developing its own version of Cloud Functions for Azure, according to a report in Forbes.

How to Distribute a Company Intranet to Employees

We’ve just launched the new version of Parallels Remote Application Server v15, with amazing new features. To ensure a smooth transition when upgrading to this version, please review the upgrade procedure (based on best practices) in this KB article. This article describes how to use one Parallels Remote Application Server’s new features to publish an […]

The post How to Distribute a Company Intranet to Employees appeared first on Parallels Blog.

The easiest way to explain the cloud to your boss

one plus one cloud dealToday, approximately 90 per cent of businesses are using at least one cloud application. Yet, only 32 per cent of these companies are running more than a fifth of their applications in the cloud. The obvious conclusion is that many company executives haven’t quite grasped what the cloud can do for them, which is why it is time for IT organisations to take an active role in explaining the cloud to the business.

One of the predominant issues preventing enterprises from realising the benefits of the cloud is their limited understanding of the technology. In simple terms, cloud computing can be defined as a computing environment consisting of pooled IT resources that can be consumed on demand. The ultimate benefit of the approach is that applications can be accessed from any device with an Internet connection.

However, even more commonly, executives are interested in hearing business cases for the implementation of cloud. Now, let’s walk through some of the most compelling pro-cloud arguments with comments from industry experts.

The money argument

“But can we afford it?”

Luckily for you, the numbers are on your side.

As David Goulden, CEO of EMC Infrastructure, explains in a recent interview: “An immediate driver of many implementations is cost reduction. Both McKinsey and EMC analyses have found that enterprises moving to hybrid cloud can reduce their IT operating expense by 24%. That’s a significant number, and in essence can fund the people and process changes that yield the other benefits of hybrid cloud.”

But where do those cost reductions come from? Goulden explains that while lower hardware, software, facilities and telecom costs account for some of the savings, by far the most substantial reductions can be made in OPEX budgets: “The automation of hybrid cloud dramatically reduces the amount of labour needed to deploy new application software, and to monitor, operate, and make adjustments to the infrastructure. Tasks that used to take days are performed in minutes or seconds.”

The agility issue

“But how will it increase our agility?”

When it comes to cloud computing, agility is commonly used to describe the rapid provisioning of computer resources. However, as HyperStratus’ CEO Bernard Golden suggests, the term can be used to refer to two entirely different advantages: IT resource availability and responsiveness to changes in the business.

Furthermore, he argues that although internal IT availability is necessary for success, the ultimate aim of cloud computing efforts should be speeding business innovation to the market: “the ability to surround a physical product or service with supporting applications offers more value to customers and provides competitive advantage to the vendor. And knowing how to take advantage of cloud computing to speed delivery of complementary applications into the marketplace is crucial to win in the future.“

The security concern

“But will our information be safe?”

Short answer: that’s completely up to your cloud. The beauty of a well-designed hybrid cloud is that it allows enterprises to allocate their applications and data between different cloud solutions in a way that brings out the benefits of all and the drawbacks of none.

However, as Tech Republic’s Enterprise Editor Conner Forrest explains in a recent article: “One of the raging debates when it comes to cloud security is the level of security offered by private and public clouds. While a private cloud strategy may initially offer more control over your data and easier compliance to HIPAA standards and PCI, it is not inherently more or less secure. True security has more to do with your overall cloud strategy and how you are using the technology.” Thus, a haphazard mix of public and private doesn’t automatically make a hybrid cloud.

The customer angle

“But how will it benefit our customers?”

More recently, the C-suite has woken up to the reality that cloud applications can help them attract and retain customers. A good example of this comes from the University of North Texas, whose CFO Rama Dhuwaraha explains: “The typical student on campus today has about six different devices that need Internet access for parking services we offer, dining, classroom registration and paying bills online. During enrolment, most of them don’t want to go find a lab and then enrol – they want it at their fingertips. We have to extend those services to them.”

Overall, the value proposition of a customised cloud solution should be pretty clear. However, as Goulden emphasises: “Most companies simply don’t realise how quickly they can implement a hybrid cloud, or how much money and capability they’re leaving on the table until they have one”. Therefore, as IT professionals, it is our responsibility to take this message forward to the business and develop cloud strategies that serve the interest of the enterprise.

 

Written by Rob Bradburn, Senior Web Operations Manager, Digital Insights & Demand, EMC – EMEA Marketing

IBM launches 26 new cloud services for data scientists

IBM2IBM is launching 26 services new services on its IBM Cloud which it describes as a ‘sweeping portfolio for data scientists and app developers’. Its new offering includes 150 publicly available datasets.

The new initiative aims to help developers build and manage applications and help data scientists to read events in the cloud more intuitively. The hybrid cloud service scans multiple cloud providers and uses open systems which, IBM says, will create a ready flow of data across different services.

The new cloud offerings will create a self-service for data preparation, migration and integration, IBM claims, with users being provided with tools for advanced data exploration and modelling. The four main pillars of the new service offering come under the headings of Compose Enterprise, Graph, Predictive Analytics and Analytics Exchange.

The IBM Compose Enterprise is a managed platform that aims to help developers build web-scale apps faster by giving them access to resources such as open source databases and their own their own dedicated cloud servers. Graph is a managed graph database service built on Apache TinkerPop with a stack of business-ready apps with real-time recommendations, fraud detection, IoT and network analysis uses. Predictive Analytics promises developers easy self-build machine learning models from a library of predictive apps generally used by data scientists. Analytics Exchange contains the catalogue of 150 publicly available datasets.

The Apache TinkerPop and the Gremlin graph traversal language will be the primary interface to IBM’s Graph service. IBM has previously pushed TinkerPop to join the Apache Software Foundation. In September BCN reported that IBM is to open a San Francisco facility with resources dedicated to IBM’s new Spark processing technology as the vendor seeks to get Spark users interested in IBM’s Watson developer cloud.

Data handlers are currently handicapped by having to use disparate systems for data needs, IBM claims. “Our goal is to move data into a one-stop shop,” said Derek Schoettle, General Manager, Analytics Platform and Cloud Data Services.

Pulsant launches cloud comparison service

cloud question markUK based data centre and hosting company Pulsant claims it has created a system to help cloud buyers make fewer bad choices over online computing services.

The system aims to cut the confusion for C-level executives who currently struggle to make sense of the array of options, services and misleadingly named products. By automating the process of evaluation, the new Cloud Intelligence decision engine will take the end user closer to the conditions of ‘perfect information’ needed before the full benefits of the market are possible, it claims.

Cloud Intelligence is an interactive tool designed to concentrate the buyer’s mind when comparing different options. One of the user problems it aims to address is matching the offering to the needs of the buyer. In a relatively new market like cloud computing, companies ideally don’t want to trust their research to a conversation with a software salesman, according to Pulsant, which claims due diligence over cloud service choices has become a ‘definite challenge’.

The configuration aims to guide the decision makers by identifying their starting point. It separates technically savvy buyers, who may be familiar with the cloud landscape, from the non-technical procurement personnel who are increasingly embarking on research. The system aims to guide each type down customised paths in order to meet their needs. Non-technical users will interact with the decision engine in layman’s terms that concentrate on the benefits of the services on offer, while technical decision makers will be given more detailed information about configurations and platforms.

Pulsant said it aims to achieve for the B2B market what comparison sites achieve in B2C markets. The new service is necessary because cloud computing is changing software buying behaviours in the corporate world, according to Adam Eaton, sales director at Pulsant.

“Customers are doing a great deal of research before making IT decisions but there’s a lot of clutter to get through. We want to give people the information they’re looking for on the solutions and services they’re interested in, while still interactively engaging with them,” said Eaton.

According to Pulsant, 80% of visitors who start on their decision making journey with them will pursue it to the end.