Aruba’s SD-Branch hooks SD-WAN, wired and wireless networks together


Adam Shepherd

19 Jun, 2018

Aruba has designed a new software-defined networking (SDN) tool to allow multi-site customers to manage their networking in a simpler and more streamlined way.

The HPE-owned company’s new SD-Branch links SD-WAN, wired and wireless networking infrastructure together, routing them all through Aruba’s new Branch Gateways so they can be managed and controlled through the cloud-based Aruba Central management platform.

In addition, the inclusion of Aruba’s ClearPass policy manager means network policy can be created and enforced remotely and automatically, without administrators having to manually provision equipment or conduct on-site maintenance. For Aruba, the aim is to help businesses cut out inefficiency, speed up deployment and reduce networking complexity.

“First and foremost, this software-defined branch solution and architecture significantly increases IT’s ability to respond in real time to the business’s need to be agile,” Aruba’s Lissa Hollinger said at HPE Discover 2018 yesterday, citing the fact that many customers have 10 to 12 IT staff managing up to 3,000 branches.

“You can imagine how complex that is if you don’t have a centralised way to automate deployment and provisioning and monitoring, so this significantly increases IT’s ability to be agile and to focus on more strategic initiatives as opposed to just keeping the lights on,” she added.

Simple, zero-touch provisioning is another key benefit of the service, and vice-president and general manager of Aruba’s cloud and SD-Branch division, Kishore Seshadri, noted that this is a critical feature for many customers.

“If you own a thousand cafes or a thousand restaurants, and you want to deploy these solutions,” he explained, “previously you could do this across two or three years – now we’re asked to be able to do this in two or three months. You have to assume that there is no technical resource on the ground, there is no IT team on the ground, so it’s just expected that you will ship a device to the location, somebody unpacks it, plugs it in; it just has to work.”

As with any networking technology, security is a critical feature of SD-Branch. Aruba has partnered with network security vendors including Zscaler, Palo Alto Networks and Check Point to offer cloud-based firewall protections, in addition to the Branch Gateway’s built-in firewall and deep packet inspection tools.

The new branch gateway units also offer context awareness, allowing for dynamic traffic optimisation to ensure maximum quality of service for bandwidth-hungry business-critical devices and applications. This also feeds into policy-based routing tools that ensure organisations can specify exactly which services they want to prioritise.

SD-Branch is hardware-agnostic, in that customers do not necessarily need to deploy Aruba’s switches or access points in order to make use of it – although the company claimed that customers may be limited by the features offered by third-party vendors.

In order to deploy the new package, customers will need to be subscribed to Aruba Central, with a headend gateway in their datacentre to manage traffic and a branch gateway unit in each physical location. Prices start at $1,495 each for the physical gateway hardware, as well as $450 in subscription fees per gateway per year.

HPE invests $4 billion in edge computing


Adam Shepherd

19 Jun, 2018

HPE is set to spend $4 billion on edge computing over the next four years, underlining the company’s strategic shift away from its traditional datacentre roots.

Speaking at the company’s annual conference, HPE Discover, CEO Antonio Neri yesterday revealed that his company would invest heavily to support the collection, processing and analysis of data outside of datacentre or cloud environments. This investment will be focused on research and development in the pursuit of new products and services in areas including automation, AI, edge computing and security.

“The edge is where we interact with our customers. That’s what the edge is all about,” Neri told attendees. “Actually, the edge is anywhere technology gets put into action. And I believe the edge is the next big opportunity for all of us.

“This next revolution requires what we call an ‘edge-to-cloud’ architecture. A world with millions of clouds distributed everywhere – that’s the future as we see it. And HPE is uniquely positioned to drive this next revolution.”

This move is a direct response to the explosion in data that has occured over the last few years, Neri said, explaining that a large portion of data that is generated at the edge is still being lost or wasted because businesses do not have the capacity to process it, and that the forthcoming development of smart cities, driverless cars and other tech innovations will only increase the amount of data being generated.

“The reality is that two years from now, we are going to generate twice the amount of data we have generated in the entirety of human history,” Neri said, “and that’s an incredible opportunity. Data that actually has the potential value to drive insights and actions across our world. To change our lives and our businesses.”

One example Neri cited was Tottenham Hotspur FC, which is using an ‘edge-to-cloud solution’ delivered by HPE’s PointNext and Aruba divisions to deliver high-speed networking for fans, combined with personalised interactive experiences and new merchandising opportunities.

HPE isn’t the only company that’s putting significant store by edge computing, though; its main rival, Dell Technologies, is also investing in the area through subsidiary VMware. The company launched a suite of new IoT packages for edge compute use cases at this year’s Mobile World Congress, powered by Dell’s hyper-converged infrastructure.

“We believe the enterprise of the future will be edge-centric, cloud-enabled and data-driven,” Neri finished. “Those that can act with the speed and agility on a continuous stream of insights and knowledge will win. That’s why our strategy is to accelerate your enterprise from edge to cloud, helping connect all of your edges, all your clouds, everywhere.”

HPE launches hybrid cloud-as-a-service offering


Adam Shepherd

20 Jun, 2018

HPE has launched a new consumption-based hybrid cloud-as-a-service offering, designed to help customers manage costs and reduce complexity within their hybrid IT infrastructure deployments.

Offered under the company’s IT-as-a-service umbrella brand GreenLake, HPE GreenLake Hybrid Cloud is a managed service that allows customers to more efficiently consume cloud services and on-premise infrastructure as part of a long-term monthly cost rather than a large upfront investment.

GreenLake Hybrid Cloud customers can have their cloud infrastructure – both public and private – designed, configured and deployed by HPE, and then maintained, supported and optimised on an ongoing basis. The company is utilising technology and capabilities from its PointNext consulting division, as well as its recent acquisitions, Cloud Technology Partners and RedPixie.

Similar to the company’s GreenLake Flex Capacity consumption model, the service uses metering services under HPE-acquired cloud monitoring firm Cloud Cruiser. Customers can closely monitor the costs of their cloud services and set limits on spending, scaling up and down as necessary.

Scott Ramsey, vice president of consumption and managed services for HPE PointNext, said this model can save organisations considerable amounts of money compared to traditional infrastructure procurement models.

“We’ve got strong empirical evidence from our 540 [existing GreenLake] customers that we’ve got that you’re total cost of ownership is in the region of 25% to 30% lower in this type of model,” he told Cloud Pro. “If you’re a customer or a business, and you’re not interested in something that can save you 25% to 30% total cost of ownership, then I’ve got to question what you’re doing, to be honest.”

The service brings with it benefits in a number of areas, according to HPE. Aside from the obvious cost control benefits that come from a consumption-based model, GreenLake Hybrid Cloud also allows businesses to trim additional operation costs by reducing the need to train IT staff in deploying and maintaining cloud infrastructure, the vendor claimed. In addition, HPE said it reduces the burden these tasks place on IT staff, freeing them up to work on projects that can deliver more practical business value.

“This model with HPE GreenLake Hybrid Cloud allows us to take out that heavy lifting that isn’t really about driving innovation in the business, but is about operating the underlying infrastructure,” explained John Treadway, senior vice president of Cloud Technology Partners. “It’s not the most value-adding thing that an IT organisation should be focused on. It should be focused on solutions to drive revenue growth and to provide analytics in business decision-making. Us taking that on allows the clients to actually be faster.”

HPE has also used the set of internal rules that it developed as part of its acquisition of Cloud Technology Partners – which PointNext SVP Ana Pinczuk said covers some 1,000 regulatory and compliance standards – to build compliance management capabilities into GreenLake Hybrid Cloud, which will supposedly allow customers to automate much of the work that goes into ensuring compliance.

The new service supports public cloud deployments on Microsoft Azure and AWS, and private cloud infrastructure via Microsoft Azure Stack and HPE ProLiant for Azure Stack, all of which is managed by HPE OneSphere, the company’s over-arching management layer.

“For some of our customers, their hybrid strategy is really Azure on and off-premise,” said Ric Lewis, senior vice president and general manager for HPE’s cloud division. “But what some customers don’t realise is Microsoft Azure Stack and Microsoft Azure are pretty separate. They run the same kind of code on the same base, but it’s not like you can move things back and forth, and it’s difficult to manage between the two.”

“With GreenLake Hybrid Cloud and some of the management and analytics that we lift from HPE OneSphere, we can help customers stitch those two fairly separate things together, regardless of the fact that they run the same thing; at least they’ll look like they’re part of the same estate and we can make that seamless for customers.”

Customers using HPE hardware for their private cloud deployments can also take advantage of seamless automatic management and provisioning via the company’s OneView on-prem automation product, which can be controlled directly via an integration with OneSphere.

However, those using alternative hardware vendors to power their infrastructure aren’t left out in the cold; OneSphere is vendor-agnostic, meaning that you can use GreenLake Hybrid Cloud to manage your infrastructure regardless of whether you’re using servers from HPE, Dell EMC, Broadberry, Lenovo or anyone else.

Although customers have a wealth of choice in terms of the infrastructure hardware they want to use to run their private clouds, they are more limited when it comes to which clouds they can actually run.

Out of the box, GreenLake Hybrid Cloud only supports AWS and Azure public cloud deployments, and despite Lewis assuring reporters that HPE is actively working on adding support for Cloud28+ partners, Ramsey told Cloud Pro that in the immediate future, the company won’t be adding support for any additional providers.

“We’ve picked the two giants of the industry to work with,” he said, “and both have a lot of strength in the enterprise arena. Over the next 12 months or so, I would say our focus will be going more vertical with those guys, getting stronger value propositions with AWS and Azure.”

“For now, our roadmap is really focused in upon making sure that the AWS and the Azure experience gets better and richer, so we’ll add more features, more functionality, more capability, more tooling into that as we go forward. That’ll be where our primary focus is going to be.”

He did, however, point out that this only applies to GreenLake Hybrid Cloud when viewed as a turnkey solution – if customers come to HPE with specific requirements that can best be met by a local Cloud28+ provider, this will be taken into consideration.

He also stated that the addition of Google Cloud support as standard was “a distinct possibility”, but said that he had no specific plans to include it. “They’re fundamentally still pretty much in the consumer space from a cloud perspective,” he said, “[but] they’re the obvious next big player that we’d want to think about working with.”

Similarly, while Azure Stack is the only private cloud infrastructure that is officially supported out the box, Ramsey confirmed that HPE is happy to support other providers if a customer has specific needs.

“We have the GreenLake solutions – which I always describe as the ‘turnkey’ solutions, where we give it to you and it’s ready to go out the box – but if a customer says to me ‘I want to run an OpenStack private cloud’, we will solution that for them, and we have done that [for some customers].”

10 signs your clients need to deploy Azure

Microsoft Azure is an ever-expanding set of services to help an organisation meet business challenges. Azure is a cloud platform for building, deploying and managing services and applications, allowing you and your clients to meet all your needs through all their services, within one platform.

Their on-premise servers are out-of-date

Operating on out-of-date servers is like driving a rusty old car. First you lose a wing mirror, then a door panel until you’re left in the street clutching nothing more than the steering wheel.

Azure, in contrast, is like a well-oiled machine. It has strong security capabilities and an IT management time saving of 80 percent. With Azure, your clients can always operate efficiently without fear of a breakdown.

They still have on-premise backups

How many clients are still backing up their data to an on-premise server? If your answer is more than zero, it’s time to pitch Azure.

Azure backup automatically backs up data to the cloud, triple replicating and then geo-replicating it across multiple data centres, meaning your clients always have access to a copy of their data, no matter what happens.

They’re worried about ransomware

If your clients are still on the fence about Azure, the recent WannaCry ransomware attack should sway them.

Resilient backup capabilities and a state-of-the-art security centre, which features the likes of Azure Threat Detection, means your clients can easily monitor, identify and shut down threats and protect their data with backups. Combined with upgrades to Windows 10 and Enterprise Mobility and Security, you can offer your clients greater peace of mind.

They’ve already adopted Office 365

60 million businesses already use Office 365, some of whom are probably your customers. Yet most are failing to take advantage of the complete Azure package.

As a service provider, it’s your role to educate customers on everything they’re missing out on. Clients with Office 365 should take to Azure with ease, especially when they find out that it’s fully compliant, secure and affordable software solution, offering far more than online Word processors.

They have centralised applications but flexible working practices

BYOD and flexible working practices are on the rise, and Azure is the key to transparent device management whether its on-premise or off-site.

Azure makes it easier to support mobile and flexible working. Anyone with an internet connection can use cloud-based services. Combinded with Office 365 and Enterprise Mobility Suite, your customers can hit peak productivity regardless of their location.

They’re struggling to manage a development

The business benefits of Azure extend further than high security capabilities and cloud backups. Azure allows clients to design, develop, test and deliver custom applications easily.

The architecture behind Azure gives your clients everything they need to manage a complete development and test environment straight from the cloud.

They’re suffering compliance issues

As of May, this year, all businesses must be compliant with new General Data Protection Regulations (GDPR).

Microsoft has always sought to help businesses remain compliant, such as creating UK-based data centres, and it built Azure with GDPR in mind. Azure’s services are all in line with new regulations, giving your clients the freedom and confidence to operate.

They’re suffering from lengthy deployment times

Thanks to Azure, businesses can deploy applications easily from any location, cutting deployment time by more than 50 percent and giving your clients complete control over which apps get installed on what devices.

They’re struggling to scale

If your clients are still using legacy systems, they’re probably suffering from scalability issues. Azure uses auto-scaling, meaning that clients can quickly scale to handle load increases, letting Microsoft take care of the infrastructure.

They’re paying too much for infrastructure

Legacy on-premise infrastructure is costly. If your clients are still paying for data storage they’re not using, they need to move to Azure.

Azure is a pay-as-you-go services with no upfront costs or termination fees, meaning your customers only pay for what they use, and never anymore.

Hosting on a cloud platform like Azure does come with its cost, and it can be quite a complex process. Before deploying Azure, you need to make sure you client can cover these costs and understanding the process.

Registration Opens for @AlexLovellTroy Sessions | @CloudEXPO @Pythian #DataHording #DigitalTransformation

Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?

read more

Kubernetes skills demand continues to soar – but are organisations dropping the ball on security?

If you have Kubernetes skills then you will almost certainly be in demand from employers, as a new survey from CyberArk has found that IT jobs with the container orchestration tool in the title have soared year on year. But beware the security risks when getting involved.

According to the company, which has crunched data from IT Jobs Watch, roles involving Kubernetes have broken into the top 250 most popular IT vacancies, having been around the 1000 mark this time last year. The most likely job title for potential applicants is either DevOps engineer (40%) or developer (23%).

Regular readers of this publication will be more than aware of the initiatives taking place within the industry over the past year. The leading cloud providers are getting on board; Amazon Web Services (AWS) and Microsoft both made their managed Kubernetes services generally available this month, while back in March Kubernetes itself ‘graduated’ from its arbiter, the Cloud Native Computing Foundation (CNCF), recognising the technology’s maturity.

Those with product to shift are eating their own dog food, making their own internal process container-based. IBM, as John Considine, general manager of cloud infrastructure services, told CloudTech earlier this year, and Google, as Diane Greene told Cisco Live attendees last week, are but two examples. Alongside this, customers are putting containers at the forefront of their buying decisions; GoDaddy said as much when it was announced the hosting provider would be going all-in on AWS.

Yet with so many organisations going in at the deep end, there is a danger of getting into trouble when swimming against the tide.

In a report published this week (pdf), security firm Lacework assessed there were more than 21,169 publicly facing container orchestration platform dashboards out there. 300 of these dashboards were completely open. Whether Weight Watchers’ Kubernetes admin console, which researchers from Kromtech Security found earlier this month to be completely accessible without password protection, was included, we will of course not know. Another widely publicised story was around Tesla; back in February, research from RedLock found hackers had been running crypto mining scripts on unsecured Kubernetes instances owned by the electric car firm.

“During our research we learned that there are a lot of different ways to manage your containers, and that they are all incredibly flexible and powerful,” the Lacework report notes. “With each one you essentially have the keys to the castle from deployment, discovery, deletion, and manageability.

“We suggest that if you are a security professional and you don’t know you are running a container orchestration system, you should definitely find out ASAP.”

CyberArk offers a similar message of concern. “There is a very real danger that the rush to achieve IT and business advantages will outpace awareness of the security risks,” said Josh Kirkwood, CyberArk DevOps security lead. “If privileged accounts in Kubernetes are left unmanaged, and attackers get inside the control panel, they could gain control of an organisation’s entire IT infrastructure.

“Many organisations simple task the same DevOps hires – often with no security experience – to protect these new Kubernetes environments, in addition to the numerous other responsibilities they have to deliver,” added Kirkwood. “That’s no longer sufficient, and security teams need to get more closely involved to support the platform.”

According to the Lacework report, if you’re running Kubernetes you need to build a pod security policy, configure your pods to run read-only file systems and restrict privilege escalation. More general container advice doubles up essentially as good security practice; multi-factor authentication at all times, using SSL for all servers, and using valid certificates with proper expiration and enforcement policies.

So is it time to take a step back? If you have Kubernetes skills then you’re in a good place – but get some security smarts alongside it and you’ll be in an even better one.

Microsoft’s work with ICE sparks backlash among tech community


Joe Curtis

19 Jun, 2018

Microsoft is under pressure from the tech community to terminate a contract it holds with US Immigration and Customs Enforcement (ICE), in light of the department’s family separation policy.

A blog post published in January has resurfaced in the last couple of days, detailing how Microsoft “is proud to support” ICE’s IT innovation aims through its Azure Government platform.

A protest is taking place later today outside Microsoft’s DC office about the issue, and employees are being encouraged to speak out online.

Erica Baker, a senior engineering manager at Patreon, said on Twitter: “Friends who work for @Microsoft, FIGHT THIS. Make the biggest noise imaginable about it.

“Don’t fall for the “all companies take government contracts” spin. Your company has THIS contract and is *proud* of it.”

A spokesperson for Microsoft said it’s not working with ICE or US Customers and Border Protection “on any projects related to separating children from their families”, and doesn’t believe its Azure services are supporting such projects.

“As a company, Microsoft is dismayed by the forcible separation of children from their families at the border,” the spokesperson added.

“We urge the administration to change its policy and Congress to pass legislation ensuring children are no longer separated from their families.”

Its original blog post, written by Tom Keane, general manager at Microsoft, read: “ICE’s decision to accelerate IT modernization using Azure Government will help them innovate faster while reducing the burden of legacy IT.

“The agency is currently implementing transformative technologies for homeland security and public safety, and we’re proud to support this work with our mission-critical cloud.”

This section was reportedly removed from the blog by an employee earlier in response to the social media outburst, but was later re-inserted.

Picture: Bigstock

AWS to create 1,000 new tech jobs in Ireland


Bobby Hellard

19 Jun, 2018

Amazon Web Services (AWS) has pledged to create 1,000 technology jobs in Ireland over the next two years after opening a new office in Dublin yesterday.

The new 170,000 square foot building will offer highly skilled roles for software development, network development, data centre and systems engineers and security and big data specialists for AWS.

There will also be tech job openings at the company’s other sites in north county Dublin, Blanchardstown, and Tallaght, as AWS looks to utilise the talent in the country.

“Today, we have more than 2,500 Amazon employees in Ireland supporting customers from Ireland and around the world,” said Mike Beary, AWS Ireland country manager.

“There is an abundance of talent in Ireland which helped us to exceed our talent growth targets ahead of schedule. Ireland is a great place to do business, the country’s creative culture and diverse pool of technical skills make it an ideal location for our rapidly expanding business.”

AWS has a long history with Ireland, having first set up an office in the country in September 2004. Today more than 1,000 of AWS’s Ireland empoyees are engaged in data centre operations.

In addition to creating new jobs, AWS has collaborated with the Institute of Technology Tallaght (IT Tallaght) to fund a bursary programme for 20 students to study to become data centre technicians.

“Amazon has a long history of success in Ireland and today’s announcement is a testament to Ireland’s highly-skilled, diverse workforce,” Martin Shanahan, CEO of IDA Ireland, a government body designed to encourage foreign private investment in the country, added.

“Tech talent and investment are fundamental to our country’s continued growth, and companies like Amazon are bringing even more energy, vision, innovation and good jobs to Ireland. We are proud to support these companies, who invest in our talent and the future of our economy, and create new opportunities for the country to succeed.”

Picture: AWS

Parallels Desktop Promotion 25% OFF for Our 2018 Birthday Sale

Parallels Desktop® for Mac is the perfect blend of Windows and macOS®. Users can seamlessly run Windows, Linux, and other popular OSes on macOS without having to reboot or partition a hard drive. At Parallels, our 12th birthday is approaching—and with it, the annual celebratory event offering a birthday promotion of 25% off. Are you new […]

The post Parallels Desktop Promotion 25% OFF for Our 2018 Birthday Sale appeared first on Parallels Blog.

Josh Atwell: #DevOps Maturity | @CloudEXPO @NetApp @Josh_Atwell #CloudNative #Serverless

Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.

read more