Citrix wants users to log in using facial recognition


Keumars Afifi-Sabet

9 May, 2018

Citrix products are set to support multi-factor authentication and facial recognition, instead of just relying on the age-old username and password combination.

Speaking at Citrix Synergy 2018 in California, senior company executives explained the company will support alternative ways of confirming users’ identities in a bid to boost security and improve the user experience.

Addressing reporters at a press Q&A following the keynote address along with CEO David Henshall, , chief product officer PJ Hough said Citrix was adding “full support for multi-factor authentication in the platform and the Workspace“.

“We are broadly deployed in a number of industries including healthcare, where log-in techniques such as badge swipe, etc, are already dominant as the log-in mechanisms, so the workspace will support all of those as capabilities,” he said.

But he conceded that the widespread implementation of such alternatives is rather “future-oriented” – in light of the fact many devices lack the hardware capability, despite outlining that everything announced in the keynote address was either available now, or would be within the next 90 days.

“One of the reasons why we can’t actually broadly deliver facial recognition technology is not because we haven’t implemented the software part of it, it’s that the devices aren’t broadly deployed that are necessary to support it; whether it’s encryption capabilities or high-quality facial recognition that doesn’t get fooled by holding up a picture of me in front of it.”

While the iPhone X does boast this capabilty, its status as an expensive, high-end smartphone means it likely hasn’t found much adoption within businesses.

He added: “We continue to evolve those technologies, but we are part of a broader ecosystem and we need the ecosystem partners here to continue to invest. I think part of the opportunity for us is we want to inspire the ecosystem.”

CTO Christian Reilly demonstrated a login using facial recognition during his keynote address, in which he said “usernames and passwords are not great security”. 

Expanding on the vulnerability of passwords, Sridhar Mullapudi, VP of product management for Workspace services, told IT Pro the widespread use of passwords makes it more likely to suffer a security risk, and that they were “an old way of doing things”. Plenty of recent data breaches have involved the threat of credential re-use, where, for example, a hack of LinkedIn password information has forced other companies to reset customers’ login details.

“As part of the Citrix Workspace we are building, identity and access management is a key part,” he said. “As part of that we have solutions; we built multi-factor authentication, and that could be facial recognition or thumbprint, or any other factor that you use to log-in, like two-factor authentication (2FA) – that’s built into the Workspace itself.”

He added: “If users have to remember passwords, and create passwords across multiple applications, it is not the most secure way – because users are not the best in … making passwords across everything, so you want to be able to provide them with a secure single sign-on across the applications.”

Citrix’s decision to strengthen security comes as part of a wider industry movement away from passwords. Microsoft ditched conventional passwords altogether in a test run for Windows 10 S earlier this year, opting instead for alternative options such as facial recognition, fingerprint, and FIDO keys.

Picture: Shutterstock

Citrix puts Synergy emphasis on user experience and security


Keumars Afifi-Sabet

9 May, 2018

Citrix CEO David Henshall wants his company to focus on three key areas; unifying the portfolio, accelerating to the cloud, and expanding into new technologies.

In his opening keynote address at Synergy 2018 in California, Henshall outlined his future plans for the virtualisation firm, centred around the theme of “people-centric computing” where technology delivers everything users need in a simple and accessible way.

Key to Henshall’s vision is the ‘universal workspace’ – embodied in the Workplace App, Citrix’s latest innovation – which aims to reduce complexity, and raise productivity by enabling universal access to apps.

Also integral to delivering its wider vision is the rollout of Citrix Analytics, a security platform initially announced last year, with chief product officer PJ Hough walking the audience through its potential.

“This is not another dashboard, this is not another alerting system; this is an autonomous closed-loop security platform that will deliver more productivity and security to your organisations,” Hough claimed.

Using a mass of data points, Citrix Analytics builds profiles for individual users, allowing an autonomous machine learning-powered system to analyse potential security risks in real-time, without causing any disruption to day-to-day workflows or productivity.

“We’ve essentially distributed risk out of the enterprise; we’re pushing it to the user, we’re pushing to the device, we’re pushing it the network and beyond,” Henshall said, adding that traditional cybersecurity defences are struggling to keep pace with the range of modern threats.

“[Old tactics are] too cumbersome. They’re too expensive. Cyber threats as we know it are only going to get more complex as we move forward, and only going to get more sophisticated, so that’s why we’re very much focused on a security model that’s we think is future-proof. The challenge is of course balancing these competing needs – the need for security with the potential impact on productivity,” he continued.

Announcing the “broad availability” of the Citrix Analytics platform, Hough said everything announced during the keynote is available either now, or will be within 90 days.

Speaking at a press Q&A following the keynote address, Hough went into more detail around building a better user experience for Citrix customers.

“Having spent a lot of time working on productivity software before I joined Citrix, I understand the value of reducing clicks, of reducing confusion for users, and really having people have a consistent and seamless experience across all their devices and platforms,” said Hough.

The keynote address also saw the firm make a slew of additional announcements tying into the idea of taking a holistic approach toward transforming the digital workspace.

These included an SD-WAN service for MSPs, an Intelligent Traffic Management tool based on its recent Cedexis acquisition, and, although the address was generally light on the Internet of Things (IoT), Amazon Alexa for business.

Poachers targeted using innovative tech


Clare Hopping

9 May, 2018

Dimension Data and Cisco have teamed up to expand their joint Connected Conservation project – a scheme designed to help protect elephants and rhinos from poachers – into Zambia, Kenya and Mozambique.

The solution works not by attaching trackers or sensors to the animals themselves, but by tracking activity in game reserves in the countries, analysing human activity via thermal cameras mounted on radio masts to transmit data back to operatives, CCTV analytics monitoring fishermen and boats on the lake, plus outdoor Wi-Fi so data can be shared in real time.

“Many organisations have committed to protecting animals through various reactive initiatives, such as dehorning, or inserting sensors in the horn and under the subcutaneous layer of skin. However, the problem with reactive initiatives is that by the time the reserve rangers reach the animal, it has been killed and the rhino horn or elephant tusks have been hacked off,” said Bruce Watson, Dimension Data Group Executive.

“With the Connected Conservation model, the technology is designed to proactively protect the land against humans. The animals are not touched, and are left to roam freely while a ‘layered’ effect of sophisticated technology, people and gadgets protect them.”

A control room is also being built for Zambia’s special marine unit with the sole responsibility of monitoring the data being fed back to it from the various source points. The Zambian local authorities will also work with local fishermen to hand out fishing permits, making it a more regulated industry than it currently is.

“More than ever before, technology has given us the ability to change the world – not tomorrow, not someday, but now,” added Karen Walker, Cisco senior vice president and chief marketing officer.

“We’re dedicated to making a difference by connecting the world and protecting the oldest and most vulnerable animals with some of the newest connectivity technology.

Red Hat teams up with IBM, Microsoft to streamline hybrid cloud app development


Dale Walker

9 May, 2018

Open source giant Red Hat has announced a series of industry partnerships that aim to make it easier for companies to develop container-based applications.

The first of these is a strategic deal with IBM that will see the companies combine their portfolios to offer new hybrid cloud services to their customers.

The agreement, announced at Red Hat’s annual Summit this week, means it’s now possible for customers of both companies to build and deploy applications using IBM’s Cloud service supported by Red Hat’s OpenShift Container platform. IBM’s WebSphere, DB2 and MQ software products will now be repackaged as certified containers on OpenShift.

It’s yet another deal struck in an ongoing partnership between the companies after a recent commitment by IBM to re-engineer its portfolio of software products to run using the increasingly popular container deployment.

A similar deal announced at the Summit will also see the creation of the industry’s first jointly managed container platform using Red Hat’s OpenShift software on Microsoft’s Azure environment, which includes access to Azure SQL DB and Azure Machine Learning. This builds upon a previous commitment signed in 2015 to bring more Red Hat products to Microsoft’s Azure platforms.

The aim of both agreements is to provide businesses with greater mobility when it comes to application deployment. An alternative to virtual machines, containers provide a means of bundling an application with all its software dependencies into a single package, bypassing the problem of incompatible environments when moving applications to different stages of testing or deployment.

IBM’s shift to containerisation will see its Cloud Private and Cloud Private for Data platforms, as well as a number of middleware products, become Red Hat-certified containers.

A joint consultancy unit will be set up linking both IBM Garage and Red Hat Consulting, which will support those customers either wishing to test out the combined service or looking to move their existing application investments to a hybrid model.

Arvind Krishna, senior vice president of IBM Hybrid Cloud, said that the move would provide “more choice and flexibility” to customers looking to move towards containered applications.

“Our common vision for hybrid cloud using container architectures allows millions of enterprises – from banks, to airlines, to government organizations – to access leading technology from both companies without having to choose between public and private cloud,” Krishna said.

By combining services, Red Hat customers will now be able to exploit well established cloud-based artificial intelligence, IoT, and blockchain tools provided by IBM.

As for the Microsoft partnership, Red Hat claims customers will be provided with a consistent experience throughout the development lifecycle of an application, including support for OpenShift on Microsoft’s on-premise platform Azure Stack, through to deployment in a hybrid cloud.

Visual Studio subscribers will also get Red Hat Linux credits for the first time, allowing developers to work from a single platform regardless of the open source framework they choose.

Scott Guthrie, executive vice president of Microsoft’s cloud and enterprise group, said: “Today, we’re combining both companies’ leadership in Kubernetes, hybrid cloud and enterprise operating systems to simplify the complex process of container management, with an industry-first solution on Azure.”

In a separate but related announcement this week, IBM also said its PowerAI platform, a suite of deep learning frameworks, will also be available through Red Hat Enterprise Linux.

Those organisations with eligible subscriptions can access their Red Hat OpenShift Container accounts on IBM’s Cloud platform using the Red Hat Cloud Access tool.

The joint Azure and OpenShift service is currently in a preview state, and will eventually be rolled out on a region by region basis.

Image: Shutterstock

Are resellers ready for the race to multi-cloud?

If anyone was in any doubt that multi-cloud was heading for the mainstream, recent research by 451 will have quashed it. Of the 800 businesses across the globe that responded to the latest Voice of the Enterprise: Cloud, Hosting and Managed Services, Budgets and Outlook survey, 69% said they planned to adopt a multi-cloud strategy by 2019. What’s more, the cloud computing-as-a-service market is expected to double to $53.3 billion by 2021, according to 451’s Market Monitor. So it’s no surprise that the number of vendors and technologies in the space are growing rapidly.

The good news for businesses is that this presents the chance to mix and match cloud services to maximise effectiveness, efficiency and costs. However, this may seem like a new and complex marketplace, which means it can be a tough challenge navigating your way through it. And therein lies a major opportunity for resellers: to help companies make sense of multi-cloud and secure the best solution for their needs. The question is whether resellers themselves are up for the challenge.

As a reseller, it’s vital to offer the best possible service to the end user. From a multi-cloud perspective, that means delivering vendor diversity to help companies reduce risk, save costs and maximise performance.

The right mix of suppliers, for example, can ensure that critical systems are running 100% of the time and reduce the risk of a business being hit by either a breakdown in a vendor relationship or a company going under.

Furthermore, while the hyperscalers do have similar base service offerings, they have also developed unique capabilities and services that solve for very specific problems. Plus despite these vendors taking a big piece of the multi-cloud pie, there are a growing number of rising stars now adding to the mix. This makes it critical for end users to understand the strengths and weaknesses of each player and the most appropriate applications.

Broadening the spread of vendors also makes good commercial sense as it will help to increase competition in the marketplace, driving down prices while pushing up quality and choice.

Resellers can help businesses to be vendor diverse in several key ways:

Providing key knowledge and being authoritative

This means staying up to date with accreditations, plus taking advantage of training courses and education sessions.

Being neutral

Give clients what they need, not what you want to sell them. This demands being vendor agnostic, understanding the client requirements, working with relevant partners and being open about the pros and cons.

Understanding the market

Dedicate time and effort to getting to know the commercial landscape thoroughly, so you can deliver the optimum vendor mix.

Offering a single point of contact

Simplify and personalise the client relationship by designating a single member of the team to oversee everything from billing to vendor management, who can also translate any industry jargon into plain English.

Simplify the process

Work with the right partners and utilise best-of-breed tools to make the discovery, planning, sourcing, and execution involved with implementing cloud strategies simple.

Multi-cloud presents resellers with a clear opportunity to differentiate themselves in a crowded channel marketplace if they are prepared to adjust their approach where necessary, as outlined above, and build key partner relationships. Legacy resellers continue to struggle, partly because of the pace of change, but also from continued demand from management and investors to chase perpetual licensing and infrastructure business. But to make the most of the multi-cloud future, it’s critical for resellers to be transparent and neutral and not bow to the pressure to optimise margins and push certain suppliers or partners.

It will also be important to manage partnerships carefully and effectively with multiple cloud service providers, which will be a big challenge. This is where a partner with existing cross-vendor relationships and products that run across multiple cloud offerings can prove invaluable. 

Finally, resellers should strongly consider specialisation by searching out a key multi-cloud niche vertical market to make their own. Rather than simply claiming to be a “cloud specialist” like the majority of other resellers, make it easy for the end client to differentiate you from the crowd that offer “vanilla” services which are simply resold or provided wholesale. If you have a vertical focus, look at building out a proposition specific to that market. This also helps many vendors, such as AWS, which has a strong focus on key verticals, particularly Life Sciences and more recently Finance.

Once you’ve identified your niche, compile strong relevant case studies to show your expertise and put together a marketing strategy, incorporating your vendors and partners where you can to strengthen your offer and add value to your relationship.

Making the most of multi-cloud will not be without its challenges to resellers, but if you can meet the growing demands of business for vendor diversity and find the right niche, the long-term rewards will be well worth the effort.

Read more: How resellers can make a difference in enabling organisations' cloud transformations

It’s time to build a multi-cloud strategy to make the best of falling public cloud prices

This year will see a marked increase in competition for public cloud dollars – and not just from incumbents like Amazon and Google. As demand for the cloud grows, big companies from China are making significant moves to expand their global reach – with Alibaba in particular moving aggressively into Europe and the United States.

The public cloud spend of a typical Fortune 500 can quickly escalate to eight or nine digits on an annual basis. Snap alone is paying hundreds of millions of dollars a year on Google Cloud.

Proprietary services and software that are part of a public cloud’s PaaS offering create significant lock-in – and credible, compatible alternatives often don’t exist

Winning one of these big customers means big money and the competition for marquee contracts will just be heating up this year. And with so much competition, there is going to be opportunities for huge savings – for companies of all sizes.

While the biggest winners will be those enterprises that have invested heavily in the cloud, IT organizations need to make sure that they implement the right cloud infrastructure and technologies to make the most of lowering costs.

One key solution is multi-cloud.

Vendor lock-in will be cloud issue #1 for the enterprise

No decision maker worth their salt is going to want to embrace a single cloud platform to the exclusion of all others. While some organizations might lock in a really good contract, it won’t be with any of the big 3 unless they cut prices dramatically. But these forced price reductions will only further the cycle of cost cutting and drive enterprises to keep their options open.

As a result, the level of anxiety for decision makers at big organizations around vendor lock-in will continue to rise – already vendor lock-in has replaced security as the #1 cloud concern.

While most organizations will remain on a single cloud this year, they will be actively seeking out options to avoid being trapped on that cloud. More and more enterprises will be moving away from expensive and limiting proprietary cloud storage technologies developed by Amazon, Google and Microsoft and embracing open source software solutions. As a result, it will be the beginning of a bad set of years for the highest price, stickiest services that are being offered by cloud service providers.

What’s at risk for enterprises considering proprietary solutions? Take Snap – a  company that uses App Engine – a Google platform. App Engine is 10x more expensive for Snap than other solutions but they are stuck on the platform because to move away from it would mean rebuilding. This would require a huge investment in engineering resources and the potential of mass instability of their platform – which could drive away users. You’ll see fewer and fewer enterprises falling into that trap in the future which is why we are seeing so much interest in multi-cloud strategies. Conversely, it will be a good year for tools that make it easier for enterprises to avoid vendor lock in – like Kubernetes and Docker.

The best path forward to multi-cloud

Most large enterprises have instituted mandates for a multi-cloud strategy. SMBs would do well to plan for one, even if it will not be a practical reality in the near term. There is a non-trivial cost to building for multi-cloud deployments, because they require a layer of abstraction between a company's IT footprint and the underlying cloud vendor's APIs. However, not building from the start for multi-cloud makes the eventual transition increasingly difficult, as each additional vendor-specific hook is utilized directly by deployed services. The good news is that the necessary layer of abstraction is rapidly evolving via open source and commercially-supported offerings, including Kubernetes, Docker, DC/OS, and Cloud Foundry.

Cloud price wars will heat up this year – it will be the beginning of a bad set of years for the highest price, stickiest services that are being offered by cloud service providers

Implementing a multi-cloud strategy requires first and foremost that IT leaders select cloud-neutral technologies. Proprietary services and software that are part of a public cloud's PaaS offering create significant lock-in. Credible, compatible alternatives often don't exist, and worse, migration paths are incomplete and poorly supported; it's not in a cloud vendor's interests to provide an easy off ramp. For two arresting examples, look no further than Dropbox's struggles to replace their usage of AWS S3 or the aforementioned Snap and their ongoing battle with Google App Engine's explosive cost structure.

Cloud price wars will heat up this year. Putting all of an enterprise's eggs into a single cloud vendor's basket invites risks resulting from the vendor's potential systemic security and/or operational shortcomings. A multi-cloud strategy will give enterprises the flexibility to migrate between cloud vendors – and take advantage of falling prices.

Microsoft wants to make Azure your AI destination


Rene Millman

8 May, 2018

Microsoft yesterday set out to open up its cloud platform to developers hoping to build AI applications, revealing a host of machine learning initiatives at its annual Build conference.

Redmond wants its Azure cloud to be the backbone of developers’ AI innovations, with CEO Satya Nadella saying: “The era of the intelligent cloud and intelligent edge is upon us. These advancements create incredible developer opportunity and also come with a responsibility to ensure the technology we build is trusted and benefits all.”

Project Kinect for Azure,is a package of sensors, including its next-generation depth camera, with onboard compute designed to allow local devices to benefit from AI capabilities.

Meanwhile, Microsoft’s Speech Devices SDK aims to enable developers to build a variety of voice-enabled scenarios like drive-through ordering systems, in-car or in-home assistants, smart speakers, and other digital assistants.

The tech giant also previewed its Project Brainwave, an architecture for deep neural net processing that Nadella said “will make Azure the fastest cloud for AI”. This is now available on Azure and for edge computing, and is fully integrated with Azure Machine Learning with support for Intel FPGA hardware and ResNet50-based neural networks.

The firm is also open sourcing the Azure IoT Edge Runtime, allowing customers to modify, debug and have more transparency and control over edge applications.

Azure IoT Edge also runs Custom Vision technology, a new service that Microsoft says enables devices such as drones and industrial equipment to work without cloud connectivity. This is the first Azure Cognitive Service to support edge deployment, with more coming to Azure IoT Edge over the next several months, according to the vendor.

“With over 30 cognitive APIs we enable scenarios such as text to speech, speech to text, and speech recognition, and our Cognitive Services are the only AI services that let you custom-train these AI capabilities across all your scenarios,” said Nadella.

Also announced was a new SDK for Windows 10 PCs in partnership with drone company DJI. Using Azure cloud, the SDK brings real-time data transfer capabilities to nearly 700 million Windows 10 devices. As part of the commercial partnership, DJI and Microsoft will co-develop tools leveraging Azure IoT Edge and Microsoft’s AI services to make drones available for agriculture, construction, public safety and other verticals.

Nadella also announced the company’s new AI for Accessibility, a $25 million, five-year programme aimed at using AI to help more than one billion people around the world who have disabilities.

The programme consists of grants, technology investments and expertise, and will also incorporate AI for Accessibility innovations into Microsoft Cloud services. Microsoft said that the initiative is similar to its previous AI for Earth scheme.

Intelligent apps

Microsoft also handed developers the ability to introduce more customisation for Microsoft 365 applications, so organisations can tailor them to their needs.

Developers whose businesses use the Microsoft 365 bundle of Office 365, Windows 10 and Enterprise Mobility and Security can now benefit from wider integrations in collaboration app Microsoft Teams, and even publish custom apps on the Teams app store.

There is also deeper SharePoint integration within Microsoft Teams to enable people to pin a SharePoint page directly into channels to promote deeper collaboration. Developers can use modern script-based frameworks like React within their projects to add more pieces that can be organised within SharePoint pages.

The vendor also revealed new Azure Machine Learning and JavaScript custom functions that let developers and organisations create their own additions to the Excel catalogue of formulas.

HCI Meets Big Data | @ExpoDX @Dana_Gardner #AI #HCI #Agile #DevOps #BigData #SmartCities #ContinuousDelivery #DigitalTransformation

The next BriefingsDirect developer productivity insights interview explores how a South African insurance innovator has built a modern hyperconverged infrastructure (HCI) IT environment that replicates databases so fast that developers can test and re-test to their hearts’ content. We’ll now learn how King Price in Pretoria also gained data efficiencies and heightened disaster recovery benefits from their expanding HCI-enabled architecture. Here to help explore the myriad benefits of a data transfer intensive environment is Jacobus Steyn, Operations Manager at King Price in Pretoria, South Africa. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

read more

Parallels Access 4.0 Released, Adds Support for iPhone X

The latest version of Parallels Access® adds a much-requested feature: support for iPhone® X, including new technologies in iPhone X such as Face ID®. Parallels Access, which gives users a convenient and natural way to control their desktop applications from their tablet or phone, now has full support for the higher resolution screen of the […]

The post Parallels Access 4.0 Released, Adds Support for iPhone X appeared first on Parallels Blog.

Top 200 DX Sponsors of @ExpoDX in 2017 | @CloudEXPO #AI #IoT #DevOps #SmartCities #FinTech #DigitalTransformation

DXWorldEXPO LLC announced today the Top 200 Digital Transformation Companies that sponsored, exhibited, and presented at CloudEXPO | DXWorldEXPO 2017. The list was published in alphabetical order. DXWorldEXPO LLC, the producer of the world’s most influential technology conferences and trade shows has also announced today the conference tracks for CloudEXPO |DXWorldEXPO 2018 New York.

DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City.

read more