Todas las entradas hechas por Keumars Afifi-Sabet

Google links US and Europe clouds with transatlantic subsea cable


Keumars Afifi-Sabet

18 Jul, 2018

Google is about to embark on building a massive subsea cable spanning the length of the Atlantic Ocean – from the French coast to Virginia Beach in the United States.

Claimed to be the first private transatlantic subsea cable, named ‘Dunant’ after the Nobel Peace Prize winner Henri Dunant, the latest addition to Google’s infrastructure network will aim to increase high-bandwidth ability, and create highly secure cloud connections between the US and Europe.

Google claims the new connection – which will support the growth of Google Cloud – will also serve its business customers by guaranteeing a degree of connectivity that will help them plan for the future.

Explaining the project in a blog post, Google’s strategic negotiator, Jayne Stowell, said the decision to build the cable privately, as opposed to purchasing capacity from an existing cable provider or building it through a consortium of partners, took several factors into account, including latency, capacity and guaranteed bandwidth for the lifetime of the cable.

Dunant follows Google’s plans to build another massive private cable spanning 10,000km between Los Angeles, California and Chile, dubbed Curie, one of three cables comprising a $30 billion push to expand its cloud network across the Nordics, Asia and the US.

Both Curie and Dunant originated in the success of relatively short pilot cables, dubbed Alpha and Beta as a nod to their software development process.

«Our investments in both private and consortium cables meet the same objectives: helping people and businesses can take advantage of all the cloud has to offer,» Stowell said.

«We’ll continue to look for more ways to improve and expand our network, and will share more on this work in the coming months.»

Google’s efforts to build a transatlantic cable follows the completion of a joint project by fellow tech giants Microsoft and Facebook in September last year, named Marea, that connected Spain with the east coast of the US.

The cable stretches more than approximately 6,600km, and weighs 4.65 million kg or, as Microsoft put it at the time, the equivalent of the weight of 34 blue whales.

Picture: Virginia Beach, US/Credit: Shutterstock

Dropbox plans SMR deployment to transform its Magic Pocket infrastructure


Keumars Afifi-Sabet

14 Jun, 2018

Dropbox has announced plans to deploy shingled magnetic recording (SMR) technology on a massive scale in a bid to transform its in-house cloud infrastructure.

The file hosting platform said deploying SMR drives on its custom-built Magic Pocket infrastructure at exabyte scale will increase storage density, reduce its data centre footprint and lead to significant cost savings, without sacrificing performance.

Dropbox says it is the first company to deploy SMR hard drive technology on such a scale. 

«Creating our own storage infrastructure was a huge technological challenge, but it’s already paid dividends for our customers and our business,» said Quentin Clark, Dropbox’s senior vice president of engineering, product, and design.

«As more teams adopt Dropbox, SMR technology will help us scale our infrastructure by providing greater flexibility, efficiency, and cost savings. We’re also excited to make this technology open-source so other companies can benefit from it.»

SMR, a hard drive technology that allows tracks on circular disks to be layered on top of one another, will be deployed on a quarter of the Magic Pocket infrastructure by 2019, according to Dropbox, with plans to open source the test software created in this process underway in the coming months.

Magic Pocket is the name of Dropbox’s custom-built infrastructure project that was rolled out after the file sharing company decided to migrate away from Amazon Web Services (AWS). The company initially built a prototype as a proof of concept in 2013, before managing to serve 90% of its data from in-house infrastructure in October 2015.

In what Dropbox describes as a «significant undertaking», SMR technology was chosen for its ability to expand disk capacity from 8TB to 14TB while maintaining performance and reliability. Drives were sourced from third-parties, before the company designed a bespoke hardware ecosystem around it, also creating new software to ensure compatibility with Magic Pocket architecture in the process.

«SMR HDDs offer greater bit density and better cost structure ($/GB), decreasing the total cost of ownership on denser hardware,» the Magic Pocket and hardware engineering teams explained. «Our goal is to build the highest density Storage servers, and SMR currently provides the highest capacity, ahead of the traditional storage alternative, PMR.

«This new storage design now gives us the ability to work with future iterations of disk technologies. In the very immediate future we plan to focus on density designs and more efficient ways to handle large traffic volumes.

«With the total number of drives pushing the physical limit of this form factor our designs have to take into consideration potential failures from having that much data on a system while improving the efficacy of compute on the system.»

Towards the end of the year, the file hosting service says its infrastructure will span 29 facilities across 12 countries, with Dropbox projecting huge cost-saving and increased storage density benefits if SMR deployment is deemed a success.

CEBIT 2018: Huawei launches hybrid cloud offering on Azure Stack


Keumars Afifi-Sabet

12 Jun, 2018

Huawei has launched a hybrid cloud service built for Azure Stack, Microsoft’s offering that brings Azure into customers’ datacentres as a private cloud.

Built on Huawei’s FusionServer V5 servers and CloudEngine switches, Huawei said the tool will allow enterprises to enable digital transformation projects by bringing Azure cloud services to on-premise sites where there is low connectivity, such as an aircraft or an oil rig.

Huawei is one of many firms working with Microsoft on producing services for Azure Stack, but speaking at CEBIT 2018, Microsoft partner director for Azure Stack, Vijay Tewari, labelled the vendor’s relationship with Huawei in particular as deep and strong.

«In terms of working with partners, the amount of time that Huawei [took] to launch the product was the shortest time it took as compared to any other partner, so we have a very strong engineering relationship with [president of server product line] Qiu Long and others at Huawei,» he said.

Huawei believes it is pivotal to pair its infrastructure with partners’ applications as it designs technology for use in smart cities, the cloud, and networking.

The Chinese networking giant likened digital transformation to a «symphony» as it promoted partnerships with a range of companies including Microsoft and UK-based Purple Wi-Fi, the latter of which it is offering its networking infrastructure to allow the Wi-Fi platform to extend the range of analytics tools it can offer customers. 

Purple Wi-Fi will be able to offer customers more detailed tracking information for consumers, with a view to boosting shopping experiences.

The company also outlined how it plans on using its partnerships with local companies to migrate projects to a global scale, with president of Huawei western Europe, Vincent Pang, outlining how a number of small-scale initiatives in Paris and London have helped the company win business elsewhere in the world.

«We want to build a local road here, we want to work with our local partners, we want to have more innovation to create end-to-end best practice here in Europe – but it’s not only for the local innovations, but how we can use these for the global market, and global vertical transformations,» he said.

Pang explained how a smart water project in Paris paved the way for expansion into Shanghai, while a smart logistics project with London’s DHL helped the company win a business case for a car manufacturer in China.

Huawei’s attempt to position itself as a leading player in the smart city scene arose with the lunch of the ‘Rhine Cloud’, a smart city and public services cloud platform, expanding on an initial memorandum of understanding signed earlier this year.

The new framework agreement extends the commitment to building a smart city platform in Duisberg, Germany to serve as a model that the company is hoping to export to the rest of western Europe.

Huawei’s first smart city digital platform includes five resource coordination capabilities for IoT, big data, a geographic information system (GIS) map, video cloud, and converged communications; all combining to share the basic resources with partners, and facilitate development of applications.

Martin Murrack, director of digitisation for Duisberg, outlined some of the benefits citizens should expect from the smart city collaboration with Huawei, including free Wi-Fi access and innovations in education, as well as unveiling the first Rhine Cloud-based SaaS platform, which digitises indoor environments, developed by Navvis.

Experts urge caution over NHS promises to secure its data in the cloud


Keumars Afifi-Sabet

7 Jun, 2018

Migrating from legacy infrastructure to the cloud is a mammoth task for any organisation, yet it’s particularly daunting for the National Health Service (NHS) – one UK’s most critical public services and its largest employer.

NHS Digital, the organisation underpinning the health service’s digital transformation, issued guidance in January outlining how Trusts should approach migration – marking the first time an authority has greenlit the use of public cloud in the health service.

Previously, cloud providers would approach Trusts to offer their services only to get knocked back, with many claiming NHS Digital prohibited the use of public cloud services. NHS Digital would then have to clarify to vendors individually, when subsequently approached with complaints, that there was no explicit ban on the use of cloud.

Because of this, NHS Digital commissioned a working group on the safe use of cloud with the National Cyber Security Centre (NCSC) in 2016, with policy at the time dictating any data stored overseas couldn’t contain sensitive information. The working group was set up to investigate how the policy could be changed to make use of cloud benefits, with a framework eventually agreed with ministers the following year.

«What the guidance is providing is a mechanism for organisations to do their own risk assessment as to whether or not cloud services can be used for their service or requirements,» Michael Flintoff, NHS Digital associate director for platforms and infrastructure, tells Cloud Pro.

«The main thing for me is clarity,» he adds, as Trusts have indicated a desire for guidance around data risk modelling and accessing services, something which traditionally hasn’t been there.

Sam Smith, coordinator at campaign group medConfidential, agrees the guidance is needed in principle, explaining «what it does – and the reason for the guidance – is it means hospitals can no longer say [to vendors] ‘NHS Digital told us not to’,» also claiming NHS Digital grew weary of having to repeatedly clear up the misunderstanding on an individual basis.

But Smith does not believe the guidance will make a difference, and, despite the clarity it provides, sees this as NHS Digital having «dumped [responsibility] back on the hospitals», while criticising the organisation for failing to introduce a set of minimum standards for the products available.

NHS ‘can be, and will be attacked’

Cloud transformation has been central to the digital transformation strategy of many organisations, including swathes of the public sector, with the government issuing its own cloud-first guidance last year.

NHS Digital says the benefits extend beyond cost-saving to being able to develop, test and deploy services quicker, without large initial capital expenditure, as well as a better scope for data interoperability.

Flintoff said a cloud-first strategy has led to greater efficiencies in his own «heavily technical, development-orientated» organisation; undergoing transformation on a service-by-service basis and pay-as-you-go type services, such as SQL-as-a-service, to ensure best value.

But for an organisation as large and fragmented as the NHS, an IT project of this scale poses huge challenges, with the health service keen to forget a string of failed efforts in the past; the most notable example being the care.data scheme abandoned in 2016.

Security, meanwhile, is an equally pressing concern. NHS Digital assured Trusts they could safely locate health data, including patient records, in the public cloud, but a string of reports have underlined security risks. For instance, 100GB of secret NSA data was found exposed on a misconfigured Amazon Web Services’ (AWS) S3 bucket in late 2017, among a host of other high-profile leaks including those at FedEx and Accenture.

«The days of blithely assuming that an IT system can be made totally secure are gone,» says Dr Paul Miller, senior analyst at Forrester.

«It is far more realistic to assume that any IT system can be attacked and will be attacked, and the emphasis should, therefore, be on detecting those attacks, defending against the vast majority of those attacks, and mitigating the impact of any attack that does get past the initial set of defences.»

medConfidential’s Smith adds that while the chances of suffering a breach aren’t necessarily higher in the cloud, the consequences of any breaches are likely to be far more severe in nature if they occur on public cloud infrastructure as opposed to within the NHS firewall – particularly if Trusts simply opt for the cheapest option.

In light of these «additional challenges,», the NHS is deploying tools like privileged access management and two-factor authentication to bolster security.

«The fundamentals of what we’ve done with IT for the last 20 years haven’t changed,» says NHS Digital’s Flintoff. «They don’t change because we go to the cloud, we might just have to approach them differently because we do that».

Third-party access to patient data

For others, meanwhile, one of the biggest concerns centres on privacy, and whether third-party organisations, such as large tech companies, may have access to sensitive patient data once it’s put into the cloud.

«These people are here to make money, if not today, tomorrow, and you need to understand very clearly what their business model is if you are going to be part of that,» said Javier Ruiz Diaz, a director at Open Rights Group. «For the NHS it would be highly irresponsible not to be asking those questions right now».

He cites arrangements with DeepMind in which hospitals, most notably the Royal Free in London, were slammed for granting Google access to patient data without consent.

Provided additional concerns around ensuring trusts do not become locked-in to one provider, and retaining ownership of public data, are dealt with, he believes the benefits could be substantial.

But these benefits, which include «better ICT, more innovative, and faster development», can only be reaped on the condition it’s «done properly and it’s not done with short-sightedness» or as a money saving exercise.

«If you say you need to do cloud just because you’re saving money – that in itself is very, very short-sighted, because there are costs,» adds Ruiz. «You are bringing risks, so to just say you are going to save money is a false economy, because in the long term you are going to lose a lot more.»

‘It makes no sense to train a new army of experts’

McAfee’s research also found a quarter of organisations cited a lack of staff with skills to manage security for cloud applications as a key challenge, with only 24% reporting they suffered no skills shortage. Significantly, 40% of IT leaders reported a skills shortage was slowing their organisation’s cloud adoption.

The NHS is suffering a staffing shortage in clinical areas, let alone in the technical skills required to maintain an IT project of this scale. NHS Digital itself only has «18 to 20 deeply technically skilled people», according to a House of Commons report into the WannaCry attack that crippled the health system last year.

The report highlighted the struggle faced by NHS organisations trying to recruit and retain skilled cyber security staff in the midst of a national shortage, in a landscape where the private sector can pay far more than the health service to attract talent.

«Teams within the NHS already have skills and experience in the relevant areas, but there aren’t enough of them,» says Dr Miller. «It doesn’t make sense for the NHS to train or hire a new army of cloud experts for the migration: there are systems integrators, consultancies, and other partners ready and willing to provide those services to the NHS. But, equally, the NHS should not outsource the whole problem to a third party.»

He called for an increase in the size and funding for internal IT and digital capabilities, given the NHS needs people with the skills and expertise to understand the migration, as well as continue to ask the right questions with regards to what its partners may be proposing. Above all, this transition must be designed, shaped, and led by NHS employees.

«The work must also be done within the broader umbrella of a digital strategy for the NHS. This isn’t just about moving from one server to another, or upgrading one application to its cloud-based equivalent,» he adds.

«The real opportunity here is to think about what a digital NHS should look like, how a digital NHS can help NHS staff be more efficient, informed, and empowered, and how a digital NHS can improve interaction with and care for patients.»

Image: Shutterstock

Local government slow to adopt cloud services, research shows


Keumars Afifi-Sabet

24 May, 2018

The majority of local authorities across the UK are yet to embrace cloud services to handle citizen data, citing concerns over data fragmentation and funding for infrastructure.

In fact, 80% of councils are still using on-premise infrastructure, either in isolation or in conjunction with a cloud service, to access and manage citizen data, according to a Freedom of Information (FoI) request.

When the government published a ‘Cloud First’ policy in 2013 calling for the public sector to consider cloud services, it highlighted the public cloud as its preferred model. Despite this, the FoI data collected by virtualisation firm Citrix shows that private cloud is the most favoured model, used by 30% of councils surveyed, followed by a hybrid mode used by a quarter of respondents, with only 7.5% using the public cloud.

Citrix sent FoIs to 80 councils, getting 40 responses. Asked what approximate proportion of applications and data are stored in the cloud, the majority, 77.5% said they stored up to a quarter of these assets in the cloud, with 5% storing everything on-premise. No council surveyed stored more than 75% of their data and applications in the cloud.

Moreover, data fragmentation remains a major concern, with 70% of IT teams ‘not confident’ the authority they work for has a ‘single view’ of citizen data; in that there is only one database entry per individual, with access to all service history.

«Local councils today are under enormous pressure from both central government and British citizens to deliver better services, at lower costs,» said Darren Fields, Citrix managing director in the UK & Ireland.

«With local authorities facing an overall funding gap of £5.8 billion by the end of the decade, councils are always on the lookout for innovative, cost-effective technology to help deliver efficient citizen services, whilst also improving productivity for staff and reducing costs.»

Despite the majority of local authorities, 75%, considering investing in cloud infrastructure in the next 12 months, only 7.5% are planning on downsizing their physical IT infrastructure by getting rid of on-premise servers or physical hardware, for instance. Just over a third, 35%, have no plans to do so.

However, an academic paper published last year warned that councils across the UK shouldn’t be rushed into migrating their applications and data. The report found that despite the political pressure on local government to move to the cloud, there were few examples of best practice.

Each of three councils’ cloud deployments examined in the report was «well implemented and well supported by cloud supplier staff», the researchers said, and had a string of positives, but all case studies examined showed a need to «develop an appropriate full and growing cloud strategy aligned to business strategy, together with an internal support programme to manage demand».

«These factors require planning, managing and monitoring to ensure the best use, value and benefit is obtained from the investment in the technology to help ensure efficient, effective and successful IT,» the report’s authors said.

Fields added: «The cloud has the potential to transform public services, yet many local authorities are held back by legacy IT systems – making it a demanding and challenging exercise to consolidate and transition data and applications to the cloud.»

«However, the cloud will inevitably become integral to service delivery – solutions are typically more cost effective, scalable, secure and flexible – and are likely to become an indispensable asset for local authorities looking to deliver first-class services to residents across the UK.»

Citrix quietly ditches Xen and NetScaler brands days after Synergy 2018


Keumars Afifi-Sabet

16 May, 2018

Citrix has ditched its Xen and NetScaler product names in a major rebrand only days following the end of its annual Synergy conference.

The cloud-centric company, specialising in workplace digitisation, is rebranding products such as XenApps, XenDesktops and XenMobile to Citrix Virtual Apps, Citrix Virtual Desktops and Citrix Endpoint Management respectively.

Also part of the rebrand, ShareFile will be named Citrix Content Collaboration, and XenServer will be Citrix Hypervisor.

These new products, which comprise a unified Workspace line, also includes Citrix’s February acquisition Cedexis, which – after aligning to the «unification plan» – will be named Citrix Intelligent Traffic Management.

Citrix’s new Networking line, meanwhile, will see its NetScaler brand phased out entirely with existing products largely retaining their previous identities under a ‘Citrix’ handle; for example, NetScaler ADC will be known as Citrix ADC.

Only days after wrapping up its annual Synergy conference, this year hosted in Anaheim, California, Citrix quietly rolled out this major rebrand in a new product guide and name unification chart  for partners.

«Throughout 2018, you will see exciting changes as we unify our product portfolio,» its new guidance said. «As we make it easier to use Citrix products, we’re also making it easier to understand the value of our solutions with new names.»

«Unifying the experience allows us to simplify our offerings, which requires name changes. Once we’ve transitioned to the new solutions and names, this effort will make it easier for you to understand the benefits of Citrix as a business and technology partner.»

Citrix added that changes to its website, support documentation and user interface will be introduced over the next year as part of a transition process, with businesses and partners given support to make any appropriate changes.

The rebrand emerges days after Citrix unveiled a host of new products at Citrix Synergy 2018, including Citrix Analytics, a machine learning-powered security tool, and the new Workspace App.

CEO David Henshall introduced the Workspace App as «one way to organise, access and open all of your files, regardless of whether they’re on your hard drive, on your network drive, on cloud or anywhere in between».

The company were keen to push a unified message and product line, shifting its focus more towards building a better user experience than predominantly focusing on infrastructure, with this rebrand feeding into Citrix’s desire to provide a more holistic and unified approach toward transforming the digital workspace.

Speaking at a press Q&A following the keynote address, chief product officer PJ Hough went into more detail around building a better user experience for Citrix customers.

«Having spent a lot of time working on productivity software before I joined Citrix, I understand the value of reducing clicks, of reducing confusion for users, and really having people have a consistent and seamless experience across all their devices and platforms,» he said.

Although references to its Xen and NetScaler brands were kept to a minimum during the company’s opening keynote address, products such as XenDesktop and XenApps were often featured in breakaway sessions on the conference floor. But there was no indiciation that Citrix was planning to remove these brands from its line.

Cloud Pro approached Citrix for comment but its US representatives were not available at the time of writing.

View from the airport: Citrix Synergy 2018


Keumars Afifi-Sabet

14 May, 2018

In the early hours of last Tuesday morning, shortly before the opening keynote of Citrix Synergy 2018, we experienced a magnitude 4.5 earthquake along the San Andreas fault zone – rattling swathes of Southern California, from San Diego to Santa Clarita.

Taking to the stage later that morning, Citrix CEO David Henshall outlined his company’s new theory of ‘people-centric computing’ – essentially an aim to provide a dramatically better user experience – and experts believe the vendor might now have what it takes to send some seismic waves of its own through the industry.

«For the first time in a while, the company has all the strategic, product, and organisational building blocks in place to execute in the marketplace,» Andrew Hewitt, Forrester’s analyst for infrastructure and operations, told Cloud Pro. «Today’s leading organisations are constantly looking for a way to simplify and improve employee experience with technology, and Citrix is well-positioned to do that.»

The overarching message the company was keen to push was its vision of the ‘future of work’. This, at first glance, seems an impressive statement, but for the fact that all its flagship rollouts at Citrix Synergy had existed, or been teased, in some form as far back as 2014; with the Workspace App, for instance, serving as the final manifestation of the company’s vision for a unified digital workspace that it has outlined in many iterations over the years.

Hewitt said Citrix needs to focus on messaging here though, differetiating its service from rival ones. He said: «For one, it needs to better delineate how its product is different from VMware’s Workspace One product, such as its inclusion of ShareFile and its strong support for MAM-only deployments.»

He added: «Citrix will need to clearly define why products like Secure Mail and Secure Hub offer a better experience than the native solutions if they want to keep moving in that direction.»

Security also featured prominently with the rollout of Citrix Analytics, a machine learning-powered tool that aims to build profiles for individual users to mitigate the threat of human error – a cyber threat echoed by many figures over the course of the week, including former secretary of state Condoleezza Rice.

Hewitt noted that the product is very focused on security use cases, but that «there are doubts on how valuable those analytics will be across varied vendor ecosystems that many customers have today».

The conference, overall, was received well, with customers and partners pleased in general with what Citrix had delivered. But for all the emphasis the company put on the ‘future of work’, Synergy 2018 was ironically lacking in any sense of futurescaping that I had been expecting.

Seeing some kind of projection for how the future of work may look in, say, five years’ time, would have complemented an all-in-all successful conference, as Citrix positions itself strongly in the market.

UK insurer Beazley adopts a digital transformation doctrine


Keumars Afifi-Sabet

11 May, 2018

Specialist insurers Beazley have been using Citrix products for a number of years, celebrating the culmination of a journey that has led them to be among the finalists at Citrix’s Innovation Awards this year.

The winner, WAGA, was crowned by CEO David Henshall during the keynote address in the Anaheim Convention Centre, California, this week – but simply featuring represents another big step in Beazley’s journey to streamline its business.

«At the moment we’re undertaking a huge activity-based working programme change,» said Dale Steggles, Beazley’s architect and leader of end-user technologies. «That’s changing the physical location, people, soft skills, as well as technology – trying to provide a huge mobility to the workforce globally.»

Citrix has underpinned a lot of that activity, according to Steggles, with the use of its products XenDesktop and XenApp, for instance, but also the likes of ShareFile and SecureMail.

Founded in 1986, Beazley grew out of its London offices to expand in the United States in 2005 – and has subsequently opened offices in Paris, Munich, and a host of other locations including the Middle East.

Speaking with Cloud Pro, Steggles was joined by Mark Moerdyk, Beazley’s head of IT infrastructure and engineering, who outlined how his team made several strong strategic decisions in the way they expanded.

«One of them was saying actually that we didn’t want to decentralise our infrastructure, and so we opted for Citrix to actually build out our US offices and enable our business across the US, while managing to consolidate our infrastructure into two data centres,» he explained.

On the day-to-day benefits the business has accrued, particularly in the recent history, Moerdyk added: «It’s the enablement of the business that’s been the most important element.

«We were actually reimagining and redesigning our physical office spaces to actually support the way people want to work, we take feedback from our users, and they’re consistently saying they don’t have the right environments to do their jobs, so we’ve actually embarked on a programme where we’re changing all of that.»

He noted an increased sense of mobility, driving gains in productive, includes hot-desking and remote working, leading to a «10 to 20% productivity gain just by enabling a different way of working».

Steggles continued to outline how, technically-speaking, Beazley has managed to transform its infrastructure outlay that has laid the foundations for the firm’s greater mobility.

«We’ve been able to centralise 1,800 endpoints – that used to be 1,800 physical devices inside the perimeter – these are laptop and desktop devices, so we’ve changed that into where 80% of our users are hosting a shared desktop; managing one image, one workflow, one deployment mechanism – across four data centres globally.

«Previously, whether it’s product updates, whether it’s patching it’s patching, all that environment we had to touch on in end-points, now we just touch it once.

«Update a master image, and its done for 80% of your organisation. It’s then really helped us focus on business drive – around innovation rather than just keeping the lights on.»

The challenges, meanwhile, for Beazley, have historically risen in maintaining collaboration between different productivity apps and services, and reducing layers of complexity that have been created over years of developing IT infrastructure.

«Today we have four or five applications you’d need to deploy to a mobile device to be productive – reducing that down to one – that is a huge leap forward,» said Steggles, pinning his hopes on Citrix’s Workspace App materialising.

«One space, one environment; I’d go there to get any activities – the share, collaborate, connect with content, I think it’s a really good place to be.»

But the linchpin of Citrix’s role in Beazley’s outlay has been the way in which it gives smaller organisations access to an equivalence of the sort of expensive infrastructure that has only conventionally been available to larger enterprises.

«For us it’s about taking that challenge and working with our partners to drive efficiencies where we can and try push the business – whether for example it’s standing up an office in Barcelona over a VPN – over an internet connection – because the business wants to try to incubate it, wants to see whether its viable to put an office together in Barcelona with minimal investment.

«Previously, we’ve been unable to do that without the use of massive infrastructure outlay, now we just ship a device, give an access key to the user, and they login from anywhere in Barcelona.»

This is a point echoed by Citrix’s Sridhar Mullapudi, vice president of product management for Workspace services. Speaking to Cloud Pro, Mallapudi said one of the key benefits for businesses of its technology, for SMBs in particular, was on the infrastructure side.

«For SMBs what cloud usually does is it brings the economics and time to value enterprises used to have. So before, enterprises used to have a lot of budget, so they could actually implement these large IT projects and infrastructure investments which SMBs can’t do,» said Mallapudi. 

«With cloud-based solutions, like what you’ve seen today, Workspace, and things like that, it just makes it easier for SMBs to consume and get the same power that was probably historically reserved for enterprises but now is available through cloud.»

Moneyball author Michael Lewis calls for better use of big data at Citrix Synergy 2018


Keumars Afifi-Sabet

11 May, 2018

Bestselling author and former Wall Street trader Michael Lewis called for data analytics to be much-better harnessed by large organisations and governments in a Q&A session at Citrix Synergy.

In an event marking the final day of Citrix’s annual Synergy conference, this year hosted in Anaheim, California, the author of Moneyball, and The Big Short, discussed how technology has affected the financial sector, how data has changed sport, and the role AI plays in human decision-making.

He cited data belonging to the US government, «the biggest company on the planet», on how data analytics can be better-harnessed to not only gain insights, but develop smarter public policy.

«The federal government is a trove of enormous databases that people have only scratched the surface of,» he said, adding we are only starting to see this being better harnessed.

«The only reason we know about the opioid crisis is the national institute for health data on distribution of prescription drugs – which was made available to the public under the Obama administration in a way it was accessible.»

He told the audience «we wouldn’t even know» about the burgeoning opioid crisis affecting parts of the United States if not for analysts at Propublica, a nonprofit newsroom based in New York, taking and analysing this data.

Lewis also spoke about how technology had fundamentally changed the financial sector and how incentives to make short-term profits led to the financial crisis – issues at the heart of The Big Short and Flash Boys.

He explained how traders began to learn that their physical distance and location had an effect on how fast they were getting information about markets, and how high-frequency traders at BATS Global Market built faster networks to the other 12 exchanges scattered around New Jersey so they could detect big sell alerts and sell in front of rivals.

«What happened is, the stock market, instead of being about fundamental investment decisions, starts to become just about speed; how fast can you get from one exchange to the other, or how fast you can assemble a picture of the markets that is more accurate.

«If you can see prices before everybody else – it’s the ultimate form of insider trading.»

Lewis explained how traders began building faster networks to connect markets together, by trying to lay the straightest fibre-optic lines possible – a process that involved blowing up mountains, and hundreds of millions of dollars of investment, to gain only a few milliseconds of advantage, adding «there’s a kind of madness to it».

«There’s a point where the speed – it’s not actually adding anything to the economic efficiency – it’s all about gaming the market; all about finding out what people are doing and and getting ahead, or assembling a picture of the market that’s a millisecond faster than the market itself,» he said. 

The other side of the equation, he noted, was about slowing down the New York Stock Exchange, or Nasdaq, as another way of widening the gap: «The incentives are to widen the gap of time between when an ordinary investor gets a piece of information, and when a high-frequency trader gets a piece of information.»

«This is a malign use of technology; I don’t think this adds anything to the wealth of the society. It does make a high percentage of high-frequency traders very rich.»

Q&A: Citrix’s privacy chief Peter Lefkowitz talks GDPR compliance at Synergy 2018


Keumars Afifi-Sabet

10 May, 2018

With GDPR set to come into force later this month, organisations of all sizes are racing to comply with a new set of tougher data protection laws.

Cloud Pro caught up with Citrix’s chief privacy and digital risk officer Peter Lefkowitz at Citrix Synergy 2018, hosted in Anaheim, California, to discuss what the new legislation means for organisations, how it changes the way businesses approach privacy, and how Citrix itself has changed in light of imminent GDPR enforcement.

«We’re just at that moment – we’re 16 days out – so I’m spending a lot of time on it, but it’s not just internal system compliance, it’s looking at our products – what data do they collect, what are our retention rules, how do we promote ourselves to customers?» he said. 

Citrix’s bid to comply with GDPR, according to Lefkowitz, has included updating all global contracts, putting out a new data privacy addendum, standard contractual clauses – pushing those to «77,000 of our active customers in April» – and new terms for all its partner channel and suppliers.

The cloud-centric company has also asked its suppliers to sign up to new privacy terms, and fill out a questionnaire, so Citrix knows «who their security contacts are, where we go in the event of an incident, who to contact, and that sort of thing».

On how GDPR has changed the way Citrix operates internally, Lefkowitz said: «By virtue of the fact the GDPR is so focused on accountability, on all of these controls, and on transparency, it has raised privacy awareness and security design awareness to a higher level, so we now have some of the members of our executive leadership team who want regular updates on these topics.

«It has raised that discussion up against how we design our products, how we manage our services, what we do on the back-end.»

Lefkowitz’s comments chimes with chief product officer PJ Hough’s assertion that Citrix is not only GDPR-ready itself, but has made efforts to ensure wider compliance among its associates in the industry.

«For all of our existing commercial products we have gone through GDPR review already, and we have actually not just complied ourselves, we’re actively engaged with many of our large European and global customers to help them become GDPR compliant in their entire deployment,» he said at a press Q&A following the opening keynote address.

«So I would say as we bring more of our products online we will be compliant with the regulations in all the markets in which we serve.»

CEO David Henshall added, in the same session, that regulatory compliance is «woven into how we think about the company – how we think about delivering cloud services – it’s just part of the fabric».

Lefkowitz continued to outline specifically how Citrix is helping its partners and customers through an array of blogs, white papers, schematics, and a range of different materials featured online, outlining its approach to GDPR and data protection more generally.

«We’ve done training internally, for our support organisations, for our sales force, for our legal department, for a lot of people that touch customers and touch suppliers, so people are aware of what the key issues are. The goal is to really be as transparent as possible – and to make it as easy as possible for our customers to use these products,» he said. 

Turning to the legislation, Citrix’s in-house privacy expert explained the benefits of GDPR include that it forces organisations into adopting healthier data protection practices – while he warned against some of the unintended consequences.

«Raising these topics, making those operation controls more of a requirement, has taken a lot of effort from every company,» Lefkowitz said.

«But if you know where your data is, where your systems are, how they’re managed, you regularly check them and update them, I think the companies that take GDPR seriously are overall going to have a better framework for security control for all of their data – particularly for sensitive personal data.»

Organisations, however, should be wary of the impact of the ePrivacy regulation, according to Lefkowitz, a separate regulation that governs electronic privacy and marketing, that sits alongside existing regulation, and is in the process of being rewritten.

«Nobody knows where it’s going to land,» he explained, adding: «We’ve all been doing this big effort around marketing systems and marketing controls around GDPR, and then probably next year we’re going to be hit with an entirely new regulation.

Lefkowitz also warned there are a number of areas under the regulation that have been left open for individual member states to pass their own laws, or enforce in their own way, going against the main purpose of GDPR; that is unifying data protection regulations across the continent, and the wider world.

«A worry is that once the regulation is in effect and countries start seeing new technologies, new instances, new breaches, we may see countries splintering a bit on some very important topics,» he explained.

He outlined a hypothetical scenario of a company heading to a lead regulator in one country, presenting its system and its controls and gaining clearance, only for another regulator in another country to pull the company up on the same issues, as a point of great concern.

In terms of regulation penalties will be enforced, Citrix’s privacy chief said the legislation brings punishment under GDPR to the same standards as that under existing laws, with the whole notion of fines of 2% and 4% annual revenue based on competition and antitrust law.

He explained there will be two prongs to the regulatory approach based on the severity of non-compliance.

Outlining the first, he said: «Some of the regulators have already spoken publicly about this – they’ve hired more staff – so on 28 May, they’re going to go out and really look for basic stuff that hasn’t been done.» This will include situation in which an organisation doesn’t have a privacy policy, or if there’s evidence they’re not giving somebody access rights.

«Tranche two is going to be when the really, really, really, really bad stuff happens – the breach that has a horrendous impact, that easily could have been avoided; the company that is selling lists of sensitive information and not following up on controls – we’ve heard a little something about that recently – those I think the regulators will take very seriously,» explained Lefkowitz. 

«Time will tell whether the fines will be similar to what we’ve seen under competition law. I can’t make a guess at that; just the fact that the regulators will have that in their back pocket I think will make a significant difference in compliance.»