A Website That Works in Microsoft Edge, but Not in Safari or Internet Explorer

Among my many character flaws, I am an unapologetic font addict. One of my favorite T-shirts reads: Whoever dies with the most fonts, wins. Figure 1 shows my font menu in Microsoft Word on my home iMac®, so you can see that I’m well on my way to winning. Because of this addiction, I was […]

The post A Website That Works in Microsoft Edge, but Not in Safari or Internet Explorer appeared first on Parallels Blog.

Gartner: Cloud giants’ dominance poses challenges for users


Clare Hopping

12 Apr, 2018

Gartner predicts the top 10 cloud providers will account for 70% of IaaS revenues in the next three years, with the likes of AWS, Microsoft, Google and Rackspace dominating the leaderboard more than ever.

These top 10 firms took 50% of the market in 2017, according to Gartner’s Forecast Analysis: Public Cloud Services, Worldwide, 4Q17 Update, and Gartner suggested that the smaller players stand little chance against the big boys, with the larger firms in danger of getting “unchecked influence” over users as a result.

“The increasing dominance of the hyperscale IaaS providers creates both enormous opportunities and challenges for end users and other market participants,” said Sid Nag, research director at Gartner.

“While it enables efficiencies and cost benefits, organizations need to be cautious about IaaS providers potentially gaining unchecked influence over customers and the market,” Nag explained.

He added that organisations will demand more from IaaS providers, particularly around the ease with which they can switch between multiple clouds, rather than focusing on one supplier. They will look to form alliances with the vendors that allow for multicloud agreements, rather than those that penalise users that have more than one supplier.

As more businesses realise the benefits of the cloud, they are becoming more demanding across the board, Nag added. This has already become apparent in the SaaS market, which is expected to grow revenues by 22% this year to $73.6 billion – the largest cloud segment, where firms want tools that specifically align to ther business objectives, rather than being a one-size-fits-all offering.

“In many areas, SaaS has become the preferred delivery model,” said Nag. “Now SaaS users are increasingly demanding more purpose-built offerings engineered to deliver specific business outcomes.”

Elsewhere IaaS is expected to grow by a third to hit $40.8 billion in revenue in 2018, and PaaS to reach $15 billion.

In the PaaS sector, database platform as a service (dbPaaS) is the fastest growing sector, with hyperscale cloud providers snapping up the opportunity to diversify their services.

“Although these large vendors have different strengths, and customers generally feel comfortable that they will be able to meet their current and future needs, other dbPaaS offerings may be good choices for organizations looking to avoid lock-in,” Nag added.

Picture: Shutterstock

How enterprise IT investment is being driven by C-level strategy

In the evolving global networked economy, every type of company essentially becomes a technology-oriented firm – in one form or another. That's fueling the strategic investment in IT infrastructure and services. Currency market changes are another key factor.

As a result, worldwide IT spending is projected to total $3.7 trillion in 2018 — that's an increase of 6.2 percent from 2017, according to the latest market study by Gartner. Senior executives and line of business leaders continue to drive many of the strategic IT procurement decisions.

IT infrastructure market development

"Although global IT spending is forecast to grow 6.2 percent this year, the declining U.S. dollar has caused currency tailwinds, which are the main reason for this strong growth," said John-David Lovelock, vice president at Gartner.

This is the highest annual growth rate that Gartner has forecast since 2007 and would be a sign of a new cycle of IT growth. However, spending on IT around the world is growing at expected levels and is in line with expected global economic growth.

Through 2018 and 2019, the U.S. dollar is expected to trend stronger while enduring tremendous volatility due to the uncertain political environment, the North American Free Trade Agreement renegotiation and the potential for an innovation trade-war with China.

Enterprise software spending is forecast to experience the highest growth in 2018 with an 11.1 percent increase. The software industry is expected to capitalize on the evolution of digital business. Application software spending will continue to rise through 2019, and infrastructure software will also continue to grow — bolstered by IT modernization initiatives.

Even with a strong end to 2017, worldwide spending on traditional data centre systems is forecast to grow 3.7 percent in 2018 – but that's down from 6.3 percent growth in 2017. The longer-term outlook continues to have challenges, particularly for the data storage segment. Blame lies in the advance of public cloud computing adoption.

The strength at the end of 2017 was primarily driven by the component shortage for semiconductor memory, and prices have increased at a greater rate than previously expected. According to the Gartner assessment, the shortages will likely continue throughout the year with the supply not expected to ease until the end of the year.

Outlook for end-user device investment

Worldwide spending for devices – i.e. personal computers, media tablets and smartphones – is forecast to grow in 2018, reaching $706 billion, an increase of 6.6 percent from 2017. The device market continues to see dual dynamics. Some users are holding back from buying, and those that are buying are doing so, on average, at higher price points.

As a result, end-user spending will increase faster than units through 2022. However, total end-user spending and unit shipments are expected to be lower compared with previous forecasts, as demand for ultra-mobile premium devices, ultra-mobile utility devices and basic mobile phones is expected to be slow.

Pushing cloud AI closer to the edge


Keri Allan

12 Apr, 2018

Cloud-based AI services continue to grow in popularity, thanks to their low cost, easy-to-use integration and potential to create complex services.

In the words of Daniel Hulme, senior research associate at UCL, “cloud-based solutions are cheaper, more flexible and more secure” than anything else on the market.

By 2020 it’s believed that as many as 60% of personal technology device vendors will be using third-party AI cloud services to enhance the features they offer in their products. However, we’re also likely to see a significant growth of cloud-based AI services in the business sector.

One of the biggest drivers of this has been the proliferation of VPAs in the consumer space, made popular by the development of smart speakers by the likes of Amazon and Google.

Users have quickly adopted the technology into their everyday lives, and businesses were quick to realise the potential locked away in these devices, particularly when it comes to delivering new products.

Drivers of cloud-based AI services

Amazon’s Alexa was the first personal assistant to achieve mass-market appeal

“It’s a confluence of factors,” says Philip Carnelley, AVP Enterprise Software Group at analyst firm IDC. “There is no doubt the consumer experience of using Alexa, Siri and Google Now has helped familiarise businesses with the power of AI.

“But there is also a lot of publicity around AI achievements, like DeepMind’s game-winning efforts – AlphaGo winning against the Go champion for example – or Microsoft’s breakthrough efforts in speech recognition.

He adds that improvements to the underlying platforms, such as the greater availability of infrastructure-as-a-service (IaaS) and new developments in graphical processing units, are making the whole package more cost-effective.

Yet, it’s important to remember that despite there being so much activity in the sector, the technology is still in its infancy.

“AI is still very much a developing market,” says Alan Priestley, research director for technology and service partners at Gartner. “We’re in the very early stages. People are currently building and training AI models, or algorithms, to attempt to do what the human brain does, which is analyse natural content.”

The likes of Google, Amazon and Facebook are leading this early development precisely because they have so much untapped data at their disposal, he adds.

The role of the cloud

Vendors have helped drive AI concepts thanks to open source code

The cloud has become an integral part of this development, primarily because of the vast computing resources at a company’s disposal.

“The hyper-scale vendors have all invested heavily in this and are building application programming interfaces (APIs) to enable themselves – and others – to use services in the cloud that leverage AI capabilities,” says Priestley.

“By virtue of their huge amount of captive compute resource, data and software skill set, [these vendors have been] instrumental in turning some of the AI concepts into reality.”

This includes the development of a host of open source tools that the wider community is using today, including TensorFlow and MXNet, and large vendor services are frequently being utilised when training AI models.

According to IDC, businesses are already seeing the value of deploying these cloud-based AI solutions. Although less than 10% of European companies use AI in operational systems today, three times that amount are currently experimenting with, piloting or planning AI usage – whether that be to improve sales and marketing, planning and scheduling, or general efficiency.

Benefits to business

Chatbots were an early AI hit within many businesses

“Businesses are seeing early implementations that show how AI-driven solutions, like chatbots, can improve the customer experience and thereby grow businesses – so others want to follow suit,” says Carnelley.

“Unsurprisingly, companies offering AI products and services are growing fast,” he points out.

Indeed, chatbots were one of the earliest AI-powered features to break into the enterprise sphere, and interest looks set to continue.

According to a report published this month by IT company Spiceworks, within the next 12 months, 40% of large businesses expect to implement one or more intelligent assistants or AI chatbots on company-owned devices. They will be joined by 25% mid-sized companies and 27% of small businesses.

However, organisations are also looking more widely at the many ways AI solutions could help them.

The insurance industry, in particular, is looking at how AI can be used to help predict credit scores and how someone may respond to a premium.

“This is not just making a decision but interpreting the data,” says Priestley. “A lot of this wasn’t originally in digital form, but completed by hand. This has been scanned and stored but until recently it was impossible for computer systems to utilise this information. Now, with AI, technology can extract this data and use it to inform decisions.”

Another example he highlights is the medical sector, which is deploying AI-powered systems to help improve the process of capturing and analysing patient data.

“At the moment, MRI and CT scans are interpreted by a human, but there’s a lot of work underfoot to apply AI algorithms that improve the interpretation of these images, and diagnosis (via AI),” says Priestley.

Moving to the edge

Self-driving cars will need latency-free analytics

Given the sheer amount of computational power on hand, the development of AI services is almost exclusively taking place in the cloud but, looking forward, experts believe that many will, at least partially, move to the edge.

The latency associated with the cloud will soon become a problem, especially as more devices require intelligent services that are capable of analysing data and delivering information in real time.

“If I’m in a self-driving car it cannot wait to contact the cloud before making a decision on what to do,” says Priestley. “A lot of inferencing will take place in the cloud, but an increasingly large amount of AI deployment will take place in edge devices.

“They’ll still have a cloud connection, but the workload will be distributed between the two, with much of the initial work done at the edge. When the device itself can’t make a decision, it will connect to the ‘higher authority’ – in the form of the cloud – to look at the information and help it make a decision.”

Essentially, organisations will use the cloud for what it’s good at – scale, training and developing APIs and storing data. Yet it’s clear that the future of cloud-only AI is coming to an end.

Image: Shutterstock

HPE acquires RedPixie to add Azure skills to its cloud consulting arm

Hewlett Packard Enterprise (HPE) has announced the acquisition of London-based RedPixie to further bolster its cloud consulting expertise.

RedPixie’s vision, in the company’s own words, is ‘to go beyond technology, building and managing Azure hybrid solutions for clients in financial services.’ The company was founded in 2010 and comprises a team of business consultants, cloud architects and data scientists.

The acquisition will fall under the remit of HPE Pointnext, the company’s services business. “With this acquisition, we will continue to expand our comprehensive hybrid IT portfolio and will be even better positioned to help our customers build new digital experiences and drive better business outcomes now and into the future,” wrote Ana Pinczuk, SVP and GM of HPE Pointnext.

With the acquisition HPE now has both of the leading cloud infrastructure providers covered. In September the company bought Cloud Technology Partners (CTP), whose focus is more on the Amazon Web Services (AWS) side.

For HPE, the proposition is clear: hybrid IT and multi-cloud is increasingly the order of the day. According to figures last year from 451 Research, around half of AWS users were also using Azure and vice versa.

“The reality today is that enterprises face a hybrid IT world,” wrote Pinczuk. “Some workloads are best suited to the public cloud, some should live in a private cloud environment and others need to stay in traditional on-premises infrastructure. Finding the right mix will enable businesses to analyse data quickly, efficiently manage workloads and ultimately accelerate business outcomes by driving new business models, creating new customer and employee experiences, and improving operational performance.”

Financial terms of the acquisition were not disclosed.

Dubai Airports shifts to Box in a move towards full digitalisation


Gabriella Buckner

11 Apr, 2018

Dubai Airports has moved its employees over to the cloud content management platform Box.

Over 2000 employees are now using Box, which has helped the airport spend less time on servicing file servers and increased collaboration between employees and external partners. Dubai Airports owns and manages the operation and development of both of Dubai’s airports – Dubai International (DXB) and Dubai World Central (DWC).

Using Box, Dubai Airports employees can securely access, edit and share information from any device. Through Box Governance, Box Zones and Box KeySafe, they’re also more easily able to comply with EU data protection standards, which was the company’s initial reason for partnering with Box.

The platform also integrates with Microsoft O365 and Okta and there is also centralised internal project management, digital assets and business-critical records on Box.

Employees can also access and edit content from any device, providing mobility anywhere on the airport campuses, greater Dubai metro and internationally.

Abdulrahman Al Hosani, vice president of Dubai Airport’s Infrastructure and Operations, said that around 88.2 million passengers travelled through Dubai Airports in 2017 and the company was looking to use technology to create a “smoother experience for those customers.

“With Box, we spend less time servicing file servers and support desk tickets and can focus on what we specialise in providing premiere passenger experience, baggage processing, and airfield management,” he said.

He added that the decision to select Box was primarily for security and regulatory purposes. “Box KeySafe, Box Governance, and Box Zones allow us to comply with
EU regulations on data protection,” said Al Hosani. “Now with Box, not only do we have greater control of our content with Box, it has simplified access to information resulting in a significant reduction in associated time and costs.”

The migration to Box, which began in 2016, is only the latest part of Dubai Airports’ journey to becoming fully digital. It has sought to update its image and services through other digital platforms like Clipatize, which has formed social media content and developed methods for measuring customer experience and instituting new workplace values.

Finding the right Agile formula: Making CI, CT and CD work together

In today’s world of Agile, we know that development teams must offer products and services to end users on their terms, on their choice of devices, and at their convenience – creating and differentiating features that work perfectly regardless of how much time they have to build and release the new software. At the end of the day, this combination of velocity and quality can be make or break for brands, but achieving it is no easy feat.

We believe that teams which are trying to mature their DevOps practices, but facing hurdles, should consider the combination of the ‘Three Cs’: continuous integration, continuous testing and continuous delivery in their workflows. By automating all release activities, teams can assure high quality deliverables throughout each stage of the DevOps pipeline, raising the confidence of flawless release time after time.

Although they serve slightly different objectives, the ‘Three Cs’ can actually integrate to assist teams in meeting their primary goals: speed and quality. But, determining how to combine these three methods into the right formula is crucial, and for many that’s the challenging bit.

Defining the methods

Before we consider how these practices can work together, we should address some confusion still in the market – and take a quick look at what each of these methods really means.

Continuous integration: The most dominant player in the ‘Three Cs’ is Continuous integration (CI) and it’s a necessary approach for any Agile team. CI requires developers to integrate code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early.

By integrating regularly, teams can detect errors quickly, and locate them more easily. Simply, it ensures bugs are caught earlier in the development cycle, which makes them less expensive to fix – and maintains a consistent quality.

Continuous delivery: Continuous delivery is the practice of streamlining/automating all the processes leading up to deployment. This includes many steps, such as validating the quality of the build in the previous environment (ex.: dev environment), promoting to staging, etc. These steps, done manually, can take significant effort and time. Using cloud technologies and proper orchestration, they can be automated.

Teams should ensure they have a monitoring dashboard for your production environment in place in order to eliminate performance bottlenecks and respond fast to issues. This will complete an efficient CD process.

Continuous testing: Continuous testing (CT), which can also be referred to as Continuous Quality, is the practice of embedding and automating test activities into every “commit”. CT helps developers use their time more efficiently when trying to fix a bug for code that was written years ago. To fix the bug, developers should first remind themselves of which code it was, undo any code that was written on top of the original code, and then re-test the new code; not a short process. Testing that takes place every commit, every few hours, nightly and weekly, not only increases confidence in the application quality, it also drives team efficiency.

Working together

These three processes are often viewed as distinctly separate identities, fighting for the top spot in the DevOps pipeline. However, we’d argue that the ‘Three Cs’ together are important to the success of one another throughout the delivery cycle, and it’s only by incorporating the CI/CD/CT trifecta that teams will be able to achieve the velocity and quality they need.

But with distinct personalities and separate jobs to do, how can DevOps teams bring these functions together and ensure a smooth running, fully integrated, team?

First, it’s important to understand how each function plays into the others. For example, for CI to be successful, you need to ensure tests are stable and run continuously and in a stable environment –  which in turn ensures high reliability of the test results per each build execution. The same goes for CT; to have an engine that can trigger tests automatically requires robust test code, a robust test lab, and most-importantly, cultural and team synchronisation. When both CI and CT are working for you, that means that you are that close to a working CD.

We also believe that there are three key characteristics that teams must have to make the trifecta of CI, CT and CD work together. The first is communication. Communication between team members is vital, and this is something that CI enables. CI allows teams to be Agile, by ensuring teams are all on the same page, despite leaving a project or moving on to a different step in the process, teams can easily integrate once they return, without having to start over from page one.

The second key characteristic is trust – CD alleviates any unknowns by automating and streamlining all the processes leading up to deployment, such as validating the quality of the build in the previous environment and promoting to staging. And third is honesty. If teams leverage CT when developing apps in different environments and with different criteria, it will prevent larger issues from happening once the app is in the sprint, or live, keeping developers honest about the status of their code.

Conclusion

So, we can clearly see the benefits of adopting the Three Cs – and by establishing open lines of communication, and ensuring teams know how each function plays into the other, DevOps can establish a seamless development workflow. And, of course, it’s crucial incorporate CI, CD and CT throughout the entire SDLC to keep deployments moving – and to achieve the velocity of release and application quality which ultimately keeps consumers happy.

Connecting Your iPhone to Your Windows 10 VM

The Windows 10 Spring Creators Update was released on April 10, 2018, and it has a number of new features that Mac users will care about. This is the first in a short series about these new Windows 10 features: Connecting your iPhone to your Windows 10 VM. Microsoft has long been accused of copying […]

The post Connecting Your iPhone to Your Windows 10 VM appeared first on Parallels Blog.

HPE acquires RedPixie to encourage enterprise workload migrations


Clare Hopping

11 Apr, 2018

HPE has announced the acquisition of cloud consulting and migration firm RedPixie, which it will absorb into its existing advisory and professional services business, which falls within the HPE Pointnext services division.

The purchase means HPE will be able to offer its customers a greater range of cloud consulting, application development and migration services for its hybrid, private, managed, and public cloud customer, helping them transition to the cloud.

“At HPE Pointnext, we always begin by understanding our customers’ digital transformation ambitions and organizing ourselves around their desired outcomes,” Ana Pinczuk, HPE Pointnext’s global leader said in a blog post.

“With this acquisition, we will continue to expand our comprehensive hybrid IT portfolio and will be even better positioned to help our customers build new digital experiences and drive better business outcomes now and into the future.”

HPE has already bought Cloud Technology Partners to service the customers using AWS, while this newest acquisition will ensure it has the Microsoft Azure base covered, with Microsoft’s cloud RedPixie’s main business. Although it’s yet to announce a third acquisition to cover the Google slice, it seems pretty likely the company is on the lookout for someone to service those customers.

“Some workloads are best suited to the public cloud, some should live in a private cloud environment and others need to stay in traditional on-premises infrastructure,” Pinczuk said. “Finding the right mix will enable businesses to analyze data quickly, efficiently manage workloads and ultimately accelerate business outcomes by driving new business models, creating new customer and employee experiences, and improving operational performance.”

HPE scrapped its own public cloud service a few years ago, so it makes sense to ensure it can keep those using alternative clouds happy. Neither company has revealed what will happen to RedPixie’s employees or its customers, although it’s likely the latter will also be absorbed into HPE’s business.

Google achieves 100% renewable energy target – becoming first public cloud to do so

Google has touted itself as the first public cloud provider to run all its clouds on renewable energy.

The company, which says it is the largest corporate purchaser of renewable energy in the world – almost three times as much as Amazon and Microsoft, its primary cloud rivals – has been working to attain this goal for the best part of a decade.

With the belief that 2017 would be the year the ‘road to 100%’ would be completed, the company ramped up its efforts. 2016 saw Google’s operational projects cover almost three fifths (57%) of the energy used from global utilities. With the addition of a record number of new contracts for wind and solar developments still under construction, it enabled the company to surpass the 100% total.

“Over the course of 2017, across the globe, for every kilowatt hour of electricity we consumed, we purchased a kilowatt hour of renewably energy from a wind or solar farm that was built specifically for Google,” wrote Urs Hölzle, Google technical infrastructure senior vice president in a blog post. “This makes us the first public cloud, and company of our size, to have achieved this feat.”

Hölzle added that plans will only escalate in future months and years with new data centre and office openings. In February, Google CEO Sundar Pichai outlined expansion plans for Google’s data centres in the US, having attended the groundbreaking for the outlet in Clarksville/Montgomery County in Tennessee. Pichai noted the importance of renewable energy generation in data centre building and maintenance.

“People often discuss ‘the cloud’ as if it’s built out of air – but it’s actually made up of buildings, machinery, and people who construct and manage it all,” Pichai wrote at the time. “Today we employ an estimated 1,900 people directly on our data centre campuses. We’ve created thousands of construction jobs – both for our data centres themselves, and for renewable energy generation.

“Our renewable energy purchasing commitments to date will result in energy infrastructure investments of more than $3.5 billion globally – about two thirds of that in the United States,” Pichai added.