Parallels Support: Sound troubleshooting and management

Guest Parallels support staff author: Paul Christopher Nathaniel A few details about audio in Parallels Desktop for Mac. Sound good? Parallels Sound Troubleshooting Like any other modern computer, Mac computers have audio recording and playback devices (such as microphones and speakers); many of us use external headphones as well. As people install and start playing […]

The post Parallels Support: Sound troubleshooting and management appeared first on Parallels Blog.

A Decade of Innovation with Parallels Desktop

A Decade of Innovation We are honored that for 10 years and counting Parallels Desktop for Mac has enabled more than five million consumers, students, professionals and businesses to seamlessly run Windows on their Mac – it has truly been a decade of innovation. We would like to specifically thank our development team at Parallels, […]

The post A Decade of Innovation with Parallels Desktop appeared first on Parallels Blog.

Twitter acquires machine learning start-up Magic Pony

Twitter has stepped up its efforts in the machine learning arena after announcing the acquisition of visual processing technology company Magic Pony.

While the company claims machine learning is central to the brands capabilities, it has been relatively quiet in the market segment in comparison to industry heavy weights such as IBM, Google and Microsoft. This is the third acquisition the team has made in this area, reported to be in the range of $150 million, following the purchase of Whetlab last year and Mad Bits in 2014, compared to Google who acquired Jetpac, Dark Blue Labs and Vision Factory, as well as $500 million on DeepMind, all in 2014.

“Machine learning is increasingly at the core of everything we build at Twitter,” said Jack Dorsey, Twitter CEO. “Magic Pony’s machine learning technology will help us build strength into our deep learning teams with world-class talent, so Twitter can continue to be the best place to see what’s happening and why it matters, first. We value deep learning research to help make our world better, and we will keep doing our part to share our work and learnings with the community.”

The acquisition follows Twitter’s announcement last week advertisers will now be able to utilize emoji keyword targeting for Twitter Ads. Although a simple proposition in the first instance, the new features did open up the opportunity for machine learning enhanced advertising solutions.

Magic Pony, which was founded in 2014 and currently has 11 employees, was acquired to bolster the visual experiences that are delivered across Twitter apps. The team will link up with Twitter Cortex, the in-house machine learning department, to improve image processing expertise.

The technology itself makes use of the abilities of convolutional neural networks to scale-out an image. By taking the information in a picture, the technology imagines a larger and more in-depth image by scaling out the detail which it sees. Much in the same way a human can imagine the rest of a car by seeing the door, the technology learns lessons from previous experiences and applies logical decisions moving forward.

Magic Pony itself was initially supported by investment from Octopus Ventures who have seemingly found a specialty in finding promising AI start-ups. Prior to Magic Pony being acquired by Twitter, Octopus Ventures invested it Evi which was acquired by Amazon in 2012, and SwiftKey which was acquired by Microsoft this year.

“Today marks a great day for the Magic Pony team,” said Luke Hakes, Investment Director at Octopus Ventures. “We’re proud to have believed in the concept early on and to then have had the privilege of joining their journey. The technology Magic Pony has developed is revolutionary and pushes the boundaries of what is possible with AI in the video space.

“The UK continues to grow as the ‘go-to’ place for companies looking to build best in breed AI technology – Octopus has been fortunate to work with the founders of three companies in this space that have gone on to be acquired, with Evi and Amazon, SwiftKey and Microsoft, and now Magic Pony and Twitter. We are excited for the Magic Pony team, but also to take what we have learnt on the last three journeys and help the next generation of entrepreneurs lead the way in the on-going AI revolution.”

Demystifying the three myths of cloud database

cloud question markThe cloud is here and it’s here to stay. The cost savings, flexibility and added agility alone mean that cloud is a force to be reckoned with.

However, many businesses are struggling to figure out exactly how to get the most out of the cloud; particularly when choosing what infrastructure elements to leave on-premises and which to migrate to the cloud. A recent SolarWinds survey found that only 42 per cent of businesses will have half or more of their organisations’ total IT infrastructure in the cloud within the next three to five years. Furthermore, seven per cent say their organisation has not yet migrated any infrastructure at all to the cloud, though many of these plan to once they have considered what to transfer and how to do it.

One of the more controversial moves when it comes to migrating infrastructure to the cloud is the database. Hesitancy in making the shift to the cloud is clear, with nearly three quarters (73%) of organisations stating they have yet to do so – but why is this?

The database is often seen as the most critical and important piece of IT infrastructure when it comes to performance, and lies at the heart of most applications, meaning changes are perceived as being risky. If there is a negative effect when moving or changing the way it operates, a ripple effect could impact on the entire business, for example losing important data.

While on some level this fear is justifiable, there are certainly a few reasons which could be defined as myths, or misconception, rather than reality:

Myth 1: Need high performance and availability? The cloud is not a suitable fit.

Several years ago during the early days of the cloud, the ‘one size fits all’ approach may have been fact, however with the natural maturation of the technology we’re at a point where databases in the cloud can meet the needs of even the most demanding applications.

The reality of today’s cloud storage systems is that there are very powerful database services available on the cloud, many based on SSD drives offering up to 48,000 IOPS and 800MBps throughout per instance. Also, while outages in the cloud were a common annoyance two to three years ago, today’s cloud providers often exceeds that of what most on-premises systems are able to deliver. Today’s cloud provider SLAs combined with the ease of setting replicas, standby systems and the durability of the data stored are often able to deliver better results.

This is not to say that the database administrator (DBA) is free of responsibility. While the cloud provider will take care of some of the heavy lifting that is involved with configurative and administrative tasks, the DBA is still responsible for the overall performance. Therefore, the DBA needs to still pay close attention to resource contention, bottlenecks, query tuning, execution plans, etc. – some of which may mean new performance analysis tools are needed.

Myth 2: The cloud is not secure.

Even though security should always be a concern, just because you can stroll into a server room and physically see the server racks doesn’t necessarily mean they are more secure than the cloud. In fact, there have been many more high profile security breaches involving on-premises compared to public cloud.

The truth is the cloud can be extremely secure, you just need a plan. When using a cloud provider, security is not entirely their responsibility, instead it needs to be thought of as a shared job – they provide reasonably secure systems, and you are responsible for secure architecture and processes.

You need to be very clear about the risks, the corporate security regulations which need to be abided by and the compliance certifications that must be achieved. Also, by developing a thorough understanding of your cloud provider’s security model, you will be able to implement proper encryption, key management, access control, patching, log analysis, etc. to complement what the cloud provider offers and take advantage of their security capabilities. With this collaborative approach to security and in-depth understanding of one another, you can ensure that your data is safe, if not safer, than if it were physical server racks down the hall.

Myth 3: If I use cloud I will have no control of my database.

This is another half-truth. Although migrating your database to the cloud does hand over some of the day-to-day maintenance control to your provider, when it comes to performance your control won’t and shouldn’t be any less.

As mentioned above, an essential step to ensure that you remain in control of your database is to understand your cloud provider’s service details. You need to understand their SLAs, review their recommended architecture, stay on top of new services and capabilities and be very aware of scheduled maintenance which may impact your job. Also, it’s important to take into account data transfer and latency for backups and to have all your databases in sync, especially if your database-dependent applications need to integrate with another one and are not in the same cloud deployment.

Finally, keep a copy of your data with a different vendor who is in a different location. If you take an active role in managing backup and recovery, you will be less likely to lose important data in the unlikely event of vendor failure or outage. The truth is that most cloud providers offer plenty of options, giving you the level of control you need for each workload.

Conclusion

The decision to migrate a database to the cloud is not an easy one, nor should it be. Many things need to be taken into account and the benefits and drawbacks need to be weighed up. However, given the tools available and the maturity of the cloud market today, deciding not to explore cloud as an option for your database could be short-sighted.

Written by Gerardo Dada, Head Geek at SolarWinds

Research shows greater uptake in virtualised environments for regulated industries

(c)iStock.com/4x-image

According to the latest research from virtualisation technology provider HyTrust, the software defined data centre (SDDC) is moving further towards the mainstream, with many regulated industries moving around half of their critical workloads over to SDDC and virtualisation.

The study, snappily titled ‘Industry Experience: the 2016 State of the Cloud and Software Defined Data Centre in Real World Environments’, surveyed more than 500 executives across a wide variety of company sizes, and found relatively high uptake of virtualisation and public cloud across the board. Put into three buckets, financial services, banking and insurance scored just under half (47%) of tier one workloads moved over, with healthcare, biotech and pharma, as well as technology firms, both scoring 55%.

For test and development servers, tech firms are on a much surer footing with almost three quarters (72%) of workloads making it, compared to financial (49%) and healthcare (52%) respectively. Network again saw a disparity between finance (51%) and technology companies (68%), while storage showed a closer gap (finance 53%, healthcare 61%). In each case, more than half of the workloads are being trusted in the public cloud.

There was an interesting point of contention with regard to the various platforms used; 29% of those in healthcare used AWS, while half (50%) in manufacturing opted for Azure. HyTrust insists that there is still one eye on security for large scale migrations. Between a quarter and a third of businesses encrypt the entire workload as part of their data security requirements for virtualising private cloud workloads, with disparities between business consulting and management (39%) and retail (24%). 40% of firms in energy and utilities encrypt personally identifiable information (PII) only.

As a result, while the numbers look impressive, there still has to be some concern. “Without much fanfare, this critical technology advance has become woven into the basic fabric of businesses large and small. The potential of virtualisation and the cloud was always undeniable, but there was genuine concern over security and scepticism regarding the processes required,” said Eric Chui, president of HyTrust.

“What we find in this research is that the challenges are being overcome, and every kind of function in every kind of industry is being migrated,” he added. “There are some holdouts, to be sure, but they’re now the exception, and we’re betting they won’t stay that way for long.”

Previous research from HyTrust back in April found that two thirds of US respondents expected increased adoption in the SDDC over the coming year.

[video] Static vs Dynamic Cloud By @NewRelic | @CloudExpo #Cloud

In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy).
The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.

read more

Connected home will be operated by Apple and Google

Research from Gartner has claimed 25% households in developed economies will utilise the services of digital assistants, such as Apple’s Siri or Google Assistant, on smartphones as the primary means to interact with the connected home.

The user experience is an area which has been prioritized by numerous tech giants, including those in the consumer world, as the process of normalizing the connected world moves forward. Although IoT as a concept has been generally accepted by industry, efforts to take the technology into the wider consumer ecosystem are underway.

Connecting all IoT applications under a digital assistant could be a means to remove the complexity of managing the connected home, playing on the consumer drive for simplicity and efficiency. The digital assistant also presents an entry point for artificial intelligence, as appliances and systems in the home can be optimized alongside information available over the internet. Energy consumption, for example, could potentially be reduced as the digital assistant optimizes a thermostats levels dependent on current weather conditions.

“In the not-too-distant future, users will no longer have to contend with multiple apps; instead, they will literally talk to digital personal assistants such as Apple’s Siri, Amazon’s Alexa or Google Assistant,” said Mark O’Neill, Research Director at Gartner. “Some of these personal assistants are cloud-based and already beginning to leverage smart machine technology.”

The process of normalizing IoT in the consumer world will ultimately create a number of new opportunities for the tech giants, as the technology could offer a gateway into the home for a number of other verticals. Banks and insurance companies for example, could offer advice to customers on how they could save money on bills, should they have access to the data which is generated in the connected home.

“APIs are the key to interoperating with new digital interfaces and a well-managed API program is a key success factor for organizations that are interested in reaching consumers in their connected homes,” said O’Neill. “In the emerging programmable home, it is no longer best to spend time and money on developing individual apps. Instead, divert resources to APIs, which are the way to embrace the postapp world.”

25% of New Yorkers have no broadband access

Digital Device Tablet Laptop Connection Networking Technology ConceptResearch from the Wireless Broadband Alliance highlighted broadband connectivity is no longer a challenge reserved for rural areas, as 57% of the world’s urban population is currently unconnected, reports Telecoms.com.

Initiatives to increase the number of people who have a consistent connection to the internet has predominantly focused around rural communities, though the report demonstrated there are still a number of advanced western cities who have higher numbers than maybe expected. The research showed New York and Los Angeles currently have 27% and 25% of their populations who would be classed in the “Urban Broadband Unconnected” category. Shanghai was another city where the percentage of unconnected urban citizens seems high at 42%.

While New York, Los Angeles and Shanghai could be seen as technologically advanced cities, the seemingly high number of unconnected citizens could be attributed to the diversity in wealth and affluence. The report claims the numbers could be driven simply by broadband not being available in certain neighbourhoods, but also the price of broadband being unaffordable. While this would almost certainly be considered a ‘first-world problem’, there could be a potential impact on other areas of society, for example politics, as more communications move online, in particular to social media.

The CIA World Fact Book lists the USA as one of the world’s most affluent countries, accounting for $55,800 GDP per capita, which makes the statistics taken from two of its leading cities perhaps more surprising, though it does provide clarity to the high percentages in other nations. Lagos and Karachi were two of the cities which demonstrated the highest number of unconnected urban citizens at 88% and 86% respectively, though their GDP per capita is listed at $6,100 and $5,000, and are two countries which have been typically associated with political unrest.

“There is a clear divide between the digital haves and the digital have-nots,” said Shrikant Shenwai, CEO of the Wireless Broadband Alliance. “And while this divide generally mirrors socioeconomic trends around the world, there are surprisingly high levels of urban unconnected citizens in major cities.

“World Wi-Fi (June 20) Day is an opportunity to recognize the contributions being made to help connect the unconnected around the globe, whether they be in major cities or rural communities.”

The report evaluated 18 of the worlds’ leading cities including Tokyo, Dusseldorf, New Delhi, Johannesburg and London, which was listed as the worlds’ most connected city as only 8% of the population are unconnected currently. Europe was the most connected continent demonstrating the lowest levels of unconnected citizens at 17%, while in Asia Pacific 68% of its urban citizens were unconnected.

Analysing the analysis: Gartner’s 2015 CRM market share report

(c)iStock.com/jxfzsy

Top line findings

  • Worldwide customer relationship management (CRM) software totaled $26.3B in 2015, up 12.3% from $23.4B in 2014.
  • SaaS revenue grew 27% year over year, more than double overall CRM market growth in 2015.
  • Asia/Pacific grew the fastest of all regions globally, increasing 9% 2015, closely followed by greater China with 18.4% growth.

These and many other insights into the current state of the global CRM market are from Gartner’s Market Share Analysis: Customer Relationship Management Software, Worldwide, 2015 (PDF, client access) published earlier this month.  The top five CRM vendors accounted for 45% of the total market in 2015. Salesforce dominated in 2015, with a 21.1% annual growth rate and absolute growth of over $902M in CRM revenue, more than the next ten providers combined.

Gartner found that Salesforce leads in revenue in the sales and customer service and support (CSS) segments of CRM, and is now third in revenue in the marketing segment. Gartner doesn’t address how analytics are fundamentally redefining CRM today, which is an area nearly every C-level and revenue team leader I’ve spoken with this year is prioritising for investment. The following graphic and table compare 2015 worldwide CRM market shares.

CRM Market Share 2015

table 1

Adobe, Microsoft, and Salesforce are growing faster than the market

Adobe grew the fastest between 2014 and 2015, increasing worldwide sales 26.9%. Salesforce continues to grow well above the worldwide CRM market average, increasing sales 21.1%. Microsoft increased sales 20% in the last year.  The worldwide CRM market grew 12.3% between 2014 and 2015.

Spending by vendor 2015

Analytics, machine learning, and artifical intelligence are the future of CRM

Advanced analytics, machine learning and artificial intelligence (AI) will revolutionize CRM in the next three years. Look to the five market leaders in 2015 to invest heavily in these areas with the goal of building patent portfolios and increasing the amount of intellectual property they own. Cloud-based analytics platforms offer the scale, speed of deployment, agility, and ability to rapidly prototype analytics workflows that support the next generation of CRM workflows. My recent post on SelectHub, Selecting The Best Cloud Analytics Platform: Trends To Watch In 2016, provides insights into how companies with investments in CRM systems are making decisions on cloud platforms today.

Based on insights gained from discussions with senior management teams, I’ve put together an Intelligent Cloud Maturity Model that underscores why scalability of a cloud-based analytics platform is a must-have for any company.

cloud-maturity-model

Sources:  Gartner Says Customer Relationship Management Software Market Grew 12.3 Percent

Vidyo Joins the Alliance for Open Media | @CloudExpo @Vidyo #Cloud

Vidyo, Inc., has joined the Alliance for Open Media. The Alliance for Open Media is a non-profit organization working to define and develop media technologies that address the need for an open standard for video compression and delivery over the web.
As a member of the Alliance, Vidyo will collaborate with industry leaders in pursuit of an open and royalty-free AOMedia Video codec, AV1. Vidyo’s contributions to the organization will bring to bear its long history of expertise in codec technology and the process of developing standard codecs, with emphasis on interactive real-time communications. The Alliance has made its AV1 codec source code publicly available as an open source project, and the group welcomes contributions from the broader developer community.

read more