Happy Birthday to Visual Studio 2017

Happy 20th Birthday, Visual Studio! As Microsoft Visual Studio celebrates their 20th year – developers are rejoicing as Visual Studio 2017 is released! Over the past two decades Visual Studio has grown from a J++ and InterDev development environment to a powerhouse suite of productivity. The new release of Visual Studio 2017 brings Visual Basic, Visual […]

The post Happy Birthday to Visual Studio 2017 appeared first on Parallels Blog.

What Is Virtual Desktop Infrastructure | @CloudExpo #VDI #Cloud #DataCenter

vdiconImagine not having to carry around a laptop or be sitting in a cubicle to access your work desktop applications. Virtual desktop infrastructure (VDI) is appealing to many different constituencies because it combines the benefits of anywhere access with desktop support improvements.
Employees typically use a wide range of mobile devices from laptops to tablets and from desktops to smartphones are being used. The diversity of these mobile devices and the sheer number of them in the workplace can overwhelm IT and strain your resources.

read more

IBM and Salesforce come together

IBM and Salesforce have come together for a significant partnership that can change the face of cloud and artificial intelligence (AI). The two giants of tech world have entered into an agreement that will integrate both their artificial intelligence platforms, i.e., Watson and Einstein.

Besides their AI platforms, both the companies also plan to align some of their software services and components. In addition, IBM has offered to deploy Salesforce Service Cloud in its internal system, as a sign of goodwill of an expected lasting partnership. Specifically, both the companies plan to tap into each other’s machine learning capabilities to deliver more in-depth knowledge about customers to their end-clients. Furthermore, Watson’s API will be introduced to Salesforce CRM-enhancing AI platform, thereby allowing Einstein to make the most of IBM’s work in the area of cognitive computing.

This agreement also includes adding IBM’s Weather Company assets to Salesforce’s app development platform called Lightning. From this, both the companies plan to make weather data a potent tool for predicting customer behavior, and even for driving many of customer preferences. Also, by the end of March, IBM’s Application Integration Suite will be able to provide data from different third-party sources to Salesforce CRM.

This partnership is obviously significant for both the companies, as it can help IBM to turn around its fortunes and at the same time, help Salesforce to meet its ambitious growth plans. But more than the companies, it can have a big positive impact on the industry as a whole.

Imagine what happens when two of the world’s best AI systems come together? It can create the next level of applications, provide the deepest possible insights, and do just about anything else that the tech industry wants. Also, with IoT and the Wearable industry blooming, this move can completely alter the fortunes of companies engaged in both these areas. The existing clients of both the companies though will be the greatest beneficiaries as they can get deeper insights into their customers’ behavior, that can in turn, help them to devise better strategies to boost their sales and revenue.

Both the companies estimate that 2017 will be the year when AI will hit the world on a large scale, and they want to be in a position to drive this industry. Also, IBM expects almost a billion people to be touched by Watson in one way or another, and with Salesforce also joining hands, the possibilities are endless. The CEOs of IBM  and Salesforce are positive about this partnership and even believe that this is the beginning of a long and exciting journey together.

If you’re wondering why these two companies, it’s because they’ve been associated with each other for years. Though IBM is based in Armonk, New York and Salesforce in San Francisco, California, both the companies have worked together. In March 2016, IBM acquired Bluewolf, a product considered to be one of the oldest implemntation of Salesforce. Since then, many interactions have happened between the two companies, and this of course, is the big step that can take their partnership to new heights.

The post IBM and Salesforce come together appeared first on Cloud News Daily.

Quantum Computing Delivered from the Cloud | @CloudExpo @IBMcloud #AI #Cloud #Analytics

IBM Cloud is now providing developers with the infrastructure and portal to a 5 qubit quantum computer. This equips them with the ability to build interfaces between classic computers and IBM’s quantum platform. Quantum computers make direct use of quantum-mechanical phenomena, such as superposition and entanglement to perform operations on data. Quantum computers are different from binary digital electronic computers based on transistors. Whereas common digital computing requires that the data be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in superpositions of states.

read more

Parallels will be at East Midlands Apple Admins Third Meetup

Our sales engineer, Pat Begbie, will be at East Midlands Apple Admins Third Meetup in Leicester University tomorrow, March 8th, 2017 presenting our Parallels Mac Management for Microsoft SCCM solution. Third meet-up of the Apple Admins Date: Wed, March 8, 2017 Time: 7:00 PM – 9:30 PM GMT Place: Centre for Medicine, University of Leicester, United Kingdom […]

The post Parallels will be at East Midlands Apple Admins Third Meetup appeared first on Parallels Blog.

Cloud becoming ‘dominant vehicle’ for business analytics, says new report

A new study from Informatica and Deloitte has found that cloud is “well on the way” to being the dominant vehicle for business analytics.

The report, which was conducted at the end of last year alongside Enterprise Management Associates (EMA) and consulted more than 400 global business and technology leaders, found that cloud was either a key or important part of the analytics strategy for 91% of respondents.

In addition, self-service was considered a key component with cloud-ready organisations more likely to go down that route for data management and governance; 94% of those polled said governed self-service was important for their organisations’ analytics implementations.

The benefits of using cloud-based analytics are stark, according to the report; 84% of respondents said business user agility went up through governed self-service for data integration, while similar numbers agreed for data mapping (83%), data modelling (82%) and data governance (77%).

Cost reduction was naturally the primary financial aspect driving cloud analytics, but the study also showed it was the key technical driver. Interestingly, while security and compliance remained the key barrier to adoption – cited by 40% of those polled – security fears lessened according to companies who were further down the road of cloud implementation.

“The key takeaways from our research is that cloud adoption is expanding quickly as companies find success with their first cloud analytics implementations and move to create more mature environments and drive broader adoption,” said Lyndsay Wise, EMA research director in a statement.

“As these environments become more mature and robust, analytics users are demanding access to their data in ways that make it fast and easy to interact with,” Wise added. “In this regard, governed self-service access is the great enabler of the upswell of analytic insight that companies need to stay competitive.”

You can find out more about the report here (registration required).

Cloud becoming ‘dominant vehicle’ for business analytics, says new report

A new study from Informatica and Deloitte has found that cloud is “well on the way” to being the dominant vehicle for business analytics.

The report, which was conducted at the end of last year alongside Enterprise Management Associates (EMA) and consulted more than 400 global business and technology leaders, found that cloud was either a key or important part of the analytics strategy for 91% of respondents.

In addition, self-service was considered a key component with cloud-ready organisations more likely to go down that route for data management and governance; 94% of those polled said governed self-service was important for their organisations’ analytics implementations.

The benefits of using cloud-based analytics are stark, according to the report; 84% of respondents said business user agility went up through governed self-service for data integration, while similar numbers agreed for data mapping (83%), data modelling (82%) and data governance (77%).

Cost reduction was naturally the primary financial aspect driving cloud analytics, but the study also showed it was the key technical driver. Interestingly, while security and compliance remained the key barrier to adoption – cited by 40% of those polled – security fears lessened according to companies who were further down the road of cloud implementation.

“The key takeaways from our research is that cloud adoption is expanding quickly as companies find success with their first cloud analytics implementations and move to create more mature environments and drive broader adoption,” said Lyndsay Wise, EMA research director in a statement.

“As these environments become more mature and robust, analytics users are demanding access to their data in ways that make it fast and easy to interact with,” Wise added. “In this regard, governed self-service access is the great enabler of the upswell of analytic insight that companies need to stay competitive.”

You can find out more about the report here (registration required).

If you can’t take your lab to the cloud – bring your cloud to the lab

The pace of change today dictates that almost every organisation of a business move fast and compete, or face disruption. With technology playing such a central role regardless of the type of business, there’s considerable focus on adopting software as an organisational lubricant and various cloud-based models to facilitate operational simplicity and ease of consumption.

Over the past decade considerable organizational energies have been spent on data center optimization, automation as well as laying the foundation for public, private and hybrid clouds. DevOps and BizOps practices have also taken root. What’s notable is a majority of the industry focus and investment has been on securing and optimizing production workloads. IT labs and pre-production environments have taken a backseat.

Labs and pre-production environments are like the kitchen in a 5-star restaurant. Dining areas are fancy, modern and welcoming. But making a visit to the cooking area can be a different experience altogether with blood, sweat and gore all orchestrated in chaos to make meals. With some focus, this can certainly change.  Productivity of the chefs, cleanliness of the environment, smoothness of the workflow, ability to quickly replicate orders, – can all contribute to increase the revenue and reputation of the restaurant aside from helping them optimise costs and have a superior customer experience.

Today a majority of the IT labs are akin to the kitchen in the 5-star hotel. They are very functional, but not necessarily modernised to deal with the requirements of high-performance IT. Getting access to development and test environments can be both complicated and time-consuming. If the environment is complex, the request can be quite time-consuming. Infrastructure readiness, when required, can take weeks, if not months sometimes. All these contribute to slowing down productivity, decreasing efficiency, increasing costs and effecting innovation velocity which has a direct bearing on time to market.

While several dev/test groups utilise AWS, Azure or Google Cloud platform, this is not a suitable option for every type of deployments, and at times, the prevalence of shadow IT causes governance and compliance issues.

So, what can be done to modernize these labs? While production workloads and some dev/test workloads embrace private and public clouds, can the same principles be applied in case of a lab environment?

Conceptually, a cloud is a set of shared infrastructure and resources that allows for on-demand consumption, self-service and as-a-service based models in a multi-tenant architecture and potentially at scale. Can these principles of cloud be applied to lab environments to make them “cloud-ready”? If the lab cannot be taken to the cloud, can the cloud be brought to the lab?

What would this entail? Can the lab be created on-demand? Would it allow efficient utilisation of resources? Would it allow for self-service? Could it have operational simplicity and good governance mechanisms? Will it increase productivity? Would it accelerate innovation velocity?

If the answer is yes to all or most of the above, then it is possible to deliver the lab as-a-service (LaaS). The lab can be transformed into a cloud.

LaaS solutions bring the core benefi­ts of the cloud—self-service, multi-tenancy, automation, and scalability—to on-prem labs and pre-production data centers, turning labs into self-service private clouds. Users can rapidly model and publish blueprints to configure lab infrastructure and then publish those blueprints to a web-based catalog for one-click, on-demand deployment known as live cloud sandboxes.

Companies with massive lab operations—from R&D to QA—can now harness the power of cloud sandboxes to become faster and more agile in a marketplace that prizes efficiency. The benefits can be exponential.

Not unlike their non-technological namesake, cloud sandboxes serve as a personal “playground” for automating the DevOps process. And when packaged as lab-as-a-service (LaaS), these sandboxes are pivotal in transforming labs into personal, private clouds, giving them a common infrastructure that is available every step of the way.

The results can be dramatic. By moving to self-service and automated equipment-access, labs can increase efficiency by over 100%, here are other benefits, too: Labs can buy less equipment, reduce setup and teardown time for infrastructure and, by eliminating configuration issues, improve test success rates 20–50%.

Benefits to managers and engineers

The move to LaaS gives managers complete visibility and control over all of the resources in the lab. By collecting business-intelligence data about sandbox and infrastructure use over time, lab managers can identify equipment that’s no longer in use or equipment that could be powered down between reservations.

Engineers and developers also benefit from a switch to cloud sandboxes. They can design blueprints that meet their exact requirements and let sandbox software find and reserve the equipment needed for any blueprint on the fly. Plus, the sandbox is protected from outside interference or accidental reconfiguration. They can also eliminate hoarding, which costs even small data centers millions of dollars per year.

Top technology solution provider World Wide Technology used LaaS cloud sandbox software to turn its Advanced Technology Center into a self-service cloud. World Wide Technology used Quali to transform its business operations from highly logistics-driven to highly cloud-driven and saw some impressive results.  

LaaS cloud sandboxes are deployable in the private, public and hybrid clouds. Additional key features include visual-based environment modeling, web-based self-service blueprint catalog, intelligent automation and provisioning, remote lab access and consolidation, multi-tenancy and resource optimisation, power control, and reporting and analytics.

As the speed of innovation grows, IT and Dev/Test labs can either hinder innovation velocity or empower it. By converting labs into “cloud” environments, they can serve as powerful catalysts to organisational velocity.

Picture credit: Quali

If you can’t take your lab to the cloud – bring your cloud to the lab

The pace of change today dictates that almost every organisation of a business move fast and compete, or face disruption. With technology playing such a central role regardless of the type of business, there’s considerable focus on adopting software as an organisational lubricant and various cloud-based models to facilitate operational simplicity and ease of consumption.

Over the past decade considerable organizational energies have been spent on data center optimization, automation as well as laying the foundation for public, private and hybrid clouds. DevOps and BizOps practices have also taken root. What’s notable is a majority of the industry focus and investment has been on securing and optimizing production workloads. IT labs and pre-production environments have taken a backseat.

Labs and pre-production environments are like the kitchen in a 5-star restaurant. Dining areas are fancy, modern and welcoming. But making a visit to the cooking area can be a different experience altogether with blood, sweat and gore all orchestrated in chaos to make meals. With some focus, this can certainly change.  Productivity of the chefs, cleanliness of the environment, smoothness of the workflow, ability to quickly replicate orders, – can all contribute to increase the revenue and reputation of the restaurant aside from helping them optimise costs and have a superior customer experience.

Today a majority of the IT labs are akin to the kitchen in the 5-star hotel. They are very functional, but not necessarily modernised to deal with the requirements of high-performance IT. Getting access to development and test environments can be both complicated and time-consuming. If the environment is complex, the request can be quite time-consuming. Infrastructure readiness, when required, can take weeks, if not months sometimes. All these contribute to slowing down productivity, decreasing efficiency, increasing costs and effecting innovation velocity which has a direct bearing on time to market.

While several dev/test groups utilise AWS, Azure or Google Cloud platform, this is not a suitable option for every type of deployments, and at times, the prevalence of shadow IT causes governance and compliance issues.

So, what can be done to modernize these labs? While production workloads and some dev/test workloads embrace private and public clouds, can the same principles be applied in case of a lab environment?

Conceptually, a cloud is a set of shared infrastructure and resources that allows for on-demand consumption, self-service and as-a-service based models in a multi-tenant architecture and potentially at scale. Can these principles of cloud be applied to lab environments to make them “cloud-ready”? If the lab cannot be taken to the cloud, can the cloud be brought to the lab?

What would this entail? Can the lab be created on-demand? Would it allow efficient utilisation of resources? Would it allow for self-service? Could it have operational simplicity and good governance mechanisms? Will it increase productivity? Would it accelerate innovation velocity?

If the answer is yes to all or most of the above, then it is possible to deliver the lab as-a-service (LaaS). The lab can be transformed into a cloud.

LaaS solutions bring the core benefi­ts of the cloud—self-service, multi-tenancy, automation, and scalability—to on-prem labs and pre-production data centers, turning labs into self-service private clouds. Users can rapidly model and publish blueprints to configure lab infrastructure and then publish those blueprints to a web-based catalog for one-click, on-demand deployment known as live cloud sandboxes.

Companies with massive lab operations—from R&D to QA—can now harness the power of cloud sandboxes to become faster and more agile in a marketplace that prizes efficiency. The benefits can be exponential.

Not unlike their non-technological namesake, cloud sandboxes serve as a personal “playground” for automating the DevOps process. And when packaged as lab-as-a-service (LaaS), these sandboxes are pivotal in transforming labs into personal, private clouds, giving them a common infrastructure that is available every step of the way.

The results can be dramatic. By moving to self-service and automated equipment-access, labs can increase efficiency by over 100%, here are other benefits, too: Labs can buy less equipment, reduce setup and teardown time for infrastructure and, by eliminating configuration issues, improve test success rates 20–50%.

Benefits to managers and engineers

The move to LaaS gives managers complete visibility and control over all of the resources in the lab. By collecting business-intelligence data about sandbox and infrastructure use over time, lab managers can identify equipment that’s no longer in use or equipment that could be powered down between reservations.

Engineers and developers also benefit from a switch to cloud sandboxes. They can design blueprints that meet their exact requirements and let sandbox software find and reserve the equipment needed for any blueprint on the fly. Plus, the sandbox is protected from outside interference or accidental reconfiguration. They can also eliminate hoarding, which costs even small data centers millions of dollars per year.

Top technology solution provider World Wide Technology used LaaS cloud sandbox software to turn its Advanced Technology Center into a self-service cloud. World Wide Technology used Quali to transform its business operations from highly logistics-driven to highly cloud-driven and saw some impressive results.  

LaaS cloud sandboxes are deployable in the private, public and hybrid clouds. Additional key features include visual-based environment modeling, web-based self-service blueprint catalog, intelligent automation and provisioning, remote lab access and consolidation, multi-tenancy and resource optimisation, power control, and reporting and analytics.

As the speed of innovation grows, IT and Dev/Test labs can either hinder innovation velocity or empower it. By converting labs into “cloud” environments, they can serve as powerful catalysts to organisational velocity.

How to install and use a beta macOS in a virtual machine

A Parallels Desktop virtual machine (VM) is an ideal environment for using a beta release of an operating system. By its very nature, a beta OS will contain bugs, unfinished features, and other ‘gotchas’ that might play havoc with your documents or your work. A VM isolates those issues in a sandbox-like environment so that any […]

The post How to install and use a beta macOS in a virtual machine appeared first on Parallels Blog.