Free Systems in Parallels Desktop

  Within our latest release of Parallels Desktop® for Mac, we have included free systems that are available to the user at no additional cost. These free systems are easily set up with a few simple clicks. Application developers, beta application testers, and software engineers utilize these systems to achieve a safe virtual environment that […]

The post Free Systems in Parallels Desktop appeared first on Parallels Blog.

[slides] #Docker Container Pipeline | @DevOpsSummit @KontenaInc #Serverless

Docker containers have brought great opportunities to shorten the deployment process through continuous integration and the delivery of applications and microservices. This applies equally to enterprise data centers as well as the cloud. In his session at 20th Cloud Expo, Jari Kolehmainen, founder and CTO of Kontena, discussed solutions and benefits of a deeply integrated deployment pipeline using technologies such as container management platforms, Docker containers, and the drone.io Cl tool. He also demonstrated deployment of a CI/CD pipeline using container management, as well as show how to deploy a containerized application through a continuous delivery pipeline.

read more

CAST Software to Exhibit at @CloudExpo | @OnQuality #DevOps #Analytics #DX #SmartCities

SYS-CON Events announced today that CAST Software will exhibit at SYS-CON’s 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CAST was founded more than 25 years ago to make the invisible visible. Built around the idea that even the best analytics on the market still leave blind spots for technical teams looking to deliver better software and prevent outages, CAST provides the software intelligence that matter most.

read more

[session] The Modern Software Factory | @DevOpsSummit @CAinc @AAkela #Agile #DX #DevOps

Translating agile methodology into real-world best practices within the modern software factory has driven widespread DevOps adoption, yet much work remains to expand workflows and tooling across the enterprise. As models evolve from pockets of experimentation into wholescale organizational reinvention, practitioners find themselves challenged to incorporate the culture and architecture necessary to support DevOps at scale.

read more

[slides] Satellite Storage for FinTech Security | @CloudExpo #AI #ML #DX #FinTech

For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks.
In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, discussed new best practices to bypass the internet and the many applications for this technology, including the financial sector, which is notoriously vulnerable to attack.

read more

[slides] Leveraging AI for Cloud Management | @CloudExpo @ZeroStackInc #AI #ML #Cloud

Businesses and business units of all sizes can benefit from cloud computing, but many don’t want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, discussed how AI can simplify cloud operations. He covered the following topics: why cloud management is a barrier to adoption and the role of AI in cloud deployment.

read more

Skytap secures $45m in funding round led by Goldman Sachs

Skytap, a Seattle-based public cloud provider, has announced it has raised $45 million (£35.2m) to push forward its product development and market expansion.

The round was led by Goldman Sachs, with Hillel Moerman, managing director who co-heads the investment firm’s private capital investing (PCI) group, joining Skytap’s board of directors as a result.

Skytap offers an alternative to the public cloud mindset of ‘forcing customers to rewrite their traditional applications before migrating’, as the company puts it. In other words, organisations who use Skytap’s cloud can migrate their core apps unchanged, and then modernise things from there. “These applications can then be evolved by combining new cloud services with traditional components to form hybrid applications that maximise existing investments, while accelerating innovation,” the company explains.

Skytap claims its total sales have more than tripled year over year in the most recent quarter, getting client wins in Fortune 500 organisations in healthcare, retail, financial services and media in the process.

The company found a position for the first time in the most recent Gartner Magic Quadrant for cloud infrastructure as a service (IaaS) as a niche player. Despite the clear lead of Amazon Web Services (AWS), and then to Microsoft, the only two companies in the leaders’ zone, Goldman Sachs among others must believe there is room for other players in the market.

“We believe that Skytap is well-positioned to address the large and growing demand for helping enterprises modernise business-critical applications in order to harness the opportunity and full value of the public cloud,” said Moerman. “This untapped market is exciting, and we are pleased to support Skytap during this time of growth.”

The data centre of tomorrow: More power, fewer humans

Moore’s Law has become the golden rule by which processing has increased and, in fact, modern statistics suggest this rise sees more of an 18-month cycle.

Today, there is more processing power in a single handheld device than that which NASA had at its disposal during the moon landings. So, as this increase continues, where does that leave our data centres in the future?

There are many considerations; not least the structure of the data centre but also the purpose of our data usage.

In addition, we will need to consider flexibility, cost, speed – and coping with it. Then there is the regulations around energy and cross-nation applications and, of course, the technology itself and disaster recovery.

It appears to be a minefield – but is it one that we’ll adapt to in the same way our foreseers were predicting 50 years ago?

One fear is that processing power will outstrip the capabilities of the human brain – but I’m happy to leave that notion to Hollywood for the time being and just examine how the future will look and how it may impact us, our clients and our clients’ data.

What’s next?

The days of being tied to a single networking vendor or technology are on the demise as we see open standards allowing separation of the network hardware and software – which allows for the development of products that are optimised for data centres.

For example, they allow for greater airflow through server components, which enables them to be kept cooler. This will be a huge boon going forward as we look to go totally green, as energy consumption will be vastly reduced. But more on this later.

The Open Compute Project is already championing the cause of making hardware more efficient, flexible and scalable. And, if Facebook believes it to be worth signing up for, then I can definitely see it has legs. Facebook recognised the need to rethink its infrastructure to cope with the unprecedented, huge demands of users. We should all follow suit.

Speed will be another key driver as we look at what’s next on the horizon. We are witnessing interconnection between systems increasingly dependent upon fibre rather than copper cabling – this is also true for our core telecoms business. Even 40G and 100G links over multiple multimode fibre channels will give way to individual singlemode fibre channels as the cost of transceivers continue to reduce. 

What’s driving change?

One school of thought might suggest it’s people and demand that’s driving change – but I believe it’s the technology itself that’s creating the demand – in particular the Internet of Things.

The growth of the amount of devices that can be controlled by users and our clients, from our cars to our home energy, has increased slowly over the past 10 years but will rise considerably over the next 10 – I’m convinced.

Millennials are using multiple devices nowadays; we don’t just see one device per household or employee. And, as readily available devices designed to connect to the internet continue to emerge, this too will impact upon data centres of the future and the need to meet more complex demands.

What’s different?

Cloud and hybrid solutions are with us now, and will evolve as we do. They will remain integral to managing our data but I see SaaS apps in the cloud in the future along with virtual desktop infrastructures (VDI) as companies strive to achieve reliable networking anywhere and everywhere.

Cloud PBX is also set to revolutionise how we communicate anywhere and at any time and will change the face of voice and data infrastructure. This is now a multifaceted business as communications advancements have moved on a long way from two people talking together at either end of a wire. Two-way communication in a business environment is being replaced by multi-sited conferencing.

What’s green?

Great strides have been made to ensure new data centres are green, nowadays. And as we move forward, this needs to be integral without compromising speed, cost and efficiency. We could see cyber attacks on grid and telecoms networks that could also impact data centres.

Again, a key feature of the Open Compute Project is to minimise the environmental impact of data centre infrastructure.

The initial design stage does away with unnecessary features and components that would waste manufacturing resources and operating energy, even including reduced transportation energy use due to reduced weight.

Data centres using these design principles can claim extremely low Power Usage Effectiveness (PUE), which is a measure of the data centre’s wastefulness or efficiency. Operators are now competing to see how close to 1.00 they can get – which would represent zero wasted energy – which in turn represents a colossal benefit for all stakeholders.

What’s the end game?

Within a human occupied building environment, copper structured cabling will play an on-going important role – providing connectivity for the increasing number of networked building control systems such as access control systems, digital CCTV cameras, building management systems and modern LED lighting panels.

Even wireless devices need wired access points to work – and these are connected to the network and a power source by the humble RJ45 plug and a length of Cat6 or more commonly Cat6a cable.

Within data centres, connectivity will become almost exclusively via fibre as speeds increase and costs reduce.

Finally, and I know I said I’d leave it to Hollywood, but this all assumes a human occupied building. My fear is that, in direct opposition to Moore’s Law that predicts the exponential rise of transistors, we’ll see very few humans in such environments.

Parallels Desktop 13 System Requirements

Parallels Desktop® for Mac empowers millions of users to go beyond the limitations of hardware to achieve their end goals. Interested in getting started with running Windows, Linux, and other popular OSes on your Mac without restarting? Check out the list below to see if your machine’s hardware specifications and desired operating system are supported. […]

The post Parallels Desktop 13 System Requirements appeared first on Parallels Blog.