Working Remotely this Holiday Season? Don’t Forget to Download Parallels Access

Do you have senioritis at the end of the year, getting antsy to get out of the office for a few days to disconnect from work and connect with family and friends? I know I do. I also know that the second I check my phone when I’m home for the holidays, there will be […]

The post Working Remotely this Holiday Season? Don’t Forget to Download Parallels Access appeared first on Parallels Blog.

NetApp confirms $870m SolidFire acquisition – but don’t think this is all about flash

(c)iStock.com/laflor

Cloud storage and data management firm NetApp has announced the acquisition of all-flash storage system provider SolidFire for $870 million (£586m) in cash.

The all-flash offering across three key market segments – traditional enterprise infrastructure buyers, application owners and next-generation infrastructure buyers – is a key reason for the purchase, NetApp argues.

“This acquisition will benefit current and future customers looking to gain the benefits of webscale cloud providers for their own data centres,” said NetApp CEO George Kurian. “We look forward to extending NetApp’s flash leadership with the SolidFire team, products and partner ecosystem, and to accelerating flash adoption through NetApp’s large partner and customer base.”

On the other side, SolidFire CEO and founder Dave Wright argues the acquisition represents a “tremendous opportunity to double down on our mission”, while noting that putting all-flash storage as a single market “obscures the true landscape of the data centre today.”

“All primary storage is moving to flash – fast,” he wrote in a company blog post. “The all-flash array market is not a sign of a niche market – it is displacing spinning disk at an exponential rate, with no sign of slowing down.

“There is no doubt that the IT industry is in the midst of a transformation the likes of which hasn’t been seen in decades,” he added. “Customers navigating this transformation need vendors with solutions that span the full range of data centre environments and application use cases.”

In October last year, SolidFire was slurping up series D funding of $82m, bringing the overall capital raised to $150m. Jay Prassl, SolidFire VP marketing, explained to this publication at the time that the Colorado-based firm did not want to be in the position of being “forced to go public”, citing the case of Violin Memory and its less-than-stellar IPO in 2013.

Not everyone has the same opinion of the deal, however. Even though fellow enterprise flash storage provider Pure Storage went public in October – and Wright tweeted congratulations to his ‘friends’ at the company – the Mountain View firm decided to stick the knife in.

“This acquisition shows very clearly that NetApp is still scrambling to piece together a viable flash strategy,” Pure VP products Matt Kixmoeller wrote. He added: “Although we hold the folks at SolidFire in very high regard, we are confident our technology trumps. Our product lead will only grow at SolidFire is slowed down by the coming years of integration, in-fighting, and confusion that come via acquisition.”

This reporter is reminded of January 2014 when enterprise mobility management provider AirWatch was acquired by VMware. Citrix, a rival of VMware’s on desktop virtualisation and end user computing (EUC) among others, issued a blog in response describing the latter’s vision for EUC as “laughable on many counts.” The post was later pulled.

The deal is expected to close in the fourth quarter of NetApp’s 2016 fiscal year, with Wright leading the SolidFire product line within NetApp’s product operations.

The white space between hand-crafted and automated approaches to the cloud

(c)iStock.com/ymgerman

The journey to the cloud will be faster for some than others. For startups, the decision is easy, they are “cloud first”. They have low overheads and can test products and services in new markets with ease. This model is one of the main reasons that the likes of Uber and Airbnb have succeeded where the dot-com companies of the late 90s failed (that, and consumer Internet access and the rise of smartphones).

For CIOs of larger organisations, it’s a different picture. They need to move from a complex, heterogeneous IT infrastructure into the highly automated – and, ultimately, highly scalable – distributed platform.

On the journey to homogeneous cloud, where the entire software stack is provided by one vendor, the provision of managed services remains essential. And to ensure a smooth and successful journey, enterprises need a service provider that can bridge the gap between the heterogeneous and homogeneous. In specific terms that means a bridge between: the network and the data centre; the IT infrastructure library (ITIL) and DevOps; infrastructure and professional services; and finally, local and global services.

This transformation in IT services is being driven by a need to match IT capabilities to an increasingly demanding business. CEOs are beginning to see the connection between digital investments and business objectives. According to a recent PwC survey, 86% of CEOs say a clear vision of how digital technologies can help achieve competitive advantage is key to the success of digital investments.

Businesses demand a lot from technology. They expect 100 per cent connectivity and application uptime. Their customers likewise expect availability of services around the clock – and flock to social media to let the world know if they can’t get them. Businesses expect IT to support the mobile revolution, to help employees become more productive and to ultimately help the company become more profitable. It’s a big ask. 

At the same time, there’s increasing pressure on the technology function. Cost containment, shorter implementation cycles and (the dreaded) regulatory compliance are all issues for the CIO to contend with ‒ not to mention finding motivated, skilled and loyal staff to help them with these challenges or the myriad of other problems that go with their day-to-day jobs.

It’s easy to see the attraction of moving infrastructure and applications to the cloud; flexibility and choice. Those two attributes deliver the business agility that’s every bit as important as potential cost savings. This contrasts with the traditional service value proposition where complex processes are managed by a third party.

This 20-year-old traditional service model can be compared to a restaurant where the chef cooks the meal in front of the guests. There’s a time where you do want to go to the restaurant and watch them cook right in front of you because you want to watch the preparation process step by step. It might be important to you from an architecture or a compliance perspective, or just an awareness standpoint, to know every single step of the way.

There’s nothing wrong with this model in principle but it has been superseded, in many instances, by the modern era of cloud and software as a service (SaaS). These are models that deliver data at the press of a button. Even when you take the hand-crafted approach at one end and the highly automated cloud and SaaS model on the other, there’s a lot of white space in between.

This white space is where many companies are today. The situation is like an 80s robot serving drinks: if it feels like an antiquated robot and looks like an antiquated robot, the chances are that you have a dud robot on your hands. You’re not really getting the true automated user experience you expected. You may be able to glean that there is some virtualisation and automation in the core of the service, but you’re missing out on good user experience, or getting something that’s bolted together.

Modernising these old service approaches requires a lot of thinking, but at the end of the day, automation always wins over headcount. Autonomic management systems can also ensure that there is full integration between the workflow and platform as a service (PaaS) framework with security/attack detection, application scheduler and dynamic resource provisioning algorithms.

Eventually that automated user experience will be able to take those edge cases that we typically reserve for human instruction and deliver something much more sophisticated.  Some of the features of autonomic cloud computing are available today for organisations that have completed – or at least partly completed – their journey to the cloud.

The cloud era has been with us for a while but the momentum is gathering pace. IT managers now trust and understand the benefits. With more and more demand on digitisation and the monetisation of IT services, companies need – and now expect – an agile IT service that can finally match the demands it faces.

Top Node.js Metrics to Watch | @ThingsExpo #APM #IoT #Nodejs #BigData

Monitoring Node.js Applications has special challenges. The dynamic nature of the language provides many “opportunities” for developers to produce memory leaks, and a single function blocking the event queue can have a huge impact on the overall application performance. Parallel execution of jobs is done using multiple worker processes using the “cluster” functionality to take full advantage of multi-core CPUs – but the master and worker processes belong to a single application, which means that they should be monitored together. Let’s have a deep look at the Top Metrics in Node.js Applications to get a better understanding of why they are so important to monitor.

read more

The Future of Data Security By @Kevin_Jackson | @CloudExpo #Cloud

The Dell Fellows program recognizes engineers for their outstanding and sustained technical achievements, engineering contributions and advancement of the industry. They are also seen as top innovators that have distinguished themselves through ingenuity, intellectual curiosity and inventiveness in the delivery of technology solutions. For these reasons and more, I couldn’t pass up the opportunity to speak with Timothy G. Brown, Executive Director Security and Dell Fellow. During our broad-ranging discussion, Tim shared with me his exciting view of security in the not too distant future.

read more

Teradata: Embrace the Power of PaaS By @Kevin_Jackson | @CloudExpo #Cloud

Platform-as-a-Service (PaaS) has always been the unappreciated sibling of the cloud computing service model trio. Existing in the dark shadow of the most widely adopted Software-as-a-Service (SaaS) and foundationally powerful Infrastructure-as-a-Service (IaaS), the third service model is often misunderstood and widely ignored.
PaaS provides a platform allowing customers to develop, run, and manage web applications without the complexity of building and maintaining the infrastructure. Its unique power is associated with developing and deploying applications.

read more

Royal Mail bags couriering SaaS specialist NetDespatch

Email DatentunnelThe UK’s Royal Mail has bought cloud-based parcel management system NetDespatch in a bid to expand its range of services and global reach.

NetDespatch will operate as an independent standalone subsidiary so it can continue to service existing clients and is free to offer services to Royal Mail competitors in future. NetDespatch directors Matthew Robertson and Matthew Clark will remain in charge of operations and all existing client terms and conditions will remain unchanged.

The service provider helps carriers (such as its new parent company Royal Mail) to manage the transport of parcels for 130,000 business customers in 100 countries across the world. The NetDespatch parcel management system is a software as a service (SaaS) cloud system that carriers use to track the movements of parcels. It has grown in popularity as it can make it easier to integrate ecommerce websites, sales order processing and warehouse systems at the point of despatch. It makes it easier for users to print shipping labels, customs documents and manifests and automatically pre-advises their carrier of incoming parcels.

The aim is to make a relatively complex process simple, make logistics more efficient and save money for everyone in the supply chain from retailer to carrier to consumer, said Matthew Robertson, NetDespatch’s Commercial Director. “E-commerce is exploding in the run up to Christmas and we expect to continue to steam ahead in 2016 and beyond,” he said.

NetDespatch’s cloud software has made the integration of the Royal Mail’s systems with its customers’ complex IT estates a lot quicker, according to Nick Landon, Managing Director of Royal Mail Parcels. “This acquisition will support our parcels business with new and innovative software solutions,” he said. The fee for the transaction was not disclosed.

Microsoft acquires Metanautix with Quest for intelligent cloud

MicrosoftMicrosoft has bought Californian start up Metanautix for an undisclosed fee in a bid to improve the flow of analytics data as part of its ‘intelligent cloud’ strategy.

The Palo Alto vendor was launched by Theo Vassilakis and Toli Lerios in 2014 with $7 million. The Google and Facebook veterans had impressed venture capitalists with their plans for more penetrative analysis of disparate data. The strategy was to integrate the data supply chains of enterprises by building a data computing engine, Quest, that created scalable SQL access to any data.

Modern corporations aspire to data-driven strategies but have far too much information to deal with, according to Metanautix. With so many sources of data, only a fraction can be analysed, often because too many information silos are impervious to query tools.

Metanautix uses SQL, the most popular query language, to interrogate sources as diverse as data warehouses, open source data base, business systems and in-house/on-premise systems. The upshot is that all data is equally accessible, whether it’s from Salesforce or SQL Server, Teradata or MongoDB.

“As someone who has led complex, large scale data warehousing projects myself, I am excited about building the intelligent cloud and helping to realize the full value of data,” said Joseph Sirosh, corporate VP of Microsoft’s  Data Group, announcing the take-over on the company web site.

Metanautix’s technology, which promises to connect to all data regardless of type, size or location, will no longer be available as a branded product or service. Microsoft is to initially integrate it within its SQL Server and Cortana Analytics systems with details of integration with the rest of Microsoft’s service portfolio to be announced in later months, Sirosh said.

The blog posting from Metanautix CEO Theo Vassilakis hinted at further developments. “We look forward to being part of Microsoft’s important efforts with Azure and SQL Server to give enterprise customers a unified view of all of their data across cloud and on-premises systems,” he said.

Monetization of Everything | @ThingsExpo @IanKhanLive #IoT #BigData

«IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications,» stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.

read more