How to Fail Enterprise DevOps Miserably By @Plutora | @DevOpsSummit [#DevOps]

The term DevOps is not well defined, and you’d be hard pressed to get the same definition of “DevOps” from everyone you ask in your enterprise. Developers in your organization may equate DevOps with a specific approach to software builds and the use of popular tools such as Chef, Puppet, Jenkins, Git, and Docker. IT management might see DevOps as a continuation of existing processes with an emphasis on faster time to market and lightweight release procedures.

Without a common definition of the term you’ll have teams arguing over what is DevOps and what is not DevOps. If your software releases still involve the use of a change management tool such as BMC’s Remedy is it really DevOps? If the build takes an hour to deploy a QA build is it really DevOps? The reality of Enterprise DevOps is that every organization’s answers to these question will vary. Enterprise DevOps is compromise between self-service, rapid-fire agility and the ability to manage the risks that accompany mission-critical builds.

read more

From Windows 1 to Windows 10 in 30 Years

Featured images courtesy of Microsoft. Believe it or not, it’s been 30 years since Microsoft introduced Windows, arguably the most popular operating system in the world. Since its debut in 1985, a lot has changed in the world of the PC—but a lot has also stayed the same. See for yourself by clicking through our […]

The post From Windows 1 to Windows 10 in 30 Years appeared first on Parallels Blog.

Will @Avaya Redefine Software Defined Networking? By @Entuity | @CloudExpo [#Cloud #SDN]

One question that springs to mind is why invest in embedded physical SDN vSwitches which sit in-line between each and every appliance and their corresponding access ports? Why not simply apply the appropriate SDN configuration to the access switch on a per-port basis (assuming the access switch is SDN capable – of which there are a growing number)? Given that SDN (if present) is much more likely to be found in the data-centres and core networks than out to the access layer and remote branch offices, ONA would allow SDN policies to be pushed out further than existing SDN deployment would allow.

read more

Five Steps to DevOps Success By @AAkela | @DevOpsSummit [@AppDynamics #DevOps]

DevOps is an approach that improves collaboration among Dev and Ops teams to enable fast delivery of applications and ensure impeccable end-user experience. BizDevOps takes the concept of DevOps to a new level – by bringing the business context and insights to day to day DevOps activities. BizDevOps ensures that Dev and Ops focus on what matters to the business and also introduces the Biz persona (line-of-business manager, product manager) as a key stakeholder in the process.

IDC recently published research on these two topics in its report, “DevOps and the Cost of Downtime: Fortune 1000 Best Practice Metrics Quantified,” by IDC Vice President Stephen Elliot. In addition to eye-opening cost of application downtime and increasing momentum for DevOps, it also highlighted that “I=improved customer experience” is the most expected business outcome from DevOps practices.

read more

Is cheap and unlimited cloud storage bad news?

(c)iStock.com/Henrik5000

By Simon Langlois

Everyone can agree, storage quotas for cloud services are increasing very quickly. And this means only one thing: cloud storage is getting dirt-cheap. This is nothing new, as organisations have been getting more and more gigabyte (GB) for their buck in the past months. Some of the industry’s biggest cloud players, like Amazon, Google and Microsoft have been engaging in a war on storage. Who will include the biggest quota at the lowest price? And who will cut down prices to low prices we’ve never seen before?

This trend is here to stay. If we back up in time a little bit, storage used to be an important part of bundling offers. The more GB you wanted, the more vendors charged you. That is still true, on some levels. Now, storage has little value. You want 1 TB of cloud storage per user? No problem! Combine it with some awesome apps that will allow you to do business more easily. You’ll save a lot of money. Vendors are focusing on delivering value through unique suites and bundles of services.

But all this data storage at never seen before low prices, is it really a privilege? It’s more some sort of burden if you ask me. I’d even like to challenge the unlimited GB give-away big players are now marketing. Why?

Loosened IT controls

A good side of this war on price is that IT departments in organisations can now evolve into cloud brokers for internal needs. They don’t need to worry about capacity management anymore (well at least a lot less). They don’t have to enforce restrictive application use governance to maintain an acceptable level of growth and minimise storage costs. They can focus on bringing value to different business units by effectively finding, delivering, and integrating the right cloud services to them. But this, unfortunately, leads to a different problem.

Irrelevant and unstructured storage

Human beings will be human beings. Give them 1 TB with no guidelines and they’ll use it to store random content. They’ll fill up that storage and come back for more before you know it. I might be exaggerating here, but hear my point. If there are no guidelines and storage is cheap, people will end up storing personal data such as music, pictures, videos – you name it. And even if they only store work-related documentation they’ll bulk up unvalued documents and files that are either on hold, copies, drafts, or outdated. After just a couple of years of service usage and employment it can easily add-up to 200 GB per user.

If you’re lucky enough, this content will all be structured. But life isn’t that simple, is it? Interesting enough, Gartner claims that 80% of data held by organisations is unstructured. Not only do people handle a lot of documentation, they waste time looking for it! So what does irrelevant and unstructured storage lead to?

A high level of complexity when it comes to moving/retaining unnecessary data

Having a lot of storage is great, at first. But then the moment comes when you have to either clean, retain or move data. Now you’re in for complexity. And in IT, complex equals hefty costs. To prove my point, I’ll provide an example of this in the current context.

Faced by the recent price drop in the oil industry, Company XYZ had to let go 1,000 employees. Let’s say these employees had been using OneDrive for a couple of years. When combined, their accounts add up to a surprising 200TB of unstructured data. This leaves the company three options.

  • Let the data be for a couple of years. By the way, common legal compliance is to hold business data for seven years. Since storage is cheap, this might not seem like an issue. But why did Company XYZ let go 1,000 employees again? To reduce expenses. Maintaining the data of 1,000 inactive users for seven years, even if it’s at a low monthly fee per user, results in a salty bill for the company.
  • Migrate the data to a cheaper cloud storage solution, or maybe to an unused on-premises server. This will lower the monthly costs of maintaining the users’ data. But unfortunately, big cloud vendors like Microsoft and Google don’t provide a free tool to easily complete migration. This means investing in a migration tool. Since data migration is complex, such tools from third parties come at high prices. Plus, the hundreds of hours spent by employees (or hired professional services) will add to the expenses. Might not be cheaper than option 1 after all.
  • Delete everything. The ultimate option. Although it is the most cost effective solution, it is not recommended. Nor is it allowed by legal departments most times.

So what happens when you’re stuck with all this unnecessary data?

Well-planned, implemented and maintained data governance has great value

The moral of the story here? As appealing as letting go or smoothing governance of directions, control and measure over data stored in the cloud can be for IT departments, it’s a no go. It leads to high expenses, unnecessary/irrelevant/disorganised content, affected productivity as employees search for documents, and so on. And this especially comes true when cloud storage is unlimited.  Which bring us back to: is access to unlimited cloud storage for data really a privilege?

Unlimited storage is a poisoned gift

It may take a while before organisations realise this, but unlimited storage is not as great a gift as it seems. The minute any organisation will have to migrate their content somewhere else – be it an acquisition or letting go employees for instance – large amounts of data will become a burden. It’s better to monitor a smaller quota that can be increased if the need comes. What do you think?

The post Is Cheap and Unlimited Cloud Storage Bad News? appeared first on SherWeb.

CenturyLink: Why complexity, not the cloud vendor, locks you in

(c)iStock.com/RinoCDZ

It continues to be an extremely busy time for CenturyLink Cloud. Amidst a plethora of news and acquisitions, one stood out: the buyout of disaster recovery as a service software provider DataGardens.

As this publication has previously examined, cloud disaster recovery is certainly one of the more popular buzzwords in the IT industry right now, yet a lack of clarity still pervades. There are degrees of severity, from a short outage to a full blown DDoS attack, and there are different strategies vendors take; Verizon’s decision to implement a planned outage in January was roundly panned by the industry.

David Shacochis is VP cloud platform at CenturyLink. Speaking to CloudTech at Cloud Expo Europe, Shacochis describes disaster recovery as a service (DRaaS) as “a good headline”, but argues it misses the point.

“It’s certainly an important conversation starter, but really what it’s all about is workload portability,” he explains. “If you have workload portability and flexibility, and the ability to move workloads around and keep them – if you have that flexibility then that’s a risk mitigation, and that risk mitigation is ultimately what disaster recovery and continuity of operations is all about.

Shacochis continues: “Disaster recovery, more and more in the cloud age, is about architecture and design, to mitigate against the need for disaster recovery. DR as a service is a great way to start a conversation, [but] I think we’re increasingly getting to the point where disaster recovery declarations are not really what the industry needs to hear.”

The proof is in the pudding, Shacochis argues, in the new wave of application development and architectures; the applications redesigned to be resilient in the age of cloud are what people need to hear about instead of hard luck stories.

“A lot of the state management, session management and consistency architectures that modern application architects are designing for are starting to remediate the need for a lot of that,” he says. “Every graduating class of computer scientists starts to buy in more and more to that architecture, that certain way of designing and way of thinking.”

As Marc Andreessen once wrote, software is eating the world. Nowhere is that more appropriate than in cloud disaster recovery. Whereas once the DR strategy was storing a pile of kit in a data centre somewhere, software is disrupting it completely. “It’s very easy to take a copy of a cloud application and copy it to another cloud provider and keep it there as your hot standby,” Shacochis notes.

Naturally, prevention is better than cure; it’s important to have disaster recovery implementation in place, but it’s better if you don’t have to resort to it. Hence the importance of the cloud exit strategy.

For CenturyLink, whose acquisition of DataGardens was a mix of hiring the talent involved as well as the product, building services that are easy to migrate in and out of is “fundamental” to what they design. The oft-reported concern, of cloud vendor lock-in, is a misnomer according to Shacochis. It’s not vendors that provide the lock-in – it’s complexity.

“You can not be in a cloud,” he says. “I’ve seen colocation cages and environments that just make your heart cry. It’s just a tangled mess. They would simply have to rebuild it somewhere else in order to ever let go of that environment.

“We particularly think that cloud computing is fundamentally a lock-in free environment if done right,” he continues. “I think there are some cloud environments and some cloud platforms that are getting so clever and innovative, like what Amazon’s doing with Lambda or some of their proprietary modular services.”

The key, as Shacochis states, is workload portability. Expect more to come from DataGardens once their team is settled in, but disaster recovery and vendor lock-in concerns are certainly changing – and CenturyLink hopes to be on the right path with its visions.

Dropbox Android SDK vulnerability revealed, cloud storage provider praised for response

(c)iStock.com/funky-data

A major vulnerability in the Dropbox SDK for Android has been revealed by IBM Security, whereby attackers can connect applications on mobile devices to a Dropbox account controlled by the attacker.

The vuln has since been fixed, with IBM praising Dropbox for its response to the issue; the company acknowledged receipt of the disclosure within six minutes, confirmed the vulnerability within the day, and issued a patch within four days.

It’s slightly better than the flaw in Moonpig’s API, which was not looked at for 17 months before security researcher Paul Price, exasperated, went public, and was one of the quickest response times IBM Security had ever seen, which “undoubtedly shows the company’s commitment to security,” according to an IBM post.

The context here is not just with Dropbox customers, but in terms of other apps. According to AppBrain, 0.31% of all applications use the Dropbox SDK, with the number rising to 1.4% of the top 500 apps. Microsoft Office Mobile, for example, utilises the Dropbox SDK, and with over 10 million downloads, it potentially puts a lot of people at risk.

Out of the 41 apps examined which used the Dropbox SDK as part of IBM’s initial research, 76% were vulnerable to the attack. Dropbox leverages the OAuth protocol, which doesn’t disclose user credentials, and its SDK generates a cryptographic nonce which is saved locally, and can’t be guessed by attackers. However, the CVE-2014-8889 vulnerability lets attackers insert an arbitrary access token into the Dropbox SDK, bypassing the nonce protection altogether.

The vulnerability has since been fixed in the Dropbox SDK for Android v1.62. IBM warns developers who use the Android Dropbox SDK to upgrade their version, as well as advising users to ‘remain diligent’ and apply mobile app updates to patch any vulnerabilities.

You can find out more in a blog post here.

Announcing @AristaNetworks to Exhibit at @DevOpsSummit NY [#DevOps]

SYS-CON Events announced today Arista Networks will exhibit at SYS-CON’s DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY.
Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports deployed worldwide.

read more

What’s the Buzz with PaaS?

I was talking to a friend the other day about Cloud Foundry and whether there was any buzz surrounding it. Cloud Foundry in particular and Platform-as-a-Service (PaaS) in general have now been around for a few years, a considerable amount of time in the Web Era.

Recent buzz was generated with the formal establishment of the Cloud Foundry Foundation last year. Collateral buzz was generated by CoreOS with its late-year Rocket container announcement, which took aim at CFF member Docker and its expanding containers strategy.

That said, “buzz” may be too strong a word to associate with the unglamorous tasks of designing and deploying software applications and services into the cloud. Yet, companies who are integrating PaaS into the way they do things should not be afraid to commend themselves a little bit for being pioneers in what are still the early days of the PaaS phenomenon.

PaaS generated less than $4 billion in global revenue in 2012, according to IDC. The research firm predicts annual growth rates on the order of 50%, with revenues reaching $14 billion by 2017.

This number would still be less than 1% of the global total enterprise IT spend of more than $2 trillion. But it would be a very important 1%. PaaS provides that critical link between infrastructure (processing power, storage, and networking) and the software applications and services that run on it.

Time’s wastin’
In a world in which flexible, fluid technology stacks are emerging, competing, and mixing with one another, there is a sense that there is no time to develop and deploy new software. Long development cycles are replaced by the perpetual beta, a DevOps culture is emerging (difficult as it may be), and a mobile-equipped populace not only needs its information right now, it needs the latest, coolest experience right now.

So, we circle back to PaaS. Within a typical enterprise shop of a few dozen or few hundred developers, a PaaS such as Cloud Foundry will be of interest to about 70% of the staff, according to a conversation I had with an exec at a Cloud Foundry company. Not everyone will be using it directly, of course, but the majority of the team needs to know what is doing and why. In smaller shops, a PaaS will be of high interest to everyone.

There may not be a big buzz per se for the PaaS itself, but there should be for the projects it enables. I’ve already seen notable stories concerning the use of Cloud Foundry to Cloud Foundry in these cases was used to improve products and services in major industries such as healthcare, agriculture, and entertainment. It also transformed the entire business and delivery model for a major publishing company.

Companies buzzing with success include AT&T, Monsanto, the LDS Church, Philips, Axel Springer, and Warner Music. Each of these projects had a measurable outcome and success criteria.

So, I do believe there is some buzz about Cloud Foundry, or at least the projects it is help bringing to life.

read more

Nearline: Google’s Low-Cost Cloud Storage Service For Cold Data

Google is launching a new cloud storage service that is expected to change how many companies of any size view online storage. Google Cloud Storage Nearline allows businesses to store data they or their customers do not often need, and for a low cost of $0.01 per gigabyte at rest.

 

Unlike other cloud storage services where it can take very long to retrieve your data, Google promises that on Nearline your data will be available in only three seconds. They believe the gap between the cost of online and offline storage must decrease according to their director of product management for the Cloud Platform team, Tom Kershaw.

 

Businesses may need or want to keep all of their records for as long as possible. Once they have been moved offline however, it becomes difficult to quickly find the desired record. Google is hoping to blue the line between cold storage and online storage so that businesses do not have to delete or move their files to more complicated storage locations.

 

The low cost of this storage service, which is competitive with Glacier by Amazon, is due to the fact that Google is able to host all of its data on a single system, regardless of location. This is unusual for a cloud storage service because historically, service providers have built two separate systems. The hardest thing about offline storage is transferring the data between these two systems.

 

Nearline uses the same system as the rest of Google’s storage products, including the same encryption and security features. They also share API’s with the standard storage service. It is expected that early adopters of this service will use it primarily for photo, video and document storage.

 

In order to reach a broader market, Google has partnered with many storage companies, most notable Iron Mountain. This partnership will allow users to send in their hard drives and have the securely uploaded onto Nearline.

iron mtn

The post Nearline: Google’s Low-Cost Cloud Storage Service For Cold Data appeared first on Cloud News Daily.