Tag Archives: automation

Adam Bateson, Umbraco: Open source as a secret weapon

TechForge recently spoke with VP of sales at Umbraco, Adam Bateson, as he explained the benefits of hyper-automation and digital workforces, and the importance of sustainability. Could you tell us a little bit about the company you work for? Umbraco is the largest Microsoft.net open source CMS. We are a Danish software company that’s been… Read more »

The post Adam Bateson, Umbraco: Open source as a secret weapon appeared first on Cloud Computing News.

Canonical releases low-touch private cloud MicroCloud

Canonical has announced the general availability of MicroCloud, a low-touch, open source cloud solution. MicroCloud is part of Canonical’s growing cloud infrastructure portfolio. It is purpose-built for scalable clusters and edge deployments for all types of enterprises. It is designed with simplicity, security and automation in mind, minimising the time and effort to both deploy… Read more »

The post Canonical releases low-touch private cloud MicroCloud appeared first on Cloud Computing News.

Basil Faruqui, BMC: Perfecting cloud strategies, and getting the most out of automation

Could you tell us a little about what BMC does and your role within the company?  BMC delivers industry-leading software solutions for IT automation, orchestration, operations, and service management to help organisations free up time and space as they continue to drive their digital initiatives forward. We work with thousands of customers and partners around… Read more »

The post Basil Faruqui, BMC: Perfecting cloud strategies, and getting the most out of automation appeared first on Cloud Computing News.

Lenovo ushers in new era of edge automation at scale

Lenovo has unveiled the next generation of ThinkEdge remote automation and orchestration with the introduction of new software solutions to accelerate the deployment of edge solutions. Lenovo’s new Lenovo Open Cloud Automation (LOC-A) 2.6 software delivers secure automated setup, enabling customers to complete global edge deployments for any number of locations in a matter of… Read more »

The post Lenovo ushers in new era of edge automation at scale appeared first on Cloud Computing News.

Chef boosts application IQ with Habitat launch

artificial intelligence, communication and futuristicChef has launched a new open source project called Habitat, which it claims introduces a new approach for application automation.

The team claim Habitat is a unique piece of software which enables applications to be freed from dependency on a company’s infrastructure. When applications are wrapped in Habitat the runtime environment is no longer the focus and does not constrain the application itself. Due to this USP applications can run across numerous environments such as containers, PaaS, cloud infrastructure and on premise data centres, but also has the intelligence to self-organize and self-configure, the company claims.

“We must free the application from its dependency on infrastructure to truly achieve the promise of DevOps,” said Adam Jacob, CTO at Chef. “There is so much open source software to be written in the world and we’re very excited to release Habitat into the wild. We believe application-centric automation can give modern development teams what they really want — to build new apps, not muck around in the plumbing.”

Chef would generally be considered a challenger to the technology industry’s giants having only been founded in 2008, though the company has made positive strides in recent years specializing in the DevOps and containers arenas, two of the more prominent growth areas. Although both of these areas are prominent in marketing campaigns and conference presentations, applications into the real-world have been more difficult.

The Habitat product is built on the idea that infrastructure dictated the design of an application. Chef claims by making the application and its automation the unit of deployment, developers can focus on business value and planning features that will make their products stand out rather than on the constraints of infrastructure and particular runtime environments.

“The launch of Habitat is a significant moment for both Chef and the entire DevOps community in the UK and EMEA,” said Joe Pynadath, ‎GM of EMEA for Chef Software, Chef. “It marks our next evolution and will provide an absolutely transformative, paradigm shift to how our community and customers can approach application management and automation. An approach that puts the application first and makes them independent of their underlying infrastructure.  I am extremely excited to see the positive impact that our Chef community and customers throughout Europe will gain from this revolutionary technology.”

Is the Cloud Right for You?

I recently presented a session entitled, “Is the Cloud Right for You?” with Randy Weis and wanted to provide a recap of the things I covered in the presentation. In this video, I discuss some of the advantages of cloud including the access to enterprise class hardware that you might not normally be able to afford, load balancers, multiple data centers, redundancy, automation and more. I also cover some of the risks associated with the cloud. Enjoy, and as always, reach out with any questions!

 

Download eBook: The Evolution of the Corporate IT Department

 

By Chris Chesley, Solutions Architect

Is the Cloud Right for You?

I recently presented a session entitled, “Is the Cloud Right for You?” with Randy Weis and wanted to provide a recap of the things I covered in the presentation. In this video, I discuss some of the advantages of cloud including the access to enterprise class hardware that you might not normally be able to afford, load balancers, multiple data centers, redundancy, automation and more. I also cover some of the risks associated with the cloud. Enjoy, and as always, reach out with any questions!

Download eBook: The Evolution of the Corporate IT Department

By Chris Chesley, Solutions Architect

Amazon buys ClusterK to reduce AWS deployment costs

Amazon has acquired ClusterK, which offers software that optimses deployments on AWS spot instances

Amazon has acquired ClusterK, which offers software that optimses deployments on AWS spot instances

Amazon has acquired ClusterK, a provider of software that optimises deployment on AWS spot instances for cost and availability.

Amazon confirmed the acquisition to BCN but declined to offer any details about how the technology would be integrated in AWS, or the financial terms of the acquisition.

One of the challenges with EC2 spot instances is that cost and availability can vary dramatically depending on overall demand.

At the same time when these instances are used for long jobs (say, running batch jobs on large databases) and those jobs are interrupted, those instances can actually disappear from right under you – unless failovers on reserved instances or similar techniques are deployed.

Those are some of the things ClusterK aims to solve. It offers an orchestration and scaling service that uses the AWS spot market in conjunction with on-demand or reserved instances to optimise workload deployments for cost and availability – an automated way of keeping workload cost and availability in check (the company claims it can reduce cloud costs by up to 90 per cent).

While it’s not clear exactly how Amazon intends to integrate the technology it is clear the company is keen to do what it takes to keep the price of its services dropping, which is where ClusterK could certainly add value. While disclosing its cloud revenues for the first time last week the company said it has dropped the prices of its services about 50 times since AWS launched ten years ago.

The 2013 Tech Industry – A Year in Review

By Chris Ward, CTO, LogicsOne

As 2013 comes to a close and we begin to look forward to what 2014 will bring, I wanted to take a few minutes to reflect back on the past year.  We’ve been talking a lot about that evil word ‘cloud’ for the past 3 to 4 years, but this year put a couple of other terms up in lights including Software Defined X (Datacenter, Networking, Storage, etc.) and Big Data.  Like ‘cloud,’ these two newer terms can easily mean different things to different people, but put in simple terms, in my opinion, there are some generic definitions which apply in almost all cases.  Software Defined X is essentially the concept of taking any ties to specific vendor hardware out of the equation and providing a central point for configuration, again vendor agnostic, except of course for the vendor providing the Software Defined solution :) .  I define Big Data simply as the ability to find a very specific and small needle of data in an incredibly large haystack within a reasonably short amount of time. I see both of these technologies becoming more widely adopted in short order with Big Data technologies already well on the way. 

As for our friend ‘the cloud,’ 2013 did see a good amount of growth in consumption of cloud services, specifically in the areas of Software as a Service (SaaS) and Infrastructure as a Service (IaaS).  IT has adopted a ‘virtualization first’ strategy over the past 3 to 4 years when it comes to bringing any new workloads into the datacenter.  I anticipate we’ll begin to see a ‘SaaS first’ approach being adopted in short order if it is not out there already.  However, I can’t necessarily say the same on the IaaS side so far as ‘IaaS first’ goes.  While IaaS is a great solution for elastic computing, I still see most usage confined to the application development or super large scale out application (Netflix) type use cases.  The mass adoption of IaaS for simply forklifting existing workloads out of the private datacenter and into the public cloud simply hasn’t happened.  Why?? My opinion is for traditional applications neither the cost nor operational model make sense, yet. 

In relation to ‘cloud,’ I did see a lot of adoption of advanced automation, orchestration, and management tools and thus an uptick in ‘private clouds.’  There are some fantastic tools now available both commercially and open source, and I absolutely expect to see this adoption trend to continue, especially in the Enterprise space.  Datacenters, which have a vast amount of change occurring whether in production or test/dev, can greatly benefit from these solutions. However, this comes with a word of caution – just because you can doesn’t mean you should.  I say this because I have seen several instances where customers have wanted to automate literally everything in their environments. While that may sound good on the surface, I don’t believe it’s always the right thing to do.  There are times still where a human touch remains the best way to go. 

As always, there were some big time announcements from major players in the industry. Here are some posts we did with news and updates summaries from VMworld, VMware Partner Exchange, EMC World, Cisco Live and Citrix Synergy. Here’s an additional video from September where Lou Rossi, our VP, Technical Services, explains some new Cisco product announcements. We also hosted a webinar (which you can download here) about VMware’s Horizon Suite as well as a webinar on our own Cloud Management as a Service Offering

The past few years have seen various predictions relating to the unsustainability of Moore’s Law which states that processors will double in computing power every 18-24 months and 2013 was no exception.  The latest prediction is that by 2020 we’ll reach the 7nm mark and Moore’s Law will no longer be a logarithmic function.  The interesting part is that this prediction is not based on technical limitations but rather economic ones in that getting below that 7nm mark will be extremely expensive from a manufacturing perspective and, hey, 64k of RAM is all anyone will ever need right?  :)

Probably the biggest news of 2013 was the revelation that the National Security Agency (NSA) had undertaken a massive program and seemed to be capturing every packet of data coming in or out of the US across the Internet.   I won’t get into any political discussion here, but suffice it to say this is probably the largest example of ‘big data’ that exists currently.  This also has large potential ramifications for public cloud adoption as security and data integrity have been 2 of the major roadblocks to adoption so it certainly doesn’t help that customers may now be concerned about the NSA eavesdropping on everything going on within the public datacenters.  It is estimated that public cloud providers may lose as much as $22-35B over the next 3 years as a result of customers slowing adoption due to this.  The only good news in this, at least for now, is it’s very doubtful that the NSA or anyone else on the planet has the means to actual mine anywhere close to 100% of the data they are capturing.  However, like anything else, it’s probably only a matter of time.

What do you think the biggest news/advancements of 2013 were?  I would be interested in your thoughts as well.

Register for our upcoming webinar on December 19th to learn how you can free up your IT team to be working on more strategic projects (while cutting costs!).

 

 

Why Automate? What to Automate? How to Automate?

By John Dixon, Consulting Architect

Automation is extremely beneficial to organizations. However, the questions often come up around why to automate, what to automate, and how to automate.

Why automate?

There are several key benefits surrounding automation. They include:

  • Saving time
  • Employees can be retrained to focus on other (hopefully more strategic) tasks
  • Removing human intervention reduces errors
  • Troubleshooting and support is improved when everything is deployed the same way

What to automate?

Organizations should always start with the voice of the customer (VoC). IT departments need to factor in what the end user wants and what the end user expects to improve their experience. If you can’t trace back something you’re automating to an improved customer experience, that’s usually a good warning sign that you should not be automating it. In addition, you need to be able to track back to how automation has provided a benefit to the organization. The benefit should always be measurable and always financial.

What are companies automating?

Requests management is the hot one because that’s a major component of cloud computing. This includes service catalogues and self-service portals. Providing a self-service portal, sending the request for approval based on the dollar amount requested, and fulfilling the order through one or more systems is something that is commonly automated today. My advice here is to automate tasks through a general purpose orchestrator tool (such as CA Process Automation or similar tools) so that automated jobs can be managed from a single console. This is instead of stitching together disparate systems that call each other in a “rat’s nest” of automation. The general purpose orchestrator also allows for easier troubleshooting when an automated task does not complete successfully.

How to automate?

There are some things to consider when sitting down to automate a task, or even determining the best things to automate. Here are a few key points:

  1. Start with the VoC or Voice of the Customer, and work backwards to identify the systems that are needed to automate a particular task. For example, maybe the customer is the Human Resources department, and they want to automate the onboarding of a new employee. It may have to setup user accounts, order a new cell phone, order a new laptop, and schedule the new employee on their manager’s calendar on their first day of work. Map out the systems that are required to accomplish this, and integrate those – and no more. You may find that some parts of the procedure may already be automated; perhaps your phone provider already has an interface to programmatically request new equipment. Take every advantage of these components.
  2. Don’t automate things that you can’t trace back to a benefit for the organization. Just because you can automate something doesn’t mean that you should. Again, use the voice of the customer and user stories here. A common user story is structure as follows:
    1. “As a [role],
    2. I want to [get something done]
    3. So that I can [benefit in the following way]”
  3. Start small and work upwards to automate more and more complex tasks. Remember the HR onboarding procedure in point #1? I wouldn’t suggest beginning your automation journey there. Pick out one thing to automate from a larger story, and get it working properly. Maybe you begin by automating the scheduling of an appointment in Outlook or your calendaring system, or creating a user in Active Directory. Those pieces become components in the HR onboarding story, but perhaps other stories as well.
  4. Use a general purpose orchestrator instead of stitching together different systems. As in point #3, using an orchestrator will allow you to build reusable components that are useful to automate different tasks. A general purpose orchestrator also allows for easier troubleshooting when things go wrong, tracking of automation jobs in the environment, and more advanced conditional logic. Troubleshooting automation any other way can be very difficult.
  5. You’ll need someone with software development experience. Some automation packages claim that even non-developers can build robust automation with “no coding required.” In some cases, that may be true. However, the experience that a developer brings to the table is an absolute must have when automating complex tasks like the HR onboarding example in point #1.

 

What has your organization automated? How have the results been?