Archivo de la categoría: Interviews

Sarah Armstrong-Smith, Microsoft: From the Millennium Bug

Sarah Armstrong-Smith has built a career on risk management, resilience, and staying ahead of evolving cyber threats. As a leading cybersecurity speaker and chief security adviser at Microsoft Europe, she has spent more than two decades helping businesses navigate digital transformation while strengthening their security posture. We spoke with Sarah to explore the biggest cybersecurity […]

The post Sarah Armstrong-Smith, Microsoft: From the Millennium Bug appeared first on Cloud Computing News.

International Women’s Day: Women in technology in 2025 on breaking the bias

A recent PwC study found that a career in the tech industry was a first choice for just 3% of female students, and women only make up for 22% of AI professionals. With the latest wave of companies scaling back on diversity, equity and inclusion (DEI) initiatives, women in tech find themselves at a crossroads. […]

The post International Women’s Day: Women in technology in 2025 on breaking the bias appeared first on Cloud Computing News.

Madoc Batters, Warner Hotels: The biggest challenge lies in affecting change

Warner Hotels, originally founded in 1932 as Warner Holiday Camp, is a hospitality company boasting numerous picturesque country and coastal properties dotted all around the UK – including North Wales, Somerset, Herefordshire, Berkshire, North Yorkshire, Nottinghamshire, Isle of Wight, Suffolk, Hampshire and Warwickshire. For the past four years, the company’s head of cloud and IT […]

The post Madoc Batters, Warner Hotels: The biggest challenge lies in affecting change appeared first on Cloud Computing News.

Rackspace CTO: no-one is bigger than the software revolution

Rackspace - John Engates

Rackspace CTO John Engates speaking at Rackspace: Solve 2016

While the concept of cloud computing has been normalized to a degree, the industry is now leaning towards the perceived benefits which can be derived from the technology on the whole. For the majority of companies who are evaluating cloud technologies, reducing CAPEX and OPEX simply isn’t a strong enough business case anymore.

This is certainly the case for Rackspace CTO John Engates. In fact, we’re starting to see the beginning of a new trend which will define the future of a vast number of organizations, the ability and desire to experiment. Those who can experiment with new technology, and are prepared to fail, will succeed. And those who don’t, won’t.

Although there can be savings made through the transition to a cloud environment, early adopters are now looking beyond. Cloud will underpin the growth and success of the new wave of next generation technologies, whether it is virtual reality, artificial intelligence or autonomous vehicles. The early adopters are already defining how these technologies will take their business to the next level, though the risk for the rest is how far they will get left behind is they don’t get up to speed quickly.

“Cloud as a technology is just about hitting the mainstream now,” said Engates. “Everything pre-2015 has been early adopters, but for mass markets it was business as usual.

“The main problem is that the majority of these companies are two or three steps away from the cloud. The cloud is not about saving money, but freeing up your developers so they can experiment with new technologies, learn new language and take the company forward. If you’re not thinking about these technologies now, how far behind are you. And you’re probably going to be in a very difficult position in a couple of years.”

Blockbuster is a classic example. Blockbuster and Netflix were in a similar position pre-digitalization, as most people now forget Netflix initially rose to fame through the delivery of DVD’s to its customers through the post. Fast forward to the digital era, where Netflix evolved and created its current market position, one in which a number of major player are now trying to emulate, and Blockbuster no longer exists.

For Engates, this example highlights the importance of experimentation. Netflix was a company which allowed its developers to play with new technologies and methods of delivery, whereas Blockbuster attempted to hold onto the traditional model. This will be the same for other verticals in the coming years, those who embrace the new digital era, adapt their models and allow their developers’ freedom to innovate will continue to be competitive, those who don’t will take the same route as Blockbuster.

Sports woman overcoming challenges“The successful companies of the future will be software companies,” said Engates. “They may not sell software but they will use it as a means to define their business and be creative in the marketplace. The likes of Google, Facebook, Uber and Netflix are all software companies. They aren’t people companies or infrastructure companies, they are software. If you want to compete with these companies you need to get better at creating the software experience.”

Nike and Under Armour are two more companies highlighted by Engates. While both are lifestyle and sportswear brands, both have had to create a digital experience to meet the demands of customers. A few years ago industry giants such as Nike and Under Armour were too big to be bothered by such trends, but the cloud computing era has levelled the playing field. No-one is bigger than the software revolution.

“I think that companies have to enable some of their organization to be innovative and to be creative,” said Engates. “Most of IT has been behind the wall; they haven’t been innovators, they’ve been keeping the lights on. It wasn’t about transforming the company into something new and different that was product development’s job or marketing. But today, inventing the new it-thing means you have to have a digital component, to connect with you users through your mobile device.”

Mobile devices are now redefining business and consumer attitudes. For the most part this is how the consumer connects with new companies; it’s almost exclusively digital and if you’re company is not embracing this, Engates thinks it won’t be too long before you’re not relevant.

But will companies take those risks? “Not all of them will,” said Engates. “Not every company will make that leap. The ones that don’t will be left behind. Now even banks are starting to do this as you’re starting to see more automated investing and digital advisors. Why would you need to go to the branch if you can do it over the phone?”

For innovation to occur within an organization, the conditions have to be right. In the majority of large scale organizations, innovation is very difficult to achieve. There are too many risks, too much red tape and too much politics. The notion that a new idea might not succeed, or reap short term benefits, scares the board and stakeholders, which in turn will inhibit innovation. It’s a difficult loop to get out of, and for a number of larger, stodgy organizations, it will be immensely difficult.

“The reason cloud is so important is because to innovate you need to be using the most modern tools, for example data science, continuous integration, containers,” said Engates. “You need APIs to interact with, you don’t want to wait six weeks on a server. You want to experiment and do things quickly. If you want to do analytics, you need storage and compute power; you need to have the cloud.

“A lot of the people who want to work on these projects have a lot of options. There are a lot of smaller companies who have these conditions to be innovative, so they attract these developers. Companies have to adapt to them, not force them to adapt to the company. Decision makers need to change their organization to have the modern environment for these developers to work in, to be innovative and to make the company competitive in the digital era.”

Virtustream enters European cloud market place

competitive swimmingWhen looking at the European cloud market ecosystem, most people would be forgiven for not looking much past the three largest holders of market share; AWS, Microsoft Azure and Google. But there are alternatives, despite them being less well known. This is the challenge facing Virtustream MD Simon Walsh.

Although as a company Virtustream has been in operation since 2009, the team consider themselves in start-up mode, taking position to pounce on the European market over the company months. The company was acquired by EMC last year, and formed the tech giant’s managed cloud services business, a position which could be seen as enviable by other cloud companies aiming to topple the top three’s firm grasp on the cloud market.

“EMC is the largest storage company in the world,” said Walsh. “And we’re aiming to leverage that position. We’re taking a rifle shot approach to building business in the European markets, but in parallel we’ve partnered with EMC because they own us, and we’ve partnered with Dell because they own EMC. With these relationships, we have access to multiple routes to market, and we plan on leveraging the recognition of companies like EMC and Dell to scale Virtustream rapidly.”

Virtustream is currently one of six EMC Federation companies (EMC2, Pivotal, RSA, VCE, VMWare and Virtustream), and will continue to be an independent brand following the introduction of Dell Technologies, and the subsequent sub-brands, in the coming months. While the brand is relatively unknown in the EMEA and APJ markets, this is not the case in North America where it has a number of certifications for federal government contracts and a number of enterprise customers.

Growth in the European market will firstly utilize the certifications Virtustream has in the US market to provide credibility for public sector organizations in Europe, and secondly, leverage customers who are already bought into the EMC and Dell portfolios.

The new EMC/Dell execs are seemingly learning lessons from Microsoft’s rise to the top of the market segment, as opposed to AWS’, becoming an end-to-end enterprise IT vendor (similar to Microsoft) as opposed to a specialist public cloud company (AWS). While AWS is widely recognised as the cloud leader worldwide, a recent study from JP Morgan will give the new EMC/Dell execs confidence.

The research highlighted 46.9% of the CIOs surveyed highlighted Microsoft as the most critical and indispensable to the future of their IT environment, whereas AWS only accounted for 13%. The all-encompassing portfolio offered by Microsoft (cloud, desktop, server and database etc.) was more appealing than AWS’ offering.

Simon Walsh

Virtustream Managing Director Simon Walsh

Virtustream can offer cloud capabilities across the board, from cloud native to traditional systems of record, and now the team have connected the cloud storage options directly to three EMC platforms in the software. The team are leaning on the idea of trust in the EMC brand, straightforwardness of upgrade and the simple integration between the offerings of all federation businesses, will offer customers a portfolio which can’t be matched in the industry.

“EMC is globally 30% of the storage market, if we go to the installed base and connect these customers to our storage cloud we should be able to scale pretty quickly (on growth of the Virtustream business),” said Walsh. “We may not be number one in the cloud storage market, but I can see us being in the top three for market share within the near future.”

One area the team are targeting in particular is the core business applications. Most enterprise organizations cloud be perceived to have a reluctance and a paranoia to run core business applications in the cloud, though Virtustream believes it has an offering which can counter this trend.

“Yes, I see paranoia, however it is geographically different,” said Walsh. “Europe is behind. Europe is still clinging onto building it themselves or outsourcing. There’s nothing wrong with outsourcing, but in the America’s they are much bolder at adopting cloud.

“Most people have used cloud to date for dev and test or they’ve used it for web front end or scale out systems of engagement, hardly anybody actually has an application which they run their business on in the cloud. It’s either in their own data centre which they run themselves or they’ve outsourced, and they have someone doing application management services. We have a hybrid solution which can change all this.”

The team have combined public cloud attributes of agility and tiered payment, and the outsourcing attributes of a core app, with an SLA and a performance guarantee, to create a hybrid proposition which they claim is unique for the industry. Walsh and his team now face the task of convincing European decision makers there is a feasible option to run core business applications in the cloud.

“The entire world has not shifted to cloud native,” said Walsh. “There is still a substantial amount of money spent by corporations around the world on running core business applications. We have a proposition which can run cloud native but can also run core business applications in the cloud, on demand and on consumption. No-one else in the industry does that. We can run all the systems on the same billing engine, the same cloud management tools and the same application management tools, which gives us a differentiator in the market.”

Data, data, data. The importance of backing up data

Cloud datacentreMore often than not when browsing the internet each morning you’ll soon discover that in fact, this morning is “Talk Like a Pirate Day”, or “Hug a Vegetarian Day”, or something equally humorous. Today is an awareness day which, conversely, holds some use to the world on the whole.

World Backup Day encourages consumers to back up their family photos, home videos, documents and emails, on more than one device. The World Backup Day website lists numerous ways in which a consumer’s data or documents can be lost, however this day is also very applicable to the world of enterprise IT.

“The rapid increase in the amount of data that consumers and organisations store is one of the biggest challenges facing the backup industry,” says Giri Fox, Director of Technical Services at Rackspace. “Organisations aren’t always sure what data they should be keeping, so to make sure they don’t discard any important data they sometimes end up keeping everything which adds to this swell of data.

“For many companies, a simple backup tool is no longer enough to make sure all these company assets are safe and available, they need support in keeping up with the sheer scale of data and to fix problems when a valuable file or database goes missing.”

The volume of data being utilized (and in some cases not utilized) has grown astronomically, but to a certain degree, security and employee behaviour has not kept pace with this growth. Cyber criminals always seem to be one step ahead of ahead of enterprise when attempting to access data, but what is more worrying is the trend of employee indifference to IT security.

A recent survey highlighted employee negligence and indifference to IT policy is one of the most significant inhibitors to cloud security with only 35% of respondents highlighting that they use passwords in work.

Giri Fox, Director of Technical Services at Rackspace

Giri Fox, Director of Technical Services at Rackspace

“Over recent years, organisations have become far more aware of the importance of backing up their data and we’ve noticed the impact here at Rackspace, where currently we backup 120 PB per month globally,” adds Fox. “One of the main challenges for us is that businesses don’t just want to back-up more data than ever before, they want it to be done quicker than ever before.

“Also, the process of doing so has become more complex than it used to be because companies are more conscious than ever of the compliance regulations they have to adhere to. Fortunately, with the development of deduplication techniques, we are now able to back-up unique sections of data rather than duplicating large pools continuously, which has sped-up the backing-up process.”

Outside of employee indifference to cloud security, changes to EU-US data protection policy have highlighted the significance of data-backup and prevention of data loss. An instance of data loss could be crippling for an organization, whether it is financial penalties or the loss of information which could prove to be invaluable in the future.

“Initiatives like World Backup Day are a great way of highlighting the importance of backing up in an age where, as Ann Winbald put it, ‘data is the new oil’,” comments Fox.

In a world where data can be seen as one of the most important commodities to any business, the value of securing, backing up and encrypting data cannot be underplayed. That said, the majority of the working world (outside of the IT department), do not appreciate the value of security, mostly not out of malice, more because they don’t know any better.

“In the post-Edward Snowden era we’re also seeing just how seriously companies are thinking about encryption. Many companies now want to make sure their backed up data is no longer just encrypted when it goes outside the four walls of a data centre, but inside it as well,” says Fox.

G-Cloud – why being certified matters

Cloud computingIt might surprise you to know that more than £900m worth of sales have now taken place via the G-Cloud platform since its launch. The Government initiated the G-Cloud program in 2012 to deliver computing based capability (from fundamental resources such as storage and processing to full-fledged applications) using cloud and it has been hugely successful, providing benefits to both customers and suppliers alike.

The G-Cloud framework is offered via the Digital Marketplace and is provided by The Crown Commercial Service (CCS), an organisation working to save money for the public sector and the taxpayer. The CCS acts on behalf of the Crown to drive savings for the taxpayer and improve the quality of commercial and procurement activity. The CCS’ procurement services can be used by central government departments and organisations across the public sector, including local government, health, education, not-for-profit and devolved administrations.

G-Cloud approves framework agreements with a number of service providers and lists those services on a publicly accessible portal known as the Digital Marketplace. This way, public sector organisations can approach the services listed on the Digital Marketplace without needing to go through a full tender process.

G-Cloud has substantial benefits for both providers and customers looking to buy services. For vendors the benefit is clear – to be awarded as an official supplier for G-Cloud demonstrates that the company has met the standards laid out in the G-Cloud framework and it is compliant with these standards. Furthermore, it also opens up an exciting new opportunities to supply the public sector in the UK with the chance to reduce their costs. Likewise it brings recognition to the brand and further emphasises their position as a reputable provider of digital services.

Where public sector organisations are concerned, G-Cloud gives quick and easy access to a roster of approved and certified suppliers that have been rigorously assessed, cutting down on the time to research and find such vendors in the marketplace. This provides companies with a head start in finding the cloud services that will best address their business and technical needs.

I am proud to say that iland was awarded a place on the G-Cloud framework agreement for supplying Infrastructure-as-a-Service (IaaS) and Disaster-Recovery-as-a-Service (DRaaS) at the end of last year. We deliver flexible, cost-effective and secure Infrastructure-as-a-Service solutions from data centres in London and Manchester, including Enterprise Cloud Services with Advanced Security and Compliance, Disaster-Recovery-as-a-Service and Cloud Backup.

So if you are looking to source a cloud provider, I would recommend that you start your search with those that have been awarded a place on the G-Cloud framework agreement. It is important to then work with prospective providers to ensure their platform, service level agreements, native management tools and support teams can deliver the solutions that best address your business goals as well as your security and compliance requirements. Ask questions up front. Ensure the provider gives you full transparency into your cloud environment. Get a demonstration. You will then be well on your way to capitalizing on the promises of cloud.

Written by Monica Brink, EMEA Marketing Director, iland

IBM: “The level of innovation is being accelerated”

Angel DiazDr. Angel Diaz joined the research division of IBM in the late nineties, where he helped co-author many of the web standards we enjoy today. Nowadays, he’s responsible for all of IBM’s cloud and mobile technology, as well as architecture for its ambient cloud. Here, ahead of his appearance at Container World (February 16 – 18,  Santa Clara Convention Center, CA,) later this month, BCN caught up with him to find out more about the tech giant’s evolving cloud strategy.

BCN: How would you compare your early days at IBM, working with the likes of Tim Berners-Lee, with the present?

Dr. Angel Diaz: Back then, the industry was focused on developing web standards for a very academic purpose, in particular the sharing of technical information. IBM had a strategy around accelerating adoption and increasing skill. This resulted in a democratization of technology, by getting developers to work together in open source and standards.If you fast forward to where we are now with cloud, mobile, data, analytics and cognitive you see a clear evolution of open source.

The aperture of open source development and ecosystems has grown to include users and is now grounded on solid open governance and meritocracy models. What we have built is an open cloud architecture, starting with an open IaaS based on Open Stack, open PaaS with Cloud Foundry and an open container model with the Open Container Initiative and Cloud Native Computing Foundation. When you combine an open cloud architecture with open APIs defined by the Open API Initiative, applications break free. I have always said that no application is an island – these technologies make it so.

What’s the ongoing strategy at IBM, and where do containers come into it?

It’s very much hybrid cloud. We’ve been leveraging containers to help deliver hybrid applications and accelerate development through devOps, so that people can transform and improve their business processes. This is very similar to what we did in the early days of the web – better business processes means better business. At the end of the day – the individual benefits. Applications can be tailored to the way we like to work, and the way that we like to behave.

A lot of people in the container space, say, wow, containers have been around a long time, why are we all interested in this now? Well, it’s gotten easier to use, and open communities have rallied around it, and it provides a very nice way of marrying concepts of operations and service oriented architecture, which the industry missed in the 2000s.

What does all this innovation ultimately mean for the ‘real world’?

It’s not an exact analogy, but if we remember the impact of HTML, JavaScript – they allowed almost anyone to become a webmaster. That led to the Internet explosion. If you look at where we are now, what we’re doing with cloud: that stack of books you need to go buy has been reduced, the concept count of things you need to know to develop an application, the level of sophistication of what you need to know in order to build an application, scale an application, secure an application, is being reduced.

So what does that do? It increases participation in the business process, in what you end up delivering. Whether it’s human facing or whether it’s an internal business process, it reduces that friction and it allows you to move faster. What’s starting to happen is the level of innovation is being accelerated.

And how do containers fit into this process? 

Previously there was this strict line: you develop software and then operate it and make tweaks, but you never really fundamentally changed the architecture of the application. Because of the ability to quickly stand up containers, to quickly iterate, etc., people are changing their architectures because of operations and getting better operations because of it. That’s where the microservices notion comes in.

And you’ll be talking at Container World. What message are you bringing to the event?

My goal is to help people take a step back and understand the moment we’re in, because sometimes we all forget that. Whether you’re struggling with security in a Linux kernel or trying to define a micro service, you can forget what it is you’re trying to accomplish.

We are in a very special moment where it’s about the digital disruption that’s occurring, and the container technology we’re building here, allow much quicker iteration on the business process. That’s one dimension. The second is that, what IBM’s doing, in not just our own implementation of containers, but in the open source world, to help democratize the technology, so that the level of skill and the number of people who build on this grows.

AWS – we view open source as a companion

deepaIn one of the last installments of our series marking the upcoming Container World (February 16 – 18,  Santa Clara Convention Center, CA, USA), BCN talks to Deepak Singh, General Manager of Amazon EC2 Container Service, AWS

Business Cloud News: First of all – how much of the container hype is justified would you say?

Deepak Singh: Over the last 2-3 years, starting with the launch of Docker in March 2013, we have seen a number of AWS customers adopt containers for their applications. While many customers are still early in their journey, we have seen AWS customers such as Linden Labs, Remind, Yelp, Segment, and Gilt Group all adopt Docker for production applications. In particular, we are seeing enterprise customers actively investigating Docker as they start re-architecting their applications to be less monolithic.

How is the evolution of containers influencing the cloud ecosystem?

Containers are helping people move faster towards architectures that are ideal for the  AWS cloud. For example, one of the common patterns we have seen with customers using Docker is to adopt a microservices architecture. This is especially true for our enterprise customers who see Docker as a way to bring more applications onto AWS.

What opportunities does this open up to AWS?

For us, it all comes down to customer choice. When our customers ask us for a capability, then we listen. They come to us because they want something the Amazon way, easy to use, easy to scale, lower cost, and where they don’t have to worry about the infrastructure running behind it.

As mentioned, many of our customers are adopting containers and they expect AWS to support them. Over the past few years we have launched a number of services and features to make it easier for customers to run Docker-based applications. These include Docker support in AWS Elastic Beanstalk and the Amazon EC2 Container Service (ECS). We also have a variety of certified partners that support Docker and AWS and integrate with various AWS services, including ECS.

What does the phenomenon of open source mean to AWS? Is it a threat or a friend?

We view open source as a companion to AWS’s business model. We use open source and have built most AWS services on top of open source technology. AWS supports a number of open source applications, either directly or through partners. Examples of open source solutions available as AWS services include Amazon RDS (which supports MySQL, Postgres, and MariaDB), Amazon Elastic MapReduce (EMR), and Amazon EC2 Container Service (ECS). We are also an active member of the open source community. The Amazon ECS agent is available under an Apache 2.0 license, and we accept pull requests and allow our customers to fork our agent as well. AWS contributes code to Docker (e.g. CloudWatch logs driver), and was a founder member of the Open Container Initiative, which is a community effort to develop specifications for container runtimes.

As we see customers asking for services based on various open source technologies, we’ll keep adding those services.

You’ll be appearing at Container World this February. What do you think the biggest discussions will be about?

We expect customers will be interested in learning how they can run container-based applications in production, the most popular use cases, and hear about the latest innovations in this space.

Cloud academy: Rudy Rigot and his new Holberton School

rudy rigotBusiness Cloud News talks to Container World (February 16 – 18, 2016 Santa Clara Convention Center, USA) keynote Rudy Rigot about his new software college, which opens today.

Business Cloud News: Rudy, first of all – can you introduce yourself and tell us about your new Holberton School?

Rudy Rigot: Sure! I’ve been working in tech for the past 10 years, mostly in web-related stuff. Lately, I’ve worked at Apple as a full-stack software engineer for their localization department, which I left this year to found Holberton School.

Holberton School is a 2-year community-driven and project-oriented school, training software engineers for the real world. No classes, just real-world hands-on projects designed to optimize their learning, in close contact with volunteer mentors who all work for small companies or large ones like Google, Facebook, Apple, … One of the other two co-founders is Julien Barbier, formerly the Head of Community, Marketing and Growth at Docker.

Our first batch of students started last week!

What are some of the challenges you’ve had to anticipate?

Since we’re a project-oriented school, students are mostly being graded on the code they turn in, that they push to GitHub. Some of this code is graded automatically, so we needed to be able to run each student’s code (or each team’s code) automatically in a fair and equal way.

We needed to get information on the “what” (what is returned in the console), but also on the “how”: how long does the code take to run?  How much resource is being consumed? What is the return code? Also, since Holberton students are trained on a wide variety of languages; how do you ensure you can grade a Ruby project, and later a C project, and later a JavaScript project, etc. with the same host while minimizing issues?

Finally we had to make sure that the student can commit code that is as malicious as they want, we can’t need to have a human check it before running it, it should only break their program, not the whole host.

So how on earth do you negotiate all these?

Our project-oriented training concept is new in the United States, but it’s been successful for decades in Europe, and we knew the European schools, who built their programs before containers became mainstream, typically run the code directly on a host system that has all of the software they need directly installed on the host; and then they simply run a chroot before running the student’s code. This didn’t solve all of the problem, while containers did in a very elegant way; so we took the container road!

HolbertonCloud is the solution we built to that end. It fetches a student’s code on command, then runs it based on a Dockerfile and a series of tests, and finally returns information about how that went. The information is then used to compute a score.

What’s amazing about it is that by using Docker, building the infrastructure has been trivial; the hard part has been about writing the tests, the scoring algorithm … basically the things that we actively want to be focused on!

So you’ve made use of containers. How much disruption do you expect their development to engender over the coming years?

Since I’m personally more on the “dev” end use of devops, I see how striking it is that containers restore focus on actual development for my peers. So, I’m mostly excited by the innovation that software engineers will be focusing on instead of focusing on the issues that containers are taking care of for them.

Of course, it will be very hard to measure which of those innovations were able to exist because containers are involved; but it also makes them innovations about virtually every corner of the tech industry, so that’s really exciting!

What effect do you think containers are going to have on the delivery of enterprise IT?

I think one takeaway from the very specific HolbertonCloud use case is that cases where code can be run trivially in production are getting rare, and one needs guarantees that only containers can bring efficiently.

Also, a lot of modern architectures fulfil needs with systems that are made of more and more micro-services, since we now have enough hindsight to see the positive outcomes on their resiliences. Each micro-service may have different requirements and therefore be relevant to be done each with different technologies, so managing a growing set of different software configurations is getting increasingly relevant. Considering the positive outcomes, this trend will only keep growing, making the need for containers keep growing as well.

You’re delivering a keynote at Container World. What’s the main motivation for attending?

I’m tremendously excited by the stellar line-up! We’re all going to get amazing insight from many different and relevant perspectives, that’s going to be very enlightening!

The very existence of Container World is exciting too: it’s crazy the long way containers have gone over the span of just a few years.

Click here to learn more about Container World (February 16 – 18, 2016 Santa Clara Convention Center, USA)