Faster deployment, easier updates, more secure operation, fewer resources: Containers make life easier for developers and lower the costs of cloud resources. Nevertheless, many companies are still relying on classic virtual machines (VMs). In an interview, cloud expert Nils Magnus reveals why this is changing and what developers should pay attention to when using containers.
Containers have been around for years in the IT world. Is the topic even still relevant?
Nils Magnus: Granted: The idea of packaging applications in containers and running them independently of software and hardware environments is not new. The first implementations, which are still in use today, can be traced back to 1979 and other ideas were sketched out even earlier. Although these proposals were well known in specialist circles, it took the computing and above all the storage capacities of our generation to implement the container idea across the board. Today this is no longer a problem.
Containers became really practical and feasible for the vast majority of developers in 2013, when the company Docker, Inc. launched its product of the same name. Docker is now the de facto standard for containers. In 2015, Google completed the concept with its Kubernetes project, which saw the development of management software for entire clusters of containers under an open source license. In tandem, the two products deal with a lot of what had irritated system administrators for years: The choice of the hardware manufacturer, storage drivers, addresses for networks or peculiarities of the operating system and ultimately even the cloud in which a Kubernetes cluster runs with its applications. This works very well for those who don’t want to rely on just one provider, i.e. who take a multi-cloud approach.
How relevant are containers for multi-cloud structures?
Nils Magnus: Very relevant: With containers, for example, I can achieve a high degree of resilience across several cloud regions. Or, when it comes to the costs for virtual machines, I can decide on a daily basis which container cluster should run in which country region. Or spontaneously move complete clusters to another region if laws or regulations suddenly change.
Does this mean that cloud containers make companies independent of individual providers?
Nils Magnus: Only under certain conditions. Even when using containers, there is still a risk for companies: if they are tempted by provider-specific services that are only available from this particular provider. Those who use convenient but unique functions of certain providers can’t simply move with container clusters if they no longer run without these special functions after the move.
The Open Telekom Cloud’s Cloud Container Engine is just such a feature: It makes container handling more convenient.
Nils Magnus: That’s true, but the Open Telekom Cloud’s Cloud Container Engine (CCE) doesn’t make the user dependent in any way. It simplifies the administration of Docker containers and reduces the effort required to set up and maintain a cluster. The result, however, is an unadulterated Kubernetes cluster that can be used with the many tools available. Without the CCE, DevOps employees have to install Docker individually on each virtual machine and then install and connect a range of services. This may sound like a bit of effort at first, but it can be done quickly, especially if the cluster is to grow larger – for example to cope with the onslaught in the Christmas business – or shrink to save costs after the festive hustle and bustle. Although it’s possible to set up entire clusters manually, it can be very laborious sometimes. It can take even an experienced developer several days, depending on the size and complexity. This is where the Cloud Container Engine comes into play, as it allows developers to assemble Kubernetes clusters within 10 minutes using the web console and a few sliders.
How much does the Open Telekom Cloud’s Cloud Container Engine service cost?
Nils Magnus: The added value through simplified installation and maintenance means there’s no added cost. We provide the engineering outlay, while customers only pay for the servers on which their workload runs.
If containers offer so many advantages, why hasn't everyone been using this virtualization technology for longer?
Nils Magnus: In my opinion, the majority of developers, users and companies will be using this technology in the foreseeable future. However, containers also require a certain amount of rethinking, the necessary expertise and time to really engage with them. At the moment, many companies are still facing other challenges relating to digitalization – many are only just introducing cloud solutions. Occasionally you hear that the cloud is only the same server at a different location, so that you can transfer your IT workloads one-to-one there. This "lift and shift" approach works, but has limited benefits. It simply doesn't exploit the possibilities of the cloud, which we call the "cloud native."
When Docker and Kubernetes are used properly, applications can be scaled as needed, updated much faster, and the entire development process can be automated more fully. This allows developers to focus on the really exciting questions, which are their applications that generate real added value and can be quantified in cash. Since this duo is available on practically every cloud and for every server, multi-cloud structures can be established with little effort. So if you don't want to tie yourself unconditionally to a single manufacturer, make sure you use Kubernetes and Docker as container technology. We offer this option with the Cloud Container Engine on our Open Telekom Cloud. It combines standard conformity with the convenience of a managed service. This is the first step into the multi-cloud.
About the cloud expert Nils Magnus
What exactly is a container?
Containers solve a fundamental problem in software development:
Applications depend on operating systems, versions and software libraries to function properly. Users of desktop PCs know this from Windows programs: For example, if you want to install a game, you need a suitable Windows version and certain drivers for the graphics or sound – if you don't have them, the software won't work.
It gets particularly tricky when two different games require different versions of a library. What to do? Either the user stops using one of the two games. Or he buys another computer and installs the other version of the required software on it. That allows him to use both applications, but it’s very expensive. This is a solution that’s unsatisfactory and makes little sense.
Containers: Everything software needs under one roof
The situation is similar for the development of enterprise software. Containers solve this dilemma by encapsulating competing applications in so-called containers. They contain everything that the software needs to function – including all the libraries. The advantage is that several processes now share the same server, where previously each application needed its own server. Containers help to avoid version conflicts, since each container only contains the necessary parts of an application. A classic server, on the other hand, is a complex all-round machine and therefore much more difficult to maintain.
This means that containers make applications less resource-intensive, more flexible and faster: While a server takes a few minutes to get up and running, containers are ready to go in a fraction of the time because they often don’t even require an operating system.
Do you have questions?
We answer your questions about testing, booking and use - free of charge and individually. Try it! Hotline: 24 hours a day, 7 days a week
0800 33 04477 from Germany / 00800 33 04 47 70 from abroad