Artificial intelligence: How GPUs increase the speed [Video]
CPU, FPGA, GPU: In the video Max Guhl explains the role that GPUs can play in AI deployments.
In this article you will read about,
- why GPUs are often well suited to AI applications
- which application areas they specialize in and
- why GPUs are experiencing a second spring in the context of AI applications.
AI is being used in more and more practical applications. And parallel to this, there is a growing acceptance of artificial intelligence. People in Germany are less skeptical about AI applications than they were three years ago. This is the result of a recent survey by the industry association Bitkom. According to the survey, more than two thirds (68 percent) of those questioned now consider AI more as an opportunity, while 29 percent consider AI more as a threat. In 2017, there was still an even split between proponents and skeptics.
Algorithms are considered artificially intelligent if they enable machines to make independent decisions. They learn this primarily with the help of data. Algorithms recognize patterns and regularities in large amounts of data and use them to draw conclusions for the continuation of the process. The algorithms thus use existing examples and experiences to optimize their course of action.
Currently, the data volumes for AI applications are growing so fast that the demands on the infrastructure are constantly increasing. Data has to be collected and analyzed in the shortest possible time in order to make concrete decisions – such as in automated driving. Since GPUs operate in parallel and are bandwidth-optimized, they are particularly suitable for computationally-intensive processes, especially in the areas of machine learning, deep learning, and visualization.
The architecture of a GPU is designed for parallel processing. These characteristics are of great importance for AI algorithms, especially for neural networks. Increasingly, service providers offer GPUs and their performance as a service from the cloud. The big advantage: companies use and pay for computing power only when it is actually needed.
In the new video, Max Guhl from the A.I.-Team at T-Systems takes a look behind the scenes: Why are GPUs so powerful – and which AI application areas are they particularly well suited to?
Do you have questions?
We answer your questions about testing, booking and use - free of charge and individually. Try it! Hotline: 24 hours a day, 7 days a week
0800 33 04477 from Germany / 00800 33 04 47 70 from abroad