CommunityDEENDEENProductsCore ServicesRoadmapRelease NotesService descriptionCertifications and attestationsPrivate CloudManaged ServicesBenefitsSecurity/DSGVOSustainabilityOpenStackMarket leaderPricesPricing modelsComputing & ContainersStorageNetworkDatabase & AnalysisSecurityManagement & ApplicationsPrice calculatorSolutionsIndustriesHealthcarePublic SectorScience and researchAutomotiveMedia and broadcastingRetailUse CasesArtificial intelligenceHigh Performance ComputingBig data and analyticsInternet of ThingsDisaster RecoveryData StorageTurnkey solutionsTelekom cloud solutionsPartner cloud solutionsSwiss Open Telekom CloudReferencesPartnerCIRCLE PartnerTECH PartnerBecome a partnerAcademyTraining & certificationsEssentials trainingFundamentals training coursePractitioner online self-trainingArchitect training courseCertificationsCommunityCommunity blogsCommunity eventsLibraryStudies and whitepaperWebinarsBusiness NavigatorSupportSupport from expertsAI chatbotShared ResponsibilityGuidelines for Security Testing (Penetration Tests)Mobile AppHelp toolsFirst stepsTutorialStatus DashboardFAQTechnical documentationNewsBlogFairs & eventsTrade pressPress inquiriesCommunity

0800 3304477 24 hours a day, seven days a week

Write an E-mail 

Book now and claim starting credit of EUR 250
ProductsCore ServicesPrivate CloudManaged ServicesBenefitsPricesPricing modelsPrice calculatorSolutionsIndustriesUse CasesTurnkey solutionsSwiss Open Telekom CloudReferencesPartnerCIRCLE PartnerTECH PartnerBecome a partnerAcademyTraining & certificationsCommunityLibraryBusiness NavigatorSupportSupport from expertsHelp toolsTechnical documentationNewsBlogFairs & eventsTrade pressPress inquiries
  • 0800 330447724 hours a day, seven days a week
  • Write an E-mail 
Book now and claim starting credit of EUR 250

A flying start into artificial intelligence

by Redaktion
Digital representation of the side view of a facial profile with networked light points
The cloud offers suitable options for artificial intelligence – depending on company requirements and strategy
 

In this article you will read,

• what possibilities exist in the cloud for developing artificial intelligence,
• how pre-trained basic models can support you in the process
• and how you can access a large language model via Github.

Generative AI (GenAI) was on everyone's lips in 2023 – and sparked people's imagination (especially that of non-experts). Many companies are fantasizing about various application scenarios for artificial intelligence (AI) and are painting a science fiction world in which super-smart assistants make decisions largely autonomously and work side by side with humans.

Current use cases for AI – productivity-enhancing, powerful, but unspectacular  

Many current use cases are not actually all that spectacular. Nonetheless, they bring a considerable increase in productivity, relieving employees of tedious tasks that tie up productive time. These include summaries of documents (collections) or meetings, searching for relevant information in the depths of company filing systems, pre-filling forms, or creating program code. And companies can already leverage this efficiency potential today – even without extensive in-house AI expertise. 

Getting into AI – even without AI know-how?

According to PAC analysts, a lack of expertise is currently the main hurdle on the path to the world of AI. But even companies that have the relevant expertise are facing challenges such as: Where do we get the necessary infrastructure resources to develop and train our own models? This is not the only area where the cloud can make a decisive contribution. Developers are provided with urgently needed resources such as GPUs at short notice and in line with demand. Training AI models is one of the most sensible use cases for the cloud.

Designing an AI strategy to suit the company

The starting point for the use of AI therefore varies from company to company, although the boundaries are blurred. There is now a comprehensive portfolio of AI offerings, particularly basic models such as large language models (LLMs) that companies can use.

SaaS offerings provide standardized services that can be adapted to the specifics of each company, and IaaS offerings are aimed at in-house developers who want to minimize the effort involved in managing infrastructure resources. Somewhere in the middle are PaaS services that offer frameworks for AI development, pre-trained basic models or specific AI solution modules. 

Quick AI solution with cloud resources

Together with its partners, the Open Telekom Cloud illustrated approaches to the topic of AI at various events. The focus was on how users can quickly achieve functional AI with the help of existing (open source) services. Infrastructure resources from the cloud always play a role here – but they are hidden “under the hood”. What is often overlooked: not only does the development of a basic AI model require large GPU resources, but the operation of an AI service also needs a powerful basis (also handled with GPUs). 

Infrastructure as a service for individual AIs

“The AI strategy must fit the company. The services that are obtained from a cloud should be planned accordingly,” explains Holger Schultheiß, Chief Product Owner of the Open Telekom Cloud. With the IaaS approach, only infrastructure resources are obtained, e.g., to develop or fine-tune an own highly specialized AI model. However, this is only worthwhile if there is specific outstanding added value and sufficient suitable data is available. In any case, advice from an experienced partner is important for planning and implementation.

This also applies to a less differentiating “85% solution”. “As a rule, you don't need a specially trained LLM for this, but a well-developed system around an (existing) LLM,” explains Dr. Tim Delbrügger, AI lead at iits Consulting, “We usually clarify whether this is sufficient in initial workshops, where the customer's needs, the data situation and technical implementation options are evaluated.” “The greatest added value is usually generated added value through the rapid, direct integration of pre-trained AI into existing business processes,” summarizes the AI expert. 

Software as a service – the fastest way for standard scenarios

The SaaS approach is the opposite. It is aimed at pragmatists who want to quickly utilize the potential of existing standard solutions for their company. One example of this is an AI-supported enterprise search (such as that from amberSearch). It is essentially based on two components: a pre-trained LLM (Large Language Model) with a basic ability to chat is linked to a RAG (Retrieved Augmented Generation) functionality. When the LLM “chats” with a user, it also accesses internal company data for this discussion and thus allows a discussion in the specific company context – without the internal data being included in the model. Such a service, e.g., for finding information within the company, can be implemented in just a few hours.

Platform as a service – efficient use of AI building blocks

“PaaS offerings are the all-weather tire of AI,” says Schultheiß. Based on pre-developed and pre-trained basic modules (e.g. LLMs), customized AI solutions can be developed that are specifically adapted to the company's requirements, e.g., for confidentiality, without any compromises. In addition, seamless integration into business processes is possible by connecting various third-party systems, while an LLM hub gives developers access to basic models that can be easily addressed via API and added to infrastructures via Terraform. 

“By using our ready-made Terraform modules, a company can get started very easily, select an LLM and, for example, talk to a chatbot with its own data in its own infrastructure,” summarizes Delbrügger, AI architect.

Use basic models cleverly – and confidently

“Ultimately,” says Schultheiß, “everything currently revolves around the correct use of the existing basic models. They can be further trained, fine-tuned, or enriched via RAG. There are various deployment models that are geared towards the security requirements of companies. For example, if a company wants to develop code using an LLM, a private installation or access to a hardened version in a sovereign cloud such as the Open Telekom Cloud is worthwhile. This ensures that the data used remains in-house and that the results generated remain the legally secure intellectual property of the user. 


This content might also interest you
 

Display of networked light points

Artificial intelligence (AI)

The cloud offers needs-oriented access to computing resources that meet these requirements – whether CPUs, GPUs, or bare metal servers.

 
Hands tap on a tablet on which X-ray images can be seen

Fuse-AI: Reliably detecting cancer

The start-up Fuse-AI has developed an algorithm that can detect tumors. The company’s founders rely on IT resources from the Open Telekom Cloud.

 
Graphical representation of gears

ModelArts – comprehensive development platform for AI

ModelArts enables the training and deployment of AI models – for users without programming skills as well as experts in AI development.

 

The Open Telekom Cloud Community

This is where users, developers and product owners meet to help each other, share knowledge and discuss.

Discover now

Free expert hotline

Our certified cloud experts provide you with personal service free of charge.

 0800 3304477 (from Germany)

 +800 33044770 (from abroad)

 24 hours a day, seven days a week

Write an E-Mail

Our customer service is available free of charge via E-Mail

Write an E-Mail

AIssistant

Our AI-powered search helps with your cloud needs.