MarketplaceCommunityDEENDEENProductsCore ServicesRoadmapRelease NotesService descriptionCertifications and attestationsPrivate CloudManaged ServicesBenefitsSecurity/DSGVOSustainabilityOpenStackMarket leaderPricesPricing modelsComputing & ContainersStorageNetworkDatabase & AnalysisSecurityManagement & ApplicationsPrice calculatorSolutionsIndustriesHealthcarePublic SectorScience and researchAutomotiveMedia and broadcastingRetailUse CasesArtificial intelligenceHigh Performance ComputingBig data and analyticsInternet of ThingsDisaster RecoveryData StorageTurnkey solutionsTelekom cloud solutionsPartner cloud solutionsSwiss Open Telekom CloudReferencesPartnerCIRCLE PartnerTECH PartnerBecome a partnerAcademyTraining & certificationsEssentials trainingFundamentals training coursePractitioner online self-trainingArchitect training courseCertificationsCommunityCommunity blogsCommunity eventsLibraryStudies and whitepaperWebinarsBusiness NavigatorMarketplaceSupportSupport from expertsAI chatbotShared ResponsibilityGuidelines for Security Testing (Penetration Tests)Mobile AppHelp toolsFirst stepsTutorialStatus DashboardFAQTechnical documentationNewsBlogFairs & eventsTrade pressPress inquiriesMarketplaceCommunity

0800 3304477 24 hours a day, seven days a week

Write an E-mail 

Book now and claim starting credit of EUR 250
ProductsCore ServicesPrivate CloudManaged ServicesBenefitsPricesPricing modelsPrice calculatorSolutionsIndustriesUse CasesTurnkey solutionsSwiss Open Telekom CloudReferencesPartnerCIRCLE PartnerTECH PartnerBecome a partnerAcademyTraining & certificationsCommunityLibraryBusiness NavigatorMarketplaceSupportSupport from expertsHelp toolsTechnical documentationNewsBlogFairs & eventsTrade pressPress inquiries
  • 0800 330447724 hours a day, seven days a week
  • Write an E-mail 
Book now and claim starting credit of EUR 250

AI Foundation Services

Large Language Models (LLMs) on the Open Telekom Cloud
 

Use LLMs!

Artificial intelligence (AI) has revolutionized the business world. AI technology offers efficient solutions for acute business challenges, massively increasing employee productivity and enabling the automation of many processes. Many companies are already using AI for enterprise knowledge management and enterprise search, for internet research on specific topics, for customer support in service hotlines, training, evaluations, etc.

 
 

The use of LLMs with RAG shortens the time to a finished AI service by 60 to 70 percent.

AI Foundational Services allow the cost-effective use of LLMs.

 
 
 

GenAI and RAG – the fast track to your goal

Generative AI (GenAI) has made a name for itself, especially since 2022. With GenAI, ready-made AI modules can be used quickly and easily in specific business solutions – without the need for extensive training. The secret behind GenAI is Large Language Models (LLMs), which have already undergone comprehensive AI training and only need to be adapted to a specific business context or individual application scenario.  

This can be easily achieved using a RAG approach. RAG stands for Retrieval Augmented Generation. Internal data sources are linked to the trained LLM, from which the AI receives the specific knowledge it needs for its application. There is a wealth of such LLMs on the market that are optimized for specific scenarios. Some are proprietary, some are open source. 

 
 

You want to quickly develop your own AI service with LLMs?

The Open Telekom Cloud offers you access to various LLMs, a RAG service, and the pre-programmed T-Systems Smart Chat as part of the AI Foundation Services. The services are stored and operated on the Open Telekom Cloud. You therefore do not need to plan your own GPU resources for their operation. All you need is an Open Telekom Cloud account and an API to access them.

Infographic AI Foundation Services: Develop your own AI services
 

LLM Serving Service API

With the LLM Serving Service, we offer you the option of using the LLM in a shared (more cost-effective) variant and in a “private”/dedicated variant for your company only. The LLMs offered vary. The models available are Mistral AI (Mistral, Nemo), Meta LLama (3.1 70B, CodeLlama2) and GPT 4 (OpenAI). Other LLMs such as GPT-4o, Claude 3, Gemini 1.5 Pro and other Mistral variants are available on request. This makes you independent of a specific LLM provider and gives you the opportunity to try out different LLMs to identify the most suitable (most cost-effective and powerful) one for your use case. 

Smart Chat API

The RAG service is also easily available via API. Add your suitable internal data sources to the LLM, choose from over 50 retrieval settings and generate a Vector database that runs in your company's own (private) instance of the Open Telekom Cloud. The RAG service supports various data formats (docx, pdf, xlsx, etc.) and can also extract data from diagrams. The RAG approach offers an additional level of security: your data stays with you. 

 

T-Systems Smart Chat

Want to try out AI even faster? Maybe our ready-to-use T-Systems Smart Chat is right for you. Based on the LLM Serving Service and the T-Systems Smart Chat API, T-Systems Smart Chat offers a browser-based chat interface that allows you to ask questions about your documents in natural language. 

Our experts can adapt the chat to your specific application scenario as part of a project. With your confidential documents and information as a knowledge base, your specific AI assistant is created, e.g., for research in document pools, automated summaries, comparative analyses of documents, or automatic text creation and optimization.

Coming soon: Fine Tuning API

Optimize your AI models quickly and efficiently with our LLM fine-tuning service. The service enables you to customize open-source models such as LLama 3.1 and Mistral to your individual requirements. Thanks to our API, you can perform fine tuning using either LoRA or DPO (RLHF) based methods, uploading your own training data and then performing the fine tuning automatically. The API according to the OpenAI standard ensures seamless integration into existing systems and workflows. In addition, the trained models can be flexibly deployed on a shared or private instance, depending on what best suits your requirements. Experience how easily and quickly you can adapt and optimize your AI models.

 

Get in touch with us, get your OpenAI-compatible API and set up your own AI service.

Contact us now
 
 

More sovereignty is hardly possible

By operating and developing your AI service on the comprehensively certified, European Open Telekom Cloud, you meet all current regulatory requirements. In particular, the Open Telekom Cloud can also process social data and can be used by individuals subject to professional secrecy. The RAG approach with the private instance gives you full control over your (sensitive) data. A higher sovereignty in the use of AI can hardly be reached in the public cloud.

 
 

How do you get your API key?

Do you want to get started with AI, LLMs and RAG? Use the contact form and receive your API keys. Our Marketplace will also be launched in a few weeks. You will then also be able to obtain the API keys there via self-service.

 

Contact us now and receive your API key

Your phone number seems to be incorrect. Please note it must contain at least four digits. Zeros at the beginning are not considered.
Your phone number seems to be incorrect. Please note that it must have a maximum of 26 characters.
Your phone number appears to be incorrect. Please note that only numbers from 0 to 9 can be used.

* required fields

 

The Open Telekom Cloud Community

This is where users, developers and product owners meet to help each other, share knowledge and discuss.

Discover now

Free expert hotline

Our certified cloud experts provide you with personal service free of charge.

 0800 3304477 (from Germany)

 +800 33044770 (from abroad)

 24 hours a day, seven days a week

Write an E-Mail

Our customer service is available free of charge via E-Mail

Write an E-Mail

AIssistant Cloudia

Our AI-powered search helps with your cloud needs.