In this article you can read,
- which business challenges amberSearch solves with AI,
- why cloud and AI are a perfect combination and
- how amberSearch solved the sovereignty requirements of its customers.
In this article you can read,
With the LLM Serving Service, amberSearch puts together a package for companies with sensitive data and allows the sovereign use of artificial intelligence.
“Just where did I put that information I really need right now?” That’s a typical everyday situation for many office workers. Searching for enterprise information robs them of nearly a half-hour of working time on average – every day. In such scenarios, artificial intelligence can truly contribute to higher employee productivity.
This is where amber Tech GmbH comes in, with its amberSearch service: “It’s our mission to break up the status quo of outdated B2B software solutions and give our customers the added value of ultramodern artificial intelligence (AI) and an intuitive user experience in our enterprise search products,” explains Philipp Reißel, co-founder and CEO of the AI specialist, which is head-quartered in Aachen, Germany. amberSearch makes AI accessible even for companies that don’t have giant budgets or immense IT resources. A 30-person team at the company is now working on making internal information accessible quickly and easily. “We offer out-of-the-box solutions that are fast and easy to roll out, but still generate major added value for companies,” adds CRO (Chief Revenue Officer) Bastian Maiworm.
The business concept of using artificial intelligence for enterprise search – searching within a company’s internal resources – is winning people over: The name of the service has become synony-mous with the company’s name. The AI provider has built a large customer base through various industries and different company sizes. Its customers include large companies like Schüßler-Plan, Zentis, DB Regio, and Landmarken AG, as well as SMEs in the mechanical engineering, biotech, and financial fields. Pharmaceu-ticals and financial companies face stringent compliance and security requirements; such companies are very careful to make sure that their internal data remains internal. The key phrase here is “data sovereignty”.
“Setting up our own infrastructure management and hosting was out of the question – we wanted to concentrate on our customer projects and the evolution of our AI solutions, and avoid unneces-sary fixed costs,” says Maiworm. With this in mind, amber Tech quickly decided to go with the cloud. “The Open Telekom Cloud is a reliable platform that gives us the freedom to run our business. It scales with us and saves us from having to procure and run our own resources.” This also includes demands for GPU resources, to train and run their AI services. At the same time, however, the com-pany is facing increased demands for data sovereignty: AI users, particularly financial firms, want to ensure that amberSearch has no way to access their data, even theoretically.
With the Open Telekom Cloud and the LLM serving service from the AI Foundation product family, we’ve found an elegant answer to the strict requirements our customers face – especially in regulated industries – about data sovereignty when using our AI services.
Bastian Maiworm, Co-founder, amber Tech GmbH
The Open Telekom Cloud gives us a simple, elegant solution to this as well,” explains Bastian Maiworm. In 2024, Deutsche Telekom launched the LLM serving services in the AI Foundation product family, a pool of ready-to-use large language models (LLMs) and embedding models that can be integrated seamlessly with customer AI applications through an API key. In this approach, leading open-source models from Meta or Mistral are hosted in the Open Telekom Cloud, while closed-source models from OpenAI, Google, or Anthropic are provided through third-party platforms. Each customer manages their API keys in a portal, where users are also administered and token usage can be displayed. “This is very convenient for users of the Open Telekom Cloud like ourselves. We can use these services as burst capacity through our own instal-lation.” In other words: Normally, amberSearch serves customers from within its own installation. In response to specific inquiries or high loads, this installation is extended seamlessly through the Open Telekom Cloud – and not only at the level of simple infra-structure resources (IaaS). When can this become necessary?
To understand, remember that amberSearch consists of two com-ponents. The original component is the model for searching and rating enterprise information, developed by amber Tech. The se-cond part acts as the user interface: An LLM processes the queries and formulates an answer from the search results.
„The LLM serving service from Deutsche Telekom gives us access to the LLM pool, which is staged directly in the Open Telekom Cloud. When our customers have increased demands for security, they get the corresponding LLM services directly from the Open Telekom Cloud – with which they have already concluded an agreement for commissioned data processing. As a result, even amberSearch has no theoretical way to access the data.” This me-ans amberSearch users can satisfy even the highest requirements for data sovereignty.
amberSearch is an example of a perfect interaction between cloud computing and AI. The company is building and develo-ping its AI business model with the LLM serving service based on the Open Telekom Cloud, fully scalable and with adaptive costs. The Open Telekom Cloud also delivers another factor: “The LLM serving service means we can also meet our customers’ demands for high levels of security and sovereignty for their sensitive data.” In addition, amber Tech has the possibility of experimenting with different LLMs, to ensure that their AI services are always of high quality. Last but not least, amberSearch benefits – beyond the cloud and use-based LLMs – from Telekom’s customer and partner network.
AI Foundational Services allow the cost-effective use of LLMs on the Open Telekom Cloud. The use of LLMs with RAG shortens the time to a finished AI service by 60 to 70 percent.
Level up your AI and HPC Applications with NVIDIA H100 GPUs
Open Telekom Cloud introduces the next Generation of GPUs from NVIDIA. H100 brings new power to your Artificial Intelligence projects and other high Performance use cases.
ISG Provider Lens: Open Telekom Cloud once again leader in the German market
The Open Telekom Cloud stands out once again: In the latest ISG Market Report 2024, it is recognized as a leading European public cloud.
The Open Telekom Cloud Community
This is where users, developers and product owners meet to help each other, share knowledge and discuss.
Free expert hotline
Our certified cloud experts provide you with personal service free of charge.
0800 3304477 (from Germany)
+800 33044770 (from abroad)
24 hours a day, seven days a week
Write an E-Mail
Our customer service is available free of charge via E-Mail
AIssistant Cloudia
Our AI-powered search helps with your cloud needs.