More-With-Less is developing an open-source framework for efficiently and cost-effectively adapting large-scale language models for SME-specific applications.
The customer
The challenge
Project duration: 01.04.2023 - 31.03.2026
In recent years, a number of breakthrough results have been achieved using large-scale AI language models - models such as OpenAI's GPT or Google's PaLM are just the tip of the iceberg.
However, European technology developers and users of these technologies, especially small and medium-sized enterprises (SMEs), are increasingly falling behind in this development. The infrastructure, computing power and expertise required for the deployment of large AI language models are in many cases not sufficiently available or uneconomical to operate.
The research project aims at developing a framework for efficient adaptation of large language models for SME-specific applications. The developed methods are intended to be predominantly low-threshold applicable without the need for special AI expertise or large training datasets.
Solution
Results & effects
Merantix Momentum contributes significantly to the success of More-with-less with domain expertise on natural language processing and large-scale language models. Special importance is attached to the development of learning methods that make do with small amounts of training data and computational costs.
In this course, numerous fine-tuning and meta-learning techniques will be investigated, such as adapters or Few-Shot-Learning methods. The technical basis will be an open source library that can be used to quickly and easily adapt existing language models to use cases.
The development of an exemplary framework for the efficient deployment of a complete AI solution is closely related. Finally, the approach is demonstrated on a selected use case: document matching for companies in the legal tech sector.