Sie befinden sich im Service:LLM Hosting

Allgemeine Infos

LLM Hosting

 

Important

Please note that the service is currently in the beta phase, and access conditions may change in the future.

On our HPC infrastructure, we host multiple stand-alone Open Weight Large Language Models (LLMs). Access to these models is available through both an API and a web interface, providing a data-sovereign alternative to commercial providers such as OpenAI/ChatGPT and Meta.

This service enables users to process sensitive data securely, with our IT Center ensuring compliance with data protection regulations. We prioritize privacy by not storing any data or prompt content made during usage or within requests.

 

Who is allowed to use LLMs?

Currently, members of RWTH Aachen University and users from WestAI can request access to the LLMs. For more information, please refer to the section on Accessing LLMs.

 

How can one access the LLMs?

You can access LLMs either via the web interface or via the API.

 

Quota Limitations

At the moment, we have not established any limits on requests or tokens in order to better evaluate demand and utilization.

However, should we encounter bottlenecks or an increase in demand or user activity, we reserve the right to set appropriate limits.