Sie befinden sich im Service: LLM Hosting

Accessing LLMs

Accessing LLMs

guide

This page describes how you can access LLMs.


Access for API usage

RWTH Aachen University is a member of the German AI Service Center WestAI. To request access to our services and self-hosted Large Language Models (LLMs) through the API, please contact us at contact@westai.de. Ensure that you include "Access to LLMs at RWTH Aachen University" in the subject line of your email. We will generate an API key for you and send it via email.

API compatability and endpoint

All our models are accesible through an OpenAI-compatible API under:

Access to Graphical User Interfaces (GUI)

During our current beta phase, access to graphical interfaces is subject to the following conditions and limitations:

  • Members of RWTH Aachen University can use KI:connect (RWTHgpt) and select one of the models marked as [Experimental]. In the future, we aim to make KI:connect and these models available to external users as well.
  • Other users can request access to the LLM-Lab, which is hosted by Fraunhofer, another member of WestAI. This platform offers a user interface for accessing some of our self-hosted models and provides options to compare various models and settings.
  • Additionally, there are several graphical chat interfaces and applications that can be installed on your local device or workstation, such as Open WebUI (our recommended option), AnythingLLM, or Jan. These applications can be configured to utilize our LLM resources via API. Step-by-step instruction on how to install and configure Open WebUI with Docker Desktop can be found in the following Guide.

Additional information:

zuletzt geändert am 17.10.2025

Wie hat Ihnen dieser Inhalt geholfen?

Creative Commons Lizenzvertrag
Dieses Werk ist lizenziert unter einer Creative Commons Namensnennung - Weitergabe unter gleichen Bedingungen 3.0 Deutschland Lizenz