- In the context of RWTHgpt, tokens are the unit with which the AI models used measure the length of texts (both for user-generated questions and system-side answers).
These are frequent character strings that occur in a text. The division into tokens is complicated and is not uniformly based on words or characters. - Detailed information on calculating the number of tokens and the background can be found here:
- OpenAI itself offers a calculator that makes it possible to determine the number of tokens and calculation in sentences or texts:
- Tokenizer
Please note that this page is subject to the general data protection guidelines of OpenAI. Please do not insert any personal data (from third parties) here.
last changed on 07/11/2024
How did this content help you?