Tokenization
Tokenization allows converting text inputs to a sequence of numbers understandable by the AI Model.
The mapping from each individual text chunk, i.e. token to the number is determined by the tokenization algorithm.
Sangy
11 months ago