What is the process of breaking text into individual units called tokens?
Lemmatization
Stemming
Tokenization
Normalization

Artificial Intelligence Exercises are loading ...