What are Tokens?
Tokens are units of text used by OpenAI for natural language processing. Tokens can vary in length, ranging from individual characters to entire words or phrases. For example, the sentence "I love using Indeemo!" consists of seven tokens: ["I", "love", "using", "Ind”, “eem”, “o", "!"].
Essentially, a token is a chunk of text that the AI breaks regular words / phrases into in order to analyse data.