Will OpenAI release a tokenizer with vocab size > 150k in 2023?
Mini
20
แน6510resolved Jan 1
Resolved
NO1D
1W
1M
ALL
The GPT-2 model used r50k_base: vocab size = 50k
The GPT-3 model used r50k_base: vocab size = 50k
The GPT-3.5 model used cl100k_base: vocab size = 100k
The GPT-4 model used cl100k_base: vocab size = 100k
Get แน1,000 play money
๐ Top traders
# | Name | Total profit |
---|---|---|
1 | แน133 | |
2 | แน52 | |
3 | แน38 | |
4 | แน36 | |
5 | แน17 |
Related questions
Related questions
Will OpenAI release a tokenizer with vocab size > 150k by end of 2024?
42% chance
Will OpenAI release a tokenizer with more than 210000 tokens before 2026?
24% chance
Will the next major LLM by OpenAI use a new tokenizer?
77% chance
Will OpenAI reveal thinking tokens by the end of June 2025?
6% chance
Will a flagship (>60T training bytes) open-weights LLM from Meta which doesn't use a tokenizer be released in 2025?
29% chance
Will OpenAI release next-generation models with varying capabilities and sizes?
64% chance
OpenAI to release model weights by EOY?
88% chance
Will OpenAI fold in 2025?
3% chance
Will OpenAI release a version of Voice Engine by the end of 2024?
81% chance
Will OpenAI have a new name by the end of 2025?
8% chance