msp26
(Sorry if I ramble I've spent some time on token counting recently and have some opinions on it)

It's neat that it's client side only but it's quite feature lacking compared to https://tiktokenizer.vercel.app/ or even a small python script I wrote.

Feedback (Feature parity): -Identification and separation of individual tokens. Useful for prompting/adjusting logit biases.

-Optional chat formatting

-It's not very responsive to type on, perhaps you could delay the token calc a little

Further suggestions (I don't know how simple you want to keep it):

-Counting tokens for Function Calling json functions

helpful resources:

https://gist.github.com/CGamesPlay/dd4f108f27e2eec145eedf5c7...

https://hmarr.com/blog/counting-openai-tokens/

nyto
Yeah this is pretty much solved in every ecosystems but Javascript. It made me not use openAI's chat streaming because guessing became to hard. I'd rather show a loading and have precise measurements.
quickthrower2
For those wanting to do this programatically: pip install tiktoken
sr.ht