Main
Use it before feeding a repo into an agent: token counts show which folders (like
dist/) will blow up your context budget.Prefer
--jsonfor agent pipelines so the agent can plan what to read within a fixed token limit.If token counts vary between models, use
--model/--encodingto compare costs across tokenizers.
Source-backed notes
- README shows a no-install quick start:
npx tokenu .. - README documents JSON output for agent consumption:
tokenu --json. - README lists encodings (
o200k_base,cl100k_base, etc.) and flags like--max-depthand--exclude.
FAQ
- Does tokenu reduce token usage automatically?: No — README says it measures token cost so you can decide what to exclude or summarize.
- Can an agent use tokenu programmatically?: Yes — use
tokenu --jsonso the agent can parse per-dir token counts. - Is it an estimate or real tokenization?: README says it uses actual tokenization via a tokenizer library, not heuristics.