"Token is a metric that can quantify the information volume contained in a project. The total number of tokens needed to encode all assets of a project is a measure of its scale."
Extended reading: How context window limits create the need for token-based measurement, see AI Statelessness and Context Window. For how token scale affects tool selection, see Tools and Context Selection: Why AI IDEs Sell "Context Selection Capability".
1. Why Use Token to Measure Project Scale
In traditional software engineering, we usually measure project scale by:
- Lines of code (LoC)
- Number of files
- Number of functional modules
- Team size
But in the AI-Native era, these metrics all have limitations:
- Lines of code: AI-generated code might be very "verbose"—more lines don't mean higher complexity
- Number of files: AI tends to generate many small files—file count doesn't reflect real information density
- Number of functional modules: Module boundaries become more blurred in the AI era—count doesn't mean much
- Team size: AI dramatically increases individual productivity—the relationship between people and project scale is broken
Token, as a new quantitative metric, has unique advantages:
Directly Reflects AI's 'Understanding Cost'
How many tokens a project needs for AI to fully understand directly determines:- Whether AI can process the entire project in one session
- How many rounds of AI relay are needed to complete a requirement
- The complexity of context selection strategies
Unified Measurement of All Assets
Token can uniformly measure:- Code
- Documentation
- Test cases
- Configuration files
- Even meeting recordings, design drafts (if converted to text)
Directly Tied to AI Costs
Token consumption = AI usage cost. This metric directly relates to:- Whether the team can continuously use AI
- Which tools/models are suitable for the current project scale
2. The 1M Token and 10M Token Boundaries
Based on practical experience, there are two key numerical boundaries:
- 1M Token: All content produced by a very small AI-Native team in 15–30 days is roughly at this scale
- 10M Token: Within this scale, AI-Native solutions can bring at least
