Apr 5, 2025
Value creation, value capture and value accrual
Learn how crypto protocols capture value, defend it, and scale it. This guide breaks down value creation, tokenomic leverage, user incentives, and defensibility, based on 750+ audited token models.
Most protocols aren’t failing because they create no value.
They fail because they can’t capture it , or worse, they try to capture too much, too early, and lose defensibility in the process.
This article outlines the importance of token value flow: a framework we use at Tokenomics.com to understand how different mechanisms create (and retain) economic power within a system.
And more importantly, we’ll show why defensibility is your cap. It’s the real limit on how much value you get to keep.
Value Capture vs Value Accrual
Before we get into models, let’s get one thing straight:
Value capture is protocol-level.
Value accrual is token-level.
Value Capture is how much economic activity the system retains. Fees, margins, protocol-owned assets—this is what stays inside the machine.
Value Accrual is how much of that captured value flows back to the token. Buybacks, burns, staking rewards, governance power, this is what gives the token actual relevance.
We separate them in every audit. Because if the protocol captures value but the token doesn’t accrue it, the token becomes irrelevant.

Example of Value Flow
BNB (Binance & BNB Chain) is a rare example of a model that gets the balance right:
• Value Creation: Access to the world’s largest centralized exchange (Binance) and a fast, low-cost L1 (BNB Chain).
• Value Capture: Binance and BNB Chain charge trading, withdrawal, and gas fees, all denominated in BNB.
• Value Accrual: Captured revenue is used for quarterly BNB buybacks and burns. BNB also grants trading fee discounts, which reinforces holding demand.
BNB isn’t valuable just because it exists. It’s valuable because the protocol captures real revenue and deliberately routes it back into the token.
Tokenomic Leverage
Not all value capture is created equal.
It’s not just about whether you retain value, it’s about how efficiently you do it.
Tokenomic leverage is the ratio between value captured and value created.
• A protocol with high leverage retains a large portion of the value it generates
• A low-leverage protocol creates value that immediately leaks out to users, arbitrage, or competitors
Think of it like operating margin for decentralized systems.
High tokenomic leverage sounds good on paper.
But in practice, it only works if your protocol is defensible.
Because if your system captures too much, and your moat is weak, someone will fork the code, cut the fees, and siphon your users.
In essence, the more value a protocol captures, the less defensible it is.
Utility and value capture have tradeoffs because of their interdependence.
The maximum value you can capture is the total value you create. As more value is captured, less is left for users/ consumers.
Defensibility
Defensibility is the competitive advantage of a protocol. It’s what keeps consumers from going to your competitors.
In short, moats.
It describes how much value can be sustainably captured before comparable utility is offered by a competitor. The more defensible a protocol, the more value it can capture, and the more overtly it can do so.
Maximizing defensibility results in a monopoly.
In Peter Thiel’s famous talk at Stanford, he describes why you want to have a monopoly in your industry — monopolies can capture nearly all of the value they create.
It’s difficult to have a monopoly in crypto, as it’s hard to build defensibility in open source
Still, there are various ways to build defensibility, and the more defensibility you have, the more value you can capture.
There are many reasons to build defensibility into your crypto protocol, and there are five moats, concrete ways of doing so.
These are:
Network effect •Lindy effect •Documentation •Brand •Gas efficiency.

Defensibility is one of the most important aspects of designing a protocol.
Building defensibly plays a part in the optimization between utility and value capture.
If you don’t design defensibly using these moats, your protocol will be copied by a competitor, who will capture less value.
Case Study: SushiSwap (Captured Too Much, Too Fast)
SushiSwap started strong. It forked Uniswap, added a token, and gave LPs a cut of trading fees. It was a bold move, and it worked.
Liquidity flooded in. The protocol was capturing value while creating it.
But then they pushed too far.
They kept increasing the share of revenue sent to token holders.
They tied emissions to every new product.
They tried to turn SUSHI into the center of everything.
And it backfired.
Users didn’t want complexity. They wanted low fees and simple trading.
Developers forked the code, cut the fees, and offered better UX.
SUSHI holders lost conviction, and participation dropped.
The token was still capturing value—but the protocol stopped growing.
No moat. No alignment. No momentum.
Sushi didn’t fail because it captured nothing.
It failed because it captured more than it could defend.
Tokenomic Leverage
At Tokenomics.com, we define tokenomic leverage as the ratio of value captured to value created.
It’s how you measure whether your system is just spinning activity, or actually retaining economic power.
• High tokenomic leverage means you’re keeping more of what you create
• Low tokenomic leverage means the value leaks out, usually to users, arbitrage, or protocol inefficiencies
Higher tokenomics leverage in a tokenomic mechanism or protocol overall is analogous to higher profit margins in a traditional business.
Examples:
• A DEX with fees routed to staking contracts has higher tokenomic leverage than one with zero-fee trading
• A protocol that collects protocol-owned liquidity retains more than one reliant on mercenary LPs
But tokenomic leverage alone isn’t enough.
If you over-optimize for capture, you build a closed loop, extracting more from users than they’re willing to give.
That’s where user leverage comes in.

User Leverage
If tokenomic leverage is how much value the system can retain,
user leverage is how much influence users have over that system.
The most sustainable protocols don’t just extract, they give users a way to move the needle.
User leverage can take different forms:
• Governance power (e.g. Curve vote escrow)
• Liquidity control (Uniswap LPs choosing exposure)
• Usage-based influence (gas fee rebates, tiered access)
• Arbitrage positioning (MEV-aware strategies)
It’s not about generosity. It’s about alignment.
When users feel like they can benefit and influence outcomes, they stick around.
When they can’t, or when their input is ignored, they farm the system, dump the token, and move on.
If users have too much power and the protocol captures nothing, you end up with a public good no one can afford to maintain.
If the protocol captures too much and users have no power, they leave, because there’s nothing in it for them.
Strong tokenomics give both sides skin in the game.
Most protocols optimize for tokenomic leverage (how much value they can extract per unit of utility)
But the best ones also consider user leverage (how much control a user has over the value they’re helping to create)
To reiterate the difference:
• Tokenomic Leverage measures how much value stays inside the system.
Protocols can dial this up with fees, burns, buybacks, and taxes.
• User Leverage measures how much users can influence outcomes.
This can be through delegation, LP positioning, governance, MEV strategies, or usage volume.
SAVE THIS SEED IN YOUR BRAIN → If tokenomic leverage gets too high and user leverage drops too low, users leave.
No one wants to play a game they can’t win.
Design Tradeoffs in Value Capture
You can’t have it all.
Every tokenomics model makes tradeoffs between:
• Capturing value vs encouraging usage
• Defending the protocol vs inviting contributors
• Rewarding holders vs growing adoption
Protocols that try to do everything at once usually fail at all of it.
Here’s the reality:
Every mechanism that captures value reduces utility somewhere else.
Every time you take more for the system, the user gets less. Fees, taxes, slippage, lockups — these aren’t invisible. Users feel them.
Raise your capture too early, and adoption stalls.
Delay capture too long, and there’s nothing left to retain.
The most resilient protocols make hard choices up front:
• Capture less early on, build defensibility, and layer in monetization later (Uniswap, Ethereum)
• Capture aggressively from day one, but offer high user leverage to balance it (Curve, GMX)
Balanced Tokenomics Designs Have These Four Things
Here’s the pattern we see in sustainable designs:
1 • Utility is clear and measurable
Users understand what the product does, and why it matters.
2 • Value capture is structured, not accidental
Fees, supply sinks, and protocol revenue are engineered in, not hoped for.
3 • Accrual to the token is justified
There’s a real reason to hold, stake, or use the token (beyond speculation).
4 • Defensibility isn’t ignored
There’s a moat. Something that makes this protocol hard to fork, hard to displace, or hard to commoditize.
And underneath it all: a healthy balance of tokenomic leverage and user leverage.
If only one side wins, no one does.
It’s not about one perfect model.
It’s about sequencing. Timing. Tradeoffs.
Conclusions
Value creation is what gets people in.
Value capture is what keeps the system running.
Defensibility is what lets you do both.
When we audit token models, we’re not just looking for clever mechanics.
We’re asking:
Where is value being created, and who actually keeps it?
Is the token necessary to the system, or just stapled on for fundraising?
Do the incentives reward contribution, or extraction?
Is this design intentional, or just borrowed from the last project that pumped?
To reiterate: There is no such thing as a perfect tokenomics model, only balanced models.
Tokenomics is a multidisciplinary field that blends hard sciences (math, physics), soft sciences (psychology, sociology, economics), and applied sciences (systems engineering).
Since human behavior plays a key role, tokenomics rarely deals in absolutes. There are no universally right or wrong answers, only trade-offs that optimize for specific objectives within unique constraints
While quantitative techniques such as statistical analysis, user segmentation, agent-based modeling, and Monte Carlo simulations provide valuable insights, they cannot dictate a singular “correct” tokenomics design.
What does help is taking a data-driven approach, especially when it comes to core economic parameters.
The good thing?
Our audits are backed by a database of over 2,500 token launches across different sectors: DePIN, RWA, AI, L1 and L2 blockchain solutions, and more.
Want us to audit your tokenomics model?
About the Author
Founder of Tokenomics.com
With over 750 tokenomics models audited and a dataset of 2,500+ projects, we’ve developed the most structured and data-backed framework for tokenomics analysis in the industry.
Previously managing partner at a web3 venture fund (exit in 2021).
Since then, I’ve personally advised 80+ projects across DeFi, DePIN, RWA, and infrastructure.
