flowchart LR
A[Research Framework] --> B[Analytical Prototype]
B --> C[Comparative Dashboard Product]
C --> D[API & Monitoring Layer]
D --> E[Institutional / Protocol Integration]
E --> F[Selective Oracle-Compatible Outputs]
````
This staged view is important because it prevents the project from overreaching too early. Not every layer should be built at once. The integrity of the platform depends on sequence.
## Stage One: Research Foundation
### Objective
The objective of the first stage is to establish the analytical core of LSDx as a serious research framework. This is the stage in which the conceptual model becomes explicit, the factor architecture is stabilised, and the first comparative logic is formalised.
This stage includes:
* defining the canonical token representation,
* finalising the fair value decomposition,
* building the first risk factor set,
* defining adjusted-yield logic,
* and documenting use-case-specific interpretation rules.
### What success looks like
Success in this stage does not mean full product readiness. It means that the methodology is coherent enough that a technically literate reader or researcher can understand:
* what is being measured,
* how outputs are derived,
* what assumptions are used,
* and where the limitations lie.
The current whitepaper itself belongs substantially to this stage, but the real completion of Stage One would also require:
* formal appendix material,
* initial calibration notes,
* token-specific mapping logic,
* and internal consistency across all scoring layers.
### Why this stage matters
Without a strong research foundation, later stages become fragile. Product design can compensate for poor presentation. It cannot compensate for conceptual confusion.
## Stage Two: Analytical Prototype
### Objective
The second stage is to turn the research framework into an internal analytical prototype. This is the first stage at which LSDx behaves like a system rather than only a document.
The prototype should be capable of:
* ingesting selected token data,
* normalising a defined LSD universe,
* computing factor-level outputs,
* estimating illustrative fair value ranges,
* generating adjusted-yield comparisons,
* and producing structured comparative views.
At this stage, the emphasis should remain on correctness and transparency rather than scale.
### Scope of the prototype
A sensible early prototype would focus on a narrow but meaningful universe, for example a selected set of major ETH-based LSDs. This is strategically useful because:
* the token class is economically important,
* the comparative problem is already real,
* and the limited scope allows deeper methodological focus.
The prototype should include:
* a raw-data ingestion process,
* a canonical representation layer,
* a deterministic feature engine,
* a first scoring and valuation engine,
* and a simple research-facing output interface.
### What success looks like
Success at this stage means that LSDx can produce repeatable outputs for a limited token set and that those outputs can be reviewed critically by the builder, researcher, or selected expert users.
The main questions at this stage are:
* Do the outputs make economic sense?
* Are factor interactions reasonable?
* Are scores explainable?
* Are obvious contradictions or unstable behaviours visible?
This stage is still closer to research infrastructure than public product.
## Stage Three: Comparative Dashboard Product
### Objective
The third stage is to expose the analytical engine through a user-facing comparative product. This is the point at which LSDx begins to look like a platform rather than an internal tool.
The dashboard should allow users to:
* compare LSDs side by side,
* inspect fair value versus market price,
* review factor-level risk decomposition,
* observe adjusted-yield outputs,
* and view regime or monitoring states.
### Why this stage matters
A dashboard is not the core of LSDx, but it is strategically important because it provides visibility, usability, and product discipline. It forces the framework to become interpretable. It also reveals whether the analytical outputs are actually useful to real users or merely satisfying in theory.
### Features of the first product version
A first dashboard version should remain focused. It should not attempt to be a full DeFi super-terminal. A strong first release might include:
* selected token universe,
* token comparison pages,
* current and historical premium/discount views,
* fair value range display,
* adjusted-yield comparison,
* component-level risk scorecards,
* liquidity and exit diagnostics,
* and regime state with confidence annotation.
At this stage, the delivery goal is not maximal breadth. It is clarity and analytical credibility.
### What success looks like
Success means that a user can open LSDx and quickly understand:
* why two tokens differ,
* why a token looks attractive or weak,
* and how the framework arrived at its conclusion.
A platform that only displays numbers without explanation would fail this objective.
## Stage Four: Monitoring, Alerts, and API Layer
### Objective
Once the analytical engine and dashboard layer are stable, the next stage is to make LSDx operationally useful through monitoring and programmatic delivery.
This stage introduces:
* alert logic,
* historical analytical state tracking,
* and structured APIs.
The platform moves from comparative snapshot analysis toward continuous intelligence.
### Monitoring functions
A monitoring layer may include:
* widening fair value deviation alerts,
* liquidity deterioration alerts,
* factor-score drop alerts,
* regime transition alerts,
* and coverage-confidence degradation alerts.
This makes LSDx useful not only for selection decisions, but also for ongoing oversight.
### API functions
An API layer allows downstream systems to consume outputs such as:
* token metadata,
* factor decompositions,
* fair value estimates,
* suitability scores,
* and monitoring states.
This stage is particularly important for:
* research teams,
* treasury tools,
* risk dashboards,
* and strategy systems.
### Why this stage should not come first
The platform should not begin with an API-first mentality unless the analytical core is already stable. Exposing poorly governed signals at scale would be premature. The API layer should emerge once the outputs are sufficiently disciplined and version-aware.
## Stage Five: Institutional and Protocol Integration
### Objective
The fifth stage is the beginning of deeper external integration. At this point, LSDx may move beyond being a public comparative platform and become a specialised intelligence layer for professional users or protocols.
This stage may include:
* tailored treasury analytics,
* governance reporting support,
* collateral-policy research support,
* custom suitability configuration,
* and integration into third-party DeFi infrastructure.
### Why this stage is strategically valuable
This stage is where LSDx begins to demonstrate its broader relevance as infrastructure rather than content. If the analytical outputs are credible and interpretable, they can become useful in settings such as:
* DAO treasury policy,
* protocol collateral evaluation,
* structured strategy research,
* and institutional digital-asset oversight.
### Requirements for this stage
To be credible in more professional contexts, LSDx will need stronger discipline in several dimensions:
* methodology versioning,
* documentation,
* change management,
* historical traceability,
* and possibly user-specific configuration.
The core challenge here is governance. Professional use does not only require good outputs. It requires trustworthy operational behaviour.
## Stage Six: Selective Oracle-Compatible Outputs
### Objective
The final stage in the roadmap is not full automation of all analytics on-chain. That would be neither realistic nor desirable in the short term. The more credible goal is selective extraction of a small subset of robust, bounded, and interpretable outputs that may later become oracle-compatible.
Examples might include:
* conservative fair value reference bands,
* bounded risk classes,
* simplified liquidity health flags,
* or delayed monitoring states.
### Why this stage should remain cautious
Not every useful analytical signal is suitable for direct on-chain publication. Some signals depend on judgement, changing methodology, or data inputs too rich for trust-minimised automation. Others may be too noisy or too reflexive.
For this reason, oracle-compatible extension should be treated as a later-stage, highly selective development path. The question is not whether an output is interesting, but whether it is robust enough for machine-level dependence.
### What success looks like
Success at this stage does not mean placing the entire LSDx framework on-chain. It means identifying a narrow subset of outputs that can safely and responsibly serve automated systems without compromising the integrity of the broader analytical framework.
## Product Maturity by Stage
The stages above can be summarised in terms of product maturity.
| Stage | Primary identity | Main deliverable |
| -------------------------------------- | --------------------------------------- | ------------------------------------------------- |
| Research foundation | Methodological framework | Whitepaper, factor architecture, analytical logic |
| Analytical prototype | Internal quant system | Limited-universe model outputs and comparisons |
| Comparative dashboard product | Public-facing analytical tool | Interactive LSD comparison and interpretation |
| Monitoring and API layer | Operational intelligence platform | Alerts, structured endpoints, historical state |
| Institutional and protocol integration | Infrastructure service layer | Embedded analytics for treasuries and protocols |
| Selective oracle-compatible outputs | Conservative machine-readable extension | Limited robust signals for automated use |
This summary is important because it shows that the platform evolves by increasing both analytical maturity and delivery sophistication.
## Research Agenda Alongside Product Development
The roadmap should not treat research as something completed at the beginning and then abandoned. LSDx is a domain in which product development and research development should proceed together.
Several research tracks should continue alongside implementation.
### Fair value refinement
As more data and better observations become available, the fair value framework should be refined. This may include:
* better liquidity discount estimation,
* richer convenience-premium modelling,
* horizon-sensitive valuation logic,
* and improved uncertainty bounds.
### Stress-liquidity modelling
This is one of the most important open research tracks. Better ways of estimating effective liquidity under stress would significantly strengthen the value of LSDx for collateral and treasury use cases.
### Structural-risk methodology
The treatment of governance concentration, dependency graphs, upgradeability, and hybrid structural factors should mature over time. This area is likely to remain partly research-driven.
### Multi-chain expansion
Once the core ETH-based or narrow-universe prototype is stable, the framework can extend to other chains and more complex derivative structures. This should be done cautiously.
### Regime and scenario logic
The monitoring system may also evolve toward richer regime classification, event-aware scoring, and scenario-sensitive interpretation.
In other words, the roadmap should preserve a living research programme inside the platform.
## Practical Prioritisation
A roadmap is only useful if it implies prioritisation. Not everything should be built at once.
The most sensible practical order is likely:
1. finalise the analytical whitepaper and internal logic,
2. build a limited-universe prototype,
3. validate outputs through repeated internal comparison,
4. expose a focused comparative dashboard,
5. add monitoring and API capability,
6. explore professional integrations,
7. evaluate whether any subset deserves oracle-style treatment.
This sequencing matters because it preserves credibility. It allows the system to become useful early without pretending premature maturity.
The following figure summarises this development path.
```nfxwcbinw
flowchart TD
A[Finalise Research Logic] --> B[Build Limited Prototype]
B --> C[Validate and Refine Outputs]
C --> D[Launch Comparative Dashboard]
D --> E[Add Monitoring & API]
E --> F[Support Professional Integrations]
F --> G[Assess Selective Oracle Outputs]
10 Roadmap and Future Development
10.1 Why a Roadmap Is Necessary
A whitepaper that defines a strong analytical framework but gives no credible implementation path remains incomplete. The purpose of this chapter is therefore to explain how LSDx can evolve from a structured research concept into a robust analytical product and, potentially, into a broader infrastructure layer for decentralised finance.
The role of the roadmap is not to promise speed for its own sake. It is to define a disciplined sequence of development. This matters because LSDx is not a trivial dashboard project. It combines valuation logic, multi-source data integration, liquidity diagnostics, structural risk interpretation, use-case-specific scoring, and potentially machine-readable outputs for external systems. Such a platform must be built in stages.
A staged roadmap is also intellectually important. Different layers of the LSDx vision have different maturity requirements. Some are suitable for early implementation. Others require richer data, stronger validation, or more operational discipline before they become credible.
For this reason, the roadmap should distinguish clearly between: - research development, - product development, - infrastructure development, - and longer-term ecosystem integration.
The central principle is simple: build the analytical core first, validate its outputs, expose it in usable form, and only later extend toward deeper automation or on-chain integration.
10.2 Development Philosophy
The roadmap of LSDx should follow several design principles.
10.2.1 Analytical depth before interface breadth
The first goal is not to appear everywhere. The first goal is to ensure that the analytical logic is coherent, reproducible, and defensible. A platform with weak underlying methodology but polished interface design would damage long-term credibility.
10.2.2 Modularity over premature complexity
The architecture should evolve in layers. Each stage should leave behind a usable and understandable system. This is better than attempting to solve every future feature in the first build.
10.2.3 Transparency over black-box sophistication
Where choices must be made between more interpretable outputs and more opaque optimisation, the early roadmap should prefer interpretability. In a field where trust is fragile, transparent structure is an asset.
10.2.4 Validation before automation
Signals that may eventually feed APIs, alerts, or even oracle-like outputs should first be tested in a research and product setting. The system should earn the right to become more automated.
10.2.5 Research and product should remain connected
LSDx is strongest when the research layer and the implementation layer reinforce each other. The roadmap should preserve this connection rather than splitting theory and product into disconnected tracks.
10.3 Strategic Development Stages
The long-term development of LSDx can be viewed as a progression through several stages.
The emphasis here is not speed for its own sake. It is disciplined expansion.
10.4 Long-Term Strategic Direction
If LSDx succeeds in the stages above, its long-term role could become larger than a token comparison tool. It could become a reusable analytical layer for staking-linked instruments more broadly.
In the long run, several strategic directions may emerge:
- broader support for multiple proof-of-stake ecosystems,
- support for wrapped or recursively integrated derivatives,
- treasury and collateral policy tooling,
- research-grade reporting interfaces,
- and a trusted external analytics layer for protocol designers and allocators.
This should not be overclaimed too early. But it is important to recognise that the core logic of LSDx is extensible. The deeper opportunity is not merely to rank today’s LSDs. It is to define a disciplined analytical language for yield-bearing staking derivatives as a financial class.
10.5 What the First Real Version Should Be
To keep the roadmap realistic, it is useful to state clearly what the first meaningful version of LSDx should aim to be.
The first real version should not try to be everything. It should be:
- focused on a limited, important token universe,
- methodologically clear,
- visibly interpretable,
- analytically reproducible,
- and useful enough that a serious DeFi user can make better decisions with it than without it.
That alone would already be a meaningful achievement.
The first version should therefore prioritise:
- token comparison,
- fair value context,
- adjusted-yield interpretation,
- component risk scoring,
- liquidity and exit diagnostics,
- and basic monitoring state.
This is a realistic and strategically strong first product identity.
10.6 Closing Remarks
This chapter has outlined a staged roadmap for LSDx from research framework to analytical product, from product to operational intelligence layer, and from there toward more advanced integration possibilities.
The roadmap is deliberately structured and conservative. It recognises that analytical credibility must precede broad automation. It places methodological clarity before interface expansion. It treats research and implementation as connected rather than separate. And it reserves oracle-style extension for a later stage where only a small subset of outputs may be mature enough for machine-level consumption.
This staged path is one of the strengths of the project. It shows that LSDx is not merely an abstract analytical concept, but a system that can be built progressively and responsibly.
With this roadmap in place, the paper reaches a natural conclusion. The remaining task is to close the document in a way that reinforces the strategic thesis of LSDx: that Liquid Staking Derivatives deserve a more disciplined analytical layer, and that such a layer can become an important piece of the evolving DeFi stack.