Author(s): @David Dryjanski @Deleted User
Problem Statement
Many SPs do not accept any retrieval requests and there is no on-chain incentive for them to do so, nor there is punishment for not providing retrievals
Summary
Golden Retriever is a retrieval rewards program that evaluates retrieval performance on the Filecoin network and pays out rewards to the top performing Storage Providers.
- The MVP will leverage data captured through Autoretrieve bridge run by Bedrock to minimize the amount of engineering effort required to launch and go to market.
- Once retrieval data is more widely captured (e.g., via Validation bot), we can extend the program to data programs (i.e., Slingshot, Discover)
Links
Autoretrieve database
Autoretrieve dashboard
Goals & KPIs
-
Main goals
- Test the hypothesis that SP retrieval behavior can be altered by providing transparency into their retrieval quality or promise of a reward
- KPI: Gather the data that allows to confirm the hypothesis and learn about SPs behavior
- Incentivize SPs to serve retrievals reliably
- KPI 1: Overall retrieval DSR in Autoretrieve
- KPI 2: Number or SPs that qualify as “good retrieval providers”
-
Additional Goals
- Build a reliable DSR monitoring tool to objectively evaluate the state of the network.
- KPIs: SP coverage (% of SPs), dashboard uptime (reliability), deals per week (scale)
- Understand what amount of reward is needed to change SP behavior.
- KPIs: Cost Per Retrieval Statistics
MVP Program (using Autoretrieve)
High level flow
- The rewards program will evaluate the retrieval performance of a set of SPs
- For every evaluation period (1 week):
- Calculate metrics for all SPs
- Identify a pool of “qualifying” SPs that pass all Retrieval Success Criteria, rank SPs and
- Identify top 3 by the number of successful retrievals served
- Run a random lottery to determine additional 5 winners