Metrics Storage
- Validation bot storing metrics in:
- web3.storage
- AWS postgres
- Amount of data:
- few hundred thousand test results
- <100MB
- Data format
- web3 no queryable interface
- Observer - storing postgres database
- Metrics currently planning to collect:
- Download speed, average, and raw download bytes for each second
- Latency - TTFB (time to first byte)
- Unsealing required (graphsync only) - how will VB determine that?
- Index ingestion
Timelines / deliverables
Delivered & testing
- Graphsync retrieval
- Index providing
- Uptime
- Test on Evergreen SPs (currently around 80)
- Next up
- Code quality
- Scalability of workers
- Later in November
- HTTP retrievals
- Need to enforce SPs using Boost
- Plan to use test miner to upgrade to Boost with the next release
- Can use
- Bitswap
- Bitswap in Boost - in test phase. Fully released
- For Validation Bot - can use Sophia miner
- What priority is Data retrieval test / timeline? (vs. Ping and traceroute and index)
- Additional nodes
- will allow to decentralize the effort
Using metrics
- Anyone can access data if they know the peer ID
- Infrastructure to stop sending deals to violating SPs:
- Part of integration with Slightness 3
- This year goal to launch integration
- How do we ensure they won’t operate under a different miner ID
- Building dashboard as part of the effort?
Pando dependencies