Subnet 13 Validating

Introduction

The Validator is responsible for validating the data delivered by Miners and scoring Miners according to the Incentive Mechanismarrow-up-right.

Validators run multiple concurrent threads:

Component

Purpose

Main Loop

Evaluation cycles, weight setting

Miner Evaluator

Score calculation and validation

On-Demand Processor

Data Collection job validation

API Server

External query interface (FastAPI/uvicorn)

W&B Logger

Metrics logging (rotates every 3 hours)

Metagraph Syncer

Network state updates

System Requirements

Validators require at least 32 GB of RAM but do not require a GPU. We recommend a decent CPU (4+ cores) and sufficient network bandwidth to handle protocol traffic. Must have python >= 3.10.

Getting Started

Prerequisites

  1. Data Universe supports Twitter and Reddit scraping via Apify so miners have to setup their Apify API tokenarrow-up-right. Validators will default to using the recommended reddit account for reliability but this can be changed editing the PREFERRED_SCRAPERS map in validator.py locally. Data Universe also supports YouTube Scraping via a official youtube apiarrow-up-right.

  2. Clone the repo

  1. Install the requirements. From your virtual environment, run

This will start a process called net13-vali-updater. This process periodically checks for a new git commit on the current branch. When one is found, it performs a pip install for the latest packages, and restarts the validator process (who's name is given by the --pm2_name flag)

Without auto-updates

If you'd prefer to manage your own validator updates...

From the data-universe folder:

Configuring the Validator

Flags

The Validator offers some flags to customize properties.

You can view the full set of flags by running

.env

Your validator .env should look like the following after setup for all data sources:

Please see docs on Apifyarrow-up-right, Redditarrow-up-right, and Youtubearrow-up-right for more information on the environment variables above.

Last updated