Subnet 13 - Data Universe
Bittensor absorbs the most data
Last updated
Bittensor absorbs the most data
Last updated
Subnet 13 is Bittensor’s decentralized data layer , focused on the collection, and distribution of fresh, desirable data.
Its incentive mechanism rewards miners for gathering content that is desirable, based on the following:
Source of the data
Specific categories of data within that source
The age of the data
Data uniqueness
Currently, miners are incentivized to scrap data from platforms like Reddit and X (Twitter) . Keeping the focus on rapidly shifting trends , because we all know in today’s world, stale data just doesn’t cut it.
Track brand sentiment and market shifts in real time
Make data-driven decisions based on the latest insights
Refine strategy and stay competitive with fresh intelligence
As machine learning becomes more commoditized, the quality of training data will become a key differentiator. SN13 aims to make data one of Bittensor’s most valuable commodities.
Subnet 13's decentralized design allows data to be distributed across miners and queried by validators, showcasing Bittensor's scalability. Macrocosmos is expanding data sources, developing a queryable API (Endpoints), and is currently the largest open source data provider on HuggingFace with over 17 billion scraped posts and comments (as of January 2025). By providing the raw material for pre-training and inference, subnet 13 is poised to supercharge the next stage of AI model development within the Bittensor ecosystem.
The Gravity subnet (SN13) specializes in decentralized web scraping and data collection across platforms like Reddit and X.
Using the GravityClient
in the Macrocosmos SDK, you can easily launch scraping tasks and get structured datasets built by miners.
For more details about the subnet 13 R&D work, take a look at our Substack articles:
Related resources
In fast-moving markets, up-to-date data is critical. Data provided by subnet 13's product, allows businesses to:
Alongside Gravity is , our second product launched on SN13. Its purpose is to perform analytics on data.
Subnet 13's not only provides up-to-date metrics on the total scraped posts, total amount of datasets or scraping speed, but also statistics on the datasets content with the amount of posts scraped per the most popular topics. The dashboard interface allows filtering based on the content age and daily scraping metrics, whilst offering a different table-formated UX.
To get started, you'll need an API key. Contact @victork_1 on or @victor_ck0 on and we'll get you set up quickly 🚀.