LogoLogo
  • Welcome
  • News and updates
  • Subnet Status Update
  • Macromedia
  • Bittensor
    • DTAO
  • SUBNETS
    • Subnet 1 - Apex
      • Subnet 1: How to use APEX
      • Subnet 1: Incentive Mechanism
      • Subnet 1: Base Miner Setup
      • Subnet 1: Roadmap
    • Subnet 9 - Pre-training
      • Subnet 9: How to use Pre-training
      • Subnet 9: Incentive Mechanism
      • Subnet 9: Roadmap
    • Subnet 13 - Data Universe
      • Subnet 13: How to use Data Universe
      • Subnet 13: Incentive Mechanism
      • Subnet 13: Roadmap
      • Subnet 13 API
    • Subnet 25 - Mainframe
      • Subnet 25: How to use Mainframe
      • Subnet 25: Incentive Mechanism
      • Subnet 25: Roadmap
  • Subnet 37 - Finetuning
    • Subnet 37: How to use Fine-Tuning
    • Subnet 37: Miners
    • Subnet 37: Validators
    • Subnet 37: Incentive Mechanism
    • Subnet 37: Competitions
    • Subnet 37: Roadmap
  • CONSTELLATION - USER GUIDES
    • Apex
      • Navigating Apex
      • FAQs
    • Gravity
      • Scraping data
      • Managing and Collecting your data
      • FAQs
    • Nebula
      • Explore Nebula
      • Analyzing data
  • Developers
    • API Documentation
      • Accessing API Keys
      • SN1 - APEX
        • Endpoints
        • Example Usage
        • Agentic Tasks
        • Supported Models
      • SN13 - Data Universe
        • Endpoints
        • Example usage
        • Scraping Youtube Data
      • SN25 - Mainframe
        • API Keys
        • Folding API
          • Running Folding API Server
          • Endpoints
        • Organic API
          • Endpoints
    • Macrocosmos SDK
      • Installation
      • API Keys
      • APEX
      • Gravity
    • Tools
      • Macrocosmos MCP
      • Developer Projects
Powered by GitBook
On this page
  • Running the API
  • Global Job Pool (GJP)
  • API Usage
  • Additional Notes
  1. Developers
  2. API Documentation
  3. SN25 - Mainframe
  4. Folding API

Running Folding API Server

PreviousFolding APINextEndpoints

Last updated 2 days ago

The Folding API provides protein folding capabilities through the Bittensor network. This API allows users to submit protein sequences for folding, query the status of folding jobs, and retrieve results. The service is designed to work with NVIDIA GPUs for optimal performance in protein folding computations.

Requirements

To ensure reliable and efficient MD simulations, please make sure your environment meets the following requirements:

  • GPU: NVIDIA RTX 4090 (recommended) Required for protein folding workloads due to its high number of CUDA cores and strong performance with molecular dynamics tasks. See our for more details on why this specific GPU is recommended.

  • Python: Version 3.11 (for compatibility with Bittensor and OpenMM)

  • Conda: For managing Python environments

  • Poetry: For managing project dependencies

Installation

1. Clone the Repository

git clone https://github.com/macrocosm-os/mainframe.git
cd folding

2. Install Conda

If you don't have Conda installed, run the following command:

mkdir -p ~/miniconda3
wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh
bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3
rm ~/miniconda3/miniconda.sh

# Initialize conda
source ~/miniconda3/bin/activate
conda init --all

3. Create and Activate Conda Environment

# Create a new environment with Python 3.11
conda create --name folding python=3.11

# Activate the environment
conda activate folding

4. Install Dependencies

# Install project dependencies using poetry
poetry install --with api

# Install the folding_api package in development mode
pip install -e .

Running the API

python3.11 main.py --netuid 25 --subtensor.network finney --wallet.name your_wallet_name --wallet.hotkey your_hotkey --gjp-address 167.99.209.27:4001
python3.11 main.py --netuid 141 --subtensor.network test --wallet.name your_wallet_name --wallet.hotkey your_hotkey --gjp-address 167.99.209.27:4001

Replace your_wallet_name and your_hotkey with your Bittensor wallet credentials.

Global Job Pool (GJP)

The API queries the Global Job Pool (GJP) using SQL for job information:

  • /job_pool endpoint executes SQL queries against the GJP database

  • Results are parsed and transformed into appropriate response models

  • Allows filtering by status, job IDs, and PDB ID search

The Global Job Pool also allows validators to distribute jobs and miners to fetch work. The following describes how miners interact with the GJP:

  1. When your miner starts, it automatically connects to the read node

  2. It maintains a local snapshot of the GJP in the /db directory

  3. You can use /scripts/query_rqlite.py to examine and analyze data from the job pool

The interaction between miners and the GJP is facilitated by the FoldingMiner class, which handles job fetching, preparation, and execution.

API Usage

1. Accessing the API Documentation

Once the API is running, you can access:

  • API server: http://0.0.0.0:8029

  • You can edit server address to access Interactive API documentation: http://localhost:8029/docs

2. Authentication

  1. Locate the api_keys.json file in the folding folder

  2. Copy your API key

  3. In the Swagger UI (/docs), click "Authorize" and enter your API key

  4. You can now make authenticated requests to the API

Additional Notes

  • The API requires an NVIDIA GPU with CUDA support for MD simulations

  • Regular system updates and proper CUDA configuration are essential

  • Monitor system resources during folding operations

  • Keep your API key secure and never share it publicly

reproducibility guidelines