Cortensor
  • Home
  • Abstract
    • Value Proposition
    • Whitepaper
      • Page 1: Introduction and Vision
      • Page 2: Architecture and Technical Overview
      • Page 3: Incentive Structure and Tokenomics
      • Page4: Development Roadmap and Phases
      • Page5: Summary
  • Introduction
    • What is Cortensor?
    • Key Features & Benefits
    • Vision & Mission
  • Getting Started
    • Quick Start Guide
    • System Requirements
    • Installation & Setup
      • Getting Test ETH
      • Setup Own RPC Endpoint
      • Router Node Setup
        • Router API Reference
  • Core Concepts
    • Decentralized AI Inference
      • Community-Powered Network
      • Gamification and Quality Control
      • Incentive Structure
    • Universal AI Accessibility
    • Multi-layer Blockchain Architecture
  • Technical Architecture
    • Design Principles
    • Node Roles
    • Node Lifecycle
      • Ephemeral Node State
    • Node Reputation
    • Network & Flow
    • Type of Services
    • Coordination & Orchestration
      • Multi-Oracle Node Reliability & Leadership Rotation
    • AI Inference
      • Open Source Models
        • Centralized vs Decentralized Models
      • Quantization
      • Performance and Scalability
    • Consensus & Validation
      • Proof of Inference (PoI) & Proof of Useful Work (PoUW
      • aka Mining
      • Proof of Useful Work (PoUW)
      • Proof of Useful Work (PoUW) State Machine
        • Miner & Oracle Nodes in PoUW State Machine
      • Sampling in Large Distributed Systems
      • Parallel Processing
      • Embedding Vector Distance
    • Multi-Layered Blockchain Architecture
    • Modular Architecture and Smart Contract Interactions
      • Session Queue
      • Node Pool
      • Session Payment
    • Mining Overview
    • User Interaction & Node Communication
      • Session, Session Queue, Router, and Miner in Cortensor
    • Data Management
      • IPFS Integration
    • Security & Privacy
    • Dashboard
    • Development Previews
      • Multiple Miners Collaboration with Oracle Node
      • Web3 SDK Client & Session/Session Queue Interaction
    • Technical Threads
      • AI Agents and Cortensor's Decentralized AI Inference
    • Infographic Archive
  • Community & Ecosystem
    • Tokenomics
      • Network Incentive Allocation
      • Token Allocations & Safe Wallet Management
    • Staking Pool Overview
    • Contributing to Cortensor
    • Incentives & Reward System
    • Governance & Compliance
    • Safety Measures and Restricted Addresses
    • Buyback Program
    • Liquidity Additions
    • Partnerships
      • Partnership Offering for Demand-Side Partnerships
    • Community Testing
      • Closed Alpha Testing Phase #1
        • Closed Alpha Testing Phase #1 Contest: Closing & Winners Announcement
      • Closed Alpha Testing Phase #2
      • Closed Alpha Testing Phase #3
      • Discord Roles & Mainnet Privileges
      • DevNet Mapping
      • DevNet Modules & Parameters
    • Jobs
      • Technical Writer
      • Communication & Social Media Manager
      • Web3 Frontend Developer
      • Distributed Systems Engineer
  • Integration Guide
    • Web2
      • REST API
      • WebSocket
      • Client SDK
    • Web3
      • Web3 SDK
  • Use Cases
  • Roadmap
    • Technical Roadmap: Launch to Next 365 Days Breakdown
    • Long-term Vision: Beyond Inference
  • Glossary
  • Legal
    • Terms of Use
    • Privacy Policy
    • Disclaimer
    • Agreement for Sale of Tokens
Powered by GitBook
On this page
  • WIP - THIS IS EARLY DRAFT
  • Overview
  • Prerequisites
  • Installation Steps
  • Configuration
  • Launching the Router Node
  • Post-Setup: API Access
  • Notes
  1. Getting Started
  2. Installation & Setup

Router Node Setup

PreviousSetup Own RPC EndpointNextRouter API Reference

Last updated 15 hours ago

WIP - THIS IS EARLY DRAFT

Overview

The Router Node acts as a Web2-compatible RESTful API endpoint, enabling seamless integration of existing Web2 applications into the Cortensor network. It provides OpenAI-compatible APIs, allowing developers to integrate AI inference functionality as effortlessly as a hotswap—without modifying their core infrastructure.

While this Router Node is privately hosted, it mirrors the behavior of a public gateway by bridging external requests with Cortensor’s internal session flow.

For Web3 applications and smart contracts, direct interaction with the Session and Session Queue modules is supported, bypassing the Router Node to operate in a fully decentralized and trustless manner.

This setup empowers developers to serve both traditional and decentralized clients while participating in Cortensor’s distributed AI inference network.

Note: The Router Node setup follows the same process as a standard Cortensor node with additional configuration for API access.


Prerequisites

Before starting, ensure the following:

  • You’ve followed the to install cortensord and IPFS.

  • Your environment is properly configured with required dependencies and keys.

  • You are running a compatible system (Linux, macOS, or Windows).


Installation Steps

1. Install cortensord and IPFS

  • cortensord (Cortensor daemon)

  • IPFS (InterPlanetary File System)

2. Generate Node Keys

Use the key generation process described in the node setup documentation to generate necessary identity and signing keys.


Configuration

3. Update Environment File (.env)

Ensure the following variables are present and configured in your .env file:

# Enable API
API_ENABLE=1

# Generate a unique API key
API_KEY=f18a7432-4d1e-47a9-a352-81145275809a

# Set your API port
API_PORT=5010

4. Generate a New API Key

You can generate a secure API key using the following command:

cortensord ~/.cortensor/.env tool gen-api-key

Copy and paste the generated key into your .env under API_KEY.


Launching the Router Node

5. Start the Router Node

Use the following command to start your router node:

cortensord ~/.cortensor/.env routerv1

Upon startup, your router node will:

  • Register itself to handle session routing

  • Open API access on the configured port (default: 5010)

  • Begin communication with miners over WebSocket

  • Accept and relay inference tasks from clients


Post-Setup: API Access

Once the router node is running, you can:

  • Use the Web2 REST API to create sessions and submit inference tasks

  • Monitor inference data as it streams between your node and miners

  • Integrate Cortensor AI functionality into applications via SDKs or custom integrations


Notes

  • The router node does not perform inference—it coordinates task flow between users and miners.

  • Your node must remain online and responsive to maintain API availability.

  • In future releases, additional features such as task prioritization, caching, and rate limits may be configurable.


By hosting your own router node, you gain private access to Cortensor’s decentralized AI inference capabilities, with full control over task submission, request routing, and session monitoring.

Follow the to install:

For more details on API endpoints and usage, visit: 📘

Cortensor Node Setup Guide
installation instructions
API Reference