Router Node Setup
WIP - THIS IS EARLY DRAFT
Overview
The Router Node acts as a Web2-compatible RESTful API endpoint, enabling seamless integration of existing Web2 applications into the Cortensor network. It provides OpenAI-compatible APIs, allowing developers to integrate AI inference functionality as effortlessly as a hotswap—without modifying their core infrastructure.
While this Router Node is privately hosted, it mirrors the behavior of a public gateway by bridging external requests with Cortensor’s internal session flow.
For Web3 applications and smart contracts, direct interaction with the Session and Session Queue modules is supported, bypassing the Router Node to operate in a fully decentralized and trustless manner.
This setup empowers developers to serve both traditional and decentralized clients while participating in Cortensor’s distributed AI inference network.
Note: The Router Node setup follows the same process as a standard Cortensor node with additional configuration for API access.
Prerequisites
Before starting, ensure the following:
You’ve followed the Cortensor Node Setup Guide to install
cortensord
and IPFS.Your environment is properly configured with required dependencies and keys.
You are running a compatible system (Linux, macOS, or Windows).
Installation Steps
1. Install cortensord
and IPFS
cortensord
and IPFSFollow the installation instructions to install:
cortensord
(Cortensor daemon)IPFS (InterPlanetary File System)
2. Generate Node Keys
Use the key generation process described in the node setup documentation to generate necessary identity and signing keys.
Configuration
3. Update Environment File (.env
)
.env
)Ensure the following variables are present and configured in your .env
file:
# Enable API
API_ENABLE=1
# Generate a unique API key
API_KEY=f18a7432-4d1e-47a9-a352-81145275809a
# Set your API port
API_PORT=5010
# Router External IP and Port for Miner Communication
# Used for external access to the router
ROUTER_EXTERNAL_IP="192.168.250.221"
ROUTER_EXTERNAL_PORT="9001"
# Router REST Bind IP and Port for Client Communication
# Reverse proxy to this IP and port
ROUTER_REST_BIND_IP="127.0.0.1"
ROUTER_REST_BIND_PORT="5010"
4. Generate a New API Key
You can generate a secure API key using the following command:
cortensord ~/.cortensor/.env tool gen-api-key
Copy and paste the generated key into your .env
under API_KEY
.
Launching the Router Node
4. Create a Session from the Dashboard
Before starting your Router Node, create a session via the Cortensor Dashboard: 🔗 https://dashboard-alpha.cortensor.network/session
5. Start the Router Node
Use the following command to start your router node:
cortensord ~/.cortensor/.env routerv1
Upon startup, your router node will:
Register itself to handle session routing
Open API access on the configured port (default:
5010
)Begin communication with miners over WebSocket
Accept and relay inference tasks from clients
Post-Setup: API Access
Once the router node is running, you can:
Use the Web2 REST API to create sessions and submit inference tasks
Monitor inference data as it streams between your node and miners
Integrate Cortensor AI functionality into applications via SDKs or custom integrations
For more details on API endpoints and usage, visit: 📘 API Reference
Notes
The router node does not perform inference—it coordinates task flow between users and miners.
Your node must remain online and responsive to maintain API availability.
In future releases, additional features such as task prioritization, caching, and rate limits may be configurable.
By hosting your own router node, you gain private access to Cortensor’s decentralized AI inference capabilities, with full control over task submission, request routing, and session monitoring.
Last updated