Cortensor
  • Home
  • Abstract
    • Value Proposition
    • Whitepaper
      • Page 1: Introduction and Vision
      • Page 2: Architecture and Technical Overview
      • Page 3: Incentive Structure and Tokenomics
      • Page4: Development Roadmap and Phases
      • Page5: Summary
  • Introduction
    • What is Cortensor?
    • Key Features & Benefits
    • Vision & Mission
  • Getting Started
    • Quick Start Guide
    • System Requirements
    • Installation & Setup
      • Getting Test ETH
      • Setup Own RPC Endpoint
      • Router Node Setup
        • Router API Reference
  • Core Concepts
    • Decentralized AI Inference
      • Community-Powered Network
      • Gamification and Quality Control
      • Incentive Structure
    • Universal AI Accessibility
    • Multi-layer Blockchain Architecture
  • Technical Architecture
    • Design Principles
    • Node Roles
    • Node Lifecycle
      • Ephemeral Node State
    • Node Reputation
    • Network & Flow
    • Type of Services
    • Coordination & Orchestration
      • Multi-Oracle Node Reliability & Leadership Rotation
    • AI Inference
      • Open Source Models
        • Centralized vs Decentralized Models
      • Quantization
      • Performance and Scalability
    • Consensus & Validation
      • Proof of Inference (PoI) & Proof of Useful Work (PoUW
      • aka Mining
      • Proof of Useful Work (PoUW)
      • Proof of Useful Work (PoUW) State Machine
        • Miner & Oracle Nodes in PoUW State Machine
      • Sampling in Large Distributed Systems
      • Parallel Processing
      • Embedding Vector Distance
    • Multi-Layered Blockchain Architecture
    • Modular Architecture and Smart Contract Interactions
      • Session Queue
      • Node Pool
      • Session Payment
    • Mining Overview
    • User Interaction & Node Communication
      • Session, Session Queue, Router, and Miner in Cortensor
    • Data Management
      • IPFS Integration
    • Security & Privacy
    • Dashboard
    • Development Previews
      • Multiple Miners Collaboration with Oracle Node
      • Web3 SDK Client & Session/Session Queue Interaction
    • Technical Threads
      • AI Agents and Cortensor's Decentralized AI Inference
    • Infographic Archive
  • Community & Ecosystem
    • Tokenomics
      • Network Incentive Allocation
      • Token Allocations & Safe Wallet Management
    • Staking Pool Overview
    • Contributing to Cortensor
    • Incentives & Reward System
    • Governance & Compliance
    • Safety Measures and Restricted Addresses
    • Buyback Program
    • Liquidity Additions
    • Partnerships
      • Partnership Offering for Demand-Side Partnerships
    • Community Testing
      • Closed Alpha Testing Phase #1
        • Closed Alpha Testing Phase #1 Contest: Closing & Winners Announcement
      • Closed Alpha Testing Phase #2
      • Closed Alpha Testing Phase #3
      • Discord Roles & Mainnet Privileges
      • DevNet Mapping
      • DevNet Modules & Parameters
    • Jobs
      • Technical Writer
      • Communication & Social Media Manager
      • Web3 Frontend Developer
      • Distributed Systems Engineer
  • Integration Guide
    • Web2
      • REST API
      • WebSocket
      • Client SDK
    • Web3
      • Web3 SDK
  • Use Cases
  • Roadmap
    • Technical Roadmap: Launch to Next 365 Days Breakdown
    • Long-term Vision: Beyond Inference
  • Glossary
  • Legal
    • Terms of Use
    • Privacy Policy
    • Disclaimer
    • Agreement for Sale of Tokens
Powered by GitBook
On this page
  • Key Components
  • Core Functionalities
  • Roles
  • Network & Flow
  • Coordination & Orchestration
  • AI Inference
  • Data Management
  • Security & Privacy
  • Type of Services
  • Community & Ecosystem

Technical Architecture

Cortensor's architecture is meticulously crafted to provide a robust, scalable, and secure platform for decentralized AI inference. Our technical implementation integrates cutting-edge blockchain technology with advanced AI capabilities, establishing a unique ecosystem for AI computation and orchestration. Here is an overview of the key components and functionalities that define the Cortensor architecture:

Key Components

  1. Multi-Layer Blockchain Architecture:

    • Layer 1 (L1): The foundational layer that ensures fundamental security and consensus.

    • Layer 2 (L2): Dedicated to AI orchestration and task management.

    • Layer 3 (L3): Focuses on privacy-preserving computations and supports customized chains.

  2. Proof of Useful Work (PoUW):

    • A novel consensus mechanism that combines network security with practical AI tasks.

    • Implements a two-level system for node evaluation and capability assessment.

  3. Decentralized AI Inference:

    • Utilizes a distributed network of nodes to perform AI computations.

    • Supports various hardware types, including CPUs and GPUs, ensuring inclusivity and adaptability.

  4. Intelligent Routing System:

    • Router nodes are employed for optimal task allocation.

    • Dynamic matching of inference requests to node capabilities ensures efficient task execution.

  5. Multi-Layered Validation:

    • Guard/validation nodes verify the results of AI tasks.

    • A reputation system ensures high-quality outputs by evaluating node performance.

  6. Universal Accessibility:

    • Compatible with both Web2 (REST API) and Web3 (SDK) environments.

    • Integrates seamlessly with popular AI frameworks and models.

  7. Privacy and Security Features:

    • Provides optional encrypted data transmission and storage.

    • L3 chains offer permissioned access and enhanced privacy for sensitive computations.

Core Functionalities

  • AI Inference Task Distribution and Execution: Efficiently distributes and executes AI inference tasks across the network.

  • Node Capability Assessment and Ranking: Periodically assesses and ranks nodes based on their capabilities and performance.

  • Token-Based Incentive System: Rewards nodes for their contributions to AI inference and task execution.

  • AI Model Marketplace: Facilitates the sharing, monetization, and access to various AI models.

  • Synthetic Data Generation: Supports the generation of synthetic data for various applications, enhancing data diversity and availability.

Roles

Cortensor's network consists of various node types, each playing a specific role in the ecosystem:

  • Router Nodes: Act as intermediaries between users and miners, ensuring secure communication and optimal task allocation.

  • Miner Nodes: Perform AI inferencing tasks, ranging from low-end devices to high-end GPUs, contributing to the network's computing power.

  • Client/Users: Initiate sessions, submit prompts, and receive AI inference results through the network.

  • Oracle/Master Guard Nodes: Maintain block time consistency, validate tasks, and ensure network reliability.

Network & Flow

The network's flow is designed to facilitate seamless interaction between different nodes and ensure efficient task execution:

  • Session Creation: Users create sessions by depositing tokens, which are calculated in terms of LLM tokens.

  • Task Routing: Router nodes handle the incoming prompts, verify payments, and route tasks to the appropriate miner nodes.

  • Task Execution: Miner nodes perform the assigned tasks and submit results securely.

  • Validation: Results are validated by other miner nodes or validation nodes to ensure accuracy and reliability.

Coordination & Orchestration

Effective coordination and orchestration are crucial for maintaining the network's performance and reliability:

  • Job Scheduling: Router nodes act as job schedulers, allocating tasks based on node capabilities and user requirements.

  • Dynamic Matching: Ensures that tasks are matched to the most suitable nodes, optimizing resource utilization and task completion times.

AI Inference

Open Source Models

  • Utilizes open-source AI models to provide a wide range of inferencing capabilities.

  • Ensures compatibility and integration with various AI frameworks.

Performance and Scalability

  • Designed to handle a large number of concurrent tasks.

  • Scales efficiently with the addition of more nodes, ensuring robust performance even under high demand.

Consensus & Validation

  • Proof of Inference: A consensus mechanism that ensures tasks are completed correctly and efficiently.

  • Validation Process: Involves multiple nodes in the validation process to ensure accuracy and reliability.

Data Management

  • Data Privacy: Ensures data is handled securely and privately, with optional encryption.

  • Data Storage: Utilizes decentralized storage solutions to maintain data integrity and accessibility.

Security & Privacy

  • Encrypted Transmission: Ensures all communications within the network are encrypted.

  • Permissioned Access: L3 chains provide enhanced privacy for sensitive computations.

Type of Services

  • AI Inference: Provides real-time AI inferencing capabilities.

  • Synthetic Data Generation: Supports the generation of synthetic data for various applications.

  • AI Model Marketplace: A platform for sharing and monetizing AI models.

Community & Ecosystem

Contributing to Cortensor

  • Encourages developers and AI enthusiasts to contribute to the network.

  • Provides incentives and rewards for valuable contributions.

Incentives & Rewards

  • Token-Based Rewards: Nodes are rewarded with tokens for their contributions to the network.

  • Staking: Nodes stake tokens to participate in the network, ensuring commitment and reliability.

Governance & Compliance

  • Decentralized Governance: Ensures the network is governed by its community, promoting transparency and fairness.

  • Compliance: Adheres to relevant regulations and standards to ensure legal compliance.

Tokenomics

  • Token Distribution: Tokens are distributed based on contributions and participation.

  • Utility: Tokens are used for various transactions within the network, including paying for services and rewarding nodes.

This comprehensive overview of Cortensor's technical architecture outlines the foundational elements that make it a pioneering platform for decentralized AI inference. The detailed components and functionalities ensure that Cortensor remains robust, scalable, and secure, fostering a collaborative and inclusive ecosystem for AI innovation.

PreviousMulti-layer Blockchain ArchitectureNextDesign Principles

Last updated 4 months ago

Page cover image