Cortensor
  • Home
  • Abstract
    • Value Proposition
    • Whitepaper
      • Page 1: Introduction and Vision
      • Page 2: Architecture and Technical Overview
      • Page 3: Incentive Structure and Tokenomics
      • Page4: Development Roadmap and Phases
      • Page5: Summary
  • Introduction
    • What is Cortensor?
    • Key Features & Benefits
    • Vision & Mission
    • Team
  • Getting Started
    • Quick Start Guide
    • System Requirements
    • Installation & Setup
      • Getting Test ETH
      • Setup Own RPC Endpoint
      • Router Node Setup
        • Router API Reference
  • Core Concepts
    • Decentralized AI Inference
      • Community-Powered Network
      • Gamification and Quality Control
      • Incentive Structure
    • Universal AI Accessibility
    • Multi-layer Blockchain Architecture
  • Technical Architecture
    • Design Principles
    • Node Roles
    • Node Lifecycle
      • Ephemeral Node State
    • Node Reputation
    • Network & Flow
    • Type of Services
    • Coordination & Orchestration
      • Multi-Oracle Node Reliability & Leadership Rotation
    • AI Inference
      • Open Source Models
        • Centralized vs Decentralized Models
      • Quantization
      • Performance and Scalability
    • Consensus & Validation
      • Proof of Inference (PoI) & Proof of Useful Work (PoUW
      • aka Mining
      • Proof of Useful Work (PoUW)
      • Proof of Useful Work (PoUW) State Machine
        • Miner & Oracle Nodes in PoUW State Machine
      • Sampling in Large Distributed Systems
      • Parallel Processing
      • Embedding Vector Distance
    • Multi-Layered Blockchain Architecture
    • Modular Architecture and Smart Contract Interactions
      • Session Queue
      • Node Pool
      • Session Payment
    • Mining Overview
    • User Interaction & Node Communication
      • Session, Session Queue, Router, and Miner in Cortensor
    • Data Management
      • IPFS Integration
    • Security & Privacy
    • Dashboard
    • Development Previews
      • Multiple Miners Collaboration with Oracle Node
      • Web3 SDK Client & Session/Session Queue Interaction
    • Technical Threads
      • AI Agents and Cortensor's Decentralized AI Inference
    • Infographic Archive
  • Community & Ecosystem
    • Tokenomics
      • Network Incentive Allocation
      • Token Allocations & Safe Wallet Management
    • Staking Pool Overview
    • Contributing to Cortensor
    • Incentives & Reward System
    • Governance & Compliance
    • Safety Measures and Restricted Addresses
    • Buyback Program
    • Liquidity Additions
    • Partnerships
      • Partnership Offering for Demand-Side Partnerships
    • Community Testing
      • Closed Alpha Testing Phase #1
        • Closed Alpha Testing Phase #1 Contest: Closing & Winners Announcement
      • Closed Alpha Testing Phase #2
      • Closed Alpha Testing Phase #3
      • Discord Roles & Mainnet Privileges
      • DevNet Mapping
      • DevNet Modules & Parameters
    • Jobs
      • Technical Writer
      • Communication & Social Media Manager
      • Web3 Frontend Developer
      • Distributed Systems Engineer
  • Integration Guide
    • Web2
      • REST API
      • WebSocket
      • Client SDK
    • Web3
      • Web3 SDK
  • Use Cases
  • Roadmap
    • Technical Roadmap: Launch to Next 365 Days Breakdown
    • Long-term Vision: Beyond Inference
  • Glossary
  • Legal
    • Terms of Use
    • Privacy Policy
    • Disclaimer
    • Agreement for Sale of Tokens
Powered by GitBook
On this page

Core Concepts

Cortensor is built on a series of core concepts that collectively provide a robust and scalable framework for decentralized AI. These concepts ensure that the platform is not only efficient and secure but also fosters innovation and community collaboration.

Key Concepts

  1. Decentralized AI Inference

    • Distributed Computing: Utilizing a global network of nodes to perform AI inference tasks, reducing the dependency on centralized servers and increasing resilience and flexibility.

    • Scalability: Ensuring that AI inference can scale efficiently with the growing demands of users and applications.

  2. Community-Powered Network

    • Collaborative Ecosystem: Encouraging contributions from a diverse community of developers, researchers, and users.

    • Incentive Structures: Implementing reward mechanisms to incentivize participation and innovation within the network.

  3. Multi-Layer Blockchain Architecture

    • Security and Transparency: Using blockchain technology to secure transactions and ensure transparency within the network.

    • Decentralized Governance: Allowing the community to participate in decision-making processes, ensuring a fair and democratic platform.

  4. Incentive Structure

    • Token-Based Rewards: Utilizing $CORTENSOR tokens to reward contributions and participation.

    • Fair Compensation: Ensuring that developers, validators, and users are fairly compensated for their efforts and resources.

  5. Universal AI Accessibility

    • Open-Source Models: Providing access to a variety of open-source AI models, allowing for customization and unrestricted use.

    • Lowering Barriers to Entry: Making advanced AI technologies accessible to a broader audience, promoting widespread adoption and innovation.

  6. Gamification and Quality Control / PoUW

    • Gamified Node Evaluation: Cortensor employs a gamified approach to evaluate and categorize inference nodes. Nodes participate in "games" where they generate blocks using LLM models, and other nodes evaluate and verify the results in a predictable and deterministic manner.

    • Dynamic Capability Assessment: This process allows Cortensor to continuously assess each node's capabilities, ensuring precise matching of inference tasks to appropriate nodes.

    • Supply-Side Quality Control: The gamified system ensures continuous quality assessment and categorization of nodes, maintaining a high standard of service and capability.

    • Incentivized Participation: Node operators are incentivized to participate in these games, improving their performance and contributing to a competitive and high-quality network.

    • Service Quality Assurance: By classifying nodes based on their capabilities, Cortensor ensures a high quality of service and reliability for end-users and services.

    • NOTE: PoUW extensibility

  7. Synthetic Data Generation (Byproduct of PoUW)

    • Synthetic Data as a Byproduct: The gamification process generates significant amounts of question-answer data, which can be valuable for training and improving AI models, providing an additional benefit to the Cortensor ecosystem.

    • NOTE: using future PoUW to generate on-demand synthetic data generation framework

These core concepts form the backbone of Cortensor, providing the structure and principles that guide the platform's development and operation. By understanding these key elements, you will gain a deeper appreciation of how Cortensor is designed to democratize AI and foster a collaborative, innovative community.


Disclaimer: This page and the associated documents are currently a work in progress. The information provided may not be up to date and is subject to change at any time.

PreviousRouter API ReferenceNextDecentralized AI Inference

Last updated 8 months ago

Page cover image