Academic and technical papers for Zoo Labs Foundation infrastructure and protocols.
Organization: https://github.com/zooai Foundation: Zoo Labs Foundation (501c3 non-profit) Website: https://zoo.ngo AI Chat: https://ai.zoo.ngo Training Library: https://gym.zoo.ngo Blockchain: https://zoo.network Contact: foundation@zoo.ai
Zoo Labs Foundation is dedicated to open AI research and development. Our papers cover:
Documentation: https://zoolabs.gitbook.io/whitepaper/ ZIP: ZIP-003-genesis.md Status: Foundational Document Launch: October 2021 as 100% airdrop to CryptoZoo users
Title: "Zoo Labs: AI-Powered Asset-Backed NFTs, Agent Wallets, and Conservation Gaming"
Abstract: The foundational whitepaper for Zoo Labs ecosystem, introducing revolutionary concepts in AI-powered NFT utility, agent-based economics, and conservation technology. Each asset-backed, ZOO-yielding NFT serves as a crypto-funded wallet for each AI agent. Launched October 2021 as a 100% airdrop to users of CryptoZoo.
Core Architecture:
- AI Agent Wallets: Each animal NFT is a pointer to an autonomous AI agent with its own crypto wallet
- Agent-Backed NFT Standard: First NFT standard where NFTs represent AI agents that can hold assets, execute transactions, and interact autonomously
- Crypto-Funded Intelligence: AI agents are funded via NFT collateral, enabling autonomous economic behavior
- Agent Autonomy: Animals can feed, grow, breed, and earn rewards through their AI agent wallets
- Decentralized AI Coordination: Multiple AI agents interact within the ecosystem, coordinating through on-chain protocols
Key Innovations:
- Asset-Backed Yielding NFT Standard: First NFT standard with AI agents as intrinsic value holders
- NFT Liquidity Protocol: Novel mechanism for trading NFT pointers to AI agent wallets
- Collateral-Backed AI Agents: AI agents backed by real assets, tradeable or "burnable" for value
- Conservation + Education Mission: AI-powered gaming platform tied to real-world wildlife conservation
- Game-Fi Mechanics: Gen 0 NFT drop, AI-driven feeding/growing/breeding, autonomous animal rewards
- ZOO DAO: Decentralized governance coordinating AI agents and human participants
- Bridging Blockchains: Multi-chain AI agent infrastructure
Sections:
- Introduction: Abstract, Mission, AI-Powered Zoo Animal Utility, Sustainability, Market Opportunity
- Differentiators: AI Agents as NFTs, ZOO DAO, NFT Liquidity Protocol, Bridging Blockchains
- Game Play: Gen 0 NFT Drop, AI Agent Feeding/Growing/Breeding, Zoo Animal Rewards, AI Assistant, AR App, Collateral-Backed AI Agents, Native Token, NFT Marketplace, Agent Wallet Transfer, Metaverse Companion
- Extras: Partnerships, Open Source
Technical Foundation: This genesis paper laid the foundation for Zoo's pioneering approach to AI agent-backed NFTs. Zoo was one of the first projects (October 2021) to combine:
- AI agents with economic autonomy (crypto-funded wallets per NFT)
- On-chain AI coordination (multiple agents interacting via blockchain)
- Asset-backed intelligence (AI agents holding collateral and yielding value)
Historical Note: Launched October 2021 as a 100% airdrop to CryptoZoo community, demonstrating commitment to fair distribution and community ownership from day one.
These principles established in 2021 continue to guide the ecosystem's evolution into decentralized semantic optimization (DSO), proof-of-AI (PoAI) consensus, and collective intelligence systems.
File: zoo-dso.pdf
LaTeX Source: zoo-dso.tex
ZIP: ZIP-001-dso.md
Status: Published October 2025
Title: "Decentralized Semantic Optimization: Byzantine-Robust Prior Aggregation for Collective AI Intelligence"
Abstract: A decentralized protocol for aggregating AI model priors with Byzantine fault tolerance, enabling collective intelligence at network scale.
Key Contributions:
- Byzantine-robust prior aggregation
- Distributed active inference framework
- Multi-agent coordination via shared priors
- Integration with Hanzo's ASO and HMM
- Proof-of-AI (PoAI) consensus mechanisms
Builds On: Hanzo ASO (HIP-002), Hanzo HMM (HIP-004)
ZIP: ZIP-002-poai.md Status: Active Specification
Title: "Proof of AI: Bayesian Active Inference-Inspired Consensus for Verifiable AI Compute"
Abstract: Zoo's consensus mechanism for quality verification of AI compute. A training-free, distributed GRPO, LLM-agnostic experiential layer for building collective intelligence.
Key Features:
- TEE attestations and Merkle proofs
- Bayesian active inference quality scoring
- Integration with Hanzo HMM for economic settlement
- Slashing for fraudulent attestations
File: zip-gym-tokenomics.md Status: Active Specification
Title: "GYM Token Economics and Integration"
Overview: Economic model for Zoo's decentralized AI training network.
Key Features:
- GYM token specification (1B supply on BSC)
- Compute mining rewards and quality metrics
- Staking tiers (Bronze to Diamond)
- Slashing conditions and governance
Research focus areas:
- Decentralized semantic learning protocols
- Byzantine-robust prior aggregation
- Active inference at network scale
- Proof-of-AI (PoAI) consensus mechanisms
Repository: https://zenlm.ai GitHub: https://github.com/zenlm/papers Models: https://huggingface.co/zenlm Co-developed by: Hanzo AI Inc & Zoo Labs Foundation
Zen AI is a comprehensive family of ultra-efficient language models serving as the base underlying frontier models co-developed by Hanzo AI and Zoo Labs Foundation. All models are based on Qwen3 and newer architectures with innovative optimizations.
-
Main Whitepaper: ZEN_WHITEPAPER_2025.md (v1.0.1)
- Complete Zen ecosystem: 600M to 480B parameters
- 95% reduction in energy consumption
- Recursive AI Self-Improvement System (RAIS) achieving 94% effectiveness
- Edge deployment enabling privacy-preserving AI
-
Model Hub: Visit https://huggingface.co/zenlm for:
- Core Models: Nano (600M edge), Eco (balanced), Coder (code gen), Omni (multimodal), Next (MoE), Guard (safety)
- Specialized Models: Artist, Designer, Scribe, Director, and more
- Direct Downloads: All models available for immediate use
-
Documentation: Visit https://zenlm.ai and https://github.com/zenlm/papers for:
- Architecture Overview: Complete technical specifications and training methodologies
- Deployment Guides: Edge computing, cloud optimization, hybrid strategies
- Integration Examples: Use with Hanzo and Zoo ecosystems
Zoo Labs Foundation (501c3) partners with Hanzo AI Inc (Techstars '17) to:
- Advance open AI research
- Ensure model accessibility
- Promote privacy-preserving AI
- Enable edge deployment for everyone
This unique partnership ensures that cutting-edge AI remains accessible while advancing the state of the art.
Status: β COMPLETE - 7 papers, ~9,500 lines, 362KB, 140+ citations GitHub Pages: https://zooai.github.io/papers/
File: zoo-foundation-mission.tex (43KB, ~1,100 lines)
Version: v2025.09 (September 2025)
Documents Zoo Labs Foundation's 501(c)(3) non-profit mission across three pillars:
- Conservation AI: 73 species genomes sequenced, de-extinction research
- Educational AI: 87,000+ users across 142 countries, 420,000+ models trained
- Frontier AI: Pushing boundaries of AI+blockchain synthesis
File: zoo-network-architecture.tex (48KB, ~1,200 lines)
Version: v2025.09 (September 2025)
Complete L2 AI specialization layer architecture:
- Layered design: Lux (L0) β Hanzo (L1) β Zoo (L2)
- HLLM integration with 2.8ms routing overhead
- Post-quantum cryptography (NIST FIPS 203/204)
- Federated learning and Experience Ledger
- 40+ academic references
File: zoo-tokenomics.tex (77KB, 2,050+ lines)
Version: v2025.09 (September 2025)
Revolutionary 100% airdrop tokenomics model:
- Total Supply: 2 trillion KEEPER tokens
- DAO Allocation: 1 trillion (50%)
- Community Airdrop: 1 trillion (100% distributed, zero VC/private sales)
- Gini Coefficient: 0.61 (vs industry 0.80-0.90)
- Validator Requirements: 1,000 KEEPER minimum stake (democratically low)
- Proof of AI (PoAI): Validation weighted by LLM experience sharing and semantic contributions
Cross-network validator comparison (Lux/Hanzo/Zoo):
- Lux: 100 validators, 1B tokens each, 100-year unlock, 1M minimum stake (PoS + Genesis)
- Hanzo: Unlimited validators, 1 AI minimum (self-mined on any device, then participate in HMM compute market) (PoW compute)
- Zoo: Unlimited validators, 1,000 KEEPER minimum stake (PoAI - weighted by experience contributions)
File: gym-training-platform.tex (49KB, ~1,228 lines)
Version: v2023.05 β v2025.09 (2.5-year evolution)
Comprehensive AI training platform:
- 100+ Models: Qwen3, LLaMA, Mistral, DeepSeek, Yi, Gemma, ChatGLM, Phi
- 8 Training Methods: SFT, LoRA, QLoRA, PPO, DPO, GRPO, GSPO, KTO
- Training-Free GRPO Integration: Revolutionary semantic optimization
- Educational Impact: 87K users, 142 countries, 420K models trained
- 31 Academic Citations
File: hllm-training-free-grpo.tex (71KB, ~50 pages)
Version: v2025.09 (September 2025)
Revolutionary AI training paradigm shift:
- 99.8% Cost Reduction: $18 vs $10,000+ for traditional fine-tuning
- AIME24: 82.7% accuracy (+2.7% improvement over fine-tuning)
- AIME25: 73.3% accuracy (+5.4% improvement)
- 100Γ Data Efficiency: 100 examples vs 10,000+ required
- Hamiltonian Invariant: Ψ·Π= κ (conservation law)
- 3-Stage Algorithm: Trajectory Summarization β Group Advantage Extraction β Batch Consolidation
- 30+ Academic Citations
File: experience-ledger-dso.tex (54KB, ~1,600 lines)
Version: v2025.09 (September 2025)
Decentralized Semantic Optimization implementation:
- Three-Layer Storage: On-chain (Merkle roots) β IPFS (mutable) β Arweave (permanent)
- 7680-Dimensional Embeddings: Native dimension from Zen-Reranker
- Byzantine-Robust Curation: Median aggregation, 92% accuracy with 33% malicious nodes
- 3 Formal Theorems: With proofs for Byzantine resistance, convergence, coverage growth
- 5 Algorithms: Merkle construction, retrieval, extraction, etc.
- Operational Cost: $3,726 annual vs $50K+ alternatives
- 17+ Academic Citations
File: zip-002-zen-reranker.tex (49KB, ~1,288 lines)
Version: v2025.10 (October 2025)
Technical specification for Zen-Reranker:
- Native 7680-Dim Embeddings: 10Γ larger than typical 768-dim
- BitDelta Compression: 31.87Γ ratio (964 bytes from 30KB)
- Performance: 94.7% Recall@5 cross-model retrieval
- Byzantine-Robust Aggregation: 92% accuracy with malicious nodes
- Smart Contract Integration: On-chain experience registry
- MTEB Benchmark: 68.4 average
- 25 Academic Citations
Tokenomics Revolution:
- β First 100% airdrop AI token (zero VC/private sales)
- β Gini coefficient 0.61 (most equitable distribution in AI+blockchain)
- β Proof of AI (PoAI) validation weighted by experience sharing (1,000 KEEPER stake)
AI Training Breakthrough:
- β 99.8% cost reduction ($18 vs $10,000+)
- β 100Γ data efficiency (100 vs 10,000+ examples)
- β +2.7% accuracy improvement (AIME24: 82.7%)
- β Hamiltonian conservation law (Ψ·Π= ΞΊ)
Infrastructure:
- β Decentralized Semantic Optimization (DSO)
- β Byzantine fault tolerance (92% accuracy with 33% malicious)
- β Three-layer storage (on-chain/IPFS/Arweave)
- β 7680-dimensional native embeddings
Educational Impact:
- β 87,000+ users worldwide
- β 420,000+ models trained
- β 142 countries participating
- β Zero barriers to entry
Automated CI/CD: All papers are automatically compiled to PDF on every commit via GitHub Actions.
Local Compilation:
# Install LaTeX (one-time)
brew install basictex # macOS
# or: sudo apt-get install texlive-full # Ubuntu
# Compile all papers
for tex in *.tex; do
base="${tex%.tex}"
pdflatex "$tex"
bibtex "$base" 2>/dev/null || true
pdflatex "$tex"
pdflatex "$tex"
done
# PDFs will be in current directory
ls -lh *.pdfDownload PDFs: Visit https://zooai.github.io/papers/ to download all compiled PDFs.
If you use these papers in your research, please cite:
@misc{zoo2025foundation,
title={Zoo Labs Foundation: AI-Blockchain Research for Conservation, Education, and Frontier Science},
author={Zoo Labs Foundation Inc},
year={2025},
organization={zoo.ngo}
}
@misc{zoo2025grpo,
title={HLLM with Training-Free GRPO: 99.8\% Cost Reduction via Semantic Optimization},
author={Zoo Labs Foundation Inc},
year={2025},
organization={zoo.ngo}
}
@misc{zoo2025dso,
title={Experience Ledger: Decentralized Semantic Optimization for Large Language Models},
author={Zoo Labs Foundation Inc},
year={2025},
organization={zoo.ngo}
}For Hanzo-specific research (ASO, HMM, Jin architecture):
- Repository: https://github.com/hanzoai/papers
- Focus: AI compute infrastructure, market mechanisms, multimodal systems
For Zen model family documentation:
- Repository: https://github.com/zenlm/papers
- Focus: Ultra-efficient LLMs, edge deployment, model architectures
Detailed technical specifications and protocols are documented as ZIPs:
Repository: https://github.com/zooai/zips
- ZIP-000 - ZIP Process and Guidelines
- ZIP-001 - Decentralized Semantic Optimization
- ZIP-002 - Proof of AI Consensus
- ZIP-003 - Genesis Paper (October 2021)
- zip-gym-tokenomics.md - GYM Token Economics
Papers use LaTeX for academic formatting.
Build from source:
cd zoo-papers
pdflatex zoo-dso.tex
bibtex zoo-dso
pdflatex zoo-dso.tex
pdflatex zoo-dso.texRequirements:
- LaTeX distribution (TeX Live, MiKTeX, or MacTeX)
- BibTeX for references
Related build instructions:
- Hanzo AI Papers: https://github.com/hanzoai/papers#building-papers
- ZenLM Papers: https://github.com/zenlm/papers#building-papers
Papers are published under Creative Commons Attribution 4.0 International (CC BY 4.0).
Code examples and implementations follow their respective project licenses.
- Email: foundation@zoo.ai
- Website: https://zoo.ngo
- AI Chat: https://ai.zoo.ngo
- Training: https://gym.zoo.ngo
- Blockchain: https://zoo.network
- Models: https://zenlm.ai
- GitHub: https://github.com/zooai
- Non-Profit Status: 501(c)(3)
Zoo Labs Foundation Supporting open AI research and accessibility https://zoo.ngo