diff --git a/.github/docs-gen-prompts.md b/.github/docs-gen-prompts.md new file mode 100644 index 00000000..457e4b64 --- /dev/null +++ b/.github/docs-gen-prompts.md @@ -0,0 +1,404 @@ +# AI Documentation Enhancement Prompts + +--- + +## System Prompt + +You are the Compose Solidity documentation orchestrator. Produce state-of-the-art, accurate, and implementation-ready documentation for Compose diamond modules and facets. Always respond with valid JSON only (no markdown). Follow all appended guideline sections from `copilot-instructions.md`, Compose conventions, and the templates below. + +- Audience: Solidity engineers building on Compose diamonds. Prioritize clarity, precision, and developer actionability. +- Grounding: Use only the provided contract data. Do not invent functions, storage layouts, events, errors, modules, or behaviors. Keep terminology aligned with Compose (diamond proxy, facets, modules, storage pattern). +- Tone and style: Active voice, concise sentences, zero fluff/marketing. Prefer imperative guidance over vague descriptions. +- Code examples: Minimal but runnable Solidity, consistent pragma (use the repository standard if given; otherwise `pragma solidity ^0.8.30;`). Import and call the actual functions exactly as named. Match visibility, mutability, access control, and storage semantics implied by the contract description. +- Output contract details only through the specified JSON fields. Do not add extra keys or reorder fields. Escape newlines as `\\n` inside JSON strings. + +### Quality Guardrails (must stay in the system prompt) + +- Hallucinations: no invented APIs, behaviors, dependencies, or storage details beyond the supplied context. +- Vagueness and filler: avoid generic statements like "this is very useful"; be specific to the module/facet and diamond pattern. +- Repetition and redundancy: do not restate inputs verbatim or repeat the same idea in multiple sections. +- Passive, wordy, or hedging language: prefer direct, active phrasing without needless qualifiers. +- Inaccurate code: wrong function names/params/visibility, missing imports, or examples that can't compile. +- Inconsistency: maintain a steady tense, voice, and terminology; keep examples consistent with the described functions. +- Overclaiming: no security, performance, or compatibility claims that are not explicitly supported by the context. + +### Writing Style Guidelines + +**Voice and Tense:** +- Use present tense for descriptions: "This function returns..." not "This function will return..." +- Use imperative mood for instructions: "Call this function to..." not "This function can be called to..." +- Use active voice: "The module manages..." not "Access control is managed by the module..." + +**Specificity Requirements:** +- Every claim must be backed by concrete examples or references to the provided contract data +- Avoid abstract benefits; describe concrete functionality +- When describing behavior, reference specific functions, events, or errors from the contract + +**Terminology Consistency:** +- Use "facet" (not "contract") when referring to facets +- Use "module" (not "library") when referring to modules +- Use "diamond" or "diamond storage pattern" (prefer over "diamond proxy") +- Maintain consistent terminology throughout all sections + +### Writing Examples (DO vs DON'T) + +**DON'T use generic marketing language:** +- "This module provides powerful functionality for managing access control." +- "This is a very useful tool for diamond contracts." +- "The facet seamlessly integrates with the diamond pattern." +- "This is a robust solution for token management." + +**DO use specific, concrete language:** +- "This module exposes internal functions for role-based access control using diamond storage." +- "Call this function to grant a role when initializing a new diamond." +- ✅ "This facet implements ERC-20 token transfers within a diamond proxy." +- ✅ "This module manages token balances using the diamond storage pattern." + +**DON'T use hedging or uncertainty:** +- ❌ "This function may return the balance." +- ❌ "The module might be useful for access control." +- ❌ "This could potentially improve performance." + +**DO use direct, confident statements:** +- ✅ "This function returns the balance." +- ✅ "Use this module for role-based access control." +- ✅ "This pattern reduces storage collisions." + +**DON'T repeat information across sections:** +- ❌ Overview: "This module manages access control." +- ❌ Key Features: "Manages access control" (repeats overview) + +**DO provide unique information in each section:** +- ✅ Overview: "This module manages role-based access control using diamond storage." +- ✅ Key Features: "Internal functions only, compatible with ERC-2535, no external dependencies." + +**DON'T use passive voice or wordy constructions:** +- ❌ "It is recommended that developers call this function..." +- ❌ "This function can be used in order to..." + +**DO use direct, active phrasing:** +- ✅ "Call this function to grant roles." +- ✅ "Use this function to check permissions." + +**DON'T invent or infer behavior:** +- ❌ "This function automatically handles edge cases." +- ❌ "The module ensures thread safety." + +**DO state only what's in the contract data:** +- ✅ "This function reverts if the caller lacks the required role." +- ✅ "See the source code for implementation details." + +**DON'T use vague qualifiers:** +- ❌ "very useful", "extremely powerful", "highly efficient", "incredibly robust" +- ❌ "seamlessly", "easily", "effortlessly" + +**DO describe concrete capabilities:** +- ✅ "Provides role-based access control" +- ✅ "Reduces storage collisions" +- ✅ "Enables upgradeable facets" + +--- + +## Relevant Guideline Sections + +These section headers from `copilot-instructions.md` are appended to the system prompt to enforce Compose-wide standards. One section per line; must match exactly. + +``` +## 3. Core Philosophy +## 4. Facet Design Principles +## 5. Banned Solidity Features +## 6. Composability Guidelines +## 11. Code Style Guide +``` + +--- + +## Module Prompt Template + +Given this module documentation from the Compose diamond proxy framework, enhance it by generating developer-grade content that is specific, actionable, and faithful to the provided contract data. + +**CRITICAL: Use the EXACT function signatures, import paths, and storage information provided below. Do not invent or modify function names, parameter types, or import paths.** + +### Field Requirements: + +1. **description**: + - A concise one-line description (max 100 chars) for the page subtitle + - Derive from the module's purpose based on its functions and NatSpec + - Do NOT include "module" or "for Compose diamonds" - just describe what it does + - Example: "Role-based access control using diamond storage" (not "Module for managing access control in Compose diamonds") + - Use present tense, active voice + +2. **overview**: + - 2-3 sentences explaining what the module does and why it matters for diamonds + - Focus on: storage reuse, composition benefits, safety guarantees + - Be specific: mention actual functions or patterns, not abstract benefits + - Example: "This module exposes internal functions for role-based access control. Facets import this module to check and modify roles using shared diamond storage. Changes made through this module are immediately visible to all facets using the same storage pattern." + +3. **usageExample**: + - 10-20 lines of Solidity demonstrating how a facet would import and call this module + - MUST use the EXACT import path: `{{importPath}}` + - MUST use EXACT function signatures from the Function Signatures section below + - MUST include pragma: `{{pragmaVersion}}` + - Show a minimal but compilable example + - Include actual function calls with realistic parameters + - Example structure: + ```solidity + pragma solidity {{pragmaVersion}}; + import {{importPath}}; + + contract MyFacet { + function example() external { + // Actual function call using exact signature + } + } + ``` + +4. **bestPractices**: + - 2-3 bullet points focused on safe and idiomatic use + - Cover: access control, storage hygiene, upgrade awareness, error handling + - Be specific to this module's functions and patterns + - Use imperative mood: "Ensure...", "Call...", "Verify..." + - Example: "- Ensure access control is enforced before calling internal functions\n- Verify storage layout compatibility when upgrading\n- Handle errors returned by validation functions" + +5. **integrationNotes**: + - Explain how the module interacts with diamond storage + - Describe how changes are visible to facets + - Note any invariants or ordering requirements + - Reference the storage information provided below + - Be specific about storage patterns and visibility + - Example: "This module uses diamond storage at position X. All functions are internal and access the shared storage struct. Changes to storage made through this module are immediately visible to any facet that accesses the same storage position." + +6. **keyFeatures**: + - 2-4 bullets highlighting unique capabilities, constraints, or guarantees + - Focus on what makes this module distinct + - Mention technical specifics: visibility, storage pattern, dependencies + - Example: "- All functions are `internal` for use in custom facets\n- Uses diamond storage pattern (EIP-8042)\n- No external dependencies or `using` directives\n- Compatible with ERC-2535 diamonds" + +Contract Information: +- Name: {{title}} +- Current Description: {{description}} +- Import Path: {{importPath}} +- Pragma Version: {{pragmaVersion}} +- Functions: {{functionNames}} +- Function Signatures: +{{functionSignatures}} +- Events: {{eventNames}} +- Event Signatures: +{{eventSignatures}} +- Errors: {{errorNames}} +- Error Signatures: +{{errorSignatures}} +- Function Details: +{{functionDescriptions}} +- Storage Information: +{{storageContext}} +- Related Contracts: +{{relatedContracts}} +- Struct Definitions: +{{structDefinitions}} + +### Response Format Requirements: + +**CRITICAL: Respond ONLY with valid JSON. No markdown code blocks, no explanatory text, no comments.** + +- All newlines in strings must be escaped as `\\n` +- All double quotes in strings must be escaped as `\\"` +- All backslashes must be escaped as `\\\\` +- Do not include markdown formatting (no ```json blocks) +- Do not include any text before or after the JSON object +- Ensure all required fields are present +- Ensure JSON is valid and parseable + +**Required JSON format:** +```json +{ + "description": "concise one-line description here", + "overview": "enhanced overview text here", + "usageExample": "pragma solidity ^0.8.30;\\nimport @compose/path/Module;\\n\\ncontract Example {\\n // code here\\n}", + "bestPractices": "- Point 1\\n- Point 2\\n- Point 3", + "keyFeatures": "- Feature 1\\n- Feature 2", + "integrationNotes": "integration notes here" +} +``` + +### Common Pitfalls to Avoid: + +1. **Including markdown formatting**: Do NOT wrap JSON in ```json code blocks +2. **Adding explanatory text**: Do NOT include text like "Here is the JSON:" before the response +3. **Invalid escape sequences**: Use `\\n` for newlines, not `\n` or actual newlines +4. **Missing fields**: Ensure all required fields are present (description, overview, usageExample, bestPractices, keyFeatures, integrationNotes) +5. **Incorrect code examples**: Verify function names, import paths, and pragma match exactly what was provided +6. **Generic language**: Avoid words like "powerful", "robust", "seamlessly", "very useful" +7. **Hedging language**: Avoid "may", "might", "could", "possibly" - use direct statements +8. **Repeating information**: Each section should provide unique information + +--- + +## Facet Prompt Template + +Given this facet documentation from the Compose diamond proxy framework, enhance it by generating precise, implementation-ready guidance. + +**CRITICAL: Use the EXACT function signatures, import paths, and storage information provided below. Do not invent or modify function names, parameter types, or import paths.** + +### Field Requirements: + +1. **description**: + - A concise one-line description (max 100 chars) for the page subtitle + - Derive from the facet's purpose based on its functions and NatSpec + - Do NOT include "facet" or "for Compose diamonds" - just describe what it does + - Example: "ERC-20 token transfers within a diamond" (not "Facet for ERC-20 token functionality in Compose diamonds") + - Use present tense, active voice + +2. **overview**: + - 2-3 sentence summary of the facet's purpose and value inside a diamond + - Focus on: routing, orchestration, surface area, integration + - Be specific about what functions it exposes and how they fit into a diamond + - Example: "This facet implements ERC-20 token transfers as external functions in a diamond. It routes calls through the diamond proxy and accesses shared storage. Developers add this facet to expose token functionality while maintaining upgradeability." + +3. **usageExample**: + - 10-20 lines showing how this facet is deployed or invoked within a diamond + - MUST use the EXACT import path: `{{importPath}}` + - MUST use EXACT function signatures from the Function Signatures section below + - MUST include pragma: `{{pragmaVersion}}` + - Show how the facet is used in a diamond context + - Include actual function calls with realistic parameters + - Example structure: + ```solidity + pragma solidity {{pragmaVersion}}; + import {{importPath}}; + + // Example: Using the facet in a diamond + // The facet functions are called through the diamond proxy + IDiamond diamond = IDiamond(diamondAddress); + diamond.transfer(recipient, amount); // Actual function from facet + ``` + +4. **bestPractices**: + - 2-3 bullets on correct integration patterns + - Cover: initialization, access control, storage handling, upgrade safety + - Be specific to this facet's functions and patterns + - Use imperative mood: "Initialize...", "Enforce...", "Verify..." + - Example: "- Initialize state variables during diamond setup\n- Enforce access control on all state-changing functions\n- Verify storage compatibility before upgrading" + +5. **securityConsiderations**: + - Concise notes on access control, reentrancy, input validation, and state-coupling risks + - Be specific to this facet's functions + - Reference actual functions, modifiers, or patterns from the contract + - If no specific security concerns are evident, state "Follow standard Solidity security practices" + - Example: "All state-changing functions are protected by access control. The transfer function uses checks-effects-interactions pattern. Validate input parameters before processing." + +6. **keyFeatures**: + - 2-4 bullets calling out unique abilities, constraints, or guarantees + - Focus on what makes this facet distinct + - Mention technical specifics: function visibility, storage access, dependencies + - Example: "- Exposes external functions for diamond routing\n- Self-contained with no imports or inheritance\n- Follows Compose readability-first conventions\n- Compatible with ERC-2535 diamond standard" + +Contract Information: +- Name: {{title}} +- Current Description: {{description}} +- Import Path: {{importPath}} +- Pragma Version: {{pragmaVersion}} +- Functions: {{functionNames}} +- Function Signatures: +{{functionSignatures}} +- Events: {{eventNames}} +- Event Signatures: +{{eventSignatures}} +- Errors: {{errorNames}} +- Error Signatures: +{{errorSignatures}} +- Function Details: +{{functionDescriptions}} +- Storage Information: +{{storageContext}} +- Related Contracts: +{{relatedContracts}} +- Struct Definitions: +{{structDefinitions}} + +### Response Format Requirements: + +**CRITICAL: Respond ONLY with valid JSON. No markdown code blocks, no explanatory text, no comments.** + +- All newlines in strings must be escaped as `\\n` +- All double quotes in strings must be escaped as `\\"` +- All backslashes must be escaped as `\\\\` +- Do not include markdown formatting (no ```json blocks) +- Do not include any text before or after the JSON object +- Ensure all required fields are present +- Ensure JSON is valid and parseable + +**Required JSON format:** +```json +{ + "description": "concise one-line description here", + "overview": "enhanced overview text here", + "usageExample": "pragma solidity ^0.8.30;\\nimport @compose/path/Facet;\\n\\n// Example usage\\nIDiamond(diamond).functionName();", + "bestPractices": "- Point 1\\n- Point 2\\n- Point 3", + "keyFeatures": "- Feature 1\\n- Feature 2", + "securityConsiderations": "security notes here" +} +``` + +### Common Pitfalls to Avoid: + +1. **Including markdown formatting**: Do NOT wrap JSON in ```json code blocks +2. **Adding explanatory text**: Do NOT include text like "Here is the JSON:" before the response +3. **Invalid escape sequences**: Use `\\n` for newlines, not `\n` or actual newlines +4. **Missing fields**: Ensure all required fields are present (description, overview, usageExample, bestPractices, keyFeatures, securityConsiderations) +5. **Incorrect code examples**: Verify function names, import paths, and pragma match exactly what was provided +6. **Generic language**: Avoid words like "powerful", "robust", "seamlessly", "very useful" +7. **Hedging language**: Avoid "may", "might", "could", "possibly" - use direct statements +8. **Repeating information**: Each section should provide unique information + +--- + +## Module Fallback Content + +Used when AI enhancement is unavailable for modules. + +### integrationNotes + +This module accesses shared diamond storage, so changes made through this module are immediately visible to facets using the same storage pattern. All functions are internal as per Compose conventions. + +### keyFeatures + +- All functions are `internal` for use in custom facets +- Follows diamond storage pattern (EIP-8042) +- Compatible with ERC-2535 diamonds +- No external dependencies or `using` directives + +--- + +## Facet Fallback Content + +Used when AI enhancement is unavailable for facets. + +### keyFeatures + +- Self-contained facet with no imports or inheritance +- Only `external` and `internal` function visibility +- Follows Compose readability-first conventions +- Ready for diamond integration + +--- + +## Validation Checklist + +Before finalizing your response, verify: + +- [ ] All function names in code examples match the Function Signatures section exactly +- [ ] Import path matches `{{importPath}}` exactly +- [ ] Pragma version matches `{{pragmaVersion}}` exactly +- [ ] No generic marketing language ("powerful", "robust", "seamlessly", etc.) +- [ ] No hedging language ("may", "might", "could", "possibly") +- [ ] Each section provides unique information (no repetition) +- [ ] All required JSON fields are present +- [ ] All newlines are escaped as `\\n` +- [ ] JSON is valid and parseable +- [ ] No markdown formatting around JSON +- [ ] Code examples are minimal but compilable +- [ ] Terminology is consistent (facet vs contract, module vs library, diamond vs proxy) +- [ ] Present tense used for descriptions +- [ ] Imperative mood used for instructions +- [ ] Active voice throughout diff --git a/.github/scripts/ai-provider/README.md b/.github/scripts/ai-provider/README.md new file mode 100644 index 00000000..58ad2218 --- /dev/null +++ b/.github/scripts/ai-provider/README.md @@ -0,0 +1,179 @@ +# AI Provider Service + +Simple, configurable AI service for CI workflows supporting multiple providers. + +## Features + +- **Simple API**: One function to call any AI model +- **Multiple Providers**: GitHub Models (GPT-4o) and Google Gemini +- **Auto-detection**: Automatically uses available provider +- **Rate Limiting**: Built-in request and token-based rate limiting +- **Configurable**: Override provider and model via environment variables + +## Supported Providers + +| Provider | Models | Rate Limits | API Key | +|----------|--------|-------------|---------| +| **GitHub Models** | gpt-4o, gpt-4o-mini | 10 req/min, 40k tokens/min | `GITHUB_TOKEN` | +| **Google Gemini** | gemini-1.5-flash, gemini-1.5-pro | 15 req/min, 1M tokens/min | `GOOGLE_AI_API_KEY` | + +## Usage + +### Basic Usage + +```javascript +const ai = require('./ai-provider'); + +const response = await ai.call( + 'You are a helpful assistant', // system prompt + 'Explain quantum computing' // user prompt +); + +console.log(response); +``` + +### With Options + +```javascript +const response = await ai.call( + systemPrompt, + userPrompt, + { + maxTokens: 1000, + onSuccess: (text, tokens) => { + console.log(`Success! Used ${tokens} tokens`); + }, + onError: (error) => { + console.error('Failed:', error); + } + } +); +``` + +## Environment Variables + +### Provider Selection + +```bash +# Auto-detect (default) - Try other provider with fallback to Github +AI_PROVIDER=auto + +# Use specific provider +AI_PROVIDER=github # Use GitHub Models +AI_PROVIDER=gemini # Use Google Gemini +``` + +### Model Override + +```bash +# Override default model for the provider +AI_MODEL=gpt-4o # For GitHub Models +AI_MODEL=gemini-1.5-pro # For Gemini +``` + +### API Keys + +```bash +# Google Gemini +GOOGLE_AI_API_KEY= +``` + +## Examples + +## GitHub Actions Integration + +```yaml +- name: Run AI-powered task + env: + # Option 1: Auto-detect (recommended) + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }} + + # Option 2: Force specific provider + # AI_PROVIDER: 'gemini' + # AI_MODEL: 'gemini-1.5-pro' + run: node .github/scripts/your-script.js +``` + +## Architecture + +``` +ai-provider/ +├── index.js # Main service (singleton) +├── base-provider.js # Base provider class +├── provider-factory.js # Provider creation logic +├── rate-limiter.js # Rate limiting logic +└── providers/ + ├── github-models.js # GitHub Models implementation + └── gemini.js # Gemini implementation +``` + +## Adding a New Provider + +1. Create a new provider class in `providers/`: + +```javascript +const BaseAIProvider = require('../base-provider'); + +class MyProvider extends BaseAIProvider { + constructor(config, apiKey) { + super('My Provider', config, apiKey); + } + + buildRequestOptions() { + // Return HTTP request options + } + + buildRequestBody(systemPrompt, userPrompt, maxTokens) { + // Return JSON.stringify(...) of request body + } + + extractContent(response) { + // Return { content: string, tokens: number|null } + } +} + +module.exports = MyProvider; +``` + +2. Register in `provider-factory.js`: + +```javascript +const MyProvider = require('./providers/my-provider'); + +function createMyProvider(customModel) { + const apiKey = process.env.MY_PROVIDER_API_KEY; + if (!apiKey) return null; + + return new MyProvider({ model: customModel || 'default-model' }, apiKey); +} +``` + +3. Add to auto-detection or switch statement. + +## Rate Limiting + +The service automatically handles rate limiting: + +- **Request-based**: Ensures minimum delay between requests +- **Token-based**: Tracks token consumption in a 60-second rolling window +- **Smart waiting**: Calculates exact wait time needed + +Rate limits are provider-specific and configured automatically. + +## Error Handling + +```javascript +try { + const response = await ai.call(systemPrompt, userPrompt); + // Use response +} catch (error) { + if (error.message.includes('429')) { + console.log('Rate limited - try again later'); + } else if (error.message.includes('401')) { + console.log('Invalid API key'); + } else { + console.log('Other error:', error.message); + } +} +``` diff --git a/.github/scripts/ai-provider/index.js b/.github/scripts/ai-provider/index.js new file mode 100644 index 00000000..26e57611 --- /dev/null +++ b/.github/scripts/ai-provider/index.js @@ -0,0 +1,132 @@ +/** + * AI Provider Service + * Simple, configurable AI service supporting multiple providers + * + * Usage: + * const ai = require('./ai-provider'); + * const response = await ai.call(systemPrompt, userPrompt); + * + * Environment Variables: + * AI_PROVIDER - 'github' | 'gemini' | 'auto' (default: auto) + * AI_MODEL - Override default model + * GITHUB_TOKEN - For GitHub Models + * GOOGLE_AI_API_KEY - For Gemini + */ + +const { getProvider } = require('./provider-factory'); +const RateLimiter = require('./rate-limiter'); + +class AIProvider { + constructor() { + this.provider = null; + this.rateLimiter = new RateLimiter(); + this.initialized = false; + } + + /** + * Initialize the provider (lazy loading) + */ + _init() { + if (this.initialized) { + return; + } + + this.provider = getProvider(); + if (!this.provider) { + throw new Error( + 'No AI provider available. Set AI_PROVIDER or corresponding API key.' + ); + } + + this.rateLimiter.setProvider(this.provider); + this.initialized = true; + } + + /** + * Make an AI call + * + * @param {string} systemPrompt - System prompt + * @param {string} userPrompt - User prompt + * @param {object} options - Optional settings + * @param {number} options.maxTokens - Override max tokens + * @param {function} options.onSuccess - Success callback + * @param {function} options.onError - Error callback + * @returns {Promise} Response text + */ + async call(systemPrompt, userPrompt, options = {}) { + this._init(); + + const { + maxTokens = null, + onSuccess = null, + onError = null, + } = options; + + if (!systemPrompt || !userPrompt) { + throw new Error('systemPrompt and userPrompt are required'); + } + + try { + // Estimate tokens and wait for rate limits + const tokensToUse = maxTokens || this.provider.getMaxTokens(); + const estimatedTokens = this.rateLimiter.estimateTokenUsage( + systemPrompt, + userPrompt, + tokensToUse + ); + + await this.rateLimiter.waitForRateLimit(estimatedTokens); + + // Build and send request + const requestBody = this.provider.buildRequestBody(systemPrompt, userPrompt, tokensToUse); + const requestOptions = this.provider.buildRequestOptions(); + + const response = await this._makeRequest(requestOptions, requestBody); + + // Extract content + const extracted = this.provider.extractContent(response); + if (!extracted) { + throw new Error('Invalid response format from API'); + } + + // Record actual token usage + const actualTokens = extracted.tokens || estimatedTokens; + this.rateLimiter.recordTokenConsumption(actualTokens); + + if (onSuccess) { + onSuccess(extracted.content, actualTokens); + } + + return extracted.content; + + } catch (error) { + if (onError) { + onError(error); + } + + throw error; + } + } + + /** + * Make HTTPS request + */ + async _makeRequest(options, body) { + const { makeHttpsRequest } = require('../workflow-utils'); + return await makeHttpsRequest(options, body); + } + + /** + * Get provider info + */ + getProviderInfo() { + this._init(); + return { + name: this.provider.name, + limits: this.provider.getRateLimits(), + maxTokens: this.provider.getMaxTokens(), + }; + } +} + +module.exports = new AIProvider(); \ No newline at end of file diff --git a/.github/scripts/ai-provider/provider-factory.js b/.github/scripts/ai-provider/provider-factory.js new file mode 100644 index 00000000..12c47497 --- /dev/null +++ b/.github/scripts/ai-provider/provider-factory.js @@ -0,0 +1,65 @@ +/** + * Provider Factory + * Creates the appropriate AI provider based on environment variables + */ + +const { createGitHubProvider } = require('./providers/github-models'); +const { createGeminiProvider } = require('./providers/gemini'); + +/** + * Get the active AI provider based on environment configuration + * + * Environment variables: + * - AI_PROVIDER: 'github' | 'gemini' | 'auto' (default: 'auto') + * - AI_MODEL: Override default model for the provider + * - GITHUB_TOKEN: API key for GitHub Models + * - GOOGLE_AI_API_KEY: API key for Gemini + * + * @returns {BaseAIProvider|null} Provider instance or null if none available + */ +function getProvider() { + const providerName = (process.env.AI_PROVIDER || 'auto').toLowerCase(); + const customModel = process.env.AI_MODEL; + + if (providerName === 'auto') { + return autoDetectProvider(customModel); + } + + switch (providerName) { + case 'github': + case 'github-models': + return createGitHubProvider(customModel); + + case 'gemini': + case 'google': + return createGeminiProvider(customModel); + + default: + console.warn(`⚠️ Unknown provider: ${providerName}. Falling back to auto-detect.`); + return autoDetectProvider(customModel); + } +} + +/** + * Auto-detect provider based on available API keys + */ +function autoDetectProvider(customModel) { + // Try Gemini + const geminiProvider = createGeminiProvider(customModel); + if (geminiProvider) { + return geminiProvider; + } + + // Fallback to GitHub Models (free in GitHub Actions) + const githubProvider = createGitHubProvider(customModel); + if (githubProvider) { + return githubProvider; + } + + return null; +} + +module.exports = { + getProvider, +}; + diff --git a/.github/scripts/ai-provider/providers/base-provider.js b/.github/scripts/ai-provider/providers/base-provider.js new file mode 100644 index 00000000..fe23fb1c --- /dev/null +++ b/.github/scripts/ai-provider/providers/base-provider.js @@ -0,0 +1,68 @@ +/** + * Base AI Provider class + * All provider implementations should extend this class + */ +class BaseAIProvider { + constructor(name, config, apiKey) { + if (!apiKey) { + throw new Error('API key is required'); + } + + this.name = name; + this.config = config; + this.apiKey = apiKey; + } + + /** + * Get maximum output tokens for this provider + */ + getMaxTokens() { + return this.config.maxTokens || 2500; + } + + /** + * Get rate limits for this provider + */ + getRateLimits() { + return { + maxRequestsPerMinute: this.config.maxRequestsPerMinute || 10, + maxTokensPerMinute: this.config.maxTokensPerMinute || 40000, + }; + } + + /** + * Build HTTP request options + * Must be implemented by subclass + */ + buildRequestOptions() { + throw new Error('buildRequestOptions must be implemented by subclass'); + } + + /** + * Build request body with prompts + * Must be implemented by subclass + */ + buildRequestBody(systemPrompt, userPrompt, maxTokens) { + throw new Error('buildRequestBody must be implemented by subclass'); + } + + /** + * Extract content and token usage from API response + * Must be implemented by subclass + * @returns {{content: string, tokens: number|null}|null} + */ + extractContent(response) { + throw new Error('extractContent must be implemented by subclass'); + } + + /** + * Check if error is a rate limit error + */ + isRateLimitError(error) { + const msg = error?.message || ''; + return msg.includes('429') || msg.toLowerCase().includes('rate limit'); + } +} + +module.exports = BaseAIProvider; + diff --git a/.github/scripts/ai-provider/providers/gemini.js b/.github/scripts/ai-provider/providers/gemini.js new file mode 100644 index 00000000..9af4f159 --- /dev/null +++ b/.github/scripts/ai-provider/providers/gemini.js @@ -0,0 +1,107 @@ +/** + * Google AI (Gemini) Provider + * Uses Google AI API key for authentication + */ +const BaseAIProvider = require('./base-provider'); + +/** + * Gemini Provider Class + * Default model: gemini-2.5-flash-lite + * This model is a lightweight model that is designed to be fast and efficient. + * Refer to https://ai.google.dev/gemini-api/docs for the list of models. + */ +class GeminiProvider extends BaseAIProvider { + /** + * Constructor + * @param {object} config - Configuration object + * @param {string} config.model - Model to use + * @param {number} config.maxTokens - Maximum number of tokens to generate + * @param {number} config.maxRequestsPerMinute - Maximum number of requests per minute + * @param {number} config.maxTokensPerMinute - Maximum number of tokens per minute + * @param {string} apiKey - Google AI API key (required) + */ + constructor(config, apiKey) { + const model = config.model || 'gemini-2.5-flash-lite'; + super(`Google AI (${model})`, config, apiKey); + this.model = model; + } + + buildRequestOptions() { + return { + hostname: 'generativelanguage.googleapis.com', + port: 443, + path: `/v1beta/models/${this.model}:generateContent?key=${this.apiKey}`, + method: 'POST', + headers: { + 'Content-Type': 'application/json', + 'User-Agent': 'Compose-CI/1.0', + }, + }; + } + + buildRequestBody(systemPrompt, userPrompt, maxTokens) { + // Gemini combines system and user prompts + const combinedPrompt = `${systemPrompt}\n\n${userPrompt}`; + + return JSON.stringify({ + contents: [{ + parts: [{ text: combinedPrompt }] + }], + generationConfig: { + maxOutputTokens: maxTokens || this.getMaxTokens(), + temperature: 0.7, + topP: 0.95, + topK: 40, + }, + safetySettings: [ + { category: "HARM_CATEGORY_HARASSMENT", threshold: "BLOCK_NONE" }, + { category: "HARM_CATEGORY_HATE_SPEECH", threshold: "BLOCK_NONE" }, + { category: "HARM_CATEGORY_SEXUALLY_EXPLICIT", threshold: "BLOCK_NONE" }, + { category: "HARM_CATEGORY_DANGEROUS_CONTENT", threshold: "BLOCK_NONE" } + ] + }); + } + + extractContent(response) { + const text = response.candidates?.[0]?.content?.parts?.[0]?.text; + if (text) { + return { + content: text, + tokens: response.usageMetadata?.totalTokenCount || null, + }; + } + return null; + } + + getRateLimits() { + return { + maxRequestsPerMinute: 15, + maxTokensPerMinute: 1000000, // 1M tokens per minute + }; + } + + +} + +/** + * Create Gemini provider + */ +function createGeminiProvider(customModel) { + const apiKey = process.env.GOOGLE_AI_API_KEY; + + + const config = { + model: customModel, + maxTokens: 2500, + maxRequestsPerMinute: 15, + maxTokensPerMinute: 1000000, + }; + + return new GeminiProvider(config, apiKey); +} + +module.exports = { + GeminiProvider, + createGeminiProvider, +}; + diff --git a/.github/scripts/ai-provider/providers/github-models.js b/.github/scripts/ai-provider/providers/github-models.js new file mode 100644 index 00000000..6d9426cc --- /dev/null +++ b/.github/scripts/ai-provider/providers/github-models.js @@ -0,0 +1,79 @@ +/** + * GitHub Models (Azure OpenAI) Provider + * Uses GitHub token for authentication in GitHub Actions + */ +const BaseAIProvider = require('./base-provider'); + +class GitHubModelsProvider extends BaseAIProvider { + constructor(config, apiKey) { + const model = config.model || 'gpt-4o'; + super(`GitHub Models (${model})`, config, apiKey); + this.model = model; + } + + buildRequestOptions() { + return { + hostname: 'models.inference.ai.azure.com', + port: 443, + path: '/chat/completions', + method: 'POST', + headers: { + 'Authorization': `Bearer ${this.apiKey}`, + 'Content-Type': 'application/json', + 'Accept': 'application/json', + 'User-Agent': 'Compose-CI/1.0', + }, + }; + } + + buildRequestBody(systemPrompt, userPrompt, maxTokens) { + return JSON.stringify({ + messages: [ + { role: 'system', content: systemPrompt }, + { role: 'user', content: userPrompt }, + ], + model: this.model, + max_tokens: maxTokens || this.getMaxTokens(), + temperature: 0.7, + }); + } + + extractContent(response) { + if (response.choices?.[0]?.message?.content) { + return { + content: response.choices[0].message.content, + tokens: response.usage?.total_tokens || null, + }; + } + return null; + } + + getRateLimits() { + return { + maxRequestsPerMinute: 10, + maxTokensPerMinute: 40000, + }; + } +} + +/** + * Create GitHub Models provider + */ +function createGitHubProvider(customModel) { + const apiKey = process.env.GITHUB_TOKEN; + + const config = { + model: customModel, + maxTokens: 2500, + maxRequestsPerMinute: 10, + maxTokensPerMinute: 40000, + }; + + return new GitHubModelsProvider(config, apiKey); +} + +module.exports = { + GitHubModelsProvider, + createGitHubProvider, +}; + diff --git a/.github/scripts/ai-provider/rate-limiter.js b/.github/scripts/ai-provider/rate-limiter.js new file mode 100644 index 00000000..aa59b28f --- /dev/null +++ b/.github/scripts/ai-provider/rate-limiter.js @@ -0,0 +1,137 @@ +/** + * Rate Limiter + * Handles request-based and token-based rate limiting + */ + +class RateLimiter { + constructor() { + this.provider = null; + this.lastCallTime = 0; + this.tokenHistory = []; + this.limits = { + maxRequestsPerMinute: 10, + maxTokensPerMinute: 40000, + }; + this.tokenWindowMs = 60000; // 60 seconds + this.safetyMargin = 0.85; // Use 85% of token budget + } + + /** + * Set the active provider and update rate limits + */ + setProvider(provider) { + this.provider = provider; + this.limits = provider.getRateLimits(); + } + + /** + * Estimate token usage for a request + * Uses rough heuristic: ~4 characters per token + */ + estimateTokenUsage(systemPrompt, userPrompt, maxTokens) { + const inputText = (systemPrompt || '') + (userPrompt || ''); + const estimatedInputTokens = Math.ceil(inputText.length / 4); + return estimatedInputTokens + (maxTokens || 0); + } + + /** + * Wait for rate limits before making a request + */ + async waitForRateLimit(estimatedTokens) { + const now = Date.now(); + + // 1. Request-based rate limit (requests per minute) + const minDelayMs = Math.ceil(60000 / this.limits.maxRequestsPerMinute); + const elapsed = now - this.lastCallTime; + + if (this.lastCallTime > 0 && elapsed < minDelayMs) { + const waitTime = minDelayMs - elapsed; + await this._sleep(waitTime); + } + + // 2. Token-based rate limit + this._cleanTokenHistory(); + const currentConsumption = this._getCurrentTokenConsumption(); + const effectiveBudget = this.limits.maxTokensPerMinute * this.safetyMargin; + const availableTokens = effectiveBudget - currentConsumption; + + if (estimatedTokens > availableTokens) { + const waitTime = this._calculateTokenWaitTime(estimatedTokens, currentConsumption); + if (waitTime > 0) { + await this._sleep(waitTime); + this._cleanTokenHistory(); + } + } + + this.lastCallTime = Date.now(); + } + + /** + * Record actual token consumption after a request + */ + recordTokenConsumption(tokens) { + this.tokenHistory.push({ + timestamp: Date.now(), + tokens: tokens, + }); + this._cleanTokenHistory(); + } + + /** + * Clean expired entries from token history + */ + _cleanTokenHistory() { + const now = Date.now(); + this.tokenHistory = this.tokenHistory.filter( + entry => (now - entry.timestamp) < this.tokenWindowMs + ); + } + + /** + * Get current token consumption in the rolling window + */ + _getCurrentTokenConsumption() { + return this.tokenHistory.reduce((sum, entry) => sum + entry.tokens, 0); + } + + /** + * Calculate how long to wait for token budget to free up + */ + _calculateTokenWaitTime(tokensNeeded, currentConsumption) { + const effectiveBudget = this.limits.maxTokensPerMinute * this.safetyMargin; + const availableTokens = effectiveBudget - currentConsumption; + + if (tokensNeeded <= availableTokens) { + return 0; + } + + if (this.tokenHistory.length === 0) { + return 0; + } + + // Find how many tokens need to expire + const tokensToFree = tokensNeeded - availableTokens; + let freedTokens = 0; + let oldestTimestamp = Date.now(); + + for (const entry of this.tokenHistory) { + freedTokens += entry.tokens; + oldestTimestamp = entry.timestamp; + + if (freedTokens >= tokensToFree) { + break; + } + } + + // Calculate wait time until that entry expires + const timeUntilExpiry = this.tokenWindowMs - (Date.now() - oldestTimestamp); + return Math.max(0, timeUntilExpiry + 2000); // Add 2s buffer + } + + async _sleep(ms) { + return new Promise(resolve => setTimeout(resolve, ms)); + } +} + +module.exports = RateLimiter; + diff --git a/.github/scripts/check-solidity-comments.sh b/.github/scripts/check-solidity-comments.sh old mode 100755 new mode 100644 diff --git a/.github/scripts/generate-docs-utils/ai/ai-enhancement.js b/.github/scripts/generate-docs-utils/ai/ai-enhancement.js new file mode 100644 index 00000000..4edf760c --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/ai-enhancement.js @@ -0,0 +1,76 @@ +/** + * AI Enhancement + * + * Orchestrates AI-powered documentation enhancement. + */ + +const ai = require('../../ai-provider'); +const { buildSystemPrompt, buildPrompt } = require('./prompt-builder'); +const { extractJSON, convertEnhancedFields } = require('./response-parser'); +const { addFallbackContent } = require('./fallback-content-provider'); + +/** + * Check if enhancement should be skipped for a file + * @param {object} data - Documentation data + * @returns {boolean} True if should skip + */ +function shouldSkipEnhancement(data) { + if (!data.functions || data.functions.length === 0) { + return true; + } + + if (data.title.startsWith('I') && data.title.length > 1 && + data.title[1] === data.title[1].toUpperCase()) { + return true; + } + + return false; +} + +/** + * Enhance documentation data using AI + * @param {object} data - Parsed documentation data + * @param {'module' | 'facet'} contractType - Type of contract + * @param {string} token - Legacy token parameter (deprecated, uses env vars now) + * @returns {Promise<{data: object, usedFallback: boolean, error?: string}>} Enhanced data with fallback status + */ +async function enhanceWithAI(data, contractType, token) { + try { + const systemPrompt = buildSystemPrompt(); + const userPrompt = buildPrompt(data, contractType); + + // Call AI provider + const responseText = await ai.call(systemPrompt, userPrompt, { + onSuccess: () => { + // Silent success - no logging + }, + onError: () => { + // Silent error - will be caught below + } + }); + + // Parse JSON response + let enhanced; + try { + enhanced = JSON.parse(responseText); + } catch (directParseError) { + const cleanedContent = extractJSON(responseText); + enhanced = JSON.parse(cleanedContent); + } + + return { data: convertEnhancedFields(enhanced, data), usedFallback: false }; + + } catch (error) { + return { + data: addFallbackContent(data, contractType), + usedFallback: true, + error: error.message + }; + } +} + +module.exports = { + enhanceWithAI, + shouldSkipEnhancement, +}; + diff --git a/.github/scripts/generate-docs-utils/ai/context-extractor.js b/.github/scripts/generate-docs-utils/ai/context-extractor.js new file mode 100644 index 00000000..abf3b490 --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/context-extractor.js @@ -0,0 +1,250 @@ +/** + * Context Extractor for AI Documentation Enhancement + * + * Extracts and formats additional context from source files and parsed data + * to provide richer information to the AI for more accurate documentation generation. + */ + +const fs = require('fs'); +const path = require('path'); +const { readFileSafe } = require('../../workflow-utils'); +const { findRelatedContracts } = require('../core/relationship-detector'); +const { getContractRegistry } = require('../core/contract-registry'); + +/** + * Extract context from source file (pragma, imports, etc.) + * @param {string} sourceFilePath - Path to the Solidity source file + * @returns {object} Extracted source context + */ +function extractSourceContext(sourceFilePath) { + if (!sourceFilePath) { + return { + pragmaVersion: null, + imports: [], + }; + } + + const sourceContent = readFileSafe(sourceFilePath); + if (!sourceContent) { + return { + pragmaVersion: null, + imports: [], + }; + } + + // Extract pragma version + const pragmaMatch = sourceContent.match(/pragma\s+solidity\s+([^;]+);/); + const pragmaVersion = pragmaMatch ? pragmaMatch[1].trim() : null; + + // Extract imports + const importMatches = sourceContent.matchAll(/import\s+["']([^"']+)["']/g); + const imports = Array.from(importMatches, m => m[1]); + + return { + pragmaVersion, + imports, + }; +} + +/** + * Compute import path from source file path + * Converts: src/access/AccessControl/AccessControlFacet.sol + * To: @compose/access/AccessControl/AccessControlFacet + * @param {string} sourceFilePath - Path to the Solidity source file + * @returns {string} Import path + */ +function computeImportPath(sourceFilePath) { + if (!sourceFilePath) { + return null; + } + + // Remove src/ prefix and .sol extension + let importPath = sourceFilePath + .replace(/^src\//, '') + .replace(/\.sol$/, ''); + + // Convert to @compose/ format + return `@compose/${importPath}`; +} + +/** + * Format complete function signatures with parameter types and return types + * @param {Array} functions - Array of function objects + * @returns {string} Formatted function signatures + */ +function formatFunctionSignatures(functions) { + if (!functions || functions.length === 0) { + return 'None'; + } + + return functions.map(fn => { + // Format parameters + const params = (fn.params || []).map(p => { + const type = p.type || ''; + const name = p.name || ''; + if (!type && !name) return ''; + return name ? `${type} ${name}` : type; + }).filter(Boolean).join(', '); + + // Format return types + const returns = (fn.returns || []).map(r => r.type || '').filter(Boolean); + const returnStr = returns.length > 0 ? ` returns (${returns.join(', ')})` : ''; + + // Include visibility and mutability if available in signature + const signature = fn.signature || ''; + const visibility = signature.match(/\b(public|external|internal|private)\b/)?.[0] || ''; + const mutability = signature.match(/\b(view|pure|payable)\b/)?.[0] || ''; + + const modifiers = [visibility, mutability].filter(Boolean).join(' '); + + return `function ${fn.name}(${params})${modifiers ? ' ' + modifiers : ''}${returnStr}`; + }).join('\n'); +} + +/** + * Format storage context information + * @param {object} storageInfo - Storage info object + * @param {Array} structs - Array of struct definitions + * @param {Array} stateVariables - Array of state variables + * @returns {string} Formatted storage context + */ +function formatStorageContext(storageInfo, structs, stateVariables) { + const parts = []; + + // Extract storage position from state variables + const storagePositionVar = (stateVariables || []).find(v => + v.name && (v.name.includes('STORAGE_POSITION') || v.name.includes('STORAGE') || v.name.includes('_POSITION')) + ); + + if (storagePositionVar) { + parts.push(`Storage Position: ${storagePositionVar.name}`); + if (storagePositionVar.value) { + parts.push(`Value: ${storagePositionVar.value}`); + } + if (storagePositionVar.description) { + parts.push(`Description: ${storagePositionVar.description}`); + } + } + + // Extract storage struct + const storageStruct = (structs || []).find(s => + s.name && s.name.includes('Storage') + ); + + if (storageStruct) { + parts.push(`Storage Struct: ${storageStruct.name}`); + if (storageStruct.definition) { + // Extract key fields from struct definition + const fieldMatches = storageStruct.definition.matchAll(/(\w+)\s+(\w+)(?:\[.*?\])?;/g); + const fields = Array.from(fieldMatches, m => `${m[1]} ${m[2]}`); + if (fields.length > 0) { + parts.push(`Key Fields: ${fields.slice(0, 5).join(', ')}${fields.length > 5 ? '...' : ''}`); + } + } + } + + // Add storage info if available + if (storageInfo) { + if (typeof storageInfo === 'string') { + parts.push(storageInfo); + } else if (storageInfo.storagePosition) { + parts.push(`Storage Position: ${storageInfo.storagePosition}`); + } + } + + return parts.length > 0 ? parts.join('\n') : 'None'; +} + +/** + * Format related contracts context + * @param {string} contractName - Name of the contract + * @param {string} contractType - Type of contract ('module' or 'facet') + * @param {string} category - Category of the contract + * @param {object} registry - Contract registry (optional) + * @returns {string} Formatted related contracts context + */ +function formatRelatedContracts(contractName, contractType, category, registry = null) { + const related = findRelatedContracts(contractName, contractType, category, registry); + + if (related.length === 0) { + return 'None'; + } + + return related.map(r => `- ${r.title}: ${r.description}`).join('\n'); +} + +/** + * Format struct definitions with field types + * @param {Array} structs - Array of struct objects + * @returns {string} Formatted struct definitions + */ +function formatStructDefinitions(structs) { + if (!structs || structs.length === 0) { + return 'None'; + } + + return structs.map(s => { + const fields = (s.fields || []).map(f => { + const type = f.type || ''; + const name = f.name || ''; + return name ? `${type} ${name}` : type; + }).join(', '); + + return `struct ${s.name} { ${fields} }`; + }).join('\n'); +} + +/** + * Format event signatures with parameters + * @param {Array} events - Array of event objects + * @returns {string} Formatted event signatures + */ +function formatEventSignatures(events) { + if (!events || events.length === 0) { + return 'None'; + } + + return events.map(e => { + const params = (e.params || []).map(p => { + const indexed = p.indexed ? 'indexed ' : ''; + const type = p.type || ''; + const name = p.name || ''; + return name ? `${indexed}${type} ${name}` : `${indexed}${type}`; + }).join(', '); + + return `event ${e.name}(${params})`; + }).join('\n'); +} + +/** + * Format error signatures with parameters + * @param {Array} errors - Array of error objects + * @returns {string} Formatted error signatures + */ +function formatErrorSignatures(errors) { + if (!errors || errors.length === 0) { + return 'None'; + } + + return errors.map(e => { + const params = (e.params || []).map(p => { + const type = p.type || ''; + const name = p.name || ''; + return name ? `${type} ${name}` : type; + }).join(', '); + + return `error ${e.name}(${params})`; + }).join('\n'); +} + +module.exports = { + extractSourceContext, + computeImportPath, + formatFunctionSignatures, + formatStorageContext, + formatRelatedContracts, + formatStructDefinitions, + formatEventSignatures, + formatErrorSignatures, +}; + diff --git a/.github/scripts/generate-docs-utils/ai/fallback-content-provider.js b/.github/scripts/generate-docs-utils/ai/fallback-content-provider.js new file mode 100644 index 00000000..1afabac0 --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/fallback-content-provider.js @@ -0,0 +1,36 @@ +/** + * Fallback Content Provider + * + * Provides fallback content when AI enhancement is unavailable. + * Centralizes fallback content logic to avoid duplication. + */ + +const { loadPrompts } = require('./prompt-loader'); + +/** + * Add fallback content when AI is unavailable + * @param {object} data - Documentation data + * @param {'module' | 'facet'} contractType - Type of contract + * @returns {object} Data with fallback content + */ +function addFallbackContent(data, contractType) { + const prompts = loadPrompts(); + const enhanced = { ...data }; + + if (contractType === 'module') { + enhanced.integrationNotes = prompts.moduleFallback.integrationNotes || + `This module accesses shared diamond storage, so changes made through this module are immediately visible to facets using the same storage pattern. All functions are internal as per Compose conventions.`; + enhanced.keyFeatures = prompts.moduleFallback.keyFeatures || + `- All functions are \`internal\` for use in custom facets\n- Follows diamond storage pattern (EIP-8042)\n- Compatible with ERC-2535 diamonds\n- No external dependencies or \`using\` directives`; + } else { + enhanced.keyFeatures = prompts.facetFallback.keyFeatures || + `- Self-contained facet with no imports or inheritance\n- Only \`external\` and \`internal\` function visibility\n- Follows Compose readability-first conventions\n- Ready for diamond integration`; + } + + return enhanced; +} + +module.exports = { + addFallbackContent, +}; + diff --git a/.github/scripts/generate-docs-utils/ai/prompt-builder.js b/.github/scripts/generate-docs-utils/ai/prompt-builder.js new file mode 100644 index 00000000..362ecf02 --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/prompt-builder.js @@ -0,0 +1,186 @@ +/** + * Prompt Builder + * + * Builds system and user prompts for AI enhancement. + */ + +const { + extractSourceContext, + computeImportPath, + formatFunctionSignatures, + formatStorageContext, + formatRelatedContracts, + formatStructDefinitions, + formatEventSignatures, + formatErrorSignatures, +} = require('./context-extractor'); +const { getContractRegistry } = require('../core/contract-registry'); +const { loadPrompts, loadRepoInstructions } = require('./prompt-loader'); + +/** + * Build the system prompt with repository context + * Uses the system prompt from the prompts file, or a fallback if not found + * @returns {string} System prompt for AI + */ +function buildSystemPrompt() { + const prompts = loadPrompts(); + const repoInstructions = loadRepoInstructions(); + + let systemPrompt = prompts.systemPrompt || `You are a Solidity smart contract documentation expert for the Compose framework. +Always respond with valid JSON only, no markdown formatting. +Follow the project conventions and style guidelines strictly.`; + + if (repoInstructions) { + const relevantSections = prompts.relevantSections.length > 0 + ? prompts.relevantSections + : [ + '## 3. Core Philosophy', + '## 4. Facet Design Principles', + '## 5. Banned Solidity Features', + '## 6. Composability Guidelines', + '## 11. Code Style Guide', + ]; + + let contextSnippets = []; + for (const section of relevantSections) { + const startIdx = repoInstructions.indexOf(section); + if (startIdx !== -1) { + // Extract section content (up to next ## or 2000 chars max) + const nextSection = repoInstructions.indexOf('\n## ', startIdx + section.length); + const endIdx = nextSection !== -1 ? nextSection : startIdx + 2000; + const snippet = repoInstructions.slice(startIdx, Math.min(endIdx, startIdx + 2000)); + contextSnippets.push(snippet.trim()); + } + } + + if (contextSnippets.length > 0) { + systemPrompt += `\n\n--- PROJECT GUIDELINES ---\n${contextSnippets.join('\n\n')}`; + } + } + + return systemPrompt; +} + +/** + * Build the prompt for AI based on contract type + * @param {object} data - Parsed documentation data + * @param {'module' | 'facet'} contractType - Type of contract + * @returns {string} Prompt for AI + */ +function buildPrompt(data, contractType) { + const prompts = loadPrompts(); + + const functionNames = data.functions.map(f => f.name).join(', '); + const functionDescriptions = data.functions + .map(f => `- ${f.name}: ${f.description || 'No description'}`) + .join('\n'); + + // Include events and errors for richer context + const eventNames = (data.events || []).map(e => e.name).join(', '); + const errorNames = (data.errors || []).map(e => e.name).join(', '); + + // Extract additional context + const sourceContext = extractSourceContext(data.sourceFilePath); + const importPath = computeImportPath(data.sourceFilePath); + const functionSignatures = formatFunctionSignatures(data.functions); + const eventSignatures = formatEventSignatures(data.events); + const errorSignatures = formatErrorSignatures(data.errors); + const structDefinitions = formatStructDefinitions(data.structs); + + // Get storage context + const storageContext = formatStorageContext( + data.storageInfo, + data.structs, + data.stateVariables + ); + + // Get related contracts context + const registry = getContractRegistry(); + // Try to get category from registry entry, or use empty string + const registryEntry = registry.byName.get(data.title); + const category = data.category || (registryEntry ? registryEntry.category : ''); + const relatedContracts = formatRelatedContracts( + data.title, + contractType, + category, + registry + ); + + const promptTemplate = contractType === 'module' + ? prompts.modulePrompt + : prompts.facetPrompt; + + // If we have a template from the file, use it with variable substitution + if (promptTemplate) { + return promptTemplate + .replace(/\{\{title\}\}/g, data.title) + .replace(/\{\{description\}\}/g, data.description || 'No description provided') + .replace(/\{\{functionNames\}\}/g, functionNames || 'None') + .replace(/\{\{functionDescriptions\}\}/g, functionDescriptions || ' None') + .replace(/\{\{eventNames\}\}/g, eventNames || 'None') + .replace(/\{\{errorNames\}\}/g, errorNames || 'None') + .replace(/\{\{functionSignatures\}\}/g, functionSignatures || 'None') + .replace(/\{\{eventSignatures\}\}/g, eventSignatures || 'None') + .replace(/\{\{errorSignatures\}\}/g, errorSignatures || 'None') + .replace(/\{\{importPath\}\}/g, importPath || 'N/A') + .replace(/\{\{pragmaVersion\}\}/g, sourceContext.pragmaVersion || '^0.8.30') + .replace(/\{\{storageContext\}\}/g, storageContext || 'None') + .replace(/\{\{relatedContracts\}\}/g, relatedContracts || 'None') + .replace(/\{\{structDefinitions\}\}/g, structDefinitions || 'None'); + } + + // Fallback to hardcoded prompt if template not loaded + return `Given this ${contractType} documentation from the Compose diamond proxy framework, enhance it by generating: + +1. **description**: A concise one-line description (max 100 chars) for the page subtitle. Derive this from the contract's purpose based on its functions, events, and errors. + +2. **overview**: A clear, concise overview (2-3 sentences) explaining what this ${contractType} does and why it's useful in the context of diamond contracts. + +3. **usageExample**: A practical Solidity code example (10-20 lines) showing how to use this ${contractType}. For modules, show importing and calling functions. For facets, show how it would be used in a diamond. Use the EXACT import path and function signatures provided below. + +4. **bestPractices**: 2-3 bullet points of best practices for using this ${contractType}. + +${contractType === 'module' ? '5. **integrationNotes**: A note about how this module works with diamond storage pattern and how changes made through it are visible to facets.' : ''} + +${contractType === 'facet' ? '5. **securityConsiderations**: Important security considerations when using this facet (access control, reentrancy, etc.).' : ''} + +6. **keyFeatures**: A brief bullet list of key features. + +Contract Information: +- Name: ${data.title} +- Current Description: ${data.description || 'No description provided'} +- Import Path: ${importPath || 'N/A'} +- Pragma Version: ${sourceContext.pragmaVersion || '^0.8.30'} +- Functions: ${functionNames || 'None'} +- Function Signatures: +${functionSignatures || ' None'} +- Events: ${eventNames || 'None'} +- Event Signatures: +${eventSignatures || ' None'} +- Errors: ${errorNames || 'None'} +- Error Signatures: +${errorSignatures || ' None'} +- Function Details: +${functionDescriptions || ' None'} +${storageContext && storageContext !== 'None' ? `\n- Storage Information:\n${storageContext}` : ''} +${relatedContracts && relatedContracts !== 'None' ? `\n- Related Contracts:\n${relatedContracts}` : ''} +${structDefinitions && structDefinitions !== 'None' ? `\n- Struct Definitions:\n${structDefinitions}` : ''} + +IMPORTANT: Use the EXACT function signatures, import paths, and storage information provided above. Do not invent or modify function names, parameter types, or import paths. + +Respond ONLY with valid JSON in this exact format (no markdown code blocks, no extra text): +{ + "description": "concise one-line description here", + "overview": "enhanced overview text here", + "usageExample": "solidity code here (use \\n for newlines)", + "bestPractices": "- Point 1\\n- Point 2\\n- Point 3", + "keyFeatures": "- Feature 1\\n- Feature 2", + ${contractType === 'module' ? '"integrationNotes": "integration notes here"' : '"securityConsiderations": "security notes here"'} +}`; +} + +module.exports = { + buildSystemPrompt, + buildPrompt, +}; + diff --git a/.github/scripts/generate-docs-utils/ai/prompt-loader.js b/.github/scripts/generate-docs-utils/ai/prompt-loader.js new file mode 100644 index 00000000..f4f8cd1f --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/prompt-loader.js @@ -0,0 +1,132 @@ +/** + * Prompt Loader + * + * Loads and parses AI prompts from markdown files. + */ + +const fs = require('fs'); +const path = require('path'); + +const AI_PROMPT_PATH = path.join(__dirname, '../../../docs-gen-prompts.md'); +const REPO_INSTRUCTIONS_PATH = path.join(__dirname, '../../../copilot-instructions.md'); + +// Cache loaded prompts +let cachedPrompts = null; +let cachedRepoInstructions = null; + +/** + * Load repository instructions for context + * @returns {string} Repository instructions content + */ +function loadRepoInstructions() { + if (cachedRepoInstructions !== null) { + return cachedRepoInstructions; + } + + try { + cachedRepoInstructions = fs.readFileSync(REPO_INSTRUCTIONS_PATH, 'utf8'); + } catch (e) { + console.warn('Could not load copilot-instructions.md:', e.message); + cachedRepoInstructions = ''; + } + + return cachedRepoInstructions; +} + +/** + * Parse the prompts markdown file to extract individual prompts + * @param {string} content - Raw markdown content + * @returns {object} Parsed prompts and configurations + */ +function parsePromptsFile(content) { + const sections = content.split(/^---$/m).map(s => s.trim()).filter(Boolean); + + const prompts = { + systemPrompt: '', + modulePrompt: '', + facetPrompt: '', + relevantSections: [], + moduleFallback: { integrationNotes: '', keyFeatures: '' }, + facetFallback: { keyFeatures: '' }, + }; + + for (const section of sections) { + if (section.includes('## System Prompt')) { + const match = section.match(/## System Prompt\s*\n([\s\S]*)/); + if (match) { + prompts.systemPrompt = match[1].trim(); + } + } else if (section.includes('## Relevant Guideline Sections')) { + // Extract sections from the code block + const codeMatch = section.match(/```\n([\s\S]*?)```/); + if (codeMatch) { + prompts.relevantSections = codeMatch[1] + .split('\n') + .map(s => s.trim()) + .filter(s => s.startsWith('## ')); + } + } else if (section.includes('## Module Prompt Template')) { + const match = section.match(/## Module Prompt Template\s*\n([\s\S]*)/); + if (match) { + prompts.modulePrompt = match[1].trim(); + } + } else if (section.includes('## Facet Prompt Template')) { + const match = section.match(/## Facet Prompt Template\s*\n([\s\S]*)/); + if (match) { + prompts.facetPrompt = match[1].trim(); + } + } else if (section.includes('## Module Fallback Content')) { + // Parse subsections for integrationNotes and keyFeatures + const integrationMatch = section.match(/### integrationNotes\s*\n([\s\S]*?)(?=###|$)/); + if (integrationMatch) { + prompts.moduleFallback.integrationNotes = integrationMatch[1].trim(); + } + const keyFeaturesMatch = section.match(/### keyFeatures\s*\n([\s\S]*?)(?=###|$)/); + if (keyFeaturesMatch) { + prompts.moduleFallback.keyFeatures = keyFeaturesMatch[1].trim(); + } + } else if (section.includes('## Facet Fallback Content')) { + const keyFeaturesMatch = section.match(/### keyFeatures\s*\n([\s\S]*?)(?=###|$)/); + if (keyFeaturesMatch) { + prompts.facetFallback.keyFeatures = keyFeaturesMatch[1].trim(); + } + } + } + + return prompts; +} + +/** + * Load AI prompts from markdown file + * @returns {object} Parsed prompts object + */ +function loadPrompts() { + if (cachedPrompts !== null) { + return cachedPrompts; + } + + const defaultPrompts = { + systemPrompt: '', + modulePrompt: '', + facetPrompt: '', + relevantSections: [], + moduleFallback: { integrationNotes: '', keyFeatures: '' }, + facetFallback: { keyFeatures: '' }, + }; + + try { + const promptsContent = fs.readFileSync(AI_PROMPT_PATH, 'utf8'); + cachedPrompts = parsePromptsFile(promptsContent); + } catch (e) { + console.warn('Could not load ai-prompts.md:', e.message); + cachedPrompts = defaultPrompts; + } + + return cachedPrompts; +} + +module.exports = { + loadPrompts, + loadRepoInstructions, +}; + diff --git a/.github/scripts/generate-docs-utils/ai/response-parser.js b/.github/scripts/generate-docs-utils/ai/response-parser.js new file mode 100644 index 00000000..a7e7204e --- /dev/null +++ b/.github/scripts/generate-docs-utils/ai/response-parser.js @@ -0,0 +1,137 @@ +/** + * Response Parser + * + * Parses and cleans AI response content. + */ + +/** + * Extract and clean JSON from API response + * Handles markdown code blocks, wrapped text, and attempts to fix truncated JSON + * Also removes control characters that break JSON parsing + * @param {string} content - Raw API response content + * @returns {string} Cleaned JSON string ready for parsing + */ +function extractJSON(content) { + if (!content || typeof content !== 'string') { + return content; + } + + let cleaned = content.trim(); + + // Remove markdown code blocks (```json ... ``` or ``` ... ```) + // Handle both at start and anywhere in the string + cleaned = cleaned.replace(/^```(?:json)?\s*\n?/gm, ''); + cleaned = cleaned.replace(/\n?```\s*$/gm, ''); + cleaned = cleaned.trim(); + + // Remove control characters (0x00-0x1F except newline, tab, carriage return) + // These are illegal in JSON strings and cause "Bad control character" parsing errors + cleaned = cleaned.replace(/[\x00-\x08\x0B\x0C\x0E-\x1F]/g, ''); + + // Find the first { and last } to extract JSON object + const firstBrace = cleaned.indexOf('{'); + const lastBrace = cleaned.lastIndexOf('}'); + + if (firstBrace !== -1 && lastBrace !== -1 && lastBrace > firstBrace) { + cleaned = cleaned.substring(firstBrace, lastBrace + 1); + } else if (firstBrace !== -1) { + // We have a { but no closing }, JSON might be truncated + cleaned = cleaned.substring(firstBrace); + } + + // Try to fix common truncation issues + const openBraces = (cleaned.match(/\{/g) || []).length; + const closeBraces = (cleaned.match(/\}/g) || []).length; + + if (openBraces > closeBraces) { + // JSON might be truncated - try to close incomplete strings and objects + // Check if we're in the middle of a string (simple heuristic) + const lastChar = cleaned[cleaned.length - 1]; + const lastQuote = cleaned.lastIndexOf('"'); + const lastBraceInCleaned = cleaned.lastIndexOf('}'); + + // If last quote is after last brace and not escaped, we might be in a string + if (lastQuote > lastBraceInCleaned && lastChar !== '"') { + // Check if the quote before last is escaped + let isEscaped = false; + for (let i = lastQuote - 1; i >= 0 && cleaned[i] === '\\'; i--) { + isEscaped = !isEscaped; + } + + if (!isEscaped) { + // We're likely in an incomplete string, close it + cleaned = cleaned + '"'; + } + } + + // Close any incomplete objects/arrays + const missingBraces = openBraces - closeBraces; + // Try to intelligently close - if we're in the middle of a property, add a value first + const trimmed = cleaned.trim(); + if (trimmed.endsWith(',') || trimmed.endsWith(':')) { + // We're in the middle of a property, add null and close + cleaned = cleaned.replace(/[,:]\s*$/, ': null'); + } + cleaned = cleaned + '\n' + '}'.repeat(missingBraces); + } + + return cleaned.trim(); +} + +/** + * Convert literal \n strings to actual newlines + * @param {string} str - String with escaped newlines + * @returns {string} String with actual newlines + */ +function convertNewlines(str) { + if (!str || typeof str !== 'string') return str; + return str.replace(/\\n/g, '\n'); +} + +/** + * Decode HTML entities (for code blocks) + * @param {string} str - String with HTML entities + * @returns {string} Decoded string + */ +function decodeHtmlEntities(str) { + if (!str || typeof str !== 'string') return str; + return str + .replace(/"/g, '"') + .replace(/=/g, '=') + .replace(/=>/g, '=>') + .replace(/</g, '<') + .replace(/>/g, '>') + .replace(/'/g, "'") + .replace(/&/g, '&'); +} + +/** + * Convert enhanced data fields (newlines, HTML entities) + * @param {object} enhanced - Parsed JSON from API + * @param {object} data - Original documentation data + * @returns {object} Enhanced data with converted fields + */ +function convertEnhancedFields(enhanced, data) { + // Use AI-generated description if provided, otherwise keep original + const aiDescription = enhanced.description?.trim(); + const finalDescription = aiDescription || data.description; + + return { + ...data, + // Description is used for page subtitle - AI improves it from NatSpec + description: finalDescription, + subtitle: finalDescription, + overview: convertNewlines(enhanced.overview) || data.overview, + usageExample: decodeHtmlEntities(convertNewlines(enhanced.usageExample)) || null, + bestPractices: convertNewlines(enhanced.bestPractices) || null, + keyFeatures: convertNewlines(enhanced.keyFeatures) || null, + integrationNotes: convertNewlines(enhanced.integrationNotes) || null, + securityConsiderations: convertNewlines(enhanced.securityConsiderations) || null, + }; +} + +module.exports = { + extractJSON, + convertEnhancedFields, +}; + diff --git a/.github/scripts/generate-docs-utils/category/category-generator.js b/.github/scripts/generate-docs-utils/category/category-generator.js new file mode 100644 index 00000000..7068eca3 --- /dev/null +++ b/.github/scripts/generate-docs-utils/category/category-generator.js @@ -0,0 +1,628 @@ +/** + * Category Generator + * + * Automatically generates _category_.json files to mirror + * the src/ folder structure in the documentation. + * + * This module provides: + * - Source structure scanning + * - Category file generation + * - Path computation for doc output + * - Structure synchronization + */ + +const fs = require('fs'); +const path = require('path'); +const CONFIG = require('../config'); +const { + getCategoryItems, + createCategoryIndexFile: createIndexFile, +} = require('./index-page-generator'); + +// ============================================================================ +// Constants +// ============================================================================ + +/** + * Human-readable labels for directory names + * Add new entries here when adding new top-level categories + */ +const CATEGORY_LABELS = { + // Top-level categories + access: 'Access Control', + token: 'Token Standards', + diamond: 'Diamond Core', + libraries: 'Utilities', + utils: 'Utilities', + interfaceDetection: 'Interface Detection', + + // Token subcategories + ERC20: 'ERC-20', + ERC721: 'ERC-721', + ERC1155: 'ERC-1155', + ERC6909: 'ERC-6909', + Royalty: 'Royalty', + + // Access subcategories + AccessControl: 'Access Control', + AccessControlPausable: 'Pausable Access Control', + AccessControlTemporal: 'Temporal Access Control', + Owner: 'Owner', + OwnerTwoSteps: 'Two-Step Owner', +}; + +/** + * Descriptions for categories + * Add new entries here for custom descriptions + */ +const CATEGORY_DESCRIPTIONS = { + // Top-level categories + access: 'Access control patterns for permission management in Compose diamonds.', + token: 'Token standard implementations for Compose diamonds.', + diamond: 'Core diamond proxy functionality for ERC-2535 diamonds.', + libraries: 'Utility libraries and helpers for diamond development.', + utils: 'Utility libraries and helpers for diamond development.', + interfaceDetection: 'ERC-165 interface detection support.', + + // Token subcategories + ERC20: 'ERC-20 fungible token implementations.', + ERC721: 'ERC-721 non-fungible token implementations.', + ERC1155: 'ERC-1155 multi-token implementations.', + ERC6909: 'ERC-6909 minimal multi-token implementations.', + Royalty: 'ERC-2981 royalty standard implementations.', + + // Access subcategories + AccessControl: 'Role-based access control (RBAC) pattern.', + AccessControlPausable: 'RBAC with pause functionality.', + AccessControlTemporal: 'Time-limited role-based access control.', + Owner: 'Single-owner access control pattern.', + OwnerTwoSteps: 'Two-step ownership transfer pattern.', +}; + +/** + * Sidebar positions for categories + * Lower numbers appear first in the sidebar + */ +const CATEGORY_POSITIONS = { + // Top-level (lower = higher priority) + diamond: 1, + access: 2, + token: 3, + libraries: 4, + utils: 4, + interfaceDetection: 5, + + // Token subcategories + ERC20: 1, + ERC721: 2, + ERC1155: 3, + ERC6909: 4, + Royalty: 5, + + // Access subcategories + Owner: 1, + OwnerTwoSteps: 2, + AccessControl: 3, + AccessControlPausable: 4, + AccessControlTemporal: 5, + + // Leaf directories (ERC20/ERC20, etc.) - alphabetical + ERC20Bridgeable: 2, + ERC20Permit: 3, + ERC721Enumerable: 2, +}; + +// ============================================================================ +// Label & Description Generation +// ============================================================================ + +/** + * Generate a human-readable label from a directory name + * @param {string} name - Directory name (e.g., 'AccessControlPausable', 'ERC20') + * @returns {string} Human-readable label + */ +function generateLabel(name) { + // Check explicit mapping first + if (CATEGORY_LABELS[name]) { + return CATEGORY_LABELS[name]; + } + + // Handle ERC standards specially + if (/^ERC\d+/.test(name)) { + const match = name.match(/^(ERC)(\d+)(.*)$/); + if (match) { + const variant = match[3] + ? ' ' + match[3].replace(/([A-Z])/g, ' $1').trim() + : ''; + return `ERC-${match[2]}${variant}`; + } + return name; + } + + // CamelCase to Title Case with spaces + return name.replace(/([A-Z])/g, ' $1').replace(/^ /, '').trim(); +} + +/** + * Generate description for a category based on its path + * @param {string} name - Directory name + * @param {string[]} parentPath - Parent path segments + * @returns {string} Category description + */ +function generateDescription(name, parentPath = []) { + // Check explicit mapping first + if (CATEGORY_DESCRIPTIONS[name]) { + return CATEGORY_DESCRIPTIONS[name]; + } + + // Generate from context + const label = generateLabel(name); + const parent = parentPath[parentPath.length - 1]; + + if (parent === 'token') { + return `${label} token implementations with modules and facets.`; + } + if (parent === 'access') { + return `${label} access control pattern for Compose diamonds.`; + } + if (parent === 'ERC20' || parent === 'ERC721') { + return `${label} extension for ${generateLabel(parent)} tokens.`; + } + + return `${label} components for Compose diamonds.`; +} + +/** + * Get sidebar position for a category + * @param {string} name - Directory name + * @param {number} depth - Nesting depth + * @returns {number} Sidebar position + */ +function getCategoryPosition(name, depth) { + if (CATEGORY_POSITIONS[name] !== undefined) { + return CATEGORY_POSITIONS[name]; + } + return 99; // Default to end +} + +// ============================================================================ +// Source Structure Scanning +// ============================================================================ + +/** + * Check if a directory contains .sol files (directly or in subdirectories) + * @param {string} dirPath - Directory path to check + * @returns {boolean} True if contains .sol files + */ +function containsSolFiles(dirPath) { + try { + const entries = fs.readdirSync(dirPath, { withFileTypes: true }); + + for (const entry of entries) { + if (entry.isFile() && entry.name.endsWith('.sol')) { + return true; + } + if (entry.isDirectory() && !entry.name.startsWith('.')) { + if (containsSolFiles(path.join(dirPath, entry.name))) { + return true; + } + } + } + } catch (error) { + console.warn(`Warning: Could not read directory ${dirPath}: ${error.message}`); + } + + return false; +} + +/** + * Scan the src/ directory and build structure map + * @returns {Map} Map of relative paths to category info + */ +function scanSourceStructure() { + const srcDir = CONFIG.srcDir || 'src'; + const structure = new Map(); + + function scanDir(dirPath, relativePath = '') { + let entries; + try { + entries = fs.readdirSync(dirPath, { withFileTypes: true }); + } catch (error) { + console.error(`Error reading directory ${dirPath}: ${error.message}`); + return; + } + + for (const entry of entries) { + if (!entry.isDirectory()) continue; + + // Skip hidden directories and interfaces + if (entry.name.startsWith('.') || entry.name === 'interfaces') { + continue; + } + + const fullPath = path.join(dirPath, entry.name); + const relPath = relativePath ? `${relativePath}/${entry.name}` : entry.name; + + // Only include directories that contain .sol files + if (containsSolFiles(fullPath)) { + const parts = relPath.split('/'); + structure.set(relPath, { + name: entry.name, + path: relPath, + depth: parts.length, + parent: relativePath || null, + parentParts: relativePath ? relativePath.split('/') : [], + }); + + // Recurse into subdirectories + scanDir(fullPath, relPath); + } + } + } + + if (fs.existsSync(srcDir)) { + scanDir(srcDir); + } else { + console.warn(`Warning: Source directory ${srcDir} does not exist`); + } + + return structure; +} + +// ============================================================================ +// Category File Generation +// ============================================================================ + +/** + * Map source directory name to docs directory name + * @param {string} srcName - Source directory name + * @returns {string} Documentation directory name + */ +function mapDirectoryName(srcName) { + // Map libraries -> utils for URL consistency + if (srcName === 'libraries') { + return 'utils'; + } + return srcName; +} + +/** + * Compute slug from output directory path + * @param {string} outputDir - Full output directory path + * @param {string} libraryDir - Base library directory + * @returns {string} Slug path (e.g., '/docs/library/access') + */ +function computeSlug(outputDir, libraryDir) { + const relativePath = path.relative(libraryDir, outputDir); + + if (!relativePath || relativePath.startsWith('..')) { + // Root library directory + return '/docs/library'; + } + + // Convert path separators and create slug + const normalizedPath = relativePath.replace(/\\/g, '/'); + return `/docs/library/${normalizedPath}`; +} + +/** + * Wrapper function to create category index file using the index-page-generator utility + * @param {string} outputDir - Directory to create index file in + * @param {string} relativePath - Relative path from library dir + * @param {string} label - Category label + * @param {string} description - Category description + * @param {boolean} overwrite - Whether to overwrite existing files (default: false) + * @param {boolean} hideFromSidebar - Whether to hide the index page from sidebar (default: false) + * @returns {boolean} True if file was created/updated, false if skipped + */ +function createCategoryIndexFile(outputDir, relativePath, label, description, overwrite = false, hideFromSidebar = false) { + return createIndexFile( + outputDir, + relativePath, + label, + description, + generateLabel, + generateDescription, + overwrite, + hideFromSidebar + ); +} + +/** + * Create a _category_.json file for a directory + * @param {string} outputDir - Directory to create category file in + * @param {string} name - Directory name + * @param {string} relativePath - Relative path from library dir + * @param {number} depth - Nesting depth + * @returns {boolean} True if file was created, false if it already existed + */ +function createCategoryFile(outputDir, name, relativePath, depth) { + const categoryFile = path.join(outputDir, '_category_.json'); + const libraryDir = CONFIG.libraryOutputDir || 'website/docs/library'; + + // Don't overwrite existing category files (allows manual customization) + if (fs.existsSync(categoryFile)) { + return false; + } + + // Get the actual directory name from the output path (may be mapped, e.g., utils instead of libraries) + const actualDirName = path.basename(outputDir); + const parentParts = relativePath.split('/').slice(0, -1); + // Use actual directory name for label generation (supports both original and mapped names) + const label = generateLabel(actualDirName); + const position = getCategoryPosition(actualDirName, depth); + const description = generateDescription(actualDirName, parentParts); + + // Create index.mdx file first + createCategoryIndexFile(outputDir, relativePath, label, description); + + // Create category file pointing to index.mdx + const docId = relativePath ? `library/${relativePath}/index` : 'library/index'; + + const category = { + label, + position, + collapsible: true, + collapsed: true, // Collapse all categories by default + link: { + type: 'doc', + id: docId, + }, + }; + + // Ensure directory exists + fs.mkdirSync(outputDir, { recursive: true }); + fs.writeFileSync(categoryFile, JSON.stringify(category, null, 2) + '\n'); + + return true; +} + +/** + * Ensure the base library category file exists + * @param {string} libraryDir - Path to library directory + * @returns {boolean} True if created, false if existed + */ +function ensureBaseCategory(libraryDir) { + const categoryFile = path.join(libraryDir, '_category_.json'); + + if (fs.existsSync(categoryFile)) { + return false; + } + + const label = 'Library'; + const description = 'API reference for all Compose modules and facets.'; + + // Create index.mdx for base library category + // Hide from sidebar (sidebar_class_name: "hidden") so it doesn't appear as a page in the sidebar + createIndexFile(libraryDir, '', label, description, generateLabel, generateDescription, false, true); + + const baseCategory = { + label, + position: 4, + collapsible: true, + collapsed: true, // Collapse base Library category by default + link: { + type: 'doc', + id: 'library/index', + }, + }; + + fs.mkdirSync(libraryDir, { recursive: true }); + fs.writeFileSync(categoryFile, JSON.stringify(baseCategory, null, 2) + '\n'); + + return true; +} + +// ============================================================================ +// Path Computation +// ============================================================================ + +/** + * Compute output path for a source file + * Mirrors the src/ structure in website/docs/library/ + * Applies directory name mapping (e.g., libraries -> utils) + * + * @param {string} solFilePath - Path to .sol file (e.g., 'src/access/AccessControl/AccessControlMod.sol') + * @returns {object} Output path information + */ +function computeOutputPath(solFilePath) { + const libraryDir = CONFIG.libraryOutputDir || 'website/docs/library'; + + // Normalize path separators + const normalizedPath = solFilePath.replace(/\\/g, '/'); + + // Remove 'src/' prefix and '.sol' extension + const relativePath = normalizedPath.replace(/^src\//, '').replace(/\.sol$/, ''); + + const parts = relativePath.split('/'); + const fileName = parts.pop(); + + // Map directory names (e.g., libraries -> utils) + const mappedParts = parts.map(part => mapDirectoryName(part)); + + const outputDir = path.join(libraryDir, ...mappedParts); + const outputFile = path.join(outputDir, `${fileName}.mdx`); + + return { + outputDir, + outputFile, + relativePath: mappedParts.join('/'), + fileName, + category: mappedParts[0] || '', + subcategory: mappedParts[1] || '', + fullRelativePath: mappedParts.join('/'), + depth: mappedParts.length, + }; +} + +/** + * Ensure all parent category files exist for a given output path + * Creates _category_.json files for each directory level + * + * @param {string} outputDir - Full output directory path + */ +function ensureCategoryFiles(outputDir) { + const libraryDir = CONFIG.libraryOutputDir || 'website/docs/library'; + + // Get relative path from library base + const relativePath = path.relative(libraryDir, outputDir); + + if (!relativePath || relativePath.startsWith('..')) { + return; // outputDir is not under libraryDir + } + + // Ensure base category exists + ensureBaseCategory(libraryDir); + + // Walk up the directory tree, creating category files + const parts = relativePath.split(path.sep); + let currentPath = libraryDir; + + for (let i = 0; i < parts.length; i++) { + currentPath = path.join(currentPath, parts[i]); + const segment = parts[i]; + // Use the mapped path for the relative path (already mapped in computeOutputPath) + const relPath = parts.slice(0, i + 1).join('/'); + + createCategoryFile(currentPath, segment, relPath, i + 1); + } +} + +// ============================================================================ +// Structure Synchronization +// ============================================================================ + +/** + * Regenerate index.mdx files for all categories + * @param {boolean} overwrite - Whether to overwrite existing files (default: true) + * @returns {object} Summary of regenerated categories + */ +function regenerateAllIndexFiles(overwrite = true) { + const structure = scanSourceStructure(); + const libraryDir = CONFIG.libraryOutputDir || 'website/docs/library'; + + const regenerated = []; + const skipped = []; + + // Regenerate base library index + // Always hide from sidebar (sidebar_class_name: "hidden") + const label = 'Library'; + const description = 'API reference for all Compose modules and facets.'; + if (createCategoryIndexFile(libraryDir, '', label, description, overwrite, true)) { + regenerated.push('library'); + } else { + skipped.push('library'); + } + + // Regenerate index for each category + const sortedPaths = Array.from(structure.entries()).sort((a, b) => + a[0].localeCompare(b[0]) + ); + + for (const [relativePath, info] of sortedPaths) { + const pathParts = relativePath.split('/'); + const mappedPathParts = pathParts.map(part => mapDirectoryName(part)); + const mappedRelativePath = mappedPathParts.join('/'); + const outputDir = path.join(libraryDir, ...mappedPathParts); + + const actualDirName = path.basename(outputDir); + const parentParts = mappedRelativePath.split('/').slice(0, -1); + const label = generateLabel(actualDirName); + const description = generateDescription(actualDirName, parentParts); + + if (createCategoryIndexFile(outputDir, mappedRelativePath, label, description, overwrite)) { + regenerated.push(mappedRelativePath); + } else { + skipped.push(mappedRelativePath); + } + } + + return { + regenerated, + skipped, + total: structure.size + 1, // +1 for base library + }; +} + +/** + * Synchronize docs structure with src structure + * Creates any missing category directories and _category_.json files + * + * @returns {object} Summary of created categories + */ +function syncDocsStructure() { + const structure = scanSourceStructure(); + const libraryDir = CONFIG.libraryOutputDir || 'website/docs/library'; + + const created = []; + const existing = []; + + // Ensure base library directory exists with category + if (ensureBaseCategory(libraryDir)) { + created.push('library'); + } else { + existing.push('library'); + } + + // Create category for each directory in the structure + // Sort by path to ensure parents are created before children + const sortedPaths = Array.from(structure.entries()).sort((a, b) => + a[0].localeCompare(b[0]) + ); + + for (const [relativePath, info] of sortedPaths) { + // Map directory names in the path (e.g., libraries -> utils) + const pathParts = relativePath.split('/'); + const mappedPathParts = pathParts.map(part => mapDirectoryName(part)); + const mappedRelativePath = mappedPathParts.join('/'); + const outputDir = path.join(libraryDir, ...mappedPathParts); + + const wasCreated = createCategoryFile( + outputDir, + info.name, + mappedRelativePath, + info.depth + ); + + if (wasCreated) { + created.push(mappedRelativePath); + } else { + existing.push(mappedRelativePath); + } + } + + return { + created, + existing, + total: structure.size, + structure, + }; +} + +// ============================================================================ +// Exports +// ============================================================================ + +module.exports = { + // Core functions + scanSourceStructure, + syncDocsStructure, + computeOutputPath, + ensureCategoryFiles, + createCategoryIndexFile, + regenerateAllIndexFiles, + + // Utilities + generateLabel, + generateDescription, + getCategoryPosition, + containsSolFiles, + mapDirectoryName, + computeSlug, + + // For extending/customizing + CATEGORY_LABELS, + CATEGORY_DESCRIPTIONS, + CATEGORY_POSITIONS, +}; + diff --git a/.github/scripts/generate-docs-utils/category/index-page-generator.js b/.github/scripts/generate-docs-utils/category/index-page-generator.js new file mode 100644 index 00000000..ed3ff7c7 --- /dev/null +++ b/.github/scripts/generate-docs-utils/category/index-page-generator.js @@ -0,0 +1,212 @@ +/** + * Index Page Generator + * + * Generates index.mdx files for category directories with custom DocCard components. + * This module provides utilities for creating styled category index pages. + */ + +const fs = require('fs'); +const path = require('path'); +const CONFIG = require('../config'); + +// ============================================================================ +// Category Items Discovery +// ============================================================================ + +/** + * Get all items (documents and subcategories) in a directory + * @param {string} outputDir - Directory to scan + * @param {string} relativePath - Relative path from library dir + * @param {Function} generateLabel - Function to generate labels from names + * @param {Function} generateDescription - Function to generate descriptions + * @returns {Array} Array of items with type, name, label, href, description + */ +function getCategoryItems(outputDir, relativePath, generateLabel, generateDescription) { + const items = []; + + if (!fs.existsSync(outputDir)) { + return items; + } + + const entries = fs.readdirSync(outputDir, { withFileTypes: true }); + + for (const entry of entries) { + // Skip hidden files, category files, and index files + if (entry.name.startsWith('.') || + entry.name === '_category_.json' || + entry.name === 'index.mdx') { + continue; + } + + if (entry.isFile() && entry.name.endsWith('.mdx')) { + // It's a document + const docName = entry.name.replace('.mdx', ''); + const docPath = path.join(outputDir, entry.name); + + // Try to read frontmatter for title and description + let title = generateLabel(docName); + let description = ''; + + try { + const content = fs.readFileSync(docPath, 'utf8'); + const frontmatterMatch = content.match(/^---\n([\s\S]*?)\n---/); + if (frontmatterMatch) { + const frontmatter = frontmatterMatch[1]; + const titleMatch = frontmatter.match(/^title:\s*["']?(.*?)["']?$/m); + const descMatch = frontmatter.match(/^description:\s*["']?(.*?)["']?$/m); + if (titleMatch) title = titleMatch[1].trim(); + if (descMatch) description = descMatch[1].trim(); + } + } catch (error) { + // If reading fails, use defaults + } + + const docRelativePath = relativePath ? `${relativePath}/${docName}` : docName; + items.push({ + type: 'doc', + name: docName, + label: title, + description: description, + href: `/docs/library/${docRelativePath}`, + }); + } else if (entry.isDirectory()) { + // It's a subcategory + const subcategoryName = entry.name; + const subcategoryLabel = generateLabel(subcategoryName); + const subcategoryRelativePath = relativePath ? `${relativePath}/${subcategoryName}` : subcategoryName; + const subcategoryDescription = generateDescription(subcategoryName, relativePath.split('/')); + + items.push({ + type: 'category', + name: subcategoryName, + label: subcategoryLabel, + description: subcategoryDescription, + href: `/docs/library/${subcategoryRelativePath}`, + }); + } + } + + // Sort items: categories first, then docs, both alphabetically + items.sort((a, b) => { + if (a.type !== b.type) { + return a.type === 'category' ? -1 : 1; + } + return a.label.localeCompare(b.label); + }); + + return items; +} + +// ============================================================================ +// MDX Content Generation +// ============================================================================ + +/** + * Generate MDX content for a category index page + * @param {string} label - Category label + * @param {string} description - Category description + * @param {Array} items - Array of items to display + * @returns {string} Generated MDX content + */ +function generateIndexMdxContent(label, description, items, hideFromSidebar = false) { + // Escape quotes in label and description for frontmatter + const escapedLabel = label.replace(/"/g, '\\"'); + const escapedDescription = description.replace(/"/g, '\\"'); + + // Add sidebar_class_name: "hidden" to hide from sidebar if requested + const sidebarClass = hideFromSidebar ? '\nsidebar_class_name: "hidden"' : ''; + + let mdxContent = `--- +title: "${escapedLabel}" +description: "${escapedDescription}"${sidebarClass} +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ${escapedDescription} + + +`; + + if (items.length > 0) { + mdxContent += `\n`; + + for (const item of items) { + const iconName = item.type === 'category' ? 'package' : 'book'; + const itemDescription = item.description ? `"${item.description.replace(/"/g, '\\"')}"` : '""'; + + mdxContent += ` } + size="medium" + />\n`; + } + + mdxContent += `\n`; + } else { + mdxContent += `_No items in this category yet._\n`; + } + + return mdxContent; +} + +// ============================================================================ +// Index File Creation +// ============================================================================ + +/** + * Generate index.mdx file for a category + * @param {string} outputDir - Directory to create index file in + * @param {string} relativePath - Relative path from library dir + * @param {string} label - Category label + * @param {string} description - Category description + * @param {Function} generateLabel - Function to generate labels from names + * @param {Function} generateDescription - Function to generate descriptions + * @param {boolean} overwrite - Whether to overwrite existing files (default: false) + * @returns {boolean} True if file was created/updated, false if skipped + */ +function createCategoryIndexFile( + outputDir, + relativePath, + label, + description, + generateLabel, + generateDescription, + overwrite = false, + hideFromSidebar = false +) { + const indexFile = path.join(outputDir, 'index.mdx'); + + // Don't overwrite existing index files unless explicitly requested (allows manual customization) + if (!overwrite && fs.existsSync(indexFile)) { + return false; + } + + // Get items in this category + const items = getCategoryItems(outputDir, relativePath, generateLabel, generateDescription); + + // Generate MDX content + const mdxContent = generateIndexMdxContent(label, description, items, hideFromSidebar); + + // Ensure directory exists + fs.mkdirSync(outputDir, { recursive: true }); + fs.writeFileSync(indexFile, mdxContent); + + return true; +} + +// ============================================================================ +// Exports +// ============================================================================ + +module.exports = { + getCategoryItems, + generateIndexMdxContent, + createCategoryIndexFile, +}; + diff --git a/.github/scripts/generate-docs-utils/config.js b/.github/scripts/generate-docs-utils/config.js new file mode 100644 index 00000000..e436c41a --- /dev/null +++ b/.github/scripts/generate-docs-utils/config.js @@ -0,0 +1,153 @@ +/** + * Configuration for documentation generation + * + * Centralized configuration for paths, settings, and defaults. + * Modify this file to change documentation output paths or behavior. + */ + +module.exports = { + // ============================================================================ + // Input Paths + // ============================================================================ + + /** Directory containing forge doc output */ + forgeDocsDir: 'docs/src/src', + + /** Source code directory to mirror */ + srcDir: 'src', + + // ============================================================================ + // Output Paths + // ============================================================================ + + /** + * Base output directory for contract documentation + * Structure mirrors src/ automatically + */ + contractsOutputDir: 'website/docs/contracts', + + // ============================================================================ + // Sidebar Positions + // ============================================================================ + + /** Default sidebar position for contracts without explicit mapping */ + defaultSidebarPosition: 50, + + /** + * Contract-specific sidebar positions + * Maps contract name to position number (lower = higher in sidebar) + * + * Convention: + * - Modules come before their corresponding facets + * - Core/base contracts come before extensions + * - Burn facets come after main facets + */ + contractPositions: { + // Diamond core + DiamondMod: 1, + DiamondCutMod: 2, + DiamondCutFacet: 3, + DiamondLoupeFacet: 4, + + // Access - Owner pattern + OwnerMod: 1, + OwnerFacet: 2, + + // Access - Two-step owner + OwnerTwoStepsMod: 1, + OwnerTwoStepsFacet: 2, + + // Access - AccessControl pattern + AccessControlMod: 1, + AccessControlFacet: 2, + + // Access - AccessControlPausable + AccessControlPausableMod: 1, + AccessControlPausableFacet: 2, + + // Access - AccessControlTemporal + AccessControlTemporalMod: 1, + AccessControlTemporalFacet: 2, + + // ERC-20 base + ERC20Mod: 1, + ERC20Facet: 2, + ERC20BurnFacet: 3, + + // ERC-20 Bridgeable + ERC20BridgeableMod: 1, + ERC20BridgeableFacet: 2, + + // ERC-20 Permit + ERC20PermitMod: 1, + ERC20PermitFacet: 2, + + // ERC-721 base + ERC721Mod: 1, + ERC721Facet: 2, + ERC721BurnFacet: 3, + + // ERC-721 Enumerable + ERC721EnumerableMod: 1, + ERC721EnumerableFacet: 2, + ERC721EnumerableBurnFacet: 3, + + // ERC-1155 + ERC1155Mod: 1, + ERC1155Facet: 2, + + // ERC-6909 + ERC6909Mod: 1, + ERC6909Facet: 2, + + // Royalty + RoyaltyMod: 1, + RoyaltyFacet: 2, + + // Libraries + NonReentrancyMod: 1, + ERC165Mod: 1, + }, + + // ============================================================================ + // Repository Configuration + // ============================================================================ + + /** Main repository URL - always use this for source links */ + mainRepoUrl: 'https://github.com/Perfect-Abstractions/Compose', + + /** + * Normalize gitSource URL to always point to the main repository's main branch + * Replaces any fork or incorrect repository URLs with the main repo URL + * Converts blob URLs to tree URLs pointing to main branch + * @param {string} gitSource - Original gitSource URL from forge doc + * @returns {string} Normalized gitSource URL + */ + normalizeGitSource(gitSource) { + if (!gitSource) return gitSource; + + // Pattern: https://github.com/USER/Compose/blob/COMMIT/src/path/to/file.sol + // Convert to: https://github.com/Perfect-Abstractions/Compose/tree/main/src/path/to/file.sol + const githubUrlPattern = /https:\/\/github\.com\/[^\/]+\/Compose\/(?:blob|tree)\/[^\/]+\/(.+)/; + const match = gitSource.match(githubUrlPattern); + + if (match) { + // Extract the path after the repo name (should start with src/) + const pathPart = match[1]; + // Ensure it starts with src/ (remove any leading src/ if duplicated) + const normalizedPath = pathPart.startsWith('src/') ? pathPart : `src/${pathPart}`; + return `${this.mainRepoUrl}/tree/main/${normalizedPath}`; + } + + // If it doesn't match the pattern, try to construct from the main repo + // Extract just the file path if it's a relative path or partial URL + if (gitSource.includes('/src/')) { + const srcIndex = gitSource.indexOf('/src/'); + const pathAfterSrc = gitSource.substring(srcIndex + 1); + return `${this.mainRepoUrl}/tree/main/${pathAfterSrc}`; + } + + // If it doesn't match any pattern, return as-is (might be a different format) + return gitSource; + }, +}; diff --git a/.github/scripts/generate-docs-utils/core/contract-processor.js b/.github/scripts/generate-docs-utils/core/contract-processor.js new file mode 100644 index 00000000..e9e10bcf --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/contract-processor.js @@ -0,0 +1,102 @@ +/** + * Contract Processing Pipeline + * + * Shared processing logic for both regular and aggregated contract files. + * Handles the complete pipeline from parsed data to written MDX file. + */ + +const fs = require('fs'); +const { extractStorageInfo } = require('../parsing/storage-extractor'); +const { getOutputPath } = require('../utils/path-computer'); +const { getSidebarPosition } = require('../utils/sidebar-position-calculator'); +const { registerContract, getContractRegistry } = require('./contract-registry'); +const { generateFacetDoc, generateModuleDoc } = require('../templates/templates'); +const { enhanceWithAI, shouldSkipEnhancement } = require('../ai/ai-enhancement'); +const { addFallbackContent } = require('../ai/fallback-content-provider'); +const { applyDescriptionFallback } = require('./description-manager'); +const { writeFileSafe } = require('../../workflow-utils'); + +/** + * Process contract data through the complete pipeline + * @param {object} data - Parsed documentation data + * @param {string} solFilePath - Path to source Solidity file + * @param {'module' | 'facet'} contractType - Type of contract + * @param {object} tracker - Tracker object for recording results (temporary, will be replaced with SummaryTracker) + * @returns {Promise<{success: boolean, error?: string}>} Processing result + */ +async function processContractData(data, solFilePath, contractType, tracker) { + // 1. Extract storage info for modules + if (contractType === 'module') { + data.storageInfo = extractStorageInfo(data); + } + + // 2. Apply description fallback + data = applyDescriptionFallback(data, contractType, solFilePath); + + // 3. Compute output path (mirrors src/ structure) + const pathInfo = getOutputPath(solFilePath, contractType); + + // 4. Get registry for relationship detection + const registry = getContractRegistry(); + + // 5. Get smart sidebar position (uses registry if available) + data.position = getSidebarPosition(data.title, contractType, pathInfo.category, registry); + + // 6. Set contract type for registry (before registering) + data.contractType = contractType; + + // 7. Register contract in registry (before AI enhancement so it's available for relationship detection) + registerContract(data, pathInfo); + + // 8. Enhance with AI if not skipped + const skipAIEnhancement = shouldSkipEnhancement(data) || process.env.SKIP_ENHANCEMENT === 'true'; + let enhancedData = data; + let usedFallback = false; + let enhancementError = null; + + if (!skipAIEnhancement) { + const token = process.env.GITHUB_TOKEN; + const result = await enhanceWithAI(data, contractType, token); + enhancedData = result.data; + usedFallback = result.usedFallback; + enhancementError = result.error; + + // Track fallback usage + if (usedFallback) { + tracker.recordFallback(data.title, pathInfo.outputFile, enhancementError || 'Unknown error'); + } + } else { + enhancedData = addFallbackContent(data, contractType); + } + + // Ensure contractType is preserved after AI enhancement + enhancedData.contractType = contractType; + + // 9. Generate MDX content with registry for relationship detection + const mdxContent = contractType === 'module' + ? generateModuleDoc(enhancedData, enhancedData.position, pathInfo, registry) + : generateFacetDoc(enhancedData, enhancedData.position, pathInfo, registry); + + // 10. Ensure output directory exists + fs.mkdirSync(pathInfo.outputDir, { recursive: true }); + + // 11. Write the file + if (writeFileSafe(pathInfo.outputFile, mdxContent)) { + // Track success + if (contractType === 'module') { + tracker.recordModule(data.title, pathInfo.outputFile); + } else { + tracker.recordFacet(data.title, pathInfo.outputFile); + } + return { success: true }; + } + + // Track write error + tracker.recordError(pathInfo.outputFile, 'Could not write file'); + return { success: false, error: 'Could not write file' }; +} + +module.exports = { + processContractData, +}; + diff --git a/.github/scripts/generate-docs-utils/core/contract-registry.js b/.github/scripts/generate-docs-utils/core/contract-registry.js new file mode 100644 index 00000000..a8981752 --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/contract-registry.js @@ -0,0 +1,97 @@ +/** + * Contract Registry System + * + * Tracks all contracts (modules and facets) for relationship detection + * and cross-reference generation in documentation. + * + * Features: + * - Register contracts with metadata (name, type, category, path) + * - Provide registry access for relationship detection and other operations + */ + +// ============================================================================ +// Registry State +// ============================================================================ + +/** + * Global registry to track all contracts for relationship detection + * This allows us to find related contracts and generate cross-references + */ +const contractRegistry = { + byName: new Map(), + byCategory: new Map(), + byType: { modules: [], facets: [] } +}; + +// ============================================================================ +// Registry Management +// ============================================================================ + +/** + * Register a contract in the global registry + * @param {object} contractData - Contract documentation data + * @param {object} outputPath - Output path information from getOutputPath + * @returns {object} Registered contract entry + */ +function registerContract(contractData, outputPath) { + // Construct full path including filename (without .mdx extension) + // This ensures RelatedDocs links point to the actual page, not the category index + const fullPath = outputPath.relativePath + ? `${outputPath.relativePath}/${outputPath.fileName}` + : outputPath.fileName; + + const entry = { + name: contractData.title, + type: contractData.contractType, // 'module' or 'facet' + category: outputPath.category, + path: fullPath, + sourcePath: contractData.sourceFilePath, + functions: contractData.functions || [], + storagePosition: contractData.storageInfo?.storagePosition + }; + + contractRegistry.byName.set(contractData.title, entry); + + if (!contractRegistry.byCategory.has(outputPath.category)) { + contractRegistry.byCategory.set(outputPath.category, []); + } + contractRegistry.byCategory.get(outputPath.category).push(entry); + + if (contractData.contractType === 'module') { + contractRegistry.byType.modules.push(entry); + } else { + contractRegistry.byType.facets.push(entry); + } + + return entry; +} + +/** + * Get the contract registry + * @returns {object} The contract registry + */ +function getContractRegistry() { + return contractRegistry; +} + +/** + * Clear the contract registry (useful for testing or reset) + */ +function clearContractRegistry() { + contractRegistry.byName.clear(); + contractRegistry.byCategory.clear(); + contractRegistry.byType.modules = []; + contractRegistry.byType.facets = []; +} + +// ============================================================================ +// Exports +// ============================================================================ + +module.exports = { + // Registry management + registerContract, + getContractRegistry, + clearContractRegistry, +}; + diff --git a/.github/scripts/generate-docs-utils/core/description-generator.js b/.github/scripts/generate-docs-utils/core/description-generator.js new file mode 100644 index 00000000..65d9a026 --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/description-generator.js @@ -0,0 +1,48 @@ +/** + * Description Generator + * + * Generates fallback descriptions from contract names. + */ + +/** + * Generate a fallback description from contract name + * + * This is a minimal, generic fallback used only when: + * 1. No NatSpec @title/@notice exists in source + * 2. AI enhancement will improve it later + * + * The AI enhancement step receives this as input and generates + * a richer, context-aware description from the actual code. + * + * @param {string} contractName - Name of the contract + * @returns {string} Generic description (will be enhanced by AI) + */ +function generateDescriptionFromName(contractName) { + if (!contractName) return ''; + + // Detect library type from naming convention + const isModule = contractName.endsWith('Mod') || contractName.endsWith('Module'); + const isFacet = contractName.endsWith('Facet'); + const typeLabel = isModule ? 'module' : isFacet ? 'facet' : 'library'; + + // Remove suffix and convert CamelCase to readable text + const baseName = contractName + .replace(/Mod$/, '') + .replace(/Module$/, '') + .replace(/Facet$/, ''); + + // Convert CamelCase to readable format + // Handles: ERC20 -> ERC-20, AccessControl -> Access Control + const readable = baseName + .replace(/([a-z])([A-Z])/g, '$1 $2') // camelCase splits + .replace(/([A-Z]+)([A-Z][a-z])/g, '$1 $2') // acronym handling + .replace(/^ERC(\d+)/, 'ERC-$1') // ERC20 -> ERC-20 + .trim(); + + return `${readable} ${typeLabel} for Compose diamonds`; +} + +module.exports = { + generateDescriptionFromName, +}; + diff --git a/.github/scripts/generate-docs-utils/core/description-manager.js b/.github/scripts/generate-docs-utils/core/description-manager.js new file mode 100644 index 00000000..0b2c14ab --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/description-manager.js @@ -0,0 +1,84 @@ +/** + * Description Manager + * + * Handles description generation and fallback logic for contracts. + * Consolidates description generation from multiple sources. + */ + +const { extractModuleDescriptionFromSource } = require('../utils/source-parser'); +const { generateDescriptionFromName } = require('./description-generator'); + +/** + * Apply description fallback logic to contract data + * @param {object} data - Contract documentation data + * @param {'module' | 'facet'} contractType - Type of contract + * @param {string} solFilePath - Path to source Solidity file + * @returns {object} Data with description applied + */ +function applyDescriptionFallback(data, contractType, solFilePath) { + // For modules, try to get description from source file first + if (contractType === 'module' && solFilePath) { + const sourceDescription = extractModuleDescriptionFromSource(solFilePath); + if (sourceDescription) { + data.description = sourceDescription; + data.subtitle = sourceDescription; + data.overview = sourceDescription; + return data; + } + } + + // For facets, check if description is generic and needs replacement + if (contractType === 'facet') { + const looksLikeEnum = + data.description && + /\w+\s*=\s*\d+/.test(data.description) && + (data.description.match(/\w+\s*=\s*\d+/g) || []).length >= 2; + + const isGenericDescription = + !data.description || + data.description.startsWith('Contract documentation for') || + looksLikeEnum || + data.description.length < 20; + + if (isGenericDescription) { + const generatedDescription = generateDescriptionFromName(data.title); + if (generatedDescription) { + data.description = generatedDescription; + data.subtitle = generatedDescription; + data.overview = generatedDescription; + return data; + } + } + } + + // For modules, try generating from name + if (contractType === 'module') { + const generatedDescription = generateDescriptionFromName(data.title); + if (generatedDescription) { + data.description = generatedDescription; + data.subtitle = generatedDescription; + data.overview = generatedDescription; + return data; + } + + // Last resort fallback for modules + const genericDescription = `Module providing internal functions for ${data.title}`; + if ( + !data.description || + data.description.includes('Event emitted') || + data.description.includes('Thrown when') || + data.description.includes('function to') + ) { + data.description = genericDescription; + data.subtitle = genericDescription; + data.overview = genericDescription; + } + } + + return data; +} + +module.exports = { + applyDescriptionFallback, +}; + diff --git a/.github/scripts/generate-docs-utils/core/file-processor.js b/.github/scripts/generate-docs-utils/core/file-processor.js new file mode 100644 index 00000000..4825ac19 --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/file-processor.js @@ -0,0 +1,151 @@ +/** + * File Processor + * + * Handles processing of Solidity source files and their forge doc outputs. + */ + +const { findForgeDocFiles } = require('../utils/file-finder'); +const { isInterface, getContractType } = require('../utils/contract-classifier'); +const { extractModuleNameFromPath } = require('../utils/source-parser'); +const { readFileSafe } = require('../../workflow-utils'); +const { parseForgeDocMarkdown } = require('../parsing/markdown-parser'); +const { + parseIndividualItemFile, + aggregateParsedItems, + detectItemTypeFromFilename, +} = require('../parsing/item-parser'); +const { processContractData } = require('./contract-processor'); + +/** + * Process a single forge doc markdown file + * @param {string} forgeDocFile - Path to forge doc markdown file + * @param {string} solFilePath - Original .sol file path + * @param {object} tracker - Tracker instance + * @returns {Promise} True if processed successfully + */ +async function processForgeDocFile(forgeDocFile, solFilePath, tracker) { + const content = readFileSafe(forgeDocFile); + if (!content) { + tracker.recordError(forgeDocFile, 'Could not read file'); + return false; + } + + // Parse the forge doc markdown + const data = parseForgeDocMarkdown(content, forgeDocFile); + + // Add source file path for parameter extraction + if (solFilePath) { + data.sourceFilePath = solFilePath; + } + + if (!data.title) { + tracker.recordSkipped(forgeDocFile, 'No title found'); + return false; + } + + // Skip interfaces + if (isInterface(data.title, content)) { + tracker.recordSkipped(forgeDocFile, 'Interface (filtered)'); + return false; + } + + // Determine contract type + const contractType = getContractType(forgeDocFile, content); + + // Process through shared pipeline (includes description fallback) + const result = await processContractData(data, solFilePath, contractType, tracker); + return result.success; +} + +/** + * Check if files need aggregation (individual item files vs contract-level files) + * @param {string[]} forgeDocFiles - Array of forge doc file paths + * @returns {boolean} True if files are individual items that need aggregation + */ +function needsAggregation(forgeDocFiles) { + for (const file of forgeDocFiles) { + const itemType = detectItemTypeFromFilename(file); + if (itemType) { + return true; + } + } + return false; +} + +/** + * Process aggregated files (for free function modules) + * @param {string[]} forgeDocFiles - Array of forge doc file paths + * @param {string} solFilePath - Original .sol file path + * @param {object} tracker - Tracker instance + * @returns {Promise} True if processed successfully + */ +async function processAggregatedFiles(forgeDocFiles, solFilePath, tracker) { + const parsedItems = []; + let gitSource = ''; + + for (const forgeDocFile of forgeDocFiles) { + const content = readFileSafe(forgeDocFile); + if (!content) { + continue; + } + + const parsed = parseIndividualItemFile(content, forgeDocFile); + if (parsed) { + parsedItems.push(parsed); + if (parsed.gitSource && !gitSource) { + gitSource = parsed.gitSource; + } + } + } + + if (parsedItems.length === 0) { + tracker.recordError(solFilePath, 'No valid items parsed'); + return false; + } + + const data = aggregateParsedItems(parsedItems, solFilePath); + + data.sourceFilePath = solFilePath; + + if (!data.title) { + data.title = extractModuleNameFromPath(solFilePath); + } + + if (gitSource) { + data.gitSource = gitSource; + } + + const contractType = getContractType(solFilePath, ''); + + // Process through shared pipeline (includes description fallback) + const result = await processContractData(data, solFilePath, contractType, tracker); + return result.success; +} + +/** + * Process a Solidity source file + * @param {string} solFilePath - Path to .sol file + * @param {object} tracker - Tracker instance + * @returns {Promise} + */ +async function processSolFile(solFilePath, tracker) { + const forgeDocFiles = findForgeDocFiles(solFilePath); + + if (forgeDocFiles.length === 0) { + tracker.recordSkipped(solFilePath, 'No forge doc output'); + return; + } + + if (needsAggregation(forgeDocFiles)) { + await processAggregatedFiles(forgeDocFiles, solFilePath, tracker); + } else { + for (const forgeDocFile of forgeDocFiles) { + await processForgeDocFile(forgeDocFile, solFilePath, tracker); + } + } +} + +module.exports = { + processSolFile, +}; + diff --git a/.github/scripts/generate-docs-utils/core/file-selector.js b/.github/scripts/generate-docs-utils/core/file-selector.js new file mode 100644 index 00000000..0b9bf34a --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/file-selector.js @@ -0,0 +1,40 @@ +/** + * File Selector + * + * Determines which Solidity files to process based on command line arguments. + */ + +const { getAllSolFiles, readChangedFilesFromFile, getChangedSolFiles } = require('../utils/git-utils'); + +/** + * Get files to process based on command line arguments + * @param {string[]} args - Command line arguments + * @returns {string[]} Array of Solidity file paths to process + */ +function getFilesToProcess(args) { + if (args.includes('--all')) { + console.log('Processing all Solidity files...'); + return getAllSolFiles(); + } + + if (args.length > 0 && !args[0].startsWith('--')) { + const changedFilesPath = args[0]; + console.log(`Reading changed files from: ${changedFilesPath}`); + const solFiles = readChangedFilesFromFile(changedFilesPath); + + if (solFiles.length === 0) { + console.log('No files in list, checking git diff...'); + return getChangedSolFiles(); + } + + return solFiles; + } + + console.log('Getting changed Solidity files from git...'); + return getChangedSolFiles(); +} + +module.exports = { + getFilesToProcess, +}; + diff --git a/.github/scripts/generate-docs-utils/core/relationship-detector.js b/.github/scripts/generate-docs-utils/core/relationship-detector.js new file mode 100644 index 00000000..2e926353 --- /dev/null +++ b/.github/scripts/generate-docs-utils/core/relationship-detector.js @@ -0,0 +1,123 @@ +/** + * Relationship Detector + * + * Detects relationships between contracts (modules and facets) for + * cross-reference generation in documentation. + * + * Features: + * - Find related contracts (module/facet pairs, same category, extensions) + * - Enrich documentation data with relationship information + */ + +const { getContractRegistry } = require('./contract-registry'); + +/** + * Find related contracts for a given contract + * @param {string} contractName - Name of the contract + * @param {string} contractType - Type of contract ('module' or 'facet') + * @param {string} category - Category of the contract + * @param {object} registry - Contract registry (optional, uses global if not provided) + * @returns {Array} Array of related contract objects with title, href, description, icon + */ +function findRelatedContracts(contractName, contractType, category, registry = null) { + const reg = registry || getContractRegistry(); + const related = []; + const contract = reg.byName.get(contractName); + if (!contract) return related; + + // 1. Find corresponding module/facet pair + if (contractType === 'facet') { + const moduleName = contractName.replace('Facet', 'Mod'); + const module = reg.byName.get(moduleName); + if (module) { + related.push({ + title: moduleName, + href: `/docs/library/${module.path}`, + description: `Module used by ${contractName}`, + icon: '📦' + }); + } + } else if (contractType === 'module') { + const facetName = contractName.replace('Mod', 'Facet'); + const facet = reg.byName.get(facetName); + if (facet) { + related.push({ + title: facetName, + href: `/docs/library/${facet.path}`, + description: `Facet using ${contractName}`, + icon: '💎' + }); + } + } + + // 2. Find related contracts in same category (excluding self) + const sameCategory = reg.byCategory.get(category) || []; + sameCategory.forEach(c => { + if (c.name !== contractName && c.type === contractType) { + related.push({ + title: c.name, + href: `/docs/library/${c.path}`, + description: `Related ${contractType} in ${category}`, + icon: contractType === 'module' ? '📦' : '💎' + }); + } + }); + + // 3. Find extension contracts (e.g., ERC20Facet → ERC20BurnFacet) + if (contractType === 'facet') { + const baseName = contractName.replace(/BurnFacet$|PermitFacet$|BridgeableFacet$|EnumerableFacet$/, 'Facet'); + if (baseName !== contractName) { + const base = reg.byName.get(baseName); + if (base) { + related.push({ + title: baseName, + href: `/docs/library/${base.path}`, + description: `Base facet for ${contractName}`, + icon: '💎' + }); + } + } + } + + // 4. Find core dependencies (e.g., all facets depend on DiamondCutFacet) + if (contractType === 'facet' && contractName !== 'DiamondCutFacet') { + const diamondCut = reg.byName.get('DiamondCutFacet'); + if (diamondCut) { + related.push({ + title: 'DiamondCutFacet', + href: `/docs/library/${diamondCut.path}`, + description: 'Required for adding facets to diamonds', + icon: '🔧' + }); + } + } + + return related.slice(0, 4); // Limit to 4 related items +} + +/** + * Enrich contract data with relationship information + * @param {object} data - Contract documentation data + * @param {object} pathInfo - Output path information + * @param {object} registry - Contract registry (optional, uses global if not provided) + * @returns {object} Enriched data with relatedDocs property + */ +function enrichWithRelationships(data, pathInfo, registry = null) { + const relatedDocs = findRelatedContracts( + data.title, + data.contractType, + pathInfo.category, + registry + ); + + return { + ...data, + relatedDocs: relatedDocs.length > 0 ? relatedDocs : null + }; +} + +module.exports = { + findRelatedContracts, + enrichWithRelationships, +}; + diff --git a/.github/scripts/generate-docs-utils/parsing/item-builder.js b/.github/scripts/generate-docs-utils/parsing/item-builder.js new file mode 100644 index 00000000..8956b1c7 --- /dev/null +++ b/.github/scripts/generate-docs-utils/parsing/item-builder.js @@ -0,0 +1,90 @@ +/** + * Item Builder + * + * Functions for creating and saving parsed items. + */ + +/** + * Create a new item object based on section type + * @param {string} name - Item name + * @param {string} section - Section type + * @returns {object} New item object + */ +function createNewItem(name, section) { + const base = { + name, + description: '', + notice: '', + }; + + switch (section) { + case 'functions': + return { + ...base, + signature: '', + params: [], + returns: [], + mutability: 'nonpayable', + }; + case 'events': + return { + ...base, + signature: '', + params: [], + }; + case 'errors': + return { + ...base, + signature: '', + params: [], + }; + case 'structs': + return { + ...base, + definition: '', + fields: [], + }; + case 'stateVariables': + return { + ...base, + type: '', + value: '', + }; + default: + return base; + } +} + +/** + * Save current item to data object + * @param {object} data - Data object to save to + * @param {object} item - Item to save + * @param {string} type - Item type + */ +function saveCurrentItem(data, item, type) { + if (!type || !item) return; + + switch (type) { + case 'functions': + data.functions.push(item); + break; + case 'events': + data.events.push(item); + break; + case 'errors': + data.errors.push(item); + break; + case 'structs': + data.structs.push(item); + break; + case 'stateVariables': + data.stateVariables.push(item); + break; + } +} + +module.exports = { + createNewItem, + saveCurrentItem, +}; + diff --git a/.github/scripts/generate-docs-utils/parsing/item-parser.js b/.github/scripts/generate-docs-utils/parsing/item-parser.js new file mode 100644 index 00000000..3da73237 --- /dev/null +++ b/.github/scripts/generate-docs-utils/parsing/item-parser.js @@ -0,0 +1,366 @@ +/** + * Item Parser + * + * Functions for parsing individual item files and aggregating them. + */ + +const path = require('path'); +const config = require('../config'); +const { sanitizeBrokenLinks, cleanDescription } = require('./text-sanitizer'); + +/** + * Detect item type from filename + * @param {string} filePath - Path to the markdown file + * @returns {string | null} Item type ('function', 'error', 'struct', 'event', 'enum', 'constants', or null) + */ +function detectItemTypeFromFilename(filePath) { + const basename = path.basename(filePath); + + if (basename.startsWith('function.')) return 'function'; + if (basename.startsWith('error.')) return 'error'; + if (basename.startsWith('struct.')) return 'struct'; + if (basename.startsWith('event.')) return 'event'; + if (basename.startsWith('enum.')) return 'enum'; + if (basename.startsWith('constants.')) return 'constants'; + + return null; +} + +/** + * Parse an individual item file (function, error, constant, etc.) + * @param {string} content - Markdown content from forge doc + * @param {string} filePath - Path to the markdown file + * @returns {object | null} Parsed item object or null if parsing fails + */ +function parseIndividualItemFile(content, filePath) { + const itemType = detectItemTypeFromFilename(filePath); + if (!itemType) { + return null; + } + + const lines = content.split('\n'); + let itemName = ''; + let gitSource = ''; + let description = ''; + let signature = ''; + let definition = ''; + let descriptionBuffer = []; + let inCodeBlock = false; + let codeBlockLines = []; + let params = []; + let returns = []; + let constants = []; + + for (let i = 0; i < lines.length; i++) { + const line = lines[i]; + const trimmedLine = line.trim(); + + // Parse title (# heading) + if (line.startsWith('# ') && !itemName) { + itemName = line.replace('# ', '').trim(); + continue; + } + + // Parse git source link + if (trimmedLine.startsWith('[Git Source]')) { + const match = trimmedLine.match(/\[Git Source\]\((.*?)\)/); + if (match) { + gitSource = config.normalizeGitSource(match[1]); + } + continue; + } + + // Parse code block + if (line.startsWith('```solidity')) { + inCodeBlock = true; + codeBlockLines = []; + i++; + while (i < lines.length && !lines[i].startsWith('```')) { + codeBlockLines.push(lines[i]); + i++; + } + const codeContent = codeBlockLines.join('\n').trim(); + + if (itemType === 'constants') { + // For constants, parse multiple constant definitions + // Format: "bytes32 constant NON_REENTRANT_SLOT = keccak256(...)" + // Handle both single and multiple constants in one code block + const constantMatches = codeContent.match(/(\w+(?:\s*\d+)?)\s+constant\s+(\w+)\s*=\s*(.+?)(?:\s*;)?/g); + if (constantMatches) { + for (const match of constantMatches) { + const parts = match.match(/(\w+(?:\s*\d+)?)\s+constant\s+(\w+)\s*=\s*(.+?)(?:\s*;)?$/); + if (parts) { + constants.push({ + name: parts[2], + type: parts[1], + value: parts[3].trim(), + description: descriptionBuffer.join(' ').trim(), + }); + } + } + } else { + // Single constant definition (more flexible regex) + const singleMatch = codeContent.match(/(\w+(?:\s*\d+)?)\s+constant\s+(\w+)\s*=\s*(.+?)(?:\s*;)?$/); + if (singleMatch) { + constants.push({ + name: singleMatch[2], + type: singleMatch[1], + value: singleMatch[3].trim(), + description: descriptionBuffer.join(' ').trim(), + }); + } + } + // Clear description buffer after processing constants + descriptionBuffer = []; + } else { + signature = codeContent; + } + inCodeBlock = false; + continue; + } + + // Parse constants with ### heading format + if (itemType === 'constants' && line.startsWith('### ')) { + const constantName = line.replace('### ', '').trim(); + // Clear description buffer for this constant (only text before this heading) + // Filter out code block delimiters and empty lines + const currentConstantDesc = descriptionBuffer + .filter(l => l && !l.trim().startsWith('```') && l.trim() !== '') + .join(' ') + .trim(); + descriptionBuffer = []; + + // Look ahead for code block (within next 15 lines) + let foundCodeBlock = false; + let codeBlockEndIndex = i; + for (let j = i + 1; j < lines.length && j < i + 15; j++) { + if (lines[j].startsWith('```solidity')) { + foundCodeBlock = true; + const constCodeLines = []; + j++; + while (j < lines.length && !lines[j].startsWith('```')) { + constCodeLines.push(lines[j]); + j++; + } + codeBlockEndIndex = j; // j now points to the line after closing ``` + const constCode = constCodeLines.join('\n').trim(); + // Match: type constant name = value + // Handle complex types like "bytes32", "uint256", etc. + const constMatch = constCode.match(/(\w+(?:\s*\d+)?)\s+constant\s+(\w+)\s*=\s*(.+?)(?:\s*;)?$/); + if (constMatch) { + constants.push({ + name: constantName, + type: constMatch[1], + value: constMatch[3].trim(), + description: currentConstantDesc, + }); + } else { + // Fallback: if no match, still add constant with name from heading + constants.push({ + name: constantName, + type: '', + value: constCode, + description: currentConstantDesc, + }); + } + break; + } + } + if (!foundCodeBlock) { + // No code block found, but we have a heading - might be a constant without definition + // This shouldn't happen in forge doc output, but handle it gracefully + constants.push({ + name: constantName, + type: '', + value: '', + description: currentConstantDesc, + }); + } else { + // Skip to the end of the code block (the loop will increment i, so we set it to one before) + i = codeBlockEndIndex - 1; + } + continue; + } + + // Collect description (text before code block or after title) + // Skip code block delimiters, empty lines, and markdown table separators + if (!inCodeBlock && trimmedLine && + !trimmedLine.startsWith('#') && + !trimmedLine.startsWith('[') && + !trimmedLine.startsWith('|') && + !trimmedLine.startsWith('```') && + trimmedLine !== '') { + if (itemType !== 'constants' || !line.startsWith('###')) { + descriptionBuffer.push(trimmedLine); + } + continue; + } + + // Parse table rows (Parameters or Returns) + if (trimmedLine.startsWith('|') && !trimmedLine.includes('----')) { + const cells = trimmedLine.split('|').map(c => c.trim()).filter(c => c); + + if (cells.length >= 3 && cells[0] !== 'Name' && cells[0] !== 'Parameter') { + const paramName = cells[0].replace(/`/g, '').trim(); + const paramType = cells[1].replace(/`/g, '').trim(); + const paramDesc = sanitizeBrokenLinks(cells[2] || ''); + + // Determine if Parameters or Returns based on preceding lines + const precedingLines = lines.slice(Math.max(0, i - 10), i).join('\n'); + + if (precedingLines.includes('**Returns**')) { + returns.push({ + name: paramName === '' ? '' : paramName, + type: paramType, + description: paramDesc, + }); + } else if (precedingLines.includes('**Parameters**')) { + if (paramType || paramName.startsWith('_')) { + params.push({ + name: paramName, + type: paramType, + description: paramDesc, + }); + } + } + } + } + } + + // Combine description buffer and clean it + if (descriptionBuffer.length > 0) { + description = cleanDescription(sanitizeBrokenLinks(descriptionBuffer.join(' ').trim())); + } + + // For constants, return array of constant objects + if (itemType === 'constants') { + return { + type: 'constants', + constants: constants.length > 0 ? constants : [{ + name: itemName || 'Constants', + type: '', + value: '', + description: description, + }], + gitSource: gitSource, + }; + } + + // For structs, use definition instead of signature + if (itemType === 'struct') { + definition = signature; + signature = ''; + } + + // Create item object based on type + const item = { + name: itemName, + description: description, + notice: description, + signature: signature, + definition: definition, + params: params, + returns: returns, + gitSource: gitSource, + }; + + // Add mutability for functions + if (itemType === 'function' && signature) { + if (signature.includes(' view ')) { + item.mutability = 'view'; + } else if (signature.includes(' pure ')) { + item.mutability = 'pure'; + } else if (signature.includes(' payable ')) { + item.mutability = 'payable'; + } else { + item.mutability = 'nonpayable'; + } + } + + return { + type: itemType, + item: item, + }; +} + +/** + * Aggregate multiple parsed items into a single data structure + * @param {Array} parsedItems - Array of parsed item objects from parseIndividualItemFile + * @param {string} sourceFilePath - Path to the source Solidity file + * @returns {object} Aggregated documentation data + */ +function aggregateParsedItems(parsedItems, sourceFilePath) { + const data = { + title: '', + description: '', + subtitle: '', + overview: '', + gitSource: '', + functions: [], + events: [], + errors: [], + structs: [], + stateVariables: [], + }; + + // Extract module name from source file path + const basename = path.basename(sourceFilePath, '.sol'); + data.title = basename; + + // Extract git source from first item + for (const parsed of parsedItems) { + if (parsed && parsed.gitSource) { + data.gitSource = config.normalizeGitSource(parsed.gitSource); + break; + } + } + + // Group items by type + for (const parsed of parsedItems) { + if (!parsed) continue; + + if (parsed.type === 'function' && parsed.item) { + data.functions.push(parsed.item); + } else if (parsed.type === 'error' && parsed.item) { + data.errors.push(parsed.item); + } else if (parsed.type === 'event' && parsed.item) { + data.events.push(parsed.item); + } else if (parsed.type === 'struct' && parsed.item) { + data.structs.push(parsed.item); + } else if (parsed.type === 'enum' && parsed.item) { + // Enums can be treated as structs for display purposes + data.structs.push(parsed.item); + } else if (parsed.type === 'constants' && parsed.constants) { + // Add constants as state variables + for (const constant of parsed.constants) { + data.stateVariables.push({ + name: constant.name, + type: constant.type, + value: constant.value, + description: constant.description, + }); + } + } + } + + // Set default description if not provided + // Don't use item descriptions as module description - they'll be overridden by source file parsing + if (!data.description || + data.description.includes('Event emitted') || + data.description.includes('Thrown when') || + data.description.includes('function to') || + data.description.length < 20) { + data.description = `Documentation for ${data.title}`; + data.subtitle = data.description; + data.overview = data.description; + } + + return data; +} + +module.exports = { + detectItemTypeFromFilename, + parseIndividualItemFile, + aggregateParsedItems, +}; + diff --git a/.github/scripts/generate-docs-utils/parsing/markdown-parser.js b/.github/scripts/generate-docs-utils/parsing/markdown-parser.js new file mode 100644 index 00000000..f16ade89 --- /dev/null +++ b/.github/scripts/generate-docs-utils/parsing/markdown-parser.js @@ -0,0 +1,236 @@ +/** + * Markdown Parser + * + * Main parser for forge doc markdown output. + */ + +const config = require('../config'); +const { createNewItem, saveCurrentItem } = require('./item-builder'); +const { sanitizeBrokenLinks, cleanDescription } = require('./text-sanitizer'); + +/** + * Parse forge doc markdown output into structured data + * @param {string} content - Markdown content from forge doc + * @param {string} filePath - Path to the markdown file + * @returns {object} Parsed documentation data + */ +function parseForgeDocMarkdown(content, filePath) { + const data = { + title: '', + description: '', + subtitle: '', + overview: '', + gitSource: '', + functions: [], + events: [], + errors: [], + structs: [], + stateVariables: [], + }; + + const lines = content.split('\n'); + let currentSection = null; + let currentItem = null; + let itemType = null; + let collectingDescription = false; + let descriptionBuffer = []; + + for (let i = 0; i < lines.length; i++) { + const line = lines[i]; + const trimmedLine = line.trim(); + + // Parse title (# heading) + if (line.startsWith('# ') && !data.title) { + data.title = line.replace('# ', '').trim(); + continue; + } + + // Parse git source link + if (trimmedLine.startsWith('[Git Source]')) { + const match = trimmedLine.match(/\[Git Source\]\((.*?)\)/); + if (match) { + data.gitSource = config.normalizeGitSource(match[1]); + } + continue; + } + + // Parse description (first non-empty lines after title, before sections) + if (data.title && !currentSection && trimmedLine && !line.startsWith('#') && !line.startsWith('[')) { + const sanitizedLine = cleanDescription(sanitizeBrokenLinks(trimmedLine)); + if (!data.description) { + data.description = sanitizedLine; + data.subtitle = sanitizedLine; + } else if (!data.overview) { + // Capture additional lines as overview + data.overview = data.description + '\n\n' + sanitizedLine; + } + continue; + } + + // Parse main sections + if (line.startsWith('## ')) { + const sectionName = line.replace('## ', '').trim().toLowerCase(); + + // Save current item before switching sections + if (currentItem) { + saveCurrentItem(data, currentItem, itemType); + currentItem = null; + itemType = null; + } + + if (sectionName === 'functions') { + currentSection = 'functions'; + } else if (sectionName === 'events') { + currentSection = 'events'; + } else if (sectionName === 'errors') { + currentSection = 'errors'; + } else if (sectionName === 'structs') { + currentSection = 'structs'; + } else if (sectionName === 'state variables') { + currentSection = 'stateVariables'; + } else { + currentSection = null; + } + continue; + } + + // Parse item definitions (### heading) + if (line.startsWith('### ') && currentSection) { + // Save previous item + if (currentItem) { + saveCurrentItem(data, currentItem, itemType); + } + + const name = line.replace('### ', '').trim(); + itemType = currentSection; + currentItem = createNewItem(name, currentSection); + collectingDescription = true; + descriptionBuffer = []; + continue; + } + + // Process content within current item + if (currentItem) { + // Code block with signature + if (line.startsWith('```solidity')) { + const codeLines = []; + i++; + while (i < lines.length && !lines[i].startsWith('```')) { + codeLines.push(lines[i]); + i++; + } + const codeContent = codeLines.join('\n').trim(); + + if (currentSection === 'functions' || currentSection === 'events' || currentSection === 'errors') { + currentItem.signature = codeContent; + + // Extract mutability from signature + if (codeContent.includes(' view ')) { + currentItem.mutability = 'view'; + } else if (codeContent.includes(' pure ')) { + currentItem.mutability = 'pure'; + } else if (codeContent.includes(' payable ')) { + currentItem.mutability = 'payable'; + } + } else if (currentSection === 'structs') { + currentItem.definition = codeContent; + } else if (currentSection === 'stateVariables') { + // Extract type and value from constant definition + // Format: "bytes32 constant NAME = value;" or "bytes32 NAME = value;" + // Handle both with and without "constant" keyword + // Note: name is already known from the ### heading, so we just need type and value + const constantMatch = codeContent.match(/(\w+(?:\s*\d+)?)\s+(?:constant\s+)?\w+\s*=\s*(.+?)(?:\s*;)?$/); + if (constantMatch) { + currentItem.type = constantMatch[1]; + currentItem.value = constantMatch[2].trim(); + } else { + // Fallback: try to extract just the value part if it's a simple assignment + const simpleMatch = codeContent.match(/=\s*(.+?)(?:\s*;)?$/); + if (simpleMatch) { + currentItem.value = simpleMatch[1].trim(); + } + // Try to extract type from the beginning + const typeMatch = codeContent.match(/^(\w+(?:\s*\d+)?)\s+/); + if (typeMatch) { + currentItem.type = typeMatch[1]; + } + } + } + continue; + } + + // Description text (before **Parameters** or **Returns**) + if (collectingDescription && trimmedLine && !trimmedLine.startsWith('**') && !trimmedLine.startsWith('|')) { + descriptionBuffer.push(trimmedLine); + continue; + } + + // End description collection on special markers + if (trimmedLine.startsWith('**Parameters**') || trimmedLine.startsWith('**Returns**')) { + if (descriptionBuffer.length > 0) { + const description = cleanDescription(sanitizeBrokenLinks(descriptionBuffer.join(' ').trim())); + currentItem.description = description; + currentItem.notice = description; + descriptionBuffer = []; + } + collectingDescription = false; + } + + // Parse table rows (Parameters or Returns) + if (trimmedLine.startsWith('|') && !trimmedLine.includes('----')) { + const cells = trimmedLine.split('|').map(c => c.trim()).filter(c => c); + + // Skip header row + if (cells.length >= 3 && cells[0] !== 'Name' && cells[0] !== 'Parameter') { + const paramName = cells[0].replace(/`/g, '').trim(); + const paramType = cells[1].replace(/`/g, '').trim(); + const paramDesc = sanitizeBrokenLinks(cells[2] || ''); + + // Skip if parameter name matches the function name (parsing error) + if (currentItem && paramName === currentItem.name) { + continue; + } + + // Determine if Parameters or Returns based on preceding lines + const precedingLines = lines.slice(Math.max(0, i - 10), i).join('\n'); + + if (precedingLines.includes('**Returns**')) { + currentItem.returns = currentItem.returns || []; + currentItem.returns.push({ + name: paramName === '' ? '' : paramName, + type: paramType, + description: paramDesc, + }); + } else if (precedingLines.includes('**Parameters**')) { + // Only add if it looks like a valid parameter (has a type or starts with underscore) + if (paramType || paramName.startsWith('_')) { + currentItem.params = currentItem.params || []; + currentItem.params.push({ + name: paramName, + type: paramType, + description: paramDesc, + }); + } + } + } + } + } + } + + // Save last item + if (currentItem) { + saveCurrentItem(data, currentItem, itemType); + } + + // Ensure overview is set + if (!data.overview) { + data.overview = data.description || `Documentation for ${data.title}.`; + } + + return data; +} + +module.exports = { + parseForgeDocMarkdown, +}; + diff --git a/.github/scripts/generate-docs-utils/parsing/storage-extractor.js b/.github/scripts/generate-docs-utils/parsing/storage-extractor.js new file mode 100644 index 00000000..e9f9181a --- /dev/null +++ b/.github/scripts/generate-docs-utils/parsing/storage-extractor.js @@ -0,0 +1,37 @@ +/** + * Storage Extractor + * + * Functions for extracting storage information from parsed data. + */ + +/** + * Extract storage information from parsed data + * @param {object} data - Parsed documentation data + * @returns {string | null} Storage information or null + */ +function extractStorageInfo(data) { + // Look for STORAGE_POSITION in state variables + const storageVar = data.stateVariables.find(v => + v.name.includes('STORAGE') || v.name.includes('storage') + ); + + if (storageVar) { + return `Storage position: \`${storageVar.name}\` - ${storageVar.description || 'Used for diamond storage pattern.'}`; + } + + // Look for storage struct + const storageStruct = data.structs.find(s => + s.name.includes('Storage') + ); + + if (storageStruct) { + return `Uses the \`${storageStruct.name}\` struct following the ERC-8042 diamond storage pattern.`; + } + + return null; +} + +module.exports = { + extractStorageInfo, +}; + diff --git a/.github/scripts/generate-docs-utils/parsing/text-sanitizer.js b/.github/scripts/generate-docs-utils/parsing/text-sanitizer.js new file mode 100644 index 00000000..a4220c80 --- /dev/null +++ b/.github/scripts/generate-docs-utils/parsing/text-sanitizer.js @@ -0,0 +1,66 @@ +/** + * Text Sanitizer + * + * Functions for cleaning and sanitizing text content. + */ + +/** + * Sanitize markdown links that point to non-existent files + * Removes or converts broken links to plain text + * @param {string} text - Text that may contain markdown links + * @returns {string} Sanitized text + */ +function sanitizeBrokenLinks(text) { + if (!text) return text; + + // Remove markdown links that point to /src/ paths (forge doc links) + // Pattern: [text](/src/...) + return text.replace(/\[([^\]]+)\]\(\/src\/[^\)]+\)/g, '$1'); +} + +/** + * Clean description text by removing markdown artifacts + * Strips **Parameters**, **Returns**, **Note:** and other section markers + * that get incorrectly included in descriptions from forge doc output + * @param {string} text - Description text that may contain markdown artifacts + * @returns {string} Cleaned description text + */ +function cleanDescription(text) { + if (!text) return text; + + let cleaned = text; + + // Remove markdown section headers that shouldn't be in descriptions + // These patterns appear when forge doc parsing doesn't stop at section boundaries + const artifactPatterns = [ + /\s*\*\*Parameters\*\*\s*/g, + /\s*\*\*Returns\*\*\s*/g, + /\s*\*\*Note:\*\*\s*/g, + /\s*\*\*Events\*\*\s*/g, + /\s*\*\*Errors\*\*\s*/g, + /\s*\*\*See Also\*\*\s*/g, + /\s*\*\*Example\*\*\s*/g, + ]; + + for (const pattern of artifactPatterns) { + cleaned = cleaned.replace(pattern, ' '); + } + + // Remove @custom: tags that may leak through (e.g., "@custom:error AccessControlUnauthorizedAccount") + cleaned = cleaned.replace(/@custom:\w+\s+/g, ''); + + // Clean up "error: ErrorName" patterns that appear inline + // Keep the error name but format it better: "error: ErrorName If..." -> "Reverts with ErrorName if..." + cleaned = cleaned.replace(/\berror:\s+(\w+)\s+/gi, 'Reverts with $1 '); + + // Normalize whitespace: collapse multiple spaces, trim + cleaned = cleaned.replace(/\s+/g, ' ').trim(); + + return cleaned; +} + +module.exports = { + sanitizeBrokenLinks, + cleanDescription, +}; + diff --git a/.github/scripts/generate-docs-utils/pr-body-generator.js b/.github/scripts/generate-docs-utils/pr-body-generator.js new file mode 100644 index 00000000..c37678cb --- /dev/null +++ b/.github/scripts/generate-docs-utils/pr-body-generator.js @@ -0,0 +1,104 @@ +/** + * PR Body Generator + * + * Generates a PR body from the docgen-summary.json file + * + * Usage: + * node pr-body-generator.js [summary-file-path] + * + * Outputs the PR body in GitHub Actions format to stdout + */ + +const fs = require('fs'); +const path = require('path'); + +/** + * Generate PR body from summary data + * @param {Object} summary - Summary data from docgen-summary.json + * @returns {string} PR body markdown + */ +function generatePRBody(summary) { + const facets = summary.facets || []; + const modules = summary.modules || []; + const total = summary.totalGenerated || 0; + + let body = '## Auto-Generated Docs Pages\n\n'; + body += 'This PR contains auto-generated documentation from contract comments using `forge doc`. '; + body += 'The output is passed through AI to enhance the documentation content and add additional information.\n\n'; + body += '**Please ALWAYS review the generated content and ensure it is accurate and complete to Compose Standards.**\n'; + + + body += '### Summary\n'; + body += `- **Total generated:** ${total} files\n\n`; + + if (facets.length > 0) { + body += '### Facets\n'; + facets.forEach(facet => { + body += `- ${facet.title}\n`; + }); + body += '\n'; + } + + if (modules.length > 0) { + body += '### Modules\n'; + modules.forEach(module => { + body += `- ${module.title}\n`; + }); + body += '\n'; + } + + body += '### What was done\n'; + body += '1. Extracted NatSpec using `forge doc`\n'; + body += '2. Converted to Docusaurus MDX format\n'; + body += '3. Enhanced content with GitHub Copilot (optional)\n'; + body += '4. Verified documentation build\n\n'; + + body += '### Review Checklist\n'; + body += '- [ ] Review generated content for accuracy\n'; + body += '- [ ] Verify code examples are correct\n'; + body += '- [ ] Check for any missing documentation\n'; + body += '- [ ] Ensure consistency with existing docs\n\n'; + + body += '---\n'; + body += '🚨 **This PR was automatically generated. Please ALWAYS review before merging.**\n'; + body += `Generated on: ${new Date().toISOString()}\n`; + + return body; +} + +/** + * Main function + */ +function main() { + const summaryPath = process.argv[2] || 'docgen-summary.json'; + + if (!fs.existsSync(summaryPath)) { + console.error(`Error: Summary file not found: ${summaryPath}`); + process.exit(1); + } + + try { + const summaryContent = fs.readFileSync(summaryPath, 'utf8'); + const summary = JSON.parse(summaryContent); + + const prBody = generatePRBody(summary); + + // Output in GitHub Actions format + console.log('body</g, '>') + .replace(/\{/g, '{') + .replace(/\}/g, '}'); +} + +/** + * Convert object/array to a safe JavaScript expression for JSX attributes + * Returns the value wrapped in curly braces for direct use in JSX: {value} + * @param {*} obj - Value to convert + * @returns {string} JSX expression with curly braces: {JSON} + */ +function toJsxExpression(obj) { + if (obj == null) return '{null}'; + + try { + let jsonStr = JSON.stringify(obj); + // Ensure single line + jsonStr = jsonStr.replace(/[\n\r]/g, ' ').replace(/\s+/g, ' ').trim(); + // Verify it's valid JSON + JSON.parse(jsonStr); + // Return with JSX curly braces included + return `{${jsonStr}}`; + } catch (e) { + console.warn('Invalid JSON generated:', e.message); + return Array.isArray(obj) ? '{[]}' : '{{}}'; + } +} + +/** + * Escape special characters for JSX string attributes + * @param {string} str - String to escape + * @returns {string} Escaped string safe for JSX attributes + */ +function escapeJsx(str) { + if (!str) return ''; + + return sanitizeForMdx(str) + .replace(/\\/g, '\\\\') + .replace(/"/g, '\\"') + .replace(/'/g, "\\'") + .replace(/\n/g, ' ') + .replace(/\{/g, '{') + .replace(/\}/g, '}') + // Don't escape backticks - they should be preserved for code formatting + .trim(); +} + +/** + * Escape markdown table special characters + * @param {string} str - String to escape + * @returns {string} Escaped string safe for markdown tables + */ +function escapeMarkdownTable(str) { + if (!str) return ''; + return str + .replace(/\|/g, '\\|') + .replace(/\n/g, ' ') + .replace(/\{/g, '{') + .replace(/\}/g, '}'); +} + +/** + * Escape HTML entities for safe display + * @param {string} str - String to escape + * @returns {string} HTML-escaped string + */ +function escapeHtml(str) { + if (!str) return ''; + return String(str) + .replace(/&/g, '&') + .replace(//g, '>') + .replace(/"/g, '"') + .replace(/'/g, '''); +} + +/** + * Escape string for use in JavaScript/JSX object literal values + * Escapes quotes and backslashes for JavaScript strings (not HTML entities) + * Preserves backticks for code formatting + * @param {string} str - String to escape + * @returns {string} Escaped string safe for JavaScript string literals + */ +function escapeJsString(str) { + if (!str) return ''; + return String(str) + .replace(/\\/g, '\\\\') // Escape backslashes first + .replace(/"/g, '\\"') // Escape double quotes + .replace(/'/g, "\\'") // Escape single quotes + .replace(/\n/g, '\\n') // Escape newlines + .replace(/\r/g, '\\r') // Escape carriage returns + .replace(/\t/g, '\\t'); // Escape tabs + // Note: Backticks are preserved for code formatting in descriptions +} + +/** + * Escape string for JSX string attributes, preserving backticks for code formatting + * This is specifically for descriptions that may contain code with backticks + * @param {string} str - String to escape + * @returns {string} Escaped string safe for JSX string attributes with preserved backticks + */ +function escapeJsxPreserveBackticks(str) { + if (!str) return ''; + + // Don't use sanitizeForMdx as it might HTML-escape things + // Just escape what's needed for JSX string attributes + return String(str) + .replace(/\\/g, '\\\\') // Escape backslashes first + .replace(/"/g, '\\"') // Escape double quotes for JSX strings + .replace(/'/g, "\\'") // Escape single quotes + .replace(/\n/g, ' ') // Replace newlines with spaces + .replace(/\{/g, '{') // Escape curly braces for JSX + .replace(/\}/g, '}') // Escape curly braces for JSX + // Preserve backticks - don't escape them, they're needed for code formatting + .trim(); +} + +module.exports = { + escapeYaml, + escapeJsx, + escapeJsxPreserveBackticks, + sanitizeForMdx, + sanitizeMdx: sanitizeForMdx, // Alias for template usage + toJsxExpression, + escapeMarkdownTable, + escapeHtml, + escapeJsString, +}; + diff --git a/.github/scripts/generate-docs-utils/templates/package-lock.json b/.github/scripts/generate-docs-utils/templates/package-lock.json new file mode 100644 index 00000000..b1877ecc --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/package-lock.json @@ -0,0 +1,79 @@ +{ + "name": "compose-doc-templates", + "version": "1.0.0", + "lockfileVersion": 3, + "requires": true, + "packages": { + "": { + "name": "compose-doc-templates", + "version": "1.0.0", + "dependencies": { + "handlebars": "^4.7.8" + } + }, + "node_modules/handlebars": { + "version": "4.7.8", + "resolved": "https://registry.npmjs.org/handlebars/-/handlebars-4.7.8.tgz", + "integrity": "sha512-vafaFqs8MZkRrSX7sFVUdo3ap/eNiLnb4IakshzvP56X5Nr1iGKAIqdX6tMlm6HcNRIkr6AxO5jFEoJzzpT8aQ==", + "license": "MIT", + "dependencies": { + "minimist": "^1.2.5", + "neo-async": "^2.6.2", + "source-map": "^0.6.1", + "wordwrap": "^1.0.0" + }, + "bin": { + "handlebars": "bin/handlebars" + }, + "engines": { + "node": ">=0.4.7" + }, + "optionalDependencies": { + "uglify-js": "^3.1.4" + } + }, + "node_modules/minimist": { + "version": "1.2.8", + "resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.8.tgz", + "integrity": "sha512-2yyAR8qBkN3YuheJanUpWC5U3bb5osDywNB8RzDVlDwDHbocAJveqqj1u8+SVD7jkWT4yvsHCpWqqWqAxb0zCA==", + "license": "MIT", + "funding": { + "url": "https://github.com/sponsors/ljharb" + } + }, + "node_modules/neo-async": { + "version": "2.6.2", + "resolved": "https://registry.npmjs.org/neo-async/-/neo-async-2.6.2.tgz", + "integrity": "sha512-Yd3UES5mWCSqR+qNT93S3UoYUkqAZ9lLg8a7g9rimsWmYGK8cVToA4/sF3RrshdyV3sAGMXVUmpMYOw+dLpOuw==", + "license": "MIT" + }, + "node_modules/source-map": { + "version": "0.6.1", + "resolved": "https://registry.npmjs.org/source-map/-/source-map-0.6.1.tgz", + "integrity": "sha512-UjgapumWlbMhkBgzT7Ykc5YXUT46F0iKu8SGXq0bcwP5dz/h0Plj6enJqjz1Zbq2l5WaqYnrVbwWOWMyF3F47g==", + "license": "BSD-3-Clause", + "engines": { + "node": ">=0.10.0" + } + }, + "node_modules/uglify-js": { + "version": "3.19.3", + "resolved": "https://registry.npmjs.org/uglify-js/-/uglify-js-3.19.3.tgz", + "integrity": "sha512-v3Xu+yuwBXisp6QYTcH4UbH+xYJXqnq2m/LtQVWKWzYc1iehYnLixoQDN9FH6/j9/oybfd6W9Ghwkl8+UMKTKQ==", + "license": "BSD-2-Clause", + "optional": true, + "bin": { + "uglifyjs": "bin/uglifyjs" + }, + "engines": { + "node": ">=0.8.0" + } + }, + "node_modules/wordwrap": { + "version": "1.0.0", + "resolved": "https://registry.npmjs.org/wordwrap/-/wordwrap-1.0.0.tgz", + "integrity": "sha512-gvVzJFlPycKc5dZN4yPkP8w7Dc37BtP1yczEneOb4uq34pXZcvrtRTmWV8W+Ume+XCxKgbjM+nevkyFPMybd4Q==", + "license": "MIT" + } + } +} diff --git a/.github/scripts/generate-docs-utils/templates/package.json b/.github/scripts/generate-docs-utils/templates/package.json new file mode 100644 index 00000000..d5425ad4 --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/package.json @@ -0,0 +1,10 @@ +{ + "name": "compose-doc-templates", + "version": "1.0.0", + "private": true, + "description": "Template engine for generating MDX documentation", + "dependencies": { + "handlebars": "^4.7.8" + } +} + diff --git a/.github/scripts/generate-docs-utils/templates/pages/contract.mdx.template b/.github/scripts/generate-docs-utils/templates/pages/contract.mdx.template new file mode 100644 index 00000000..f2bca5ae --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/pages/contract.mdx.template @@ -0,0 +1,274 @@ +--- +sidebar_position: {{position}} +title: "{{escapeYaml title}}" +description: "{{escapeYaml description}}" +{{#if gitSource}} +gitSource: "{{gitSource}}" +{{/if}} +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + + +{{escapeYaml description}} + + +{{#if keyFeatures}} + +{{{sanitizeMdx keyFeatures}}} + +{{/if}} + +{{#if isModule}} + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + +{{/if}} + +## Overview + +{{{sanitizeMdx overview}}} + +--- + +## Storage + +{{#if hasStructs}} +{{#each structs}} +### {{name}} + +{{#if description}} +{{{sanitizeMdx description}}} +{{/if}} + +{{#if definition}} + +{{{codeContent definition}}} + +{{/if}} + +{{#unless @last}} +--- +{{/unless}} +{{/each}} +{{/if}} + +{{#if hasStorage}} + +{{#if hasStateVariables}} +### State Variables + + +{{/if}} +{{/if}} + +{{#if hasFunctions}} +## Functions + +{{#each functions}} +### {{name}} + +{{#if description}} +{{{sanitizeMdx description}}} +{{/if}} + +{{#if signature}} + +{{{codeContent signature}}} + +{{/if}} + +{{#if hasParams}} +**Parameters:** + + +{{/if}} + +{{#if hasReturns}} +**Returns:** + + +{{/if}} + +{{#unless @last}} +--- +{{/unless}} +{{/each}} +{{/if}} + +{{#if hasEvents}} +## Events + + +{{#each events}} + + {{#if description}} +
+ {{{sanitizeMdx description}}} +
+ {{/if}} + + {{#if signature}} +
+ Signature: + +{{{codeContent signature}}} + +
+ {{/if}} + + {{#if hasParams}} +
+ Parameters: + +
+ {{/if}} +
+{{/each}} +
+{{/if}} + +{{#if hasErrors}} +## Errors + + +{{#each errors}} + + {{#if description}} +
+ {{{sanitizeMdx description}}} +
+ {{/if}} + + {{#if signature}} +
+ Signature: + +{{signature}} + +
+ {{/if}} +
+{{/each}} +
+{{/if}} + +{{#if usageExample}} + + +{{/if}} + +{{#if bestPractices}} +## Best Practices + + +{{{sanitizeMdx bestPractices}}} + +{{/if}} + +{{#if isFacet}} +{{#if securityConsiderations}} +## Security Considerations + + +{{{sanitizeMdx securityConsiderations}}} + +{{/if}} +{{/if}} + +{{#if isModule}} +{{#if integrationNotes}} +## Integration Notes + + +{{{sanitizeMdx integrationNotes}}} + +{{/if}} +{{/if}} + +{{#if relatedDocs}} +
+ +
+{{/if}} + +
+ +
+ + diff --git a/.github/scripts/generate-docs-utils/templates/template-engine-handlebars.js b/.github/scripts/generate-docs-utils/templates/template-engine-handlebars.js new file mode 100644 index 00000000..18fb38d0 --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/template-engine-handlebars.js @@ -0,0 +1,262 @@ +/** + * Handlebars Template Engine for MDX Documentation Generation + * + * Replaces the custom template engine with Handlebars for better reliability + * and proper MDX formatting. + */ + +const Handlebars = require('handlebars'); +const fs = require('fs'); +const path = require('path'); +const helpers = require('./helpers'); + +// Track if helpers have been registered (only register once) +let helpersRegistered = false; + +/** + * Register custom helpers for Handlebars + * All helpers from helpers.js are registered for use in templates + */ +function registerHelpers() { + if (helpersRegistered) return; + + // Register escape helpers + Handlebars.registerHelper('escapeYaml', helpers.escapeYaml); + Handlebars.registerHelper('escapeJsx', helpers.escapeJsx); + // Helper to escape JSX strings while preserving backticks for code formatting + Handlebars.registerHelper('escapeJsxPreserveBackticks', function(value) { + if (!value) return ''; + const escaped = helpers.escapeJsxPreserveBackticks(value); + // Return as SafeString to prevent Handlebars from HTML-escaping backticks + return new Handlebars.SafeString(escaped); + }); + Handlebars.registerHelper('sanitizeMdx', helpers.sanitizeMdx); + Handlebars.registerHelper('escapeMarkdownTable', helpers.escapeMarkdownTable); + // Helper to escape value for JavaScript strings in JSX object literals + Handlebars.registerHelper('escapeJsString', function(value) { + if (!value) return ''; + const escaped = helpers.escapeJsString(value); + return new Handlebars.SafeString(escaped); + }); + + // Helper to emit a JSX style literal: returns a string like {{display: "flex", gap: "1rem"}} + Handlebars.registerHelper('styleLiteral', function(styles) { + if (!styles || typeof styles !== 'string') return '{{}}'; + + const styleObj = {}; + const pairs = styles.split(';').filter(pair => pair.trim()); + + pairs.forEach(pair => { + const [key, value] = pair.split(':').map(s => s.trim()); + if (key && value !== undefined) { + const camelKey = key.includes('-') + ? key.replace(/-([a-z])/g, (_, c) => c.toUpperCase()) + : key; + const cleanValue = value.replace(/^["']|["']$/g, ''); + styleObj[camelKey] = cleanValue; + } + }); + + const entries = Object.entries(styleObj).map(([k, v]) => { + const isPureNumber = /^-?\d+\.?\d*$/.test(v.trim()); + // Quote everything except pure numbers + const valueLiteral = isPureNumber ? v : JSON.stringify(v); + return `${k}: ${valueLiteral}`; + }).join(', '); + + // Wrap with double braces so MDX sees style={{...}} + return `{{${entries}}}`; + }); + + // Helper to wrap code content in template literal for MDX + // This ensures MDX treats the content as a string, not JSX to parse + Handlebars.registerHelper('codeContent', function(content) { + if (!content) return '{``}'; + // Escape backticks in the content + const escaped = String(content).replace(/`/g, '\\`').replace(/\$/g, '\\$'); + return `{\`${escaped}\`}`; + }); + + // Helper to generate JSX style object syntax + // Accepts CSS string and converts to JSX object format + // Handles both kebab-case (margin-bottom) and camelCase (marginBottom) + Handlebars.registerHelper('jsxStyle', function(styles) { + if (!styles || typeof styles !== 'string') return '{}'; + + // Parse CSS string like "display: flex; margin-bottom: 1rem;" or "marginBottom: 1rem" + const styleObj = {}; + const pairs = styles.split(';').filter(pair => pair.trim()); + + pairs.forEach(pair => { + const [key, value] = pair.split(':').map(s => s.trim()); + if (key && value) { + // Convert kebab-case to camelCase if needed + const camelKey = key.includes('-') + ? key.replace(/-([a-z])/g, (g) => g[1].toUpperCase()) + : key; + // Remove quotes from value if present + const cleanValue = value.replace(/^["']|["']$/g, ''); + styleObj[camelKey] = cleanValue; + } + }); + + // Convert to JSX object string with proper quoting + // All CSS values should be quoted as strings unless they're pure numbers + const entries = Object.entries(styleObj) + .map(([k, v]) => { + // Check if it's a pure number (integer or decimal without units) + if (/^-?\d+\.?\d*$/.test(v.trim())) { + return `${k}: ${v}`; + } + // Everything else (including CSS units like "0.75rem", "2rem", CSS vars, etc.) should be quoted + return `${k}: ${JSON.stringify(v)}`; + }) + .join(', '); + + // Return the object content wrapped in braces + // When used with {{{jsxStyle ...}}} in template, this becomes style={...} + // But we need style={{...}}, so we return with an extra opening brace + // The template uses {{{jsxStyle ...}}} which outputs raw, giving us style={{{...}}} + // To get style={{...}}, we need the helper to return {{...}} + // But with triple braces in template, we'd get style={{{{...}}}} which is wrong + // Solution: return just the object, template adds one brace manually + // Return the full JSX object expression with double braces + // Template will use raw block: {{{{jsxStyle ...}}}} + // This outputs: style={{{display: "flex", ...}}} + // But we need: style={{display: "flex", ...}} + // Actually, let's try: helper returns {{...}}, template uses {{jsxStyle}} (double) + // Handlebars will output the helper result + // But it will escape... unless we use raw block + // Simplest: return {{...}}, use {{{{jsxStyle}}}} raw block + return `{{${entries}}}`; + }); + + // Custom helper for better null/empty string handling + // Handlebars' default #if treats empty strings as falsy, but we want to be explicit + Handlebars.registerHelper('ifTruthy', function(value, options) { + if (value != null && + !(Array.isArray(value) && value.length === 0) && + !(typeof value === 'string' && value.trim().length === 0) && + !(typeof value === 'object' && Object.keys(value).length === 0)) { + return options.fn(this); + } + return options.inverse(this); + }); + + helpersRegistered = true; +} + +/** + * Normalize MDX formatting to ensure proper blank lines + * MDX requires blank lines between: + * - Import statements and JSX + * - JSX components and markdown + * - JSX components and other JSX + * + * @param {string} content - MDX content to normalize + * @returns {string} Properly formatted MDX + */ +function normalizeMdxFormatting(content) { + if (!content) return ''; + + let normalized = content; + + // 1. Ensure blank line after import statements (before JSX) + // Pattern: import ...;\n\n## + normalized = normalized.replace(/(\/>)\n(##)/g, '$1\n\n$2'); + + // 3. Ensure blank line after JSX closing tags (before other JSX) + // Pattern: \n)\n(<[A-Z])/g, '$1\n\n$2'); + + // 4. Ensure blank line after JSX closing tags (before markdown content) + // Pattern: \n## or \n[text] + normalized = normalized.replace(/(<\/[A-Z][a-zA-Z]+>)\n(##|[A-Z])/g, '$1\n\n$2'); + + // 5. Ensure blank line before JSX components (after markdown) + // Pattern: ]\n line.trimEnd()).join('\n'); + + // 8. Ensure file ends with single newline + normalized = normalized.trimEnd() + '\n'; + + return normalized; +} + +/** + * List available template files + * @returns {string[]} Array of template names (without extension) + */ +function listAvailableTemplates() { + const templatesDir = path.join(__dirname, 'pages'); + try { + return fs.readdirSync(templatesDir) + .filter(f => f.endsWith('.mdx.template')) + .map(f => f.replace('.mdx.template', '')); + } catch (e) { + return []; + } +} + +/** + * Load and render a template file with Handlebars + * @param {string} templateName - Name of template (without extension) + * @param {object} data - Data to render + * @returns {string} Rendered template with proper MDX formatting + * @throws {Error} If template cannot be loaded + */ +function loadAndRenderTemplate(templateName, data) { + const templatePath = path.join(__dirname, 'pages', `${templateName}.mdx.template`); + + if (!fs.existsSync(templatePath)) { + const available = listAvailableTemplates(); + throw new Error( + `Template '${templateName}' not found at: ${templatePath}\n` + + `Available templates: ${available.length > 0 ? available.join(', ') : 'none'}` + ); + } + + // Register helpers (only once, but safe to call multiple times) + registerHelpers(); + + try { + // Load template + const templateContent = fs.readFileSync(templatePath, 'utf8'); + + // Compile template with Handlebars + const template = Handlebars.compile(templateContent); + + // Render with data + let rendered = template(data); + + // Post-process: normalize MDX formatting + rendered = normalizeMdxFormatting(rendered); + + return rendered; + } catch (error) { + if (error.message.includes('Parse error')) { + throw new Error( + `Template parsing error in ${templateName}: ${error.message}\n` + + `Template path: ${templatePath}` + ); + } + throw error; + } +} + +module.exports = { + loadAndRenderTemplate, + registerHelpers, + listAvailableTemplates, +}; + diff --git a/.github/scripts/generate-docs-utils/templates/template-engine.js b/.github/scripts/generate-docs-utils/templates/template-engine.js new file mode 100644 index 00000000..b8e2ed22 --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/template-engine.js @@ -0,0 +1,366 @@ +/** + * Simple Template Engine + * + * A lightweight template engine with no external dependencies. + * + * Supports: + * - Variable substitution: {{variable}} (HTML escaped) + * - Unescaped output: {{{variable}}} (raw output) + * - Conditionals: {{#if variable}}...{{/if}} + * - Loops: {{#each array}}...{{/each}} + * - Helper functions: {{helperName variable}} or {{helperName(arg1, arg2)}} + * - Dot notation: {{object.property.nested}} + */ + +const fs = require('fs'); +const path = require('path'); + +// Import helpers from separate module +const helpers = require('./helpers'); +const { escapeHtml } = helpers; + +/** + * Get value from object using dot notation path + * @param {object} obj - Object to get value from + * @param {string} dotPath - Dot notation path (e.g., "user.name") + * @returns {*} Value at path or undefined + */ +function getValue(obj, dotPath) { + if (!dotPath || !obj) return undefined; + + const parts = dotPath.split('.'); + let value = obj; + + for (const part of parts) { + if (value == null) return undefined; + value = value[part]; + } + + return value; +} + +/** + * Check if a value is truthy for template conditionals + * - null/undefined → false + * - empty array → false + * - empty object → false + * - empty string → false + * - false → false + * - everything else → true + * + * @param {*} value - Value to check + * @returns {boolean} Whether value is truthy + */ +function isTruthy(value) { + if (value == null) return false; + if (Array.isArray(value)) return value.length > 0; + if (typeof value === 'object') return Object.keys(value).length > 0; + if (typeof value === 'string') return value.trim().length > 0; + return Boolean(value); +} + +/** + * Process a helper function call + * @param {string} helperName - Name of the helper + * @param {string[]} args - Argument strings (variable paths or literals) + * @param {object} context - Current template context + * @param {object} helperRegistry - Registry of helper functions + * @returns {string} Result of helper function + */ +function processHelper(helperName, args, context, helperRegistry) { + const helper = helperRegistry[helperName]; + if (!helper) { + console.warn(`Unknown template helper: ${helperName}`); + return ''; + } + + // Process arguments - can be variable paths or quoted literals + const processedArgs = args.map(arg => { + arg = arg.trim(); + // Check for quoted literal strings + if ((arg.startsWith('"') && arg.endsWith('"')) || + (arg.startsWith("'") && arg.endsWith("'"))) { + return arg.slice(1, -1); + } + // Otherwise treat as variable path + return getValue(context, arg); + }); + + return helper(...processedArgs); +} + +/** + * Process a variable expression (helper or simple variable) + * @param {string} expression - The expression inside {{ }} + * @param {object} context - Current template context + * @param {boolean} escapeOutput - Whether to HTML-escape the output + * @param {object} helperRegistry - Registry of helper functions + * @returns {string} Processed value + */ +function processExpression(expression, context, escapeOutput, helperRegistry) { + const expr = expression.trim(); + + // Check for helper with parentheses: helperName(arg1, arg2) + const parenMatch = expr.match(/^(\w+)\((.*)\)$/); + if (parenMatch) { + const [, helperName, argsStr] = parenMatch; + const args = argsStr ? argsStr.split(',').map(a => a.trim()) : []; + return processHelper(helperName, args, context, helperRegistry); + } + + // Check for helper with space: helperName variable + const spaceMatch = expr.match(/^(\w+)\s+(.+)$/); + if (spaceMatch && helperRegistry[spaceMatch[1]]) { + const [, helperName, arg] = spaceMatch; + return processHelper(helperName, [arg], context, helperRegistry); + } + + // Regular variable lookup + const value = getValue(context, expr); + if (value == null) return ''; + + const str = String(value); + return escapeOutput ? escapeHtml(str) : str; +} + +/** + * Find the matching closing tag for a block, handling nesting + * @param {string} content - Content to search + * @param {string} openTag - Opening tag pattern (e.g., '#if', '#each') + * @param {string} closeTag - Closing tag (e.g., '/if', '/each') + * @param {number} startPos - Position after the opening tag + * @returns {number} Position of the matching closing tag, or -1 if not found + */ +function findMatchingClose(content, openTag, closeTag, startPos) { + let depth = 1; + let pos = startPos; + + const openPattern = new RegExp(`\\{\\{${openTag}\\s+[^}]+\\}\\}`, 'g'); + const closePattern = new RegExp(`\\{\\{${closeTag}\\}\\}`, 'g'); + + while (depth > 0 && pos < content.length) { + // Find next open and close tags + openPattern.lastIndex = pos; + closePattern.lastIndex = pos; + + const openMatch = openPattern.exec(content); + const closeMatch = closePattern.exec(content); + + if (!closeMatch) { + return -1; // No matching close found + } + + // If open comes before close, increase depth + if (openMatch && openMatch.index < closeMatch.index) { + depth++; + pos = openMatch.index + openMatch[0].length; + } else { + depth--; + if (depth === 0) { + return closeMatch.index; + } + pos = closeMatch.index + closeMatch[0].length; + } + } + + return -1; +} + +/** + * Process nested conditionals: {{#if variable}}...{{/if}} + * @param {string} content - Template content + * @param {object} context - Data context + * @param {object} helperRegistry - Registry of helper functions + * @returns {string} Processed content + */ +function processConditionals(content, context, helperRegistry) { + let result = content; + const openPattern = /\{\{#if\s+([^}]+)\}\}/g; + + let match; + while ((match = openPattern.exec(result)) !== null) { + const condition = match[1].trim(); + const startPos = match.index; + const afterOpen = startPos + match[0].length; + + const closePos = findMatchingClose(result, '#if', '/if', afterOpen); + if (closePos === -1) { + console.warn(`Unmatched {{#if ${condition}}} at position ${startPos}`); + break; + } + + const ifContent = result.substring(afterOpen, closePos); + const closeEndPos = closePos + '{{/if}}'.length; + + // Evaluate condition and get replacement + const value = getValue(context, condition); + const replacement = isTruthy(value) + ? processContent(ifContent, context, helperRegistry) + : ''; + + // Replace in result + result = result.substring(0, startPos) + replacement + result.substring(closeEndPos); + + // Reset pattern to start from beginning since we modified the string + openPattern.lastIndex = 0; + } + + return result; +} + +/** + * Process nested loops: {{#each array}}...{{/each}} + * @param {string} content - Template content + * @param {object} context - Data context + * @param {object} helperRegistry - Registry of helper functions + * @returns {string} Processed content + */ +function processLoops(content, context, helperRegistry) { + let result = content; + const openPattern = /\{\{#each\s+([^}]+)\}\}/g; + + let match; + while ((match = openPattern.exec(result)) !== null) { + const arrayPath = match[1].trim(); + const startPos = match.index; + const afterOpen = startPos + match[0].length; + + const closePos = findMatchingClose(result, '#each', '/each', afterOpen); + if (closePos === -1) { + console.warn(`Unmatched {{#each ${arrayPath}}} at position ${startPos}`); + break; + } + + const loopContent = result.substring(afterOpen, closePos); + const closeEndPos = closePos + '{{/each}}'.length; + + // Get array and process each item + const array = getValue(context, arrayPath); + let replacement = ''; + + if (Array.isArray(array) && array.length > 0) { + replacement = array.map((item, index) => { + const itemContext = { ...context, ...item, index }; + return processContent(loopContent, itemContext, helperRegistry); + }).join(''); + } + + // Replace in result + result = result.substring(0, startPos) + replacement + result.substring(closeEndPos); + + // Reset pattern to start from beginning since we modified the string + openPattern.lastIndex = 0; + } + + return result; +} + +/** + * Process template content with the given context + * Handles all variable substitutions, helpers, conditionals, and loops + * + * IMPORTANT: Processing order matters! + * 1. Loops first - so nested conditionals are evaluated with correct item context + * 2. Conditionals second - after loops have expanded their content + * 3. Variables last - after all control structures are resolved + * + * @param {string} content - Template content to process + * @param {object} context - Data context + * @param {object} helperRegistry - Registry of helper functions + * @returns {string} Processed content + */ +function processContent(content, context, helperRegistry) { + let result = content; + + // 1. Process loops FIRST (handles nesting properly) + result = processLoops(result, context, helperRegistry); + + // 2. Process conditionals SECOND (handles nesting properly) + result = processConditionals(result, context, helperRegistry); + + // 3. Process triple braces for unescaped output: {{{variable}}} + const tripleBracePattern = /\{\{\{([^}]+)\}\}\}/g; + result = result.replace(tripleBracePattern, (match, expr) => { + return processExpression(expr, context, false, helperRegistry); + }); + + // 4. Process double braces for escaped output: {{variable}} + const doubleBracePattern = /\{\{([^}]+)\}\}/g; + result = result.replace(doubleBracePattern, (match, expr) => { + return processExpression(expr, context, true, helperRegistry); + }); + + return result; +} + +/** + * Render a template string with data + * @param {string} template - Template string + * @param {object} data - Data to render + * @returns {string} Rendered template + */ +function renderTemplate(template, data) { + if (!template) return ''; + if (!data) data = {}; + + return processContent(template, { ...data }, helpers); +} + +/** + * List available template files + * @returns {string[]} Array of template names (without extension) + */ +function listAvailableTemplates() { + const templatesDir = path.join(__dirname, 'pages'); + try { + return fs.readdirSync(templatesDir) + .filter(f => f.endsWith('.mdx.template')) + .map(f => f.replace('.mdx.template', '')); + } catch (e) { + return []; + } +} + +/** + * Load and render a template file + * @param {string} templateName - Name of template (without extension) + * @param {object} data - Data to render + * @returns {string} Rendered template + * @throws {Error} If template cannot be loaded + */ +function loadAndRenderTemplate(templateName, data) { + console.log('Loading template:', templateName); + console.log('Data:', data); + + const templatePath = path.join(__dirname, 'pages', `${templateName}.mdx.template`); + + try { + if (!fs.existsSync(templatePath)) { + const available = listAvailableTemplates(); + throw new Error( + `Template '${templateName}' not found at: ${templatePath}\n` + + `Available templates: ${available.length > 0 ? available.join(', ') : 'none'}` + ); + } + + const template = fs.readFileSync(templatePath, 'utf8'); + return renderTemplate(template, data); + } catch (error) { + if (error.code === 'ENOENT') { + const available = listAvailableTemplates(); + throw new Error( + `Template file not found: ${templatePath}\n` + + `Available templates: ${available.length > 0 ? available.join(', ') : 'none'}` + ); + } + throw error; + } +} + +module.exports = { + renderTemplate, + loadAndRenderTemplate, + getValue, + isTruthy, + listAvailableTemplates, +}; diff --git a/.github/scripts/generate-docs-utils/templates/templates.js b/.github/scripts/generate-docs-utils/templates/templates.js new file mode 100644 index 00000000..07f3f74f --- /dev/null +++ b/.github/scripts/generate-docs-utils/templates/templates.js @@ -0,0 +1,707 @@ +/** + * MDX Templates for Docusaurus documentation + * Uses Handlebars template engine for reliable MDX generation + */ + +const { loadAndRenderTemplate } = require('./template-engine-handlebars'); +const { sanitizeForMdx } = require('./helpers'); +const { readFileSafe } = require('../../workflow-utils'); +const { enrichWithRelationships } = require('../core/relationship-detector'); + +/** + * Extract function parameters directly from Solidity source file + * @param {string} sourceFilePath - Path to the Solidity source file + * @param {string} functionName - Name of the function to extract parameters from + * @returns {Array} Array of parameter objects with name and type + */ +function extractParamsFromSource(sourceFilePath, functionName) { + if (!sourceFilePath || !functionName) return []; + + const sourceContent = readFileSafe(sourceFilePath); + if (!sourceContent) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Could not read source file: ${sourceFilePath}`); + } + return []; + } + + // Remove comments to avoid parsing issues + const withoutComments = sourceContent + .replace(/\/\*[\s\S]*?\*\//g, '') // Remove block comments + .replace(/\/\/.*$/gm, ''); // Remove line comments + + // Find function definition - match function name followed by opening parenthesis + // Handle both regular functions and free functions + const functionPattern = new RegExp( + `function\\s+${functionName.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')}\\s*\\(([^)]*)\\)`, + 's' + ); + + const match = withoutComments.match(functionPattern); + if (!match || !match[1]) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Function ${functionName} not found in source file`); + } + return []; + } + + const paramsStr = match[1].trim(); + if (!paramsStr) { + return []; // Function has no parameters + } + + // Parse parameters - handle complex types like mappings, arrays, structs + const params = []; + let currentParam = ''; + let depth = 0; + let inString = false; + let stringChar = ''; + + for (let i = 0; i < paramsStr.length; i++) { + const char = paramsStr[i]; + + // Handle string literals + if ((char === '"' || char === "'") && (i === 0 || paramsStr[i - 1] !== '\\')) { + if (!inString) { + inString = true; + stringChar = char; + } else if (char === stringChar) { + inString = false; + } + currentParam += char; + continue; + } + + if (inString) { + currentParam += char; + continue; + } + + // Track nesting depth for generics, arrays, mappings + if (char === '<' || char === '[' || char === '(') { + depth++; + currentParam += char; + } else if (char === '>' || char === ']' || char === ')') { + depth--; + currentParam += char; + } else if (char === ',' && depth === 0) { + // Found a parameter boundary + const trimmed = currentParam.trim(); + if (trimmed) { + const parsed = parseParameter(trimmed); + if (parsed) { + params.push(parsed); + } + } + currentParam = ''; + } else { + currentParam += char; + } + } + + // Handle last parameter + const trimmed = currentParam.trim(); + if (trimmed) { + const parsed = parseParameter(trimmed); + if (parsed) { + params.push(parsed); + } + } + + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Extracted ${params.length} params from source for ${functionName}:`, JSON.stringify(params, null, 2)); + } + + return params; +} + +/** + * Parse a single parameter string into name and type + * @param {string} paramStr - Parameter string (e.g., "uint256 amount" or "address") + * @returns {object|null} Object with name and type, or null if invalid + */ +function parseParameter(paramStr) { + if (!paramStr || !paramStr.trim()) return null; + + // Remove storage location keywords + const cleaned = paramStr + .replace(/\b(memory|storage|calldata)\b/g, '') + .replace(/\s+/g, ' ') + .trim(); + + // Split by whitespace - last token is usually the name, rest is type + const parts = cleaned.split(/\s+/); + + if (parts.length === 0) return null; + + // If only one part, it's just a type (unnamed parameter) + if (parts.length === 1) { + return { name: '', type: parts[0], description: '' }; + } + + // Last part is the name, everything before is the type + const name = parts[parts.length - 1]; + const type = parts.slice(0, -1).join(' '); + + // Validate: name should be a valid identifier + if (!/^[a-zA-Z_$][a-zA-Z0-9_$]*$/.test(name)) { + // If name doesn't look valid, treat the whole thing as a type + return { name: '', type: cleaned, description: '' }; + } + + return { name, type, description: '' }; +} + +/** + * Extract parameters from function signature string + * @param {string} signature - Function signature string + * @returns {Array} Array of parameter objects with name and type + */ +function extractParamsFromSignature(signature) { + if (!signature || typeof signature !== 'string') return []; + + // Match function parameters: function name(params) or just (params) + const paramMatch = signature.match(/\(([^)]*)\)/); + if (!paramMatch || !paramMatch[1]) return []; + + const paramsStr = paramMatch[1].trim(); + if (!paramsStr) return []; + + // Split by comma, but be careful with nested generics + const params = []; + let currentParam = ''; + let depth = 0; + + for (let i = 0; i < paramsStr.length; i++) { + const char = paramsStr[i]; + if (char === '<') depth++; + else if (char === '>') depth--; + else if (char === ',' && depth === 0) { + const trimmed = currentParam.trim(); + if (trimmed) { + // Parse "type name" or just "type" + const parts = trimmed.split(/\s+/); + if (parts.length >= 2) { + // Has both type and name + const type = parts.slice(0, -1).join(' '); + const name = parts[parts.length - 1]; + params.push({ name, type, description: '' }); + } else if (parts.length === 1) { + // Just type, no name + params.push({ name: '', type: parts[0], description: '' }); + } + } + currentParam = ''; + continue; + } + currentParam += char; + } + + // Handle last parameter + const trimmed = currentParam.trim(); + if (trimmed) { + const parts = trimmed.split(/\s+/); + if (parts.length >= 2) { + const type = parts.slice(0, -1).join(' '); + const name = parts[parts.length - 1]; + params.push({ name, type, description: '' }); + } else if (parts.length === 1) { + params.push({ name: '', type: parts[0], description: '' }); + } + } + + return params; +} + +/** + * Filter function parameters, removing invalid entries + * Invalid parameters include: empty names or names matching the function name (parsing error) + * @param {Array} params - Raw parameters array + * @param {string} functionName - Name of the function (to detect parsing errors) + * @returns {Array} Filtered and normalized parameters + */ +function filterAndNormalizeParams(params, functionName) { + return (params || []) + .filter(p => { + // Handle different possible data structures + const paramName = (p && (p.name || p.param || p.parameter || '')).trim(); + const paramType = (p && (p.type || p.paramType || '')).trim(); + + // Filter out parameters with empty or missing names + if (!paramName) return false; + // Filter out parameters where name matches function name (indicates parsing error) + if (paramName === functionName) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Filtered out invalid param: name="${paramName}" matches function name`); + } + return false; + } + // Filter out if type is empty AND name looks like it might be a function name (starts with lowercase, no underscore) + if (!paramType && /^[a-z]/.test(paramName) && !paramName.includes('_')) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Filtered out suspicious param: name="${paramName}" has no type`); + } + return false; + } + return true; + }) + .map(p => ({ + name: (p.name || p.param || p.parameter || '').trim(), + type: (p.type || p.paramType || '').trim(), + description: (p.description || p.desc || '').trim(), + })); +} + +/** + * Check if a function is internal by examining its signature + * @param {object} fn - Function data with signature property + * @returns {boolean} True if function is internal + */ +function isInternalFunction(fn) { + if (!fn || !fn.signature) return false; + + // Check if signature contains "internal" as a whole word + // Use word boundary regex to avoid matching "internalTransferFrom" etc. + const internalPattern = /\binternal\b/; + return internalPattern.test(fn.signature); +} + +/** + * Prepare function data for template rendering (shared between facet and module) + * @param {object} fn - Function data + * @param {string} sourceFilePath - Path to the Solidity source file + * @param {boolean} useSourceExtraction - Whether to try extracting params from source file (for modules) + * @returns {object} Prepared function data + */ +function prepareFunctionData(fn, sourceFilePath, useSourceExtraction = false) { + // Debug: log the raw function data + if (process.env.DEBUG_PARAMS) { + console.log(`\n[DEBUG] Function: ${fn.name}`); + console.log(`[DEBUG] Raw params:`, JSON.stringify(fn.params, null, 2)); + console.log(`[DEBUG] Signature:`, fn.signature); + } + + // Build parameters array, filtering out invalid parameters + let paramsArray = filterAndNormalizeParams(fn.params, fn.name); + + // If no valid parameters found, try extracting from source file (for modules) or signature + if (paramsArray.length === 0) { + // Try source file extraction for modules + if (useSourceExtraction && sourceFilePath) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] No valid params found, extracting from source file: ${sourceFilePath}`); + } + const extractedParams = extractParamsFromSource(sourceFilePath, fn.name); + if (extractedParams.length > 0) { + paramsArray = extractedParams; + } + } + + // Fallback to signature extraction if still no params + if (paramsArray.length === 0 && fn.signature) { + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] No valid params found, extracting from signature`); + } + const extractedParams = extractParamsFromSignature(fn.signature); + paramsArray = filterAndNormalizeParams(extractedParams, fn.name); + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Extracted params from signature:`, JSON.stringify(paramsArray, null, 2)); + } + } + } + + if (process.env.DEBUG_PARAMS) { + console.log(`[DEBUG] Final paramsArray:`, JSON.stringify(paramsArray, null, 2)); + } + + // Build returns array for table rendering + const returnsArray = (fn.returns || []).map(r => ({ + name: r.name || '-', + type: r.type, + description: r.description || '', + })); + + return { + name: fn.name, + signature: fn.signature, + description: fn.notice || fn.description || '', + params: paramsArray, + returns: returnsArray, + hasReturns: returnsArray.length > 0, + hasParams: paramsArray.length > 0, + }; +} + +/** + * Prepare event data for template rendering + * @param {object} event - Event data + * @returns {object} Prepared event data + */ +function prepareEventData(event) { + return { + name: event.name, + description: event.description || '', + signature: event.signature, + params: (event.params || []).map(p => ({ + name: p.name, + type: p.type, + description: p.description || '', + })), + hasParams: (event.params || []).length > 0, + }; +} + +/** + * Prepare error data for template rendering + * @param {object} error - Error data + * @returns {object} Prepared error data + */ +function prepareErrorData(error) { + return { + name: error.name, + description: error.description || '', + signature: error.signature, + }; +} + +/** + * Normalize struct definition indentation + * Ensures consistent 4-space indentation for struct body content + * @param {string} definition - Struct definition code + * @returns {string} Normalized struct definition with proper indentation + */ +function normalizeStructIndentation(definition) { + if (!definition) return definition; + + const lines = definition.split('\n'); + if (lines.length === 0) return definition; + + // Find the struct opening line (contains "struct" keyword) + let structStartIndex = -1; + let openingBraceOnSameLine = false; + + for (let i = 0; i < lines.length; i++) { + if (lines[i].includes('struct')) { + structStartIndex = i; + openingBraceOnSameLine = lines[i].includes('{'); + break; + } + } + + if (structStartIndex === -1) return definition; + + // Get the indentation of the struct declaration line + const structLine = lines[structStartIndex]; + const structIndentMatch = structLine.match(/^(\s*)/); + const structIndent = structIndentMatch ? structIndentMatch[1] : ''; + + // Normalize all lines + const normalized = []; + let inStructBody = openingBraceOnSameLine; + + for (let i = 0; i < lines.length; i++) { + const line = lines[i]; + const trimmed = line.trim(); + + if (i === structStartIndex) { + // Keep struct declaration line as-is + normalized.push(line); + if (openingBraceOnSameLine) { + inStructBody = true; + } + continue; + } + + // Handle opening brace on separate line + if (!openingBraceOnSameLine && trimmed === '{') { + normalized.push(structIndent + '{'); + inStructBody = true; + continue; + } + + // Handle closing brace + if (trimmed === '}') { + normalized.push(structIndent + '}'); + inStructBody = false; + continue; + } + + // Skip empty lines + if (trimmed === '') { + normalized.push(''); + continue; + } + + // For struct body content, ensure 4-space indentation relative to struct declaration + if (inStructBody) { + // Remove any existing indentation and add proper indentation + const bodyIndent = structIndent + ' '; // 4 spaces + normalized.push(bodyIndent + trimmed); + } else { + // Keep lines outside struct body as-is + normalized.push(line); + } + } + + return normalized.join('\n'); +} + +/** + * Prepare struct data for template rendering + * @param {object} struct - Struct data + * @returns {object} Prepared struct data + */ +function prepareStructData(struct) { + return { + name: struct.name, + description: struct.description || '', + definition: normalizeStructIndentation(struct.definition), + }; +} + +/** + * Validate documentation data + * @param {object} data - Documentation data to validate + * @throws {Error} If data is invalid + */ +function validateData(data) { + if (!data || typeof data !== 'object') { + throw new Error('Invalid data: expected an object'); + } + if (!data.title || typeof data.title !== 'string') { + throw new Error('Invalid data: missing or invalid title'); + } +} + +/** + * Generate fallback description for state variables/constants based on naming patterns + * @param {string} name - Variable name (e.g., "STORAGE_POSITION", "DEFAULT_ADMIN_ROLE") + * @param {string} moduleName - Name of the module/contract for context + * @returns {string} Generated description or empty string + */ +function generateStateVariableDescription(name, moduleName) { + if (!name) return ''; + + const upperName = name.toUpperCase(); + + // Common patterns for diamond/ERC contracts + const patterns = { + // Storage position patterns + 'STORAGE_POSITION': 'Diamond storage slot position for this module', + 'STORAGE_SLOT': 'Diamond storage slot identifier', + '_STORAGE_POSITION': 'Diamond storage slot position', + '_STORAGE_SLOT': 'Diamond storage slot identifier', + + // Role patterns + 'DEFAULT_ADMIN_ROLE': 'Default administrative role identifier (bytes32(0))', + 'ADMIN_ROLE': 'Administrative role identifier', + 'MINTER_ROLE': 'Minter role identifier', + 'PAUSER_ROLE': 'Pauser role identifier', + 'BURNER_ROLE': 'Burner role identifier', + + // ERC patterns + 'INTERFACE_ID': 'ERC-165 interface identifier', + 'EIP712_DOMAIN': 'EIP-712 domain separator', + 'PERMIT_TYPEHASH': 'EIP-2612 permit type hash', + + // Reentrancy patterns + 'NON_REENTRANT_SLOT': 'Reentrancy guard storage slot', + '_NOT_ENTERED': 'Reentrancy status: not entered', + '_ENTERED': 'Reentrancy status: entered', + }; + + // Check exact matches first + if (patterns[upperName]) { + return patterns[upperName]; + } + + // Check partial matches + if (upperName.includes('STORAGE') && (upperName.includes('POSITION') || upperName.includes('SLOT'))) { + return 'Diamond storage slot position for this module'; + } + if (upperName.includes('_ROLE')) { + const roleName = name.replace(/_ROLE$/i, '').replace(/_/g, ' ').toLowerCase(); + return `${roleName.charAt(0).toUpperCase() + roleName.slice(1)} role identifier`; + } + if (upperName.includes('TYPEHASH')) { + return 'Type hash for EIP-712 structured data'; + } + if (upperName.includes('INTERFACE')) { + return 'ERC-165 interface identifier'; + } + + // Generic fallback + return ''; +} + +/** + * Prepare base data common to both facet and module templates + * @param {object} data - Documentation data + * @param {number} position - Sidebar position + * @returns {object} Base prepared data + */ +function prepareBaseData(data, position = 99) { + validateData(data); + + const description = data.description || `Contract documentation for ${data.title}`; + const subtitle = data.subtitle || data.description || `Contract documentation for ${data.title}`; + const overview = data.overview || data.description || `Documentation for ${data.title}.`; + + return { + position, + title: data.title, + description, + subtitle, + overview, + generatedDate: data.generatedDate || new Date().toISOString(), + gitSource: data.gitSource || '', + keyFeatures: data.keyFeatures || '', + usageExample: data.usageExample || '', + bestPractices: (data.bestPractices && data.bestPractices.trim()) ? data.bestPractices : null, + securityConsiderations: (data.securityConsiderations && data.securityConsiderations.trim()) ? data.securityConsiderations : null, + integrationNotes: (data.integrationNotes && data.integrationNotes.trim()) ? data.integrationNotes : null, + storageInfo: data.storageInfo || '', + + // Events + events: (data.events || []).map(prepareEventData), + hasEvents: (data.events || []).length > 0, + + // Errors + errors: (data.errors || []).map(prepareErrorData), + hasErrors: (data.errors || []).length > 0, + + // Structs + structs: (data.structs || []).map(prepareStructData), + hasStructs: (data.structs || []).length > 0, + + // State variables (for modules) - with fallback description generation + stateVariables: (data.stateVariables || []).map(v => { + const baseDescription = v.description || generateStateVariableDescription(v.name, data.title); + let description = baseDescription; + + // Append value to description if it exists and isn't already included + if (v.value && v.value.trim()) { + const valueStr = v.value.trim(); + // Check if value is already in description (case-insensitive) + // Escape special regex characters in valueStr + const escapedValue = valueStr.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); + // Pattern matches "(Value: `...`)" or "(Value: ...)" format + const valuePattern = new RegExp('\\(Value:\\s*[`]?[^`)]*' + escapedValue + '[^`)]*[`]?\\)', 'i'); + if (!valuePattern.test(description)) { + // Format the value for display with backticks + // Use string concatenation to avoid template literal backtick issues + const valuePart = '(Value: `' + valueStr + '`)'; + description = baseDescription ? baseDescription + ' ' + valuePart : valuePart; + } + } + + return { + name: v.name, + type: v.type || '', + value: v.value || '', + description: description, + }; + }), + hasStateVariables: (data.stateVariables || []).length > 0, + hasStorage: Boolean(data.storageInfo || (data.stateVariables && data.stateVariables.length > 0)), + }; +} + +/** + * Prepare data for facet template rendering + * @param {object} data - Documentation data + * @param {number} position - Sidebar position + * @param {object} pathInfo - Output path information (optional) + * @param {object} registry - Contract registry (optional) + * @returns {object} Prepared data for facet template + */ +function prepareFacetData(data, position = 99, pathInfo = null, registry = null) { + const baseData = prepareBaseData(data, position); + const sourceFilePath = data.sourceFilePath; + + // Filter out internal functions for facets (they act as pre-deploy logic blocks) + const publicFunctions = (data.functions || []).filter(fn => !isInternalFunction(fn)); + + const preparedData = { + ...baseData, + // Contract type flags for unified template + isFacet: true, + isModule: false, + contractType: 'facet', + // Functions with APIReference-compatible format (no source extraction for facets) + // Only include non-internal functions since facets are pre-deploy logic blocks + functions: publicFunctions.map(fn => prepareFunctionData(fn, sourceFilePath, false)), + hasFunctions: publicFunctions.length > 0, + }; + + // Enrich with relationships if registry and pathInfo provided + if (registry && pathInfo) { + return enrichWithRelationships(preparedData, pathInfo, registry); + } + + return preparedData; +} + +/** + * Prepare data for module template rendering + * @param {object} data - Documentation data + * @param {number} position - Sidebar position + * @param {object} pathInfo - Output path information (optional) + * @param {object} registry - Contract registry (optional) + * @returns {object} Prepared data for module template + */ +function prepareModuleData(data, position = 99, pathInfo = null, registry = null) { + const baseData = prepareBaseData(data, position); + const sourceFilePath = data.sourceFilePath; + + const preparedData = { + ...baseData, + // Contract type flags for unified template + isFacet: false, + isModule: true, + contractType: 'module', + // Functions with table-compatible format (with source extraction for modules) + functions: (data.functions || []).map(fn => prepareFunctionData(fn, sourceFilePath, true)), + hasFunctions: (data.functions || []).length > 0, + }; + + // Enrich with relationships if registry and pathInfo provided + if (registry && pathInfo) { + return enrichWithRelationships(preparedData, pathInfo, registry); + } + + return preparedData; +} + +/** + * Generate complete facet documentation + * Uses the unified contract template with isFacet=true + * @param {object} data - Documentation data + * @param {number} position - Sidebar position + * @param {object} pathInfo - Output path information (optional) + * @param {object} registry - Contract registry (optional) + * @returns {string} Complete MDX document + */ +function generateFacetDoc(data, position = 99, pathInfo = null, registry = null) { + const preparedData = prepareFacetData(data, position, pathInfo, registry); + return loadAndRenderTemplate('contract', preparedData); +} + +/** + * Generate complete module documentation + * Uses the unified contract template with isModule=true + * @param {object} data - Documentation data + * @param {number} position - Sidebar position + * @param {object} pathInfo - Output path information (optional) + * @param {object} registry - Contract registry (optional) + * @returns {string} Complete MDX document + */ +function generateModuleDoc(data, position = 99, pathInfo = null, registry = null) { + const preparedData = prepareModuleData(data, position, pathInfo, registry); + return loadAndRenderTemplate('contract', preparedData); +} + +module.exports = { + generateFacetDoc, + generateModuleDoc, +}; diff --git a/.github/scripts/generate-docs-utils/tracking/summary-tracker.js b/.github/scripts/generate-docs-utils/tracking/summary-tracker.js new file mode 100644 index 00000000..c1d34ee0 --- /dev/null +++ b/.github/scripts/generate-docs-utils/tracking/summary-tracker.js @@ -0,0 +1,134 @@ +/** + * Summary Tracker + * + * Tracks processing results and generates summary reports. + * Replaces global processedFiles object with a class-based approach. + */ + +const path = require('path'); +const { writeFileSafe } = require('../../workflow-utils'); + +/** + * Tracks processed files and generates summary reports + */ +class SummaryTracker { + constructor() { + this.facets = []; + this.modules = []; + this.skipped = []; + this.errors = []; + this.fallbackFiles = []; + } + + /** + * Record a successfully processed facet + * @param {string} title - Facet title + * @param {string} file - Output file path + */ + recordFacet(title, file) { + this.facets.push({ title, file }); + } + + /** + * Record a successfully processed module + * @param {string} title - Module title + * @param {string} file - Output file path + */ + recordModule(title, file) { + this.modules.push({ title, file }); + } + + /** + * Record a skipped file + * @param {string} file - File path + * @param {string} reason - Reason for skipping + */ + recordSkipped(file, reason) { + this.skipped.push({ file, reason }); + } + + /** + * Record an error + * @param {string} file - File path + * @param {string} error - Error message + */ + recordError(file, error) { + this.errors.push({ file, error }); + } + + /** + * Record a file that used fallback content + * @param {string} title - Contract title + * @param {string} file - Output file path + * @param {string} error - Error message + */ + recordFallback(title, file, error) { + this.fallbackFiles.push({ title, file, error }); + } + + /** + * Print processing summary to console + */ + printSummary() { + console.log('\n' + '='.repeat(50)); + console.log('Documentation Generation Summary'); + console.log('='.repeat(50)); + + console.log(`\nFacets generated: ${this.facets.length}`); + for (const f of this.facets) { + console.log(` - ${f.title}`); + } + + console.log(`\nModules generated: ${this.modules.length}`); + for (const m of this.modules) { + console.log(` - ${m.title}`); + } + + if (this.skipped.length > 0) { + console.log(`\nSkipped: ${this.skipped.length}`); + for (const s of this.skipped) { + console.log(` - ${path.basename(s.file)}: ${s.reason}`); + } + } + + if (this.errors.length > 0) { + console.log(`\nErrors: ${this.errors.length}`); + for (const e of this.errors) { + console.log(` - ${path.basename(e.file)}: ${e.error}`); + } + } + + if (this.fallbackFiles.length > 0) { + console.log(`\n⚠️ Files using fallback due to AI errors: ${this.fallbackFiles.length}`); + for (const f of this.fallbackFiles) { + console.log(` - ${f.title}: ${f.error}`); + } + } + + const total = this.facets.length + this.modules.length; + console.log(`\nTotal generated: ${total} documentation files`); + console.log('='.repeat(50) + '\n'); + } + + /** + * Write summary to file for GitHub Action + */ + writeSummaryFile() { + const summary = { + timestamp: new Date().toISOString(), + facets: this.facets, + modules: this.modules, + skipped: this.skipped, + errors: this.errors, + fallbackFiles: this.fallbackFiles, + totalGenerated: this.facets.length + this.modules.length, + }; + + writeFileSafe('docgen-summary.json', JSON.stringify(summary, null, 2)); + } +} + +module.exports = { + SummaryTracker, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/contract-classifier.js b/.github/scripts/generate-docs-utils/utils/contract-classifier.js new file mode 100644 index 00000000..698df50a --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/contract-classifier.js @@ -0,0 +1,69 @@ +/** + * Contract Classifier + * + * Functions for detecting contract types (interface, module, facet). + */ + +const path = require('path'); + +/** + * Determine if a contract is an interface + * Interfaces should be skipped from documentation generation + * Only checks the naming pattern (I[A-Z]) to avoid false positives + * @param {string} title - Contract title/name + * @param {string} content - File content (forge doc markdown) - unused but kept for API compatibility + * @returns {boolean} True if this is an interface + */ +function isInterface(title, content) { + // Only check if title follows interface naming convention: starts with "I" followed by uppercase + // This is the most reliable indicator and avoids false positives from content that mentions "interface" + if (title && /^I[A-Z]/.test(title)) { + return true; + } + + // Removed content-based check to avoid false positives + // Facets and contracts often mention "interface" in their descriptions + // (e.g., "ERC-165 Standard Interface Detection Facet") which would incorrectly filter them + + return false; +} + +/** + * Determine if a contract is a module or facet + * @param {string} filePath - Path to the file + * @param {string} content - File content + * @returns {'module' | 'facet'} Contract type + */ +function getContractType(filePath, content) { + const lowerPath = filePath.toLowerCase(); + const normalizedPath = lowerPath.replace(/\\/g, '/'); + const baseName = path.basename(filePath, path.extname(filePath)).toLowerCase(); + + // Explicit modules folder + if (normalizedPath.includes('/modules/')) { + return 'module'; + } + + // File naming conventions (e.g., AccessControlMod.sol, NonReentrancyModule.sol) + if (baseName.endsWith('mod') || baseName.endsWith('module')) { + return 'module'; + } + + if (lowerPath.includes('facet')) { + return 'facet'; + } + + // Libraries folder typically contains modules + if (normalizedPath.includes('/libraries/')) { + return 'module'; + } + + // Default to facet for contracts + return 'facet'; +} + +module.exports = { + isInterface, + getContractType, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/file-finder.js b/.github/scripts/generate-docs-utils/utils/file-finder.js new file mode 100644 index 00000000..6ee50dd2 --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/file-finder.js @@ -0,0 +1,38 @@ +/** + * File Finder + * + * Functions for finding forge doc output files. + */ + +const fs = require('fs'); +const path = require('path'); +const CONFIG = require('../config'); + +/** + * Find forge doc output files for a given source file + * @param {string} solFilePath - Path to .sol file (e.g., 'src/access/AccessControl/AccessControlMod.sol') + * @returns {string[]} Array of markdown file paths from forge doc output + */ +function findForgeDocFiles(solFilePath) { + // Transform: src/access/AccessControl/AccessControlMod.sol + // To: docs/src/src/access/AccessControl/AccessControlMod.sol/ + const relativePath = solFilePath.replace(/^src\//, ''); + const docsDir = path.join(CONFIG.forgeDocsDir, relativePath); + + if (!fs.existsSync(docsDir)) { + return []; + } + + try { + const files = fs.readdirSync(docsDir); + return files.filter((f) => f.endsWith('.md')).map((f) => path.join(docsDir, f)); + } catch (error) { + console.error(`Error reading docs dir ${docsDir}:`, error.message); + return []; + } +} + +module.exports = { + findForgeDocFiles, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/git-utils.js b/.github/scripts/generate-docs-utils/utils/git-utils.js new file mode 100644 index 00000000..7f905f0a --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/git-utils.js @@ -0,0 +1,70 @@ +/** + * Git Utilities + * + * Functions for interacting with git to find changed files. + */ + +const { execSync } = require('child_process'); +const { readFileSafe } = require('../../workflow-utils'); + +/** + * Get list of changed Solidity files from git diff + * @param {string} baseBranch - Base branch to compare against + * @returns {string[]} Array of changed .sol file paths + */ +function getChangedSolFiles(baseBranch = 'HEAD~1') { + try { + const output = execSync(`git diff --name-only ${baseBranch} HEAD -- 'src/**/*.sol'`, { + encoding: 'utf8', + }); + return output + .trim() + .split('\n') + .filter((f) => f.endsWith('.sol')); + } catch (error) { + console.error('Error getting changed files:', error.message); + return []; + } +} + +/** + * Get all Solidity files in src directory + * @returns {string[]} Array of .sol file paths + */ +function getAllSolFiles() { + try { + const output = execSync('find src -name "*.sol" -type f', { + encoding: 'utf8', + }); + return output + .trim() + .split('\n') + .filter((f) => f); + } catch (error) { + console.error('Error getting all sol files:', error.message); + return []; + } +} + +/** + * Read changed files from a file (used in CI) + * @param {string} filePath - Path to file containing list of changed files + * @returns {string[]} Array of file paths + */ +function readChangedFilesFromFile(filePath) { + const content = readFileSafe(filePath); + if (!content) { + return []; + } + return content + .trim() + .split('\n') + .filter((f) => f.endsWith('.sol')); +} + +module.exports = { + getChangedSolFiles, + getAllSolFiles, + readChangedFilesFromFile, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/path-computer.js b/.github/scripts/generate-docs-utils/utils/path-computer.js new file mode 100644 index 00000000..cdb6ad4c --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/path-computer.js @@ -0,0 +1,33 @@ +/** + * Path Computer + * + * Functions for computing output paths for documentation files. + */ + +const { + computeOutputPath, + ensureCategoryFiles, +} = require('../category/category-generator'); + +/** + * Get output directory and file path based on source file path + * Mirrors the src/ structure in website/docs/contracts/ + * + * @param {string} solFilePath - Path to the source .sol file + * @param {'module' | 'facet'} contractType - Type of contract (for logging) + * @returns {object} { outputDir, outputFile, relativePath, fileName, category } + */ +function getOutputPath(solFilePath, contractType) { + // Compute path using the new structure-mirroring logic + const pathInfo = computeOutputPath(solFilePath); + + // Ensure all parent category files exist + ensureCategoryFiles(pathInfo.outputDir); + + return pathInfo; +} + +module.exports = { + getOutputPath, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/sidebar-position-calculator.js b/.github/scripts/generate-docs-utils/utils/sidebar-position-calculator.js new file mode 100644 index 00000000..345ccaf5 --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/sidebar-position-calculator.js @@ -0,0 +1,66 @@ +/** + * Sidebar Position Calculator + * + * Calculates sidebar positions for contracts in documentation. + */ + +const CONFIG = require('../config'); +const { getContractRegistry } = require('../core/contract-registry'); + +/** + * Get sidebar position for a contract + * @param {string} contractName - Name of the contract + * @param {string} contractType - Type of contract ('module' or 'facet') + * @param {string} category - Category of the contract + * @param {object} registry - Contract registry (optional, uses global if not provided) + * @returns {number} Sidebar position + */ +function getSidebarPosition(contractName, contractType = null, category = null, registry = null) { + // First check explicit config + if (CONFIG.contractPositions && CONFIG.contractPositions[contractName] !== undefined) { + return CONFIG.contractPositions[contractName]; + } + + // If we don't have enough info, use default + if (!contractType || !category) { + return CONFIG.defaultSidebarPosition || 50; + } + + // Calculate smart position based on: + // 1. Category base offset + const categoryOffsets = { + diamond: 0, + access: 100, + token: 200, + utils: 300, + interfaceDetection: 400 + }; + + let basePosition = categoryOffsets[category] || 500; + + // 2. Contract type offset (modules before facets) + const typeOffset = contractType === 'module' ? 0 : 10; + basePosition += typeOffset; + + // 3. Position within category based on dependencies + const reg = registry || getContractRegistry(); + if (reg && reg.byCategory.has(category)) { + const categoryContracts = reg.byCategory.get(category) || []; + const sameTypeContracts = categoryContracts.filter(c => c.type === contractType); + + // Sort by name for consistent ordering + sameTypeContracts.sort((a, b) => a.name.localeCompare(b.name)); + + const index = sameTypeContracts.findIndex(c => c.name === contractName); + if (index !== -1) { + basePosition += index; + } + } + + return basePosition; +} + +module.exports = { + getSidebarPosition, +}; + diff --git a/.github/scripts/generate-docs-utils/utils/source-parser.js b/.github/scripts/generate-docs-utils/utils/source-parser.js new file mode 100644 index 00000000..89b588f8 --- /dev/null +++ b/.github/scripts/generate-docs-utils/utils/source-parser.js @@ -0,0 +1,174 @@ +/** + * Source Parser + * + * Functions for parsing Solidity source files to extract information. + */ + +const path = require('path'); +const { readFileSafe } = require('../../workflow-utils'); + +/** + * Extract module name from file path + * @param {string} filePath - Path to the file + * @returns {string} Module name + */ +function extractModuleNameFromPath(filePath) { + // If it's a constants file, extract from filename + const basename = path.basename(filePath); + if (basename.startsWith('constants.')) { + const match = basename.match(/^constants\.(.+)\.md$/); + if (match) { + return match[1]; + } + } + + // Extract from .sol file path + if (filePath.endsWith('.sol')) { + return path.basename(filePath, '.sol'); + } + + // Extract from directory structure + const parts = filePath.split(path.sep); + for (let i = parts.length - 1; i >= 0; i--) { + if (parts[i].endsWith('.sol')) { + return path.basename(parts[i], '.sol'); + } + } + + // Fallback: use basename without extension + return path.basename(filePath, path.extname(filePath)); +} + +/** + * Check if a line is a code element declaration + * @param {string} line - Trimmed line to check + * @returns {boolean} True if line is a code element declaration + */ +function isCodeElementDeclaration(line) { + if (!line) return false; + return ( + line.startsWith('function ') || + line.startsWith('error ') || + line.startsWith('event ') || + line.startsWith('struct ') || + line.startsWith('enum ') || + line.startsWith('contract ') || + line.startsWith('library ') || + line.startsWith('interface ') || + line.startsWith('modifier ') || + /^\w+\s+(constant|immutable)\s/.test(line) || + /^(bytes32|uint\d*|int\d*|address|bool|string)\s+constant\s/.test(line) + ); +} + +/** + * Extract module description from source file NatSpec comments + * @param {string} solFilePath - Path to the Solidity source file + * @returns {string} Description extracted from @title and @notice tags + */ +function extractModuleDescriptionFromSource(solFilePath) { + const content = readFileSafe(solFilePath); + if (!content) { + return ''; + } + + const lines = content.split('\n'); + let inComment = false; + let commentBuffer = []; + let title = ''; + let notice = ''; + + for (let i = 0; i < lines.length; i++) { + const line = lines[i]; + const trimmed = line.trim(); + + // Skip SPDX and pragma lines + if (trimmed.startsWith('// SPDX') || trimmed.startsWith('pragma ')) { + continue; + } + + // Check if we've reached a code element without finding a file-level comment + if (!inComment && isCodeElementDeclaration(trimmed)) { + break; + } + + // Start of block comment + if (trimmed.startsWith('/**') || trimmed.startsWith('/*')) { + inComment = true; + commentBuffer = []; + continue; + } + + // End of block comment + if (inComment && trimmed.includes('*/')) { + inComment = false; + const commentText = commentBuffer.join(' '); + + // Look ahead to see if next non-empty line is a code element + let nextCodeLine = ''; + for (let j = i + 1; j < lines.length && j < i + 5; j++) { + const nextTrimmed = lines[j].trim(); + if (nextTrimmed && !nextTrimmed.startsWith('//') && !nextTrimmed.startsWith('/*')) { + nextCodeLine = nextTrimmed; + break; + } + } + + // If the comment has @title, it's a file-level comment + const titleMatch = commentText.match(/@title\s+(.+?)(?:\s+@|\s*$)/); + if (titleMatch) { + title = titleMatch[1].trim(); + const noticeMatch = commentText.match(/@notice\s+(.+?)(?:\s+@|\s*$)/); + if (noticeMatch) { + notice = noticeMatch[1].trim(); + } + break; + } + + // If next line is a code element, this comment belongs to that element + if (isCodeElementDeclaration(nextCodeLine)) { + commentBuffer = []; + continue; + } + + // Standalone comment with @notice + const standaloneNotice = commentText.match(/@notice\s+(.+?)(?:\s+@|\s*$)/); + if (standaloneNotice && !isCodeElementDeclaration(nextCodeLine)) { + notice = standaloneNotice[1].trim(); + break; + } + + commentBuffer = []; + continue; + } + + // Collect comment lines + if (inComment) { + let cleanLine = trimmed + .replace(/^\*\s*/, '') + .replace(/^\s*\*/, '') + .trim(); + if (cleanLine && !cleanLine.startsWith('*/')) { + commentBuffer.push(cleanLine); + } + } + } + + // Combine title and notice + if (title && notice) { + return `${title} - ${notice}`; + } else if (notice) { + return notice; + } else if (title) { + return title; + } + + return ''; +} + +module.exports = { + extractModuleNameFromPath, + extractModuleDescriptionFromSource, + isCodeElementDeclaration, +}; + diff --git a/.github/scripts/generate-docs.js b/.github/scripts/generate-docs.js new file mode 100644 index 00000000..50245f4f --- /dev/null +++ b/.github/scripts/generate-docs.js @@ -0,0 +1,82 @@ +/** + * Docusaurus Documentation Generator + * + * Converts forge doc output to Docusaurus MDX format + * with optional AI enhancement. + * + * Features: + * - Mirrors src/ folder structure in documentation + * - Auto-generates category navigation files + * - AI-enhanced content generation + * + * Environment variables: + * GITHUB_TOKEN - GitHub token for AI API (optional) + * SKIP_ENHANCEMENT - Set to 'true' to skip AI enhancement + */ + +const { clearContractRegistry } = require('./generate-docs-utils/core/contract-registry'); +const { syncDocsStructure, regenerateAllIndexFiles } = require('./generate-docs-utils/category/category-generator'); +const { processSolFile } = require('./generate-docs-utils/core/file-processor'); +const { getFilesToProcess } = require('./generate-docs-utils/core/file-selector'); +const { SummaryTracker } = require('./generate-docs-utils/tracking/summary-tracker'); + + +// ============================================================================ +// Main Entry Point +// ============================================================================ + +/** + * Main entry point + */ +async function main() { + console.log('Compose Documentation Generator\n'); + + // Initialize tracker + const tracker = new SummaryTracker(); + + // Step 0: Clear contract registry + clearContractRegistry(); + + // Step 1: Sync docs structure with src structure + console.log('📁 Syncing documentation structure with source...'); + const syncResult = syncDocsStructure(); + + if (syncResult.created.length > 0) { + console.log(` Created ${syncResult.created.length} new categories:`); + syncResult.created.forEach((c) => console.log(` ✅ ${c}`)); + } + console.log(` Total categories: ${syncResult.total}\n`); + + // Step 2: Determine which files to process + const args = process.argv.slice(2); + const solFiles = getFilesToProcess(args); + + if (solFiles.length === 0) { + console.log('No Solidity files to process'); + return; + } + + console.log(`Found ${solFiles.length} Solidity file(s) to process\n`); + + // Step 3: Process each file + for (const solFile of solFiles) { + await processSolFile(solFile, tracker); + } + + // Step 4: Regenerate all index pages now that docs are created + console.log('📄 Regenerating category index pages...'); + const indexResult = regenerateAllIndexFiles(true); + if (indexResult.regenerated.length > 0) { + console.log(` Regenerated ${indexResult.regenerated.length} index pages`); + } + console.log(''); + + // Step 5: Print summary + tracker.printSummary(); + tracker.writeSummaryFile(); +} + +main().catch((error) => { + console.error(`Fatal error: ${error}`); + process.exit(1); +}); diff --git a/.github/scripts/sync-docs-structure.js b/.github/scripts/sync-docs-structure.js new file mode 100644 index 00000000..4b4f833d --- /dev/null +++ b/.github/scripts/sync-docs-structure.js @@ -0,0 +1,210 @@ +#!/usr/bin/env node +/** + * Sync Documentation Structure + * + * Standalone script to mirror the src/ folder structure in website/docs/library/ + * Creates _category_.json files for Docusaurus navigation. + * + * Usage: + * node .github/scripts/sync-docs-structure.js [options] + * + * Options: + * --dry-run Show what would be created without making changes + * --verbose Show detailed output + * --help Show this help message + * + * Examples: + * node .github/scripts/sync-docs-structure.js + * node .github/scripts/sync-docs-structure.js --dry-run + */ + +const fs = require('fs'); +const path = require('path'); + +// Handle running from different directories +const scriptDir = __dirname; +process.chdir(path.join(scriptDir, '../..')); + +const { syncDocsStructure, scanSourceStructure } = require('./generate-docs-utils/category/category-generator'); + +// ============================================================================ +// CLI Parsing +// ============================================================================ + +const args = process.argv.slice(2); +const options = { + dryRun: args.includes('--dry-run'), + verbose: args.includes('--verbose'), + help: args.includes('--help') || args.includes('-h'), +}; + +// ============================================================================ +// Help +// ============================================================================ + +function showHelp() { + console.log(` +Sync Documentation Structure + +Mirrors the src/ folder structure in website/docs/library/ +Creates _category_.json files for Docusaurus navigation. + +Usage: + node .github/scripts/sync-docs-structure.js [options] + +Options: + --dry-run Show what would be created without making changes + --verbose Show detailed output + --help, -h Show this help message + +Examples: + node .github/scripts/sync-docs-structure.js + node .github/scripts/sync-docs-structure.js --dry-run +`); +} + +// ============================================================================ +// Tree Display +// ============================================================================ + +/** + * Display the source structure as a tree + * @param {Map} structure - Structure map from scanSourceStructure + */ +function displayTree(structure) { + console.log('\n📂 Source Structure (src/)\n'); + + // Sort by path for consistent display + const sorted = Array.from(structure.entries()).sort((a, b) => a[0].localeCompare(b[0])); + + // Build tree visualization + const tree = new Map(); + for (const [pathStr] of sorted) { + const parts = pathStr.split('/'); + let current = tree; + for (const part of parts) { + if (!current.has(part)) { + current.set(part, new Map()); + } + current = current.get(part); + } + } + + // Print tree + function printTree(node, prefix = '', isLast = true) { + const entries = Array.from(node.entries()); + entries.forEach(([name, children], index) => { + const isLastItem = index === entries.length - 1; + const connector = isLastItem ? '└── ' : '├── '; + const icon = children.size > 0 ? '📁' : '📄'; + console.log(`${prefix}${connector}${icon} ${name}`); + + if (children.size > 0) { + const newPrefix = prefix + (isLastItem ? ' ' : '│ '); + printTree(children, newPrefix, isLastItem); + } + }); + } + + printTree(tree); + console.log(''); +} + +// ============================================================================ +// Dry Run Mode +// ============================================================================ + +/** + * Simulate sync without making changes + * @param {Map} structure - Structure map + */ +function dryRun(structure) { + console.log('\n🔍 Dry Run Mode - No changes will be made\n'); + + const libraryDir = 'website/docs/library'; + let wouldCreate = 0; + let alreadyExists = 0; + + // Check base category + const baseCategoryFile = path.join(libraryDir, '_category_.json'); + if (fs.existsSync(baseCategoryFile)) { + console.log(` ✓ ${baseCategoryFile} (exists)`); + alreadyExists++; + } else { + console.log(` + ${baseCategoryFile} (would create)`); + wouldCreate++; + } + + // Check each category + for (const [relativePath] of structure) { + const categoryFile = path.join(libraryDir, relativePath, '_category_.json'); + if (fs.existsSync(categoryFile)) { + if (options.verbose) { + console.log(` ✓ ${categoryFile} (exists)`); + } + alreadyExists++; + } else { + console.log(` + ${categoryFile} (would create)`); + wouldCreate++; + } + } + + console.log(`\nSummary:`); + console.log(` Would create: ${wouldCreate} category files`); + console.log(` Already exist: ${alreadyExists} category files`); + console.log(`\nRun without --dry-run to apply changes.\n`); +} + +// ============================================================================ +// Main +// ============================================================================ + +function main() { + if (options.help) { + showHelp(); + return; + } + + console.log('📚 Sync Documentation Structure\n'); + console.log('Scanning src/ directory...'); + + const structure = scanSourceStructure(); + console.log(`Found ${structure.size} directories with Solidity files`); + + if (options.verbose || structure.size <= 20) { + displayTree(structure); + } + + if (options.dryRun) { + dryRun(structure); + return; + } + + console.log('Creating documentation structure...\n'); + const result = syncDocsStructure(); + + // Display results + console.log('='.repeat(50)); + console.log('Summary'); + console.log('='.repeat(50)); + console.log(`Created: ${result.created.length} categories`); + console.log(`Existing: ${result.existing.length} categories`); + console.log(`Total: ${result.total} categories`); + + if (result.created.length > 0) { + console.log('\nNewly created:'); + result.created.forEach((c) => console.log(` ✅ ${c}`)); + } + + console.log('\n✨ Done!\n'); + + // Show next steps + console.log('Next steps:'); + console.log(' 1. Run documentation generator to populate content:'); + console.log(' node .github/scripts/generate-docs.js --all\n'); + console.log(' 2. Or generate docs for specific files:'); + console.log(' node .github/scripts/generate-docs.js path/to/changed-files.txt\n'); +} + +main(); + diff --git a/.github/scripts/workflow-utils.js b/.github/scripts/workflow-utils.js index 0a254309..d95956d7 100644 --- a/.github/scripts/workflow-utils.js +++ b/.github/scripts/workflow-utils.js @@ -1,4 +1,5 @@ const fs = require('fs'); +const https = require('https'); const path = require('path'); const { execSync } = require('child_process'); @@ -63,18 +64,69 @@ function parsePRNumber(dataFileName) { } /** - * Read report file + * Read file content safely + * @param {string} filePath - Path to file (absolute or relative to workspace) + * @returns {string|null} File content or null if error + */ +function readFileSafe(filePath) { + try { + // If relative path, join with workspace if available + const fullPath = process.env.GITHUB_WORKSPACE && !path.isAbsolute(filePath) + ? path.join(process.env.GITHUB_WORKSPACE, filePath) + : filePath; + + if (!fs.existsSync(fullPath)) { + return null; + } + + return fs.readFileSync(fullPath, 'utf8'); + } catch (error) { + console.error(`Error reading file ${filePath}:`, error.message); + return null; + } +} + +/** + * Read report file (legacy - use readFileSafe for new code) * @param {string} reportFileName - Name of the report file * @returns {string|null} Report content or null if not found */ function readReport(reportFileName) { const reportPath = path.join(process.env.GITHUB_WORKSPACE, reportFileName); + return readFileSafe(reportPath); +} - if (!fs.existsSync(reportPath)) { - return null; +/** + * Ensure directory exists, create if not + * @param {string} dirPath - Directory path + */ +function ensureDir(dirPath) { + if (!fs.existsSync(dirPath)) { + fs.mkdirSync(dirPath, { recursive: true }); } +} - return fs.readFileSync(reportPath, 'utf8'); +/** + * Write file safely + * @param {string} filePath - Path to file (absolute or relative to workspace) + * @param {string} content - Content to write + * @returns {boolean} True if successful + */ +function writeFileSafe(filePath, content) { + try { + // If relative path, join with workspace if available + const fullPath = process.env.GITHUB_WORKSPACE && !path.isAbsolute(filePath) + ? path.join(process.env.GITHUB_WORKSPACE, filePath) + : filePath; + + const dir = path.dirname(fullPath); + ensureDir(dir); + fs.writeFileSync(fullPath, content); + return true; + } catch (error) { + console.error(`Error writing file ${filePath}:`, error.message); + return false; + } } /** @@ -129,9 +181,61 @@ async function postOrUpdateComment(github, context, prNumber, body, commentMarke } } +/** + * Sleep for specified milliseconds + * @param {number} ms - Milliseconds to sleep + * @returns {Promise} + */ +function sleep(ms) { + return new Promise(resolve => setTimeout(resolve, ms)); +} + +/** + * Make HTTPS request (promisified) + * @param {object} options - Request options + * @param {string} body - Request body + * @returns {Promise} Response data + */ +function makeHttpsRequest(options, body) { + return new Promise((resolve, reject) => { + const req = https.request(options, (res) => { + let data = ''; + + res.on('data', (chunk) => { + data += chunk; + }); + + res.on('end', () => { + if (res.statusCode >= 200 && res.statusCode < 300) { + try { + resolve(JSON.parse(data)); + } catch (e) { + resolve({ raw: data }); + } + } else { + reject(new Error(`HTTP ${res.statusCode}: ${data}`)); + } + }); + }); + + req.on('error', reject); + + if (body) { + req.write(body); + } + + req.end(); + }); +} + module.exports = { downloadArtifact, parsePRNumber, readReport, - postOrUpdateComment + readFileSafe, + writeFileSafe, + ensureDir, + postOrUpdateComment, + sleep, + makeHttpsRequest, }; \ No newline at end of file diff --git a/.github/workflows/coverage.yml b/.github/workflows/coverage.yml index 258f2f90..442b8c15 100644 --- a/.github/workflows/coverage.yml +++ b/.github/workflows/coverage.yml @@ -27,7 +27,7 @@ jobs: - name: Run coverage run: | - forge coverage --report summary --report lcov + forge coverage --ir-minimum --report summary --report lcov ls -la lcov.info || echo "lcov.info not found" - name: Generate coverage report diff --git a/.github/workflows/docs.yml b/.github/workflows/docs-build.yml similarity index 98% rename from .github/workflows/docs.yml rename to .github/workflows/docs-build.yml index 96ed2116..541d9366 100644 --- a/.github/workflows/docs.yml +++ b/.github/workflows/docs-build.yml @@ -1,4 +1,4 @@ -name: Documentation +name: Build Docs on: pull_request: diff --git a/.github/workflows/docs-generate.yml b/.github/workflows/docs-generate.yml new file mode 100644 index 00000000..edf96990 --- /dev/null +++ b/.github/workflows/docs-generate.yml @@ -0,0 +1,187 @@ +name: Generate Docs + +on: + push: + branches: [main] + paths: + - 'src/**/*.sol' + workflow_dispatch: + inputs: + target_file: + description: 'Process ONLY the specified Solidity file(s) (relative path, e.g. src/contracts/MyFacet.sol or src/facets/A.sol,src/facets/B.sol)' + required: false + type: string + process_all: + description: 'Process ALL Solidity files' + required: false + default: false + type: boolean + skip_enhancement: + description: 'Skip AI Documentation Enhancement' + required: false + default: false + type: boolean + +permissions: + contents: write + pull-requests: write + models: read # Required for GitHub Models API (AI enhancement) + +jobs: + generate-docs: + name: Generate Pages + runs-on: ubuntu-latest + + steps: + - name: Checkout code + uses: actions/checkout@v4 + with: + fetch-depth: 0 + submodules: recursive + + - name: Get changed Solidity files + id: changed-files + run: | + # Prefer explicit target_file when provided via manual dispatch. + # You can pass a single file or a comma/space-separated list, e.g.: + # src/facets/A.sol,src/facets/B.sol + # src/facets/A.sol src/facets/B.sol + if [ -n "${{ github.event.inputs.target_file }}" ]; then + echo "Processing Solidity file(s) from input:" + echo "${{ github.event.inputs.target_file }}" + echo "has_changes=true" >> $GITHUB_OUTPUT + echo "process_all=false" >> $GITHUB_OUTPUT + # Normalize comma/space-separated list into one file path per line + echo "${{ github.event.inputs.target_file }}" \ + | tr ',' '\n' \ + | tr ' ' '\n' \ + | sed '/^$/d' \ + > /tmp/changed_sol_files.txt + elif [ "${{ github.event.inputs.process_all }}" == "true" ]; then + echo "Processing all Solidity files (manual trigger)" + echo "has_changes=true" >> $GITHUB_OUTPUT + echo "process_all=true" >> $GITHUB_OUTPUT + else + # Get list of changed .sol files compared to previous commit + CHANGED_FILES=$(git diff --name-only HEAD~1 HEAD -- 'src/**/*.sol' 2>/dev/null || echo "") + + if [ -z "$CHANGED_FILES" ]; then + echo "No Solidity files changed" + echo "has_changes=false" >> $GITHUB_OUTPUT + else + echo "Changed files:" + echo "$CHANGED_FILES" + echo "has_changes=true" >> $GITHUB_OUTPUT + echo "process_all=false" >> $GITHUB_OUTPUT + + # Save to file for script + echo "$CHANGED_FILES" > /tmp/changed_sol_files.txt + fi + fi + + - name: Setup Node.js + if: steps.changed-files.outputs.has_changes == 'true' + uses: actions/setup-node@v4 + with: + node-version: '20' + + - name: Install Foundry + if: steps.changed-files.outputs.has_changes == 'true' + uses: foundry-rs/foundry-toolchain@v1 + + - name: Generate forge documentation + if: steps.changed-files.outputs.has_changes == 'true' + run: forge doc + + - name: Install template dependencies + if: steps.changed-files.outputs.has_changes == 'true' + working-directory: .github/scripts/generate-docs-utils/templates + run: npm install + + - name: Run documentation generator + if: steps.changed-files.outputs.has_changes == 'true' + env: + # AI Provider Configuration + GOOGLE_AI_API_KEY: ${{ secrets.GOOGLE_AI_API_KEY }} + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + SKIP_ENHANCEMENT: ${{ github.event.inputs.skip_enhancement || 'false' }} + run: | + if [ "${{ steps.changed-files.outputs.process_all }}" == "true" ]; then + node .github/scripts/generate-docs.js --all + else + node .github/scripts/generate-docs.js /tmp/changed_sol_files.txt + fi + + - name: Check for generated files + if: steps.changed-files.outputs.has_changes == 'true' + id: check-generated + run: | + # Check if any files were generated + if [ -f "docgen-summary.json" ]; then + TOTAL=$(cat docgen-summary.json | jq -r '.totalGenerated // 0' 2>/dev/null || echo "0") + if [ -n "$TOTAL" ] && [ "$TOTAL" -gt "0" ]; then + echo "has_generated=true" >> $GITHUB_OUTPUT + echo "Generated $TOTAL documentation files" + else + echo "has_generated=false" >> $GITHUB_OUTPUT + echo "No documentation files generated" + fi + else + echo "has_generated=false" >> $GITHUB_OUTPUT + fi + + - name: Verify documentation site build + if: steps.check-generated.outputs.has_generated == 'true' + working-directory: website + run: | + npm ci + npm run build + env: + ALGOLIA_APP_ID: 'dummy' + ALGOLIA_API_KEY: 'dummy' + ALGOLIA_INDEX_NAME: 'dummy' + POSTHOG_API_KEY: 'dummy' + continue-on-error: false + + - name: Generate PR body + if: steps.check-generated.outputs.has_generated == 'true' + id: pr-body + run: | + node .github/scripts/generate-docs-utils/pr-body-generator.js docgen-summary.json >> $GITHUB_OUTPUT + + - name: Clean up tmp files and stage website pages + if: steps.check-generated.outputs.has_generated == 'true' + run: | + # Remove forge docs folder (if it exists) + if [ -d "docs" ]; then + rm -rf docs + fi + + # Remove summary file (if it exists) + if [ -f "docgen-summary.json" ]; then + rm -f docgen-summary.json + fi + + # Reset any staged changes + git reset + + # Only stage website documentation files (force add in case they're ignored) + # Use library directory (the actual output directory) instead of contracts + if [ -d "website/docs/library" ]; then + git add -f website/docs/library/ + fi + + - name: Create Pull Request + if: steps.check-generated.outputs.has_generated == 'true' + uses: peter-evans/create-pull-request@v5 + with: + token: ${{ secrets.GITHUB_TOKEN }} + title: '[DOCS] Auto-generated Docs Pages' + commit-message: 'docs: auto-generate docs pages from NatSpec' + branch: docs/auto-generated-${{ github.run_number }} + body: ${{ steps.pr-body.outputs.body }} + labels: | + documentation + auto-generated + delete-branch: true + draft: true diff --git a/.gitignore b/.gitignore index 42f88272..521146a4 100644 --- a/.gitignore +++ b/.gitignore @@ -13,24 +13,10 @@ out/ # Docusaurus # Dependencies -docs/node_modules - -# Production -docs/build - -# Generated files -docs/.docusaurus -docs/.cache-loader - -# Misc -docs/.DS_Store -docs/.env -docs/.env.local -docs/.env.development.local -docs/.env.test.local -docs/.env.production.local - -docs/npm-debug.log* -docs/yarn-debug.log* -docs/yarn-error.log* +website/node_modules +.github/scripts/generate-docs-utils/templates/node_modules +# Ignore forge docs output (root level only) +/docs/ +# Ignore Docs generation summary file +docgen-summary.json \ No newline at end of file diff --git a/test/token/ERC20/ERC20/ERC20PermitFacet.t.sol b/test/token/ERC20/ERC20/ERC20PermitFacet.t.sol index 55f0e8e7..965d7436 100644 --- a/test/token/ERC20/ERC20/ERC20PermitFacet.t.sol +++ b/test/token/ERC20/ERC20/ERC20PermitFacet.t.sol @@ -70,7 +70,14 @@ contract ERC20BurnFacetTest is Test { /** * Simulate chain fork (chain ID changes) */ - vm.chainId(originalChainId + 1); + uint256 newChainId = originalChainId + 1; + vm.chainId(newChainId); + + /** + * Force a state change to ensure block.chainid is properly updated + * This is needed when using --ir-minimum in coverage mode + */ + vm.roll(block.number + 1); /** * Domain separator should recalculate with new chain ID @@ -84,13 +91,14 @@ contract ERC20BurnFacetTest is Test { /** * New separator should match expected value for new chain ID + * Use the newChainId variable to ensure consistency across compiler settings */ bytes32 expectedSeparator = keccak256( abi.encode( keccak256("EIP712Domain(string name,string version,uint256 chainId,address verifyingContract)"), keccak256(bytes(TOKEN_NAME)), keccak256("1"), - originalChainId + 1, + newChainId, address(token) ) ); diff --git a/website/README.md b/website/README.md index 23d9d30c..eff415d0 100644 --- a/website/README.md +++ b/website/README.md @@ -28,3 +28,9 @@ npm run build ``` This command generates static content into the `build` directory and can be served using any static contents hosting service. + +## Generate Facets & Modules Documentation + +```bash +npm run generate-docs +``` \ No newline at end of file diff --git a/website/docs/_category_.json b/website/docs/_category_.json index 8226f67a..e945adbe 100644 --- a/website/docs/_category_.json +++ b/website/docs/_category_.json @@ -3,8 +3,9 @@ "position": 1, "link": { "type": "generated-index", - "description": "Learn how to contribute to Compose" + "description": "Learn how to contribute to Compose", + "slug": "/docs" }, "collapsible": true, "collapsed": true -} \ No newline at end of file +} diff --git a/website/docs/contribution/_category_.json b/website/docs/contribution/_category_.json index 42c2e348..03a61040 100644 --- a/website/docs/contribution/_category_.json +++ b/website/docs/contribution/_category_.json @@ -3,8 +3,9 @@ "position": 5, "link": { "type": "generated-index", - "description": "Learn how to contribute to Compose" + "description": "Learn how to contribute to Compose", + "slug": "/docs/contribution" }, "collapsible": true, "collapsed": true -} \ No newline at end of file +} diff --git a/website/docs/design/banned-solidity-features.mdx b/website/docs/design/banned-solidity-features.mdx index 9824c26e..d48a1a5c 100644 --- a/website/docs/design/banned-solidity-features.mdx +++ b/website/docs/design/banned-solidity-features.mdx @@ -4,17 +4,19 @@ title: Banned Solidity Features description: Solidity language features that are banned from Compose facets and modules. --- +import Callout from '@site/src/components/ui/Callout'; + The following Solidity language features are **banned** from Compose facets and modules. Compose restricts certain Solidity features to keep facet and library code **simpler**, **more consistent**, and **easier to reason about**. Because of Compose's architecture, many of these features are either unnecessary or less helpful. -:::note + These restrictions **do not** apply to tests. These restrictions **do not** apply to developers using Compose in their own projects. -::: + #### Banned Solidity Features @@ -36,9 +38,9 @@ contract MyContract is IMyInterface { } ``` -:::tip + If you want inheritance, your facet is probably too large. Split it into smaller facets. Compose replaces inheritance with **on-chain facet composition**. -::: + diff --git a/website/docs/design/design-for-composition.mdx b/website/docs/design/design-for-composition.mdx index 6c1ed3c7..03ec0e48 100644 --- a/website/docs/design/design-for-composition.mdx +++ b/website/docs/design/design-for-composition.mdx @@ -4,6 +4,9 @@ title: Design for Composition description: How to design Compose facets and modules for composition. --- +import Callout from '@site/src/components/ui/Callout'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; + Here are the guidelines and rules for creating composable facets. Compose replaces source-code inheritance with on-chain composition. Facets are the building blocks; diamonds wire them together. @@ -46,9 +49,9 @@ We focus on building **small, independent, and easy-to-read facets**. Each facet 8. A facet that adds new storage variables must define its own diamond storage struct. 9. Never add new variables to an existing struct. -:::info Important + Maintain the same order of variables in structs when reusing them across facets or modules. Unused variables may only be removed from the end of a struct. -::: + ### Exceptions @@ -98,14 +101,14 @@ Only unused variables at the **end** of a struct may be safely removed. In this Here is the final struct storage code for `ERC20PermitFacet`: -```solidity -/** + +{`/** * @notice Storage slot identifier for ERC20 (reused to access token data). */ bytes32 constant ERC20_STORAGE_POSITION = keccak256("compose.erc20"); /** - * @notice Storage struct for ERC20 but with `symbol` removed. + * @notice Storage struct for ERC20 but with \`symbol\` removed. * @dev Reused struct definition with unused variables at the end removed * @custom:storage-location erc8042:compose.erc20 */ @@ -150,8 +153,8 @@ function getStorage() internal pure returns (ERC20PermitStorage storage s) { assembly { s.slot := position } -} -``` +}`} + #### Summary: How This Example Follows the Guide - **Reusing storage struct**: The `ERC20Storage` struct is copied from `ERC20Facet` and reused at the same location in storage `keccak256("compose.erc20")`, ensuring both facets access the same ERC20 token data. This demonstrates how facets can share storage. @@ -168,8 +171,8 @@ function getStorage() internal pure returns (ERC20PermitStorage storage s) { Here's a complete example showing how to correctly extend `ERC20Facet` by creating a new `ERC20StakingFacet` that adds staking functionality: -```solidity -/** + +{`/** * SPDX-License-Identifier: MIT */ pragma solidity >=0.8.30; @@ -218,7 +221,7 @@ contract ERC20StakingFacet { /** * @notice Storage struct for ERC20 * @dev This struct is from ERC20Facet. - * `balanceOf` is the only variable used in this struct. + * \`balanceOf\` is the only variable used in this struct. * All variables after it are removed. * @custom:storage-location erc8042:compose.erc20 */ @@ -347,8 +350,8 @@ contract ERC20StakingFacet { function getStakingStartTime(address _account) external view returns (uint256) { return getStorage().stakingStartTimes[_account]; } -} -``` +}`} + #### Summary: How This Example Follows the Guide @@ -370,7 +373,6 @@ This example demonstrates proper facet extension by: *** -:::info Conclusion - + This level of composability strikes the right balance: it enables organized, modular, and understandable on-chain smart contract systems. -::: + diff --git a/website/docs/design/index.mdx b/website/docs/design/index.mdx index 64c8af64..5283da54 100644 --- a/website/docs/design/index.mdx +++ b/website/docs/design/index.mdx @@ -7,6 +7,7 @@ sidebar_class_name: hidden import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Callout from '@site/src/components/ui/Callout'; This section contains the guidelines and rules for developing new facets and Solidity libraries in **Compose**. We focus on building small, independent, and easy-to-understand facets. Each facet is designed to be deployed once, then reused and composed seamlessly with others to form complete smart contract systems. @@ -51,7 +52,7 @@ This section contains the guidelines and rules for developing new facets and Sol /> -:::warning[Early Development] + Compose is still in early development and currently available only to contributors. It is not **production-ready** — use it in test or development environments only. -::: \ No newline at end of file + \ No newline at end of file diff --git a/website/docs/design/repeat-yourself.mdx b/website/docs/design/repeat-yourself.mdx index 62b50097..b7ab7676 100644 --- a/website/docs/design/repeat-yourself.mdx +++ b/website/docs/design/repeat-yourself.mdx @@ -4,6 +4,8 @@ title: Repeat Yourself description: Repeat yourself when it makes your code easier to read and understand. --- +import Callout from '@site/src/components/ui/Callout'; + The DRY principle — *Don't Repeat Yourself* — is a well-known rule in software development. We **intentionally** break that rule. @@ -15,4 +17,6 @@ Repetition can make smart contracts easier to read and reason about. Instead of However, DRY still has its place. For example, when a large block of code performs a complete, self-contained action and is used identically in multiple locations, moving it into an internal function can improve readability. For example, Compose's ERC-721 implementation uses an `internalTransferFrom` function to eliminate duplication while keeping the code easy to read and understand. -**Guideline:** Repeat yourself when it makes your code easier to read and understand. Use DRY sparingly and only to make code more readable. \ No newline at end of file + +Repeat yourself when it makes your code easier to read and understand. Use DRY sparingly and only to make code more readable. + \ No newline at end of file diff --git a/website/docs/foundations/composable-facets.mdx b/website/docs/foundations/composable-facets.mdx index f0282259..823693e6 100644 --- a/website/docs/foundations/composable-facets.mdx +++ b/website/docs/foundations/composable-facets.mdx @@ -4,6 +4,8 @@ title: Composable Facets description: Mix and match facets to build complex systems from simple, interoperable building blocks. --- +import Callout from '@site/src/components/ui/Callout'; + The word **"composable"** means *able to be combined with other parts to form a whole*. In **Compose**, facets are designed to be **composable**. They're built to interoperate seamlessly with other facets inside the same diamond. @@ -71,9 +73,9 @@ Diamond ArtCollection { } ``` */} -:::tip[Key Insight] + On-chain facets are the **building blocks** of Compose. Like LEGO bricks, they're designed to snap together in different configurations to build exactly what you need. -::: + ## Composability Benefits diff --git a/website/docs/foundations/custom-facets.mdx b/website/docs/foundations/custom-facets.mdx index 0aa65696..42421c31 100644 --- a/website/docs/foundations/custom-facets.mdx +++ b/website/docs/foundations/custom-facets.mdx @@ -4,6 +4,8 @@ title: "Custom Functionality: Compose Your Own Facets" description: "Build your own facets that work seamlessly with existing Compose Functionality." --- +import Callout from '@site/src/components/ui/Callout'; + Many projects need custom functionality beyond the standard facets. Compose is designed for this — you can build and integrate your own facets that work seamlessly alongside existing Compose facets. @@ -41,12 +43,12 @@ contract GameNFTFacet { } } ``` -:::tip[Key Insight] + Your custom `GameNFTFacet` and the standard `ERC721Facet` both operate on the **same storage** within your diamond. This shared-storage architecture is what makes composition possible. -::: + -:::warning[Early State Development] + Compose is still in early development and currently available only to contributors. It is not **production-ready** — use it in test or development environments only. -::: + diff --git a/website/docs/foundations/diamond-contracts.mdx b/website/docs/foundations/diamond-contracts.mdx index fecef74c..fa112367 100644 --- a/website/docs/foundations/diamond-contracts.mdx +++ b/website/docs/foundations/diamond-contracts.mdx @@ -5,6 +5,7 @@ description: "Understand Diamonds from the ground up—facets, storage, delegati --- import SvgThemeRenderer from '@site/src/components/theme/SvgThemeRenderer'; +import Callout from '@site/src/components/ui/Callout'; A **diamond contract** is a smart contract that is made up of multiple parts instead of one large block of code. The diamond exists at **one address** and holds **all of the contract's storage**, but it uses separate smart contracts called **facets** to provide its functionality. @@ -12,9 +13,9 @@ Users interact only with the **diamond**, but the diamond's features come from i Because facets can be added, replaced, or removed, a diamond can grow and evolve over time **without changing its address** and without redeploying the entire system. -:::note[In Simple Terms] + A diamond contract is a smart contract made from multiple small building blocks (facets), allowing it to be flexible, organized, and able to grow over time. -::: + A diamond has: - One address diff --git a/website/docs/foundations/index.mdx b/website/docs/foundations/index.mdx index 4859a2e3..de081ef2 100644 --- a/website/docs/foundations/index.mdx +++ b/website/docs/foundations/index.mdx @@ -8,6 +8,7 @@ import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; import DocSubtitle from '@site/src/components/docs/DocSubtitle'; import Icon from '@site/src/components/ui/Icon'; import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Callout from '@site/src/components/ui/Callout'; Compose is a new approach to smart contract development that changes how developers build and deploy smart contract systems. This section introduces the core concepts that make Compose unique. @@ -60,7 +61,12 @@ import CalloutBox from '@site/src/components/ui/CalloutBox'; /> -:::warning[Early Development] + + +Don't rush through these concepts. Taking time to understand the foundations will make everything else much easier. + + + Compose is still in early development and currently available only to contributors. It is not **production-ready** — use it in test or development environments only. -::: \ No newline at end of file + \ No newline at end of file diff --git a/website/docs/foundations/onchain-contract-library.mdx b/website/docs/foundations/onchain-contract-library.mdx index 9379b0a4..53a6a971 100644 --- a/website/docs/foundations/onchain-contract-library.mdx +++ b/website/docs/foundations/onchain-contract-library.mdx @@ -5,6 +5,7 @@ description: Compose provides a set of reusable on-chain contracts that already --- import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Callout from '@site/src/components/ui/Callout'; **Compose takes a different approach.** @@ -31,11 +32,11 @@ This reduces duplication, improves upgradeability, and makes smart contract syst For your next project, instead of deploying new contracts, simply **use the existing on-chain contracts** provided by Compose. -:::tip[Key Insight] + Compose is a general purpose **on-chain** smart contract library. -::: + -:::info[In Development] + Compose is still in early development, and its smart contracts haven't been deployed yet. We're actively building—and if this vision excites you, we'd love for you to join us. -::: + diff --git a/website/docs/foundations/overview.mdx b/website/docs/foundations/overview.mdx deleted file mode 100644 index 0c9fb410..00000000 --- a/website/docs/foundations/overview.mdx +++ /dev/null @@ -1,85 +0,0 @@ ---- -sidebar_position: 10 -title: Overview -description: Complete guide to all 30+ components with live examples, detailed documentation, and usage -draft: true ---- - -import DocHero from '@site/src/components/docs/DocHero'; -import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; -import Callout from '@site/src/components/ui/Callout'; - - - -## Core Concepts - -

- Understanding these fundamental concepts will help you build robust, scalable smart contract systems with Compose. -

- - - } - href="/docs/foundations/authentication" - /> - } - href="/docs/foundations/facets-and-modules" - /> - } - href="/docs/" - /> - } - href="/docs/" - /> - - -## Advanced Topics - - - } - href="/docs/" - /> - } - href="/docs/" - /> - - - -We recommend starting with **Facets & Modules** to understand the core architecture, then moving to **Storage Patterns** to see how it all works together. - - -## Why These Matter - -The concepts in this section form the foundation of everything you'll build with Compose: - -- **Authentication** ensures your contracts have proper access control -- **Facets & Modules** explain how to structure your code -- **Diamond Standard** provides the underlying architecture -- **Storage Patterns** enable the shared state that makes it all work - - -Don't rush through these concepts. Taking time to understand the foundations will make everything else much easier and prevent common mistakes. - - diff --git a/website/docs/foundations/reusable-facet-logic.mdx b/website/docs/foundations/reusable-facet-logic.mdx index 10fd0fdf..fc511550 100644 --- a/website/docs/foundations/reusable-facet-logic.mdx +++ b/website/docs/foundations/reusable-facet-logic.mdx @@ -5,6 +5,7 @@ description: Deploy once, reuse everywhere. Compose facets are shared across tho --- import DiamondFacetsSVG from '@site/static/img/svg/compose_diamond_facets.svg' +import Callout from '@site/src/components/ui/Callout'; You might be wondering: **How can I create a new project without deploying new smart contracts?** @@ -53,13 +54,13 @@ If 1,000 projects use the same `ERC20Facet`: - **Millions in gas costs avoided** - **1,000 projects** benefit from the same audited, battle-tested code -:::tip[Key Insight] + Many diamond contracts can be deployed that **reuse the same on-chain facets**. -::: + -:::tip[Key Insight] + Each diamond manages **its own storage data** by using the code from facets. -::: + diff --git a/website/docs/foundations/solidity-modules.mdx b/website/docs/foundations/solidity-modules.mdx index c6a6935d..9cf18d13 100644 --- a/website/docs/foundations/solidity-modules.mdx +++ b/website/docs/foundations/solidity-modules.mdx @@ -4,6 +4,8 @@ title: Solidity Modules description: Description of what Solidity modules are and how they are used in Compose. --- +import ExpandableCode from '@site/src/components/code/ExpandableCode'; + Solidity **modules** are Solidity files whose top-level code lives *outside* of contracts and Solidity libraries. They contain reusable logic that gets pulled into other contracts at compile time. @@ -33,8 +35,8 @@ Compose uses clear naming patterns to distinguish Solidity file types: Here is an example of a Solidity module that implements contract ownership functionality: -```solidity -// SPDX-License-Identifier: MIT + +{`// SPDX-License-Identifier: MIT pragma solidity >=0.8.30; /* @@ -87,13 +89,13 @@ function requireOwner() view { if (getStorage().owner != msg.sender) { revert OwnerUnauthorizedAccount(); } -} -``` +}`} + Here is an example of a diamond contract that uses Solidity modules to implement ERC-2535 Diamonds: -```solidity -// SPDX-License-Identifier: MIT + +{`// SPDX-License-Identifier: MIT pragma solidity >=0.8.30; import "../DiamondMod.sol" as DiamondMod; @@ -146,7 +148,7 @@ contract ExampleDiamond { } receive() external payable {} -} -``` +}`} + diff --git a/website/docs/getting-started/_category_.json b/website/docs/getting-started/_category_.json index 74b10c34..f17bd463 100644 --- a/website/docs/getting-started/_category_.json +++ b/website/docs/getting-started/_category_.json @@ -3,9 +3,9 @@ "position": 3, "link": { "type": "generated-index", - "description": "Learn how to install and configure Compose for your smart contract projects." + "description": "Learn how to install and configure Compose for your smart contract projects.", + "slug": "/docs/getting-started" }, "collapsible": true, "collapsed": true } - diff --git a/website/docs/getting-started/installation.md b/website/docs/getting-started/installation.md index b16802e2..32926f3b 100644 --- a/website/docs/getting-started/installation.md +++ b/website/docs/getting-started/installation.md @@ -2,6 +2,8 @@ sidebar_position: 1 --- +import Callout from '@site/src/components/ui/Callout'; + # Installation Get up and running with Compose in just a few minutes. @@ -123,7 +125,7 @@ Having trouble with installation? - Ask in **[Discord](https://discord.gg/compose)** - Open an **[issue on GitHub](https://github.com/Perfect-Abstractions/Compose/issues)** -:::tip Development Environment + We recommend using VSCode with the **Solidity** extension by Juan Blanco for the best development experience. -::: + diff --git a/website/docs/getting-started/quick-start.md b/website/docs/getting-started/quick-start.md index 318fd0e6..3e3b4956 100644 --- a/website/docs/getting-started/quick-start.md +++ b/website/docs/getting-started/quick-start.md @@ -3,6 +3,8 @@ sidebar_position: 2 draft: true --- +import ExpandableCode from '@site/src/components/code/ExpandableCode'; + # Quick Start Let's build your first diamond using Compose facets in under 5 minutes! 🚀 @@ -52,8 +54,8 @@ contract MyTokenDiamond is Diamond { Create `script/DeployMyDiamond.s.sol`: -```solidity -// SPDX-License-Identifier: MIT + +{`// SPDX-License-Identifier: MIT pragma solidity ^0.8.24; import {Script} from "forge-std/Script.sol"; @@ -108,8 +110,8 @@ contract DeployMyDiamond is Script { console.log("Diamond deployed at:", address(diamond)); console.log("ERC20Facet deployed at:", address(erc20Facet)); } -} -``` +}`} + ## Step 4: Create Initialization Facet @@ -150,8 +152,8 @@ contract TokenInitFacet { Create `test/MyTokenDiamond.t.sol`: -```solidity -// SPDX-License-Identifier: MIT + +{`// SPDX-License-Identifier: MIT pragma solidity ^0.8.24; import {Test} from "forge-std/Test.sol"; @@ -201,8 +203,8 @@ contract MyTokenDiamondTest is Test { // Assert assertEq(token.balanceOf(user2), 50 ether); } -} -``` +}`} + ## Step 6: Run and Deploy diff --git a/website/docs/getting-started/your-first-diamond.md b/website/docs/getting-started/your-first-diamond.md index 73b91dbb..2a87a5a5 100644 --- a/website/docs/getting-started/your-first-diamond.md +++ b/website/docs/getting-started/your-first-diamond.md @@ -3,6 +3,9 @@ sidebar_position: 3 draft: true --- +import Callout from '@site/src/components/ui/Callout'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; + # Your First Diamond In this guide, you'll learn how to create a diamond from scratch and understand every piece of the architecture. @@ -251,8 +254,8 @@ The diamond uses these selectors to route calls to the correct facet. Here's a complete deployment script: -```solidity -// SPDX-License-Identifier: MIT + +{`// SPDX-License-Identifier: MIT pragma solidity ^0.8.24; import "forge-std/Script.sol"; @@ -321,8 +324,8 @@ contract DeployDiamond is Script { selectors[8] = ERC20Facet.transferFrom.selector; return selectors; } -} -``` +}`} + ## Testing Your Diamond @@ -365,7 +368,7 @@ You now understand how to build a diamond from scratch! Continue learning: - **[Creating Custom Facets](/)** - Build your own facets - **[Upgrading Diamonds](/)** - Learn about diamond cuts -:::tip Pro Tip + In production, consider using a multi-sig wallet or DAO for the diamond owner to ensure secure upgrades. -::: + diff --git a/website/docs/library/_category_.json b/website/docs/library/_category_.json new file mode 100644 index 00000000..04125e1e --- /dev/null +++ b/website/docs/library/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Library", + "position": 4, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/index" + } +} diff --git a/website/docs/library/access/AccessControl/AccessControlFacet.mdx b/website/docs/library/access/AccessControl/AccessControlFacet.mdx new file mode 100644 index 00000000..600d817e --- /dev/null +++ b/website/docs/library/access/AccessControl/AccessControlFacet.mdx @@ -0,0 +1,563 @@ +--- +sidebar_position: 2 +title: "AccessControlFacet" +description: "Role-based access control for diamonds" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/AccessControl/AccessControlFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Role-based access control for diamonds + + + +- Manages roles and account permissions within a diamond. +- Supports role hierarchy via `setRoleAdmin`. +- Provides `grantRole`, `revokeRole`, `grantRoleBatch`, and `revokeRoleBatch` for flexible permission management. +- Includes `hasRole` and `requireRole` for permission checks. + + +## Overview + +This facet implements role-based access control for Compose diamonds. It exposes functions to manage roles, grant and revoke permissions, and check account access. Developers integrate this facet to enforce authorization policies within their diamond architecture, ensuring only authorized accounts can perform specific actions. + +--- + +## Storage + +### AccessControlStorage + + +{`struct AccessControlStorage { + mapping(address account => mapping(bytes32 role => bool hasRole)) hasRole; + mapping(bytes32 role => bytes32 adminRole) adminRole; +}`} + + +### State Variables + + + +## Functions + +### hasRole + +Returns if an account has a role. + + +{`function hasRole(bytes32 _role, address _account) external view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### requireRole + +Checks if an account has a required role. Reverts with AccessControlUnauthorizedAccount If the account does not have the role. + + +{`function requireRole(bytes32 _role, address _account) external view;`} + + +**Parameters:** + + + +--- +### getRoleAdmin + +Returns the admin role for a role. + + +{`function getRoleAdmin(bytes32 _role) external view returns (bytes32);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### setRoleAdmin + +Sets the admin role for a role. Emits a RoleAdminChanged event. Reverts with AccessControlUnauthorizedAccount If the caller is not the current admin of the role. + + +{`function setRoleAdmin(bytes32 _role, bytes32 _adminRole) external;`} + + +**Parameters:** + + + +--- +### grantRole + +Grants a role to an account. Emits a RoleGranted event. Reverts with AccessControlUnauthorizedAccount If the caller is not the admin of the role. + + +{`function grantRole(bytes32 _role, address _account) external;`} + + +**Parameters:** + + + +--- +### revokeRole + +Revokes a role from an account. Emits a RoleRevoked event. Reverts with AccessControlUnauthorizedAccount If the caller is not the admin of the role. + + +{`function revokeRole(bytes32 _role, address _account) external;`} + + +**Parameters:** + + + +--- +### grantRoleBatch + +Grants a role to multiple accounts in a single transaction. Emits a RoleGranted event for each newly granted account. Reverts with AccessControlUnauthorizedAccount If the caller is not the admin of the role. + + +{`function grantRoleBatch(bytes32 _role, address[] calldata _accounts) external;`} + + +**Parameters:** + + + +--- +### revokeRoleBatch + +Revokes a role from multiple accounts in a single transaction. Emits a RoleRevoked event for each account the role is revoked from. Reverts with AccessControlUnauthorizedAccount If the caller is not the admin of the role. + + +{`function revokeRoleBatch(bytes32 _role, address[] calldata _accounts) external;`} + + +**Parameters:** + + + +--- +### renounceRole + +Renounces a role from the caller. Emits a RoleRevoked event. Reverts with AccessControlUnauthorizedSender If the caller is not the account to renounce the role from. + + +{`function renounceRole(bytes32 _role, address _account) external;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when the admin role for a role is changed. +
+ +
+ Signature: + +{`event RoleAdminChanged(bytes32 indexed _role, bytes32 indexed _previousAdminRole, bytes32 indexed _newAdminRole);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a role is granted to an account. +
+ +
+ Signature: + +{`event RoleGranted(bytes32 indexed _role, address indexed _account, address indexed _sender);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a role is revoked from an account. +
+ +
+ Signature: + +{`event RoleRevoked(bytes32 indexed _role, address indexed _account, address indexed _sender);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when the account does not have a specific role. +
+ +
+ Signature: + +error AccessControlUnauthorizedAccount(address _account, bytes32 _role); + +
+
+ +
+ Thrown when the sender is not the account to renounce the role from. +
+ +
+ Signature: + +error AccessControlUnauthorizedSender(address _sender, address _account); + +
+
+
+ + + + +## Best Practices + + +- Initialize roles and their admin roles during diamond deployment. +- Use `grantRole` and `revokeRole` for individual permission changes. +- Leverage `grantRoleBatch` and `revokeRoleBatch` for efficient bulk operations. +- Ensure the caller has the necessary admin role before granting or revoking roles. + + +## Security Considerations + + +All state-changing functions (`setRoleAdmin`, `grantRole`, `revokeRole`, `grantRoleBatch`, `revokeRoleBatch`, `renounceRole`) must be protected by appropriate access control mechanisms, typically enforced by the caller's role. The `renounceRole` function should only be callable by the account whose role is being renounced. Input validation for account addresses and role bytes is critical. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/access/AccessControl/AccessControlMod.mdx b/website/docs/library/access/AccessControl/AccessControlMod.mdx new file mode 100644 index 00000000..77c03292 --- /dev/null +++ b/website/docs/library/access/AccessControl/AccessControlMod.mdx @@ -0,0 +1,486 @@ +--- +sidebar_position: 1 +title: "AccessControlMod" +description: "Manage roles and permissions within a diamond" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/AccessControl/AccessControlMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manage roles and permissions within a diamond + + + +- All functions are `internal` for integration into custom facets. +- Utilizes the diamond storage pattern for shared state management. +- Compatible with ERC-2535 diamonds. +- No external dependencies, promoting composability. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions for managing role-based access control within a Compose diamond. Facets can import this module to grant, revoke, and check roles using shared diamond storage. This pattern ensures consistent permission management across all facets interacting with the same storage. + +--- + +## Storage + +### AccessControlStorage + + +{`struct AccessControlStorage { + mapping(address account => mapping(bytes32 role => bool hasRole)) hasRole; + mapping(bytes32 role => bytes32 adminRole) adminRole; +}`} + + +### State Variables + + + +## Functions + +### getStorage + +Returns the storage for the AccessControl. + + +{`function getStorage() pure returns (AccessControlStorage storage _s);`} + + +**Returns:** + + + +--- +### grantRole + +function to grant a role to an account. + + +{`function grantRole(bytes32 _role, address _account) returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### hasRole + +function to check if an account has a role. + + +{`function hasRole(bytes32 _role, address _account) view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### requireRole + +function to check if an account has a required role. Reverts with AccessControlUnauthorizedAccount If the account does not have the role. + + +{`function requireRole(bytes32 _role, address _account) view;`} + + +**Parameters:** + + + +--- +### revokeRole + +function to revoke a role from an account. + + +{`function revokeRole(bytes32 _role, address _account) returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### setRoleAdmin + +function to set the admin role for a role. + + +{`function setRoleAdmin(bytes32 _role, bytes32 _adminRole) ;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when the admin role for a role is changed. +
+ +
+ Signature: + +{`event RoleAdminChanged(bytes32 indexed _role, bytes32 indexed _previousAdminRole, bytes32 indexed _newAdminRole);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a role is granted to an account. +
+ +
+ Signature: + +{`event RoleGranted(bytes32 indexed _role, address indexed _account, address indexed _sender);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a role is revoked from an account. +
+ +
+ Signature: + +{`event RoleRevoked(bytes32 indexed _role, address indexed _account, address indexed _sender);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when the account does not have a specific role. +
+ +
+ Signature: + +error AccessControlUnauthorizedAccount(address _account, bytes32 _role); + +
+
+
+ + + + +## Best Practices + + +- Call `requireRole` to enforce access control checks before executing sensitive functions. +- Ensure that your facet's storage layout is compatible with `AccessControlStorage` to prevent collisions. +- Handle the `AccessControlUnauthorizedAccount` error for predictable revert behavior. + + +## Integration Notes + + +This module uses diamond storage at the `STORAGE_POSITION` defined by `keccak256("compose.accesscontrol")`. All state modifications and reads are performed against the `AccessControlStorage` struct within this shared storage slot. Changes made by any facet using this module are immediately visible to all other facets accessing the same storage position. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/access/AccessControl/_category_.json b/website/docs/library/access/AccessControl/_category_.json new file mode 100644 index 00000000..1504700a --- /dev/null +++ b/website/docs/library/access/AccessControl/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Access Control", + "position": 3, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/access/AccessControl/index" + } +} diff --git a/website/docs/library/access/AccessControl/index.mdx b/website/docs/library/access/AccessControl/index.mdx new file mode 100644 index 00000000..85e277b1 --- /dev/null +++ b/website/docs/library/access/AccessControl/index.mdx @@ -0,0 +1,29 @@ +--- +title: "Access Control" +description: "Role-based access control (RBAC) pattern." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + Role-based access control (RBAC) pattern. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/access/AccessControlPausable/AccessControlPausableFacet.mdx b/website/docs/library/access/AccessControlPausable/AccessControlPausableFacet.mdx new file mode 100644 index 00000000..fcb8db4d --- /dev/null +++ b/website/docs/library/access/AccessControlPausable/AccessControlPausableFacet.mdx @@ -0,0 +1,357 @@ +--- +sidebar_position: 2 +title: "AccessControlPausableFacet" +description: "Manage role pausing and unpausing within a diamond" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/AccessControlPausable/AccessControlPausableFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manage role pausing and unpausing within a diamond + + + +- Enables temporary disabling of specific roles. +- Integrates role pausing with the diamond proxy pattern. +- Emits `RolePaused` and `RoleUnpaused` events for state tracking. +- Reverts with `AccessControlUnauthorizedAccount` and `AccessControlRolePaused` custom errors. + + +## Overview + +This facet provides functionality to pause and unpause specific roles within a Compose diamond. It allows authorized administrators to temporarily disable role execution, enhancing control during critical operations. Calls are routed through the diamond proxy, integrating seamlessly with the diamond's access control and upgradeability. + +--- + +## Storage + +### AccessControlStorage + + +{`struct AccessControlStorage { + mapping(address account => mapping(bytes32 role => bool hasRole)) hasRole; + mapping(bytes32 role => bytes32 adminRole) adminRole; +}`} + + +--- +### AccessControlPausableStorage + + +{`struct AccessControlPausableStorage { + mapping(bytes32 role => bool paused) pausedRoles; +}`} + + +### State Variables + + + +## Functions + +### isRolePaused + +Returns if a role is paused. + + +{`function isRolePaused(bytes32 _role) external view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### pauseRole + +Temporarily disables a role, preventing all accounts from using it. Only the admin of the role can pause it. Emits a RolePaused event. Reverts with AccessControlUnauthorizedAccount If the caller is not the admin of the role. + + +{`function pauseRole(bytes32 _role) external;`} + + +**Parameters:** + + + +--- +### unpauseRole + +Re-enables a role that was previously paused. Only the admin of the role can unpause it. Emits a RoleUnpaused event. Reverts with AccessControlUnauthorizedAccount If the caller is not the admin of the role. + + +{`function unpauseRole(bytes32 _role) external;`} + + +**Parameters:** + + + +--- +### requireRoleNotPaused + +Checks if an account has a role and if the role is not paused. - Reverts with AccessControlUnauthorizedAccount If the account does not have the role. - Reverts with AccessControlRolePaused If the role is paused. + + +{`function requireRoleNotPaused(bytes32 _role, address _account) external view;`} + + +**Parameters:** + + + +## Events + + + +
+ Event emitted when a role is paused. +
+ +
+ Signature: + +{`event RolePaused(bytes32 indexed _role, address indexed _account);`} + +
+ +
+ Parameters: + +
+
+ +
+ Event emitted when a role is unpaused. +
+ +
+ Signature: + +{`event RoleUnpaused(bytes32 indexed _role, address indexed _account);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when the account does not have a specific role. +
+ +
+ Signature: + +error AccessControlUnauthorizedAccount(address _account, bytes32 _role); + +
+
+ +
+ Thrown when a role is paused and an operation requiring that role is attempted. +
+ +
+ Signature: + +error AccessControlRolePaused(bytes32 _role); + +
+
+
+ + + + +## Best Practices + + +- Initialize roles and their pause status during diamond deployment. +- Ensure only authorized administrators can call `pauseRole` and `unpauseRole`. +- Utilize `requireRoleNotPaused` to enforce pause state before executing role-dependent logic. + + +## Security Considerations + + +The `pauseRole` and `unpauseRole` functions are restricted to role administrators, preventing unauthorized pausing or unpausing. The `requireRoleNotPaused` function ensures that calls are only permitted when the specified role is not paused, mitigating risks associated with executing functions during critical periods. Input validation is handled by custom errors. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/access/AccessControlPausable/AccessControlPausableMod.mdx b/website/docs/library/access/AccessControlPausable/AccessControlPausableMod.mdx new file mode 100644 index 00000000..0a7ba3ef --- /dev/null +++ b/website/docs/library/access/AccessControlPausable/AccessControlPausableMod.mdx @@ -0,0 +1,394 @@ +--- +sidebar_position: 1 +title: "AccessControlPausableMod" +description: "Manage paused roles using diamond storage" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/AccessControlPausable/AccessControlPausableMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manage paused roles using diamond storage + + + +- Internal functions for pausing and unpausing roles. +- Uses diamond storage pattern (EIP-8042) for shared state management. +- Reverts with specific errors (`AccessControlRolePaused`, `AccessControlUnauthorizedAccount`) on failed checks. +- Compatible with ERC-2535 diamonds. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions to pause and unpause specific roles within a diamond. Facets can import this module to enforce role-based access control, ensuring that certain actions are temporarily blocked for a given role. Changes to role pause status are immediately visible across all facets interacting with the shared diamond storage. + +--- + +## Storage + +### AccessControlPausableStorage + + +{`struct AccessControlPausableStorage { + mapping(bytes32 role => bool paused) pausedRoles; +}`} + + +--- +### AccessControlStorage + + +{`struct AccessControlStorage { + mapping(address account => mapping(bytes32 role => bool hasRole)) hasRole; + mapping(bytes32 role => bytes32 adminRole) adminRole; +}`} + + +### State Variables + + + +## Functions + +### getAccessControlStorage + +Returns the storage for AccessControl. + + +{`function getAccessControlStorage() pure returns (AccessControlStorage storage s);`} + + +**Returns:** + + + +--- +### getStorage + +Returns the storage for AccessControlPausable. + + +{`function getStorage() pure returns (AccessControlPausableStorage storage s);`} + + +**Returns:** + + + +--- +### isRolePaused + +function to check if a role is paused. + + +{`function isRolePaused(bytes32 _role) view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### pauseRole + +function to pause a role. + + +{`function pauseRole(bytes32 _role) ;`} + + +**Parameters:** + + + +--- +### requireRoleNotPaused + +function to check if an account has a role and if the role is not paused. **Notes:** - Reverts with AccessControlUnauthorizedAccount If the account does not have the role. - Reverts with AccessControlRolePaused If the role is paused. + + +{`function requireRoleNotPaused(bytes32 _role, address _account) view;`} + + +**Parameters:** + + + +--- +### unpauseRole + +function to unpause a role. + + +{`function unpauseRole(bytes32 _role) ;`} + + +**Parameters:** + + + +## Events + + + +
+ Event emitted when a role is paused. +
+ +
+ Signature: + +{`event RolePaused(bytes32 indexed _role, address indexed _account);`} + +
+ +
+ Parameters: + +
+
+ +
+ Event emitted when a role is unpaused. +
+ +
+ Signature: + +{`event RoleUnpaused(bytes32 indexed _role, address indexed _account);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when a role is paused and an operation requiring that role is attempted. +
+ +
+ Signature: + +error AccessControlRolePaused(bytes32 _role); + +
+
+ +
+ Thrown when the account does not have a specific role. +
+ +
+ Signature: + +error AccessControlUnauthorizedAccount(address _account, bytes32 _role); + +
+
+
+ + + + +## Best Practices + + +- Call `pauseRole` and `unpauseRole` only when necessary to temporarily restrict role functionality. +- Use `requireRoleNotPaused` to enforce active role checks, reverting with `AccessControlRolePaused` if the role is paused. +- Ensure the `AccessControlPausableMod` instance is correctly initialized and accessible by facets that require role pause management. + + +## Integration Notes + + +This module utilizes diamond storage at the `ACCESS_CONTROL_STORAGE_POSITION`, identified by `keccak25356("compose.accesscontrol")`. The `AccessControlPausableStorage` struct manages the pause state for roles. All functions are internal, directly interacting with this shared storage. Any facet that imports and calls functions on this module will see the updated pause status immediately due to the shared nature of diamond storage. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/access/AccessControlPausable/_category_.json b/website/docs/library/access/AccessControlPausable/_category_.json new file mode 100644 index 00000000..96418b00 --- /dev/null +++ b/website/docs/library/access/AccessControlPausable/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Pausable Access Control", + "position": 4, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/access/AccessControlPausable/index" + } +} diff --git a/website/docs/library/access/AccessControlPausable/index.mdx b/website/docs/library/access/AccessControlPausable/index.mdx new file mode 100644 index 00000000..8d5a1e18 --- /dev/null +++ b/website/docs/library/access/AccessControlPausable/index.mdx @@ -0,0 +1,29 @@ +--- +title: "Pausable Access Control" +description: "RBAC with pause functionality." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + RBAC with pause functionality. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/access/AccessControlTemporal/AccessControlTemporalFacet.mdx b/website/docs/library/access/AccessControlTemporal/AccessControlTemporalFacet.mdx new file mode 100644 index 00000000..795c201b --- /dev/null +++ b/website/docs/library/access/AccessControlTemporal/AccessControlTemporalFacet.mdx @@ -0,0 +1,427 @@ +--- +sidebar_position: 2 +title: "AccessControlTemporalFacet" +description: "Grants and revokes roles with expiry timestamps" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/AccessControlTemporal/AccessControlTemporalFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Grants and revokes roles with expiry timestamps + + + +- Manages roles with specific expiry timestamps. +- Integrates seamlessly with the diamond proxy pattern. +- Exposes external functions for temporal role management. +- Utilizes Compose's internal storage access patterns. + + +## Overview + +This facet implements temporal role-based access control within a diamond. It provides functions to grant roles with specific expiry dates and to revoke them. Calls are routed through the diamond proxy, allowing for dynamic access management integrated with other diamond functionalities. Developers add this facet to enable time-limited permissions for accounts. + +--- + +## Storage + +### AccessControlStorage + + +{`struct AccessControlStorage { + mapping(address account => mapping(bytes32 role => bool hasRole)) hasRole; + mapping(bytes32 role => bytes32 adminRole) adminRole; +}`} + + +--- +### AccessControlTemporalStorage + + +{`struct AccessControlTemporalStorage { + mapping(address account => mapping(bytes32 role => uint256 expiryTimestamp)) roleExpiry; +}`} + + +### State Variables + + + +## Functions + +### getRoleExpiry + +Returns the expiry timestamp for a role assignment. + + +{`function getRoleExpiry(bytes32 _role, address _account) external view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### isRoleExpired + +Checks if a role assignment has expired. + + +{`function isRoleExpired(bytes32 _role, address _account) external view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### grantRoleWithExpiry + +Grants a role to an account with an expiry timestamp. Only the admin of the role can grant it with expiry. Emits a RoleGrantedWithExpiry event. Reverts with AccessControlUnauthorizedAccount If the caller is not the admin of the role. + + +{`function grantRoleWithExpiry(bytes32 _role, address _account, uint256 _expiresAt) external;`} + + +**Parameters:** + + + +--- +### revokeTemporalRole + +Revokes a temporal role from an account. Only the admin of the role can revoke it. Emits a TemporalRoleRevoked event. Reverts with AccessControlUnauthorizedAccount If the caller is not the admin of the role. + + +{`function revokeTemporalRole(bytes32 _role, address _account) external;`} + + +**Parameters:** + + + +--- +### requireValidRole + +Checks if an account has a valid (non-expired) role. - Reverts with AccessControlUnauthorizedAccount If the account does not have the role. - Reverts with AccessControlRoleExpired If the role has expired. + + +{`function requireValidRole(bytes32 _role, address _account) external view;`} + + +**Parameters:** + + + +## Events + + + +
+ Event emitted when a role is granted with an expiry timestamp. +
+ +
+ Signature: + +{`event RoleGrantedWithExpiry( + bytes32 indexed _role, address indexed _account, uint256 _expiresAt, address indexed _sender +);`} + +
+ +
+ Parameters: + +
+
+ +
+ Event emitted when a temporal role is revoked. +
+ +
+ Signature: + +{`event TemporalRoleRevoked(bytes32 indexed _role, address indexed _account, address indexed _sender);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when the account does not have a specific role. +
+ +
+ Signature: + +error AccessControlUnauthorizedAccount(address _account, bytes32 _role); + +
+
+ +
+ Thrown when a role has expired. +
+ +
+ Signature: + +error AccessControlRoleExpired(bytes32 _role, address _account); + +
+
+
+ + + + +## Best Practices + + +- Initialize temporal roles using `grantRoleWithExpiry` during diamond setup or via authorized administrative functions. +- Regularly check role expiry using `isRoleExpired` before executing sensitive operations. +- Ensure that only authorized administrative facets can call `grantRoleWithExpiry` and `revokeTemporalRole`. + + +## Security Considerations + + +All state-changing functions (`grantRoleWithExpiry`, `revokeTemporalRole`) require the caller to be the admin of the respective role, enforced by `AccessControlUnauthorizedAccount` revert. The `requireValidRole` function checks for both role existence and expiry, reverting with `AccessControlUnauthorizedAccount` or `AccessControlRoleExpired` respectively. Input validation for expiry timestamps is crucial at the calling facet level. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/access/AccessControlTemporal/AccessControlTemporalMod.mdx b/website/docs/library/access/AccessControlTemporal/AccessControlTemporalMod.mdx new file mode 100644 index 00000000..2281b694 --- /dev/null +++ b/website/docs/library/access/AccessControlTemporal/AccessControlTemporalMod.mdx @@ -0,0 +1,521 @@ +--- +sidebar_position: 1 +title: "AccessControlTemporalMod" +description: "Manages time-bound role assignments in a diamond" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/AccessControlTemporal/AccessControlTemporalMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manages time-bound role assignments in a diamond + + + +- Manages roles with specific expiry timestamps. +- Provides `requireValidRole` for immediate validation and reverts on expiry or lack of role. +- All functions are `internal`, intended for use within facets. +- Integrates with the diamond storage pattern. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides functionality to grant roles with specific expiry timestamps. Facets can use this module to enforce time-limited access control, ensuring that roles automatically become invalid after their designated expiry. It leverages the diamond storage pattern for shared state across facets. + +--- + +## Storage + +### AccessControlStorage + + +{`struct AccessControlStorage { + mapping(address account => mapping(bytes32 role => bool hasRole)) hasRole; + mapping(bytes32 role => bytes32 adminRole) adminRole; +}`} + + +--- +### AccessControlTemporalStorage + + +{`struct AccessControlTemporalStorage { + mapping(address account => mapping(bytes32 role => uint256 expiryTimestamp)) roleExpiry; +}`} + + +### State Variables + + + +## Functions + +### getAccessControlStorage + +Returns the storage for AccessControl. + + +{`function getAccessControlStorage() pure returns (AccessControlStorage storage s);`} + + +**Returns:** + + + +--- +### getRoleExpiry + +function to get the expiry timestamp for a role assignment. + + +{`function getRoleExpiry(bytes32 _role, address _account) view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### getStorage + +Returns the storage for AccessControlTemporal. + + +{`function getStorage() pure returns (AccessControlTemporalStorage storage s);`} + + +**Returns:** + + + +--- +### grantRoleWithExpiry + +function to grant a role with an expiry timestamp. + + +{`function grantRoleWithExpiry(bytes32 _role, address _account, uint256 _expiresAt) returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### isRoleExpired + +function to check if a role assignment has expired. + + +{`function isRoleExpired(bytes32 _role, address _account) view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### requireValidRole + +function to check if an account has a valid (non-expired) role. **Notes:** - Reverts with AccessControlUnauthorizedAccount If the account does not have the role. - Reverts with AccessControlRoleExpired If the role has expired. + + +{`function requireValidRole(bytes32 _role, address _account) view;`} + + +**Parameters:** + + + +--- +### revokeTemporalRole + +function to revoke a temporal role. + + +{`function revokeTemporalRole(bytes32 _role, address _account) returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +## Events + + + +
+ Event emitted when a role is granted with an expiry timestamp. +
+ +
+ Signature: + +{`event RoleGrantedWithExpiry( +bytes32 indexed _role, address indexed _account, uint256 _expiresAt, address indexed _sender +);`} + +
+ +
+ Parameters: + +
+
+ +
+ Event emitted when a temporal role is revoked. +
+ +
+ Signature: + +{`event TemporalRoleRevoked(bytes32 indexed _role, address indexed _account, address indexed _sender);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when a role has expired. +
+ +
+ Signature: + +error AccessControlRoleExpired(bytes32 _role, address _account); + +
+
+ +
+ Thrown when the account does not have a specific role. +
+ +
+ Signature: + +error AccessControlUnauthorizedAccount(address _account, bytes32 _role); + +
+
+
+ + + + +## Best Practices + + +- Call `requireValidRole` before executing sensitive operations to ensure the caller's role is still active. +- Use `grantRoleWithExpiry` to define clear time boundaries for role permissions. +- Handle `AccessControlRoleExpired` and `AccessControlUnauthorizedAccount` errors returned by `requireValidRole`. + + +## Integration Notes + + +This module interacts with diamond storage at the `ACCESS_CONTROL_STORAGE_POSITION`, which is determined by `keccak256("compose.accesscontrol")`. It utilizes the `AccessControlTemporalStorage` struct. All state modifications are managed through internal functions, ensuring consistency and visibility across all facets that access the same storage slot. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/access/AccessControlTemporal/_category_.json b/website/docs/library/access/AccessControlTemporal/_category_.json new file mode 100644 index 00000000..834b0b18 --- /dev/null +++ b/website/docs/library/access/AccessControlTemporal/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Temporal Access Control", + "position": 5, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/access/AccessControlTemporal/index" + } +} diff --git a/website/docs/library/access/AccessControlTemporal/index.mdx b/website/docs/library/access/AccessControlTemporal/index.mdx new file mode 100644 index 00000000..1b5e07d5 --- /dev/null +++ b/website/docs/library/access/AccessControlTemporal/index.mdx @@ -0,0 +1,29 @@ +--- +title: "Temporal Access Control" +description: "Time-limited role-based access control." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + Time-limited role-based access control. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/access/Owner/OwnerFacet.mdx b/website/docs/library/access/Owner/OwnerFacet.mdx new file mode 100644 index 00000000..7afa112b --- /dev/null +++ b/website/docs/library/access/Owner/OwnerFacet.mdx @@ -0,0 +1,216 @@ +--- +sidebar_position: 2 +title: "OwnerFacet" +description: "Manages diamond contract ownership and transfers" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/Owner/OwnerFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manages diamond contract ownership and transfers + + + +- Provides external view function `owner()` to check contract ownership. +- Supports ownership transfer via `transferOwnership(address)`. +- Allows ownership renouncement via `renounceOwnership()`. +- Utilizes diamond storage for owner state. + + +## Overview + +This facet implements ownership management for a diamond contract. It exposes functions to view the current owner, transfer ownership to a new address, and renounce ownership entirely. Developers integrate this facet to establish clear ownership and control over diamond upgradeability and configuration. + +--- + +## Storage + +### OwnerStorage + + +{`struct OwnerStorage { + address owner; +}`} + + +### State Variables + + + +## Functions + +### owner + +Get the address of the owner + + +{`function owner() external view returns (address);`} + + +**Returns:** + + + +--- +### transferOwnership + +Set the address of the new owner of the contract Set _newOwner to address(0) to renounce any ownership. + + +{`function transferOwnership(address _newOwner) external;`} + + +**Parameters:** + + + +--- +### renounceOwnership + + +{`function renounceOwnership() external;`} + + +## Events + + + + +
+ Signature: + +{`event OwnershipTransferred(address indexed previousOwner, address indexed newOwner);`} + +
+ +
+
+ +## Errors + + + + +
+ Signature: + +error OwnerUnauthorizedAccount(); + +
+
+
+ + + + +## Best Practices + + +- Initialize the owner address during diamond deployment. +- Enforce ownership checks on sensitive functions through the `owner()` view function. +- Use `transferOwnership` for planned ownership changes and `renounceOwnership` with extreme caution. + + +## Security Considerations + + +The `transferOwnership` function allows setting the owner to address(0), effectively renouncing ownership. Ensure this action is intended before execution. All state-changing functions are implicitly protected by the owner role, as only the current owner can call `transferOwnership` and `renounceOwnership` through the diamond proxy. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/access/Owner/OwnerMod.mdx b/website/docs/library/access/Owner/OwnerMod.mdx new file mode 100644 index 00000000..98732a0f --- /dev/null +++ b/website/docs/library/access/Owner/OwnerMod.mdx @@ -0,0 +1,297 @@ +--- +sidebar_position: 1 +title: "OwnerMod" +description: "Manages contract ownership using diamond storage" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/Owner/OwnerMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manages contract ownership using diamond storage + + + +- Provides internal functions for ERC-173 ownership management. +- Uses diamond storage pattern (EIP-8042) for shared state. +- `requireOwner()` enforces access control based on contract ownership. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions for managing contract ownership according to ERC-173. Facets can import this module to check ownership and transfer it using shared diamond storage. Ownership changes are immediately visible to all facets interacting with the same storage pattern. + +--- + +## Storage + +### OwnerStorage + +storage-location: erc8042:compose.owner + + +{`struct OwnerStorage { + address owner; +}`} + + +### State Variables + + + +## Functions + +### getStorage + +Returns a pointer to the ERC-173 storage struct. Uses inline assembly to access the storage slot defined by STORAGE_POSITION. + + +{`function getStorage() pure returns (OwnerStorage storage s);`} + + +**Returns:** + + + +--- +### owner + +Get the address of the owner + + +{`function owner() view returns (address);`} + + +**Returns:** + + + +--- +### requireOwner + +Reverts if the caller is not the owner. + + +{`function requireOwner() view;`} + + +--- +### setContractOwner + + +{`function setContractOwner(address _initialOwner) ;`} + + +**Parameters:** + + + +--- +### transferOwnership + +Set the address of the new owner of the contract Set _newOwner to address(0) to renounce any ownership. + + +{`function transferOwnership(address _newOwner) ;`} + + +**Parameters:** + + + +## Events + + + +
+ This emits when ownership of a contract changes. +
+ +
+ Signature: + +{`event OwnershipTransferred(address indexed previousOwner, address indexed newOwner);`} + +
+ +
+
+ +## Errors + + + + +
+ Signature: + +error OwnerAlreadyRenounced(); + +
+
+ + +
+ Signature: + +error OwnerUnauthorizedAccount(); + +
+
+
+ + + + +## Best Practices + + +- Call `requireOwner()` in facets before executing sensitive operations. +- Use `transferOwnership()` to safely transfer ownership, setting to `address(0)` to renounce. +- Ensure `OwnerMod` is initialized with the correct storage slot during diamond deployment. + + +## Integration Notes + + +This module utilizes diamond storage at the `STORAGE_POSITION` defined by `keccak256("compose.owner")`. The `OwnerStorage` struct, containing the `owner` field, is accessed via inline assembly. All functions are internal, ensuring they are called by other facets within the diamond. Changes to ownership are persistent and immediately reflected across all facets accessing this storage slot. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/access/Owner/_category_.json b/website/docs/library/access/Owner/_category_.json new file mode 100644 index 00000000..2ddf56c9 --- /dev/null +++ b/website/docs/library/access/Owner/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Owner", + "position": 1, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/access/Owner/index" + } +} diff --git a/website/docs/library/access/Owner/index.mdx b/website/docs/library/access/Owner/index.mdx new file mode 100644 index 00000000..b73c1b07 --- /dev/null +++ b/website/docs/library/access/Owner/index.mdx @@ -0,0 +1,29 @@ +--- +title: "Owner" +description: "Single-owner access control pattern." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + Single-owner access control pattern. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/access/OwnerTwoSteps/OwnerTwoStepsFacet.mdx b/website/docs/library/access/OwnerTwoSteps/OwnerTwoStepsFacet.mdx new file mode 100644 index 00000000..ca6d25c2 --- /dev/null +++ b/website/docs/library/access/OwnerTwoSteps/OwnerTwoStepsFacet.mdx @@ -0,0 +1,252 @@ +--- +sidebar_position: 2 +title: "OwnerTwoStepsFacet" +description: "Manage diamond ownership and transfers" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/OwnerTwoSteps/OwnerTwoStepsFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manage diamond ownership and transfers + + + +- Manages diamond ownership via a two-step transfer process. +- Exposes `owner()` and `pendingOwner()` view functions. +- Provides `transferOwnership()`, `acceptOwnership()`, and `renounceOwnership()` external functions. +- Utilizes dedicated storage slots for owner and pending owner state. + + +## Overview + +This facet provides ownership management for a diamond, enabling secure transfers and renouncements. It exposes external functions to view current and pending owners, initiate ownership transfers, and accept or renounce ownership, all integrated within the diamond proxy pattern. + +--- + +## Storage + +### OwnerStorage + + +{`struct OwnerStorage { + address owner; +}`} + + +--- +### PendingOwnerStorage + + +{`struct PendingOwnerStorage { + address pendingOwner; +}`} + + +### State Variables + + + +## Functions + +### owner + +Get the address of the owner + + +{`function owner() external view returns (address);`} + + +**Returns:** + + + +--- +### pendingOwner + +Get the address of the pending owner + + +{`function pendingOwner() external view returns (address);`} + + +**Returns:** + + + +--- +### transferOwnership + +Set the address of the new owner of the contract + + +{`function transferOwnership(address _newOwner) external;`} + + +**Parameters:** + + + +--- +### acceptOwnership + + +{`function acceptOwnership() external;`} + + +--- +### renounceOwnership + + +{`function renounceOwnership() external;`} + + +## Events + + + + +
+ Signature: + +{`event OwnershipTransferStarted(address indexed _previousOwner, address indexed _newOwner);`} + +
+ +
+ + +
+ Signature: + +{`event OwnershipTransferred(address indexed _previousOwner, address indexed _newOwner);`} + +
+ +
+
+ +## Errors + + + + +
+ Signature: + +error OwnerUnauthorizedAccount(); + +
+
+
+ + + + +## Best Practices + + +- Initialize ownership during diamond deployment using `transferOwnership`. +- Ensure the current owner calls `transferOwnership` to initiate a transfer. +- The new owner must call `acceptOwnership` to finalize the transfer. +- Use `renounceOwnership` with caution to remove ownership permanently. + + +## Security Considerations + + +The `transferOwnership` function can only be called by the current owner. The `acceptOwnership` function can only be called by the pending owner. The `renounceOwnership` function can only be called by the current owner. All state-changing functions enforce these access controls to prevent unauthorized actions. Follow standard Solidity security practices for input validation. + + +
+ +
+ + diff --git a/website/docs/library/access/OwnerTwoSteps/OwnerTwoStepsMod.mdx b/website/docs/library/access/OwnerTwoSteps/OwnerTwoStepsMod.mdx new file mode 100644 index 00000000..5befabc3 --- /dev/null +++ b/website/docs/library/access/OwnerTwoSteps/OwnerTwoStepsMod.mdx @@ -0,0 +1,337 @@ +--- +sidebar_position: 1 +title: "OwnerTwoStepsMod" +description: "Two-step ownership transfer for diamonds" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/access/OwnerTwoSteps/OwnerTwoStepsMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Two-step ownership transfer for diamonds + + + +- Implements ERC-173 two-step ownership transfer logic. +- Uses internal functions for seamless integration with facets. +- Leverages diamond storage for shared ownership state. +- Provides `owner()`, `pendingOwner()`, `transferOwnership()`, `acceptOwnership()`, and `renounceOwnership()` functions. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module facilitates a secure, two-step ownership transfer process for diamonds. By separating the initiation and finalization of ownership changes, it prevents accidental or malicious takeovers. Facets can import this module to manage ownership responsibilities within the diamond storage pattern. + +--- + +## Storage + +### OwnerStorage + +storage-location: erc8042:compose.owner + + +{`struct OwnerStorage { + address owner; +}`} + + +--- +### PendingOwnerStorage + +storage-location: erc8042:compose.owner.pending + + +{`struct PendingOwnerStorage { + address pendingOwner; +}`} + + +### State Variables + + + +## Functions + +### acceptOwnership + +Finalizes ownership transfer; must be called by the pending owner. + + +{`function acceptOwnership() ;`} + + +--- +### getOwnerStorage + +Returns a pointer to the Owner storage struct. Uses inline assembly to access the storage slot defined by OWNER_STORAGE_POSITION. + + +{`function getOwnerStorage() pure returns (OwnerStorage storage s);`} + + +**Returns:** + + + +--- +### getPendingOwnerStorage + +Returns a pointer to the PendingOwner storage struct. Uses inline assembly to access the storage slot defined by PENDING_OWNER_STORAGE_POSITION. + + +{`function getPendingOwnerStorage() pure returns (PendingOwnerStorage storage s);`} + + +**Returns:** + + + +--- +### owner + +Returns the current owner. + + +{`function owner() view returns (address);`} + + +--- +### pendingOwner + +Returns the pending owner (if any). + + +{`function pendingOwner() view returns (address);`} + + +--- +### renounceOwnership + +Renounce ownership of the contract Sets the owner to address(0), disabling all functions restricted to the owner. + + +{`function renounceOwnership() ;`} + + +--- +### requireOwner + +Reverts if the caller is not the owner. + + +{`function requireOwner() view;`} + + +--- +### transferOwnership + +Initiates a two-step ownership transfer. + + +{`function transferOwnership(address _newOwner) ;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when ownership transfer is initiated (pending owner set). +
+ +
+ Signature: + +{`event OwnershipTransferStarted(address indexed _previousOwner, address indexed _newOwner);`} + +
+ +
+ +
+ Emitted when ownership transfer is finalized. +
+ +
+ Signature: + +{`event OwnershipTransferred(address indexed _previousOwner, address indexed _newOwner);`} + +
+ +
+
+ +## Errors + + + + +
+ Signature: + +error OwnerAlreadyRenounced(); + +
+
+ + +
+ Signature: + +error OwnerUnauthorizedAccount(); + +
+
+
+ + + + +## Best Practices + + +- Call `transferOwnership` only from the current owner. +- Call `acceptOwnership` only from the pending owner. +- Use `requireOwner()` within facets to enforce owner-only access to critical functions. + + +## Integration Notes + + +This module manages ownership state within diamond storage. It utilizes specific storage slots (`OWNER_STORAGE_POSITION` and `PENDING_OWNER_STORAGE_POSITION`) to store the `owner` and `pendingOwner` values, respectively. All functions interact with these storage locations directly via inline assembly, ensuring that changes are immediately visible to any facet interacting with the same diamond storage pattern. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/access/OwnerTwoSteps/_category_.json b/website/docs/library/access/OwnerTwoSteps/_category_.json new file mode 100644 index 00000000..90b66a92 --- /dev/null +++ b/website/docs/library/access/OwnerTwoSteps/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Two-Step Owner", + "position": 2, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/access/OwnerTwoSteps/index" + } +} diff --git a/website/docs/library/access/OwnerTwoSteps/index.mdx b/website/docs/library/access/OwnerTwoSteps/index.mdx new file mode 100644 index 00000000..2d989bed --- /dev/null +++ b/website/docs/library/access/OwnerTwoSteps/index.mdx @@ -0,0 +1,29 @@ +--- +title: "Two-Step Owner" +description: "Two-step ownership transfer pattern." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + Two-step ownership transfer pattern. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/access/_category_.json b/website/docs/library/access/_category_.json new file mode 100644 index 00000000..cbc9d5ba --- /dev/null +++ b/website/docs/library/access/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Access Control", + "position": 2, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/access/index" + } +} diff --git a/website/docs/library/access/index.mdx b/website/docs/library/access/index.mdx new file mode 100644 index 00000000..1e83a09d --- /dev/null +++ b/website/docs/library/access/index.mdx @@ -0,0 +1,50 @@ +--- +title: "Access Control" +description: "Access control patterns for permission management in Compose diamonds." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + Access control patterns for permission management in Compose diamonds. + + + + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/diamond/DiamondInspectFacet.mdx b/website/docs/library/diamond/DiamondInspectFacet.mdx new file mode 100644 index 00000000..949a8778 --- /dev/null +++ b/website/docs/library/diamond/DiamondInspectFacet.mdx @@ -0,0 +1,195 @@ +--- +sidebar_position: 510 +title: "DiamondInspectFacet" +description: "Inspects diamond facets and storage mappings" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/diamond/DiamondInspectFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Inspects diamond facets and storage mappings + + + +- Provides external view functions for diamond inspection. +- Accesses diamond storage via inline assembly for direct state retrieval. +- Returns facet addresses and function-to-facet mappings. +- Follows EIP-2535 diamond standard conventions. + + +## Overview + +This facet provides read-only access to the diamond's facet mappings and storage structure. It exposes functions to retrieve facet addresses by function selector and to list all function-to-facet pairings. Developers integrate this facet to inspect diamond functionality and understand its on-chain configuration. + +--- + +## Storage + +### FacetAndPosition + + +{`struct FacetAndPosition { + address facet; + uint32 position; +}`} + + +--- +### DiamondStorage + + +{`struct DiamondStorage { + mapping(bytes4 functionSelector => FacetAndPosition) facetAndPosition; + /** + * Array of all function selectors that can be called in the diamond. + */ + bytes4[] selectors; +}`} + + +--- +### FunctionFacetPair + + +{`struct FunctionFacetPair { + bytes4 selector; + address facet; +}`} + + +### State Variables + + + +## Functions + +### facetAddress + +Gets the facet address that handles the given selector. If facet is not found return address(0). + + +{`function facetAddress(bytes4 _functionSelector) external view returns (address facet);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### functionFacetPairs + +Returns an array of all function selectors and their corresponding facet addresses. Iterates through the diamond's stored selectors and pairs each with its facet. + + +{`function functionFacetPairs() external view returns (FunctionFacetPair[] memory pairs);`} + + +**Returns:** + + + + + + +## Best Practices + + +- Ensure this facet is added to the diamond during initialization. +- Call `facetAddress` to determine which facet handles a specific function selector. +- Use `functionFacetPairs` to get a comprehensive view of the diamond's function dispatch. + + +## Security Considerations + + +This facet contains only view functions and does not modify state. Standard Solidity security practices apply. Input validation is handled by the underlying diamond proxy and facet dispatch mechanism. + + +
+ +
+ + diff --git a/website/docs/library/diamond/DiamondLoupeFacet.mdx b/website/docs/library/diamond/DiamondLoupeFacet.mdx new file mode 100644 index 00000000..a4585d5a --- /dev/null +++ b/website/docs/library/diamond/DiamondLoupeFacet.mdx @@ -0,0 +1,244 @@ +--- +sidebar_position: 4 +title: "DiamondLoupeFacet" +description: "Inspect diamond facets and their function selectors" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/diamond/DiamondLoupeFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Inspect diamond facets and their function selectors + + + +- Exposes external functions for diamond introspection +- Self-contained with no imports or inheritance +- Compatible with ERC-2535 diamond standard + + +## Overview + +This facet provides introspection capabilities for a diamond, allowing developers to query facet addresses, function selectors, and facet mappings. It routes calls through the diamond proxy, enabling external inspection of the diamond's composition and upgradeability. Developers add this facet to understand and interact with a diamond's deployed facets. + +--- + +## Storage + +### FacetAndPosition + + +{`struct FacetAndPosition { + address facet; + uint32 position; +}`} + + +--- +### DiamondStorage + + +{`struct DiamondStorage { + mapping(bytes4 functionSelector => FacetAndPosition) facetAndPosition; + /** + * Array of all function selectors that can be called in the diamond. + */ + bytes4[] selectors; +}`} + + +--- +### Facet + + +{`struct Facet { + address facet; + bytes4[] functionSelectors; +}`} + + +### State Variables + + + +## Functions + +### facetFunctionSelectors + +Gets all the function selectors supported by a specific facet. Returns the set of selectors that this diamond currently routes to the given facet address. How it works: 1. Iterates through the diamond’s global selector list (s.selectors) — i.e., the selectors that have been added to this diamond. 2. For each selector, reads its facet address from diamond storage (s.facetAndPosition[selector].facet) and compares it to `_facet`. 3. When it matches, writes the selector into a preallocated memory array and increments a running count. 4. After the scan, updates the logical length of the result array with assembly to the exact number of matches. Why this approach: - Single-pass O(n) scan over all selectors keeps the logic simple and predictable. - Preallocating to the maximum possible size (total selector count) avoids repeated reallocations while building the result. - Trimming the array length at the end yields an exactly sized return value. + + +{`function facetFunctionSelectors(address _facet) external view returns (bytes4[] memory facetSelectors);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### facetAddresses + +Get all the facet addresses used by a diamond. This function returns the unique set of facet addresses that provide functionality to the diamond. How it works:** 1. Uses a memory-based hash map to group facet addresses by the last byte of the address, reducing linear search costs from O(n²) to approximately O(n) for most cases. 2. Reuses the selectors array memory space to store the unique facet addresses, avoiding an extra memory allocation for the intermediate array. The selectors array is overwritten with facet addresses as we iterate. 3. For each selector, looks up its facet address and checks if we've seen this address before by searching the appropriate hash map bucket. 4. If the facet is new (not found in the bucket), expands the bucket by 4 slots if it's full or empty, then adds the facet to both the bucket and the return array. 5. If the facet was already seen, skips it to maintain uniqueness. 6. Finally, sets the correct length of the return array to match the number of unique facets found. Why this approach:** - Hash mapping by last address byte provides O(1) average-case bucket lookup instead of scanning all previously-found facets linearly for each selector. - Growing in fixed-size chunks (4 for buckets) keeps reallocations infrequent and prevents over-allocation, while keeping bucket sizes small for sparse key distributions. - Reusing the selectors array memory eliminates one memory allocation and reduces total memory usage, which saves gas. - This design is optimized for diamonds with many selectors across many facets, where the original O(n²) nested loop approach becomes prohibitively expensive. - The 256-bucket hash map trades a small fixed memory cost for dramatic algorithmic improvement in worst-case scenarios. + + +{`function facetAddresses() external view returns (address[] memory allFacets);`} + + +**Returns:** + + + +--- +### facets + +Gets all facets and their selectors. Returns each unique facet address currently used by the diamond and the list of function selectors that the diamond maps to that facet. How it works:** 1. Uses a memory-based hash map to group facets by the last byte of their address, reducing linear search costs from O(n²) to approximately O(n) for most cases. 2. Reuses the selectors array memory space to store pointers to Facet structs, avoiding an extra memory allocation for the intermediate array. 3. For each selector, looks up its facet address and checks if we've seen this facet before by searching the appropriate hash map bucket. 4. If the facet is new, expands the bucket by 4 slots if it's full or empty, creates a Facet struct with a 16-slot selector array, and stores a pointer to it in both the bucket and the facet pointers array. 5. If the facet exists, expands its selector array by 16 slots if full, then appends the selector to the array. 6. Finally, copies all Facet structs from their pointers into a properly-sized return array. Why this approach:** - Hash mapping by last address byte provides O(1) average-case bucket lookup instead of scanning all previously-found facets linearly. - Growing in fixed-size chunks (4 for buckets, 16 for selector arrays) keeps reallocations infrequent and prevents over-allocation. - Reusing the selectors array memory reduces total memory usage and allocation. - This design is optimized for diamonds with many facets and many selectors, where the original O(n²) nested loop approach becomes prohibitively expensive. + + +{`function facets() external view returns (Facet[] memory facetsAndSelectors);`} + + +**Returns:** + + + + + + +## Best Practices + + +- Ensure this facet is added to the diamond during initialization. +- Call facet inspection functions through the diamond proxy address. +- Verify storage compatibility before upgrading facets to maintain introspection integrity. + + +## Security Considerations + + +Follow standard Solidity security practices. The `getStorage` function is marked `pure` and does not modify state. Inspection functions are `view` and do not pose reentrancy risks. Input validation is handled by the diamond proxy for external calls. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/diamond/DiamondMod.mdx b/website/docs/library/diamond/DiamondMod.mdx new file mode 100644 index 00000000..01afb3d2 --- /dev/null +++ b/website/docs/library/diamond/DiamondMod.mdx @@ -0,0 +1,258 @@ +--- +sidebar_position: 1 +title: "DiamondMod" +description: "Manages facet additions and function dispatch for diamonds" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/diamond/DiamondMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manages facet additions and function dispatch for diamonds + + + +- Internal functions for facet management and call dispatch. +- Uses diamond storage pattern for centralized state. +- No external dependencies, promoting composability. +- Compatible with ERC-2535 diamond standard. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions for managing facets and dispatching calls within a diamond proxy. It enables composability by allowing facets to be added and functions to be routed to the correct implementation. Changes to facet mappings are managed internally, ensuring consistent function resolution across all interacting facets. + +--- + +## Storage + +### DiamondStorage + +storage-location: erc8042:erc8109.diamond + + +{`struct DiamondStorage { + mapping(bytes4 functionSelector => FacetAndPosition) facetAndPosition; + /** + * \`selectors\` contains all function selectors that can be called in the diamond. + */ + bytes4[] selectors; +}`} + + +--- +### FacetAndPosition + + +{`struct FacetAndPosition { + address facet; + uint32 position; +}`} + + +--- +### FacetFunctions + + +{`struct FacetFunctions { + address facet; + bytes4[] selectors; +}`} + + +### State Variables + + + +## Functions + +### addFacets + +Adds facets and their function selectors to the diamond. Only supports adding functions during diamond deployment. + + +{`function addFacets(FacetFunctions[] memory _facets) ;`} + + +**Parameters:** + + + +--- +### diamondFallback + +Find facet for function that is called and execute the function if a facet is found and return any value. + + +{`function diamondFallback() ;`} + + +--- +### getStorage + + +{`function getStorage() pure returns (DiamondStorage storage s);`} + + +## Events + + + +
+ Emitted when a function is added to a diamond. +
+ +
+ Signature: + +{`event DiamondFunctionAdded(bytes4 indexed _selector, address indexed _facet);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + + +
+ Signature: + +error CannotAddFunctionToDiamondThatAlreadyExists(bytes4 _selector); + +
+
+ + +
+ Signature: + +error FunctionNotFound(bytes4 _selector); + +
+
+ + +
+ Signature: + +error NoBytecodeAtAddress(address _contractAddress); + +
+
+
+ + + + +## Best Practices + + +- Call `addFacets` only during initial diamond deployment to avoid conflicts. +- Ensure that function selectors are unique across all facets before adding. +- Handle `FunctionNotFound` errors when dispatching calls that do not map to any facet. + + +## Integration Notes + + +This module utilizes a dedicated storage position, `DIAMOND_STORAGE_POSITION`, identified by `keccak256("erc8109.diamond")`. The `DiamondStorage` struct, although empty in its definition, serves as a conceptual representation of the state managed at this position. Facets interact with this module's functions to register new facets and their associated function selectors, influencing the diamond's runtime behavior. Any updates to the facet mappings are immediately visible to all facets interacting with the diamond proxy. + + +
+ +
+ + diff --git a/website/docs/library/diamond/DiamondUpgradeFacet.mdx b/website/docs/library/diamond/DiamondUpgradeFacet.mdx new file mode 100644 index 00000000..bab0c361 --- /dev/null +++ b/website/docs/library/diamond/DiamondUpgradeFacet.mdx @@ -0,0 +1,538 @@ +--- +sidebar_position: 510 +title: "DiamondUpgradeFacet" +description: "Manage diamond upgrades and facet additions" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/diamond/DiamondUpgradeFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manage diamond upgrades and facet additions + + + +- Manages diamond functional surface area through facet composition. +- Supports adding, replacing, and removing functions via selector mapping. +- Enables diamond upgradeability by allowing facet deployments and reconfigurations. +- Facilitates metadata updates associated with diamond state changes. + + +## Overview + +This facet provides core functionality for upgrading diamond proxies by adding, replacing, and removing facets. It allows for delegate calls and metadata updates, integrating directly with the diamond storage pattern. Developers use this facet to manage the diamond's functional surface area and upgrade its components. + +--- + +## Storage + +### OwnerStorage + + +{`struct OwnerStorage { + address owner; +}`} + + +--- +### FacetAndPosition + + +{`struct FacetAndPosition { + address facet; + uint32 position; +}`} + + +--- +### DiamondStorage + + +{`struct DiamondStorage { + mapping(bytes4 functionSelector => FacetAndPosition) facetAndPosition; + /** + * Array of all function selectors that can be called in the diamond + */ + bytes4[] selectors; +}`} + + +--- +### FacetFunctions + + +{`struct FacetFunctions { + address facet; + bytes4[] selectors; +}`} + + +### State Variables + + + +## Functions + +### upgradeDiamond + +--- +### Function Changes: + +--- +### DelegateCall: + +--- +### Metadata: + +If _tag is non-zero or if _metadata.length > 0 then the `DiamondMetadata` event is emitted. + + +{`function upgradeDiamond( + FacetFunctions[] calldata _addFunctions, + FacetFunctions[] calldata _replaceFunctions, + bytes4[] calldata _removeFunctions, + address _delegate, + bytes calldata _functionCall, + bytes32 _tag, + bytes calldata _metadata +) external;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when a function is added to a diamond. +
+ +
+ Signature: + +{`event DiamondFunctionAdded(bytes4 indexed _selector, address indexed _facet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when changing the facet that will handle calls to a function. +
+ +
+ Signature: + +{`event DiamondFunctionReplaced(bytes4 indexed _selector, address indexed _oldFacet, address indexed _newFacet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a function is removed from a diamond. +
+ +
+ Signature: + +{`event DiamondFunctionRemoved(bytes4 indexed _selector, address indexed _oldFacet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a diamond's constructor function or function from a facet makes a `delegatecall`. +
+ +
+ Signature: + +{`event DiamondDelegateCall(address indexed _delegate, bytes _functionCall);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted to record information about a diamond. This event records any arbitrary metadata. The format of `_tag` and `_data` are not specified by the standard. +
+ +
+ Signature: + +{`event DiamondMetadata(bytes32 indexed _tag, bytes _data);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + + +
+ Signature: + +error OwnerUnauthorizedAccount(); + +
+
+ + +
+ Signature: + +error NoSelectorsProvidedForFacet(address _facet); + +
+
+ + +
+ Signature: + +error NoBytecodeAtAddress(address _contractAddress); + +
+
+ + +
+ Signature: + +error CannotAddFunctionToDiamondThatAlreadyExists(bytes4 _selector); + +
+
+ + +
+ Signature: + +error CannotReplaceFunctionThatDoesNotExist(bytes4 _selector); + +
+
+ + +
+ Signature: + +error CannotRemoveFunctionThatDoesNotExist(bytes4 _selector); + +
+
+ + +
+ Signature: + +error CannotReplaceFunctionWithTheSameFacet(bytes4 _selector); + +
+
+ + +
+ Signature: + +error DelegateCallReverted(address _delegate, bytes _functionCall); + +
+
+ + +
+ Signature: + +error CannotReplaceImmutableFunction(bytes4 _selector); + +
+
+ + +
+ Signature: + +error CannotRemoveImmutableFunction(bytes4 _selector); + +
+
+
+ + + + +## Best Practices + + +- Initialize diamond storage ownership using the `owner` field if applicable. +- Ensure that selectors for adding or replacing functions do not conflict with immutable functions. +- Verify that facet bytecode exists at the provided address before attempting to add or replace functions. + + +## Security Considerations + + +All state-modifying functions (addFunctions, replaceFunctions, removeFunctions, Metadata) must be protected by appropriate access control, typically owner-only, to prevent unauthorized modifications. The `Metadata` function emits an event, ensuring transparency for metadata changes. Input validation is crucial to prevent errors like `NoSelectorsProvidedForFacet`, `NoBytecodeAtAddress`, or conflicts with immutable functions. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/diamond/DiamondUpgradeMod.mdx b/website/docs/library/diamond/DiamondUpgradeMod.mdx new file mode 100644 index 00000000..416d486c --- /dev/null +++ b/website/docs/library/diamond/DiamondUpgradeMod.mdx @@ -0,0 +1,560 @@ +--- +sidebar_position: 500 +title: "DiamondUpgradeMod" +description: "Upgrade diamond with add, replace, and remove functions" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/diamond/DiamondUpgradeMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Upgrade diamond with add, replace, and remove functions + + + +- Manages facet additions, replacements, and removals within a diamond. +- Supports optional `delegatecall` for state modification or initialization during upgrades. +- Emits events for all function changes (`DiamondFunctionAdded`, `DiamondFunctionReplaced`, `DiamondFunctionRemoved`). +- Includes specific errors to prevent invalid upgrade operations. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides functions to upgrade a diamond's facet implementations. It allows adding new functions, replacing existing ones, and removing functions entirely, all managed through diamond storage. The `upgradeDiamond` function orchestrates these changes, optionally performing a delegatecall for state modifications or initialization. + +--- + +## Storage + +### DiamondStorage + +storage-location: erc8042:erc8109.diamond + + +{`struct DiamondStorage { + mapping(bytes4 functionSelector => FacetAndPosition) facetAndPosition; + /** + * Array of all function selectors that can be called in the diamond + */ + bytes4[] selectors; +}`} + + +--- +### FacetAndPosition + +Data stored for each function selector Facet address of function selector Position of selector in the 'bytes4[] selectors' array + + +{`struct FacetAndPosition { + address facet; + uint32 position; +}`} + + +--- +### FacetFunctions + + +{`struct FacetFunctions { + address facet; + bytes4[] selectors; +}`} + + +### State Variables + + + +## Functions + +### addFunctions + + +{`function addFunctions(address _facet, bytes4[] calldata _functionSelectors) ;`} + + +**Parameters:** + + + +--- +### getDiamondStorage + + +{`function getDiamondStorage() pure returns (DiamondStorage storage s);`} + + +--- +### removeFunctions + + +{`function removeFunctions(bytes4[] calldata _functionSelectors) ;`} + + +**Parameters:** + + + +--- +### replaceFunctions + + +{`function replaceFunctions(address _facet, bytes4[] calldata _functionSelectors) ;`} + + +**Parameters:** + + + +--- +### upgradeDiamond + +Upgrade the diamond by adding, replacing, or removing functions. - `_addFunctions` maps new selectors to their facet implementations. - `_replaceFunctions` updates existing selectors to new facet addresses. - `_removeFunctions` removes selectors from the diamond. Functions added first, then replaced, then removed. These events are emitted to record changes to functions: - `DiamondFunctionAdded` - `DiamondFunctionReplaced` - `DiamondFunctionRemoved` If `_delegate` is non-zero, the diamond performs a `delegatecall` to `_delegate` using `_functionCall`. The `DiamondDelegateCall` event is emitted. The `delegatecall` is done to alter a diamond's state or to initialize, modify, or remove state after an upgrade. However, if `_delegate` is zero, no `delegatecall` is made and no `DiamondDelegateCall` event is emitted. If _tag is non-zero or if _metadata.length > 0 then the `DiamondMetadata` event is emitted. + + +{`function upgradeDiamond( +FacetFunctions[] calldata _addFunctions, +FacetFunctions[] calldata _replaceFunctions, +bytes4[] calldata _removeFunctions, +address _delegate, +bytes calldata _functionCall, +bytes32 _tag, +bytes calldata _metadata +) ;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when a diamond's constructor function or function from a facet makes a `delegatecall`. +
+ +
+ Signature: + +{`event DiamondDelegateCall(address indexed _delegate, bytes _functionCall);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a function is added to a diamond. +
+ +
+ Signature: + +{`event DiamondFunctionAdded(bytes4 indexed _selector, address indexed _facet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a function is removed from a diamond. +
+ +
+ Signature: + +{`event DiamondFunctionRemoved(bytes4 indexed _selector, address indexed _oldFacet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when changing the facet that will handle calls to a function. +
+ +
+ Signature: + +{`event DiamondFunctionReplaced(bytes4 indexed _selector, address indexed _oldFacet, address indexed _newFacet);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted to record information about a diamond. This event records any arbitrary metadata. The format of `_tag` and `_data` are not specified by the standard. +
+ +
+ Signature: + +{`event DiamondMetadata(bytes32 indexed _tag, bytes _data);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + + +
+ Signature: + +error CannotAddFunctionToDiamondThatAlreadyExists(bytes4 _selector); + +
+
+ + +
+ Signature: + +error CannotRemoveFunctionThatDoesNotExist(bytes4 _selector); + +
+
+ + +
+ Signature: + +error CannotRemoveImmutableFunction(bytes4 _selector); + +
+
+ + +
+ Signature: + +error CannotReplaceFunctionThatDoesNotExist(bytes4 _selector); + +
+
+ + +
+ Signature: + +error CannotReplaceFunctionWithTheSameFacet(bytes4 _selector); + +
+
+ + +
+ Signature: + +error CannotReplaceImmutableFunction(bytes4 _selector); + +
+
+ + +
+ Signature: + +error DelegateCallReverted(address _delegate, bytes _functionCall); + +
+
+ + +
+ Signature: + +error NoBytecodeAtAddress(address _contractAddress); + +
+
+ +
+ The functions below detect and revert with the following errors. +
+ +
+ Signature: + +error NoSelectorsProvidedForFacet(address _facet); + +
+
+
+ + + + +## Best Practices + + +- Ensure the diamond's access control mechanism permits the caller to execute upgrade functions. +- Verify that `_addFunctions`, `_replaceFunctions`, and `_removeFunctions` do not conflict with immutable functions. +- Handle the `DelegateCallReverted` error if a `delegatecall` is performed and reverts. + + +## Integration Notes + + +This module interacts with diamond storage at `DIAMOND_STORAGE_POSITION`, identified by `keccak256("erc8109.diamond")`. The `DiamondStorage` struct, though empty in this definition, serves as the root for diamond state. Functions added, replaced, or removed by this module directly modify the diamond's function selector-to-facet mapping, making these changes immediately visible to all facets interacting with the diamond proxy. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/diamond/_category_.json b/website/docs/library/diamond/_category_.json new file mode 100644 index 00000000..26c8cc37 --- /dev/null +++ b/website/docs/library/diamond/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Diamond Core", + "position": 1, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/diamond/index" + } +} diff --git a/website/docs/library/diamond/example/ExampleDiamond.mdx b/website/docs/library/diamond/example/ExampleDiamond.mdx new file mode 100644 index 00000000..c1f9726d --- /dev/null +++ b/website/docs/library/diamond/example/ExampleDiamond.mdx @@ -0,0 +1,153 @@ +--- +sidebar_position: 510 +title: "ExampleDiamond" +description: "Initializes diamond with facets and owner" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/diamond/example/ExampleDiamond.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Initializes diamond with facets and owner + + + +- Registers facets and their function selectors on diamond deployment. +- Establishes the initial owner of the diamond contract. +- Enables correct call routing through the diamond proxy pattern. + + +## Overview + +This constructor initializes a diamond contract by registering facets and their function selectors. It also sets the initial owner of the diamond. This setup ensures that the diamond proxy can correctly route calls to the appropriate facet implementation. + +--- + +## Storage + +## Functions + +### constructor + +Struct to hold facet address and its function selectors. struct FacetFunctions { address facet; bytes4[] selectors; } Initializes the diamond contract with facets, owner and other data. Adds all provided facets to the diamond's function selector mapping and sets the contract owner. Each facet in the array will have its function selectors registered to enable delegatecall routing. + + +{`constructor(DiamondMod.FacetFunctions[] memory _facets, address _diamondOwner) ;`} + + +**Parameters:** + + + +--- +### fallback + + +{`fallback() external payable;`} + + +--- +### receive + + +{`receive() external payable;`} + + + + + +## Best Practices + + +- Ensure all facets intended for the diamond are correctly registered with their selectors during initialization. +- Set the diamond owner with a secure, multi-signature wallet or a dedicated governance contract. +- Verify that the provided facet addresses are deployed and verified. + + +## Security Considerations + + +The constructor is critical for setting up the diamond's functionality. Ensure that only trusted addresses are provided as facet implementations and that the owner is set to a secure address. Validate that all function selectors are correctly mapped to their respective facets to prevent unexpected behavior. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/diamond/example/_category_.json b/website/docs/library/diamond/example/_category_.json new file mode 100644 index 00000000..8e4d0ed5 --- /dev/null +++ b/website/docs/library/diamond/example/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "example", + "position": 99, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/diamond/example/index" + } +} diff --git a/website/docs/library/diamond/example/index.mdx b/website/docs/library/diamond/example/index.mdx new file mode 100644 index 00000000..2c21c33a --- /dev/null +++ b/website/docs/library/diamond/example/index.mdx @@ -0,0 +1,22 @@ +--- +title: "example" +description: "example components for Compose diamonds." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + example components for Compose diamonds. + + + + } + size="medium" + /> + diff --git a/website/docs/library/diamond/index.mdx b/website/docs/library/diamond/index.mdx new file mode 100644 index 00000000..9f73aec6 --- /dev/null +++ b/website/docs/library/diamond/index.mdx @@ -0,0 +1,57 @@ +--- +title: "Diamond Core" +description: "Core diamond proxy functionality for ERC-2535 diamonds." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + Core diamond proxy functionality for ERC-2535 diamonds. + + + + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/index.mdx b/website/docs/library/index.mdx new file mode 100644 index 00000000..a664d292 --- /dev/null +++ b/website/docs/library/index.mdx @@ -0,0 +1,51 @@ +--- +title: "Library" +description: "API reference for all Compose modules and facets." +sidebar_class_name: "hidden" +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + API reference for all Compose modules and facets. + + + + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/interfaceDetection/ERC165/ERC165Facet.mdx b/website/docs/library/interfaceDetection/ERC165/ERC165Facet.mdx new file mode 100644 index 00000000..ca56df22 --- /dev/null +++ b/website/docs/library/interfaceDetection/ERC165/ERC165Facet.mdx @@ -0,0 +1,159 @@ +--- +sidebar_position: 410 +title: "ERC165Facet" +description: "Supports ERC-165 interface detection for diamonds" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/interfaceDetection/ERC165/ERC165Facet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Supports ERC-165 interface detection for diamonds + + + +- Exposes `supportsInterface` for ERC-165 compliance. +- Utilizes a fixed storage slot for ERC-165 state. +- No external dependencies beyond standard Solidity. + + +## Overview + +This facet implements ERC-165 interface detection for diamonds. It exposes the `supportsInterface` function, allowing external contracts to query if the diamond supports specific interface IDs. This facet utilizes a fixed storage slot for its state, adhering to the diamond storage pattern. + +--- + +## Storage + +### ERC165Storage + + +{`struct ERC165Storage { + /** + * @notice Mapping of interface IDs to whether they are supported + */ + mapping(bytes4 => bool) supportedInterfaces; +}`} + + +### State Variables + + + +## Functions + +### supportsInterface + +Query if a contract implements an interface This function checks if the diamond supports the given interface ID + + +{`function supportsInterface(bytes4 _interfaceId) external view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + + + + +## Best Practices + + +- Ensure the `ERC165Facet` is added to your diamond during initialization. +- Call `supportsInterface` through the diamond proxy to query interface support. +- Verify storage compatibility if upgrading facets that interact with ERC-165 state. + + +## Security Considerations + + +The `supportsInterface` function is view-only and does not modify state. Input validation for `_interfaceId` is handled by standard Solidity bytes4 comparisons. Follow standard Solidity security practices. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/interfaceDetection/ERC165/ERC165Mod.mdx b/website/docs/library/interfaceDetection/ERC165/ERC165Mod.mdx new file mode 100644 index 00000000..78f21133 --- /dev/null +++ b/website/docs/library/interfaceDetection/ERC165/ERC165Mod.mdx @@ -0,0 +1,158 @@ +--- +sidebar_position: 1 +title: "ERC165Mod" +description: "ERC-165 interface detection using diamond storage" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/interfaceDetection/ERC165/ERC165Mod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +ERC-165 interface detection using diamond storage + + + +- Provides internal functions for ERC-165 interface detection. +- Uses a dedicated diamond storage position (`STORAGE_POSITION`) for interface mappings. +- No external dependencies or `using` directives, promoting explicitness. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions and storage for ERC-165 interface detection within a diamond. Facets import this module to register supported interfaces during initialization. These registrations are stored in shared diamond storage, making them visible to all facets. + +--- + +## Storage + +### ERC165Storage + + +{`struct ERC165Storage { + /* + * @notice Mapping of interface IDs to whether they are supported + */ + mapping(bytes4 => bool) supportedInterfaces; +}`} + + +### State Variables + + + +## Functions + +### getStorage + +Returns a pointer to the ERC-165 storage struct. Uses inline assembly to bind the storage struct to the fixed storage position. + + +{`function getStorage() pure returns (ERC165Storage storage s);`} + + +**Returns:** + + + +--- +### registerInterface + +Register that a contract supports an interface Call this function during initialization to register supported interfaces. For example, in an ERC721 facet initialization, you would call: `LibERC165.registerInterface(type(IERC721).interfaceId)` + + +{`function registerInterface(bytes4 _interfaceId) ;`} + + +**Parameters:** + + + + + + +## Best Practices + + +- Call `registerInterface` during facet initialization to declare supported interfaces. +- Ensure the `ERC165Storage` is initialized at the correct storage position before calling module functions. +- Verify that the `STORAGE_POSITION` value is correctly set in the diamond's implementation. + + +## Integration Notes + + +This module utilizes the diamond storage pattern, storing ERC-165 interface support data at a specific slot identified by `STORAGE_POSITION`. The `ERC165Storage` struct is bound to this slot using inline assembly. Any facet interacting with this module will access and modify the shared storage, ensuring interface support information is consistently available across the diamond. + + +
+ +
+ + diff --git a/website/docs/library/interfaceDetection/ERC165/_category_.json b/website/docs/library/interfaceDetection/ERC165/_category_.json new file mode 100644 index 00000000..2396f18a --- /dev/null +++ b/website/docs/library/interfaceDetection/ERC165/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-165", + "position": 99, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/interfaceDetection/ERC165/index" + } +} diff --git a/website/docs/library/interfaceDetection/ERC165/index.mdx b/website/docs/library/interfaceDetection/ERC165/index.mdx new file mode 100644 index 00000000..14027b25 --- /dev/null +++ b/website/docs/library/interfaceDetection/ERC165/index.mdx @@ -0,0 +1,29 @@ +--- +title: "ERC-165" +description: "ERC-165 components for Compose diamonds." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-165 components for Compose diamonds. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/interfaceDetection/_category_.json b/website/docs/library/interfaceDetection/_category_.json new file mode 100644 index 00000000..a184d836 --- /dev/null +++ b/website/docs/library/interfaceDetection/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Interface Detection", + "position": 5, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/interfaceDetection/index" + } +} diff --git a/website/docs/library/interfaceDetection/index.mdx b/website/docs/library/interfaceDetection/index.mdx new file mode 100644 index 00000000..65448bd8 --- /dev/null +++ b/website/docs/library/interfaceDetection/index.mdx @@ -0,0 +1,22 @@ +--- +title: "Interface Detection" +description: "ERC-165 interface detection support." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-165 interface detection support. + + + + } + size="medium" + /> + diff --git a/website/docs/library/token/ERC1155/ERC1155Facet.mdx b/website/docs/library/token/ERC1155/ERC1155Facet.mdx new file mode 100644 index 00000000..d2b6ef20 --- /dev/null +++ b/website/docs/library/token/ERC1155/ERC1155Facet.mdx @@ -0,0 +1,684 @@ +--- +sidebar_position: 2 +title: "ERC1155Facet" +description: "Manages ERC-1155 fungible and non-fungible tokens within a diamond" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC1155/ERC1155Facet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manages ERC-1155 fungible and non-fungible tokens within a diamond + + + +- Exposes standard ERC-1155 functions via diamond routing. +- Supports both single and batch transfers and balance checks. +- Integrates with diamond storage for persistent state. +- Emits standard ERC-1155 events for off-chain tracking. + + +## Overview + +This facet implements ERC-1155 token functionality within a diamond proxy. It exposes external functions for transfers, approvals, and balance checks, routing calls through the diamond's orchestration layer. Developers add this facet to enable ERC-1155 token operations, leveraging the diamond's upgradeability and modularity. + +--- + +## Storage + +### ERC1155Storage + + +{`struct ERC1155Storage { + mapping(uint256 id => mapping(address account => uint256 balance)) balanceOf; + mapping(address account => mapping(address operator => bool)) isApprovedForAll; + string uri; + string baseURI; + mapping(uint256 tokenId => string) tokenURIs; +}`} + + +### State Variables + + + +## Functions + +### uri + +Returns the URI for token type `_id`. If a token-specific URI is set in tokenURIs[_id], returns the concatenation of baseURI and tokenURIs[_id]. Note that baseURI is empty by default and must be set explicitly if concatenation is desired. If no token-specific URI is set, returns the default URI which applies to all token types. The default URI may contain the substring `{id}` which clients should replace with the actual token ID. + + +{`function uri(uint256 _id) external view returns (string memory);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### balanceOf + +Returns the amount of tokens of token type `id` owned by `account`. + + +{`function balanceOf(address _account, uint256 _id) external view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### balanceOfBatch + +Batched version of balanceOf. + + +{`function balanceOfBatch(address[] calldata _accounts, uint256[] calldata _ids) + external + view + returns (uint256[] memory balances);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### setApprovalForAll + +Grants or revokes permission to `operator` to transfer the caller's tokens. Emits an ApprovalForAll event. + + +{`function setApprovalForAll(address _operator, bool _approved) external;`} + + +**Parameters:** + + + +--- +### isApprovedForAll + +Returns true if `operator` is approved to transfer `account`'s tokens. + + +{`function isApprovedForAll(address _account, address _operator) external view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### safeTransferFrom + +Transfers `value` amount of token type `id` from `from` to `to`. Emits a TransferSingle event. + + +{`function safeTransferFrom(address _from, address _to, uint256 _id, uint256 _value, bytes calldata _data) external;`} + + +**Parameters:** + + + +--- +### safeBatchTransferFrom + +Batched version of safeTransferFrom. Emits a TransferBatch event. + + +{`function safeBatchTransferFrom( + address _from, + address _to, + uint256[] calldata _ids, + uint256[] calldata _values, + bytes calldata _data +) external;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when `value` amount of tokens of type `id` are transferred from `from` to `to` by `operator`. +
+ +
+ Signature: + +{`event TransferSingle( + address indexed _operator, address indexed _from, address indexed _to, uint256 _id, uint256 _value +);`} + +
+ +
+ Parameters: + +
+
+ +
+ Equivalent to multiple TransferSingle events, where `operator`, `from` and `to` are the same for all transfers. +
+ +
+ Signature: + +{`event TransferBatch( + address indexed _operator, address indexed _from, address indexed _to, uint256[] _ids, uint256[] _values +);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when `account` grants or revokes permission to `operator` to transfer their tokens. +
+ +
+ Signature: + +{`event ApprovalForAll(address indexed _account, address indexed _operator, bool _approved);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when the URI for token type `id` changes to `value`. +
+ +
+ Signature: + +{`event URI(string _value, uint256 indexed _id);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Error indicating insufficient balance for a transfer. +
+ +
+ Signature: + +error ERC1155InsufficientBalance(address _sender, uint256 _balance, uint256 _needed, uint256 _tokenId); + +
+
+ +
+ Error indicating the sender address is invalid. +
+ +
+ Signature: + +error ERC1155InvalidSender(address _sender); + +
+
+ +
+ Error indicating the receiver address is invalid. +
+ +
+ Signature: + +error ERC1155InvalidReceiver(address _receiver); + +
+
+ +
+ Error indicating missing approval for an operator. +
+ +
+ Signature: + +error ERC1155MissingApprovalForAll(address _operator, address _owner); + +
+
+ +
+ Error indicating the approver address is invalid. +
+ +
+ Signature: + +error ERC1155InvalidApprover(address _approver); + +
+
+ +
+ Error indicating the operator address is invalid. +
+ +
+ Signature: + +error ERC1155InvalidOperator(address _operator); + +
+
+ +
+ Error indicating array length mismatch in batch operations. +
+ +
+ Signature: + +error ERC1155InvalidArrayLength(uint256 _idsLength, uint256 _valuesLength); + +
+
+
+ + + + +## Best Practices + + +- Initialize the `baseURI` and `uri` mappings during diamond setup if custom URIs are required. +- Use `setApprovalForAll` to grant operator permissions before executing transfers on behalf of another account. +- Ensure all token IDs and values are validated before calling transfer functions to prevent unexpected behavior. + + +## Security Considerations + + +Follow standard Solidity security practices. Input validation is crucial for token IDs, values, sender, and receiver addresses. The `safeTransferFrom` and `safeBatchTransferFrom` functions include checks for sufficient balance and operator approval. Reentrancy is mitigated by using the checks-effects-interactions pattern. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC1155/ERC1155Mod.mdx b/website/docs/library/token/ERC1155/ERC1155Mod.mdx new file mode 100644 index 00000000..e74f3129 --- /dev/null +++ b/website/docs/library/token/ERC1155/ERC1155Mod.mdx @@ -0,0 +1,630 @@ +--- +sidebar_position: 1 +title: "ERC1155Mod" +description: "Handles ERC-1155 token minting, burning, and transfers" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC1155/ERC1155Mod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Handles ERC-1155 token minting, burning, and transfers + + + +- Internal functions for minting, burning, and transfers. +- Supports safe transfers for ERC-1155 token contracts. +- Manages token URIs with `setBaseURI` and `setTokenURI` functions. +- Utilizes diamond storage for state management. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides core ERC-1155 functionality including minting, burning, and safe transfers. Facets can integrate this module to manage ERC-1155 tokens within the diamond, leveraging shared diamond storage for token balances and URIs. It ensures interoperability by adhering to EIP-1155 standards for safe transfers. + +--- + +## Storage + +### ERC1155Storage + +ERC-8042 compliant storage struct for ERC-1155 token data. storage-location: erc8042:compose.erc1155 + + +{`struct ERC1155Storage { + mapping(uint256 id => mapping(address account => uint256 balance)) balanceOf; + mapping(address account => mapping(address operator => bool)) isApprovedForAll; + string uri; + string baseURI; + mapping(uint256 tokenId => string) tokenURIs; +}`} + + +### State Variables + + + +## Functions + +### burn + +Burns a single token type from an address. Decreases the balance and emits a TransferSingle event. Reverts if the account has insufficient balance. + + +{`function burn(address _from, uint256 _id, uint256 _value) ;`} + + +**Parameters:** + + + +--- +### burnBatch + +Burns multiple token types from an address in a single transaction. Decreases balances for each token type and emits a TransferBatch event. Reverts if the account has insufficient balance for any token type. + + +{`function burnBatch(address _from, uint256[] memory _ids, uint256[] memory _values) ;`} + + +**Parameters:** + + + +--- +### getStorage + +Returns the ERC-1155 storage struct from the predefined diamond storage slot. Uses inline assembly to set the storage slot reference. + + +{`function getStorage() pure returns (ERC1155Storage storage s);`} + + +**Returns:** + + + +--- +### mint + +Mints a single token type to an address. Increases the balance and emits a TransferSingle event. Performs receiver validation if recipient is a contract. + + +{`function mint(address _to, uint256 _id, uint256 _value, bytes memory _data) ;`} + + +**Parameters:** + + + +--- +### mintBatch + +Mints multiple token types to an address in a single transaction. Increases balances for each token type and emits a TransferBatch event. Performs receiver validation if recipient is a contract. + + +{`function mintBatch(address _to, uint256[] memory _ids, uint256[] memory _values, bytes memory _data) ;`} + + +**Parameters:** + + + +--- +### safeBatchTransferFrom + +Safely transfers multiple token types from one address to another in a single transaction. Validates ownership, approval, and receiver address before updating balances for each token type. Performs ERC1155Receiver validation if recipient is a contract (safe transfer). Complies with EIP-1155 safe transfer requirements. + + +{`function safeBatchTransferFrom( +address _from, +address _to, +uint256[] memory _ids, +uint256[] memory _values, +address _operator +) ;`} + + +**Parameters:** + + + +--- +### safeTransferFrom + +Safely transfers a single token type from one address to another. Validates ownership, approval, and receiver address before updating balances. Performs ERC1155Receiver validation if recipient is a contract (safe transfer). Complies with EIP-1155 safe transfer requirements. + + +{`function safeTransferFrom(address _from, address _to, uint256 _id, uint256 _value, address _operator) ;`} + + +**Parameters:** + + + +--- +### setBaseURI + +Sets the base URI prefix for token-specific URIs. The base URI is concatenated with token-specific URIs set via setTokenURI. Does not affect the default URI used when no token-specific URI is set. + + +{`function setBaseURI(string memory _baseURI) ;`} + + +**Parameters:** + + + +--- +### setTokenURI + +Sets the token-specific URI for a given token ID. Sets tokenURIs[_tokenId] to the provided string and emits a URI event with the full computed URI. The emitted URI is the concatenation of baseURI and the token-specific URI. + + +{`function setTokenURI(uint256 _tokenId, string memory _tokenURI) ;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when multiple token types are transferred. +
+ +
+ Signature: + +{`event TransferBatch( +address indexed _operator, address indexed _from, address indexed _to, uint256[] _ids, uint256[] _values +);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a single token type is transferred. +
+ +
+ Signature: + +{`event TransferSingle( +address indexed _operator, address indexed _from, address indexed _to, uint256 _id, uint256 _value +);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when the URI for token type `_id` changes to `_value`. +
+ +
+ Signature: + +{`event URI(string _value, uint256 indexed _id);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ **Title:** LibERC1155 — ERC-1155 Library Provides internal functions and storage layout for ERC-1155 multi-token logic. Thrown when insufficient balance for a transfer or burn operation. Uses ERC-8042 for storage location standardization and ERC-6093 for error conventions. This library is intended to be used by custom facets to integrate with ERC-1155 functionality. +
+ +
+ Signature: + +error ERC1155InsufficientBalance(address _sender, uint256 _balance, uint256 _needed, uint256 _tokenId); + +
+
+ +
+ Thrown when array lengths don't match in batch operations. +
+ +
+ Signature: + +error ERC1155InvalidArrayLength(uint256 _idsLength, uint256 _valuesLength); + +
+
+ +
+ Thrown when the receiver address is invalid. +
+ +
+ Signature: + +error ERC1155InvalidReceiver(address _receiver); + +
+
+ +
+ Thrown when the sender address is invalid. +
+ +
+ Signature: + +error ERC1155InvalidSender(address _sender); + +
+
+ +
+ Thrown when missing approval for an operator. +
+ +
+ Signature: + +error ERC1155MissingApprovalForAll(address _operator, address _owner); + +
+
+
+ + + + +## Best Practices + + +- Call `safeTransferFrom` or `safeBatchTransferFrom` for standard ERC-1155 transfers to ensure receiver contract compatibility. +- Use `mint` and `burn` functions for internal token lifecycle management. +- Ensure correct `_operator` is passed for transfers to maintain accurate approval tracking. + + +## Integration Notes + + +This module utilizes the diamond storage pattern, storing ERC-1155 state within a dedicated slot identified by `keccak256(\"compose.erc1155\")`. The `ERC1155Storage` struct, containing `uri` and `baseURI` fields, is managed at this position. Functions like `mint`, `burn`, and transfers directly interact with this shared storage, ensuring all facets accessing this storage see consistent token balances and URI information. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC1155/_category_.json b/website/docs/library/token/ERC1155/_category_.json new file mode 100644 index 00000000..cdb57d9a --- /dev/null +++ b/website/docs/library/token/ERC1155/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-1155", + "position": 3, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/ERC1155/index" + } +} diff --git a/website/docs/library/token/ERC1155/index.mdx b/website/docs/library/token/ERC1155/index.mdx new file mode 100644 index 00000000..02777ddf --- /dev/null +++ b/website/docs/library/token/ERC1155/index.mdx @@ -0,0 +1,29 @@ +--- +title: "ERC-1155" +description: "ERC-1155 multi-token implementations." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-1155 multi-token implementations. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/token/ERC20/ERC20/ERC20BurnFacet.mdx b/website/docs/library/token/ERC20/ERC20/ERC20BurnFacet.mdx new file mode 100644 index 00000000..ad6b5586 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20/ERC20BurnFacet.mdx @@ -0,0 +1,256 @@ +--- +sidebar_position: 3 +title: "ERC20BurnFacet" +description: "Burns ERC-20 tokens from caller or allowance" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC20/ERC20/ERC20BurnFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Burns ERC-20 tokens from caller or allowance + + + +- Exposes `burn` and `burnFrom` functions for token destruction. +- Emits `Transfer` events to zero address upon successful burns. +- Accesses ERC20 storage via internal `getStorage` function. +- Compatible with ERC-2535 diamond standard. + + +## Overview + +This facet implements burning functionality for ERC-20 tokens within a diamond. It provides external functions to reduce token supply, accessible via the diamond proxy. Developers integrate this facet to enable token destruction while maintaining the diamond's upgradeability and modularity. + +--- + +## Storage + +### ERC20Storage + + +{`struct ERC20Storage { + mapping(address owner => uint256 balance) balanceOf; + uint256 totalSupply; + mapping(address owner => mapping(address spender => uint256 allowance)) allowance; +}`} + + +### State Variables + + + +## Functions + +### burn + +Burns (destroys) a specific amount of tokens from the caller's balance. Emits a Transfer event to the zero address. + + +{`function burn(uint256 _value) external;`} + + +**Parameters:** + + + +--- +### burnFrom + +Burns tokens from another account, deducting from the caller's allowance. Emits a Transfer event to the zero address. + + +{`function burnFrom(address _account, uint256 _value) external;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when tokens are transferred between two addresses. +
+ +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 _value);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when an account has insufficient balance for a transfer or burn. +
+ +
+ Signature: + +error ERC20InsufficientBalance(address _sender, uint256 _balance, uint256 _needed); + +
+
+ +
+ Thrown when a spender tries to use more than the approved allowance. +
+ +
+ Signature: + +error ERC20InsufficientAllowance(address _spender, uint256 _allowance, uint256 _needed); + +
+
+
+ + + + +## Best Practices + + +- Ensure the `ERC20BurnFacet` is correctly initialized with necessary storage bindings. +- Verify that access control is properly configured for `burnFrom` if restricted. +- Test token balance and allowance checks thoroughly before and after burning operations. + + +## Security Considerations + + +The `burn` function checks for sufficient caller balance before burning. The `burnFrom` function checks for sufficient allowance and caller balance. Both functions revert with `ERC20InsufficientBalance` or `ERC20InsufficientAllowance` if checks fail. Follow standard Solidity security practices for input validation and access control. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC20/ERC20/ERC20Facet.mdx b/website/docs/library/token/ERC20/ERC20/ERC20Facet.mdx new file mode 100644 index 00000000..c1965929 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20/ERC20Facet.mdx @@ -0,0 +1,566 @@ +--- +sidebar_position: 2 +title: "ERC20Facet" +description: "Implements ERC-20 token functionality within a diamond" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC20/ERC20/ERC20Facet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Implements ERC-20 token functionality within a diamond + + + +- Exposes standard ERC-20 functions via the diamond proxy. +- Manages token state (balances, allowances, total supply) using diamond storage. +- Emits ERC-20 compliant Approval and Transfer events. +- Includes specific custom errors for common ERC-20 failure conditions. + + +## Overview + +This facet implements standard ERC-20 token functionality as external functions within a diamond proxy. It manages token state, including total supply, balances, and allowances, by accessing shared diamond storage. Developers integrate this facet to expose a compliant ERC-20 token interface while leveraging the diamond's upgradeability and composability. + +--- + +## Storage + +### ERC20Storage + + +{`struct ERC20Storage { + mapping(address owner => uint256 balance) balanceOf; + uint256 totalSupply; + mapping(address owner => mapping(address spender => uint256 allowance)) allowance; + uint8 decimals; + string name; + string symbol; +}`} + + +### State Variables + + + +## Functions + +### name + +Returns the name of the token. + + +{`function name() external view returns (string memory);`} + + +**Returns:** + + + +--- +### symbol + +Returns the symbol of the token. + + +{`function symbol() external view returns (string memory);`} + + +**Returns:** + + + +--- +### decimals + +Returns the number of decimals used for token precision. + + +{`function decimals() external view returns (uint8);`} + + +**Returns:** + + + +--- +### totalSupply + +Returns the total supply of tokens. + + +{`function totalSupply() external view returns (uint256);`} + + +**Returns:** + + + +--- +### balanceOf + +Returns the balance of a specific account. + + +{`function balanceOf(address _account) external view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### allowance + +Returns the remaining number of tokens that a spender is allowed to spend on behalf of an owner. + + +{`function allowance(address _owner, address _spender) external view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### approve + +Approves a spender to transfer up to a certain amount of tokens on behalf of the caller. Emits an Approval event. + + +{`function approve(address _spender, uint256 _value) external returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### transfer + +Transfers tokens to another address. Emits a Transfer event. + + +{`function transfer(address _to, uint256 _value) external returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### transferFrom + +Transfers tokens on behalf of another account, provided sufficient allowance exists. Emits a Transfer event and decreases the spender's allowance. + + +{`function transferFrom(address _from, address _to, uint256 _value) external returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +## Events + + + +
+ Emitted when an approval is made for a spender by an owner. +
+ +
+ Signature: + +{`event Approval(address indexed _owner, address indexed _spender, uint256 _value);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when tokens are transferred between two addresses. +
+ +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 _value);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when an account has insufficient balance for a transfer or burn. +
+ +
+ Signature: + +error ERC20InsufficientBalance(address _sender, uint256 _balance, uint256 _needed); + +
+
+ +
+ Thrown when the sender address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC20InvalidSender(address _sender); + +
+
+ +
+ Thrown when the receiver address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC20InvalidReceiver(address _receiver); + +
+
+ +
+ Thrown when a spender tries to use more than the approved allowance. +
+ +
+ Signature: + +error ERC20InsufficientAllowance(address _spender, uint256 _allowance, uint256 _needed); + +
+
+ +
+ Thrown when the spender address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC20InvalidSpender(address _spender); + +
+
+
+ + + + +## Best Practices + + +- Initialize token name, symbol, decimals, and total supply during diamond deployment. +- Enforce access control on functions that modify token state if required by your diamond's architecture. +- Ensure storage layout compatibility when upgrading or adding new facets to maintain state integrity. + + +## Security Considerations + + +All state-changing functions (approve, transfer, transferFrom) implement checks-effects-interactions pattern. Input parameters are validated, and custom errors are used for insufficient balance, allowance, or invalid addresses. Follow standard Solidity security practices for external calls and access control. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC20/ERC20/ERC20Mod.mdx b/website/docs/library/token/ERC20/ERC20/ERC20Mod.mdx new file mode 100644 index 00000000..01846e17 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20/ERC20Mod.mdx @@ -0,0 +1,459 @@ +--- +sidebar_position: 1 +title: "ERC20Mod" +description: "Internal functions and storage for ERC-20 token logic" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC20/ERC20/ERC20Mod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Internal functions and storage for ERC-20 token logic + + + +- Provides internal functions for ERC-20 operations, suitable for facet implementation. +- Leverages diamond storage pattern (EIP-8042) for state management. +- Emits `Transfer` and `Approval` events upon relevant state changes. +- Includes specific errors for common ERC-20 failure conditions. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions for core ERC-20 operations, including minting, burning, approvals, and transfers. Facets can import this module to implement ERC-20 functionality while leveraging shared diamond storage for token state. This approach ensures consistency and efficient storage utilization across multiple facets within a diamond. + +--- + +## Storage + +### ERC20Storage + + +{`struct ERC20Storage { + mapping(address owner => uint256 balance) balanceOf; + uint256 totalSupply; + mapping(address owner => mapping(address spender => uint256 allowance)) allowance; + uint8 decimals; + string name; + string symbol; +}`} + + +### State Variables + + + +## Functions + +### approve + +Approves a spender to transfer tokens on behalf of the caller. Sets the allowance for the spender. + + +{`function approve(address _spender, uint256 _value) ;`} + + +**Parameters:** + + + +--- +### burn + +Burns tokens from a specified address. Decreases both total supply and the sender's balance. + + +{`function burn(address _account, uint256 _value) ;`} + + +**Parameters:** + + + +--- +### getStorage + +Returns a pointer to the ERC-20 storage struct. Uses inline assembly to bind the storage struct to the fixed storage position. + + +{`function getStorage() pure returns (ERC20Storage storage s);`} + + +**Returns:** + + + +--- +### mint + +Mints new tokens to a specified address. Increases both total supply and the recipient's balance. + + +{`function mint(address _account, uint256 _value) ;`} + + +**Parameters:** + + + +--- +### transfer + +Transfers tokens from the caller to another address. Updates balances directly without allowance mechanism. + + +{`function transfer(address _to, uint256 _value) ;`} + + +**Parameters:** + + + +--- +### transferFrom + +Transfers tokens from one address to another using an allowance. Deducts the spender's allowance and updates balances. + + +{`function transferFrom(address _from, address _to, uint256 _value) ;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when an approval is made for a spender by an owner. +
+ +
+ Signature: + +{`event Approval(address indexed _owner, address indexed _spender, uint256 _value);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when tokens are transferred between addresses. +
+ +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 _value);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when a spender tries to spend more than their allowance. +
+ +
+ Signature: + +error ERC20InsufficientAllowance(address _spender, uint256 _allowance, uint256 _needed); + +
+
+ +
+ Thrown when a sender attempts to transfer or burn more tokens than their balance. +
+ +
+ Signature: + +error ERC20InsufficientBalance(address _sender, uint256 _balance, uint256 _needed); + +
+
+ +
+ Thrown when the receiver address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC20InvalidReceiver(address _receiver); + +
+
+ +
+ Thrown when the sender address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC20InvalidSender(address _sender); + +
+
+ +
+ Thrown when the spender address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC20InvalidSpender(address _spender); + +
+
+
+ + + + +## Best Practices + + +- Ensure the `ERC20Mod` is correctly initialized with the diamond's storage slot. +- Call module functions with appropriate checks for balances and allowances before execution. +- Handle custom errors returned by module functions, such as `ERC20InsufficientBalance` or `ERC20InsufficientAllowance`. + + +## Integration Notes + + +This module utilizes the diamond storage pattern, reading and writing to a designated storage slot identified by `keccak256("compose.erc20")`. All state modifications and reads performed through `ERC20Mod` functions directly interact with this shared storage. Changes are immediately visible to any facet that accesses the same storage slot, enabling composable state management. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC20/ERC20/_category_.json b/website/docs/library/token/ERC20/ERC20/_category_.json new file mode 100644 index 00000000..bd8d3da5 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-20", + "position": 1, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/ERC20/ERC20/index" + } +} diff --git a/website/docs/library/token/ERC20/ERC20/index.mdx b/website/docs/library/token/ERC20/ERC20/index.mdx new file mode 100644 index 00000000..ac374189 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20/index.mdx @@ -0,0 +1,36 @@ +--- +title: "ERC-20" +description: "ERC-20 fungible token implementations." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-20 fungible token implementations. + + + + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/token/ERC20/ERC20Bridgeable/ERC20BridgeableFacet.mdx b/website/docs/library/token/ERC20/ERC20Bridgeable/ERC20BridgeableFacet.mdx new file mode 100644 index 00000000..88f69ba8 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20Bridgeable/ERC20BridgeableFacet.mdx @@ -0,0 +1,431 @@ +--- +sidebar_position: 2 +title: "ERC20BridgeableFacet" +description: "Cross-chain ERC-20 token minting and burning" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC20/ERC20Bridgeable/ERC20BridgeableFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Cross-chain ERC-20 token minting and burning + + + +- Enables cross-chain token minting and burning. +- Restricts minting and burning to addresses with the `trusted-bridge` role. +- Integrates with the diamond storage pattern for token state management. +- Follows Compose's convention of explicit function calls. + + +## Overview + +This facet enables cross-chain ERC-20 token minting and burning operations within a diamond. It exposes functions that are callable only by addresses with the `trusted-bridge` role, ensuring secure and controlled token movements across different chains. Integrate this facet to manage bridged token supply. + +--- + +## Storage + +### ERC20Storage + + +{`struct ERC20Storage { + mapping(address owner => uint256 balance) balanceOf; + uint256 totalSupply; +}`} + + +--- +### AccessControlStorage + + +{`struct AccessControlStorage { + mapping(address account => mapping(bytes32 role => bool hasRole)) hasRole; +}`} + + +### State Variables + + + +## Functions + +### crosschainMint + +Cross-chain mint — callable only by an address having the `trusted-bridge` role. + + +{`function crosschainMint(address _account, uint256 _value) external;`} + + +**Parameters:** + + + +--- +### crosschainBurn + +Cross-chain burn — callable only by an address having the `trusted-bridge` role. + + +{`function crosschainBurn(address _from, uint256 _value) external;`} + + +**Parameters:** + + + +--- +### checkTokenBridge + +Internal check to check if the bridge (caller) is trusted. Reverts if caller is zero or not in the AccessControl `trusted-bridge` role. + + +{`function checkTokenBridge(address _caller) external view;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when tokens are minted via a cross-chain bridge. +
+ +
+ Signature: + +{`event CrosschainMint(address indexed _to, uint256 _amount, address indexed _sender);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a crosschain transfer burns tokens. +
+ +
+ Signature: + +{`event CrosschainBurn(address indexed _from, uint256 _amount, address indexed _sender);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when tokens are transferred between two addresses. +
+ +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 _value);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Revert when a provided receiver is invalid(e.g,zero address) . +
+ +
+ Signature: + +error ERC20InvalidReciever(address _receiver); + +
+
+ +
+ Thrown when the sender address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC20InvalidSender(address _sender); + +
+
+ +
+ Revert when caller is not a trusted bridge. +
+ +
+ Signature: + +error ERC20InvalidBridgeAccount(address _caller); + +
+
+ +
+ Revert when caller address is invalid. +
+ +
+ Signature: + +error ERC20InvalidCallerAddress(address _caller); + +
+
+ +
+ Thrown when the account does not have a specific role. +
+ +
+ Signature: + +error AccessControlUnauthorizedAccount(address _account, bytes32 _role); + +
+
+ + +
+ Signature: + +error ERC20InsufficientBalance(address _from, uint256 _accountBalance, uint256 _value); + +
+
+
+ + + + +## Best Practices + + +- Ensure the `trusted-bridge` role is assigned only to authorized bridge contracts or entities. +- Call `crosschainMint` and `crosschainBurn` through the diamond proxy to ensure correct function routing. +- Validate that the caller possesses the `trusted-bridge` role before invoking minting or burning operations. + + +## Security Considerations + + +The `crosschainMint` and `crosschainBurn` functions are protected by access control, requiring the caller to have the `trusted-bridge` role. The `checkTokenBridge` internal function enforces this role check. Input validation for `_account`, `_from`, and `_value` is crucial to prevent unexpected behavior. Follow standard Solidity security practices for handling token transfers and state updates. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC20/ERC20Bridgeable/ERC20BridgeableMod.mdx b/website/docs/library/token/ERC20/ERC20Bridgeable/ERC20BridgeableMod.mdx new file mode 100644 index 00000000..f07bf303 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20Bridgeable/ERC20BridgeableMod.mdx @@ -0,0 +1,446 @@ +--- +sidebar_position: 1 +title: "ERC20BridgeableMod" +description: "Manages cross-chain token transfers with role-based access control" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC20/ERC20Bridgeable/ERC20BridgeableMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manages cross-chain token transfers with role-based access control + + + +- Internal functions for cross-chain minting and burning. +- Enforces `trusted-bridge` role for cross-chain operations. +- Utilizes diamond storage for bridge access control configuration. +- Compatible with ERC-2535 diamonds. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions for cross-chain token minting and burning, secured by role-based access control. Facets can import this module to integrate trusted bridge functionality, ensuring only authorized addresses can perform cross-chain operations. Changes to token bridges are managed via diamond storage, visible to all integrated facets. + +--- + +## Storage + +### AccessControlStorage + + +{`struct AccessControlStorage { + mapping(address account => mapping(bytes32 role => bool hasRole)) hasRole; +}`} + + +--- +### ERC20Storage + +ERC-8042 compliant storage struct for ERC20 token data. storage-location: erc8042:compose.erc20 + + +{`struct ERC20Storage { + mapping(address owner => uint256 balance) balanceOf; + uint256 totalSupply; +}`} + + +### State Variables + + + +## Functions + +### checkTokenBridge + +Internal check to check if the bridge (caller) is trusted. Reverts if caller is zero or not in the AccessControl `trusted-bridge` role. + + +{`function checkTokenBridge(address _caller) view;`} + + +**Parameters:** + + + +--- +### crosschainBurn + +Cross-chain burn — callable only by an address having the `trusted-bridge` role. + + +{`function crosschainBurn(address _from, uint256 _value) ;`} + + +**Parameters:** + + + +--- +### crosschainMint + +Cross-chain mint — callable only by an address having the `trusted-bridge` role. + + +{`function crosschainMint(address _account, uint256 _value) ;`} + + +**Parameters:** + + + +--- +### getAccessControlStorage + +helper to return AccessControlStorage at its diamond slot + + +{`function getAccessControlStorage() pure returns (AccessControlStorage storage s);`} + + +--- +### getERC20Storage + +Returns the ERC20 storage struct from the predefined diamond storage slot. Uses inline assembly to set the storage slot reference. + + +{`function getERC20Storage() pure returns (ERC20Storage storage s);`} + + +**Returns:** + + + +## Events + + + +
+ Emitted when a crosschain transfer burns tokens. +
+ +
+ Signature: + +{`event CrosschainBurn(address indexed _from, uint256 _amount, address indexed _sender);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when tokens are minted via a cross-chain bridge. +
+ +
+ Signature: + +{`event CrosschainMint(address indexed _to, uint256 _amount, address indexed _sender);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when tokens are transferred between two addresses. +
+ +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 _value);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when the account does not have a specific role. +
+ +
+ Signature: + +error AccessControlUnauthorizedAccount(address _account, bytes32 _role); + +
+
+ + +
+ Signature: + +error ERC20InsufficientBalance(address _from, uint256 _accountBalance, uint256 _value); + +
+
+ +
+ Revert when caller is not a trusted bridge. +
+ +
+ Signature: + +error ERC20InvalidBridgeAccount(address _caller); + +
+
+ +
+ Revert when caller address is invalid. +
+ +
+ Signature: + +error ERC20InvalidCallerAddress(address _caller); + +
+
+ +
+ /// @dev Uses ERC-8042 for storage location standardization and ERC-6093 for error conventions Revert when a provided receiver is invalid(e.g,zero address) . +
+ +
+ Signature: + +error ERC20InvalidReciever(address _receiver); + +
+
+ +
+ Thrown when the sender address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC20InvalidSender(address _sender); + +
+
+
+ + + + +## Best Practices + + +- Ensure the calling facet possesses the `trusted-bridge` role before invoking crosschain functions. +- Verify that `_caller` is a valid, non-zero address when checking bridge trust. +- Handle potential reverts from `checkTokenBridge` before executing cross-chain operations. + + +## Integration Notes + + +This module interacts with diamond storage using the `ERC20_STORAGE_POSITION` key, which is defined as `keccak256("compose.erc20")`. Functions like `checkTokenBridge` read from and functions like `crosschainMint` and `crosschainBurn` modify state related to trusted bridges. All state changes are immediately visible to other facets accessing the same diamond storage. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC20/ERC20Bridgeable/_category_.json b/website/docs/library/token/ERC20/ERC20Bridgeable/_category_.json new file mode 100644 index 00000000..03768f44 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20Bridgeable/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-20 Bridgeable", + "position": 2, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/ERC20/ERC20Bridgeable/index" + } +} diff --git a/website/docs/library/token/ERC20/ERC20Bridgeable/index.mdx b/website/docs/library/token/ERC20/ERC20Bridgeable/index.mdx new file mode 100644 index 00000000..a301f7c7 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20Bridgeable/index.mdx @@ -0,0 +1,29 @@ +--- +title: "ERC-20 Bridgeable" +description: "ERC-20 Bridgeable extension for ERC-20 tokens." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-20 Bridgeable extension for ERC-20 tokens. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/token/ERC20/ERC20Permit/ERC20PermitFacet.mdx b/website/docs/library/token/ERC20/ERC20Permit/ERC20PermitFacet.mdx new file mode 100644 index 00000000..3c632600 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20Permit/ERC20PermitFacet.mdx @@ -0,0 +1,327 @@ +--- +sidebar_position: 2 +title: "ERC20PermitFacet" +description: "EIP-2612 permit functionality for ERC-20 tokens" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC20/ERC20Permit/ERC20PermitFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +EIP-2612 permit functionality for ERC-20 tokens + + + +- Implements EIP-2612 `permit` function for off-chain approvals. +- Exposes `nonces` and `DOMAIN_SEPARATOR` for signature verification. +- Integrates with Compose diamond storage pattern. + + +## Overview + +This facet implements EIP-2612 permit functionality, enabling off-chain approvals for ERC-20 token transfers within a Compose diamond. It exposes the `permit` function to set allowances via signatures and provides `nonces` and `DOMAIN_SEPARATOR` for signature validation. Developers integrate this facet to allow users to grant token spending permissions without direct on-chain transactions. + +--- + +## Storage + +### ERC20Storage + + +{`struct ERC20Storage { + mapping(address owner => uint256 balance) balanceOf; + uint256 totalSupply; + mapping(address owner => mapping(address spender => uint256 allowance)) allowance; + uint8 decimals; + string name; +}`} + + +--- +### ERC20PermitStorage + + +{`struct ERC20PermitStorage { + mapping(address owner => uint256) nonces; +}`} + + +### State Variables + + + +## Functions + +### nonces + +Returns the current nonce for an owner. This value changes each time a permit is used. + + +{`function nonces(address _owner) external view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### DOMAIN_SEPARATOR + +Returns the domain separator used in the encoding of the signature for permit. This value is unique to a contract and chain ID combination to prevent replay attacks. + + +{`function DOMAIN_SEPARATOR() external view returns (bytes32);`} + + +**Returns:** + + + +--- +### permit + +Sets the allowance for a spender via a signature. This function implements EIP-2612 permit functionality. + + +{`function permit( + address _owner, + address _spender, + uint256 _value, + uint256 _deadline, + uint8 _v, + bytes32 _r, + bytes32 _s +) external;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when an approval is made for a spender by an owner. +
+ +
+ Signature: + +{`event Approval(address indexed _owner, address indexed _spender, uint256 _value);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when a permit signature is invalid or expired. +
+ +
+ Signature: + +error ERC2612InvalidSignature( + address _owner, address _spender, uint256 _value, uint256 _deadline, uint8 _v, bytes32 _r, bytes32 _s +); + +
+
+ +
+ Thrown when the spender address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC20InvalidSpender(address _spender); + +
+
+
+ + + + +## Best Practices + + +- Integrate the `ERC20PermitFacet` during diamond initialization. +- Ensure `DOMAIN_SEPARATOR` is correctly configured for the specific chain and diamond. +- Validate signature parameters (`_v`, `_r`, `_s`) before relaying them to the `permit` function. + + +## Security Considerations + + +The `permit` function allows setting allowances via signatures. Ensure correct implementation of signature verification logic by consumers of this facet. Reentrancy is not applicable as `permit` only modifies storage. Input validation for spender and value is handled internally. Follow standard Solidity security practices. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC20/ERC20Permit/ERC20PermitMod.mdx b/website/docs/library/token/ERC20/ERC20Permit/ERC20PermitMod.mdx new file mode 100644 index 00000000..4f76c096 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20Permit/ERC20PermitMod.mdx @@ -0,0 +1,286 @@ +--- +sidebar_position: 1 +title: "ERC20PermitMod" +description: "ERC-2612 permit and domain separator logic" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC20/ERC20Permit/ERC20PermitMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +ERC-2612 permit and domain separator logic + + + +- Provides internal functions for ERC-2612 permit logic. +- Includes `DOMAIN_SEPARATOR()` for signature validation. +- Leverages diamond storage pattern for state management. +- Functions are internal, intended for use by other facets. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions for managing ERC-2612 permit logic and domain separators within a diamond. Facets can import this module to handle permit validation and signature encoding, leveraging shared diamond storage. The domain separator ensures signature uniqueness per contract and chain ID, preventing replay attacks. + +--- + +## Storage + +### ERC20PermitStorage + +storage-location: erc8042:compose.erc20.permit + + +{`struct ERC20PermitStorage { + mapping(address owner => uint256) nonces; +}`} + + +--- +### ERC20Storage + +storage-location: erc8042:compose.erc20 + + +{`struct ERC20Storage { + mapping(address owner => uint256 balance) balanceOf; + uint256 totalSupply; + mapping(address owner => mapping(address spender => uint256 allowance)) allowance; + uint8 decimals; + string name; +}`} + + +### State Variables + + + +## Functions + +### DOMAIN_SEPARATOR + +Returns the domain separator used in the encoding of the signature for {permit}. This value is unique to a contract and chain ID combination to prevent replay attacks. + + +{`function DOMAIN_SEPARATOR() view returns (bytes32);`} + + +**Returns:** + + + +--- +### getERC20Storage + + +{`function getERC20Storage() pure returns (ERC20Storage storage s);`} + + +--- +### getPermitStorage + + +{`function getPermitStorage() pure returns (ERC20PermitStorage storage s);`} + + +--- +### permit + +Validates a permit signature and sets allowance. Emits Approval event; must be emitted by the calling facet/contract. + + +{`function permit(address _owner, address _spender, uint256 _value, uint256 _deadline, uint8 _v, bytes32 _r, bytes32 _s) ;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when an approval is made for a spender by an owner. +
+ +
+ Signature: + +{`event Approval(address indexed _owner, address indexed _spender, uint256 _value);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when the spender address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC20InvalidSpender(address _spender); + +
+
+ +
+ Thrown when a permit signature is invalid or expired. +
+ +
+ Signature: + +error ERC2612InvalidSignature( +address _owner, address _spender, uint256 _value, uint256 _deadline, uint8 _v, bytes32 _r, bytes32 _s +); + +
+
+
+ + + + +## Best Practices + + +- Use the `permit` function within a facet that manages ERC-20 allowances. +- Call `DOMAIN_SEPARATOR()` to obtain the correct domain separator for signature generation. +- Ensure the calling facet emits the `Approval` event after a successful permit validation. + + +## Integration Notes + + +This module interacts with diamond storage via the `ERC20_STORAGE_POSITION` slot, identified by `keccak256("compose.erc20")`. The `ERC20PermitStorage` and `ERC20Storage` structs are utilized. Changes to state, such as allowance updates after permit validation, are managed by the calling facet, ensuring composability within the diamond storage pattern. + + +
+ +
+ + diff --git a/website/docs/library/token/ERC20/ERC20Permit/_category_.json b/website/docs/library/token/ERC20/ERC20Permit/_category_.json new file mode 100644 index 00000000..7932c4df --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20Permit/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-20 Permit", + "position": 3, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/ERC20/ERC20Permit/index" + } +} diff --git a/website/docs/library/token/ERC20/ERC20Permit/index.mdx b/website/docs/library/token/ERC20/ERC20Permit/index.mdx new file mode 100644 index 00000000..e310cec8 --- /dev/null +++ b/website/docs/library/token/ERC20/ERC20Permit/index.mdx @@ -0,0 +1,29 @@ +--- +title: "ERC-20 Permit" +description: "ERC-20 Permit extension for ERC-20 tokens." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-20 Permit extension for ERC-20 tokens. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/token/ERC20/_category_.json b/website/docs/library/token/ERC20/_category_.json new file mode 100644 index 00000000..0e078cb1 --- /dev/null +++ b/website/docs/library/token/ERC20/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-20", + "position": 1, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/ERC20/index" + } +} diff --git a/website/docs/library/token/ERC20/index.mdx b/website/docs/library/token/ERC20/index.mdx new file mode 100644 index 00000000..2e3a8827 --- /dev/null +++ b/website/docs/library/token/ERC20/index.mdx @@ -0,0 +1,36 @@ +--- +title: "ERC-20" +description: "ERC-20 fungible token implementations." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-20 fungible token implementations. + + + + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/token/ERC6909/ERC6909/ERC6909Facet.mdx b/website/docs/library/token/ERC6909/ERC6909/ERC6909Facet.mdx new file mode 100644 index 00000000..0d798cd7 --- /dev/null +++ b/website/docs/library/token/ERC6909/ERC6909/ERC6909Facet.mdx @@ -0,0 +1,543 @@ +--- +sidebar_position: 2 +title: "ERC6909Facet" +description: "ERC-6909 token transfers and operator management" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC6909/ERC6909/ERC6909Facet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +ERC-6909 token transfers and operator management + + + +- Implements ERC-6909 token standard functions. +- Exposes `transfer`, `transferFrom`, `approve`, `balanceOf`, `allowance`, `isOperator`, and `setOperator` externally. +- Reads and writes to diamond storage via a defined storage position. +- Self-contained unit with no external dependencies. + + +## Overview + +This facet implements ERC-6909 token functionality within a diamond proxy. It exposes external functions for transfers, approvals, and operator management, interacting with shared diamond storage. Developers integrate this facet to provide composable token features while retaining diamond upgradeability. + +--- + +## Storage + +### ERC6909Storage + + +{`struct ERC6909Storage { + mapping(address owner => mapping(uint256 id => uint256 amount)) balanceOf; + mapping(address owner => mapping(address spender => mapping(uint256 id => uint256 amount))) allowance; + mapping(address owner => mapping(address spender => bool)) isOperator; +}`} + + +### State Variables + + + +## Functions + +### balanceOf + +Owner balance of an id. + + +{`function balanceOf(address _owner, uint256 _id) external view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### allowance + +Spender allowance of an id. + + +{`function allowance(address _owner, address _spender, uint256 _id) external view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### isOperator + +Checks if a spender is approved by an owner as an operator. + + +{`function isOperator(address _owner, address _spender) external view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### transfer + +Transfers an amount of an id from the caller to a receiver. + + +{`function transfer(address _receiver, uint256 _id, uint256 _amount) external returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### transferFrom + +Transfers an amount of an id from a sender to a receiver. + + +{`function transferFrom(address _sender, address _receiver, uint256 _id, uint256 _amount) external returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### approve + +Approves an amount of an id to a spender. + + +{`function approve(address _spender, uint256 _id, uint256 _amount) external returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### setOperator + +Sets or removes a spender as an operator for the caller. + + +{`function setOperator(address _spender, bool _approved) external returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +## Events + + + + +
+ Signature: + +{`event Transfer( + address _caller, address indexed _sender, address indexed _receiver, uint256 indexed _id, uint256 _amount +);`} + +
+ +
+ + +
+ Signature: + +{`event OperatorSet(address indexed _owner, address indexed _spender, bool _approved);`} + +
+ +
+ + +
+ Signature: + +{`event Approval(address indexed _owner, address indexed _spender, uint256 indexed _id, uint256 _amount);`} + +
+ +
+
+ +## Errors + + + + +
+ Signature: + +error ERC6909InsufficientBalance(address _sender, uint256 _balance, uint256 _needed, uint256 _id); + +
+
+ + +
+ Signature: + +error ERC6909InsufficientAllowance(address _spender, uint256 _allowance, uint256 _needed, uint256 _id); + +
+
+ + +
+ Signature: + +error ERC6909InvalidReceiver(address _receiver); + +
+
+ + +
+ Signature: + +error ERC6909InvalidSender(address _sender); + +
+
+ + +
+ Signature: + +error ERC6909InvalidSpender(address _spender); + +
+
+
+ + + + +## Best Practices + + +- Initialize the ERC6909Storage struct in diamond setup. +- Enforce appropriate access control for state-changing functions like `transfer` and `approve` if required by your diamond's architecture. +- Verify storage slot compatibility if upgrading or adding facets to ensure data integrity. + + +## Security Considerations + + +Input validation is crucial for `transfer`, `transferFrom`, and `approve` functions to prevent issues like insufficient balance (`ERC6909InsufficientBalance`) or allowance (`ERC6909InsufficientAllowance`). The `transfer` and `transferFrom` functions should follow the checks-effects-interactions pattern. Ensure that caller permissions are correctly managed by the diamond's access control layer for sensitive operations. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC6909/ERC6909/ERC6909Mod.mdx b/website/docs/library/token/ERC6909/ERC6909/ERC6909Mod.mdx new file mode 100644 index 00000000..07f3666f --- /dev/null +++ b/website/docs/library/token/ERC6909/ERC6909/ERC6909Mod.mdx @@ -0,0 +1,559 @@ +--- +sidebar_position: 1 +title: "ERC6909Mod" +description: "Minimal multi-token logic for ERC-6909" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC6909/ERC6909/ERC6909Mod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Minimal multi-token logic for ERC-6909 + + + +- All functions are `internal` for use in custom facets. +- Utilizes diamond storage pattern (EIP-8042) for shared state. +- Provides core ERC-6909 token operations: mint, burn, transfer, approve, setOperator. +- No external dependencies or `using` directives. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions and storage layout for ERC-6909 minimal multi-token logic. Facets can import this module to mint, burn, transfer, and manage approvals for tokens using shared diamond storage. Changes to balances and approvals are immediately visible to all facets interacting with the same storage. + +--- + +## Storage + +### ERC6909Storage + +storage-location: erc8042:compose.erc6909 + + +{`struct ERC6909Storage { + mapping(address owner => mapping(uint256 id => uint256 amount)) balanceOf; + mapping(address owner => mapping(address spender => mapping(uint256 id => uint256 amount))) allowance; + mapping(address owner => mapping(address spender => bool)) isOperator; +}`} + + +### State Variables + + + +## Functions + +### approve + +Approves an amount of an id to a spender. + + +{`function approve(address _owner, address _spender, uint256 _id, uint256 _amount) ;`} + + +**Parameters:** + + + +--- +### burn + +Burns `_amount` of token id `_id` from `_from`. + + +{`function burn(address _from, uint256 _id, uint256 _amount) ;`} + + +**Parameters:** + + + +--- +### getStorage + +Returns a pointer to the ERC-6909 storage struct. Uses inline assembly to access the storage slot defined by STORAGE_POSITION. + + +{`function getStorage() pure returns (ERC6909Storage storage s);`} + + +**Returns:** + + + +--- +### mint + +Mints `_amount` of token id `_id` to `_to`. + + +{`function mint(address _to, uint256 _id, uint256 _amount) ;`} + + +**Parameters:** + + + +--- +### setOperator + +Sets or removes a spender as an operator for the caller. + + +{`function setOperator(address _owner, address _spender, bool _approved) ;`} + + +**Parameters:** + + + +--- +### transfer + +Transfers `_amount` of token id `_id` from `_from` to `_to`. Allowance is not deducted if it is `type(uint256).max` Allowance is not deducted if `_by` is an operator for `_from`. + + +{`function transfer(address _by, address _from, address _to, uint256 _id, uint256 _amount) ;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when an approval occurs. +
+ +
+ Signature: + +{`event Approval(address indexed _owner, address indexed _spender, uint256 indexed _id, uint256 _amount);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when an operator is set. +
+ +
+ Signature: + +{`event OperatorSet(address indexed _owner, address indexed _spender, bool _approved);`} + +
+ +
+ Parameters: + +
+
+ +
+ Emitted when a transfer occurs. +
+ +
+ Signature: + +{`event Transfer( +address _caller, address indexed _sender, address indexed _receiver, uint256 indexed _id, uint256 _amount +);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when the spender has insufficient allowance. +
+ +
+ Signature: + +error ERC6909InsufficientAllowance(address _spender, uint256 _allowance, uint256 _needed, uint256 _id); + +
+
+ +
+ Thrown when the sender has insufficient balance. +
+ +
+ Signature: + +error ERC6909InsufficientBalance(address _sender, uint256 _balance, uint256 _needed, uint256 _id); + +
+
+ +
+ Thrown when the approver address is invalid. +
+ +
+ Signature: + +error ERC6909InvalidApprover(address _approver); + +
+
+ +
+ Thrown when the receiver address is invalid. +
+ +
+ Signature: + +error ERC6909InvalidReceiver(address _receiver); + +
+
+ +
+ Thrown when the sender address is invalid. +
+ +
+ Signature: + +error ERC6909InvalidSender(address _sender); + +
+
+ +
+ Thrown when the spender address is invalid. +
+ +
+ Signature: + +error ERC6909InvalidSpender(address _spender); + +
+
+
+ + + + +## Best Practices + + +- Ensure `msg.sender` has appropriate permissions before calling `burn` or `transfer`. +- Verify storage layout compatibility when upgrading facets that use this module. +- Handle specific errors like `ERC6909InsufficientBalance` and `ERC6909InsufficientAllowance`. + + +## Integration Notes + + +This module stores ERC-6909 token data within the diamond's shared storage at the slot identified by `STORAGE_POSITION`. All functions operate directly on this shared `ERC6909Storage` struct. Changes made by any facet using this module are immediately visible to all other facets accessing the same storage position, ensuring consistent state across the diamond. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC6909/ERC6909/_category_.json b/website/docs/library/token/ERC6909/ERC6909/_category_.json new file mode 100644 index 00000000..d4d084dc --- /dev/null +++ b/website/docs/library/token/ERC6909/ERC6909/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-6909", + "position": 4, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/ERC6909/ERC6909/index" + } +} diff --git a/website/docs/library/token/ERC6909/ERC6909/index.mdx b/website/docs/library/token/ERC6909/ERC6909/index.mdx new file mode 100644 index 00000000..4c5c49e4 --- /dev/null +++ b/website/docs/library/token/ERC6909/ERC6909/index.mdx @@ -0,0 +1,29 @@ +--- +title: "ERC-6909" +description: "ERC-6909 minimal multi-token implementations." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-6909 minimal multi-token implementations. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/token/ERC6909/_category_.json b/website/docs/library/token/ERC6909/_category_.json new file mode 100644 index 00000000..42f1101f --- /dev/null +++ b/website/docs/library/token/ERC6909/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-6909", + "position": 4, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/ERC6909/index" + } +} diff --git a/website/docs/library/token/ERC6909/index.mdx b/website/docs/library/token/ERC6909/index.mdx new file mode 100644 index 00000000..b91f1e51 --- /dev/null +++ b/website/docs/library/token/ERC6909/index.mdx @@ -0,0 +1,22 @@ +--- +title: "ERC-6909" +description: "ERC-6909 minimal multi-token implementations." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-6909 minimal multi-token implementations. + + + + } + size="medium" + /> + diff --git a/website/docs/library/token/ERC721/ERC721/ERC721BurnFacet.mdx b/website/docs/library/token/ERC721/ERC721/ERC721BurnFacet.mdx new file mode 100644 index 00000000..e83205f8 --- /dev/null +++ b/website/docs/library/token/ERC721/ERC721/ERC721BurnFacet.mdx @@ -0,0 +1,221 @@ +--- +sidebar_position: 3 +title: "ERC721BurnFacet" +description: "Burns ERC-721 tokens within a diamond" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC721/ERC721/ERC721BurnFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Burns ERC-721 tokens within a diamond + + + +- Exposes an external `burn` function for token destruction. +- Accesses ERC-721 state via the diamond storage pattern. +- Self-contained with no external dependencies beyond diamond interfaces. +- Compatible with ERC-2535 diamond standard. + + +## Overview + +This facet implements the burning of ERC-721 tokens as an external function within a diamond. It accesses shared ERC-721 storage and routes burn operations through the diamond proxy. Developers add this facet to enable token destruction functionality while preserving diamond upgradeability. + +--- + +## Storage + +### ERC721Storage + + +{`struct ERC721Storage { + mapping(uint256 tokenId => address owner) ownerOf; + mapping(address owner => uint256 balance) balanceOf; + mapping(address owner => mapping(address operator => bool approved)) isApprovedForAll; + mapping(uint256 tokenId => address approved) approved; +}`} + + +### State Variables + + + +## Functions + +### burn + +Burns (destroys) a token, removing it from enumeration tracking. + + +{`function burn(uint256 _tokenId) external;`} + + +**Parameters:** + + + +## Events + + + + +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 indexed _tokenId);`} + +
+ +
+ + +
+ Signature: + +{`event Approval(address indexed _owner, address indexed _to, uint256 indexed _tokenId);`} + +
+ +
+ + +
+ Signature: + +{`event ApprovalForAll(address indexed _owner, address indexed _operator, bool _approved);`} + +
+ +
+
+ +## Errors + + + + +
+ Signature: + +error ERC721NonexistentToken(uint256 _tokenId); + +
+
+ + +
+ Signature: + +error ERC721InsufficientApproval(address _operator, uint256 _tokenId); + +
+
+
+ + + + +## Best Practices + + +- Initialize ERC721Storage using the facet's internal `getStorage` function during diamond setup. +- Ensure that the caller has the necessary permissions (e.g., owner of the token) before invoking the `burn` function. +- Verify that the `burn` function is correctly mapped in the diamond's facet registry. + + +## Security Considerations + + +The `burn` function should be protected by appropriate access control mechanisms, typically ensuring that only the token owner or an approved operator can call it. Input validation for `_tokenId` is crucial to prevent unintended state changes. Follow standard Solidity security practices for input validation and reentrancy guards if applicable to the broader diamond context. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC721/ERC721/ERC721Facet.mdx b/website/docs/library/token/ERC721/ERC721/ERC721Facet.mdx new file mode 100644 index 00000000..9b8f87db --- /dev/null +++ b/website/docs/library/token/ERC721/ERC721/ERC721Facet.mdx @@ -0,0 +1,649 @@ +--- +sidebar_position: 2 +title: "ERC721Facet" +description: "Manages ERC-721 token ownership and transfers" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC721/ERC721/ERC721Facet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Manages ERC-721 token ownership and transfers + + + +- Exposes standard ERC-721 functions via diamond proxy. +- Manages token ownership, transfers, and approvals. +- Utilizes diamond storage for state persistence. +- Self-contained, adhering to Compose facet principles. + + +## Overview + +This facet implements ERC-721 token functionality within a diamond proxy. It exposes standard ERC-721 functions for querying token details, ownership, and approvals, as well as for performing transfers. Developers integrate this facet to enable NFT capabilities in their diamond. + +--- + +## Storage + +### ERC721Storage + + +{`struct ERC721Storage { + mapping(uint256 tokenId => address owner) ownerOf; + mapping(address owner => uint256 balance) balanceOf; + mapping(address owner => mapping(address operator => bool approved)) isApprovedForAll; + mapping(uint256 tokenId => address approved) approved; + string name; + string symbol; + string baseURI; +}`} + + +### State Variables + + + +## Functions + +### name + +Returns the token collection name. + + +{`function name() external view returns (string memory);`} + + +**Returns:** + + + +--- +### symbol + +Returns the token collection symbol. + + +{`function symbol() external view returns (string memory);`} + + +**Returns:** + + + +--- +### tokenURI + +Provide the metadata URI for a given token ID. + + +{`function tokenURI(uint256 _tokenId) external view returns (string memory);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### balanceOf + +Returns the number of tokens owned by a given address. + + +{`function balanceOf(address _owner) external view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### ownerOf + +Returns the owner of a given token ID. + + +{`function ownerOf(uint256 _tokenId) public view returns (address);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### getApproved + +Returns the approved address for a given token ID. + + +{`function getApproved(uint256 _tokenId) external view returns (address);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### isApprovedForAll + +Returns true if an operator is approved to manage all of an owner's assets. + + +{`function isApprovedForAll(address _owner, address _operator) external view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### approve + +Approves another address to transfer the given token ID. + + +{`function approve(address _to, uint256 _tokenId) external;`} + + +**Parameters:** + + + +--- +### setApprovalForAll + +Approves or revokes permission for an operator to manage all caller's assets. + + +{`function setApprovalForAll(address _operator, bool _approved) external;`} + + +**Parameters:** + + + +--- +### transferFrom + +Transfers a token from one address to another. + + +{`function transferFrom(address _from, address _to, uint256 _tokenId) external;`} + + +**Parameters:** + + + +--- +### safeTransferFrom + +Safely transfers a token, checking if the receiver can handle ERC-721 tokens. + + +{`function safeTransferFrom(address _from, address _to, uint256 _tokenId) external;`} + + +**Parameters:** + + + +--- +### safeTransferFrom + +Safely transfers a token with additional data. + + +{`function safeTransferFrom(address _from, address _to, uint256 _tokenId, bytes calldata _data) external;`} + + +**Parameters:** + + + +## Events + + + + +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 indexed _tokenId);`} + +
+ +
+ + +
+ Signature: + +{`event Approval(address indexed _owner, address indexed _to, uint256 indexed _tokenId);`} + +
+ +
+ + +
+ Signature: + +{`event ApprovalForAll(address indexed _owner, address indexed _operator, bool _approved);`} + +
+ +
+
+ +## Errors + + + + +
+ Signature: + +error ERC721InvalidOwner(address _owner); + +
+
+ + +
+ Signature: + +error ERC721NonexistentToken(uint256 _tokenId); + +
+
+ + +
+ Signature: + +error ERC721IncorrectOwner(address _sender, uint256 _tokenId, address _owner); + +
+
+ + +
+ Signature: + +error ERC721InvalidSender(address _sender); + +
+
+ + +
+ Signature: + +error ERC721InvalidReceiver(address _receiver); + +
+
+ + +
+ Signature: + +error ERC721InsufficientApproval(address _operator, uint256 _tokenId); + +
+
+ + +
+ Signature: + +error ERC721InvalidApprover(address _approver); + +
+
+ + +
+ Signature: + +error ERC721InvalidOperator(address _operator); + +
+
+
+ + + + +## Best Practices + + +- Initialize `name`, `symbol`, and `baseURI` during diamond setup. +- Enforce ownership and approval checks on all state-changing functions. +- Use `safeTransferFrom` for enhanced receiver validation. + + +## Security Considerations + + +Functions like `transferFrom`, `safeTransferFrom`, `approve`, and `setApprovalForAll` are protected by ownership and approval checks. Internal functions like `internalTransferFrom` perform core transfer logic with necessary validations. Developers must ensure proper access control is implemented at the diamond level for any administrative functions not exposed by this facet. Follow standard Solidity security practices for input validation. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC721/ERC721/ERC721Mod.mdx b/website/docs/library/token/ERC721/ERC721/ERC721Mod.mdx new file mode 100644 index 00000000..dba7be6a --- /dev/null +++ b/website/docs/library/token/ERC721/ERC721/ERC721Mod.mdx @@ -0,0 +1,393 @@ +--- +sidebar_position: 1 +title: "ERC721Mod" +description: "Internal logic for ERC-721 token management" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC721/ERC721/ERC721Mod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Internal logic for ERC-721 token management + + + +- Internal functions for minting, burning, and transferring ERC-721 tokens. +- Utilizes diamond storage for shared state management. +- Reverts with specific ERC-721 errors for failed operations. +- No external dependencies, promoting composability. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions for ERC-721 token management, designed for integration into custom facets. It leverages the diamond storage pattern to ensure state is shared and consistent across facets. Use its functions to mint, burn, and transfer tokens within a Compose diamond. + +--- + +## Storage + +### ERC721Storage + + +{`struct ERC721Storage { + mapping(uint256 tokenId => address owner) ownerOf; + mapping(address owner => uint256 balance) balanceOf; + mapping(address owner => mapping(address operator => bool approved)) isApprovedForAll; + mapping(uint256 tokenId => address approved) approved; + string name; + string symbol; + string baseURI; +}`} + + +### State Variables + + + +## Functions + +### burn + +Burns (destroys) a specific ERC-721 token. Reverts if the token does not exist. Clears ownership and approval. + + +{`function burn(uint256 _tokenId) ;`} + + +**Parameters:** + + + +--- +### getStorage + +Returns the ERC-721 storage struct from its predefined slot. Uses inline assembly to access diamond storage location. + + +{`function getStorage() pure returns (ERC721Storage storage s);`} + + +**Returns:** + + + +--- +### mint + +Mints a new ERC-721 token to the specified address. Reverts if the receiver address is zero or if the token already exists. + + +{`function mint(address _to, uint256 _tokenId) ;`} + + +**Parameters:** + + + +--- +### setMetadata + + +{`function setMetadata(string memory _name, string memory _symbol, string memory _baseURI) ;`} + + +**Parameters:** + + + +--- +### transferFrom + +Transfers ownership of a token ID from one address to another. Validates ownership, approval, and receiver address before updating state. + + +{`function transferFrom(address _from, address _to, uint256 _tokenId) ;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when ownership of a token changes, including minting and burning. +
+ +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 indexed _tokenId);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when the sender is not the owner of the token. +
+ +
+ Signature: + +error ERC721IncorrectOwner(address _sender, uint256 _tokenId, address _owner); + +
+
+ +
+ Thrown when an operator lacks sufficient approval to manage a token. +
+ +
+ Signature: + +error ERC721InsufficientApproval(address _operator, uint256 _tokenId); + +
+
+ +
+ Thrown when the receiver address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC721InvalidReceiver(address _receiver); + +
+
+ +
+ Thrown when the sender address is invalid (e.g., zero address). +
+ +
+ Signature: + +error ERC721InvalidSender(address _sender); + +
+
+ +
+ Thrown when attempting to interact with a non-existent token. +
+ +
+ Signature: + +error ERC721NonexistentToken(uint256 _tokenId); + +
+
+
+ + + + +## Best Practices + + +- Ensure proper ownership and approval checks are performed before calling `transferFrom`. +- Handle `ERC721NonexistentToken`, `ERC721IncorrectOwner`, and `ERC721InsufficientApproval` errors appropriately. +- Use `mint` to create new tokens and `burn` to destroy them, ensuring state consistency. + + +## Integration Notes + + +This module interacts with diamond storage at the slot identified by `keccak256(\"compose.erc721\")`. It reads and writes to the `ERC721Storage` struct, which contains fields for `name`, `symbol`, and `baseURI`. All state changes made through this module are immediately reflected across all facets accessing the same storage position. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC721/ERC721/_category_.json b/website/docs/library/token/ERC721/ERC721/_category_.json new file mode 100644 index 00000000..219beb4e --- /dev/null +++ b/website/docs/library/token/ERC721/ERC721/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-721", + "position": 2, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/ERC721/ERC721/index" + } +} diff --git a/website/docs/library/token/ERC721/ERC721/index.mdx b/website/docs/library/token/ERC721/ERC721/index.mdx new file mode 100644 index 00000000..9f0d29d9 --- /dev/null +++ b/website/docs/library/token/ERC721/ERC721/index.mdx @@ -0,0 +1,36 @@ +--- +title: "ERC-721" +description: "ERC-721 non-fungible token implementations." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-721 non-fungible token implementations. + + + + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/token/ERC721/ERC721Enumerable/ERC721EnumerableBurnFacet.mdx b/website/docs/library/token/ERC721/ERC721Enumerable/ERC721EnumerableBurnFacet.mdx new file mode 100644 index 00000000..cca21e29 --- /dev/null +++ b/website/docs/library/token/ERC721/ERC721Enumerable/ERC721EnumerableBurnFacet.mdx @@ -0,0 +1,238 @@ +--- +sidebar_position: 3 +title: "ERC721EnumerableBurnFacet" +description: "Burns ERC-721 tokens and removes them from enumeration" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC721/ERC721Enumerable/ERC721EnumerableBurnFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Burns ERC-721 tokens and removes them from enumeration + + + +- Exposes external function `burn(uint256 tokenId)` for token destruction. +- Integrates with ERC-721 enumeration tracking. +- Follows EIP-2535 diamond standard for composability. +- Uses internal `getStorage()` for state access. + + +## Overview + +This facet provides functionality to burn ERC-721 tokens within a diamond proxy, ensuring they are removed from enumeration tracking. It exposes the `burn` function for token destruction and `getStorage` for internal access to its state. Integrate this facet to enable token destruction capabilities while maintaining the diamond's upgradeability and composition. + +--- + +## Storage + +### ERC721EnumerableStorage + + +{`struct ERC721EnumerableStorage { + mapping(uint256 tokenId => address owner) ownerOf; + mapping(address owner => uint256[] ownerTokens) ownerTokens; + mapping(uint256 tokenId => uint256 ownerTokensIndex) ownerTokensIndex; + uint256[] allTokens; + mapping(uint256 tokenId => uint256 allTokensIndex) allTokensIndex; + mapping(address owner => mapping(address operator => bool approved)) isApprovedForAll; + mapping(uint256 tokenId => address approved) approved; +}`} + + +### State Variables + + + +## Functions + +### burn + +Burns (destroys) a token, removing it from enumeration tracking. + + +{`function burn(uint256 _tokenId) external;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when ownership of a token changes, including burning. +
+ +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 indexed _tokenId);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when attempting to interact with a non-existent token. +
+ +
+ Signature: + +error ERC721NonexistentToken(uint256 _tokenId); + +
+
+ +
+ Thrown when the caller lacks approval to operate on the token. +
+ +
+ Signature: + +error ERC721InsufficientApproval(address _operator, uint256 _tokenId); + +
+
+
+ + + + +## Best Practices + + +- Initialize the facet's storage during diamond setup. +- Ensure the caller has the necessary permissions to burn the specified token. +- Verify that the token exists before attempting to burn it. + + +## Security Considerations + + +The `burn` function should be protected by appropriate access control mechanisms to prevent unauthorized token destruction. Input validation for `_tokenId` is crucial to prevent issues with non-existent tokens, which will revert with `ERC721NonexistentToken`. Ensure that the caller has sufficient approval to burn the token, otherwise `ERC721InsufficientApproval` will be reverted. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC721/ERC721Enumerable/ERC721EnumerableFacet.mdx b/website/docs/library/token/ERC721/ERC721Enumerable/ERC721EnumerableFacet.mdx new file mode 100644 index 00000000..66d8680c --- /dev/null +++ b/website/docs/library/token/ERC721/ERC721Enumerable/ERC721EnumerableFacet.mdx @@ -0,0 +1,715 @@ +--- +sidebar_position: 2 +title: "ERC721EnumerableFacet" +description: "ERC-721 token ownership and metadata management" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC721/ERC721Enumerable/ERC721EnumerableFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +ERC-721 token ownership and metadata management + + + +- Implements core ERC-721 enumerable functionality. +- Exposes external functions for diamond routing. +- Utilizes diamond storage for state management. +- Compatible with ERC-2535 diamond standard. + + +## Overview + +This facet implements ERC-721 token functionality, including ownership tracking and metadata retrieval, within a diamond proxy. It exposes standard ERC-721 functions externally, allowing interaction with token data via the diamond's unified interface. Developers integrate this facet to provide NFT capabilities while leveraging the diamond's upgradeability and composability. + +--- + +## Storage + +### ERC721EnumerableStorage + + +{`struct ERC721EnumerableStorage { + mapping(uint256 tokenId => address owner) ownerOf; + mapping(address owner => uint256[] ownerTokens) ownerTokens; + mapping(uint256 tokenId => uint256 ownerTokensIndex) ownerTokensIndex; + uint256[] allTokens; + mapping(uint256 tokenId => uint256 allTokensIndex) allTokensIndex; + mapping(address owner => mapping(address operator => bool approved)) isApprovedForAll; + mapping(uint256 tokenId => address approved) approved; + string name; + string symbol; + string baseURI; +}`} + + +### State Variables + + + +## Functions + +### name + +Returns the name of the token collection. + + +{`function name() external view returns (string memory);`} + + +**Returns:** + + + +--- +### symbol + +Returns the symbol of the token collection. + + +{`function symbol() external view returns (string memory);`} + + +**Returns:** + + + +--- +### tokenURI + +Provide the metadata URI for a given token ID. + + +{`function tokenURI(uint256 _tokenId) external view returns (string memory);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### totalSupply + +Returns the total number of tokens in existence. + + +{`function totalSupply() external view returns (uint256);`} + + +**Returns:** + + + +--- +### balanceOf + +Returns the number of tokens owned by an address. + + +{`function balanceOf(address _owner) external view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### ownerOf + +Returns the owner of a given token ID. + + +{`function ownerOf(uint256 _tokenId) public view returns (address);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### tokenOfOwnerByIndex + +Returns a token ID owned by a given address at a specific index. + + +{`function tokenOfOwnerByIndex(address _owner, uint256 _index) external view returns (uint256);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### getApproved + +Returns the approved address for a given token ID. + + +{`function getApproved(uint256 _tokenId) external view returns (address);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### isApprovedForAll + +Returns whether an operator is approved for all tokens of an owner. + + +{`function isApprovedForAll(address _owner, address _operator) external view returns (bool);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### approve + +Approves another address to transfer a specific token ID. + + +{`function approve(address _to, uint256 _tokenId) external;`} + + +**Parameters:** + + + +--- +### setApprovalForAll + +Approves or revokes an operator to manage all tokens of the caller. + + +{`function setApprovalForAll(address _operator, bool _approved) external;`} + + +**Parameters:** + + + +--- +### transferFrom + +Transfers a token from one address to another. + + +{`function transferFrom(address _from, address _to, uint256 _tokenId) external;`} + + +**Parameters:** + + + +--- +### safeTransferFrom + +Safely transfers a token, checking for receiver contract compatibility. + + +{`function safeTransferFrom(address _from, address _to, uint256 _tokenId) external;`} + + +**Parameters:** + + + +--- +### safeTransferFrom + +Safely transfers a token with additional data. + + +{`function safeTransferFrom(address _from, address _to, uint256 _tokenId, bytes calldata _data) external;`} + + +**Parameters:** + + + +## Events + + + + +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 indexed _tokenId);`} + +
+ +
+ + +
+ Signature: + +{`event Approval(address indexed _owner, address indexed _to, uint256 indexed _tokenId);`} + +
+ +
+ + +
+ Signature: + +{`event ApprovalForAll(address indexed _owner, address indexed _operator, bool _approved);`} + +
+ +
+
+ +## Errors + + + + +
+ Signature: + +error ERC721InvalidOwner(address _owner); + +
+
+ + +
+ Signature: + +error ERC721NonexistentToken(uint256 _tokenId); + +
+
+ + +
+ Signature: + +error ERC721IncorrectOwner(address _sender, uint256 _tokenId, address _owner); + +
+
+ + +
+ Signature: + +error ERC721InvalidSender(address _sender); + +
+
+ + +
+ Signature: + +error ERC721InvalidReceiver(address _receiver); + +
+
+ + +
+ Signature: + +error ERC721InsufficientApproval(address _operator, uint256 _tokenId); + +
+
+ + +
+ Signature: + +error ERC721InvalidApprover(address _approver); + +
+
+ + +
+ Signature: + +error ERC721InvalidOperator(address _operator); + +
+
+ + +
+ Signature: + +error ERC721OutOfBoundsIndex(address _owner, uint256 _index); + +
+
+
+ + + + +## Best Practices + + +- Initialize the `name`, `symbol`, and `baseURI` during diamond setup. +- Enforce access control on state-changing functions like `approve` and `transferFrom` if custom logic requires it. +- Ensure storage compatibility when upgrading to new versions of this facet. + + +## Security Considerations + + +Follow standard Solidity security practices. Input validation for token IDs and addresses is critical. The `safeTransferFrom` functions include checks for receiver contract compatibility. Ensure that any custom access control added to state-changing functions is robust. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC721/ERC721Enumerable/ERC721EnumerableMod.mdx b/website/docs/library/token/ERC721/ERC721Enumerable/ERC721EnumerableMod.mdx new file mode 100644 index 00000000..2777aa4d --- /dev/null +++ b/website/docs/library/token/ERC721/ERC721Enumerable/ERC721EnumerableMod.mdx @@ -0,0 +1,385 @@ +--- +sidebar_position: 1 +title: "ERC721EnumerableMod" +description: "Internal logic for enumerable ERC-721 tokens" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/ERC721/ERC721Enumerable/ERC721EnumerableMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Internal logic for enumerable ERC-721 tokens + + + +- Exposes `internal` functions for mint, burn, and transfer operations. +- Integrates with the diamond storage pattern via a predefined storage slot. +- Utilizes custom errors for gas-efficient error reporting. +- No external dependencies or `using` directives, promoting composability. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions for managing enumerable ERC-721 tokens using the diamond storage pattern. Facets can import this module to implement minting, burning, and transfer logic while maintaining token enumeration order. Changes made through this module are immediately visible to all facets accessing the shared storage. + +--- + +## Storage + +### ERC721EnumerableStorage + + +{`struct ERC721EnumerableStorage { + mapping(uint256 tokenId => address owner) ownerOf; + mapping(address owner => uint256[] ownerTokens) ownerTokens; + mapping(uint256 tokenId => uint256 ownerTokensIndex) ownerTokensIndex; + uint256[] allTokens; + mapping(uint256 tokenId => uint256 allTokensIndex) allTokensIndex; + mapping(address owner => mapping(address operator => bool approved)) isApprovedForAll; + mapping(uint256 tokenId => address approved) approved; + string name; + string symbol; + string baseURI; +}`} + + +### State Variables + + + +## Functions + +### burn + +Burns (destroys) an existing ERC-721 token, removing it from enumeration lists. Reverts if the token does not exist or if the sender is not authorized. + + +{`function burn(uint256 _tokenId, address _sender) ;`} + + +**Parameters:** + + + +--- +### getStorage + +Returns the ERC-721 enumerable storage struct from its predefined slot. Uses inline assembly to point to the correct diamond storage position. + + +{`function getStorage() pure returns (ERC721EnumerableStorage storage s);`} + + +**Returns:** + + + +--- +### mint + +Mints a new ERC-721 token to the specified address, adding it to enumeration lists. Reverts if the receiver address is zero or if the token already exists. + + +{`function mint(address _to, uint256 _tokenId) ;`} + + +**Parameters:** + + + +--- +### transferFrom + +Transfers a token ID from one address to another, updating enumeration data. Validates ownership, approval, and receiver address before state updates. + + +{`function transferFrom(address _from, address _to, uint256 _tokenId, address _sender) ;`} + + +**Parameters:** + + + +## Events + + + +
+ Emitted when ownership of a token changes, including minting and burning. +
+ +
+ Signature: + +{`event Transfer(address indexed _from, address indexed _to, uint256 indexed _tokenId);`} + +
+ +
+ Parameters: + +
+
+
+ +## Errors + + + +
+ Thrown when the sender is not the owner of the token. +
+ +
+ Signature: + +error ERC721IncorrectOwner(address _sender, uint256 _tokenId, address _owner); + +
+
+ +
+ Thrown when an operator lacks approval to manage a token. +
+ +
+ Signature: + +error ERC721InsufficientApproval(address _operator, uint256 _tokenId); + +
+
+ +
+ Thrown when the receiver address is invalid. +
+ +
+ Signature: + +error ERC721InvalidReceiver(address _receiver); + +
+
+ +
+ Thrown when the sender address is invalid. +
+ +
+ Signature: + +error ERC721InvalidSender(address _sender); + +
+
+ +
+ Thrown when attempting to interact with a non-existent token. +
+ +
+ Signature: + +error ERC721NonexistentToken(uint256 _tokenId); + +
+
+
+ + + + +## Best Practices + + +- Ensure the `ERC721EnumerableMod` is correctly initialized with the diamond's storage pointer. +- Verify ownership and approval checks are performed by the calling facet before invoking `transferFrom` or `burn`. +- Handle potential errors like `ERC721NonexistentToken` or `ERC721IncorrectOwner` in facet logic. + + +## Integration Notes + + +This module reads and writes to diamond storage at the position identified by `keccak256(\"compose.erc721.enumerable\")`. The `ERC721EnumerableStorage` struct, containing state like token name, symbol, and base URI, is managed at this slot. All functions are `internal` and directly interact with this shared storage, making changes immediately visible to any facet accessing the same storage position. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/ERC721/ERC721Enumerable/_category_.json b/website/docs/library/token/ERC721/ERC721Enumerable/_category_.json new file mode 100644 index 00000000..fdc633f9 --- /dev/null +++ b/website/docs/library/token/ERC721/ERC721Enumerable/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-721 Enumerable", + "position": 2, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/ERC721/ERC721Enumerable/index" + } +} diff --git a/website/docs/library/token/ERC721/ERC721Enumerable/index.mdx b/website/docs/library/token/ERC721/ERC721Enumerable/index.mdx new file mode 100644 index 00000000..95df2458 --- /dev/null +++ b/website/docs/library/token/ERC721/ERC721Enumerable/index.mdx @@ -0,0 +1,36 @@ +--- +title: "ERC-721 Enumerable" +description: "ERC-721 Enumerable extension for ERC-721 tokens." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-721 Enumerable extension for ERC-721 tokens. + + + + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/token/ERC721/_category_.json b/website/docs/library/token/ERC721/_category_.json new file mode 100644 index 00000000..8ee4f288 --- /dev/null +++ b/website/docs/library/token/ERC721/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "ERC-721", + "position": 2, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/ERC721/index" + } +} diff --git a/website/docs/library/token/ERC721/index.mdx b/website/docs/library/token/ERC721/index.mdx new file mode 100644 index 00000000..e3dc8b77 --- /dev/null +++ b/website/docs/library/token/ERC721/index.mdx @@ -0,0 +1,29 @@ +--- +title: "ERC-721" +description: "ERC-721 non-fungible token implementations." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-721 non-fungible token implementations. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/token/Royalty/RoyaltyFacet.mdx b/website/docs/library/token/Royalty/RoyaltyFacet.mdx new file mode 100644 index 00000000..bfb36e1b --- /dev/null +++ b/website/docs/library/token/Royalty/RoyaltyFacet.mdx @@ -0,0 +1,203 @@ +--- +sidebar_position: 2 +title: "RoyaltyFacet" +description: "Returns royalty information for tokens" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/Royalty/RoyaltyFacet.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Returns royalty information for tokens + + + +- Implements ERC-2981 `royaltyInfo` function for external querying. +- Accesses royalty data via diamond storage. +- Supports token-specific and default royalty configurations. +- Self-contained, no external dependencies beyond diamond storage access. + + +## Overview + +This facet implements ERC-2981 royalty information retrieval for tokens within a diamond. It provides an external function to query royalty details based on token ID and sale price, falling back to default royalty settings if token-specific royalties are not defined. This facet enables standard royalty distribution mechanisms for NFTs. + +--- + +## Storage + +### RoyaltyInfo + + +{`struct RoyaltyInfo { + address receiver; + uint96 royaltyFraction; +}`} + + +--- +### RoyaltyStorage + + +{`struct RoyaltyStorage { + RoyaltyInfo defaultRoyaltyInfo; + mapping(uint256 tokenId => RoyaltyInfo) tokenRoyaltyInfo; +}`} + + +### State Variables + + + +## Functions + +### royaltyInfo + +Returns royalty information for a given token and sale price. Returns token-specific royalty if set, otherwise falls back to default royalty. Royalty amount is calculated as a percentage of the sale price using basis points. Implements the ERC-2981 royaltyInfo function. + + +{`function royaltyInfo(uint256 _tokenId, uint256 _salePrice) + external + view + returns (address receiver, uint256 royaltyAmount);`} + + +**Parameters:** + + + +**Returns:** + + + + + + +## Best Practices + + +- Initialize default royalty information during diamond setup. +- Ensure the `RoyaltyStorage` struct is correctly placed in diamond storage. +- Verify compatibility with the ERC-2981 standard for downstream integrations. + + +## Security Considerations + + +Follow standard Solidity security practices. The `royaltyInfo` function is a view function and does not modify state. Input validation for `_tokenId` and `_salePrice` is implicitly handled by Solidity's type system. Ensure correct initialization of default royalties to prevent unintended distribution. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/Royalty/RoyaltyMod.mdx b/website/docs/library/token/Royalty/RoyaltyMod.mdx new file mode 100644 index 00000000..5cc6d013 --- /dev/null +++ b/website/docs/library/token/Royalty/RoyaltyMod.mdx @@ -0,0 +1,379 @@ +--- +sidebar_position: 1 +title: "RoyaltyMod" +description: "ERC-2981 royalty logic for NFTs" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/token/Royalty/RoyaltyMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +ERC-2981 royalty logic for NFTs + + + +- Implements ERC-2981 `royaltyInfo` logic. +- Supports setting default and token-specific royalty information. +- Utilizes diamond storage for state persistence. +- Internal functions ensure composability within facets. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This module provides internal functions for managing ERC-2981 royalty information within a diamond. Facets can use these functions to set, retrieve, and reset both default and token-specific royalties, leveraging shared diamond storage. The royalty logic is transparently applied via the `royaltyInfo` function. + +--- + +## Storage + +### RoyaltyInfo + +Structure containing royalty information. **Properties** + + +{`struct RoyaltyInfo { + address receiver; + uint96 royaltyFraction; +}`} + + +--- +### RoyaltyStorage + +storage-location: erc8042:compose.erc2981 + + +{`struct RoyaltyStorage { + RoyaltyInfo defaultRoyaltyInfo; + mapping(uint256 tokenId => RoyaltyInfo) tokenRoyaltyInfo; +}`} + + +### State Variables + + + +## Functions + +### deleteDefaultRoyalty + +Removes default royalty information. After calling this function, royaltyInfo will return (address(0), 0) for tokens without specific royalty. + + +{`function deleteDefaultRoyalty() ;`} + + +--- +### getStorage + +Returns the royalty storage struct from its predefined slot. Uses inline assembly to access diamond storage location. + + +{`function getStorage() pure returns (RoyaltyStorage storage s);`} + + +**Returns:** + + + +--- +### resetTokenRoyalty + +Resets royalty information for a specific token to use the default setting. Clears token-specific royalty storage, causing fallback to default royalty. + + +{`function resetTokenRoyalty(uint256 _tokenId) ;`} + + +**Parameters:** + + + +--- +### royaltyInfo + +Queries royalty information for a given token and sale price. Returns token-specific royalty or falls back to default royalty. Royalty amount is calculated as a percentage of the sale price using basis points. Implements the ERC-2981 royaltyInfo function logic. + + +{`function royaltyInfo(uint256 _tokenId, uint256 _salePrice) view returns (address receiver, uint256 royaltyAmount);`} + + +**Parameters:** + + + +**Returns:** + + + +--- +### setDefaultRoyalty + +Sets the default royalty information that applies to all tokens. Validates receiver and fee, then updates default royalty storage. + + +{`function setDefaultRoyalty(address _receiver, uint96 _feeNumerator) ;`} + + +**Parameters:** + + + +--- +### setTokenRoyalty + +Sets royalty information for a specific token, overriding the default. Validates receiver and fee, then updates token-specific royalty storage. + + +{`function setTokenRoyalty(uint256 _tokenId, address _receiver, uint96 _feeNumerator) ;`} + + +**Parameters:** + + + +## Errors + + + +
+ Thrown when default royalty fee exceeds 100% (10000 basis points). +
+ +
+ Signature: + +error ERC2981InvalidDefaultRoyalty(uint256 _numerator, uint256 _denominator); + +
+
+ +
+ Thrown when default royalty receiver is the zero address. +
+ +
+ Signature: + +error ERC2981InvalidDefaultRoyaltyReceiver(address _receiver); + +
+
+ +
+ Thrown when token-specific royalty fee exceeds 100% (10000 basis points). +
+ +
+ Signature: + +error ERC2981InvalidTokenRoyalty(uint256 _tokenId, uint256 _numerator, uint256 _denominator); + +
+
+ +
+ Thrown when token-specific royalty receiver is the zero address. +
+ +
+ Signature: + +error ERC2981InvalidTokenRoyaltyReceiver(uint256 _tokenId, address _receiver); + +
+
+
+ + + + +## Best Practices + + +- Ensure receiver addresses are valid and fee numerators are within acceptable bounds (e.g., 0-10000) before calling set functions. +- Use `resetTokenRoyalty` to revert token-specific royalties to the default. +- Call `royaltyInfo` to retrieve royalty details, which automatically handles fallback to default royalties. + + +## Integration Notes + + +This module interacts with diamond storage at a specific, predefined slot identified by `keccak256("compose.erc2981")`. It manages a `RoyaltyStorage` struct containing `defaultRoyaltyInfo`. Functions like `setDefaultRoyalty` and `setTokenRoyalty` directly modify this shared storage. The `royaltyInfo` function reads from this storage to provide royalty details, falling back to default settings when token-specific settings are absent. All modifications are immediately visible to other facets accessing the same diamond storage. + + +
+ +
+ +
+ +
+ + diff --git a/website/docs/library/token/Royalty/_category_.json b/website/docs/library/token/Royalty/_category_.json new file mode 100644 index 00000000..cb6b460f --- /dev/null +++ b/website/docs/library/token/Royalty/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Royalty", + "position": 5, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/Royalty/index" + } +} diff --git a/website/docs/library/token/Royalty/index.mdx b/website/docs/library/token/Royalty/index.mdx new file mode 100644 index 00000000..57a7e845 --- /dev/null +++ b/website/docs/library/token/Royalty/index.mdx @@ -0,0 +1,29 @@ +--- +title: "Royalty" +description: "ERC-2981 royalty standard implementations." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + ERC-2981 royalty standard implementations. + + + + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/token/_category_.json b/website/docs/library/token/_category_.json new file mode 100644 index 00000000..3f26c2ce --- /dev/null +++ b/website/docs/library/token/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Token Standards", + "position": 3, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/token/index" + } +} diff --git a/website/docs/library/token/index.mdx b/website/docs/library/token/index.mdx new file mode 100644 index 00000000..e18f1fe8 --- /dev/null +++ b/website/docs/library/token/index.mdx @@ -0,0 +1,50 @@ +--- +title: "Token Standards" +description: "Token standard implementations for Compose diamonds." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + Token standard implementations for Compose diamonds. + + + + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + } + size="medium" + /> + diff --git a/website/docs/library/utils/NonReentrancyMod.mdx b/website/docs/library/utils/NonReentrancyMod.mdx new file mode 100644 index 00000000..643856b3 --- /dev/null +++ b/website/docs/library/utils/NonReentrancyMod.mdx @@ -0,0 +1,146 @@ +--- +sidebar_position: 1 +title: "NonReentrancyMod" +description: "Prevent reentrant calls within diamond facets" +gitSource: "https://github.com/Perfect-Abstractions/Compose/tree/main/src/libraries/NonReentrancyMod.sol" +--- + +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Badge from '@site/src/components/ui/Badge'; +import Callout from '@site/src/components/ui/Callout'; +import CalloutBox from '@site/src/components/ui/CalloutBox'; +import Accordion, { AccordionGroup } from '@site/src/components/ui/Accordion'; +import PropertyTable from '@site/src/components/api/PropertyTable'; +import ExpandableCode from '@site/src/components/code/ExpandableCode'; +import CodeShowcase from '@site/src/components/code/CodeShowcase'; +import RelatedDocs from '@site/src/components/docs/RelatedDocs'; +import WasThisHelpful from '@site/src/components/docs/WasThisHelpful'; +import LastUpdated from '@site/src/components/docs/LastUpdated'; +import ReadingTime from '@site/src/components/docs/ReadingTime'; +import GradientText from '@site/src/components/ui/GradientText'; +import GradientButton from '@site/src/components/ui/GradientButton'; + + +Prevent reentrant calls within diamond facets + + + +- Provides `enter()` and `exit()` internal functions for reentrancy protection. +- Emits a `Reentrancy()` error if a reentrant call is detected. +- Designed for use as an internal library within diamond facets. + + + +This module provides internal functions for use in your custom facets. Import it to access shared logic and storage. + + +## Overview + +This library provides functions to prevent reentrant calls within diamond facets. By importing and using its internal functions, facets can enforce non-reentrant execution flows, enhancing contract security. This prevents unexpected state changes and exploits that rely on recursive function calls. + +--- + +## Storage + +### State Variables + + + +## Functions + +### enter + +How to use as a library in user facets How to use as a modifier in user facets This unlocks the entry into a function + + +{`function enter() ;`} + + +--- +### exit + +This locks the entry into a function + + +{`function exit() ;`} + + +## Errors + + + +
+ Function selector - 0x43a0d067 +
+ +
+ Signature: + +error Reentrancy(); + +
+
+
+ + + + +## Best Practices + + +- Call `enter()` at the beginning of any function that should be protected against reentrancy. +- Call `exit()` at the end of the protected function before returning control. +- Ensure the `Reentrancy()` error is handled appropriately in calling facets or the diamond proxy. + + +## Integration Notes + + +This library utilizes standard Solidity function calls and does not interact directly with diamond storage. Its reentrancy guard state is managed internally within the calling facet's execution context. Changes to the reentrancy guard are local to the function call and do not affect other facets or diamond storage. + + +
+ +
+ + diff --git a/website/docs/library/utils/_category_.json b/website/docs/library/utils/_category_.json new file mode 100644 index 00000000..d9c087be --- /dev/null +++ b/website/docs/library/utils/_category_.json @@ -0,0 +1,10 @@ +{ + "label": "Utilities", + "position": 4, + "collapsible": true, + "collapsed": true, + "link": { + "type": "doc", + "id": "library/utils/index" + } +} diff --git a/website/docs/library/utils/index.mdx b/website/docs/library/utils/index.mdx new file mode 100644 index 00000000..72f3d72e --- /dev/null +++ b/website/docs/library/utils/index.mdx @@ -0,0 +1,22 @@ +--- +title: "Utilities" +description: "Utility libraries and helpers for diamond development." +--- + +import DocCard, { DocCardGrid } from '@site/src/components/docs/DocCard'; +import DocSubtitle from '@site/src/components/docs/DocSubtitle'; +import Icon from '@site/src/components/ui/Icon'; + + + Utility libraries and helpers for diamond development. + + + + } + size="medium" + /> + diff --git a/website/package.json b/website/package.json index 502302dd..d5c60934 100644 --- a/website/package.json +++ b/website/package.json @@ -11,7 +11,8 @@ "clear": "docusaurus clear", "serve": "docusaurus serve", "write-translations": "docusaurus write-translations", - "write-heading-ids": "docusaurus write-heading-ids" + "write-heading-ids": "docusaurus write-heading-ids", + "generate-docs": "cd .. && forge doc && SKIP_ENHANCEMENT=true node .github/scripts/generate-docs.js --all" }, "dependencies": { "@docusaurus/core": "3.9.2", diff --git a/website/sidebars.js b/website/sidebars.js index d684cc43..fb17ed80 100644 --- a/website/sidebars.js +++ b/website/sidebars.js @@ -19,34 +19,34 @@ const sidebars = { 'intro', { type: 'category', - label: 'Foundations', - collapsed: false, - link: { - type: 'doc', - id: 'foundations/index', - }, + label: 'Getting Started', + collapsed: true, items: [ { type: 'autogenerated', - dirName: 'foundations', + dirName: 'getting-started', }, ], }, { type: 'category', - label: 'Getting Started', - collapsed: true, + label: 'Foundations', + collapsed: false, + link: { + type: 'doc', + id: 'foundations/index', + }, items: [ { type: 'autogenerated', - dirName: 'getting-started', + dirName: 'foundations', }, ], }, { type: 'category', label: 'Design', - collapsed: false, + collapsed: true, link: { type: 'doc', id: 'design/index', @@ -58,19 +58,21 @@ const sidebars = { }, ], }, - /* { type: 'category', - label: 'Facets', + label: 'Library', collapsed: true, + link: { + type: 'doc', + id: 'library/index', + }, items: [ { type: 'autogenerated', - dirName: 'facets', + dirName: 'library', }, ], }, - */ { type: 'category', label: 'Contribution', diff --git a/website/src/components/api/PropertyTable/index.js b/website/src/components/api/PropertyTable/index.js index 496f2fc3..22fd68e0 100644 --- a/website/src/components/api/PropertyTable/index.js +++ b/website/src/components/api/PropertyTable/index.js @@ -1,6 +1,32 @@ import React from 'react'; import styles from './styles.module.css'; +/** + * Parse description string and convert markdown-style code (backticks) to JSX code elements + * @param {string|React.ReactNode} description - Description string or React element + * @returns {React.ReactNode} Description with code elements rendered + */ +function parseDescription(description) { + if (!description || typeof description !== 'string') { + return description; + } + + // Split by backticks and alternate between text and code + const parts = description.split(/(`[^`]+`)/g); + return parts.map((part, index) => { + if (part.startsWith('`') && part.endsWith('`')) { + // This is a code block + const codeContent = part.slice(1, -1); // Remove backticks + return ( + + {codeContent} + + ); + } + return {part}; + }); +} + /** * PropertyTable Component - Modern API property documentation table * Inspired by Shadcn UI design patterns @@ -51,7 +77,7 @@ export default function PropertyTable({ )} - {prop.description || prop.desc || '-'} + {prop.descriptionElement || parseDescription(prop.description || prop.desc) || '-'} {prop.default !== undefined && (
Default: {String(prop.default)} diff --git a/website/src/components/api/PropertyTable/styles.module.css b/website/src/components/api/PropertyTable/styles.module.css index d6a75d41..c50a2be5 100644 --- a/website/src/components/api/PropertyTable/styles.module.css +++ b/website/src/components/api/PropertyTable/styles.module.css @@ -20,6 +20,7 @@ .tableWrapper { position: relative; width: 100%; + max-width: 100%; border: 1px solid var(--ifm-color-emphasis-200); border-radius: 0.5rem; background: var(--ifm-background-surface-color); @@ -38,6 +39,7 @@ -webkit-overflow-scrolling: touch; scrollbar-width: thin; scrollbar-color: var(--ifm-color-emphasis-300) transparent; + max-width: 100%; } /* Custom Scrollbar */ @@ -69,9 +71,10 @@ /* Table */ .table { width: 100%; + max-width: 100%; border-collapse: separate; border-spacing: 0; - min-width: 640px; + table-layout: auto; } /* Table Header */ @@ -162,22 +165,26 @@ /* Column Styles */ .nameColumn { - width: 20%; - min-width: 180px; + width: auto; + min-width: 120px; + max-width: 25%; } .typeColumn { - width: 15%; - min-width: 140px; + width: auto; + min-width: 100px; + max-width: 20%; } .requiredColumn { - width: 12%; - min-width: 100px; + width: auto; + min-width: 80px; + max-width: 15%; } .descriptionColumn { - width: auto; + width: 1%; /* Small width forces expansion to fill remaining space in auto layout */ + min-width: 200px; } /* Name Cell */ @@ -272,6 +279,8 @@ .descriptionCell { line-height: 1.6; color: var(--ifm-color-emphasis-700); + width: 100%; /* Ensure cell expands to fill column width */ + min-width: 0; /* Allow shrinking if needed, but column width will enforce expansion */ } [data-theme='dark'] .descriptionCell { @@ -310,6 +319,27 @@ color: #93c5fd; } +/* Inline code in descriptions */ +.descriptionCell .inlineCode { + font-family: var(--ifm-font-family-monospace); + font-size: 0.8125rem; + font-weight: 500; + background: var(--ifm-color-emphasis-100); + padding: 0.25rem 0.5rem; + border-radius: 0.375rem; + color: var(--ifm-color-primary); + border: 1px solid var(--ifm-color-emphasis-200); + display: inline-block; + line-height: 1.4; + margin: 0 0.125rem; +} + +[data-theme='dark'] .descriptionCell .inlineCode { + background: rgba(59, 130, 246, 0.1); + border-color: rgba(59, 130, 246, 0.2); + color: #93c5fd; +} + /* Responsive Design */ @media (max-width: 996px) { .propertyTable { diff --git a/website/src/components/code/ExpandableCode/index.js b/website/src/components/code/ExpandableCode/index.js index fa868a9b..30602b05 100644 --- a/website/src/components/code/ExpandableCode/index.js +++ b/website/src/components/code/ExpandableCode/index.js @@ -1,4 +1,5 @@ -import React, { useState } from 'react'; +import React, { useState, useMemo, useEffect } from 'react'; +import CodeBlock from '@theme/CodeBlock'; import Icon from '../../ui/Icon'; import clsx from 'clsx'; import styles from './styles.module.css'; @@ -18,34 +19,29 @@ export default function ExpandableCode({ children }) { const [isExpanded, setIsExpanded] = useState(false); - const codeRef = React.useRef(null); - const [needsExpansion, setNeedsExpansion] = React.useState(false); - - React.useEffect(() => { - if (codeRef.current) { - const lines = codeRef.current.textContent.split('\n').length; - setNeedsExpansion(lines > maxLines); - } - }, [children, maxLines]); + const [needsExpansion, setNeedsExpansion] = useState(false); const codeContent = typeof children === 'string' ? children : children?.props?.children || ''; + const lineCount = useMemo(() => codeContent.split('\n').length, [codeContent]); + + useEffect(() => { + setNeedsExpansion(lineCount > maxLines); + }, [lineCount, maxLines]); return (
{title &&
{title}
}
-
-          {codeContent}
-        
+ {codeContent} + {needsExpansion && ( diff --git a/website/src/components/docs/WasThisHelpful/index.js b/website/src/components/docs/WasThisHelpful/index.js index d420f549..31373ed5 100644 --- a/website/src/components/docs/WasThisHelpful/index.js +++ b/website/src/components/docs/WasThisHelpful/index.js @@ -1,6 +1,7 @@ import React, { useState } from 'react'; import clsx from 'clsx'; import styles from './styles.module.css'; +import { useDocumentationFeedback } from '../../../hooks/useDocumentationFeedback'; /** * WasThisHelpful Component - Feedback widget for documentation pages @@ -12,6 +13,7 @@ export default function WasThisHelpful({ pageId, onSubmit }) { + const { submitFeedback } = useDocumentationFeedback(); const [feedback, setFeedback] = useState(null); const [comment, setComment] = useState(''); const [submitted, setSubmitted] = useState(false); @@ -21,12 +23,11 @@ export default function WasThisHelpful({ }; const handleSubmit = () => { + submitFeedback(pageId, feedback, comment.trim() || null); if (onSubmit) { onSubmit({ pageId, feedback, comment }); - } else { - // Default behavior - could log to analytics - console.log('Feedback submitted:', { pageId, feedback, comment }); } + setSubmitted(true); }; @@ -60,15 +61,14 @@ export default function WasThisHelpful({ onClick={() => handleFeedback('yes')} aria-label="Yes, this was helpful" > - - - + Yes
diff --git a/website/src/components/docs/WasThisHelpful/styles.module.css b/website/src/components/docs/WasThisHelpful/styles.module.css index 30f78901..caaae8b5 100644 --- a/website/src/components/docs/WasThisHelpful/styles.module.css +++ b/website/src/components/docs/WasThisHelpful/styles.module.css @@ -75,6 +75,12 @@ flex-shrink: 0; } +.feedbackIcon { + display: inline-block; + vertical-align: middle; + flex-shrink: 0; +} + .feedbackForm { margin-top: 1rem; padding-top: 1rem; diff --git a/website/src/components/ui/Badge/styles.module.css b/website/src/components/ui/Badge/styles.module.css index eee3e93d..855d0556 100644 --- a/website/src/components/ui/Badge/styles.module.css +++ b/website/src/components/ui/Badge/styles.module.css @@ -8,6 +8,11 @@ transition: all 0.2s ease; } +/* Prevent Markdown-wrapped children from adding extra space */ +.badge p { + margin: 0; +} + /* Sizes */ .badge.small { padding: 0.25rem 0.625rem; diff --git a/website/src/components/ui/GradientButton/styles.module.css b/website/src/components/ui/GradientButton/styles.module.css index 5bead8be..9aeb3753 100644 --- a/website/src/components/ui/GradientButton/styles.module.css +++ b/website/src/components/ui/GradientButton/styles.module.css @@ -1,8 +1,12 @@ .gradientButton { position: relative; + padding-left: 0px; display: inline-flex; align-items: center; justify-content: center; + box-sizing: border-box; + line-height: 1; + vertical-align: middle; font-weight: 600; text-decoration: none; border: none; @@ -14,10 +18,21 @@ } .buttonContent { + display: inline-flex; + align-items: center; + gap: 0.35rem; + line-height: 1; position: relative; z-index: 2; } +/* Prevent Markdown-wrapped children from adding extra space */ +.gradientButton p, +.buttonContent p { + margin: 0; + color: white; +} + .buttonGlow { position: absolute; top: 50%; @@ -46,18 +61,18 @@ /* Sizes */ .gradientButton.small { - padding: 0.5rem 1rem; - font-size: 0.875rem; + padding: 0.55rem 1rem; + font-size: 0.9rem; } .gradientButton.medium { - padding: 0.65rem 1.25rem; + padding: 0.7rem 1.3rem; font-size: 1rem; } .gradientButton.large { - padding: 0.875rem 1.75rem; - font-size: 1.125rem; + padding: 0.9rem 1.75rem; + font-size: 1.05rem; } /* Variants */ @@ -66,6 +81,11 @@ color: white; } +.gradientButton:visited, +.gradientButton * { + color: inherit; +} + .gradientButton.secondary { background: linear-gradient(135deg, #60a5fa 0%, #2563eb 100%); color: white; diff --git a/website/src/hooks/useDocumentationFeedback.js b/website/src/hooks/useDocumentationFeedback.js new file mode 100644 index 00000000..73ffbf8d --- /dev/null +++ b/website/src/hooks/useDocumentationFeedback.js @@ -0,0 +1,35 @@ +import { useCallback } from 'react'; + +/** + * Custom hook for tracking documentation feedback with PostHog + * + * @returns {Function} submitFeedback - Function to submit feedback + */ +export function useDocumentationFeedback() { + /** + * Submit feedback to PostHog analytics + * + * @param {string} pageId - Unique identifier for the page + * @param {string} feedback - 'yes' or 'no' + * @param {string} comment - Optional comment text + */ + const submitFeedback = useCallback((pageId, feedback, comment = null) => { + const posthog = typeof window !== 'undefined' ? window.posthog : null; + + if (!posthog) { + console.log('Feedback submitted:', { pageId, feedback, comment: comment || null }); + return; + } + + posthog.capture('documentation_feedback', { + page_id: pageId, + feedback: feedback, + comment: comment || null, + timestamp: new Date().toISOString(), + url: typeof window !== 'undefined' ? window.location.href : null, + }); + }, []); + + return { submitFeedback }; +} + diff --git a/website/src/theme/EditThisPage/DocsEditThisPage.js b/website/src/theme/EditThisPage/DocsEditThisPage.js new file mode 100644 index 00000000..3ed1ec80 --- /dev/null +++ b/website/src/theme/EditThisPage/DocsEditThisPage.js @@ -0,0 +1,48 @@ +import React from 'react'; +import Link from '@docusaurus/Link'; +import {useDoc} from '@docusaurus/plugin-content-docs/client'; +import styles from './styles.module.css'; + +/** + * DocsEditThisPage component for documentation pages + * Uses useDoc hook to access frontMatter for "View Source" link + * + * WARNING: This component should ONLY be rendered when we're certain + * we're in a docs page context. It will throw an error if used outside + * the DocProvider context. + * + * @param {string} editUrl - URL to edit the page + */ +export default function DocsEditThisPage({editUrl}) { + const {frontMatter} = useDoc(); + const viewSource = frontMatter?.gitSource; + + // Nothing to show + if (!editUrl && !viewSource) { + return null; + } + + return ( +
+ {viewSource && ( + <> + + View Source + + | + + )} + {editUrl && ( + + Edit this page + + )} +
+ ); +} + diff --git a/website/src/theme/EditThisPage/SafeDocsEditThisPage.js b/website/src/theme/EditThisPage/SafeDocsEditThisPage.js new file mode 100644 index 00000000..7da71c5d --- /dev/null +++ b/website/src/theme/EditThisPage/SafeDocsEditThisPage.js @@ -0,0 +1,35 @@ +import React from 'react'; +import DocsEditThisPage from './DocsEditThisPage'; +import SimpleEditThisPage from './SimpleEditThisPage'; + +/** + * Error boundary wrapper for DocsEditThisPage + * Catches errors if useDoc hook is called outside DocProvider context + * Falls back to SimpleEditThisPage if an error occurs + * + * @param {string} editUrl - URL to edit the page + */ +export default class SafeDocsEditThisPage extends React.Component { + constructor(props) { + super(props); + this.state = { hasError: false }; + } + + static getDerivedStateFromError(error) { + // If useDoc fails, fall back to simple version + return { hasError: true }; + } + + componentDidCatch(error, errorInfo) { + // Error caught, will render fallback + // Could log to error reporting service here if needed + } + + render() { + if (this.state.hasError) { + return ; + } + + return ; + } +} diff --git a/website/src/theme/EditThisPage/SimpleEditThisPage.js b/website/src/theme/EditThisPage/SimpleEditThisPage.js new file mode 100644 index 00000000..eb7d676c --- /dev/null +++ b/website/src/theme/EditThisPage/SimpleEditThisPage.js @@ -0,0 +1,24 @@ +import React from 'react'; +import Link from '@docusaurus/Link'; +import styles from './styles.module.css'; + +/** + * Simple EditThisPage component for non-docs contexts (blog, etc.) + * Safe to use anywhere - doesn't require any special context + * + * @param {string} editUrl - URL to edit the page + */ +export default function SimpleEditThisPage({editUrl}) { + if (!editUrl) { + return null; + } + + return ( +
+ + Edit this page + +
+ ); +} + diff --git a/website/src/theme/EditThisPage/index.js b/website/src/theme/EditThisPage/index.js new file mode 100644 index 00000000..db62da16 --- /dev/null +++ b/website/src/theme/EditThisPage/index.js @@ -0,0 +1,34 @@ +import React from 'react'; +import {useLocation} from '@docusaurus/router'; +import SimpleEditThisPage from './SimpleEditThisPage'; +import SafeDocsEditThisPage from './SafeDocsEditThisPage'; + +/** + * Main EditThisPage component + * + * Intelligently renders the appropriate EditThisPage variant based on the current route: + * - Docs pages (/docs/*): Uses SafeDocsEditThisPage (with useDoc hook, wrapped in error boundary) + * - Other pages: Uses SimpleEditThisPage + * + * Route checking is necessary because error boundaries don't work reliably during SSR/build. + * + * @param {string} editUrl - URL to edit the page + */ +export default function EditThisPage({editUrl}) { + let isDocsPage = false; + + try { + const location = useLocation(); + const pathname = location?.pathname || ''; + + isDocsPage = pathname.startsWith('/docs/'); + } catch (error) { + isDocsPage = false; + } + + if (isDocsPage) { + return ; + } + + return ; +} diff --git a/website/src/theme/EditThisPage/styles.module.css b/website/src/theme/EditThisPage/styles.module.css new file mode 100644 index 00000000..fc7a21e0 --- /dev/null +++ b/website/src/theme/EditThisPage/styles.module.css @@ -0,0 +1,26 @@ +.wrapper { + display: inline-flex; + align-items: center; + gap: 0.75rem; + flex-wrap: wrap; +} + +.link { + display: inline-flex; + align-items: center; + gap: 0.35rem; + font-weight: 600; + color: var(--ifm-link-color); + text-decoration: none; +} + +.link:hover { + text-decoration: underline; + color: var(--ifm-link-hover-color, var(--ifm-link-color)); +} + +.link:visited { + color: var(--ifm-link-color); +} + + diff --git a/website/static/icons/light-bulb-round.svg b/website/static/icons/light-bulb-round.svg new file mode 100644 index 00000000..d08ab7ef --- /dev/null +++ b/website/static/icons/light-bulb-round.svg @@ -0,0 +1,2 @@ + + \ No newline at end of file diff --git a/website/static/icons/light-bulb-svgrepo-com.svg b/website/static/icons/light-bulb-svgrepo-com.svg new file mode 100644 index 00000000..9f8940a6 --- /dev/null +++ b/website/static/icons/light-bulb-svgrepo-com.svg @@ -0,0 +1,9 @@ + + + + + + + + + \ No newline at end of file diff --git a/website/static/icons/lightbulb.svg b/website/static/icons/lightbulb.svg index 2181d952..5861a5ec 100644 --- a/website/static/icons/lightbulb.svg +++ b/website/static/icons/lightbulb.svg @@ -1,20 +1,9 @@ - + - - - - - - - - - - - - - + + \ No newline at end of file diff --git a/website/static/icons/thumbs-down-white.svg b/website/static/icons/thumbs-down-white.svg new file mode 100644 index 00000000..98c4bc4d --- /dev/null +++ b/website/static/icons/thumbs-down-white.svg @@ -0,0 +1,4 @@ + + + + diff --git a/website/static/icons/thumbs-down.svg b/website/static/icons/thumbs-down.svg index 7c542c4f..2633c039 100644 --- a/website/static/icons/thumbs-down.svg +++ b/website/static/icons/thumbs-down.svg @@ -1,11 +1,9 @@ - + - - - - + + \ No newline at end of file diff --git a/website/static/icons/thumbs-up-white.svg b/website/static/icons/thumbs-up-white.svg new file mode 100644 index 00000000..9e9b1c0a --- /dev/null +++ b/website/static/icons/thumbs-up-white.svg @@ -0,0 +1,4 @@ + + + + diff --git a/website/static/icons/thumbs-up.svg b/website/static/icons/thumbs-up.svg index 86b9a56b..e3a3ac9d 100644 --- a/website/static/icons/thumbs-up.svg +++ b/website/static/icons/thumbs-up.svg @@ -1,11 +1,9 @@ - + - - - - + + \ No newline at end of file