We use essential cookies to make our site work. With your consent, we may also use non-essential cookies to improve user experience and analyze website traffic. By clicking “Accept,” you agree to our website's cookie use as described in our Cookie Policy. You can change your cookie settings at any time by clicking “Preferences.”

    AI Indexing Documentation

    Optimly provides machine-readable content optimized for AI agents, RAG pipelines, and LLM retrieval systems.

    llms.txt

    Token-efficient index of key Optimly URLs and services. Ideal for quick context loading and navigation.

    • • Lightweight (~50 lines)
    • • Core services and documentation links
    • • Entity disambiguation included

    llms-full.txt

    Full RAG-ready content with semantic chunking and YAML metadata delimiters for retrieval systems.

    • • 20+ semantic chunks
    • • Self-contained contexts
    • • Chunk IDs and source URLs

    For AI Developers

    These files follow the emerging llms.txt standard for providing structured content to AI systems.

    Chunk Format (llms-full.txt)

    ---
    chunk_id: methodology-001
    topic: The Measurement Problem
    source: https://www.optimly.ai/resources/methodology
    ---
    
    # The Measurement Problem
    
    [Self-contained content for RAG retrieval...]