Skip to content

API Reference

Overview

The project is a codebase documentation generation tool that uses large language models (LLMs) to automatically produce documentation from source code analysis. It is organized into several cooperating subsystems: a networking and protocol layer (built from protocol, client, http, network, openai, anthropic, and provider) that abstracts provider-specific API interactions; an extraction pipeline (extract and schema) that parses code into structured representations; an agent module that drives an LLM-based exploration loop using tool calls; and a generate module that renders the final documentation output. Configuration is managed by the config module, while support provides shared utilities for file I/O, hashing, and logging.

Readers should view the project as a pipeline where source code is first analyzed and represented in a schema, an LLM agent then iteratively inspects the codebase, and the results are assembled into documentation pages. The networking modules handle the asynchronous communication with various LLM backends, while the agent orchestrates the conversation and caching. This modular design allows each subsystem to be developed and tested independently while contributing to the overall goal of automated documentation generation.

Modules

Namespaces

Module Dependency Diagram