Aztec is an automated code-review service powered by an LLM running inside a container. It integrates directly with GitHub pull requests and posts AI-generated review comments based on the diff.
Aztec uses a GitHub Actions workflow that:
- Spins up the Aztec container (running an LLM such as
codellama:7b) note: this can be sped up by hosting your own LLM server - Computes the diff between the PR branch and
main - PR description allows user for flexibility for varying prompt and output from the LLM
- Sends the diff + PR description to the LLM
- Posts the generated review as a PR comment
When a pull request is opened the workflow:
- Fetches the diff between the PR branch and
main - Specifying prompt the description of the PR provides the prompt for the LLM to use to review the code
- Sends the diff (plus the PR body as context) to the Aztec container via
curl - Captures the model output and stores it as
review.txt - Publishes the review directly as a comment on the pull request using GitHub’s API
This allows fully automated LLM-powered reviews without needing to run external cloud services.
To enable Aztec in your repository:
- Add a GitHub Actions workflow file (e.g.
.github/workflows/aztec.yml) - Configure the Aztec container as a service
- Provide the required credentials (see below)
- Ensure your model is available in the container—by default the workflow uses
codellama:7b
The container image defaults to:
ghcr.io/montcao/aztec:latest
This image runs an Ollama-compatible API on port 11434.
Two credentials are required:
A GitHub container registry token used to authenticate the Aztec container image pull.
Store it as a GitHub Actions secret:
REG_TOKEN
Used to write PR comments.
Must be created with at least the repo scope.
Example:
Name: Aztec Code Review
Token: GH_PAT
Add it to repository secrets:
GH_PAT
- The workflow supports incremental updates—every time the PR changes, Aztec reruns and posts a fresh comment.
- You can swap
codellama:7bfor any model supported by your Aztec container. - The PR body is included as context for the LLM (e.g., requirements, intent, flags, explanations).
- Dockerfile is what ghcr.io/montcao/aztec:latest contains
- Description of PR allows user to prompt the LLM for specific comments or fields