Scores vault notes using 5 signals (forward links, backlinks, shared tags, folder proximity, shared properties) and presents a review modal with color-coded badges and token budget tracking. Adds "Copy smart context" command, Intelligence settings tab, and mode: auto preset support. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> |
||
|---|---|---|
| .github/workflows | ||
| docs | ||
| src | ||
| .editorconfig | ||
| .gitignore | ||
| .npmrc | ||
| AGENTS.md | ||
| esbuild.config.mjs | ||
| eslint.config.mts | ||
| LICENSE | ||
| manifest.json | ||
| package-lock.json | ||
| package.json | ||
| README.md | ||
| styles.css | ||
| tsconfig.json | ||
| version-bump.mjs | ||
| versions.json | ||
The Problem
When working with AI assistants on your Obsidian vault, you constantly re-explain your folder structure, link style, tag conventions, and frontmatter requirements. This is repetitive, error-prone, and wastes tokens.
The Solution
Promptfire stores your vault's conventions in a dedicated folder and copies everything to clipboard with one hotkey. Output adapts automatically for different LLMs (Claude, GPT-4, Gemini, etc.), and a built-in history tracks every context you generate.
Key Features
- Multi-LLM output targets with per-model token limits, formats (XML, Markdown, Plain), and truncation strategies
- Prompt templates with placeholders and conditionals for reusable workflows
- Additional context sources from freetext, external files, or shell commands
- Frontmatter presets to configure context per-note via
ai-contextYAML - Granular section selection to include only the headings you need
- Context history with diff, search, and one-click restore
- Context generator that walks you through setting up vault conventions
See the full documentation for details on each feature.
Quick Start
1. Generate context files
Ctrl+P > "Promptfire: Generate context files"
2. Add output targets Settings > Promptfire > Output Targets > "Add Built-in Targets"
3. Copy & paste
Ctrl+P > "Promptfire: Copy context to clipboard"
Paste into Claude, ChatGPT, or any AI assistant.
Documentation
| Topic | Description |
|---|---|
| Getting Started | Installation, first run, commands |
| Output Targets | Multi-LLM formats and truncation |
| Templates | Prompt templates, placeholders, conditionals |
| Context Sources | Freetext, file, and shell sources |
| Frontmatter Presets | Per-note ai-context YAML configuration |
| Section Selection | Granular heading and block selection |
| History | Context history, diff, and restore |
| Settings | Full settings reference |
| Advanced | Truncation strategies, priority order, XML format |
Contributing
Contributions welcome! Fork the repo, create a feature branch, run npm run build to verify, and submit a pull request. Report bugs and request features via Codeberg Issues.
License
MIT License - see LICENSE for details.