Skip to content
#

local-inference

Here are 35 public repositories matching this topic...

Modern desktop application (Rust + Tauri v2 + Svelte 5 + Candle (HF)) for communicating with AI models that runs completely locally on your computer. No subscriptions, no data sent to the internet — just you and your personal AI assistant

  • Updated Feb 25, 2026
  • Rust

EN: An overfitted SD prompt engine with severe "aesthetic snobbery," forcibly transforming mundane ideas into professional-grade physical rendering instructions. CN: 一个具备“审美洁癖”的过拟合提示词引擎,强行将平庸构思纠偏为具备极致物理质感的工业级渲染指令。

  • Updated Jan 19, 2026
  • Python

An agentic, zero‑shot document intelligence engine that sees, understands, and extracts from any PDF, no training, no hallucinations. Just define your fields and get trusted, structured outputs with confidence scores, deployed locally and built for the enterprise.

  • Updated Mar 2, 2026
  • Python

Improve this page

Add a description, image, and links to the local-inference topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the local-inference topic, visit your repo's landing page and select "manage topics."

Learn more