Official application repository for LettuceAI
Overview • Install • Development • Android • iOS • Contributing
This repository contains the LettuceAI application. LettuceAI is a privacy-first, cross‑platform roleplay and storytelling app built with Tauri v2, React, and TypeScript. It runs locally, keeps data on‑device, and lets users bring their own API keys and models.
- Bun 1.1+ (includes Node.js compatibility): https://bun.sh/
- Rust 1.70+ and Cargo
- Android SDK (optional, for Android builds)
- Xcode + iOS SDK (optional, for iOS builds, macOS only)
# Clone the repository
git clone http://github.com/LettuceAI/mobile-app.git
cd mobile-app
# Install dependencies
bun install# Desktop (Tauri)
bun run tauri dev
bun run tauri build
bun run tauri:build:macos
# Desktop with NVIDIA CUDA llama.cpp acceleration
bun run tauri dev --features llama-gpu-cuda
bun run tauri build --features llama-gpu-cuda
# Desktop with NVIDIA CUDA llama.cpp acceleration (auto-detect local GPU arch)
bun run tauri:dev:cuda:auto
bun run tauri:build:cuda:auto
# Desktop with Vulkan llama.cpp acceleration (AMD/Intel/NVIDIA, driver-dependent)
bun run tauri dev --features llama-gpu-vulkan
bun run tauri build --features llama-gpu-vulkan
# Desktop with Metal llama.cpp acceleration (Apple Silicon/Intel Macs, macOS only)
bun run tauri:dev:metal
bun run tauri:build:metal
# Android
bun run tauri android dev
bun run tauri android build
# Quality
bunx tsc --noEmit
bun run check- Install Android Studio and set up the SDK
- Ensure
ANDROID_SDK_ROOTis set in your environment - Add platform tools to your
PATH(example:export PATH=$ANDROID_SDK_ROOT/platform-tools:$PATH)
# Run on Android emulator
bun run tauri android dev
# Build Android APK
bun run tauri android build- Install Xcode from the App Store
- Install Xcode command-line tools:
xcode-select --install - Install CocoaPods:
sudo gem install cocoapods(or Homebrew) - Provide ONNX Runtime for iOS with CoreML support:
- Build/download an iOS-compatible ONNX Runtime package that includes CoreML EP
- Set
ORT_LIB_LOCATIONto the directory containing the ONNX Runtime libraries before building
- Initialize iOS project files:
export ORT_LIB_LOCATION=/absolute/path/to/onnxruntime/ios/libs
bun run tauri ios init# Run on iOS simulator/device (from macOS)
bun run tauri ios dev
# Build iOS app
bun run tauri ios buildFor llama-gpu-cuda, install the NVIDIA CUDA toolkit and driver on the build machine.
For llama-gpu-metal, build on macOS with Xcode command-line tools installed.
Build a native macOS app bundle and DMG installer on macOS:
bun run tauri:build:macosThe build script auto-downloads a compatible ONNX Runtime dylib for macOS into src-tauri/onnxruntime (unless ORT_LIB_LOCATION is explicitly set), and bundles it into the app resources.
Artifacts are generated under:
src-tauri/target/release/bundle/macos/*.appsrc-tauri/target/release/bundle/dmg/*.dmg
We welcome contributions.
- Fork the repo
- Create a feature branch
git checkout -b feature/my-change - Follow TypeScript and React best practices
- Test your changes
- Commit with clear, conventional messages
- Push and open a PR
GNU Affero General Public License v3.0 — see LICENSE
Privacy-first • Local-first • Open Source