⚠️ Project Paused — I got bored and discovered LM Studio, which already does everything I was building here. Check it out instead!
Development Paused

HugBrowse

Your Local-First AI Platform

Browse, download, and run Hugging Face models — right on your machine.
Too heavy for local? Offload to the cloud with one click.

🦀 Built with Rust 🔒 Local-First, Cloud-Optional GPU Accelerated 📜 MIT Licensed

Everything you need to run AI locally

From model discovery to local inference, HugBrowse handles the entire workflow so you can focus on what matters.

🔍

Smart Model Search

Browse and filter thousands of Hugging Face models by task, library, parameters, and popularity — all from a beautiful native interface.

Hardware-Aware

Auto-detects your GPU, CPU, and RAM, then classifies your hardware tier to recommend models you can actually run.

📦

Download Manager

Pause, resume, and cancel downloads with SHA-256 integrity verification. Never worry about corrupted model files again.

💬

Local Chat

Chat with models running entirely on your hardware. Markdown rendering, streaming responses, and full conversation history.

📊

Resource Monitor

Real-time CPU, RAM, GPU, and VRAM tracking. Know exactly what your system is doing and how much headroom you have.

☁️

Cloud Offload

PC too weak? Deploy models to HuggingFace Inference Endpoints or connect any OpenAI-compatible API server — VPS, Azure, your own server. Switch backends in one click.

🏪

Marketplace

Community-driven plugins, extensions, and model configurations. Extend HugBrowse with one-click installs from the marketplace.

🔄

Auto-Updates

Always on the latest version. HugBrowse checks for updates automatically and installs them seamlessly — no manual downloads needed.

See it in action

A clean, native desktop experience designed to make AI accessible to everyone.

HugBrowse Model Browser showing trending Hugging Face models
Model Browser
HugBrowse Resource Monitor showing CPU, RAM, GPU usage
Resource Monitor
HugBrowse Recommendations showing models matched to your hardware
Recommendations
HugBrowse Settings panel
Settings

Built on modern foundations

Three powerful technologies come together to deliver a fast, secure, and beautiful experience.

🔧

Tauri

v2

Lightweight native shell with security-first IPC, auto-updater, system tray, and deep link support. Tiny binary, low memory.

⚛️

React

v19

Modern component architecture with TypeScript, TanStack Router, and a responsive design system for the entire UI layer.

🦀

Rust

Backend

System-level performance for hardware detection, download management, file hashing, and llama-server sidecar orchestration.

HugBrowse uses a sidecar architecture — the Rust backend handles system operations while llama-server runs as an independent process for GPU-accelerated inference with CUDA, Metal, and Vulkan support. No data ever leaves your machine.

Ready to run AI locally?

Download HugBrowse and start exploring thousands of models — run locally or offload to the cloud. Auto-updates keep you current.