Run LLM Agents
Locally on Your Machine

Privacy-first AI assistant platform. Your data stays on your computer.

Version 0.1.0 (Early Access)

$ drakyn start

Starting inference server...

Starting MCP server...

✓ Drakyn Desktop ready

$ _

Why Drakyn Desktop?

🔒

Privacy First

All processing happens locally. Your conversations and data never leave your machine.

Fast Inference

Powered by vLLM for high-performance model inference on CPU or GPU.

🔧

Extensible Tools

Built on Model Context Protocol (MCP) for easy tool integration.

💻

Native Experience

Beautiful desktop app for macOS, Windows, and Linux.

🤖

Multiple Agents

Configure and run different AI agents for different tasks.

📦

Model Management

Easy model downloading, loading, and switching between models.

Download Drakyn Desktop

Choose your platform and start running AI agents locally

macOS

macOS 10.15 or later (Apple Silicon)

View Releases

Windows

Windows 10 or later

View Releases

Linux

Ubuntu 20.04+ / Debian-based

Installation

Click "View Releases" above to download the latest version for your platform. Installation instructions are provided on the releases page.

Open Source & Community Driven

Drakyn Desktop is open source and welcomes contributions from the community.

View on GitHub