Native software for people who care about performance

I build desktop apps and graphics engines the old way: compiled code, no web views, no Electron, no 500MB runtime for a text editor.

Indie developer focused on macOS. Background in graphics programming. Currently shipping an LLM client and working on a graphics engine. Everything built on real rendering backends, not browser tech pretending to be native.

Products

Bots

Native LLM Client

A local-first LLM client that doesn't choke on long conversations. Built on a game engine, not Electron. Scrolls 10k messages at 3ms per frame. Your data stays on your machine in SQLite you control.

$20 once. No subscription.

Graphics Engine

Coming Soon

A real-time rendering engine built from scratch. Vulkan backend. Same UI framework powering Bots. More details when there's something to show.

In development

Devlog

Notes on building native software, graphics programming, and shipping products.

Why am I doing this?

The origin story: from Casey Muratori's game engine streams to forking raddbg to building a standalone collection of libraries and shipping products.

View all posts →

Bots — Native LLM Client

Long chats. Real search. Yours forever.

Bots screenshot
Join the Beta

Features

  • Instant UI — Built on a game engine. Scrolling 10k messages feels like 10.
  • Handles long conversations — Web UIs choke on long chats. This doesn't.
  • Real search — Searches your actual messages, not just titles.
  • Works offline — Your history is always there. No connection required.
  • Switch providers anytime — Any OpenAI-compatible API. Claude down? Send to Cerebras.
  • Keyboard-driven — Hotkeys for everything. Mouse optional.
  • Export to Markdown — Your conversations, in files you control.

macOS now. Linux and Windows coming soon.

Why?

Web UIs break on long conversations. Search only finds auto-generated titles. History disappears when you're offline. One provider goes down and you're stuck.

This is a native macOS app that handles all of it. Not Electron — built on a fork of raddbg's UI layer, it renders at 3ms per frame. Your conversations live in a local SQLite database you own and control. Bring your own API keys — any OpenAI-compatible endpoint works.

Local-first. No cloud. No account. No subscription — $20 once, updates included.

Requires macOS MACOS_VERSION+. Bring your own API keys. No screen reader support — game engine UI trade-off.

Demo

Setup walkthrough and feature overview.

Bots Demo — click to play

About

I'm Dima Afanasyev. By day I work as a developer; on the side I'm building this and saving up for a physics degree (optics) in Germany. The plan: save enough to live on while studying, since I can't do both a bachelor's and full-time work. Every sale here goes into that fund.

I'm into graphics programming — the engine underneath this app started as a fork of raddbg, and I plan to keep working on it long-term. I've got a graphics demo in the works using the same engine. The physics degree is partly about getting the applied math foundation I want.

So if you buy this, you're helping fund one developer's education. And you get a fast, local LLM client out of it.