“CraftGPT” is a fully in-game computer that runs a ~5.1M-parameter small language model inside Minecraft using roughly 439 million redstone blocks. It’s a glorious engineering stunt—and a great teaching tool.
The builder trained a tiny model (≈5.09M params) on a basic English dialogue dataset, then implemented tokenization and matrix ops using redstone logic. It does chat, but the limitations are the point: responses can take hours even with a high-tick server, and output quality tracks the model’s size and short context. Still, as substrate-agnostic compute theater, it’s superb.
To put this in context, see our take on NPUs’ real value and our guide to running LLMs on a PC. If you want a practical sandbox, start with lightweight models on CPU/GPU and graduate to quantized 7B–13B tiers before you worry about redstone.
Leave a Reply Cancel reply