Anapoly Notebook | Digital Garden
A local LLM
Status: 🔸 Seed → ✅ Growing → 🔸 Well-formed → 🔸 Fruitful → 🔸 Retired
*Transparency label: AI-heavy
With an AI installed locally rather than in the cloud, our data remains private and and we retain full control over our information. A cost-effective way to achieve this with reasonable performance is to install the AI on a powerful, consumer-grade computer. This needs to have a strong CPU (like AMD Ryzen 9 or Intel Core i9), a high-performance NVIDIA GPU (such as RTX 4090 or 3090), ample RAM (32–64 GB), and fast NVMe storage.
A high-end laptop such as a MacBook Pro M4 Max could be used if we need portability. Alternatively, a desktop machine similar to the latest gaming computers can meet this requirement at lower cost. If using a desktop machine, we have the option of installing Linux (Ubuntu LTS) to provide a stable and efficient operating system well-suited to large language models.
An AI running locally could be accessed over an internal network or remotely through a VPN for secure access over the internet. In these cases, we would want to apply additional security measures as appropriate to the sensitivity of our information.
This deep-dive conversation produced by NotebookLM provides a helpful overview of the technical issues involved in running an LLM locally.