$120 Raspberry Pi5 Can Run 14 Billion Parameter LLM Models … Slowly
It is possible to load and run 14 Billion parameter llm AI models on Raspberry Pi5 with 16 GB of memory ($120). However, they can be slow with about 0.6 tokens per second. A 13 billion parameter model can run at 1.36 tokens per second. Improved firmware (better SDRAM timing) improved results.
Link :
https://www.nextbigfuture.com/2025/01/120-raspberry-pi5-can-run-14-billion-parameter-llm-models-slowly.html
Link :
https://www.nextbigfuture.com/2025/01/120-raspberry-pi5-can-run-14-billion-parameter-llm-models-slowly.html