I've not tried it, but there's a Distributed Llama that can run on a cluster of Raspberry Pi.The answer is me.he smallest model that gave useful coding answersMy brain outdoes all the Artificial Idiot coding from what I see in examples online. The model would have to be big and specific to something I have not practiced. Can you run a big model on a cluster of Raspberry Pi computers?
https://github.com/b4rtaz/distributed-llama
Here're some performance metrics for the 8 billion parameter distillation of DeepSeek-R1.
https://github.com/b4rtaz/distributed-l ... ssions/162
The scaling looks good but I don't know how the overall speed compares to an x86 computer.
Statistics: Posted by ejolson — Mon Feb 17, 2025 6:03 am