Raspberry Pi has received an AI power up.
Conversation
Notices
-
It's FOSS (itsfoss@mastodon.social)'s status on Thursday, 20-Jun-2024 13:50:19 JST It's FOSS -
Nazo (nazokiyoubinbou@mastodon.social)'s status on Thursday, 20-Jun-2024 14:00:00 JST Nazo @itsfoss I imagine even with a NPU processor use for LLM tasks is going to be pretty severely limited on a RPi (even 5.) RAM of course being one of the biggest constraints, but if that "TOPs" measurement means token operations per second, prompt processing alone is going to take ages.
Can't imagine running anything much heavier than a 3B or something on there and even that would likely be limited. 3B isn't very good either. (Need at least 7 to get decent stuff and even that's pushing it.)
-