It's going to be a AMD GPU, however I checked the compatibility and ROCm supports everything I need it to luckily
Conversation
Notices
-
✡️ ((fuggy)) :swastika: (fuggy@skippers-bin.com)'s status on Monday, 21-Aug-2023 11:47:12 JST ✡️ ((fuggy)) :swastika: -
✡️ ((fuggy)) :swastika: (fuggy@skippers-bin.com)'s status on Monday, 21-Aug-2023 11:47:47 JST ✡️ ((fuggy)) :swastika: Just a little bit of a annoying setup for much more VRAM, seems like a reasonable tradeoff. Also Wayland :DDD
-
✡️ ((fuggy)) :swastika: (fuggy@skippers-bin.com)'s status on Monday, 21-Aug-2023 12:46:32 JST ✡️ ((fuggy)) :swastika: @Biendeo@m.biendeo.com isn't Intel's AI stuff even less well supported?
-
Biendeo (biendeo@m.biendeo.com)'s status on Monday, 21-Aug-2023 12:46:36 JST Biendeo Would an A770 16GB work in your case? They’re silly cheap here in Australia and the only pitfall is whether it’s compatible with the games/software you’re running.
-
✡️ ((fuggy)) :swastika: (fuggy@skippers-bin.com)'s status on Monday, 21-Aug-2023 12:52:49 JST ✡️ ((fuggy)) :swastika: @Biendeo@m.biendeo.com Intel seems more committed on working on AI stuff for consumer GPUs, but at the same time AMD is tried and tested
-
✡️ ((fuggy)) :swastika: (fuggy@skippers-bin.com)'s status on Monday, 21-Aug-2023 12:58:34 JST ✡️ ((fuggy)) :swastika: @Biendeo@m.biendeo.com that's really good for the price, but I think I will wait until next time I get a new GPU
-
Biendeo (biendeo@m.biendeo.com)'s status on Monday, 21-Aug-2023 12:58:35 JST Biendeo Fair call, I was seeing recent articles about their recent gains in Stable Diffusion. I’m not sure how widespread support for OpenVINO is compared to ROCm though, and I’ve got no personal experience with either. https://www.tomshardware.com/news/stable-diffusion-for-intel-optimizations
-
✡️ ((fuggy)) :swastika: (fuggy@skippers-bin.com)'s status on Monday, 21-Aug-2023 19:09:15 JST ✡️ ((fuggy)) :swastika: @Biendeo@m.biendeo.com actually I'm just going to get two of these and figure out the cooling later, with this I should be able to even run 70B LLaMA 2 locally
In conversation permalink
-