Conversation
Notices
-
OHNONONONONONO THE AI-BROS ARE STEALING OUR LAST SOULFUL KINO STYLE
- likes this.
-
@crunklord420 I toyed with an idea of making the CBoyardee/PilotRedSun style lora, but ultimately I know next of nothing about the whole training process, plus lack the hardware most of the time.
-
@mint I managed to buy the bottom of the market when getting a used 3090. They went up like 25% post-4090 sanction.
I'm seriously considering buying a 5090 at launch assuming it has 32GB+ VRAM. 40GB would definitely be a buy. It bums me out so much the 40 series held back the VRAM a generation. 4090 should have 32GB, 5090 should be 40GB, but it'll probably just be 32GB.
The point is, waiting has never resulted in anything good. You may as well just buy the start of the cycle.
-
@crunklord420 @mint I've got the 24gb 3090 and don't care to upgrade right now. It would be nice to have more context in a llm, I wouldn't mind 40gb of vram, but it's not worth paying money for imho. It does images fine