floofloof@lemmy.ca to Technology@lemmy.worldEnglish · 2 days agoMotherboard sales are now collapsing amid unprecedented shortages fueled by AIwww.tomshardware.comexternal-linkmessage-square142fedilinkarrow-up1633arrow-down12file-text
arrow-up1631arrow-down1external-linkMotherboard sales are now collapsing amid unprecedented shortages fueled by AIwww.tomshardware.comfloofloof@lemmy.ca to Technology@lemmy.worldEnglish · 2 days agomessage-square142fedilinkfile-text
minus-squareboonhet@sopuli.xyzlinkfedilinkEnglisharrow-up3·1 day agoYup, you want memory accessible to the GPU for local AI. AMD Strix Point and Mac devices are popular options. CPU can run LLMs but very slowly. I’ve got 32 GB of RAM and 8 VRAM and it’s borderline useless for models that don’t fit in the VRAM.
Yup, you want memory accessible to the GPU for local AI. AMD Strix Point and Mac devices are popular options. CPU can run LLMs but very slowly. I’ve got 32 GB of RAM and 8 VRAM and it’s borderline useless for models that don’t fit in the VRAM.