Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

RTX 5090 is about as good as it gets for home use. Its inference speeds are extremely fast.

The limiting factor is going to be the VRAM on the 5090, but nvidia intentionally makes trying to break the 32GB barrier extremely painful - they want companies to buy their $20,000 GPUs to run inference for larger models.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: