Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
christianqchung
on Sept 8, 2024
|
parent
|
context
|
favorite
| on:
Serving AI from the Basement – 192GB of VRAM Setup
Pretty sure it'll work where any 70b model would, but it's probably not noticably better than Llama 3.1 70b if the reports I'm reading now are correct.[1]
[1]
https://x.com/JJitsev/status/1832758733866222011
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
[1]https://x.com/JJitsev/status/1832758733866222011