Not really, given that it doesn't increase the amount of RAM compared to the old 4080 Super. If you want to do 'modern' AI on a (relative) budget you should be looking at a 4090 or 5090. This seems to be the card targeted most squarely at gamers.
I heard nvidia is gimping consumer-grade cards to not be good at LLM training, is this true? If so are they gimped only for training or also for running LLMs?
I guess the limited amount of RAM is also a way to limit the cards.
Many Nvidia "gaming" SKUs are already at the point where memory is often the biggest likely limitation on their gaming use case, and they'll be noticeably better products for the consumer with a small cost increase by adding more memory.
So I'd say there's good evidence that something outside cost and value to the gaming use case is why they don't have higher memory SKUs, and eating into "professional" priced AI SKUs is an obvious possibility.
I doubt anyone outside Nvidia itself knows "for sure", but it's a pretty big indication.