So is spam if you only need a summarized version of the financial difficulties of princes in Nigeria. Or the kinds of people doctors can’t stand.
Your jumping off point is a cliff into a pile of leaves. It looks correct and comfy but will hurt your butt taking it for granted. You’re telling people to jump and saying “it’ll get better eventually just keep jumping and ignore the pain!”
Nope, spam is specifically unwanted. Also, I'm saying "jump in the leaves, it's fun if you don't try leaping in from a mile up" and you're saying "NO LEAVES KILL PEOPLE ALL THE TIME" lol.
What if I don’t want to use Llms, but people keep telling me that it’s fine and I should. Isn’t that spam?
What if my reason isn’t, “I like typing code” but instead, “we don’t need more Googles doing Google things and abusing privacy.”
Then personally the whole thing is spam.
Regardless of what the reasons for and against Llms maybe doesn’t obfuscate that the primary use case for generated content has been to scam and spam people.
That spam scam can be simply getting people to pour their personal private info into an Llm or it can be ads or a generated lie. Regardless it’s unwanted to a lot of people. And the history of that tech and attempt at normalizing it are founded in spam techniques.
Even the gratuitous search for AGI is spammed on us as these companies take tax payer money and build out infrastructure that’s actually available to 0% of the public for use.
Like I discredit in my mind anyone that cites Chatgpt as a source.
Unfortunately due to the generation / verification ratio, spam and misinformation are indeed their most low-hanging fruits. It's so much easier to generate LLM output than to verify it, which is probably why Google held back the Transformer Architecture.