Many said (some still say) that Wikipedia is not a replacement for traditional encyclopedias with articles written by domain experts, such as Britannica.
Many more scoffed at that, saying those people were just stuck in their old ways and unable to adjust to the obviously superior new thing.
Is that you? AI applications are different than Wikipedia and are better in some ways: Coverage is much greater - you can get a detailed article on almost any topic. And if you have questions after reading a Wikipedia article, Wikipedia can't help you; the AI software can answer them. Also, it's a bit easier to find the information you want.
Personally, I'm with the first group, at the top if this comment. And now truth, accuracy, and epistemology, and public interest in those things, take another major hit in the post-truth era.
Right, and where are all those LLMs without the billions upon billions of lines of text written by humans? A not insignificant number coming from Wikipedia?
Also, LLMs don't produce truth. They don't have a concept of it. Or lies for that matter. If you are using LLMs do study something you know nothing about the information provided by them is as good as useless if you don't verify it with external sources written by a person. Wikipedia isn't perfect, nothing is, but I trust their model a shitload more then an LLM.
> where are all those LLMs without the billions upon billions of lines of text written by humans? A not insignificant number coming from Wikipedia?
Where is Wikipedia without all the learning and information from other sources, many of which it put out of business?
> Also, LLMs don't produce truth. They don't have a concept of it. Or lies for that matter. If you are using LLMs do study something you know nothing about the information provided by them is as good as useless if you don't verify it with external sources written by a person. Wikipedia isn't perfect, nothing is, but I trust their model a shitload more then an LLM.
Wikipedia produces consensus that correlates with truth to some degree. LLMs produce statistical output, which in a way is a automated consensus of the LLM's input, that also correlates with truth to some degree - and the correlation is hardly zero.
I agree that information has no value if you don't know its accuracy; it's always a sticking point for me. IMHO Wikipedia has the same problem: I have no idea how accurate it is without verifying it with an external source (and when I've done that, I've often been disappointed).
Has anyone researched the relative accuracy of Wikipedia and LLMs?
The comment about Wikipedia supposedly putting companies out of business is so goofy I'm not even gonna comment on it. I'm surprised you'd bother trying to make a point there.
The difference is humans have a concept of truth, humans have intent. A person, taking an aggregation of their research, expertise, and experience to produce an article is (presumably) trying to produce something factual. Other humans then come along, with similar intent, and verify it. Studies in the past have shown Wikipedia's accuracy rate is roughly on par with traditional encyclopedias, and more importantly sources are clearly documented. Making validation and further research fairly easy. And if something isn't sourced I know immediately it's more suspect.
LLMs have no concept of truth, they have no "intent". They just slap words down based on statistics. It is admittedly very impressive how good they are at doing that, but they don't produce truth in any meaningful way, it more a by product. On top of that all its sources get smashed together, making it much more difficult to verify the validity of any given claim. It's also unpredictable, so the exact same prompt could produce truth one time, and a hallucination another (a situation I have run into when it comes to engineering tasks). And worst of all. Not only will an LLM be wrong, but it will be confidently and persuasively wrong.
> The comment about Wikipedia supposedly putting companies out of business is so goofy I'm not even gonna comment on it.
I've learned that when people don't have any merits to argue, they turn to ridicule. Right back at you buddy.
> The difference is humans have a concept of truth, humans have intent. A person, taking an aggregation of their research, expertise, and experience to produce an article is (presumably) trying to produce something factual.
It's pretty naive to think that humans have intent and motivation for the truth, and no others. Just look around you in the world - most communication disregards the truth either carelessly or incidentally (because they are motivated to believe or claim something else) or intentionally (lots of that).
> LLMs have no concept of truth, they have no "intent".
My calculator app has no intent or concept of truth, but outputs truth pretty reliably.
To think that I'm saying all humans intend to produce truth you'd have to intentionally misread my comment. Wikipedia obviously has to deal with bad actors and vandalism, and they have processes in place for that. My point is that the intent matters.
Calculators aren't a useful analogy for LLMs. They produce a deterministic output based on a (relatively) narrow range of inputs. The calculations to produce those outputs follow very rigid and well defined rules.
LLMs by their very nature are non-deterministic, and the inputs/outputs are far more complicated.
It's not an insult when it's true. I don't think you've made one comment that actually added something useful. I did my best to reply to what was there, but you didn't give me anything to work with. Your last comment was so unrelated there was no where left to go.
If you have something actually relavent to say you're welcome to say it.
Here's an opportunity to talk about listening, epistemology, and human intercourse:
> It's not an insult when it's true.
It's not slander, but it's certainly an insult. If you tell someone they are fat and ugly, it's an insult regardless of its truth and you shouldn't say it, ever. There's never a good reason for personal insults.
> it's true
> you're welcome to
This assumes your perspective is truth. That is the case for nobody in the world; in fact, I also have a perspective that I'm confident in, as do many others. Your statements also assume that, perhaps as the arbiter of truth, you have some authorization or power to enforce it. Again, that's nobody's business.
We're in a world of peers, generally speaking, and none of us know who is right. We need strategies to navigate that world, not the one where truth is given to you.
> you didn't give me anything to work with
When I feel like you do, it's a signal I need to listen better - the other person probably does have something to say and I'm missing it. It's possible we're talking past each other, but that's never a reason for insults.
(human intercourse)
Note that the signal is that I need to do something, not the other person. That's not because I'm 'wrong' or 'right' - those are mostly unknowable and irrelevant because 1) We're in a world of peers, generally speaking, and none of us know who is right. Also, 2) I'm the only one I can control and am responsible for, and ...
3) Respecting other people is always more important. That's a strategy for, and wisdom in, a world of uncertainty (as described), as opposed to a world of certainty. Also, it's a strategy for social creatures in social groups - it keeps groups strong and functioning. Finally, it's strategy for both loving and respecting yourself - you deserve it. You're better than insults, I'm sure; and I sometimes say the wrong thing, but I'm better than that too.
> Where is Wikipedia without all the learning and information from other sources, many of which it put out of business?
Which businesses did Wikipedia put out of business? You will frequently see a 5k word article used for a couple of sentences in a Wikipedia page, with the entire Wikipedia page itself being smaller than one paper it cites for one small corner of said page. When I’m researching events, I frequently go to Wikipedia to find sources as search engines have a drastically larger recentism bias.
> Has anyone researched the relative accuracy of Wikipedia and LLMs
No comparative research on this specific topic has been conducted afaik, and most comparative research is aging (likely, to Wikipedia’s own detriment–general consensus is that Wikipedia’s reliability has increased over time). However at the time of research publication, the consensus seems to be that Wikipedia is generally only slightly less reliable than peers in a given field (ie textbooks or británica), although Wikipedia is often less in depth. The most frequently cited study is a 2005 comparison in Nature which found 4 major errors in both Wikipedia and Británica, and 130 minor errors on Británica whereas 160 on Wikipedia. All studies are documented on Wikipedia itself, see [[Reliability of Wikipedia]]. LLMs… do not have this same reputation.
> Which businesses did Wikipedia put out of business?
Just as a start, other sources of reference, including encylopedias, dictionaries, websites, etc. For example, I'm sure it impacts McGraw-Hill's AccessScience, which likely you've never heard of.
> This is documented on Wikipedia itself
Maybe there's a little bias there? Would Wikipedia accept Wikipedia's analysis of its own reliability as a valid source?
I've heard that claim, but having no knowledge of the accuracy of any particular article, it's not worth very much to me.
> LLMs… do not have this same reputation.
They don't with you, but many people obviously use them that way. Also, reputation does not correlate strongly with reality.
> Just as a start, other sources of reference, including encylopedias, dictionaries
This just seems like healthy competition. I thought we were talking about a situation where Wikipedia’s use of other encyclopedias is an instrument of their demise.
> Maybe there's a little bias there
Paradoxically, I suspect you’d be pleasantly surprised about how tough this article is on itself. A lot of attention is given to bias in this case.
> Would Wikipedia accept Wikipedia's analysis of its own reliability as a valid source?
First, it is not Wikipedia’s own analysis. Editors should not present their own conclusions from research, just what each paper says. See [[WP:SYNTH]]. Second, generally Wikipedia discourages anyone citing it as it is not a stable source of information. Much better is to use the sources the article itself conveniently cites inline. As a general policy citing any encyclopedia is discouraged.
> having no knowledge of the accuracy of any particular article, it's not worth very much to me.
Wikipedia does have internal metrics grading the quality of an article. [[WP:ASSESS]]. In general though, even entirely discounting the Wikipedia component of the británica comparison, based on británicas own failures it seems wise to verify each and every claim in an encyclopedia, which Wikipedia does an excellent job of helping you do.
> They don't with you, but many people obviously use them that way. Also, reputation does not correlate strongly with reality
OpenAIs own benchmarks show much higher hallucination rates than any study on Wikipedia. Wikipedia itself is quite close to a ban on LLMs for reliability issues. If you ask literally any layman “has ChatGPT ever been wrong for you” they will say yes, either in that moment or after only a little prompting. It is much harder to elicit such a response regarding Wikipedia in my experience
Correct. The amount of Wikipedia pages at any given moment with active vandalism is vanishingly small. The only time I have ever stumbled upon vandalism is as part of my work as a volunteer there actively looking for such cases. Looking at my feed of possibly problematic changes at the moment, about 3 entries are appearing per minute with the most recent revert being just 2 entries ago. It is significantly worse while school is in session in my experience, but vandalism very rarely lasts long. Talking to people, people frequently confess to vandalising wikipedia at one point or another. When I ask them "how long did it survive" they tell me answers ranging between "a few moments" to "5 minutes." So to answer your question, I believe it is unlikely the average person has seen vandalism on the site barring those looking at their own shit.
>> Just as a start, other sources of reference, including encylopedias, dictionaries
> This just seems like healthy competition. I thought we were talking about a situation where Wikipedia’s use of other encyclopedias is an instrument of their demise.
Somewhere above, someone complained that LLMs were harming Wikipedia, a source of its information. My point is that Wikipedia did the same to others.
> > Which businesses did Wikipedia put out of business?
> Just as a start, other sources of reference, including encylopedias, dictionaries, websites, etc. For example, I'm sure it impacts McGraw-Hill's AccessScience, which likely you've never heard of.
Your “for example” in response to a question about what businesses Wikipedia put out of business is a business that is...still in business?
this is pedantic, but I think its important. We can say that a truth and lie are exclusive. when we lie, we know the truth. we just make up something else because it fits our agenda better. post-truth implies that the distinction doesn't matter anymore. nothing is true, everything is true. no statement can be evaluated outside the intent in which its delivered. that's much more substantial shift than just presenting a set of lies as the truth. a post-truth society is almost compelled to devalue science or any other pursuit of knowledge. by making all the voices selfish and mundane, it explicitly rejects beauty and accomplishment.
Whenever I can, I try to think of modern events from the point of view of archaeologists digging through the layers, at some point in the distant future.
Given that perspective, my thought was: "Hey Bob, look at these morons, they called easily proven lies 'post-truth!' Can you believe that? In a civilization based on science, with AI, nuclear, and biological weapons?! No wonder they died out right after this. How did they not see this coming? Anyway, what's for lunch?"
It indicates that it's a follow-on to postmodernism. To a significant degree the post-truth era is built on a reactionary attack on postmodernism - you can see it on HN, where many people reflexively attack like a mob anything they perceive is postmodern. You can see it in so many people who will accept lies and disaster over postmodernism.
And post-truth is a postmodern term - ironic, ridiculing, makes you think, has some energy to it. How absurd to be literally //post-truth//.
> era of lies
That's a post-postmodern term. No irony or wit; a term of despair. :)
>It indicates that it's a follow-on to postmodernism.
How? it's just postmodernism itself.
There is no truth because everything is relative. There is no singular, objective truth, facts are intrinsically bound to their context, hence post-truth.
> How? it's just postmodernism itself. / There is no truth because everything is relative. There is no singular, objective truth, facts are intrinsically bound to their context, hence post-truth.
That's not postmodernism but a caricature of it by its critics.
Postmodernism cares deeply about truth. It is highly skeptical of power, bias, perception, etc. and provides tools to mitigate these risks to truth.
Post-truth is cares very little about truth and is especially non-skeptical, imho.
> Postmodernism cares deeply about truth. It is highly skeptical of power, bias, perception, etc. and provides tools to mitigate these risks to truth.
I think someone told you this and you believed it, but the track record of postmodernists in academia tells a different story.
I once saw a history lecture from a dean of the history department who had written of a book on WWII in the Philippines. This is one of the most heavily studied and well documented periods in human history. There are dozens of books and hundreds of papers on this topic.
My father-in-law, who was born there and lived through the war there, read the book. He called it historical fiction. The book cherry-picked facts badly, ignored events which countered the narrative of the book and in general made every effort to promote an entirely counter-factual narrative of events. In addition, the author had no real expertise in military or naval warfare and didn't seem to understand even the basics about how wars are fought.
When things like this happen on a regular basis, it is hard to say that postmodernism cares about objective truth in any way. In fact, they seem to actively dislike reality. This isn't a caricature, this is from the horse's mouth. If you want people to have a different perception of postmodernism, make events like this have some sort of penalty, because he's still the dean many years later.
> I think someone told you this and you believed it
Obviously you don't care about the truth but just fabricate bullshit about other people that's convenient to you. I know the topic well; you clearly don't.
Your father is one source; he no doubt has his experience and memory which are important to him. If you look at the sources of a scholarly history book, you'll see thousands of people - that's how history is done, by researching primary sources like your father. Of course not everyone will agree; people will have widely varying perspectives (which is something that postmodernism helps you understand and dealt with).
These cheap shots at me, scholarship, and postmodernism - which has its flaws, but not this stuff - are an insult to everyone's intelligence, including yours.
It's hilarious to read American history and read in every book how everyone who had something the US wanted, or went counter to US interests accidentally happened to have been evil in some way so the US could conveniently show up out of selfless desire just to right the wrongs and end up with the geopolitical result that it wanted in the first place.
People with decades long scholarly careers write this shit (some even having the highest credentials), and people eat it up.
History as written by (especially US) historians is just racist boomer fanfiction (pushed as propaganda to enforce the national myth) for your own country. This is especially true of the US.
And their errors and not subtle by any means. When they write something that is so wrong about you that every man on the street who even has cursory familiarity with the subject would reject as not even wrong, so far from the truth it's clear to the observer that the one who made these claims doesn't even have basic familiarity with the subject.
You've countered this argument with an appeal to authority (his word against thousands of researchers) - how fortunate was a person like Galileo who could just make people look into a telescope and show his numerous and highly distinguished opponents that they we wrong - unfortunately no such thing can exist for history. The next best thing I could recommend to US scholars is to have their work reviewed by top and highly respected local scholars for obvious errors, not biases of overarching narrative but, basic shit and continuity errors that common man on the street wold laugh at.
How could this be? Do I believe Americans to be specially dumb? Just like Big Tobacco pushing studies on the health benefits of smoking for pregnant mothers, and Coca Cola delegitimizing the view that sugar is bad, US historians have a vested interest in propping up US imperialism - or are acting as reactionaries saying everything the US (or white people) has ever done was purely evil, what you have is a partisan shouting match (also called activism), that is the exact opposite of scholarly work.
Caricature? Foucault talked about "regimes of truth" determining that context and society are what determine truth. Baudrillard straight up said "The simulacrum is true" putting it over objective fact. Derrida's whole deal was about how language constructs, rather than describes, reality.
If all of this sounds ridiculous that's another matter, if I actually wanted to cast shade upon them then gee, I'd just quote their stances on sex with minors.
First, I mean no personal insults. Let's discuss ideas.
As someone who grew up being de-programmed from Soviet propaganda by my parents every time I came home, starting from pre-school, I cannot even begin to communicate how allergic I am to this discourse. "There is no truth" is some grade-A bullshit to me. What's next? Maybe Stalin wasn't Hitler's military ally to start WWII? Maybe we live in a simulation? What's the point of anything? 1+1=3!
I could just be dumb, but my theoretical view from 30,000 feet, or 30,000 years in the future can be read here:
But “era of lies” doesn’t sound nice because nobody wants to be a liar… so “post-truth” sounds better: “I'm telling the truth. Almost. But I'm not lying.”
The inevitable AI investment bubble popping is going to be a big deal, but I believe that the larger bubble of political lies popping is much more interesting to think about.
What is that going to look like? How does one hedge against that eventuality?
Interesting reference. I had not heard of him, or that book from 2013. 10B is now projected to be our peak, and at 7B we have so much spare food that we burn it as automotive fuel. And now... we realize that population "collapse" is much more likely to happen than endless growth.
Obviously I have not read the book, but do you think it holds up in 2025?
Many books on the subject are very much "doomsayers" or "preppers" style.
The Roman Empire has been crumbling for 400 years, so it's likely that we won't experience the collapse of society as described in most history books either - life is too short for that. Unless a black swan comes along…
To answer your question, I like to come back to the book because it's written in the style of Dan Brown :) - short, punchy chapters. And it still makes sense (to me).
Worldwide population peaked sometime between 2015 and 2020 at somewhere between 7.2 and 7.4B. So my take on the book is that it isn't very good.
Also, biofuels based on most crops other than sugar cane, in addition to not being very helpful in the fight against AGW, triggered large price spikes and political turmoil in a dozen different countries at once. Perhaps you heard of this event, we call it the Arab Spring. We are still dealing with the fallout of it.
Someone made a similar argument to me recently about AI, talking about how programmers used to have stacks of reference books at their desks and how many of the old guard had to have trainings convincing them to use google as a reference.
I guess this argument was supposed to convince me to stop being such a luddite and accept the inevitable future, but really, in an increasingly post-truth world, it made me want to go and get myself a stack of reference books.
Many more scoffed at that, saying those people were just stuck in their old ways and unable to adjust to the obviously superior new thing.
Is that you? AI applications are different than Wikipedia and are better in some ways: Coverage is much greater - you can get a detailed article on almost any topic. And if you have questions after reading a Wikipedia article, Wikipedia can't help you; the AI software can answer them. Also, it's a bit easier to find the information you want.
Personally, I'm with the first group, at the top if this comment. And now truth, accuracy, and epistemology, and public interest in those things, take another major hit in the post-truth era.