Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
whimsicalism
12 months ago
|
parent
|
context
|
favorite
| on:
How has DeepSeek improved the Transformer architec...
hallucinations decrease with scale and reasoning, the model just gets better and stops making stuff up.
littlestymaar
12 months ago
|
next
[–]
o1 still hallucinates badly though.
Jerrrry
12 months ago
|
prev
[–]
False, facts only need to be seen once, and one mis-step in reasoning and your CoT is derailed.
whimsicalism
12 months ago
|
parent
[–]
> one mis-step in reasoning and your CoT is derailed.
tell me you've never seen reasoning traces without telling me
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: