Hacker Newsnew | past | comments | ask | show | jobs | submit | d3m0t3p's commentslogin

Same, Firefox iOS


The model is fined tuned for chat behavior. So the style might be due to - Fine tuning - More Stylised text in the corpus, english evolved a lot in the last century.


Diverged as well as standardized. I did some research into "out of pocket" and how it differs in meaning in UK-English (paying from one's own funds) and American-English (uncontactable) and I recall 1908 being the current thought as to when the divergence happened: 1908 short story by O. Henry titled "Buried Treasure."


Is that really the only thing you managed to remember ?


Because the ML ecosystem is more mature on the NVidia side. Software-wise the cuda platform is more advanced. It will be hard for AMD to catch up. It is good to see competition tho.


But the article shows that the Nvidia ecosystem isn't that mature either on the DGX Spark with ARM64. I wonder if Nvidia is still ahead for such use cases, all things considered.


On the DGX Spark, yes. On ARM64, Nvidia has been shipping drivers for years now. The rest of the Linux ecosystem is going to be the problem, most distros and projects don't have anywhere near the incentive Nvidia does to treat ARM like a first-class citizen.


In my own studies, software engineering was mostly about structurig code, coding pattern such as visitor, singleton etc. I.E how to create a maintainable codebase


My software engineering course was about the software development life cycle, different business methodologies like agile and waterfall, and working in a group.

It was very helpful. I would have appreciated “how to create a maintainable codebase” as well though. “Singleton” was not a part of my vocabulary until 3 years into my career :/


> “Singleton” was not a part of my vocabulary until 3 years into my career :/

If you are a more old-school style programmer, you simply use the older term "global variable". :-)


Looking back, I wish it never had been necessary to memorize all those design patterns just to get work done! All OOP has been is a huge distraction and mostly bs. This is me looking back across 30 years of work, so don't just downvote because you love OOP--try thinking about what I'm really saying here. OOP was, to me, an enormous bend in the river that eventually got pinched off and has become a horseshoe lake, destined to dry up and just become a scar on the software engineering landscape. It feels like it was all a big waste of time and someone's money making schemes, tbh.


Would you have some literature about that ?


There's a ton but it's pretty scattered. Yurii Nesterov's a big name, for example.


This sounds a lot like what the Muon / Shampoo optimizer do.


Interesting to see that they enforce retroactive opt out for data collection. I wonder how they do that, what if the model is already trained with your data and you opt out.


You can batch only if you have distinct chat in parallel,


> > if I want to run 20 concurrent processes, assuming I need 1k tokens/second throughput (on each)


Nice to see a master thesis highlighted on the research groupe page


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: