Hacker Newsnew | past | comments | ask | show | jobs | submit | netbioserror's commentslogin

Precisely what I was going to say. As domain specificity increases, LLM output quality rapidly decreases.

This is neat and I wish C3 well. But using Nim has shown me the light on maybe the most important innovation I've seen in a native-compiled systems language: Everything, even heap-allocated data, having value semantics by default.

In Nim, strings and seqs exist on the heap, but are managed by simple value-semantic wrappers on the stack, where the pointer's lifetime is easy to statically analyze. Moves and destroys can be automatic by default. All string ops return string, there are no special derivative types. Seq ops return seq, there are no special derivative types. Do you pay the price of the occasional copy? Yes. But there are opt-in trapdoors to allocate RC- or manually-managed strings and seqs. Otherwise, the default mode of interacting with heap data is an absolute breeze.

For the life of me, I don't know why other languages haven't leaned harder into such a transformative feature.


NOTE: I'm a fan of value semantics, mostly devil's advocate here.

Those implicit copies have downsides that make them a bad fit for various reasons.

Swift doesn't enforce value semantics, but most types in the standard library do follow them (even dictionaries and such), and those types go out of their way to use copy-on-write to try and avoid unnecessary copying as much as possible. Even with that optimization there are too many implicit copies! (it could be argued the copy-on-write makes it worse since it makes it harder to predict when they happen).

Implicit copies of very large datastructures are almost always unwanted, effectively a bug, and having the compiler check this (as in Rust or a C++ type without a copy constructor) can help detect said bugs. It's not all that dissimilar to NULL checking. NULL checking requires lots of extra annoying machinery but it avoids so many bugs it is worthwhile doing.

So you have to have a plan on how to avoid unnecessary copying. "Move-only" types is one way, but then the question is which types do you make move-only? Copying a small vector is usually fine, but a huge one probably not. You have to make the decision for each heap-allocated type if you want it move-only or implicitly copyable (with the caveats above) which is not trivial. You can also add "view" types like slices, but now you need to worry about tracking lifetimes.

For these new C alternative languages, implicit heap copies are a big nono. They have very few implicit calls. There are no destructors, allocators are explicit. Implicit copies could be supported with a default temp allocator that follows a stack discipline, but now you are imposing a specific structure to the temp allocator.

It's not something that can just be added to any language.


And so the size of your data structures matters. I'm processing lots of data frames, but each represents a few dozen kilobytes and, in the worst case, a large composite of data might add up to a couple dozen megabytes. It's running on a server with tons processing and memory to spare. I could force my worst case copying scenario in parallel on each core, and our bottleneck would still be the database hits before it all starts.

It's a tradeoff I am more than willing to take, if it means the processing semantics are basically straight out of the textbook with no extra memory-semantic noise. That textbook clarity is very important to my company's business, more than saving the server a couple hundred milliseconds on a 1-second process that does not have the request volume to justify the savings.


It's not just the size of the data but also the amount of copies. Consider a huge tree structure: even if each node is small, doing individual "malloc-style" allocations for millions of nodes would cause a huge performance hit.

Obviously for your use case it's not a problem but other use cases are a different story. Games in particular are very sensitive to performance spikes. Even a naive tracing GC would do better than hitting such an implicit copy every few frames.


Great! We can start trading with each other in stock notes. I can't wait to buy my next round of groceries with a 0.1 SPY note! The ones I don't use will pay dividends!

you're joking, but this is basically what it means to be retired...

If only there were a digital asset that had a fixed supply to prevent inflation that nobody could control who could spend what (to avoid unjust debanking) which was highly divisible so that you could spend large or small amounts, and because it's digital it could be spent very rapidly across long distances using the magic of the Internet. And if only it's governance model wasn't subject to the corruption seen in governments and private banks alike.

I bet that thing would be a pretty useful monetary tool, even if it were attacked, as one might expect by all of the government and banks around the world who were trying to cling to the power they have by virtue of having captured the ability to print money and use it when it is most valuable, fresh off the press.


Positive downstream effect: The way software is built will need to be rethought and improved to utilize efficiencies for stagnating hardware compute. Think of how staggering the step from the start of a console generation to the end used to be. Native-compiled languages have made bounding leaps that might be worth pursuing again.

Alternatively, we'll see a drop in deployment diversity, with more and more functionality shifted to centralised providers that have economies of scale and the resources to optimise.

E.g. IDEs could continue to demand lots of CPU/RAM, and cloud providers are able to deliver that cheaper than a mostly idle desktop.

If that happens, more and more of its functionality will come to rely on having low datacenter latencies, making use on desktops less viable.

Who will realistically be optimising build times for usecases that don't have sub-ms access to build caches, and when those build caches are available, what will stop the median program from having even larger dependency graphs.


I’d feel better about the RAM price spikes if they were caused by a natural disaster and not by Sam Altman buying up 40% of the raw wafer supply, other Big Tech companies buying up RAM, and the RAM oligopoly situation restricting supply.

This will only serve to increase the power of big players who can afford higher component prices (and who, thanks to their oligopoly status, can effectively set the market price for everyone else), while individuals and smaller institutions are forced to either spend more or work with less computing resources.

The optimistic take is that this will force software vendors into shipping more efficient software, but I also agree with this pessimistic take, that companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can’t afford tech at inflated prices.

I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.


> companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can't afford tech at inflated tech

These big companies are competing with each other, and they're willing and able to spend much more for compute/RAM than we are.

> I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.

A few ideas:

* Use/develop/optimise local tooling

* Pool resources with friends/communities towards shared compute.

I hope prices drop sooner than projects dev tools all move to the cloud.

It's not all bad news: as tooling/builds move to the cloud, they'll become available to those that have thus far been unable or unwilling to afford a fast computer to be mostly idle.

This is a loss of autonomy for those who were able to afford such machines though.


Some Soviet humor will help you understand the true course of events:

A dad comes home and tells his kid, “Hey, vodka’s more expensive now.” “So you’re gonna drink less?” “Nope. You’re gonna eat less.”


I have some hope for transpiling to become more commonplace. What would happen if you could write in Python, but trivially transpile to C++ and back?

You've described Nim, Chicken, and Jank. Partially what I meant by "leaps made by native-compiled languages".

Proton is a single build target, and it's just the Windows build target.

Valve maintains a 'Steam Runtime', which is similar to a docker container, to ensure it's easy to develop games that run on many distributions.

Exactly, this argument wasn’t a good one 10 years ago and it definitely isn’t one now.

The problem is kernel level cheats, can't defend against those from pure userland.

Soon: The problem is DMA level cheats, can't defend against those from the kernel.

Oh those are already here, Its why Battlefield needs Secureboot turned on so it can use the IOMMU to protect the game kinda

I was going to say. This is pretty easily achievable with a Typst template and script, which can even parse YAML.


Obvious is good. Optimization can come later. Cleverness is for when you are out of options.

The programming landscape 30+ years ago and its severely constrained resources strongly biased our idea of "good software" in favor of cleverness. I think we can say we know better now. Having been responsible for picking up someone else's clever code myself.


> severely constrained resources

Energy is a resource. Mobile computing devices demonstrate this constraint already. I predict that what is old will become new again.


Do we? I feel the layers of abstraction are quite extensive now. They are anything but simple.


(Good) Abstraction is there to hide complexity. I don't think it's controversial to say that software has become extremely complex. You need to support more spoken languages, more backends, more complex devices, etc.


The most complex thing to support is peoples' resumes. If carpenters were incentivized like software devs are, we'd quickly start seeing multi-story garden sheds in reinforced concrete because every carpenters dream job at Bunkers Inc. pays 10x more.


"You don't get your best performances by trying harder" is just another way of saying that our talents come so naturally that they don't feel like work.

Does that mean that if you're trying, you're fighting a losing uphill battle against something you'll never excel at? I think many skills are learned and must be earned with discipline. But the culture places excessive weight on excelling in specific fields that most people simply can't brute-force. Hence the prevalence of chemical assistance at the highest ends of productivity, intellectual competition, and athletics.

We probably need to place more emphasis on doing things that come naturally to us. Emphasis on doing. But also enjoy downtime and not-doing occasionally.


I treat extensions like they're all capable of privileged local code execution. My selection is very vetted and very small.


The only extensions I have installed are dark reader and ublock origin. Would be nice if I could disable auto updating for them somehow and run local pinned versions...


Get the source code and manually pack your own unsigned web-ext’s.


Add-ons Manager -> (click the add-on in question) -> change "Allow automatic updates" to "Off"

(for firefox/derivatives anyways...)


Same here, uBlock Origin and EFF's Privacy Badger are the only extensions I trust enough to install.


Ditto, plus 1pass / BitWarden.


So then what use is any other approach than simply letting it happen? Words are just that. If violence is out, then the only other approach is escalating the trade war and Chinese isolation, at great cost.


If your only plan is invading China, you don't have a plan.


Precisely, and I'm saying there is no other good plan. If the cost of defending democratic values worldwide is starting WW3, we simply can't defend democratic values worldwide anymore.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: