Hacker Newsnew | past | comments | ask | show | jobs | submit | amonith's commentslogin

So if you're directly embedding the thing. This is a somewhat rare use case, should not be banned almost anywhere...


The thing is, if you want AI output to be heavily directed, which is probably the case here, I can imagine that thousands of random takes had to be made to make the damn thing follow the director's imagination. If you don't care too much about the output you can make these very quickly, yeah.


Same but with 1 kid and different websites (including HN, which is equally bad!). Actively fighting it though. Slowly removing all social media accounts, now just need to figure out how to block stuff permanently on my phone. On a desktop I did it with changing my hosts file to point everything to 127.0.0.1. Need to figure out how to do this also on mobile without an additional network device that would disrupt things for my wife.


I'd also add "there are almost no developers using it on the job market" to the list why some technologies are no longer fit for purpose. It's a major one. Sort of tied to the ecosystem (no devs - not many things get mantained/created).


I do think that holds more water than just "It's old".

However for pretty much any dev I would hire for a job they can get to grips with a technology that's older pretty quickly. Where it does get dicey is when good dev just refuses to work with it. For those devs, I think, when they hold that opinion it typically means one of those other reasons is behind their refusal.


>one of those other reasons is behind their refusal.

I mean, one of those reasons is "When I leave this job will I be able to get another job" is a huge deeply life affecting one.

If you want me to work on COBOL from 1988 then you've limited my work prospects to one of a very few employers in the country at a very specific pay range. If I instead tell you to eat a fat one and go with $language_de_jour the number of employers and potential salary range is much, much larger.


Why does working in one technology prevent you from getting a job in another one?


Haven't you seen job offers where X years of experience in XYZ is a must? It's like most of them. Never got one without this actually. Gotta have this experience from somewhere.

I know devs like to say they would hire anyone, but they're not the ones hiring. At best you get to interview people already prefiltered by HR which... looks for keywords in CV.


My personal anecdotal experience is considerably different. I've worked multiple places where I had to learn the stack on the job. Up to and including the language at least once.

I've never found it too difficult to get hired even when the requirements don't list something I've done already.


You’ve been lucky then, or have excellent credentials, or some combination of the two. Your anecdote doesn’t invalidate their point about it being a valid reason a dev would want to gain experience in a current and modern technology over an archaic one.


At least in the current market there is a lot less leeway for this


That invisible hand exists and had always existed, it's the market. Nobody will arrest you, but the enjoyable work simply slowly disappears. Unless we're talking hobby scenarios, but nobody cares about that.


> That invisible hand exists and had always existed, it's the market.

I have never heard a client say "Man, glad you used React". Literally nobody cares what framework you use to build your site. Nobody.

If you didn't know any better you'd think all software developers are chained in a basement where they have absolutely no power to do anything but build React sites.


You sound like a freelancer or something. Every single company I interviewed for in the last couple of years as a full stack dev *required* experience in React/Vue/Angular 2+. With old school js/html/css you wouldn't even pass CV screening. Best you could get with that is some wordpress gig for peanuts.


This comment is pretty off-base.

1.) Those WordPress gigs can make your React gig look like indentured servitude

2.) The company you’re applying to isn’t the client.

3.) “freelancer or something”, like you’re spitting it out? Yes - some of us aren’t handcuffed to mediocrity by 200-1000 person orgs. As the kids say, “Don’t hate.”


I'm not hating, might be a language barrier, I'm from Poland, sorry.

1. Definitely not in my country. The average pay of a Wordpress/PHP dev is half of a modern full stack and the clients are terrible, because it's just websites for small businesses. Modern full stacks don't create websites most of the time, but highly interactive B2B apps.

2. It is absolutely my client. I optimize their happiness not their customers. I have no relationship with the customers, some don't know who I am.

3. I worked as a contractor for a couple of years and I'm not missing the stress and unstable pay. Especially now with a kid on board. Many contracts were actually "hey we need a React/[insert other tech] guy for our current project, wanna join?", not "we have an idea and we don't know how to do it" kind of thing. The latter are super rare and even more stressful, because they come from "non-technical startup founders" often with little money.

Keep in mind that I'm in EU, so the benefits of permanent employment make a huge difference.


Random thing spotted in the article:

> "Wish there was a windows laptop I could buy that is good"

What does it even mean? There are macs, there are chromebooks and there are just laptops. Wth is a Windows laptop? There's a good Linux laptop?

Just a nit :P


I use it for autocomplete... e.g ./f<tab> and enter. If I don't do it the terminal literally hangs for a split second and gives me a lot useless suggestions. I rarely type full words.


I think by default after fresh install it suggests the "old" layout akin to Office 2000, but you can just select "tabbed ribbon" and then it really isn't half bad.


You know we are living in crazy times when people actually actively ask for the ribbon interface instead of making fun of Microsoft for it. It's one of the worst things ever conceived in UI design.


Both have their issues but having 50 uncategorized icons (I just looked up default libre office ui screenshot and counted...) is something only a power user can love. They can keep their classic ui as an option.

Categorized ribbon is an improvement for most people. Especially new generations who simply can't enjoy the effect of shared conventions with other software.


I just looked up the difference and I don't really feel a strong pull towards either style? Why are you so anti ribbon?


The default layout is similar to office 2016


Not the parent commenter, but why would you assume that he meant LLMs specifically? I'm one of the "tech people not interested in AI" and I mean everything around AI/ML. I just like writing OG code man. I like architecture, algorithms, writing "feeling good" code. Like carpenters who just like to work with wood I like to work with code.


Yes, same feeling about ML really. Whether you are working with classic ML or LLMs, it's all about trial and error without predictable results, which just feels like sloppy (pun unintended) engineering by programmers' standards.


But this just doesn't correspond to reality. Most interesting algorithms in optimization etc. are metaheuristics as precise solutions are either proven to be impossible to get or we haven't found a solution yet. In the meantime, we get excellent results with "close-enough" solutions. Yes, the pedantic aspect of my soul may suffer, and we will always strive towards better and better solutions, but I guess we accepted already over a century ago that approximate solutions are extremely valuable.


I see my instructions for the LLM still as code. Just in human language and not a particular programming language. I still have to specify the algorithm, and I still have to be specific - the more fuzzy my instructions the more likely it is that I end up with having to correct the LLM afterwards.

There is so much framework stuff, when I started coding I could mostly concentrate on the algorithm, now I have to do so much framework stuff, I feel like telling the LLM really only the actual algorithm, minus all the overhead, is much more "programming" than today's programming with the many many layers of "stuff" layered around what I actually want to do.

I find it a bit ironic though that our tool out of the excessive complexity is an even more complex tool, although, looking at biology and that programming in large longer-running projects already felt like it had plenty of elements that reminded me of how evolution works in biology, already leading to hard or even impossible to comprehend systems (like https://news.ycombinator.com/item?id=18442637), the new direction is not that big of a surprise. We'll end up more like biology and medicine some day, with probabilistic methods and less direct knowledge and understanding of the ever more complex systems, and evolution of those systems based on "survival" (does what it is supposed to most of the time, we can work around the bugs, no way to debug in detail, survival of the fittest - what doesn't work is thrown away, what passes the tests is released).

Small systems that are truly "engineered" and thought through will remain valuable, but increasingly complex systems go the route shown by these new tools.

I see this development as part of a path towards being able to create and deal with ever more complex systems, not, or only partially, to replace what we have to create current ones. That AI (and what will develop out of it) can be used to create current systems too is a (for some, or many, nice) side effect, but I see the main benefit in the start of a new method to deal with ever more complexity.

I only ever see single-person or -team short-term experiences of LLM use for development. Obviously, since it is so new. But one important task of the tooling will only partially be to help that one person, or even team, to produce something that can be released. Much more important will be the long-term, like that decades-long software dev process they ended up with in my link above, with a lot of developers over time passing through still being able to extend it and fix issues years later. Right now it is solved in ways that are far from fun already, with many developers staying in those teams only long enough, or H1Bs who have little choice. If this could be done in a higher level way, with whatever "AI for software dev" will turn into over the next few decades, it could help immensely.


> There is so much framework stuff, when I started coding I could mostly concentrate on the algorithm, now I have to do so much framework stuff, I feel like telling the LLM really only the actual algorithm, minus all the overhead, is much more "programming" than today's programming with the many many layers of "stuff" layered around what I actually want to do.

I was wondering about this a lot. While it's a truism the generalities are always useful whereas the specific gets deprecated with time, I was trying to get down deeper on why certain specifics age quickly whereas other seem to last.

I came up with the following:

* A good design that allows extending or building on top of it (UNIX, Kubernetes, HTML)

* Not being owned by a single company, no matter how big (negative examples: Silverlight, Flash, Delphi)

* Doing one thing, and being excellent at it (HAproxy)

* Just being good at what needs to be done in a given epoch, gaining universality, building ecosystem, and just flowing with it (Apache, Python)

Most things in JS ecosystem are quite short-lived dead ends so if I were a frontend engineer I might consider some shortcuts with LLMs because what's the point of learning something that might not even exist a year from now? OTOH, it would be a bad architectural decision to use stuff that you can't be sure it will be supported in 5 years from now, so...


I predict the useful activity of writing LLM boilerplate will have a far shorter shelf-life than the activity of writing code has has.


I don't doubt that the current specific products and how you use them will endure. This is the very first type of something truly better, and there still is a very long way to go. Let's see what we will have twenty years from now, while the current products still find their customers as we can see.

No, I'm talking about core principles.

You just can't go on being incredibly specific. We already tried other approaches, "4th gen" languages were a thing in the 90s already, for example. I think the current kind of more statistical and NN approach is more promising. The completely deterministic computing is harder to scale, or you introduce problems such as seen in my example link over time, or it becomes non-deterministic and horrible to debug because the bigger the system you other things dominate more and more.

Again, this won't replace smaller software like we write today, this is for larger, ever longer lasting and more complex systems, approaching bio-complexity. There is just no way to debug something huge line by line, and benefits of modularization (and separation of the parts into components easier to handle) will be undermind4ed by long-term development following changing goals.

Just look at the difference in complexity of software form a mere forty, or twenty years ago and now. The majority of software was very young, and code size was measured in low mega-bytes. The systems explode in size, scale and complexity, and new stuff added over time is less likely to be added cleanly. Stuff will be "hacked on" somehow and released when it passes the tests well enough, just like in my example link which was for a 1990s DB system, and it will only get worse.

We need very different tools, trying to do this with our current source code and debugging methods already is a nightmare (again, see that link and the work description). We might be better off embracing more fuzzy statistical and NN methods. We can still write smaller components in more deterministic ways of today.


One must naturally make assumptions when responding to something that is poorly defined or communicated. That's just how it is. That's an issue for the original poster, not the responder.

The terminology of AI has a strong link with LLMs/GenAI. Quite reasonable.

As for code/architecture/infrastructure I like those things too. You do have to shape your communications to the audience you are talking to though. A lot of the products have eliminated the demand for such jobs, and its a false elimination so there will be an overcorrection later in a whipsaw, but by that time I'll have changed careers because the jobs weren't there. I'm an architect, with 10+ years of experience, not a single job offer in 2 years with tens of thousands of submissions in that time.

If there is no economic opportunity you have to go where the jobs are. When executives play stupid games based in monopoly to drive wages down, they win stupid prizes.

Sometime around 2 years is the max time-frame before you get brain drain for these specialized fields, and when that happens those people stop contributing to the overall public parts of the sector entirely. They take their expertise, and use it for themselves only, because that is the only value it can provide and there's no winning when the economy becomes delusional and divorced from reality.

You have AI eliminating demand for specialized labor that requires at least 5 years of experience to operate competently, AI flooding the communication space with jammed speech (for hiring through a mechanism similar to RNA interference), and you have professional certificate providers retiring all benefits, and long-lasting certificates that prove competency on top of the coordinated layoffs by big tech in the same time period. Eliminating the certificate path as a viable option for the competent but un-accredited through university.

You've got a dead industry. Its dead, but it doesn't know it yet. Such is the problem with chaotic whipsaws and cascading failures that occur on a lag. By the time the problem is recognized, it will be too late to correct (because of hysteresis).

Such aggregate stupidity in collapsing the labor pool is why there is a silent collapse going on in the industry, and why so many people cannot find work.

The level of work that can be expected now in such places because of such ill will by industry is abyssal.

Given such fierce loss and arbitrarily enforced competition, who in their right mind would actually design resilient infrastructure properly; knowing it will chug away for years without issue after they lay you off with no intent towards maintenance (making money all that time).

A time is fast approaching where you won't find the people competent enough to know how to do the job right, at any price.


> They are just way too distracting, I don't understand why people like them.

Simply not all people get so easily distracted... It may be signs of mild ADHD.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: