Who remembers sprite sheets? Does that give my age away?
I did an optimization pass for a client once where I got rid of a ton of the sprites but didn't have the energy to redo it all, so it just had huge sections that were blank.
I’ve encountered a heap of group 2. They’re the ones sick of learning new things, for whatever reason.
I think it's easy to dismiss that group, but the truth is there was a lot of flux in our industry in the last decade before AI, and I would say almost none of it was beneficial in any way whatsoever.
If I had more time I could write an essay arguing that the 2010s in software development was the rise of the complexity for complexity's sake that didn't make solving real world problems any easier and often massively increased the cost of software development, and worse the drudgery, with little actually achieved.
The thought leaders were big companies who faced problems almost no-one else did, but everyone copied them.
Which led to an unpleasant coding environment where you felt like a hamster spinning in a wheel, constantly having to learn the new hotness or you were a dinosaur just to do what you could already do.
Right now I can throw a wireframe at an AI and poof it's done, react, angular, or whatever who-gives-a-flying-sock about the next stupid javascript framework it's there. Have you switched from webpack to vite to bun? Poof, AI couldn't care less, I can use whatever stupid acronym command line tool you've decided is flavour of the month. Need to write some Lovecraftian-inspired yaml document for whatever dumbass deploy hotness is trending this week? AI has done it and I didn't have to spend 3 months trying to debug whatever stupid format some tit at netflix or amazon or google or meta came up with because they literally had nothing better to do with their life and bang my head against the wall when it falls over every 3 weeks but management are insisting the k8s is the only way to deploy things.
That in itself feels like second-system syndrome but instead of playing out over a single software project it’s the large-scale version playing out over the entire industry.
When you move to the enterprise layer, suddenly you get the opposite problem, you have a low amount of "users" but you often need a load of CPU intensive or DB intensive processing to happen quickly.
One company I worked for had their system built by, ummmm, not the greatest engineers and were literally running out of time in the day to run their program.
Every client was scheduled over 24 hours, and they'd got to running the program for 22 hours per day and were desperately trying to fix it before they ran out of "time". They couldn't run it in parallel because part of the selling point of the program was that it amalgamated data from all the clients.
Without seeing more this seems like it could be solved by not recomputing the entire history to add on data. Depends what kind of math you are doing however.
Some sort of check point system could likely save significant IO.
What am I missing that requires you to recompute all data every day?
It was receiving huge volumes of data from each financial client, and matching it all up to try and find certain things. And match it with existing historical data. Not ads or online tracking. So the loop was adding this data and recalculating everything, It had to be done sequentially, I can't remember the exact reason, but it was a good one.
I was only there a few months as they were so dysfunctional I jumped to another job offer I'd received. We were having endless sprint meetings to "plan" this work when all it needed was someone experienced like me refactoring it for a coupe of days. There was a lot of junior devs with senior developer titles as everyone invariably got promoted every year. The funny thing about the sprint cards was all the tasks I put up a 1 for, the other developers put up a 10, and all the ones I put up 10s for the other developers put up 1s. That's what happens when you let junior devs have a say, no comprehension of what's hard or not.
Before I went I did point out the multiple pretty obvious O(n²) loops they had in the main calculation loop the results of which could easily be cached but I don't know if it went whoosh over their heads.
I'm pretty certain if they'd just let me got on with it instead of holding up sprint cards in the meetings they'd have been down to doing the whole lot is a 1/2 hour run a day, even in the short time I was there.
In my experience the first run of optimizing something like that usually doesn't take long and has huge benefits.
> It doesn't matter if Bob can be normal. There was no point to him being paid to be on the program.
Yeah, I'm surprised at the number of people who read the article and came away with the conclusion that the program was designed to churn deliverables, and then they conclude that it doesn't matter if Bob can only function with an AI holding his hand, because he can still deliver.
That isn't the output of the program; the output is an Alice. That's the point of the program. They don't want the results generated by Alice, they want the final Alice.
And then you realize that most of science is unnecessary. As TFA points out, it doesn't matter if the age of the universe is 13.77 or 13.79 billion years. So you ban AI in science, you produce more scientists who can solve problems that don't matter. So what?
I don't think it works like that, from my recollection of the uni courses I did 20 years ago.
Even a small advantage like 1% will quickly propagate in a population, because it's about advantage over 1,000s of generations.
That this disease defence CAN be turned on, means some people would have at some point had a genetic mutation to turn it on.
As the GP pointed out, therefore it must be a net negative from an evolutionary stand point.
I also suspect it would be calorific consumption, as someone else said, so it might be ok.
However, there are plausible other explanations. For example there are medical conditions that result from a too aggressive immune system and it could instead be reducing the chance of that occuring.
The problem is implying that “if evolution did not do it there must be a reason”, because 1) it makes evolution look like an engineer evaluating trade offs, which is not and 2) it considers the current state of affairs the final “product”, which is not. For example, flowers did not exist in the Cretaceous, so somebody looking at what evolution did until then would say “if evolution did not invent flowers, then we’d better not do it”. But of course that’s absurd.
Also as I said evolution is not a process towards a goal. There are 8 billion people around the world which proves Homo sapiens is quite fit for its environment so the pressure to evolve further features is quite low.
I'm really sorry, but you're really misunderstanding how evolution works.
Worth reading something like the Selfish Gene if you want to understand it a bit better.
There are always reproductive pressures and there are always genetic variations.
Modern civilization and medicine has simply changed what the pressures are.
As an example if a genetic variation occured tomorrow which gave resistance to spermicide, within 100 generations that variant would probably be quite successful and prevalent in the human population.
I know about reproductive pressure and I’ve read The Selfish Gene. What you say is correct but does not explain that “if evolution did not, better not do it” attitude of the original comment, which I think is wrong for many reasons as I’ve wrote.
I would say you are both right in that if you have two competing variables (on-time for the defence vs calorie consumption), when the main causes of death before procreating were infectious disease and malnutrition before modern times, I would expect some equilibrium to be reached and we have not had that much time to evolve since caloric scarcity in the western world was a solved problem for large swaths of the population.
If in the future we could trade a few hundred extra calories per day for a great immune system (without auto-immune side effects) we would have found a nice cheat code!
Thinking about your point- I bet we do not know if some people have it on or not. It feels like something that would have to be specifically investigated.
Unless you go through the code with a tooth comb, you're not even aware of what trade-offs the AI has made for you.
We've all just seen the Claude Code source code. 4k class files. Weird try/catches. Weird trade-offs. Basic bugs people have been begging to fix left untouched.
Yes, there's a revolution happening. Yes, it makes you more productive.
But stop huffing the kool-aid and be realistic. If you think you're still deciding about the trade-offs, I can tell you with sincerity that you should go try and refactor some of the code you're producing and see what trade-offs the AI is ACTUALLY making.
Until you actually work with the code again, it's ridiculously easy to miss the trade-offs the AI is making while it's churning out it's code.
I know this because we've got some AI heavy users on our team who often just throwing the AI code straight into the repo with properly checking it. And worse, on a code review, it looks right, but then when something goes wrong, you go "why did they make that decision?". And then you notice there's a very AI looking comment next to the code. And it clicks.
They didn't make that decision, they didn't choose between the trade-offs, the AI did.
I've seen weird timezone decisions, sorting, insane error catching theatre, changing parts of the code it shouldn't have even looked at, let alone changed. In the FE sphere it's got no clue how to use UseEffect or UseMemoization, it litters every div with tons of unnecessary CSS, it can't split up code for shit, in the backend world it's insanely bad at following prior art on things like what's the primary key field, what's the usual sorting priority, how it's supposed to use existing user contexts, etc.
And the amount of times it uses archaic code, from versions of the language 5-10 years ago is really frustrating. At least with Typescript + C#. With C# if you see anything that doesn't use the simpler namespacing or doesn't use primary constructors it's a dead give-away that it was written with AI.
I feel this is the key - three years ago everyone on HN would be able to define "technical debt" and how it was bad and they hated it but had to live with it.
We've now build a machine capable of something that can't even be called "technical debt" anymore - perhaps "technical usury" or something, and we're all supposed to love it.
Most coders know that support and maintenance of code will far outlast and out weigh the effort required to build it.
It's only 510k LoC, at ~100 lines of code a day for a year, this code base would take 23 engineers a year to write. That's for 220 working days in somewhere civilized.
And I'm sure we all know that when working on a greenfield project you can produce a lot more LoC per day than maintaining a legacy one.
Given that vibe code is significantly more verbose, you're probably talking about ~15 engineers worth of code?
I know that's all silly numbers, but this is just attempting to give people some context here, this isn't a massive code base. I've not read a lot of it, so maybe it's better than the verbose code I see Claude put out sometimes.
> It's only 510k LoC, at ~100 lines of code a day for a year, this code base would take 23 engineers a year to write.
Correction: a code base of 500kLoC would take 23 engineers a year to write. There is no indication that the functionality needed in a TUI app that does what this app does needs 500kLoC.
The previous poster was making out that in a year the code base would be a mess if people had done it.
This is a two-pizza team sized project, so it's not a project that the code quality would inevitably spiral out of control due to communication problems.
A single senior architect COULD have kept the code quality under control.
I did an optimization pass for a client once where I got rid of a ton of the sprites but didn't have the energy to redo it all, so it just had huge sections that were blank.
Super snappy loading afterwards though.
reply