Be Aware of the Makefile Effect
blog.yossarian.net411 points by thunderbong 2 days ago
411 points by thunderbong 2 days ago
"A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system."
– John Gall (1975) Systemantics: How Systems Really Work and How They Fail
It's why I'm always very skeptical of new languages and frameworks. They often look great on a PowerPoint slide, but it's not clear how they'll look on something complex and long-lasting.
They usually pick up warts added for some special case, and that's a sign that there will be infinitely many more.
There's a fine line between "applying experience" and "designing a whole new system around one pet peeve". But it's a crucial distinction.
With that attitude how would the presently accepted languages/frameworks have come about?
Probably slower and with more respect for existing tech.
But hey, now we have npm, so who cares anymore? :-)
Most languages are much older than we think. But early adoption is a key to geting to that point of when to "trust it". D isn't that much younger than C and its variants, and older than C#. But it never quite got that adoption to really push development to the point of C#
C (K&R) : 1972 => 53 years ago
C++ : 1985 => 40 years ago
D : 2001 => 23 years ago
Also, https://www.bell-labs.com/usr/dmr/www/chist.htmlSo D is 30 years younger than C, so I'd disagree with "isn't that much younger".
D was really a reaction to C++, not C, so it is with C++ that it should be compared. The C like subset of D (BetterC) is much more recent.
I was thinking more about Ansi C. But fair enough. I hope the core point that these are all still Languages old enough to drink rings through.
Disrespect is part of progress, respectful humans are liable to blindness of flaws. Just as part of youthful creativity is disregard for what has come before.
I can't agree with that take. Criticism is a part of progress. You can be a critic but still be respectful.
Disrespect is simply to belittle and look down upon. I don't see many situations where such an attitude leads to progress.
> "designing a whole new system around one pet peeve"
BAHHAHAH! So…you mean React. If I hear the word hook as if it alone can solve complexity in web dev one more time I’ll…eh, I’ll do nothing actually. But my point still stands. React solves asynchronous event driven behavior well, but that’s all. Everything else in React projects is, well, everything else.
Great quote, commenting just to 'bookmark' it.
I have an alternate theory: about 10% of developers can actually start something from scratch because they truly understand how things work (not that they always do it, but they could if needed). Another 40% can get the daily job done by copying and pasting code from local sources, Stack Overflow, GitHub, or an LLM—while kinda knowing what’s going on. That leaves 50% who don’t really know much beyond a few LeetCode puzzles and have no real grasp of what they’re copying and pasting.
Given that distribution, I’d guess that well over 50% of Makefiles are just random chunks of copied and pasted code that kinda work. If they’re lifted from something that already works, job done—next ticket.
I’m not blaming the tools themselves. Makefiles are well-known and not too verbose for smaller projects. They can be a bad choice for a 10,000-file monster—though I’ve seen some cleanly written Makefiles even for huge projects. Personally, it wouldn’t be my first choice. That said, I like Makefiles and have been using them on and off for at least 30 years.
> That leaves 50% who don’t really know much beyond a few LeetCode puzzles and have no real grasp of what they’re copying and pasting.
Small nuance: I think people often don’t know because they don’t have the time to figure it out. There are only so many battles you can fight during a day. For example if I’m a C++ programmer working on a ticket, how many layers of the stack should I know? For example, should I know how the CPU registers are called? And what should an AI researcher working always in Jupyter know? I completely encourage anyone to learn as much about the tools and stack as possible, but there is only so much time.
If you spend 80% of your time (and mental energy) applying the knowledge you already have and 20% learning new things, you will very quickly be able to win more battles per day than someone who spends 1% of their time learning new things.
Specifically for the examples at hand:
- at 20%, you will be able to write a Makefile from scratch within the first day of picking up the manual, rather than two or three weeks if you only invest 1%.
- if you don't know what the CPU registers are, the debugger won't be able to tell you why your C++ program dumped core, which will typically enable you to resolve the ticket in a few minutes (because most segfaults are stupid problems that are easy to fix when you see what the problem is, though the memorable ones are much hairier.) Without knowing how to use the disassembly in the debugger, you're often stuck debugging by printf or even binary search, incrementally tweaking the program until it stops crashing, incurring a dog-slow C++ build after every tweak. As often as not, a fix thus empirically derived will merely conceal the symptom of the bug, so you end up fixing it two or three times, taking several hours each time.
Sometimes the source-level debugger works well enough that you can just print out C++-level variable values, but often it doesn't, especially in release builds. And for performance regression tickets, reading disassembly is even more valuable.
(In C#, managed C++, or Python, the story is of course different. Until the Python interpreter is segfaulting.)
How long does it take to learn enough assembly to use the debugger effectively on C and C++ programs? Tens of hours, I think, not hundreds. At 20% you get there after a few dozen day-long debugging sessions, maybe a month or two. At 1% you may take years.
What's disturbing is how many programmers never get there. What's wrong with them? I don't understand it.
You make it sound easy, but I think it's hard to know where to invest your learning time. For example, I could put some energy into getting better at shell scripting but realistically I don't write enough of it that it'll stick so for me I don't think it'd be a good use of time.
Perhaps in learning more shell scripting I have a breakthrough and realise I can do lots of things I couldn't before and overnight can do 10% more, but again it's not obvious in advance that this will happen.
I agree. And there's no infallible algorithm. I think there are some good heuristics, though:
- invest more of your time in learning more about the things you are currently finding useful than in things that sound like they could potentially be useful
- invest more of your time in learning skills that have been useful for a long time (C, Make) than in skills of more recent vintage (MobX, Kubernetes), because of the Lindy Effect
- invest more of your time in skills that are broadly applicable (algorithms, software design, Python, JavaScript) rather than narrowly applicable (PARI/GP, Interactive Brokers)
- invest your time in learning to use free software (FreeCAD, Godot, Postgres) rather than proprietary software (SolidWorks, Unity, Oracle), because sooner or later you will lose access to the proprietary stuff.
- be willing to try things that may not turn out to be useful, and give them up if they don't
- spend some time every day thinking about what you've been doing. Pull up a level and put it in perspective
I agree. An additional perspective that I have found useful came from a presentation I saw by one of the pragmatic programmers.
They suggested thinking about investing in skills like financial investments. That is, investments run on a spectrum from low risk, low return to high risk, high return.
Low risk investments will almost always pay out, but the return is usually modest. Their example: C#
High risk investments often fail to return anything, but sometimes will yield large returns. Their example: Leaning a foreign language.
Some key ideas I took away:
- Diversify.
- Focus on low risk to stay gainfully employed.
- Put some effort into high risk, but keep expectations safe.
- Your mix may vary based on your appetite for risk.
> invest your time in learning to use free software (FreeCAD, Godot, Postgres) rather than proprietary software (SolidWorks, Unity, Oracle), because sooner or later you will lose access to the proprietary stuff.
The think you have a solid point with Postgres v Oracle and I haven’t followed game dev in a while, but your FreeCAD recommendation is so far from industry standard that I don’t think it’s good advice.
If you need to touch CAD design in a professional setting, learn SolidWorks or OnShape. They’re what every MechE I’ve ever worked with knows and uses, and they integrate product lifecycle aspects that FreeCAD does not.
One simple approach that the second, or at least third, time you deal with something, you invest time to learn it decently well. Then each time you come back to it, go a bit deeper.
This algorithm makes you learn the things you'll need quite well without having to understand and/or predict the future.
If it’s a tool you use every day, it’s worth understanding on a deeper level. I’ve used the shell probably every day in my professional career, and knowing how to script has saved me and my team countless hours of tedious effort with super simple one liners.
The other thing that’s worth learning is that if you can find tools that everybody uses regularly, but nobody understands, then try to understand those, you can bring enormous value to your team/org.
That’s an insightful comment, but there is a whole universe of programmers who never have to directly work in C/C++ and are productive in safe languages that can’t segfault usually. Admittedly we are a little jealous of those elite bitcrashers who unlock the unbridled power of the computer with C++… but yeah a lot of day jobs pay the bills with C#, JavaScript, or Python and are considered programmers by the rest of the industry
Yeah, I write most things in Python or JavaScript because it's much more practical.
Both have strong limits for writing complex code. Typescript is one attempt of an answer because bad as javascript is for large programs the web forces it. I prefer a million lines of c++ to 100k lines of python - but if 5k lines of python sill do them c++ is way too much overhead. (rust likely plays better than c++ for large problems from scratch but most large probles have existing answers and throwing something else in would be hard)
This is the 40% that OP mentioned. But there's a proportion on people/engineers that are just clueless and are incapable of understanding code. I don't know the proportion so can't comment on the 50% number, but hey definitely exist.
If you never worked with them, you should count yourself lucky.
We can’t really call the field engineering if this is the standard. A fundamental understanding of what one’s code actually makes the machine do is necessary to write quality code regardless of how high up the abstraction stack it is
Steam engines predate the understanding of not just the crystalline structure of steel but even the basics of thermodynamics by quite a few decades.
I don't consider that an equal comparison. Obviously an engineer can never be omniscient and know things nobody else knows either. They can, and should, have an understanding of what they work with based on available state of the art, though.
If the steam engine was invented after those discoveries about steel, I would certainly hope it would be factored into the design (and perhaps used to make those early steam engines less prone to exploding).
Yes and they’re far less efficient and require far more maintenance than an equivalent electric or even diesel engine, where equivalent power is even possible
Steam engines currently power most of the world's electrical grid. The main reason for this is that, completely contrary to what you said, they are more efficient and more reliable than diesel engines. (Electric motors of course are not a heat engine at all and so are not comparable.)
Steam engines used to be very inefficient, in part because the underlying thermodynamic principles were not understood, but also because learning to build safe ones (largely a question of metallurgy) took a long time. Does that mean that designing them before those principles were known was "not engineering"? That seems like obvious nonsense to me.
Steam engines are thoroughly obsolete in the developed world where there are natural gas pipeline networks.
People quit building coal burning power plants in North America at the same time they quit burning nuclear power plants for the same reason. The power density difference between gas turbines and steam turbines is enough that the capital cost difference is huge. It would be hard to afford steam turbines if the heat was free.
Granted people have been building pulverized coal burning power plants in places like China where they'd have to run efficient power plants on super-expensive LNG. They thought in the 1970s it might be cheaper to gasify coal and burn it in a gas turbine but it's one of those technologies that "just doesn't work".
Nuclear isn't going to be affordable unless they can perfect something like
https://www.powermag.com/what-are-supercritical-co2-power-cy...
If you count the cost of the steam turbine plus the steam generators plus the civil works to enclose those, nuclear just can't be competitive.
There is some truth in what you say. Though steam engines still power most of the power grid (especially in the "developed world") their capital costs are indeed too high to be economically competitive.
However, there are also some errors.
In 02022 24% of total US electrical power generation capacity was combined-cycle gas turbines (CCGT), https://www.eia.gov/todayinenergy/detail.php?id=54539 which run the exhaust from a gas turbine through a boiler to run a steam turbine, thus increasing the efficiency by 50–60%. So in fact a lot of gas turbines are installed together with a comparable-capacity steam turbine, even today.
Syngas is not a technology that "just doesn't work". It's been in wide use for over two centuries, though its use declined precipitously in the 20th century with the advent of those natural-gas pipeline networks. The efficiency of the process has improved by an order of magnitude since the old gasworks you see the ruins of in many industrial cities. As you say, though, that isn't enough to make IGCC plants economically competitive.
The thing that makes steam engines economically uncompetitive today is renewable energy. Specifically, the precipitous drop in the price of solar power plants, especially PV modules, which are down to €0.10 per peak watt except in the US, about 15% of their cost ten years ago. This combines with rapidly dropping prices for batteries and for power electronics to undercut even the capex of thermal power generation rather badly, even (as you say) if the heat was free, whereas typically the fuel is actually about half the cost. I don't really understand what the prospects are for dramatically cheaper steam turbines, but given that the technology is over a century old, it seems likely that its cost will continue to improve only slowly.
Yeah, and people are talking about renewables as if the storage is free. Or people quote case 17 out of
https://www.eia.gov/analysis/studies/powerplants/capitalcost...
as if 1.5 hours of storage was going to cut it. I've been looking for a detailed analysis of what the generation + storage + transmission costs of a reliable renewable grid is that's less than 20 years old covering a whole year and I haven't seen one yet.