What's with the avalanche of new things to learn - has it always been that bad?

cracauer@

Developer
Anybody else feeling that the amount of new things to learn has gotten out of control?

C++ dumps a big bucket of complicated features on you every 3 years. I am not even completely 2020-ized, 23 is already out and 26 is next year. I feel that learning C++' new features well takes more time than there is between editions. There is of course all the machine learning. I did some supervised ML including training thereof, but my understanding of all the new terms being casually thrown around today is frighteningly low. So self-hosting DeepSeek it is. Then there is all the containerization and orchestration thereof - and FreeBSD is now right in the mix with everything released in the last 12 months. Goes from virtualization such as qemu internals all the way to Ansible. Let's not even talk of GPU programming which is on my wishlist, but buried pretty deep. Then there is actually practical things like more DTrace and eBPF, and kernel security features which I have been focusing on. New languages like Rust and Go. That's just my mix, I am sure others have more.

Has it always been that bad? Maybe I have selective memory?
 
Yes. It might be due to all the talk about computers and programming nowadays that bring in a bunch of new people. A lot of these new people might be trying to create things that are new without regard to the basics and fundamentals.
 
The thing is, there is an avalanche of new things. Knowledge grows exponentialy. But you don't have to learn them. There are more scientific publications in some years than from -inf to 1950, you can't keep up. No way. Learn the things you  need to learn. The rest is hobby.

And I have stopped to follow the C++ circus around c++11 or so. I know what I need, wake me when there is more I need to know.
 
Anybody else feeling that the amount of new things to learn has gotten out of control?
Yes, I do. Maybe I'm just getting old?

About 25 years ago, I was a C++ expert. I had a copy of the ANSI standard printed out in a binder, and I was our group's official language lawyer, because I knew where in the binder to find each language issue. I also knew Java, to the point that I put corrections into Java books. By coincidence, I ran into the author of one of the O'Reilly books about Posix (their kid and my kid happened to play in the same high school band), and it turned out our level of knowledge about Posix standards was comparable. So I was pretty much an expert.

Fast forward 25 years. I got hired by a big FAANG company that programs in C++ 17 (and later in C++ 20). I had spent the intervening time working in an old-fashioned company programming in C++ of the ~1998 level. And I discovered that I didn't even know how to read and understand modern C++ code, much less write it. OK, spent a few weeks worth of evenings on training classes, and at least I was able to read it. I even checked out a dedicated textbook about "C++ move semantics" from the library. Yes, such a book exists! It taught me that C++ has become so ridiculously complicated, I do not WISH TO program in it, even for the few months that I had the required skills. Was the problem that I got old and stupid? Perhaps, but the complexity of C++ has increased to a ridiculous amount.

Then I got thrown into a briar patch of modern Java code. I thought I knew Java: I programmed in it from about 1994 to 2000, intensely. Turns out the stuff I knew was correct, but completely unidiomatic. For some reason, the common Java style took the "Gang of Four patterns book" and inhaled the whole thing (there's a joke about Bill Clinton in there). While technically the code I had to read and fix was still Java, it was really all about patterns. So now I was faced with forgetting everything I knew about style, and re-learn it.

In my opinion, part of the problem is that my knowledge (and to a lesser extent yours, you are much younger) is simply dated and has not been refreshed. That's because the day has only so many hours, and I can't spend a few hours every day on C++ or Java programming just to stay "up to date", instead I need to do my day job and get paid. But as you point out, part of the problem is that languages have become much more complex. Both the language definition (C++ is the perfect example here), and the style (Java is my example).

What is the cause of this? That's a tough one. Part of it is that C was a really crappy language, which barely worked for systems (kernel) development, but at least it was better than the only alternative at the time: assembly. C++ was a pus-covered band aid put on top of C to make programming large software projects more rational (that's a dirty joke about Grady Booch, the person who has done more damage to software engineering than any other, perhaps excluding RMS). But it didn't fix the basic problems of C, which are memory management, string management, and memory management (there is an old joke about Alzheimer's in that sentence). Matter-of-fact, by allowing us to write more "sensible code" on a weak foundation, it made software development harder: We were now using C++ for million line projects, but the basic language was designed or 10K line projects (such as the Unix kernel, or awk or Postgres). And to deal with those hyper-large systems (remember when Mentor or Cadence tried to compile 10M line C++ artifacts!), we had to add all the complexity to C++ to make it barely work. And we've kicking that dead whale down the beach, and every 2-3 years it gets even bigger and deader.

Then came Java, and the idea of fixing it all. Started out nice. Turned into a huge political/corporate/economic nightmare (remember Scott McNealy suggesting that killing Bill Gates would be a good idea, since the inheritance tax on his riches would pay for the federal deficit). And again, language and libraries and style grew out of control.

Now you add that the problem that the whole concept of "OS" has failed. The idea behind an OS was to virtualize a real computer, and allow multiple users and multiple processes to share it. Several people can log in, and each of them can run several programs at once. I remember using computers where that was NOT possible, so having an OS (like VMS, Unix, or MVS/TSO) was a real step forward. But it turns out we utterly failed at it. Instead of having an OS, we had many. We needed VM, so we can run multiple different OSes. What happened to the idea of portable software, or "compile once, run anywhere"? And then we discovered that even if everyone wants to run the same OS on the hardware, the version control problem is unsolvable. So we gave up on portable software and version interoperability, and instead went to Docker / Flatpack / Kubernetes / Jails / and so on: Every program comes with a complete copy of the OS and all libraries. You are probably old enough to remember Rob Pike's paper "Systems Research is Dead". Not only was he right, we are dancing on the grave of systems in production. In the old days, we took a 100 line C program and compiled it, and ran it. If we were good (and I was!), we write it such that it would compile on any POSIX compliant system. Today we wrap that 100 line C program into a multi-gigabyte MPM, which requires Rabbit and Rapid to build (I'm giving you clues about which environment I worked in), and then we schedule that MPM to be run by Borg (which was designed by 10 PhDs with degrees in econometrics, and figuring out the scheduling algorithm requires one to have that PhD). I thought this was the stupidest thing ever ... until my colleagues taught me that a 10-liner in Python (an interpreted language that doesn't require compilation, nor libraries!) also needs to be turned in a multi-gigabyte MPM and scheduled through Borg to actually run. No, I'm not allowed to do "./quick_test.py" from the command line.

So explain to me again: Why do I need to learn everything about Docker and Kubernetes, to run 100 lines of C or 10 lines of Python?

-----------------------

Sorry about this post. It's an example of "old man yells at clouds".
 
ralphbsz I'm currently working on a project where some folks did embedded realtime java with enterprise methods. Wouldn't be surprised to find a factory for the ProblemFactory...
 
What is the cause of this? That's a tough one.
One can observe similar things in all tech industries. One may decide if it's wanted ('conspiracy theory'), or just thoughtless - or as I see it, a mix of both:
Almost all companys are not lead by tech people (engineers), but by business people mostly lack, and not seldom ignore basic understandings of technical core principals. Not only large companys want to bring in their individual needs into standards. The tenor of every company also always is trying to gain advantages for themselves, and disadvantages for the competitors, of course naturally. They don't want to build a better world (whatever crap they tell on their websites), they want to maximize their revenue. Additionally for growing markets more regulations are needed.
Over time things are getting larger, even bloated, more complicated, more complex.
At a certain point you need teams of many people to deal with something apparently trivial of what you think at the first peek: 'wtf do you need three teams each of a dozen engineers plus consultants for working on something I could do alone on a weekend?'
Result: Small companys, or one-man shows somewhen cannot keep up anymore. But companys large enough for not caring to employ additionally some dozens more people can.

But to end with a joke (I really enjoyed yours, ralphbsz (Alzheimer was best 😂)):
As all I read, and as far as I understand it doesn't matter anymore how complicated programming becomes,
'cause AI does it for you :cool:
 
AI does it for you
I mentioned this a couple of times elsewhere. Yesterday, again, I tried to get AI to solve a programming problem for fun. It provided a wrong answer six times and finally admitted it had to give up.

I asked it a few questions about settings on FreeBSD. It insisted on me using Bash. When I told it bash wasn't native to FreeBSD, it apologized, rewrote the code, but on another question, continued to use bash.

A setting on my phone seems to have disappeared or moved. 10 times I asked AI to find it and 10 times it failed before it gave up.
 
About over complexity : I think it's the result of people incompetence to deeply understand things they working on, just add more and more layer over other layer, unable to think as a whole system, unable to delete over kill or over bloated parts.

This is the end of black box scheme, we don't care about how it's works and using without any mastering. This is not about computers only, all modern society is concerned, everywhere, sadly.
 
About over complexity : I think it's the result of people incompetence to deeply understand things they working on, just add more and more layer over other layer, unable to think as a whole system, unable to delete over kill or over bloated parts.

Yeah. You have to be really careful adding features to programming languages, because you can never delete them later. I will have to give the Golang folks some credit here - by today's standards they display refreshing constraint.

ETA: and that is for something that comes out of Google. There might be hope for humanity.
 
ETA: and that is for something that comes out of Google. There might be hope for humanity.
You know who is behind that project. He is not of the all-and-the-sink type.

Reading up on some new languages, I'd sooner learn go than catching up with C++.

And as always for that topic, a reminder.
 
You know who is behind that project. He is not of the all-and-the-sink type.

Reading up on some new languages, I'd sooner learn go than catching up with C++.

And as always for that topic, a reminder.

There's also compile, link and startup time, aka the time it takes for a 1-line change to running actively for debugging. If it takes more than 7 seconds (*) your brain context is mostly gone and you start over major parts. A lot of languages violate this now, not to mention build tools.

My take on it:

(*) 7 seconds is a commonly used number, YMMV, but not too much.
 
Touching some files of llvm and rebuilding takes bloody ages. Something is seriously wrong in this universe.
 
Anybody else feeling that the amount of new things to learn has gotten out of control?

C++ dumps a big bucket of complicated features on you every 3 years.
You are generally right for new things. Too many people are now working in "PC world" and create new things. But especially for C++ the solution is very simple - Ignore it, at least new versions. I have done that and have no plans to try to use all new "features" of C++ after C++03. It seems Linus Torvalds thinks in the same way (this is in case I am nobody to follow my advice).
 
We can add... too many people reinventing the wheel continuously. And not only in the computer world, in many other fields also.
There is a bunch of people/consultants/companies selling smoke in the form of over complex systems full on new ways to do old things ( with the corresponding new language or terms)... or selling solutions to a problems invented/caused by them.

So... yes, I feel the same way.
 
It depends.

There's an avalanche of content and you must be extremely selective and put hard limits on everything. Social media, Youtube, Netflix, etc.

Ansible is not something you need to master. With just 10% of knowledge you can get 90% of what you want.

So-called "AI" is not that interesting to me. Perhaps the only interesting bit is the security part on how to jailbreak LLM's.

Remember that this is all about having fun. That's why I'll never learn Rust. I'm not masochist.
 
I understand the feeling, but I think you have to differentiate. First of all, on the theoretical level I don't see that much progress. There's only few fundamentally new methods in ML, and distributed systems still work the same as 20 years ago. What I'm personally missing is some info channel that provides a high level perspective instead of all the noise.

One big change in the industry is the economics of things. Shift to services, much more is built on top of existing things, you rarely see anything actually replacing the old - devOps are built to work around all the deficiencies of traditional network protocols, webservers, operating systems, without improving the underlying technologies. And GPUs are extensions to existing hardware, not entirely new architectures. All this means a growing surface of specialized things, often based on inadequate abstractions, and a lot of complication for you to learn.

And C++ ... has always been too many possibilities. You decide which style and guidelines to follow in your project, but you have to choose and be selective. They currently try to tackle all the shortcomings they previously built into the language, let's hope they also find some way to deprecate stuff...
 
So-called "AI" is not that interesting to me. Perhaps the only interesting bit is the security part on how to jailbreak LLM's.

I really appreciate how you phase this. Machine learning seems so much of what we really talk about.

For example searching datasets for star movments and various material science experiments that can be ran looking at alloying.

What bugs me is my father poking around at this AI stuff and doing nothing. Spending cycles doing nothing. Words that mean nothing.
Not looking for new compounds to heal the world.. At least SETI or Prime95 had laudable goals regardless how stupid I found them..

How people spend thier time on the computer is up to them. I might say something though.

My boss is 3D printing a PVC-tubing tracked old time Steam Train replica for his xmas light display for next year.
How goofy use of a computer.
He has been hitting me up for wireless control opinions.
I told him we could put an battery powered FreeBSD WirelessAccessPoint on a Pi with web interface to power down the loco.
He is planning on charging batteries daily anyway for loco power. I said ditch Windshield Wiper motor and use robot motors.
More control and tons of choices.

Of cource there is the $35 Wireless Battery Disconnet with Fobs, but we have to work a comptuter in there somehow.
 
I really appreciate how you phase this. Machine learning seems so much of what we really talk about.

I avoid stochastic tools. I prefer determinism. The same output for the same input unless I explicitly ask for randomness.

That's not to say that LLM lacks use cases. I'll use it some day to detect photos in my backups that needs to be rotated, etc. But I wouldn't trust a cloud service with my private data.
 
Back
Top