What's with the avalanche of new things to learn - has it always been that bad?

I'm in complete agreement here.
About the only thing good I can say about them is:
They can make you stop and think about data, the relationships between different objects and types, rather than just on the algorithms.

Go and Rust look interesting, but for me I need a "real project" to actually work on to understand them. Just doing tutorials and "hello, world" programs are good enough to get the fundamentals of how to build the executable but don't have enough to really get into the nitty-gritty stuff.
Try some project that would benefit from concurrency and parallelism and you'll love Channels, Goroutines and the rest of primitives in Golang.
 
To answer the original post:

Yeah, it's always been this bad. Just imagine having a handle on AMD vs Intel CPU chipsets back in 2010s, and being able to talk about the differences between the two, and then realizing that there's Motorola, DEC, Sun SPARC and Texas Instruments who were also major players back then. Does anyone remember the bewildering variety of architectures that NetBSD was ported to?

Nowadays, we have ARM architecture, Snapdragon-based laptops, Exynos (from Samsung) chips in phones, and plenty of variety even if Apple's M series gets completely ignored. A wave of new chips and architectures to learn about and lose yourself in! Not to mention you can do a lot of the compute tasks straight on the GPU, which is a whole 'nother animal.

AMD vs Intel CPU war is still on, but with battery power emerging as a priority that competes with computing power and weight of the device, all on the back of price for individual devices and architecture flexibility, this has me thinking that in the next 15 years, we're gonna have the next wave of new architectures, features, and tradeoffs to learn.

At some point, you gotta ask yourself: Do you HAVE to learn all this new stuff, or maybe it's time to get more pragmatic about prioritizing what you want to learn, what you have the time/ability to learn, and how useful will that newfound knowledge be for you?
 
To answer the original post:

Yeah, it's always been this bad. Just imagine having a handle on AMD vs Intel CPU chipsets back in 2010s, and being able to talk about the differences between the two, and then realizing that there's Motorola, DEC, Sun SPARC and Texas Instruments who were also major players back then. Does anyone remember the bewildering variety of architectures that NetBSD was ported to?

Nowadays, we have ARM architecture, Snapdragon-based laptops, Exynos (from Samsung) chips in phones, and plenty of variety even if Apple's M series gets completely ignored. A wave of new chips and architectures to learn about and lose yourself in! Not to mention you can do a lot of the compute tasks straight on the GPU, which is a whole 'nother animal.

AMD vs Intel CPU war is still on, but with battery power emerging as a priority that competes with computing power and weight of the device, all on the back of price for individual devices and architecture flexibility, this has me thinking that in the next 15 years, we're gonna have the next wave of new architectures, features, and tradeoffs to learn.

I've been around for RISC and the rest of what you mention. I didn't perceive it as much of a burden, the only slight pain was all the different variants of Unix. A couple of #ifdefs during programing and different admin methods.

Now on the other hand you have GPUs, which are programmed fundamentally different. You also have SIMD instructions which are used in libc and which are done in assembly.

At some point, you gotta ask yourself: Do you HAVE to learn all this new stuff, or maybe it's time to get more pragmatic about prioritizing what you want to learn, what you have the time/ability to learn, and how useful will that newfound knowledge be for you?

Much of what I mentioned seems required by the job market, unless you want to be a full-stack programmer (in which case there are a bazillion of web frameworks).
 
  • Like
Reactions: mer
"Full Stack" means we want a front-end developer, a back-end developer, a web designer and a project manager - and we if we pay you all 1/4 of what the janitor gets, can you do it in 1/4 of the time?
Yeah, but boils down to "a front-end developer that thinks they can do back-end because Node."
 
And every other language, when I did try to use it, I always had a voice in the back of my mind saying, "But I can already do this in C!"
This is pretty much the same idea that John Carmack gave when he went on his semi-recent AI coding retreat.

In hindsight, I should have just gone full retro and done everything in ANSI C. I do have plenty of days where, like many older programmers, I think “Maybe C++ isn’t as much of a net positive as we assume...”. There is still much that I like, but it isn’t a hardship for me to build small projects in plain C.

As per the main topic. Has it always been this bad? I don't think things have changed, most developers never need to take a serious look at a new language. What we do have, especially from some of the newer / less mature languages is noise. And *lots* of it. If you just ignore it, you can get back to life and doing stuff :)
 
Just for interest.


There are lots of different versions of this list; zdnet, dice, indeed, tiobe... take your pick.
Supposedly the highest paying programming languages in US in 2024. I think UK/europe likely to be similar.

Briefly:-
1. Solidity (so solid crew what??? - showing my ignorance and how desparately uncool I am)
2. erlang
3. clojure
4. scala
5. go (hmm, worth having a look at go then)
6. c#
7. perl (wtf? But, but, python! I thought perl was supposed to be dead...again...)
8. ruby (really? In 2024?)
9. java
10. rust (I suppose some poor bastard has to use it)

Surprise number 0: no python in the list!
Surprise number 1: no C / C++ in the list!!!! WHAT???
Surprise number 2: no SQL in the list!
Surprise number 3: no VB in the list!

Hmm, maybe that last one isn't such a big surprise.
No node.js either.

There are probably some specialised languages used in crypto, finance/hedge funds, etc that pay very well too, but hyper-specialised and tiny number of actual jobs.
 
Just for interest.


There are lots of different versions of this list; zdnet, dice, indeed, tiobe... take your pick.
Supposedly the highest paying programming languages in US in 2024.

Briefly:-
1. Solidity (so solid crew what??? - showing my ignorance and how desparately uncool I am)
2. erlang
3. clojure
4. scala
5. go (hmm, worth having a look at go then)
6. c#
7. perl (wtf? But, but, python! I thought perl was supposed to be dead...)
8. ruby (really? In 2024?)
9. java
10. rust

Surprise number 0: no python in the list!
Surprise number 1: no C / C++ in the list!!!! WHAT???
Surprise number 2: no SQL in the list!
Surprise number 3: no VB in the list!https://ellow.io/highest-paying-programming-languages-in-the-us/ay

Hmm, maybe that last one isn't such a big surprise.
No node.js either.

Well, first of all those studies are difficult. It might be entirely bogus.

But apart from that, different languages are used for different things in different fields. Fields pay differently. Common Lisp probably had a very high average pay after Google bought ITA software and inherited a large chunk of the world's community of Lisp programmers working on QPX and QRES.

C++ not in the list is indeed surprising. Just the salary of high-speed stock trading companies should raise the average quite a bit. Or maybe I have a distorted view of this because I'm too close to New York.

node.js and Python I can see low, there are just too many programmers working on random projects. But Python is the language of feeding AI systems and for the people who prepare the training data. So the earning potential is there - if you happen to know a lot about machine learning. Again, different fields pay differently.
 
Yes, needs taking with a pinch of salt. I thought it was quite an interesting list though. Hadn't heard of 'solidity' before. I'm still getting the feeling that go is worth taking a look at.
 
Yeah, looks like interesting rumors, is anyone gonna boast in here that they actually found a job using that info?

Just repeating the info you come across while browsing the Internet - yeah, that does need to be taken with a grain of salt. After all, I only learned about Solidity for the first time just now... a quick look told me that while it looks like a fun thing to learn, salaries look too good to be true. It very well can take a lot of time, effort, and computing power to become competent to the point that somebody is willing to pay the advertised salary.

It's gonna be more like what Crivens mentioned in post #58... if you ever get there in terms of competence. It very well can turn out to be what Java was in 2000-2010, and what virtualization was in 2010-2020. The trick is to be good NOW, be able to sell snake oil using sexy vocabulary, and then become entrenched in the job and impossible to fire. Not that different from the legal framework of how marriage is supposed to work - same end result!
 
Hmmm... my suspicions are raised that someone is trying to publicise 'solidity' with that article. Maybe cracauer@ is right in thinking it could be bogus. Oh well, back to the drawing board. Some of the other languages listed are interesting though. Another one that comes to mind that would have made that list in days of yore is SAP... perhaps that doesn't count as a pure programming language though.

There was a guy in my old team who left for a new job in the netherlands doing clojure programming, he told me they made him a very good offer. It probably depends as much on the particular job and the person as on the programming language used though.
 
5. go (hmm, worth having a look at go then)
Go is definitely worth learning, even just for personal stuff. The language is only a little more complicated than C to learn, offers garbage collection, fast builds for any platform you'd want, and the built-in linter helps enforce good practices. I wouldn't even consider using C or C++ for from-scratch projects any more unless tiny executables or maximum performance were critical. Go executables are small enough and fast enough for most uses, and its other advantages outweigh that for most general-purpose work. The cool kids all use Rust though.
 
It's not just the language that needs maturing, but the members of this cult.
ccammack said:
"The cool kids all use Rust though."

I hereby claim copyright to the slogan "I suppose some poor bastard has to use it"(c) in relation to being forced to work with the Rust programming language, under all relevent intellectual property law notwithstanding prior art hitherto in all world-wide jurisdictions, although I will consider granting permission to use the slogan upon receipt of either a suitably excellent cup of coffee or of a suitably vast sum of bitcoin, the payment to be decided by me at the time of granting usage permission...

T-shirts, mugs, mouse mats, keyboards, cricket bats and clubs to beat your PC with and other merch bearing the slogan will be available real soon now... :)
 

It's not just the language that needs maturing, but the members of this cult.
Heh, Hector Martin is that guy with that cartoon character alter-ego right? Asahi Lima or something. He is a very much an exhibitionist. I wonder how many on that mailing list cringe when he opens his mouth.

These Rust "abstractions" (bindings) seem to be absorbing all of the developers time. It has been over a year already and they are still developing them! Just wait until maintenance needs to happen; they are going to be very fragile.

Hopefully some of the skilled developers from Linux will move over to FreeBSD as an end result :)
 
To kind of reiterate what others have mentioned, because I do think about this often...
There is an influx of young people who are creative, and since all the "necessary stuff" has already been done, they often create "solutions in search of a problem" simply to make a name for themselves. and since the world really isn't smart enough to adopt "what makes the best sense" we end up with 18,000 ways to do what we've already been doing for years...but the new way is "way so cool"

A college prof said to me many many years ago...all the programs have already been written. Everything now is just a permutation of what's already been done.
 
A lesson I learned far too late in life: Everything requires maintenance. Your health, relationships, skill set, living space, environment, vehicle, everything. Without regular upkeep things fall apart - if you are rich enough you can replace somethings but not most.

How is this relevant here? If you want to play in the C++ field, you've got to update your knowledge & skill of C++. If you have to use containers and kubernetes, you'd better keep your skills uptodate. And so on.

Unless you are in a position to change things (& have the necessary energy level to keep at it), it is no use complaining. One option is to try to simplify your life, to change the playing field, so that you have to maintain fewer and hopefully more enjoyable things. It is true that changes are speeding up and there is more and more complicated (and badly designed) systems so simplifying is harder but not impossible. An additional issue is that as we get older at least some people get less tolerant of newfangled changes that don't add to their quality of life!
 
Back
Top