The book for C++ 1.0 was a lot thinner than the books for the following versions too! I shudder to think how many thousands of pages it must be now...
Try some project that would benefit from concurrency and parallelism and you'll love Channels, Goroutines and the rest of primitives in Golang.I'm in complete agreement here.
About the only thing good I can say about them is:
They can make you stop and think about data, the relationships between different objects and types, rather than just on the algorithms.
Go and Rust look interesting, but for me I need a "real project" to actually work on to understand them. Just doing tutorials and "hello, world" programs are good enough to get the fundamentals of how to build the executable but don't have enough to really get into the nitty-gritty stuff.
No. One needs to look past the hype to see what's really going on. Make sure the hype isn't about things you don't need and it's just an ad for things you are told you want.Do you HAVE to learn all this new stuff
To answer the original post:
Yeah, it's always been this bad. Just imagine having a handle on AMD vs Intel CPU chipsets back in 2010s, and being able to talk about the differences between the two, and then realizing that there's Motorola, DEC, Sun SPARC and Texas Instruments who were also major players back then. Does anyone remember the bewildering variety of architectures that NetBSD was ported to?
Nowadays, we have ARM architecture, Snapdragon-based laptops, Exynos (from Samsung) chips in phones, and plenty of variety even if Apple's M series gets completely ignored. A wave of new chips and architectures to learn about and lose yourself in! Not to mention you can do a lot of the compute tasks straight on the GPU, which is a whole 'nother animal.
AMD vs Intel CPU war is still on, but with battery power emerging as a priority that competes with computing power and weight of the device, all on the back of price for individual devices and architecture flexibility, this has me thinking that in the next 15 years, we're gonna have the next wave of new architectures, features, and tradeoffs to learn.
At some point, you gotta ask yourself: Do you HAVE to learn all this new stuff, or maybe it's time to get more pragmatic about prioritizing what you want to learn, what you have the time/ability to learn, and how useful will that newfound knowledge be for you?
Thanks for the tip. I'll look into that.I used to think that, but I ended up not liking how it handled low level data types when used with classes.
Yeah, but boils down to "a front-end developer that thinks they can do back-end because Node.""Full Stack" means we want a front-end developer, a back-end developer, a web designer and a project manager - and we if we pay you all 1/4 of what the janitor gets, can you do it in 1/4 of the time?
This is pretty much the same idea that John Carmack gave when he went on his semi-recent AI coding retreat.And every other language, when I did try to use it, I always had a voice in the back of my mind saying, "But I can already do this in C!"
In hindsight, I should have just gone full retro and done everything in ANSI C. I do have plenty of days where, like many older programmers, I think “Maybe C++ isn’t as much of a net positive as we assume...”. There is still much that I like, but it isn’t a hardship for me to build small projects in plain C.
So how many are now wondering, "Is Doc John Carmack!!!"This is pretty much the same idea that John Carmack gave
I've got a mate who keeps nagging me about how I need to learn memory safe. I tell him to use strncpy and ignore him.It's not just the language that needs maturing, but the members of this cult.
Just for interest.
![]()
Top 10 highest-paying programming languages in the US - ellow.io
Discover the top 10 highest-paying programming languages in the US in 2024. Our comprehensive guide explores lucrative tech skills, offering insights into salary trends and career opportunities. Perfect for both budding and experienced developers looking to maximize their earning potential.ellow.io
There are lots of different versions of this list; zdnet, dice, indeed, tiobe... take your pick.
Supposedly the highest paying programming languages in US in 2024.
Briefly:-
1. Solidity (so solid crew what??? - showing my ignorance and how desparately uncool I am)
2. erlang
3. clojure
4. scala
5. go (hmm, worth having a look at go then)
6. c#
7. perl (wtf? But, but, python! I thought perl was supposed to be dead...)
8. ruby (really? In 2024?)
9. java
10. rust
Surprise number 0: no python in the list!
Surprise number 1: no C / C++ in the list!!!! WHAT???
Surprise number 2: no SQL in the list!
Surprise number 3: no VB in the list!https://ellow.io/highest-paying-programming-languages-in-the-us/ay
Hmm, maybe that last one isn't such a big surprise.
No node.js either.
Go is definitely worth learning, even just for personal stuff. The language is only a little more complicated than C to learn, offers garbage collection, fast builds for any platform you'd want, and the built-in linter helps enforce good practices. I wouldn't even consider using C or C++ for from-scratch projects any more unless tiny executables or maximum performance were critical. Go executables are small enough and fast enough for most uses, and its other advantages outweigh that for most general-purpose work. The cool kids all use Rust though.5. go (hmm, worth having a look at go then)
ccammack said:It's not just the language that needs maturing, but the members of this cult.
Heh, Hector Martin is that guy with that cartoon character alter-ego right? Asahi Lima or something. He is a very much an exhibitionist. I wonder how many on that mailing list cringe when he opens his mouth.
It's not just the language that needs maturing, but the members of this cult.
And "what have you programmed with it and which company? Where did you get your experience?"The trick is to be good NOW