I actually find the popularity of macOS an interesting symptom. They are hard to fix, difficult to procure hardware (in bulk for an enterprise) and yet they actually seem to hold a strong market-share even after all this.
They need hardly any fixing. For the most part, the hardware quality is much better than other laptops (exceptions discussed below). The software quality will be discussed below too. Quantity purchasing is trivial, both for education (I've personally bought MacBook Airs by the dozen for education, and my employers buy them by the tens of thousands). Setting up internal upgrade / management servers is trivial, because it is engineered by Apple.
The hardware exception is occasional glitches: MacBook Airs had a string of problems with CPU and GPU chips coming loose from the motherboard; the current generation MacBook Pros has terrible keyboard reliability problems.
Obviously Apple's targetting of Students was a classic ploy to make them relevant in later life and it seems to be working.
I've worked for a company where many people used Macs, but most employees were of an age that they had NO computers in school, and mainframes using punched cards in college.
I did try to get hold of one about 10 years ago (obviously they were old and rare then too!). I am fairly certain if IBM did another batch (even running the same ancient AIX), that they would sell like hotcakes.
Doing another batch would be impossible today; one would have to engineer all the chips from scratch. It would actually be faster to emulate the PowerPC instruction set on an Intel or Arm machine.
And I doubt the "selling like hotcakes": Lenovo did a "T20 memorial edition laptop" with the 7-row keyboard recently; they lost their shirt on it: hundreds and hundreds of enthusiasts bought them, but to make a laptop model profitable, you need to hundreds of thousands.
It's quite simple: the TCO of Apple hardware is much much lower compared to having a Windows computer for a company. People using MacOS are happier, more productive and you do need much less support staff for them compared to Windows.
Exactly. Macs have far fewer problems, and need much less support. And that's the real cost, which dwarves the hardware and software purchase cost. At one former employer, the central IT organization was charging departments $1800/month for every employee, to pay for access to networking, e-mail, internal tools, printing, support; for that charge you also get one laptop or desktop computer. As you can see from the price, the ~$1500 every 2-3 years for the laptop hardware or the $50 for the Windows license is irrelevantly small.
By the way if you don't believe that the TCO of a amount of Apple hardware is lower than having a big amount of Windows computers: this is not my imagination, nor opinion. This is the result of IBM having rolled out over 200.000 Macs globally since 2015,
and having done a thoroughly analysis on that hardware including all costs.
I was at IBM from 2000 onwards. Initially I used a Windows laptop (for a little while, I also had a Linux desktop, until I switched that to Windows). In about 2009, I switched to an (IBM supported and provided) Mac laptop. While corporate IT was pushing Linux laptops for a while, they back-pedaled after the support costs exploded, and made the Mac the preferred platform: simply because of higher productivity (Mac's weren't down often) and less support cost.
IBM manages to support the over 200000 Macs with just 7 employees.
That number is a mis-interpretation. I would believe that the central IT planning/managing/programming department for Mac's might be 7 people. But the typical deskside and network support staff is more like 1-2% of all employees, meaning about 4,000 support staff at IBM's size.
(Talking about Linux ...)
Yes, because they know how to do that!
That is also an important observation. If you take Linux laptops, and issue them to software engineers, you will have very few support problems. That's because the users will fix things themselves. Now take those same Linux laptops and issue them to project managers, budget analysts, mathematicians, analytical chemists (all highly educated and intelligent people), and you have a disaster at your hands. That's because they are not computer people who dig into source code when something isn't perfect. I literally had a colleague who would run a modified Linux kernel on his laptop to optimize something, because he could.
Thats cool. I would never have even believed IBM would have rolled out Macs!
Not only did they roll out Macs very early (I think starting in 2004 or 05), they also resold the used employee Macs to their employees at a steep discount. My home Mac is a 2008 MBP, which I bought used from IBM (my employer) in 2010 or 2011. It still works, and I still use it for personal stuff (alas, without a functioning battery).
Sadly I am just seeing a lot of guys who "know how to do that" being constrained by dumb internal policies. Or a single (perceived!) "must have" app that the higher ups have arbitrarily chosen due to a cooler colour scheme.
Not due to a cooler color scheme. Not due to them wanting "a virus downloader". But due to very sensible efficiency constraints. For example: It makes perfect sense for a whole corporation (whether it is 10 people or half a million) to use a single, coherent e-mail system, which implies everyone using the same mail client (we call that MUA in software land). From a compatibility and support standpoint, it is the only sensible solution. Once in a while, this leads to disasters (current IBM troubles with Notes are an example), but most of the time it is great and necessary.
I'd go so far as to say that almost all the technical staff is on Mac or Linux. The reason for this is that our deployment environment is Linux. There's just too big of an impedance mismatch with Windows to develop on it and deploy on Linux.
I have been using Windows and Mac as my laptop/desktop machine for just about ever. For the last ~20 years, all my deployment environment has been Linux or a commercial Unix (AIX, HP-UX). As a software engineer, what you run on your laptop has nothing to do with where you deploy, because you don't actually compile and link on your laptop machine. That would be insane: it is way underpowered. The actual work gets done on big iron: mainframe-like machines, clusters, and the cloud, all machines that are very large, highly efficient, and somewhere in a data center. Today, it makes very little difference whether that data center is in the basement of my office building, or on the opposite corner of the continent. Matter-of-fact, from either my home or my office it would take several days by car to reach my main login machine at work.
Other than amateurs, for at least a decade software development has not been happening on the machine that your keyboard and screen are attached to.
Note that this applies to software engineering. Other professions still run on local machines: Graphics, video and audio production, electrical engineering EDA (like CAD tools). This stuff is only very slowly moving off-desk, because of the high-bandwidth user interface needs. For software engineering, with something like emacs/vi, Eclipse, and web-based bug tracking and source control tools, it makes much more sense to run in a data center. One of the advantages: When I say the equivalent of "make", I have anywhere between dozens machines and thousands of them at my disposal.
And even they are a tiny fraction of the hordes of Android and IOS users, which is why I ponder the wisdom of wading into this thread again. We're discussing pre-installing operating systems on desktops like it's 2008. That ship has sailed, folks. The desktop is just not all that important anymore. Embrace the long tail.
From a total computing throughput point of view, that is totally true. Somewhere I saw a statistic that about 80% of all computer usage, as measured by web traffic, now comes from two OSes: Android and iOS. But for user-interface intensive tasks (such as engineering), the laptop is still the preferred mode.