Cutting Though the AI Hype

An imaginary prog-rock album cover whose theme is AI

There is so much hyperbole about Large Language Models (LLMs) in the media right now that I’m finding it can be overwhelming, and I am someone who works in the field of AI! From claims that AI will put people out of jobs to claims that it will takeover and enslave the human race, it’s difficult to know where to start. Some say AI should be regulated now, others are happy to let the "free market"1 take its course. It’s not easy to navigate, especially when many of the people with strong opinions have their own agendas. This post is really an attempt to briefly answer many of the questions I’ve asked myself over the past few weeks.

What’s Changed and Why Now?

What’s changed is that recent years the neural networks that power many of the previous generation speech to text and language classification models have gotten a lot better. Not because they evolved by themselves, but because engineers and mathematicians made them better. In the case of text generation, this means GPT-3.5 and 4 are uncannily good at predicting the next set of words for a given prompt. Text to image models like Midjourney and Stable Diffusion are now able to predict a set of pixels that can resemble a photo-realistic image from a text prompt. In the case of Open AI and ChatGPT, there has been some genuinely mind-blowing innovation. I’m more sceptical of Microsoft, who after decades of being seen as a laggard behind Google and Apple in every industry they enter, seem a little too keen to plug Open AI into all of their products. Still, there is a big PR push from both companies, and Microsoft is a big investor in Open AI. What’s new therefore is a breakthrough in the mathematical models which link training samples to prediction. Given the amount of money invested, it’s now time to monetise, secure more investment and/or research funding.

Will It Turn Against Humans and Take Over the World?

A common refrain among many skeptics of AI is to make the analogy between humans and other animals on Earth. We have used our intelligence to dominate all other species and decimate the planet. Wouldn’t a more intelligent being do the same to humans? It’s a convincing argument but I’m not sure that the likes of Stable Diffusion and ChatGPT or their successors should worry us. In fact, I think the way in which we see these statistical models today will be akin to how some people in the 1950s saw the “electronic brains” that we now call computers. Mysterious and magical. Frightening. There are two reasons I am not worried. Firstly, I cannot see how the ability to predict the next word in a sentence or the pixels that make a convincing image equate to intelligence in the human sense, or anything beyond human intelligence. Even if we assume that the breakthroughs in language prediction will also be possible in other areas of life (politics, art, engineering) it’s still simply providing predictions from a given input. What I can see, and think we’ve already seen to a certain extent already is how AI might disrupt our society. The way in which the algorithms that run social media have damaged our democracy by convincing vast swathes of the population on both the left and the right to believe in nonsense science and conspiracy theories is an obvious example of this. But there is no drive, agency or consciousness behind the AI. The second reason is history and our even current political climate tells us that intelligence is not something we should necessarily fear. Take a few examples of individuals who’ve caused damage to humankind in in the past. Trump, Putin, Hitler, Pinochet. I could go on. None of them are known for their raw intelligence. Charisma and ruthlessness perhaps. They manage to co-opt other intelligent people on to work their behalf. The most intelligent human beings from Einstein, Galileo and Lovelace to Lennon and McCartney are not the ones we need to worry about. I’m far more concerned about humans who have average intelligence and Napoleon complex, and have access to nuclear weapons than I am super intelligent AI.

Will We All Be Out of Jobs?

Unfortunately I am not as optimistic on this one. I can easily see jobs like copywriting and graphic design being disrupted at the junior level. We’ve already seen an NYT Bestseller using AI generating artwork on its cover. We’re all told not to judge a book by its cover, but let’s face it, we all do. In this case the AI generated image was listed in a stock image library alongside other human generated images, so I doubt the choice was conscious. It was also heavily modified by a human. You have to ask if the author cared so little about the cover of their book that they were happy for a stock image to be used, rather than commission an artist themselves, is there really any loss? They didn’t want to pay much for the images in the first place. But somewhere, someone would have been paid something to make that image, and it hasn’t happened this time. That junior designer would need to work on small jobs like this in order to get the experience and skills to go on to move up in their profession. How will they do that now? The problem I see is that with AI taking the low-end, basic work away from these professions, how does a human work their way up? On the other hand, the pocket calculator allowed mathematicians, accountants and engineers to focus on bigger problems and have the gruntwork done for them. Anyone gifted at maths can still find a well paid job, despite the fact we’ve had powerful calculators in our pockets for 50 years. Perhaps the same will be true of writing and graphic artistry. The bottom line is, these large models can only generate was is akin to a statistical average of what is already on the Internet. If it was perceived wisdom that the word was flat, GPT-4 would reliably tell us that the world was flat. GPT-4 has no mechanism to generate anything else and nor will its successors.

What about other jobs? Programming is often cited as being at risk from GPT, because it can generate code. Yet, I have yet to have it generate anything I couldn’t have found by searching on Google and finding the first result on StackOverflow. When I asked it to write an application to covert files from one format to another, it told me it didn’t know enough about their implementation, despite them being both public documented on the Internet. Impressive that it “knew” what it didn’t “know” (I wish more developers were like that!). The jury is out, but I am not as worried about software developers all loosing their jobs because I know just how difficult software development can be, even for humans with 30+ years of experience. Implementing a variation on a well known algorithm in the abstract is one thing, but integrating it into existing business domains, data structures, user interfaces and architectures is something else entirely.

In Summary

I find it hard to get excited about the likes of GPT and Midjourney because I find them emblematic of a tech industry that has lost its way. If someone were to have suggested 20 years ago that we should build a mathematical model that can crawled web pages as an input, and then autocomplete text and images as an output, and that it could likely spread misinformation and put journalism and other important professions at risk, we would have collectively responded with a resounding “Nope.” While I am intrigued that we may have stumbled across a deeper mathematical theory of language, if only by chance, I am not yet excited by the potential utility of such models. Beyond that, I am disappointed that in a world where many people still struggle feed themselves, that is heading for a climate disaster, and where many populist leaders are coming to power, technology - something that was always a cause for optimism when I was growing up in the 90s/2000s - looks likely to make it worse, not better.

Notes

  1. The free market of course, doesn’t exist, and is generally a euphemism for ‘what I want the rules to be in order to suit me’.

In Praise of Apple Music (Yes, the Mac App)

Apple Music has a poor reputation these days. I’m not talking about the music streaming service here, nor the iOS apps that share the same name. I’m talking about the 22 year old application formally known as iTunes. You see it’s the same application really, only with a new name and a fresh coat of paint. Squint a bit, or rather go to the Songs view and turn on Column Browser, and you will see it for what it really is. iTunes with slightly different icon and all of the video and podcast bloat removed.

Having been around since January 2001, in many places it really shows. When compared to more modern applications that come bundled with macOS, such as the nine-year old Photos app, and unlike Apple Music for iOS Apple Music for Mac retains many of it’s power user features and can be used to manage a local music library effectively, if that’s something you still want to do.

Apple Music retains most of the features from the heyday of iTunes

Yes, it does have many annoying bugs and could certainly do with some investment in the QA department. But thankfully it still has many ‘power user’ features that I admire, and I’m grateful for this. Apple Music is able to still, believe it on not, rip CDs to MP3 or AAC and will still fetch CD metadata from the venerable Gracenote CDDB service. This may sound arcane, but some music is only available on CD. (As a more general point, if you’re limiting your music discovery to what’s available on streaming services, you should definitely expand your horizons. Even the old iTunes Store has a much wider range, and streaming - especially Spotify, is notoriously bad for artists.) Once ‘ripped’, the songs are uploaded into iTunes Match and are available on all of my devices, including my Apple Watch. Apple Music also allows me manually select which albums I want to keep locally, unlike the Photos app. I can also see the status of both uploads and downloads, again unlike Photos. Best of all, I can manually remove local copy of songs stored in the cloud in order to free up space on my Mac. Yet again, such a basic function is not provided by Photos. In addition, I can easily backup my music collection using the Finder, and the data will travel across different filesystems (no special, Mac-only bundle files). Best of all, Apple Music is able to mostly, seamlessly meld my offline library, tracks synchronised via iTunes Match, and songs added from Apple’s music subscription service into one seamless library. It even allows me to create a “Smart Playlist” which can catalog songs based on whether they are rented through Apple Music, or fully owned, as well as countless other criteria such as Genre, Artist, Play Count (which incidentally, includes plays from all of my other devices), File Size, File Type, Date Added, Last Played, Composer, Year Released, and even how many times I’ve skipped a song. Yes, I could create a playlist of my most skipped songs. How’s that for a desert island disc?

So while it’s not perfect, I’m glad Apple haven’t released some half-arsed rewrite of Apple Music focused on flogging their subscription. iTunes lives on for now, if not in name, then in spirit.


Sorting by Date Is Also an Algorithm

It’s a common refrain in tech circles when discussing Twitter to talk about “The Algorithm”. Take this Lifewire article: How to Turn off the Twitter Timeline Algorithm, or this one by TechCrunch: Twitter makes algorithmic timeline default on iOS.

In the beginning the Twitter timeline was a simple list of posts by people you follow sorted by date descending. In 2013 Twitter introduced a new timeline that wasn’t so transparent. It was seemingly designed to optimise for engagement. All of a sudden, people were seeing posts from people they didn’t follow, or that were posted days ago and for some reason had been boosted by the algorithm.

The point is, both the classic timeline and the “algorithmic” timeline are both in fact algorithmic. In fact, the first algorithm taught in computer science is often the Bubble Sort algorithm. It’s still an algorithm! In the case of Twitter, or its upcoming rival Mastodon, sorting by date may be a preferable, but there are possibly downsides too. Prioritising something because it happens to have been posted recently is a form of Recency bias after all. The key seems to be transparency.

So enjoy your favourite algorithm, and remember, they’re not all bad.


Microsoft OneDrive Causing Sleepless Nights (on My Mac)

My M1 Mac mini has been sleeping dreadfully recently. Perhaps it’s the thought of one day having macOS Ventura installed on it, or the knowledge that its younger M2 brethren beats it in both performance and power efficiency. Or perhaps, it was something else altogether.

I could tell it wasn’t sleeping due to the fact the attached USB storage would endlessly spin up and back down during the day; the Mac min sits under my desk while I work all day on my laptop. One time while it was supposedly asleep and idle, I went to plug a device into it, and could feel the fan blowing out hot air from the back. It wasn’t sleeping at all.

Thanks to an app called Sleep Aid, I was able to verify that my Mac was indeed struggling to get to sleep. Even when I put it to sleep manually, it would quickly wake up and not go back to sleep again. Sleep Aid helpfully lists the processes that were active during time awake. In my case, the most active process was something called “com.apple.FileProvider.cache-delete.push”.

Armed with this information, I did what any self proclaimed nerd would do; I Googled it. To my surprise, there were barely any hits. Some posts relating to Android, one on the Apple Developer forum about a bug with the File Provider API, but it didn’t seem related. I was stumped. Sam Rowlands, the developer behind Sleep Aid reached out to me on Twitter and offered to check that Sleep Aid itself wasn’t misreporting what I was seeing. It was not.

Eventually I stumbled across a WWDC video on the File Provider API, and decided to watch it. It dawned on me that this is the API Apple makes for services such as Dropbox and Microsoft OneDrive to use. For years, Dropbox and OneDrive have been using various unofficial hacks in order to integrate with the Finder. In 2019, Apple released the File Provider API and gradually the third party cloud sync providers have been moving to it ever since.

Apart from Apple’s own iCloud Drive, I only had Microsoft OneDrive installed. I promptly deleted the OneDrive app and since then, my Mac has not had any bouts of insomnia due to com.apple.FileProvider.cache-delete.push. Problem solved.

My guess is that because I had OneDrive set not to run on system startup, and I had the OneDrive sync directory on an external hard drive, somehow the push notifications that Apple sends to indicate a file has changed were getting stuck. Perhaps OneDrive is supposed to say “I’ve got the file, you can go back to sleep” but wasn’t able to do so.

My Mac still spends a lot of the time it is supposed to be sleeping awake, but this seems to be common in M1 Macs. Thankfully now it does go to sleep for around 40% of the time instead of 0%, helping avoid needlessly wasting energy.

So if you’ve got a Mac that won’t sleep, I recommend grabbing a copy of Sleep Aid to help find the culprit.

Title image generated by Stable Diffusion using the prompt “a man sitting at a laptop falling asleep in a dark room at night, Cinematic” (with some edits).


My First Gadget: The Oregon Scientific AM-080C 34KB

https://www.youtube.com/watch?v=PUN5aKdxC3Q


Amazon’s Echo Into the Void

From Business Insider:

The vast majority of Worldwide Digital’s losses were tied to Amazon’s Alexa and other devices, a person familiar with the division told Insider. The loss was by far the largest among all of Amazon’s business units and slightly double the losses from its still nascent physical stores and grocery business.

I recall back in the summer of 2017 meeting with some senior marketing executives who worked for a multinational fashion and beauty company. We were there to talk about AI. Voice was central to the discussion. In a few years, it seemed inevitable to most people in the room that voice would be an important “touch point” for consumers wanting to interact with their brands. I felt slightly more cautious, though was easy to get wrapped up in the hype. I always imagine a simple task of ordering a meal in a restaurant. It’s far easier to peruse a menu with your eyes, than to have the waiter read you a list of what they have available to order while you try and remember and make a decision in a reasonable amount of time. In general I subscribe to the view that AI is an accelerator for human-like skills and interactions. It can speed up and automate tasks that humans do, but if those tasks don’t already make for a great experience, then AI by itself won’t make it better, unless speed and accuracy are the cause of the poor experience.

Alexa suffers from this “restaurant problem”. While modern Natural Language Understanding capabilities are very good, they haven’t progressed at the rate it seemed they would back in 2017. This makes Alexa great for simple commands like setting timers and playing music, but useless for anything for more substantial. A common misconception with systems such as Alexa and Apple’s Siri is that they generate the answers using AI. They don’t. Generative AI systems do except (See GPT3 and ChatGPT), but they cannot be trusted to provide accurate answers and because they are trained based on crawling the internet, they are unable to generate answers that require knowledge or recent or future events. ChatGPT won’t be able to tell you the weather tomorrow, and it won’t be able to tell you what time your local supermarket opens. Instead, systems like Alexa and Apple Siri use a form of text classification. After the sound waves from your voice are converted into symbols (letters and numbers) and those symbols are then converted into words, they take this sentence you uttered and classify it into one or more intents. The intent that scores the highest probability from the machine learning model is the one Alexa will presume was your actual intent. That’s why when I recently asked Alexa “At what temperature should I hang washing outside?” it thought I was was asking for a weather forecast. Someone at amazon has to have created that intent and fed the ML model with example utterances for it to be able to detect it. These systems cannot understand intents they haven’t been trained on. Once the assistant knows your intent, the next task is to extract any parameters from your utterance. Examples would be the date and location in the phrase “_Will it rain in Newport next wee_k?”. Once your voice assistant knows your intent and any parameters, it will then perform some kind of logic based on that intent. This is where the AI and machine learning typically stops. If the intent was asking the weather, then the next step would be to query a weather API. If it was to send a message to someone, then it would be to to start whichever process it used to send messages on your device. Of course the weather API itself may use AI or machine learning to predict the weather, but that is totally separate and no different to a weather presenter telling you the same forecast on the TV. This approach is extraordinarily useful for many things: most chatbots and voice assistants work like this. For people who can’t see, or find it difficult to use a touchscreen or mouse, they provide invaluable ways to interact with computing devices.

I use Siri all the time to set reminders, timers and to control my lights. What Alexa and Siri are not so good at is deep and meaningful conversation. This is where it seems Amazon’s hope that Alexa might one day be a shopping destination falls short. When you have a device that is centred around a conversational user experience, it will hit a wall due to current technical limitations and the fact that for many people, speaking is less efficient that using a smartphone when they need to both receive and provide information to complete the task. The fact that Amazon seemingly has no way to monetise Alexa means the experience has been gradually getting worse. Now when I ask it the weather, it responds with the forecast - great - but then immediately starts telling me I can order groceries from it as well. Ads like this are infuriating and a sign of desperation from Amazon.

So were we foolish to think the future of human computer interaction will be voice? No. I think in the long term, when devices are advanced enough to provide human level, meaningful conversation then there is no doubt in my mind that voice will be the one of primary user interfaces we use for some tasks at least. When I ask ask Alexa to order the precise groceries I want and have the confidence to know it will work, and that the device will be capability to ask me for confirm anything its unsure about then maybe I can see it working. But I still can’t help thinking that humans like to see as well as hear things, especially when it comes to making choices. Voice is great for issuing commands and receiving quick updates, but your voice assistant starts talking for more than about 20 seconds, then it’s usually quicker to glance down at a screen and see a text or graphical representation.

I think the future is bright for voice assistants like Siri because they complement alternative user interfaces and are part of a deep ecosystem, and so can integrate with health, home automation, contacts and other information users have provided. Voice based AI is also making large strides in call centres. Unless Amazon changes tact, the Amazon Echo however with its limited ecosystem will remain glorified clock radios for a while longer.


How a Battery Replacement Left My Apple Watch Obsolete

£91 a year for a watch?

I’ve been using an Apple Watch for over 7 years now, and throughout this time I’ve owned the original Series 0, the Series 2, and Series 4. Recently I decided it was time to upgrade to the latest and greatest Series 8. Each time I’ve upgraded I’ve given the previous model to a family member and so my girlfriend has been wearing my old Series 2 for the past 4 years. Having been originally purchased in 2016, 6 years of daily use was most definitely taking its toll on the Series 2’s battery. My partner likes to cycle and workout, which hammers the battery even more than typical use. She was therefore keen to take on my old Series 4 which still has decent battery life. That left us with her otherwise fully functional 6 year old Apple Watch Series 2, that apart from the knackered battery, still worked absolutely fine and even paired with the latest version of iOS, version 16. I really wanted to give this to someone else in our family, and so decided to pay Apple £85 for a battery replacement.

Before booking the appointment with the Apple Store, I unpaired the watch from my girlfriend’s phone and paired it with my iPhone 12 mini running watchOS 16. This was so the watch would show up in my account under the Apple support web page. With my appointment at the Apple Store booked, I walked into the Apple store with both my Series 8 and the Series 2 paired to my phone to get the battery replaced. It was lucky I did pair the Series 2 to my phone, as the guy at the Genius Bar needed to run some diagnostics on the watch the verify that the battery was indeed knackered, and I guess to rule out any other faults or damage that they might get blamed for if the device were to come back faulty. With the diagnostic check passed, and my battery confirmed to be a dud, he sent it off for a replacement. Just over a week later, I received an email telling me my device was ready to be collected.

In actual fact, my device had been replaced. The box contained a factory fresh (or possibly refurbished) Apple Watch Series 2. Fair enough I thought. I can imagine how replacing the battery in such a tiny device that also has to remain waterproof might not be worth the hassle for Apple. I paid my £85 and drove home. Before handing the shiny new but also old watch to the next relative, I wanted to try and pair it with my phone again just to check everything was working as expected. This is where it all started to go down hill. I would instigate the pairing process, but get to the point where the Watch would insist on being updated, only to tell me it could not connect to the update server and to check my internet connection. I suspected this to be a case of lazy error handling on the part of the Apple developer who wrote this code, as my internet connection was fully functioning. Still, I tried on 4G just to make sure, and to my complete lack of surprise, it made no difference.

After some time Googling the error message, I figured out what the issue was. The replacement watch came loaded with an old version of watchOS which is not compatible with iOS 15 or iOS 16. What I needed to do was find an old iPhone running an older OS (my guess: iOS 14), upgrade the Watch to watchOS 6 (the last version supported on the S2) and then it would pair successfully with my iOS 16 device, as it had done only a week earlier. I tried on an iOS 15 device, just in case, but it made no difference. I didn’t have any older phones, at least none that were so old that they weren’t applicable for iOS 15, and nor did anybody I know. No worries I thought: I’ll go back to the Apple Store. Surely they have some kind of Mac app that can simulate any iPhone and load any watchOS version on the a device as deemed appropriate by a qualified expert?

My hopes were dashed when the geniuses in the Apple Store were equally as confused as I had been at first. They didn’t have any older iPhones and so the only hope would be to send the Watch back to the repair depot. They did have sympathy for the fact I’d paid for a new battery and now the device couldn’t be used, and so suggested I upgrade the S2 to the brand new Apple Watch SE, offering me a generous 50% discount off the price of a new one. I hadn’t planned on treating this other family member that much – but even after factoring in the £85 spent on a new battery, it seemed like such a good deal. I took them up on the offer, but the gentleman who served me said I could always pop back in on a weekday with the Series 2 and they’d still be able to send it back off to be upgraded, so it wouldn’t go to waste. I walked away with a new SE, and an obsolete S2 and got on with my life, and our family member was extremely chuffed with the early Christmas present.

A few weeks later, after seeing this stainless steel Apple Watch that originally cost £549 back in 2016 staring at me from the shelf every day I though maybe I should get Apple to fix it. It was still within its 90 day repair warranty, and the guy at the Apple Store had suggested It could be done. I have plenty of relatives with iPhones who would love a free Apple Watch. At £549, lasting almost exactly 6 years, it cost £91/year to own. That’s actually quite a lot for a watch. I can’t imagine being satisfied with any other Watch or piece of jewellery that cost that much only lasting 6 years. So I dropped it back into the Apple Store, and a week later it was replaced yet again. I went to collect it, and this time tried to run the pairing process in store. This time, the pairing process failed with exactly the same error message as before. The problem hadn’t been resolved, despite the service notes explicitly stating “customer wants to pair this with his iOS 16 device”. I didn’t push the issue with the staff – after all, they did look after me by offering the discounted SE. This was their “solution” to the problem in fact.

This brand new Series 2 was destined to sit on a shelf for ever more. What a waste.

The point of this post is not to complain about Apple or the staff in the Apple Store. The store staff did a good job turning around a bad situation. The point is to raise awareness about how utterly disposable Apple Watches are. I can understand that there must be a limit to how long companies like Apple provide updates to devices for, but a phone without updates will still work for a few years. The Apple Watch however won’t even pair to a modern phone, or even the the original phone I had when I bought the Series 2 (an iPhone SE, which if kept up to date would be on iOS 15).

So if you’re thinking of buying one of the more expensive, premium watches with premium materials, maybe think again.


Apple Battery Replacement Costs Compared

Every year we marvel at the latest product releases from tech companies, with Apple often leading the way. What is often not said however is that that shiny new device with even more battery life than the previous model will quite likely be a shadow of its former self in little over 2 - 3 years. This is not a conspiracy, and there is nothing evil going on. Batteries, like brake pads on car, are consumable devices that we should expect to have to replace.

Unlike brakes on a car, the cost of replacing a battery can be quite the expense relative to the original price of the device. In many cases, the price of the original (or equivalent) device may have gone down by the time the battery needs replacing further reducing the gap. Why is this bad? The closer the gap between replacing a battery and the cost of a new device makes it more likely for someone to say, “My battery sucks, I’ll just buy a new one”. This is great if you’re in the market for selling new devices, but it’s not so great for the environment.

The chart above shows the price of a new device compared to the cost of replacing the battery, as quoted by Apple on their UK web site on the 15th of September 2022. Where there are upgrade options available (such as storage or RAM), I took the price of the base model. For AirPods, Apple quotes a price per AirPod. I was generous and only multiplied this by 2, my intuition being that customers are less likely to replace the battery in their AirPods case but will need to replace the batteries in the buds themselves.

As you can see, the percentage ranges from 10% for a MacBook Air M2 to a whopping 61% for the AirPods 3rd generation. For some reason it’s cheaper to replace the battery in a set of AirPods Pro than it is with the cheaper AirPods. One might argue that this is simply the price we pay for having cheaper products in the first place. Obviously, the relative cost of a battery for a £2,000 maxed out MacBook Pro will be a lot less than for a £259 Apple Watch. The problem is that the higher the cost of a replacement battery relative to simply replacing the product, the more likely consumers opt for the latter. Don’t get me wrong, there are many other reasons consumers upgrade their devices, but poor battery life surely contributes greatly to that feeling that the device is old and needs replacing.

Perhaps Apple could do more to nudge consumers into making fewer unnecessary purchases by reducing the cost of replacement batteries?

Chart Data

Device

Device Cost

Battery Cost

%

iPhone 14 Pro

£1,099

£105

10%

iPhone 14

£849

£105

12%

iPhone SE 2

£449

£49

11%

IPad Pro 11”

£749

£99

13%

IPad Air

£569

£99

17%

iPad

£319

£99

31%

Apple Watch Ultra

£849

£105

12%

Apple Watch 8

£419

£85

20%

Apple Watch SE

£259

£85

33%

AirPods Pro*

£249

£90

36%

AirPods*

£179

£110

61%

MacBook Air M2

£1,249

£129

10%

MacBook Air M1

£999

£129

13%

MacBook Pro 14

£1,899

£199

10%


System Preferences: Then and Now

After 21 years, the original macOS System Preferences is being retired. The next version of macOS, “Ventura” will have an all-new iOS style preferences application named “System Settings”. I decided to go all the way back to 2001 and look at the original System Preferences (or “System Prefs” as it was called in the Menu Bar) in Mac OS X 10.0 “Cheeta” to see which of the original 21 preference panes made through all 21 years, and how they ended up in their final incarnation under macOS 12 “Monterey”.

All Preferences

Then Now

Classic

This one went away in MacOS X 10.5 Leopard.

ColorSync

Since replaced with an app

Then Now

Date & Time

Then Now

Date and Time > Network Time

Then Now

Date & Time > Time Zone

Then Now

Date & Time

Then Now

Desktop

Not technically a System Preference pane, but it quickly became one in 10.1 so I’ve included it here :)

Then Now

Displays

Then Now

Dock

Then Now

Energy Saver (some options missing due to being in a VM)

Then Now

Energy Saver on a Portable

Energy Saver on a Portable (if anyone knows how to configure UTM to make Mac OS X 10.0 think it’s running on a laptop, let me know and I’ll add a comparison for that too)

General

Then Now

Language & Region > Time

Then Now

Internet

Then Now

Keyboard

Then Now

Login

Then Now

Login

Then Now

Mouse

Then Now

Network

Then Now

Quicktime

Then Now

Screen Saver

Then Now

Screen Saver > Hot Corners

Then Now

Sharing

Then Now

Software Update

Then Now

Sound

Then Now

Speech

Then Now

Startup Disk

Then Now

Users

Then Now

In sum

So, after 21 years it’s fair to say that while System Preferences has evolved and become more complex, a lot has also stayed the same. Teleport someone from 2001 to 2022 and while you might need to explain the concepts of the cloud storage, Wi-Fi, Bluetooth, and Touch ID, they would probably do just fine working out the basics of macOS. (With the major exception of those who require the use of accessibility options, which wouldn’t be introduced until 10.1 “Puma”.) Since 2001, many more preference panes have been added. On a clean macOS Monterey installation, I counted 30 preferences panes. This increases to 32 if Family Sharing and iCloud are signed into. That’s an increase of about half a preference pane every year. I can therefore see why Apple might want to move to a more scalable system. That said, part of me will miss the familiarity of using System Preferences, the same application I have used since 2001 at the dawn of the OS X era.


iOS 16 Put Notifications on Notice

Recently announced iOS 16 will feature a new look Lock Screen. One of the striking features is how notifications are now far less prominent than they have been ever since their arrival in iPhone OS 3.0 back in 2009.

The change is a signal that many users want to control what they see and when, rather than have application developers decide for them. In 2009 it was quiet novel to know instantly when someone liked your Facebook post. After 13 years the novelty has worn off. In fact, many notifications are just superfluous noise that don’t enrich our lives at all.

While there will always be a need for some kinds of notification, my sense is that there will be a shift towards being “off by default” in future iOS releases - much in the same way the Windows system tray ended up being filled with every possible kind of icon and Microsoft ended up just hiding them all by default.