Apple Vision Pro…blematic

Image credit: Apple (https://www.apple.com/ca/newsroom/2023/06/introducing-apple-vision-pro/)

Three and a half months after the release of Apple Vision Pro, and I am finding it difficult to get on board with Apple's vision for the future of computing. While admittedly I haven't tried the Vision Pro due to it not being available in the UK, and I'm sure I would be momentarily wowed by the experience, I just can't get excited about what I see as a deeply dystopian vision of the future.

Let’s start with the vision itself. We spend too much time on screens as it is. Endless doomscrolling and the inability to switch off are widely believed to be behind rising anxiety and depression. Apple, to their credit, has attempted to remedy this with features like Screen Time which allow limits to be put in place. The very concept of the Vision Pro goes against this progress. It's a screen strapped to your face. Right now, the discomfort of a hot and heavy device acts as a deterrent against using it too much. My worry is that as technology progresses and much like the smartphone, the Apple Vision Pro will get lighter and more comfortable — more addictive. If the headset follows the path of the smartphone, then it will eventually be something we want to wear as much as possible. We live in a world of filter bubbles. If you use Twitter (sorry, X), then your view of reality can be vastly different from your next-door neighbour who also uses Twitter, due to the fact that you both like and follow totally different posts and sets of people. Smaller social networks don’t need an algorithm to do this as they will typically only attract people of a certain political leaning in the first place. The end result is the same. Filter bubbles are an obvious problem for social cohesion, and while they are not limited to social networks (hello Fox News), I can’t help thinking that when literally everything is viewed through a screen — yes, your entire reality — the problem will only get worse. If you've seen Black Mirror, then I don’t need to explain why, but let me anyway. Imagine an app for a future revision of Vision Pro (or its rivals) that can be worn all day. The app allows users to employ Twitter-style blocking, so the faces of people they would rather not see are blurred out. Could someone with right-wing views use an app to blur out TV screens showing CNN or the front page of the New York Times, allowing them to walk down the high street and not be bothered by reality, or what they consider dissenting opinions? In 2020, as California was struck by awful wildfires, people noticed something strange. Their smartphones were automatically adjusting the apocalyptical red-tinted sky to look more “normal.” The image of humans using VR goggles to deny climate change or insert virtual green spaces into an otherwise industrial wasteland is just terrifying. In short, I am deeply worried about a future in which our entire reality is seen through the filter of a screen. The Apple Vision Pro can’t do any of this right now, and I doubt Apple would ever allow it to. Yet as the Twitter/Musk debacle shows us, the people who run companies can change, and their policies with it. Not to mention that if the Apple Vision Pro were to be a success, then there would naturally be rival products released which may not be as restrictive.

All of what I’ve said so far is admittedly far-fetched, and perhaps assumes the worst of humanity. So let me focus on the Apple Vision Pro as the product it is today. Apart from being deeply antisocial, the main problem I see is that it is an iPad strapped to your face but priced as if it were a fully-featured MacBook strapped to your face. The iPad and tablets, in general, are great content consumption devices, but the limitations inherent in mobile operating systems make them poor substitutes for a laptop. Samsung and Apple have taken steps to address this, including making the iPad more like a laptop with the addition of support for trackpads, mice, and keyboards. Yet the apps themselves are more often than not baby versions of the same applications available for desktop operating systems, if an app is available at all. There are exceptions — Logic and Final Cut Pro. Overall, it seems a combination of iOS (sorry iPadOS) limitations, Apple’s greedy business practices, and the poor ergonomics of tablet devices has contributed to the iPad not taking off in a productivity context, outside some niche verticals. By designing the Apple Vision Pro’s software stack in the image of the iPad, the device is unfortunately severely limited. Had the device been based on macOS, with the ability to run Mac software, then it would be a completely different story. I understand that technical limitations likely prohibit this: the accuracy of Vision Pro’s gesture recognition simply isn’t good enough to allow Mac apps designed for use with a mouse to be comfortably driven. Which brings me to my final criticism. One of the most touted features of the Vision Pro is the ability to VNC into a desktop Mac and control it remotely. The fact that this was pushed as a feature is the most damning indictment of the Vision Pro’s capabilities as a computing device. A proper computing device worth its $3,500+ price tag wouldn’t need the ability to remotely connect into your other $2,000+ laptop — it would be as capable itself. On various forums I’ve seen happy Vision Pro customers tout the ability to work on a plane for hours on end with both their MacBook and Apple Vision Pro. Here we go again with the dystopian future: can someone not take a flight for a few hours without needing to be productive? For goodness' sake, read a book, listen to a podcast, or just look out of the window!

So, while the nerd in me is ready to be wowed by the Apple Vision Pro, I find myself despairing as I think about a potential future where we are always plugged in, online, and unable to escape work. I can’t help but think this isn’t what technology was supposed to be about. I have to ask myself, would the figure from Apple’s famous 1984 commercial who throws a sledgehammer at the big screen be wearing an Apple Vision Pro, does the screen itself represent Apple latest Vision?


Microsoft: This Feels Desperate Now

Andrew Cunningham at Ars Technica has written a nearly 3,000 word article explaining how to remove advertisements and other annoyances from Windows 11 and Microsoft Edge. Microsoft are not alone in degrading the quality of their software in order to sell more services, Apple has been doing this as lot recently too. With Microsoft though it feels increasingly desperate. Its Copilot feature acts like a thin wrapper around the OpenAI API, no doubt created in order so that Microsoft can be seen as fully riding the LLM bandwagon. Of course the resulting stock price bump from this perception was probably factor here as well. Its garish icon is placed on the taskbar by default, yet has no intuitive way to be removed. You'd think would be able to simply right-click and unpin it like any other icon, right? Of course not. Instead, you have to dig around in the system settings app. Either their UX designers are bad at their jobs, or more likely a manager stepped in and said "Actually, let's enhance the friction in removing the icon; my forthcoming bonus hinges on maximising user engagements with it. (evil laugh)"

The same is true in Microsoft Edge. Any setting that Microsoft would rather you didn't change is unbelievably difficult to find. Want to remove the side bar, or change the default search engine from Bing? One might think, perhaps naively, that altering something as basic as the default search engine shouldn't require a degree in computer science. Yet, here I am, resorting to the wisdom of Google search to uncover the elusive setting (I did try asking Microsoft Copilot, but alas, it came back with something completely unrelated). Edge is also desperate for me to make it the default browser. Instead of cutting to the chase, it dances around like a slick sales rep, coyly offering me the choice of "Microsoft's Recommended Settings" as if I'm choosing to enhance my computer security, when really it's more about padding Microsoft's bottom line.

I suppose we should be grateful for Microsoft at least offering settings for these options - there's no reason they have to. Microsoft's strategy seems to be a delicate balancing act between making it difficult for the masses and maintaining some appeal for the more tech-savvy who would switch to another vendor if they didn't have these options at all.

Anyway, it's a desperate look. Please Microsoft, show some class!


Fixing Focus Modes on iOS

When Apple introduced Focus Modes and Focus Filters, I hoped it might finally mean the ability to cleanly separate work and personal life within iOS. Unfortunately, the system is flawed.

Let’s briefly recap on how it works. Focus Modes are an extension of “Do Not Disturb” which has been with iOS since version 2011 (if my memory is correct) and allow precise control of notifications for different scenarios. Apple allows users to create various focus modes for different activities. For each focus mode, you can control who and which apps are allowed to send notifications. Apple ships iOS with several default modes such as “Sleep”, “Work”, and “Reading”. For example, I might turn off Slack notifications in the “Personal” Focus Mode, and then schedule that to activate in the evenings.

Focus filters on the other hand control which data is available inside apps and are linked to a particular Focus Mode. For example, for to-do list app that allow you to create different lists, you could use Focus Filters to hide your shopping list when the “Work” Focus Mode is activated, and then hide your work tasks while your “Personal” Focus Mode is activated.

The problem with Apple’s implementation is that it always requires a particular Focus Mode to be activated. There is no way to apply Focus Filters or control notifications when no Focus Mode is activated. Additionally, Focus Modes and Focus Filters are literally filters. This means they can only reduce content that is already enabled. In my use case, I want to turn off my work email and calendars by default but have then enabled during work hours. I want to disable Slack Notifications all the time, except during work hours. This is simply not possible with iOS and represents a fundamental flaw in the design. I can of course disable Slack Notifications fully, but then I can’t enable them for a particular Focus Mode. I could also painstakingly configure focus modes to consume contiguous time blocks throughout the day, but this is error prone and there are times when a focus mode gets switched off for some reason and I’m inundated with notifications I don’t want. The Apple Watch is also unaware of Focus Filters and will happily show calendar appointments despite the same appointments being filtered out on my Phone.

I’m not quite sure how Apple missed the obvious on this one. Perhaps they are stuck dealing with technical debt and the current system is a compromise based on what is technically achievable. Perhaps they genuinely think the current system is adequate. Perhaps I’m the outlier. Either way, I hope they address this issue. Currently, iOS still makes it far too difficult to separate work and personal content.


Why Is My Apple Watch Not Charging to 100%?

Apple have recently introduced a new featured called "Optimized Charge Limit" which aims to lengthen the lifespan of the battery embedded within an Apple Watch:

From Apple's support pages:

With watchOS 10, this feature is available on Apple Watch SE, Apple Watch Series 6 and later, and Apple Watch Ultra and later. Optimized Charge Limit learns from your daily usage to determine when to charge to an optimized limit and when to allow a full charge. Optimized Charge Limit is on by default when you set up your Apple Watch.

Apple support documentation

The feature builds on the existing optimised charging feature found on the Apple Watch and most other Apple devices. Now though, if the watch notices that you typically only use, say 30% of the battery capacity on a daily basis, but still charge it every night, then instead of charging it to 100% it might stop short at say, 70% instead. Lithium iron batteries don't like to be charged above 80% or below 20%. Doing so frequently will shorten the lifespan of batteries, making a dreaded replacement necessary.

Hopefully the algorithm employed by Apple is smart enough to work out your routine, and so if it notices you usually go for a long run on a Tuesday morning then it will give you a full charge the night before.

This is a much needed new feature as the tiny batteries in Apple Watches are more prone to the effects of aging than larger devices. Their small capacity means a slight decrease in capacity will have a notable effect, and also mean users are likely complete a greater number of charge cycles over a shorter time than on say an iPhone or MacBook.

That said, the feature is it is perhaps not visible enough. Various technical support forms and subreddits have been flooded with people who think their watch is malfunctioning. As is often the case with Apple, you are probably best off simply not looking at the battery meter and just using the watch. If you do notice the battery is not fully charged but think you might need it to be, the support document details how to temporarily disable the feature (note, you need follow the instructions while the watch is connected to power).


Now in the Fediverse

Wordpress have enabled support for ActivtyPub. So you can now follow this blog on platforms such as Mastodon by searching for imarc.co.uk@imarc.co.uk - let me know if it works by replying to this post.


Microsoft to Discontinue Visual Studio for Mac

In a disappointing move, Microsoft have discontinued Visual Studio for the Mac. This comes not long after they rewrote the user interface to be faster and more fluid using native frameworks. Given the depressing state of Windows these days, the lack of a full-featured IDE does not bode well for the long-term future of .NET and other related technologies. For many years, Windows has seemed stuck in limbo. Adding pointless features like Dark Mode, but still with nowhere near the number of productivity features Mac users take for granted. I had considered VS for Mac somewhat if a lifeboat for .NET developers wanting jump ship from Microsoft's burning platform.


watchOS 10: Peaks and Troughs

Another year and another update to the venerable operating system that runs on millions of watches is here. This time though, it’s a mixed bag. It’s easy for us technology enthusiasts become complacent about the fact we get new features released each year for ‘free’. There is though unwritten contract between Apple and its user base that these free updates won’t regress the product that was originally purchased. Unfortunately that’s not the case with watchOS 10. While there are some fantastic new features, the overall quality appears to have dropped significantly.

Let’s start with the good. The new design is a welcome change. Having dropped the original 42/38mm watches, Apple have been able to stretch their design muscle and rethink many of the core apps. Stalwarts like Activity (which incidentally has not been renamed to Fitness like its iOS sibling - a plus in my view) now look much more modern, though there is still no way to see a map of a run or cycle on the watch itself - surprising considering the Apple Watch has shipped with a maps app since day one. The Fitness app remains the same, but now asks for confirmation when ending a workout. After eight years of not requiring one and having recorded workouts nearly every day of those eight years, this will definitely take some getting used to. The Weather app looks great, taking inspiration from its iOS counterpart. Overall, Apple have done a fantastic job with the new design.

My favourite new feature is widgets. It’s now possible to scroll the digital crown or swipe up from the main watch face and see a list of customisable widgets. The nicely solves the problem of watch faces being severely limited with space and the fact that many of the nicest looking watch faces don’t have many complication slots. It’s essentially the Siri watch face available all of the time with a simple gesture. In fact, I’m surprised the Siri face is still available as it seems unnecessary now. The downside is it requires two hands to operate, and so it’s not as convenient as a standard complication would be.

At long last we have some decent new watch faces. For the past few years the new offerings have been hit and miss. It’s as if Apple sends all its junior designers on day one to the watch face division. Well those junior designers have finally graduated. The new Solar Analogue and Palette faces are beautiful, and sit well along alongside some of the original classics such as Solar and Astronomy.

an iPhone showing running metrics during a cycle workout.

Another useful new feature is that when recording a cycling workout, your phone can be used to display metrics from the workout. File this under “totally obvious” because in hindsight, it is. In hindsight, of course. This is Apple at its best. Trying to look at a watch when cycling isn’t easy, and can often be downright unsafe! Now, thanks to the seamless integration between watchOS and iOS, an Apple Watch Series 4 purchased in 2018 can now be a full on cycle computer at no extra cost.

But I said it wasn’t all good. The biggest problem has been battery life. With watchOS 9 on my Series 8, I would typically end the day at 50% on those days when I did short workout (30mins), but with watchOS 10 it’s been down to 20%. “That’s OK” I hear you say “As long as you get through the day, that’s all that matters right?”. Wrong. It’s not acceptable for an update to regress the original capabilities of the hardware. If it was possible to get two full days from the watch, then it should be with the new update (battery aging and expected reduction in capacity notwithstanding). When I cycled around the Isle Of Wight earlier this year, I took for granted the watch could deal with a day of cycling. I often run for multiple hours at a time. I want to know the watch can cope with what I paid for it to do in the first place. With watchOS 10, I’m not so confident any more. I’m sure it’s a bug, and it will be fixed in due time, but the fact Apple didn’t spot and fix it before the release is extremely disappointing. I’ve been able to mitigate the issue somewhat by removing features such as the Weather complication and disabling background refresh for all apps. Still, I wonder if the QA department have been focusing too much on Vision Pro lately instead of the watch!

Playing media now seems unnecessarily complicated. Before it was clear whether you were playing a song or podcast on the watch or from your phone. Now when I navigate to the “Dowloaded” section and choose a podcast, it plays on my phone! I hope it’s a bug and will be fixed. I’m not sure in what world Apple thought someone would want to play audio from their phone via the watch by default. The only way to stop it doing this was to walk out of bluetooth range of my phone and then try to play the podcast again. A common scenario for me is selecting a podcast before a run. Now a task that took a couple of taps is going to take a lot of fiddling with Bluetooth settings. This is another example of poor quality and lack of testing. (If it is indeed a bug, if not then it is an astonishingly poor reading on how customers actually use their devices)

If Apple could fix the rough parts, I would recommend upgrading to watchOS 10. Right now, I wish I’d held off. I suspect after a couple of months have passed, Apple will iron out the rough bits and fix the bugs. So if you’ve not yet upgraded, I’d recommend waiting. If you have an Apple Watch Ultra but rarely use it for ultra marathons, then you might be OK with the reduced battery life. This time last year, watchOS was still the newest of Apple’s computing platforms. Now though there’s a new kid on the block: visionOS. It will be interesting to see how much attention the Apple Watch gets going forward. I’m not particularly enthused about the prospect of wearing an M1 Mac on my face, and at the current price, I won’t be trying Vision Pro for many years. The Apple Watch is still the computer that is with you all of the time and the only computer that directly senses your body. For me at least, it’s still the most exciting of Apple’s platforms.


Cutting Though the AI Hype

An imaginary prog-rock album cover whose theme is AI

There is so much hyperbole about Large Language Models (LLMs) in the media right now that I'm finding it can be overwhelming, and I am someone who works in the field of AI! From claims that AI will put people out of jobs to claims that it will takeover and enslave the human race, it's difficult to know where to start. Some say AI should be regulated now, others are happy to let the "free market"1 take its course. It's not easy to navigate, especially when many of the people with strong opinions have their own agendas. This post is really an attempt to briefly answer many of the questions I've asked myself over the past few weeks.

What’s Changed and Why Now?

What's changed is that recent years the neural networks that power many of the previous generation speech to text and language classification models have gotten a lot better. Not because they evolved by themselves, but because engineers and mathematicians made them better. In the case of text generation, this means GPT-3.5 and 4 are uncannily good at predicting the next set of words for a given prompt. Text to image models like Midjourney and Stable Diffusion are now able to predict a set of pixels that can resemble a photo-realistic image from a text prompt. In the case of Open AI and ChatGPT, there has been some genuinely mind-blowing innovation. I'm more sceptical of Microsoft, who after decades of being seen as a laggard behind Google and Apple in every industry they enter, seem a little too keen to plug Open AI into all of their products. Still, there is a big PR push from both companies, and Microsoft is a big investor in Open AI. What's new therefore is a breakthrough in the mathematical models which link training samples to prediction. Given the amount of money invested, it's now time to monetise, secure more investment and/or research funding.

Will It Turn Against Humans and Take Over the World?

A common refrain among many skeptics of AI is to make the analogy between humans and other animals on Earth. We have used our intelligence to dominate all other species and decimate the planet. Wouldn't a more intelligent being do the same to humans? It's a convincing argument but I'm not sure that the likes of Stable Diffusion and ChatGPT or their successors should worry us. In fact, I think the way in which we see these statistical models today will be akin to how some people in the 1950s saw the "electronic brains" that we now call computers. Mysterious and magical. Frightening. There are two reasons I am not worried. Firstly, I cannot see how the ability to predict the next word in a sentence or the pixels that make a convincing image equate to intelligence in the human sense, or anything beyond human intelligence. Even if we assume that the breakthroughs in language prediction will also be possible in other areas of life (politics, art, engineering) it's still simply providing predictions from a given input. What I can see, and think we've already seen to a certain extent already is how AI might disrupt our society. The way in which the algorithms that run social media have damaged our democracy by convincing vast swathes of the population on both the left and the right to believe in nonsense science and conspiracy theories is an obvious example of this. But there is no drive, agency or consciousness behind the AI. The second reason is history and our even current political climate tells us that intelligence is not something we should necessarily fear. Take a few examples of individuals who've caused damage to humankind in in the past. Trump, Putin, Hitler, Pinochet. I could go on. None of them are known for their raw intelligence. Charisma and ruthlessness perhaps. They manage to co-opt other intelligent people on to work their behalf. The most intelligent human beings from Einstein, Galileo and Lovelace to Lennon and McCartney are not the ones we need to worry about. I'm far more concerned about humans who have average intelligence and Napoleon complex, and have access to nuclear weapons than I am super intelligent AI.

Will We All Be Out of Jobs?

Unfortunately I am not as optimistic on this one. I can easily see jobs like copywriting and graphic design being disrupted at the junior level. We've already seen an NYT Bestseller using AI generating artwork on its cover. We're all told not to judge a book by its cover, but let's face it, we all do. In this case the AI generated image was listed in a stock image library alongside other human generated images, so I doubt the choice was conscious. It was also heavily modified by a human. You have to ask if the author cared so little about the cover of their book that they were happy for a stock image to be used, rather than commission an artist themselves, is there really any loss? They didn't want to pay much for the images in the first place. But somewhere, someone would have been paid something to make that image, and it hasn't happened this time. That junior designer would need to work on small jobs like this in order to get the experience and skills to go on to move up in their profession. How will they do that now? The problem I see is that with AI taking the low-end, basic work away from these professions, how does a human work their way up? On the other hand, the pocket calculator allowed mathematicians, accountants and engineers to focus on bigger problems and have the gruntwork done for them. Anyone gifted at maths can still find a well paid job, despite the fact we've had powerful calculators in our pockets for 50 years. Perhaps the same will be true of writing and graphic artistry. The bottom line is, these large models can only generate was is akin to a statistical average of what is already on the Internet. If it was perceived wisdom that the word was flat, GPT-4 would reliably tell us that the world was flat. GPT-4 has no mechanism to generate anything else and nor will its successors.

What about other jobs? Programming is often cited as being at risk from GPT, because it can generate code. Yet, I have yet to have it generate anything I couldn't have found by searching on Google and finding the first result on StackOverflow. When I asked it to write an application to covert files from one format to another, it told me it didn't know enough about their implementation, despite them being both public documented on the Internet. Impressive that it "knew" what it didn't "know" (I wish more developers were like that!). The jury is out, but I am not as worried about software developers all loosing their jobs because I know just how difficult software development can be, even for humans with 30+ years of experience. Implementing a variation on a well known algorithm in the abstract is one thing, but integrating it into existing business domains, data structures, user interfaces and architectures is something else entirely.

In Summary

I find it hard to get excited about the likes of GPT and Midjourney because I find them emblematic of a tech industry that has lost its way. If someone were to have suggested 20 years ago that we should build a mathematical model that can crawled web pages as an input, and then autocomplete text and images as an output, and that it could likely spread misinformation and put journalism and other important professions at risk, we would have collectively responded with a resounding "Nope." While I am intrigued that we may have stumbled across a deeper mathematical theory of language, if only by chance, I am not yet excited by the potential utility of such models. Beyond that, I am disappointed that in a world where many people still struggle feed themselves, that is heading for a climate disaster, and where many populist leaders are coming to power, technology - something that was always a cause for optimism when I was growing up in the 90s/2000s - looks likely to make it worse, not better.

Notes

  1. The free market of course, doesn't exist, and is generally a euphemism for 'what I want the rules to be in order to suit me'.

In Praise of Apple Music (Yes, the Mac App)

Apple Music has a poor reputation these days. I’m not talking about the music streaming service here, nor the iOS apps that share the same name. I’m talking about the 22 year old application formally known as iTunes. You see it’s the same application really, only with a new name and a fresh coat of paint. Squint a bit, or rather go to the Songs view and turn on Column Browser, and you will see it for what it really is. iTunes with slightly different icon and all of the video and podcast bloat removed.

Having been around since January 2001, in many places it really shows. When compared to more modern applications that come bundled with macOS, such as the nine-year old Photos app, and unlike Apple Music for iOS Apple Music for Mac retains many of it’s power user features and can be used to manage a local music library effectively, if that’s something you still want to do.

Apple Music retains most of the features from the heyday of iTunes

Yes, it does have many annoying bugs and could certainly do with some investment in the QA department. But thankfully it still has many ‘power user’ features that I admire, and I’m grateful for this. Apple Music is able to still, believe it on not, rip CDs to MP3 or AAC and will still fetch CD metadata from the venerable Gracenote CDDB service. This may sound arcane, but some music is only available on CD. (As a more general point, if you’re limiting your music discovery to what’s available on streaming services, you should definitely expand your horizons. Even the old iTunes Store has a much wider range, and streaming - especially Spotify, is notoriously bad for artists.) Once ‘ripped’, the songs are uploaded into iTunes Match and are available on all of my devices, including my Apple Watch. Apple Music also allows me manually select which albums I want to keep locally, unlike the Photos app. I can also see the status of both uploads and downloads, again unlike Photos. Best of all, I can manually remove local copy of songs stored in the cloud in order to free up space on my Mac. Yet again, such a basic function is not provided by Photos. In addition, I can easily backup my music collection using the Finder, and the data will travel across different filesystems (no special, Mac-only bundle files). Best of all, Apple Music is able to mostly, seamlessly meld my offline library, tracks synchronised via iTunes Match, and songs added from Apple's music subscription service into one seamless library. It even allows me to create a "Smart Playlist" which can catalog songs based on whether they are rented through Apple Music, or fully owned, as well as countless other criteria such as Genre, Artist, Play Count (which incidentally, includes plays from all of my other devices), File Size, File Type, Date Added, Last Played, Composer, Year Released, and even how many times I've skipped a song. Yes, I could create a playlist of my most skipped songs. How's that for a desert island disc?

So while it's not perfect, I'm glad Apple haven't released some half-arsed rewrite of Apple Music focused on flogging their subscription. iTunes lives on for now, if not in name, then in spirit.


Sorting by Date Is Also an Algorithm

It's a common refrain in tech circles when discussing Twitter to talk about "The Algorithm". Take this Lifewire article: "How to Turn off the Twitter Timeline Algorithm", or this one by TechCrunch: "Twitter makes algorithmic timeline default on iOS".

In the beginning the Twitter timeline was a simple list of posts by people you follow sorted by date descending. In 2013 Twitter introduced a new timeline that wasn't so transparent. It was seemingly designed to optimise for engagement. All of a sudden, people were seeing posts from people they didn't follow, or that were posted days ago and for some reason had been boosted by the algorithm.

The point is, both the classic timeline and the "algorithmic" timeline are both in fact algorithmic. In fact, the first algorithm taught in computer science is often the Bubble Sort algorithm. It's still an algorithm! In the case of Twitter, or its upcoming rival Mastodon, sorting by date may be a preferable, but there are possibly downsides too. Prioritising something because it happens to have been posted recently is a form of Recency bias after all. The key seems to be transparency.

So enjoy your favourite algorithm, and remember, they're not all bad.