You can now read this post in full here on my website.
I started writing this blog post months ago, with the intention of getting it finished in time for Apple’s next iPhone event, which it now happens is in a few days. In this post, I wanted to reflect on the experience of upgrading to Apple’s latest and greatest iPhone (at time of writing) and how doing so along with iOS 14 has changed my phone based workflows.
Let’s get into it now, because there isn’t a lot of time!
iPhone 12 Pro Max
My last phone was an 8 Plus, which in turn followed on from a 5S. The longer the gap between models, the more has changed and the steeper the learning curve. In this case, the jump in features between the 12 Pro Max and the 8 Plus is pretty big, so how do these two models compare with each other?
FaceID vs TouchID
Apple couldn’t have predicted the COVID pandemic when they first implemented FaceID. On the face of it (pun intended) FaceID was an improvement over TouchID when the feature made its debut with the iPhone X way back in 2017.
Personally, I was cautious of this new technology, which is one of the reasons that year I upgraded to an 8 Plus that year and stuck with the proven TouchID. However, the technology ended up evolving to become pretty reliable and so when I looked to upgrade I didn’t have much of a choice to jump to FaceID.
Having lived in Montreal for the best part of the last decade – a city which spends around one third of the year in bitterly cold weather – one annoyance with TouchID is it doesn’t work with gloves! In this scenario, FaceID is god-sent for quick access and seemed like a big quality of experience improvement over what I’d become accustomed to.
But, now we’re all wearing masks, and because of this FaceID becomes next to useless. Worse still, FaceID sometimes sort of recognises a face half-way through typing in my pass-code, causing the keypad to briefly vanish before returning as the hardware realises it can’t recognise a face after all. This means having to re-type your password fro scratch again.
This is most frustrating when wanting to quickly pay for something with PayPass when there’s an impatient line forming behind you. As someone who takes digital security seriously, I use an alphanumeric passcode which is cumbersome to type.
The same annoyance holds true for long train journeys or flights, where you need to wear a mask necessitating the need to continually enter your passcode.
Apple have gone part-way towards alleviating this issue by allowing the Apple Watch to act as a biometric proxy, bypassing the need for both FaceID and typing in a password when the device is being worn. However, this doesn’t help those of us who aren’t Apple Watch users.
My hope is Apple adopt the iPad’s TouchID into the next iPhone generation, as this approach doesn’t interfere with the iPhone’s current aesthetic of being button-free on the front display.
Overall, I can’t be too annoyed at Apple for FaceID’s shortcomings when wearing a mask. They couldn’t have predicted the pandemic any more than I could have done. But the fact remains the technology isn’t really suited to a world of mask wearing.
Bye-bye 3D Touch, Hello Haptic Touch
3D touch was certainly an odd feature which confused a lot of people. The idea was some iPhones with supporting hardware could interpret how hard you were pushing on your screen, expanding the touch paradigm to enable more features.
For example, if you pressed normally on a home screen app it would launch as usual, but if you pressed hard on the app it would pop up a widget and menu allowing you to “peek” into the app or run a shortcut directly into an app’s feature.
3D Touch was sprinkled throughout iOS, but with little consistency. This is probably what caused the feature to fail, as not only was there no way to know what force-pressing on something would trigger, but it was also possible to trigger the feature by accident.
Personally, I loved this feature. One of my favourite tricks was to force-press on the on-screen keyboard, which would then let you move the cursor around like a mouse. If you force-pressed again without taking your finger off, you could easily highlight text for copying. As the iPhone’s text highlighting leaves much to be desired, this was a feature which became ingrained into my muscle memory.
Apple eventually abandoned this feature, which has been replaced with Haptic Touch, a software rather than hardware version of 3D Touch, where you just hold down on the screen to get a similar effect. However, features such as keyboard cursor navigation have lost the awesome text highlighting and it’s a poor shadow of its precursor, although better than losing the feature altogether.
Replacing That Old Muscle Memory
Another thing which changed back in the happy pre-COVID times of 2017 was the removal of the home button from the iPhone’s home screen, to be replaced with the various swiping gestures which arrived with the iPhone X, which then went on to become the new standard for iPhones.
Making the sudden leap from an 8 Plus to a 12 Pro Max meant I was very late to the swiping party. This involved un-learning years of home button based muscle memory in lieu of the new way of doing things.
Frustrating at first, I’ve now built up those new reflexes required to navigate around a modern iPhone, but in all honestly I still miss the unambiguous nature of the home button. as a reminder here are some of the more common actions:
|Action||Home Button||Modern Way|
|Go home||Single click||Large swipe up|
|App switcher||Double click||medium swipe up|
|Control center||Swipe down from top-right||Swipe up from bottom|
|Notification center||Swipe down from top||Swipe down from top|
|Reachability||Double tap||Swipe down from bottom of screen|
One of the unique features to the 8 Plus (and the 6 Plus and 7 Plus before it) was the ability to use it completely in landscape mode – like an iPad.
I suspect I was one of the few people who liked to do this, but there was something cool about having the home screen in landscape mode saving you having to keep changing the screen’s orientation each time you switched from a landscape back to the home screen. This was a nice feature, but with the arrival of widgets it would only work if you opted to continue keeping a home screen exclusively with apps, as adding a widget would force the setup to revert to portrait.
The introduction of “the notch” meant less space to show icons along the top-bar. The positive of this is finally you don’t have to keep looking at your carrier’s logo which is an asthetic which has irritated me for as long as I’ve owned an iPhone.
The negative is you lose some useful information such as the location avatar, whether your VPN is enabled or not and most annoyingly if your bluetooth headphones are connected.
Granted you can swipe down into the Control Center to get this information, but I find it a small annoyance nonetheless when your headphones can be tricky about connecting.
I think I can just sum this up by stating MagSafe is the most wonderful addition to the new iPhones. Since I bought a MagSafe case and a desk stand, I’ve not looked back.
So, which do I prefer overall? It’s hard to argue the 12 Pro Max is light years ahead of the 8 Plus in terms of features, but I do miss the simplicity of the home button iPhones. In a perfect world, Apple would bring back the home button and I wouldn’t mind sacrificing a small amount of screen real estate if they did. But I’m keeping my fingers crossed in a few years we’ll see a home button embedded under the screen and we can all have the best of both worlds.
Current Home Screen
Someone (I think it may have been Myke Hurley) said home screens shouldn’t contain apps, they should contain actions. When unlocking your phone, one should be intentional about what one wants to do. iOS 14 helps with this by allowing you to place (additionally to apps):
- Shortcuts – These perform a specific action
- Widgets – These give information at a glance, and often negate the need to dive into an app
Using these three types of objects turns Home screens from mere app launchers into powerful dashboards. Let’s take a look at mine.
In the top left we have what I think of as the core apps:
For me, clock and calendar act as really small widgets rather than app launchers, becuase at a glance they show me the time and the current date respectively. The phone app icon also shows a badge when I have missed calls, and Due shows me a badge for overdue reminders. From these 4 icons alone they tell me an awful lot of useful information.
Another subtlety is I don’t actually use Calendar as my main scheduling app, as I much prefer Fantastical. However, Fantastical doesn’t have a dynamic app icon showing the date, so my workaround is to use a Shortcuts automation which simply opens Fantastical whenever I launch the Calendar app.
Moving into the top right quadrant we have a stack of widgets:
- Fantastical – Shows me my next events
- Widget Wizard – Shows me upcoming anniversaries and important days
- Carrot Weather – Today’s forecast
- Carrot Weather – Weekly forecast
- Carrot Weather – Weather map
Again, this stack gives me useful information at a glance, without the need to jump into an app. Setting it as a “Smart Stack” in theory means it’ll have a better shot at predicting which widget I want to see when unlocking my phone.
On the left of the center, there’s another stack of widgets. This time the focus is on media:
- WidgetPod – An all-in-one widget for both Apple Music and Spotify
- Castro – For podcasts
- Dark Noise – For background ambience
- WidgetPack – Powerful Shortcuts-based widget, but I use it as a photo frame
On the right-hand side, we have some more icons:
- A folder of Shortcuts & apps
The folder contains items I access semi-regularly.
Although the icons are themed consistently and most are Shortcuts, there are a few apps in there too:
The “iPhone” Shortcut pops up a menu with some quick links I access frequently which relate to management of the phone itself.
At the bottom is a stack of regularly used Shortcuts and quick app launchers.
Finally, down on the Dock, I have Messages, Safari and Things, along with a personal Shortcut I built call “New Seed” which intelligently adds an item to Things.
Over the course of the past year, I played around with various home screen widget setups, but in the end I settled on just two home screens, with the second screen being three Siri App Suggestions widgets. As each widget is 4×2, stacking three on top of each other gives you a full home screen of apps.
I’ve found Siri App Suggestions to be exceptionally good at surfacing all the apps I tend to want access to at a given moment in time, and it’s clever enough to not show any duplicates across the three individual widgets.
iOS 14.3 also introduced a Shortcuts action I’ve been wanting for a very long time: The ability to set wallpapers.
Combining this with Shortcuts automations gives my phone a day and night theme.
- Day Mode:
- Each day, I get a new wallpaper on my home screen chosen sequentially from a directory of wallpapers in iCloud.
- The lock screen wallpaper rotates weekly.
- Night Mode:
- The home screen uses one of the Club Macstories backgrounds designed by Silvia Gatta.
- The lock screen wallpaper rotates daily, with the night wallpapers being themed to show off the screen’s the phone’s OLED true black abilities.
Maybe it’s silly waxing lyrical about a wallpaper, but I do absolutely love this dark mode background. I like the way it frames the icons and widgets to make it look more like a dashboard, and the central background colour perfectly blends in with the background of the Shortcuts and batteries widgets so each element appears to float in a space unconstrained by their usual boxes.
One annoyance is Shortcuts won’t let me change the appearance itself between light and dark modes via automations without the phone being unlocked. So, rather than have an all-in-one Shortcut to achieve this I have to sync the Shortcuts schedule with the appearance schedule.
Another annoyance is iOS 14.6 somewhat broke how the Shortcuts wallpaper action operates. Thankfully, it was possible to fix this with a custom Shortcut.
I’ve also made a Shortcut to manually toggle themes, as demonstrated below:
This is probably the longest I’ve gone without tweaking my setup, mainly due to the introduction of widgets in iOS 14 and finally being able to find an ideal layout on my first screen with everything I need whilst also being aesthetically pleasing.
Going forward I want to experiment with context aware home screens once iOS 15 arrives, and we’ll discuss this more later on.
Back Tap Quick Search
Back Tap is an accessibility feature which lets you trigger an action with either a double or triple tap on the back of the phone. The kinds of actions possible range from something as simple as mimicking a swipe up to return to the home screen, to running a Shortcut.
I currently have a triple tap assigned to launch Spotlight from which ever app I happen to be in at the time (otherwise I’d have to go back to the home-screen, and then drag down). In my experience, I’ve found the double-tap to be more likely to trigger by accident, which can be annoying, but the triple-tap is fine.
As a long-time iOS user, I remember first meeting Siri when it was a standalone app. When Apple purchased Siri and baked it into iOS 6, Siri continued to remain a bit of a novelty with limited usefulness. It’s no secret these days that Siri tends to be unreliable when compared to their Alexa and Google Assistant counterparts, and in the past I’ve just been put off using Siri at all because of its flakiness.
But, I’ve discovered if you limit Siri to specific things you know they’re good at, it can be a time-saving tool. It’s what I refer to as “Intentional Siri”.
For example, asking Siri “Remind me to…” is in fact a pretty powerful action when combined with other apps, something which I only realised when listening recently to the Automators Podcast. In itself, asking Siri to remind you of something just adds a new reminder to your default Reminders list. That doesn’t sound very interesting until you consider various apps plug in to the Reminders app, using it a bit like an API sitting atop a database. These apps will collect items according to certain criteria:
- If I add an item with no time or location context, Things will pick up the item and add it to its inbox for later processing. For example: “Hey Siri, remind me to organise my bookshelf”. These are the kinds of items which need to be scheduled, so the Things inbox is the ideal destination for them so I can look at them during my next weekly review.
- If I add an item and give it a time, Due will pick it up and (very persistently!) remind me when the deadline arrives. For example: “Hey Siri, remind me to empty the dishwasher in an hour from now”.
- If I add an item and give it a location, the Reminders app will handle the notification as usual. For example: “Hey Siri, remind me to water the plants when I get home”.
What you’ll notice from the above, is the two main apps I use for task management and reminders (Things and Due) are intelligent enough to ignore reminders which don’t have the correct context, and collect the reminders which do. So, the “smart” element here ironically isn’t Siri itself – Siri is a dumb pipe which collects anything I throw at it – Instead, it’s the apps themselves which know what to pick up.
Another area where Siri excels is launching Shortcuts. There’s no ambiguity in your intentions, and it’s 100% reliable. For example, If I have a Shortcut called Dim my Desk Lamp, invocation is simply a matter of saying “Hey Siri, dim my desk lamp”. Siri will also present any menus in the Shortcut, so you can make decisions using your voice as the Shortcut runs.
There’s definitely two types of users for voice assistants. The majority want to just ask a question using natural language, and have the voice assistant establish context and interpret the user’s intention; Sadly this is where Siri has always fallen short. The second less typical type of user is the power user who wants to trigger specific workflows via their automation assistant, and it’s here in this less common usage where Siri has nailed it.
Using iOS with Keyboard and Mouse
By this point, it’s common knowledge an iPad can be used with a keyboard and trackpad making it a desirable laptop replacement for some. What’s less well known is you can also take the same approach with an iPhone, albeit with some caveats.
Why would you want to do this? Attaching a keyboard and mouse to the iPhone seems like trying to force a square peg through a round hole, and there are some bugs which make the experience less than ideal.
Personally, I’ve tried this a few times when sat in a cafe waiting for a bus on a rainy day. It’s feasible, but when you consider the need to carry around a keyboard, mouse and stand one has to question why it’s not easier just to pack a laptop instead. Perhaps it’s useful if you want to do some focussed work without the distractions of a laptop, but I’d also argue the iPhone is much more of a distraction machine than a laptop anyway.
It’s great to see Apple support this functionality on an iPhone though, but the functionality is not polished and best left as an accessibility feature.
iOS 15, macOS Monterey and the Future of Shortcuts
As a heavy Shortcuts user I can’t overstate how excited I am hearing the news Shortcuts will finally be coming to the Mac. This is going to be a huge efficiency improvement as users no longer need to duplicate workflows on each device using different technologies.
For example, it’s really annoying to have a Shortcuts based workflow working perfectly on the iPhone, and having to re-create it in an app like Alfred on the Mac.
Excitement aside, based on past OS releases from Apple I’m very cynical that day one will provide a stable and bug free version. Therefore, as excited as I am about these releases, I’m going to try and hold off until at least the
.1 release unless all the reviews indicate otherwise. But based on what I’ve been hearing about even the latest beta, I’m not very hopeful.
Most likely once I do upgrade there’ll be a significant overhaul of the setups discussed here, not least of which will be due to “Focus”, the new feature which will allow you to dynamically change home screen layouts based on various criteria such as time of day or location.
(as a side note, I plan to update my Shortcuts collection again once I’m established on iOS 15. The decision to keep them in GitHub made sense when Shortcuts could be shared as files, but made less sense when Apple changed its mind and decided they could only be shared as links. I hear that iOS 15 will once again allow file sharing, so my public collection is long overdue an update later this year)
So, there we have it. More stuff coming on this blog over the coming months, so please stay tuned!