Mapping Exoplanet Traversal for fun (although not for profit) using exotic.py on Big Sur.

The problem with knowing people who are very smart is that occasionally you get dragged into their smart-person shenanigans. Dumb people are more fun in this way; their idea of a good time is to go set off all the illegal fireworks they found in a trash bag next to the freeway inside the nearest dumpster or abandoned refrigerator, which is generally a good time and requires little from the average bystander except a willingness to bring their own fire extinguisher and an ability to watch out for the cops. Smart people, however, will drag you into Intellectual Pursuits, and it’s thus that I’ve ended up banging my head obsessively against trying to get Exotic installed on Big Sur.

“What is Exotic”, I hear you cry? Well, it’s a clumsy acronym for EXOplanet Transit Interpretation Code, which in its simplest terms (which are the only terms that I can reliably understand), means that it’s a Python package that you use to reduce photometric data of transiting exoplanets into light curves, thus retrieving transit epochs and planetary radii.

Just a simple chart measuring Relative Flux. Which I completely understand. Honest.

Okay, you got me. I copied that last bit from the Exotic GitHub page. I am not a data scientist; I am merely an well-coiffed ape with a dream and what Liam Neeson would doubtless describe as a set of Very Particular Skills – none of which involve understanding what the hell a “transit epoch” might consist of.

Still, Better Minds Than Mine assure me that Exotic is a Python package that examines data from massive telescopes and interprets that data to let you know if there might be a planet zipping around a distant star, and who am I to challenge this claim? How does it do this arcane task? I don’t know. Hell, I don’t think anyone really knows – this kind of thing, as far as I’m concerned, is essentially synonymous with witchcraft. Which has some tonal resonance for me, because getting the thing installed on a Mac running Big Sur required a certain amount of supernatural wrangling all of its own.

For one thing, it requires a long list of Python dependencies (software components required by a system to be in place before you install The Thing You Really Wanted To Install) to get the thing installed, and most of those dependencies have their own lists of dependencies, and after a while it starts to feel like there’s dependencies all the way down. Some of those dependencies install without error or comment, and others fail in the most dramatic way possible – mostly by spitting out 2,700 lines of alarming red text where the precise shade of red has been carefully engineered to provide maximum guilt and terror (red on black is also hard to read, but I’m going to stick with the guilt/terror thing as that means I don’t have to confront my fading eyesight. Having bad ears is quite enough, thank you).

Apple helpfully updates Python with every OS release, which has the unfortunate effect that they’re continuously moving goalposts without telling anyone – so a package that might work just fine with one release might fail utterly after a seemingly trivial point upgrade. Fortunately, you can get hold of a lot of old versions of Python here. Which is what I did – downloading a fresh copy of Python 3.8.6. This, unfortunately, is the non-M1 version of Python and requires Rosetta 2 to be installed on your M1 Mac, but there’s regrettably not much that can be done about that.†

You can, it should be noted, install as many versions of Python as there are (ha ha) stars in the sky. The problem with that approach is that you’ll possibly need to switch between them using something like pyenv, but in this case it wasn’t important to do a lot of mucking around with keeping options open vis-a-vis Python versions; the prime directive was to make sure that as far as the Mac was concerned when someone typed python then this was interpreted as the freshly-installed Python 3.8.6, and not the included-with-the-computer-and-hard-to-remove Python 3.9.2.

This was fairly simple to accomplish with a command to write an alias into the .zshrc file:

echo "alias python=/usr/local/bin/python3.8" >> ~/.zshrc

What this does is inject the alias python=/usr/local/bin/python3.8 command into the file that macOS uses to configure the zsh shell so that whenever you type “python” into the terminal your Mac sort of clears it’s throat and discreetly directs whatever comes next toward the version of Python that you tell it to instead of the one that it would customarily choose right out of the box.

Next, there are three python packages that should be installed before attempting to install Exotic:

pip3 install wheel

pip3 install importlib_metadata

pip3 install —upgrade keyrings.alt

With those in place, you’ll need to run the Install Certificates command that you should find in the Python 3.8 folder in your Applications folder (otherwise you’ll run into all kinds of SSL/TLS errors. Which are Bad Things).

Finally, you can install exotic:

pip3 install exotic

Note: You may have to run this command twice as I’ve seen it error out on installing one package with a missing dependency that is fixed by a second install.

All things being equal, you’ll be able to type exotic in the Terminal, and see something like this:

† – Way back at the beginning of this article I mentioned that we installed a non-Apple Silicon version of Python (3.8.6), which means that if you’re using an M1 Mac then that M1 Mac will be running Python through emulation – in effect, pretending to be an Intel-based computer. This, it should be noticed, means that it’s running a lot slower than it would be capable of doing with software written for the ARM-based chip in the M1 (Later versions of Python are available that are optimized for Apple Silicon, but none of them seem to work with Exotic). This sounds bad – and in reality it’s not perfect, but here’s some anecdotal feel-good data for you.

The ingenious boffins who came up with exotic also make some test data available so that you can try the thing out to make sure it works. Running that test project on a 2017 i7 MacBook Pro took twenty-two minutes to process and render the test data, accompanied by a chorus of whirring fans and a truly alarming spike in how hot the computer got.

My 12-core Xeon Mac Pro with a bucket of RAM and fast drives took a decently respectable fifteen minutes to chew through the same data and spit out a fancy-looking graph, and because it’s largely comprised of fans (and pretty noisy as a baseline) it didn’t get noticeably hot or bothered by this procedure

A bottom-of-the-line, base spec M1 MacBook Air? Five minutes. It took five minutes, and the thing didn’t so much as get mildly warm. Running on an older version of Python while under emulation. Five minutes.

This is… well. Blimey.

Using multiple versions of Python on macOS using pyenv and a series of ugly hacks.

macOS (at least, as of Big Sur 11.2.1) ships with Python 2.7.16 by default. This seems curious when you consider that Python 2.7 was deprecated and no longer supported after December 2020, but there are Reasons for shipping the newest and shiniest macOS operating system with a (much) older version of Python.

For one thing, Python 3 isn’t backward compatible, and Python 2 has vast library support that might be critical to your workflow, and not having those libraries available seems like a recipe for disaster. For another thing, you invoke Python 2 with the python command, and Python 3 with the python3 command – and while this seems less obviously an issue it’s actually a much bigger problem that you’d expect. After all, how many pieces of code or scripts are out there, undocumented, starting with #!/usr/bin/python? I’m betting the answer to that question is, well, a lot.

So, what to do if you want to be able to run both Python 2.7 and Python 3 on a single install of macOS? Well, there’s a two-pronged way of doing it by utilizing the homebrew pyenv package (which is ingenious and well-supported) along with a little .zshrc hack (which is something I came up with and therefore almost certainly deeply problematic).

You can find pyenv here, and the simple installer for it linked here. In the simplest terms, what pyenv does is intercept any request that’s made for python commands and then slides those requests over to the selected version of python. It’s extremely clever and additionally that rare and unusual creature in that it’s a GitHub project that’s both well-maintained and well-documented. You’ll need to install the Xcode Command-Line tools with an xcode-select --install command, but once that’s done you can run the installer linked above and it shouldn’t cause you much trouble.

Once installed, there are a couple of additional required steps. Firstly, you’ll need to make a backup of the .zshrc file in your home directory with ditto ~/.zshrc ~/.zshrcbackup (if you don’t already have a .zshrc file then feel free to skip this part.)

Next, make another copy of your .zshrc file: ditto ~/.zshrc ~/.zshrc2 (This may seem needlessly duplicative, but it’ll come in handy in the ugly hack part of the procedure later on).

Finally, edit your .zshrc file and add these lines onto the end:

export PATH="/Users/daveb/.pyenv/bin:$PATH" (Note: substitute your short username for daveb unless your name is also Dave B.)

eval "$(pyenv init -)"

eval "$(pyenv virtualenv-init -)"

Once you’ve done all the above, quit Terminal, open it again, then install your Python version of choice by issuing the pyenv command, invoking the install option, and then adding the version of Python you want to install like so:

pyenv install 3.9.1

…and that’s it. If you want to install other versions of Python then you can run the pyenv install command again with the relevant build, and if there’s one particular build that you want to, say, specify Python 3.9.1 as the default then you can make that so with the pyenv global 3.9.1 command. A more complete and in-depth list of commands can be found here.

So, now that’s all the pretty, elegant stuff out of the way, so time to move on to the ugly hack. The problem I’ve found with pyenv is that it’s great for installing versions of Python that are up-to-date, but that it really doesn’t seem to want to toggle back to the version that Big Sur ships with, and there doesn’t seem to be a convenient off switch for the thing. Further, trying to install that old version (2.7.16) doesn’t work because Python 2.7.16 is ancient and wicked and pyenv refuses to allow something so arcane to be installed on your machine (despite the fact that it was on there already).

To get around that, I added this to my .zprofile:

alias py='mv ~/.zshrc ~/.zshrctemp; mv ~/.zshrc2 ~/.zshrc; mv ~/.zshrctemp ~/.zshrc2'

Ugh. Just look at that thing.

Like I said, it’s ugly, but when you type the alias py into a terminal window then what happens next is that it takes your existing .zshrc file, renames it to .zshrctemp, then takes the .zshrc2 file and renames it .zshrc, and then finally takes the .zshrctemp file and rename it .zshrc2.

In effect, it takes the .zshrc2 file (the one that isn’t set up to enable pyenv) and switches it out with your .zshrc file (the one that is set up to enable pyenv). When .zshrc2 becomes .zshrc, pyenv suddenly no longer works, and your computer defaults to the version of python that it shipped with (in my case 2.7.16) because it suddenly doesn’t know any better. And because the command just switches those two files around, issuing it again turns the tables and re-enables pyenv.

Using Custom Audiograms on iOS (Or: “Where’s Izzy?”).

Normally I like to make these amusing little bon mots tightly focussed on macOS and iOS integrations, tips and tricks, but this one is… a little different.

Instead of talking about macOS, I’d like to talk about Izzy Stradlin. If you can’t immediately picture Izzy (maybe you weren’t paying attention to the rock charts in the late eighties), then he’s this guy:

Still not ringing any bells? Okay, I can see that. That’s fair – sad, and makes me feel very, very old, but fair. You might know him as this guy:

I mean, the album cover is pretty clear on this one. Come on.

Izzy Stradlin was the rhythm guitarist, sometime vocalist, and a primary song-writer for Guns N’ Roses – a band that I loved with a rare and utter abandon back in 1988. This is not something I feel I have to defend as a foolish whim of youth; “Appetite for Destruction” is a perfect album and if anyone would tell me otherwise then that person would no longer be my friend and there is an additional statistically decent chance that we would – in due course – result to fisticuffs. There may even be a donnybrook.

Later Guns N’ Roses got pretty wobbly (quality and production-wise), but this album remains flawless because it was recorded by brilliant engineers who had the wisdom to embrace a stripped-down approach; to whit, put Axl Rose in the middle where he could growl and scream and wail at the world, have Slash on lead guitars on the right channel and Izzy playing his distinctive, borderline unnecessarily-choppy and bonkers-aggressive rhythm on the left channel.

Izzy left the band in the early nineties and went on to make a lot of solo records that I justifiably rave about because they are universally excellent, but one day in 2010 he disappeared from my life.

Okay, that’s a little dramatic. He didn’t disappear – as far as I know he’s living somewhere outside Ojai doing… whatever semi-retired rock stars do, and good luck to him. No, I mean that about the time that he put out his last solo record and steadily retreated from the world he also steadily retreated from my left ear, and has remained gone for the last decade.


I am forty-seven years old. There are things that you don’t imagine about being forty-seven that sort of creep up on you without noticing. My back never really completely healed after falling off a horse (my own damn fault) about eighteen years ago. My left knee made a sort of “popping” sound one day while hiking back circa 2014, and at the time I looked at it with a quizzical expression and thought “Huh. Well. That can’t be good.” (Spoiler: it wasn’t.)

Bits of you slow down and stop working and generally you degrade in both small and large ways, and while this is something you entirely expect (because you have no illusions about the predations of time and are not an idiot in that regard), it’s one thing in the abstract to understand that you’re not going to be springing out of bed in fighting form in your middle-aged years and another to wake up one day with the dim realization that this is more difficult than it used to be, that dim realization calcified in your mind as much as in your calcified joints.

And really, I’m not complaining. These are table stakes for seeing another day – they are the price we pay for the stuff we did thirty years ago, and I think most people are happy to pay it if that means that you got to drink and dance and spend your weekends headbanging like a maniac to “Welcome to The Jungle” in a lot of now-long-defunct rock clubs. These were good times – and I’d do them all over again except that I now look like my Dad, can’t pull off the long-hair look, and mosh-pits are probably all COVID hot zones these days. Some things are better left in the past; you just have to embrace the moments gone by and let them go. Take what they give you, and move on with your life.

Unfortunately, what those experiences mostly gave me was hearing damage, which brings me on to Mimi Hearing Test.

Mimi is a free download from the App Store, and in conjunction with the Accessibility options in iOS it can mitigate some of your possible hearing loss – at least, for phone calls and audio played through the iPhone. It creates a custom audiogram that you can point Accessibilty at, and in turn Accessibilty will run audio through that audiogram so that frequencies that you may be lacking are boosted. The overall effect of this is that – at least in my personal experience – I can now hear Izzy Stradlin again in my left ear for the first time since late 2010.

Here’s how set it up. First, download the app from the iOS App Store and install it on your phone. Then open it. I doubt these are controversial steps. On opening the app you’ll see the following:

Find a quiet place in your abode to sit while you run the thing. It’s going to test your hearing, so try and ensure that there’s the least ambient or environmental noise possible. I sat in a dark closet at the furthest end of the house from everyone else. This worked well, but I have no knowledge of the particulars of your domestic environment so choose your own spot. No, you can’t use my closet. That would be weird. Hit “Continue”, and you’ll be prompted to connect and select your headphones. Mimi seems to be designed with certain headphones in mind (both in and over ear), but as one of the pre-ordained options is for Apple AirPods (and they’re my ear sticks of choice) I opted for that.

Dial the volume on your phone to 50%:

And then begin the test:

This took me a couple of tries – fortunately you can re-do a bad test. The idea is that you tap the “I can hear it” button whenever you hear a sound in each ear, and then release the button as soon as that sounds goes away. Each ear is tested with a range of frequencies for a couple of minutes, and this quickly becomes annoying when the sounds of your own breathing conflict with the almost-inaudible sonic whines you’re trying to pay attention to. Once you’re done you’ll get the results, which in my case look something like this:

This wasn’t what I’d call something I was happy about, but it tracked with what I knew already; any audio I’d play with any headphones was always substantially quieter in my left ear, and there are only so many pairs of headphones you can go through before you have to admit that they can’t all be bad on one side and that you, in fact, are bad on one side. So, while it’s jarring to have confirmation of your own failing senses, we’re better off moving on to the fixing-it bit, right? Right.

To do that, open up the Settings app on your iPhone, click on Accessibility, and then scroll down to tap on Audio/Visual:

Toggle on Headphone Accommodations:

…and then hit “Continue”. All things being equal you’ll see an option for Audiograms that you can select and use:

Finally, you’re presented with the Recommended Settings screen, which is really just an excuse for the phone to show off how good (or bad) the custom setup you’ve made is compared to the standard. Make sure you’ve selected “Custom” and then “Use Audiogram”:

The difference is jarring. But in a good way. Each time you run the Mimi app you can create a new Audiogram – and what I’ve shown above is a bit of a cheat because it’s actually my third attempt (I fat-fingered the first one, the second one was impacted by having a plane fly overhead and the resulting audiogram made everything hopelessly distorted), so you may want to tinker a little to get to the optimum experience.

Having the hearing of a thirty-seven year old doesn’t sound like a game-changer unless you’re a forty-seven year-old. Although, come to mention it, the only way that I’d know for sure how effective this is would be to go and dig further back through my music catalog and break out some of the stuff I was listening to in the early 2000’s – and that’s a potential chamber of musical abominations that I’m largely glad I’ve left behind. For now I’m just happy to have Izzy Stradlin back where he belongs, buzzing away in my left ear like a bizarre, insanely, gleefully infectious reminder that the past isn’t necessarily always as far away as we imagine it to be.

Making Big Sur a little faster.

The new M1 Macs are things of wonder and delight. Cool, inexpensive and fast. Staggeringly, ludicrously fast. Everyone should have one. In fact, if you’re reading this and you don’t own one then you should probably close this window (I mean, bookmark it first) and go and buy one and then come back and we’ll continue.

You did that? Good. Congratulations! I don’t think I know anyone who is disappointed with the whole Apple Silicon experience except for my friend who, for the sake of argument, we’ll call David. Because this is in fact his name.

My friend bought an M1 Mac mini on my recommendation to do some development work on, and was immediately frustrated that many of the things that he’d want to use don’t work properly (at least, not yet). Cocoapods, Docker, the native version of Homebrew – these are all things that are actively incoming. And that’s fine – these things might be a month or two out from being fully and totally supported and reliably working. He understands that; what seems to bother him is that despite all the purported speed increases, the Mac mini feels… slow.

The problem, I suspect, is Big Sur. Whether you’re running the slowest, oldest, least supported i3 iMac or a $55k Mac Pro there are some things that remain true across the board. Speed is all about perception, and there are tangible roadblocks that have been put in place in the name of User Experience in Big Sur which rely on tinkering with the balance between speed and aesthetics.

One example is the wretched rollover delay in the Finder. Here, for example, is what I see when I roll over the (stupidly hidden) spot to reveal the proxy icon for the current folder:

(Pardon the odd artifacts at the top of the image. gifsicle is a fickle beast.)

Ignoring the fact that there’s no earthly reason why this icon should be hidden in the first place there’s the matter of the delay before it pops into being. It’s a little more than a second, no matter what computer you’re using. A little more than a second may not sound like a lot, but if you’re spending the day – purely for the sake of argument – doing a lot of filing of documents and general end-of-year/spring cleaning on your Apple IT Consulting business then you’re going to end up rolling over proxy icons a lot, and that grain of sand between your toes is going to grate.

Fortunately there’s a simple way of fixing it by adjusting the default rollover delay via the Terminal, thus:

defaults write com.apple.Finder NSToolbarTitleViewRolloverDelay -float 0; killall Finder

This tweaks the delay down to 0 seconds, which ends up looking like this:

Bonkers!

Still, that’s not enough. There are other delays built into macOS that (depending on your taste) you might want to decrease or dispose of altogether. My personal favorites are reducing the initial delay when pressing a key and then also reducing the key repeat rate. Thankfully, these can both be demonstrated with a single animated gif and tweaked with a couple of simple Terminal commands.

The “before” gif:

This isn’t terrible, but if you’d rather have something more responsive then you can try the following Terminal command:

defaults write NSGlobalDomain InitialKeyRepeat -int 12; defaults write NSGlobalDomain KeyRepeat -int 1

Which – after you log out and in again – gets you this:

I feel the need – the need for… a lot of k’s being typed and deleted.

These are, it has to be said, small hacks. Trifles. Almost inconsequential in the grander scheme of things, and definitely fringe cases where your mileage may vary. Not everyone, after all, is happy with a version of the world where hitting the delete key for a fraction longer than they normally would can result in vast swaths of text being instantly deleted. But on the other hand, they do point to a greater issue; that the way that you work with your computer is wholly subject to the decisions made on your behalf by other people. Apple’s fit and finished on their operating systems is… well, if not without reproach (because that’s a bold claim) at least the result of rather more careful thought and review than the fit and finish of most of their competitors.

Still, having some flexibility and the ability to customize the way your Mac operates isn’t a repudiation of Apple’s work. If anything, it condones it. Or, at least, makes it a little faster.

Quitting Zoom on a Mac (or: Zoomkiller™).

Okay. This isn’t some philosophical nonsense about how to tear yourself away from the screen, or how it’s important to compartmentalize your life, or how we’re all engendering negative self-imagery because we’re looking at pictures of ourselves all day or anything of that nature. This, in a very literal sense, is about how to Quit Zoom.

Or more precisely, how to get Zoom to quit. I’m sure that I’m not alone in my frustrations in this department; you’re done with your Zoom call and everyone is saying their farewells, so you hit the “Leave” button in the bottom right hand corner of the Zoom window, thus:

Buh-bye!

and… don’t leave, because you have to click “Leave Meeting” a second time:

I SAID BUH-BYE DAMMIT WHY ARE YOU STILL HERE

I mean, sure, the world is full of horrors right now, but while we’re all carrying rocks in our back with names like “insurrection” and “pandemic” and “looming economic apocalypse” it’s the little things that seem to get me down the most. The flecks of grit in ones proverbial sock, and as I spend a lot of every day on Zoom this is the one that chafes me the most. So I decided to do something about it, and that something is a little script/application that I call…. Zoomkiller.

Okay, the name is a work in progress. I’m workshopping a few alternatives. I should probably also come up with an icon while I’m at it, because right now it looks like this:

Behold.

The reason it looks like this is because it is, in fact, an AppleScript application. I could have written the thing in Swift (and may actually go that route at some point), but AppleScript (while increasingly archaic) is pretty great for knocking together very simple tools to do very simple jobs.

Telling AppleScript to quit an open application is easy – you just tell it to activate the application and then use System Events to feed it the appropriate keystroke, like so:

If Zoom was something that played nice and quit right away with one simple Command-Q keystroke then that’d be all that was required (and, more to the point, something simple enough that Zoomkiller wouldn’t be required at all), but unfortunately it’s not that simple. When you try and quit Zoom, you get that pesky second “Leave Meeting” button that pops up on the screen – fortunately that can be killed with AppleScript and System Events again:

This does the same thing as the first script, but then additionally tells System Events to go look at the front-most window and click the first button (which is, in this case, “Leave Meeting”). The next step is to save the thing as an AppleScript application by choosing “Export” from the “File” menu and selecting the appropriate options as seen below:

Et voila! There are just a couple of more steps to get this thing to run properly. Firstly, you’ll need to tell your Mac that it’s okay for Zoomkiller to control your computer (i.e., that it’s allowed to use System Events to send keystrokes to Zoom to tell it to quit).

First, open the Security and Privacy pane in System Preferences:


Click on the padlock in the bottom-left corner to unlock the prefpane (you’ll need to enter your computer password), then click the “+” icon and navigate to where you put your Zoomkiller application and click “Open”.

Make sure the box next to “Zoomkiller” is checked…

…and that’s about it. I’ve dragged Zoomkiller into my Dock so that at the end of each call I can just tap on the icon – because the application is saved as run-only and because it doesn’t stay open after running it just neatly quits Zoom and then quits itself without any further required input.

PS: Copy/pasteable code below. Enjoy(?)

activate application “zoom.us”

tell application “System Events” to keystroke “q” using command down

tell application “System Events”

tell front window of (first application process whose frontmost is true)

click button 1

end tell

end tell

Modeling Threats (or, Helen Keller vs. Russian Hackers).

As I’ve made abundantly clear to a lot of people over the years, we should all have been paying more attention to Helen Keller.

Okay, maybe I should clarify that somewhat.

If you were to start talking about Helen Keller to the proverbial person-in-the-street then there are certain touchstones of knowledge that you’ll see come into play. Some people will have no idea who Helen Keller is – which I get, because she’s a much bigger deal in the USA than anywhere else, so for those folks I’d mention the whole being-born-deaf-and-blind thing (which is what most people customarily jump to), as well as the whole Socialism thing (which is an association that a significantly fewer number of people make), but those are kind of table stakes. They’re showy and textbook inspirational/surprising, but differently-abled socialists aren’t uniquely unknown.

No, the thing I’d really draw attention to was her incisive grasp of the nuances of late twentieth and early twenty-first century Information Security, which were as gimlet-sharp as they were eerily predictive – the latter being quite a feat considering that she died in 1968.

What I’m referring to – of course – is this quote:

“Security is mostly a superstition. It does not exist in nature, nor do the children of men as a whole experience it. Avoiding danger is no safer in the long run than outright exposure.”

Helen Keller – “The Open Door”

Now, I’ve used this quote in a lot of talks in a lot of hotel conference rooms near a lot of airports over the years, and once this current plague is over I hope to use it in a lot more, because it’s something that’s absolutely worth absorbing and I like the weird little swag bags you sometimes get when you’re a speaker at a conference because I never seem to have enough pens and novelty iPhone chargers. Pursuing absolute security is like blundering into your nearest National Park, blindly hoping to bump into a Unicorn; no matter what your intentions you’re going to end up cold and tired and wet and disappointed.

Security doesn’t exist. It’s a mental and conceptual model that we’ve created so that we can sleep at night, and nothing more. You are not safe from lightning strikes on clear summer days. You can be as cautious and careful as possible and be rigorous in your use of PPE and distancing and still get COVID-19. A meteor could crash through your house while you sleep. Terrible, unexplained, fatal things happen to people all over the world on a daily basis; sure, sometimes the odds are fantastically slim, but you’re still playing a game with those odds.

It’s usually after making that point that I bring up the next slide in the deck, which looks a little something like this:

When we talk about “security” what we’re really talking about is “the mitigation of risk.”

This is an unpleasant truth, and when I address it in front of the aforementioned crowds in the aforementioned hotel conference rooms I can usually see the audience do one of two things; dutifully nod and go back to screwing around on Facebook (which is what most conference attendees – whose presence is mandated by their bosses – do anyway), or actually start to pay attention (which, as a person standing on a stage who spent two hours the night before rehearsing in the bathroom mirror, is something I heartily approve of).

This, in an admittedly roundabout fashion, brings us around to this story. If you’re disinclined to go and follow and read that link then I’ll lay out the broad strokes thus: during the current imbroglio that is the SolarWinds investigations another security firm (CrowdStrike) reported that Russian hackers had used compromised access to the vendor that sold it Microsoft Office 365 licenses in order to attempt to harvest emails that – because of the nature of CrowdStrike as a security company – would probably have contained privileged information.

Apple people traditionally like to throw shade at PC people, and as an Apple person I hate being lumped in with that crowd. Talking trash about a company just because you think that its products are inferior to the products of the company that you prefer doesn’t make you right, or sophisticated, or some arbiter of taste. It means that you have an opinion – which is fine – and that your opinion is something that you can’t keep to yourself – which isn’t, and which in turn makes you an asshole. I don’t want to court controversy here, but I’d venture that not being an asshole is a low bar that everyone should really try and clear, or at least strive to.

So, with that in mind you have to step back and consider this story with a little distance. Sure, this sounds bad – and it is bad – but fair’s fair; CrowdStrike have been forthcoming about the attempted breach, and while it’s fun to sling mud and hand out blame there’s really no fault in their actions – nor is there any in the actions of Microsoft (without more information on the nature of the breach on the vendor’s side there’s little value in making accusations and throwing accountability around, but my hunch is that if we’re hearing about all of this then they’ve probably done the smart thing and been transparent about the issue too.)

The problem here was not Microsoft, nor the vendor, nor CrowdStrike. Giving all of them the benefit of the doubt they may have acted perfectly. No, the problem is that if the model you’ve created to ensure your organizational security isn’t correct then no matter how well that model is implemented it’s always going to be subject to compromise. Keller’s maxim is universal. Security is superstition.

This article isn’t really about the Microsofts and CrowdStrikes of the world; I can’t speak to that scale of company because I’m an independent IT consultant in a coastal SoCal town, and because I rarely actually bump into that kind of setup. Amazon and Raytheon and the handful of larger enterprises that have facilities around here aren’t my clients, because they’re dealing with issues of size and complexity that entail a full-time staff of in-house dedicated IT support. I’ve been that in-house guy before, and I’m very happy that those clients aren’t in my base (because I like to take weekends off and I like to sleep nights and because being on call 24/7/365 is exhausting). But there are lessons to be taken here that can be applied to smaller-scale organizations. So:

It’s your data.

Passwords, certificates, login credentials – they’re your data. They don’t belong to anyone else, and they shouldn’t be given to anyone else. Not third-party vendors, not indiscriminately handed out to employees. Not even given to IT consultants.

I don’t keep passwords, because it’s a terrible business practice. Leaving aside the blatantly horrifying liability issues, I firmly believe that clients have the right to fire their IT consultants (and vice versa). I’m fortunate in that I’ve only been fired by one client, and in that case it was less of a firing than a mutual parting of ways (after all, if you’re moving from an all-macOS Server infrastructure to an all-Windows Server infrastructure then there’s relatively little point in keeping the Apple consultant around when the Windows consultant is right there in the mix). On the other hand, I’ve fired a handful of clients over the years, and having both parties able to walk away amicably and secure in the knowledge that nobody owes anybody anything makes that process easier.

Good IT consulting outfits don’t retain your data. If you forget your password and call your IT person, and they can look it up for you in their records then you should fire that IT person immediately and then change all of your passwords – and I mean all of them. The offending IT person can be of the stoutest character and unimpeachable ethical standards, but if they have your data then they’re a threat because if a third party can get to their data then that third party also owns yours. There’s little point investing in locks and alarm systems if the person who maintains the locks and alarms leaves your keys and codes lying around their office for their cleaning staff to see.

Your data doesn’t just live on your computer.

It’s not just about the services and credentials that you use inside your organization; it’s also about the services and credentials that reside outside your organization. Some of the most critical things that effect your ability to do business are some of the most-often overlooked – a prime example is DNS.

DNS – and this is an immensely stripped down explanation so don’t shoot me – is the mechanism by which the internet knows where computers and servers actually are and what they do. DNS servers tell the world where your website is, and where your email server resides. Unless you’re hosting your own DNS server (which is thankfully a rarer occurrence these days) then your DNS host has the power to – deliberately or not – completely cut off and isolate your organization from the internet.

This sounds like a worst-case scenario, but I’ve seen this a lot more than I’d like; organizations that let third-parties administer their DNS without giving any control to the organization. If I had a nickel for every time I’ve asked a client if they have any documentation or information on where their DNS is hosted and then had nothing in return but a blank, panicked stare then I’d have… I don’t know. A lot of nickels.

And again, that’s understandable. The structural mechanics of How The Internet Words are a conceptual handful, and there’s no practical need for most people to stay on top of that as a matter of course. But there is a need to have that information if needed.

Have a secure repository for your data.

Yes, yes, I know; “Security is mostly a superstition” and so forth. That’s a given, but the rest of the Keller quote – part that I don’t generally like to include in the talks at the conferences in the hotels near the airports runs as follows: “Life is either a daring adventure, or nothing.” It’s easy to take that as a carefree expression of the vital need to embrace a zest for life, but I look at it as something more chilling. “Daring adventures” are a hell of a lot better than doing nothing, after all. It’s better to have a carefully thought-through and protected repository for your data than it is to write it down in a book and put it on your desk, or throw everything in a Filemaker database or spreadsheet on your server marked “Passwords”.

(Note: Those are actual examples of things I’ve moved actual people away from doing.)

Take some time and find the right tool for documentation. Something cloud-based would be good; better yet, something with a lot of redundancy and good encryption options. I like IT Glue, but that’s just a personal preference. If you’re at an appropriate scale then look into having something written for you by a decent web/database person – there are options to explore in this space. Just don’t blindly either put everything into one bucket that lives on your computer (which can be stolen/damaged/hacked/just decide to die one day) or equally blindly go and throw it all onto a Google Workspace or Office 365 document.

Know where your keys are.

I don’t mean “keys” in the PKI sense (well, okay, maybe I do, but that’s not where I’m going with this) – I mean the keys to the things that run your business. I’ve already mentioned DNS, but there’s also Domain Registration. Do you know where your domain is registered? Whose account was used for the registration? When it expires? What about organizational Apple IDs used to administer Apple Business Manager, or APNS? What about software licenses? How are you tracking that data? Whose account was used to purchase those, and from what vendor?

That’s a lot of questions – I apologize. But not much; it’s a regrettable truth that when your job often involves going into organizations experiencing systemic trouble then you tend to only see the worst case scenarios, and in those kinds of cases it’s not uncommon to discover that the absolute critical piece of information or credential is locked behind a defunct email address, or originally set up sans documentation by a former employee, or more often than not just missing in action without a trace or a clue.

There’s nothing that can’t be fixed (well, very little that can’t be fixed), but some fixes are well-documented and quickly squared away because there’s a clear chain of information, and other fixes can take literally days of complete downtime and mountains of billable hours. Don’t get me wrong; I enjoy billable hours – I just don’t particularly enjoy writing them for reasons that could have easily been averted.

Make sure that you’re being diligent in how you implement products and services, and that there are established procedures for how those are accessed and serviced. Apple recommends having a specific Apple ID for organizations just for administering Volume Purchasing/MDM, but I’d go further and suggest setting up a specific administrative account that’s used as the contact for everything else – web, DNS, registration, licensing, the whole nine yards. Not an account that’s regularly used by an individual, either – an account that’s purely reserved for that specific purpose and that alone, with critical notifications forwarded to people inside the organization that need to see them.

What Helen Keller got wrong.

To be perfectly fair, there’s not a lot to say here. The only thing I’d throw into the ring would be that – philosophically at least – there’s little value in accepting the “Security as superstition” maxim at face value. Yes, the broad strokes are accurate, but while the idea of safety is something that we’ve constructed with our meaty, inefficient animal brains we’ve also managed to create systems that are more capable of dealing in absolutes. Nobody is going to start declaring Public Key Infrastructure as the greatest invention since fire/the wheel/sliced bread etc, but the fact remains that we live in a world where danger is starting to run on diminishing returns. You can narrow the risks – slice them into thinner swathes than ever before – because now we have better, more finite, stronger tools that we can use to protect ourselves.

These are – as has become abundantly clear over the last twelve months or so – Unprecedented Times. While we’re trotting out tired platitudes I’ll throw “the world is getting smaller” into the ring, because that and the unprecedented-times bit tie in pretty neatly; when we’re able to communicate faster and more completely then our connections contract. They become less nuanced, more immediate, and far, far more polarizing – creating systems so vast that simple fixes are less likely to be attended to and more likely to be overlooked or misunderstood.

Earlier I wrote about Security as an abstract mental model – and I think that’s an important way to consider it. Models are – to my way of thinking – the primary way that we’re able to containerize the outside world and build frames of reference and connections that adequately map our personal constructs of our personal worlds to the reality we actually live in. Both people and organizations exist and integrate with each other by creating and maintaining those models of the world, and with rapid change those are models that have to be updated and checked and refitted on a continual basis – and this applies whether you’re considering correct personal pronoun usage or assessing organizational network weaknesses. The only ways to stay relevant are to be continuously reactive and adaptive in updating and maintaining those models, and attacks like the SolarWinds incident point to bad actors being similarly more determined and focussed.

At the end of the day, the responsibility for your data lies with you and you alone. It’s an uncomfortable truth (after all, it’s much more fun to blame someone else when everything goes horribly wrong) so selecting the right tools and approaches to try to protect that data is something best done carefully and with considered understanding. Your model is never going to be perfect, but the sooner you can accept and internalize that then the sooner you can adopt a critical approach to remedying potential threats.

YouTube-dl and the RIAA

This is, believe it or not, the busiest part of the year if you’re an IT consultant. There are excellent reasons for this; businesses are generally keen to close out budgets, or make purchases and roll out products prior to the end of the tax year, or even just feel the universal urge to do whatever it takes to tie a neat knot around the year and go into the next one loaded for bear. The practical upshot of that is that I haven’t written anything for this thing for a while because I’ve been far too busy running around and actually working (which is exactly the kind of problem you want to have if you’re me).

Still, I have a running list of things I wanted to touch on because I think they’re interesting and because this blog is as much as anything else a resource that I can go back and look at when I need to remember some piece of syntax that gets shoved out of my aging cranium to make space for something more critical. One of those things concerns my first love (at least in a professional sense), which is homebrew.

(Okay, honorary shout-out to Synology as my other first love. It’s the weirdest, nerdiest form of polygamy.)

Homebrew is fabulous. I like to support it financially when I can because it’s a simple and effective way of putting together tools that enable me to do my job. I mean, sure, there are other ways of building programs and tools that are perfectly functional, but if IT consultants are plumbers then homebrew is an organization that just hands out wrenches for free. You can’t beat that value, and you’d be a fool to try. Still, now and again open-source tools run afoul of the rest of the world, and it can be jarring to reach for a wrench only to find that – against all expectation – it’s not where you left it.

I’m referring in this case to YouTube-dl, and the recent debacle over it’s equally recent removal and reinstatement on GitHub. You can read that link, or I can outline the rough lines of the story, which is pretty simple. YouTube-dl is a tool that you can feed a streaming video URL to, and which will then process that URL to extract the video and audio feeds and download them to your computer. Despite the name it’s a tool that works on, well, basically every service that streams non-DRM encoded video, and while it’s (fairly predictably) used to scrape video content from the internet that providers may not want scraped, it also has a raft of legitimate and fair use applications. I work with educators who’ll use it to pull academically-licensed video down to machines for presentations and research purposes, for example.

Still, the problematic use cases of the tool (notably the bit about being capable of illegally downloading and saving copyrighted music and video) ran afoul of the RIAA, who complained to GitHub that they were hosting a tool that was in contravention of copyright law, and in turn GitHub pulled the tool and left a lot of proverbial plumbers without proverbial wrenches.

Fortunately, all parties saw sense and restored Youtube-dl within days, but during the outage I had to do some very specific poking around about how to find and build the tool from non-GitHub resources, and in turn how to use it to manually select video and audio formats, which turned out to be rather interesting.

As anyone who’s uploaded video to YouTube will tell you, it’s not simply a process of lobbing it a file and then going and having a cup of tea while it puts all the ones and zeroes onto a webpage for your delectation and delight. That’d be convenient, but there’s more to it; YouTube is accessed by all kinds of client computers and devices over all kinds of connections, and as such it likes to have a lot of different versions of those video and audio files to serve out to those computers and devices. After all, if I’m watching a video on my iPhone on an LTE connection and the only video they can send my iPhone is the same full-resolution 4k file they’d send to my desktop wired to fiber then I’d stand in the rain watching the thing spool for a very long time while I asked myself whether I really needed to spend thirty minutes waiting to watch a cat video, and whether I should have considered my life choices with greater attention to time-management and the ownership and use of an umbrella.

Fortunately, YouTube-dl makes it easy to look at all the different versions of audio and video streams associated with a YouTube video, and then allows you to pick and choose which versions you’d like to download. Let’s start with a classic educational staple – Dr. Richard P. Astley’s rendition of the immortal classic “I Shall Never Give You Up.”

The YouTube URL for this is:

If you copy that URL and paste into YouTube-dl while invoking the -F option then you’ll get this impressive looking list:

From looking at that list, we can see that there are four audio-only formats, seventeen video-only formats, and one combo option right at the very end. YouTube takes this menu, looks at your connection and the device you’re using, and then selects the best options from the list, but we can use YouTube-dl to make our own choices by invoking the -f option. Let’s say we want to download the highest-possible quality video file (the 1080p mp4 option – number 137 on the list ) and the worst quality audio file (the 49k webm option – number 249 on the list). To download that using YouTube-dl you’d use the command:

youtube-dl -f 137+249 https://youtu.be/dQw4w9WgXcQ

The output will look something like this:

…et voila – you now have a copy of the file in the home folder on your computer.

All talk of plumbers and nonsense aside, I can kind of see the concern about the use of tools like Youtube-dl. It is, after all, a tool that can be used for both legitimate and illegitimate use alike, and how it’s wielded is largely a matter of personal judgment and policy. On a practical level, I suspect that the prospect of it being used wholesale as a mainstream piracy tool is limited by the fact that unless you want to go spelunking into video and audio formats a default invocation of the command will feed you a pretty good copy of a video that’s really no better than just viewing the content for free on the internet. Further, if you’re of a mind to go digging around and trying to pull down higher-quality files then you’ll still usually end up with a sub-perfect product quality-wise, as well as something that will eat up probably a lot of storage space – in short, there are easier and better ways of getting to content than adopting this as a default part of your arsenal…

How Not To Go Insane In A Warehouse (or: replacing code signatures for fun and profit)

I spent most of last weekend in a warehouse in Carpinteria, spouting an ever more specific series of salty oaths and curses.

This isn’t – just so we’re on the same page here – the way that I normally like to spend my weekends. It’s terribly important to maintain a healthy work/life balance (particularly in These Trying Times), so keeping work and personal matters separate is important and a flagpole of mental health, and it’s vital to stay grounded and in touch with the people who are most important to you.

This is by way of saying that when I’m issuing salty oaths and curses on most weekends they are chiefly directed at my family, who are quick and open about returning them in kind.

Still, now and again the nature of honest toil involves going and working on a weekend, which is fine. A lot of substantive IT work gets done at hours when it’s less likely to cause massive disruption. Like most IT consultants, I’m no stranger to walking into a client office at 5pm and walking out at 8am the next morning. Or decamping to an onsite location for a weekend, for that matter. This is the nature of the gig; you can’t make fundamental changes to infrastructure while said infrastructure is being actively… infrastructed. It’s like repairing a car engine while the thing is hauling down the freeway. It can be done, but it’s not going to end well, there are going to be enormously destructive crashes that cost everyone a lot of money and time, and someone’s probably going to end up in the hospital.

So, last weekend should have been pretty straightforward. The migration from the client’s ancient and ailing Mac mini server to a nice, shiny new Synology NAS had been completed without incident – chiefly because Synology makes a solid, well-designed product – and all that was left to do was to install a remote access application on each Mac desktop so that the client could use their cloud-based accounting package. It was a simple matter of installing some applications, doing a little light configuration, then being home in time to sink a couple of cocktails replete in the general glow of a Job Well Done.

Except that, no, it wasn’t a simple matter. The remote access application flatly refused to launch on about half the Desktops for no discernible reason whatsoever. Same hardware, same exact operating system and patches, but while about half of them worked perfectly, the other half not only refused to launch but refused to even bounce in the dock.

This is unusual. Well-written applications either run just fine or give you some kind of polite-if-terse indication why they fail to do so. They don’t as a matter of course just sit there, unresponsive, glowering at you from the Dock while you wrack your brain and try and work out what’s wrong. A peruse of the Console.app showed an error message thus:

Termination Reason: Namespace CODESIGNING, Code 0x1

…which is the kind of thing that makes your blood run cold once you figure out what it means. Essentially, the program won’t run because the OS has decided that it either isn’t signed (see last week’s article on Gatekeeper) or because its signature is invalid. Downloading a fresh copy of the app from the Mac App Store made no difference, which pointed me in the direction of the OS thinking that the signature was invalid because anything you download from the App Store is, by the nature of the transaction, signed.

So, how to fix?

My first thought was that maybe – somehow – Gatekeeper on those Macs was somehow at fault. Other downloaded apps worked just fine, though, which rather scuppered that theory. My second thought was that maybe there was some issue with the app being flagged as damaged by the Macs, so I tried manually adding the apps to quarantine using xattr, like so:

sudo xattr -rd com.apple.quarantine /Applications/Microsoft\ Remote\ Desktop.app

(Spoiler – the app was Microsoft Remote Desktop).

Finally, I stumbled across the codesign command (installed as part of Xcode Command-Line tools). I’d run into it before while tinkering around with homebrew, and on reading the man page found that it had options for removing, altering, and replacing existing code signatures. Downloading the Xcode Command-Line tools can be done from the Terminal.app like so:

sudo xcode-select --install

The first move was to remove the existing code signature:

sudo codesign --force --deep --remove-signature - /Applications/Microsoft\ Remote\ Desktop.app

Next, now that the existing signature has been removed, we can re-sign the app (using the --force flag to actually replace the existing signature and --deep flag to ensure that any sub-hosted code signatures are also replaced) by issuing the following command:

sudo codesign --force --deep --sign - /Applications/Microsoft\ Remote\ Desktop.app

Thankfully, this worked like a charm, allowing all parties to return to their regularly scheduled weekend drinking. I mean families. Right? Right.

Let’s talk about GateKeeper

This week has been a quiet news week, which is probably a good thing. What with the election shenanigans raging to and fro I’ve sort of peered at the news with a cautious, jaundiced eye and been pleased that the default recommended behavior has not – for once – been to actively recoil. When we’re living in a world where my news feed is sending me stories about byzantine security measures in macOS and not doubling up on every specie and varietal of The Current Apocalypse then I’m prone to taking the win. It’s the little things, etc.

Still, some little things are – depending on where your priorities lie – big things. I refer of course to the minor brouhaha about macOS and GateKeeper – the former being a hugely popular and recently updated operating system (you may have heard of it) and the latter being Apple’s ingenious quarantining mechanism designed to keep nasty things from happening to your Mac. The current controversy kicked off with an article from November 12th which pointed out that with the advent of Big Sur/macOS 10.16/macOS 11 Apple was constantly collecting a lot of information about what programs you were opening, where you were when you opened them, what the time and date was, and what computer you were opening them on. Further, it noted that as a partner in PRISM Apple was essentially turning all this data over to The Powers That Be in order that Big Brother can track your every movement.

This, I’m sure we can all agree, sounds Bad. But – as in so much of life – an ounce or two of perspective can often throw things into a different light.

First of all, what the heck is GateKeeper?

Good Question.

Thanks!

GateKeeper’s ancestor was a system that Apple put in place back in 2007 which eventually evolved into a two-part mechanism designed to make sure that anything you download and install on your Mac isn’t riddled with malware. Initially it was a pretty basic tool; applications downloaded to your computer were quarantined until you explicitly gave permission to open them for the first time, and provided you knew what you were doing (or were at least prepared to say that you knew what you were doing) then the presumption was that nothing was apt to go awry. A year or two later Apple upgraded the system so that Mac OS X would check the downloaded application for known malware threats, and then the whole thing was spruced up again with Mac OS X Lion to incorporate signed apps.

And it’s this mechanism – the checking for signed apps – that’s really the crux of the recent concern. In a nutshell, here’s how the process works.

  1. A developer – let’s call him Dave – wants to write a macOS application. He signs up for an Apple Developer account, goes and bangs out his masterpiece in Xcode, and signs it with a certificate denoting his Developer ID.
  2. A customer – let’s call him Bob – purchases this amazing application. When he runs it, his Mac looks at the application, notes the certificate that Dave signed it with, then sends an inquiry to Apple to make sure that the application is legitimate and actually written by an actual Apple-approved Developer. Said inquiry is in the form of a hash that contains an identifier of the application that’s being opened.
  3. The OCSP (Online Certificate Status Protocol) responder at Apple looks at the hash it’s been sent, notes that yes, everything looks okay, and then tells Bob’s Mac that the application is okay to run.

This system is not without its flaws, but they tend to be the obfuscatory variety and not the destructive sort. The worst of the bunch is that occasionally a developer certificate will expire, so when the application is launched the hash pushed to OCSP is refused, leading to a lot of frustrating inabilities to open the application. Fortunately, renewing a developer certificate is a relatively simple process.

There’s also been some alarm about the fact that these hashes are sent with non-encrypted http instead of https, although logic dictates that if you use a certificate-encrypted https session to check for an OCSP certificate then you’ll first need to decrypt the https certificate, and eventually it’s certificates all the way down, which would at least give all the elephants something to look at.

Still, the idea that your computer is constantly sending a stream of information about what applications you’re running out to The World™ sans encryption isn’t a great look. So much so that Apple has published an updated document on the subject, thus.

It’s comforting to read that kind of thing, but one should also trust and verify. Thankfully, a lot of that kind of heavy lifting is done by better and wiser minds than mine; for example – Jacopo Jannone, who published an article that did a fascinating deep dive into the OCSP process. I’d encourage anyone who’s remotely interested in looking under the hood of their computer to follow his process. I mean, I know that I did; using Wireshark to capture an OCSP request for CodeRunner.app I was able to pull the serial number of the application and match it to what was being sent to OCSP, as well as noting that once that was sent the first time the app was opened after a reboot no further requests were sent, even after opening and closing the app.

So, a storm in a teacup, then. Apple isn’t tracking your every move via application opening and closing (or if they are then they’re doing a shockingly inefficient and terribly-implemented job of it). There’s still a temptation to disable your Mac’s ability to go talk to OCSP but that’s a temptation to be metered or avoided. Gatekeeper might seem like some authoritarian mechanism, but it’s a vast improvement on the absence of any kind of check or balance. In a world without rational, transparent security – even the kind that leaves an uncertain taste in your mouth, it’s all too easy to end up with a fully open sandbox where applications can run unmetered and unchecked, and send a lot more information out than the time and date you anonymously open a browser…

Apple Silicon for the Pro market?

Well, today Apple pulled the wraps off their new toys in a manner that surprised almost nobody at all. We got new portables and a new Mac mini (which hadn’t been talked about a great deal by anyone, but seemed a shoo-in on the grounds that the Apple Silicon Dev kit was… also a Mac mini). And these are all great products, and will do very well because they’ll do what they do very well.

What they won’t do very well? Not much, but I can think of one glaring problem if you’re anyone who works in design, video or do a lot of CAD work – and it’s not really Apple’s fault. What’s the problem? I’ll give you a hint in the form of the accessories available for the new MacBook Pro:

What’s missing here?

Too oblique? Okay, that’s understandable – if you’re looking at a forest and don’t notice that it’s missing a tree, then that’s not on you. Here, I’ll make it easier by showing you the accessories available for the older, Intel-based MacBook Pro:

Ruh-roh.

Now I am – and this is no surprise to anyone who knows me – not what you’d call a world-expert on chip design, but it’s pretty clear to me that in putting the entire system (CPU, Cache, Neural Engine, Fabric, GPU and DRAM) on a single chip then you’re somewhat boxed into the idea that you’re stuck with integrated graphics. And if that’s the case, then said system on a chip is – by its nature – not going to have any mechanism to go and talk to discreet graphics – whether it’s a graphics card or an eGPU. It’s counter to the design of the thing.

Still, no eGPU support isn’t entirely surprising when you consider the nature of the beast(s). These are, after all, not Pro machines. Yes, yes, I know: the MacBook Pro has “Pro” in the name and is used by professionals, but the 13-inch model isn’t historically renowned as the hard-hitting graphical powerhouse of the line. And, to be fair, the M1’s octo-core GPU generates some very decent numbers – from peering uneasily at screen grabs and doing some back-of-the-napkin math it looks like the thing’ll churn out about half of a Radeon RX 580, which while admittedly a long way from the top of the heap isn’t exactly chump change, either.

I don’t mean to dump on these new machines. They’re really, really great products (and I’ve already ordered a couple of Airs for my kids). As a first run, it’s extremely impressive that Apple’s managed to come up with machines that are bound to make Intel go a little pale and wobbly-footed, but it’s also true that having machines this powerful at the low end of the range generates some interesting questions about the rest of the product line. Benchmarks have yet to be forthcoming, but based on the claims of speed increases from the older, intel-based versions of the MacBook Air, MacBook Pro and Mac mini it rather looks like those computers will cheerfully stomp all over the iMac and iMac Pro in raw performance, and even give the Mac Pro a bit of a turn.

Except when it comes to tasks and pro use-cases that involve significant GPU compute needs, that is, which raises two questions (both of which I have actually been asked this morning):

Are the rest of Apple’s desktop products suddenly lame ducks?

If you’re, say, the manager of a small publishing company with a limited budget, what reason is there to go and buy four new iMacs? After all, there are going to be new, M1-based iMacs coming out at some point.

Are Apple’s Pro computers ever going to be good again?

Further from the last question – is it even possible that Apple can make a chip that can compete with some of the higher-end graphics cards? After all, those are companies that have years of experience and deep benches of R&D expertise, and even assuming that it’s possible to compete, why would you want to buy a pro machine with non-upgradeable graphics cards?

I won’t lie; this was an awkward conversation. But I’ll put down what I said after a couple of minutes of thought. Maybe – just maybe – we’re thinking about what a Pro machine is, and coming up with some answers that are informed by what we’ve been conditioned to believe instead of thinking flexibly. Maybe we’re looking at it all wrong.

Graphics cards are awful devices. No, really; finicky, phenomenally expensive, prone to failure and oft laid low by software problems (not to mention hot and noisy and wildly, wildly power-hungry). One of the rumors about the new Mac line has been about a supposed new Mac Pro – much smaller – and the feedback I’ve read has solidly fallen into discussions about how there’ll be no room for expandability, adding extra cards and storage and so forth. Maybe we’re looking at this kind of problem the same way that people looked at the first cars and sniffed, derisively, pointing out that there was no place to attach the horses to the front of the thing.

I don’t think we’re going to see Macs (and by association, a lot of the PC market) using discreet graphics in future. Yes, there are people who upgrade their pro machines with new-and-improved hardware as time goes by, but I’ve worked with those clients for the thick end of two decades and the vast, vast majority of those clients? When they’re ready for an upgraded graphic card, they look at the budget, look at the depreciation scheduled, and just buy a new computer.

There’s a reason that Apple rolled out Apple Silicon the way that it has. Consumer/Prosumer machines first (because the M1’s secret weapon is it’s absurdly low power footprint); and then, later on, a followup product with significantly more graphic cores. After all, if a Mac mini with eight GPU Cores can come within punching distance of a decent graphics card, what can a twelve core card do? Or a sixteen core? Or a thirty-two core?

My money says that we’ll see an Apple Silicon iMac within the year, with graphical performance that’ll jump up and down all over the current iMac range. In the mean time, though, I think there’ll be a lot of difficult decisions to make about sticking with Intel-based Macs…