A Story Half-Told

Back in November I wrote about how the introduction of the first M1 Macs put extra pressure on Apple to differentiate the iPad Pro from the recently upgraded iPad Air. The Airs were colorful, performant, and supported all the new accessories. Meanwhile, the new M1 MacBook Air was light, blazing fast, and could run iOS apps. What would compel people to buy an iPad Pro?

In that post, I listed 10 things I thought Apple could do to make its pro tablet stand out in the line-up. This week, Apple addressed one and a half of them, and set the stage for a few more. It upgraded the port to Thunderbolt/USB 4 (but didn’t add an additional port like I hoped), added 5G, and gave the iPad Pro the very same M1 chip that powers its new Macs, making it more than capable of running things like Xcode, Final Cut, Logic, etc. The port could potentially point toward things like better external display support and fast Time Machine backups.

Disappointingly, the iPad Pro presentation lacked the colorful, whimsical joy of the new iMac introduction (though I was definitely impressed by the production quality of the M1 chip heist). Apple has doubled down on iPad Pros being Serious Business, which is just too bad, because literally everyone I know would love an iPad Pro in some other color than gray. In fact, I find myself in a strange position—the new iPad Air made me excited for the iPad Pro, which in turned disappointed me enough to make me hopeful that the next iPad Air will be released in some even more vivid colors. Apple has become a company of a thousand SKUs…you can’t tell me they can’t give us some more gosh darn hues. But, I digress.

Once upon a time, Apple made an outrageously powerful, desktop-class tablet, with artificially limited software and I/O.

…and then what?

Well, we have to wait until June 7 to find out. Or do we? The iPad’s future is just as wrapped up in the current anti-trust hullabaloo as it is in iPadOS 15. Will developers be allowed greater freedom to innovate without being fearful of App Review? Will Apple finally shift its focus to eliminating actual multi-million dollar scams and fraud instead of nitpicking honest developers who desire to follow the spirit of the law, if not the letter (which is usually pretty vague to begin with)?

If Apple is willing to give App Review a complete overhaul and also manages to release at least one first party “pro” app for iPadOS this June, I think the iPad Pro’s story will take a happy turn indeed. For now, however, it remains a half-told tale of wasted potential—a sleek, expensive “what if?”

How to Set Up Core Data and CloudKit When You Haven’t the Faintest Clue What You’re Doing

Note: This was posted before WWDC 2021, so if major changes were made to Core Data + CloudKit, they aren’t reflected here. This right here is just pure dumpster fire all the way down. Also if you’re an Apple engineer…I’m sorry.

When Apple introduced changes to Core Data + CloudKit integration in 2019, they sold developers on a dead-simple API: add iCloud sync to your Core Data app with “as little as one line of code.” That one line, of course, is simply changing NSPersistentContainer to NSPersistentCloudKitContainer and enabling a few capabilities in the project settings. Boom, done! And in fact, Apple’s “Core Data –> Host in CloudKit” SwiftUI project template does those things for you, so you’re good to go, right?

Turns out, if you want to sync Core Data-backed data between devices and have those changes reflected in your UI in a timely manner, you have some more work to do. To figure out what that work is, you can’t look at Apple’s Core Data templates. You have to look at their sample code.

My SwiftUI app was created before Apple even added a SwiftUI + Core Data project template, so I created a class called “CoreDataStack” that has a shared instance. If you use the template, that becomes a struct called “PersistenceController.” I’m sure the struct is SwiftUI-ier, but the class from Apple’s sample code (which does not use SwiftUI) makes more sense to my brain, so I went with that.

Step 1: Make your container lazy and set some important options

In Apple’s sample code, you’ll notice that within the persistent container’s lazy initialization, two options are set on the container’s description. Include these.

guard let description = container.persistentStoreDescriptions.first else {
        fatalError("###\(#function): Failed to retrieve a persistent store description.")
    }
    description.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey)
    description.setOption(true as NSNumber, forKey: NSPersistentStoreRemoteChangeNotificationPostOptionKey)

If you don’t set the first option, you’ll regret it. Look, I don’t even really understand what it does, I just know that somewhere down the line, you’ll find some dumb way to break sync and then when you finally get it working again, you’ll find that only managed objects created after this key was set will sync properly. Every object created before the NSPersistentHistoryTrackingKey was set will stubbornly refused to sync unless you modify it and re-save it, which is a giant pain in the derrière. I mean, at least that’s what my…uh…friend told me.

The second option is the first step toward receiving notifications when magic cloud stuff happens. You’ll subscribe to that NSPersistentStoreRemoteChangeNotification later, but for now, just make sure that option is set.

Step 2: Stir in some of this stuff that I have a super weak grasp of

After your container loads its persistent stores, but before you return the container itself, these lines are also important:

container.viewContext.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
    container.viewContext.transactionAuthor = appTransactionAuthorName
       container.viewContext.automaticallyMergesChangesFromParent = true
    do {
        try container.viewContext.setQueryGenerationFrom(.current)
    } catch {
        assertionFailure("###\(#function): Failed to pin viewContext to the current generation:\(error)")
    }

There are several merge policies, and you can read about them in the docs.

Again, I barely understand this stuff. For my purposes, I set “appTransactionAuthorName” to the name of my app’s container, which was simply “YarnBuddy.” From what I kinda understand, setting the transaction author here allows me to later filter for changes that weren’t created by my app on this particular device and act on them.

Now, I’ve always had “automaticallyMergesChangesFromParent” set to true, but what I didn’t realize is that it doesn’t just refresh your view hierarchy immediately when a change occurs. Maybe it should, but for me, it doesn’t. That’s where the remote change notification comes in.

Step 3: Dip your toes into Combine for a hot second and subscribe to notifications

I put this code right before “return container.”

NotificationCenter.default
      .publisher(for: .NSPersistentStoreRemoteChange)
      .sink {
        self.processRemoteStoreChange($0)
      }
      .store(in: &subscriptions)

And somewhere within the class I have declared this variable:

private var subscriptions: Set<AnyCancellable> = []

Make sure you import Combine at the top. I know extremely little about Combine at this point. It’s number one on my list of things to learn, and I plan to start with John Sundell’s “Discover Combine” materials.

We’ll get into what my “processRemoteStoreChange” function does in a minute.

Step 4: Just copy over these blessed code snippets from the sample code

Copy the following from CoreDataStack.swift in Apple’s sample code:

  • the initializer
  • lastHistoryToken variable
  • tokenFile variable
  • historyQueue variable

Also copy over the NSPersistentContainer extension in “CoreData+Convenience.swift.”

Also, my “processRemoteStoreChange” function is identical to the sample code’s “storeRemoteChange” function.

Step 5: Merge new changes into the context

I modified Apple’s “processPersistentHistory” function to look like this:

func processPersistentHistory() {
    let backgroundContext = persistentContainer.newBackgroundContext()
    backgroundContext.performAndWait {

        // Fetch history received from outside the app since the last token
        let historyFetchRequest = NSPersistentHistoryTransaction.fetchRequest!
        historyFetchRequest.predicate = NSPredicate(format: "author != %@", appTransactionAuthorName)
        let request = NSPersistentHistoryChangeRequest.fetchHistory(after: lastHistoryToken)
        request.fetchRequest = historyFetchRequest

        let result = (try? backgroundContext.execute(request)) as? NSPersistentHistoryResult
        guard let transactions = result?.result as? [NSPersistentHistoryTransaction],
              !transactions.isEmpty
            else { return }

        print("transactions = \(transactions)")
        self.mergeChanges(from: transactions)

        // Update the history token using the last transaction.
        lastHistoryToken = transactions.last!.token
    }
}

The “mergeChanges” function looks like this:

private func mergeChanges(from transactions: [NSPersistentHistoryTransaction]) {
        context.perform {
            transactions.forEach { [weak self] transaction in
                guard let self = self, let userInfo = transaction.objectIDNotification().userInfo else { return }
                NSManagedObjectContext.mergeChanges(fromRemoteContextSave: userInfo, into: [self.context])
        }
    }
}

Most of that code was pulled from Stack Overflow. Apple’s code has a bunch of deduplication logic in it that frankly, I’m not emotionally ready to process, so I skipped it.

I’ve seen a few folks say that merging changes like this shouldn’t be necessary. However—and maybe it’s some sort of weird placebo effect—it seemed like changes from my watch synced much more quickly to my phone. Not instantaneous, but a handful of seconds instead of requiring me to sometimes force quit and restart the app (or switch tabs or something) to see changes.

Step 6: Never forget to deploy your updated schema from the CloudKit Dashboard

Honestly, it’s not that I forgot to do this, it’s that I failed to make sure it actually happened. CloudKit threw some weird error at me and told me my development and production schemas were the same, when they were in fact extremely different. I never double-checked, and chaos ensued! Don’t be like me: make sure your schema is deployed.

After launch, remember that you still have to do this every time you change your Core Data model, before you release your update to testers or the App Store. If your production CloudKit schema doesn’t properly correspond to your production Core Data model, syncing is going to break in all kinds of terrifying ways.

Conclusion

I know I probably could have saved myself a lot of frustration if I’d forked over some money for a Ray Wenderlich membership or some other paid tutorials/books related to Core Data. I’m also guessing there’s a much easier way to set everything up so that cloud changes are reflected near-instantaneously. But y’all, I’ve combed the free internet for weeks and this is the best I could come up with. Maybe it’ll help you too.

Thoughts on Apple Glasses

Rumors about Apple glasses have been swirling around for years now, and they have never once interested me. Why would I want to wear a computer on my face? Didn’t I get LASIK eye surgery to avoid ever again having to clumsily clean a pair of lenses with the corner of my t-shirt and my hot breath? And what about folks that stare at a computer screen all day for their job anyway…are they going to stare at that screen through a pair of Apple glasses? Or do you only wear them in certain situations…in which case, what’s the point?

Whenever I read an article about Apple’s foray into AR headsets, something about it just doesn’t feel right. No one ever makes it sound like Apple is trying to make a product for the mass market, when I strongly believe that they will only release this product if it has mass market (or potentially future mass market) appeal. That means it has to follow the same trajectory as the Apple Watch: comes in many different styles, and is meant to be something you wear all day long and charge at night. Perhaps even reliant on a companion iPhone app until the components get small enough, then made independent.

It’s one thing to have something a bit odd-looking strapped around your wrist, or sticking from your ears. It’s an entirely different thing to have something goofy smack dab in the middle of your face. These glasses are going to have to be sleek af. They’re going to have to look good on a wide range of face shapes, sizes, and skin tones, and appeal to a wide range of personalities. They’re going to have to make people who don’t currently wear glasses want to wear glasses.

But Becky, why try to appeal to everyone when they could just create some sweet futuristic sci-fi specs for influencers and nerds? Because in my heart of hearts, I believe the narrative behind Apple’s AR glasses is going to be the same as the later iterations of the watch: Health. Wellness. Accessibility. The Human Experience.

Honestly, I think these devices will be revolutionary for people who are blind, colorblind, or have low vision. Not necessarily because assistive devices for these groups don’t already exist, but because Apple will do it better, and sell it cheaper. Imagine having Apple Glasses paired with some AirPods, discreetly giving you an audio description of whatever you’re looking at. Personally, I’d love to have a pair of glasses that could help me see better when driving at night (although, who knows if we’ll be allowed to drive while wearing these things. Maybe they’ll have a do-not-disturb-while-driving mode?).

Maybe users with hearing loss could enable live captions, à la Clips. Or on the flip side, hearing folks could use the glasses to recognize sign language. Suddenly, all text responds to dynamic type. Sounds like sirens give visual cues, while sights like signage give audio cues.

Tech reporters love to go on about all the “hidden” accessibility features in iOS that are actually great for the masses, and I think Apple Glasses are going to be a whole lot of that.

People walking or running outdoors could see mile/km markers in the environment around them, and maybe even fireworks in the sky when they reach their goal. Maybe when you’re hunting around for that lost AirTag, a big giant 3D arrow appears over it when you’re close. That might sound silly, but it also kind of sounds like Apple, doesn’t it?

All of this is to say: I think the reason I haven’t been interested in all of the Apple Glasses talk is because the focus seems to be on games, notifications, and maps, which to me are the least interesting and least imaginative features of this supposedly futuristic device. There is no “killer app” because the entire purpose of the device is not just to replicate iPhone functionality, but rather to fundamentally improve the human experience in a deep and meaningful way. It won’t get there in version 1.0, but I now find myself excited about the possibilities. This isn’t about giving us a new screen. It’s about freeing us from screens, from distractions, and bringing us together again.

Let’s be honest: Apple isn’t going to make a “Dear Apple,”-style commercial about how Apple Glasses impacted people’s lives by allowing them to play Minecraft on their bathroom floor or get notifications about the newest AppleTV+ shows directly in front of their pupils. It’s going to be about how two neighbors who speak different languages are able to communicate face-to-face with real-time translations hovering nearby. It’s going to be about people with disabilities having an improved experience in the world and greater overall quality of life. It’s going to be what it’s always been about (besides profit, of course): people. All of the fun-albeit-gimmicky 3D AR stuff is just a cherry on top.

5 Years of App-Making

I realized recently that it’s been nearly five and a half years since I released my first app onto the App Store. Like, holy smokes, where did that time go?

I’ve been feeling somewhat reflective these last few weeks (and I haven’t even watched Soul yet!) as I’ve taken time off from programming to focus on getting ready for Christmas, which is something I genuinely love doing. I ran across this tweet early in my preparations:

…and honestly, that’s exactly what I’m about at Christmastime: being that mom that makes things magical. The day after Christmas, my kids asked how many days were left until next Christmas, so I think I succeeded?

Anyway, back to five and a half years. Although I released my first app on June 10, 2015, I didn’t actually start making any money until the following June when I released LiveRotate, my app for rotating Live Photos (remember when editing Live Photos wasn’t possible without turning them into stills? lol). From June 2016 to today, I’ve made approximately $12,600 in profit from the App Store.

It’s simultaneously a lot and a little. It’s a lot for most developers. It’s a little for the developers I follow on Twitter. Early on, someone asked me how I would define success for myself as an indie developer. I remember stressing that my apps were just side projects (they are) and that I’d be happy if my revenue could cover the cost of my personal device upgrades (it has). At the time, I think I forgot to say something about how I wanted to make things that improved people’s lives, or just made them smile. In that way, I’ve also succeeded, and hearing from happy customers has been incredibly rewarding.

Most of the proceeds from the past four years have come from Snapthread. I owe its success to the very nice media coverage it got, and I owe that media coverage to my following on Twitter, and I owe that following to Brent Simmons, who cared enough to compile a list of women in the iOS community (thanks again, Brent!). Revenue from Snapthread has diminished considerably in the past year or so, as has my enthusiasm for struggling with AVFoundation. Scribblet and YarnBuddy were just what I needed this year, both in terms of challenge and inspiration.

I set a goal a few months ago of reaching 20 annual subscribers to YarnBuddy Pro by the end of the year. I’m happy to report that as of today, my subscriber count stands at 52 subscribers (plus an additional 16 that redeemed a promo code for the first year).

And here we are, at the doorstep of 2021. Convention says that it’s time to set new goals, but I’m just not feeling it. Time stood still in March, and yet somehow things began to happen at an increasingly frenzied pace. There’s a really excellent episode of Mr. Robot in the final season that basically happens in real time, with no spoken dialogue save from one line at the beginning and one at the end. It is ridiculously intense; I felt like I was holding my breath the entire time. This year kinda felt like that too. I hope 2021 feels like a nice, long exhale.

Of course, learning goals are another story. I’d really like to deepen my design skills this coming year as well as my understanding of SwiftUI and Combine. It’s amazing to realize what I’ve learned since starting my development journey, starting with Objective-C in early 2014, then quickly pivoting to Swift 1.0, playing with SpriteKit, AVFoundation, PhotoKit and PencilKit, and now writing apps using SwiftUI. It’s been a wild ride, and I’m so thankful for all of you that have helped me along the way (including many who I’ve never interacted with, but whose blog posts and Stack Overflow answers have literally kept me going).

Do you have indie business goals for 2021? How about learning goals? I’d love to hear them! I wish you all happiness and good health this coming year. 💚

Your Move, iPad

Hear that? It’s the sound of Mac fans. No, not your shiny new M1 Mac’s fans—chances are, you’ll never hear those—but rather, the sound of excitement rippling through the Mac community. This is something big. Really big. Now, I’m only 33, but someday when I go full fuddy-duddy I will speak of this: the great Intel/Apple Silicon transition. The beginning of a new era at Apple.

All that sounds dramatic, of course, but it’s interesting to trace all of the different paths that led us to this point. The A-Series chips, the introduction of Metal, rapid machine learning gains, the gradually degrading repairability scores as components became more integrated, the Secure Enclave, a new super fast emulation layer, new unified memory architecture, and 5nm process… years and years of work have now come to fruition with the first Apple Silicon chips for Mac. And our minds are blown.

Suddenly, we’re handed a thin, entry-level fanless laptop that performs better than almost every other Mac computer out there, and a low-end MacBook Pro and Mac Mini that make current Mac Pro owners sweat and clutch their wheels. So many questions abound. What new hardware designs will these gains make possible? What on earth does Apple have in store for its high-end Macs? Will anyone else even be able to compete? It’s an exciting time to be a Mac lover, but, surprise: this post isn’t really about the Mac. It’s about the iPad.

There’s no question that Apple has struggled to craft a cohesive, compelling narrative for the iPad. For a long time, there seemed to be a distinct lack of product vision. Everyone likes to speculate over what role Steve Jobs ultimately intended the iPad to have in people’s lives, but not only is that pointless, it’s also irrelevant. We don’t need Steve to tell us what the iPad is good for. We know what it’s good for, and we can easily imagine what it could be good for, if only Apple would set it free.

Just as Apple left us with great expectations for its Pro Mac line-up, the latest iPad Air also raises the bar in new and interesting ways. The Air served as sort of an appetizer for the new M1 chips, while also receiving a generous trickle-down of features from the iPad Pro, including USB-C and support for the latest keyboard and Pencil accessories. There have been rumors of new mini-LED displays for the next-gen iPad Pros, but it’s going to take a lot more than new display tech to set the Pros apart.

Francisco Tolmasky (@tolmasky) recently tweeted:

“A sad but inescapable conclusion from the impressive launch of the M1 is just how much Apple squandered the potential of the iPad. The iPad has had amazing performance for awhile, so why is the M1 a game changer? Because it’s finally in a machine we can actually do things on.”

Francisco is right: Power and performance aren’t the bottleneck for iPad, and haven’t been for some time. So if raw power isn’t enough, and new display tech isn’t enough, where does the iPad go from here? Will it be abandoned once more, lagging behind the Mac in terms of innovation, or will Apple continue to debut its latest tech in this form factor? Is it headed toward functional parity with the Mac or will it always be hamstrung by Apple’s strict App Store policies and seemingly inconsistent investment in iPadOS?

It’s clear that Apple wants the iPad Pro to be a device that a wide variety of professionals can use to get work done. And since so many people use web apps for their work, the introduction of “desktop” Safari for iPad was an important step toward that goal. The Magic Keyboard and trackpad was another step.

Here are ten more steps I believe Apple could and should take to help nudge the iPad into this exciting next era of computing.

  1. Give the iPad Pro another port. Two USB 4.0 ports would be lovely.
  2. Adopt a landscape-first mindset. Rotate the Apple logo on the back and move the iPad’s front-facing camera on the side beneath the Apple Pencil charger to better reflect how most people actually use their iPad Pros.
  3. Introduce Gatekeeper and app notarization for iOS. The process of side-loading apps should not be as simple as downloading them from the App Store. Bury it in Settings, make it slightly convoluted, whatever: just have an officially-sanctioned way of doing it.
  4. Ruthlessly purge the App Store Guidelines of anything that prevents the iPad from serving as a development machine. Every kind of development from web to games should be possible on an iPad. And speaking of games—emulators should be allowed, too.
  5. Release a suite of professional first-party apps at premium prices. If someone can edit 4K videos in Final Cut on their M1 MacBook Air, they should be able to edit 4K videos in Final Cut on their iPad Pro. I refuse to believe that these pro apps can’t be re-imagined and optimized for a touch experience. If Apple leads the way in developing premium software for iPad, others will follow.
  6. Make it possible to write, release, and install plug-ins (if appropriate) for the aforementioned first party apps.
  7. Bring App Library to the iPad and allow widgets to be positioned anywhere on the Home Screen. This isn’t groundbreaking, it just annoys the heck out of me.
  8. Release a new keyboard + trackpad case accessory that allows the iPad to be used in tablet mode without removing it from the case.
  9. Introduce Time Machine backups for iPadOS.
  10. 5G, ofc.

In the end, fostering a vibrant community of iPad app developers can only stand to benefit the Mac (and vice-versa).

It’s simple: people love their iPads. They love them so much they wish they could do even more with them. The new M1 Macs should give iPad fans reason to be excited; now that we’ve seen hints of what future Macs can be, it’s time for the iPad to reassert itself—to remind us once again who it’s for, and what makes it special.

In other words: Your move, iPad.

My Apple Silicon Dilemma

About 3 years ago I wrote a post about the very real possibility that my current 2016 13-inch MacBook Pro with Touch Bar could in fact be my last Mac. I paid ~$2600 for it at the time, and two years later purchased an 12.9″ iPad Pro that I absolutely fell in love with. My reasoning then was that this MacBook would likely work just fine through at least 2021, and by that time I could do all of my work on an iPad Pro.

Now, it doesn’t look like Xcode for iPad is coming anytime soon. And while my MacBook Pro does still work fine, there’s a “Service Recommended” warning for its battery, and its infamous butterfly keyboard has lasted so long without breaking that it actually makes me a bit nervous.

As I watched Apple’s “One More Thing” event on Tuesday, I found myself really drawn to the new M1 MacBook Air. Light, portable, quiet, fast. No new design, no fun new colors, but plenty of beefy benchmarks. There’s a part of me that feels like if I want to be a member of the developer community, I should always be interested in the most powerful “pro” machine I can get. With rumors of new 14″ and 16″ MacBook Pros with better displays and even more impressive capabilities, I should just wait, right?

And yet. As I reread my post about gravitating toward iPad, I realized that I really do use my MacBook Pro almost exclusively for running Xcode (and occasionally for messing with the CSS in my WordPress template). When it comes to blogging, or photo editing, or designing icons and graphics in Affinity Designer: I prefer my iPad Pro.

If I trade in my current MacBook Pro, I could get a new MacBook Air with 16GB of RAM and a 1TB hard drive for around $1150. If we subtract that from the price I paid for my computer 4 years ago, that leaves $1449 I could spend on an arguably more compelling new iPad Pro someday AND I’d also have a faster, lighter laptop with better battery life than the one I have now (albeit with fewer ports).

Additionally, when I think of all of the things “missing” from the new Apple Silicon MacBooks: things like mini-LED displays, FaceID, touch screens, ProMotion, 5G, etc… I mean, that’s basically describing a next-gen iPad Pro, just without macOS.

The last piece of the puzzle is the Touch Bar: the new MacBook Air doesn’t have one. I don’t hate the Touch Bar like many folks do, but I don’t use it too terribly much. I like having emoji handy, but the new Air has a function button for opening up the emoji picker, so I don’t really think I’d miss it.

So then. Since I already tend to be an enthusiastic early adopter (Swift and SwiftUI 1.0, baby!), I think I’ve talked myself into ordering a new MacBook Air, and perhaps upgrading my iPad Pro at some point (maybe next year, maybe 2022). I’m going to wait a bit longer for some reviews to come out before I pull the trigger, but writing this post helped me think through my current situation and decide what I want my future tech setup to be. How about you? Are you going to order a new MacBook or Mac mini, or wait for the higher-end Macs?

With over a day left until the Apple event, I find myself in an unfamiliar bind: I’m out of predictions podcast episodes to listen to. 🙈 #mbnov

The only thing puzzling me at the moment is why I am still awake, seeing as we’re not likely to hear any significant updates until tomorrow night, maybe even Monday. Must just be nervous energy. #mbnov

Mac Event Predictions

I comfort-ordered a white HomePod mini this morning. When I’m feeling stressed, I tend to buy things and eat a lot of cheese, and hoo-boy that 18-count mega box of Kraft mac and cheese from Costco is emptying at an alarming rate. Lucky for me and my cheese stash, another opportunity to therapeutically buy stuff is right around the corner, with Apple’s “One More Thing” event just 3 days and 20 hours away, according to my Scriptable widget.

I will admit to not knowing much about computer processors, and what features/outcomes certain types of processors can enable. That’s why I’m the perfect person to write a hilariously specific prediction post for next week’s event. For this, I’m going to pretend I have an inside source (I don’t) and make some confident statements about what to expect. Why am I doing this? Because my kids are napping and I literally cannot concentrate on anything else I am supposed to be doing right now.

Okay, here goes.

MacBook Air

Apple is going to announce a brand new 13-inch MacBook Air that runs on a variant of the A14 chip. This Air will be the same thickness as previous models with the same tapered design, but will be fanless, with FaceID but no TouchBar. It’ll have two USB-C ports, a headphone jack, and come in 5 different colors. It’ll boast up to 13 hours of battery life.

MacBook Pros

Apple is going to announce new 14-inch and 16-inch MacBook Pro models. These will run on a different chip (maybe M14?) and won’t be fanless. They’ll have a TouchBar, FaceID, four USB-C ports, a headphone jack, speaker and microphone improvements, higher resolution mini-LED displays, and up to 12 hours of battery life. Their GPU performance will be bananas-good. They’ll come in the usual colors, along with one wildcard…my guess is black or blue.

All 3 laptop models will gain an upgraded 1080p HD camera and the FaceID sensor will mean support for Animoji/Memoji in video chats. I’m torn on whether or not they’ll bring back the light-up Apple logo. Maybe I’ll make this bold prediction: the Airs will have Apple’s classic rainbow logo on the back, but it will not light up. The MacBook Pros will have a white Apple logo that does light up.

Other Stuff

How about those AirTags, amiright? Actually, I think if Apple adds anything to this event, it will either be headphones or a Mac mini. There will undoubtedly be some demos of iOS and legacy apps running on Apple Silicon and Big Sur, including a game demo.

So, there you have it: my expert predictions for the Apple Silicon Mac event. Honestly, the only thing I really want out of this event is MacBooks in different colors, and I probably won’t get it. A girl can dream though, right?

I listened to John McCain’s 2008 concession speech today and was moved to tears by his display of integrity, humility, maturity, and kindness. Such a stark contrast to you-know-who, who appears to be stooping to new lows I hadn’t even considered possible. #mbnov

As Apple’s “One More Thing” Mac event draws near, I find myself dreaming about Mac laptops in colors other than gray/silver. Black, rose gold, tangerine, cosmic purple, pumpkin spice; I literally do not care as long as they’re not gray. MacBook Airs in the original six colors would be so rad. #mbnov

It is an astonishingly beautiful day today. 82 degrees in Nebraska in November…I’ll take it! I’m outside with my family right now, soaking in the sunshine and hoping for better days. ☀️ #mbnov

Finding it nearly impossible to concentrate on coding right now, so I decided to learn how to play chess. Why chess? My husband watched The Queen’s Gambit and I happened to catch the final episode. Seems like a fun game to help take my mind off of current events! #mbnov

The prospect of a dark, dreary winter used to depress me; this year, I find myself oddly looking forward to it. Something about the juxtaposition of our warm interior lights and the frosty blue darkness out the window…it’s a whole vibe that I’m very much into right now. #mbnov