A Few Thoughts on the Eve of WWDC

Here we are again, on the eve of another WWDC, feeling…weird. Excited. Ambivalent? Curious. Did I mention excited? But also kinda annoyed. And some of you? Some of you are downright mad.

There’s a cloud hovering over Apple Park again, and it’s not just the pandemic. It’s bruised developer relations. It’s alleged anti-trust violations. It’s App Store scammy-ness. It’s the weight of a million different expectations and quibbles, from “make the iPad more like the Mac” to “let the iPad be an iPad,” from pro hardware announcements to satisfy developers, to hints of an augmented reality revolution to satisfy those hungry and excited for the post-staring-at-screens era.

And then there’s the shareholders. Can’t forget the shareholders.

Caught in the middle of it all, then, are the lovely Apple employees we know (or are lightly acquainted with) and love. They show us their work with such deliberation and care, such passion and delight. They made our iMacs colorful again. They made it possible to control an Apple Watch with one hand. They work on Notes, on tvOS, on Safari, on SwiftUI, on hundreds of teams that make things millions of people rely on. And if any of them are reading this: I appreciate you, and I hope you have an awesome WWDC week. I can’t wait to see what you’ve been working on. There’s always something announced at WWDC that just blows me away, and I know this year will be no different.

Speaking of different, Apple’s slogan used to be “Think Different.” Apple does many things differently, such as its environmental initiatives, focus on health and accessibility, and emphasis on privacy. But Apple is a big company and big companies naturally become stubborn, entrenched in tradition, and difficult to steer in different directions.

Unfortunately for Apple, the winds of change are blowing, have been for a long time, and are reaching gale force. In various regions of the United States, the coronavirus’s progress has been stymied and the phrase “back to normal” is bandied about as if it’s a sure thing, as if “normal” is something we have managed to recover, rather than something new being slowly born from the ashes of a horrible year.

While some have learned absolutely nothing from this experience, others are finding a renewed understanding of what’s most important to them. A country obsessed with work is toying with the idea that the way we do and view work might not always be the best way. And amidst all of this, an absolute reckoning involving the way we treat one another, and the way our entire society is structured to, consciously or unconsciously, treat some worse than others.

There is palpable anger toward so many in authority, whether in government, or at companies like Apple, for a failure to listen and a refusal to even consider change.

I’ve said this before, but I believe one of the single most important leadership qualities is humility, which by definition requires listening. If Apple executives listen to their employees and developers, decide their requests are not in line with the company’s core values, and say as much, that is one thing, because at least it’s honest. If, however, their requests or ideas align with the company’s values, but clash with its traditions or shareholder expectations (or simply aggravate the executives’ hubris) and they dig in their heels and tighten their grips, they are rightly deserving of criticism and, dare I say, scorn. And I think they’ll find, as the winds of change continue to blow, that they’ll eventually be caught in a storm they can’t escape, driven along on a course they did not chart for themselves.

It’s not about giving in to every little demand being lobbed at them. It’s about collecting information, determining what the right thing to do is, and doing it the Apple Way. When Apple does that and does it right, the results are fantastic.

Let’s hope we see some of that Apple shine through this week.


Adding a Gradient to Large Title Text in SwiftUI or UIKit

When I first released YarnBuddy, it had an orange-to-pink gradient as the navigation bar background color. The colors reminded me of a sunset, which I liked, but the overall effect was a little too heavy and could easily clash with the user’s own project photos. I wondered if I could do something a little more subtle and put the gradient inside the navigation bar title text itself.

The good news is that SwiftUI makes it trivially easy to create gradients and mask them in a variety of ways. The bad news is that SwiftUI can’t do much of anything when it comes to customizing the navigation bar. Maybe that will change in the next version of SwiftUI, to be announced at WWDC in a little over two weeks…maybe it won’t. For now, we can use the good ol’ UIKit Appearance APIs to accomplish our goal.

What we’re going to do is create a UIColor from a pattern image. The image will be generated using our gradient colors, and sized based on the height of the navigation bar and the width of the longest title we expect to display. All of this will happen in an extension to UINavigationController, in which we’ll override viewDidLoad().

The first thing we’ll need is a function to create an image from our gradient.

func getImageFrom(gradientLayer:CAGradientLayer) -> UIImage? {
    var gradientImage:UIImage?
    if let context = UIGraphicsGetCurrentContext() {
        gradientLayer.render(in: context)
        gradientImage = UIGraphicsGetImageFromCurrentImageContext()?.resizableImage(withCapInsets: UIEdgeInsets.zero, resizingMode: .stretch)
    return gradientImage

Since the function isn’t guaranteed to return an image, we’ll assign a default color for large title text. I chose UIColor.label, since I knew it would automatically adjust between dark mode and light mode. I also really like the “rounded” font design so I added it to my large title font descriptor; you can go with the default or serif options or a completely different font if it suits your app. Here is an example of what you can do in your UINavigationController extension:

extension UINavigationController {
    override open func viewDidLoad() {
        var gradientColor = UIColor.label
        let blue = UIColor.systemBlue
        let purple = UIColor.systemPurple
        let largeTitleFont = UIFont.systemFont(ofSize: 40.0, weight: .bold)
        let longestTitle = "My Awesome App"
        let size = longestTitle.size(withAttributes: [.font : largeTitleFont])
        let gradient = CAGradientLayer()
        let bounds = CGRect(origin: navigationBar.bounds.origin, size: CGSize(width: size.width, height: navigationBar.bounds.height))
        gradient.frame = bounds
        gradient.colors = [blue.cgColor, purple.cgColor]
        gradient.startPoint = CGPoint(x: 0, y: 0)
        gradient.endPoint = CGPoint(x: 1, y: 0)
        if let image = getImageFrom(gradientLayer: gradient) {
            gradientColor = UIColor(patternImage: image)
        let scrollEdgeAppearance = UINavigationBarAppearance()
        if let largeTitleDescriptor = largeTitleFont.fontDescriptor.withDesign(.rounded) {
            scrollEdgeAppearance.largeTitleTextAttributes = [.font : UIFont(descriptor: largeTitleDescriptor, size: 0), .foregroundColor : gradientColor]
        navigationBar.scrollEdgeAppearance = scrollEdgeAppearance

Setting the the x-value of the gradient’s start point to 0 and the end point to 1 creates a horizontal gradient. You can create a vertical gradient by changing the y-value instead. You’ll see that if our getImageFrom(gradientLayer:) function returns an image, we’ll use that to create a UIColor that we can use when assigning text attributes to our instance of UINavigationBarAppearance.

You’ll see I’m only setting the navigation bar’s scroll edge appearance—that’s because it covers the only navigation bar state where large title text appears. However, in YarnBuddy, I also set the “standard appearance” and “compact appearance” to use colors that match the user’s selected theme. If you’re wondering why I’m not using a gradient in all cases, it’s because it doesn’t look very good with small font sizes and makes the text way less readable.

I made a playground using the above code and SwiftUI so that you can fiddle around with colors and font:

What’s New in YarnBuddy

My backlog of blog posts I want to write is getting a bit ridiculous at this point. One of them is a follow-up to YarnBuddy’s “App of the Day” feature back in March, complete with screenshots and stats and the whole story of how that feature came to be. I’m not quite ready to put all of that together yet, so today, I just want to note some of the things I’ve been working on lately in YarnBuddy (from a more technical standpoint).


I’ve released 6 updates for YarnBuddy so far this year (with another one waiting for review), kicking the year off with a major design refresh that introduced the ability to change the app’s theme. I did this by creating a Theme struct that has a number of semantic colors such as “primaryAccent,” “secondaryAccent,” “headerText,” “rowBackground” etc. Next, I set up an AppTheme enum, with each case being the name of a theme. The enum has a variable called “colors” that returns a Theme struct for each case. The result is that I can do things like .background(settings.theme.colors.primaryBackground) and it just works!

However, there are some UI elements in SwiftUI that are notoriously difficult to customize, such as the background of the list view that a Picker pushes onto the stack. I realized that some of my themes could be considered “light,” while others would be more at home with dark mode defaults. In other words, a bright white background would be super jarring in my Supernova theme. So, I decided to override the user’s preferred mode based on the selected theme.

In my SettingsStore class, which manages a number of UserDefaults keys, I added the following:

var theme: AppTheme {
                set {
                        defaults.set(newValue.rawValue, forKey: DefaultKeys.appTheme)
                switch(newValue) {
                case AppTheme.system:
                    UIApplication.shared.windows.first?.overrideUserInterfaceStyle = .unspecified
                case AppTheme.dark, .midnight, .supernova:
                    UIApplication.shared.windows.first?.overrideUserInterfaceStyle = .dark
                    UIApplication.shared.statusBarStyle = .lightContent

                case AppTheme.light, .grapefruit, .creamsicle, .seaside:
                    UIApplication.shared.windows.first?.overrideUserInterfaceStyle = .light
                    UIApplication.shared.statusBarStyle = .darkContent
                    UIApplication.shared.windows.first?.overrideUserInterfaceStyle = .unspecified
    get { AppTheme(rawValue: defaults.string(forKey: DefaultKeys.appTheme) ?? AppTheme.system.rawValue) ?? AppTheme.system }

Those status bar text color overrides are deprecated but I can’t figure out how I would call their replacement in SwiftUI. For now, it works!

Data Export

The next major version, 1.5, gave users the ability to export a zip archive containing all of their photos, pattern PDFs with annotations, and metadata (in plain text files). I wanted to give users some method for getting their data out of the app as soon as possible, even if it wouldn’t be importable. Now that I’ve shipped it, I’ve begun slowly working on a true backup solution using Codable.

To export data using SwiftUI, I created a struct conforming to FileDocument and did all of the data gathering in the fileWrapper(configuration: WriteConfiguration) throws -> FileWrapper method. I used SwiftUI’s .fileExporter modifier to trigger the export. YarnBuddy uses CoreData, so to make things a little easier I created an “exportString” variable in each NSManagedObject subclass that prints a nicely-formatted string of all metadata associated with the class. That way, I just needed to loop through all of the user’s projects and yarn, map the exportStrings to their own array and append them to the appropriate plain text files.

Time Tracking

Version 1.6 introduced time tracking for Pro users. Time tracking can be fun and useful on its own, but what I’m hoping to do is lay the groundwork for some fun “end of the year summary”-type features. For avid knitters and crocheters, it would be neat to know how many projects you completed that year, the average and total time spent on them, etc. Someday I’d also like to have a little “share card” creation studio where users could choose stats and photos to share on social media.

Time tracking is also a great feature to integrate with widgets, Shortcuts, and the watch app, so there will be plenty of low-hanging fruit for me to pick throughout the summer!

Yarn Remaining, User Guide, and Stash Export

Version 1.6.1, when approved, will be a big point update: it includes a new user guide, the ability to export your yarn stash as a CSV file, and new estimates of yarn remaining.

Previously, when a user linked a project to a yarn in their yarn stash, the amount of yarn would not automatically be subtracted from the total stashed amount. The reason for that was logistics: “In progress” projects weren’t guaranteed to be finished, projects could list yarn quantities in terms of the number of skeins (yarn balls), grams, or ounces, and the total amount of yarn in the user’s stash could be shown in terms of skeins, grams, ounces, yards, or meters. In other words, there were potentially a lot of unit conversions required, and in some cases those conversions would only work if the user supplied the length and net weight per yarn ball (which aren’t required fields).

Now, YarnBuddy will attempt to calculate an estimated amount of yarn remaining based on whatever pieces of information is has, using projects marked as finished. If the estimate is clearly wrong, users can override it with a custom value. Hopefully it’s a good compromise, and users will be happy with it.

I don’t get a ton of support email for YarnBuddy, but I wanted to have a place where users could go to get answers to basic questions and even just explore what the app has to offer. So, I wrote a little user guide in plain ol’ HTML using Textastic on my iPad and included it in the app.

YarnBuddy User Guide screenshot

Finally, users can now export their yarn stash as a CSV file. This turned out to be trivially easy, and you can see my entire implementation of it below. If you’re horrified by my lack of error handling, please know that the rest of the app is even worse; it will absolutely give you the heebie-jeebies…just a total wasteland filled with roving packs of feral errors running amok and destroying anything in their path.

struct YBStashCSV: FileDocument {

let managedObjectContext = CoreDataStack.shared.context

static var readableContentTypes: [UTType] { [UTType.commaSeparatedText] }
static var writableContentTypes: [UTType] { [UTType.commaSeparatedText] }

static let yarnWeights = ["Lace (0), Light Fingering, 1-3 ply",
                          "Super Fine (1), Fingering, 4 ply",
                          "Fine (2), Sport, 5 ply",
                          "Light (3), DK, 8 ply",
                          "Medium (4), Worsted/Aran, 10 ply",
                          "Bulky (5), Chunky, 12 ply",
                          "Super Bulky (6), Roving",
                          "Jumbo (7)"]
init() { }

init(configuration: ReadConfiguration) throws {


func fileWrapper(configuration: WriteConfiguration) throws -> FileWrapper {

    let csvText = createCSV()
    return FileWrapper(regularFileWithContents: csvText.data(using: .utf8) ?? Data())


func createCSV() -> String {
    let fetchRequest = NSFetchRequest(entityName: "Yarn")
    fetchRequest.resultType = .managedObjectResultType

    var stashString = ""
    stashString.append(contentsOf: "Name,Colorway,Color Family,Dye Lot,Quantity,Remaining,Weight,Length,Net Weight,Fiber Type,Purchase Location,Date Added,Notes\n")

    do {
        let allYarn = try managedObjectContext.fetch(fetchRequest)
        let stash = allYarn.filter({ $0.isStashed })

        for yarn in stash {
            var notesString = ""
            for note in yarn.notesArray {
            stashString.append("\"\(yarn.wrappedName)\",\"\(yarn.wrappedColorName)\",\"\(yarn.wrappedColorFamily)\",\"\(yarn.dyeLot)\",\"\(yarn.quantityValue) \(yarn.wrappedQuantityUnit)\",\"\(yarn.customRemainingString.isEmpty ? yarn.estimatedQuantityRemainingString : yarn.customRemainingString)\",\"\(Self.yarnWeights[Int(yarn.weight)])\",\"\(yarn.lengthString)\",\"\(yarn.weightString)\",\"\(yarn.wrappedFiberType)\",\"\(yarn.wrappedPurchaseLocation)\",\"\(yarn.dateAdded ?? Date())\",\"\(notesString)\"\n")
    } catch let error {
    return stashString

Wishes for WWDC 2021

‘Tis the season for WWDC rumors, predictions, and wishlists! Spring is off to a shaky start here in Nebraska; we’ve had a series of colder, windy days that seem more at home in March than May, and the forecast for the coming weeks is much more “April showers” than “May flowers.” However, the foliage is definitely greening, the birds are still singing, and, best of all, we have kittens on our farm again (somewhere between 3 and 8…one of the two litters was moved to an undisclosed location by the momma cat).

My husband and I are fully vaccinated, nearby parks, zoos, and orchards are starting to plan fun outdoor activities, and things are generally looking up around here. Of course, I know that’s not true for everyone; I just want to express my gratitude for the circumstances I’m in, and I hope things are looking up for you as well.

There are things to celebrate even in hard times, and WWDC is when we all get to celebrate Apple engineers’ hard work over the past year. I really enjoy reading wishlists and predictions, so this year I’ve compiled a WWDC 2021 Community Wishlist. You’re welcome to contribute, just submit a pull request (or send me a note on Twitter and I’ll add it for you).

Here are a few things I’m hoping to see next month:


  • Native search bar & pull-to-refresh
  • Ability for views to become/resign first responder, and to identify the current first responder
  • For List and Form to either drop their UIKit backing or be much more customizable (cell selection style and behavior, list background colors, etc. without using appearance APIs)
  • Iron out all of the NavigationView/NavigationLink bugs. It seems like there’s some regression in this area with every single update. Also, native navigation bar customizations that don’t require the appearance API (background color, text color, etc.)
  • A way to change the status bar style (light/dark) at runtime using the SwiftUI app life cycle
  • Some sort of “bottom sheet” view that can be pulled up and expanded
  • Accessory views for TextFields/TextViews
  • Inactive/destructive states in Context Menus
  • Context menu preview-providers (for showing a custom preview on long-press/right-click)
  • SwiftUI version of UIVisualEffectView
  • Native support for the share sheet


  • A first-party code editor for iPadOS that supports SwiftUI, UIKit, live previews, Swift Package Manager, light debugging tools, and the ability to archive and submit builds to App Store Connect
  • Another Home Screen overhaul for iPad, allowing widgets to be moved anywhere
  • An iPad feature similar to App Library but that is actually more like LaunchPad
  • The ability to back up to an external drive (i.e. Time Machine)

Everything Else

  • The ability to see and configure Smart Mailboxes in Mail for iOS
  • A method for adding third-party wallpapers that won’t clutter up your Photo Library and also supports light/dark mode
  • TestFlight for Mac
  • Some degree of widget interactivity. I would love to be able to easily start/stop timers, check to-do items, increment a counter, etc.
  • Third-party Apple Watch faces (😂)
  • For Apple to chill out and allow apps like Riley Testut’s Delta emulator to be installed on iOS devices in some sanctioned way (remember, emulators are not illegal)
  • For Apple to chill out and let developers accept payments via some approved processors (i.e. Stripe)
  • Improved TestFlight beta review times (or, just ditch the whole review process in its current form)
  • Subscription cancellation API for developers.

I’m sure I’ll think of more things in the coming weeks. Don’t forget to check out the WWDC 2021 Community Wishlist!

A Story Half-Told

Back in November I wrote about how the introduction of the first M1 Macs put extra pressure on Apple to differentiate the iPad Pro from the recently upgraded iPad Air. The Airs were colorful, performant, and supported all the new accessories. Meanwhile, the new M1 MacBook Air was light, blazing fast, and could run iOS apps. What would compel people to buy an iPad Pro?

In that post, I listed 10 things I thought Apple could do to make its pro tablet stand out in the line-up. This week, Apple addressed one and a half of them, and set the stage for a few more. It upgraded the port to Thunderbolt/USB 4 (but didn’t add an additional port like I hoped), added 5G, and gave the iPad Pro the very same M1 chip that powers its new Macs, making it more than capable of running things like Xcode, Final Cut, Logic, etc. The port could potentially point toward things like better external display support and fast Time Machine backups.

Disappointingly, the iPad Pro presentation lacked the colorful, whimsical joy of the new iMac introduction (though I was definitely impressed by the production quality of the M1 chip heist). Apple has doubled down on iPad Pros being Serious Business, which is just too bad, because literally everyone I know would love an iPad Pro in some other color than gray. In fact, I find myself in a strange position—the new iPad Air made me excited for the iPad Pro, which in turned disappointed me enough to make me hopeful that the next iPad Air will be released in some even more vivid colors. Apple has become a company of a thousand SKUs…you can’t tell me they can’t give us some more gosh darn hues. But, I digress.

Once upon a time, Apple made an outrageously powerful, desktop-class tablet, with artificially limited software and I/O.

…and then what?

Well, we have to wait until June 7 to find out. Or do we? The iPad’s future is just as wrapped up in the current anti-trust hullabaloo as it is in iPadOS 15. Will developers be allowed greater freedom to innovate without being fearful of App Review? Will Apple finally shift its focus to eliminating actual multi-million dollar scams and fraud instead of nitpicking honest developers who desire to follow the spirit of the law, if not the letter (which is usually pretty vague to begin with)?

If Apple is willing to give App Review a complete overhaul and also manages to release at least one first party “pro” app for iPadOS this June, I think the iPad Pro’s story will take a happy turn indeed. For now, however, it remains a half-told tale of wasted potential—a sleek, expensive “what if?”

How to Set Up Core Data and CloudKit When You Haven’t the Faintest Clue What You’re Doing

Note: This was posted before WWDC 2021, so if major changes were made to Core Data + CloudKit, they aren’t reflected here. This right here is just pure dumpster fire all the way down. Also if you’re an Apple engineer…I’m sorry.

When Apple introduced changes to Core Data + CloudKit integration in 2019, they sold developers on a dead-simple API: add iCloud sync to your Core Data app with “as little as one line of code.” That one line, of course, is simply changing NSPersistentContainer to NSPersistentCloudKitContainer and enabling a few capabilities in the project settings. Boom, done! And in fact, Apple’s “Core Data –> Host in CloudKit” SwiftUI project template does those things for you, so you’re good to go, right?

Turns out, if you want to sync Core Data-backed data between devices and have those changes reflected in your UI in a timely manner, you have some more work to do. To figure out what that work is, you can’t look at Apple’s Core Data templates. You have to look at their sample code.

My SwiftUI app was created before Apple even added a SwiftUI + Core Data project template, so I created a class called “CoreDataStack” that has a shared instance. If you use the template, that becomes a struct called “PersistenceController.” I’m sure the struct is SwiftUI-ier, but the class from Apple’s sample code (which does not use SwiftUI) makes more sense to my brain, so I went with that.

Step 1: Make your container lazy and set some important options

In Apple’s sample code, you’ll notice that within the persistent container’s lazy initialization, two options are set on the container’s description. Include these.

guard let description = container.persistentStoreDescriptions.first else {
        fatalError("###\(#function): Failed to retrieve a persistent store description.")
    description.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey)
    description.setOption(true as NSNumber, forKey: NSPersistentStoreRemoteChangeNotificationPostOptionKey)

If you don’t set the first option, you’ll regret it. Look, I don’t even really understand what it does, I just know that somewhere down the line, you’ll find some dumb way to break sync and then when you finally get it working again, you’ll find that only managed objects created after this key was set will sync properly. Every object created before the NSPersistentHistoryTrackingKey was set will stubbornly refused to sync unless you modify it and re-save it, which is a giant pain in the derrière. I mean, at least that’s what my…uh…friend told me.

The second option is the first step toward receiving notifications when magic cloud stuff happens. You’ll subscribe to that NSPersistentStoreRemoteChangeNotification later, but for now, just make sure that option is set.

Step 2: Stir in some of this stuff that I have a super weak grasp of

After your container loads its persistent stores, but before you return the container itself, these lines are also important:

container.viewContext.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
    container.viewContext.transactionAuthor = appTransactionAuthorName
       container.viewContext.automaticallyMergesChangesFromParent = true
    do {
        try container.viewContext.setQueryGenerationFrom(.current)
    } catch {
        assertionFailure("###\(#function): Failed to pin viewContext to the current generation:\(error)")

There are several merge policies, and you can read about them in the docs.

Again, I barely understand this stuff. For my purposes, I set “appTransactionAuthorName” to the name of my app’s container, which was simply “YarnBuddy.” From what I kinda understand, setting the transaction author here allows me to later filter for changes that weren’t created by my app on this particular device and act on them.

Now, I’ve always had “automaticallyMergesChangesFromParent” set to true, but what I didn’t realize is that it doesn’t just refresh your view hierarchy immediately when a change occurs. Maybe it should, but for me, it doesn’t. That’s where the remote change notification comes in.

Step 3: Dip your toes into Combine for a hot second and subscribe to notifications

I put this code right before “return container.”

      .publisher(for: .NSPersistentStoreRemoteChange)
      .sink {
      .store(in: &subscriptions)

And somewhere within the class I have declared this variable:

private var subscriptions: Set<AnyCancellable> = []

Make sure you import Combine at the top. I know extremely little about Combine at this point. It’s number one on my list of things to learn, and I plan to start with John Sundell’s “Discover Combine” materials.

We’ll get into what my “processRemoteStoreChange” function does in a minute.

Step 4: Just copy over these blessed code snippets from the sample code

Copy the following from CoreDataStack.swift in Apple’s sample code:

  • the initializer
  • lastHistoryToken variable
  • tokenFile variable
  • historyQueue variable

Also copy over the NSPersistentContainer extension in “CoreData+Convenience.swift.”

Also, my “processRemoteStoreChange” function is identical to the sample code’s “storeRemoteChange” function.

Step 5: Merge new changes into the context

I modified Apple’s “processPersistentHistory” function to look like this:

func processPersistentHistory() {
    let backgroundContext = persistentContainer.newBackgroundContext()
    backgroundContext.performAndWait {

        // Fetch history received from outside the app since the last token
        let historyFetchRequest = NSPersistentHistoryTransaction.fetchRequest!
        historyFetchRequest.predicate = NSPredicate(format: "author != %@", appTransactionAuthorName)
        let request = NSPersistentHistoryChangeRequest.fetchHistory(after: lastHistoryToken)
        request.fetchRequest = historyFetchRequest

        let result = (try? backgroundContext.execute(request)) as? NSPersistentHistoryResult
        guard let transactions = result?.result as? [NSPersistentHistoryTransaction],
            else { return }

        print("transactions = \(transactions)")
        self.mergeChanges(from: transactions)

        // Update the history token using the last transaction.
        lastHistoryToken = transactions.last!.token

The “mergeChanges” function looks like this:

private func mergeChanges(from transactions: [NSPersistentHistoryTransaction]) {
        context.perform {
            transactions.forEach { [weak self] transaction in
                guard let self = self, let userInfo = transaction.objectIDNotification().userInfo else { return }
                NSManagedObjectContext.mergeChanges(fromRemoteContextSave: userInfo, into: [self.context])

Most of that code was pulled from Stack Overflow. Apple’s code has a bunch of deduplication logic in it that frankly, I’m not emotionally ready to process, so I skipped it.

I’ve seen a few folks say that merging changes like this shouldn’t be necessary. However—and maybe it’s some sort of weird placebo effect—it seemed like changes from my watch synced much more quickly to my phone. Not instantaneous, but a handful of seconds instead of requiring me to sometimes force quit and restart the app (or switch tabs or something) to see changes.

Step 6: Never forget to deploy your updated schema from the CloudKit Dashboard

Honestly, it’s not that I forgot to do this, it’s that I failed to make sure it actually happened. CloudKit threw some weird error at me and told me my development and production schemas were the same, when they were in fact extremely different. I never double-checked, and chaos ensued! Don’t be like me: make sure your schema is deployed.

After launch, remember that you still have to do this every time you change your Core Data model, before you release your update to testers or the App Store. If your production CloudKit schema doesn’t properly correspond to your production Core Data model, syncing is going to break in all kinds of terrifying ways.


I know I probably could have saved myself a lot of frustration if I’d forked over some money for a Ray Wenderlich membership or some other paid tutorials/books related to Core Data. I’m also guessing there’s a much easier way to set everything up so that cloud changes are reflected near-instantaneously. But y’all, I’ve combed the free internet for weeks and this is the best I could come up with. Maybe it’ll help you too.

Thoughts on Apple Glasses

Rumors about Apple glasses have been swirling around for years now, and they have never once interested me. Why would I want to wear a computer on my face? Didn’t I get LASIK eye surgery to avoid ever again having to clumsily clean a pair of lenses with the corner of my t-shirt and my hot breath? And what about folks that stare at a computer screen all day for their job anyway…are they going to stare at that screen through a pair of Apple glasses? Or do you only wear them in certain situations…in which case, what’s the point?

Whenever I read an article about Apple’s foray into AR headsets, something about it just doesn’t feel right. No one ever makes it sound like Apple is trying to make a product for the mass market, when I strongly believe that they will only release this product if it has mass market (or potentially future mass market) appeal. That means it has to follow the same trajectory as the Apple Watch: comes in many different styles, and is meant to be something you wear all day long and charge at night. Perhaps even reliant on a companion iPhone app until the components get small enough, then made independent.

It’s one thing to have something a bit odd-looking strapped around your wrist, or sticking from your ears. It’s an entirely different thing to have something goofy smack dab in the middle of your face. These glasses are going to have to be sleek af. They’re going to have to look good on a wide range of face shapes, sizes, and skin tones, and appeal to a wide range of personalities. They’re going to have to make people who don’t currently wear glasses want to wear glasses.

But Becky, why try to appeal to everyone when they could just create some sweet futuristic sci-fi specs for influencers and nerds? Because in my heart of hearts, I believe the narrative behind Apple’s AR glasses is going to be the same as the later iterations of the watch: Health. Wellness. Accessibility. The Human Experience.

Honestly, I think these devices will be revolutionary for people who are blind, colorblind, or have low vision. Not necessarily because assistive devices for these groups don’t already exist, but because Apple will do it better, and sell it cheaper. Imagine having Apple Glasses paired with some AirPods, discreetly giving you an audio description of whatever you’re looking at. Personally, I’d love to have a pair of glasses that could help me see better when driving at night (although, who knows if we’ll be allowed to drive while wearing these things. Maybe they’ll have a do-not-disturb-while-driving mode?).

Maybe users with hearing loss could enable live captions, à la Clips. Or on the flip side, hearing folks could use the glasses to recognize sign language. Suddenly, all text responds to dynamic type. Sounds like sirens give visual cues, while sights like signage give audio cues.

Tech reporters love to go on about all the “hidden” accessibility features in iOS that are actually great for the masses, and I think Apple Glasses are going to be a whole lot of that.

People walking or running outdoors could see mile/km markers in the environment around them, and maybe even fireworks in the sky when they reach their goal. Maybe when you’re hunting around for that lost AirTag, a big giant 3D arrow appears over it when you’re close. That might sound silly, but it also kind of sounds like Apple, doesn’t it?

All of this is to say: I think the reason I haven’t been interested in all of the Apple Glasses talk is because the focus seems to be on games, notifications, and maps, which to me are the least interesting and least imaginative features of this supposedly futuristic device. There is no “killer app” because the entire purpose of the device is not just to replicate iPhone functionality, but rather to fundamentally improve the human experience in a deep and meaningful way. It won’t get there in version 1.0, but I now find myself excited about the possibilities. This isn’t about giving us a new screen. It’s about freeing us from screens, from distractions, and bringing us together again.

Let’s be honest: Apple isn’t going to make a “Dear Apple,”-style commercial about how Apple Glasses impacted people’s lives by allowing them to play Minecraft on their bathroom floor or get notifications about the newest AppleTV+ shows directly in front of their pupils. It’s going to be about how two neighbors who speak different languages are able to communicate face-to-face with real-time translations hovering nearby. It’s going to be about people with disabilities having an improved experience in the world and greater overall quality of life. It’s going to be what it’s always been about (besides profit, of course): people. All of the fun-albeit-gimmicky 3D AR stuff is just a cherry on top.

5 Years of App-Making

I realized recently that it’s been nearly five and a half years since I released my first app onto the App Store. Like, holy smokes, where did that time go?

I’ve been feeling somewhat reflective these last few weeks (and I haven’t even watched Soul yet!) as I’ve taken time off from programming to focus on getting ready for Christmas, which is something I genuinely love doing. I ran across this tweet early in my preparations:

…and honestly, that’s exactly what I’m about at Christmastime: being that mom that makes things magical. The day after Christmas, my kids asked how many days were left until next Christmas, so I think I succeeded?

Anyway, back to five and a half years. Although I released my first app on June 10, 2015, I didn’t actually start making any money until the following June when I released LiveRotate, my app for rotating Live Photos (remember when editing Live Photos wasn’t possible without turning them into stills? lol). From June 2016 to today, I’ve made approximately $12,600 in profit from the App Store.

It’s simultaneously a lot and a little. It’s a lot for most developers. It’s a little for the developers I follow on Twitter. Early on, someone asked me how I would define success for myself as an indie developer. I remember stressing that my apps were just side projects (they are) and that I’d be happy if my revenue could cover the cost of my personal device upgrades (it has). At the time, I think I forgot to say something about how I wanted to make things that improved people’s lives, or just made them smile. In that way, I’ve also succeeded, and hearing from happy customers has been incredibly rewarding.

Most of the proceeds from the past four years have come from Snapthread. I owe its success to the very nice media coverage it got, and I owe that media coverage to my following on Twitter, and I owe that following to Brent Simmons, who cared enough to compile a list of women in the iOS community (thanks again, Brent!). Revenue from Snapthread has diminished considerably in the past year or so, as has my enthusiasm for struggling with AVFoundation. Scribblet and YarnBuddy were just what I needed this year, both in terms of challenge and inspiration.

I set a goal a few months ago of reaching 20 annual subscribers to YarnBuddy Pro by the end of the year. I’m happy to report that as of today, my subscriber count stands at 52 subscribers (plus an additional 16 that redeemed a promo code for the first year).

And here we are, at the doorstep of 2021. Convention says that it’s time to set new goals, but I’m just not feeling it. Time stood still in March, and yet somehow things began to happen at an increasingly frenzied pace. There’s a really excellent episode of Mr. Robot in the final season that basically happens in real time, with no spoken dialogue save from one line at the beginning and one at the end. It is ridiculously intense; I felt like I was holding my breath the entire time. This year kinda felt like that too. I hope 2021 feels like a nice, long exhale.

Of course, learning goals are another story. I’d really like to deepen my design skills this coming year as well as my understanding of SwiftUI and Combine. It’s amazing to realize what I’ve learned since starting my development journey, starting with Objective-C in early 2014, then quickly pivoting to Swift 1.0, playing with SpriteKit, AVFoundation, PhotoKit and PencilKit, and now writing apps using SwiftUI. It’s been a wild ride, and I’m so thankful for all of you that have helped me along the way (including many who I’ve never interacted with, but whose blog posts and Stack Overflow answers have literally kept me going).

Do you have indie business goals for 2021? How about learning goals? I’d love to hear them! I wish you all happiness and good health this coming year. 💚

Your Move, iPad

Hear that? It’s the sound of Mac fans. No, not your shiny new M1 Mac’s fans—chances are, you’ll never hear those—but rather, the sound of excitement rippling through the Mac community. This is something big. Really big. Now, I’m only 33, but someday when I go full fuddy-duddy I will speak of this: the great Intel/Apple Silicon transition. The beginning of a new era at Apple.

All that sounds dramatic, of course, but it’s interesting to trace all of the different paths that led us to this point. The A-Series chips, the introduction of Metal, rapid machine learning gains, the gradually degrading repairability scores as components became more integrated, the Secure Enclave, a new super fast emulation layer, new unified memory architecture, and 5nm process… years and years of work have now come to fruition with the first Apple Silicon chips for Mac. And our minds are blown.

Suddenly, we’re handed a thin, entry-level fanless laptop that performs better than almost every other Mac computer out there, and a low-end MacBook Pro and Mac Mini that make current Mac Pro owners sweat and clutch their wheels. So many questions abound. What new hardware designs will these gains make possible? What on earth does Apple have in store for its high-end Macs? Will anyone else even be able to compete? It’s an exciting time to be a Mac lover, but, surprise: this post isn’t really about the Mac. It’s about the iPad.

There’s no question that Apple has struggled to craft a cohesive, compelling narrative for the iPad. For a long time, there seemed to be a distinct lack of product vision. Everyone likes to speculate over what role Steve Jobs ultimately intended the iPad to have in people’s lives, but not only is that pointless, it’s also irrelevant. We don’t need Steve to tell us what the iPad is good for. We know what it’s good for, and we can easily imagine what it could be good for, if only Apple would set it free.

Just as Apple left us with great expectations for its Pro Mac line-up, the latest iPad Air also raises the bar in new and interesting ways. The Air served as sort of an appetizer for the new M1 chips, while also receiving a generous trickle-down of features from the iPad Pro, including USB-C and support for the latest keyboard and Pencil accessories. There have been rumors of new mini-LED displays for the next-gen iPad Pros, but it’s going to take a lot more than new display tech to set the Pros apart.

Francisco Tolmasky (@tolmasky) recently tweeted:

“A sad but inescapable conclusion from the impressive launch of the M1 is just how much Apple squandered the potential of the iPad. The iPad has had amazing performance for awhile, so why is the M1 a game changer? Because it’s finally in a machine we can actually do things on.”

Francisco is right: Power and performance aren’t the bottleneck for iPad, and haven’t been for some time. So if raw power isn’t enough, and new display tech isn’t enough, where does the iPad go from here? Will it be abandoned once more, lagging behind the Mac in terms of innovation, or will Apple continue to debut its latest tech in this form factor? Is it headed toward functional parity with the Mac or will it always be hamstrung by Apple’s strict App Store policies and seemingly inconsistent investment in iPadOS?

It’s clear that Apple wants the iPad Pro to be a device that a wide variety of professionals can use to get work done. And since so many people use web apps for their work, the introduction of “desktop” Safari for iPad was an important step toward that goal. The Magic Keyboard and trackpad was another step.

Here are ten more steps I believe Apple could and should take to help nudge the iPad into this exciting next era of computing.

  1. Give the iPad Pro another port. Two USB 4.0 ports would be lovely.
  2. Adopt a landscape-first mindset. Rotate the Apple logo on the back and move the iPad’s front-facing camera on the side beneath the Apple Pencil charger to better reflect how most people actually use their iPad Pros.
  3. Introduce Gatekeeper and app notarization for iOS. The process of side-loading apps should not be as simple as downloading them from the App Store. Bury it in Settings, make it slightly convoluted, whatever: just have an officially-sanctioned way of doing it.
  4. Ruthlessly purge the App Store Guidelines of anything that prevents the iPad from serving as a development machine. Every kind of development from web to games should be possible on an iPad. And speaking of games—emulators should be allowed, too.
  5. Release a suite of professional first-party apps at premium prices. If someone can edit 4K videos in Final Cut on their M1 MacBook Air, they should be able to edit 4K videos in Final Cut on their iPad Pro. I refuse to believe that these pro apps can’t be re-imagined and optimized for a touch experience. If Apple leads the way in developing premium software for iPad, others will follow.
  6. Make it possible to write, release, and install plug-ins (if appropriate) for the aforementioned first party apps.
  7. Bring App Library to the iPad and allow widgets to be positioned anywhere on the Home Screen. This isn’t groundbreaking, it just annoys the heck out of me.
  8. Release a new keyboard + trackpad case accessory that allows the iPad to be used in tablet mode without removing it from the case.
  9. Introduce Time Machine backups for iPadOS.
  10. 5G, ofc.

In the end, fostering a vibrant community of iPad app developers can only stand to benefit the Mac (and vice-versa).

It’s simple: people love their iPads. They love them so much they wish they could do even more with them. The new M1 Macs should give iPad fans reason to be excited; now that we’ve seen hints of what future Macs can be, it’s time for the iPad to reassert itself—to remind us once again who it’s for, and what makes it special.

In other words: Your move, iPad.

My Apple Silicon Dilemma

About 3 years ago I wrote a post about the very real possibility that my current 2016 13-inch MacBook Pro with Touch Bar could in fact be my last Mac. I paid ~$2600 for it at the time, and two years later purchased an 12.9″ iPad Pro that I absolutely fell in love with. My reasoning then was that this MacBook would likely work just fine through at least 2021, and by that time I could do all of my work on an iPad Pro.

Now, it doesn’t look like Xcode for iPad is coming anytime soon. And while my MacBook Pro does still work fine, there’s a “Service Recommended” warning for its battery, and its infamous butterfly keyboard has lasted so long without breaking that it actually makes me a bit nervous.

As I watched Apple’s “One More Thing” event on Tuesday, I found myself really drawn to the new M1 MacBook Air. Light, portable, quiet, fast. No new design, no fun new colors, but plenty of beefy benchmarks. There’s a part of me that feels like if I want to be a member of the developer community, I should always be interested in the most powerful “pro” machine I can get. With rumors of new 14″ and 16″ MacBook Pros with better displays and even more impressive capabilities, I should just wait, right?

And yet. As I reread my post about gravitating toward iPad, I realized that I really do use my MacBook Pro almost exclusively for running Xcode (and occasionally for messing with the CSS in my WordPress template). When it comes to blogging, or photo editing, or designing icons and graphics in Affinity Designer: I prefer my iPad Pro.

If I trade in my current MacBook Pro, I could get a new MacBook Air with 16GB of RAM and a 1TB hard drive for around $1150. If we subtract that from the price I paid for my computer 4 years ago, that leaves $1449 I could spend on an arguably more compelling new iPad Pro someday AND I’d also have a faster, lighter laptop with better battery life than the one I have now (albeit with fewer ports).

Additionally, when I think of all of the things “missing” from the new Apple Silicon MacBooks: things like mini-LED displays, FaceID, touch screens, ProMotion, 5G, etc… I mean, that’s basically describing a next-gen iPad Pro, just without macOS.

The last piece of the puzzle is the Touch Bar: the new MacBook Air doesn’t have one. I don’t hate the Touch Bar like many folks do, but I don’t use it too terribly much. I like having emoji handy, but the new Air has a function button for opening up the emoji picker, so I don’t really think I’d miss it.

So then. Since I already tend to be an enthusiastic early adopter (Swift and SwiftUI 1.0, baby!), I think I’ve talked myself into ordering a new MacBook Air, and perhaps upgrading my iPad Pro at some point (maybe next year, maybe 2022). I’m going to wait a bit longer for some reviews to come out before I pull the trigger, but writing this post helped me think through my current situation and decide what I want my future tech setup to be. How about you? Are you going to order a new MacBook or Mac mini, or wait for the higher-end Macs?

With over a day left until the Apple event, I find myself in an unfamiliar bind: I’m out of predictions podcast episodes to listen to. 🙈 #mbnov

The only thing puzzling me at the moment is why I am still awake, seeing as we’re not likely to hear any significant updates until tomorrow night, maybe even Monday. Must just be nervous energy. #mbnov

Mac Event Predictions

I comfort-ordered a white HomePod mini this morning. When I’m feeling stressed, I tend to buy things and eat a lot of cheese, and hoo-boy that 18-count mega box of Kraft mac and cheese from Costco is emptying at an alarming rate. Lucky for me and my cheese stash, another opportunity to therapeutically buy stuff is right around the corner, with Apple’s “One More Thing” event just 3 days and 20 hours away, according to my Scriptable widget.

I will admit to not knowing much about computer processors, and what features/outcomes certain types of processors can enable. That’s why I’m the perfect person to write a hilariously specific prediction post for next week’s event. For this, I’m going to pretend I have an inside source (I don’t) and make some confident statements about what to expect. Why am I doing this? Because my kids are napping and I literally cannot concentrate on anything else I am supposed to be doing right now.

Okay, here goes.

MacBook Air

Apple is going to announce a brand new 13-inch MacBook Air that runs on a variant of the A14 chip. This Air will be the same thickness as previous models with the same tapered design, but will be fanless, with FaceID but no TouchBar. It’ll have two USB-C ports, a headphone jack, and come in 5 different colors. It’ll boast up to 13 hours of battery life.

MacBook Pros

Apple is going to announce new 14-inch and 16-inch MacBook Pro models. These will run on a different chip (maybe M14?) and won’t be fanless. They’ll have a TouchBar, FaceID, four USB-C ports, a headphone jack, speaker and microphone improvements, higher resolution mini-LED displays, and up to 12 hours of battery life. Their GPU performance will be bananas-good. They’ll come in the usual colors, along with one wildcard…my guess is black or blue.

All 3 laptop models will gain an upgraded 1080p HD camera and the FaceID sensor will mean support for Animoji/Memoji in video chats. I’m torn on whether or not they’ll bring back the light-up Apple logo. Maybe I’ll make this bold prediction: the Airs will have Apple’s classic rainbow logo on the back, but it will not light up. The MacBook Pros will have a white Apple logo that does light up.

Other Stuff

How about those AirTags, amiright? Actually, I think if Apple adds anything to this event, it will either be headphones or a Mac mini. There will undoubtedly be some demos of iOS and legacy apps running on Apple Silicon and Big Sur, including a game demo.

So, there you have it: my expert predictions for the Apple Silicon Mac event. Honestly, the only thing I really want out of this event is MacBooks in different colors, and I probably won’t get it. A girl can dream though, right?

I listened to John McCain’s 2008 concession speech today and was moved to tears by his display of integrity, humility, maturity, and kindness. Such a stark contrast to you-know-who, who appears to be stooping to new lows I hadn’t even considered possible. #mbnov