SnapThread 1.1 with Live Photo Support

I have a confession to make: I released SnapThread too early. I thought it was a good MVP (minimum viable product), but I was wrong. It was a little too buggy, and didn’t have a real “killer” feature.

The good news is, SnapThread 1.1 is now available, and it’s what I should have waited to release in the first place. I’m hoping that with the addition of Live Photo support, SnapThread can now be people’s go-to app for quick, easy portrait video compilations.

New features:

  • Live Photo support! SnapThread will strip the 3 second videos from your Live Photos and allows you to stitch them together.
  • The app now presents the native iOS share sheet upon successful export.
  • New duration limits. Filter your photo library by videos under 10, 15, 30, or 60 seconds.

Improvements/Bug fixes:

  • Faster exporting, and fixed a bug where the exporting would fail when trying to merge over 16 videos at once.
  • Better progress reporting. Downloading lots of videos from iCloud can take awhile, and now the app more accurately reflects the download’s current progress.
  • Smart aspect ratio: since Live Photos are 3:4 and regular videos are 9:16, SnapThread chooses the final video’s aspect ratio based on what you have more of. So if you have 45 Live Photos and 1 video you want to merge, the final video will be 3:4. Likewise, if you have lots of widescreen videos and only a few Live Photos, the final video will be 9:16. (you should see the tangled if-else statements that determine the video clips’ scaling and translation values…it’s terrifying).

Since “the proof of the pudding is in the eating,” here’s a video of Charlie’s day at the pumpkin patch that I stitched together from Live Photos (and one video):

Future plans/ideas for the app:

  • Photo library album picker (so you can find clips more easily)
  • Scaling improvements and fixes (I’m sure there’s some edge cases I missed!!
  • Ability to mute individual clips?
  • Ability to add a very simple title to beginning of video?
  • New localizations
  • Accessibility improvements
  • iPad version??

SnapThread Now Available

SnapThread icon

My new app, SnapThread, is now available on the App Store! SnapThread is a simple, no-frills utility for merging portrait videos from apps like Snapchat and Instagram Stories.

I am hoping to add support for Live Photos in the next version. In the future, I may also add the ability to include a title for a few seconds at the beginning of the video, or add a single background music track. However, I don’t want to complicate the app too much, so those features aren’t guaranteed to make the cut.

Let me know if you run into any trouble (errors and such) using the app, and if you like it, please consider leaving a rating or review. Thanks!

Please, Don’t Write About My App

Please, don’t write about my app. It’s not that good. In fact, it probably crashes sometimes. Also, I don’t really know what I’m doing.

Please, don’t tell your friends about my app. They probably won’t like it. I mean, the art assets aren’t good enough. It’s too simple. And it only really appeals to a tiny niche anyway.

Thus goes my inner monologue every time I prepare to ship an app. It’s not because I’m humble: trust me, I’m not. It’s just…fear of failure, I guess?

For indie developers, marketing is especially important. You gotta get the word out about your stuff. You gotta build your audience, refine your #brand, hack all dat growth, and so on and so forth. It feels gross. It isn’t, though—at least not for the most part. I struggle with it though, as I’m sure many of you do as well.

Look, my app isn’t special. It’s not Apple Design Award material. Does that mean it shouldn’t exist? No, it doesn’t mean that at all. I created something that’s useful to me, and now I’m going to share it with others. If they don’t find it useful, that’s fine. But if I want to give it its best chance at success, it’s still my job to tell its story.

But if you do…

Look, if you’re going to write about my app, say this: it’s a simple utility for merging short, portrait videos. It’s called SnapThread. It’s currently waiting for review.

It’s for parents who have a bajillion Snapchat videos of their kids with Marilyn Monroe hair or with a hot dog dancing on their head or what-have-you and all they want to do is create a sweet supercut of them all. No fancy filters or overlays or stickers or cropping. No dumb letter-boxing that forces it into landscape. Just stitch ’em all together and get on with your day.

It’s for travelers who have 30 Instagram Stories videos from their trip to Disney World spread over several days, and want to mash them all into one movie.

It was created by a mom who wanted to visualize how her son has grown.

SnapThread does what it says on the tin: it let’s you select portrait videos from your photo library that are 15 seconds or less, re-arrange them to your liking, merge them together, and save them to your photo library. Sometimes it takes a long time. Sometimes the videos have to be downloaded from iCloud. Sometimes their rotation has to be fixed before the merge can finish.

This isn’t an app for your home screen. This is an app you throw in your “Photo/Video Editing” folder and use once in awhile when you need it. It’s like “Clips,” but simpler.

SnapThread will be out soon. In the meantime, you can check out this page about it (the App Store link doesn’t work yet obviously).

Tell your friends! Or don’t, maybe. I don’t know.

Too Many AVPlayers?

Wow, I can’t believe it’s nearly September! For me that means I’m 1) popping allergy pills like a maniac because UGH RAGWEED, 2) getting really excited for the September Apple event, and 3) scrambling to finish up a random side-project app before iOS 11 hits the mainstream.

A couple nights ago I ran into a strange bug with my app, which uses AVFoundation to merge videos. Sometimes I would be able to export the final video with AVAssetExportSession and save it to my photo library, and sometimes it would randomly fail with the following error:

AVFoundationErrorDomain Code=-11839 "Cannot Decode" and NSLocalizedFailureReason=The decoder required for this media is busy., NSLocalizedRecoverySuggestion=Stop any other actions that decode media and try again., NSLocalizedDescription=Cannot Decode

In my app, every time a new video clip is added to the list of videos to be merged, I re-generate a preview of the final merged video. After merging the video, I set up an AVPlayer with the AVComposition like so:

I always made sure to set the AVPlayer to nil before re-generating the preview, so I couldn’t figure out why there would be any other “actions that decode media.” A trip to Stack Overflow revealed a possible platform limitation on the number of video “render pipelines” shared between apps on the device. It turns out that setting the AVPlayer to nil does not free up a playback pipeline and that it is actually the association of a playerItem with a player that creates the pipeline in the first place. Since developers don’t seem to have any control over when these resources are released, I knew I’d have to figure out another solution.

In the end, I decided to initialize the view controller’s AVPlayer right off the bat with its playerItem set to nil. Then I changed my setup function like so:

Replacing the player’s playerItem instead of initializing it with a new playerItem each time (even though the player was previously set to nil), seems to prevent that weird “cannot decode” error (so far, at least). I’d like to know more about this error and why exactly it occurs, just out of nerdy curiosity.

Anyway, I hope this helps somebody out!

Data Persistence Dilemma

Sometimes, in order to solve a problem, I have to think through it out loud (or in this case, in writing).

Here’s the sitch: I’m making an app for knitters and crocheters. They need to be able to manage projects (i.e. “Baby blanket,” “Socks for mom,” etc.). In addition to a bunch of metadata about each project, users should be able to add photos and designate one or more images or PDF files as the project’s pattern. A PDF or image of the pattern isn’t required, but including one will allow users to enter a split view where they can view the pattern and operate a row counter at the same time.

A project shouldn’t necessarily “own” its pattern. In other words, a pattern can have multiple projects associated with it (say you want to make the same baby blanket for multiple babies), so as to avoid the needless duplication of the pattern file. A pattern can exist without a project and a project can exist without a pattern, but when linked, the project becomes the child of the pattern.

My user base includes people who may not always have an internet connection. Therefore, all data needs to be stored locally. However, those who do have an internet connection are going to want iCloud sync between devices.

I like Core Data. If I were to set this up in Core Data, without any consideration of iCloud syncing, I’d create Project and Pattern entities, store images and PDFs in the file system, and call it a day.

iCloud syncing is where things get murky for me. Core Data + iCloud is deprecated, and I don’t want to use it. Not only that, I don’t know what to do with the PDFs and images. Storing them as BLOBs in Core Data seems like a bad idea. I understand how to save them to the file system but don’t understand how to sync them via iCloud and also have a reference to them in Core Data. Do I use iCloud Document Storage for them? Do I zip them up somehow (NSFileWrapper??) and use UIDocument? How do I store a reference to them in Core Data (just the file name of the UIDocument, since the file URL is variable?). If users will be adding photos and PDFs at different times, do I use one UIDocument subclass for photos and one for PDFs or do I use a single document and update it with the added information? You can tell I obviously have no idea how this works, and a multitude of Google searches has yet to clear it up for me.

As for the rest of the information in Core Data, I’m thinking of trying to sync it  with CloudKit using something like the open source version of Ensembles or Seam3.

I guess I’m not sure if I’m on the right track and would welcome any advice/feedback. I’d really like to stay away from non-Apple services (like Realm) for the time being. Comments are open!


Indies: when you decide to run with a new app idea, what sort of prep do you do before you ever fire up Xcode? Prototype? Design sketches?

WWDC 2017: Designing for Everyone

If I had to guess at an unofficial theme for this year’s WWDC, it would be “Designing for Everyone.” In addition to being the name of an actual session (806: Design for Everyone), the phrase captures the main thrust of what many of the other sessions were about as well. More than one presenter mentioned that while accessibility wasn’t a primary focus when re-designing some of Apple’s first party apps for iOS 10, it became a priority for iOS 11. One presenter (I can’t remember who) admitted that her team still had plenty of work to do on the accessibility front.

The session 802: Essential Design Principles reminded me of the best book I read in grad school: Donald Norman’s The Design of Everyday Things. Apple Design Evangelism Manager Mike Stern outlined many of the same principles that Norman discussed in his book—things like natural mapping and affordances.

In Chapter 2 of The Design of Everyday Things, there’s a little box with questions that designers should ask about their design:

How easily can one:

  • determine the function of the device [or in our case, an app or feature]?
  • tell what actions are possible?
  • tell if the system is in desired state?
  • determine mapping from intention to physical movement?
  • determine mapping from system state to interpretation?
  • perform the action?
  • tell what state the system is in?

(Norman, D. A. (2013). The design of everyday things. New York: Basic Books, a member of the Perseus Books Group.)

The questions are based on seven stages that humans go through when taking an action. When people use our apps, they have a goal in mind that they want to accomplish. They should be able to figure out what actions are available, which ones will lead them to their goal, and how to execute said action(s). Ideally, they should be able to do this without any additional walkthroughs, on-boarding, or tooltips.

They should also be able to accurately predict and evaluate the outcome of their actions. This can be achieved using things like error/success messages, animations, button state changes, view transitions, etc.

An app’s design should clearly communicate what you can do with it. Can you toggle a switch? Press a button? Scroll up and down? All of those things are called “affordances”—things the app allows you to do. 3D Touch is tricky because one of the affordances of a flat piece of glass is NOT that you should be able to exert pressure downward “into” it. Rather, glass, as a material, affords swiping and tapping by its very smooth, flat nature.

AirPods are another example of hidden affordances. Can you tell by their design that you can tap on them to execute additional functions? No. That’s traditionally considered bad design.

State changes are also important. What state is your app in? Is it syncing, or loading something? If you are using a tabbed interface, is it clear which tab the user is currently viewing? These questions are integral to making sure your design is accessible for everyone.

I’m planning to spend the next few weeks updating Nebraska93’s design to be compatible with Dynamic Type. A few days ago I tweeted a video of my successfully re-designed table view cells and I’m really surprised by the amount of favorites and retweets it got. A few people remarked that they didn’t know Accessibility Inspector could adjust Dynamic Type settings on the fly.

My advice to fellow devs is twofold. First, open up Accessibility Inspector and see how your app looks to people who have low vision. Off-hand, I can think of three people that I know in real life who use the larger text size settings, and I suspect it might be way more common than you might think. Second, read The Design of Everyday Things. It’ll make you really mad at a lot of doors and household appliances, but it’s worth it!


iOS 11 has already had two major effects on my iPad usage: 1) I feel delighted whenever I interact with it and 2) I want to use it more.

Announcing Nebraska93

Nebraska93 Icon

Two days ago I quietly released Nebraska93, a county license plate game for anyone traveling through Nebraska.

I’m really happy with the way it turned out and thought I’d share some fun things about it:

  • I drew the app icon, which is based on the state’s 1940 license plate design.
  • I also created all of the icons and artwork in the app and gathered all of the interesting facts for each county.
  • Nebraska93 was “in review” for only 12 minutes.
  • The photo behind the Nebraska map on the app’s main screen was taken from my yard.
  • The app uses AdMob to serve ads. If it really takes off (I mean really takes off), I’d definitely entertain the idea of rolling my own ads for local Nebraska businesses.
  • I used third party libraries to create confetti and display in-app notifications. I also modified Marco’s IPInsetLabel for the “Did You Know?” facts on each county page.
  • The basic county data (population, established date, county seat, etc.) is loaded into Core Data on the app’s first launch. The interesting facts about each county are in a separate plist so that I can easily add more as time goes on.
  • The dynamic Nebraska map, circular progress indicator, and corn icon were all made in PaintCode.
  • The “Discoveries” images were drawn with Affinity Designer and involved a lot of tracing with vector tools (except for the Walgren Lake Monster, which I made up myself).
  • If Nebraska schools showed interest in purchasing the app, I’d remove ads, switch to paid-upfront, and then release an ad-supported “lite” version.
  • One of my favorite things about the app is the little informational view that pops up when you tap an unlocked Discovery.

Discovery modal view

I think it would really be fun to collaborate with local elementary schools. All fourth graders learn about Nebraska history, so students could research the counties and send me their own fun facts which I could feature with their first name and/or school.

The next things on my to-do list are to make it available for iPad, improve accessibility, and re-take the app’s store screenshots without the ad banner (whoops).

Heartfelt thanks to all of you who have downloaded Nebraska93 so far (especially those who don’t live in Nebraska, lol)! If you have any questions about how anything in the app works, let me know. I always like it if my work can help other beginners in some way. And if you’re a beginner, let me just encourage you to explore app ideas that might benefit your local community. It’s a great way to gain experience and give back at the same time!


Currently preparing to submit my Nebraska app for review later tonight. This whole project took about five months longer than it needed to, but I’m proud of the finished product and hope people in my state will have fun with it!

April 21st – Progress Update

A couple months ago I wrote about a little local project I’ve been working on: a license plate game for Nebraska residents/tourists (okay, Nebraska doesn’t really have tourists). Things are slowly coming together with only a few, albeit time-consuming, things left on my to-do list. One of those things involves researching my state’s history and geography, and another involves designing some achievement-related assets and a system for unlocking them.

The UI still needs work, obviously (especially the information density). The “Discoveries” section will be home to special “points of interest” which will be little badges that are automatically unlocked when a person discovers certain counties. Each point of interest will display a little story or historical nugget when tapped. The app is supposed to be educational and fun, and while I don’t think it’ll make much money (if any), I want it to be well-designed and look nice in my app portfolio.

As I mentioned in a previous post, there are 93 counties in Nebraska. I decided to request an App Store rating when a person spots 62 of them. I figure if they stick with it that long, they must like it at least a little! There’s also a link to rate the app in the modal Settings view.

So yeah! That’s where I’m at. I’m hoping it won’t take much more than a month for me to finish and ship this baby. 🤞

Apple’s Identity and the New Mac Pro

The Mac Pro was dead, or so many of us thought. Or so Apple thought, apparently, because rumor has it the decision to revive it was made fairly recently.

The implications of the Mac Pro’s presumed death were worrisome and left us with many questions. Did Apple care to serve a higher-needs market? If not, why? What machines were Apple’s own engineers using? What did this mean for the future of macOS and Macs in general?

Apple, being Apple, naturally sought to regain control of its narrative and did so with unprecedented transparency and humility. But I can’t help but wonder whether Apple itself fully realizes the implications of its decision to double down on pro hardware. This wasn’t just a product decision, with effects on staffing, component sourcing, and profit margins. It was a decision about the company’s identity. What is our core mission? Who is our audience? Answering those questions (and making sure every employee knows the answer to those questions) is like Running a Company 101. And yet Apple seemed to be confused.

Depending on when the initial decision to sunset the Mac Pro was made, it seems like a lot of this could have been avoided if Apple had utilized its own mission statement. Up until early June 2015, the company still ended every press release with “Apple designs Macs, the best personal computers in the world…” Now filter “Should we kill our high end personal computer?” through that and the answer is an emphatic “Nope.”

Setting all of that aside, I hope Apple realizes that new hardware should only be the beginning. After all, for the most part, pros seem to want a Big Boring Box of Raw Power—a flexible cheese grater of the future. I’m sure Apple will find a way to make it look a little sexier than that, but what remains is that software needs to be the differentiater between the new Mac Pro and a suped up Windows machine. 

In other words, renewing a commitment to professionals involves more than just designing the perfect computer for professionals. It means designing an OS and a software ecosystem for professionals. For example, it’s not enough for Adobe to do cool stuff with the Touch Bar, or for the iPad Pro to act as a Wacom-like tablet input for the Mac. Adobe’s (and other high end software-makers’) products need to integrate with macOS in a way that makes them better and easier to use on a Mac than Windows. It’s Apple’s job to make that possible and to provide incentive for companies to put resources into it. 

So, Apple. Expand your Mac software teams. Fix the Mac App Store. Make sure some of that Workflow love makes it to the Mac. Focus on improving iCloud’s stability and features, because collaboration and remote work are the future and because RAW photographs aren’t getting any smaller (and your storage tiers are not friendly to even amateur photographers). 

Look, you may not be Apple Computer anymore but you have reaffirmed that you are Apple, a company that makes personal computers (among other things), and that your audience is everybody, and that you want to be the best. So do that. 


Larder Blog Interview

Larder Blog Interview

The only interview I’ve ever given was in middle school, to my local newspaper, after winning a spelling bee (ironically, the reporter spelled my last name wrong). So I was super excited when Belle Beth Cooper asked me if she could interview me for Larder’s “Making it” series. In the interview I talk a bit about my background and the advice that I would give beginning iOS developers today.