Sploot the Corgi Stickers!

I made a sticker pack!

It features a corgi named Sploot (a “sploot” is a common corgi position where both hind legs are splayed out, as illustrated by this helpful Buzzfeed article). There are currently a dozen stickers in the pack and I plan to add more throughout the year.

Sploot the Corgi

I want to talk a little bit about Sploot’s launch.

At some point last Monday night, the iMessage App Store unexpectedly went live and sticker packs became available for purchase for anyone running the iOS 10 beta. I rushed to the store and checked all the featured lists and…nada. Apparently Sploot, like many other apps, was still “Pending an Apple Release.” I admit, I was bummed not to be featured (doesn’t everyone love corgis?!). Looking through those lists, it seems I would have had better luck creating something that was purposefully bad (see: failmoji) than attempting to create something good and falling short of the mark.

Anyway, when my stickers were finally ready for sale the next day, I sent out a couple tweets about them. Meanwhile, other sticker packs were getting featured in round-ups on all my favorite Apple sites. My mistake? Not inviting bloggers to beta-test my sticker pack. Sure, they might not have listed it anyway, but it would have been worth a try! I’ve often talked about how important promotion is; apparently, I can’t take my own advice.

My next mistake became obvious to me as I was spamming my husband and best friend with stickers: my stickers were too big. I mistakenly thought that everyone would use the maximum sticker size of 206×206 points. However, it looks like most stickers are around 130×130 points. So, I resized mine and submitted an update (which was approved yesterday). I also changed the name of the pack from “Sploot” to “Sploot the Corgi” and switched it from the “Emoji & Expressions” category to “Animals and Nature.” I’m hoping these changes will make it easier to find.

In the last week I sold about 80 sticker packs—which is fine, but I was definitely hoping for better!

I think the best thing I can do now is to get working on an update with some seasonal stickers and continue hoping for a feature. And hey, if you’re in need of some cute corgi stickers, you can download Sploot the Corgi from the App Store!

Submitting Stickers Through iTunes Connect

As I sat down to submit my corgi-themed sticker pack last night, I realized I actually had no clue what to do. Apple’s instructions for submitting standalone sticker packs are actually spread across two separate guides (iMessage App Submissions | Sticker Submissions) which added to my confusion.

So, in case you’re as confused as I was, here’s some answers to questions you may have.

What size screenshots do I need to prepare?

You need to prepare two sizes: iPhone 6s Plus (or any of the 5.5″ devices) and 12.9″ iPad Pro.

Where do I upload the screenshots?

You need to add them in two places: at the top of the iTunes Connect record under “App Preview and Screenshots” and farther down under “iMessage App.”

Do I need to add a 1024×1024 square app icon even though Apple doesn’t list it in its Human Interface Guidelines or icon template?

Apparently. There’s a paragraph at the bottom of the iMessage App Submission guide that makes me think it might be possible to view sticker apps on the iOS App Store if you’re not running iOS 10:

if users are on an operating system lower than iOS 10, the link will open the product page on the App Store for iPhone and iPad, and users can download your app from there.

Of course, that might just apply to full iOS apps that include sticker extensions. Who knows? If you’re using the Photoshop template provided by Apple, double click on the “icon” layer under the 58×58 size. It will open as a 1536×1536  square version…you can size that down to 1024×1024 and hit “Save As” (so you won’t mess with the template) to easily fulfill this requirement.

What should I include in the screenshots?

I was a little unsure about this, but decided to go with one shot of my sticker pack in expanded view, and one shot of a sample conversation using my stickers. I took all of my screenshots in the Simulator because I don’t have test devices. I used Photoshop to add profile pictures for my message participants using royalty-free stock photos from Pexels.

What category should I put my sticker pack in?

Why, the Stickers category, of course! From there, you can choose a subcategory or two.

Can I use “stickers” and “iMessage” in my keywords?

Yep! There are some caveats though, as described in the iMessage App Submission guide.

Removing the Headphone Jack: Pros & Cons

I know, I know. This topic has been done to death. However, as we’re two days out from Apple’s “See you on the 7th” event, I thought it’d be fun to list my personal pros and cons of eliminating the headphone jack. 

Cons

  • Without a cable, baby will have one less thing to yank on while nursing. How will he occupy his hands? What will he coat with his drool?!
  • I won’t be able to practice identifying nautical knots several times a day after retrieving my earbuds from my pocket.
  • After I drop my phone in water for the 19th time, I won’t experience the simple joy of digging a single grain of rice out of the headphone jack.

Pros

  • I love remembering to charge things!
  • I’ll no longer be spoiling my ears with high quality audio. Take that, you pompous flaps of cartilage.
  • My phone will finally be thin enough to mince garlic on the go.
  • I love having to triple-check whether my headphones are paired so that sudden noise doesn’t wake Charlie. Aw, who am I kidding? I love it when he wakes up from his naps early!
  • Don’t tell anyone, but I dream of buying dongles. Especially when they’re $29.99. Especially when I’ll probably lose them.
  • Wireless stuff is so so so reliable always. And easy to use in old cars too.

Wow. After reviewing my pro and cons list, I’ve determined that I actually don’t care what happens to the headphone jack. If Apple thinks this will propel us into a perfect future filled with pure, unadulterated Slabs of Glass, then who are we to stand in the way? ?

On Stickers

As I seek to understand more about the popularity of stickers in messaging apps (hint: they’re more than just big emoji!), I thought I’d share some of the interesting articles I’ve come across.

Sticker Culture

  • Stickers: From Japanese craze to global mobile messaging phenomenon by Jon Russell (TNW)

    Despite success in Asia, it appears likely that the appeal of stickers is different in Western markets, where Romanic alphabets are better supported on smartphones and there is less of an emoji/cartoon culture.

  • Why is every messaging app under the sun trying to sell stickers to you? by Jon Russell (TNW)

    Stickers are a frictionless way to monetize a service. By that I mean that they do not immediately disrupt the user experience by serving adverts, forcing video plays or using other forced ‘interactions’ that might serve to draw revenue from sponsors. Stickers are not intrusive and can keep an app priced free.

  • The Elements of Stickers by Connie Chan (Andreessen Horowitz)

    The “trading” element, however, is less about statically collecting and more about dynamically custom-curating one’s personal collection of stickers. These collections also signal one’s “sticker street cred” in Asian messaging apps — you can always tell a newbie or non-tech savvy user by their use of the stock stickers only.

For Developers

For Users

Key Takeaways

  • Designers only need to submit @3x versions of stickers (max size: 618 x 618px, 500KB)
  • PNG files are preferred (even for animated stickers)
  • Pay attention to transparency because your stickers can overlap message bubbles and images in conversations 
  • If you are making stickers that feature a single character, name the sticker pack after the character (or “CharacterName & Friends”)
  • If you want to appeal to Asian users, a quick Google image search of the word “kawaii” wouldn’t hurt
  • Most sticker packs seem to have at least a dozen stickers

Even though I’m not the greatest artist, I’m hoping to have a sticker pack ready for September!

Voicemail Transcription in iOS 10 Beta

I don’t answer my phone much these days. It seems I’m always either holding my sleeping baby, dealing with some sort of unfortunate situation involving his clothes, or creeping around the house in ninja-like silence while he snoozes in his bassinet. There’s no room for noise in my life right now—not when my chances of getting a decent night’s sleep are on the line!

As such, one iOS 10 feature that I haven’t heard much about but that I’ve found very useful is voicemail transcription. For instance, I missed two calls from a friend today. Thanks to voicemail transcription, I found out that the first call was urgent: something had gone wrong in the online class that I helped him set up and he needed me to take a look at it.

The second call came just as Charlie was settling down for his all-important afternoon nap in my arms. This time, I saw that my friend just wanted to tell me a story and that I could call him back when I got a chance.

In both cases, glancing at the transcription was way more convenient than holding the phone to my ear and potentially waking Charlie, who I’m convinced could hear a butterfly flap its wings in Africa.

Another great thing about this feature is that it gives you the ability to provide quick and easy feedback on the quality of the transcription. Below the transcribed paragraph it says something like “How useful was this?” and you can select either “Useful” or “Not useful.” Dead simple, right? I’m sure that prompt will go away when the final version is released but for now I’m grateful for its existence.

It made me wish that Apple would add that same kind of feedback mechanism to all of its AI “suggestions,” even if only in the beta releases. Whether it be suggested apps, contacts, calendar items or locations, I should be given the opportunity to report on their usefulness/relevance. Otherwise, how does Apple get better at this? How do they know where they need to improve? Heck, how do they even know what they’re doing well?

Quick unobtrusive feedback prompts are a great “opt in” way of figuring out the answers to those questions.

Thoughts on Screen Time for Kids

In episode 176 of ATP, Casey, John and Marco discussed their thoughts on “screen time” for kids and whether or not parents should limit the amount of time their kids spend in front of screens of any kind. It caused me to reminisce about my own childhood as well as my [few] experiences with babies and screens, so I thought I’d share those memories here. (They might be kinda boring, so the tl;dr version is: I think screens are A-OK!)

A Childhood of Screens

You might know me as the stay-at-home mom who codes on the farm, but I actually grew up in Bristol, Connecticut. I have very fond memories of our house there; most of them involve me playing outside in our gigantic backyard: finding salamanders under rocks, riding around in my little motorized Jeep, trying to start fires with a magnifying glass…you know, kid stuff.

However, my mom worked a late shift so there was a period of time in the afternoon before my dad got home that I spent with a babysitter—a lovely, middle-aged French Canadian woman named Leona. “Ona” and I watched a LOT of television. I always joke that I learned how to spell by watching Wheel of Fortune. We watched the usual slew of early evening sitcoms (the one I remember most clearly is Roseanne) and I watched my favorite movie, Homeward Bound, at least a million times.

When I was 7, my family moved to Ohio and several things happened: 1) I started at a new school in the middle of second grade, 2) My parents bought me a TV for my room, and 3) I got a Super Nintendo. I had a hard time making friends in Ohio, so I spent lots of time playing Donkey Kong Country, pouring over Nintendo Power magazines, and watching TV. Every night I’d fall asleep watching Nick at Nite, through which I became familiar with many of the shows of my parents’ time: I Love Lucy, The Jeffersons, Mary Tyler Moore, All in the Family, BewitchedHappy Days, and more. In many ways, I think those shows helped me understand how adults related to one another, as well as develop a sense of humor and empathy.

Still, I spent plenty of time not looking at screens. Ohio used to be underwater once upon a time, and my parents and I would visit parks where you could dig for fossils of sea creatures. I also became interested in model trains, so we’d visit train museums and displays around the state.

Two years later, when I was 9, we moved to Nebraska. I spent hours on the family computer, playing with virtual pets, learning HTML, and discovering a vast world of pirated content. I’m pretty sure I played Pokémon on an emulator before my mom bought me my first Game Boy. Like many kids, I was so addicted to Pokémon that I’d sit in the back seat of the car after dark, struggling to play the game by the light of passing streetlights.

Let me tell ya, the late 80s/early 90s were a weird time to grow up because everything was changing so fast and nobody knew what they were doing. As a kid, I sort of straddled the divide between pre-Internet and always-connected—between one screen (the TV) and ubiquitous screens. Since computers and gaming systems were new, and cool, and fun, my parents didn’t think twice about letting me play with them. And now, after all those hours of unrestricted screen time, here I am: a relatively well-functioning human being.

Screens + Babies

Charlie's selfie

Charlie lined this shot up himself!

Charlie is three and a half months old now. He’s very interested in our phones and likes looking at himself via the front-facing camera. There’s a period of time during the day when he refuses to sleep, but is also too cranky/sleepy to play with anything. During that time, he sits on my husband’s lap and watches Fast and Loud, a show about restoring old cars. It’s all just a bunch of blurry blobs to him, yet he’s fascinated by the movement and the bright colors of the hot rods. When the episode is over, he’s usually ready to eat and finally take a nap.

There’s a little baby girl at our church that I watched a few times in the church nursery. She was too shy to interact with the other kids and so I just held her on my lap the whole time while she watched them play. Suddenly she noticed my Apple watch and was transfixed by the honeycomb screen. At a little over a year old, she figured out that if she moved her finger over the surface of the watch, the little app icons would move. That interaction paradigm of touching a screen is incredibly easy for babies to get the hang of. It opens up a world of learning to them that can serve as a good supplement to those all-important hands-on activities.

The Future

I’m not worried about managing screen time with Charlie. In the same way my generation remembers cassettes, record players, rotary phones, and finding the answers to our questions at the public library, our kids may remember smartphones and tablets and 5K displays. In other words, the children who grew up with nothing but screens may very well be the ones who lead us into a future without them (or with fewer of them).

Our kids may be the ones who bring augmented reality to the mainstream. They may laugh at the thought of us staring at our phones all day. They may very well straddle a new divide: between ubiquitous flat pieces of glass and…well, whatever’s next. Heck, in some ways, that screen time might be essential in helping them figure out what should be next.

Sherlocked?

Well my friends, WWDC has come and gone and I, like many of you, am now deeply engrossed in the plethora of new videos, sample projects, and API diffs that Apple has posted.

Whether you were actually there or experienced the fun from the comfort of your home, you may have noticed one fateful phrase wedged amongst the many words on the Developer APIs slide: “Live Photo Editing.” And if you didn’t see it there, you may have read about it on the “What’s New in Photos” pop-up in the iOS 10 beta:

What's New in Photos

Screenshot by Casey Liss

So yeah, with iOS 10 you can now rotate, crop, and otherwise adjust Live Photos right in the Photos app—which is awesome and just as it should be!

I hesitate to say that I was sherlocked (which, incidentally, keeps auto-correcting to “shellacked” ?). In order for an app to be sherlocked, I think there has to be some uncertainty involved. In other words, it wasn’t inevitable that Apple would build f.lux-like capabilities into iOS. Nor was it inevitable that Maps would gain the ability to locate your parked car, or that Photos would auto-generate compilations and call them Memories (there is an app by that name with similar functionality). However, I do believe it was inevitable that Apple would expand Live Photo-editing capabilities…the question was just “when?”

Now we know the answer.

And that’s OK. I learned so much building LiveRotate, and even sold a few copies! From its release on June 7 to today (June 28), LiveRotate was downloaded 304 times. A few people requested refunds, which was expected. I think the app can still provide value to the general public this summer, though when September rolls around I’ll likely remove it from sale.

Overall, I’m very happy with how well it sold, and am feeling more confident than ever about my ability to build apps!

LiveRotate stats

So what’s next for me? Well, I have two ideas for Messages apps: one sticker pack (depending on my drawing abilities) and one app that lets users build their own stickers. I’m also in the process of updating my Bible verse app for watchOS 3. After that, it’s back to Corgi Corral and then onward to some other app ideas that are floating around in my noggin (wow, does anyone use that word anymore?).

Best of luck to all of you with your summer projects! And for those tinkering with the idea of making an app: there’s no better time to get started! ?

The Making of LiveRotate

I thought it might benefit other beginners if I wrote up an overview of how I went about building LiveRotate. (Spoiler alert: there was a lot of Googling involved!)

Starting the Project

When I began, I didn’t have the foggiest idea how PhotoKit worked, and I had all but forgotten how to use collection views, which help you display things in a grid. So, I turned to Apple to see if they had a sample project for the Photos framework and luckily, they do. It has even been updated to “illustrate the use of LivePhoto APIs.” Right on! ?

I then translated almost the entire thing, line by line, into Swift. I’m not joking. I needed the code for the collection view, for displaying a Live Photo with a badge, and for caching thumbnails as you scroll, and that was honestly the bulk of the project (if anybody needs any of that code in Swift, just let me know!). As I translated the code, I learned what each piece did, so that I wouldn’t just be blindly copying things without building up my understanding.

Handling Rotation

Deciding how to rotate the photos was confusing at first because there are two ways you can do it. There are rotation flags that determine how a photo is displayed on a device (but that flag may not be respected by all programs/devices). Or, I could “physically” rotate the bits using some kind of transform. Option B seemed like the right way to go, so I set about learning two new frameworks: Core Graphics for the JPEG part of the Live Photo and AVFoundation for the Quicktime Movie part.

Rotating Photos

There are three types of image-related classes in iOS: UIImage, CGImage, and CIImage. For a beginner, that was SUPER CONFUSING (and still sort of is). Some more searching led me to a category for rotating CIImages by 90 degrees. The Swift equivalent of an Objective C category is an extension. So, I translated that code as follows:

extension CIImage {
    
    func imageRotatedByRadians(radians: CGFloat, imageOrientation: UIImageOrientation) -> CIImage {
        let finalRadians = -radians
        var image = self
        
        let rotation = CGAffineTransformMakeRotation(finalRadians)
        let transformFilter = CIFilter(name: "CIAffineTransform")
        transformFilter!.setValue(image, forKey: "inputImage")
        transformFilter!.setValue(NSValue(CGAffineTransform: rotation), forKey: "inputTransform")
        image = transformFilter!.valueForKey("outputImage") as! CIImage
        
        let extent:CGRect = image.extent
        let translation = CGAffineTransformMakeTranslation(-extent.origin.x, -extent.origin.y)
        transformFilter!.setValue(image, forKey: "inputImage")
        transformFilter!.setValue(NSValue(CGAffineTransform: translation), forKey: "inputTransform")
        image = transformFilter!.valueForKey("outputImage") as! CIImage
        
        return image
    }

Here’s an overview of the photo rotation steps:

  1. Request the photo data using PHAssetResourceManager
  2. Create a CIImage from the data and use the extension to rotate it
  3. Add appropriate metadata (more on this later), convert the resulting image to a JPEG and save it to a temporary location

Rotating Videos

Rotating the video portion of the Live Photo turned out to be much, much trickier. This Technical Q&A from Apple describes which methods actually rotate the buffers and which only set a rotation flag. In order to rotate the video, I needed to use an AVExportSession and apply a transform.

There are 4 orientations that a photo or video may be captured in. I made this convenience method to take the video’s original transform and return information about it.

func orientationFromTransform(t: CGAffineTransform) -> (orientation: String, isPortrait: Bool) {
        var assetOrientation = "Up"
        var isPortrait = false
        if t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0 {
            assetOrientation = "Right"
            isPortrait = true
        } else if t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0 {
            assetOrientation = "Left"
            isPortrait = true
        } else if t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0 {
            assetOrientation = "Up"
        } else if t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0 {
            assetOrientation = "Down"
        }
        return (assetOrientation, isPortrait)
    }

Each of those 4 orientations could then be potentially rotated 3 different ways: 90 degrees, -90 degrees, and 180 degrees. When you rotate the video, you rotate it around its origin point, which can potentially move the video out of the frame. Therefore you have to apply a translation to get it back to where it’s supposed to be. Derek Lucas (@derekplucas) got me started by creating a Playground that rotated videos on the Mac. I took his translation values and had to tweak them, via trial and error, to get it to work on iOS. Here’s just a small sample of what that hot mess looks like:

var adjustY = videoSize.width
var adjustX = CGFloat(0)

if (radians == CGFloat(-M_PI_2)) {
            if orientation == "Right" || orientation == "Up" {
                adjustX = videoSize.height
                adjustY = 0
            } else if orientation == "Left" {
                adjustX = videoSize.width
                adjustY = -1 * videoSize.width / 4
            } else {
                adjustX = videoSize.width
                adjustY = -1 * videoSize.height / 4
            }
        }

Once rotated, I saved the video to a temporary file.

Live Photo Metadata

You can’t just throw any two photos and videos together and make a Live Photo without doing a little extra work. I found this project by genadyo on GitHub that shows what sort of metadata must be written into the photo and video files in order for them to be paired up correctly.

Basically, you have to do 5 things:

  1. Create an identifier of some kind, assign it to the key kFigAppleMakerNote_AssetIdentifier (which is “17”) in a new dictionary and set that dictionary as the kCGImagePropertyMakerAppleDictionary for your JPEG file.
  2. Create an AVMetaDataItem where the key is “com.apple.quicktime.content.identifier” and the value is the identifier you created in the first step.
  3. Create an AVMetaDataItem where the key is “com.apple.quicktime.still-image-time” and the value is 0. For some reason, this is required in order for iOS to recognize it as a true Live Photo.
  4. Use AVAssetWriter to re-save the video you made using AVExportSession, this time writing in the appropriate metadata. Of course, if you aren’t rotating the video, you could just use AVAssetWriter from start to finish.
  5. Save both the photo and the video to Photos like so (where “fileURLs” is an array containing the two temporary URLs for the photo and video):
     PHPhotoLibrary.sharedPhotoLibrary().performChanges({
                    let request = PHAssetCreationRequest.creationRequestForAsset()
                    
                    request.addResourceWithType(PHAssetResourceType.Photo, fileURL: fileURLs.first! as NSURL, options: photoOptions)
                    request.addResourceWithType(PHAssetResourceType.PairedVideo, fileURL: fileURLs.last! as NSURL, options: videoOptions)
                    
                    }

Conclusion

I started LiveRotate on April 27 and finished it on June 6, so it took just a little over a month to make. I’ve had some good suggestions for improvements to the app and hope to implement those soon. For now, though, my brain can finally break free from “obsessive coding” mode and focus on important things like catching up on household chores and cooking some real food! ?
Edit: 4:35 pm CDT

I forgot to add that I created the app’s icon in Photoshop CS6, and translated it into German, Spanish, Italian and Russian via a hilarious process of changing my phone’s language, opening up apps that had the words/phrases I needed, and screenshotting them. I know—I’m a dang thief!

LiveRotate

This post could’ve easily been titled “I made an app with a two month old baby glued to me, AMA.” Of course, if it weren’t for Charlie, I wouldn’t have gotten the idea for the app in the first place! ?

It started with a giraffe.

At the ripe old age of two months, Charlie enjoys things like smiling, staring at ceiling fans, getting his outfit changed (that one seems unusual), and of course, conversing with stuffed animals. By “conversing” I mean “looking intently, grinning, and occasionally yelling at.” One day I snapped a bunch of pics him speaking with his giraffe pal and this happened:

 Pics of Charlie and giraffe pal incorrectly rotated
Whoops. I wasn’t paying attention and was tilting my phone in such a way that it thought I was holding it in portrait rather than landscape. When you attempt to edit a Live Photo beyond simply auto-correcting it, you get this message:

Editing will turn off Live Photo

At this point, I was kinda sad because the Live Photos were cute but weren’t captured as I intended. A few days later, I decided to do something about it.

Programming is fun!

I really enjoy writing code, especially when it requires me to learn a lot of new things. However, while I’m still really excited to finish my game, Corgi Corral, it’s officially on hold for two reasons:

  1. I don’t have the creative energy for it. Taking care of a baby who doesn’t sleep through the night has sapped me of the mental resources I need to make stuff like art and music. Someday!
  2. I’m interested to see what changes to GameplayKit and SpriteKit Apple will show off at WWDC. Maybe I’ll get some new ideas or will be able to improve my code in some way.

Even though I’m taking a break from Corgi Corral, I still have that itch to make something. So, I decided to dive head first into the Photos framework and create an app that rotates Live Photos.

It actually took me several weeks to get something working due to my lack of experience with both Core Image and AVFoundation (not to mention the fussy way that Live Photos are constructed). I’ll write more about the process of building the app in another post, but needless to say there were many headaches involved!

However, I still had a blast doing it. For the first time ever, I truly feel like a real app developer. Sure, my Bible verse app was fun to make, but there are a zillion Bible verse apps on the App Store. I haven’t found an app yet that can rotate Live Photos. Maybe one exists, maybe not, but I finally feel like I was able to identify a unique problem and build my own solution. It’s a powerful feeling!

Shipping things is fun!

Charlie & his giraffe

Confession: I love filling in all the blanks in iTunes Connect. The screenshots, the app preview video, the description…there’s something really satisfying about seeing my app’s profile come together. I even had fun making the screenshots, using David Verwer’s SimulatorStatusMagic to ensure the status bars looked nice and clean.

I don’t really know if there’s a market for this app. I mostly built it for myself, so I could enjoy my pictures of Charlie. Having it on the App Store is just the cherry on top. ? Still, it sure is fun to ship something—to be able to point to something and say “I made that.”

LiveRotate icon

The app is called LiveRotate (I decided to adopt _David Smith’s straightforward approach to naming apps) and it costs $0.99. Any money I happen to make will go towards purchasing the new MacBook Pro I’ve been dreaming of for the past two years!

First Apps Roundup [Updated June 3, 2016]

Last week I challenged iOS developers to share anecdotes and screenshots of their first apps and was delighted when a few people responded. These are exactly the kind of responses I was looking for!

Marius Constantinescu:

My first contact with iOS development was during my BSc. studies. We were a bunch of students and a passionate teaching assistant, and we were learning iOS development on our own, outside of the university curriculum, following Paul Hegarty’s Stanford CS193P course on iTunes U.

I think it’s really cool that Marius and his fellow students taught themselves by following along with the Stanford course. I also love the screenshot from his first (albeit unreleased) iOS game!

Rob Poulter:

Development is easy. We know (or quickly learn) the constraints of the development environment and platform, and the rest is about research and experimentation. Not having done a lot with UIKit before, learning the API was the most challenging part of this. I wish I’d tracked just how much time I spent on StackOverflow vs XCode.

I also wish I’d timed my Stack Overflow visits. I’m sure it’s been at least a couple of days.

Alistair Phillips:

After reading Becky Hansmeyer’s post I decided to dig up some screenshots of the first version of My Opal to see just how far things have come. My Opal was released not long after iOS 7 arrived so I started off with the clean/sparse look but about 8 months later it gained a little more personality.

The screenshots Alistair posted are downright inspiring.

I’ll continue to update this post if more people respond!

Update: April 18th, 4:00 p.m. CST

Curtis Herbert:

One conversation I keenly remember having with my partner on this venture was about the app’s worth. After nights and weekends spread out over a few months we had 1.0 in review for the store. We were talking next steps over lunch and he brought up his idea to approach the big-wigs in the server monitoring space about selling the app to them. For “easily $50,000+” as it existed today, he asserted.

I love that little anecdote! In a nutshell: app pricing is hard, man. As such, Curtis notes how important it is to have a marketing plan and to manage your expectations.

Isis sold 6 copies over its lifetime. I’m pretty sure one of those was my mom, trying to encourage me.

This made me laugh. I’m pretty sure the only people who bought my first app (a Bible verse app for Apple Watch) at $0.99 were my friends and family as well, probably out of pity. ? Once I lowered it to free, the downloads picked up considerably! Anyway, go read Curtis’s post because he included some nice old school iOS screenshots as well as reflections on his code organization and implementation.

Update: April 20th, 5:10 p.m. CDT

Yono Mittlefehldt:

In March 2010, I scrapped everything and rewrote the entire app from scratch. Even I, a novice iPhoneOS programmer, could see how bad it was. And it was really bad. Since it was just a side project and not a business, I had the luxury to do so. And I learned a ton in the process.
Then, in April 2010, I rewrote the entire app. Again. Seriously. It was still terrible.

Another awesome “first app” post! I’m starting to understand more and more than in order to make something great, you have to first be willing to make something terrible. And then figure out why it’s terrible, and try again. Lather, rinse, repeat. This is so hard for picky people like me, who want everything to be perfect the first time! Random, but: Yono’s post also makes me want to learn Hebrew. He’s got a cool language learning app for kids called Gus on the Go, which I plan on introducing to Charlie when he’s old enough!

Update: June 3rd, 9:15 a.m. CDT

Cesare Rocchi:

Beginning of May 2014. I get out of the hospital after a sleepless night. A few hours before my daughter was born. Happiness and concern both having a party somewhere in my body. I stood up and took a walk to think a bit. I made the error of checking my email. As I skim I end up on App.net State of Union. Bottom line: I spent my spare time building on an API that was going to slowly die. The first reaction was cursing. The second was realizing the crazy twist of fate: I was experiencing the joy of birth and the sorrow of death at the same time.

Cesare recounts the story behind what would have been his fourth app—a slick-looking App.net client—if it had ever been released. He encourages new developers to experiment with new APIs but also to be cautious with them and give them time to mature.

We All Have to Start Somewhere

I only know one person in the 3-dimensional world (aka “real life”) that develops iOS apps. We went to college together but were a few class years apart, so we never really got to know each other that well. However, I had the opportunity to talk with him a bit last week and came away from the conversation feeling encouraged, so I thought I’d share the reasons for that here.

I would consider my friend (I’m not sure he’d want me to publish his name, so I won’t) to be a pretty successful programmer. He was the sole developer contracted to build the iOS app Lens Distortions, which has done really, really well in the Photography category. You can read a fairly recent review of it on Fracture’s blog. Although my friend didn’t invent the photo filters featured in the app, I think he did a good job with the UI and the overall user experience.

What encouraged me was when I brought up one of his first projects, a matching game for iOS released in 2009. One of the things I loved about the game was that all of the artwork was literally drawn with colored pencils by his mother, who is one of my dearest friends. The game was adorable and quaint, with unique and challenging mechanics. But when my husband asked how much money the game made, my friend laughed. “Probably about 50 bucks,” he said. “Sometimes I think about going back and making it work again.” (The game is no longer for sale and is broken on most devices.)

I joked that he should rewrite the whole thing in Swift.

Reminiscing about one of my friend’s first apps reminded me that we all have to start somewhere. I’m currently where he began: creating my first game, with far-from-polished graphics and little hope of making more than a few bucks. But it’s all part of the adventure, right? Maybe in 7 years I’ll be where he is now…maybe I won’t. All I know is that these things take time.

Now here’s a challenge for you: If you want to encourage and inspire newbies like me, write a blog post about your first app. Share some screenshots from that good ol’ version 1.0 and talk about what was good and not-so-good about your first effort. Everyone says things like “yeah, the first apps I built when I was learning to program were terrible.” But like…pics or it didn’t happen, amiright? And if you already wrote about it awhile ago: I’d love it if you’d share the link.

(Note: this post was also inspired by Marco Arment’s recent interview with Computerphile, where he reiterates that it took him about a decade to build his audience and create the success he enjoys today.)

Unprepared

I was prepared to have a baby.

I read the Mayo Clinic Guide to a Healthy Pregnancy and What to Expect the First Year. In the last few months of my pregnancy, I’d wager that over 80% of my Google searches began with either “newborn” or “breastfeeding.” And there were a lot of Google searches.

I decorated his room while my husband assembled his crib and changing table. We practiced swaddling stuffed animals and learned all the ways to avoid sudden infant death syndrome. After my baby showers, we organized his clothes, bottles, toys, and various accessories. We were prepared.

I was prepared to go through labor, to have a C-section, to breastfeed an infant, to care for him and bathe him and put him to sleep. I was prepared (mentally) to be up at all odd hours of the night, to live in a state of endless exhaustion, and to be generally overwhelmed with my new responsibilities as a mother.

I was unprepared for how much I would love him.

Charlie at 2 weeks 3 days

He’s my world, to the point where I look forward to spending time with him during those godawful hours of the night. And even though he consumes nearly all of my time and energy, I’m more motivated than ever to finish my little iOS game because I want to make him proud someday.

Like: Yeah, kid. Mommy makes iOS apps. How freaking cool is that?

Meet Charlie!

Last Monday I mentioned how impatient I was to give birth to my son. I published that post around 2 p.m., after having had mild contractions since 4 in the morning. By 5 p.m., I was convinced that something was really happening and by 6, we were on our way to the hospital!

Introducing Charles Maxwell Hansmeyer: 7 pounds, 1 ounce and 21 inches long (though we think that’s inaccurate…probably more like 19 inches)

Charlie Hansmeyer

Naturally, I was hoping for a smooth birth. Unfortunately, Charlie’s grand entrance into this world was anything but easy! As I said, we checked into the hospital around 6 p.m. on March 14. Labor was slow going, and I wasn’t dilated enough to receive an epidural until just after midnight. The epidural worked wonders until about 5 a.m. when suddenly, it didn’t. At that point I was fully dilated and the nurses assured me that my baby boy would be there after about an hour or so of pushing.

Something was wrong though, and the pain was unbearable. The anesthesiologist came back and gave me some extra happy juice (as I called it in my head). This kept me from yelling bloody murder every two minutes but had the unfortunate side effect of making me completely numb from the waist down (usually, you want to be able to feel just enough to push correctly).

Eventually, we determined that Charlie was oddly positioned: he was both posterior and asynclitic. That means that the back of his head was against my back, and that his head was tilted toward one shoulder. My doctor attempted to reposition him several times, but he kept sliding back. Finally, at around noon on Tuesday, March 15, we made the call: it was time for a C-section.

Me & Charlie

By 1:17 p.m., I could hear my little guy crying. I remember tears of relief streaming down my face. Then I remember them holding him near me so I could kiss his little head, after which I promptly fell asleep.

We recovered in the hospital for the rest of the week and I’m happy to say that we’re all doing great! For the past few days my husband and I have pretty much just sat around and stared at him, amazed at how adorable and perfect he is. I’m hoping to get back to work on Corgi Corral soon, but in the meantime I’ve enjoyed reading everyone’s kind responses regarding Charlie. Here’s one last picture of him, all dressed for yesterday’s Apple event:

Charlie in iPood onesie

The Waiting Game

Patience has never been one of my strong suits. As such, it’s no surprise that as I wait for my son to make his debut into this world, I’m finding it more and more difficult to concentrate on coding. I have a good, solid to-do list set up for Corgi Corral so that I won’t forget where I left off, but basically, I’m just ready to have this baby. (Seriously, any time now would be excellent. Today would be great!)

It's been 84 years

And speaking of waiting: I’ve been waiting to buy a new MacBook Pro for what feels like a very, very long time (how’s that for a segue? lol). I originally assumed that new Skylake MBPs would be announced this month; however, most rumor sites now seem to agree that they won’t be unveiled until WWDC in June. Now I’m even hearing rumors that while the 13″ MBP will be ready for release in June, the 15″ might not be available until September. September? I really, really don’t want to wait until September.

Honestly, it’s made me re-evaluate whether or not a 15″ MacBook Pro is what I even want or need.

The Xcode Problem

One thing we’ve surely learned from the endless “Mac vs. iPad” debate is that everyone uses (and passionately defends) the tool that works best for them. On a recent episode of the Accidental Tech Podcast, the hosts briefly discussed the possibility of Apple releasing a version of Xcode for the iPad (which seems like a real possibility).

For some reason, their discussion prompted me to search Google for the “best laptop for iOS development.” There were a number of question-and-answer threads that went something like this: “Can I use an 11″ MacBook Air for iOS development?” followed by a resounding chorus of “Get a 15″ MacBook Pro. You’re going to want more screen space. Get a Mac mini and a large display. Get a Mac Pro.”

The iPad Pro has a 12.9″ display. It’s certainly possible that Apple could find a clever way to redesign Xcode’s interface for the iPad in order to make screen size less of a pain point. But what about the simulator? And instruments? Would they run side-by-side with Xcode somehow? The logistics are baffling to me. But I digress.

The point I’m trying to make is: it seems like some of the folks who imply that they could switch to the iPad full time if only it had Xcode probably aren’t currently willing to do their work on a 12″ MacBook (and not just because of the keyboard and speed).

On the other hand, I was able to find one person who loves their 12″ MacBook for development: Rob Rhyne. Back in August, Rob wrote a very interesting post about how he sets up his little MacBook to use Xcode and Photoshop. It’s a lot of hoops to jump through (lots of hiding panels and adjusting font sizes), but no more hoops than Federico Viticci has to navigate in order to get work done on his iPad. As I read Rob’s post, I got inspired. Maybe I don’t need the classic developer workhorse after all.

overview_hero_hero

Spring is in the Air

I want something just a tad bigger than the 12″ MacBook. The rumors of new MacBook Airs intrigue me—especially the possibility of a 15″ Air. Presumably, these new Airs (or whatever they’ll be called) will be released before September. Presumably they’ll also have retina displays, a decent speed boost, and good battery life. They’ll probably be fine for photo editing, code compiling, and playing a few games and they’ll be cheaper and lighter than a MacBook Pro to boot.

My philosophy has always been to buy the most powerful laptop I can afford and use it for as long as possible. But times are changing, and maybe I need to refresh my thinking as well. After all, I’m not a professional photographer, designer, videographer, or musician (though I like to dabble in all of those things). I’m not even a professional developer. If I want to run graphics-heavy games, I have a gaming PC for that. I rarely hook up peripherals to my laptop. Really, I honestly can’t think of a single reason that I would need a MacBook Pro over a refreshed MacBook Air.

And that’s…freeing. If, like me, you’ve been waiting ages for new MacBook Pros, I’d encourage you to spend some time thinking about what you really want and need. Maybe, like me, you’ll come to a different conclusion. :)

Helpful iOS/Mac Developer Resources

If you’re new to iOS/Mac development (or even if you’re not), you might find these resources useful!

  1. Open Radar – a community bug report site where you can view and submit copies of bug reports that have already been sent to Apple
  2. QuickRadar – I just learned about this one from one of Craig Hockenberry’s tweets. QuickRadar is a free, open source Mac app that lives in your toolbar and allows you to quickly submit bug reports to Apple (and also Open Radar) without using their web interface.
  3. WWDC app for OS X – a really nice Mac app for viewing WWDC videos. You can also view session slides and read the transcripts. On the GitHub page, right below where it says “WWDC app for OS X,” choose “Click here to download the latest release” to download the app.
  4. Dash for OS X and iOS – Dash has a lot of neat features, but probably most useful is the ability to browse API docs offline. You can download the docsets for iOS, Swift, OS X, watchOS, tvOS, and over 150 other languages and APIs.
  5. Bjango App Icon Templates – if you’re going to make your own app icons, using a template can save you a lot of time.