I’ve found a bug in your software! Now what?

As software developers, we have to deal with bugs sometimes plaguing the apps we’ve developed. It’s not uncommon to have a client, customer, co-worker or complete stranger tell us about a bug, glitch…basically something that is broken with our app.

As helpful as this is, the conversation usually goes a little like this:

App user: “Hey man, what wrong with your app? It’s not working?”

Developer: Blank stare… “What’s not working?”

The thing is, we write thousands and thousands of lines of code to make your app work. Unless we know exactly what the issue is, it’s difficult to figure out what exactly is going wrong. It’s like finding one missing comma in a novel without knowing what page and line it should be on.

So, how do you prevent your developer from giving you a blank stare? Upgrade your bug report from blank stare to a gold star bug report.

The blank stare bug report: It’s not working!

Tell your developer what you actually expected to happen. Believe it or not, we don’t always know. And sometimes the issue is not a bug so much as an interaction that fails to meet user expectations. The solution might be to tweak a design rather than digging into code to find out what is broken.

The mediocre bug report: I tried to log into the app, and it didn’t work!

Tell your developer what actually happened. In most tasks that an app performs, there are multiple steps and multiple points of failure. Help your developer isolate exactly what didn’t work.

The helpful bug report: I tried to log in, but I could not type in the username text field.

Provide as much detail as you can about how to reproduce the issue. This prevents the developer from trying to reproduce it the wrong way, which wastes valuable time and effort.

The very helpful bug report:After logging out of the app, I tried to log in again, but I could not type in the username text field.

Be sure to tell your developer how often this happens. This means, after you find a bug, try to reproduce it a couple of times.

The bronze star bug report: After logging out of the app, I tried to log in again, but I could not type in the username text field. This only seems to happen about half of the time.

Sometimes, doing the above will provide additional details about the bug in question.

The silver star bug report: After logging out of the app, I tried to log in again, but I could not type in the username text field. This only seems to happen if I go to the home screen and back into the app between logging in and logging out, but in that case it happens every time.

Screenshots and video are always appreciated, especially if the bug is visual in nature. It can also be helpful to show your developer the bug in person. Regardless, documentation is always good, especially if the developer is unable to fix the bug immediately.

The gold star bug report: The silver star report + screen shots or video

There you have it. Simple steps to go from a blank stare to a gold star bug report.

Your developers may not necessarily be happy to find out that their code doesn’t work, but they will certainly be happy to have a detailed bug report to help them isolate the problem in a timely manner.

Stephen Gazzard, Robot

Stephen Gazzard is one of our talented robots. He’s worked on some of our award winning apps,  including Spy vs Spy; released eight of his own games to the App Store, cracking the US top 100 with Castle Conflict; and dreams of the day his life observations will finally be published.


Welcome to our office



At Robots and Pencils’ head office, you’ll notice something a little different when you come in. Our digital signage may seem “ordinary” but when you walk past it, you may feel as though you’re being watched…

“Mona Lisa” is a subtly fun way to welcome visitors to our office. It uses new technology to take what you’d normally expect, digital signage in a tech office, and turn it into something a little different. For me, Mona Lisa was an experiment that started with a lot of different approaches and it took some time to narrow them down to find what worked best for the situation I was working with.



I’ll start off with the scenario: We wanted to install a display in our office entrance to greet visitors when they arrived for their meetings. We wanted it to look like a piece of art hanging on the wall, but with a twist: the eyes would follow your movement throughtout the room. You’ve probably seen similar effect before but without any fancy technology. We had a few different options available in order to pull this off on a television screen and they can be categorized two ways: image analysis and depth mapping.

The image analysis method uses algorithms to make a good guess as to where a face is in a picture. You’ve probably noticed that your iPhone will use this to find faces in a photo and focus on them. This method is simpler because it only requires a regular camera, but the algorithms only work well when the lighting and contrast of the image is ideal. The depth mapping method is best known by the Xbox Kinect hardware. Here a special camera tracks how far away millions of points are in the room, and builds a sort of topographic map based on that. This method can be more accurate than plain image analysis because it can see in a third dimension but requires special hardware at an extra cost.


There are two different ways that we could implement image analysis. There’s the popular OpenCV software that is used often for these sorts of projects, but we tried the built-in frameworks that Apple makes available on OS X and iOS. This was really the easiest way of the ones I tried. Given some guesses at average face size it was even possible to get the approximate distance from the face to the camera.

I’ll note that although OS X often has some more advanced features, iOS has definitely gotten the love here. It has some APIs that OS X doesn’t have that would have been helpful, but thankfully it wasn’t that hard to port from one to the other.

The drawbacks of image analysis became instantly apparent as soon as I started testing it away from my desk. Anytime the room was backlit (meaning light from a fixture or window was shining towards the camera lens) it proved much less likely to find a face, and this wouldn’t work at all in the well-lit entrance to our office.

This forced me to try out the second method.


We ordered a Kinect camera and I got started figuring out how to get it hooked up to the Mac. There’s no official SDK for the Mac (only for Windows), however lots of smart people have stepped in to fill this void with OpenNI and CocoaOpenNI. It took some fiddling to get everything up and running, but once I started getting information into my app it was smooth sailing.

The best part of depth mapping was that it works so much better in over-lit or backlit rooms. It also provides more accurate information about the distance from the camera than the calculations I could make with image analysis.



You might be wondering why this project was called Mona Lisa to begin with. We originally used an image of Da Vinci’s Mona Lisa, but after having some demos up on the wall for a while we realized that the art theme didn’t fit with our plans to display more information.

I started out just moving the two eye images in front of the larger image to simulate the tracking, but it wasn’t quite convincing enough. I decided to try it out with the SceneKit framework on OS X in order to render 3D eyes that would rotate instead! The effect actually turned out to be very subtle but helped add to the realism, and I think it’s an important detail to make it more convincing.



Ultimately Mona Lisa is about welcoming our guests, so it can also show the visitors’ names, meeting times and room when they arrive. It’s a little touch.  In the future we’re looking to tie the visitor information to our office meeting calendar and other ways to showcase the technology we know so well.


- Brandon Evans, Robot.




@PhilKomarny #5 Influencer from Interop!

Phil was invited to speak at Interop this year, and being the Twitter Guru he is, he was a top 5 influencer according to Onalytica! (For those of you who don’t know, Interop is a huge IT conference held every year. It attracts visitors from all over the world, keen to share their stories and cool new products). Stay tuned – you never know where the intrepid @PhilKomarny will show up next!Screen Shot 2014-04-08 at 4.13.07 PM


Robots and Pencils Win an Anvil

On Friday, April 4th, Robots and Pencils brought home the Calgary-based Ad Award, The Anvil, in the newest category: Best Mobile Game or Application. The Anvil was awarded for the super fun game, Sasquatch, available on The App Store here. Robots and Pencils worked with the great team at NotBadU to create Sasquatch, the location hunting photo game.


Click here to see the winning entry and more about Ad Rodeo.


Representing Robots and Pencils were Sandra Mills, left, and Vicki Sloot, right – both Pencils.

Sandra Mills, Left, and Vicki Sloot, Right, with the Anvil in hand

Sandra Mills, Left, and Vicki Sloot, Right, with the Anvil in hand


Robots and Pencils Expands to U.S.


In case you missed it, Robots and Pencils’ expansion to the US hit the papers a few weeks back. We’re terribly excited to have Phil Komarny on board as our chief executive and have been pushing strong into the US market. An office is being established in Austin, Texas with one to follow in London, England within the next year.

Check out the full Calgary Herald article here