These are a | few of my | fav- or- ite | things…

March 28th, 2013 5 comments

If you’ve ever done usability tests using mobile devices you know that it can get very complicated very quickly: sleds, goosenecks, document cameras, converters, physical restraints (well, maybe not physical restraints, but “Don’t move the device beyond this point” markers), and a lot more.

Since I wrote a book about testing (Rocket Surgery Made Easy), people often ask me about the best way to do mobile testing. I think I’ve finally figured out what’s what, and I’m going to write a few blog posts about it. To begin with, though, I thought I’d mention two tools that seem to be well-kept secrets (i.e., I’m always surprised how many people haven’t heard about them.)

They’re useful for solving two problems:

  • How you record mobile tests, and
  • How you display them to observers.

(They’re both for iOS devices, not Android or Windows Phone.)

First, recording.

As near as I can tell, Steve Jobs must have been scared by multitasking when he was a kid, with the result that it’s nearly impossible to do more than one thing at a time on your iPhone or iPad–or iWatch, presumably. (Exception: You can listen to music and do one other thing. Steve was apparently fond of music.)

The upshot is that there’s never been much prospect for running a screen recorder in the background under iOS. And screen recorders (e.g., Camtasia, et al), are like mothers’ milk to us usability folks. I had despaired of ever finding one, until recently.

First, I came across UX Recorder (www.uxrecorder.com). And then someone (OK, it was Dave Greenlees. Thanks, Dave.) alerted me to Magitest (www.magitest.com).

They have a lot of similarities. They both record what’s on the screen, the user’s face (via the front-facing camera), and think-aloud audio (via the microphone). They both save the recordings to the camera roll when you’re done. And they both superimpose some representation of the user’s gestures (taps, swipes, pinches, etc.) on the recording.

Both have free versions that let you do short test recordings. The full Magitest costs $24.99, and UX Recorder has a pay-per-recording plan, or $59.99 for unlimited use.

The truth is, given the limitations imposed by Apple, they’re both pretty remarkable. But they’re not perfect.

They both seem to slow things down. (Typing can be painful, for instance.) And the recordings can take a very long time to encode and save. (All of this is from limited use/experimentation, so anyone with *actual* knowledge, please chime in with corrections.)

They both can only record things that happen in a web browser, so you can’t use them to test apps. But Magitest has an SDK module that you can add to your native-app project code which will let you make app-specific recordings.

BTW, there *was* another app called Display Recorder in the App Store briefly that *didn’t* play by the no-multitasking rules, and I was lucky enough to grab a copy before Apple banned it. It allowed you to record *anything*, including other apps, seemed not to slow things down, and saved its files remarkably fast. The bad news is that it doesn’t exist anymore, at least not without jailbreaking. It seems to live on in the Cydia store, though, if you’re the kind of person who’s OK with ripping the “Do not remove under penalty of law” labels off pillows and mattresses.

Second, displaying.

The truth is, while recordings are nice to have, I’m much more concerned about getting a bunch of people in a room and having them observe the tests live, so displaying what’s happening on a big screen is crucial.

There are two basic approaches: screen sharing (mirroring what’s on the screen) and camera views (what the user sees, including his or her own hands).

I’ll go into the pros and cons of both in another post, but I can tell you one thing: If all you want to do is mirror the screen to people in another room, you should at least consider Airplay–a feature built into iOS, the Mac OS, and Apple TVs for streaming audio and video.

It’s this simple:

  1. Connect your iDevice (iPhone 4S or higher, iPad 2 or higher, iPad Mini) and a Mac or PC to the same WiFi network.
  2. Install Reflector ($12.99 from www.reflectorapp.com ) on the Mac or PC.
  3. Turn on Airplay on the iDevice.

Bingo: Reflector mirrors your iDevice screen on the Mac or PC. (I have to say the effect is oddly striking. It always feels a little bit like magic to me.)

From there, you can run a screen recorder (e.g., Camtasia) to make a recording, and run screen sharing software (e.g. GoToMeeting) to send the image to the observation room. (Reflector doesn’t transmit audio from the iDevice microphone, so you’ll have to hook up a mic to the Mac or PC.)

I’ve used this to project the demo tests that I do whenever I give talks, and it works remarkably well.

As I mentioned at the beginning, I’m going to write another post or two about what I’ve finally concluded about mobile testing. (I have to say even I was surprised by my conclusions.)

Categories: Usability
  1. March 29th, 2013 at 16:09 | #1

    Good. Someone should figure this all out and let the rest of us know what to do. Thanks.

  2. Wayne Pau
    March 29th, 2013 at 17:02 | #2

    Steve,

    The only thing we’ve found that with haptic devices there are three (3) limitations:

    #1 – You don’t have a mouse cursor. This limits a lot of watching how a user gets to a control or move amount visually in an app or mobile web page.

    #2 – Some mobile apps are overloaded so the same action happens with potentially different inputs. For example, you could do a bezel swipe gesture or a double click or long tap to get to the same screen. When you only do screen capture, sometimes you can’t tell what triggered it, only the result. You have to imply the triggering user action.

    #3 – Sometimes a user could be trying to “attempt” a gesture and fail. So you can’t tell if it was a accidental action or user intentionally doing a gesture because he or she is confused by the UI. It’s much hard to mix up gestures-like actions using a mouse on a desktop. Long Tap vs. Double Tap vs. Single Tap is a common example of this.

    What we’re finding or trying to do is get a visual frame that is slightly larger than just the mobile device screen so we can see the physical finger interaction. We can see if it’s thumb or an index finger, right or left hand, etc. We can see if the user “tried” to do a bezel swipe or was trying to a drag. We can see if a button is hard to reach or in a spot that that is easy to access one-handed, etc.

    Obviously this makes things a lot more difficult, especially with the reflections off the real-glass screens, but it’s something we’re trying to work out.

    Hope that helps…
    w.

  3. Steve Krug
    March 31st, 2013 at 21:27 | #3

    @Wayne Pau

    All great points, Wayne; exactly the things I’ve been pondering lately, especially the difference that absence-of-cursor makes. And also the fact that capturing the gestures well is nothing to sneeze at, given considerations like glare, focus, zoom, intruding on the user’s experience, etc.

    What’s your tyical setup? Are you using an overhead camera, document camera, or sled-and-gooseneck to do your captures?

    Steve

  4. April 1st, 2013 at 18:24 | #4

    Hi Steve,

    Glad you found Magitest and hope that you find it useful in your mobile testing.

    We’re also big fans of Reflector and agree that it can be used in conjunction with Magitest to send the video of the session to a remote/observation computer when you’re testing in house.

    That said, we believe that one of the biggest benefits of Magitest is that it lets you get out of the lab and hit the streets. This way, you can do repeated short, sharp tests as you iterate your designs. Hopefully this changes the way people plan, recruit for and conduct mobile usability testing. Armed with a bag of chocolate bars or iTunes vouchers for example, a researcher could go out into the field and conduct a bunch of tests in a short space of time with immediate actionable results.

    Let us know if you have any other questions/thoughts/requests.

  5. Steve Krug
    July 16th, 2013 at 00:47 | #5

    Someone just told me about another bit of SDK code you can put in an iPhone app you’re building so your testers can capture screen recordings while they use it.

    http://www.capturerecord.com/

    It’s not something that will pass App Store vetting, so you’d only use it in a version you created specifically for usability testing.

    Here’s how the authors describe it:

    •Capture screen activity.
    •Record the tester’s face (using the device’s front facing camera).
    •Record the tester’s voice (using the device’s microphone).
    •Record user input and touch events.
    •Export as a single video to your camera roll.
    •Useful for basic usability testing.

    There’s a demo recording video here: http://www.capturerecord.com/mov/crdemo_orig.mov

Add a comment

Notify me of followup comments via e-mail.