Archive

Archive for March, 2013

These are a | few of my | fav- or- ite | things…

March 28th, 2013 6 comments

If you’ve ever done usability tests using mobile devices you know that it can get very complicated very quickly: sleds, goosenecks, document cameras, converters, physical restraints (well, maybe not physical restraints, but “Don’t move the device beyond this point” markers), and a lot more.

Since I wrote a book about testing (Rocket Surgery Made Easy), people often ask me about the best way to do mobile testing. I think I’ve finally figured out what’s what, and I’m going to write a few blog posts about it. To begin with, though, I thought I’d mention two tools that seem to be well-kept secrets (i.e., I’m always surprised how many people haven’t heard about them.)

They’re useful for solving two problems:

  • How you record mobile tests, and
  • How you display them to observers.

(They’re both for iOS devices, not Android or Windows Phone.)

First, recording.

As near as I can tell, Steve Jobs must have been scared by multitasking when he was a kid, with the result that it’s nearly impossible to do more than one thing at a time on your iPhone or iPad–or iWatch, presumably. (Exception: You can listen to music and do one other thing. Steve was apparently fond of music.)

The upshot is that there’s never been much prospect for running a screen recorder in the background under iOS. And screen recorders (e.g., Camtasia, et al), are like mothers’ milk to us usability folks. I had despaired of ever finding one, until recently.

First, I came across UX Recorder (www.uxrecorder.com). And then someone (OK, it was Dave Greenlees. Thanks, Dave.) alerted me to Magitest (www.magitest.com).

They have a lot of similarities. They both record what’s on the screen, the user’s face (via the front-facing camera), and think-aloud audio (via the microphone). They both save the recordings to the camera roll when you’re done. And they both superimpose some representation of the user’s gestures (taps, swipes, pinches, etc.) on the recording.

Both have free versions that let you do short test recordings. The full Magitest costs $24.99, and UX Recorder has a pay-per-recording plan, or $59.99 for unlimited use.

The truth is, given the limitations imposed by Apple, they’re both pretty remarkable. But they’re not perfect.

They both seem to slow things down. (Typing can be painful, for instance.) And the recordings can take a very long time to encode and save. (All of this is from limited use/experimentation, so anyone with *actual* knowledge, please chime in with corrections.)

They both can only record things that happen in a web browser, so you can’t use them to test apps. But Magitest has an SDK module that you can add to your native-app project code which will let you make app-specific recordings.

BTW, there *was* another app called Display Recorder in the App Store briefly that *didn’t* play by the no-multitasking rules, and I was lucky enough to grab a copy before Apple banned it. It allowed you to record *anything*, including other apps, seemed not to slow things down, and saved its files remarkably fast. The bad news is that it doesn’t exist anymore, at least not without jailbreaking. It seems to live on in the Cydia store, though, if you’re the kind of person who’s OK with ripping the “Do not remove under penalty of law” labels off pillows and mattresses.

Second, displaying.

The truth is, while recordings are nice to have, I’m much more concerned about getting a bunch of people in a room and having them observe the tests live, so displaying what’s happening on a big screen is crucial.

There are two basic approaches: screen sharing (mirroring what’s on the screen) and camera views (what the user sees, including his or her own hands).

I’ll go into the pros and cons of both in another post, but I can tell you one thing: If all you want to do is mirror the screen to people in another room, you should at least consider Airplay–a feature built into iOS, the Mac OS, and Apple TVs for streaming audio and video.

It’s this simple:

  1. Connect your iDevice (iPhone 4S or higher, iPad 2 or higher, iPad Mini) and a Mac or PC to the same WiFi network.
  2. Install Reflector ($12.99 from www.reflectorapp.com ) on the Mac or PC.
  3. Turn on Airplay on the iDevice.

Bingo: Reflector mirrors your iDevice screen on the Mac or PC. (I have to say the effect is oddly striking. It always feels a little bit like magic to me.)

From there, you can run a screen recorder (e.g., Camtasia) to make a recording, and run screen sharing software (e.g. GoToMeeting) to send the image to the observation room. (Reflector doesn’t transmit audio from the iDevice microphone, so you’ll have to hook up a mic to the Mac or PC.)

I’ve used this to project the demo tests that I do whenever I give talks, and it works remarkably well.

As I mentioned at the beginning, I’m going to write another post or two about what I’ve finally concluded about mobile testing. (I have to say even I was surprised by my conclusions.)

Categories: Usability