Mobile Musings: Have you been a respondent yet?
Scott Weinberg, Tabla Mobile LLC
Immediate Past President, MN / Upper Midwest MRA Chapter
I’ve been noticing how few market researchers and advertisers have participated in even a single mobile research study. Specifically, I’m referring to an app-based experience, usually using a form of geo-validation and multi-media data capture. I’m not referring to opening a url on your phone and taking a survey, any survey.
Rather, I’m referring to an actual ‘mobile research’ experience, the kind where you’re notified walking into a movie theatre, Best Buy, Target, grocery store, gas station, etc. Alternately, you may be pre-screened and invited to participate, e.g. an out of home ‘assignment.’ The reason I’m curious about this is because of the (profoundly?) unique and different respondent experience these studies entail. Let me give you a few examples.
I took an in-store study, or attempted to, inside a Super Target. I’m not affiliated with this supplier; I have several survey research apps running on my phone (and I never stray far from an electric outlet). Essentially the assignment entailed taking 1 photo of 11 variations of a food product, and responding to a few questions on each. Not difficult; tedious, but not difficult. When I uploaded the first pic, my phone timed out/went into lock mode (set at 1 minute). I tried it three times. I was on an iPhone back then, where pic file sizes range from 1-2 mb, depending on the detail (Androids are similar). This isn’t an issue on a home wifi or similar network, but inside a big box, via your cellular carrier, pic uploads (or any uploads) can be a pickle.
So what did I do? I was calculating that even if I could get the upload to work, I was looking at a 15+ minute boring repetitive survey, while standing in this food aisle. Not much intrigue to this. I was wondering how many others around the country were having this same frustrating experience. I decided to try an experiment of my own: I took 11 random product photos outside the survey (just using my camera) and exited the survey. The survey told me I had an hour to finish up from when I started. I drove home. Resumed the survey on my home wi-fi. At the first upload sequence, I randomly uploaded one of the pics, in about 2 seconds. Answered those questions. Went to the next sequence. Rinse and repeat. Finished in a few minutes. My experiment was to determine if this particular app had any kind of lockout or detection protocols for what I was doing. This supplier is a major player, one of the largest out there. Submitted fine, and my incentive showed up after a few days.
I’ve also noticed recently that Target’s offer free wi-fi. You need to actively accept their terms and login to connect, i.e. it’s not an ‘auto-connect.’ I wonder how many people actually do this? Or how many suppliers tell their potential respondents there is free onsite wi-fi, and to connect to it? I’ve never seen messaging to this effect in a mobile study; have you?
Another example, this time as a project manager rather than a respondent. On a time sensitive, DMA-specific mobile study, a phone recruit to survey app was in effect. Ergo, many of the respondents were ‘first-timers’ to this kind of study. I’m rather keen on these audiences actually; as they bypass the conditioned (i.e. self-select bias) ‘panel people’ who comprise the bulk of all primary online research (and a small but growing portion of mobile respondents). During this study it became apparent that live tech support was needed (and by live I mean immediate, while they were in-aisle). I began emailing my phone number to the potential respondents, and my phone quickly started ringing with confused respondents. They weren’t doing anything wrong, the app was working fine, survey was loading fine, they were just unsure how all this works. Happily however, they were motivated to participate (a healthy incentive didn’t hurt).
So, what are the lessons here? First, suppliers approach signal strength issues differently, with some using offline versions of the app experience (data are uploaded later); others minimize the amount of data uploaded via design. Ask what your options are. Second, when the sampling universe is small, e.g. with specific DMA’s, age groups and such, ergo when each potential response is critical, it’s wise to plan for tech support in advance and have live people ready and on-call to answer questions or take feedback. A confused user may not return to the study if they can’t access the content correctly.
Most importantly, experiencing activities like these make more an impact than reading about it; I always encourage interested parties to experience this methodology as a respondent. It doesn’t matter whether you’re new to the mobile research space or are versed in various fieldwork methods; the technology is rapidly changing, and our assumptions regarding how we should interact are best learned empirically.