Different types of UK Medics use different types of devices to complete online surveys, with almost 10% now using a smartphone. Taken together with tablets and iPads, almost one third complete on a mobile device.
Looking at data from our most recent online surveys (Table 1), GPs were most likely to have completed using a desktop PC and monitor; Secondary care physicians were more likely to have used a laptop, and nurses – often out and about visiting patients – used a variety of devices, with over a third having completed on a tablet or iPad.
So what? Should we care that respondents use different devices?
Well, in terms of survey experience it doesn’t seem to make a significant difference. Referencing more surveys authored and administered by us in 2015 (base = 1770), mean scores on our 11-point ‘survey experience’ scale were slightly lower for those completing by smartphone (6.9), compared to the other three devices (7.1 to 7.3). In other words, how much they liked it didn’t seem to be adversely affected by smartphone completion.
However, smartphone survey respondents do take much longer to complete…. 60% longer on average, compared to using a desktop or laptop. Those on tablets or iPads fall in between. But that is not the whole story, because -crucially- when we analysed actual duration versus perceived duration, we found that smartphone respondents were significantly more likely to perceive that they had not spent as long as they actually had, compared to respondents using other devices.
Although the resolution of smartphone screens these days (measured by total number of pixels) is often higher than desktop monitors of ten years ago, resolution across all types of computer display has improved alongside. So when we look at Table 2, we see that smartphone respondents overwhelmingly have relatively very low resolutions which – even with the familiar smartphone ‘pinch and zoom’ function – means they have reduced ability to see detailed and/or busy screen layouts.
We all know how convenient smartphones are, which may help explain why experience scores are not significantly lower. The way we use smartphones to multi-task (i.e. flitting between messaging, email, Apps, widgets etc.) and the consequent fragmentation of our attention may help explain why respondents misjudge how long it took them to complete a survey. Perhaps the task did not seem so heavy when interspersed with all the other interactivity they have in hand?
That said, we also know that smartphone respondents are relatively more likely to drop-out of surveys once they have clicked the link. Whilst this is an area that needs more work in the industry in terms of avoiding unnecessary obstacles, I speculate that it is also a function of the fragmented attention / multi-task nature of smartphone use – i.e. our surveys absolutely must make a good first impression if they are to compete with the other things the user could be doing on them!
A word of warning to clients entrusting online surveys to agencies and fieldwork companies…. beware talk of “mobile optimization”.
Not because it is in itself a bad thing, but because it may only describe a specific ON/OFF ‘toggle’ present in survey software which is meant to improve survey rendering on mobile devices. In our view turning this ON has many more drawbacks than advantages, and so we don’t use it. For example, it typically automatically deconstructs grid questions and asks them instead as a series of single-choice questions – which is quite possibly exactly what the researcher doesn’t want if the purpose is to assess things comparatively! (see here for a related post)
So – yes – we should care about which devices our medical market research respondents are using. Research buyers should quiz agencies and fieldwork companies on these issues, and suppliers should undertake ongoing analysis of their own response data to create more tailored, and better, online research experiences – whatever the device.