Published: 1/30/2025
Keith Phillips
Senior Director, Research & Data Sciences, Dynata
For several years, Dynata has been partnering with the MRS (Market Research Society) and other leading industry players* to investigate mobile usage in online research. Our research findings come from two distinct data sets. We analyze the devices used to take surveys by combining a year’s worth of traffic from all four participating companies.
This year we analyzed traffic data from 2023 that was aggregated from 13 countries (Australia, Brazil, Canada, China, Germany, France, India, Indonesia, Japan, The Netherlands, UK, USA and South Africa). In addition to our traffic results, we tackle one topic each year with a custom research project via an annual survey. In previous years, we have explored survey taker satisfaction with their mobile experience and the impact poor design can have on the data collected. from September to October 2024) explored demographic, psychographic, and behavioral differences between survey takers using mobile devices and those using desktop/laptop devices for online surveys.
Mobile usage for online survey taking continues to increase
In 2023, mobile was the preferred device choice for online survey taking for every age segment under the age of 60. Overall, mobile accounted for 59% of survey starts, up from 54% in 2022. Year over year growth continues for mobile showing no signs of stopping. The inclusion of mobile is important from both a practical and methodological standpoint.
Device users are demographically different
The most obvious difference between device users is a demographic one, particularly the correlation between device usage and age. Looking at the combined 2023 traffic, we observe that 77% of surveys attempts from 18–24-year-olds were started on a mobile device. Only the 60+ age cohort was less likely to attempt a survey on a mobile device (44%) vs. a desktop/laptop (53%).
There is a strong personal preference for device usage
The next significant between mobile and desktop/laptop survey takers emerged from our custom research, where we controlled for age and gender across devices. We found notable differences in device preference for various online activities, including browsing social media, online shopping, watching videos, online banking, booking tickets for events, filling out government forms, and taking surveys.
Perhaps unsurprisingly, the device being used during our survey interview always over indexed as the preferred device for these online activities. However, the size of the difference was unexpected. Outside of survey taking, the largest difference existed for online shopping: 72% of mobile survey takers preferred shopping on their mobile device, compared to 41% of desktop/laptop survey takers. Conversely, 71% of desktop/laptop survey takers preferred online shopping from a desktop/laptop compared to 39% from mobile survey takers.
This stark contrast highlights the difference preferences in device usage between these two groups for online activities. Survey takers could select multiple devices indicating a strong preference rather than a slight inclination towards one device.
Mobile survey takers and Desktop/Laptop survey takers were similar enough that strict device quotas were not necessary.
In our survey, we explored a range of topics, including data privacy, brand loyalty, brand awareness, gaming habits, social media usage, and setting work-life boundaries, to name a few. Often these topics did not exhibit a difference between device users. While some topics revealed statistically significant differences between device users, the differences were often small. None of these topics can explain or account for the drastic difference in device preference.
Observing the similarities between survey takers across devices gives us confidence in maintaining data consistency simply by relying on natural fallout of device. This suggests that strict quotas based on device type are unnecessary in most cases. However, the strong preference for specific devices, even after controlling for age, highlights an inherent difference between users of different devices.
Therefore, creating a device agnostic survey for online research is not just a matter of improved feasibility or inclusivity; it remains a vital part of pursuing representativity among online survey takers.
Including survey takers from all device types is an important part of achieving representativity.
Perhaps an individual’s history with a device could be the best predictor of device preference. Or maybe device preference is somewhat intrinsic. However, the difference in preference did exist and as a result an online sample should be called into question if we exclude certain device users. Afterall, we would not want a representative sample to consist of only people who prefer vanilla over chocolate. Although such a sample might not impact the topic at hand, it’s difficult to predict when and how it would alter our findings.
Watch the webinar recording: Mobile Optimisation Research | Market Research Society