英文摘要 |
In order to improve coverage and response rates, and to reduce survey cost, mixed modes have been commonly used in large-scale panel surveys. If the assignment of a survey mode is not random, the respondents’answers to a mixed-mode survey might be subject to two kinds of biases. One is measurement bias (i.e., mode effects) evoked by the modes themselves, and the other is sample selection bias, which results from the respondents’non-random assignment to different modes. How to disentangle these two biases is a crucial challenge for the estimation of mode effects. This study adopts propensity score matching, an analytical method which can deal with non-random sample assignment, to examine mode effects on response behaviors in a panel survey with a mixed-mode design of face-to-face and self-administered online modes. The outcome variables analyzed in this study include overall item nonresponse,“refusal”and“don’t know”answers, and two response styles in balanced attitude scales, namely the acquiescence and extreme response styles. Data analyzed in this study are from the Panel Study of Family Dynamics survey conducted in 2018, in which the sample was pre-assigned to face-to-face and self-administered online modes based on whether they provided an email address and finished an online ques¬tionnaire previously sent with a festival greeting card. There is a large difference in the numbers of complete questionnaires between the face-to-face and self-administered online modes. In order to improve the estimates for measurement effects, this study uses two matching methods, including the radius and kernel matching methods. These two matching methods are based on an oversampling strategy and matching with replacement. The results of the two matching methods indicate that the proba¬bilities of occurrence of item nonresponse and“don’t know”answers in the self-administered online mode were significantly higher than those in the face-to-face mode, consistent with previous studies. It is also consistent with previous studies in that, regarding responses to the balanced attitudinal scales, respondents who finished the questionnaires by face-to-face interviews were significantly more likely to provide acquiescent responses than those who filled out online questionnaires by themselves. However, different from previous studies, our findings indicate that respondents who completed self-administered online questionnaires were more likely to provide extreme responses to items on the balanced attitudinal scales than face-to-face interviewees. One other finding worth mentioning is that this study did not find significant mode effects for“refusal”answers. This study contributes to research on mode effects, applications of propensity score matching, and survey practices. Our findings suggest that, to mitigate mode effects, a mixed-mode survey including face-to-face and self-administered modes should adopt the same design for“don’t know”and“refusal”options between modes, and respondents in the face-to-face mode should be allowed to enter answers to ques¬tions with social desirability concerns by themselves. Despite these academic and practical contributions, our study still has its limitations. One is that the response behaviors explored in this study are confined to overall item-nonresponse,“don’t know”and“refusal”answers, and the style of responses to attitude scales. In addition to extending the investigation of mode effects to a broader range of survey questions, future research should endeavor to increase the application of propen¬sity score matching methods in order to disentangle the mode effects and selections effects in mixed-mode surveys. Future directions include but are not confined to the selection of covariates for the logistic model used to predict propensity scores, the methods of imputing missing values, and other matching strategies. |