Please use this identifier to cite or link to this item:
https://hdl.handle.net/10620/19131
Longitudinal Study: | HILDA | Title: | Wave 21 Data Quality | Authors: | Watson, Nicole Nesa, Mossamet Kamrun Summerfield, Michelle |
Institution: | Melbourne Institute | Publication Date: | Nov-2022 | Pages: | Discussion Paper No 1/22 | Abstract: | The fieldwork for wave 21 of the Household, Income and Labour Dynamics in Australia (HILDA) Survey, conducted between July 2021 and March 2022, was affected by various lockdowns and restrictions that were in place due to the COVID-19 pandemic. A previous HILDA discussion paper (Watson, Jin and Summerfield, 2021) detailed the changes made to the wave 20 fieldwork procedures in the first year of the pandemic and examined the impact this had on the data quality. For the most part, the data quality in wave 20 was very similar to previous waves. There was a relatively small reduction in the re-interview rate in wave 20 and a small improvement in the amount of missingness in the Self-Completion Questionnaire (SCQ). More concerning was the reduction in response rates for new entrants and children turning 15. This paper focuses on the data quality for wave 21. While it was initially planned to return to face-to-face interviewing this wave, these plans were changed with the escalation of COVID-19 cases in New South Wales, Victoria and the Australian Capital Territory just prior to fieldwork commencing. Many of the fieldwork changes implemented in wave 20 were therefore continued in wave 21. Chief among these was interviewing primary by telephone rather than face-to-face. As a result, the following features were also retained: i) showcards (in the form of an A5 sized booklet) were mailed to all households and made available online; ii) the SCQ was made available online as well as in hardcopy format; and iii) the incentive offered was $40 per completed individual interview and $20 for a completed SCQ. Further, experienced interviewers were trained via live webinars. New and less experienced interviewers were trained face-to-face in Queensland, Western Australia and South Australia and via live webinars in the remaining states. All interviewers had self-paced learning to complete prior to group training. The interviewer training was also extended by an additional two weeks to allow for smaller webinar groups. The aspects of data quality examined in this paper include response rates, response timing, missing data, use of multi-item response options, responses chosen for questions with long response lists, straight lining, rounding, and the length of responses provided at open-ended questions. These data quality measures for wave 21 are typically compared to those for waves 18 to 20. | URL: | https://melbourneinstitute.unimelb.edu.au/__data/assets/pdf_file/0009/4368105/hdps122.pdf | Research collection: | Reports and technical papers |
Appears in Collections: | Technical Papers |
Show full item record
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.