ANNOUNCEMENTS

Stories From Wave 22: Survey Execution Q&A

CMIE presented Stories from Wave 22 on June 10, 2021. The presentation was followed by a live question and answer session. Over 150 questions were asked by the audience, with the bulk of them being answered during the session. We have compiled those questions into separate Q&A pieces. This one answers questions that were asked about Survey Execution.

Q: Is there documentation available online?

A: Yes, all of our documentation is available online on the How We Do It section of our website. This includes detailed documentation on survey design, survey execution, the questions and indicators in CPHS as well as other information. You can find a list of all of our indicators here. There is no “questionnaire” since the survey is administered electronically through an app. However, you can see each screen used in the app for data collection here. You will require a CMIE user ID and password to access some of this material. It is free to create one. With it, you can read all of our documentation and download sample data without having to be a subscriber. If you need any help in creating a CMIE user ID, we will be happy to help.

Q: Will the face-to-face interview fraction reduce for the 23rd wave?

A: The ratio of face-to-face interviews depends entirely upon the lockdowns. As the lockdowns decline the F2F interviews will rise quickly.

Q: How do you verify Household Income?

A: We cannot ‘verify’ as in we cannot see their tax returns or their bank statements. We try and quiz the households adequately to motivate them to recall all their incomes and report them honestly to us. We do have several validation rules that ensure that the responses are not completely out of line of reality. This is a fairly large questionnaire. So, it is difficult for the respondents to create an internally consistent set of responses.

Q: Were the response rates in wave 22 less in pockects with high covid cases?

A: Short answer is Yes. But, we do not have accurate numbers available. We tried hard to reach out to these regions telephonically so the skew is unlikely to be significant. But, this question is worth answering more systematically.

Q: At what point does the possibility that non-response is driven by unobservable charecteristics, become a worry? In a sense, the possibility that states/districts/households that opt out of the survey are inherently different from the ones that don’t can possibly impact causal inferences and/or analysis. Is there a way to combat this?

A: We combat selective non-response by ensuring a large sample and smoothness of survey execution. As you can see from the balance statistics, we do a good job of maintaining this balance.

Q: How were the response rate and balance in the survey maintained when many workers migrated back to their villages?

A: The response rate fell and the migrants are partly responsible for that. This fall in the response rate could not be helped. What we have ensured is that although the response rate could fall, the balance of the respondents should not shift too much. This was possible by ensuring that the execution was unbiased and well spread out geographically.

Q: You are reporting relatively poor response in households earning less than Rs. 72,000 per annum and also just literate households. It is known that this group constitute a large proportion. Is the representative balance still maintained?

A: Our aim in the survey is not to capture a fixed proportion of any income or education group. Rather, our survey design throws a wide net geographically, and requires a large sample for each geographic unit. That is, our survey design requires a certain geographic and rural/urban distribution. The survey design does not require evenness of distribution among income or education groups. Despite that, we still find relatively good balance across groups that have decent representation in the data. For groups that account for less than 3 per cent of the sample, we do see some heightened non-response. However, this is for a very small group of households and concerning survey execution during challenging conditions.

Q: What could have triggered the drop in Intentions To Purchase (ITP) for durables in JA21 - is it design led or sentiments led?

A: It is sentiments led. We have tried to not let the survey design impact results.

Q: How many interviewers were employed? How did you ensure that there are no individual biases in interviews?

A: We employ over 200 interviewers who conduct interviews face-to-face. They are managed by over 100 personnel distributed across India. We try to eliminate ‘interviewer effects’ to the extent possible. This was a problem that we faced in the early years of the survey, which led to a massive overhaul of the sample in some states like Gujarat. Today, our interviewers are extensively trained and the data they collect is verified in real-time (usually the same day as data collection) by their immediate supervisors. Additionally, we use the ‘para-data’ on each interview, which includes information on the interviewers themselves, to monitor results as they come in and check for any bias algorithmically.

Use of CMIE Product implies acceptance of the usage agreement & privacy policy   ♦   FAQs   ♦  Diagnosis