The Healthy Minds Study provides a detailed picture of mental health and related issues in college student populations. Schools typically use their data for some combination of the following purposes: to identify needs and priorities; benchmark against peer institutions; evaluate programs and policies; plan for services and programs; and advocate for resources.


The Healthy Minds Study is designed to protect the privacy and confidentiality of participants. HMS is approved by the Health Sciences and Behavioral Sciences Institutional Review Board at University of Michigan. To further protect respondent privacy, the study is covered by a Certificate of Confidentiality from the National Institutes of Health.

SAMPLING Each participating school provides the HMS team with a randomly selected sample of currently enrolled students over the age of 18. Large schools typically provide a random sample of 4,000 students, while smaller schools typically provide a sample of all students. Schools with graduate students typically include both undergraduates and graduate students in the sample. DATA COLLECTION HMS is a web-based survey. Students are invited and reminded to participate in the survey via emails, which are timed to avoid, if at all possible, the first two weeks of the term, the last week of the term, and any major holidays. The data collection protocol begins with an email invitation, and non-responders are contacted up to three times by email reminders spaced by 2-4 days each. Reminders are only sent to those who have not yet completed the survey. Each communication contains a URL that students use to gain access to the survey. NON-RESPONSE ANALYSIS A potential concern in any survey study is that those who respond to the survey will not be fully representative of the population from which they are drawn. In the HMS, we can be confident that those who are invited to fill out the survey are representative of the full student population because these students are randomly selected from the full list of currently enrolled students. However it is still possible that those who actually complete the survey are different in important ways from those who do not complete the survey. The overall participation rate for the 2016-2017 study was 23%. It is important to raise the question of whether the 23% who participated are different in important ways from the 77% who did not participate. We address this issue by constructing non-response weights using administrative data on full student populations. Most of the 54 schools in the 2016-2017 HMS were able to provide administrative data about all randomly selected students. The analysis of these administrative data, separated from any identifying information, was approved in the IRB application at the University of Michigan and at each participating school. We used the following variables, when available, to estimate which types of students were more or less likely to respond: gender, race/ethnicity, academic level, and grade point average. We used these variables to estimate the response propensity of each type of student (based on multivariate logistic regressions), and then assigned response propensity weights to each student who completed the survey. The less likely a type of student was to complete the survey, the larger the weight they received in the analysis, such that the weighted estimates are representative of the full student population in terms of the administrative variables available for each institution. Finally, note that these sample weights give equal aggregate weight to each school in the national estimates. An alternative would have been to assign weights in proportion to school size, but we decided that we did not want our overall national estimates to be dominated by schools in our sample with very large enrollments.


Made with FlippingBook - Online Brochure Maker