As Capture Labs conducted year-end analysis on Capture Behavioral Engagement, one of its primary goals was to understand and describe the ways different student populations visit our partner schools’ websites. In one analysis, we looked at two CBE partner institutions to explore whether students of color (SOC) visit sites more often, or visit more pages per visit.
When grouped into two bins — SOC and Non-SOC — we found that Non-SOC visited, on average, 40 percent more times than self-identified SOC. They also visited, on average, 2.4 more pages than self-identified SOC. Both results were statistically significant.
When breaking results into specific ethnicities, some more complex patterns emerge, particularly the difference between self-reported African-Americans and Whites. For example, in the table below, we see that Black students, on average, made 2.2 unique visits to institutional sites, while White students made 1.6.
Descriptive Statistics by Ethnicity
As we know from past research, repeated visits to a site are positively correlated with application. The table below shows students who advanced to the applicant stage and beyond.
Descriptive Statistics by Ethnicity (>= Applicants)
When sorting in descending order by unique visits per visitor, Black students (and students who did not report) have more unique visits, on average than do White students.
This finding provides some evidence that students of color might interact with institution websites in different ways than majority populations. If this proves to be a generalizable result (see limitations below), we can illustrate to partner institutions how their websites can and should be a venue that is disproportionately used by specific groups of students. Targeting messages and dynamic content through CBE can be more effective for communicating with SOC — specifically Black students — with the end goal of increasing student diversity.
As always, we should be cautious about interpreting results. First, the assignment of specific ethnicities can be difficult, and in some cases arbitrary.
Second, the institutions in the study might not be generalizable, due to either their specific student populations or to interventions that we have not controlled for. An email specifically sent to SOC, for example, might have an impact that is not seen here.
Finally, beware of Simpson’s paradox where the group level findings go away when disaggregated into different sub-groups.
By Brad Weiner, Ph.D., Director of Data Science, Capture Higher Ed