Purpose of the Survey
This short survey was designed to ascertain information about our students’ backgrounds and finances that can be used to update our local knowledge and provide information in response to accrediting agencies and external surveys from publishers.
The idea of a short and quick survey stemmed from past experiences of survey administration which were perceived as so intrusive that more than half the faculty declined to participate. This required extensive re-sampling and compromised the attempt at getting a stratified random sample. In sharp contrast, during the administration of this survey only one faculty member declined, although a few others required that we sample a different section. This cooperation was primarily due to the estimated time it would take to complete this survey (3 minutes) compared to in-class surveys in the past (30-45 minutes). Thus, in regard to faculty cooperation, the survey was a huge success and demonstrated that we have a mechanism in the future for gaining useful information about our students in a minimally-intrusive manner.
Strengths and Weaknesses of the Sample (Click here for sample population comparisons)
The Quick Survey is a perfect example of a study that presents us with the challenges of acquiring a sample that is good enough to make useful estimates of the campus population. A great deal of effort was made to implement random sampling strategies as much as was practical, but the realities of administrating such a survey made compromises unavoidable. Three sampling strategies were implemented: in-class, on-line, and email. Both the in-class and on-line samples were drawn from a population (sampling frame) of all lecture and seminar classes (C1-C6) with official enrollments of at least 10 students. These stratified cluster samples closely matched the known population parameters (for all lecture and seminar courses) by course level, day/evening and college. The exclusion of some modalities (i.e., supervision, independent study) and courses with low enrollments means that not all courses had a chance of being selected. Furthermore, there were a few courses that had to be replaced because the faculty preferred to have a different section participate.
The in-class survey was distributed by the instructor on record. For their convenience, they were given a window of a few weeks to administer it. It is unknown how many faculty administered it during class time (potentially yielding a high response rate) or had students take it home and return it to class (potentially yielding a lower response rate). Furthermore, not all students who were officially enrolled would have been in attendance. The on-line survey was distributed via Blackboard and the faculty on record was to encourage their students to log in and participate. Finally, an email sample was administered to a random sample of all students (regardless of courses taken) that were not already enrolled in a class constituting the other samples. This would include students who took only courses other than lecture and seminar. The response rates for the three administrations of the survey differed substantially based on the method of administration (in-class: 46%, on-line: 19%, and email: 12%). The amount of non-participants further compromises the goal of creating a final random sample since it is impossible to know the degree to which the non-participants had characteristics that could skew the results.
What we can say about the sample is that the demographics demonstrate a healthy cross-section of our enrollment based on class level, college of their major, ethnicity, sex, full-time/part-time status, and age groups. There was some oversampling of post-baccalaureate students since many of these students also took undergraduate courses. The degree to which the three samples closely fit the total population parameters based on the fall 2007 census data sometimes depended on the method of administration. For the most part, these results showed that the samples (especially when combined) reasonably represent the campus’s population demographics. Although it would be unreasonable to draw absolute inferences from this sample to the population, the fact that the sample demographics are very similar to that of the population can give us some confidence that the results generated from these data are a useful estimate of our students’ backgrounds and finances in terms of hours of employment, sources of finances, number of dependents, obstacles to achieving educational goals, and first generation status. These results can be cross-tabulated by a variety of demographics such as class level, age, sex, ethnicity, and college of major.
Two additional administrations of the survey were conducted. One was for nursing students and the other was for UNV 101 (Freshmen Experience) courses. These surveys were administered well after those above. The students were also given instructions not to fill out a survey if they already did so in another class. These were separate studies, the results of which will not be commingled with those from the combined in-class, on-line, and email samples.