From the silent era to the present, film productions have shaped the way the public views campus life. Collaborations between universities and Hollywood entities have disseminated influential ideas of race, gender, class, and sexual difference. Even more directly, Hollywood has drawn writers, actors, and other talent from ranks of professors and students while also promoting the industry in classrooms, curricula, and film studies programs. In addition to founding film schools, university...