Social and Political Dimensions of Civic Engagement: The Impact of Compulsory Community Service

Authors: Ailsa Henderson, Steven D. Brown and Mark Pancer

Published in February 2012 in Politics and Policy.

Abstract: In 1999, the Canadian province of Ontario joined a number of other jurisdictions in requiring its high school students to complete volunteer service before graduating. The primary objective of this program, and others like it around the world, was to address declining civic engagement within society. Using a quasi-experimental design, we explore the impact of mandatory volunteering on its stated aims. Our findings suggest that volunteering in high school has positive impacts on the political dimensions of a student’s subsequent civic engagement, measured here as political involvement, political activism, political interest, and political efficacy. However, those impacts are largely conditional on two features of the volunteering experience: sustained commitment to one placement and a positive experience as evaluated by the student. High school community service seems to be unrelated to social dimensions of civic engagement, measured here as involvement in a variety of social, cultural, and religious organizations.

Making Scholarly Data Public?

Andrew Gelman from the Monkey Cage writes:

“The answer is clear to me: by making your data available, you are making it more likely that others will replicate your results, continue the directions of your research, cite you, etc. Fame and fortune await.”

Yet not all political scientists make their data available:

“If it’s so good to do, why isn’t everybody doing it? Continue reading

Let’s set aside the cheaters and the insecure people, those scholars who are worried that if someone else gets their hands on their data, they will come to different conclusions. And let’s set aside those researchers who are so clueless that they honestly seem to think that their particular analysis is the last word on the subject ….

I can think of two reasons we (those of us who would actually like our research to be reproduced) don’t routinely share data and code:

1. Effort. This for me is the biggie. As Aven notes, there are social benefits to making data and code available, and as I note above, there are direct personal benefits as well. But these benefits are all medium-to-long-term and they pale beside the short-term costs of getting my act together to put the data in a convenient place. In fact, when I do actually organize my data, it’s often motivated by a desire to make my life easier when handling repeated requests.

2. Rules. The default is for data and code not to be released. Often there are silly IRB rules or commercial restrictions on data. In other cases it seems like too much effort to find out. Again, though, it can be good self-interest to make data available. For example, in our wildly-popular (not yet but eventually, I hope) mrp package in R, we use CCES data, not Annenberg or Pew, for our examples. Why? Because the people at CCES were cool about it. Not only do they release their (old) data for free, they don’t mind us reposting it. Ultimately CCES benefits from this. The freer the data, the easier it will be for people to do analyses, cite CCES, suggest improvements toCCES, etc.”

You can read his full post here: