- Author(s)
- Jean Baldwin Grossman, Christianne Lind, Cheryl Hayes, Jennifer McMaken, and Andrew Gersick
- Publisher(s)
- Public/Private Ventures and The Finance Project
Research Approach
The goal of this study was to clarify the costs of high-quality OST programs. The sampling strategy was designed to select programs that have operational practices and components that have been shown in scientific research to be associated with quality. Working in six cities—Boston, Charlotte, Chicago, Denver, New York, and Seattle—the study team solicited recommendations of highly-regarded OST programs from key informants.
This request yielded an initial pool of more than 600 programs that were categorized according to a typology of relevant program characteristics. These characteristics included: Age group, location (school or community-based), operator, program content, operation schedule. The goal was to have a relatively even distribution of programs in each city that would constitute the full range of relevant OST characteristics.
To narrow the pool and ensure that the sample included programs with quality characteristics, the study team used three criteria: two research-validated structural “markers” of high-quality OST programming (staff/youth ratios and participation rates) and a maturity measure (years of operation).
The study team categorized the more than 600 programs into 36 program types defined by combinations of the characteristics listed above (such as school-year, school-based, community-run, academically focused programs for younger children). Then, within each city, they randomly picked programs in each of the cells and attempted to interview the executive directors to confirm the programs’ characteristics, assess these selection criteria and collect information about an array of other quality attributes. These included, for example, a clear organizational mission; small group sizes; adequate space and materials; formal orientation, training and performance reviews for staff; regular staff meetings; and formal feedback from participating youth and parents.
Once a program of a desired type passed the screening criteria, the study team asked the executive director or designated staff to complete a cost survey. The process continued until enough qualified programs of different types had completed a cost survey. Of the 494 programs contacted, 215 met the three criteria listed above. The study attempted cost interviews with 196 of them; 111 completed the survey. They did not contact the remaining programs due to resource constraints.
The final sample included 111 programs with sufficient capacity to complete the detailed surveys and follow-up interviews that the study team conducted to collect the relevant cost data. Programs with directors who were unable or unwilling to provide information on all elements of the data collection protocol were eliminated from the final sample.
Cost data were initially collected through detailed surveys that program directors completed by hand or by phone. To ensure that the information provided in the survey was as complete and accurate as possible, the study team conducted follow-up phone interviews with key staff (usually the executive director and/or financial manager) from all of the programs in the sample. Particular attention was given to verifying cost data, probing for hidden costs (especially those related to in-kind contributions) and double-checking staff salaries and hours.
This information was then compared to information in program budgets and annual reports. Wherever possible, the study team obtained documentation to support the valuation of goods and services received as in-kind contributions. For each program, the study team captured the full cost of operation. This is the sum of out-of-pocket cash expenditures and the value of in-kind contributions, including donated space. Earlier OST cost studies have not included space because none combined the costs of school-based and community based programs in which space costs are likely to be quite different.
The cost data collected were made comparable through cost-of-living adjustments. By detailing the programs’ wide-ranging costs, this study highlights questions and considerations that are critical to decision-makers in their efforts to build and sustain quality OST programs for children and youth in their communities.