Skip to main content
Back to Overview
Building a Stronger Principalship, Vol. 5
Document
  • Author(s)
  • Brenda J. Turnbull, Leslie M Anderson, Derek L. Riley, Jaclyn R. MacFarlane, and Daniel K. Aladjem
  • Publisher(s)
  • Policy Studies Associates, Inc.
Page Count 64 pages

Research Approach

This report is based on an analysis of data collected by the evaluation team. Data sources include interviews, focus groups, surveys, observation, and document review.  

More specifically, the data sources are as follows:

  • Semi-structured interviews with administrators in district central offices and in outside programs that partnered with the districts (e.g., in universities), conducted during annual site visits in spring 2012-15.
  • Focus groups with novice principals and assistant principals (i.e., those in their first, second, and third years in their position) in spring 2013-15.
  • Surveys of novice principals and assistant principals in spring 2013-15.
  • Documents including the districts’ proposals, work plans, and progress reports for the foundation.
  • Observation of and participation in cross-site meetings from 2011 through 2015, including observation of presentations and panel discussions by district leaders.

Site-visit interviews were arranged by the project director in each district, responding to specifications from the evaluation team. In each year researchers requested interviews with the project director, the superintendent (or, in New York City, another high-level official in the central office); other members of the executive team such as the directors of human capital, curriculum and instruction, and data systems; and central-office staff and partner-program leaders who were, collectively, knowledgeable about standards, preparation programs, hiring and placement, supervision, evaluation, and support for principals. Where two or three people worked closely together on a particular function, the team typically conducted a joint interview with them. In some cases, project directors arranged one or more larger group interviews (e.g., the eight principal supervisors in Hillsborough County were interviewed in two groups in 2014, as were the eight principal coaches in that district). Numbers of interviewees are shown in Exhibit 1 on page 6 of the report.

The semi-structured interview protocols included not only factual questions about all components but also probes for the respondents’ perceptions of what district practices appeared to be serving the intended purposes, what practices appeared to need improvement, and what changes, if any, were under consideration. Respondents were asked more detailed follow-up questions about their particular areas of responsibility.

Focus groups with novice principals and assistant principals addressed their experiences and perceptions related to standards, preparation, hiring, evaluation, and support. Project directors identified the participants for these focus groups. More complete data were gathered from novice principals and assistant principals in web-based surveys administered annually in spring 2013-15. For principals, all of those in their first three years in the position were surveyed in all districts. For assistant principals, all in their first three years in the position were surveyed in five of the districts; a random sample was surveyed in New York City. The surveys addressed these novice school leaders’ perceptions and experiences related to each pipeline component, along with a few questions about their backgrounds and career plans. Surveys were revised from year to year to provide more data on the focal topics for upcoming study reports; thus the 2013 surveys had a few extra questions on preparation experiences, and the 2015 surveys had a few extra questions on principal evaluation and support.

Response rates generally exceeded 80 percent in five of the districts and ranged from 31 to 54 percent in New York City across the years. Appendix A of the report provides information on the response rates and on the researchers’ procedure for checking for non-response bias in the New York City principal data. (The New York response rate was lowest, 17 percent, among assistant principals in 2013, and researchers have excluded those respondents from their analysis.)

Data Analysis

For this study of initiative implementation across six districts, the district is the primary unit of analysis. Therefore, where this report presents a single frequency drawn from responses across all districts, the analysis gives equal weight to each district. The raw survey responses, if compiled across districts, would over-represent New York City—which had the largest numbers of novice principals—and under-represent the other districts. Because of the significant cross-district differences in numbers of novice principals, researchers applied post-stratification weights as described in Appendix A so that each district would be equally represented in overall analyses.

The survey data presented in this report were analyzed for change over time in two main ways:

  1. by comparing cross-sections of the principal and assistant principal responses gathered in 2013, 2014, and 2015; and

  2. by comparing cohorts of principals and assistant principals who started their position at five different time points. 

The first three cohorts of principals and assistant principals were surveyed in 2013, and one new cohort of first-year principals and assistant principals was surveyed in each of the two subsequent years for a total of five mutually exclusive cohorts. Thus for several analyses in this report researchers  compare the first two cohorts of principals who started their position in 2010-11 or 2011-12 to the last two cohorts of principals who started their position in 2013-14 or 2014-15. The study’s aim in focusing on these combined cohorts was to maximize the distance in time between the cohorts while boosting the numbers of cases by combining pairs of cohorts. See Appendix A for a more detailed description of cohort assignments and analyses.

Qualitative analysis was iterative. The evaluation team coded interview transcripts and notes according to the components and supporting mechanisms of the Principal Pipeline Initiative. Consistent with the purpose of the implementation study, the analysis for each component emphasized what had been done, what had helped or hindered those steps, what appeared to be working well, what challenges had emerged, and what district leaders wanted to do next. Multiple iterations of analysis identified and refined the specific themes, descriptions, and analyses presented in this report. Drafts were reviewed by the particular team members who visited the site for factual accuracy and revised as necessary. The districts’ progress reports to The Wallace Foundation were a supplementary source for detailed factual descriptions of policies enacted; similarly, the initial proposals were a source for facts about policies and practices in place before the initiative. Finally, project directors in the districts conducted a fact-check of this report’s text prior to publication.

Share This