Research in Action: A Practical Guide

Winter 2018

By Mariel Triggs

Nearly a decade ago, an administrator at Lick-Wilmerding High School (LWHS) projected a chart of grade point averages by race at a faculty meeting. The bars indicating the academic achievement of white and Asian students loomed over those of their black and Latino counterparts. The silence was long and eventually broken by faculty conversation ranging from shock to disgust to complex theoretical sociological explanations about why these inequities exist, even at good places like LWHS. For many, it was too big and too difficult an issue to grapple with. 

Despite the difficulty, LWHS continued to examine our lopsided outcomes. Like many other educational institutions, white and Asian students were outperforming our Latino and black students academically, with significantly higher grade point averages and representation in higher-tracked mathematics courses. Meanwhile, Latino and black students were recommended to our Student Support Services (SSS) team or placed on academic probation at much higher rates than their peers. 

The school instituted policies to address these issues: mandatory cultural competency training, improved support services and all faculty had to create their own equity goals. LWHS wanted to understand the racial student achievement gap so that the institution could take more effective steps to mitigate it. In order to fulfill LWHS’s mission to serve students from all walks of life, we embarked on a schoolwide effort to alleviate our racial achievement gap through analysis of existing student performance data, development of new metrics of student experience and qualitative transition studies. Institutional research would play a pivotal role.

In 2015, the E.E. Ford Foundation granted LWHS funding for the newly created position of institutional researcher. The principal goal of the grant was to support the creation of better ways to monitor the growth of students in order to inform and advise decision-makers. To do this, LWHS had to rethink its existing metrics and methods of data collection, implement collaborative analysis, and partner with like-minded institutions. 
 

Step One: Mining Current Data

The first step to tackling these tough issues through a more data-informed manner was to conduct a data inventory. At LWHS, we had several major data sources to consider: admissions files, the learning management system, the college counseling platform, business office ledgers, and various student life records. 

It took nearly a year to develop a system to align these data sources. Variables, the vocabulary of data, were inconsistent across our platforms. Seemingly innocuous tasks—such as the categorization of types of feeder middle schools—led to conversations across teams about the important characteristics of prior student classroom experience, which were hidden in our coding. Previously, we had classified middle schools as independent, parochial, and public, but because this did not recognize the variety of experiences that public middle schools provided, we further disaggregated the public school category into suburban public, urban public, and charter schools. 

After reanalyzing our academic achievement gap data through this new lens, we found that the type of middle school was as strong a predictor of academic success as race, with the greatest disparity present during freshman year and shrinking by junior year. Transitioning into LWHS academically seemed to be most difficult for students who attended charter schools. While we continue our efforts to address possible causes for a racial achievement gap, such as evidence-based interventions to minimize unconscious bias and stereotype threat, we had another entry point to explore.

Step Two: Identifying Causes Through Qualitative Research 

Our quantitative research exposed the inequities but not why they existed. Program leaders offered theories and anecdotes that supported the results and were ready to solve the perceived problems before we fully understood the issues. Our school is dynamic and agile, and it was uncomfortable to not act immediately when we cared so much. As the institution evolved into one committed to making more data-informed decisions, we realized now was the time for qualitative research.

There is no perfect instrument to measure feelings and perceptions. Plus, sampling and data collection methods allow for bias and other errors, which can distort results. In the book Teachers Investigate Their Work: An Introduction to Action Research Across the Professions, Herbert Altrichter articulates a remedy for this in which he identifies triangulating data as a means to “give a more detailed and balanced picture of the situation.” 

LWHS instituted a rule of threes for research: at least three unique sources of data, at least three different data-gathering methods, and at least three data researchers or evaluators. For the Charter School Transition Study, LWHS formed a team that included a member from the admission office, a member from the student life office, and the institutional researcher. The team held a focus group with former charter school students, interviewed current students, and spoke with counselors from middle school charters, who explained their culture, daily schedules, and grading. From this research, two distinct narratives emerged.

Students from structured charter schools had a very regimented schedule, constant early academic intervention from adults, and a highly structured approach to learning. They knew how to study but did not know how to manage their unscheduled time during the school day at LWHS. 

Students from progressive charter schools tended to have more unstructured class environments, most adult interventions around bad behavior, and less rigorous academic expectations. They were more independent but didn’t know how to study. 

For both these groups, most students were black or Latino and were experiencing being in the numeric minority for the first time at LWHS. Stereotype threat was also at play.

To serve students from charter schools effectively, LWHS needed to undergo changes that involved nearly everyone on campus. As the institutional researcher, I presented the research methods and the distinct narratives that emerged from the study, and suggested evidence-based interventions tailored to the types of decisions each group made. For example, faculty were presented with academic mindset research and effective pedagogical interventions. 

Another example of a decision-making group is our Mastering Educational Tools for Achievement (META) Program, which works with six to eight freshmen that we identify as needing extra help transitioning into our high school. META was presented with their students’ highs and lows mapped throughout their first year by subject and GPA progression throughout their high school career. Once teams were informed of the issues, their mechanisms, and proven solutions, teams brainstormed next steps such as adjustments to the freshmen rotation curriculum that more intentionally introduced students to support systems offered in the school, targeting learning strategies instruction to specific students in appropriate subjects, and continued support through the sophomore year. However, we needed more information to see if these efforts were working and the current end-of-the-year course evaluations would not suffice.
 

Step Three: Making Data Analysis Easy 

Since so many teams were chipping away at the same problem, we needed to coordinate indicators of progress. The struggle to create common language revealed our blind spots and offered opportunities to unify our understanding. We created a “master codes” shared spreadsheet in which variables are defined and checked collectively. Each major data source has a point person who has two responsibilities: update exports from their database to a shared spreadsheet and review the master codes.

Through data blends of these shared Google Sheets, we created interactive dashboards, which we dubbed “flashlight tools.” (Schools that use Microsoft OneDrive can accomplish similar results with Excel and Power BI.) The dashboards allow us to compare student data across demographic backgrounds, giving us a holistic view of the institution. With a click, administrators can see academic achievement measures in various disciplines over time by level of financial aid, race, gender, or type of middle school. We can monitor student achievement measured through GPA or activities, sliced by student demographic or course data with ease. However, the traditional measures of student achievement were not sufficient metrics and we wanted to measure mediating variables, such as students’ perceptions of LWHS and their mindsets.

 

Step Four: Monitoring Indicators Through the Student Experience Survey 

Twenty years of research in the field of education have shown that student evaluations of teachers are problematic: Respondents’ gender bias and grade expectations tend to color the evaluation more than the actual quality of instruction. For example, last year a University of California, Berkeley, study published in ScienceOpen Research showed that biases against female instructors are large and significant, even on seemingly objective measures, such as how long it takes to grade assignments. 

Previously at LWHS, some students completed course feedback forms at the end of the school year, which not only prevented teachers from making timely adjustments to curriculum and instruction, but also deterred students from sharing thoughtful answers, since they knew their efforts would not help their learning. Teachers tended to be highly critical of themselves, emphasizing a few bad reviews. Also, the feedback forms were designed by the teacher, and the lack of uniformity meant that we could not examine the overall health of the school.

Last year, we replaced the course feedback forms with standardized Student Experience Surveys (SES) that were distributed about seven weeks into each semester. Most questions were changed to reliable and validated psychometrics that measure the student perceptions of the impact of the class environment on their learning, their sense of belonging, and their confidence level in their ability to learn the material. Students are no longer asked to be pedagogical gurus nor to rate their teachers. Instead they are asked to share their own experiences—something they are experts in. When triangulated with assessment of student work and teachers’ experience with the students, these results can inform whether or not interventions need to be adjusted in order to be effective. Some immediate issues that came to light were anxiety levels among females being significantly higher than males in mathematics classes, urban charter students feeling like they don’t belong, and freshman parochial students’ ratings dipping in nearly all categories from quarter one to quarter three.
 

Step Five: Developing a Shared Understanding Through Data-based Inquiries

Flashlight tools based on the results of the SES are the foundation of LWHS’s data-based inquiries (DBI). Our DBI is like the See/Think/Wonder routine from Harvard’s Project Zero. This helps us avoid jumping to conclusions or coming up with solutions before a problem is thoroughly understood. To construct a shared understanding of what the data say, departments play with the flashlight tools with a research question in mind, first identifying what the graphs say (see) and what the data means (think), and then doing low-level inference about why that may be (wonder). The goal of the DBI is to develop a deeper, more actionable research question to drive next steps. Teachers also receive personalized SES flashlight tools for their own classes.

Nearly everyone on campus is a decision-maker influencing the student experience, and now everyone has metrics to reveal the growth edges for LWHS. Teachers in particular shape how welcomed, challenged, and supported our students are. This year, Jennifer Selvin, an English teacher; Tamisha Williams, dean of adult equity and inclusion; and Randy Barnett, assistant head of school, redesigned in-house professional development that makes each teacher a researcher. Now, every teacher chooses a research question regarding student belonging based on the SES, qualitative transition studies, and the summer reading assignment (this year it was Blindspot by Mahzarin Banaji). Throughout the year, teachers gather and analyze data, design interventions, and then check to see if their interventions worked, just as the administration modeled.
 

Step Six: Finding Proven Interventions Through Partnerships

Today, teachers and administrators at LWHS are piloting interventions to ease the transition for incoming ninth-graders at the school. Some of these initiatives are based on vetted tools and many are of our own design. Developing logic models that connect theories of change to key mediating and observable indicators lead to a more cohesive vision of success, higher levels of buy-in and effective plan implementation. Currently, the math department requires incoming students and teachers to take an online class at Stanford about how to learn and teach mathematics by developing growth mindsets. Our Learning Services Center (LSC) administers the Learning and Study Strategies Inventory (LASSI) to freshman to help students understand their current study strengths and weaknesses. META has adjusted its program to reach out to the families of their students more.

Luckily, we are not alone in these endeavors. Other like-minded schools have made changes to their programs to address similar issues. While collaboration would be ideal, coordination can be difficult due to budget constraints, the need to protect student data, and disparate evaluation methods between institutions. However, outside associations deal with these issues by coordinating partnerships between institutions.

In fact, Jen de Forest, associate director of the California Association of Independent Schools (CAIS) is coordinating one such partnership in which LWHS is participating. By pooling schools together, de Forest has made us a big enough “client” so that cutting-edge vendors can adapt their products to our needs. For example, Panorama Ed is a company that helps school districts monitor student learning. With CAIS’s coordination, they will survey our students on their social-emotional learning (SEL) competencies and their perceptions of the SEL support systems at LWHS and 35 other schools. The aggregate data lets us see trends beyond an individual school’s small samples, assess the effects of different types of interventions, benchmark ourselves versus our peer schools, and draw meaningful conclusions from the results. 

We are also partnering with Stanford University’s research group Project for Education Research that Scales (PERTS). It provides free evidence-based professional development, monitoring tools, and searchable databases for proven student intervention methods. Currently we have two groups piloting PERTS’ Engagement Project. We are testing an online survey and a professional learning community action research protocol, which are designed to provide teachers with weekly information to identify causes of student disengagement, along with appropriate interventions proven to improve the learning conditions of the class.

 

What’s Ahead

The evolution of the LWHS culture takes time. Helping a culture make a shift to value data analysis takes training. Coordinating evidence-based interventions requires decision-making systems to adjust. We have made great mistakes along the way, learned from them, and capitalized on opportunities like accreditation to increase the community’s statistical literacy and research skills. We still have an achievement gap, but our coordinated efforts are just now taking hold. We are now monitoring our students academic indicators through the lens of race, gender, region, type of middle school, previous family academic attainment, learning accommodations, transitional program participation, and socioeconomic status metrics. Preliminary results are promising and the cycle continues. Constant reflection, high expectations, and actions informed by the latest educational research means our work is never done.
Mariel Triggs

Mariel Triggs is the institutional researcher at Lick-Wilmerding High School in San Francisco.