Research Agenda
Four Questions, One Thread
My research investigates structural features of introductory CS courses that predict whether students seek help, persist, and develop belonging — with a focus on community college students. The connecting thread is structural equity: the claim, grounded in Seymour and Hunter's work, that departure from STEM correlates with teaching quality and institutional design, not academic ability.
I'm building toward a research program that combines qualitative methods (interview studies grounded in Seymour & Hunter), quantitative learning analytics, and computational tool-building. These are early-stage questions — they need data, methods, and collaboration to answer well. The empirical projects designed to address them are described on the projects page.
Target venues: SIGCSE · ICER · EDM · LAK · Learning @ Scale
Question 1
Help-Seeking and Course Structure
What observable features of introductory CS course design — assignment framing, office hours culture, peer collaboration norms — predict whether students seek help when stuck? Do these predictors differ across demographic groups, and can they be measured from LMS and discussion forum data?
Methods: Learning analytics · LMS log analysis · Qualitative coding (Seymour & Hunter taxonomy)
→ Addressed by Project P1: HelpMap
Question 2
Curriculum Structure and Student Departure
Can a graph-based representation of CS curriculum structure reveal which dependency patterns create avoidable confusion bottlenecks? Do community college CS curricula show structural features that research university curricula don't — and do those features predict DFW rates?
Methods: Curriculum graph analysis · Institutional data · NLP analysis of syllabi
→ Addressed by Project P4: CurriculumGraph
Question 3
Replicating Seymour & Hunter at Community Colleges
Do the departure reasons documented in Talking About Leaving Revisited — teaching quality, weed-out culture, belonging, help-seeking suppression — replicate at community colleges? Are there CC-specific departure pathways not captured in the original taxonomy?
Methods: Semi-structured interviews · Qualitative coding · Grounded theory · IRB study
→ Addressed by Project P3: Why They Left
Question 4
Validating Computational Tools for Instructional Auditing
Can NLP-based analysis of course materials be validated as a reliable proxy for expert judgment about structural features associated with poor student outcomes — specifically features that suppress help-seeking or create what I'm tentatively calling "motivational debt"? This is fundamentally a validation question: can automated analysis match expert annotation with sufficient reliability (Cohen's κ ≥ 0.65) to be useful to instructors?
Methods: NLP · Annotation studies · Tool validation · Instructional design literature
→ Addressed by Project P2: SyllabusAudit
Key Ideas and Influences
The constructs below draw on existing literature and, in some cases, propose extensions that need empirical validation. I list them here as intellectual commitments that shape the research questions above, not as established theory.
Structural departure (Seymour & Hunter, 1997; 2019). The landmark finding that students who leave STEM are not academically weaker than those who stay — departure correlates with teaching quality, institutional culture, and help-seeking suppression. This is the empirical foundation for the entire research program. The open question is whether these findings, established at research universities, replicate at community colleges (Q3).
The necessity principle (Harel, 1998; applied by Anderson). Students should experience intellectual need before receiving the tool that answers it. In Harel's formulation for mathematics education: "if math is the medicine, what is the headache?" I'm interested in whether violations of this principle — what I'm tentatively calling "pedagogical debt" by analogy with technical debt — are measurable from course materials and predictive of student outcomes (Q4).
Belonging as structural feature (Walton & Brady; Margolis & Fisher). Sense of belonging in CS is not purely interpersonal — it can be signaled (or suppressed) by course design choices visible in materials alone. The question is whether these signals can be reliably coded and validated against student survey data (P5).
Help-seeking suppression. The behavioral pattern in which students who need help do not seek it. Predicted by Seymour & Hunter to be structurally caused. I'm interested in whether this can be operationalized as a measurable feature in LMS data (Q1).
Curriculum graph analysis. Representing curriculum as a directed graph of learning objectives with typed dependencies (conceptual, procedural, motivational, social). The empirical question is whether structural properties of these graphs — density, longest paths, bottleneck nodes — predict student outcomes. This extends existing prerequisite-chain research by adding dependency types (Q2).
Roadmap
Now — Spring 2026
IRB Protocol + Corpus Construction
Submit IRB for P3 interview study. Continue collecting syllabi for P2 corpus (47/100+ collected). Develop annotation schema. PhD program applications.
Summer 2026
P3: Departure Interviews
Conduct 20–30 semi-structured interviews with students who left STEM at Foothill. Code against S&H taxonomy. Begin grounded theory analysis.
Fall 2026
P2: Annotation Study + P1: LMS Analysis
Recruit expert annotators for P2. Begin NLP classifier training. In parallel, analyze LMS data for P1 help-seeking feature extraction.
Spring 2027
First Submissions
Submit P3 (interview study) to ICER 2027. Submit P2 (SyllabusAudit) to Learning @ Scale 2027. Pilot P4 instructor annotation study.
PhD Program
Dissertation Research
Integrate P1–P5 into a coherent dissertation on structural predictors of help-seeking and STEM departure at community colleges.
Research and Teaching as One Practice
These research questions did not arise from a literature review. They arose from watching students leave — from financial aid offices, counseling appointments, tutoring sessions, and learning communities where I saw the same structural patterns from every institutional vantage point. The research formalizes what I observed. The teaching practice attempts to fix it.
Every course I design is a potential research site: the Build a Computer from Scratch project creates a natural laboratory for studying help-seeking behavior (Q1), the effects of physical computing on belonging (Q3), and whether constructionist curriculum design measurably reduces the structural departure patterns that Seymour and Hunter documented (Q3). The build journal entries are qualitative data. The milestone completion patterns are quantitative data. The three-track system is a testable belonging intervention (P5). The research and the teaching are the same activity, observed from different angles.
See the full curriculum site for the enacted version of this research agenda.
Last updated: March 2026