Six courses built on one conviction: students learn CS best when they build real things, earn abstractions through struggle, and understand that every algorithm encodes a set of values about the world. No exams. No required textbooks. Every concept earned by building.
Most introductory CS courses follow the same pattern: explain a concept, demonstrate syntax, assign practice problems, assess on recall. Students who arrive with prior exposure tend to succeed. Students who don’t fall behind quickly and conclude they are not a CS person. The course then calls this selection, when in fact it is structure.
This failure is especially concentrated at community colleges, where students arrive with more diverse prior experience, more financial and time constraints, and less inherited access to the social capital that makes traditional CS culture feel familiar. The typical response — remediation, prerequisite chains, stricter placement — treats the wrong variable.
This curriculum takes a different position: the difficulty students experience in introductory CS is not uniformly about ability. Most of it is caused by courses that teach abstractions before problems, syntax before meaning, and efficiency before understanding. The fix is not remediation. It is a different kind of course — one that begins with a problem the student wants to solve, introduces tools only when they’re needed, and assesses understanding through what students build rather than what they recall.
Three pedagogical principles run through every course in this curriculum. They are not buzzwords — each one is a concrete design decision that shows up in how units are sequenced, how projects are scoped, and how student work is assessed. Each principle traces to a specific body of learning science research.
Every major concept is introduced through a problem that requires it — before the formal tool is offered. This is Guershon Harel’s intellectual need principle: students must experience the inadequacy of what they currently have before a new tool will feel like a solution rather than a fact to memorize. A hash table isn’t defined on day one. Students first build a slow lookup system, hit the wall of its inefficiency, and then the hash table arrives as the answer to a question they already have.
Students implement every major data structure and algorithm from scratch before using the library version. This is not hazing — it is the difference between knowing that sklearn has a fit() method and understanding what fitting means. Ko’s read-before-write research shows that students who trace implementations before writing them develop significantly stronger debugging ability and conceptual transfer. Every abstraction has an interior; students earn the right to ignore it by first understanding it.
The social implications of algorithmic systems are not a lecture added at the end of the semester — they are woven into the technical content from the start. When students build a word frequency analyzer, they ask whose vocabulary the default dictionary centers. When they study PageRank, they ask what gets amplified and what gets buried. This is not politics inserted into CS. It is the recognition that every algorithm encodes the values of the people who built it — and understanding those values is part of understanding the algorithm.
Each course is a complete 18-week experience: a central project arc, three entry tracks for students at different levels, portfolio-based assessment, and a public exhibition at semester end. No exams. No required textbook purchases. All course materials free and open access.
CS pathway (left to right) · Engineering track branches at Math 2B / ENGR 11 · All courses accessible at Track I with no prior programming
AI is not magic — it is math, history, and human choice. We build from first principles: probability, search, neural networks, language models. Then we ask who built these systems, for whom, and what they’re encoding about the world.
ML algorithms are not neutral mathematical facts — they are choices about what to optimize, whose data counts, who bears the error. We derive every algorithm before using it, implement from scratch before touching any library.
Every data structure is an argument about the world. Every algorithm is a strategy, a tradeoff, a value judgment. We implement every structure before using the library version. No LeetCode grind culture — deep projects that transfer.
PageRank. GPS. The phone in your pocket. Three technologies derived from first principles. You’ll understand why they work — not just that they work. Includes the signature project: Build a Computer from Scratch — a 20-week team build of an 8-bit breadboard computer bridging five STEM disciplines.
Six fundamental problems. Twenty-nine lessons. Build circuits, collect data, verify models. Derive before compute — the ALAF approach. Math made useful the moment it’s introduced.
Model real circuits. Understand why 0.1 + 0.2 ≠ 0.3. MATLAB the way engineers use it — from binary representation up to IEEE 754 floating-point. No prior programming required.
Every course in this curriculum is taught at three simultaneous depth levels. These are not ability groups, remediation tiers, or ceiling categories — they are depth choices. The same concept is taught to everyone. What varies is whether a student builds it, proves it, or extends it. Track I is a complete, serious course outcome — not a consolation.
Students develop genuine working fluency with the core concept. They build a complete, functional implementation and can explain what it does and why. No prior experience required at Track I — by design.
Students implement the concept rigorously, handle edge cases, and extend it to a novel context. They move between their own implementation and a library version and explain the tradeoffs in writing.
Students engage with the formal mathematical structure of the concept, read related research, and contribute something original — a proof, an optimization, a novel application, or a written critical analysis.
These are not homework assignments with a twist. Each signature project is the central learning experience of its unit — designed so that students encounter the core concept as a genuine problem to solve, not a technique to apply. Every project is public: presented at a semester-end exhibition open to the campus community.
Teams of students build a working, programmable 8-bit computer on breadboards from logic gates — and in the process encounter the physics, mathematics, linear algebra, differential equations, and chemistry that make it possible. Inspired by Ben Eater. Grounded in seven bodies of learning science research. Designed for cross-STEM transfer: every module includes an explicit “STEM Bridge Moment” where the instructor names the connection to the student’s next math, physics, or chemistry course.
Students build a functional contacts application — adding, searching, sorting — starting with a flat list. As the dataset grows, they measure slowdown empirically, identify the bottleneck, and redesign. The project runs four weeks, and students implement and profile at least three structural approaches before semester end.
Students implement linear regression, logistic regression, and a single-layer neural network entirely in NumPy — no sklearn allowed until the implementation is complete. The final submission includes both versions plus a written explanation of what the library conceals and why that matters.
Students choose a real, deployed AI product — a hiring tool, content moderator, facial recognition API, or recommendation system — design a structured test protocol, run it, measure performance across demographic groups, and produce a findings report structured like a research paper. Directly adapted from Buolamwini’s Gender Shades methodology.
Students derive the PageRank algorithm from the random surfer model, implement it on a graph of their own construction, then run it on a real web crawl. They modify the damping factor and adjacency structure and observe how rankings change — which leads directly to the question of who decides what matters on the internet.
Every capstone project must include a written explanation addressed to a non-technical family member: what does your program do, what problem does it solve, and why should someone care? If you cannot explain it to someone who doesn’t speak code, you don’t fully understand it yet. Communication is assessed alongside the code itself.
Every course ends with a public exhibition, not a final exam. Students present their portfolio of work to an audience of peers, instructors, and community members. Students write a self-evaluation against the course learning goals and propose their final grade with evidence from their portfolio.
Andrew Ng’s landmark Coursera specializations, rebuilt for community college students. Same rigorous concepts, three entry tracks per course, problem-first sequencing, and every project connected to a real community need. No exams. No required textbooks.
Original course content by Andrew Ng (Coursera / deeplearning.ai). These are community college curriculum adaptations — restructured for three-track delivery, portfolio assessment, and equity-centered project design. Not affiliated with or endorsed by deeplearning.ai.
Understanding AI is a civic skill. Three tracks from no-prereq literacy to building with APIs. Community audit projects. Portfolio-based.
Derive before you compute. Implement before you import. Audit before you deploy. Six projects on real community data including bias audits.
Six modules from neurons to transformers. Implement backpropagation from scratch. Mandatory bias audit module. Capstone serves a real community need.
Every image is a choice. Every output reflects training data. Use gen AI critically, build with it skillfully, and audit its biases rigorously.
These courses are not built on personal teaching preference. Each design decision traces to a body of research or a theoretical framework. The thinkers below are not name-dropped — their ideas show up in specific assignments, assessment structures, and sequencing decisions across every course in this curriculum.
This curriculum is developed by Henry Fan, a CS instructor at a California community college working at the intersection of computer science education, equitable pedagogy, and learning science research. The courses on this site represent a multi-year project to redesign introductory CS at the community college level — building courses that are technically rigorous, project-driven, and designed from the start to serve students who have historically been pushed out of CS.
The pedagogical foundation of this work is built on the mentorship and teaching philosophy of Jeff Anderson (Foothill College), whose commitment to antiracist learning science, ungrading, and the principle that every classroom decision should map back to research in cognitive science has shaped how every course on this site is designed, assessed, and taught. His five anti-racist, research-based, learner-centered learning objectives are the invisible architecture of this entire curriculum.
The curriculum is connected to ongoing research into help-seeking behavior in introductory CS courses, curriculum dependency structures, and the experiences of students who leave STEM at community colleges. That work lives at the CS Education Research Portfolio →
All course materials are free and open access. Instructors who want to adapt any of these courses for their own institutions are encouraged to do so. For questions, collaboration, or to share what you build with these materials, reach out directly.