California’s proposed AI professor ban — Senate Bill 928, introduced to the state legislature in 2026 — would require that all instructors at California State University be human beings, making California the first U.S. state to explicitly prohibit AI systems from serving as university professors. The bill protects CSU employees from what its sponsor describes as "the encroachment of artificial intelligence" into faculty roles, and it answers a question most institutions have been content to leave unasked.
What California’s AI Professor Ban Would Actually Prohibit
The legislation amends California Education Code to specify that academic instruction within the CSU system must be delivered by human instructors. It does not ban AI as a classroom supplement — students and professors can continue using AI tools — but it draws a legal line between AI as teaching assistant and AI as instructor of record.
That distinction matters because it is already being blurred. Some fully online courses route students through AI-driven curriculum platforms with a human instructor formally listed but minimally involved. SB 928 closes that gap by statute, at least within CSU’s 23 campuses serving approximately 460,000 students.
The California Faculty Association (CFA), representing roughly 29,000 CSU faculty members, has advocated for explicit protections as AI adoption in higher education has accelerated. This bill is, in large part, their ask formalized in legislation.
Why Professors Think This Is Necessary Now
Budget pressure is the engine behind the urgency. CSU requested approximately $825 million in state general fund support for fiscal year 2025-26; the Governor’s January budget offered substantially less, leaving campuses under pressure to find efficiencies. AI-delivered instruction at a fraction of faculty cost is an obvious calculation administrators would be negligent not to model.
Several universities have already run the experiment. Georgia Tech deployed an AI teaching assistant named "Jill Watson" — built on IBM Watson — that answered more than 10,000 student questions in a single semester before students realized they were talking to software. Arizona State University has deployed AI tutors across introductory courses. Carnegie Learning’s MATHia platform handles adaptive math instruction through AI at institutions nationwide.
None of these deployments have formally replaced a human instructor of record. The trajectory is legible regardless.
The AI Tools Already Operating Inside CSU
AI is not hypothetical in California classrooms. Khan Academy’s Khanmigo AI tutor has been piloted at multiple California institutions. AI-powered grading, proctoring, and writing assistance tools are standard across many CSU courses. Some ed-tech vendors have moved to AI-generated video lectures delivered through AI avatar platforms capable of simulating instructor presence at scale with no human involvement in delivery.
The line SB 928 draws is not between AI and no-AI. It is between AI-as-tool and AI-as-teacher. A professor using an AI grading assistant to evaluate 200 essays remains the instructor of record. A platform that delivers lectures, assesses students, and provides feedback without meaningful human involvement does not qualify under this bill.
For ed-tech companies building AI-delivered curriculum, this is a structural constraint. Platforms built as instructor replacements will be locked out of CSU contracts. Platforms structured as faculty augmentation tools remain welcome. That distinction will reshape how vendors pitch California institutions going forward.
California’s Three AI Bills: Toys, Therapy, and Teaching
SB 928 sits alongside two other California AI bills filed this session that reveal a consistent legislative philosophy about where AI does and does not belong in human development.
SB 867 would ban AI companion chatbots embedded in children’s toys — a direct response to products that build ongoing emotional relationships with children through AI-driven conversation. The concern: children cannot meaningfully consent to an AI relationship and may not understand they are talking to software, creating psychological dependencies with no human accountability on the other end.
SB 903 would regulate AI transcription and documentation tools in mental health therapy settings. It does not ban AI from the therapy room but establishes consent and disclosure requirements, acknowledging that clinical environments involve sensitive data and power dynamics that unsupervised AI deployment handles poorly.
Three bills, three domains: education, childhood emotional development, mental health therapy. California legislators are drawing lines around spaces they consider too consequential to leave to market-driven AI adoption. The market, left alone, had not drawn those lines itself.
The Anti-Innovation Objection
The counterargument deserves a direct statement: AI instruction, done well, can deliver personalized, patient, always-available teaching that human instructors at scale cannot match. Duolingo’s AI-driven language learning platform has reached over 110 million monthly active users, according to the company’s 2024 earnings reports. Khan Academy has demonstrated measurable learning gains from AI tutoring in underserved school districts where qualified teachers are scarce.
If an AI system demonstrably improves learning outcomes for CSU’s 460,000 students — many from lower-income backgrounds who benefit most from scalable, flexible access — and SB 928 prevents that deployment, the bill’s cost is real and human. The pattern of AI entering domains once considered categorically human suggests that treating human instruction as inherently superior will require ongoing defense with evidence, not just principle.
That said, the bill is not banning AI from education. It is banning AI from replacing the person responsible for it. That is a narrower intervention than critics tend to describe.
The Humans First Reflex in Higher Education
The bill aligns with a broader cultural and political response to AI labor displacement. The Humans First movement, which has gained traction across multiple labor sectors, argues that certain roles carry human value independent of efficiency — that being taught by a person who can be challenged, questioned, and held accountable is categorically different from interacting with software, regardless of output quality.
That argument has particular force in higher education. Universities are not purely credential factories. Intellectual mentorship, debate, and social formation happen between people. Whether AI can replicate those outcomes is genuinely contested. Whether it should be given the chance in exchange for cost savings is a political question, and California has given its answer — at least for CSU, at least for now.
MegaOne AI tracks 139+ AI tools across 17 categories, and education platforms have seen among the highest rates of new entrants over the past 18 months. Legislative response was, in retrospect, predictable.
What Happens Next
SB 928 faces committee hearings in spring 2026. Ed-tech industry groups will push to narrow the definition of "instructor" — vendors whose products blur the line between tool and teacher have the most to lose. Expect amendment pressure to limit scope to fully automated delivery systems and exempt AI tutoring embedded in human-led courses.
If the bill passes in current form, it sets a precedent every state legislature with active AI-in-education debates will examine. California houses the largest public university system in the country, and its policy decisions export. SB 867 and SB 903 signal this is a legislative session with a coherent theory about AI’s appropriate role in human development — not a series of unrelated reactions.
Watch the CSU administration’s committee testimony. If officials confirm no plans existed to replace instructors with AI, the bill is preemptive insurance. If they resist or qualify their answers, the concern is operational and SB 928 is the narrowest possible response to a problem already in motion.
Either way, California has decided that "professor" requires a pulse — and that leaving the alternative to market forces is not a risk the state is willing to take.