What makes a college course popular or unpopular? I’ve long been interested in courses for non-science majors that satisfy “general education” requirements, their aim being to foster overall scientific literacy and to convey an understanding of topics that are important to society. I often teach such courses at the University of Oregon, for example a biophysics-for-non-scientists course and one on renewable energy. Last term I again taught The Physics of Energy and the Environment, a course for non-science-majors that I’ve written about before (for example, this).
Here’s the enrollment in Physics of Energy and the Environment for the past 15 years. (See Methods for how I constructed the plot.) The datapoints with the circles are the terms in which I taught the course.

You’ll notice that there are enormous fluctuations, with the number ranging from about 40 to 140. Last term had among the lowest numbers of students. I wondered why.
Here’s enrollment data for The Physics of Light and Color, usually a popular course. Last term was particularly low, less than 50 when it’s usually over 150.

Are there “general education” Physics courses with more students, and in which enrollment last term was high? Yes: Essentials of Physics. Note the scale, 300 students last term:

These were the three general education Physics courses offered in Winter 2026. Even before the term started, I was paying attention to the enrollment, tensely checking to see if my course would cross the 20-student threshold to avoid cancellation. Here’s the graph, starting a week after enrollment opened:

300, by the way, is the maximum allowed for Essentials of Physics. The ceilings for Energy and the Environment and Light and Color were 76 and 218 respectively, indicated by dashed lines above.
What if we look at all Physics general education courses for the past 15 years?

There’s a spaghetti of lines, but it’s clear that something is unusual in recent terms.
What sets the Essentials of Physics course apart? Why is it so popular? The content is “Physics 101” for non-science-majors, i.e. not a particular theme of social or humanistic interest.
While you’re formulating a guess, I’ll note that I’ve often heard great things about the Physics of Light, Color, and Vision course.
Though I’m biased, I’ll note that students also seem fond of Physics of Energy and the Environment. I’ve had enthusiastic students tell me, sometimes even years later, that they like the course. Plus, it has a lot of real-world relevance, and we like to think our students care about this.
From this past term’s student evaluations:
“The relevance of this course content can’t be overstated. This course clearly connects to real world examples and helps explain world phenomenons.”
and
“He [i.e. me] also is very good at including active learning in his lectures by making students think first before directly stating answers.” (The relevance of this will be clear in a moment.)
I’ve posted all the student evaluations here, so you can verify that I’m not cherry-picking a few cheerful kids from an otherwise angry mob.
I have yet to hear praise of Essentials of Physics, though I haven’t specifically investigated. (We don’t have access to other courses’ evaluations.)
Modalities and the Ethics of Instruction
As you’ve likely guessed, what’s different about Essentials of Physics in Winter 2026 (and Winter 2025), is that it’s an online, asynchronous course. This means that there’s no in-person interaction; lectures are recorded. Most importantly, Most importantly, students submit all work online. In principle there could be proctored in person exams at a testing center, but this doesn’t exist for this course, or for most UO online courses. The other two courses, Light … and Energy and the Environment, like nearly all of our other Physics courses, are in person.
The University of Oregon is a residential university that makes a point of stressing in its public relations “live” interactions, student experiences, topical courses, etc. University of Oregon students, therefore, are presumably not enrolling from far away, nor enrolling with the aim of taking classes in their pajamas. The interactions enabled by actually having a room full of students, especially incorporating active learning methods that stimulate student engagement and allow a back-and-forth of questions and answers, are effective ways to enhance learning. Plus, they’re fun.
Apparently all this does not diminish the appeal (or temptation?) to students of online courses.
Obviously, one can’t think about online courses in 2026 without thinking about artificial intelligence. (This has been true since at least 2024, but in 2024 one could perhaps be unaware of AI without being professionally negligent.) Even in high-level undergraduate classes, there is nothing one can assign that can’t be answered perfectly by AI; in a general education course, perfect AI-delivered answers are trivial to obtain. We are all seeing as one of the consequences the evaporation of correlation between homework scores and (in person) exam scores, the former being generally perfect and the latter increasingly bimodal with a large fraction showing stunningly low levels of understanding.
The concern is not simply academic dishonesty, though addressing this is essential to avoiding the devaluation of higher education. Perhaps more sadly, we’re seeing students use AI as a crutch for their understanding. It’s easy to ask any modern LLM to answer and then explain a homework question, read that explanation, and think this is a substitute for thinking about the question and constructing the solution oneself. The student, then, bypasses the actual process of learning, and without meaningful assessments (like quizzes or exams), the students delude themselves about their skills.
Is the immediate filling of the 300-student Essentials of Physics really a consequence of it being online? As an additional datapoint, note the Physics Behind the Internet in the graph above. Having hovered between about 20 and 100 students, it surged to 150 two years ago, and 300 this term. What’s new about Physics Behind the Internet? Two years ago it became an online asynchronous course (ceiling 154 students in 2024, 300 now).
It is possible, I should add, to create a meaningful, rigorous asynchronous online course. As noted above, one can have human-proctored exams, though UO doesn’t have the capacity to do this for large courses. One can schedule online video chats for presentation and assessment (oral exams or quizzes); one of my colleagues in Biology does this — it is effective. This won’t scale to classes larger than 20 or maybe 30; certainly not 300.
It seems obvious that online courses are pedagogical disasters. There are, as mentioned, ways to structure them well. (Doing so requires more work than an in person course, I think!) And, of course, there are motivated and self-aware students who will learn very well from such courses, as they would from other courses. However, for a 300-person general education course with no independent assessment or validation, there’s no way to take such courses seriously, or to be proud to offer them. We may as well just tell students to send a check in return for an “A”, and spare everyone 10 weeks of pretending. There would be considerable student demand for this, just as there is currently considerable demand for the online asynchronous courses.
At a faculty meeting, I asked our department to stop permitting online assessments, which would effectively stop our teaching online asynchronous courses. There was some agreement and some concern with details, but not enough enthusiasm to move forward. I lacked the energy to push the issue vigorously enough, especially because there’s a structural problem with “unilaterally” taking such a step:
The resources of a department, such as my Physics department, are tied to the number of students it teaches. (This connection doesn’t need to exist, but it’s understandable; even more than most public universities, the University of Oregon is dependent on student tuition, so an administrative insistence that departments carry their weight is understandable.) My analysis above suggests that our online courses are siphoning students from our other general-education courses, so canceling these courses would send students to these other courses, like Energy and the Environment, which I would argue would be an educational improvement. However, it would likely also send students to online courses in other departments. Should we hurt our own income, which helps us accomplish our many worthwhile goals, to uphold a general principle about educational validity? I’d argue yes, but I can see that this isn’t an obvious choice.
What we need to solve this dilemma is a university-wide policy about online education that is honest and forthright about what learning looks like in 2026, that considers actual teaching goals and student experiences, and that has teeth. So far, we lack such a policy. UO is not unique; this is a common problem.
On the plus side, my many conversations about AI and teaching with faculty at many institutions, and with students, show a universal agreement that online, un-proctored assessment is meaningless and that universities need to think clearly about what they’re doing. (Students, by the way, are some of the strongest voices against AI-enabled cheating and its facilitation by clueless professors and administrators.) At some point, this will have to translate into changes in how we run universities. The institutions that do this quickly and well may survive more easily than those that don’t.
Methods
Data on course enrollment over time at the University of Oregon isn’t readily available, at least for those of us without any administrative superpowers. However, all our course schedules are available online, so it’s possible to get a web page for every course offered by a given department (like Physics) in a given term, and save it as an HTML file. Reading this by eye is easy. Writing code to read the HTML is hard — the table structure isn’t simple. This is a completely uninteresting programming task and is, therefore, ideal for current AI tools! (Without this, I would not have bothered with this analysis.) I therefore downloaded the HTML files, asked Claude (Sonnet 4.6) to convert all the HTML files to more comprehensible CSVs, and then asked it to write code to extract information from the CSVs. I then read the code, made a few changes, and ran it. this works well.
I don’t use AI to write prose, and I’m witnessing the disastrous results of students offloading learning to AI, but writing routine and boring code is an ideal task for modern artificial intelligence. There’s a lot to think about with all these developments.
Today’s illustration…
I painted a whale to use in a public talk I gave in January. My wife noted that I’ve had two whale paintings on the blog before, in 2013!
— Raghuveer Parthasarathy, April 12, 2025



