MOOC Platforms, Surveillance, and Control
The institutional impact of MOOCs.
By Paul-Olivier Dehaye
In the mid-1980s, while she was a professor at the Harvard Business School, Shoshana Zuboff formulated three laws about the implications of information technology:
1. Everything that can be automated will be automated.
2. Everything that can be informated will be informated.
3. Every digital application that can be used for surveillance and control will be used for surveillance and control.
Informating is the process that translates descriptions and measurements of activities, events, and objects into information. Zuboff’s original analysis was applied to a variety of workplaces, but its scope has naturally expanded with the spread of information technology to the spaces where we live, shop, exercise, relax, heal, and, now, teach and learn.
The first public promise of broadcast-style massive open online courses, or MOOCs, has always been to reduce costs by automating the classroom experience as much as possible: large participation numbers are possible because content can be delivered online 24-7, to the great convenience of students, and because thousands of tests can be scored automatically, to the great convenience of professors. This automation has been critical in the evolution of MOOCs, and its impact on pedagogy has been much discussed. Less frequently discussed, however, are the informating, the surveillance, and the control entailed in MOOCs. Indeed, these online courses are constructed as part of a bigger assemblage, a MOOC platform, whose logic is secreted away into venture capital–funding rounds, term sheets, quarterly reports, partnership contracts, partner conferences, and support tickets.
In more recent work, Zuboff cleverly deconstructs the informating step by recycling observations of Hal Varian, chief economist at Google. Varian identifies four different benefits of computer-mediated transactions:
1. data collection and analysis
3. constant monitoring enabling new forms of contracts
4. constant experimentation
These four facets frame the following analysis of the systemic and societal impacts of MOOCs.
Data Collection and Analysis
One of the operating strategies of web platforms is to represent symbolically as many events and interactions as possible. For teaching and learning platforms, this strategy renders subjective events into commodities, actively traded by professors, universities, and the platforms themselves.
The stated promise associated with data collection is to improve education and gain insights to inform our on-campus teaching. But this one-way extraction of data will unfortunately occur at the expense of a vast erosion and redistribution of privacy rights: in exchange for access to the data they contributed, eager universities have turned a blind eye to third-party uses of those data. They have also partly shielded themselves from liability by relying on start-ups that will simply nullify the privacy laws they don’t like. This is particularly problematic in Europe, where international agreements fail to ensure adequate protection for data transferred across the Atlantic but limit what can be exchanged between universities in the same country afterward.
The data collection has other goals as well: it allows platforms to gather new types of information about the online education market. Netflix can tell precisely when a TV series hooks its viewers, and MOOC platforms can say the same for individual courses and students. They use metrics that have been defined by computer engineers with little understanding of cognitive processes and no form of accountability.
MOOC platforms can track the topic searches of students to assess the demand for particular courses or topics. They use this knowledge in competitive bidding processes for new courses and specializations, sometimes offering loans to universities to develop the content, which gives the platform more control over the production process. Eager to get a first-mover advantage, universities often embark on these projects despite some initial reluctance. The bidding process is opaque and easily manipulated by the central player, the platform.
Personalization and adaptive learning are often touted as a (future) advantage of MOOC education. They have drawbacks, however. The primary aim of MOOC platforms is to offer the best user experience, which, in heated academic debates, can sometimes result in censorship. When a professor ends up being the lone dissenter, his or her account can easily be suspended without explanation.
Even more troubling, this personalized experience sometimes evolves into more refined forms of censorship. Through the practice of “shadow banning,” some individuals’ forum posts are not delivered to others—without providing notice to the original posters that their comments are being blocked. This practice is fairly common in online communities but has no place in a classroom.
In the context of an online course, such tactics can destroy creativity and breed a culture of compliance and anticipatory conformity among faculty members: professors are encouraged to fit their content into patterns that can then be delivered at scale by the platforms in ways that can be (paradoxically) personalized and monetized for each student. Professor comments are centralized into a partner portal, which only contributes to this groupthink through insidious self-censorship and vain self-presentation.
Varian boasts that the constant monitoring of web platforms enables new forms of contracts: examples include online advertisers paid per click and Amazon authors paid per Kindle page-turn.
Already, MOOC providers reward content producers (universities and professors) through a precise assessment of the revenue they generate. Eventually, this system might be tied to actual hours of video watched or some other inaccurate proxy for the quality of the teaching done.
Varian also looks favorably upon the new online businesses that thrive thanks to new means of verification embedded in our devices. In some MOOCs, students are required to send pictures of their identification cards to certify their identities. In the more extreme verification systems, keystroke dynamics are also used to identify users. These mechanisms, which are outside of the control of universities, raise profound questions about consent and privacy that cannot be adequately resolved through “terms of service” agreements.
Perhaps most notable among the newly possible contracts are those that involve certifications and, ultimately, integration with the employment market. Currently, students exchange money for cryptographically signed certificates, but this system does not guarantee that certifications will be recognized. Some classes of certifications have already been retired as meaningless or technologically degraded. In the future, the introduction of digital rights management schemes might mean that students will never fully own their degrees or might be required to pay as technology is upgraded. In fact, after around one year of operations, Coursera started to anticipate such a monetization strategy by slipping an extra clause into its (leaked) partnership contracts—without providing any disclosure to students. Such practices are particularly worrisome in the context of today’s workplace: while optional “upskilling” in a stable work environment is desirable, in our fragmented labor market it seems more likely that individuals will be constantly expected to showcase their earned certificates in order to land their next gig.
Universities are also participating in MOOCs for their own research purposes. Undoubtedly, the vast collection of data gathered through MOOCs will lead to many new quantitative studies, sometimes confirming prior smaller-scale or qualitative ones and almost always discovering statistically but not practically significant effects. The numbers will be staggering and unprecedented, but not necessarily informative from a pedagogical standpoint. In addition, a lot of the experimentation now revolves around increasing student retention. MOOC platforms seek, for instance, to optimize the scheduling and content of their periodic e-mail reminders. It is unclear what effect, if any, this nudging could have on students.
Universities are stepping into an ethical quagmire. Students are not informed of the legal and technical frameworks driving the experimental designs of courses, and scientific research conducted by academics will undoubtedly be confused with product research conducted by the MOOC platforms themselves. Some more active interventions will be especially questionable. Already, some MOOC platforms are experimenting with pricing discrimination, with the goal of maximizing revenue. This discrimination is currently based on geography, but could socioeconomic data collected through the voluntary “research” surveys that students take also be used to determine pricing? What about data submitted for financial aid applications? Either way, such tactics betray a profound disloyalty to students.
Experimentation can also be used as an opaque shield for activities that exploit universities and professors. In a world where a platform hosts thousands of courses, a recommendation engine is crucial to drive traffic. Just as Google can steer users to its own products, so can platforms favor courses that maximize their own returns.
The concerns, however, go much further. The logic of data accumulation, which Zuboff calls surveillance capitalism, naturally tends to favor dominant players, because they get exhaustive knowledge of the unregulated markets they create and completely subsume. The platforms have successfully played on existing university rankings and a misplaced sense of competition among universities to lure them in. Once they enter agreements with MOOC platforms, the Varian principles will take over, and universities will end up competing to the exclusive benefit of their host platforms.
Zuboff has more recently christened the end state of this logic as Big Other, which she defines as a ubiquitous, distributed, networked, and novel institutional regime of individual corporate actors, each dominating and commodifying a separate facet of the global digital life. MOOCs constitute just one of those facets for many individuals worldwide, and it is certainly reasonable to expect a consolidation in the industry. There are already very few MOOC platforms that dominate the higher education market, and it becomes harder and harder to imagine new players coming in.
So, what happens when the dominant MOOC platform actually “wins” and enmeshes academia as part of Big Other? Will professors, reduced to mere content producers, all working under nondisclosure agreements, live in fear of being booted off the platform if they speak up, just like authors on Amazon? Will university administrators, in this pyramid of power and control, live in fear of damaging their privileged partnerships with a monopolistic player, just as they do now with academic publishers? Will this external player hasten the demise of shared governance in academia?
The asymmetries of knowledge and power between Internet giants and citizens are already large, and they seem to grow worse every month. We are rapidly institutionalizing new facts, igniting new network effects, and facilitating the influx of even more predatory capital.
As a mathematician who has seen mathematical models misused again and again—in the financial markets, for instance—I am inherently skeptical of big-data claims. I think, like many others, that machine learning helps perpetuate, legitimize, and cloak discrimination.
Should universities not be more careful in approaching MOOC platforms, for the benefit of their students? Should we not work against the centralization of our online presence? Should we not aim to preserve our independence from the Internet giants? In the end, we risk being collectively complicit in an unconditional intellectual surrender to venture capital–funded educational disruption—indeed, we risk directly contributing to the numbing effects of constant and indiscriminate surveillance. As professors, we should always insist that education remain emancipating and should resist the coercive logic of surveillance capitalism.
Paul-Olivier Dehaye is a former assistant professor of mathematics at the University of Zurich. His e-mail address is paulolivier.