Paper Summaries
25_Fall_261

November 12, 2025 | 3 minute read

Maintaining the reversibility of foldings: Making the ethics (politics) of information technology visible

by Lucas D. Introna

Text Exploration

In this text, the author argues that the common view of technology positions it as a technical means to achieving social ends, and this is both wrong and dangerous. Instead, he states that there is a morality embedded in technology, and that the creators of technology must disclose the morality of their creations.

To make this argument, the author begins by describing the common view of technology as a thing outside of people—something separate, that can be applied to help achieve goals or meet social ends. Technology is designed as a tool, it is claimed, and technological products are the outcome of a rational approach to problem solving. Technology is neutral; it is people using the technology that leads to good or bad outcomes. This view “tends to allocate all agency to human begins.”

The author argues that the premise of a rational, neutral creation of technology is flawed, because technologists and designers make choices as they create new products—and those choices are driven by various moral and ethical stances, which are embedded in the outcome. As a result, the social and technical are intertwined from the start. They are folded together intimately, and this “treats technology as material culture that is not neutral but the very condition of our way of being.” The creation of technology is political and argumentative, and the author argues that we should analyze and critique it in the same way as we might with any other political agenda.

Examining technology to understand the ethical stances built in is difficult, because technology is “opaque.” Experiencing or using the technology does it not make it clear what is happening (as is the case of facial recognition built into security cameras), and so it cannot be reasonably judged. Code itself is ambiguous and hard to understand. This author describes this as the “silent” nature of technology. Two cases study examples are used to show this opaque, silent folding of political perspective and technology; one examines search engines, while the other discusses anti-plagiarism software in schools.

Search engines, the author explains, include, exclude, and prioritize different content based on design. Users don’t know how this works (and arguably, neither do researchers, as the prioritization is proprietary), and the actual use of the tool reinforces its biased behavior by viewing selected results as more popular—and therefore further embedding them at the top of a list of results. “Increasingly,” the author explains, “we have a web for the majority at the expense of the minority.”

Plagiarism detection systems used in schools are claimed to add objectivity and fairness to grading, so that teachers identify plagiarism in a more consistent manner. The author disagrees, and argues that instead, the tool’s technology targets non-native speakers disproportionally—these speakers write in a different way, a way that is likely to trigger false positives. The author suggests that not only is the tool biased, but the teachers using it are unaware of how it may be functioning in a manner that they don’t intend.

To combat the negative impact these technologies may have on users, the author argues that the creators of these technologies “have a moral obligation to disclose [the politics and ethics] on an ongoing basis.” The author names this transparent design, which calls for making the operation clear to “ordinary uninformed users” so they can understand the intentions, functions, and details of how the product is impacting their lives. As an example, the author states that a camera that gives a user a great deal of control over shutter speed, aperture settings, and so-on is a better camera than a disposable one, as the disposable camera is convenient, but “constitutes you as ignorant.”

The author concludes that we must moralize technology, and to do this, it must be made transparent and open to ongoing review—it will be unfolded.