Academics
25_Fall_261

December 4, 2025 | 9 minute read

This is my last class assignment for Social Computing, an Analytic Review/Critical Analysis that spans across multiple papers (previously: one / two / three).

A three paper analysis of:

The Social Construction of Artefacts: A Response to Pinch and Bijker

By Stewart Russell
Published in: Social Studies of Science, Vol. 16, No. 2 (May, 1986), pp. 331-346
DOI: 10.1177/0306312786016002008

Do Artifacts Have Politics?

By Langdon Winner
Published in: Daedalus, Vol. 109, No. 1 (Winter, 1980), pp. 121-136
DOI: 10.4324/9781315259697-21

Science and technology researcher Stewart Russell, political theorist Langdon Winner, and community advocate Marika Pfefferkorn all offer ways to understand how technologies emerge, take shape, and exert force within social and political worlds. Each challenges the assumption that technologies are neutral tools, and each proposes that a technology cannot be understood without examining the sociopolitical context in which it came to exist, and the sociopolitical context in which it is deployed.

In The Social Construction of Artefacts: A Response to Pinch and Bijker,Stewart Russell, “The Social Construction of Artefacts: A Response to Pinch and Bijker,” Social Studies of Science 16, no. 2 (May 1986): 331–46, DOI Stewart Russell argues that science and technology are fundamentally different, and therefore the social processes of the two ideas cannot be viewed as unified. Instead of attempting to develop a single, uniform model of socially understanding technology wrapped with science, he argues that we already have a model suitable for understanding technology: a Marxist form of social analysis.

Russell’s argument is a critique of how Pinch and Bijker see technology as closely intertwined with science.Trevor J. Pinch and Wiebe E. Bijker, “The Social Construction of Facts and Artefacts: Or How the Sociology of Science and the Sociology of Technology Might Benefit Each Other,” Social Studies of Science 14, no. 3 (1984): 399–441. This unified model, Russell explains, is deterministic in its focus on the use of technology, as if any given technology is inevitable. The model, he claims, ignores how a technology comes to be in the first place, which is entirely a social activity of selection, prioritization, inclusion and exclusion. The whole process, and not just the outcome, must be examined to understand who decided a given technology should exist at all, and what things were deemed less, or not important in the process of making that decision.

Russell argues that technology is not neutral, and to study it with that assumption presents two flaws. First, an attempt at impartiality ignores the content of the technology itself, and that content is significant; second, technology has an asymmetrical impact on social interests of various groups.

Russell then further asserts that a view of technological change must always consider those involved in the larger surround of a technology. He critiques Pinch and Bijker’s offhand reference to social group, which he views to be inadequate in richness to form a meaningful understanding of the impact of a technology and those who influenced it—what social? What group? A vague gesture to people doesn’t account for a nuanced understanding of real impact on anyone, “be it a developer, adopter, operator, consumer, sufferer of side effects, or whatever.”Russell, “The Social Construction of Artefacts. It is not enough to simply address the impact on these groups, either; Russell argues that an explanation of technology must show “not only what different social groups think about an artifact, but also what they are able to do about it.”Russell, “The Social Construction of Artefacts.

Russell concludes by suggesting that there is an existing model that serves well for exploring technology change with culture. A Marxist paradigm, traditionally focused on the impact of technological introduction into a workplace, can be seamlessly reappropriated to explore the impact of technological introduction into a society at large. The elements of a traditional Marxist framework, such as classes, economic relations, power, and interests, can be used as evaluative lenses, as can politics of the state (as well as a set of more localized interests of corporations.) A Marxist framework is one that considers a system of forces often working in conflict with one another, and Russell argues that to understand a technology means understanding the systems in which it exists.

In Do Artifacts Have Politics?, Langdon Winner describes a way to approach this understanding that Russell demands. He begins by offering a form of interpretation that examines the specific features of a technology or technologically-rich artifact. He calls this an examination of Technical Arrangements as Forms of Order;Langdon Winner, “Do Artifacts Have Politics?Daedalus 109, no. 1 (Winter 1980): 121–36, DOI the more flexible forms of technical capabilities require people to shape them—and the people likely have a political perspective and agenda. Winner offers the racist agenda of Robert Moses to describe how a technology of transport was arranged in such a way as to prohibit black people from accessing public areas. Similarly, management at an executive plant selected machinery to purposefully remove union workers. The technology itself might be seen as benign, but the selection and arrangement of it is purposeful, and in both examples, in support of a political agenda rather than a utilitarian optimization.

Winner then provides a second form of interpretation, which can be used to examine “Inherently Political Technologies;”Winner, “Do Artifacts Have Politics? this is a way to consider and interpret technology and politics by questioning if some technologies are inherently political, and primarily political in support of an autocratic form of control. Plato, Winner explains, answered these questions, but focused less on the issue of technology and more on the issue of democracy, by indicating “it a practical necessity that a ship at sea have one captain, and an unquestioningly obedient crew.” If the various technologies have been assembled into one of sea voyage, logistics, product transportation, and so on, the broader “technology of sailing in a ship” has an implicit need for a non-equitable model of leadership.Winner, “Do Artifacts Have Politics? Winner explains that Engel, too, recognized the social relations that are required in large-scale systems like factories and railways, and argued that based simply on the complexity and scale of these technologies, some form of oversight-through-power is required for sustained operation. The most vivid example offered is the atomic bomb, which clearly could not be created, implemented, managed, or maintained if all participants involved were considered equal.

Winner discounts the idea that a technology itself—the literal and physical aspects of it, such as a blade or a tube—is infused with political meaning, which he feels is commonly argued. He also discounts the simplicity of the common response to this claim, which is that the technology is irrelevant, as compared to the social and economic systems in which it is embedded. The first he views as “just plain wrong,” while the second is problematic because it “suggests that technical things do not matter at all.” Winner indicates that there are alternative perspectives on how technology and politics are intertwined (technology as “all of modern practical artifice,” and politics as “arrangements of power and authority.”)Winner, “Do Artifacts Have Politics?

It is overly simplistic to claim technology is apolitical, or to claim that it is embedded with agenda and naturally leads to inequity. Instead, Winner concludes that a considered attempt to understand how technologies become un-neutral means looking at how technology features are arranged, and how some technologies are, by necessity, political and argument-provoking. It is likely the intertwining of these facets that highlights technological intrusion likely to cause dissent and controversy.

While both Russell and Winner include examples in their work, they argue largely from a theoretical perspective. In Marika Pfefferkorn: Oral Histories of Surveillance, Marika Pfefferkorn, interviewed by Spencer-Notabartolo, offers a more intimate example of both of their theories, come to life. Pfefferkorn’s example comes from her experience working in Minnesota to create the Coalition to Stop the Cradle to Prison Algorithm—an effort to eliminate predictive analytics and data tracking in lower socioeconomic, predominantly black and brown educational environments.

Pfefferkorn describes how a joint powers agreement was developed in Ramsey County that would merge data between the city of St. Paul and the St. Paul Public Schools. This agreement was made to create “an early warning system through predictive analytics to identify a risk factor for a student.”Lexi Spencer-Notabartolo, “Marika Pfefferkorn: Oral Histories of Surveillance,” Our Data Bodies, December 14, 2020. Pfefferkorn calls this a “deficit-based approach,” one that works towards correction rather than empowerment and growth; this particular approach, she argues, was one that would create a school-to-prison pipeline.

Pfefferkorn argues that it is not only the content of the agreement that is lacking—the entire process leading up to the agreement was flawed. First, it was based on a presumption that data and algorithmic predictive analytics were good and desired by the community; when questioned, leaders in support of the agreement pushed back on criticism by claiming critics simply did not support innovation. The agreement was also based on a presumption that any community involvement in the discussion of the technology was informed, and this was false: residents didn’t understand the technology, and even those supporting the technology didn’t really understand it, either. “We found out that the unanimous decision to approve this JPA, most of the folks that signed off on it did not understand what they were signing off on,” Pfefferkorn explains; “they did not know what predictive analytics was.”Spencer-Notabartolo, “Marika Pfefferkorn.”

Pragmatically, the agreement had no plan for continual funding. The initial efforts were provided for free by a technology provider, but when asked, the advocates for the agreement were unable to identify a budget to support the program after the trial.

Pfefferkorn calls out a strange irony to the experience—those in support of the program were those that were often on the “same side” as the residents: they were Democrats, who in theory were advocates for equal rights and equal educational experiences. Yet they were so enamored with the promise of innovation and the magic of an “algorithm” that the agreement had inertia, and until the Cradle to Prison group raised awareness of the potential risks of the program, they were unable to see beyond the promise of technology as empowerment.

Pfefferkorn ends by describing the ongoing political context in which discussions like this occur. It has become important to question what is behind an invitation to give a presentation or to engage in a discussion about technology, because technological “advancement” is wrapped up in competing interests and perspectives; even such an invitation comes with the burden of agenda.

These three texts—The Social Construction of Artefacts: A Response to Pinch and Bijker, Do Artifacts Have Politics?, and Marika Pfefferkorn: Oral Histories of Surveillance—all argue that technology is shaped by, or burdened with, political agenda. Russell establishes the need to carefully examine the social and political forces surrounding technological development. Winner demonstrates how these forces become embedded in artifacts and systems, shaping forms of authority and creating or reinforcing inequalities. Pfefferkorn shows what this looks like when it shows up in practice. Taken together, Russell, Winner and Pfefferkorn argue that technologies must always be examined by considering those in power who deploy them, and the political and power structures the technologies, and those advocating for them, reinforce; most importantly, technologies should be evaluated based on the real consequences they generate for the people who have to live with them.