
July 12, 2025 | 8 minute read
Investigating the Homogenization of Web Design: A Mixed-Methods Approach
by Sam Goree, Bardia Doosti, David J. Crandall, Norman Makoto Su
What I read
In this article, the authors conduct a rigorous and primarily computational analysis of the changes in color and block-style layout in websites over the last 17 years. They conclude that there has been a dramatic increase in consolidation and commoditization of visual design, and hypothesize that the change is due largely to the rise of mobile design, and the creation of reusable templates and frameworks used to speed up development.
First, the authors indicate that there is more to a website than functionality and usability; aesthetics plays a dramatic role. They describe that many people in popular media have asked why many websites have started to look the same, and this question led to the impetus for the study: have websites actually objectively started to look the same, and if so, why?
The authors then walk through a brief history of design history related to the web. First, they describe that looking historically at artifacts like websites is considered a valuable way of exploring cultural phenomenon, but that preserving websites is technically difficult. However, online behavioral and technical trends have been easy to explore historically. There are several examples of researchers who have looked at the relationship between aesthetics at moments in time; more recent work has used convolutional neural networks to analyze design, which provides a more gestalt-driven analysis of a visual artifact.
Next, the authors explain their method. The main data used in the analysis is a set of static screenshots from web pages, as well as rendered source code. The images themselves were gathered by using the Internet Archive. They limited the dataset to the most popular sites, leveraging a stock index of the largest publicly traded companies as a way of focusing on sites that were likely the most traveled. They added to this the data from Alexa rankings and Webby Award nominations. This led to over 227,000 images of over 10,000 websites. Screenshots were gathered that showed each 15 day increment for many of the sites, and once a year changes for the supplementary sites. They leveraged a trained model to provide rich data about the gathered sites, and then elected to focus on color and spatial layout as a way of later quantifying similarity. They support this entire endeavor with qualitative interviews with 11 web designers.
The authors then describe their findings. The primary finding is that sites became less homogenous from 2003-2007, and then dramatically more homogenous after. This was true related to both color and layout similarity; layout similarity distance (or the amount of difference) declined 44% from 2010-2019. The major conclusion is that “websites have homogenized since 2007, and that layout in particular has seen a significant decrease in diversity.” The total data analysis set that led to this conclusion is upwards of 2 million comparisons through, among other things, a pairwise approach.
They then indicate objective historic technological changes that may have led to this significant change, focusing on two time periods: one where mobile browsing (and responsive design, and small screen use) increased due to the release of the iphone, and the other focused on the introduction of content management templates and pre-packaged libraries and frameworks in and around 2013, with a growth of a small set of libraries (such as Bootcamp and jQuery’s UI package) around 2017. They note that broader bandwidth led to the use of large-scale image backgrounds at that point, and limitations on “web-safe colors” were removed as screens and browsers increased in capability. It’s likely that SEO optimization also drove these changes, as best practices emerged for higher ranking, and those practices often related to what was displayed on the screen and the order in which it was displayed.
The authors then discuss the meaning and implication of their findings.
First, they indicate that “the use of a relatively small number of frameworks and libraries has expanded significantly, and that the use of similar libraries strongly correlates with visual similarity—suggesting that the rise of these tools may be contributing significantly to the homogenization that we observed.” They note that the vast majority of this was driven by the adoption of jQuery and Modernizr, which was packaged in Microsoft’s ASP.net model/view controller package.
As they consider the future implications of the work, they note that, first, the changes may “limit the perceived repertoire of possible and legitimate designs that future website designers draw from, constraining the creativity and innovation of future websites.” The precedent set over this time period will likely impact design norms in the future, where designers feel it appropriate to leverage these tools in their process. Additionally, they imagine that designers used to use “many informal, flexible representations of websites (e.g., sketches) for wireframing. These unconstrained, sometimes low-fidelity representations afford usual and unusual designs equally, limited only by the designer’s imagination.” They anticipate that this will change in the future, because platforms like Wordpress provide quicker on-screen layout.
They conclude that the challenge confronting the community of web designers and technologists is “to see whether we can or should create an online design landscape that is diverse—without giving up the advantages that maturity and standardization provide users and creators.”
What I learned and what I think
This is the first paper I’ve come across that has studied, in any way at all, the impact of systems on the creativity of designers working in a digital environment. It’s starting to be a little clearer why I am having such a hard time finding any existing work in this space: the technique they used was extraordinarily involved, and I’m reading into the article that it kept growing and growing in scope as they thought through their goal in more depth. For this type of study to happen, one would need an overlap of researchers who care about design, are technically capable of developing the various analysis methods, and have enough passion, time, money, and PhD students to work through this; clearly there is one such team, but I would doubt there are many more.
I’m wavering back and forth between a) a really large love and appreciation for the rigor, and a strong belief in the findings, and b) many of the issues I see in the actual study itself—primarily on the way design was considered and shaped.
I grew up with the web, and experienced all of the trends they point out; my experiences (and probably portfolio, if I still actually had any of it) mirror the data they gathered. Here’s my brief nostalgia. The beginning was literally a wild-west, because the original doctype standards had so little in them; we pushed the browsers to do whatever they could, and it was all hacky and shitty. I remember building the entire barcraze site, in 1999, with images in tables. We did dhtml, and all of the vendor specific hacks around it. CSS was a thing, along with all the same stupid hacks. Web browsing, responsive, and then packaged libraries; I remember discovering jquery and the UI library it came with blew my mind. And on and on, till now.
Their two major jump points, around mobile and frameworks, makes sense.
Their hypothesized implications have played out exactly as they predicted; this was from 2018 (published in 2021), and since then, there’s no sketching on paper, the tool (Figma) constrains all of the work, Canva and Webflow and all of the other sites have templated everything, and it’s all become normalized, except for all of us old people saying “get off our lawn.”
My problem with the study is what is probably a result of trying to build an experiment, any experiment, to study design. I appreciate that this wasn’t actually as positivist as many of the other “let’s measure creativity” papers, and I am pretty sure that the reductive nature of definition was a pure necessity of trying to execute a manageable study. But the work reduces design to color and layout, and layout is really, really roughly considered. Of course, visual design is these things, and it isn’t these things, because it’s all of the things. And I’m willing to bet that if I asked the authors (maybe I will) how they feel about focusing on these two data points, they would recognize the limitations. I also have no better suggestion on how to run the rigorous automated analysis in a more cohesive way. But the eleven qualitative conversations that focused more on technical history probably could have been broader in scope, more focused on designers rather than what looks like technologists, and much more focused on some sort of broad evaluation of the whole over the parts.
Also, I love that this paper exists, and that CHI actually thought it was worth publishing. Maybe that’s a sign of how the conference has changed since I last actually leaned into it, or maybe this was embraced because of its use of trained neural nets and the automated nature of the analysis. Either way, there it is, which is a great win for design in a traditionally engineering-only culture.
Download Investigating the Homogenization of Web Design: A Mixed-Methods Approach, by Sam Goree, Bardia Doosti, David J. Crandall, Norman Makoto Su. If you are the author or publisher and don't want your paper shared, please contact me and I will remove it.