This is (yet another) reflection about the ways that ‘Web 2.0’, social media, and the mass adoption of smartphones have changed society. But first, here is the introductory, prerequisite stanza: I began blogging in 2004. MySpace was one year old. Facebook was exclusively for Harvard students. I had a cool PalmPilot and used Windows XP.
For a moment, please indulge my nostalgia. We didn’t think of the ‘tweetability’ of our headlines back then. We did not worry about whether our titles balanced the aim of inciting a ‘curiosity gap’ without being renounced as ‘clickbait’. We did not feel obliged to find eye-catching ‘featured images’ for our blog posts. The ambition for likes, shares, and retweets did not exist — simply because likes, shares, and retweets had not been invented yet. Words like ‘viral’ and ‘trending’ had not yet even become associated with digital content.
Back in 2004, the only metrics available were page hits (we called our visitors ‘guests’) and perhaps newsletter subscribers. (RSS analytics had only just been invented.) Primarily, your readership grew only if the quality of your writing convinced your ‘guests’ to bookmark your site and return on a regular basis. ‘Networking’ was accomplished when blogs linked and referred to one another, as well as by commenters who highlighted parallel or opposed theses elsewhere.
Fundamentally, the same ingredients remain today: there are still writers and readers. However, a number of new incentives — now baked into the workflow of the social/searchable web — have added significant layers between writers and readers. To varying degrees, these incentives change the substance and style of language itself. (Insert obligatory remark here about the influence of the telegram and printing press on the evolution of language — and perhaps a shoutout to Plato’s dread of writing itself as the demise of memory — with an appreciative nod for the cyclical nature of all such observations.)
The first layer of incentives lurking in the subtext of these words is the social algorithm. You see, I desperately want you to ‘Like’ or star this post: every time you click on that heart or ‘thumbs up’ icon you are indicating the potential likelihood that other people will click on it too. This kind of predictive work is precisely the purpose of social algorithms, and as far as the owner of the algorithm is concerned, your clicks are invaluable data points for discerning what to put in other people’s newsfeeds.
However, this gets a little messy. As a writer, my goal is to share ideas with people. Like most writers, I want the biggest return on my investment of time: that is, I want to share my ideas with as many people as possible. Your supportive click on the social algorithm is the principle mechanism for getting these words in front of more eyeballs. Now that I know your response to this article determines the probability of others reading it, consider the incentive I have for getting you to click on that little icon…
Every second you read these words is a battle. I, the author, am competing with petabytes of other eye-catching, interest-grabbing bids for your attention. At any point you will abandon this piece if you suspect a net loss in return for the opportunity cost of reading it. Therefore, I really need you to click that Like button as soon as possible. Consequently, I am highly incentivized to appeal to your likely beliefs and presuppositions. If you are going to support my covert algorithm agenda here, I’m predisposed to reinforcing your current ideas or at least making you feel psychologically sophisticated for your penchant for robust analysis. Either way, same end goal: you darn well better ‘Like’ this, and the sooner you click that shiny little button the better.
Concurrently, while I am busily appealing to your subconscious to give the social algorithm a pretty gold star on my behalf, there are other algorithms I am seeking to appease at the same time: namely, search engines. We all know that one of the greatest utilities of the internet rests in its searchability. Most inquiries begin with questions. Therefore, I am not only writing to you, the reader, I am also writing to the algorithm that is going to rank where this article comes up in future searches. Strange, isn’t it? In order to write for human readers — or, I should say, in order to find human readers — I first feel compelled to write for a computational analysis by some software.
Will this article win the search engine game? Probably not. For an experiment, I submitted all the paragraphs above to a search engine optimization (SEO) tool. So far, this article fails most SEO tests miserably: I apparently do not use the article keywords in the right places. The readability score is low. There are no subheadings or sections. The title is too short. The meta description does not score well against the body text. There’s a whole list of changes I should make to give this article a better chance at being ranked higher by search engines.
As someone interested in history and writing, I find the algorithm-induced pressures of writing in today’s world fascinating. No matter what topic is addressed, algorithms have turned the internet into a giant race for ‘Likes’ and search rank. The armchair economist in me is simply left to wonder how this will go on to impact writing itself, both stylistically and linguistically.
Writers, remember: the more we play the algorithmic game, the more the algorithmic game plays us. (All hail the great Algorithm in the Cloud!)
Algorithm-oriented content is becoming ubiquitous. It doesn’t matter what you read or what topics you search for, a growing percentage of online material is designed ground-up for the acquisition of ‘Likes’ and the courtship of search engines. Food, politics, current affairs, cats, academia — everything. It doesn’t matter what you are interested in, there is an army of people writing about it with strategic intent to leverage the algorithmic landscape for their advantage.
I do not want to argue that algorithms themselves are evil. All I am saying is that they create a set of incentives that can substantively influence the work of a writer. This is nothing new. Seeking to win the approval of an editor or publisher equally incentivizes certain form and format considerations . The point is not that writers should — or ever could — work in a vacuum. The historical lesson here is simply that being cognizant and vigilant about how such pressures frame our ideas in the first place is critical.
‘Content is king’, says the old maxim. But when the crown is won by an algorithm, to what extent can we say that the competition for the throne was meritocratic? Algorithms do not merely mediate what we see and read online — they arbitrate it, they select it. Therefore, one of the central tenets of digital literacy must be realizing that algorithms are not neutral: they have implicit agendas, functions, and benefactors.
As much as we might like to think that ‘content is king’, increasingly we are living under the reign of the algorithm. But we should not fear algorithms. They are simply computer programs operated by the human beings who set their rules. But we should damn well get smart about them and start studying the deeper impacts that they have on our social discourse with a detached, critical eye.
Inasmuch as we appreciate the inherent biases of publishers, editors, and media corporations, we must equally acknowledge the inherent agendas of those who determine the substance of our newsfeeds.