Writing versus Posting?

Surely writing has been in flux and evolution since our earliest etchings, and the advent of the internet has only ushered in yet another transformative epoch to the practice. So how does the post-to-share structure of social media change the act of stringing words together? I am wondering: is there a difference between ‘posting’ and ‘writing’ online? Obviously posting text involves writing, but how does the broadcast-this-now proposition of the internet shape the act of writing itself?

Here’s a question to frame the proposition: are you writing or posting?

The distinction may not be as subtle as it seems. Or maybe I’m splitting hairs. Either way, since leaving social media I find myself thinking much more about the act of writing as something distinguishable from the act of posting. I am ‘publishing’ here on my blog, yes, but this intuitively feels much different than submitting these words to Facebook or Twitter to distribute on my behalf.

Perhaps the difference between posting and writing is this: when you post something to Facebook, you inherently hope to find an audience; you wish the algorithm and potential recipients to ‘engage’ with the creation. By contrast, when you write a book or a blog, your write for readers — people who have already made some intentional decision to interact with you and your ideas.

Posting words with the intent to find an audience for them versus writing something for an audience are two distinct activities, I hypothesize. Granted, maybe the difference is all in my head. What do you think? Would you describe a difference between posting and writing?

I am not convinced

I am not convinced that ‘online communities’ will be defined as ‘communities’ indefinitely: it is quite possible some future generation might rebel against pixel-based approximations of human interaction as the sham of their parent’s age.

Is social media robbing us of our dearest hopes and dreams in life?

James Williams’ talk, Are Digital Technologies Making Politics Impossible?, is a must-listen. Riffing on the story of Diogenes and Alexander (with an interpretive lens drawn from Peter Sloterdijk) and Herbert Simon’s definition of an attention economy, Williams posits that there is a massive discrepancy between the design of digital technology and what we, the users, genuinely want for our lives.

What does technology want? It wants more clicks, more time on site, higher conversation rates, etc. It wants your attention — as much of it as it can take. And it wants to hold your attention for as long as it can. Your attention is the prize that Facebook wants to win. And keep.

What do we want? Well, presumably our dearest hopes and dreams for our lives go far beyond spending another 20 minutes on Facebook, Twitter, Pinterest, Instagram, etc. When you ask most of us what we want, we talk about more time with family, causes we care about, books we want to read (or write), traveling, adventures, experiences, personal achievements, and so on. We don’t tend to define another 20 minutes on social media as a step towards our ideal lives. But we check our phones again, anyway. Hence is the disconnect: technology is designed to hook us into behaviours and activities we don’t want. Williams suggests that it is time to consider the very real possibility that the “industrial-scale persuasion” complex baiting you into your newsfeed might not actually have your personal, human hopes and ambitions in mind.

What if every minute with your newsfeed takes a minute away from what you really want?

Williams says we need to think beyond the daily bait and switch that leads us to ‘accidentally’ getting sucked into Facebook for another 20 minutes. What is ultimately at stake here? Digital media distracts from our personal goals and pursuits in life. 20 minutes at a time, this industrial persuasion apparatus steals attention away from us — attention that might otherwise be invested in activities we feel truly matter. Ergo, the real crisis here is not that we just lost 20 minutes to some mindless activity: the issue that technology is intentionally usurping us from our own lives with “epistemological distractions” that divert us from the goals and activities that we sincerely do care deeply for.

So, should we blame the end user? Williams, a former Google employee himself, disagrees. Every day, millions of dollars and the brightest minds in the world are invested figuring out more effective ways to circumvent our will-power. Surely the answer to this dilemma is not, “Just have more will-power!” These infinitely scrolling newsfeeds are designed to drug us into submission — like “informational slot machines” whose sole purpose is sticking more ads in front of us along the way. Another 20 minutes. We are not only distracted from what we want in life by the promise of another dopamine hit, but we are also distracted from recognizing the opportunity cost of the addiction itself.

As far as Williams is concerned, we need to see “technology design as the ground of first political struggle” moving ahead. The present ‘solutions’ offered by digital technology are not working for us, but corporate interests have successfully usurped our imaginations when it comes to what can happen in 20 minutes of conscience existence on the planet.

Williams, therefore, declares that it is time for collective action to “assert and defend our freedom of attention.”

In an attention economy, your freedom of attention is your freedom. If you do not have the freedom to focus your mind on the things you truly care about, do you really have any freedom at all?

(Interested in getting together with some real live human beings — in an actual room — to critically analyze the prevalence, ubiquity, and power of social media in our lives? Come to Should We Quit Social Media? on Monday, October 16, 2017, 7pm at Central Library, to join the conversation.)

Post-Truth. Alternative Facts. Fake News. Blimey!

To what extent did ‘truth’ and ‘fact’ ever exist in politics and broadcast media before? How do the algorithms of social media fit into an evolving definition of propaganda today? Is society more ideologically ‘polarized’ than it has been in the past — and what would be the benchmark to measure this? How can accusations of practicing ‘post-truth politics’ and broadcasting ‘fake news’ be abused as politically rhetorical devices in their own right?

It boils down to a timeless question: what is truth and why does it matter?

Tim Blackmore is a Professor in the Faculty of Information & Media Studies at Western University. He has researched and written at length about war, war technology, propaganda and popular culture. His book, War X, focuses on the way humans understand the world of industrial warfare. Tim is especially interested in understanding how we use images and media to make war look attractive to ourselves as societies.

Related Blog Posts

On the Folly of Personal Analytics

Personal analytics

Remember how early personal web pages often included a ‘guest book’ for visitors to sign? Remember how bloggers used to place little ‘hit counters’ at the bottom of their sites? In one form or another, web analytics have been around about as long as the internet itself.

For as long as we have been posting digital content for the world to see, we have also wondered, “How many people are actually seeing this?”

Today, measuring and quantifying online activity is a full blown science and business. Virtually every website tracks where visitors are from, what they click, and how long they stay on each page. To ‘surf the web’ is to sail an ocean of conversion rates, key performance indicators, and statistical analysis. Even signing up for a free blog comes with the opportunity to drill down into visitor data.

Once upon a time, this site was a number-crunching machine, too. I would feel the euphoria of watching my stats ‘soar’ upon publishing a particularly ‘popular’ post. And the sit in the despair of an attention vacuum when other posts ‘didn’t connect’ with readers.

However, one day I had an epiphany: the impetus and passion to share ideas negates the importance of tracking analytics. I admit this proposition is equal parts unsexy and counter-intuitive, yes. But please allow me to unpack this idea.

I am driven to write in this space because I am compelled to articulate thoughts, ideas, and perspectives that have shifted my perception. For example, take the recent piece on Water Protectors: this article captures a moment of time in my personal evolution where my relationship with the world was significantly recalibrated. I want you — and anyone — to read Water Protectors because I think it represents a genuinely noteworthy set of observations and arguments. At very least, these ideas are important to me because they have changed me as a person.

But what good does it do for me to know how many ‘page hits’ Water Protectors has? If only one other person was ever going to read the article, would I have still written it anyway? Definitely. If an idea is worth sharing with the world, it must be no less important to share it with an individual. (After all, what are human conversations?)

Or consider this: let’s imagine that absolutely no one read Water Protectors. Would I decline to write about the next important lesson I learned simply because no one read my article about water? Of course not. The analytics for earlier blog posts had nothing to do with my compulsion for writing about water. In the same way, the number of page views for the water essay has nothing to do with my motivation for writing this piece.

The value of sharing an idea for the sake of the idea itself cannot be conflated with how many times the page is viewed.

To the extent that writers care principally about sharing ideas, I think we care less about web analytics. We write because ideas need to be written, not because we believe ‘page views’ have some intrinsic value. The zeitgeist of our times says that the metric for the value of an idea is synonymous with its popularity, shareability, and clickrate. We say it is a lie. If we measure the investment of our writing solely by its analytical return, we are no longer truth-seekers, philosophers, and concept architects — but perhaps something more akin to digital populists; submissive copywriters following the dictates of our algorithm overlord.

This is how I came to the decision to uninstall all the statistics and analytics tracking modules from this site. I arrived at a point where I didn’t event want to know how many people read this space because this knowledge neither has nor should have any useful input in determining what I will write about next. It is a distraction. A diversion. I don’t want to know how many people subscribe to my newsletter or podcast, either, for precisely the same reason. I write what I believe merits sharing. And even if no one ever read this post, the lack of audience has no bearing whatsoever on whether or not I will write again tomorrow.

The poverty of personal analytics is that we buy into the corporately sponsored dogma of our age: everything we do and think is meaningless unless other people ‘like’ it, too.

To Code is Human

Yesterday I learned how to do a media query in CSS. When I learn bits of code I am always taken aback by how the complexity of computer languages is only surpassed by its creativity. Coding, even if it appears like gibberish, is 100% human. We made this stuff up. And, like verbal language, digital script gives us the capacity to imagine ideas that were impossible to conceive before we wrote the platform to dream on.

Upon the Altar of the Internet

Curious phenomenon, the Internet continues to be.

We did not use audio cassette recorders to share our opinions with each other about the polyester plastic industry. Nor did not send each other fax updates with our predictions about the future of bitmap scanning technologies. With the possible exception of ham radio enthusiasts and Bulletin Board System users (who were arguably precursive practitioners of the present data age), I cannot think of another communication technology that has conceived so many pundits and prophets of itself. The Internet hosts a self-perpetuating chorus of commentary about the Internet. (You know, sites like this blog, for instance.)

However, we, the self-described cognoscente of cyberspace, are grounded in no fewer self-validating biases than the most subjective of herbalists. The already fuzzy line between culture and cult becomes indecipherable in a world that hails tech stars and CEOs as messianic figures. Whether one sits in an Alexandrian temple to debate the merits of Apollos, or whether one blogs about tactics for monetizing social networks, one cannot help but recognize that humanity is animated by a voracious curiosity about the products of its own imagination.

Like any other religion, the Internet is a temple we are building to ourselves: it provides us tangible metrics to validate ourselves individually (at the micro) and collectively (at the macro). It offers us a way to see ourselves as individuals in relationship to the whole. It is the contextual structure for community. It has priests, prophets, heretics, saviours, monastics, and ordained officiants. It has denominations, splinter groups, schisms, power struggles, politics, orders, orthodoxies, and a canon of mythological architects, archetypes, and heroes.

The Internet is a shrine to all the brilliance and ridiculousness that is the homo sapien. It potentially the most religious invention we have concocted to date.

If the Internet has taught me one lesson, it is this…

If the Internet has taught me one lesson, it is this: virtually every “original” idea I have was already had, likely a long time ago, by someone much smarter than I, who understood the counterpoints and implications far better than I do.

I have learned that the worth of an idea rests not in being its originator, discoverer, or author. Nor is the value of an idea some inherent property related to its degree of originality.

In the end, what matters about ideas is not only their capacity to shape the way I view the world, but even more how they shape my actions, choices, and behaviours in the world.

When an idea acutely changes the way I live my life, does really matter who had the idea first?

Perhaps our obsession with originality is more clearly a reflection of our individualism.

Every day I find myself more consciously aware that I need to hear, discover, and absorb the ideas of others… even people who have lived thousands of years ago.

The Internet tantalized the ego of my individualism, which sought to be original and unique, and turned it on its head to teach me a (repeated) lesson about humility.

Internet and Inequality

Kevin Drum wrote an article last year in Mother Jones wherein he argued that

…the internet makes dumb people dumber and smart people smarter. If you don’t know how to use it, or don’t have the background to ask the right questions, you’ll end up with a head full of nonsense. (Drum, 2012)

On the flip side, Ryan Avent suggests that the Internet may be an equalizer of cognitive fortitude:

The more I rely on the same cloud brain that’s available to anyone else, the less the strengths or weaknesses of my meat brain may matter. (Avent, 2012)

As we make the Internet, to what extent does the Internet makes us? Are we creating the network in our own image… is the network transforming us into its likeness? If we are going to speculate on the cognitive-equity consequences of the Internet, the question of reciprocal causation is paramount.

Has the Internet broadened the gap between the smart and the dumb? Has it increased our overall level of cognition by equalizing and democratizing access to information? As with most human technologies, the answer seems to be: depends on the user.

As the famous little epigram goes,

Great minds discuss ideas, average minds discuss events, and little minds discuss people. (Mouat, 1953)

The technical contribution of the Internet is the capacity to make all types of discussions broader and more accessible. The Internet is a digital amplification of human nature. Its transformative influence on the cognitive landscape of society is inseparable from the agendas and cognition of those who leverage it.

Blogging and Trusting

We often underestimate the trust that is associated with blogging. When I subscribe to a blog I am virtually inviting someone else (perhaps a rather talkative person) to make an apartment adjacent to the living room of my mind. Every time I check a blog or browse through RSS channels I am performing a wager of trust, banking on the author’s resolve to share things worth sharing.

Every post, every update, every act of “publishing” online is an act of trust-building or trust-breaking.

As with any form of human expression, there are times when the interests and intents of the consumer simply do not align with those of the creator. In this case, it may not be an issue of “trust broken” as much as it a flirtatious literary partnership that was just never meant to get serious. And this is fine. In fact, we’d be better off if we were more promiscuous readers from time to time. (There’s nothing wrong with a one-post stand; a fiery one-off that is so intense you can’t remember the author’s name in the morning.)

Regardless of the metaphor we choose for our readers, it falls upon the authors, I believe, to make the first moves to towards a committed relationship. There are some topics that are simply not good for discussion on a first date — especially things like running commentaries on minutiae void of consequence. Readers, like eligible suitors, tend to be put off by desperate, insecure ramblings that amount to little more than pleas for attention.

In summation: blog for others as you would have others blog for you.