top of page

Can’t Unsee

In recently scanning Tweets, my eyes halted for a second on a closeup of someone smiling. He didn’t particularly stand out from what I usually see in social media. But as I continued scrolling, I started feeling uneasy. Did this come from the positive photo or the 10s of other posts I viewed along with his? None could have caught my eye for more than a second. I was bothered that I couldn’t tell what bothered me.


I’d had this feeling before and it seemed specific to social media. In thinking a bit more, I realized it was also specific to positive posts! Negative Tweets produced more tangible reactions, whereas neutral ones really didn’t seem to affect me at all. Angst produced by disturbing images, for example, made perfect sense, but how to explain discomfort at scrolling through positive posts? Deep psychology? Lack of self-control? Was I at the mercy of apparently well-intentioned posters?


I don’t know a catch-all term for how our behaviors affect other people. Perhaps no such term exists for the good reason that acts and their effects are so varied to be beyond a useful one-word moniker. And when it comes to social media[1], searching for the magic term is largely moot, since – and this is my point in this essay – we are not well equipped to realize we are affected, nor how to deal with it once we pin-point the source.


In my view, our behaviors and emotions are mismatched to the accelerating information universe in which we find ourselves. So, although my uneasy feeling to a happy expression may very well have been the same as part of a hunter-gatherer group 10000+ years ago, it is preoccupying that I should express this feeling from a perfect stranger projected in a small corner of my computer screen.


Let’s look at the phenomenon of mismatch from different angles, starting with the raw material: information.


Relativity or relatively?

Information is all around and in us, but only manifests when interpreted[2]. Cells interpret information within and from other cells, plants interpret information from soil chemistry, microbes, and gasses such as O2 and CO2, we interpret information from books, art, sport, other people, and social media. I could go on ad infinitum, but even then, this would be just the tiniest tip of the information iceberg, especially for humans who not only thrive on information, but also generate excessive amounts of it – most of which is never interpreted.


Simplifying greatly, information has two main facets: in the moment and across time. First, what goes on around us and what we generate internally. The former includes the living and physical worlds, both of which interact in shaping information. Think of technology, think of the climate, think of picturesque landscapes. Like external sources of information, internal ones could appear to work in isolation, but (with the possible exception of creative imagination…and even then!) information generated from within us is ultimately influenced by outside.


Second, independent of information in and around us, individual experiences accumulate throughout a life. That is, a given person is exposed to an information level XPresent. XPresent tends to be greater than XPast, especially as Present-Past becomes large. As of today, my sum total of X over time period Y, is greater than yours over time period Z, if Y>Z. Simple, straightforward, and suggests cumulative information is a double-whammy: more per unit time and more time to have experienced it.


Each of us experiences ‘more’ in our own subjective way. My dates to the 1960s, when, compared to today, information came from a miniscule set of sources. In the pre-internet, pre-email, pre-cell phone world, our sources were TV[3], radio, books, newsprint and magazines, and word from people we met, corresponded with or phoned. That was it! We had less information available and greater control over content. Getting in-depth information meant reading books or newsprint. Books meant going to a book store, library or having a book lent by a friend. Newspapers could be delivered daily to your doorstep.


I was a big fan of New York Times and flitting my eyes over those ink-delible pages was an enjoyment. The total population of candidate memes encountered was a tither of what it is today. Or, perhaps the total information was comparable, but in fewer sources and one either ignored it, or was just better able to filter the unwanted, then compared to now. But the reality is that even if we perceive information as unchanging through time, this is an artefact of how we continuously mentally adapt to the reality of increasing information.


Adaptation to information growth and in particular to broadcasted information – that is, coming from a single source, with little or no possibility of reply – has its limits. To simplify, information comes in chunks and bits. We can react to chunks, since there are relatively few of them and they potentially have huge consequences. At the opposite end, infinitesimal bits tend to escape our notice, making them a challenge to filter. Bits in isolation present few challenges, yet can be more harmful than chunks should the former arrive as unfettered streams[4]. Was the Twitter smile the bit that broke my back?


Cultural change

We sometimes don’t realize the extent to which technology has come to dominate our information universe. Stored and newly generated information is just waiting to be consumed. The availability, quantity and speed of information continually pick-up though time, or, rather, tick-down as we go back in time. Going back, way back in time reveals something else about information: how we interface it. Humans have gone from being largely seekers to mostly wanderers.


A seeker wants to know what is happening in their surroundings and relies on direct observation and close social interactions. News from distant groups might as well be from the Moon. Our hunter-gatherer ancestors were largely seekers. As knowledge and technology grew, seeking kept apace by evolving to access this knowledge and employ technological tools. However, it’s a misconception that a nomadic lifestyle is only about finding the next meal. To the extent that present-day hunter-gatherers are representative of our distant past, we would have had comparable if not more inactivity and leisure time thousands of years ago than did the first agriculturalists or do we today! More time today, but not always dedicated to resting or fun.


What happened? The incessant growth of technology and connectedness has created time from nowhere – fleeting seconds or mesmerized minutes – to wander, gathering marginally relevant or useless stuff. Tweets are easy – letter-headed letters are hard – email is somewhere in-between. True, sometimes we do seek, but for well-to-do people, this is a far cry from quests for life’s necessities. The time from nowhere that we spend on our cell phones, email or the Internet however is not really leisure. What it is is hard to say[5].


Hasn’t escaped attention

So why are we so duped to get that fix?


Part of the answer is we cannot easily escape our biology. We have evolved to record and recall. This means that what we experience, even if glints, can’t always (if ever) be dispelled. Record-recall would have been useful for our ancestor seekers who employed indicators and clues to social interact, find resources and avoid dangers. The quantity of information encountered would have been enormous due to the diversity of situations and the many (sometimes unexpected) contexts of each. This nevertheless is a tiny fraction of the infocapts we encounter in today’s world. So, nowadays our recorders are turned on to help us survive and reproduce, but they are registering what is largely non-Darwinian crap. Indeed, that information from single encounters with perfect strangers affects us at all equates to pathological non-Darwinian crap.


Beyond apparent maladaptation in Darwin prize awardees and other extremes, it’s not unreasonable to think social media can produce much more frequent, unintended feelings. For example, some studies show that the use of social media correlates with depression. Add to this and arguments that social media has negative effects on broad-scale society, social media generating a range of less severe negative emotions and feelings on individuals.


In a 2021 article published in Trends in Cognitive Sciences, Edgar Kross and colleagues review the accumulating literature on how social media affects emotions. Although experimental evidence points to marginal negative effects on well-being, the authors argue that differential vulnerability and usage contexts generate person to person variation, and that surveys and experiments not accounting for this can mask moderate or severe impacts on certain users. The authors discuss and distinguish the many passive and active psychological facets of motivating and reacting to media, and the fact that we tend to notice and report negativities more often than positive consequences.


Social media is not as grave as tobacco, but you get the point.


State of mind

The Internet has become a self-service buffet. Just think of the freedom we have to say and see what we want. This is good as far as it goes, since human freedom by definition has limited limits. But freedom and spontaneity can be an explosive mixture in social interactions, generating self-regret and incredulity in others. Though whereas we have some control over what we say, this is not quite so for what we see, unless of course we either avoid social media altogether or do engage, but consecrate effort to blocking or muting accounts and only follow select people or things. General filtering is not a good solution because it is error-prone: well before reaching the malarkey threshold you’ve already seen one or more ludicrous, time-wasting or annoying posts.


Filtering what we see and not stumbling in what we say would appear best accomplished through some form of self-control and, specifically, vigilance. Vigilance lies somewhere between prevention, that is, don’t use social media or don’t follow certain accounts, and reaction, that is mute or remove what you find bothersome. Vigilance is prepping your mind to censure posts before they reach your visual cortex. Needless to say, this is problematic since filtering is time and energy consuming…and vigilance relinquishes freedom.


But all is not lost: We filter without even realizing it! Without filters our brains would be bombarded with information – functions would frazzle. Yet there’s a gulf between natural autonomous filtering and active hands-on sorting. Are we really appreciably losing freedom when we actively filter? There’s nuance here. Vigilant filtering reduces freedom because of time that could have been spent doing other things, but increases freedom to intently experience fun stuff.


A useful metaphor for vigilance is a focusable pair of glasses. Just turn the dial and unwanted information goes opaque. But given that natural filtration lets unwanted stuff through anyway[6], don’t count on manual filtration being perfect. There are many reasons, foremost that the informational environment is complex: emitters themselves are diverse and variable. What you don’t want to see today is produced 100% of the time by Jake, 50% of the time by Judy, and 10% by John[7]. What you don’t want to see tomorrow may be different, as might what each of the three Js post.



Unremembered

Vigilance is fine as long as posts trigger your filter when they should. You either avoid – that is, flit your eyes away or scroll faster – or defy and stay on the post and defuse it. More insidiously, posts that perhaps should trigger your filter don’t, and the experience obliquely rears its head sometime later.


This raises the question of how and why memories are formed and forgotten. To oversimplify, memory divides into short and long term. Short and long reside in largely distinct parts of the brain and the signaling pathways involved in their formation and recall differ as well. The point is regardless of how we slice and dice memory, it’s ultimately constrained. Constraint does not mean the brain has a hard limit beyond which memories either don’t enter or if they do, replace others, but rather (with rare exception) our minds are not bottomless pits. Memories need to be constructed, stored, maintained and retrieved. All of these capacities are limited and imperfect... and ever more so as we age.


That memory formation and retrieval are never optimal, means we record too much, miss recording, and mis-record. It stands to reason that memory components each have a probability distribution about some kind of optimum, and some of these memory dimensions are interdependent. Straying from the optimum has costs, for example, recording too much over too little time means lower play-back fidelity. On the other hand, too little recording could mean, in the extreme, paying with one’s life, for example from recurring encounters with a wild animal or a hostile human group. Recurring encounters would have selected for memory in our distant ancestors. This is key to understanding memory in today’s world, be it either appropriate responses or pathological responses to information.


Darwin

Information behaviors including social interactions are potentially under Darwinian selection. That is, the stuff of interior and exterior environments influences heritable behavioral traits, which, in turn, impact survival and reproduction. This is not to say that each and every thing you see and do has an evolutionary explanation, and indeed probably few do. Rather, information gathering and processing behaviors were selected in our ancestral environments, with the caveat that they were adapted to have some degree of flexibility as well[8]. This explains why we (usually) don’t experience a total meltdown when we look at our Twitter feed.


Again, as above, the behaviors we observe today do not necessarily serve the same functions, nor are under the same degree of selection, compared to when they evolved. Given social and behavioral flexibility this is hard to prove. We usually seek such evidence either through archeology or through the observation of present-day hunter-gatherer groups. We find that interdependence was and is key to the survival and prosperity of human groups. Not meeting social norms can result in a range of responses, from a disciplinary facial expression to physical punishment. Positive attitudes may have served to reinforce mutual-dependence by rallying groups to obtain scarce resources or fend-off competing groups. Positivism could have also fostered finer within-group social structure, for example mutual trust or generating power hierarchies.


Our minds have not fundamentally changed between the past and now, but the environmental and social contexts are very different. Whereas my adult livelihood would have depended on my clanmates 10000 years ago, I can live a largely solitary, long life today. The phenomenon of greater individualism and less interdependence is reflected in trends in how western culture has increasingly depended less on groupiness since the mid-20th Century. In Bowling Alone, Robert Putnam speculates that home entertainment – television in particular – was largely responsible for this effect. Since Putnam’s book, what would appear to be a return to social groups is – with the notable exception of gaming – in fact nothing of the kind. It’s not groups (it’s networks), not in-person (it’s virtual), and not enduring (it’s ephemeral).


The message is don’t confuse group with network. Today’s social networks not only don’t resemble old-school in-person gatherings, but if anything, the former brings Bowling Alone individualism from the living room to the global arena.


Mismatch

The clear conclusion is that behaviors elicited by social media are mismatched with respect to in-person contexts in which they evolved. Mismatch does not mean we get sick or die by simply scrolling Tweets! Rather, our responses to virtual social interactions are out of evolutionary contexts. It should come as little surprise that reactions to a Tweet could indeed approximate something like what we would have done in groups thousands of years ago. The point is that behaviors and emotions produced in today’s world are not under the same selection regime, since the consequences for performance measures such as retweets and new followers (both being proxies for Darwinian fitness) contrast or are simply not the same as Darwinian fitness in our ancestors[9].


Mismatch in social media can take many forms, the most tangible being wasting time that could be better spent doing other things, or, as is the theme of this essay, being unexpectedly impacted by the nature of what we see[10]. Either way, we do not necessarily forget all the drivel in social media. True, Twitter posts do not spontaneously pop in my head days after seeing them. But if I were to go back and re-scroll ‘forgotten’ ones, I’d surely be a dismayed at how many I remember.


Future shock now

No single entity can possibly access, no less process, even an infinitesimal fraction of human stored information. Yet information is what we perceive as progress. Progress means more information, which, in turn, supercharges progress. These feedback and feedforward loops manifest as an accelerating pace of life.


In the 1970 book Future Shock, Alvin Toffler develops the idea that humans cannot psychologically keep pace with the progress we as a society create[11]. The vast majority of humans contribute little if anything to progress, but rather are the unassuming psychologically victims. No doubt that a barrage of ever-new things and the inevitability of their ephemerality does not bode well with a sense of security and well-being. No wonder social media – as the epitome of ephemerality – can influence mental health.


Future Shock is a dystopian take on the mismatch framework. Writers and theorists often deflect mixed outlooks to the negative, and they provide little in the way of a firm scientific basis for their claims. Mismatch has a scientific basis in the academic realms of ecology and evolution. Even if an oversimplification to apply biological principles to human mind and society, there is consensus that psychology can be mismatched to present-day contexts. Unlike Future Shock however, mismatch need not generate negativities – it could be just as well that a new psychological context (e.g., spirituality, meditation) creates positive feelings.


All of the above may come across as too Darwinian. Well, I’ll add a layer. Our behavioral compass is adapted to our ancestral physical and social environments, but it is also adaptable to swings in environment and indeed to new environmental dimensions, especially technologies. This compass takes the form of self-control. Perhaps paradoxically, for control to work, it can’t be overly boxed-in with constraints. That there’s an evolutionary basis here is indicated by flexibility and control decreasing through life.


Alive?

As a first approximation, vigilance is selfish. We act to improve our own experience. But if many social media users are vigilant and use less, then this puts pressure on social media platform designers to adapt capabilities to keep the adepts onboard, to bring more people in, and make things such that adepts themselves bring others in. Platform adaptation to user morays could draw-in broad swaths of the population, but also to be more personalized, intelligently collecting and analyzing personal information and then reacting to keep you interested.


The obvious frightening question then is whether social media has agency like a thinking individual or, even scarier, like a thinking network. Although there’s some truth to social media agency, platforms are nowhere near the intelligence or autonomy of human individuals. Rather, although massive platforms such as Twitter, Instagram or Facebook could appear as local entities, they are, in fact, virtual, multi-purposed, distributed networks. This does not obviate their behavior being akin to a biological species, but rather what we see in real time is better described as multi-level, open terrain where users can wander through endless menus, click widgets, etc. This said, evolution somewhat akin to Darwinian can occur, but (at least as of present) this requires explicit platform intervention by human programmers. Thus, the social media evolution is not of the biological Blind Watchmaker type, but rather one directed to ameliorate shortcomings and seize opportunities.


Just coevolution silly

I believe the to-and-fro between users and platforms qualifies as evolution, or, more accurately coevolution. Social media coevolution is understudied and we simply don’t know what these dynamics resemble, in large part because existing platforms adapt to environmental conditions (cell phones, user behaviors, etc.) and new platform ‘species’ frequently emerge, the latter not by the process of biological speciation, but rather through a process akin to technological innovation.


The naive coevolutionary expectation is that emitters (social media platforms, posters) evolve to better penetrate minds in ways that benefit them, whereas viewers continually evolve better search and control capacities. Social media platforms have the one-up, since they are fighting (so to speak) for their survival: on the offensive and dedicated to keeping viewers. Viewers, on the other hand, are less dependent on platforms than the reverse. Thus, viewers are less assiduous about filtering than are platforms about corralling. Somewhat in-between are posters, who need both platforms and viewers to thrive. But posters and viewers are usually the same individuals! To the extent that strategies evolve, does being both a poster and a viewer generate internal conflict?


Agency v2.0

Beyond the impossibility of social media ever being regulated to the point that users decide to leave, its largely free, encouraging nature makes messing around a real concern. Messing around evidently includes what you and I would view as unacceptable and for which (or at least we hope) social media platforms are able to identify and censure. But most of the freedom distribution (at least in academia) is harmless: usually positive or occasionally a bit snarky. This is not what most viewers would view as surreptitiously messing with our minds, but then again, if a mind is influenced and the cause not apparent, then how can one know?


So, what to do? Platform regulation (except for extremes, such as bad-for-the-commons, hate groups, …) is out of the question, since, if for no other reason, we can’t cancel happy posters. More local filters on our devices are a possibility, but this risks making mistakes and missing interesting posts, and not being adaptable to our changing preferences.


The regulation of the information commons to create a world of pleasantness-for-all is therefore unattainable. However, were it to happen, it would create more problems (freedom being #1) than would solve. The fact that many perfectly well-intentioned posts can cause despair reinforces the conclusion that viewers themselves need to take more control. ‘Well-intentioned’ however should be nuanced, since for example, viewing someone happy who does not stop boasting might not feel particularly good if you have nothing to be happy about. As a corollary to the ‘unseeing of what bothers’, would you not find it odd if the vast majority of reactions to your good news are people who do not know you personally or professionally? Even if they do, isn’t it strange that their ❤️ or 👍 are seen both by you and any onlooker?


Self-control unleashed

This essay explored implications of how the ever-changing information landscape manifested in social media can generate subliminal effects on feelings and emotions. I have underscored scientifically-based explanations for mismatches and the practical idea of enhancing self-control through vigilance and filtering in limiting some negative effects of using social media. These ideas extend well beyond the social media universe to include broadcast-type media (television, podcasts), and events in everyday life (social norms).


The evident problem with self-control as a panacea is intentional or inadvertent self-censorship to socially important issues. I therefore stress the importance of self-control in managing self-control.




Footnotes

[1] Evidently there are a variety of platforms, some which we use for general interactions, others for personal relationships, and yet others for professional reasons. So when I glance over my university emails and see a name or subject-line, I can’t unsee it (although sometime I wish I could!). Has to be opened and once seen, dealt with now or later.

[2] A radical oversimplification! A given object can yield information at different levels, from atoms to larger-scale structure(s). The information yielded also depends on the environment (e.g., wavelengths of light reflecting off the object) and on the observer (aspects and level(s) observed, quality and quantity of input data, processing...mood or condition...). Thus, information can be intrinsic to the object, but more usually involves some degree of environmental interaction and interpretation by the observer.

[3] I recall eight channels in Los Angeles: 2 (CBS), 4 (NBC), 5 (KTLA), 7 (ABC), 9 (KCAL), 11 (KTTV), 13 (KCOP), 28 (PBS).

[4] I’ve previously put forward a theory that modern cultural change is largely due to tolerated increments in information, akin to memes. If these increments are endogenized at intermediate rates then the meme population (as it were) will achieve a dynamic equilibrium of constant turnover. The culture evolves but stays intact. However, if the force of broadcasting new memes is too great, then the culture may ‘meltdown’ in the sense that it has lost its basic elements and has become dominated by the continual influx of change (what I refer to as ‘deviations’). This idea of negligible changes is not written in stone. The information environment is not 100% perfect tasty bite-sized packets. Rather, information parameters can be inadvertently or intentionally targeted to the consumer. The use of anything from micro-steps or great leaps in content can target those who passively or willfully consume such memes. https://royalsocietypublishing.org/doi/10.1098/rsbl.2004.0172

[5] Laurie Santos has aptly called this ‘time confetti’.

[6] And should, since stuff needs to be evaluated (if only for a split second) before categorized as useful or binned as trash.

[7] This facet is sometimes ignored as an issue in the social media ecosphere. See https://www.sciencemag.org/careers/2018/11/social-media-survival-guide-scientists https://www.sciencemagazinedigital.org/sciencemagazine/11_february_2022_Main/MobilePagedArticle.action?articleId=1765504&app=false#articleId1765504

[8] Of course, I have not forgotten that social trait evolution is largely cultural, that is, evolution can and does occur without selection on genetic variants.

[9] A low-cost act demonstrating generalized reciprocity, would be selected in closely knit, mutually dependent hunter-gatherer groups, but is meaningless in many present-day contexts. For example, opening a door for a perfect stranger only makes sense if you had a good chance at meeting that stranger at some later time. Door-opening behavior is maintained therefore as a modern social norm and breaking the norm could result in paying the high cost of an altercation and is therefore maintained in most situations due to the low comparative cost of opening the door.

[10] Another facet of maladaptation is the effect of social media on how we access and interpret information. Being bombarded with information that constantly tests our emotions could lead to insensitivity and pathologies such as anxiety. Again, as is the theme of this essay, the gradual onset and subtlety of these effects could mean it escapes our notice completely.

[11] A glimpse of what was to become Future Shock can be found in Toffler’s 1965 essay The Future as a Way of Life https://mgorbis.medium.com/the-future-as-a-way-of-life-4bc314ec97de. Although Toffler lived to 2016, I somehow doubt he ever had a Facebook, Twitter or Instagram account.

Recent Posts

See All

Beams in the Distance

The card led to cobbles but the lead, the road was too, far, too off in the distance to start again, to mend, to the end. Yet he lay there, decided, pocked by the sight, that horrible feeling, the emp

bottom of page