How Everyone Lost Their Mind… Except you

Max Stossel
12 min readNov 1, 2018

Have you noticed that everything seems like it’s falling apart? Have you noticed that the world is crumbling in exactly the way you thought it would? Have you felt recently like you’re the only one who gets it, and everyone else just doesn’t understand… like the world has been going crazy and everyone else is getting sucked in to their conspiracy theories, or fooled by mainstream media, or deceived by Trump’s media machine or too caught up in identity politics/tribalism to have their own opinions or brainwashed by the alt-right or extreme left? Doesn’t it feel like everyone except for you, and maybe your tribe has lost their freaking minds?!

I’d like to help explain why.

There’s been a series of subtle and not so subtle changes to the way we consume and share information, that has thrown a significant wrinkle in our reality.

Before we dive in, I’m not saying that all social media is evil. Of course it’s been an incredible tool for many in a variety of ways and circumstances, but many of the ways it’s deeply harming society are challenging to articulate. In this article I‘ll give one of them a try.

A couple years ago, before Facebook realized that an algorithm with trillions of data points with the sole purpose of getting people to spend more time on their platform was not such a good idea, every time you clicked that blue icon there was a super computer playing chess against your brain. It was trying to show you the perfect next piece of content that would get you to spend more time on site. Even if it made what seemed like a bad move sometimes, and you thought “hey I don’t care about this” it was playing the long game. It learned from our behavior and the behavior of millions others like us, so over time it kept us coming back for more. The house always wins, especially when there are 1000 engineers updating the house every day to make it better and better at playing you.

Image from AlphaChess

You might remember a simpler time, when Facebook was much more dog and baby photos than it was news. Then at a certain point (I think around in 2014) Facebook started heavily emphasizing news on the platform and the algorithm had to figure out the question of “Which of the infinite possibilities of articles should we show each Facebook user?” After lots and lots of testing, these algorithms optimizing for engagement tend to make a similar discovery. They find that a very particular type of article rises to the top. Zeynep Tufekci explains this in her TED talk better than anyone I’ve seen. Quote below:

So in 2016, I attended rallies of then-candidate Donald Trump to study as a scholar the movement supporting him. I study social movements, so I was studying it, too. And then I wanted to write something about one of his rallies, so I watched it a few times on YouTube. YouTube started recommending to me and autoplaying to me white supremacist videos in increasing order of extremism. If I watched one, it served up one even more extreme and autoplayed that one, too. If you watch Hillary Clinton or Bernie Sanders content, YouTube recommends and autoplays conspiracy left, and it goes downhill from there.

Well, you might be thinking, this is politics, but it’s not. This isn’t about politics. This is just the algorithm figuring out human behavior. I once watched a video about vegetarianism on YouTube and YouTube recommended and autoplayed a video about being vegan. It’s like you’re never hardcore enough for YouTube.

And unfortunately it’s not just Youtube. Facebook, Twitter, this entire system that has revolutionized the way we consume and share information, has a natural tilt towards extremism, but not just any extremism. It pushes us towards the extremes of what ever ideas we are personally most likely to believe. Whatever we are most personally susceptible to, shared by our friends and those we trust most.

What does the world look like when 2 billion people are pushed towards the extreme versions of their own thoughts? Whatever extreme rabbit hole your brain is most likely to fall down? It looks something like the world we’re in today.

The rabbit hole my brain is most susceptible to is that people are losing their sense of reality and that social media is breaking the world beyond repair. For you it might be that climate change has us hopelessly screwed and evil corporations hate the environment (bear with me). It might be that Trump has destroyed America, it might be that the left has lost all sense of reason and tolerance… the algorithms don’t care. They care only about enticing us to stay on the site longer. Tilting our view of reality, messing our emotional states were just externalities of what they discovered as the most effective means to keep us scrolling.

To be clear, I’m NOT saying that any of the above scenarios aren’t problems! What I am saying is that whichever of those problems an individual’s brain is most likely to believe… over the past couple of years that person has been systematically shuffled towards a more extreme version of that belief.

One of the most challenging ideas to wrestle with here, is that this is not just happening to some abstract “them”. This is not just those other people who don’t get it and are completely lost and if only they could see the light you see. It’s happening to me. This is happening to you. It’s happening to Mark Zuckerberg and Donald Trump. It’s happening to all of us.

This is the closest I could get to a mirror.

Even if you turned everything off and decided to get all your information from TV, the people on TV are deeply ingrained in this social media system, using it often to stay on the pulse, so they’re being pushed to the extremes. Even if you avoid screens all together… you probably know a few other people, and talk with them sometimes. If you’re talking to other humans… chances are they’re being impacted by this system, and no one influences us more than our immediate social circles, online and off.

After a couple years of two billion people being systematically pushed to the extreme versions of their ideas, What seemed like a sudden shift in election results started making a lot of people pay attention to this issue. Brexit and Trump’s election seemed incredibly shocking to many, including employees of mainstream news and of social media companies as the individuals in these companies were not immune to being pushed in their own directions. I don’t think social media is the sole reason for these events by any stretch of the imagination, but I do think if we zoom out far enough, a global pattern appears to be emerging.

I‘m personally not a fan of this trend, but it appears to be happening in the UK, US, most recently Brazil, and other countries that are experiencing different versions of the same thing. When you take a society, introduce a smartphone with social media into everyone in the country’s hand, and then have elections, the result seems to be a push for more authoritarian/strongman political leaders. If those leaders win, this seems to be followed by a shocked opposition spreading every possible piece of negative information about those leaders and their supporters. Some of this information is true and some isn’t, but the information that isn’t true or is exaggerated is what reaches those supporters, who in turn think the people who hate the leader they voted for have lost their minds.

I have a hunch that suddenly being able to see all the most personally terrifying problems in the world without much agency to actually do anything about them contributes strongly to this trend. As Mark Manson puts it:

We become only exposed to the most extreme negative aspects of certain groups of people, giving us a skewed view of how other people in the world really think, act, and live. When we are exposed to police, we only see the worst 0.1% of police. When we are exposed to poor African Americans, we’re only exposed to the worst 0.1% of poor African Americans. When we’re exposed to Muslim immigrants, we only hear about the worst 0.1% of Muslim immigrants. When we are exposed to chauvinist, shitty white men, we’re only exposed to the worst 0.1%, and when we’re exposed to angry and entitled social justice warriors, we’re only exposed to the worst 0.1%.

As a result, it feels as though everyone is an angry fucking extremist and is full of hate and violence and the world is coming undone thread by thread, when the truth is that most of the population occupies a silent middle ground and is actually probably not in so much disagreement with one another.

We demonize each other. We judge groups of people by their weakest and most depraved members. And to protect ourselves from the overreaching judgments of others, we consolidate into our own clans and tribes, we take refuge in our own precious identity politics and we buy more and more into a worldview that is disconnected from cold data and hard facts.

Unfortunately, when we believe the world is falling apart and focus on the worst 0.1% for long enough, we start to actually become divided. We the people of more extreme ideas, each CERTAIN of our world view as its been so systematically confirmed over the past few years, are scrolling through “news”feeds lingering longer on anything that shows further evidence of what we believe or what makes us outraged. When we linger longer online, the algorithms take notice and we get more content that pushes us in that direction. On TV and through conversation we naturally seek out this confirmation ourselves. Either way, we come to this unprecedented certainty that our suspicions about the state of the world were right all along.

I think this is where we’re at. A generation of people completely certain that we see the world the way it truly is, and that no one gets it but us. Unfortunately, there are also third parties working to exacerbate these issues.

Bots Throwing A Wrench In Everything

Today it’s almost impossible to separate the internet from breaking news. We have this expectation of being able to know about something the moment it happens. The problem with this is that accurate information often takes time. We want to know what’s going on right away, but how often do we actually get a wholistic picture of what happened while following a breaking story? Often it feels more like entertainment than news. Google and Facebook have gotten much better at removing problems after they’ve been discovered, but on both of these platforms there is a deeply troubling immediacy problem.

The moment a new breaking news event happens (for example this week’s shooting in Pittsburg) Google Facebook and Twitter are extremely susceptible to false or extreme information rising to the top in immediate reaction. It goes something like this:

  1. Newsworthy Event Happens
  2. Foreign Actors or Domestic Actors with specific agendas surface stories blaming event on a particular group
  3. Bots escalate these stories and make them look like people are actually believing them
  4. People start believing them because they have so much engagement or are trending
  5. We sink deeper into our rabbit holes of sadness about the state of the world and frustration with the other side

The platforms are stuck playing Whack-A-Mole with these issues and by the time they’ve removed the harmful posts the damage has been done. Bots were driving 23 percent of the initial Twitter activity related to the the Tree of Life shooting.

Platforms have a long way to go to improve these systems, but at least they’re now working on some of the societal scale issues. Facebook, Youtube, Twitter etc. keep their algorithms in a black box, so it’s impossible to say for certain, but in today’s social media climate I actually think Facebook’s algorithm is less tilted towards extremism than it once was (though that’s kind of like being less on fire). Youtube continues to be a recommendation nightmare, and Twitter has become more and more like the Facebook of 2014. But these platforms are at least aware that they have immense responsibility.

They’re aware that more time on site isn’t actually good for people’s mental health. They’re aware they’re accidentally causing deaths in countries like Myanmar and India where the engineers don’t speak the language. They’re aware they’re accidentally making children watch scarring videos and they’re trying to squash problems as they find them. They are caught in an endless and horrifying game of Whack-a-mole.

I believe they need to be more willing to stray from their core products to make any real progress*, but regardless of the changes these platforms may or may not make, if we don’t get present to the impact years of this system have had on our worldview we can’t actually undo the damage. If we start seeing better and less extreme information, we won’t believe it because it will look like it’s wrong or in denial or completely disconnected from reality, or that __________ has lost their freaking minds!

I know that saying “hey, your worldview is inaccurate because you’ve been manipulated by this system.” isn’t going to immediately make you reevaluate your beliefs. I’m aware of the irony that this is the extreme belief that I’ve been led down the rabbit hole of, and the irony that the systems distributing this message are targeting it to the people most likely to naturally agree or be enraged by this point of view. But I’ve also been trying desperately to keep my eyes on the road as I’ve watched all this going on, and I think I’ve done a decent, albeit flawed, job.

I hope seeing my perspective on all this will make you be a little more open-minded towards the people you think are crazy. Maybe it will make you a little less certain that your doomsday is really the one to be worried about. Maybe it will stop you from sharing that outrage-inducing article and de-outrage the internet just a little bit. Maybe it will lead you to examine what rabbit holes you’re vulnerable to falling down. Maybe it will provide a bit more space for action in the areas in your life you have control over. If any of that happens then the last few hours I’ve spent glued to my screen will have been worthwhile.


Thought Starters on A Better Way

*What does a social network look like that actually enhances our social lives? Maybe one that helps us spend more time with people we’ve indicated we care about. Maybe it guides us to more experiences that we might spend less time on, but later rate as having been satisfying or meaningful. I wouldn’t pay for any of the social networks as they exist now, but I’d absolutely pay for something like that. Something using this wealth of data it has on me to help me achieve goals that are actually my own.

Joe Edelman, who helped coin the term “Time Well Spent” (that has since been coopted by Zuckerberg and others), believes that the answer is more about values than goals. He believes the solution to many of these issues lies in designing social systems to help people live by their values. If we think about consumption and sharing of news through that lens it would likely look a lot different. What is the purpose of news? What’s news’ role in a “social” network? There’s this societal pressure to “know what’s going on” but in a completely connected world, it seems clear we need to get more specific than that. What news helps to benefit your life? What news helps you benefit the lives of others?

If someone is trying to live by their value of generosity what would it look like for their news to be tailored to that value? It probably wouldn’t look anything like a feed at all. It might be some system that brings information at opportune moments to help them acquire resources so that they have enough to distribute. Similarly it might surface content that allows this person to find new opportunities or environments around them in which they could act generously. If someone were trying to live by their value of emotional courageousness, perhaps they’d be served with stories of people modeling that behavior. Joe’s Human Systems Curriculum teaches designers to think this way. I highly recommend the course.

Imagine a world where platforms that distribute information or news are not just competing for our attention, but are competing for how effectively they help us live by our values. That’s a world I think we could all be proud of and a world I’d like to live in.

Lots in this article is not of my own discovering. Thanks to Joe Edelman, Zeynep Tufekci, Tristan Harris, Mark Manson, Tobias Rose-Stockwell and so many others who dedicate your time and attention to better understanding, articulating, and solving these issues.



Max Stossel

Award winning poet + filmmaker Head of Education at The Center for Humane Technology