If you’re not a fan of body parts or biology, yes I agree, brains can be gross… But this wrinkly little sponge can do some amazing things when used in the right way.
I personally have never been one to admire others for their beauty, money, experiences, or status, but when it comes to the way someone thinks… I begin to “fangirl” like crazy!
For me, one of the most amazing moments to watch is when a beautiful brain is able to string together different ideas and explain those connections to a “regular” audience. These moments are hard to come by, but when they happen it’s like fireworks for my brain!
This week’s research sprint was inspired by a book I’m reading (e.g. Skin in the Game) and an author I’m researching (e.g. Nassim Taleb).
Nassim is one of many brains I admire due to the way he thinks and the ideas he contemplates.
Below is a short list of brains I admire, but this list is always changing and growing, similar to my understanding of the world… Something I think we should all be a little more open to (e.g. changing our perspectives)
- Nassim Taleb, Chamath Palihapitiya, Naval Ravikant, Richard Feynman, Michelle Thaller, Eric Weinstein, Jason Fried, Scott Adams, Marc Andreessen, Nick Bostrom, Elon Musk, & Bryan Johnson
And yes… I’m completely aware that this list is male-heavy, so thanks to this little exercise I’ll start working on finding more women brains to admire. If you have any suggestions feel free to shoot them over (here).
Asking the right question
This admiration for the way people think started me off on a “wandering journey” trying to figure out what they’re doing that I’m not… The danger (or beauty) of going on these “wandering journies” is that they sometimes result in dead ends, but the flipside to wasting hours of your life Googling the “wrong” questions, you’ll uncover the “right” ones eventually. That was the exact case here.
At the beginning of this week, I started off researching rational and scientific thought, hoping to understand more about this mental process. My hope was that if I could better understand how to approach problems and opportunities more scientifically, then I would be one baby step closer to the brains I admire.
Good news! I know more about these ideas. Bad news! After a few hours of reading, I found out that I was asking the “wrong” question. But that’s life… Ha!
I realized that scientific thinking sits under a larger umbrella of rational or analytical thought, which sits under an even larger umbrella of mental models (formally known as heuristics)… And that’s what we’re going to explore today, but before that, I want to share some interesting nuggets of information about our brains and the world we live in.
Lazy by default
It’s amazing how we’re naturally wired to be so lazy as primates, but still, accomplish some amazing things (e.g. moon landing, internet, fast food, genome editing, and selfies). If you’re not aware, the majority of our body, especially the brain defaults to lazy (e.g. do whatever is easy) when given the option… But this laziness is for a good reason. Back in the day, we primates needed to conserve all of our energy for survival, but luckily things have changed and now we’re able to focus on sending people to Mars (or taking selfies).
The mental work of learning something new can be a painful and long process, but it’s possible to enjoy the pain and shorten the process.
Why thinking is hard…
The process of thinking can be broken into two systems…
System 1 (we’ll name Drew):
- Defining “Drew” → This is the brain’s fast, automatic, & intuitive approach. This is also where your “mental shortcuts” and heuristics live.
- Examples: Driving a car on an empty road… reading text on a billboard… or… solving 2+2
System 2 (we’ll name Gun):
- Defining “Gun” → This is the mind’s slower, analytical mode, where reasoning dominates
- Examples: digging into your memory to recognize a sound… parking into a tight parking space… or… solving 17 × 24
What most people don’t understand about these two systems, is that they work together… Once you’ve gone through the pain of training “Gun” (e.g. System 2) he then passes this learning onto “Drew” (e.g. System 1). Once “Drew” has this skill or knowledge, then he’s able to pull it up MUCH easier when making decisions.
The trick here is to figure out ways to build up your pain tolerance for using “Gun” (e.g. System 2) and passing those learnings onto “Drew” (e.g. System 1), so you create a war chest of skills and knowledge that can easily be called upon. More on this a bit later…
… And why it’s ok to use shortcuts
Now, most people see the use of System 1 as a lazy shortcut and that we humans should be more logical, more of the time. This is wishful thinking. As I mentioned before, we humans are biologically built to conserve energy, but more importantly, this approach is wrong. And that leads me to my next point…
There’s a group of thinkers (Nassim Taleb and Gerd Gigerenzer are two popular ones) that argue taking mental shortcuts (e.g. heuristics) is not only OK but actually better than trying to rationalize your way through life. This argument was born out of an idea called “Bounded Rationality”, which Herbert Simon made famous. The term sounds complicated because most academic things do, but it’s pretty simple… Basically, we humans are limited (e.g. bounded) by how rational we can be due to our brain’s ability to think analytically and the time we have to make decisions.
This idea of being limited by our brain and time sparked a simple, but convincing argument from this group of thinkers.
We, humans, tend to run into two types of problems and each problem has its own little world.
World 1 (e.g. Certain) – This is when we have “perfect” knowledge of all the future scenarios of the world, their consequences, and the probability of each scenario happening.
- Example: This world could be explained through the casino. Depending on how you prefer to lose your money (e.g. the game you play), you could calculate all the possible numbers your dice could land on, the consequences for each role, and the probability of you getting different combinations… But… In the end, you’ll realize most likely you’re going to lose.
World 2 (e.g. Uncertain) – This is when there’s no “perfect” knowledge of all the future scenarios of the world, their consequences, and the probability of each scenario happening. This is the world we live in most of the time…
- Example: There is an endless list of examples for this world, but we’ll go with something obvious… Like deciding to change careers. It doesn’t matter if you’re changing roles inside the same company, switching industries, or going out to the world on your own as an entrepreneur, there are too many knowns and unknowns for you to rationalize you’re way through it. This is where “mental shortcuts” come into play, with a little bit of logical thinking sprinkled throughout.
The majority of our lives are based in World 2 (e.g. Uncertain) and in this world mental shortcuts (e.g. heuristics or System 1) are almost always the more effective way of thinking.
A spiderweb of brains
After, understanding that we humans use “mental shortcuts” most of the time and these “mental shortcuts” are developed through training “System 2” (e.g. slow and rational thinking)… I immediately had my next question…
If I’m aware of more mental models, then will I be able to train “system 2” a bit faster?
In walks Shane Parrish and his gang of “mental model” enthusiasts (e.g. Gabriel Weinberg, Charlie Munger, and James Clear).
Here’s a short explanation from Shane on what mental models are and how they’re used…
“Mental models are how we understand the world. Not only do they shape what we think and how we understand but they shape the connections and opportunities that we see. Mental models are how we simplify complexity, why we consider some things more relevant than others, and how we reason.
A mental model is simply a representation of how something works. We cannot keep all of the details of the world in our brains, so we use models to simplify the complex into understandable and organizable chunks.”
The interesting thing about mental models is that once you know the name of a model and the basic definition, you start to notice them everywhere. This mental illusion is called “Frequency Bias”. For example, think about your younger self and how you really wanted a specific type of shoe, toy, or model car, and all of a sudden you began to see this everywhere… That’s “frequency bias” in action.
The more models you notice in yourself and the brains you admire, the easier it becomes to begin mapping these out. This “spiderweb” of models is something you can build over time and will hopefully help with training “System 2” a bit faster. At least that’s what I’m hoping for, remember we’re on this journey together! Haha
So let’s take a look at a few mental models.
Three mental models, I use the most
- Pareto’s Principle (e.g. 80/20): The principle states that a small amount of some effort (e.g. 20%) causes a disproportionately large effect (e.g. 80%). This is a mental model I find myself using a lot and I find myself asking a similar question constantly… “What’s the most effective use of my time and energy?”. Remember “effective” doesn’t always mean productive, so going for a walk, relaxing with a loved one, or taking a break with YouTube is completely acceptable. HAHA!
- Narrative Instinct: Human beings have been called “the storytelling animal” because of our instinct to create and seek meaning in narrative. This is a mental model I’ve found extremely useful in my life when trying to get complex ideas across to friends, family, and co-workers.
- Survivorship Bias: This happens when we assume that success tells the whole story and when we don’t emphasize past failures. Think about how we sometimes worship those college dropout billionaires (e.g. Bill Gates, Mark Zuckerburg, Steve Jobs, etc.), without considering all the other people that dropped out and didn’t land so gracefully… Or that right now it seems “cool” to promote entrepreneurship, without emphasizing the depression, failures, and solitude that comes with it. This is a mental model I lean on anytime someone asks me about success… It’s mainly timing and luck, that’s it.
Three mental models, I’m interested in learning more about
- Second-Order Thinking: This is thinking farther ahead and more holistically. It’s when you can say to yourself “If I do A, then there’s a chance B, C, and D will happen as a consequence”. It requires us to not only consider our actions (e.g. A) and their immediate consequences (e.g. B) but the “domino effect” of those actions as well (e.g. C and D).
- Sunk Cost Fallacy: This is when you’ve “sunk” money, time, energy, identity, or anything else into something and now your decisions going forward are affected by that previous commitment. Here’s a personal example: I recently spent the past 8+ months dedicating every spare moment to studying theoretical computer science, web development, and data science, but once I realized that the daily life of a programmer wasn’t for me I felt the urge to pivot again. This mental model prevented me from moving forward for almost an entire month… Afterward, I realized this time and energy wasn’t a waste, but a necessity to figure out where I wanted to head next.
- Inversion: This model is a mind-bending one. When faced with a difficult problem most of us see it from front to back working our way through the problem to the solution… Inversion flips that around, so you’re looking at the end and working backward. Charlie Munger did a good job summarizing this in a simple quote… “All I want to know is where I’m going to die, so I’ll never go there.”
I hope this more abstract “wandering journey” of learning was interesting and useful!
Consume widely (e.g. technology, biology, physics, psychology, etc.), but more importantly, be conscious of the mental models the brains you admire are using.