Everyone who wants to take big, significant action in the world understands that intellectual rigour is important.
You need evidence, you need data, and you need people who can think well and manage their emotions to act in a clear and unbiased fashion.
We’ll call this intellectual rigour.
It runs the world and it makes you a more effective world-improver to be really good at it.
We have role models that inspire members of the community to constantly get better at it (e.g. Eliezer, Scott, Nate Soares, gwern)
The academic/EA/rationalist communities are very good at attracting people who, just like their founders, understand the value of intellectual rigour and are committed to increasing their mastery over it.
They say that we need to take responsibility for our flawed, bias-prone human nature and improve our thinking, so that we can take effective action that is aligned with outcomes we want to achieve.
Without intellectual rigour, we’re prone to fall for our cognitive biases, generate wrong conclusions, and over-rely on intuitions and emotions to make decisions. Intellectual rigour also scales well and is teachable.
This is why some of our smartest and wealthiest people are investing in or hiring people who display a lot of intellectual rigour, creating an intellectual elite that is shaping the culture associated with impactful action and world improvement now.
But although we’re increasing our world’s intellectual rigour capabilities, the outcomes don’t indicate that we’re moving towards an optimal state of impact.
It’s common to hear people say that the science community is broken and the culture of academic communities optimises for the wrong incentives.
EAs and rationalists initiated a lot of important, intellectually rigorous projects in the world (MIRI, OpenPhil, FTX, LessWrong) but I would argue that they struggle to bring about significant change on the systems level.
They are largely operating within the system rather than creating new systems that could give us confidence against x-risks. A piece of implicit evidence for this is how much urgency and doom is felt in the community right now – people don’t feel like we’re making real progress against x-risks. Or the amount of criticism the EA community gets for burnout.
It often seems to me that there is some unwholesomeness, un-balancedness in those communities and it bugs me because I believe in their values and agree we need to do the best we can to prevent civilisational collapse.
What is culturally missing?
My take is that we are failing to acknowledge that we need equally as much rigour to counter the flaws in our thinking, as we need to leverage the natural skills and ‘technologies’ that come from our human nature.
We all have a prefrontal cortex and a hypothetical ability to carry out complex calculations, predictions and large-scale planning.
But we would all agree that just having it is not enough to use it effectively – we need to actually put work into training this machinery. That’s why we all agree we should be going to school, know how to tell good evidence from bad, train critical thinking and are willing to spend money and time on rationality training (e.g. CFAR).
In the same way, we also have powerful human skills, including intuition, empathy, interoception, or emotional signalling but we take them for granted or dismiss them much more willingly.
We kind of just expect to be good at them because we are human. But just like with intellect, we can develop a culture around practices and improve them to actually make them work for us, not against us.
I’d like to call this space of human skills as another kind of rigour – relational rigour. One that complements and balances the intellectual one.
Relational means focused on building, nurturing and growing high-quality relationships: to oneself, to others, to non-humans and the general world around us. We do not often think about it explicitly, but we are constantly in relation to everything around us.
And if we do not have the tools to examine those relationships and improve them, we might end up doing something that seems important but we’re doing it in the wrong way, because we’re not relating to it in a healthy way.
And that has consequences on how effective we can be.
I think that noticing it and having the tools to relate to things well can significantly improve our experience of life and therefore take effective action in the world.
We have many unproductive ways of thinking to this type of rigour currently:
- some people outright reject them as useful (usually, the more intellectually rigorous, the more that is true)
- some people repress them to achieve their goals and “be rational”
- some people take on structured paths to explore them and re-integrate them into their life (e.g. doing therapy, taking a sabbatical, going on retreats)
- some people (especially those who feel wronged by or disappointed by the western world and intellectualism) make them their new god and become a spiritual goddess/god to reach others how to free their mind and body from the constraints of our western culture (New Age stuff on social media)
Neither one of these is a way of relating that would be analogous to what the scientists or rationalists did with intellectual rigour – systematising it, creating tools to learn it, creating communities to help people gain mastery, and creating organisations that spread the skill at large scales.
We need a new culture that will help us reliably work on and scale relational rigour. So that it can complement our existing intellectual rigour in our quest to take action to improve the world.
- How can we measure progress in relational rigour?
- What are the highest-leverage tools for building relational rigour? (for individuals, for communities, for orgs)
- What’s the right balance of relational and intellectual rigour? How do we know when to use which?
- Are there other kinds of rigour?
- How can we make people who value and have a lot of intellectual rigour interested in learning relational rigour? How do we present it so that it does not feel like a threat to their intellectual rigour?
If this resonates and you’d like to chat more about this, reach out on firstname.lastname@example.org