Everyone who wants to take big, significant action in the world understands that intellectual rigor is important.
You need evidence, you need data, and you need people who can think well and manage their emotions to act in a clear and unbiased fashion.
It’s called intellectual rigor.
It runs the world and it makes you a more effective world-improver to be really good at it.
The academic/technology/ EA/rationalist communities are very good at attracting people who understand the value of intellectual rigor and are committed to increasing their mastery over it.
The core case for intellectual rigor goes as follows:
- as humans, we are rational beings but our ability to use our reason can be blocked by a slew of cognitive biases or unresolved emotions
- being reasonable and rational is a good thing that we want more of to make good decisions
- therefore it is our responsibility to develop ways of being and thinking that are aware of these and counter-act them effectively.
Without intellectual rigor, we’re prone to fall for our cognitive biases, generate wrong conclusions, and over-rely on intuitions and emotions to make decisions. Intellectual rigor also scales well and is teachable.
This is why some of our smartest and wealthiest people are investing in or hiring people who display a lot of intellectual rigor, creating an intellectual elite that is shaping the culture associated with impactful action and world improvement now.
But although we’re increasing our world’s intellectual rigor capabilities, I am not convinced that the optimal course of action is to always apply rationality and reason to things.
There are important signals in our emotions and experiences or desires that seem irrational, and by not knowing how to interface with them (or being outright scared of them), we’re making decisions only on the basis of what we can understand or grasp.
It’s common to hear people say that the science community is broken and the culture of academic communities optimizes for the wrong incentives.
It seems to me that as a researcher, you can do research on what gets funded or do research on what really interests you but face a lot of risk and uncertainty as an independent researcher. There are even more incentives that try to drop research results that do not make sense (or would be outright uncomfortable for some people or organizations).
Similarly, it seems to me that EAs and rationalists initiated a lot of important, intellectually rigorous projects in the world (GiveWell, MIRI, OpenPhil, LessWrong) but I would argue that they struggle to bring about significant change on the systems level.
They are largely operating within the system rather than creating new systems that could give us confidence against x-risks. A piece of implicit evidence for this is how much urgency and doom is felt in the community right now – people don’t feel like we’re making real progress against x-risks. Or the amount of criticism the EA community gets for burnout.
There is some unwholesomeness and a lack of balance in those communities and it bugs me because I believe in their values and agree we need to do the best we can to prevent civilizational collapse.
What is culturally missing?
My take is that we are failing to acknowledge that we need equally as much rigor to counter the flaws in our thinking, as we need to leverage the natural skills and ‘technologies’ that come from our human nature. (although psychotechnologies is a term that’s gaining popularity – if you’re intrigued, a great place to start is Eliot’s and Daniel’s podcast Psychotechnologies Live).
We all have a prefrontal cortex and a hypothetical ability to carry out complex calculations, predictions, and large-scale planning.
But we would all agree that just having it is not enough to use it effectively – we need to actually put work into training this machinery.
That’s why we all agree it’s good to attend school, know how to tell good evidence from bad, train critical thinking, and even spend money and time on rationality training (e.g. CFAR, Farnam Street, any good executive coaching, etc).
But equally, we also have powerful human skills, including intuition, empathy, interoception, or emotional signaling but we take them for granted or dismiss them much more willingly.
We kind of just expect to be good at them because we are human.
But just like with intellect, we can develop a culture around practices and improve them to actually make them work for us, not against us.
I’d like to call this space of human skills another kind of rigor – relational rigor. One that complements and balances the intellectual one.
Relational means focused on building, nurturing, and growing high-quality relationships: to oneself, to others, to non-humans, and to the general world around us. We do not often think about it explicitly, but we are constantly in relation to everything around us.
And if we do not have the tools to examine those relationships and improve them, we might end up doing something that seems important but we’re doing it in the wrong way because we’re not relating to it in a healthy way.
And that has consequences on how effective we can be.
I think that noticing it and having the tools to relate to things well can significantly improve our experience of life and therefore take effective action in the world.
We have many unproductive ways of thinking about this type of rigour currently:
- some people outright reject them as useful (usually, the more intellectually rigorous, the more that is true)
- some people repress them to achieve their goals and “be rational”
- some people take on structured paths to explore them and re-integrate them into their life (e.g. doing therapy, taking a sabbatical, going on retreats)
- some people (especially those who feel wronged by or disappointed by the western world and intellectualism) make themselves their new god and become a spiritual goddess/god to reach others how to free their minds and body from the constraints of our western culture (New Age stuff on social media)
Neither one of these is a way of relating that would be analogous to what the scientists or rationalists did with intellectual rigor – systematizing it, creating tools to learn it, creating communities to help people gain mastery, and creating organizations that spread the skill at large scales.
We need a new culture that will help us reliably work on and scale relational rigor. So that it can complement our existing intellectual rigor in our quest to take action to improve the world.
Further questions I’m exploring:
- How can we measure progress in relational rigor?
- What are the highest-leverage tools for building relational rigor? (for individuals, for communities, for organizations)
- What’s the right balance of relational and intellectual rigor? How do we know when to use which?
- How should someone concerned with relational rigor navigate the mainstream intellectual and productivity culture? How to play the game in between?
- How can we make people who value and have a lot of intellectual rigor interested in learning relational rigor?
- How do we present it so that it does not feel like a threat to their intellectual rigor?
If this resonates and you’d like to chat more about this, reach out on sasobanska@gmail.com