Language changes. It’s right that it does.
But not yay.
Not for me. I like language as it is. You know, as I learned it. Properly. According to the rules as taught and drilled, tested and modelled by my redoubtable, walking-round-the-classroom-peering-over-your-shoulder middle school English teacher, Miss Martin. Never mind where her rules originated. And never mind how many of her rule-makers were rule-breakers defying their own teacher’s rules. Like Shakespeare.
That’s where I go wrong. Shakespeare. I love him. I revere him. Anyone who doesn’t isn’t breathing. But you can’t just say, ‘All language rule-breakers are wrong except Shakespeare.’ You can’t say that after Shakespeare no one gets to make up 1,700 words or turn nouns into verbs or invent prefixes. Either it is morally ok to mutilate language usage, or it isn’t. So I guess it is.
But just for a second, take that word: ‘usage’. ‘Usage’ should be used to refer only to language, not to things like monthly electricity consumption. The correct word for those things is ‘use’. Not ‘usage’. So you shouldn’t say ‘Your bill shows your electricity usage.’ It’s wrong. You should say, ‘Your bill shows your electricity use’. But try telling that to FirstEnergy whose passion for anything starts and stops at kilowatts. Or to your friends who wish you’d get, yes, a life.
And then there are the words that mean their opposite, like ‘bad’ for ‘good’, or ‘cool’ for ‘hot’, or ‘hot’ for ‘hot’ as in ‘cool’.
And what about making a word mean something unrelated to what it has meant for centuries? Like ‘neat’. ‘Neat’ used to mean ‘tidy’. Then overnight it started meaning ‘super’, about anything at all. Even sermons.
A person could be ‘neat’, too, even a slob who never made his bed or straighten a single drawer in his whole life.
Worst, there are shockers like: ‘none of them are here’, or ‘to happily accept’, or ‘who do you love?’, or ‘immune from’, or ‘different to’, or ‘bored of’, or ‘amount of things’, or ‘I love you doing that’, or ‘I only want the best’, and masses more.
Or take super shockers like: ‘Me and her are going’ or the great combo: ‘Her and I went with he and the dog’. (I actually heard that once.)
These infractions are 100% not ok with me. For me the beauty of life is the life of language. And language has rules. I want people to obey them because they make sense. They are wonderfully logical. But today’s butchering of the rules enters the speed-of-light-digital vat of proliferation, and instantly wrong usage goes ‘viral’, as in people become illiterate overnight. There should be a word for this massacre. Like ‘linguisticide’. Shakespeare would love that word even if he didn’t agree with it.
Actually, I do it, too. But that’s different. When I do it, it’s a creative act. It is a way to say the perfect thing for the context. It is not ignorance. That’s it. That’s the difference. If you break the rules knowingly in order to be dazzling, it is ok. What is not ok is ignorance of the rules. Or flaunting of them because you hated your childhood.
And to me it is criminally not ok when ignorance becomes the norm just because a huge population of ignorant others joins in. (Like that: some would say, ‘a huge population of ignorant others join in’.) For me this is serial slovenliness, which, if I remember correctly, is one of the Seven Deadly Sins, as chosen by God.
God, if you ask me, also chose the rules of punctuation, sentence construction, word definition, tense, case, adjective/verb agreement and all the other inward, invisible things going on in an outward and visible sentence. They are holy.
They’re not. I know that. And anyway, I am already defeated because the thing I worship even more than Shakespeare is the Oxford English Dictionary (especially in its massive 12-volume version – now 20) which I ‘lived inside’ in the celestial Denison Library at Scripps.)
And that’s a problem because the OED allows change. Unlike me, it keeps an open mind about what real people are doing with language. And when it sees sufficient usage of a formerly wrong or made-up word or construction, the OED blesses it. It takes its sweet time with this, (except with ‘to Google’ which took only nine years), but ultimately it says wrong is now right.
So I guess I have to, too. I have to stop minding where language is going. It doesn’t matter. What matters is that we use language comprehensibly, coherently and expressively, with a bit of flair. Or maybe a lot.
Therefore, maybe the mistakes I denounce are not mistakes. Maybe they are just life evolving, inexorably, beautifully. Even when a perfectly good noun becomes a perfectly awful verb like: ‘transition’ or ,’flip chart’, or ‘handbag’, or ‘action’, or, ‘friend’, or ‘coffee’, or, or, or.
This noun-to-verb crime is called ‘denominalization’, which has today denominalised itself into ‘verbify’. I can hardly even write that. I’m with Benjamin Franklin who wrote to Noah Webster that denominalization is “awkward and abominable.”
I’m sure Franklin would have said the same about nouns that verbify by growing an ‘ise’. But I thought I would never get over the ‘ising’ (I just did that) of ‘priority’. But I did. I now think ‘prioritise’ is lovely.
On the other hand, if I’d been Franklin’s friend, I probably would have said the same about the now common verbified nouns: to chair, to bed, to seed, to man, to house, to paper, to divorce, to dress, to train, to voice ….
This is all just too hard.
But maybe, maybe, I could imagine watching a Shakespeare play in 1606 and hearing twenty words I’d never heard before because he had just made them up. Like: ‘bandit’. Can you imagine hearing ‘bandit’ for the first time? Or ‘critic’. Or ‘dauntless’. Or even ‘lonely’. Imagine there not being those words in the world, and suddenly there they are! Maybe it is inherently exciting. Maybe there is nothing wrong with it at all.
But I will never ever, ever agree to love ‘like’ when it means absolutely nothing while attaching itself like an annelid worm to the opening of a sentence following the also meaningless, ‘It’s’ and ‘you know’: ‘It’s like, you know, really great.’
Or when it spray-shoots its way around a sentence, making it sputter: ‘I am like going to my like friend’s house for like an overnight.’ Dear God.
Ditto the ‘up inflection’ at the end of sentences. Turn me loose on people who do that and, well, don’t. And when ‘like’ and the ‘up inflection’ join forces, I become a danger to society. In my quiet way.
While I’m at it, I am watching warily the gathering tsunami of ’So’. Six years ago one of my colleagues began an off-the-cuff speech with ‘So’. I kind of liked it. It seemed to honour the previous speaker. It seemed respectful.
But suddenly (and I mean suddenly, not just linguistic geologic suddenly), ‘So’, meaning nothing at all, was launching 25% of sentences. So (here meaningfully meaning ‘therefore’), good luck if you want people to start speaking without saying ‘So’ first, meaninglessly.
Sure, you can argue that ‘So’ simply replaces ‘Well’. But ‘Well’ is also meaningless.
To my pledge:
I will follow in the hallowed footsteps of Shakespeare and the Oxford English Dictionary and let language be whatever it’s going to be.
Bottom line, I will keep my outrage to myself.
One more tiny thing. Sentence Fragments. I used to hate them. I preached that every stand-alone sequence of words had to have a subject (noun) and predicate (verb). Fragments were wrong wrong wrong.
Then fragments showed up in some of the modern ‘greats’. (Of course, Joyce had been flinging fragments around since 1930. But who could miss subjects or predicates while drowning in a wash of run-ons?) By 1990 the ‘greats’ were fragmenting sentences everywhere.
Then one day I noticed that I liked it. I admitted how pacey those fragments could make a sentence. How emphatic. (Like that.)
So now I do fragments guilt-free. Almost. When I handed in the manuscript of my latest book, I ran back under the duvet, hands over my ears. But nothing happened. Fragments are fine, apparently. So I’ll have to be, too.
Regardless, this is the last you’ll hear about it from me.
I solemnly pledge.
Photo by NASA
Photo by NASA
When you get down to it, death, as a concept, is completely fascinating.
(And in case you are superstitious, you can relax because to think about death is not to bring it on. Some people are scared of that. So they won’t think about it. Really. It sounds stupid, I know. And it is. But I was. And then I got over it. Pretty much. So now I’m good. I know absolutely that death is caused by lots of things, but not by thinking about it as a concept. And if I die immediately after writing this, it will be just a classic case of disentangling ‘post hoc’ from ‘proctor hoc’, one of my favourite pastimes. Of course, I won’t be here to enjoy that. Anyway, check with me next week.)
Death is like that: just bringing it up makes people weird. And to discuss it at length, well, it’s hardly worth the ‘um’ look on people’s faces and the high-level finessing they do to morph the conversation into anything else, even taxes or skin disease. Anything.
And that’s odd if you think about it. Given how ho-hum death really is, how arrestingly ordinary, how wholesome, how absolutely-for-certain it is to happen to every single one of us, you just can’t account for people’s freeze when you raise the subject. They might smile in order to look game, but those ‘oh, please, no’ smiles fool no one.
Dying clearly is the right and proper thing to do. So we really could be fine about it. The way we are about living. Conceptually speaking.
But we are entirely not fine about it. In fact, death completely creeps us out. Dying is the thing not to do for as long as possible. And even forever if you are a cryogenicist. (I knew one of them. They really are creepy.)
Now you’d think death would be easier to consider if you’re over 70. For one thing, death is not that far off, including if, like me, your genetic pool is full of ancient people (except for the smokers, who don’t count here but were, of course, lovely in every other way). But no, over 70, 80, 90, even at 101 you can still hate the whole idea of death.
And that’s a shame because death is right there to be considered. It is everywhere hovering, knocking, peering, pacing, jumping up and down at the end of the drive. But we ignore it. Or we shoo it away. We rake the leaves, bury the dog, step on the ants, power wash the wasp holes, pad the subsidence, and look at the stars as if they, and not just their light, are for sure still there. Hardly for a moment do we stop, sit down and just think about all this dying. All this ending.
Probably like you, I was raised to believe that when creatures die, their spirit leaves the carcass to continue life in a non-physical form. We can believe that even dogs, rabbits and dragonflies have spirits. (Not so much snakes, tarantulas, or pond algae.) And I’ve been through at least four phases of belief/nonbelief about the afterlife. Recently I was drifting.
And then I came across these two facts:
1 Everything dies, even the universe.
2 Life cannot exist without death.
When I really think about those two ideas, I fall into a kind of rupturing rapture. I feel almost reverent thinking about the end. Of everything.
Since then I’ve stopped being afraid of the whole idea of ‘over’. I can almost love it. The way you can love an equation that is mostly Greek letters extending the length of a Feynman black board. (Not that I could ever produce such a string of wonder. But I am sure I would love it if I could.)
Here is my understanding of those two staggering facts:
Everything dies, even the universe.
Death is popular. Everything does it. Not just us. If it’s alive, it dies. If it’s not alive, it still dies because a) science is undecided about what life is, and so it may well be alive, and b) it ceases to be in its present form, and that is as good as dying. In fact, that is dead. Ceasing to be in your present form is definitely dead. Same for humans. We cease to be in our present form. We become dead.
And again, I appreciate that this physical deadness which is observable does not disprove a non-physical aliveness that is not observable. But that belief in an unobservable ‘non-physical’ possible reality stops us from contemplating the observable physical actual reality which is the absolute end of being. And deeply contemplating that reality is where the challenge and reward lie. At least for me. Thinking about the known reality wholeheartedly is so worth setting down the tools of belief for just a minute.
In detail, what we know is that we are atoms that exist briefly in one form – us. Then when we die, our atoms disperse (stop being us) to the stars, hanging out with other atoms until it’s time to configure into something different. (Life starts and ends in the stars. I love that.)
So, what we know, not just believe, is that everything is physical and everything dies. Even the universe.
And that’s the shocker. That’s the song. That was what got me: Even the universe dies. Even the universe. Everything.
Some astrophysicists say it like this: the universe will ‘turn in on itself’, collapsing into a ‘singularity’ which is how it began. That would be pretty dead if you ask me. Even collapsing back to the odd quark would be as good as dead. You can’t have a great conversation with a quark.
And whether its death turns out to be the Big Freeze, the Big Crunch, the Big Rip, the Big Bounce, or the Big Slurp (that’s my favourite), the universe will die.
Some do say it will take the universe 100 trillion years or more to do this collapsing, which I admit is not tomorrow. But still it is out there waiting to happen. And that means that, yes, I die but so does the universe. Eventually there is nothing at all. I feel liberated by that.
But maybe you don’t. Maybe, as I mentioned, you believe that when the body is dead, the person isn’t. I understand this. I sense the presence of dead people I love, too, everyday.
But I increasingly feel that my ‘sense’ of them is my fully alive, inextinguishable memory of them. And that seems as magnificent to me now as any imagined non-physical existence of them did before. They no longer have to ‘exist’ to exist for me.
Why, though, I’ve wondered, was this so hard for me? Endings, most likely. Humans are pathetic at endings. My physician helped me with this when he said, ‘Nancy, old age is not a disease, and dying is not a failure.’ I thought they were.
He is right. Humans won’t allow the absolute ending of something we love, human or universe. But I think we should.
Life cannot exist without death.
I didn’t know that. I had noticed, obviously, that life doesn’t exist without death. But I hadn't realised that it can’t.
It can’t because we breathe oxygen that turns food into energy, keeping our bodies alive. And that same process spins off rogue oxygen molecules that kill some cells faster than they can reproduce. Eventually these ‘wandering’ oxygen molecules destroy our organs.
Seeing that the process that keeps us alive also slowly kills us, I no longer separate living from dying because both are happening at once. We are the living. We are the dying. Both.
The makers of life are the destroyers of life. Life cannot exist without death.
That changed the whole issue for me. It meant that I had to be not just accepting of death, but grateful for it. Grateful for this, nature’s most exquisite symbiosis.
Soon I started giving thanks for death.
Now I cherish the ending because it creates the beginning that becomes the being.
BeginningBeingEnding. (Is this, I wonder, the real trinity?)
With a measure of awe I now honour the sheer poetry of life’s built-in finishing.
Cultivating a Love of Scrutiny
Consider this from James Clear:
Rather than trying to be right, assume you are wrong and try to be less wrong.
Trying to be right tends to devolve into protecting your beliefs. Trying to be less wrong tends to prompt more questions and intellectual humility.
‘Try to be less wrong’. Whoever does that? For one thing, it is a big job. It goes against what we are rewarded for. In school, for example, the point is to get things right. You usually don’t get A+ by being a bit less wrong.
And in the world of work, being wrong is a disaster. It can get you fired. Being right can get you promoted until you become one of the higher-ups who are often the most wrong and steeped in how right the wrong is. Organisations don’t recruit on the basis of loving to discover how wrong we were. Imagine seeing this in the ‘want ads’:
We are seeking a person who has no investment in being right. They need to be someone who understands that humans get things wrong most of the time, someone who will strive to get things a bit less wrong each day.
Wouldn’t that be refreshing?
Or how about this for a reward path:
People who have the most knowledge and are the best at implementing it will be rewarded only when they demonstrate joy at discovering where they were wrong, and excitement at figuring out what would work better, and then mastering that, eager to discover down the road what was wrong with it, too.
That would be amazing. And that would be at least a nod to reality. Because that is what the real world of knowledge and practice actually is. Mostly we are wrong about what we know, and mostly our actions, therefore, are flawed. Our real job, then, should be to be thrilled to discover the flaws or gaps in the ideas and their action, and to figure out how to improve them, how to be ‘less wrong’ about them. But you’d never know that from the confetti-dropping hullaballoo rewarders make over being right. And the contortions people get into, making the wrong seem right. Power, it seems, punishes scrutiny.
So, how about we stop right now? How about we decide to be thrilled to discover error in what we espouse and do? I know that is tricky. Especially when we adore what we espouse and do, and most especially when what we adore is a body of personal thought and practice.
Adore vs Love
A word on this distinction. I use ‘adore’ here on purpose. We can love an idea and not slip into adoring it. But when we think an idea is so beautiful and workable it should be left alone and not scrutinised, we have slipped into adoration. Adoration is dangerous. Especially, again, when the object of worship is a body of personal thought and practice.
Thought vs Product
And a moment on that distinction. On the one hand, adoring (i.e. not examining) operational or commercial ideas and action associated with, say, a product, or service, or business team, or radio programme, or fashion, is only somewhat dangerous because those ideas and actions are about things outside ourselves. They are objective, practical. So if they are flawed, we will know soon enough because they will not work in the ‘marketplace’. We will get feedback from ‘users’, fast. Of course we can ignore or reinterpret the feedback and stick with the idea/action, but down the road that is likely to cost us.
On the other hand, adoring (not examining) ideas that we hold subjectively and practice personally is chill-makingly dangerous. Usually we ourselves and the people who think just like us and do just what we do are our main feedback system for the idea. And the feedback usually is, ‘This is wonderful!’ And because the ideas are subjective and the practice is personal, and because we adore both, we can’t see the gain from being open to their flaws. In fact, we can easily regard scrutiny of them as something to be feared and fought and denied entry. For certain we rarely know joy at the prospect of seeing where our ideas are wrong or incomplete.
This sliding from love into adoration may well be, as Trisha Lord suggests, a consequence of our having begun to identify with the idea. We begin to adore the idea when it has begun to be part of who we think we are. So deciding to notice flaws in the idea can feel to us like the decision to notice flaws in ourselves, in our very being. This can feel too threatening, and so we quickly develop defensiveness towards any scrutiny of the idea (i.e., ourselves). We fast become wary of, and then resistant to, scrutiny of the idea.
Maryse Barak, reflecting on this reluctance to notice flaws, said: ‘Scrutiny of an idea I love demands my courage, my intent to discover, my understanding that things may fall apart, and that when they do – I don’t. I can keep thinking and letting new information allow new levels of coherence’.
When an idea we love falls apart, we don’t. Yes. Our ideas live in us, but they do not comprise us. (Monica Schüldt, conversation)
Thinking about this reluctance to scrutinise, I would speculate that there are roughly three camps of ‘thought-and-personal-practice’ enthusiasts. When asked how they feel at the prospect that the ideas they espouse might be wrong, they might have one of three answers:
I love it every time I see something new we had missed, or something wrong with what had seemed right. I love figuring out what would work better. I look forward to those moments of discovery.
I am happy when changes to the thought and practice are made, but I feel frightened when I am staring an error or new feature in the face. I do not keep my eyes open eagerly for those moments of discovery.
I wish we would stop trying to improve this and just get on with espousing and practicing it. These ideas and practice are so good, they don’t need to be better. I don’t enjoy scrutinising them. Practicing and teaching and spreading them are what give me joy.
Clearly most idea-adorers and practitioners are in the third camp. But even idea lovers in the second camp only sit on the shores of scrutiny.
And yet – and this is what matters so much to me – only in scrutinising the body of thought and practice we love, only by noticing and discarding inferior features, and discovering new vigorous ones, can we protect its viability. Only when we ruthlessly examine it ongoingly does its best form have a chance to emerge and endure. Only then can it gradually make a down-the-road, pervasive, positive difference in the world. Without this joy in finding the flaws and gaps, and correcting them, we almost ensure its relegation to the shelves of ‘interesting’ but easily sideline-able notions.
We also need caution. As we adopt scrutiny as an essential feature of honouring the ideas and practice we love, and as we then build changes into them, we need to join Monica Schüldt in scrutinising that very act of change. She would ask: ‘How can we be sure that what we think are improvements to the body of thought really are improvements? How do we ensure we do not replace what was actually better with something new which we have put on a pedestal only because it is new?’
Cultivating a Love of Scrutiny
Enter group size.
Now things get serious.
It is hard enough and important enough to ‘want to notice where we are wrong’ when we are in small groups (probably six or fewer ) to celebrate, learn more about and confirm the efficacy of a set of ideas and personal practice. But when we gather in large groups, the scrutiny of those ideas and practice can become nearly impossible. Large groups in my experience almost always discourage scrutiny. And the larger the group the less inclined its members can be to prove themselves wrong.
Here’s one possible reason why. If we hold dear an idea, and if that idea has, as Lord suggests, become part of our personal identity, we can feel a need for it to be completely right. So the more people who gather to say it is right, the more we can feel confident that it is right. The more people who think so, the more likely it is so. And then before you know it, ‘self’ identity has become ‘us’ identity. Soon, as Claire Andrews said, ‘we have begun to care more about “us”, than we do even about the original idea. The very idea of “us”, whenever that emerges, can discourage scrutiny of the idea that drew us together originally because “us” suggests a “them” and encourages barriers and harder structures - real or imagined. So there's a problem not only with how the group sees itself and its “group idea” but also with how it relates to the wider world.’
Add this ‘the-us-has-to-be-right’ phenomenon to our having been practically bludgeoned each time we pointed out flaws in a revered idea growing up, and you can almost guarantee that espousers of a loved idea will meet in large groups to espouse it further, and soon to adore it, certainly not to question it.
There is a feeling aspect to all of this too, an insidious one. By collectively embracing the idea and practice, we not only get to feel right, we also get to feel good. Looking around and seeing how many people there are who share our support of this idea and, therefore, how right the idea must be and, therefore, how good we must be, too, for advocating it makes us feel really really good. And when we all feel really really good as a group, that really really good feeling can make us think that meeting in larger and larger groups to promote the idea is an actual need that should be filled. So we proceed to create larger and larger groups of espousers.
What’s so bad about feeling really really good in large groups?
Andrews again: ‘Large group euphoria keeps people from scrutinising the ideas that make them euphoric’. In large groups the temptation is fiercely to coddle, bow to and just not see the dicey bits of the ideas and practice we adore. We seemed to be systematically socialised against scrutiny.
But without that scrutiny the body of thought and practice eventually stagnates, codifies, stops breathing. The larger the group, the easier the kill.
Some go even further and argue that scrutinising in large groups goes against human nature. Ruth McCarthy’s take on this view is particularly cogent, I think. I would paraphrase her this way:
When it comes to groups, human nature makes us prefer belonging to scrutinising. Of course, in cellular evolution Nature does produce random new ‘ideas’ (mutations), and the errors in those ‘ideas’ keep the organism from reproducing, so only the ‘good in the idea’ survives long term. In that way Nature does love to find errors and dispose of them. And therefore, at the cellular level, scrutiny can be said to be natural.
But in humans this ‘biology of scrutinising’ conflicts with our 'psychology of belonging’. Observably the natural drive to belong in groups usually overrides the natural need to scrutinise.
Therefore, we are never going to agree to develop in ourselves a love of scrutiny, especially in large groups, because that would seem to threaten our belonging.
So we choose to belong. Not to scrutinise. Just look around.
When we do, we see this ‘need to belong’ overriding the ‘need to scrutinise’, often horrifyingly. We see it in political rallies of all sorts, ordinary ones, and dramatic ones like Nuremberg 1933 and Washington 2021. We see it in churches, too, especially Megachurches. We see it in personal growth movements. We see it also, of course, in cults from James Jones to QAnon.
Can you imagine a speaker at a large such gathering saying, ‘We are almost certainly wrong about a lot of this. We are here today to discover where.’
It just won’t happen because the point of the large gathering is implicitly to adore this ‘now’ version of the idea, not to question it for the sake of its future. And that’s the concomitant problem: we don’t usually think in ’futures’, in ‘long runs’. We think in ‘nows’ and ‘next weeks’. Maybe ‘next quarters’. But ‘long runs’ are so, I don’t know, boring? Ungraspable? Un-now?
And yet, ‘long runs’ are all that an emerging good idea and practice can count on. It is the long run of an ever-scrutinised good idea that allows it to change society. Democracy, universal education and freedom of speech seem to me to make the case for scrutiny-as-friend.
So, if I were an emergent idea and practice, and I wanted to make a long-term positive difference in society, I would want to be questioned ruthlessly, improved and questioned again. And I would run like mad from large groups of adorers.
I would be not quite as concerned about small groups. Although humans can still be nervous in small groups, there is usually less face for us to lose by being different there, by questioning things, by looking for oddities, by saying, ‘that’s funny,’ and wanting to look harder to see the aberration. Small groups are usually less intimidating, less punishing. And so less ‘sameifying’. They are less euphoria-making. In a small group a loved idea stands a better chance of being developed, tested and persistently freed of its flaws.
In considering this challenge, Laura Williams pointed to the words of Steve Carroll: ‘To know a thinking environment [an environment of scrutiny] is to experience the frightening freedom of breaking from the herd.’ She went on to ask, ‘How do we ensure we can do this when we are in large groups? Can we break from the herd and still be in the herd? Does the herd move with us?’ Interesting questions.
So if it is the integrity and longevity of a body of personal thought and practice we want, we probably want to think a hundred times before loving it into adoration, and before fanning that adoration in large groups.
In the face of this challenge I would argue that our ‘biology of scrutiny’ is every bit as robust as our ‘psychology of belonging’. We just need to summon it and give it a work out. It will soon lift a tonne. In doing that we will know joy, true joy, the joy of embracing an idea by stepping back from it and seeing it, by rejoicing in the discovery of its flaws, or its formerly invisible features, that suddenly change our understanding of it. This joy in noticing the need for change is unmitigated joy because it is honest. It is eyes wide open. It is humans awake.
Also, joy in ‘that’s funny!’ moments sits beautifully with joy in 'that works!’ moments. In fact, I think we can be truly joyous only if we love both being wrong and being right. People who know joy only in being right worry they might be wrong; people who know joy only in being wrong worry they might be right. People, on the other hand, who take joy in both don’t have to worry. They can just walk forward with a lilt, keeping a look out.
So I would like to see our world feel really really good about being really really good at scrutiny. And especially at the scrutiny of the very ideas we are afraid to scrutinise, the ones we love.
And I would like for the world to start every group gathered around a loved body of thought and practice with the injunction to scrutinise it as we go. Monica Schüldt again said it well:
‘What do we do when we meet? We question.’
I long for us all to sit impatiently for the moments when we spot the anomaly or gap, face it and say, ’Hooray, we were wrong!'