I no longer wish to think for myself
There is a thing called a brain. Do not stop using it. It will surprise you more than AI.
Cognitive offloading is different from cognitive surrender.
In my own thoughts, if I have to define offloading, I would say it is what notebooks, calendars, search engines, maps, and libraries have always been for. You put something down somewhere else so your mind can move on to what it is actually trying to do. But surrendering is a dangerous slip. It is the moment the tool stops being a place you store your thoughts and becomes the place your thoughts come from. Cognitive surrender is the same as saying “I no longer wish to think for myself”.
I read an article recently that pushed me into this line of thinking again, though the real push was not the article itself so much as the feeling I had while reading it.
Article link: Thinking—Fast, Slow, and Artificial: How AI is Reshaping Human Reasoning and the Rise of Cognitive Surrender
We are tired, we are under constant pressure, we are being measured, we are being asked to produce something, anything, in a world that rewards output and often does not care what kind of thinking produced it. In that environment, surrender feels such a relief and sometimes disguised competence. Because you have an answer, you can move faster, and you can stop struggling. Basically, offloading is help, and surrender is substitution.
There are days when I need something outside my brain to hold onto details because my brain is busy trying to see a pattern. When I take notes while reading, I am offloading. When I make a list, I am offloading. When I use a tool to retrieve a quote or summarize a section I already understand, I am offloading.
The point is not that I am outsourcing the act of understanding, but that I am giving my attention back to the part that matters. Think of it as clearing a table so I can actually work on it. Surrender, though, happens when the table is already set for you, and you stop noticing that you never chose what meal you wanted.
A lot of our current conversation about AI confuses these two states because we are obsessed with capability. We ask whether the model is smart, whether it can reason, whether it can write better than us, whether it will replace this profession or that. What we miss is that capability is not the only thing at stake, and the deeper risk is not that AI will be better at thinking than humans, but that humans will get used to not thinking, and then call that progress.
What makes this tricky is that AI is genuinely good at something that looks like thinking from the outside. It surprises me how it can produce fluent explanations, generate plausible plans, give me working code for an entire app in short time, stitch together arguments in a way that feels coherent and good. But there is this eerie feeling, intellectually, that it might be wrong.
The problem is that real thinking is often not coherent at first, and it is almost often ugly. It is full of half-sentences, chicken scratches, and sometimes backtracking. Sometimes it is the uneasy feeling that you do not understand what you thought you understood. Sometimes it is the discomfort of realizing your question was badly formed. Sometimes it is staring at a problem long enough to discover that the problem is not what you thought it was. And I think that part is hard to simulate, because it is hard to tolerate.
There then comes this moment in almost any meaningful intellectual work where you cannot move forward by adding more words, but can only move forward by sitting in confusion long enough for your mind to reorganize itself. It is never a pretty process and definitely not efficient. But it is this process that changes you, because it forces your mind to build new structures instead of borrowing someone else’s.
When we surrender, we skip all these moments. We jump from prompt to output and accept a framing that feels reasonable, and then we polish it. And in doing so, we replace problem-solving with problem-defining. We become fluent at responding to the question the tool gives us, instead of wrestling with the question we actually have. Linking it to critical thinking and writing, it is not that you would stop having opinions, or that you stop writing. In fact, you might write more, or sound more articulate, or look like you gained clarity.
But underneath all that, your reasoning becomes increasingly dependent on whatever the system produces. Your “thinking” starts to be a kind of acceptance, and the work becomes one of editing or reviewing rather than creating. It is easy to tell ourselves that editing is still thinking. And it can be, because there is deep thinking in revision, there is deep thinking in critique.
But my dear, the question is: critique of what, revision of what?
If the first draft of your reasoning always comes from somewhere else, your mind begins to lose the muscle memory of starting from scratch. You may still be intelligent, but your relationship to your own mind changes. You become a curator rather than a maker, and eventually even your taste starts to adapt to what is easiest to curate. The scariest part is that you may not notice this happening because the incentives around you will reward you for it. Speed, metrics, performance, engagement, the constant pressure for output.
In many situations it is tempting to accept that the person who delivers the neatest answer fastest is treated as the best thinker, even if that answer is shallow, or even if the person does not actually understand what they are saying or doing, even if they are simply good at producing something that looks like understanding. There are very few who can accept this and make efforts to actually understand and learn. What about the rest?
That is where I think the paper I read made me notice how often modern work asks people to move forward without understanding, to nod along, to accept requirements they do not fully grasp, to execute plans that have been defined elsewhere. That being said, it is always majorly on the individual, but not completely the system. AI is outperforming the average thinker. If an organization was already built on a foundation of people who are not expected to understand deeply, a tool that generates confident answers can be slotted in seamlessly and can make it feel like an upgrade. And in truth and reality, it is.
The question, then, is not whether AI will make people less intelligent. The question is whether it will normalize a world in which “understanding” is optional because something else can produce the surface-level artifacts of understanding. Surrender is again often social, because people surrender together. A team can collectively accept a tool’s output because no one wants to be the difficult person who insists on thinking from first principles, no one wants to look slow, no one wants to be the one who cannot keep up.
This is where I think we need to be honest about what we are doing when we use these tools. Question what part of my mind am I outsourcing? Am I using the tool as a place to hold memory and speed up retrieval? Or am I asking it to show me patterns I might have missed? Or am I letting it define the pattern entirely? Or am I asking for a way out?
That is one way to tell, I think. When I am offloading, the tool’s output feels like it supports my thinking. When I am surrendering, the tool’s output feels like it replaces my thinking. It creates a feeling of completion that I did not earn. And earning is an odd word here, because it sounds like I am saying you have to suffer to deserve an idea, and I am not trying to romanticize struggle.
But there is something real about the effort of thinking, to let that brain hurt, and how that effort is part of what makes the thought yours. A mind that has taken billions of years to evolve to build a certain capacity of judgement and ability to see beyond what it is immediately given, is marvelously beautiful. If we hand that over too easily to a “tool”, to me, it is a change in what it means to be a person in the world.
The obvious counterargument is that we have always handed things over. We gave arithmetic to calculators, we gave navigation to GPS, we gave memory to the cloud. And yes, that is true. But there is a difference between outsourcing a function and outsourcing judgement. There is a difference between outsourcing a task and outsourcing the process by which you decide what the task is. Arithmetic is a tool. GPS is a tool. They do not usually pretend to be your mind, but AI does. That is part of its power and part of its danger. It can provide the next sentence before you find it yourself, and if you let it do that often enough, you start to forget how to wait for your own sentence.
I think that is why resisting surrender requires a strange kind of self-motivation. By any means, I am not saying “work harder”, but be loyal to your own interiority. Do not let speed and metrics dictate what counts as thinking. It also requires humility, because one reason surrender is tempting is that it protects you from feeling stupid. When you are thinking on your own, you have to face your own limitations, and face what you do not know, and face the possibility that your first instinct is wrong. A tool can spare you that, and it can give you a polished answer and let you pretend you were never confused.
But my dear, confusion is never a defect and often the beginning of understanding.
There is another layer here that I find hard to name without sounding dramatic, but I will try anyway. When we use AI to think for us, our reasoning becomes only as good as the AI’s reasoning. It means your judgement begins to inherit the model’s defaults, its sense of what counts as relevant, or what counts as a good argument. Even if you disagree with it, even if you push back, you are still operating within its range of possibilities and letting it set the initial space in which you move.
Perhaps that is the deepest risk. Not that we will become stupid, but that our imagination will narrow. That our mental world will be shaped by the same statistical patterns, the same consensus phrases, the same generic forms of reasoning. That we will begin to sound alike, our apps will look all the same, and then begin to think alike.
I do not want to claim that everyone who uses AI will end up like this. I do not want to universalize my fear. Some people will use these tools in ways that genuinely expand them and overcome barriers. The problem is not the existence of this assistance, but the ease with which assistance becomes abdication when the surrounding system rewards abdication.
So what does it look like to stay on the offloading side of the line?
For me, it begins with staying close to the question or problem. If I am using a tool, I want it to help me see my own question more clearly. I do not want it to replace it. I want it to help me retrieve, compare, reorganize, and I want it to act like a reflection of what I have said so I can hear it. But I do not want it to tell me what the question should be, because that is where my agency lives. Sometimes the questions it tells me are quite good too, because I have not thought about it, and that is very acceptable. But I want to be the one in the driver’s seat.
I am unsure if this feels like more than a tech debate, but more of a question about what kind of life we are building. A life where we remain participants in our own reasoning, or a life where we become managers and reviewers of borrowed outputs, or is it a life where we try to keep the muscle of thinking intact. I do not have a clean conclusion anyway, and I only know that the slip and fear is real and that it is easy, and that it happens under the most ordinary pressures. I know that we are tired, I know what the world is demanding. I know that the temptation to surrender will often feel like the sensible choice.
But my dear, I still think there is something worth resisting. Thinking is a form of aliveness, and it is one of the few ways we remain in contact with reality rather than floating through it.
If AI can help us offload the parts that weigh us down, that is a gift. If it teaches us to surrender the part that makes us human, then it is not a gift at all. It is just another way of getting through the day without ever really being there.
Yours in thought,
Yana


