Timothy Crouch


impersonal knowledge

#

Why you should not use ChatGPT, large language models, or other “artificial intelligence” (falsely so-called) tools in your research or work, for any of the synthetic tasks (summaries of data or information, etc.) for which it is proposed as a helpful time-saver:

  1. The process of pattern recognition and synthetic integration is the basis of how human beings come to know and understand the world.
  2. This process is an inextricably bodily process in humans. (This is true of human cognition in general: the whole of the human body, not just the brain, is involved in every act of thought — and in fact other bodies are involved, too, because thought is an intersubjective process. But I digress.)
  3. ChatGPT and similar tools are, however, definitionally disembodied. Even if they are in fact “just pattern-recognition machines” (dubious), by virtue of being disembodied their pattern “recognition” is not the same as the real thing in humans.
  4. In fact, insofar as ChatGPT exists in the physical world, it is under a very different sort of embodiment — a non-organic sort — which is antithetical to the human sort.
  5. Therefore, ChatGPT and so forth cannot be trusted to faithfully simulate human knowing — and if the mechanism cannot be trusted neither can the results.
  6. Additionally, by using such a tool, a human being forgoes the opportunity to practice and experience such knowing, kneecapping his or her capacity to learn from the experience.

In a nutshell: the promise of LLMs is “impersonal knowledge” — but no such thing exists. Relying on it is thus, in a meaningful sense, worse than nothing.