Tom Ritchford
Nov 25, 2021

--

You can’t just emit the words “straw man” in response to a complete argument and expect us to understand which part of this you refer to!

Did you respond to the wrong article, maybe?

Let’s summarize.

I claimed that GPT-3 specifically does not have intention, wants or desires.

You claimed I was making a “common mistake”, and your argument appears to be that internal states are entirely opaque to everyone else, so we can’t say it isn’t happening.

My response was two-fold:

  1. “You can’t say it isn’t happening” is insufficient to prove an extremely strong claim, specifically, “The text-prediction neural net GPT-3has intention, desires and wants.”
  2. In humans and even animals like dogs, we can interrogate their internal state in different ways, and they spontaneously reveal their inner state without being interrogated.

Neither of these are a straw man. They’re extremely reasonable and general objections.

--

--

Responses (1)