The Symbol Grounding Problem

Before we get into today’s post, some news:

  • RESURRECTIONS is up for pre-order on Kindle! (If you’re not in the US, it may take a few days before it propagates to your country’s version of Amazon; it took a day before it showed up for me in Canada, for instance.) If you’ve been looking forward to this book, please feel free to pre-order as soon as possible; the timing really does help. If you prefer a physical copy, I’ll let you know when those become available as well.
  • I don’t remember if I announced this earlier, but I’ve been appointed to the SFWA Emerging Technologies Committee, where we’ll do some projects and organizing related to the effect of new technologies – including, but not limited to, generative AI – on speculative fiction authors. It will be a lot of behind the scenes stuff that I mostly won’t be able to publicly discuss, but I’m super excited.

And now, some cognitive science!


Every year – and now more than ever – I tell my first-year students about the symbol grounding problem.

It goes like this:

In the orthodox view of cognitive science, information processing consists of representations and procedures. Something – in your mind, in another animal’s mind, or on a computer – creates representations that refer to the world, and uses them to carry out procedures – whether it’s as simple as adding two numbers, or as complex as writing a novel. A representation can be fuzzy and diffuse (as in the brain, where our knowledge is stored as patterns of connection) but without some form of representation, there is no thought.*

All the representations in a human mind share some important properties.

First: all representations are symbolic.

To be symbolic means that a representation is not the same as the thing it represents. (We call the thing itself a referent.) The map is not the territory. As I like to tell my students, words like “water” are representations; but I can’t drink a glass of the word “water.” Whether a representation is a word, a picture, or something more abstract, it refers to something other than itself.

Second: all representations are grounded.

(Read the full post on Substack)