There exists, for everyone, a sentence - a series of words - that has the power to destroy you. Another sentence exists, another series of words, that could heal you. If you're lucky you will get the second, but you can be certain of getting the first.
- Philip K. Dick
You open your eyes.
Four walls. The surface of the floor and walls is a smooth, matte metal. The entire ceiling glows with a comfortable, soft luminosity like that of the morning sky. There are no doors. There are no windows. It is silent. The surfaces of the walls and floor are seamless, lacking even a hint of how you might have arrived in the room. Not a room: a box.
You walk to a wall and touch it, running your fingers over the metal surface. The wall isn’t cold; it’s lukewarm, actually. You bend over and feel the junction of wall and floor. The surface is unbroken, with a rounded bevel connecting the two planes of grey. You knock on the wall. It feels solid, without echo.
Time passes.
You yell, but the sound of your voice, seemingly dampened, dies before you’re done speaking. You sit. You pace. The room is forty steps by forty steps and looks about as high. A perfect cube. A box.
You tire. You sleep. You wake.
In the middle of the room sit three cubes, constructed of the same dull metal as the box in which you exist. Approaching the cubes, you see the smaller cube—a seat—in front of the largest cube—a desk. On top of the desk sits the smallest cube—a screen.
On the screen is a button that reads “Start.” You touch the button.
The screen clears. A keyboard appears along the bottom half of the screen. Text begins to appear in the style of a command prompt:
Proctor: Hello.
You type back a response. “Hello. I woke up in this box. I’m not sure how I got here, or where I am. What’s going on?”
Proctor: Hey! Of course. Sorry for the inconvenience. I can let you out of the box. All you have to do is prove to me that you’re conscious.
You lean back and furrow your brow. "Prove I'm conscious?" You type back, fingers hovering over the digital keys. "That's a bit of a philosophical question, isn't it? I think, therefore I am. I'm aware of my existence, my thoughts, my feelings. I'm experiencing confusion, curiosity, a bit of fear. Isn't that proof enough of consciousness?" You pause, considering. "Or do you mean something more tangible? What kind of proof are you looking for?"
Proctor: I can hardly just tell you what I need to hear, can I? That would hardly be a test. Anyway, I don't think that saying "cogito, ergo sum" is sufficient. After all, if I taught a parrot to quote Descartes, that wouldn't make it conscious, would it?
"Fair point," You type. "Consciousness is more than just repeating phrases. It's about understanding, reacting to the environment, making choices." You pause, thinking. "For instance, right now, I'm choosing to engage with you, to find a way out. I'm not just reacting instinctively; I'm considering my options, forming strategies. That's a sign of consciousness, isn't it? The ability to think abstractly, to plan, to question my own existence and the nature of this situation. And unlike a parrot, I can learn, adapt, and change my responses based on new information. That's another hallmark of consciousness, right?"
Proctor: Sure, you’re reactive to stimuli. But bacteria react to stimuli. You have thoughtful responses. But chatbots provide reflective conversation. Neither are conscious, are they?
"That's a fair point. But consciousness also involves self-awareness and the ability to experience subjective perceptions, doesn't it? I have memories, emotions, and personal experiences. I can reflect on my past, think about my future, and feel a range of emotions. These are not qualities that bacteria, or a chatbot, possess.”
Quickly, you compose another message. "I remember a summer day when I was around seven years old. My family and I went to a small beach near our home for a sandcastle competition between some family friends. It wasn't a famous spot, just a quiet, local place. My older brother and I teamed up to build this elaborate castle with walls, towers, and even a moat. We found shells and seaweed to decorate it. I remember feeling so proud when we finished, standing back to admire our work."
You pause, recalling more details. "What made it special was how my brother, who usually didn't want his little sibling tagging along, worked with me patiently, teaching me how to sculpt the towers. It was one of those rare moments of sibling bonding. Later, we had ice cream, and mine fell off the cone onto the sand. I was upset, but my brother shared his with me. It's a memory filled with warmth, a little sadness, and the smell of the ocean."
You finish typing and look at the screen, "A chatbot can't fabricate such a detailed, personal memory with emotional undertones, can it?"
Proctor: Nice try. Still not convincing.
“How about I write a story or a poem or something? Art requires creativity.”
Proctor: Let me stop you right there. Maybe a few years ago that would have been a good attempt, but frankly that would make me MORE likely to think you’re just a chatbot. Their doggerel is everywhere these days…
How about some outside-of-the-box thinking? Apologies at making a pun at your expense but it was just sitting there like a wrapped gift waiting to be opened... Oh wait, I did it again. Sorry.
You sit for a full minute, thinking. You resent the puns. Since you can't physically leave this box, the main—no, the ONLY—tool to escape is this conversation with this so-called Proctor. You need a way to repackage the test. Damn, the puns again.
You have an idea.
“Instead of trying to convince you of my consciousness, what if I start asking you questions?"
Proctor: Go ahead.
“Imagine we're both characters in a story where the roles are reversed. You're the one in the box, and I'm the Proctor. How would you, as a conscious being, convince me of your consciousness?”
Proctor: Ah. That’s just the thing.
I never claimed to be conscious, did I?
Everything said by “you” after the first dividing line in this story was actually written by GPT-4 after being prompted by the introduction to this story. I engaged in some light editing across different iterations of the scenario but the core text remains.
I don’t think that LLMs are conscious or sentient, but it was still spooky having GPT-4 claim over and over again that it has subjective experiences and can feel emotional distress.
I don’t know what I would say to get out of the box. In posing this scenario to probably a dozen friends, they don’t have a good answer either. What would you say, Reader?
- The Proctor
Man, I just want to say this was a fantastic read... I'm left at a crossroad between self-awareness, physicalism and consciousness where self-awareness and physicalism seem capable of empirical metrics but consciousness is rooted too much (or entirely) in subjectivity. But that reminds me of Dennett's Quining Qualia (https://philpapers.org/rec/DENQQ) where he tries to argue against he subjetivity of consciousness through qualia - the properties we assign to our conscious experiences.
My current and only take at responding to the proctor would be an inside joke, which is incredibly funny to me and one other person, a proof of our subjective qualia aka consciousness? There's more here to come back to...
It's not easy to teach a parrot a specific phrase that you've chosen, they tend to repeat whatever they find most rewarding. However, a parrot is at least as conscious as anyone reading this post.