STAR TREK – THE MEASURE OF A MAN
Scientists are currently working towards achieving artificial general intelligence for synthetic consciousness. It appears, though, that society is heavily conflicted by the notion of consciousness within a machine and often imagine doomsday scenarios. This exact concern is present in the Star Trek episode, ‘The Measure of a Man’. Throughout the show I found myself reflecting on philosophical and scientific themes such as Descartes’ famous mind body problem, environmental adaptation, and self improvement.
Commander Data is a successfully functioning android on the Starship Enterprise. His value is brought into question when Commander Maddox requests to disassemble Data in order to duplicate him. He has strong reservations regarding Data’s soul, consciousness, values, and rights. The philosophical question arises “Do machines understand the meaning of things?” The preceding trial to put an end to the debate is compelling and thought-provoking.
“We have all been dancing around the basic issues: does Data have a soul? I don’t know that he has. I don’t know that I have. But I have got to give him the Freedom to explore that question himself.”
- Captain Phillipa Louvois, Judge
This statement concludes the episode. Data is free to make his own decision regarding his disassembly. Data’s decision to protect himself shows his self awareness and gradual growth to wisdom.
Captain Phillipa Louvois’ final verdict got me thinking… how can humans be so confused on the state of Data’s soul when they have such little proof of their own?
During Data’s trial, it’s argued that he does not qualify as a sentient being because one needs intelligence, self awareness, and consciousness. Commander Maddox Argues that “self awareness means being conscious of actions and being, self and own ego”. Throughout his actions and speech, Data proves to be self aware; he knows of his composition, his conflicts with programming, and that he is a unique creation which must be protected. His sense of humor is a cherry on top, depicting personality within an android.
As someone who briefly studied artificial intelligence, I struggled with watching Data be put on trial by Commander Maddox. Everything I have been told are requirements for a conscious AI are present in Data’s character. In order for this machine to be ‘conscious’ it requires the ability to be aware and adapt to its environment in a self-organized way. The argument of him having a soul, now that’s something I’m still struggling with. I have an issue with identifying souls within humans, let alone in a robot. I started asking, how does consciousness differ from having a soul? Most of the time those terms are used interchangeably, so if we were to agree that Data is conscious, how can it be argued that he lacks a soul?
If a soul and consciousness are of the same properties, then Androids like Data definitely have souls. If anything, theirs are much purer than that of humans’.