News Ticker

[GUEST POST] Jadah McCoy on The Curious Case of Singularity

Jadah McCoy lives in Nashville, TN and works as a legal coordinator. When not babysitting attorneys, she can be found juicing her brain for creative ideas or fantasizing about her next trip out of the country (or about Tom Hiddleston as Loki – it’s always a toss up when she fantasizes.) Her new novel is called Artificial. You can find out more at her website and by following her on Twitter as @theQueryFaerie.

Humanity And The Curious Case Of Singularity

by Jadah McCoy

Humanity | noun | hu·man·i·ty | \hyü-?ma-n?-t?, yü-\:

  1. the quality or state of being human;
  2. the quality or state of being kind to other people or to animals;
  3. all people.

What does it truly mean to be human, to have humanity?

It’s a highly debated quality, sought among other species, and even sometimes among our own species. In a way, isn’t humanity truly just self-awareness, sentience, intelligence, a sense of empathy for other beings?

A recent article stated that India declared dolphins to be “non-human persons.” We coin it “humanity” because human beings seem to be of the opinion that no other creature can feel as we do, think as we do, communicate as we do.

Sure, maybe dolphins don’t have the Grammy’s and maybe they don’t debate the effects of eating organic fish versus pollution-tainted fish, but does that make them any less intelligent? Just because your dog barks instead of speaking English, does that make you understand their expressions any less? Just because an android is made of metal and coding, does that make the emotions they experience any less real?

So then, if something looks, speaks, acts, and feels like a human being, and the only discernable difference is the material they are created from, where is the line of humanity drawn?

Within a couple of decades, we’ve gone from flip phones to iPhones, dial up to wi-fi, virtually no connection, to everyone, everywhere being connected all the time. Technology is expanding at a rapid rate, and how is no longer the question-it’s when.

For example, Siri, the joyful voice in your phone who tells you jokes and calls you funny names. Helpful, right? Could this piece of software one day decide she’s tired of repeating the same “the past, present, and future walk into a bar-it was tense” joke and go all I, Robot on us? Or the Jolly Roger Telephone Co. created by Roger Anderson, a sassy piece of software that converses with telemarketers. Maybe one day Jolly decides she’s tired of humans trying to sell her objects her intangible interface can’t utilize.

It’s no secret that some of the world’s most intelligent men fear the creation of AI. Stephen Hawking is quoted as saying, “The development of full AI would spell the end of the human race.” Or Elon Musk, who claimed AI could be more dangerous than nuclear war.

Artificial intelligence is something that modern media can’t seem to get enough of either. With movies like Prometheus, Blade Runner, Uncanny, AI, Ex Machina, Her, Chappy, and I,Robot, as well as books such as Suzanne Van Royen’s I Heart Robot, Freak of Nature by Julia Crane, and my own novel, Artificial, it’s safe to say that people are curious about singularity, and rightfully so.

One thing most, if not all, of these have in common is the exploration of human emotion and the sense of self among these AIs.

In Blade Runner, Rachel doesn’t appear to be aware that she is a Replicant and not a human woman. Her memories have been implanted, and she’s unable to tell the difference between what’s real and what isn’t. In AI, in a moment of thoughtfulness unlike any a robot should have, Gigolo Joe tells David, an android created to perpetually live as a child, that humans hate AI for their longevity, because they were created too fast, too smart, and will be the only things left when the world ends.

In Uncanny, a young journalist is brought into a scientist’s workshop to do a piece on the android he has created. She falls for the scientist and is perturbed by his creation, which seems not to understand social boundaries. Only at the end of the movie does the journalist realize the “scientist” is actually the android, while what she had assumed to be the AI, was actually human.

And in my own book, Artificial, androids who experience emotion are deemed “Glitches” for showing such a human trait. Glitches are hunted and killed for what normal androids consider a weakness.

So, what if, like in Uncanny, you truly can’t tell the difference between human and not human? What if the illusion is so convincing, even the AI doesn’t know they’re different, like in Blade Runner?

What happens when the experiment begins to think for itself? What happens when the creation can choose to be good (Samantha in Her) or evil (David in Prometheus), helpful or monstrous? How could anyone not find that nuance enthralling?

My conclusion is that humans would be very entitled indeed to think that no other creature shows intelligence, empathy, or kindness. I know of several homosapiens who don’t even exhibit that much. And more than that, the existence of violence, of a dominating will, would be even more evidence of their humanity.

For what species on this planet has ever been more destructive, more hateful, more callous than human beings?

Should AI be created, and should singularity take place, I believe that the byproduct of this would be something very much capable of thought and feeling, something very human in all senses of the word.

%d bloggers like this: