Issue 038 Author Interview: Elizabeth Guilt and “Violent Silence”

We’re here today to share a chat with Issue 038 author Elizabeth Guilt about her story “Violent Silence.” Enjoy!

LSQ: You’ve taken a popular element of speculative fiction–increasingly human-like androids–and turned it on its head here. What made you think of having an android with not just human qualities, but an actual human brain?

Elizabeth: The first thought for the story was an android regretting that they didn’t experience human bodily sensations: being out of breath, having a pounding heart rate – they felt only the “violent silence” that Sherri describes. In order to miss that, they’d have to have known it, which requires them to have experience of – or at least understanding of – being human. The idea of a human brain inside a constructed body followed very naturally from that.

Also, like Garth, I want to believe that as a living being I am special, and that an artificial consciousness can’t be as powerful and flexible as a brain. Although AI is coming along in leaps and bounds all the time, there’s still no consensus that anyone’s yet produced something which can pass the Turing test.

Once we’ve reached the point (and I’m sure we will) where a bot can hold a conversation that convinces people they’re talking to a human, that’s very different from creating a thinking consciousness. AI is principally a machine learning algorithms inferring rules from the data they’re given, and then applying those rules to judge future data: they don’t truly “think” in the way I understand that word. They’ll replicate any biases from their training data, without questioning them.

Of course, you then start to get into the philosophical questions around what constitutes thought and consciousness. Approaching them stealthily through science fiction feels like a much more practical proposition than trying to tackle them head on.

LSQ: Do you envision a future in which human/droid hybrids like Sherri could be possible? What would be the moral implications?

Elizabeth: The moral implications of being able to take a brain from a human and implant it into a full android body get pretty thorny pretty quickly.

There’s the question of consent – Sherri never mentions if she had had any say in whether her brain was given new life following her death. I can imagine situations where individuals, corporations, or governments would be keen to restore a brain to a state where they can interact with it – for evidence in court, or to retrieve information, or just to speak to a loved one again. Can someone be restored like that against their will, and once they have been, who takes responsibility for their future?

You quickly run into the perennial question of what is human. Should a human brain in a robot body have the same rights as a human? Would failing to carry out repairs to the robot body be the same crime as medical negligence? If you purchase a robot, do you own it and is that akin to slavery? Given that a mechanical body is presumably indefinitely repairable, is having your brain transplanted into a robot body a form of immortality?

I could imagine a human brain in a synthetic body playing out very differently depending on context. Do we see the plastic body as a shell to house the brain, allowing someone to continue living after their human body has crumpled with age or illness? Or do we see the functioning android, performing some vital task, as the important aspect and the brain as a replaceable organic part?

I think it’s easy to hold confused and quite contradictory opinions on this kind of question. In the story, Garth reasons to himself that Sherri is surely human, but also makes it clear that he doesn’t consider the Sherri he has met to be the same person as the “real” Sherri.

LSQ: We’re moving toward a more automated future, perhaps like the one in which “Violent Silence” is set. What are your thoughts regarding robots taking over the workforce and the like? Do you think humans won’t stand a chance without droids, like in your story?

Elizabeth: I think a lot of the current drive for automation comes from business cases rather than a need for survival: companies want to cut costs, produce goods faster, make decisions based on data. Jobs that have previously been done by humans are handed off to robots or computers when it’s cheaper or more efficient – so plenty of businesses won’t stand a chance without automation, but that’s a very different story! Given the vast resources that would be required to create a whole “artificial human”, there would need to be a very compelling reason why doing it was better than employing an actual human.

In an ideal world, the mechanical, repetitive tasks will be done by robots, and humans left free to do the more interesting or creative work. Automating a task may increase the output, and thus actually create more jobs at the next stage of the process. It’s still a hard sell if your particular job has been automated out of existence, though. And that’s been happening to skilled workers for a couple of hundred years; it’s not a new problem.

I think the interesting changes in our world at the moment are coming from analytical algorithms, where computers can draw inferences that aren’t available to humans simply because of the scale of data required. Obviously, we see a lot of baleful headlines about algorithms ruining everything, but some of the applications around, for example, the use of medical data, have the capacity to be transformative. In terms of actual robots, though, our whole approach to mechanization pretty much requires them to remain unthinking beings.

LSQ: What was the most challenging part of this story to write and why? What was the most enjoyable and why?

Elizabeth: Having given my characters army backgrounds, and set the story in an ongoing conflict, I realized I’d landed myself the job of writing descriptions of military operations. I have absolutely no experience to draw on for that, and it’s not even a style of fiction I regularly read. The story doesn’t require correct real-world detail, but I was lacking the most basic knowledge and vocabulary. Even without the military angle, I find it difficult to describe people moving through terrain because I have no visual imagination.

Fortunately the story didn’t require in-depth descriptions. I just needed to furnish a reason for the characters to be in the area, and give them something to run away from!

The most enjoyable sections, even though they’re tiny, were the interactions with and around the campfire. People today, living in places where they have central heating and modern cookers and no actual daily need for fires, are still fascinated by flames. With all the technological toys we have today, sometimes we still choose fire and drums, which we’ve been playing with for thousands of years.