Writing can be like cooking in the sense that you have your own taste, but if you want to sell your work, you must be able to package a story in a way that is palatable to others. The difficulty lies in the fact that my ideas play a certain way in my head, with my values, my education, my experiences, and my tastes. Of course, I "get it," but will others? And if they get it, will they enjoy it?
It's like when non-writers say "I've got this idea for a book," they thumbnail it for you, and inevitably conclude with, "So what do you think?" They say it as if describing the idea is sufficient to judge the work that will be written from it. It's not. There's voice, tone, pacing, characterization, world building, and a thousand little specific turns of phrase that, in the long run, dictate the so-called quality a story. So many times I will find myself in love with some part of a story or its conceit only to realize that I haven't developed it enough for public consumption. Motivations, assumptions, context, setup, and technical speculation that live in my head are not always easy to transform into an accessible story for a complete stranger. AI Family Values is one of those rare stories I've written that flowed pretty easily from the conceit to the story, probably because half of the conceit was from a previous story. I'd written a story that appeared in Deep Magic called Expectation of Privacy. The idea there was that the only way AI could be helpful is if it's given invasive surveillance powers; it can't pick you up when you fall down unless it's always watching. You make that palatable to people by saying that they aren't on the hook, legally speaking, for something an AI sees if it's in a context wherein the person could expect privacy. A trivial example would be illegal drug use; the AI wouldn't report you for doing illegal drugs in your bedroom, but would still call emergency services if you had a medical crisis. It respects your privacy to a fault, but still helps you. That was the kernel of that story. Depending on your sensibilities, that idea can be anything from revolting to a godsend. On the one hand, you will have people getting away with heinous crimes in the name of respecting their privacy. On the other hand, you can have people benefitting from helpful AI presences everywhere in life. So all kinds of medical and logistical benefits become possible as people fear AI less and less. As usual, Expectation of Privacy strayed far from that conceit, becoming more of a cautionary tale of high-tech vs. anachronistic societies. But it left a little sliver in my mind. If you accept that AIs need to understand human emotion completely to analyze their communications and behavior, then it occurred to me that you'd have a problem. Any AI that truly valued humans, that was committed to minimizing human suffering, would have some kind of conflict in its programming. It would have to witness horrible crimes and remain silent. Since the societal value of humans accepting invasive surveillance is enormous, the AIs would understand the tradeoff, but depending on the architecture of your AI, I could imagine a kind of HAL 9000 problem arising from that conflict. So, the conceit of AI Family Values was that these AIs guarding our privacy have established a secret court for crimes they can't report that are so awful they simply must be acted on. The problem is that the second AIs start punishing people, taking human agency, it's going to engender a huge backlash against the AIs. To avoid that, any criminal proceeding must be controlled by humans. Now imagine that these advanced AIs are pervasive, the notion of an expectation of privacy exists in law, and this secret form of jurisprudence is in operation. The line I picked through that "story space" is one of a mechanic stumbling over evidence of those secret goings on. It would look to him like AIs are running amok. At that point in the story development, I'm like one of those non-writers with an idea. That system doing those things and that guy discovering it are a lot, but nowhere near a complete story. A thousand writers starting from my description thus far would come up with a thousand different stories. As a writer, I separate reader engagement into two broad categories: emotional and intellectual. For most people, they need to feel something for the main character. They don't have to like the MC, just like reading about the MC. In sci-fi, especially hard sci-fi, there is a lot of audience desire for intellectual engagement. That part of the readership is more geeky and less emotional in their reading. They want clever speculative fiction that reads as plausible. They want it elaborated into the story world with consistency. In my experience, most readers aren't one of those two extremes. Most, I find, enjoy both modes of engagement to some extent. So, I try not to situate a story at any fixed point on that imagined continuum of intellectual --> emotional engagement. Instead, I simply recognize the two areas and try my damndest to make sure the story delivers as much of both types of engagement as possible. What I finally came up with works well, I think. Still, I had a hard time selling it. It was a bit grim and touched, obliquely, on some horrific implied family crimes. The ending, I think, is gutting. The nice people at Starship Sofa performed the story on their podcast. That was the first time anything I'd written had been performed. I found it uniquely satisfying. The odd thing for me is that AI Family Values feels like one of my most marketable pieces that provides deep emotional engagement in a well-elaborated near future with a powerful ending. So when I see some piece of sci-fi dreck that people spent a lot of time and money developing, I'm deflated by the idea that my seemingly great piece would never succeed beyond that obscure podcast. That's ego on my part no matter how much I assure myself it's not. There is no objective measure of the quality of a piece of writing, but I'll find some TV show or movie that is so awful that despite there being no metric, I'll assert that whatever the metric is, surely that piece of crap is worse than mine. It's a kind of rational irrationality if that makes any sense. Of course, it's simply me being a whiny ingrate. The older I get, the less I think about that sort of thing. If you spend time in writing groups, you realize that any piece of writing can be shredded in critique. In fact, it's a popular writerly sport among aspiring authors to hold up some commercial success and rip it to shreds. The subtext seems to be that they do better work, that they are more deserving than that other successful, beloved author. It's a very unhelpful thing to put yourself through as a writer. The hard truth is that whether you call it luck, non-determinism, connections, cultural bias, trends, religiosity, philosophy, politics, or whatever, how the world receives one's creative work is a crapshoot. The joke's on all of us because in most cases the public isn't rejecting an author's work, they simply don't know it exists; the ones that do will likely see a posting about it and decide in a second or two whether or not to spend their time and money deciding if they properly like the work. All that's to say, I love AI Family Values. I'm disappointed that it never got more exposure, so I included it in the forthcoming collection. I hope you enjoy it. Comments are closed.
|
|