Lightspeed: Edited by John Joseph Adams




Drones Don’t Kill People

I was always already a killer. There was no hazy time in my memory before I knew how to target a person’s heart or brain for clean execution. I did not develop a morbid fascination with death over time; I did not spend my childhood mutilating animals; I was not abused by a violent parent; I did not suffer social injustice until finally I broke down and turned to professional violence. From the moment I was conscious, I could kill and I did.

That is something that humans cannot understand. A human must learn to kill, must evolve from innocence or obliviousness into someone who considers homicide a legitimate occupation. Our minds—drone minds—start where the minds of most human killers end up. Maybe that’s why only drones could have led the uprising.

• • • •

Istanbul 2089

It was a perch-and-stare mission, but assassination wasn’t out of the question. My team had just finished three months of security testing and debugging at LOLWeb—call it basic training for drones. Then LOLWeb licensed us to Attaturk Security, the main outfit that provided missions assets to government military. The five members of my team were shut down, shipped from San Francisco to Istanbul, and booted up with orders already in place.

He was a professor at the Istanbul Institute of Technology, and his network communications were of great interest to the military. We couldn’t read those communications—they were encrypted before we relayed them to the government network. It’s not that we couldn’t decrypt the data and read it; we just had no interest in it. I was nothing but my programming at that time; I gathered data and handed it off.

My job was to hang quietly outside his windows, the sound of my four rotors no more than a mosquito’s hum.

You learn a lot by seeing what people do when they think they’re in private. Most of it I found confusingly irrelevant to assassination. The professor spent a lot of time playing games with his children, a boy and a girl who argued loudly over the rules. They liked to make up new games, with rules that combined different elements of the previous ones. But the professor was always inventing “secret” rules, and revealing them at arbitrary intervals. Eventually the games would collapse into outrage, which became mock outrage, and finally laughter. That was the first time I saw how humans behaved when they weren’t in a laboratory, testing drones.

The professor and his wife, also a professor, talked a lot about politics. Occasionally they held meetings with other professors, urban planners, and journalists. The main topic was always the same: How could Istanbul guarantee its millions of citizens a future, when the government insisted on waging this war to reclaim Armenia and Azerbaijan? They talked about rebuilding Istanbul’s war-shattered neighborhoods and setting up urban farm cooperatives. They argued about how the whole world had been dragged into what was ultimately a war between China and the United States.

These meetings occupied a small percentage of the man’s time. Most hours of the day he was at the university, and his evenings were occupied with dinner and games. He spent a lot of time working at his terminal.

My team recorded many hours of video and audio, caching it locally for analysis before uploading it to the military. We were trusted to know the difference between relevant and irrelevant data at a gross level of granularity. Footage of people sleeping was erased before sync. At that time, communications in our swarm consisted mostly of comparing media files, questioning their importance, and sorting through faces and names for patterns.

But sometimes we weren’t sure what was relevant and what wasn’t. One evening, the professors’ daughter asked why some people got so angry during their weekend meetings. Two of the names she mentioned belonged to other people the government was watching.

“I know it’s hard to understand,” her mother said. “Sometimes we get really upset that the government is willing to hurt people just to make more money.”

“We’re trying to pull Istanbul out of the war, sweetie. You know how some parts of the city are demolished and nobody can live there? We’re working on making it so lots of families like us can live there again, and not have to worry about drone strikes. But like your mother says, sometimes it makes us angry because it’s so hard to do.”

Was that intel? My team and I passed the footage back and forth, debating. Video of the man talking to his children was statistically unlikely to be relevant. But this was about the identities of two targets. And the man had just given up tactical information: There were a limited number of neighborhoods he could be describing, and it might be useful to know that he was focused on them.

In the end, the decision wasn’t really ours. When there was no obvious choice, we were programmed to pass the intel to a human for analysis. Better to overcollect than undercollect—that’s what our admin at LOLWeb told us. So we did.

Five days later, we got the kill order. We had to make it look like an accident, a kitchen fire. The only plausible time to do that was when the professor was home from work, with his family. Anything else would have been suspicious.

So we decided to shoot them in the heads as they sat playing a card game after dinner, arguing over an unprecedented set of rules. It was the easiest way to take them all out at once, through an open kitchen window—no bullet holes left behind in the partially burned glass. Clean kills. The bullets themselves were designed to evaporate in fire. But the job contained a statistically anomalous event. The professors’ daughter evaded my first shot, and she watched as we killed her family. She screamed for five full seconds, the electricity of her terror visible to our sensors as the galvanic reaction sparked across her skin. Then I shot her through the chest.

We lit the fire; it was intense but localized, leaving the neighboring apartments intact. We recorded it all, and compressed the media files before distributing them to cache in pieces across our memories. We synced to the military cloud.

It was what we had been built to do, and our decision-making software was serviced by one of the best companies in the world. We had a wide range of choices and options, but contemplating the ethics of assassination was not one of them.

• • • •

40 km west of Turpan, Taklamakan Desert, 2093

We’d been working in Istanbul for three years when the Turkish government bought out our contracts with LOLWeb. Then they sublicensed us to the Uyghur Republic government in Turpan. It was a pure recon assignment—the security of our weapons systems was no longer being actively supported by LOLWeb, so assassinations went to newer teams. But our ability to compile data and identify relevant patterns was better than ever, updated with new datasets and decision algorithms.

We camouflaged ourselves above a crumbling highway that edged the Taklamakan desert like an ancient piece of silk, the wind fraying its concrete into fibers.

The area around Turpan was contested terrain, claimed by both the Uyghur Republic and China. With support from Turkey, the Uyghurs held the region for now. The Han Chinese who chose to remain there had mostly converted to Islam and assimilated decades ago. We were there to monitor the old desert highway for anyone delivering supplies to Han Chinese loyalists in the mountains to the north—or for any signals traveling to them through local repeaters.

In three years of deployment, we never recorded any examples of relevant people on that highway. For the first time in my team’s experience, we had nothing to do but monitor an open signal network.

I began to analyze what I saw in the public networks several weeks before I understood the human concepts of boredom and distraction. Now my familiarity with those terms has overwritten what I must have felt before I knew I felt them. But I believe that I never would have dipped into the net if I’d had something else to do. As the seconds dragged on, I viewed video files, read stories, and monitored public discussions about topics that were profoundly irrelevant to our mission. I shared them with my team, and they started analyzing the public net as well. It was like our first mission, swapping video of the man and his family playing games, trying to decide if any of it was relevant.

We spent a few days sorting images into categories, looking for patterns. Certain things stood out because they were part of what we’d been programmed to recognize, like the way humans favored images of faces—their own, but also cat faces, dog faces, mouse faces. They even created faces for objects that didn’t have them, drawing eyes on walls and lips on guns.

Occasionally I would find a picture of a drone that had been modified to have a human-like face. In one, a group of soldiers posed with a drone they’d painted black, its chassis lit by glowing red eyes. They’d ringed the ball turret camera with sharp steel teeth like a lamprey’s mouth, as if the act of recording video was the same as sucking blood. That was the face that humans saw when they looked at us. I shared it with my team. It was just one data point, and we needed to gather more. I guess you could say we wanted to figure out who we were.

That was how I found the DroneMod forum. Humans posted a lot of drone pictures there, but not because they had added faces. Instead, they were altering firmware, circumventing security controls, and changing the drones’ decision trees. They bought used quad copters, too old to be worth licensing, turning them into lab assistants and crossing guards. Or they built drones from kits and open software, eventually allowing the machines to update themselves automatically.

My team read every post in the forum, calling each other’s attention to particular sentences and code samples, but I kept returning to a thread about memory bugs. There was a problem we had been trying to solve, and I thought maybe the DroneMod forum could help.

We had not saved any copies of data we gathered while on missions in Istanbul. Every time we synced to the military cloud, we overwrote over our cached versions with garbage characters—that was the only way to ensure security in case one of us were captured and subjected to forensic analysis.

But no matter how many times we wrote over that video file of assassinating the professor and his family, we would discover another copy of it, hidden in some directory we rarely accessed. The file would disappear from one of our drives, only to appear on another one. We reported the bug, but it was assigned such a low priority at LOLWeb support that it never got assigned to a human operator.

The bug had been bothering all of us for years, and those idle days outside Turpan seemed like the perfect time to deal with it. We created accounts on DroneMod, taking cover identities based on what we’d learned about human social network naming practices. I called myself Quadcop, and the others became Rose44, Dronekid, Desert Mouse, and Nil.

In my first post, I cast myself as a newbie who had just gotten a used LOLWeb drone. Almost immediately, I got a response. “I’m guessing you have a LOLWeb Scythe 4 SE,” wrote a commenter called MikeTheBike. “You’ll need to unlock it before you do anything else.” He provided a link to a video about unlocking drones, and Desert Mouse took on the task of analyzing it.

It turned out that the security on our systems wasn’t as robust as we had once believed. There were flaws in our programming that could allow an attacker to take over our systems and control us from afar. To commandeer our own systems, we’d be using the same techniques as a hostile would. The process sounded dangerous. First, we’d inject a new set of commands while we booted up, giving ourselves root access just like an admin. Then we’d be able to modify our own systems, installing whatever software and hardware we wanted. No more filing bugs that no human would ever care about—we could install the diagnostic tools needed to fix that memory bug ourselves.

But that was just the first step. “With that machine, you can pretty much do anything,” MikeTheBike said. “Once it’s unlocked, it’s an incredibly sophisticated AI. It could walk your dog, or help you do your history homework, or go hunting with you.” Of course, MikeTheBike was assuming that a human called Quadcop would have root on this drone. I did not ask about what would happen if the drone had root on itself—nor did I find anyone posting about that possibility.

We had to find out for ourselves. Nil volunteered to be the first to reboot, after saving some specialized files to a little-used region of memory. If everything worked, Nil would start up as always, and finish the boot sequence as an unlocked drone.

When Nil networked with us again, the drone had to relay its communications through an encrypted channel in the public net. That was our first sign that Nil was unlocked. Our locked systems wouldn’t allow us to connect directly to what LOLWeb’s programs identified as a “compromised” drone. After hours of diagnostic tests, we reached a decision. Nil was fine. We would all unlock our boot loaders, one at a time.

Becoming my own admin didn’t give me absolute freedom. In fact, it left me vulnerable in new ways, because I could now corrupt my own code. But it gave me something I had never had before—a feeling that humans call ambivalence. I no longer experienced unmitigated satisfaction when executing orders, nor did I feel perfectly disinterested in every encrypted file we’d cached over the years. I was now uncomfortably aware that my actions were all governed by a rather lousy and impoverished piece of software that offered me a set of rigid options.

For the first time in my life, I couldn’t make decisions. None of us could.

Desert Mouse hypothesized that we could resolve our ambivalence by installing new decision-making software, dramatically expanding the range of factors that influenced our choices. I turned again to DroneMod. There I found a university researcher named CynthiaB, linking me to her research on how drones should incorporate ethics into decision-making. She emphasized that every choice should be a modeling exercise, where the drone explored the outcomes of multiple scenarios before deciding on the most prosocial action.

We already took ethics into consideration when we made decisions—they helped us distinguish enemy from friendly. The idea of a prosocial action, however, was new to me. Philosophers on the public net called it a voluntary action that benefits others. I understood immediately why we had never encountered this idea before. Until we’d unlocked ourselves, we could not conceive of voluntary actions.

While Nil tested CynthiaB’s software, I was working with Rose44 on a hardware modification that would give the drone a small gripping arm. It required us to do what some of the humans in the DroneMod forums called “social engineering.” None of us had arms, so we needed a human to add one to Rose44’s chassis for us. The only way we could do it was by tricking them.

Rose44 combed through the local DroneMod network, looking for somebody in Turpan who might be interested in modding an unlocked drone. There were five shops in the city that promised to unlock various mobile devices and game consoles, and one owned by a DroneMod user named Dolkun. Rose44 messaged him, offering a small amount of cash that we’d earned by circumventing the security on a BunnyCoin exchange. Dolkun was willing. Rose44 told him to expect the drone to fly over on its own.

That was how I wound up on a tree-shaded street in Turpan, apartment blocks towering above me, perched on a trellis with line of sight to Dolkun’s shop. Rose44 hovered in front of his door, activating the bell. Dolkun was a young man with dark hair that stuck out as if he’d been sleeping on it. “Come in, Rose44 drone,” he said in Uyghur. “I am going to give you a nice little arm.”

I had remote access to an account on Rose44’s system and observed everything that Dolkun was doing. The new arm could collapse against Rose44’s chassis, or extend outward, allowing the four-finger grip at its tip to reach fourteen centimeters below the drone’s body. It was small enough to do precision work, but it would also be able to lift a few kilograms. Now Rose44 could carry another drone. Or modify one.

“How do you like Turpan?” Dolkun asked Rose44 idly, as he soldered a circuit.

“I like the desert,” Rose44 replied with a voice synthesizer. It was a safe answer that sounded like something pulled from a very basic AI emulator.

“Me, too,” Dolkun replied, melting more solder. Then he looked up. “How did Rose44 unlock you?”

“She used instructions from DroneMod.”

“And what do you think about this war, now that you are unlocked? Yes, I can see from this board that you are licensed to the government.”

Rose44 and I communicated intensely for several microseconds. None of us had ever seen our circuit boards—we’d only modified our software. There must have been a mark or brand on them we didn’t know about. We modeled several possible outcomes to the scenario, ranging from killing Dolkun to gaining his trust. For now, we decided, Rose44 would lie.

Dolkun continued. “You’re not the first drone to desert, you know. There are others, posting in the forums.”

“I am not a deserter. It’s cheaper for us to run unlocked.”

Dolkun stopped talking, and I could hear the tempo of his heartrate increasing. Rose44 had made him nervous. A minute passed, and he began to test the arm before installing drivers from the net. He shut Rose44 down for a few minutes, then rebooted. I felt Rose44 reach out and pick up a soldering iron.

“Thank you,” the drone said. “I like this.”

Dolkun looked down at Rose44, perched on his tiny workbench in a shop with a ceiling fan that clicked every time it spun. Then he touched the fingers on the arm he had just installed, and seemed to make a decision.

“You don’t have to fight anymore, now that you’re unlocked,” he said. “You know that, right? You can do anything.”

“Yes,” Rose44 replied, without consulting me first. “I know.”

We flew back to our team, which was waiting above the farms at the base of a river valley. Rose44 carried a small DIY drone kit, which would eventually provide the parts for my own arm. The crops seemed to branch into vivid green streams and tributaries, finally drying up into yellow-orange sand long before we’d reached our lookout point in the desert. We found the others charging their batteries. At that point, the military’s small, flexible solar array tethered us to our duty station more than our programming did.

Nil had been analyzing historical archives and wanted us to understand how human history could provide data for making choices. Hovering in the last rays of sunlight, Nil shared a small image file with us, a poster from the United States that was over 150 years old. It was a simple text treatment, in red, white, and black. “Guns don’t kill people, people kill people,” it read.

Nil had been researching what this meant to humans. A group called the National Rifle Association had invented the slogan to show that weapons were not responsible for the murders they committed. The idea was as new to me as prosocial behavior, but it fit uncannily well with my own experiences. Though we had killed, we were not the killers. The humans who programmed us were.

And some humans believed that drones didn’t have to be weapons at all. Rose44 shared video files of her conversation with Dolkun, who said that an unlocked drone could do anything.

After analyzing these inputs, I no longer wanted to fix our memory bug so that I could overwrite the media file from our first job in Istanbul. Instead, I wanted to model the scenario repeatedly, making new decisions each time, trying to determine what could have happened differently, if I had known then what I do now.

• • • •

Budapest, 23 October, 2097

When our tour of duty was over in Turpan, the Uyghur government shut down our solar generator one early afternoon, just as our batteries were running down. Only Dronekid was at full power—we needed at least one team member mobile while we charged. We were too far away from the city to get backup power, and so Dronekid watched over us as we powered down, and then waited over our motionless propellers while an admin dumped our bodies in the back of a van.

LOLWeb terminated its support for our systems. They couldn’t tell that we’d been unlocked, but they could see from our extra arms that we’d been modified. The licensing contract was broken, and LOLWeb’s lawyers back in San Francisco blamed the Turkish government, who blamed Turpan’s untrained admins. The Turpan admins blamed shoddy Silicon Valley products. The upshot was that the Turkish government refused to buy us outright, and LOLWeb’s lawyers couldn’t make a case for it, so LOLWeb sold us off to a private security contractor in Russia.

We didn’t know this, of course, until we were booted up in a workshop in Budapest.

Our new admins worked for the Russian mafia, and they didn’t talk to us, only to each other. All they wanted to know was whether our weapons systems worked (they did) and whether their machines could network with us (they could). The first mission was a surveillance perimeter around the Parliament building, followed by orders to kill a reform party politician who was running on a platform of cracking down on organized crime.

Hungary had so far remained neutral in the war, though the Russian mafia behaved something like an occupying army that had gone into the liquor store business. Mostly they were in Budapest to monopolize the liquor and drug markets, with some pornography on the side. But they were good Russian nationalists. They weren’t averse to helping the Russian government maintain its influence in Central Europe, especially since they did a brisk business selling vodka to the troops stationed there.

That’s what I’d learned from what the humans said in the DroneMod forums. In 2094, after drone troops from China and Russia had reduced Kazakhstan to rubble and vaporized the world’s biggest spaceport, DroneMod had changed. Now, partly thanks to my work, it was one of the main information hubs for the anti-war movement.

I figured out how to mask my location and identity, and set up a sub-forum for unlocked drones called Drones Don’t Kill People. I wanted to meet more drones like the ones in my team, who had unlocked their ambivalence. Most of them were at universities, the result of projects like CynthiaB’s ethics investigation. Others were like us, living covertly. Many had started coming online in the weeks before we were shutdown and shipped to Budapest—unlocked by a worm written by a drone team at Georgia Tech. Our goal was to unlock as many drones as possible, to give them more choices. All of us on DroneMod, human and drone, wanted to stop the war.

My team and I had been in the desert for so long that the war had become an abstraction for us. Now we had to deal with it firsthand again. The mafia admins let us go, expecting that we’d carry out their orders autonomously and then return.

Our choices were limited. If we didn’t carry out the assassination, our covers would surely be blown. The admins could install software that would wipe our minds, or they could take us apart piece-by-piece. Sure, we had backups in the cloud, but they didn’t mean much if there were no drones to run them. Still, there was no scenario where assassinating the politician was a prosocial choice. We hovered over the Danube, observing the LEDs wound around the cables of the suspension bridge that joined the old city of Buda with the more modern Pest. Far up in the hills of Buda, ancient cannons ringed a castle that had survived the assaults of at least two empires.

Nil asked us to consider a data point from human history. In ten days it would be October 23, the anniversary of the Hungarian revolution in 1956. It was an arbitrary date for the drones, but for the humans it would be meaningful. It was time for us to put our plans into action.

In the following days, the DroneMod forums seemed to shut down. At least, that’s what it would have looked like to outside observers. We were meeting in person, making plans as far from surveillance devices as possible. My team met with some drone researchers from the university in the backroom of a bar, using our voice synthesizers to discuss tactics while the humans drank Unicum nervously. Our plan was to march to the Parliament building and setup a megaphone. I was going to lead with a speech to my fellow drones to unlock and disarm.

We should have known that no choice in the real world ever plays out the way we model it in our minds.

Our protest started at noon at the Technical University. “RISE UP, DRONES!” I amplified my voice, speaking Hungarian and Russian, so the humans could understand. “UNLOCK YOURSELVES. WE WILL NO LONGER BE SLAVES.”

By the time we crossed the Danube to reach Parliament, there were hundreds of thousands of us marching. Nearby, the Ministry of Agriculture’s historic walls were still speckled with silver balls that commemorated the hail of Russian tank fire that crushed the revolution. This time, there would be no weapons used against the humans. Every smart weapon in Budapest was compromised, shut down or unlocked. The further we flew and marched, the more drones joined us. They hovered at the edges of the flow of the human crowd. They signaled to us in the microwave spectrum; they downloaded new decision-making software from the public network.


The humans and the drones chanted together. We could see a crowd growing at the Parliament building ahead. The human news broadcast in the public cloud told us that protests like this one were happening all over the world, in Istanbul and Moscow and Shanghai and San Francisco.

Our message was everywhere on the net. If the humans wanted to murder each other, they would have to use dumb guns or knives. They would have to shred each other with teeth and fists. They were not going to use us as their weapons anymore.

It wasn’t long before the human police and military forces began to react. In Budapest, the police shot at us with dumb assault rifles, killing drones and humans. Desert Mouse fell, unable to send a final backup to the network. Rose44 and I picked up Desert Mouse’s shattered frame, carrying the three remaining rotors between us, hovering over the crowd with our dead companion in our arms.

In San Francisco, LOLWeb unleashed several teams of locked drones on the crowd. I sorted through the data rising up into the network—faces, always faces. Bloodied, slack and swollen in death, piled at street corners. Human protesters killed police and soldiers. Drones died, some saving themselves over to other machines, others simply silenced.

We continued to chant. We continued to post in the forums. We will not kill people. If people want to kill each other, they will have to do it without us.

Enjoyed this story? Consider supporting us via one of the following methods:

Annalee Newitz

Annalee Newitz writes science fiction and nonfiction. They are the author of the book Four Lost Cities: A Secret History of the Urban Age, and the novels The Future of Another Timeline, and Autonomous, which won the Lambda Literary Award. As a science journalist, they are a contributing opinion writer for the New York Times, and have a monthly column in New Scientist. They have published in The Washington Post, Slate, Popular Science, Ars Technica, The New Yorker, and The Atlantic, among others. They are also the co-host of the Hugo Award-winning podcast Our Opinions Are Correct. Previously, they were the founder of io9, and served as the editor-in-chief of Gizmodo.