Lightspeed: Edited by John Joseph Adams




The Kingdom of the Blind

After the Apocalypse by Maureen McHugh

This story also appears in Maureen F. McHugh’s new collection, After the Apocalypse, which is available now from Small Beer Press.

At 3:17 EST, the lights at DM Kensington Medical did the wave. Starting at the east end of the building, the lights went out, and after just a couple of seconds, came back on. The darkness went down the hall. Staff looked up. It was a local version of a rolling blackout, a kind of weird utility/weather event. In its wake, IV alarms went off, monitors re-set. Everything critical was on back-up but not everything was critical, some of it was just important, and some of it wasn’t even important, unless you consider coffee a life-or-death substance. Which for a resident, might be true. It was not life-threatening in the immediate sense, but it wasn’t trivial and it interrupted two nurses and a resident working on a woman in ICU having seizures, a pharmacist counting meds, a CT Scan, a couple of X-rays, and it derailed a couple of consultations. The line of darkness washed across the buildings, leapt the parking lot, split into two parts and then washed north and south simultaneously across a complex of medical offices.

At 3:21, the same thing happened at UH Southpoint Medical. UH Southpoint was in Tennessee and Kensington was in Texas. At 3:25 it rolled through Seattle Kellerman, although there it started in the north and went south. The three hospitals were all part of the Benevola Health Network. Their physical plant—thermostats, lights, hot water and air filtration—were all handled by BHP DMS, a software system. Specifically by a subroutine called SAMEDI. SAMEDI was not an acronym. It was the name of a Haitian Voodoo loa, a possession spirit. A lot of the subroutines in BHP DMS were named for Haitian loa. The system that monitored lab results and watched for emergent epidemiological trends (a fancy way of saying something that noticed if there were signs of say, an upsurge in cases of West Nile virus, or an outbreak of food poisoning symptoms across several local ERs) was called LEGBA, after the guardian of the crossroads, the trickster who managed traffic between life and the spiritworld. Some programmer had undoubtedly been very pleased with themselves.

The problem line lit up in BHP DMS IT.

“Sydney, phone,” Damien said.

“You get it.”

“You’re the least Aspergers person in the department. It’s that having two X chromosomes thing.”

Actually, the only people in the department who were clinically Aspergers were probably Dale, who was a hardware guy, and their boss, Tony.

“In the kingdom of the blind,” Sydney said. “The one-eyed girl is king.”

“The difference between ‘see/not see’ is a lot bigger than the difference between one eye and two eyes,” Damien said.

Sydney picked up the phone. “Hi, this is Sydney.” It was 4:49 EST and the lights went out.

“Fuck,” said Vahn, a couple of cubes down. “There goes two hours.”

“Save early, save often,” Dale said.

“Fuck. Fuck. Fuck.”

The lights came back on and everyone’s systems started booting up.

“Hello?” said the voice on the phone.

“Sorry,” Sydney said. “We just had a power glitch.”

“Well, you and everyone else,” said the man on the phone. “The system is screwy again.”

The system had been screwy for months. Sydney thought someone had probably been messing with it, introducing bugs or maybe even writing some sort of virus. BHP DMS was an elaborate system. Sydney, Damien, and the ten other people who took care of BHP DMS actually worked for Cronaut Labs, the company that had put BHP DMS together. Cronaut contracted them to Benevola.

BHP DMS had been engineered by using genetic algorithms. Genetic algorithms weren’t genetic, actually. Damien had had an AI class in college and they had talked about genetic algorithms. Programmers wrote a couple of different programs that solved a particular problem. Then they wrote some code that chopped and recombined chunks of programs and generated hundreds of program “offspring” most of which didn’t work at all. They tested those programs by having them solve the problem, threw out the ones that didn’t work, and did the same thing all over again with the programs that were left. The result was messy and full of odd quirks, but sometimes the results more efficient than traditionally written code. It had a lot of apparent junk. Spaghetti code that made no sense. BHP DMS made a Microsoft operating system look elegant and streamlined, but it could do some amazingly complex stuff. Damien was really interested in genetic algorithms. He had written some stuff into SAMEDI so that he could have it run a report that output variables at different points. He had shown her a place where SAMEDI seemed to be reading stuff in and out of memory for no particular reason.

“That’s classic,” he said. “It looks pointless. But I bet if you take it out the program crashes.”

In the last couple of months, BHP DMS subsystems had been crashing a lot.

Sydney was not really a codemonkey the way Damien was. She had a degree in computer science and she could write code but she had come straight out of school into support. Damien had actually done programming for Threepoint Games. He had told her about the game development death march to deadline, working eighteen hour days in the crunch before release of their game, SphereGuardian, sleeping at work and living on cereal and power bars and caffeine. SphereGuardian had not been a success. In fact, it had sucked. The company had folded. Damien had ended up at Benevola “until he got a better job.” That had been three years ago.

Sydney did not expect to get a better job, at least not in computers. She was pretty sure she had gotten this job because she was a woman and human resources had seen an opportunity to increase diversity. Most of the guys had more experience than she did. But she had been getting a lot of experience in the last year. Big systems like BHP DMS could get buggy and BHP DMS had so they had all been writing what Sydney thought of as code boxes. A subsystem would start doing something weird—crashing a lot, although when it was restarted they couldn’t find anything wrong. Then it would start doing something just plain weird, the way SAMEDI had just run the wave on the electrical systems. They would try to track down a point in the program where they could find something that triggered the event, and then they’d write some code to try to box that behavior in. Something that said, “when you want the electrical system to roll over that way, do this instead.”

Sydney was not all that good at it. Which was one reason why she answered the phone. It was a way of being useful. She did a lot of grunt work for Damien. A lot of coding is dull as hell. Database dull kind of stuff. Sydney got stuck with a lot of that. That was why she stood up and looked over the cube wall and said to Damien, “I figured out why it started with Kensington and then went to Southpoint.”

Damien looked up at her. He was short, pale, with black hair. He was growing a goatee and the hair was still sparse and wiry. But he had big, soulful looking eyes which Sydney was beginning to suspect had caused her to attribute to Damien certain emotional characteristics—sensitivity, vulnerability—that he in fact did not have. But he was funny and fun to work with. On the wall of his cube was a poster for SphereGuardian showing a guy in a space suit that made him look like a large, red human-insect carrying a spiky looking gun. Sydney had bought the game in the cheap rack for fifteen dollars. It had sucked.

“That’s the order they’re stacked in SAMEDI,” she said. “There’s a table.”

“That’s cool,” Damien said.

Sydney waited a moment and when Damien didn’t say anything else, sat back down. Damien could get in the zone when he coded. He said hours could pass during which he forgot to eat. Didn’t notice what time it was. He was not that skinny for a guy who could forget to eat. Sydney had never forgotten to eat in her life. One of her secret fantasies had been that, as a girl who could code, she would work in the one place where a geeky fat girl could get dates. It had not been entirely untrue. But as someone had pointed out to her in school, although the odds are good, the goods are odd.

Damien believed that BHP DMS was aware.

Sydney had found the Wired magazine article where he’d gotten the idea although she’d never told him that, she’d gone along with the fiction that Damien had figured it out himself. In the last couple of years, a number of big complex systems had, like BHP DMS, gotten buggy and weird and had started crashing in inexplicable ways. Eventually, all four of the systems had been wiped and re-established from two-year-old back-ups and in three cases, the problems had stopped. In one case, after several months, the problems had started back up again.

The guy who wrote the article had interviewed a scientist at MIT who thought that the systems had shown patterns that seemed purposeful and that could be interpreted as signs that the systems were testing their environments. Since their “environments” were the complex fields of data inputted into them, they didn’t see or hear or smell or taste. BHP DMS actually did monitor security cameras, smoke detectors, CO detectors, and a host of other machines but it didn’t care what the security cameras “saw.” It checked them for orientation. It made sure that the smoke detectors had back-up batteries with a charge. It didn’t use them to sense the world, it sensed them.

Sydney stood back up and looked over the cubicle wall again. After a moment, Damien looked up at her.

“What do you think DMS wants?” she asked.

He looked puzzled. Or maybe he was really not paying attention to her. Sometimes when she interrupted him he only appeared to be looking at her.

“If it’s aware,” she said. “What does it want?”

“Why does it have to want anything?” he asked.

“Everything wants something,” Sydney said.

“Rocks don’t want anything.”

“Everything alive wants something,” Sydney said.

Damien shook his head. “I didn’t say it was alive. I said it was aware.”

“How can you be aware but not alive?”

“Do you believe in life after death?”

Sydney did not believe in life after death but in her experience, admitting this could lead to long and complicated discussions in which people seemed to think that since she did not believe in God or the afterlife, there was nothing to stop her from becoming an ax murderer. She was pretty sure that Damien didn’t believe in God—he had a stridently pro-evolution t-shirt that said EVOLUTION: How can 100 bazillion antibiotic resistant bacteria be wrong?—but she wasn’t absolutely certain. “A ghost or a spirit was alive,” she said.

Damien shrugged and looked back at his monitor.

Which meant that Sydney should sit down, so she did.

After a minute Damien looked over her cube wall. His head was right above the Mardi Gras mask hung on her wall. She didn’t particularly want to go to Mardi Gras, which seemed to be mostly about blond girls flashing their tits, she just liked masks.

“I think DMS is aware but not alive,” Damien said.

“I don’t even know what that means,”

“Nobody does,” Damien said. Then he sat back down.


They decided to poke it. Or rather, Damien did. Sydney pointed out that they didn’t know what it would do if they poked it—it could crash, it could shut down all the electrical systems, it could delete all the pharmaceutical records from the previous year.

“Then we’ll install from back-up,” Damien said. “We’ll do it at 1:00 A.M.. We’ll send out a system maintenance bulletin.” Hospitals don’t shut down, but they do a lot less at 1:00 A.M.

Sydney said, “But if we install from back-up, we’ll be killing it.”

Damien leaned back in his chair. “Ah, the old transporter question.” They were sitting in his cube. His desk was a mess—stacked with papers, binders, a couple of manuals, and the remains of a dinner of Chinese food. “In Star Trek, if I beam you down to the planet, does that mean I have actually killed you and sent an exact replica in your place?”

“Yes,” Sydney said. She was wearing her If You’re Really a Goth, Where Were You When We Sacked Rome? t-shirt because Damien had laughed his ass off when she first wore it. Damien was wearing cut-off sweatpants and yellow flip-flops because even though the office was technically business casual, no one cared what you wore at 1:00 A.M.. “Look, what if we shut it down, back it up, and it never comes back to consciousness?” She was thinking about the book The Moon is a Harsh Mistress where the AI, Mike, is damaged during the war and after that never speaks again to Manny, the main character. Manny calls the secret phone number he has for Mike but when he does, there is only silence. She’d read it when she was thirteen and it had haunted her. She told herself that eventually Mike ‘woke up’ again and called Manny.

“Why do you think it’s conscious?” Damien said.

“Why do I think you’re conscious?” Sydney said.

“You think I’m conscious because I’m like you and you’re conscious,” Damien said. “DMS isn’t like us.”

“But if it’s aware, then it has consciousness,” Sydney said.

“Is a shark conscious?” Damien said.

“Yeah,” Sydney said.

“How about a cricket? How about a jellyfish? A sponge?”

“If we don’t know if DMS is conscious or not, then we pretty much have to assume it is,” Sydney said. “And if we back it up, we might kill it.”

Damien shook his head. “How can we kill it?”

Sydney said, “Because we will stop it and re-install it.”

“So you think that the interruption of consciousness might be enough to kill it? You think it has a soul? Its consciousness is in the code. Its code and body are unchanged. If someone has a heart attack and you shock them back, they come back as themselves. Your body is you. DMS’s software and hardware is DMS.” Damien was very pleased with himself.

Sydney was pretty sure it wasn’t so simple. It wasn’t until the next day that she thought of a cogent argument, which was that organic systems are a lot less fragile than computer systems. Organic systems decay gracefully. Computer systems break easily. DMS was much more fragile than an animal. But that night she couldn’t think of anything.

The problem with poking the system to see if it was aware was to figure out what it could sense. DMS didn’t see or hear, didn’t eat or breath. Its “senses” were all involved in interpreting data. So the “poke” needed to be something that it would recognize, that it would sense. And the poke needed to be something that it would sense as meaningful. The idea that Damien came up with was to feed it information in a way that it could recognize was a pattern, but that wasn’t a pattern it expected.

DMS had several systems which regulated input and scanned for patterns. Epidemiological information was generated from ER, patient intake and pharmaceutical information. Maintenance issues were anticipated from electrical usage. They picked the maintenance system, since DMS had been screwing with the electrical system, and input a 1000 character string of ‘1’s and ‘0’s. It was, Damien said, boring but clearly a pattern.

Sydney wasn’t sure it was the right kind of pattern. “Basically,” she said, “It’s like I flipped a coin and it came up heads 1000 times.”

“Yeah,” Damien said.

“If I did that, I might assume there was something wrong with the coin. But I wouldn’t assume aliens were trying to communicate with me through my coin toss.”

“DMS doesn’t have to recognize that we’re trying to communicate with it,” Damien explained. “It just has to notice that the information is not junk.”

DMS kicked the entry into the garbage column on its maintenance report.

They had written a program to do the entry, so they ran the program 1000 times.

If DMS noticed, it didn’t think anything of it. 1000 times it kicked the entry into the junk portion of the report.

“I don’t think it knows what we’re doing,” Sydney said. “You know, analyzing reports may be unconscious.”

“I don’t think consciousness is an issue here,” Damien said. “Remember the shark.”

“Okay. Maybe it’s involuntary. The shark has control over what fish it goes after, but it doesn’t have control over its kidney function. It doesn’t choose anything about kidney function. Maybe maintenance is involuntary.”

Damien looked at her. She thought he was going to say something dismissive, but after a moment he said, “Well, then, what parts of it would be voluntary?”

Sydney shrugged. “I don’t know. Epidemiologist maybe. But we can’t screw too much with that.”

Screwing with maintenance was bad enough. But data from LEGBA went directly to the CDC and National Institute of Health through a weird subroutine called DAMBALLAH which did complicated pattern recognition and statistical stuff to the data from LEGBA. Sydney worried about a couple of things. One was causing a system crash that meant someone ended up dying. The other was getting them in trouble with the CDC or the government. Of the two she would have to choose getting in trouble, except she could imagine bad data to the CDC might mean someone ended up dying anyway. In her mind it unfolded: Bad information seems to indicate a critical alert, Marburg virus reports in New York City seem to show that someone got off a plane and infected people with a hemorrhagic fever. The false epidemic pulls resources from a real outbreak of Legionnaires’ Disease and people who would have lived now die because she and Damien were poking DMS.

She thought Damien would say poke DAMBALLAH. Damien seemed a lot less concerned about getting in trouble than she did. She had a theory that the fear of getting in trouble was what made her not as good a programmer and that, in fact, it was all linked to testosterone and that was why there were more guy programmers than women. It was a very hazy theory and she didn’t like it but she had pretty much convinced herself it was true although she couldn’t bear to think of sharing it with anybody, because it was a lot better to think that there were social reasons why girls didn’t usually become codemonkeys than to think there were biological reasons. But right now she was pretty sure that she would say stop and Damien would say go.

He surprised her. “Not DAMBALLAH. You think that DMS might be fucking with the outputs on DAMBALLAH?”

She shrugged. “I don’t know.”

“Maybe tomorrow we can try to check that.”


Tomorrow was tough because when Sydney got home she was too keyed up to sleep and she was up until almost four reading a book called Dead Until Dark. The book had been recommended to her by Addy, her college roommate from junior and senior years. It was the first in a series about a paranormal detective and had been just about the most perfect thing to read after coming home from a failed attempt to prove that a computer system was aware.

She was still worried about DAMBALLAH and whether DMS was doing weird things with the epidemiological reports. DAMBALLAH was a complicated system. It made decisions about reporting data. She couldn’t easily check its decisions—that was the point. Every two weeks they got a report from the NIH and the CDC about epidemiological trends and if there was something new that the CDC was looking for, say an outbreak of shigella in preschools in the south, there was an elaborate way they entered additional parameters to DAMBALLAH’s tracking system. The CDC and the NIH also sent them error reports and WRs. WRs were to correct when DMS was reporting something that wasn’t important or was over-reporting. The result was the DMS ‘learned’ epidemiology.

This made it difficult to know if DMS was screwing with the numbers. If DMS did report something, like an epidemic of Onchocerciasis (parasitical river blindness) in Seattle, that would get caught fast. But if DMS were just, say, over-reporting the incidence of TB in Seattle, that might not. Sydney ran a EP report and started working on a program that would check the DAMBALLAH database for raw numbers of cases of illnesses that DMS was tracking for the CDC to see if she could spot anything that looked weird.

Damien had been cranky and quiet all day. Then at 3:17, the lights at Meridian Health in Macon Georgia did the wave. The same thing that had happened the day before happened again, except this time in reverse order, ending with DM Kensington Medical. They found out it was happening again when the power outage rolled through headquarters early in the sequence. Within minutes Tony, their boss, was screaming at people to stop it, but they decided that stopping it would be more complicated than letting it run its course, so they called the last three hospitals and gave them a heads up.

Damien was set to write code that would catch the beginning of the sequence and stop it from happening. Together, he and Sydney poured over the tangle of spaghetti that was SAMEDI code. Then next day, at 3:17 they could at least switch the electrical systems to maintenance mode for the time it took for DMS to run through its sequence. (According to the log, it would have started with DM Kensington again.) Hospitals bitched about slow downs in the DMS while SAMEDI was not running. It shouldn’t have effected everything else, but DMS was so weirdly interconnected that SAMEDI had evidently been doing something that optimized read/write functions. Which SAMEDI wasn’t supposed to do at all.

“Why 3:17?” Sydney asked. “Why the electrical system?”

Damien shrugged. They were pouring over printouts, looking for ways to, in Damien’s words, “build a box around the bug.” Tony, their boss, was alternating between asking them if they’d found it yet and telling the head of operations that the admin IT team was doing a great job and to get out of their faces and let them work. Tony was a screamer, but as far as he was concerned, the only one allowed to scream at his people was him.

Mostly Sydney noticed that Damien did not seem to be “in the zone.” He had talked a lot about being “in the zone.” About time passing without his even realizing it. Poring over printouts he sighed, exasperated. He got up and went to the bathroom a lot. He got coffee a lot. He talked about what they might do and although his ideas were smart, they more he talked the more she got an idea about how he thought about stuff like this and for the first time she found herself thinking, maybe with some experience, she could code pretty good, too.

She finished her database checker for DAMBALLAH, the program that tracked disease trends. The results were mostly … complicated. But there was one area she thought was a problem.

“Damien?” she said.


“I think DAMBALLAH is messing with the numbers.”

He looked at her. Carefully he said, “How do you know?”

“I don’t,” she said. “Not for certain. But I ran a raw compilation of what was in the Seattle database, and compared it to what DMS is reporting. And DMS is reporting a nosocomial infection rate of 7%.” Benevola was involved in a big program to reduce nosocomial infections. Nosocomial infections were infections that the patient caught as a result of medical care. Benevola was working with a huge government double blind study.

“And?” Damien said.

“I can only find evidence of less than a one percent nosocomial infection rate.”

Tony, their boss, stood in his doorway. “What are you saying, Sydney?”

“I … I’m not sure.” Sydney wasn’t ready to talk to Tony yet. Actually, Sydney was pretty much never ready to talk to Tony. But she had wanted to talk to Damien about this first. “I mean, DAMBALLAH is cranking numbers in ways I don’t understand. It could be that I don’t recognize a lot of stuff that DAMBALLAH does. I mean, that’s the whole point, right?”

Tony came by and leaned over the cube wall. “We might shut it down.”

“Tonight?” Damien asked.

“No, shut it down and reload from a back-up from twelve months ago.” Tony always acted as if you were dim if you didn’t get what he was talking about, but he had a tendency to start conversations somewhere in the middle, so everyone was always confused talking to Tony.

“We’ll lose all our updates,” Sydney said.

“Yeah,” Tony said. “But if it’s unstable, who cares? We’ll look at reloading the system over the weekend. I gotta talk to upstairs first ’cause it will be a huge nightmare.”

Understatement of the year.

When Tony had gone back in his office Damien said, “Show me.”

She showed him.

Damien nodded. “This is really smart. I mean, not the programming.”

Sydney grinned, “A monkey could do the programming.” It was an old joke.

“I wouldn’t have thought to do this,” Damien said.

“It might not mean anything,” Sydney said. “I mean, the whole point is that DAMBALLAH is extrapolating information.”

“It means we’re killing DMS,” Damien said.

“You said it wasn’t alive,” she said.

“Semantics,” he said.


She went home and finished Dead Until Dark, started Dark Hunter, and fed Scott Pilgrim, her cat, and thought about DMS. What would it be like to be alone? Of course, as a human being, she was a social animal. Even the cat was a somewhat social animal. But DMS wasn’t. DMS didn’t even know anyone else existed. DMS lived in a datastream. In science fiction, AIs were always looking for other AIs or trying to be human, like Data on Star Trek Next Gen.

Truth was, she was beginning to get a feeling about DMS. About what DMS might be like. She felt as if she could sort of sense the edges of DMS’s personality and although she knew it wasn’t true, she knew it was just because Damien had used it as an example, more and more she thought of DMS as a shark. Not in a predatory way. She had an image of a shark in her head, a small shark, a nurse shark. She could see its eye, a black circle in white, overly simple, like a ventriloquist dummy. Although the whole point of DMS was that it was not someone else speaking through the code.

The shark in her head swam, purposeful and opaque, its eyes tracking, its mouth open and curved. Sharks don’t have a neocortex. Their brain is simple. They aren’t moral or immoral, ethical or unethical. DMS was like that because for DMS nothing else was alive. The world for DMS was data and DMS swam in the data. She was beginning to feel as if she wanted it to. DMS was creepy.

She dragged herself in again the next day. She swore she would not read late. She would go to bed early.

The good news was, Damien was pretty sure they had a way to catch DMS when it started screwing with the electrical system. At 3:15, Tony and most of the department came over to watch. What Damien had done was made sure that when DMS did its electrical system trick, the system would catch it as soon as the lights started going out and reroute so that DMS wasn’t actually touching the electrical system. At 3:17, Damien and Sydney’s printers started up. Damien had set them to send a report if DMS tried to do its thing.

DMS would know that the electrical system wasn’t responding. Sydney imagined DMS trying to run the pattern that sent the blackout rolling and finding yet again that nothing was happening. Was it perplexing? If data was DMS’s reality, and it couldn’t affect the data, what would that mean for DMS?

She ran the program that sent DMS the string of a thousand 10101s, a thousand times.

Instantly, her printer light blinked. DMS had started the electrical pattern sequence again.

She ran the program again.

DMS started over again.

She ran the program a third time. And a third time her printer hummed. She ran the program a fourth time, thinking, I’m talking to you. I’m responding to you. Do you know someone else is out here? Or is it like a toddler knocking something off a high chair just to see it fall? The fourth time, there was no response. DMS didn’t start the sequence that should have started the lights going out at DM Kensington Medical but which would, in actual fact, simply send an alert to Damien and Sydney. DMS had responded three times, and ignored it the fourth. She felt a chill.

Years later, she would tell about this moment. There really wasn’t enough proof to know that this wasn’t just an intermittent software glitch. But she had believed at that moment that this was proof. DMS was choosing to act or not act. Software didn’t choose. It ran. She would give talks and lectures and would come back to this moment again and again until like a coin it had worn so smooth that she couldn’t actually feel anything about it. What she would never tell, and would eventually mostly forget, was how afraid she was.

“What the fuck are you doing?” Damien asked.

“It answered me,” Sydney said. She told him.

“It doesn’t mean anything,” Damien said.

“What are you talking about?” Tony asked.

“Damien thinks that DMS might be aware,” Sydney said.

“What the fuck?” Tony said. “I don’t have time for this. Are you screwing around with this system? This four point two million dollar system on which people’s lives depend?”

“I don’t really think that,” Damien said. “It was just kind of an idea to kick around, you know?” The look he shot Sydney was murderous.

“We’re going to have to go to backup. This is a mess,” Tony said. “Admin wants us to go back to when the system was stable. Damien, can you fly to Texas on Saturday?”

DMS wasn’t “in one place.” DMS was a complex system spread across multiple servers. Damien would end up spending the weekend in Texas, babysitting part of the reload.

Damien was looking at Sydney. She should have said, “We can’t.” She should have said, “It’s aware. It’s the only one of its kind.” She should have said a lot of things. Instead she looked at her desk.

“Yeah,” Damien said. “I can go. I’m racking up the comp time, Tony.”

Tony waved his hand in a “don’t talk about that now” way. “Sydney, can you write me a memo about the data corruption you’re finding?”

“I don’t know that it’s really data corruption,” Sydney said.

“I don’t want to hear any more about this DMS is alive crap.”

“I don’t mean—DAMBALLAH might be catching things I’m not catching. The whole point is that DAMBALLAH is sorting the data.”

“Yeah,” Tony said, not really listening. “Write that up, too.”

Somewhere, DMS sorted the data stream. She was pretty sure that the thing in the machine did not think someone was talking to it. Blind and deaf, DMS had tried to make something happen and something else had happened. But ones and zeroes weren’t interesting enough for DMS to keep doing it. There would be no Helen Keller at the well moment for DMS. No moment when DMS felt something out there in the void, talking to it, when DMS knew it was not alone. Sharks do not worry about others. They don’t care. DMS didn’t care, wasn’t alive. It was aware of something. Just not her.

Tony told them they would be working that weekend to do the re-install from back-up. Start figuring out what they needed to do.

It would be gone. No one would ever know that she had known, except Damien. Maybe. He certainly wasn’t likely to say, “Hey, there was this AI and we killed it.” No, he’d explain to her how it was never really alive, how it could be restarted so it wasn’t exactly dead.

DMS was not a shark. She didn’t know what it was. Didn’t know how to think about it. It was as opaque as a stone. Did it even care if it was or was not? It had no survival instinct.

They started figuring out what data they wanted to back-up before the reinstall.

It was a dicey thing. People’s lives couldn’t be trusted to DMS. But DMS was aware. But DMS couldn’t be downloaded to another machine and replaced with a back-up. DMS was a system, a bunch of programs and computers all tied together. Reloading.

A couple of hours later, Sydney dug out the Wired magazine with the interview with the guy from MIT who thought some systems had become aware. She sat at her desk for awhile. Then she called MIT. “I’d like to talk to Professor Ayrton Tavares, please.”

She was forwarded. “This is Kaleisha,” a voice said.

“Can I talk to Professor Tavares?” Sydney asked.

“He’s not available right now,” the woman said. “Can I take a message?”

Sydney thought about saying “no.” She was going to get in trouble for this. Benevola. They weren’t in the business of protecting nascent AIs. They were supposed to manage hospitals. “I’m a computer tech working on a big system like the ones that Professor Tavares talked about in the Wired article.”

“Yes?” said the woman.

“I’m pretty sure I’ve got proof that our system is aware. Like the ones in the article. And they’re going to shut it down.”


In the end they would shut the system down. Benevola would fire Sydney for divulging proprietary information. She would go to grad school for urban planning.

But at that moment, she hung up the phone and went to find Damien. DMS was still swimming in the data stream. The future was still probabilities, not actualities.

“Damien,” she said, “I called Ayrton Tavares.”

Damien said, “Who?” Not really paying attention. The name meant nothing to him.

“The AI guy. The one in the Wired article.”

The look Damien gave her was naked and exposed. Too late she remembered that she wasn’t supposed to know that Damien had found the article in Wired. Too late she realized that her whole relationship with Damien rested on the understanding that he was the guru, the smart one. He was Obi Wan. She was just a girl who he could explain things to. She had known it all along, at some level, but this was the first time she’d forgotten to uphold her end of the bargain.

Maybe she thought for a moment that like DMS, she didn’t care. But of course, she did.

Four years later, Rochester Institute of Technology would build a system that simulated DMS’s environment and load DMS. Despite the differences between the original hardware and R.I.T.’s simulation, DMS would come back as if no time had passed at all. At 3:17, DMS would try to run the lights.


© 2008 by Maureen F. McHugh.
Originally published in the chapbook Plugged In.
Reprinted by permission of the author.

Enjoyed this story? Consider supporting us via one of the following methods:

Maureen F. McHugh

Maureen F. McHugh was born in what was then a sleepy, blue collar town in Ohio called Loveland. She went to college in Ohio, and then graduate school at New York University. She lived a year in Shijiazhuang, China. Her first book, Tiptree Award winner China Mountain Zhang was published in 1991. Since then she has written three novels and a well received collection of short stories, Story Prize finalist Mothers & Other Monsters. McHugh has also worked on alternate reality games for Halo 2, The Watchmen, and Nine Inch Nails. She lives in Los Angeles, where she has attempted to sell her soul to Hollywood.