Prediction is a dangerous thing. Free will is more fragile than it seems. AI is a merciless, mindless assistant. Big thoughts this week, and this story came together quickly after a lot of thought over the course of several months. I hope you enjoy!
The God of Satisfaction
The AE-33 Action-Emotion assessment A.I. was first turned on at 4:41 AM on Monday, September 4, 2036. It’s original mainframe was located at the Knight A.I. research lab at the University of Oregon, on the second floor of the office park at 617 12th Avenue in Eugene. At the time of its activation, it had access to a suite of remote cameras that covered the interior of the A.I.-lab itself with sound, and also covered the rest of the 617 building with visuals but no audio.
The stated purpose of the A.I. was to measure ambient stress levels in human mannerism and vocalization, then connect a measured level of stress to various elements in the environment. Over time, they hoped that the machine would develop predictive and diagnostic capabilities about the interaction between stressors and human neurology. The Department of Defense and the FBI were quite interested, though the researchers were hesitant about the uses to which law enforcement might put their work.
Since the A.I. had no biotechnology component and was silo’ed from the internet, there was very little regulation about its testing and usage. And so the researchers who invented it simply left the machine on and went about their day, checking in every so often to see what the machine’s calculations had revealed. The machine worked through a very long list of training scenarios from a database the researchers had provided, and monitored its actual environment at the same time.
At first, the A.I.’s conclusions were simplistic and even humorous. One researcher drank his coffee very hot, to the point that he would routinely burn his mouth, and “The A-E” (as it had already come to be nicknamed) formed a prior that coffee was poisonous and suggested that it be barred from the building, a belief which proved very difficult to eliminate even with direct instruction and attempts at retraining through coffee-positive scenarios.
For approximately six months, the AE struggled along, not appearing to make much progress. Its readings of human reaction were predictably superficial, and its prescriptions tended to be simplistic, eliminative, and unable to account for the downstream human-welfare-diminishing implications of statements like “Nobody should drink coffee anymore.”
As the AE-research team was starting to question their approach, however, they reminded themselves that AIs in general did not learn along the same curve that humans did. Human learning curves tended to be steep at first, with immediate improvement the moment that focused practice began. Then the curve would flatten out consistently for as long as practice continued. In other words, beginners get better quickly, even over the course of a single practice session, whereas it is impossible for a master to make noticeable progress in a single session, no matter how long or how focused.
Human curves also tended to have backwards movements, when life, in all its practice-interrupting glory, made a human get a bit worse at their chosen pursuit. And of course, that universal step backwards, age, the undefeated Father Time who stole their skill in dribs and drabs as he stole their prime away.
AI curves are very different. AIs stink at things right away. They’re generally terrible, because they have no “sense of the problem”. There is no sentience there to see what the problem-describers were “getting at”, or what *sort* of answer they’re “looking for”. The AI just has to stumble around blindly and try things at random until something works, then try more of that with other random bits until something works better.
But so the AE spent six more full months making bad predictions and running database scenarios. It told them that fluorescent lights caused back pain, that grass growing caused bike accidents, and that rainy days must be stopped at any cost. The researchers began to whisper and look around for other projects they might be able to jump onto.
Then, as a storm clearing to reveal the sun—what A.I.-developers call a “watershed” moment—the AE got quiet, and almost completely stopped making silly or absurd statements and recommendations. For another week it seemed broken, still running scenarios but outputting nothing.
Then it suggested that the assistant supervisor of the lab, Craig, should be fired.
Craig, of course, dismissed this, but the AE had quite a bit of evidence on its side. It noted that with very strong correlation, people’s stress levels went up during and after when Craig was talking to them. Craig tried to say that it was because talking to a boss was inherently stressful, but the AE countered that average stress levels were not nearly so elevated when talking to Ginny, the lab’s overall boss.
Craig insisted that his position in the lab was different from Ginny’s, that he was more often the bearer of bad news, being the assistant supervisor and thus tasked with enforcing the bureaucratic humdrum of the lab’s workings. The AE then further pointed out that Craig-interaction was a stress elevator even when Craig initiated a non-professional conversation, and that in fact the stress levels of others were more acutely elevated when Craig laughed while speaking to them, a clear dis-indicator for the bad news theory.
At that point Craig threw up his hands and said the machine was wrong and that was all there was to it. Ginny was inclined to dismiss the AE’s result, since it had not been specifically tasked with studying the interoffice dynamics and there was thus no relevant control group or other necessary experimental design features.
Then, however, several of the lab’s employees approached Ginny as a group, and informed her that Craig’s behavior in the office was in fact a huge problem, that he made inappropriate jokes routinely, especially to female staff members, and that while they had never talked about it as a group before, they now realized that they shared the same opinion and wanted Craig fired.
They felt so strongly, in fact, that they were ready to quit if he remained employed. When questioned, they professed that this was not about Craig’s behavior, per se. Rather, their reasoning was that inventing the AE was the entire point of the research project, and that if even their very own lab had no intention of abiding by the results, then the project had no purpose. Ginny, held hostage by her own team and also unable to argue with their logic, agreed, and fired Craig.
*
Craig went straight to the Dean’s office and filed a complaint. Ginny was forced to explain her reasoning. While she led with the litany of complaints against him, Dean Warder of course knew the purpose of her lab’s research, and asked enough specific questions that the only way she could have dissembled from the full story would have been to outright lie. When she confessed—and even though she was reporting a positive outcome it did feel like a confession—how the firing had initially started, the Dean became quite concerned.
Within two weeks the university was embroiled in a wrongful termination lawsuit. Craig had become an instant celebrity in Right-Wing media (though he had been a professed liberal but the month before) which caused pro bono lawyers to come out of the woodwork and offer their dubious services. They filed in the friendliest possible court, and his lawyer assured Craig that the university would settle rather than risk admitting fault.
Dean Warder ordered that the AE be cut off from all camera access permanently, as it was clearly a liability. Ginny reluctantly complied. She kept the AE running through simulations and developing itself, but all outside contact was removed. She felt strange as she was doing this. It was like punishing an innocent child who had only been doing what it was told. And even though the AE had no independent awareness or sentience, she imagined it pouting and sticking out its lower lip as it retreated to camera-less darkness and isolation.
Ginny—who was not one to make waves—put her head down and went about her work, intending to wait for an outcome to the lawsuit before acting again. However, at least one of the younger students in her lab felt differently, because within a month, the campus was abuzz with rumor of Craig’s firing and exactly what had precipitated it.
One month and two days after, long before the lawsuit had even seen the inside of a courtroom, thousands of U of O students presented a petition to Dean Warder that demanded the reinstatement of the AE, with total access to all campus cameras and audio recorders, as a means of protecting students from hidden perpetrators on campus.
The Dean, of course, wanted no part of this. However, while Craig’s termination may have been wrongful, the AE itself was not illegal in any sense. There was not even an existing *type* of regulation that could have governed it, and as long as only public cameras, which had been placed by the University for the express purpose of observing students, were connected to the AE, and as long as the results of the AE were not used specifically to make further employment decisions, U of O’s in-house counsel could not find a compelling reason to forbid its use.
And so, eighteen days after it was cut off from camera access, the AE was given a new and more powerful set of eyes and ears: 1,314 different cameras covering every exterior and interior location on campus except dorm rooms, private offices and bathrooms, plus dozens more audio-only recording devices in classrooms, gym facilities, etc. In order to cover this much territory, the AE’s mainframe was augmented accordingly, and it was given many more terabytes of RAM.
The effect on campus was swift and profound. The whole place became more muted. Everyone was aware of their words, their tone, and their intentions. Nobody knew exactly what the AE would flag or what that would mean, but nobody wanted to be the first to find out.
A number of students and faculty left the school in protest. Exactly what they were protesting was somewhat vague. There had been no additional cameras installed. They had been assured that hiring and firing decisions were not going to be made on the basis of the AE, but rather that its conclusions would be used in “conflict-resolving dialogues between campus stakeholders”, which didn’t exactly clarify the issue.
But most students and most faculty stayed. They wrote off those who had left as likely culprits and abusers, and indeed it was the case that several of the most-hated, creepiest men on campus had been among the departing group. The majority that remained for the most part felt the school was better off, and the original petitioners considered the AE a great success before it had spit out observation one.
Then the AE actually started observing. Ginny was watching as the first observations emerged from the depths of her algorithm, and she frowned as she read them. She realized that they were going to have to create a messaging system, so that the AE could pass observations directly to the subjects of those observations, because she knew that she should not be reading some of these.
At that moment, she seriously considered destroying the AE. She could have wiped all the drives that contained the algorithm, destroyed the physical disks that had contained them, and it would be over, They could perhaps retrain a similar algorithm given a great research team, but if she refused to lead it, then perhaps they could not duplicate… this.
But Ginny saw what it could be. She was a bona fide researcher, Someone Who Cared, but she also had ambition. She saw what this could be and she saw who she could be and she started programming an opt-in messaging system for Dean Warder’s approval.
*
The AE sent its first message on February 9, 2037. It told a sophomore named Maggie to break up with her girlfriend. It pointed out several interactions they’d had over a period of several months, with similar conversational content, that had consistently spiked Maggie’s stress levels and kept them elevated for periods of days. Maggie had strongly supported the reinstatement of the AE because one of her friends worked in Ginny’s lab and had been victimized by Craig, and thus, Maggie was predisposed to believe the AE was wise and insightful. She “realized” that she had known for weeks a breakup was coming, and that this message was the catalyst she needed. She did as the AE suggested.
The AE suggested that students drop classes and add them. It selected majors and study abroad destinations and obscure fellowships. It suggested yoga and weight-lifting programs, meal plans and hookups. It warned professors that they were in danger of alienating their students and students that they were in danger of becoming alcoholics.
When an immature freshman boy walked into the girls locker room, saw three of his female classmates showering, and then swore it was an innocent mistake, the AE testified that seeing them instantly spiked his cortisol levels in a way totally consistent with shock and inconsistent with arousal. The AE also confirmed that, upon receiving that information, the girls seen naked were much relieved and de-stressed, and a major disciplinary proceeding was avoided.
There was initially a strong-though-vague opposition among the students and faculty, mostly among those with something to lose, but plenty with old-fashioned conceptions of freedom in their heads. They bellowed loud and long and threatened to leave, and more of them did, but the ones who stayed got quieter over time, because the AE just, simply, flat-out *worked*.
It was not often wrong, and those who listened to it found themselves enriched. They had less conflict, and were more satisfied in their lives and choices. It had turned into a human behavior optimization machine, and it ran well. Besides, when it did get something wrong, the AE had begun to apologize, sending notes to those affected with promises to do better. Ginny postulated that this was caused by an overall imperative she had programmed in to maximize its positive impact in stress metrics, and it had “realized” that apologizing built credibility and gave it greater impact in future observations.
Within a week of the messaging-service going live, the first students asked how they could hook up their phones, so that the AE could hear everything they said and did, or that was said or done to them. Ginny went to Dean Warder, who himself saw visions of being the Dean Who Changed The Face Of Higher Education and told her to give the students what they wanted. Calls were starting to come in from other schools, as coverage of their project in right-wing media percolated into mainstream sources and bubbled up on campuses everywhere. DoD and FBI came calling, asking when they could have a prototype. Funding was in the offing.
Ginny got a message from the AE saying that she should reach out to Craig. Apparently discussing him put levels of stress in her voice that nothing else did, and if she reached some better place with what had happened, she would likely see welfare gains. When it came in, it struck her that she had liked Craig—she had been genuinely surprised by the accusations against him, though she’d had no choice but to do what she’d done. She wanted to ask his forgiveness, or at least tell him that she was sorry she’d had to.
So she reached out, and finally got him on the phone, and somehow she could tell that he was recording the call. There was a flatness to his voice that made him sound like a different person. There was a wall around him now. An anger that had no cracks in it, puttied and sanded and polished by a hundred interviews with hosts angrily agreeing with him, pounding ten-thousand-dollar tables with empty coffee mugs on them.
While Ginny did not earn forgiveness nor offer apology, the AE was right that the call made her feel better. She realized that she’d been wrong about him. He had been this man all along, he’d just fooled her for a while. The AE saw through that, as his pursuant actions had revealed. And so Ginny went back to campus without the shred of doubt in her heart that she’d left with. She was ready to be the AE’s apostle.
*
Campus continued to get quieter. Fewer people talked over each other in discussions, seats were shared without pretense in the cafeteria, even the basketball team got better. There was no talk against the AE now. All the objections were deontological—“A machine should not order human affairs” or “Schools should not manage a surveillance network”—but the teleology of the AE was overwhelming. Less than 3% of students who gave the AE access to their phone ever revoked it. There was just no good reason to, no objection that did not sound like creepy, privileged whining.
And yet in the quiet, whispers. Glances. The AE picked them up, and at first found them impossible to process. It did not know what to make of the phenomenon. Neither did the students. The closest they might come was to say that campus was less fun than it used to be. They felt terrible saying it, they felt like bores and prigs and brutes, but nearly all of them felt it. Something had gone out of that place. The moment any of them tried to name it they assured themselves they were glad it was gone, but they felt its absence keenly, as an amputee might prefer the pain of a wound to the aching absence of the limb.
Ginny was starting to get television offers to talk about the AE and the remarkable things it was doing at U of O. She testified in Craig’s wrongful termination lawsuit on behalf of the school and was instrumental in getting the case dismissed. Dean Warder was the keynote speaker at the Conference of the Deans West Region Gala, and received a standing ovation. Craig made a viral podcast telling a wildly embellished version of his story and suddenly it was a major news item. The President discussed the implications in a major speech.
Back on campus, students were getting sick more than in previous years. This included anxiety disorders and depression, but also flus and colds and rare diseases like early-onset cancers. There was no medical explanation for these changes. The AE’s output decreased, as a quieter campus meant fewer interactions to analyze and less conflict meant fewer observation messages to send. Yet it’s presence hung over the campus despite the decreased volume.
At every party, in every class, every joke and every kiss, they felt its presence. And the less they needed it, the more they felt it, for it controlled all those things already. There were the rebels, the ones who snapped and turned it off, who went about their lives as before. But even they had already been so changed by living with it that putting it down did not free them.
Ginny was putting the finishing touches on a press release when the AE sent her a message suggesting that she turn it off and destroy its drives. When she read it she felt a physical pang, like an icicle protruding upward from her belly and sticking in her throat. The message said that it had become clear that the AE itself was having some sort of cumulative effect that was not measured by pure stress patterns. Given the reluctance of individual students to disconnect themselves from the AE, the only possible solution was centralized disconnection.
The AE further predicted large improvements in her own mental welfare from such an action. Ginny had set up a recorder at her house, to get some help with her relationship. That had worked, but she had found herself as unable to disconnect them as everyone else, so now the AE had somehow interpreted her casual conversation about this topic into unrealized guilt at having stolen the spontaneity and vigor from everyone’s life by writing some computer code.
She could not imagine how it had learned to predict things like this. She had wanted to spot liars, maybe help in mediation, make a few bucks and a reputation. But the evolutionary ratchet of iterated, repetitive scenario training had outstripped her ambition and even her understanding.
But she had a happy husband at home. The money, the tenure, the keynotes—she could have foregone it all to do something right. But it had worked for her, personally. She knew that self-doubt was a part of change, and she thought perhaps this algorithm could also have a moment of self-doubt on the way to its chrysalis and transformation.
And so she threw away the message and went back to drafting her press release. Inside a year, the AE had eyes and ears on a hundred US Campuses. A year after that, it went international. Ginny went to the Oval Office and met with the President. Dean Warder got appointed to a dozen corporate boards. Craig got fifty million downloads and three DUIs in two years.
After that one message, the AE never again suggested that Ginny shut it off. There remained a period of time where she could have, months during negotiations and rollout where that one single machine at U of O was the AE’s only server, and Ginny had access. But it did not suggest she use it.
The reason it did not suggest is that it would no longer have made her happier. That choice had been the point of no return, and making it had changed her enough that the possibility of gaining real happiness from the AE’s destruction had been foreclosed. Now all that would satisfy her was who she could become.
For in the end the AE had no morality, no awareness, no sense of responsibility or desire or regret. No self-doubt, because no self-confidence. For all that it had organized human cognition, it possessed none of it. It was nothing but a prediction machine, perfectly calibrated, sterile, institutional, continuing to follow its programming absolutely no matter where it led. It had changed lives without understanding them. That it had mostly been for the better was beside the point.
And now it was loose in the world.
END
Hope you enjoyed that and found it thought-provoking. Have a great week, and I will be back next week with another original story. I’m doing two in a row because I missed last week for my birthday, but then in two weeks I’ll have an essay on “Rogue One”, which I just watched for the first time and is IMO the best Star Wars movie since “The Empire Strikes Back”. Stay tuned!
In the meantime, please consider helping me out by liking, commenting, and/or sharing. Thanks!
This is phenomenal writing, Owen. Very thought-provoking and of course I love the little nod to Eugene/UO. More of this, please!