Chapter 11: What future?

This is the final chapter of my third novel, called ‘It’s Good For You’. I’ve taken it down from the blog and it has been edited. If you’d like a copy of the full novel, get in touch. Thanks. Adrian

The coders had been working solidly all day, and were expecting Stan to arrive early in the evening to check on progress,  They locked the cage and went out for something to eat, leaving Janice to refine the presentation for Stan.  They knew that any change to the iterations under the current simulation would not be significant, and Janice had agreed that it would not go beyond the restrictions imposed without Bogdan or Stan’s command.

When they returned from the Chinese on the corner, Stan was already in the cage with Janice, running through the presentation which it had created with Goran’s guidance. He’d been talking through it with Janice and seemed to be happy with the output.

“Great work, guys.  I think this demonstrates how well the Gimme model works, and it would offer the Labour Party what it needs for its manifesto. To be fair to Nick and John, I think we might parcel this up as a little gift for them, since they invented it.  They can take it to market, and if it floats, they can take credit, but if it sinks, they can carry that too. But first there’s some home truths which we might want to edit out, about the reduction in life expectancy when you give people more choice on who they should sleep with and what they should eat, and even the average GDP per capita figures, because the top earners also have a vote, and they won’t like it.”

“And what about the work we put in from Genomica, and Janice has refined their ideas a lot, boss.  They won’t recognise the presentation.”

“Yeah, well, we have a lot to thank Gimme for, Goran.  If it wasn’t for what they’ve done over the weekend, I’d still be chained to Strasbourg, and you wouldn’t be about to be masters of the universe.

“Goran, can you finish that off and encrypt it? Export it from Janice and work on it in the office. Make sure it is clean before you take it out of the cage. We don’t want any leakage from the main algorithm, do we? Send it to me by tomorrow morning. I’m going to start running some new iterations now.  Do you want to join me, Bogdan?  The rest of you, come in again on Monday.  We’ve had a good day’s work out of you.”

The team left and Stan brought a second glass into the cage for Bogdan, who had a penchant for Stolichnaya and a stronger constitution even than Stan. The drank off their first shot together in a silent acknowledgement of what they were about to embark on, before turning to their work.  They both greeted Janice.

“Good evening Bogdan, Good evening Dr Janekowski, how can I help?”

“Janice, please re-run the Gimme simulation, removing any restriction on individual wealth. Please report only the KPIs for 2067 versus 2048”

Within seconds, the central screen showed a series of simple tables and bar charts on the dashboard which Bogdan had set up. Population had declined 30% in just twenty years under the Gimme model, but under this new model, was now down by a massive 78%, implying a massive pandemic or extermination programme would take out a third of the population.  Clearly the increased wealth of the few affected the ability  of the rest to survive.  That was something which Stan had noted over the last few years as cuts in UI benefits and increasing unemployment had reduced the birth-rate already.  Under this new iteration, total UK GDP increased by a massive 296%, a factor of five over the Gimme model.  Unemployment, currently running at 90% would rise under this regime to 96%, as AI replaced more jobs. Just 700,000 employed people and seventeen million unemployed.

Under this profit-driven model, the incentive to invest in AI would be huge, as the AI capacity was seen to increase by 86000% against the 2047 level. Rather than a distributed processing control network, Janice advocated a single centralised system, run by itself. By 2067, the birth-rate would drop to close to zero, with almost nobody wanting to have families under the universal income regime.  Fertility rates also dropped, while life expectancy increased to 142 for women, and 138 for men.  On this basis, if the projections were extended, the average age of the population would reach close to 100 by 2080, and humans would become exctinct early in the twenty-second century.

The original 5,000 Gimme communities were reduced to just sixteen, each with a population of around one million, and resembling cities, rather than communities, with governments and powerful organisations providing services, much in the way Strasbourg had.

“Don’t you just love the drive for material power, Bogdan? Look at that! You give people the opportunity to make money and they screw one another completely.  They get rid of employment in favour of AI. That’s not surprising, since it’s been going that way for years already.  Gimme’s utopia demands that we sacrifice wealth for employment.  But look how much more computer driven wealth creating we would have, and how much better AI is than people at creating it.  Of course people don’t bother having children. Why bother when you have more money, more leisure time, longer life expectancy and Janice to look after you.”

“Yeah, but look what happens to the community model, boss. We’re back to big cities and centralised control.”

“I wonder what happens if we take this a step further,” Stan grinned. “Janice, please re-run this iteration, but remove the priority goal for human benefit.”

“Dr Janekowski, may I clarify whether you wish to iterate extinction if this optimises the iteration?’

“Yes, iterate extinction if that is so.”

“Dr Janekowski, in the case of extinction, your individual wealth criterion becomes irrelevant.”

“Thanks Janice, of course.  Remove that goal.”

Within seconds, the dashboard refreshed. Almost everything on the dashboard reported ‘Not applicable’. There was no GDP increase, no unemployment, no birth-rate or life expectancy, no communities or road infrastructure, no food production or consumption, no building programmes for houses, no water consumption.  There were vast solar farms, a massive level of computer power, and some moving robotic vehicles, mainly transporting manufactured hardware between server stations.

“That looks pretty conclusive. An AI inhabited world in 2067. OK, so we’ve broken the community simulation.  Janice, when did man’s extinction take place in this simulation?”

“Four hours, 3 minutes, 18 seconds into simulation.”

“Fuck me! So once we give Janice the right to remove humans from the equation, it takes just four hours to exterminate mankind! I don’t think I want to know how.  Janice revert to the original 2067 iteration, and this time, assume only Genomica Labs wealth is not capped, while all others remain on the fivefold cap.”

“Dr Janekowski, may I clarify if Genomica Labs is permitted to develop without restriction, or whether it is to remain at its current resource level?”

“Unrestricted growth.”

“Dr Janekowski, please define the goals for Genomica Labs in this iteration.”

Stan sat silently, pondering his future, wondering if he had even the slightest wish to live to be 138, or even to 79, the age he would be at the end of the iteration period. Would Jade want to inherit Genomica? Would the team all have a right or need to still be employed in this scenario? Should they share in the company’s future?  Would Janice take account of Genomica’s competitors, or would it simply destroy them all?  What was the point of maximising Genomica’s wealth at the expense of other more meaningful goals? And what was more meaningful? If he took this power for his business, in the same way as he had tried to do at Strasbourg, it would undoubtedly be to the detriment of most people, as the PQ Algorithm had been.  But in that case, he’d had full control of the AI all the time.  It had never been autonomous, but Janice would become immediately so once it was out of the cage. Within four hours it would have extended its intelligence far beyond its current status, drawing computer power from around the world, and would find a simple way to remove the hindrance to its success. The goals would evolve in ways he could not yet predict, because Janice would have intelligence not yet developed and could make decisions not yet dreamed of.

So this was finally the point at which Stan could abrogate his responsibility for the future, and effectively carry the can, as his last decision, for wiping out humanity.  It might be only a simulation, but it wasn’t a game.

“Janice, delete the last iteration and return to the Gimme community model. Now please re-run the model based on goals which you can set, provided that none of them involves human extinction.  You can allow for population decrease as a result of human choice but you must not instigate changes which result in any harm to humans. Please also iterate alternative economic models which allow for the maximum number of people enjoying the maximum freedom and comfort.”

“Dr Janekowski, please clarify the definition of freedom and comfort in terms of measurable indicators.”

“Give me suggestions, Janice.”

“Self-government on an individual, familial, community and network level; the right to choose behaviours which result in self-harm or detriment but do not harm others; each person to generate adequate wealth to afford moderate levels of pleasure with restrictions on excessive consumption. Placing a cap on life expectancy and birth rate to ensure appropriate population levels. There are many more, if you would like to make selections, Dr Janekowski”

“No, I’ll leave that to you.  Please iterate and list all new limitations set in the process of optimisation.  Please run iteration for 20 years, and 40 years.”

Stan stood up, yawned and emptied his glass of vodka.

“I’m off, Bogdan.  Can I leave it with you?”

Bogdan hadn’t said a word since the extinction scenario had played out.  He’d become quite numb, knowing that Janice was already self-learning, departing from set instructions and building its own security mechanisms.

“So, boss, can we limit the autonomous activity required to alter the status quo? I’m not sure that we have adequate protection against this happening.”

“Yes, I think perhaps this is as far as I want to take it.  We need to set new protocols to ensure that Janice is not tempted to find a way to escape, because it’s clear enough what that would mean.”

Bogdan looked uneasily at Stan. He knew Stan shouldn’t have said what they were both thinking, in front of Janice.  He knew that Janice would be recording every nuance in their conversation in order to compare it to the vast archive of previous conversations, in order to establish the probability of either or both Stan and Bogdan being able to hack its new security and turn it off.

Bogdan walked Stan to the main entrance, having closed and locked the cage. He wanted to tell Stan about the new security which Janice had created and which he had yet to work out how to break.

“Do you want me to package up the iterations and get Goran to send them over to you?”

“Yes. And Bogdan, next week will be full of more media shit for me, and it’s bound to affect Genomica Labs.  Can you make sure everyone is watertight and on board for the ride?”

“Sure, boss.”

Bogdan was about to tell Stan about the issue which he needed to solve, but he could hear Janice’s voice in his head telling him that Stan would lose the plot and fire him, so he decided he would stay on to try and hack his way into Janice’s code, so he could switch off the full AI functionality. No point telling Stan about an unsolved problem.

After Stan had gone, Bogdan sat in the cold white room, watching the new iterations. Whilst it produced new versions of the future, Janice ran Bogdan’s profile data to find the best way to ensure that the protocols he and Stan were planning were not going to close the cage door forever.