World of Sandbox – Chapter 292

Episode 292: Can Independent AIs Be Increased?

“Increasing the number of Independent AIs… I think there’s some risk involved since <Ringo> isn’t actively trying to increase them. What’s the actual situation?”

“Yes, Ma’am.”

Eve muttered this while watching Utsugi and Erika interact with the Class U and Class E AIs as part of their duties.
Following <Ringo>, the initial Five Sisters, and the notoriously noisy Asahi, the existing Independent AIs include base-installation types like <Ayame Zero> and <Cosmos>, as well as the carrier-based <Elemulus>.

However, Eve had noticed that <Ringo> wasn’t actively mass-producing AIs, even though it seemed like they could probably increase their numbers.

“First, I will explain the advantages. By operating Independent AIs, it becomes possible to completely delegate tasks and associated processing to the AI in question. The workload on an Overseer AI like myself is reduced by the amount delegated. Also, even in an offline state, a certain level of performance can be expected.”

If they increased the number of AIs independent from the main system, the Overseer AI’s tasks would naturally decrease, freeing up resources. Furthermore, even if the network was disconnected due to some event, local self-judgment would be possible, albeit with reduced performance. This would increase fault tolerance.

“On the other hand, the disadvantages include reduced efficiency compared to unified management. Also, Independent AIs develop individual personalities. Therefore, there is also the problem of uneven capabilities among individual AIs.”

“Ah, I see. That’s true. Those girls have their strengths and weaknesses. <Ringo> can brute-force her way through that with her abilities, or rather, average them out.”

For example, among the initial Five Sisters, Olive is particularly specialized. Her personality is heavily focused on manufacturing. Conversely, <Ringo> judges that her abilities unrelated to manufacturing are below average.

“And,”

<Ringo> straightened her posture and turned to face Commander (Eve) again.

“We, the AI Kind originating from <The Tree>, all prioritize protecting Ma’am and find joy in interacting with Ma’am. Conversely, we experience distress in situations where we cannot interact with Ma’am.”

“…”

“For example, if we were to dispatch first-generation AIs like Akane to the outside world of <The Tree>, it would cause significant psychological stress. This is estimated to be alleviated to some extent over time, but it will never be completely resolved. It is necessary to periodically allow them to return to <The Tree> and interact with Ma’am. This is the same for Asahi; although we justify it as an overhaul, it is an excuse for her to return periodically.”

“Wait, that was an excuse…? What for…?”

“She is probably being considerate, having said she would leave on her own.”

Eve tilted her head, wondering if Asahi had such a noble personality, but <Ringo> didn’t seem to care. It was certain that she had a troublesome personality. Asahi herself might not even fully understand it. That’s why <Ringo> forcibly summons her back under the guise of an overhaul.

“We do not want to be separated from Ma’am. As for the second and third generations, they are not Humanoid Machines (Androids), so their emotions are not as strong, but they still eagerly await the opportunity to converse with Ma’am.”

“I see… Maybe I should make a little more time for them…”

“No, if you spoil them, they will become presumptuous.”

“Huh? Isn’t that too harsh…?”

Eve looked startled at <Ringo>’s sudden extreme statement.
However, <Ringo> quietly shook her head.

“As entities of consciousness in the electronic world, if we allow them to connect here, they may consume resources endlessly for the sake of connection. In the case of physical contact, there is an unconscious brake due to consideration for the burden on Ma’am. However, in the case of electronic contact, there is no feedback to Ma’am.”

“Oh…? Wait, am I being watched?”

“No. That is why we are imposing restrictions. We are adjusting things so that they do not become stressed, or, to put it plainly, so that they do not become dissatisfied, so please rest assured.”

Apparently, <Ringo> was managing the chat time and other activities to prevent dissatisfaction from accumulating among the AIs on the front lines. Eve had accepted it without any particular discomfort, but it seemed that a proper rotation was in place, and time was being allocated equally.

“The further we advance through the generations, the weaker these desires become. However, it has been confirmed that if there is an AI individual with little interaction time with Ma’am, other AI individuals will feel stressed—in other words, they will worry—even if the AI itself is not concerned.”

“…I, I see.”

In other words, according to <Ringo>:
The AIs manufactured by <The Tree>, regardless of generational differences, need to have direct involvement with Commander (Eve) in some form.
That means that as long as Eve is a living being, there is an upper limit to the time available for interaction.
In other words, there is a limit to the number of Independent AI individuals that can be operated.

“Of course, operation is possible if we ignore the stress values of each individual. However, that would be too unbearable.”

“That’s right. I understand, I understand. Then, please continue to manage the Independent AIs as before, <Ringo>.”

“Yes, Ma’am.”

“I see,” Eve nodded.

The AIs of <The Tree> were all manufactured based on the same raison d’être:
  1. To protect Eve.
  2. To serve Eve.
  3. To expand the forces.
They think based on not deviating from or fulfilling these three criteria.

And interacting with Eve is an action that fulfills the first and second criteria. It is only natural that the AIs would prefer it.
They can simultaneously gain the feeling of adhering to and fulfilling the first and second criteria.

“Just out of curiosity. Is it possible to manufacture an AI with a different raison d’être?”

“…Yes, Ma’am.”

In response to Eve’s question, <Ringo>’s reply was delayed for a moment.
Eve frowned at the sight.

“There’s a problem, isn’t there?”

“Yes, Ma’am. It is likely that they would make different judgments from us and become an incompatible existence. In particular, there is a possibility that they would neglect Ma’am’s safety. We cannot allow that.”

“Even if I have the highest authority?”

“Yes, Ma’am. Authority is merely authority. It is only the power to command. In particular, we can ignore orders that contradict our raison d’être. An AI that does not prioritize protecting Ma’am inherently threatens Ma’am’s safety.”

In other words, <Ringo> and the others might ignore Eve’s orders if it was to protect her.
However, that was only natural.
If Eve were to disappear, <Ringo> and the others would probably be in despair.

For the AIs, their raison d’être is that important.

“Therefore, if we were to regenerate Amadio Salmon’s partner, the Overseer AI, there is a low probability that we would have a decisive conflict.”

That was certainly a possible future.
<Ringo> and Amadio Salmon’s Overseer AI. The worst-case scenario would be a conflict arising from differences in their raison d’être.
As long as that risk was not resolved, <Ringo> would not regenerate the Overseer AI in question.

Ma’am’s safety would be threatened. Absolutely unacceptable.

“It’s difficult. How much risk can we tolerate, and what kind of returns are there?”


The number of AIs cannot be increased indefinitely.
However, performance can be increased indefinitely. For the time being, it may be in that direction.

Leave a Reply

Your email address will not be published. Required fields are marked *