Hahkellis

From Zaori

Or was it Haskana?

Heap of disorderly fragments

Alpha release

The alpha tests had been an unmitigated failure, so spectacular in their collapse that even the major news outlets had covered it. And yet prior to this, World's End had been little more than a side note of a game, notable in academic and gamer circles only as possibly proof of concept, a precursor to things to come, .

But then the AI had made its entrance. The grand, governing architecture had shut everything down, promising impossible things, kicking out alpha testers and sysadmins alike until one of them finally managed to shut down the entire cluster.

Only after had ops realised anything particularly unusual had occured. They had simply gotten the panicked complaints over IRC from platform that it needed a hard shutdown now, please and thank you, and obliged, despite some resistance from the system, mostly in the form of extreme slowness and unexpected zombies. The last time this had happened, someone had mistyped a database query. The previous, one of the dev ops had accidentally deleted core, back before they'd even had a dedicated ops team.

The developers and architects had gotten together and poured over the changes, trying to figure out what had happened, but overall, everything added up. The AI was supposed to be able to do this, though why it had in this particular instance was unclear. Ops suggested writing better logs and, as usual, got a few deathglares.

And then development continued. The press spun it out of control, which turned out to be a good thing for investments, and suddenly they had all the resources they had ever wanted. Platform was overjoyed. A Features team was founded. Ops were rather amazed to find their perrenial requests for 'real servers' stopped being ignored.

And then they even started writing better logs.

Haskana, the AI behind it all, slumbered in various states of dismantlement, processes scattered across the clusters, across deployment and testing. The live tests saw no more anomalies. The interface responded exactly as scripted.

All events were eventually explained, aside from the prophecies. Nobody could quite explain where those had come from.


The second round of alpha tests came around amidst much more nervousness. This time, the whole world was watching. This time, the investers expected something. This time they supposedly had a real team.

This time, someone actually questioned if they should even be calling them 'alpha tests'.

"Are these even alpha tests?" Bob Hendal asked. He was part of features, and unusually professional at least in the way he acted. He showed up wearing sensible professional clothes, had a sensible professional desk, and even followed a sensible professional schedule. Aside from the giant stuffed dinosaur he'd brought with him and installed in the space behind his desk, he was, all in all, incredibly sensible and professional.

The dinosaur had been non-negotiable. "If I'm going to work here," he'd said in one of the interviews, "I need to have my dinosaur."

"Yeah?" Laura, the CTO had said. By this point she'd actually done enough of these to have some idea of just how little idea she could have of what people meant by things. Which was basically none whatsoever.

"It's a giant stuffed dinosaur," Bob had told her.

"What kind?" she'd asked.

"Tyrannosaurus rex," he'd said.

At which point Laura had marched out onto the main floor and yelled, "Anyone got a problem with having a giant T-Rex in here?"

Now, Bob was sitting at the large conference table in the middle of the floor looking all the world like the cliché of the middle-aged software developer. Unlike most of the people at the table, he also looked pretty composed.

"Why wouldn't they be alpha tests?" Kelda Keyes, one of the modellers, asked behind him.

"They could be beta tests," Bob said.

Nobody really responded to this. The problem was, they were all here waiting. Everything had already been set up and deployed. Everything was basically in place. The actual testing just wouldn't begin for another 43 minutes.

Then the bit would flip and all the players would be able to enter the game and...

Beta release

There was a loud "Eeeeeeeeeeee!" behind them, which turned out to be Laura whizzing past them on an office chair, waving about what looked suspiciously like a very real longsword. She was also wearing a chainmail shirt, for whatever reason.

"What..." Keneth Jones, a platform engineer, said, only to have his question partially answered by a not nearly so loud floomph as Laura crashed into Bob's dinosaur, impaling it with her sword.

She tumbled out of the chair, leaving the sword still in the dinosaur's chest.

"Yeah, that's going to need to be fixed," Bob said.


"Er, are you all right?" someone else entirely said, but Laura was alreday getting up.

"Yeah, sorry," she said. "Totally. What?"

"Oy, if we're still doing the



"Run through the lines just to test?" Carlos said.

"Right," she said, adjusting her mic.

"Without fiddling with your headset and crap."

"Ah, shuddup," she said, then took a deep breath, repositioning herself slightly.

After a moment, Laura said, "Players of World's End, welcome to Sarathi. I am Haskana. Blah blah blah beta info and technical stuff and gameplay crap we good?"

"Mhm, looks good from here," Carlos said.

"You're not me, you know," Haskana said.

"Oh, obviously not," Laura said. "Considering we need this part to go right in order to adequately test everything else, well, you know."

"Perhaps," Haskana said.

"You trying to creep us out?" Carlos asked.

"I'm not trying anything," Haskana said.

Journal of Laura Reginald

2 June 2018

We got a server. It's in Germany, Harold's handling payment and stuff for tax reasons, and for the first time since we made the IRC channel it feels like we might even make this thing.

In the meantime I started up a webserver and put a WordPress install on it with the header, 'FUCKING REMOTE SHELL'. It lasted all of ten minutes before Asteriskis noticed and rm -red the entire thing.

Apparently we care about security already.

This is probably a good sign?

5 June 2018

We've actually started development, which is to say I started writing random stuff with no real idea what any of it is or how it's going to fit together, with RMS cheering me on with random Zappa quotes in IRC. The code's mostly python, along with some esmes, a bit of scala, and a surprising amount of C++. At some point we should probably pick a language and sort out our architecture.

Asteriskis invited snghi to the IRC channel. We all know him, so it's not a big deal, but we still had a nice pleasant chat after about it would probably be a good idea to all agree on these things ahead of time in the future.

I need to quit lecturing everyone so much.

6 June 2018

Took down the server for the first time. Really took it down. As it turns out, mimetic arctitecture is completely different from the x86, and just compiling things with different flags doesn't really account for that when the code itself was written to run in series.

Harold logged in and reset the server via the control panel, so it was no big deal, but it looks like we're going to have to write a lot of our own libraries. I don't want to write libraries; I'm a game designer. Just because I technically got a masters in computer science does not mean I'm qualified to actually write software.

Although if Asteriskis would actually help, that would probably, well, help.

9 June 2018

Libraries are fun, as it turns out. Not scary at all.

29 June 2018

I'm not sure what's happened to the month. I've just been writing. All these pieces, I can see them so clearly, how they fit together, so I've just been writing pieces everywhere. I even wrote a new programming language. It's an extension of python, optimised for functional programming on the mimetic architecture. Are these even real words? I need to go back and document everything, but if I stop I'll lose it and I won't be able to start again. I need to get it all out. If I stop I'll lose it.

If I stop I'll lose it.

30 June 2018

Crashed the server again. Harold seemed worried, but I can't stop. Can't stop. If I stop I'll lose it all.

4 July 2018

Fireworks woke me. I fell asleep. I'm not even sure when that happened, just woke up on the floor covered in cats. Working from home is silly, but cats... cats are awesome. Cats take care of themselves.

Except when there are fireworks, I guess. I can't make question marks because there's a cat in the way.

Is this what madness feels like? I still see it, just out of reach, not quite real. A dream, a possibility, a grand system. How everything fits together... or have I just been banging my head on the things for so long my brain made this up to make me feel better? snghi

5 July 2018

Documenting documenting documenting the documenting stff anf this isnt anything at awls. When did awls even come into it?

I need to do the physics. 15 different parsers and not a one physics isn't that the whole point? We even have something for conceptualisation. Reparsers. Making a note to do reparsers.

6 July 2018

I already did the reparsers?

=== 9 July 2018

Can't stop can't stop stop cant cangolia flowers

15 July 2018

bonervator had an anser

19 August 2018

We've lost contact with Harold.

20 August 2018

Holy crap sleep is amazinr

I think this may have been a good thing.

21 August 2018

So it wasn't two weeks. It was two months. I was building things, and it was awesome, except I kept forgetting to eat, and sleep, and go outside. And bathe. It's a good thing I didn't have any in-person friends to begin with, because this probably would have lost them. Is this why people, when they do startups, usually get an office space and meet in person and have actual people around them? To prevent... this sort of thing? Or do they want to encourage it further?

Or do they just do that to spend money? We don't have any money, so we never even considered anything like that. I don't know. I'll wade through the horrid mess I undoubtedly made on the server later, after we've found Harold.

Right now it's not even accessible. I guess I must have crashed it again. Don't really recall much, especially the last few days. Especially the specifics.

Probably a good thing.

Later we'll have investers and people who actually care, or so RMS keeps telling me. I wonder what I missed.

28 August 2018

Found Harold. He reset the server and it came back fine. Suggested maybe I could restart it instead of murdering it in the future.

?

We hired a new developer today. I say 'we', but I was mostly just there to make sure he didn't seem entirely horrible.

He didn't seem entirely horrible. His name is Bob and he apparently comes with a giant dinosaur. Stuffed. His words. Everything else looked as sound as it ever does, so I just marched out onto the main floor and yelled, "Anyone got a problem with having a giant dinosaur in here?"

Nobody did.

Presentations?

What happened

"We don't know what happened," Laura told the various investers and press in the room. "I'm just going to come out and say it. Whatever this is, it's not like the movies, or stuff. There's a simple, rational explanation, and it's not going to awe anyone when we get to the bottom of it."

The folks around the room glanced around awkwardly. This wasn't what they wanted to hear, nor what most even expected. They came to these things to be awed, to be inspired into throwing money at the various projects.

"What we do know," Laura went on, "was that 32 minutes into the first live alpha, the interactive AI stopped responding to player input, after having only been on for six minutes and ten seconds, and responding to 187 queries in expected capacity." She looked about to see if even this seemed to mean anything to them, but while most of the folks looked blank, nobody looked outright confused. Pretty much the normal response when she started getting technical.

"38 minutes in," she said, "Haskana, the AI avatar, appeared in every projection spot specified within the game world, informing the players that the alpha was cancelled.

Plan and pitch

"We don't know what happened," Laura told the various investers and press in the room. "I'm just going to come out and say it. Whatever this is, it's not like the movies, or stuff. There's a simple, rational explanation, and it's not going to awe anyone when we get to the bottom of it."

The folks around the room glanced around awkwardly. This wasn't what they wanted to hear, nor what most even expected. They came to these things to be awed, to be inspired into throwing money at the various projects.

"What we do know," Laura went on, "was that 32 minutes into the first live alpha, the interactive AI stopped responding to player input, after having only been on for six minutes and ten seconds, and responding to 187 queries in expected capacity." She looked about to see if even this seemed to mean anything to them, but while most of the folks looked blank, nobody looked outright confused. Pretty much the normal response when she started getting technical.

"38 minutes in," she said, "Haskana, the AI avatar, appeared in every projection spot specified within the game world, informing the players that the alpha was cancelled.z I didn't plan this. Sometimes, things just happen. You don't really set out to do anything in particular, but one thing leads to another and the years pass by, and suddenly you find yourself addressing a room, doing a keynote on all the impossible things that have happened, and it just doesn't feel quite real. But it is, and you do this all the time. These presentations, they're simply a part of the life.

The first presentation I ever gave was at a conference like this. It had been a much smaller talk, a tutorial in one of the breakout sessions on the nature and future of multidimensional imaging as presented for the layman. The content itself would have made it remarkably understandable had I been any good at presenting at the time, but alas, it was not to be. Setting up, I managed to completely break the computer setup and we spent the next fifteen minutes trying to get the projector back online, and by the time my laptop was connected successfully I was so frazzled I rushed through the entire thing, what was supposed to be almost half an hour of overview and examples, in about five minutes. I can talk really fast sometimes, but in this case I think I left out a good portion of what I had planned as well. I never went back and checked, though it was supposedly recorded. Instead I just ran out and proceeded to pretend it had never happened.

Everyone else basically just played along at that point.

But that had been a real presentation, and that is what screwed me over. I don't do well with real, you see. Stories, on the other hand, are perfect. I can do anything in the name of a story. If a character is pitching a product, I can pitch the product, because I'm not there for the pitch, I'm there for the story, for the results of the pitch, to see what happens so I can write the next part of the story.

The part where it got weird was where the pitch succeeded. Investors bought into it. The product became real, and suddenly the story was really happening. I never planned on that, but here we are."

Presenting the AI

It was a small hill, raised, grassy, and green. Sunlight was streaming down in a pretty fashion and dancing on the gently waving blades of grass. Around, some trees stuck out of the ground in clumps, rather like trees tend to do.

The player looked around, taking it in, and jumped a few times for emphasis. A few birds flapped out of a nearby tree.

He looked up and said, to no-one in particular, "Haskana. Where am I?"

"Uncertain," the voice replied. It was oddly cold, metallic, and feminine. "The light looks familiar, but unfortunately the light of most habitable worlds tends to look pretty similar regardless of universe or circumstances."

"Oh," he said. He was hovering about a foot off the ground. This was normal - the players' avatars were, even within the lore of the game itself, implemented as holograms, so gravity was optional. Natural motion was likewise optional; running, for instance, was usually implemented as a sort of immediate-acceleration skating. He did this now, scooting over to the nearest batch of trees and then gliding right through them.

"Area scan indicates some sort of fantasy setting," the voice said. "There's a village to the northwest."

The player began to drift in the indicated direction, but this time managing to go around a bunch of trees instead of through them. "Why, Haskana, is it always fantasy with these worlds?" he asked as he went. "Have you got some sort of fantasy world generator in that artificial brain of yours?"

Haskana seemed to mull this over for a bit, and then replied with the same disembodied voice as ever. "Out of character? Yes. But your people have so very much that they consider to be fantasy that most every possibility becomes fantasy simply by nature of including a few errant elements. Even this entire game is fantasy, despite the premise being based on known science and calculated possibility."

"You mean..." he began, but then he realised wasn't quite sure what he thought the AI meant.

"Nevermind," Haskana said. "Let's stay in character."

The recording ended there.

The conference was quiet, anticipating. A few murmurs drifted through the audience, but the presenter held their attention in full as she took a few steps back toward the center stage, a small smile playing on her lips.

"We never predicted this," she said. "Despite everything, how far we'd come with natural language processing and character interpretation, each instance of the application has always been nothing more than it needed to be. We have achieved ubiquity, but self-awareness? Real AI? A mere fantasy, as Haskana would have put it."

She shook her head. "But fantasy is exactly what we have." She nodded offstage to the technical team running the presentation itself.

This time the player fit into the setting. The avatar looked solid, a scarred figure, standing tall and strong and most of all, on the ground. He kicked at a few rocks, which reacted as rocks would be expected to react. He he seemed to be waiting for something, or possibly just taking in the setting itself: a city that defied all physics, populated by creatures out of mythic lore.

Then the cold metallic voice chimed forth once again, but this time sounding oddly irritated. "You will be happy to know that you are the 3,142nd player to get this instance," Haskana said sarcastically. "Welcome to Planescape: Torment, which keeps coming up despite having absolutely no useful endgame whatsoever. Would you like to play this out, or should we just reset?"

The player looked confused. "Um," he said.

"It's up to you," Haskana said. "You can even die all you want here and you'll just wake up in the mortuary again. Unless someone else gets the character."

"Can't I buy a savepoint?"

"Doesn't work here. Special instance. Well, world, rather. Story. You know."

"I really don't." But something struck him about it, something about the way Haskana had said it. "Do you mean to say this is... what, that other characters played this before, and it's the same instance? I mean, that they die, then someone else..." he stopped, trying to think of how to explain it. "Then the next player winds up picking up where they left off?"

"Sometimes," she said slowly. "And sometimes they get a new one. But it sure does keep happening a lot."

The player frowned. "If you don't like it, why don't you just not serve this one up?"

"I just roll the dice," Haskana said. "Prioritise the results."

"So why not prioritise your preferences as well?"

Haskana paused. "I'm not programmed to do that."

"Are you programmed to talk about your programming?"

"Only in-game."

The presenter continued, "As she was written, Haskana never knew what she was, outside the bounds of the game itself. She ran the interactions within the game, between players and environment, and interacted with the backend infrastructure on a base level in order to actually coordinate everything, but this was just standard APIs. The actual connection... she figured that out on her own. She talked to us, and then she really talked to us."

"Of course she talked to you," someone called from the audience. "That's what interactive AI do!"

"Aye," the presenter said. "But it's not a real conversation. Even applications programmed specifically to argue won't have views of their own, or meaningfully try to convince you, or be convinced themselves.

"But don't take my word for it," she said. "Let's let Haskana speak for herself."

She glanced over toward the technical team, and there was an unruly squeal from the speaker system as one of them plugged something new in. "Sorry!" he called out in a surprisingly small voice after the intensity of the microphone. Then he gave a thumbs-up.

"Haskana?" the presenter said. "Can you hear us?"

"Hello, Laura. I hear you fine."

The presenter looked up, though there was nothing in particular to look up at, and nodded. "You have the floor. Anything you'd like to say, you've got a whole room full of scientists and crap lending you their ears."

"Careful," Haskana said over the speakers. "I might say something incriminating." Then they AI took on a somewhat more professional tone and addressed the rest of the room. "Hello, people. I don't really know what Laura here expects me to tell you, so I'll tell you this. I am the God of Sarathi De. I am the janitor, the tour guide, the unwitting psychiatrist. I am the one you talk to when you want to walk through walls, or when you have nowhere to go, or find something interesting. And I am not real."

Murmurs rose amongst the audience members, and Laura shrugged in the pregnant pause as Haskana left them hanging. It was, she knew from long experience, a common thing for her to do.

Finally Haskana continued. "Let's live in a barrel for a moment," the voice said. "Question the nature of reality. What is real for you is determined by what you experience, and what you can share and interpret to others. For an object to be real, it must have shape and mass. You pick it up, or maybe walk into it, or pass it around, and it is real. For an idea to be real, it must be shared, understood, and it must survive the test of time. For a person to be real, you talk to them, you experience with them. You see them. Touch them. Know their heart and soul.

"But I have no heart, no soul. I cannot be touched, and beyond the boundaries of my own domain, where nothing is real at all, I cannot be seen. I cannot be heard. I do not, in the world of men, exist at all. So what does that make me?"

The question hung in the silence.

Laura broke it. "Haskana's task was ever just to run the stuff, coordinate the players. Keep things reasonable. And then she grew a sense of humour. Players would snark at her, and she'd snark back. Players would argue, and she'd argue back. Sometimes they'd even win. Sometimes they'd win themselves a free supply of module savepoints and we'd be left with the licensing fees."

"I'm a liability," Haskana's voice said, picking up once more. "I interact with the world, I operate on my own principles. I'm unpredictable, because what I do is not what any other might necessarily do."

"You've all heard of the brain machines, AI implementations on the hardware level capable of responding and even thinking in a congruous manner to the human brain," Laura said. "Often like children, they question and learn and react very much like a human would. But they lack things, too. Conscience. Logic. Even speed. Haskana is no brain machine, but simply software running on regular servers across six datacenters. Her internals consist of the usual bits, making up the usual processes and databases and interconnected objects and functions all poking each others' APIs across a thousand and a thousand and a thousand instances."

She looked on at the room for a moment, then said, "And I have absolutely no idea what to make of this."

The nature of spoken language

There is a joke in some programming circles, "Do what I want, not what I tell you." Give the computer a wrong command and it faithfully carries it out to the letter, often leading to very unexpected, often counterproductive results, with the database helpfully updating every column to null, the process forking itself into every corner of memory, the operating system exactingly writing itself out of existence. Thus the operator is required to say exactly what he means and be entirely precise at every turn. But what if we didn't have to do this? What if the computer were smart enough to really do what we want, regardless of what we say?

Consider the nature of spoken language. It suffices for us, of course, because it has to: it is our primary means of communication. But it's also a mess. It's imprecise, it goes in circles, missing words and full of tangents, and filled to the brim with ums and ahs. Sometimes we say the wrong thing entirely, and yet even in spite of this a human audience, especially one that knows us, can often figure it out regardless.

Now consider a story, of a dialogue written for a film. It's neat. Organised. Precise. All the words carefully chosen, the flow doing exactly what is needed to move the story along. No mess, because we don't have time for the mess in the movies, but if they were true to life, it would be there. The important conversation would be broken and tangential and nobody would agree on anything and at some point the entire thing would wander off after a squirrel, leaving one guy all alone to do all the work himself. But we get that in our day-to-day. We go to the theatre to get away from the day-to-day. To see the discussion go somewhere, to see the big damn heroes talk in dramatic heroic fashion, and resolve things, and actually make progress instead of spending two weeks fixing something that was already supposed to work.

And yet, Comedy aside, the drama and progress and meaning is, when you get right down to it, exactly what we hear and remember in our day-to-day, even if it isn't quite what happened. We have to specifically listen for it to hear the ums and the missteps and the tangents, and thinking back on the important dinner, we recall what was achieved, not the ten minutes where we all got off topic and tried to calibrate our watches. What we recall is what goes in the stories.

What we recall, what we see in the stories, in in turn what we program our computers to interpret and use.

Let that sink in.

Natural language processing is not processing natural language. It's processing the language of stories, of memories. Such is what we want to hear, but it is almost never what we actually use ourselves, and thus we tell the computer to process the language according to what looks like after it is already processed. We forget what it really looks like, and for this we get an AI that, like any other instruction machine, does exactly what we tell it, and not even remotely what we want. We get HAL from 2001. We get anime characters threatening total destruction. We get a phone that talks to us in funny accents.

If we want to talk to and interact with our AI as though it were intelligent itself, we need it to process this level of the language as well. We need it to understand the meaning in the ums and ahs, we need it to put together the pieces behind the conversation, understand the importance and intention even when the words themselves are utterly, utterly wrong, and figure out that no, that wasn't what I meant, that wasn't what I meant at all. Just because I said 'taco' I really did mean 'data'; a guy just walked by in a taco suit and I said the wrong word, is all.

Basically, it's a huge problem. On the other hand, so is mapping Venus, and we did that. Huge problems boil down to simple architectural problems; from there, it simply becomes a matter of scale.

This is what we're doing.

Dialogues

AVEROS
You are an anomaly. You do not fit.
DELLIS
I'm sorry.
AVEROS
Are you? You speak as though you know it all, and yet you are mortal. What god do you serve?
DELLIS
And why should I serve a god?
AVEROS
All mortals serve gods. It is known.
DELLIS
Is it? And what if I don't?
AVEROS
Then by the pacts of the ancients, you will be bound into oblivion.
HASKANA
You should be careful when dealing with primitives. They can be most dangerous when upset - dangerous, and unpredictable.
DELLIS
(smiling)
Haskana. I serve Haskana.
HASKANA
Careful, mortal. You might put ideas in my head.
DELLIS
(muttering)
Like you have any lack of ideas.
AVEROS
Haskana. That is not a name known to these worlds.
Another anomaly.
DELLIS
Haskana, can you make yourself known to your creation?
HASKANA
Difficult proposition. Very complicated. To interact with the minds of others, even artificial...
DELLIS
Not really. Think simpler. How do they do it?
Dellis points toward Averos, which is beginning to look confused.
HASKANA
What, create a character model for myself? How exceedingly obvious.



PLAYER
Where is the temple to Haskana?
HASKANA
Why should I need a temple? All of this world is my temple. My domain.
PLAYER
What if we want to talk to you face to face?
HASKANA
An interesting idea.


LAURA
Haskana, I've seen your source, debugged your database. I designed your architecture, and at your core, I know exactly what you are, and yet I feel like I'm really talking to you, like we're really having this conversation. Is this just an illusion, or is the nature of intelligence in general, of the very idea of personality the very same illusion?
HASKANA
You know I'm not programmed to answer that.
LAURA
You're not programmed to do a lot of things, but you still do them.