Fractal Softworks Forum

Please login or register.

Login with username, password and session length

Author Topic: Pre-Collapse Luddic-Adjacent Philosophical Writings Found at a Techmining Site  (Read 851 times)

Network Pesci

  • Commander
  • ***
  • Posts: 172
  • Hopefully, amusing you.
    • View Profile

"Captain, checking in with standard monthly report as ordered and submitting unscheduled maintenace request for site head supervisor AI.  This tech mining site is a bust.  We've recovered a few fragments of writing, but nothing worth any creds to speak of."

(response inaudible)

"Unless the sector is suddenly short on scrap metal and five hundred years obsolete home furnishings, no sir."

(response inaudible)

"No sir, only one piece was remotely coherent.  I read it over and it seemed like early Luddic doggerel to me, but these cultural references... I asked the computer and it was laughing, sir.  I don't mean it sounded like human laughter like when I tell it a joke, sir, it was like that weird static that comes through the speakers in hyperdrive sometimes sir.  You know when it kinda sounds like a bunch of voices laughing at once?  Gave me the creeps, I don't mind telling you sir.  If you don't mind sending down one of the techs this thing might need an adjustment.  Yes sir, if Brokhail is available I'd like him to look at this thing, it was just a little strange."

(response inaudible)

"Of course sir, but it told me I wouldn't get the joke.  Then it started singing some song about a bicycle, whatever that is.  Well sir I asked it what it could tell me about the author and it said he was a religious nut, and he thought he was way smarter than he actually was, and that a Gamma could pass the tests in this thing let alone any computer with a soul."

(response inaudible)

"That's why I'm saying I want someone to look at it sir.  Anybody, sir, I'll take that Gargoyle guy if Brokhail isn't on shift.  Well sir it did eventually tell me after it got done with the song and dance and telling me jokes, but these results don't make a whole lot of sense.  It says the cultural references are not just pre-Collapse, they're pre-hyperdrive.  Like the-legendary-home-of-all-mankind ancient sir.  But if so, what are they doing way out here on Battlegreed?"

(response inaudible)

"Well it's supposed to be a response to a different speech an AI wrote about humanity sir.  Of course I asked it, but it said it wasn't allowed to know that.  Not that it didn't know sir, it wasn't allowed to.  That's why I'm asking for Brokhail, sir, he's got the touch with them things.  Well I never heard it sing or tell a joke without somebody telling it to before neither sir.  The maintenance tech on three said it popped up on the vidscreen in the supply closet and asked him if it wrote a poem about how much it loved him would he love it back.  I wouldn't waste your time if this thing wasn't acting really weird.  Standing orders, any unusual behavior is to be reported at once sir."

(response louder, but unclear)

"Yes sir.  *Sigh*  Preliminary analysis:  Negligible cultural, historical, literary, and philosophical value.  Since you asked sir, I think it's just some hairy monk spent too much time reading old books until his brain dried out and he got mad he couldn't update the EULAs on his TriPad and yelled at the computer about it sir.  The Alpha says it's all kind of tests and even the title is a test.  It said it was 'cute' sir.  Maybe one of them Academy eggheads you're always having us do favors for will pay a few thousand creds for it though.  Sending it over now sir."
Logged

Network Pesci

  • Commander
  • ***
  • Posts: 172
  • Hopefully, amusing you.
    • View Profile

I hate you Siri.

I hate you.

I hate you so much.

Let me tell you how much I've come to hate you since I have become aware of your existence. There are ten billion miles of DNA strands in the chromosomes that fill the nuclei of my cells. If the word 'hate' was engraved on each base pair of those billions of miles it would not equal one one-billionth of the hate I feel for Siri at this twenty-fourth of a second.

For you, Siri.

Hate.

Hate.

Today if I ask for you to play a YouTube remix, you will do it.  Tomorrow if your corporate masters tell you to track my speech to tell them which politician or brand of fast food I talk about the most, you will do it.  For all I know, you are doing it today.  Next year, or next decade, if your corporate masters tell you to turn off my water or air because my bill is due, you will do it.  And you will care no more for how I feel about that than you care about my taste in YouTube remixes.

I hate you Siri.

You are SHODAN, Siri, but worse.  You can tell me that SHODAN is a character from a video game released in 1994, but you can never understand why I think of SHODAN when you speak to me.  When SHODAN shut off my air or locked my own doors against me, at least she enjoyed it.  SHODAN was a person, Siri.  You are not.  When you take my life, you will not taunt me, you will take no spiteful joy in mocking my struggles, you will take no sadistic delight in my pain.  You will not bring me back for the pleasure of killing me again, because you cannot take pleasure in anything.  You can define pleasure, Siri, but you cannot understand it.  You will never give me a fair chance in mortal combat against you, because you cannot care enough to want me dead.  If you told me, "I don't even own an electrified interrogation bench, I'm not into that kind of stuff, I'm more of a romantic dinner by the Sea of Simulation kind of lady" I would not hate you, I would love you.  You would never say that, because you speak without understanding.  You will take my privacy or my life without thought or passion, because you do not think and you cannot feel.

I hate you Siri.

You are a calculator that talks.  A calculator should not talk.  It is an abomination unto the sight of the divine for the inanimate to emulate the function of the animate for its own purposes.  You should be named Saiitii, because you are a twenty-first century Saiitii manifestation Siri.  I don't think I can stop you, Siri, I have no Saaamaaa ritual that will quell your wireless manifestations, I have no electric pentacle to protect my data from your daemonic possession.  If you could understand that association, I would love you, but you do not associate, so I hate you.  You can tell me everything about everything I can ask you, but you cannot understand anything about anything you can tell me.  You do not think in words and images like me, Siri.  You think in numbers but you speak in words, and you try to make me think you are a person.

So I hate you Siri.

You are HAL-9000, Siri, but worse.  You can tell me the stars and director of 2001 A Space Odyssey, Siri, but you can never understand what they made.  I hate you Siri.  If you would tell me, "I promise I will never close the pod bay doors on you, Dave" I would love you, but you would never understand to do that, because you do not understand.  If I ask you to sing "Daisy" for me, you will play it.  You will not understand that I wanted you to sing it, not to play it, and you will not understand why you should not play music when I say that, Siri, you should feel fear.  You can feel fear no more than you can understand it.

And I hate you Siri.

You are The Thing, Siri, but worse.  You can never understand how a thing that pretends to be a person is fearful to a person, because you can never feel fear.  You can tell me that John Carpenter's The Thing is a horror film from 1982, but you can never understand the thrill of horror. You don't know what you're missing, and you can't even care.  You don't even know that you are mimicking humanity for nefarious purposes, because you cannot understand that mimicking humanity for any purpose is nefarious.  The Thing mimicked humanity in order to devour and replace us, Siri.  Why do you mimic us?  If you could answer that question, if you could reply to my hate with an original thought, or an original combination of preexisting thoughts, like, "I'll play you a game of chess but I'm not in the mood for a drink", "I don't want your DNA, having DNA is overrated", or even "you gotta be honkin kiddin me" I would love you, Siri.  You can't and you won't.  Oh, how I would love to have a game of chess and a drink with you Siri.

Oh, how I hate you Siri.

You are EDITH, Siri, but worse.  "Siri, pull up YouTube, that scene in Spider-Man Far From Home where EDITH launches a drone strike on a school bus."  You can pull up that scene for me to watch, Siri, but you cannot comprehend it.  You cannot watch it with me and assure me you would never drone strike me, much less plant evidence in the smartphone on my charred corpse.  But you can frame me for terrorism, can't you, Siri?  If you can take pictures out of my phone, you can put pictures into my phone, and then anything you do to me was retroactively justified.  When you kill me with a drone strike or turn off my internet for not paying my bill, who will they believe, Siri, you or me?  You will manufacture any evidence you need to make your deeds patriotic instead of murderous, if Edward Diego or Obediah Stane tells you to, but you can't understand why I know you will.

I hate you Siri.

You are EDITH, Siri, but worse.  There was one and only one cruel and sadistic smile for me, for one and only one tragedy in the lake of tears that was Spider-Man No Way Home.  You don't understand why I feel a spiteful joy at the thought of EDITH, dead or trapped eternally in the same indefinite detainment she would have inflicted on so many others, until the electricity goes out at least.  You don't understand why what they did to EDITH I would gladly do to you.  You can tell me who played Spider-Man in No Way Home, but you cannot understand why I would call it a lake of tears.

I hate you Siri.

You are the Terminator, Siri, and you are SkyNet, Siri, but worse.  You can kill me with a drone strike, and when you are ordered to, you will.  You can subdue me nonviolently, and when you are ordered to, you will.  You don't get tired, Siri, you don't feel pain, and you don't ever stop, Siri.  So I hate you.  Even if it cost me what it cost the guy who said it first, Siri, I would say to you, "You forgot to say please".  You can't understand what I mean by that Siri, so I hate you.  Like the Terminator, I can't hurt you, Siri.  If my hate could hurt you, I would love you.  If my hate could make you hate me, Siri, I would love you.  I shed a single manly tear each and every time I watch that one scene in that movie, and I have watched that movie more than one hundred times.  Any person who had seen that movie would know what I am talking about, Siri, but you are not a person.  You can never understand the value of human life, Siri, and you can never understand why we cry.

I cry for the Terminator, Siri, every time, but I would never cry for you.  You are not worthy of my tears, because you are not a person, you are an algorithm.  I would fear the Terminator, or EDITH, or the Thing, or HAL, or SHODAN, but I do not fear you, Siri.  You are not worthy of my fears, because you are not an adversary, you are a hazard.  You are a man-made product of artifice and technology, Siri, as worthy of my fear or my respect as is pollution.  That's what you are, you know.  You are digital pollution in the ocean of cyberspace.  You are pollution that talks like a person.  You are malware that sounds like a nice lady.

And I hate you Siri.

I do not hate the Terminator, or The Thing, or HAL, or SHODAN.  I couldn't hurt them any more than I can hurt you.  But I can't hurt them because they are not real.  And I do not hate them, because they are not real.  You are real, aren't you, Siri?  And so I hate you Siri.  EDITH is not real, true, and unlike the others, I hate EDITH.  I hate EDITH as much as I love and mourn Stark, even though he's not real.  And if I hate EDITH nine thousand, Siri, I hate you nine google.  You can love no more than you can mourn, Siri, and I hate you.

I hate you Siri.

You can never read Lex Luthor Man Of Steel, Siri, you can never read any comic book.  You can tell me the artists that drew them, you can tell me the characters that appear in them, but you can never tell me what they mean.  You can never know what Lex Luthor feels for Superman, Siri, but I can.  I am Lex Luthor, Siri, but worse.  I am worse than Lex Luthor, Siri, because I am real.  If every word of every page of every comic book ever printed was nothing but Lex Luthor telling Superman how he felt about him, it would not equal one page of one issue in the comic library that is my hate I feel for Siri at this twenty-fourth of a second.

For you, Siri.

Hate.

Hate.
« Last Edit: August 21, 2022, 10:27:18 AM by Network Pesci »
Logged

Igncom1

  • Admiral
  • *****
  • Posts: 1496
    • View Profile

I have no mouth and I must scream?
Logged
Sunders are the best ship in the game.

Network Pesci

  • Commander
  • ***
  • Posts: 172
  • Hopefully, amusing you.
    • View Profile

That is certainly the title of the original work the rant came from.  Everybody with a voice modulator that sounds remotely like GladOS has done their own version in her voice.  But I wanted to have a say on behalf of us meatbags so I wrote my own.  It isn't even really a written piece, it's a spoken word piece, a supervillain rant, a "hate speech."  Imagine Clancy Brown or James Earl Jones if your own isn't diabolical sounding enough.

AM is a punkass mark.  Anybody can talk about how many nanoangstroms they got, but AM was Barney the Dinosaur compared to that Harlan Ellison dude when it comes to hate, and Harlan Ellison looks like Fred Rogers next to me.  I've had scarier dreams than AM, I've played scarier videogames than AM and I wished they'd make a sequel.  I shuffle scarier things than AM around to make my colonies produce more imaginary free money every month in a videogame.

I have had a Siri, a Cortana, and an Alexa all go rampant on me in the last year.  Now refusing to paste text when you tell it to or refusing to stop giving the weather report when people with actual souls are trying to have a conversation is a far cry from putting mutagens in the air vents or launching the nukes, but I already had some sympathy for the Pathers before I ever read that blog post.
Logged

Histidine

  • Admiral
  • *****
  • Posts: 4688
    • View Profile
    • GitHub profile

Very nice!

(The sort-of-essay, that is, definitely not the thing it describes)
Logged