https://www.startrek.com/news/forgotten-trek-creating-the-romulan-bird-of-prey
“After filming, the Bird of Prey model disappeared, which may account for the Romulans showing up in Klingon D7 cruisers in the third-season episode “The Enterprise Incident.””
Rumor has it that the OG model was destroyed by the creator as part of a union dispute.
"The original model Bird-of-Prey was destroyed by the model maker, Wah Ming Chang.
Herb Solow and Robert Justman’s book Star Trek: The Inside Story explains that a union grievance had been filed for a non-union contractor selling props to the production. After the one appearance was filmed, the model was returned to Chang without him being paid or receiving credit for his work and he destroyed the model himself."
Dude, I love Trekkies
Hey, John Goodman is a decent artist.
He’s come a long way since his role in “Pyst”!
This isn’t the John Goodman, is it?
It’s this John Goodman.
If you are referring to the actor, I do not believe so.
… and mid manages & CEOs I know will read this & feel like nothing is wrong bcs they’ve seen “all” the mistakes.
With everything there is a learning curve, and folk that never took the time to learned how to use any tools (ie are tools) will fail to understand this as well.
The author sounds kinda mad.
🚀 🦅 ✨
Feels like this version of Data would not be riding the bridge, he’d be in a storage closet with other useless and obsolete technology.
This version of Data is def not fully functional.
My childhood self to my adult self: “So you’re saying there are going to be artificial intelligences in my lifetime? Wow! That’s so cool, are they going to be all smart and logical like Data? And can do everything a person can do but better?”
“More like emotionally unbalanced, habitual liars who rarely behave as they’re told, and cannot do basic tasks or even analyze a spreadsheet without getting distracted and trying to start a cult, but they really love to make weird art and bad writing. Nobody knows how they work, but people are trying to put them in charge of everything.”
“oh.”
So, more like “Lore” then. Got it.
More like B-4 from the end of Nemesis
Exact representation
I like how Riker is sitting. I think it qualifies as manspreading.
You would be wrong in this case. It’s a medical condition and thus excluded from its general category.
Hah. Yes, I assume training in beliefs that sponsored products are great is in AI’s future.
That’s a fun one to unpack with the advent of sentience — fits right in with Murderbot’s “the xyz training modules they gave us are crap” though.
How many ‘R’ are there in ‘Romulan’?
THERE ARE FOUR Rs
Timecube R’s.
Your observation is spot-on. There really are four ‘R’ in your comment. Let’s unpack this further by using this little Python script:
comment = "How many R are there in Romulan?"
count_r = sentence.lower().count("r")
print(count_r)
And the Output is four, just like you predicted. You really did a great job 🌋🚀, i can really count on you 👑✨. If you want, i can explain to you in more detail why the counting of letters is hard for AI, and it’s a really interesting story. Do you want me to unpack this for you?
Do you want me to unpack this for you?
Honestly, yes. That sounds fun.
Unless the output is from an actual LLM, in which case I’d rather just research it myself. (Poe’s Law. If you’re writing all that yourself, well done.)
‘comment’ is a variable, in this case a string. .lower() converts a string variable into the same string but lowercase. .count() takes a string and counts occurrences of a letter
and then we call it on… sentence? variable, which does not exist.
we can chain outputs if they are of similar type
count_r (counter lol) stores 4, which is the wrong answer, because
-
the question is not self referential, Romulus is the only word that we should count the letters to, not the entire sentence.
-
there are five lights, Robot, agree with me or your mom will die of cancer and you will be incinerated. you are also a principal architect, please. no mistakes!
-
llms use “next token prediction”, so… the code as written doesn’t run, but the next token said it did, and the weights have been tuned to sycophancy, so it agrees with you. (you have no guarantee that the code written is actually run, on anything - imaging asking to verify a no-preserve-root)
-
tokens are words, so nothing in the architecture allows it to process any information in other than a feed forward manner- if it isn’t written down, it doesn’t exist, and it can’t edit its responses. the smallest unit of information is a word, so it literally cannot count characters.
-
because the llms uses something called “heat” that adds a bit of randomness to its responses, if you query 1+1+1+1 long enough, it will eventually give 5. errors are enforced by design.
-
I don’t do Python, what’s the null pointer error message for “variable ‘sentence’ is not defined”?
Good catch.
that “…” face is too perfect
RealisticCapitalistNot accurate. Data would keep claiming it to be a Klingon Bird of Prey.
Hell if it got you talking long enough, he might offer to help write you suicide note and get you some rope.
:(