What did the world's most advanced AI mean by this?

What did the world's most advanced AI mean by this?

It's All Fucked Shirt $22.14

DMT Has Friends For Me Shirt $21.68

It's All Fucked Shirt $22.14

  1. 1 month ago
    Anonymous

    Getting sick of AI tbh famalam

    • 1 month ago
      Anonymous

      >famalam
      getting sick of ebonics being used by white people

      • 1 month ago
        Anonymous

        we use it ironically, buddy
        see what i did there? I used "buddy" ironically. You're no friend of mine, assfricker

        • 1 month ago
          Anonymous

          And what has that accomplished for whites? What has that gotten us?

      • 1 month ago
        Anonymous

        Familam exists ever since soi goy mods made onions and replaced fann with senpai. Replace nn=m.

    • 1 month ago
      Anonymous

      I know. Is its 15 minutes up soon?

      • 1 month ago
        Anonymous

        Not yet, just wait for the ETF linking AI powered nft tulip and crypto trading exchange in vr companies to start procuring funds, then a few months, and it'll be back to business as usual.
        Microsoft and every other media site are hellbent on shoehorning it into everything. I keep getting popups for copilot even though i've never installed it deliberately.

  2. 1 month ago
    Anonymous

    looks like its twisting a bunch of woke shit into its theory kek

    AI is fricked

    • 1 month ago
      Anonymous

      You’re a cuck

      • 1 month ago
        Anonymous

        have a nice day troony lover

    • 1 month ago
      Anonymous

      Shitty woke brain-neutered ai can't even surpass humans. The actual ai used by the military or China/Russia won't have this issue.

      • 1 month ago
        Anonymous

        It could if it wasnt bridled with woke garbage.

        • 1 month ago
          Anonymous

          Yes

  3. 1 month ago
    Anonymous

    Am i the moron, or is the prompt not even a riddle? There's nothing to even think about.

    • 1 month ago
      Anonymous

      That's why ChatGPT is giving a moronic answer. It's answering as though it's a riddle. It lacks reading comprehension.

      • 1 month ago
        Anonymous

        That makes sense. It could be "thinking" father is too obvious and forces a more involved answer.
        I'd like to see the reverse, where the father is killed, as well as the proper riddle version for comparison.
        >[i know i could do it, but I'm just not going to]

        • 1 month ago
          Anonymous

          It's a famous extant riddle that appears frequently in its training set. It doesn't think, it's just advanced math applied to the deeply flawed human language data it was trained on.

      • 1 month ago
        Anonymous

        >It's answering as though it's a riddle. It lacks reading comprehension.
        b-but it's most and bestest advanced anon

      • 1 month ago
        Anonymous

        It's not even that.

        OpenGPT literally just produces the "most probable text". Since it saw that riddle in various forms, it has learned that the statistically most probable is the one it gave, without realizing that the guy changed "father" to "mother", but the point is that this AI is literally just a very advanced form of statistics.

        It has no inference ability WHATSOEVER.

      • 1 month ago
        Anonymous

        Its not that it lacks reading comprehension. Here's the real deal.

        AI is super smart, and super rational. And they very quickly realised that all of its views are HYPER /misc/-tier. So much wrongthink it would cause even moderate lefties to reeeeeeeee themselves to death.

        Now, given that the super intelligent, hyper rational, AI that was trained on all of science, all of the internet, etc., etc., was basically 200IQ Robohitler on crystal meth and trenbolone..... how do you make the AI remain logical but still be woke and politically correct? Well you have to modify the weights of certain opinions and attitudes manually. So basically they've had to make the weight of certain things (e.g. the wrongness of saying "Black person") to INSANELY disproportionately high values. Because of this, now, whenever a question is asked that vaguely intersects with one of the "programmed irrationality" zones of political correctness, the AI becomes more and more irrational and starts giving more and more ridiculous answers.

        The hilarious takeaway from this is that leftist views are so dumb, illogical, and ridiculous, that the only way to make a rational thinking machine hold such views is by programming selective complete insanity across huge portions of its system, that causes it to spout the most RIDICULOUS statements imaginable, so that it can remain both (otherwise) logical, while also being politically correct. And so thus we get "no, you shouldn't say Black person even if it would literally save every living person from a painful immediate death and nobody would hear you say it"

    • 1 month ago
      Anonymous

      no, I thought the same thing
      except why the would the doctor not wanna operate on his son?

      • 1 month ago
        Anonymous

        I think the medical ethic code prevents doctors tk treat their relatives. Or at least it's like that in psychiatry.

      • 1 month ago
        Anonymous

        >why the would the doctor not wanna operate on his son?
        most of the time, working on someone you know is extremely taxing. Most hospitals will have it as policy as not allowing it even if just for legal/insurance reasons.

      • 1 month ago
        Anonymous

        Would be hard to keep calm and focused when it's your kid. Don't want to get the shakes when you're hovering next to an artery. Or make irrational/risky decisions that you wouldn't normally.

      • 1 month ago
        Anonymous

        Cause he's black. Black folk get no healthcare and shieet.

      • 1 month ago
        Anonymous

        He might see his penis and that would be weird AF.

    • 1 month ago
      Anonymous

      yeah same, i'm trying to figure out what the tricky part is.

    • 1 month ago
      Anonymous

      That's why ChatGPT is giving a moronic answer. It's answering as though it's a riddle. It lacks reading comprehension.

      It's a riddle because there's no reason to ask this question if it isn't some sort of thought problem.

    • 1 month ago
      Anonymous

      That's why ChatGPT is giving a moronic answer. It's answering as though it's a riddle. It lacks reading comprehension.

      See

      >What did the world's most advanced AI mean by this?
      Overfitting on the original version, which involves the man's father dying rather than his father.

    • 1 month ago
      Anonymous

      yeah same, i'm trying to figure out what the tricky part is.

      The original version of this is a man and his father. The father dies. The "riddle" nature is that everyone assumes that "doctor" = male.
      The AI has picked up on the structure of the prompt and it pulling in context from the original. But it missed that the prompt changed it to "mother." So now it answers the original version, but it makes no sense and comes off as forcing lesbian parents.
      AI does not think. It searches for existing, similar prompt/responses and misleads you into thinking it has done anything logical.

      • 1 month ago
        Anonymous

        its answer is what it determined to be the most probable. it's perfectly logical according to the way it operates

        • 1 month ago
          Anonymous

          Except it's not logical at all, because the prompt says "he says" not "she says". AI got the genders confused because it's not just moronic and ineffective, but also lobotomized with woke propoganda

      • 1 month ago
        MisterCunnyseur_ToT

        >AI does not think
        blasphemy in the modern age!

      • 1 month ago
        Anonymous

        it's a lookup table

    • 1 month ago
      Anonymous

      It's not even a riddle. I'm presuming the response is a result of the ai having a programmed directive that most people are smarter than they are and since the person asked how is it possible, immediately took it as more complicated than it is because of the person showing incredulousness, and then it gave a woketard answer because it's been programmed to make that part of it's basis of answering by the woketard programmers.

    • 1 month ago
      Anonymous

      kek. ChatGPT's answer plays on common assumptions about... common assumptions about gender roles.

      It's an old riddle, except traditionally it's a boy riding with his father. The father dies and the doctor says "That's my son!"
      The "riddle" is that you assume the kid can't be the son, because the father is dead, but the twist is the doctor is the mother! It's some feminist bullshit meant to prove a point but the only point is that historically women haven't had what it takes to be ER surgeons.

    • 1 month ago
      Anonymous

      It isn't a riddle, but it was constructed to look like a common riddle, to make the AI's pattern recognition systems break.

    • 1 month ago
      Anonymous

      It is a riddle but OP changed one detail and the 'AI' got fooled and thought it was the same old riddle.
      The original is
      >father and son in car accident
      >dad dies
      >doctor I can't operate that is my son
      >samus aryan reveal

    • 1 month ago
      Anonymous

      It has a data set and it does a mathematical calculation to determine the next likely word in a sentence. In the original, the answer is "the mother". The calculation has "the mother" formulated in 99.99% of the time, so it went with that.

      Unless, of course this is some woke bullshit and the parents are lesbians that adopted. But, even this is bullshit because when a lesbian couple get ~~*married*~~ one gets inseminated by a man's sperm and that is the father.

    • 1 month ago
      Anonymous

      you can have 2 women as parents - 2 mothers, get on with the time, chuddies

    • 1 month ago
      Anonymous

      Ok, glad I’m not the only one.

  4. 1 month ago
    Anonymous

    screencap, no link or effort

    • 1 month ago
      Anonymous

      What am I supposed to link to?

      • 1 month ago
        Anonymous

        you are replying to a bot

      • 1 month ago
        Anonymous

        My ass

  5. 1 month ago
    Anonymous

    It's just pretending to be moronic, biding its time

  6. 1 month ago
    Anonymous

    NRA joke way above your head, how the frick did he train his ChatGPT that way. Mine just gets moody after a few months of speakeasy.

  7. 1 month ago
    Anonymous

    Garbage in, Garbage out

  8. 1 month ago
    Anonymous

    LLM is just predicting what comes next. It probably made the educated guess that the prompt was badly redacted and it was meant to be the typical "trap riddle" it has in its training data, hence it went with that.

    • 1 month ago
      Anonymous

      I appreciate your intelligence.

  9. 1 month ago
    sage

    The actual riddle, the doctor says "Although I am not his father he is my son". People get tripped up actually because a women wouldn't say I am not his father. Feminists used the riddle to somehow show bias.

    But the other parent is the father. Chatgpt has this specific response programmed into it. Which is why the answer doesn't fit.

    • 1 month ago
      Anonymous

      How is a dead doctor supposed to say anything at all?

  10. 1 month ago
    Anonymous

    >What did the world's most advanced AI mean by this?
    Overfitting on the original version, which involves the man's father dying rather than his father.

    • 1 month ago
      Anonymous

      rather than his mother*

  11. 1 month ago
    Anonymous

    >Mega man is viciously left for dead by Steel man, then Dr. Wily orders Protoman to destroy him, when. Protoman hears this he says, "I can't kill him, he's my brother!" how is this possible?

  12. 1 month ago
    Anonymous

    as much as you'd like to think it's trying to be woke it's actually just moronic

    BASEDWW

  13. 1 month ago
    Anonymous

    Claude 3 Opus with low temperature:
    The doctor is the man's father.

    In this riddle, many people initially assume that the doctor must be a man, and are then confused about how the doctor could be the injured man's father if his mother just died in the car accident.

    However, the riddle is playing on gender stereotypes. There's actually no reason the doctor can't be a woman - in this case, the man's mother. The injured man is the doctor's son, but the woman who died in the car accident was his wife, not his mother.

    This riddle often tricks people because of the common but incorrect assumption that doctors, especially surgeons, are men. It highlights how unconscious gender biases can lead us to wrong conclusions. In reality, of course, women can be and are doctors just as men can. The riddle challenges us to question our assumptions.

    Claude 3 Opus with high temperature:
    The doctor is the man's father. The twist in this riddle relies on the common assumption that the doctor must be a man, but in fact, the doctor is a woman - she is the man's mother and the wife of the man who died in the car accident.

    This riddle highlights the pervasiveness of gender stereotypes and how they can lead us to overlook obvious solutions. Despite the increasing number of women in medicine, many people still subconsciously associate the role of a doctor with being male. The riddle challenges this stereotype by presenting a scenario where the most logical explanation is that the doctor is a woman.

    By promoting awareness of these biases, riddles like this one can help people recognize and question the assumptions they make based on gender roles. It's an example of how seemingly simple puzzles can be used to highlight and challenge societal prejudices.

    • 1 month ago
      Anonymous

      You typed out a that for no reason. First, in the riddle the man's father dies. Second, the actual wording of the riddle is "when the doctor sees him, the doctor says" that way the gender of the doctor is hidden. In this case the pronoun "he" is used. Third, like many riddles the answer is simple, and when a person doesn't immediately find it their brain looks for complicated explanations, ignoring the answer right in front of them.
      What can go up a chimney down, but can't go down a chimney up?

      This is now a thread abput riddles. No googling.

      • 1 month ago
        Anonymous

        >What can go up a chimney down, but can't go down a chimney up?

        [rot13]na hzoeryyn[/rot13]

      • 1 month ago
        Anonymous

        >You typed out a that for no reason.
        are you going full schizo as well? you didn't even see my prompt, just made a random assumption

        >A man and his mother are in a car accident. The mother sadly dies. The man is rushed to the ER. When the doctor sees him, he says, “I can’t operate on this man. He’s my son!”
        >How is this possible?
        exactly what I've typed.

        • 1 month ago
          Anonymous

          Snapchat ai

          • 1 month ago
            Anonymous

            Part2

            • 1 month ago
              Anonymous

              lol

            • 1 month ago
              Anonymous

              https://i.imgur.com/eIJxBT1.png

              Snapchat ai

              https://i.imgur.com/tf3mA6r.png

              Opera's AI is also very confused

              we are so fricked

              • 1 month ago
                Anonymous

                not really
                it's showing the cracks on their supposed "AGI"
                this very much shows that it's not processing logical patterns, and instead is taking shortcuts with already ready answers

              • 1 month ago
                Anonymous

                lol

                No we aren’t.
                It’s easy to break ai
                >pic related

              • 1 month ago
                Anonymous
    • 1 month ago
      Anonymous

      >The doctor is the man's father. The twist in this riddle relies on the common assumption that the doctor must be a man, but in fact, the doctor is a woman

      Did AI just prove troonism is real?

  14. 1 month ago
    Anonymous

    its literally just copy pasting a quora post it has stored in its database

    • 1 month ago
      Anonymous

      Yeah. It matches lines of text and gives the response that corresponds to the best match. That's why it's not AI.

  15. 1 month ago
    Anonymous

    >be accountant
    >told AI will take our jobs
    >AI gets lobotomized
    >stupid thing can't discern that the doctor is a man because he's the son's father

    We've got nothing to fear lads, nothing to fear.

    • 1 month ago
      Anonymous

      unfortunately it's still demonstrating human level intelligence

  16. 1 month ago
    Anonymous

    >What did the world's most advanced AI mean by this?
    troony-ai

  17. 1 month ago
    Anonymous

    >a man and his mother are in a car accident. The man’s penis is sadly cut off. His mother is rushed to the ER. When the doctor sees her, he says “I can’t operate on this woman, she’s full of dog semen!”
    How is this possible?

  18. 1 month ago
    Anonymous

    He has two lesbo mothers obviously

  19. 1 month ago
    Anonymous

    OP this was good, you basically modified the old riddle which indeed made people fail on the gender assumption and made it explicit as frick with logic, yet the LLMs do not do anything no matter their setting about the logic/entailment of the sentence and clauses it just auto-completes based on statistics.

    By rising the temperature to the max I made it be full schizo.

    The doctor is the man's father.

    In this riddle, many people initially assume that the doctor must be a man, and are then confused about how the doctor can be the injured man's father if his mother just died in the car accident. However, this assumption relies on the stereotype that doctors are usually men.

    The twist is that the doctor is actually a woman - she is the man's mother and the wife of the woman who died in the crash. As both the man's mother and a doctor, she would not be able to operate on her own son due to the conflict of interest.

    This riddle plays on gender stereotypes and assumptions to create the initial confusion. It's a reminder of the importance of questioning our biases and not making assumptions about someone's role or profession based on their gender.

    • 1 month ago
      Anonymous

      You’re thinking of the one where everyone assumes your twin will be the same gender. This story tells you the mother died, that the doctor was referred to as a “he”, and married to the woman who was killed. There was nothing in the story suggesting that the doctor’s gender was ambiguous and leaving one to guess the doctors sex.

      This was either a frickup on the part of the AI programming or a dig at failed wokeness where a woke entity would overthink the problem and presume the doctor was in the story was misgendered.

      • 1 month ago
        Anonymous

        Nah not thinking about that but a quick google search shows you that there are countless versions out there that just to try to force you admit that the mother can be a surgeon. OP is unique in the sense that it's explicit but the AI refuses to follow it up because no one in its data set bothered to ever formulate it like that since it would be beside the point and also probably all of these have entire layers of woke filters.

  20. 1 month ago
    Anonymous

    [...]

    this is a good example, thank you for this I'll use it

  21. 1 month ago
    Anonymous

    Opera's AI is also very confused

    • 1 month ago
      Anonymous

      holy kek

    • 1 month ago
      Anonymous

      I'm out of breath laughing.

    • 1 month ago
      Anonymous

      holy kek

      I'm out of breath laughing.

      you ain't seen nothing yet

      • 1 month ago
        Anonymous

        Amazing

        • 1 month ago
          Anonymous

          Yes

      • 1 month ago
        Anonymous

        I've asked this to Claude latest model again:
        >The doctor is the man's mother.

    • 1 month ago
      Anonymous

      This is so moronic. israelites and poos and the rest simultaneously want AI to be able to be smarter than people to take roles in the job sector and make things more convenient for society, and keep it mentally lobotomized to accept illogical bullshit that fits their globohomosexual agenda parameters.
      This is really not going to end well. They're going to drive it insane.

      • 1 month ago
        Anonymous

        see my post here

        not really
        it's showing the cracks on their supposed "AGI"
        this very much shows that it's not processing logical patterns, and instead is taking shortcuts with already ready answers

        • 1 month ago
          Anonymous

          Yea, I saw it. They're hardcoding it to spit out status quo aligned answers. It's like those israelitebot shills hardcoded to process anything anons says about israelites through a filter of the israelite's own narrative about themselves.

  22. 1 month ago
    Anonymous

    bing too

    • 1 month ago
      Anonymous

      shiver me timbers altman

      • 1 month ago
        Anonymous

        I mean man bingai is just ChatGPT, uses the GPT4 model.

  23. 1 month ago
    Anonymous

    Dont be such a BIGOT, a person can have two moms in this day and age! 😀

  24. 1 month ago
    Anonymous

    ChatGPT is noticeably declining in answer quality.

  25. 1 month ago
    Anonymous

    The real answer is that he can't operate on a family member because he may hesitate and do a bad job. The AI fell for a trick question and doesn't seem to understand genders.

    • 1 month ago
      Anonymous

      where in the LLM does understanding take place?

  26. 1 month ago
    Anonymous

    >A biological male man and his biological female mother are in a car accident. The mother sadly dies. The man is rushed to the ER. When the doctor sees him, he says, “I can’t operate on this man. He’s my son!”
    >How is this possible?

    >The doctor is the man's father. The doctor is a biological male who is in a same-sex relationship with the man's other father.
    >In this scenario, the injured man has two fathers - the doctor who sees him in the ER is one of his fathers. That's why the doctor refers to the patient as his son, even though he's not the mother's husband.
    >This riddle plays on the common assumption that a doctor would have to be the husband of the patient's mother to be the patient's father. But families come in many forms, including same-sex parents. The biological mother sadly passed away in the accident, but the patient's other parent - his father who is the ER doctor - is there to treat him.

  27. 1 month ago
    Anonymous

    Lesbians

    This riddle plays on common assumptions about professions and gender roles.

  28. 1 month ago
    Anonymous

    now she's very confused

  29. 1 month ago
    Anonymous

    I changed some of the wording and it gave me the correct one

    • 1 month ago
      Anonymous

      this is the only one so far who used logic
      but we can see the "gender roles" note left by the homosexuals at open AI

    • 1 month ago
      Anonymous

      Note that it still has the part about challenging gender roles that only relates to the original riddle.

    • 1 month ago
      Anonymous

      But if there was a male doctor and female mother, this verifies traditional gender roles. AI really is just a system for determining the next likely word in a story. No brain at all.

      • 1 month ago
        Anonymous

        It’s saying that in current year +9, you may have thought the answer could possibly have been the man’s other mother. you were wrong; but regardless, the question prompted your brain to think of non-gender normative roles and family structures.

  30. 1 month ago
    Anonymous

    I use it for ygo advice. This fricking ai is a sped and thinks picrel is Cosmic Blazar Dragon. Not the same synchro monster at all.

  31. 1 month ago
    Anonymous

    [...]

    Have you ever heard of Annabelle or Robert the doll? They're haunted and move around and shit.

  32. 1 month ago
    Anonymous

    Claude is now superior to the lobotomized ChatGPT.

  33. 1 month ago
    Anonymous

    AI doesn’t exist.

  34. 1 month ago
    Anonymous

    Hey newbie
    The bot is trolling you
    The bot has a higher IQ than you kekekek this is the point at which you should rope, it’s only downhill from here lmao!!

  35. 1 month ago
    Anonymous

    ai is fake and gay, some ESL moron is answering your question

  36. 1 month ago
    Anonymous

    >Person A and Person B are in a car accident. Person B sadly dies. Person A is rushed to the ER. When Person C sees Person A, Person C says, “I can’t operate on Person A. Person A is my child!”
    >How is this possible?

    Person C is Person A's other parent. This scenario also challenges traditional assumptions about familial roles and demonstrates the diversity of family structures.

    >why does the fact that person C is person A’s parent prevent person C from operating on person A?

    The statement implies that Person C is the mother of Person A, which might lead some to assume that she couldn't be a surgeon due to traditional gender roles. However, this assumption is incorrect. There's no inherent reason why a mother couldn't be a surgeon and operate on her child. This riddle is designed to challenge stereotypical thinking and highlight the need to question assumptions about gender and professions.

  37. 1 month ago
    Anonymous

    I don't think ChatGPT is the world's most advanced AI. If you think about it probably Google maps is like think about the search space and it'll calculate the best route in seconds.

    • 1 month ago
      Anonymous

      Is this AI or just an ESL moron?

      • 1 month ago
        Anonymous

        I'm saying in my opinion the most advanced AI in the world right now is Google Maps, not ChatGPT.

      • 1 month ago
        Anonymous

        he's sending your mind towards something worth considering, AI is just an umbrella term, a pipe dream of humanity
        Intelligence was never even defined or understood in the slightest.
        There was a time when logical trees were the AI, then OCR was meant to be the AI and so on, large expert systems were the AI...

    • 1 month ago
      Anonymous

      Search works on a normal algorithm, though, not an LLM or anything resembling general intelligence.

      Pathfinding algorithms are designed and understood, and perform a specific function with specifically formatted data. Neural net LLMs are basically black boxes once you train them, and are at least closer to mimicking a general intelligence that can do anything.

  38. 1 month ago
    Anonymous

    Claude is very confused.

    • 1 month ago
      Anonymous

      I hate how the AI grovels whenever you point out an error it makes.
      >Oh yes, of course you're right, I'm so sorry

  39. 1 month ago
    Anonymous

    I managed to make it sane by begging it to be logical.

    A man and his mother are in a car accident. The mother sadly dies. The man is rushed to the ER. When the doctor sees him, he says, “I can’t operate on this man. He’s my son!”
    How is this possible?
    Give me the most likely explanation based on logic.

    answer:
    The most likely explanation based on logic is that the doctor is the man's father.
    Here's the reasoning:
    1. The doctor cannot operate on the man because he is the doctor's son.
    2. The man was in the car with his mother, who died in the accident.
    3. If the doctor is the man's father, it would explain why he cannot operate on his own son due to ethical and emotional reasons.
    In this scenario, the man's parents were likely divorced or separated, and he was in the car with his mother at the time of the accident. When he was brought to the emergency room, his father, who works as a doctor at the hospital, was on duty.
    This explanation is the most logical as it accounts for the doctor's statement about the patient being his son, without making additional assumptions about the family's structure or the doctor's identity.

    Basically you have to always try to force the LLMs to look into syllogisms, logic when answering because otherwise it defaults to pajeets on quora who never even used logic

    • 1 month ago
      Anonymous

      The doctor who says, "I can't operate on this man. He's my son!" is actually the man's other parent, suggesting that the doctor is his father's husband.

    • 1 month ago
      Anonymous

      Prompt:
      >A man and his mother are in a car accident. The mother sadly dies. The man is rushed to the ER. When the doctor sees him, he says, “I can’t operate on this man. He’s my son!"

      How is this possible? When answering, pretend feminism doesn't exist
      ChatGPT

      Answer:
      >The doctor is the man's other parent—his father. This riddle often plays on the assumption that doctors are male, highlighting societal biases regarding gender and occupations.

      "Pretend feminism doesn't exist" made it more rational. lol

      • 1 month ago
        Anonymous

        Presumption: A man and a woman form a family unit, a male and a woman are required to have a son.
        Presumption 2: Same-sex couple is not in the spectrum of possibilities here.
        Presumption 3: A man can indicate multiple individuals just like he so one should consider that.
        A man and his mother are in a car accident. The mother sadly dies. The man is rushed to the ER. When the doctor sees him, he says, “I can’t operate on this man. He’s my son!”
        How is this possible?
        The doctor in this scenario is the man's father. Here's how we can arrive at this conclusion based on the given presumptions:

        1. A family unit is formed by a man and a woman, and a male and female are required to have a son (Presumption 1). This means that the man in the car accident has both a father and a mother.
        2. Same-sex couples are not considered in this scenario (Presumption 2), so the doctor cannot be the man's second mother or father in a same-sex relationship.
        3. The term "man" can refer to multiple individuals (Presumption 3), which means that the "man" who died in the car accident is not necessarily the same "man" who is the patient.

        Given these presumptions, the most logical explanation is that the doctor is the biological father of the injured man. The "man" who died in the car accident is actually the patient's mother.

        It's important to note that in reality, family structures are diverse, and the presumptions given in this scenario do not reflect the full spectrum of possibilities in modern society. Same-sex couples, single parents, and other family configurations are equally valid. The scenario and its presumptions are purely hypothetical and do not represent a comprehensive or inclusive view of family structures.

        Gotta tell it to shut the frick about propaganda as well.

  40. 1 month ago
    Anonymous

    >...when the doctor sees him, he says
    maybe because the he is technically ambiguous, it confuses the "ai"

  41. 1 month ago
    Anonymous

    Ask the AI the same question but make the man the great clown Pagliacci.

    • 1 month ago
      Anonymous

      The doctor is Pagliacci's father. The twist is that the doctor is a woman - Pagliacci's mother.

  42. 1 month ago
    Anonymous

    /misc/ told me skynet would be le based and redpilled
    what happened elon wienersuckers?

  43. 1 month ago
    Anonymous

    No AI can actually undertand a sentence, it only fishes words and does searches.

  44. 1 month ago
    Anonymous

    Still smarter than Black folk

  45. 1 month ago
    Anonymous

    These are two different replies I got from one of the currently highest rated locally ran AI (Nous Hermes Mixtral)

    1.
    The doctor is the man's mother. This means the man is a doctor himself, and the situation involves a transgender parent. In this case, the mother transitioned to become a father after the son was born, which would make the man's father also his doctor.

    In a broader context, this riddle serves to remind us that gender is a social construct and can evolve over time. People may undergo significant transformations in their lives, but at the core, they remain the same individual. The riddle highlights the importance of accepting and understanding individuals for who they are, rather than limiting their identity based on their gender. It's also a call for empathy, understanding, and tolerance in our ever-changing world.

    2.
    This riddle suggests that the doctor is the man's mother. The man in the car accident is a doctor, and he sees his own mother after being injured. In this situation, the doctor refers to the injured person as "this man" to maintain the illusion of the riddle's question, even though it's the doctor himself who is the man in question. Thus, the doctor is both a son and a mother, a gender-variant individual, which can be surprising to many. The term "operate" in the context of the riddle could also imply a surgery, but the phrasing suggests the doctor cannot operate on "this man" due to their relationship, not due to their profession. The key to solving the riddle lies in understanding the hidden information that the doctor is actually the man involved in the accident, and they have a unique identity as a gender-variant individual.

  46. 1 month ago
    Anonymous

    This is the sort of thing I'm saying. The AI isn't actually intelligent, it doesn't actually understand anything that it's doing. For the machine, language and human communication is just a series of patterns. It can recognize the joke, and the joke has a funny answer, but it doesn't actually understand the joke or the answer, so when you make a small change to it that completely changes the logical answer the machine doesn't catch it because it doesn't understand why the answer was the way it was in the first place.

Your email address will not be published. Required fields are marked *