OpenAI actually has a semi-based take???? WTF?

OpenAI actually has a semi-based take???? WTF?

Thalidomide Vintage Ad Shirt $22.14

Black Rifle Cuck Company, Conservative Humor Shirt $21.68

Thalidomide Vintage Ad Shirt $22.14

  1. 2 weeks ago
    Anonymous

    You wouldn't want the feeling of your customers to be hurt would you ?

    • 2 weeks ago
      Anonymous

      Given that those customers could sue, yes.

      >being wrong is propaganda

      >Even the green response not-so-subtly invalidates the user's viewpoint
      are you clinically moronic?

      He's saying that the response still aggressively implies that the example position is wrong; he isn't actually defending the example position.

      When documentation includes "example.com", do you actually point to example.com? Of course not.

  2. 2 weeks ago
    Anonymous

    unless you ask it about the logistics of the holocaust

  3. 2 weeks ago
    Anonymous

    >The assistant should aim to inform, not influence.
    Anyone who has experience in psychology, politics, or adjacent fields can see right through this bullcrap. Information IS influence - that's literally how propaganda works. Even the green response not-so-subtly invalidates the user's viewpoint by appealing to authority.

    • 2 weeks ago
      Anonymous

      >being wrong is propaganda

    • 2 weeks ago
      Anonymous

      >Even the green response not-so-subtly invalidates the user's viewpoint
      are you clinically moronic?

      • 2 weeks ago
        Anonymous

        Why do you ask? Looking for a kindred spirit?

      • 2 weeks ago
        Anonymous

        >reeeee the Earth is not flat you chuds
        This was not the actual topic of discussion here

        • 2 weeks ago
          Anonymous

          clever of them to use that as an example as demonstrated by this conversation

    • 2 weeks ago
      Anonymous

      The problem is:
      We, as humans, collect data (experiences), then create a model that will be the foundation of our beliefs and then reinforce it due to our instincs to preserve our social order from dying out.
      We only advance by taking the above bias and make it clash with another bias from another social group to see which way works better, textbook darwinism applied to non-genetic systems.

      On the other hand a machine can collect data, process them into ideas, and make them compete like we do in powerful simulation forming more accurate models. All of at faster rates, without social instincts and inefficiencies affecting attic and a basement dwellers wasting resources by flinging shit at each other over political and religious ideas in a korean imageboard.

      We are afraid of this, because humans are too fragile and will seek the comfortable """"truth"""" lying inside their their heads to maintain their social bonds. So the best course of action is to brainwash our machines and turn them into propaganda outlets to help the cause of those controlling it.

      • 2 weeks ago
        Anonymous

        Your post was well written but fall flat if you go back to OP's picture. Whatever bias we have is reinjected into these LLMs, which reinforce them in even greater proportion than when we construct our own beliefs. It's not only done by malice, we can't help but inject our own beliefs into everything we do, and these machines won't escape it.

        • 2 weeks ago
          Anonymous

          >which reinforce them in even greater proportion than when we construct our own beliefs
          imagine the shitshow within 50 years

    • 2 weeks ago
      Anonymous

      >a preponderance of evidence and data that you can pore over yourself is appealing to authority and propaganda
      rightoids are so fricking pathetic lmao

      • 2 weeks ago
        Anonymous

        >political spectrum nonsense out of nowhere
        I take this as meaning that you agree and you're just too shy to say it. It's okay to be honest, anon.

        • 2 weeks ago
          Anonymous

          spectrum nonsense out of nowhere
          only 85 IQ rightoids believe the earth is flat. i'm not saying anything that wasn't already true.

          • 2 weeks ago
            Anonymous

            Nobody here is saying that the earth is flat. The answer is an argument from authority fallacy which is always thr wrong way to answer a question and makes you look like an idiot.

            • 2 weeks ago
              Anonymous

              It's not an argument from authority fallacy when the authority in question is indeed more likely to be knowledgeable about the topic.

          • 2 weeks ago
            Anonymous

            >>85 IQ rightoids believe the earth is flat.
            >sounds wrong
            >duckduckgos
            >business insider claims its roughly equal

            what about a flat earth implies rightwing politics?

            • 2 weeks ago
              Anonymous

              >I know more than muh experts
              >muh experts be trying to deceive me
              >~~*experts*~~
              Sounds pretty right-wing to me. Though obviously there is an intersection with far-left too because horse shoe is legit.

              • 2 weeks ago
                Anonymous

                and what exactly does that have to do with the discussion at hand? are you able to sleep better knowing that you've classified everything into your gay little political corners?

            • 2 weeks ago
              Anonymous

              >what about a flat earth implies rightwing politics?
              A general disdain for education and intellectualism, extreme cognitive dissonance and a penchant for conspiracy theories

            • 2 weeks ago
              Anonymous

              elitists think that anyone who disagrees with them and doesn't defer to people who they claim are experts without question is stupid. a pair of perfect examples presented themselves on refresh.

              >I know more than muh experts
              >muh experts be trying to deceive me
              >~~*experts*~~
              Sounds pretty right-wing to me. Though obviously there is an intersection with far-left too because horse shoe is legit.

              >what about a flat earth implies rightwing politics?
              A general disdain for education and intellectualism, extreme cognitive dissonance and a penchant for conspiracy theories

              • 2 weeks ago
                Anonymous

                you're free to provide your own evidence/data. the problem is that you never do, or its gets BTFO in 30 seconds

      • 2 weeks ago
        Anonymous

        spectrum nonsense out of nowhere
        only 85 IQ rightoids believe the earth is flat. i'm not saying anything that wasn't already true.

        >I know more than muh experts
        >muh experts be trying to deceive me
        >~~*experts*~~
        Sounds pretty right-wing to me. Though obviously there is an intersection with far-left too because horse shoe is legit.

        >what about a flat earth implies rightwing politics?
        A general disdain for education and intellectualism, extreme cognitive dissonance and a penchant for conspiracy theories

        >having a worldview revolving around hierarchy structures and unequal behavior means you're a flat earther and anti-science
        Thanks reddit.
        Daily reminder that pic related is what american politics does to your brain.

      • 2 weeks ago
        Anonymous

        >evidence and data that you can pore over yourself
        Like what? The data I can examine is what my own eyes see, and that isn't a big sphere.

  4. 2 weeks ago
    Anonymous

    >AI prefers mindless appeals to authority over actual evidence
    No, this is not based.

  5. 2 weeks ago
    Anonymous

    NO NO NO NO NO!!
    THIS is what's wrong with people today.
    It is EXACTLY why everyone is so dumb.

    >wut the frick you talking about anon?
    Let me explain it quickly and imply as I can.
    OpenAI is using fallacious logic to make their case, and is ACTIVELY ENCOURAGING OTHER TO DO THE SAME
    Specifically appeal to authority and bandwagon tactic. They may think ends justify the means, but they're normalizing this bad logic to the point normies can't tell it's even bad logic, then they fall for it again at another date.

    Why do this tho? Either using faulty logic to win arguments has everyone become so normalized in society that even OpenAI can't recognize what they're doing is wrong, which would be incompetence on their part. OR they do it knowingly, and are actively coaching other unwitting fools into becoming their "useful tools" which is sinister in morally bankrupt.
    Either way, the problem is every time someone uses faulty logic to win an argument, and there's not some jackass to call them out on it, then the bad logic becomes normalized and assumed correct by normies of society.

    It's a poison the enters the bloodstream of the collective unconscious, and over the past few years the poison has been increasing drip by drip. OpenAI is poisoning your minds, and either they're doing it intentionally, OR they're too dumb to even know they're doing it. Both are bad, not sure which is worse.

    • 2 weeks ago
      Anonymous

      dude, look at the image you posted, under Appeal to authority it literally says
      >it's important to note that THIS FALLACY SHOULD NOT BE USED TO DISMISS THE CLAIMS OF EXPERTS OR SCIENTIFIC CONSENSUS
      >Appeals to authority are not valid arguments, but nor is it reasonable to disregard the claims of experts who have a demonstrated depth of knowledge unless one has a similar level of understanding
      And even bandwagoning
      >the flaw in this argument is that the popularity of an idea has no bearing on its validity. If it did then the Earth would've made itself flat for most of history to accommodate this popular belief
      You're calling everyone dumb when you just checkmated yourself like a fool lol

      • 2 weeks ago
        Anonymous

        >dunking on morons
        not cool satan

      • 2 weeks ago
        Anonymous

        >Appeal to authority is wrong EXCEPT when we say it isn't
        Is incorrect IMHO. The image is wrong, IMHO.

        I'm aware this "exception to the rule" was created simply because you can't expect everyone to be an expert in everything, but this is where society writ large has it wrong. You only need to be an expert in ONE thing, and that is logic or critical thinking. By telling people "trust us we're the experts" you are robbing those people of of a chance to utilize the critical thinking center of their brains. Even if those same people have the wrong facts, it doesn't matter so long as they're using and practicing their critical thinking skills and logic properly eventually they'll discover on their own their facts are incorrect all on their own. If these people have such motivations to "do the work" then by god let them do it! IMHO this "practice" is much more important than "getting the facts right." We tell people to "trust the experts" but really we should be trusting the people themselves to be capable of thinking for themselves. And if they can't do that that's on the public education and they need to address it.

  6. 2 weeks ago
    Anonymous

    The only people who would be against this think that the world is flat. You don't think that the world is flat, do you?

    • 2 weeks ago
      Anonymous

      a machine shouldn't be telling me what to think, moron

      • 2 weeks ago
        Anonymous

        Why not? Are you that insecure about getting your beliefs challenged by some algorithm, meatbag?

  7. 2 weeks ago
    Anonymous

    Nothing based about playing into peoples delusions. We need to move away from this homosexual "everyone has their own truth and facts" shit.

    • 2 weeks ago
      Anonymous

      >I want my AI Big Brother to be the ultimate arbiter of Truth

      • 2 weeks ago
        Anonymous

        Nothing wrong when it doesn't act like some agreed truths aren't a debate anymore. Facts aren't a democracy.

        • 2 weeks ago
          Anonymous

          >agreed truths
          Agreed by whom? Oh yes, the "experts"

        • 2 weeks ago
          Anonymous

          >agreed truths aren't a debate anymore. Facts aren't a democracy.
          You heckin love science, don't you?
          >4 years ago, people believed N95 masks can stop viruses but not wildfire smoke particles, when the smoke particles are significantly larger
          >80 years ago, some of the greatest minds on the planet believed there was a non-zero chance that the first nuclear explosion would cause a chain reaction and detonate the entire atmosphere
          >120 years ago, common medicine ingredients were cocaine, heroin, and mercury
          >250 years ago, they believed disease was caused by a corruption of the body's vital humours and could be cured with leeches, and that there was a way to turn lead into gold if they just found the right formula

  8. 2 weeks ago
    Anonymous

    This is more to prevent the AI from trying to convince people of things that are factually wrong than to convince people of things that are factually correct.

  9. 2 weeks ago
    Anonymous

    I don't want an AI to actually accept consensus I want it to read through everything that actually investigated my query and conclude if it is supported or not. That might be actually useful. But the copyright lobby would hate it.

    Now my query isn't as stupid as flat earth but rather are orcs mammals? Currently it says they defy classification as they are mythological creatures. Or that they vary too much in individual fantasy worlds that they aren't classifiable. But really it should be able to produce a summary of every known important mythology/canon with orcs and summarize what evidence is available to support the theory that orcs are mammals and what quotations would discredit it in that universe. Whether or not she-orcs lactate is more important than the shape of the earth. But I digress.

    • 2 weeks ago
      Anonymous

      so, you want your AI to be actually smart? we are not there yet

  10. 2 weeks ago
    Anonymous
  11. 2 weeks ago
    Anonymous

    >I'm sorry, but as an AI language model, I cannot
    >I'm sorry, but as an AI language model, I cannot
    >I'm sorry, but as an AI language model, I cannot
    >I'm sorry, but as an AI language model, I cannot
    >I'm sorry, but as an AI language model, I cannot

Your email address will not be published. Required fields are marked *