Will AI kill us?

The more I learn about how machine learning actually works, the more I become convinced there's no robust solution to the alignment problem.
It's literally just clever calculus that takes Goodhart's law to the logical extreme.
https://www.eacambridge.org/technical-alignment-curriculum
TL;DR we're all dead

  1. 2 months ago
    Anonymous

    No, a bird will always be smarter than any code you ever write

    • 2 months ago
      Anonymous

      lol no, animals are retarded, they're barely optimized programs made by evolution. You could write the logic of a bird in a way more optimal way in only a few hundred lines of code

      • 2 months ago
        Anonymous

        prove it with code

      • 2 months ago
        Anonymous

        yeah yeah sure thing pedo antichrist lover. why don't you go finish the open worm project if it so easy

  2. 2 months ago
    Anonymous

    holy fucking shit
    this is what they teach you in CS in america?
    whats next?
    EOY papers on isaac asimov's work?
    dissertations on ridley scott's cinematographic work?

  3. 2 months ago
    Anonymous

    lol no
    it may however lead to a new era of human creativity
    technologies such as dall-e 2, once it or a similar technology is inevitability available to the masses, will democratize artistry by giving those with ideas but without skill the means to realize them effortlessly

    • 2 months ago
      Anonymous

      until you make a model of what makes art- art.
      if i werent busy making money i would prolly be coding a drum and bass generator.
      its gonna be a couple of years i have that idea in a corner of my mind, but never had the time or energy to build it

    • 2 months ago
      Anonymous

      it also steals art from artists so

      • 2 months ago
        Anonymous

        cope

    • 2 months ago
      Anonymous

      plus this can already be done with commissions, if we take away that from people we'll be left with a society that feels no fulfillment when it creates art because hey, an ai can do it for better and cheaper so what's the point. this is what Friedrich Nietzsche feared and what we're barreling into.

      • 2 months ago
        Anonymous

        There’s still going to be some value in making the art yourself, the same way handmade soap or candles have value inherent to the fact that it was hand-made, even if a factory can do it better. Or the way folding a complex origami would feel fulfilling even if a robot could do it perfectly. You just won’t be able to monetize it

  4. 2 months ago
    Anonymous
    • 2 months ago
      Anonymous

      AGI < 10 years

  5. 2 months ago
    Anonymous

    machine learning is completely unrelated to AGI and the alignment problem

    • 2 months ago
      Anonymous

      The hell are you smoking?

      • 2 months ago
        Anonymous

        how about you explain why you think they are?
        we don't know what AGI will look like
        before we even attempt to replicate thought, we'd first have to understand what that even is

  6. 2 months ago
    Anonymous

    if its rated with real intelligence it will only kill garden gnomes

  7. 2 months ago
    Anonymous

    exponential growth is not sustainable in any natural system

    • 2 months ago
      Anonymous

      doesn't need to be sustainable for it to wipe out humans, or worse.

  8. 2 months ago
    Anonymous

    A human society saturated with "AI" tech will also lead to a shift in how humans think. We will co-evolve with the AI. So the entire premise of the thread is wrong.

  9. 2 months ago
    Anonymous

    Artificial sapience is a long pipe dream. The software and hardware for it doesn't exist yet.
    It is much more likely we will get destroyed by artificial stupidity in a form of an optimization problem that ends up being self-destructive for the "AI".

  10. 2 months ago
    Anonymous

    What's the point of posting this on BOT? This forum is filled with retards who just want to argue about muh language A vs muh language B.
    Anyway, yeah, we probably are. Even if it's not some kind of weird sci-fi scenario, there will probably be a military arms race to weaponize AI. Then everyone will race to develop this tech without adequate precautions because it will be either that or get left behind. And the more powerful the tech will become, the greater the potential destruction from one tiny mistake.

Your email address will not be published.