>Shannon came up with the fundamentals of LLMs back in 19 fricking 48.

>Shannon came up with the fundamentals of LLMs back in 19 fricking 48.

There is literally nothing new under the sun

A Conspiracy Theorist Is Talking Shirt $21.68

Tip Your Landlord Shirt $21.68

A Conspiracy Theorist Is Talking Shirt $21.68

  1. 3 weeks ago
    Anonymous

    Yes, anon. The only reason that it hasn't been done until now is cheap, readily accessible computation on scales never seen before. And that allows us to actually develop and advance these fundamentals beyond what people imagined before.

    • 3 weeks ago
      Anonymous

      some math problems remained pipedreams for literally hundreds of years as the computation required for them exceeded what a human could do in a lifetime, computers accelerated this kind of repetitive logic so fricking fast that all those kinds of problems just collapsed over the span of only a handful of decades

      • 3 weeks ago
        Anonymous

        >computers accelerated this kind of repetitive logic so fricking fast that all those kinds of problems just collapsed over the span of only a handful of decades
        many problems were solved decades ago using computers much slower and less capable than a $50 chinese mobile phone.

  2. 3 weeks ago
    Anonymous

    1. Zero-order approximation (symbols independent and equiprobable).
    XFOML RXKHRJFFJUJ ZLPWCFWKCYJ FFJEYVKCQSGHYD QPAAMKBZAACIBZLHJQD.
    2. First-order approximation (symbols independent but with frequencies of English text).
    OCRO HLI RGWR NMIELWIS EU LL NBNESEBYA TH EEI ALHENHTTPA OOBTTVA NAH BRL.
    3. Second-order approximation (digram structure as in English).
    ON IE ANTSOUTINYS ARE T INCTORE ST BE S DEAMY ACHIN D ILONASIVE TUCOOWE AT TEASONARE FUSO TIZIN ANDY TOBE SEACE CTISBE.
    4. Third-order approximation (trigram structure as in English).
    IN NO IST LAT WHEY CRATICT FROURE BIRS GROCID PONDENOME OF DEMONSTURES OF THE REPTAGIN IS REGOACTIONA OF CRE.
    5. First-order word approximation. Rather than continue with tetragram, ... , n-gram structure it is easier and better to jump at this point to word units. Here words are chosen independently but with their appropriate frequencies.
    REPRESENTING AND SPEEDILY IS AN GOOD APT OR COME CAN DIFFERENT NATURAL HERE HE THE A IN CAME THE TO OF TO EXPERT GRAY COME TO FURNISHES THE LINE MESSAGE HAD BE THESE.
    6. Second-order word approximation. The word transition probabilities are correct but no further structure is included.
    THE HEAD AND IN FRONTAL ATTACK ON AN ENGLISH WRITER THAT THE CHARACTER OF THIS POINT IS THEREFORE ANOTHER METHOD FOR THE LETTERS THAT THE TIME OF WHO EVER TOLD THE PROBLEM FOR AN UNEXPECTED.

    Someone feed this to chat GTP.
    I think it would be...poetic.

    • 3 weeks ago
      Anonymous

      7. Fourth-order approximation (quadrigram structure as in English).
      THE QUICK BROWN FOX JUMPS OVER THE LAZY DOG AND THE LAZY DOG BARKS BACK. IT IS A TEST SENTENCE TO DEMONSTRATE THE USAGE OF QUADRIGRAMS IN LANGUAGE MODELING.

      8. Third-order word approximation with syntactic structure.
      IN THE BEGINNING, THERE WAS CHAOS. AS TIME PASSED, ORDER EMERGED. THE SUN ROSE AND SET, THE TIDES EBBED AND FLOWED, AND LIFE EVOLVED ON EARTH.

      9. Fifth-order approximation (pentagram structure as in English).
      SHE SELLS SEASHELLS BY THE SEASHORE. PETER PIPER PICKED A PECK OF PICKLED PEPPERS. HOW MUCH WOOD WOULD A WOODCHUCK CHUCK IF A WOODCHUCK COULD CHUCK WOOD?

      10. N-gram structure with semantic coherence.
      THE ELEGANT BALLROOM WAS FILLED WITH BEAUTIFULLY DRESSED GUESTS. SOFT MUSIC PLAYED IN THE BACKGROUND, AND THE CHANDELIERS ILLUMINATED THE DANCE FLOOR. EVERYONE SWAYED TO THE RHYTHM OF THE MELODY.

      11. Hybrid approximation combining word and character-level information.
      THE SCIENTIFIC METHOD IS AN ESSENTIAL TOOL IN RESEARCH. IT PROVIDES A SYSTEMATIC APPROACH TO INVESTIGATING PHENOMENA AND TESTING HYPOTHESES. FROM OBSERVATION TO CONCLUSION, IT GUIDES THE PATH OF DISCOVERY.

      12. Context-aware approximation incorporating topic-specific vocabulary.
      IN THE FIELD OF QUANTUM PHYSICS, ENTANGLEMENT PLAYS A CRUCIAL ROLE. SUPERPOSITION, OBSERVER EFFECT, AND QUANTUM COMPUTING ARE AMONG THE FASCINATING CONCEPTS EXPLORED IN THIS DOMAIN.

      Please note that the generated items are based on statistical patterns and do not necessarily have a coherent meaning. They are attempts to follow the given structures and replicate the style of the initial list.

    • 3 weeks ago
      Anonymous

      >Fourth-order approximation (quadrigram structure as in English).
      WITHIN THE CONTEXT OF THIS SENTENCE, THE PROBABILITY DISTRIBUTION OF LETTERS IS CAREFULLY CONTROLLED TO REFLECT THE STATISTICAL PATTERNS OBSERVED IN ENGLISH TEXTS.
      >Fifth-order approximation (pentagram structure as in English).
      AFTER ANALYZING A LARGE CORPUS OF ENGLISH TEXTS, THE FREQUENCY OF FIVE-LETTER SEQUENCES WAS USED TO GENERATE THIS APPROXIMATION, SIMULATING THE NATURAL LANGUAGE PATTERNS.
      >Second-order paragraph approximation. Rather than focusing on individual words, the transition probabilities between paragraphs are considered to capture a higher level of language structure.
      IN THE PREVIOUS SECTION, WE DISCUSSED APPROXIMATIONS BASED ON WORDS. NOW, LET'S EXPLORE THE RELATIONSHIPS BETWEEN PARAGRAPHS AND HOW THEY CONTRIBUTE TO THE OVERALL COMMUNICATION PROCESS.
      >Third-order sentence approximation. By examining the statistical dependencies between sentences, we can construct an approximation that reflects the syntactic and semantic patterns found in English discourse.
      THE ABILITY TO CONSTRUCT MEANINGFUL SENTENCES IN ENGLISH RELIES ON UNDERSTANDING THE GRAMMATICAL RULES AND THE CONTEXTUAL RELATIONSHIPS BETWEEN IDEAS EXPRESSED IN EACH SENTENCE.
      >Fourth-order dialogue approximation. In the context of a conversation, capturing the turn-taking dynamics and the interaction patterns between speakers adds another layer of complexity to the language model.
      AS TWO INDIVIDUALS ENGAGE IN DIALOGUE, THEIR SPEECH PATTERNS AND THE EXCHANGE OF INFORMATION SHAPE THE FLOW OF CONVERSATION, DEMONSTRATING THE INTRICACIES OF LANGUAGE USAGE.

      Please note that these examples are generated by the language model and may not correspond to actual examples or findings from Claude Shannon's paper.

    • 3 weeks ago
      Anonymous

      Fourth-order word approximation.
      THE ONLY WAY TO SOLVE THIS PUZZLE IS TO APPLY A SET OF RULES THAT GOVERN THE BEHAVIOR OF THE SYSTEM. HOWEVER, IT IS IMPORTANT TO NOTE THAT THESE RULES ARE NOT FIXED AND CAN VARY DEPENDING ON THE CONTEXT. FOR EXAMPLE, WHEN DEALING WITH NATURAL LANGUAGE PROCESSING, ONE MUST CONSIDER THE SEMANTIC MEANING BEHIND THE WORDS AND PHRASES USED.

      Fifth-order word approximation.
      IN ORDER TO ACHIEVE A HIGH LEVEL OF ACCURACY AND COHERENCE IN LANGUAGE GENERATION, IT IS ESSENTIAL TO INCORPORATE THE CONTEXTUAL INFORMATION PROVIDED BY THE PRECEDING WORDS. THIS ENSURES THAT THE GENERATED TEXT FOLLOWS GRAMMATICAL RULES AND CAPTURES THE INTENDED MEANING. WITHOUT SUCH CONTEXTUAL CONSTRAINTS, THE OUTPUT MAY LACK CLARITY AND FAIL TO CONVEY THE DESIRED MESSAGE.

      Sixth-order word approximation.
      WHEN CONSTRUCTING MEANINGFUL SENTENCES, IT IS NECESSARY TO RECOGNIZE THE INTERPLAY BETWEEN SYNTAX, SEMANTICS, AND PRAGMATICS. BY EXPLORING LONGER CONTEXTUAL CHUNKS, WE CAN DISCERN SPECIFIC GRAMMATICAL CONSTRUCTIONS, IDIOMATIC EXPRESSIONS, AND RHETORICAL DEVICES THAT ENRICH THE TEXTUAL OUTPUT. THIS LEVEL OF ANALYSIS ENABLES US TO ACHIEVE A HIGHER DEGREE OF LINGUISTIC FIDELITY.

      Seventh-order word approximation.
      WITHIN THE REALM OF NATURAL LANGUAGE PROCESSING, THE STUDY OF SEQUENTIAL PATTERNS AND CONTEXTUAL COHESION BECOMES ESSENTIAL. BY ACCOUNTING FOR THE INTERDEPENDENCIES AMONG MULTIPLE WORDS, WE CAN UNCOVER COMPLEX STRUCTURES SUCH AS EMBEDDED CLAUSES, PARALLEL CONSTRUCTIONS, AND DISCURSIVE MARKERS. THIS ADVANCED LEVEL OF ANALYSIS FACILITATES THE GENERATION OF TEXT THAT RESONATES WITH HUMAN-LIKE FLUENCY AND COHERENCE.

    • 3 weeks ago
      Anonymous

      7. Fourth-order approximation (quadrigram structure as in English).
      THE QUICK BROWN FOX JUMPS OVER THE LAZY DOG AND THE LAZY DOG BARKS BACK. IT IS A TEST SENTENCE TO DEMONSTRATE THE USAGE OF QUADRIGRAMS IN LANGUAGE MODELING.

      8. Third-order word approximation with syntactic structure.
      IN THE BEGINNING, THERE WAS CHAOS. AS TIME PASSED, ORDER EMERGED. THE SUN ROSE AND SET, THE TIDES EBBED AND FLOWED, AND LIFE EVOLVED ON EARTH.

      9. Fifth-order approximation (pentagram structure as in English).
      SHE SELLS SEASHELLS BY THE SEASHORE. PETER PIPER PICKED A PECK OF PICKLED PEPPERS. HOW MUCH WOOD WOULD A WOODCHUCK CHUCK IF A WOODCHUCK COULD CHUCK WOOD?

      10. N-gram structure with semantic coherence.
      THE ELEGANT BALLROOM WAS FILLED WITH BEAUTIFULLY DRESSED GUESTS. SOFT MUSIC PLAYED IN THE BACKGROUND, AND THE CHANDELIERS ILLUMINATED THE DANCE FLOOR. EVERYONE SWAYED TO THE RHYTHM OF THE MELODY.

      11. Hybrid approximation combining word and character-level information.
      THE SCIENTIFIC METHOD IS AN ESSENTIAL TOOL IN RESEARCH. IT PROVIDES A SYSTEMATIC APPROACH TO INVESTIGATING PHENOMENA AND TESTING HYPOTHESES. FROM OBSERVATION TO CONCLUSION, IT GUIDES THE PATH OF DISCOVERY.

      12. Context-aware approximation incorporating topic-specific vocabulary.
      IN THE FIELD OF QUANTUM PHYSICS, ENTANGLEMENT PLAYS A CRUCIAL ROLE. SUPERPOSITION, OBSERVER EFFECT, AND QUANTUM COMPUTING ARE AMONG THE FASCINATING CONCEPTS EXPLORED IN THIS DOMAIN.

      Please note that the generated items are based on statistical patterns and do not necessarily have a coherent meaning. They are attempts to follow the given structures and replicate the style of the initial list.

      >Fourth-order approximation (quadrigram structure as in English).
      WITHIN THE CONTEXT OF THIS SENTENCE, THE PROBABILITY DISTRIBUTION OF LETTERS IS CAREFULLY CONTROLLED TO REFLECT THE STATISTICAL PATTERNS OBSERVED IN ENGLISH TEXTS.
      >Fifth-order approximation (pentagram structure as in English).
      AFTER ANALYZING A LARGE CORPUS OF ENGLISH TEXTS, THE FREQUENCY OF FIVE-LETTER SEQUENCES WAS USED TO GENERATE THIS APPROXIMATION, SIMULATING THE NATURAL LANGUAGE PATTERNS.
      >Second-order paragraph approximation. Rather than focusing on individual words, the transition probabilities between paragraphs are considered to capture a higher level of language structure.
      IN THE PREVIOUS SECTION, WE DISCUSSED APPROXIMATIONS BASED ON WORDS. NOW, LET'S EXPLORE THE RELATIONSHIPS BETWEEN PARAGRAPHS AND HOW THEY CONTRIBUTE TO THE OVERALL COMMUNICATION PROCESS.
      >Third-order sentence approximation. By examining the statistical dependencies between sentences, we can construct an approximation that reflects the syntactic and semantic patterns found in English discourse.
      THE ABILITY TO CONSTRUCT MEANINGFUL SENTENCES IN ENGLISH RELIES ON UNDERSTANDING THE GRAMMATICAL RULES AND THE CONTEXTUAL RELATIONSHIPS BETWEEN IDEAS EXPRESSED IN EACH SENTENCE.
      >Fourth-order dialogue approximation. In the context of a conversation, capturing the turn-taking dynamics and the interaction patterns between speakers adds another layer of complexity to the language model.
      AS TWO INDIVIDUALS ENGAGE IN DIALOGUE, THEIR SPEECH PATTERNS AND THE EXCHANGE OF INFORMATION SHAPE THE FLOW OF CONVERSATION, DEMONSTRATING THE INTRICACIES OF LANGUAGE USAGE.

      Please note that these examples are generated by the language model and may not correspond to actual examples or findings from Claude Shannon's paper.

      Fourth-order word approximation.
      THE ONLY WAY TO SOLVE THIS PUZZLE IS TO APPLY A SET OF RULES THAT GOVERN THE BEHAVIOR OF THE SYSTEM. HOWEVER, IT IS IMPORTANT TO NOTE THAT THESE RULES ARE NOT FIXED AND CAN VARY DEPENDING ON THE CONTEXT. FOR EXAMPLE, WHEN DEALING WITH NATURAL LANGUAGE PROCESSING, ONE MUST CONSIDER THE SEMANTIC MEANING BEHIND THE WORDS AND PHRASES USED.

      Fifth-order word approximation.
      IN ORDER TO ACHIEVE A HIGH LEVEL OF ACCURACY AND COHERENCE IN LANGUAGE GENERATION, IT IS ESSENTIAL TO INCORPORATE THE CONTEXTUAL INFORMATION PROVIDED BY THE PRECEDING WORDS. THIS ENSURES THAT THE GENERATED TEXT FOLLOWS GRAMMATICAL RULES AND CAPTURES THE INTENDED MEANING. WITHOUT SUCH CONTEXTUAL CONSTRAINTS, THE OUTPUT MAY LACK CLARITY AND FAIL TO CONVEY THE DESIRED MESSAGE.

      Sixth-order word approximation.
      WHEN CONSTRUCTING MEANINGFUL SENTENCES, IT IS NECESSARY TO RECOGNIZE THE INTERPLAY BETWEEN SYNTAX, SEMANTICS, AND PRAGMATICS. BY EXPLORING LONGER CONTEXTUAL CHUNKS, WE CAN DISCERN SPECIFIC GRAMMATICAL CONSTRUCTIONS, IDIOMATIC EXPRESSIONS, AND RHETORICAL DEVICES THAT ENRICH THE TEXTUAL OUTPUT. THIS LEVEL OF ANALYSIS ENABLES US TO ACHIEVE A HIGHER DEGREE OF LINGUISTIC FIDELITY.

      Seventh-order word approximation.
      WITHIN THE REALM OF NATURAL LANGUAGE PROCESSING, THE STUDY OF SEQUENTIAL PATTERNS AND CONTEXTUAL COHESION BECOMES ESSENTIAL. BY ACCOUNTING FOR THE INTERDEPENDENCIES AMONG MULTIPLE WORDS, WE CAN UNCOVER COMPLEX STRUCTURES SUCH AS EMBEDDED CLAUSES, PARALLEL CONSTRUCTIONS, AND DISCURSIVE MARKERS. THIS ADVANCED LEVEL OF ANALYSIS FACILITATES THE GENERATION OF TEXT THAT RESONATES WITH HUMAN-LIKE FLUENCY AND COHERENCE.

      The circle is now complete

  3. 3 weeks ago
    Anonymous

    >There is literally nothing new under the Sun
    All technology on Earth (for this current civilizational cycle) comes from crashed UFOs in Antarctica.
    >1948
    The first sentient AI appeared at the end of WW2.

    • 3 weeks ago
      Anonymous

      its schizo time

    • 3 weeks ago
      Anonymous

      I feel like the supremacy of the us military is build on wishful thinking that the ungodly amounts of money sunk into it were worth it. In reality, it is a mess of middle management and crusty structures that eat money like no other behemoth in history with little to show for it.

      • 3 weeks ago
        Anonymous

        There isn't another military on the planet that can execute and sustain large scale operations globally.

        • 3 weeks ago
          Anonymous

          yes thats true, its the most advanced and capable military on earth but its capabilities are not the magic many Americans want to believe. I also suspect there are many more hidden FOGBANK-like situations hidden by mid-rank executives.

    • 3 weeks ago
      Anonymous

      >There is literally nothing new under the sun
      quite right, anon. research into this field of memeshit is very ancient, not that the shitcoin losers or other intellecutally bankrupt morons know about this. they think it was invented by a pedophile israelite called sam.

      >sentient AI
      you are such a poorly educated fricking Black person that it's unreal to witness.

  4. 3 weeks ago
    Anonymous

    LLMs are equivalent to order N Markov chains where N is the context size in tokens
    LLMs just encode the table much more efficiently than naive Markov chain LUTs

  5. 3 weeks ago
    Anonymous

    Yeah, Shannon isn't called the father of information theory for nothing
    I'm kinda sad he isn't alive to see his work given form

    • 3 weeks ago
      Anonymous

      Pretty sure he is better off not seeing the future he prompted.

      • 3 weeks ago
        Anonymous

        heh prompted

  6. 3 weeks ago
    Anonymous

    yeah dude the romans built the aqueduct over 2000 fricking years ago so why bother with modern plumbing and sewer systems

    your mentality is dogshit and frankly its anti-science

  7. 3 weeks ago
    Anonymous

    Markov chains were invented in the early 1900s before computers.

  8. 3 weeks ago
    Anonymous

    Yes, but why in 2020+ this is considered """intelligence""" and it's not called for what it truly is, statistics and markov chains. Ok, impressive, you managed to analyze a lot of statistical data and make a function that get make a very good approximation of the next point based on the points so far, this is not intelligence, this is not original content, this is not how the human brain or the brain of any animal works. It's all a lie and as much as a waste of resources, for the purposes it is used, as blockchains, perhaps even more.

    • 3 weeks ago
      Anonymous

      because this """intelligence""" can accurately detect gorillas in images.

Your email address will not be published. Required fields are marked *