AI Math Teacher?

Is ChatGPT good at teaching you college level math? I'm not talking about cheating or letting it straight up do all of your homework. I mean actually helping you learn concepts and explaining them in moron-friendly ways.

Mike Stoklasa's Worst Fan Shirt $21.68

POSIWID: The Purpose Of A System Is What It Does Shirt $21.68

Mike Stoklasa's Worst Fan Shirt $21.68

  1. 6 months ago
    Anonymous

    No.
    Read a book.

    • 6 months ago
      Anonymous

      this, textbooks are unironically 1000x better than professors.

    • 6 months ago
      Anonymous

      this, textbooks are unironically 1000x better than professors.

      i feel like agreeing, because books have so much potential
      but how do you manage to read books that are hard for you, without progressing at like one page every week?
      i have read "Neural Networks: A Systematic Introduction" to chapter 7 and it's really a wonderful book, but it took so fricking long to read, and even worse, i feel like i dont remember some cool shit i've read there (which i'd love to remember), like how human neurons work

    • 6 months ago
      Anonymous

      Yes use GPT4 and double check the answers on Google. Don't listen to these boomers

      this, textbooks are unironically 1000x better than professors.

      I wonder how they find out their way here

  2. 6 months ago
    Anonymous

    Why don't you try it and report back to us your results?

  3. 6 months ago
    Anonymous

    it can't even pivot a matrix

  4. 6 months ago
    Anonymous

    I'm using a local model to teach myself linear algebra. I imagine ChatGPT can easily tutor you, but smaller models more than likely could as well.

    I use a textbook in conjunction with it. Models know markdown formatting, so I'm able to just copy examples and formulas from the text and the model can understand them.

    It's basically like having a tutor with infinite patience. The chatbot can expand examples step by step, and is able to be granular and dumb-down for me as I need it.

    From what I've read, LLMs are less reliable for higher level mathematics. But so far, for 200-level course stuff, I'm having a blast!

    Here's a pic I took from a while back.

    • 6 months ago
      Anonymous

      need

    • 6 months ago
      Anonymous

      What model is this? I haven't tried it that much but my ERP model of choice is not great at this sort of stuff.

      • 6 months ago
        Sage

        Euryale 1.4 70b. I use 5.0 blw exl2.
        >I haven't tried it that much but my ERP model of choice is not great at this sort of stuff.
        That's surprising. I've actually found that erp models tend to also perform well in other tasks.
        Anyway, I think even a 7b might work. I think what helps is if you use unicode math characters and formatting, as Llama IS trained with those symbols, and those tokens seem to push the model towards its mathematics datasets.
        Notice I use the proper symbol for elements in my example pic.
        See here for a reference:
        https://en.m.wikipedia.org/wiki/Mathematical_operators_and_symbols_in_Unicode

        • 6 months ago
          Sage

          Oh, and I use <sup></sup> and <sub></sub> instead of ^ for exponents. Similarly use other unicode chars like R and C or ≤ and ≥ and even x and y instead of the substitutes you would normally use.
          It's a bit more tedious but it definitely works.

        • 6 months ago
          Anonymous

          Thanks. I think I actually already had that one downloaded, but I got lazy trying to decide on optimal vram/ram load splitting. What's your hardware setup? And yes I realize now the symbol thing is probably a big part of my issues. Kinda similar to how if you ask it for an answer that involves a list, it usually gives you a sucky comma-separated text dump, but if you specifically ask for bullet points it does a much better job. Also don't listen to this homosexual

          have some shame

          the sexy older professor is great.

          • 6 months ago
            Anonymous

            I've got 64gb ram and 60gb vram across 4 cards: 3 3060s and a 3090. All used. I use those 1x to 16x riser cards that use USB cables and an external PSU to power whatever isn't hooked up directly to my motherboard.

            • 6 months ago
              Anonymous

              Forgot my model setup:
              Exllama2 through text-gen-webui. 8bit cache. 8192 context. Gpu split 20,10,11,11 seems to consistently work for me. I'm on Linux Mint, cinnamon, 1080 monitor.

            • 6 months ago
              Anonymous

              Forgot my model setup:
              Exllama2 through text-gen-webui. 8bit cache. 8192 context. Gpu split 20,10,11,11 seems to consistently work for me. I'm on Linux Mint, cinnamon, 1080 monitor.

              Impressive. Thanks for all the tips anon. I've got 64 ram + 24 vram so I can run 70Bs (albeit slowly). I'll do some experimenting tonight.

              • 6 months ago
                Anonymous

                Np anon, but like I said I don't see why a 34b or lower couldn't suit your educational needs fine. Isn't that YI model supposed to be great at math? That way you can run a model entirely in your vram. Maybe just stay away from really small quants.

              • 6 months ago
                Anonymous

                True. I wanted to see about replicating your results and then scaling down for faster responses. I did kinda the same for ERP shit when I had way worse specs and could only run 7B in vram. Haven't heard of that YI model. I'll look for it. remmlewd was the recommended one the last time I dug through lmg.

              • 6 months ago
                Anonymous

                Ah. A fellow intellectual coomer. A coomisseur, if you will.
                Well here's a link to the character card:
                https://chub.ai/characters/Opti902/eliza-9d3ee6dc

                And yeah, Yi is gonna be terrible at the personality aspect. Euryale is the most "ethically and morally unconstrained" 70b model I've used. 13b's aren't too bad, though, but they don't have the same "depth" to the roleplay aspect.

                Are you aware of Mixtral? It can run on KoboldCPP and there's GGUF quants for it. Check the /lmg/ OP for a how-to. I tried it and it seems very promising. I just can't take the soow llamacpp prompt processing, so I'm waiting for exl2 support.

              • 6 months ago
                Anonymous

                Incredibly based character card. I'll try out Mixtral and YI. I've been using koboldcpp since it's the first loader I got working.

    • 6 months ago
      Anonymous

      have some shame

  5. 6 months ago
    Anonymous

    speaking from experience, absolutely not.

  6. 6 months ago
    Anonymous

    Get a textbook and ask chatgpt all your dumb questions.
    I learnt the very basics of music theory that way.

  7. 6 months ago
    Anonymous

    Yes

  8. 6 months ago
    Anonymous

    This relates to what I've been trying to do. Bump for any interesting insights.

  9. 6 months ago
    Anonymous

    use it and find out

Your email address will not be published. Required fields are marked *