why the fuck does ML use python

> state of the art machine intelligence
> billions of dollars of funding
> PhD requirement to even enter the field

uses fricking python. is the entire ML field secretly composed of midwits?

Nothing Ever Happens Shirt $21.68

Unattended Children Pitbull Club Shirt $21.68

Nothing Ever Happens Shirt $21.68

  1. 1 year ago
    Anonymous

    What language would you use?

    • 1 year ago
      Anonymous

      ASSEMBLY

    • 1 year ago
      Anonymous

      rust
      C++
      typescript
      go

      anything that provides some ability to incorporate it into a useable product

      • 1 year ago
        Modern Systems programming language

        Go, Rust - none or limited ML backend library support
        typescript - just no
        C++ is usable, sometimes ML frontend is even written in it, but C++ is just so good damnt hard to debug.
        Python - excelent write speed, easy to change code, excelent binding to ML backends

        • 1 year ago
          Anonymous

          >C++
          >hard to debug
          what? the debugging tools for it are excellent and anything else is a chair-keyboard interface error

          Requirements for ML:
          - Quick iteration iteration time (lots of parameters to check for any arch choice)
          - Ease of modifying the overall arch/workflow
          - Accelerated single-block compute (all compute happens at once, repeatedly, with no need to move back to python runtime for a long long time)
          - Good tool facilities (tensor manipulation, text and stats/prob toolkits, etc.)

          Things nobody gives a shit about:
          - Fast non-library speed (virtually no time is spent in the python runtime)
          - Verbosity
          - Low level memes (makes modifications harder, too verbose, greatly reduces turnaround time, error risk requires far more time and effort to work out, no room for mistakes due to long runtimes)

          [...]
          A ML algorithm was trained to quantify the brain size of posters based on their posts. It has an average error rate of 0.001 cubic centimeters. I used it on your post. Here is the result:
          >Error: brain not found
          It's the first time it ever did that. Strange!

          woah you hecking owned him bro, bazinga, dude!

          • 1 year ago
            Anonymous

            he's right.

          • 1 year ago
            Anonymous

            Oh come on he got his ass
            >nooo but he didnt say nìgger in his post he's not a real heckin BOTnerrino

            • 1 year ago
              Anonymous

              For

              >C++
              >hard to debug
              what? the debugging tools for it are excellent and anything else is a chair-keyboard interface error
              [...]
              woah you hecking owned him bro, bazinga, dude!

          • 1 year ago
            Anonymous

            t. underageb&

          • 1 year ago
            Anonymous

            >C++
            >the debugging tools for it are excellent
            lmao bro this is Stockholm's syndrome. Please try to phone a friend or family.

            • 1 year ago
              Anonymous

              please, in detail, explain how they arent

              surely you wouldnt just be saying "C++ is le bad LOL!" just because its funny and everyone else does it, would you?

              • 1 year ago
                Anonymous

                Hold on a sec bro I have a problem with my
                std::map<std::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::less<std::basic_string<char, std::char_traits<char>, std::allocator<char> > >, std::allocator<std::pair<std::basic_string<char, std::char_traits<char>, std::allocator<char> > const, std::basic_string<char, std::char_traits<char>, std::allocator<char> > > > >

              • 1 year ago
                Anonymous

                aha XD bro templates have long names thats so funny dude LOL

                so basically you have no point and you just repeat what every other nocoder says. people like you should not have access to this board

              • 1 year ago
                Anonymous

                >sees unreadable log because C++ doesn't have reflection and debuggers have to hack something togethertogether so they just fully expand types
                >haha XD it's le funny joke and not a fundamental flaw of the language

              • 1 year ago
                Anonymous

                1. works on my machine
                2. not a flaw for me

              • 1 year ago
                Anonymous

                Do yourself a favour and try another language.

              • 1 year ago
                Anonymous

                i used many and i only like c, c++, lua and flavors of assembly similar to 8086

      • 1 year ago
        Anonymous

        shut the frick up

      • 1 year ago
        Anonymous

        >typescript
        >rust
        only a microsoft troony would even propose this.

        There are a few programming languages that could be used for ML, as much as I love javascript it isn't one of them, although with the amount of code written in it it could be possible.

        real programming languages for ML
        >ruby / crystal lang
        >go
        >C/C++

        • 1 year ago
          Anonymous

          >There are a few programming languages that could be used for ML

          it needs optimized libraries for linear algebra, statistics, and calculus. it also needs to be able to use GPUs or TPUs at the moment. otherwise your titty jiggle CNN is gonna take months to train

        • 1 year ago
          Anonymous

          >real programming languages for ML
          >>ruby
          Are we living in the same universe?

        • 1 year ago
          Anonymous

          ruby is not more suited to ML than JS

      • 1 year ago
        Anonymous

        Braindead moron

      • 1 year ago
        Anonymous

        common lisp, the actual answer
        frick low level langs
        frick webshit langs
        we need something actually powerful and not primitive.

        good luck getting anything done

      • 1 year ago
        Anonymous

        LMAO, castrate yourself, my friend. You don't know what you are talking about. Like, seriously, consider suicide.

    • 1 year ago
      Anonymous

      Ruby

      • 1 year ago
        Anonymous

        Identical except no libs.

        • 1 year ago
          Anonymous

          No, ruby is red. Python is green.

    • 1 year ago
      Anonymous

      Assmebly

    • 1 year ago
      Anonymous

      common lisp, the actual answer
      frick low level langs
      frick webshit langs
      we need something actually powerful and not primitive.

      • 1 year ago
        Anonymous

        > common lisp, the actual answer
        kek, they ditched it even at in AI for a reason

        • 1 year ago
          Anonymous

          >for a reason
          let's hear it anon, ill wait

          • 1 year ago
            Anonymous

            ask the MIT AI department, they would explained better than me
            What do you think? Do you think the government made lisp illegal you moron?

            • 1 year ago
              Anonymous

              >yeah i actually have no idea what I'm talking about, ill just post something condescending and pretend like I'm right
              every time

              • 1 year ago
                Anonymous

                That posts tell a lot more about you than me
                Have you ever used PDDL?
                Have you ever implemented a ML algorithm?

              • 1 year ago
                Anonymous

                anon i haven't forgotten about that """reason""" you mentioned.
                do you actually have one or were you talking out of your ass?

              • 1 year ago
                Anonymous

                The reasons can be inferred from the post if you weren't a pea brain moron
                lisp and functional style in general have no advantage in AI algorithms (especially in ML algos, being supervised, unsupervised or reinforcement)
                Natural framework of those problems is linear (and non) algebra and statistics, and they don't fit better in a non typed functional language (quite the opposite)
                The only fit for language like lisp are the niche were **fully** declarative language (like prolog) were the best anyway

                Adding

                >reasons to ditch lisp
                1.) Ugly syntax
                2.) Too many dialects
                3.) No standard implementation
                4.) Low user libraries
                5.) Hard to teach
                6.) Niche advantages that are rarely used by the average lisp programmer

                >reasons to use python
                1.) Simple syntax
                2.) 2 dialects, but one is deprecated and is slowly being erased
                3.) 1 accepted implementation with other options for optimization
                4.) Plentiful user libraries
                5.) Easy to teach
                6.) C bindings means python doesn't need to pull niche tricks to be fast.

                spoonfed, lisp was always slow as molasses
                Go eat your daily toejam and stop posting please
                Again, do you ever wrote a ML algorithm

              • 1 year ago
                Anonymous

                are you actually stupid? Meme networks are literally a chain of functions being composed.

          • 1 year ago
            Anonymous

            >reasons to ditch lisp
            1.) Ugly syntax
            2.) Too many dialects
            3.) No standard implementation
            4.) Low user libraries
            5.) Hard to teach
            6.) Niche advantages that are rarely used by the average lisp programmer

            >reasons to use python
            1.) Simple syntax
            2.) 2 dialects, but one is deprecated and is slowly being erased
            3.) 1 accepted implementation with other options for optimization
            4.) Plentiful user libraries
            5.) Easy to teach
            6.) C bindings means python doesn't need to pull niche tricks to be fast.

            • 1 year ago
              Anonymous

              It's amazing how moronic nuBOT is. Posts like these make me want to find another ib because clearly this one is lost for good.

              • 1 year ago
                Anonymous

                >pure syntax
                Theoretically pure, but that theory means nothing for the average programmer who has to deal with the ugly syntax.
                >common lisp
                Sure that's your favorite, but there are plenty more competing and still used variants.
                Guile
                Emacs lisp
                Closure
                Clisp
                Racket
                (The giant scheme gap)
                >implementations
                Same as above
                >low 3rd party libraries
                Unpopular because it is split between all the different dialects. Also popular is a valid metric to measure the value of a language.
                >hard to teach
                Its easy at first until you start dealing with all the theory behind meta programming that no one wants to talk about.
                >niche advantages that are rarely used
                Well for one everyone wants to talk about how they can in theory treat functions as a list, but I have rarely seen anyone do this.

              • 1 year ago
                Anonymous

                >but that theory means nothing
                no. it is not just "theoretically pure".
                it's unironically easier to learn than python or C syntax since it's literally just (function arg1 arg2 ...)
                that's an advantage since the average programmer can worry less about syntax and more about their actual project. the fact that you're not used to it yourself isn't an argument or a fault of the language
                >Sure that's your favorite
                no. in my initial post, i specifically said common lisp would be more suitable than python for ML. i never mentioned clojure or scheme or anything else. you're adding that in on your own.
                >implementations
                same as above
                >Unpopular because it is split
                maybe it's true, but like i said, that's not a fault of the language.
                >Also popular is a valid metric to measure the value of a language
                no it's not.
                this essay addresses the popularity of some languages very well
                http://paulgraham.com/icad.html
                >Its easy at first until you start dealing with all the theory behind meta programming that no one wants to talk about.
                depends on the complexity of the macros you're writing. your average programmer won't write big complex macros often, if at all. simple macros are actually rather easy. try it out sometime
                >Well for one everyone wants to talk about how they can in theory treat functions as a list, but I have rarely seen anyone do this.
                i mentioned the advantages of lisp above. those should be enough to show that lisp does have tangible advantages for the average programmer. about that specific one, it is used in every lisp program under the sun. when people talk about "treat functions as a list", they mean macros. while you may not write many macros yourself, you most certainly will use them often in any lisp program. built-in or from libraries or whatever, but all it means is that you can use lisp code to modify lisp code like if it was any regular old list, because it is

              • 1 year ago
                Anonymous

                It takes a week to teach lisp fully so that students become productive.
                For python, it takes at least 6 months.
                You are delusional.
                >muh favorite
                >muh flavor
                >unironically lists elisp
                b8/8 I responded

                >unironically easier
                It is about as easy as python at best. You still need to learn about all the subtle shit and the standard libs that you would have to learn in python. Also people generally have a harder time grasping functional programming than imperative programming and this can be shown by which is more popular.
                >muh post
                Sure, but justify that to the google engineers that wrote the tensor libraries or to the data engineers that need to write in it.
                >split is not the fault of the language
                It kind of is. The spec is too hands off. When you look at most other programming languages they don't have this problem.
                >popularity is not a valid metric
                Yes it is. This essay depends on the mythic hacker. This titan stands above the industry and commands the directions that people go. This man doesn't exist. Languages get popular because of work. The language has to do something better than everyone else. The language has to justify its use. People have to sell it to their companies to use it. Languages that remain popular were sold to their companies and did what the company expected of them. Languages that were only temporarily popular did not. The languages mentioned in essays didn't remain popular. Perl, lisp, tcl, tk are all dead or on the decline. Ruby another smash hit is on the decline as well. Python has been around a long time and hasn't wavered in popularity. I would say that, that says leagues about the language. The mythic hackerman liked it and the shareholders and people who actually have to maintain the code after hackerman fricks off to google like it.
                >macros
                They are part of the reason lisp died. It turned every lisp project into its own subdialect of a dialect.

              • 1 year ago
                Anonymous

                Delusional has a face and it's yours.

              • 1 year ago
                Anonymous

                >delusions
                Kek
                >The google engineers should have used common lisp because I said so
                >Lisp is easier to learn and use that's why its so niche
                >language split and drift is not the fault of a language that barely had a spec
                >I worship mythic hackerman, but when mythic hackerman chooses python to write the ML libraries I just assume mythic hackerman was wrong and that me Joe lisp shelp knows better
                >The best languages are the ones that died after a few years
                >every lisp guide says to avoid macros like the plague if you can, but they are the best part of the language

              • 1 year ago
                Anonymous

                Forgot your meds again. Go get 'em, they're good for you.

              • 1 year ago
                Anonymous

                >t. copegays when they actually have to back their arguments instead of throwing out /misc/ buzzwords
                Kek

              • 1 year ago
                Anonymous

                ML is a scientific problem. You don't need state of the art theoretic CS discoveries to solve a optimization problem.
                You need a good language, fast and simple enough

                That's why lisp is not used in ML industry

              • 1 year ago
                Anonymous

                >ML is a scientific problem.
                True
                >You don't need state of the art theoretic CS discoveries to solve a optimization problem.
                False, a large part of the field is concerned with discovering new optimization methods and several parts of the field work squarely in the theoretical CS arena.
                >That's why lisp is not used in ML industry
                >industry
                No backepdaling please.
                It is patently false that lisp was not used for ML. Until 2009-2010, there was https://lush.sourceforge.net/ for example. Theano got the ball going and the rest was purely momentum.
                Lisp is a superior tool than python for ml in all areas that matter except libraries, but the libraries didn't exist in python at the time it began being used either so even that's not a real argument.
                The problem is that nobody knew lisp anymore at the time, pure and simple.

              • 1 year ago
                Anonymous

                > False, a large part of the field is concerned with discovering new optimization methods
                in math/algos (and parallalelization), not in programming language feature you idiot

                if you think that find a speedup in matrix product is a theoretical cs problem you're irremediably hopeless
                > No backepdaling please.
                take your meds, no one is backpedaling. You are imagining things

                btw
                >lisp is not used
                >well lisp was used
                I think you need a nap

                > Lisp is a superior tool than python for ml in all areas
                just no. Programming language is almost irrelevant to ML, with rare exception

              • 1 year ago
                Anonymous

                >moron doesn't know what optimization is or does
                >thinks ML finds solution by magic or something
                >calls others moronic
                >thinks it's OK for data processing to take so long it dwarfs epoch time
                OK schizo, whatever you say.

              • 1 year ago
                Anonymous

                >moron doesn't know what optimization is or does
                You are the one which is confused at what level the optimisation speedup is done.
                The optimization is done at math level most of the time (approximation of proper gradient recursion, truncation of gradient, assuming adaptive weights uncorrelated with past data, synchronization of time and iteration in recursion, matrix inversion lemma, convenient matrix decomposition, etc)

                And on top of that, parallelization Whenever is possible

                >thinks ML finds solution by magic or something
                I never sad that. ML finds solution by crunching numbers and whatever crunch it fast while remaining simple is the best
                You don't need hygienic macros, TCO, cons cell, metacircular evaluation or whatever you deranged lisphomosexual claim
                >calls others moronic
                I called you because you are one
                >thinks it's OK for data processing to take so long it dwarfs epoch time
                Lisp is no faster than python. Data (pre)processing is outside the ML core concepts btw
                >OK schizo, whatever you say.
                Kys moron

              • 1 year ago
                Anonymous

                nta but lithp was once popular as a machine learning language. part of the reason it nearly died was because the AI winter destroyed the lisp machine market.

              • 1 year ago
                Anonymous

                No, it was used in symbolic strong ai/robotics (not even lisp, but lisp based languages)

                For ml it was never that popular, because ml at those times were mostly algorithms for controllers and sensors

              • 1 year ago
                Anonymous

                > controllers
                *control problem is a better definition

              • 1 year ago
                Anonymous

                >It is about as easy as python at best.
                no. it's easier. you're just having baby duck syndrome. also you can do imperative programming in common lisp just fine.
                >muh post
                wtf? you're the one replying to my post which was specifically about common lisp kek
                and about the implementations, you're really trying hard to make it sound like it's worse than it is.
                >muh popularity
                did you read the essay i linked?
                >macros are part of the problem
                i was just explaining the feature you asked about.

                unironically, learn common lisp. then form real opinions about it instead of parroting bot memes. I'm really not trying to be condescending, but i'm honestly tired of people coming in and saying the same moronic things when they obviously have not looked into it at all

              • 1 year ago
                Anonymous

                >baby duck
                Why did the industry centralize around imperative? Especially when lisp came out in the 50's? Thus most would have baby ducked to it. Its because functional languages are too closely related to calculus and set theory and not enough to language. Functional language patterns expect verbs to memorize information about nouns because that makes sense in set theory when mapping between sets. When talking about what is intuitive to people we don't think about state like that. Mapping between sets is already an abstract concept and then imagining problems like that is another layer of abstraction.
                >it can be imperative
                Almost no one uses lisp like that.
                >you are responding
                Yes because you argue it was a mistake not to implement it in lisp, but then ignore the fact that lisp is less popular and more fragmented than python. Implementing in common lisp would only allow a subset of the lisp community to use the ML libraries while implementing in python would allow all of the python community to use it.
                >did you read the essay I linked
                Yes, but his argument fails to materialize in real life. The mythic 20 to 1 ratio only really applies in C and more verbose languages and isn't as big as a lead in python. If you were to use his approach for something like ML you would require people to learn both lisp and the DSL that would come with tensor. Also his approach has a fragmenting effect on reusability as every program or library that you would write would end up with its own DSL. This again is an issue of sub fragmenting a language.
                >learn common lisp instead of parroting BOT memes
                What BOT memes? Who even talks about common lisp here except people on the lisp generals. Who are all pro lisp by the way. This was my take on lisp after learning guile. When reading the docs on that the whole take was don't use macros unless you absolutely have to.

              • 1 year ago
                Anonymous

                It's a waste of time
                Lispgay are like Jehovah's witnesses
                They are all schizo who think lisp is the best programming language ever conceived and the fight about if that particular language is really a lisp or they fight their imaginary enemy language
                I learned lisp, it was cool but nothing out of this world honestly

              • 1 year ago
                Anonymous

                Honestly enjoyed Haskell more than lisp. Lisp just becomes a mess of parenthesis too quickly even when you have an editor or an IDE to manage it, it is a pain.

              • 1 year ago
                Anonymous

                I never touched Haskell but I felt the same for ocaml. It was nicer to me

              • 1 year ago
                Anonymous

                >Lisp just becomes a mess of parenthesis too quickly even when you have an editor or an IDE to manage it
                and the truth finally surfaces
                you could have just said "i cant into parens" that's literally the real gripe homosexuals have with lisp, everything else is just cope

              • 1 year ago
                Anonymous

                people could just press a button on the editor and activate a form of m-exps without too many parens to lisp, but they are so limited and without creativity that they don't even consider that possibility.

              • 1 year ago
                Anonymous

                It takes a week to teach lisp fully so that students become productive.
                For python, it takes at least 6 months.
                You are delusional.
                >muh favorite
                >muh flavor
                >unironically lists elisp
                b8/8 I responded

              • 1 year ago
                Anonymous

                >For python, it takes at least 6 months.
                lol

              • 1 year ago
                Anonymous

                That's right, normal students aren't as slow as you and don't typically require the 10 years and counting you've been at it. But don't give up, timmy, one day you'll be able to program something more interesting than fizzbuzz! Ganbatte~

              • 1 year ago
                Anonymous

                Wow you sure told him anon

              • 1 year ago
                Anonymous

                anon, you're responding to someone with the same position as you.
                he's saying lol because python can be learned very quickly.

            • 1 year ago
              Anonymous

              >t. has never touched lisp
              >Ugly syntax
              pure syntax
              >Too many dialects
              i specifically mentioned common lisp above.
              >No standard implementation
              SBCL is what everyone uses nowadays. ANSI CL is the standard all the implementations must follow
              >Low user libraries
              less than python, but that's just because it's unpopular. nothing to do with the language itself being bad.
              >Hard to teach
              false. in fact it's probably one of the easiest languages to teach, precisely due to it's pure and simple syntax.
              >Niche advantages that are rarely used by the average lisp programmer
              what? can you elaborate on this one? I think the advantages of lisp seem "niche" to the average blub programmer since they don't have them in their language, but the average lisp programmer definitely appreciates these features (interactive programming, macros, higher order functions, closures, etc)

              The reasons can be inferred from the post if you weren't a pea brain moron
              lisp and functional style in general have no advantage in AI algorithms (especially in ML algos, being supervised, unsupervised or reinforcement)
              Natural framework of those problems is linear (and non) algebra and statistics, and they don't fit better in a non typed functional language (quite the opposite)
              The only fit for language like lisp are the niche were **fully** declarative language (like prolog) were the best anyway

              Adding [...] spoonfed, lisp was always slow as molasses
              Go eat your daily toejam and stop posting please
              Again, do you ever wrote a ML algorithm

              >bro just let me post bullshit and infer some good arguments from that
              wtf?
              >do you ever wrote a ML algorithm
              yes. multiple. in python, and CL would be better unironically
              also common lisp is not purely functional, it is strongly typed and way faster than python. go look up what common lisp is before being condescending about it online.

              It's amazing how moronic nuBOT is. Posts like these make me want to find another ib because clearly this one is lost for good.

              it might just be bait honestly. specifically the ESL anon hasn't posted anything of actual value yet

              • 1 year ago
                Anonymous

                >in fact it's probably one of the easiest languages to teach, precisely due to it's pure and simple syntax.
                lisp syntax is yet another example of the fact that simple doesn't mean easy. nobody wants to sit there conforming his thoughts to an alien structure regardless of it's conceptual purity and simplicity.
                and this is leaving aside paren complaints; postfix has the same problem. it's just not how people think so you have to do extra translation steps to tell the computer what to do.

              • 1 year ago
                Anonymous

                you people are moronic, there's nothing wrong with lisp syntax, it's just (blah blah (blah blah)), it's easy

              • 1 year ago
                Anonymous

                Yes, I am aware what the syntax is and how easy it is to describe.

              • 1 year ago
                Anonymous

                >nobody wants to sit there conforming his thoughts to an alien structure regardless of it's conceptual purity and simplicity.
                How do you explain mathematicians? They’ve been developing a complicated difficult and extremely terse notation for thousands of years. We are at the point now where no one cares if you are filtered by mathematical notation.
                Code monkey seething is ridiculous, just because you want to import a library and get something done as easily as possible doesn’t mean there is no value in other “esoteric” or “alien” languages.

              • 1 year ago
                Anonymous

                >They’ve been developing a complicated difficult and extremely terse notation for thousands of years.
                more density = easier to read more information, this is scientifically proven

              • 1 year ago
                Anonymous

                That is my point. And it doesn’t stop millions of people from bashing their head against a wall trying to learn all sorts of mathematics every day. No idea why people hate on languages with syntax / notation that is designed with a purpose. Of those, forth and APLjk come to mind first, but Lisp’s syntax is clean and easy. Don’t get the parens hate.

              • 1 year ago
                Anonymous

                >Don’t get the parens hate
                indoctrination in schools unironically, if we teached apl in school or uni, people would find C ugly

              • 1 year ago
                Anonymous

                Lisp isn't hard to learn because of the core syntax. It's hard to learn because of all the other idiosyncratic details of any specific Lisp that you have to learn before you can do simple things. Common Lisp being a lisp-2, for example. Clojure has some wierd javalike scaffolding, understanding how entrypoints work can be a little painful for a new programmer. Racket is full of autustic academic shit that you hit the moment you try to do something you think would be simple like load some data from json or yaml.

                None of those issues have anything to do with syntax. You can even use Hy, which is basically python with lisp syntax, to get a feel for how little the parens really matter.

              • 1 year ago
                Anonymous

                >Racket is full of autustic academic shit that you hit the moment you try to do something you think would be simple like load some data from json or yaml.
                While I haven't dealt with json or yaml in racket, your claim that it's full of academic crap is false. Go to docs.racket-lang.org and type json or yaml and you will get what you want. Racket is by far the most practical Scheme, they have a lot of libraries and their documentation is one of the best because they all use the same documentation tool unlike Python.
                I also assume from your post that Lisps are harder than Python in which I disagree. Where does this nonsense with Python = easy comes from? Is it the one line hello world? The only easy thing in Python is the ability to write shit code because language features are poorly implemented (lambda, for comprehensions, conditional expressions, syntax). That there is a library for almost everything is true but nobody talks about the quality of them. They all use different documentation layouts, all are incompatible with each other and aren't even compatible with a lot of the language's feature. Look at numpy, like why can't I do math operations on their shitty built in types? Does the language not have interfaces, contracts, traits or whatever they call them?

              • 1 year ago
                Anonymous

                If you want a real response, try again to make an intelligent post and not a totally moronic one.

        • 1 year ago
          Anonymous

          The reason being that symbolic AI fell out of favor and it was used mostly for its symbolic manipulation capabilities. That AI winter singlehandedly killed lisp. That's literally the only reason.

          • 1 year ago
            Anonymous

            because the idea that intelligence was MODEL-PLAN-ACTION were the kind of moronation you could only see at MIT

            I wouldn't call it a "winter", but rather a spring. The intuition of Rodney Brooks on behavioral-ism stop defining "intelligence" as an intrinsic property of an agent but rather a property attributed from an external observer of an "emergent intelligent behavior" emerging from hard-wired stimulus and inhibitors were a savior for the field

            It's amazing how moronic nuBOT is. Posts like these make me want to find another ib because clearly this one is lost for good.

            you'll free to bring your uneducated ass away from here, and please do it

            • 1 year ago
              Anonymous

              How did it save anything when it didn't have any impact whatsoever on deep learning or its development (not to mention it already existed at the time)?

              • 1 year ago
                Anonymous

                deep learning is only a subset of ML which is a subset of AI
                There is a lot more to AI that deep learning

                >t. has never touched lisp
                >Ugly syntax
                pure syntax
                >Too many dialects
                i specifically mentioned common lisp above.
                >No standard implementation
                SBCL is what everyone uses nowadays. ANSI CL is the standard all the implementations must follow
                >Low user libraries
                less than python, but that's just because it's unpopular. nothing to do with the language itself being bad.
                >Hard to teach
                false. in fact it's probably one of the easiest languages to teach, precisely due to it's pure and simple syntax.
                >Niche advantages that are rarely used by the average lisp programmer
                what? can you elaborate on this one? I think the advantages of lisp seem "niche" to the average blub programmer since they don't have them in their language, but the average lisp programmer definitely appreciates these features (interactive programming, macros, higher order functions, closures, etc)
                [...]
                >bro just let me post bullshit and infer some good arguments from that
                wtf?
                >do you ever wrote a ML algorithm
                yes. multiple. in python, and CL would be better unironically
                also common lisp is not purely functional, it is strongly typed and way faster than python. go look up what common lisp is before being condescending about it online.
                [...]
                it might just be bait honestly. specifically the ESL anon hasn't posted anything of actual value yet

                > yes. multiple. in python, and CL would be better unironically
                so do I, for a living. And never once in my life I thought that lisp were better suited for a deep reinforcement learning algorithm
                I'm sure you did in your fantasy

              • 1 year ago
                Anonymous

                >deep learning is only a subset of ML which is a subset of AI
                And all useful ML is DL, just as all AI that works is ML in the exclusive sense of DL (RL also is just DL nowadays).
                >There is a lot more to AI that deep learning
                For all intents and purposes, DL is 100% of AI.
                Also, stop moving the goalposts. You claimed that moving away from intrinsic to extrinsic view of intelligence helped the field, yet the field is entirely DL, there has been no real advances in any other direction, and you implied DL isn't what you mean. Clearly huge contradictions here.

              • 1 year ago
                Anonymous

                >And all useful ML is DL, just as all AI that works is ML in the exclusive sense of DL
                not remotely true

                >RL also is just DL nowadays)
                not true again. There are a lot of soft-actor critic algorithm that work without necessary using dl
                > You claimed that moving away from intrinsic to extrinsic view of intelligence helped the field, yet the field is entirely DL
                you are still reiterating your wrong thesis. I don't know what kind of discussion you're used to
                > here has been no real advances in any other direction,
                In the behaviorism-soft ai approach there were big advantages, first of all the speed of action avoided divergence in the modelling and most importantly, robots worked without the closed-world assumption, which was a big chore at the time

                >and never once in my life I thought that lisp were better suited for a deep reinforcement learning algorithm
                lol maybe because you have no idea what lisp is anon
                try looking it up

                sure thing bud. Whatever makes you sleep at night

              • 1 year ago
                Anonymous

                >facts aren't real because... they just aren't, OK?
                Take your meds, schizo.

              • 1 year ago
                Anonymous

                Worst larp post I've ever seen on BOT, well done. Want a medal? You'll have to imagine it, but you seem quite good at doing that.

              • 1 year ago
                Anonymous

                >and never once in my life I thought that lisp were better suited for a deep reinforcement learning algorithm
                lol maybe because you have no idea what lisp is anon
                try looking it up

      • 1 year ago
        Anonymous

        Unironically common lisp would be just about perfect for deep learning.
        >compiles to very fast code
        >best in class development velocity
        >won't lose your work for a typo, not only because you will know about typos in advance but because worst case scenario you get the restart handler to fix and continue your work

      • 1 year ago
        Anonymous

        wolfram language already exists

    • 1 year ago
      Anonymous

      >What language would you use?
      bash

    • 1 year ago
      Anonymous

      OP here, Lisp of course

    • 1 year ago
      Anonymous

      >What language would you use?
      Kotlin

    • 1 year ago
      Anonymous

      Lua, the language torch and subsequently waifu2x was written in
      python is only used because universities teach everything in it, and universities use it only because students straight up demand it (and will leave poor ratings for your course otherwise), universities listen because students pay handsomely, and students demand python because they're all midwits who barely know anything about technology other than the six digit income they're about to make. For a more realistic answer at this point I'd pick julia

    • 1 year ago
      Anonymous

      Java, sir

    • 1 year ago
      Anonymous

      Lua or js

  2. 1 year ago
    Anonymous

    Python is like a car with good ergonomics for the ML people. The actual math and algorithmics are much more nuanced and involve a lot of JIT compilation, C++ code, computation graphs and linear algebra.

    Python just happens to be the tool of choice for comfortable manipulation of high-level concepts for high IQ people.

  3. 1 year ago
    Anonymous

    because ML's most important applications are in science and engineering and ML researchers know that lab scientists and engineers don't have the fricking time to work through structure and interpretation of computer programs. python just werks

  4. 1 year ago
    Anonymous

    ML uses c/c++/rust python is just glue.

    • 1 year ago
      Anonymous

      Requirements for ML:
      - Quick iteration iteration time (lots of parameters to check for any arch choice)
      - Ease of modifying the overall arch/workflow
      - Accelerated single-block compute (all compute happens at once, repeatedly, with no need to move back to python runtime for a long long time)
      - Good tool facilities (tensor manipulation, text and stats/prob toolkits, etc.)

      Things nobody gives a shit about:
      - Fast non-library speed (virtually no time is spent in the python runtime)
      - Verbosity
      - Low level memes (makes modifications harder, too verbose, greatly reduces turnaround time, error risk requires far more time and effort to work out, no room for mistakes due to long runtimes)

      rust
      C++
      typescript
      go

      anything that provides some ability to incorporate it into a useable product

      A ML algorithm was trained to quantify the brain size of posters based on their posts. It has an average error rate of 0.001 cubic centimeters. I used it on your post. Here is the result:
      >Error: brain not found
      It's the first time it ever did that. Strange!

      • 1 year ago
        Anonymous

        > Things nobody gives a shit about:

        they literally have no idea if they should give a shit about it. Nvidia has really cucked these researchers into thinking chunky GPU ops are the only way

        • 1 year ago
          Anonymous

          Show another way.

        • 1 year ago
          Anonymous

          ML is accelerated by using vector and matrix ops, which is exactly what the GPU does. what would you do instead?

          • 1 year ago
            Anonymous

            show me good research in dynamic sparsity or networks with tight control flow

            doesn't happen cuz paper farms know that scaling up the latest LLM with nvidias hardware is the way to go for an easy publication in one of like 300 annual conferences

            • 1 year ago
              Anonymous

              Scaling papers barely get published. Nowadays the popular paper scams are either: copy some technique from 5 years ago and rename it (UNet, cyclegan, latent space diffusion models for recent examples), bribe (coordconv, for example), or dazzle with (wrong) math (basically all the mathy papers in the past 10 years).

        • 1 year ago
          Anonymous

          Goddamn this truly is the lowest IQ board

          • 1 year ago
            Anonymous

            Nvidia shill confirmed

            CPU AMX instructions (intel, apple, arm) gonna crush the GPU "muh flops" watt wasters

      • 1 year ago
        Anonymous

        >Requirements for ML:
        >- Quick iteration iteration time (lots of parameters to check for any arch choice)
        >- Ease of modifying the overall arch/workflow
        >- Accelerated single-block compute (all compute happens at once, repeatedly, with no need to move back to python runtime for a long long time)
        >- Good tool facilities (tensor manipulation, text and stats/prob toolkits, etc.)
        >Things nobody gives a shit about:
        >- Fast non-library speed (virtually no time is spent in the python runtime)
        >- Verbosity
        >- Low level memes (makes modifications harder, too verbose, greatly reduces turnaround time, error risk requires far more time and effort to work out, no room for mistakes due to long runtimes)
        unironically, just use forth
        https://mind.sourceforge.net/mind_faq.html

        • 1 year ago
          Anonymous

          So what library, written in Forth, is equivalent to Keras or PyTorch again?

          • 1 year ago
            Anonymous

            forth did not had any investment mady by big corporations, but is entirely possible to do modern ai software with it, forth was an ai language in the old days, but people just remembers lisp.

      • 1 year ago
        Anonymous

        I’d very much like a faster runtime and especially something statically typed. It feels pretty moronic to coerce a simple loop into a matrix operation just so it doesn’t take ages. And it’s significantly nicer to get an error up front rather than four hours into a run on a compute cluster.
        t phd student doing ML

        • 1 year ago
          Anonymous

          OK you surely know better than people in the field that have actually done valuable work. Fricking midwit

        • 1 year ago
          Anonymous

          so basically you're looking for nim

        • 1 year ago
          Anonymous

          If you're getting python-level runtime errors after 4 hours it probably means you didn't test on sufficiently good dummy data. (Other reasons usually have nothing to do with the language or runtime, like corrupt data or storage flaking out, etc).
          As for coercing the matrix, typically the tradeoff is a few awkward optimization points vs dealing with constant tedium of declaring types that are obvious or writing extra boilerplate to deal with user input.

        • 1 year ago
          Anonymous

          > feels pretty moronic to coerce a simple loop into a matrix operation

          this is the crux of it.

          "it's written in C++" posters don't understand that you have to coerce so much shit into pre-defined python APIs and have no easily accessed escape hatch

    • 1 year ago
      Modern Systems programming language

      You meant to say C/C++/CUDA

      • 1 year ago
        Anonymous

        ML uses c/c++/rust python is just glue.

        Forgot Fortran.

        • 1 year ago
          Anonymous

          Nothing is actually written in Fortran anymore.

          • 1 year ago
            Anonymous

            BLAS implementations that are used by e.g. numpy (they're pluggable) are often ancient fortran libraries. Nobody uses it for new stuff though, that's true.

            • 1 year ago
              Anonymous

              I bet there are still oldgays writing MPI jobs in fortran.

              • 1 year ago
                Anonymous

                FORTRAN programmers divide in two groups: the ones that understand the language is amazing as a backend for many number crunching APIs and the old farts (and their students) running simulations with the most unmaintainable code you will ever see, or maybe not but is still pretty bad.

              • 1 year ago
                Anonymous

                tbqh nothing is really there to rival Fortran when you have to develop HPC code for clusters where you can't log into the worker nodes to debug.

          • 1 year ago
            Anonymous

            Not true. OpenBLAS uses lapack which is Fortran. That's just off the top of my head, I believe there are several other commonly used python libraries that use fortran.

    • 1 year ago
      Anonymous

      This thread reads like a bunch of underage wannabe programmers/MLers talking out of their asses. Thread should have ended with

  5. 1 year ago
    Anonymous

    Today you've realized that programming languages are tools and specializing in deep "technical knowledge" of a languages is not as valuable

  6. 1 year ago
    Anonymous

    Would you rather ML to use assembly and take years to come up with a dataset that recognizes the text "Hello World"?

    ML is supposed to be easy to code. Python is just the interface.

  7. 1 year ago
    Anonymous

    Python is the programming language of everyone who isn't a developer but needs some way of getting the computer to do whatever he needs to achieve some other goal.

    Before python, these people likely used Autohotkey, Excel formulas, VBA, Bash scripts or maybe PHP

    • 1 year ago
      Anonymous

      or just hire programmers instead??

      • 1 year ago
        Anonymous

        Eh. I'm not talking about hiring.
        I'm talking about the people that have an interest in for example Data Science or ML.
        They don't necessarily also have to be programmers but they'll need something.

      • 1 year ago
        Anonymous

        >hire two people to do one job
        >each must be paid more than the average CEO
        >one of them isn't supposed to do anything 99% of the time as he waits for the other to program the specified solution
        >must wait hours to change a single parameters instead of seconds because of the "two body" problem
        Get a brain, cletus.

        • 1 year ago
          Anonymous

          >implying 1:1 ratio
          1 programmer could satisfy multiple ai researcher at the same time

          • 1 year ago
            Anonymous

            Wow, what a bawd.

          • 1 year ago
            Anonymous

            But the one waiting is not the dev, it's the researcher. You can't modify your shit if you don't have the results from the previous run to decide what to change next, but here you're waiting forever for the dev to implement what you described (maybe he's busy with something else, maybe he's out to lunch, maybe he has trouble understanding your request and you have to meet him to explain, etc.), just for you to run it because only you are well-placed to judge if the numbers you're getting show the training is worth continuing or not. And if the dev is incompetent, ho lee shiet.
            All this for literally 0 advantage because the training time will be the same anyway since it's all time spent in cuda, not in the python interpreter.

      • 1 year ago
        Anonymous

        Programmers aren't mathematicians or AI researchers

      • 1 year ago
        Anonymous

        Perfect, you now need to teach a programmer how do neural networks work. You better pray HR chose you a person with a proper math background and not some random bootcamper.

      • 1 year ago
        Anonymous

        The the first mistake if you actually want to make a product.

      • 1 year ago
        Anonymous

        >paying professionals when gradserfs will work for free
        wow, it's like you want it done properly or something

    • 1 year ago
      Anonymous

      Before python, they used bespoke solutions like LUSH for the most part. It was common for labs to have their own special snowflake language based on their focus (some had probabilistic languages, some used lisp variants or languages with logic programming support if they were working with symbolic stuff, etc.)
      Another common choice was matlab. The more engineery places used fortran. C was also used, but rarely.

      • 1 year ago
        Anonymous

        I want the usual BOT shitposter to see some of that old "scientist taking up programming" code from before the mass adoption of Python just to see how many cases of sudden heart attacks are reported in the news

        • 1 year ago
          Anonymous

          It's still a huge problem. Any physic**t or mathtard uses C nowadays and the code literally doesn't relies on "running on linux having been compiled with gcc in debug mode and no optimization" to not segfault, if you're lucky enough that it even compiles at all. It's also always slow and completely unreadable (single-letter names everywhere, but importantly even pajeet seems to have a better grasp of programming concepts). Truly dreadful.

          • 1 year ago
            Anonymous

            literally relies*
            or literally doesn't run without relying on*

          • 1 year ago
            Anonymous

            lmao
            some things don't change as much, after all

  8. 1 year ago
    Anonymous

    >is the entire ML field secretly composed of midwits?
    They're doing statistical analysis and pretending it's something new, so you tell me.

  9. 1 year ago
    Anonymous

    ML uses numpy, numpy is written in C++ (?

    • 1 year ago
      Anonymous

      ML does not use numpy lol

      • 1 year ago
        Anonymous

        Yes it does, scikit-learn takes advantage of pandas and numpy.
        https://scikit-learn.org/stable/install.html

      • 1 year ago
        Anonymous

        Are you fricking moronic

        Just about any data loading and prep that's done uses Numpy, and if it's done with Pandas it's still backed by Numpy

    • 1 year ago
      Anonymous

      Python is an incredibly well-designed language, and the libraries are all written in C(++) anyways, so it isn't like the 250ms python takes to call the C library matters when gradient descent takes days/weeks/months

      PyTorch and TensorFlow are both implemented in C++. Your post was correct in spirit I suppose.

      • 1 year ago
        Anonymous

        >Python is an incredibly well-designed language
        My fricking sides!

        • 1 year ago
          Anonymous

          > >python le badly designed language
          >brainlet detected
          >One of the most popular starting language
          >Used by almost everyone in IT
          >Used by tons of people outside of the computer science field
          >used by a ton of developers
          If python is a poorly designed language then there is no well designed language.

          • 1 year ago
            Anonymous

            You must have an IQ of 18 or above to browse this site.

            • 1 year ago
              Anonymous

              Then why are you here?

          • 1 year ago
            Anonymous

            The fact that it isn't well designed is why it's popular. Software that is better in principle comes with everything you need. software that is inferior does not, and someone has to fill it in. They do it by building tools. Every third-party “tool” is actually a weakness: it’s doing something that either didn’t need to have been done at all, or could have been done better in the first place. however, it creates a community. There’s a way for people to contribute. It’s like a game that leaves little bonuses lying around for people to pocket. If even a novice can make a small contribution, so much better. They feel good about themselves, and now they’ve made a commitment to that ecosystem. The switching costs suddenly became significantly higher. In the realm of programming languages, this is especially true. And in fact, to be “successful” in terms of size, perhaps every language should start out somewhat flawed.. it should have bad scope rules, weird modules, funky overloading, floating points as the only numbers, and so on. To a rational observer, this might seem an awful idea. but programmers, as a species, have gotten acculturated to salt mines as a natural habitat. They will think nothing of it. Of course, you need something to draw them in: “one weird trick” that lets them do entirely unnecessary and perhaps wholly unwise things that let them demonstrate virtuosity. The latter draws them in, and the former seals the commitment. One could view C++ templates as a form of the latter.

            Customers regard more highly an organization that has recovered well from failure than one that had not yet failed. Good designs are like the latter: they do not fail, and therefore cannot pleasantly surprise you. Inferior designs fail, but when the community comes to your rescue, you are left happier than you were before.

            • 1 year ago
              Anonymous

              >Software that is better in principle comes with everything you need.
              So python? The standard library is pretty massive. Is this a botpost or something it's borderline nonsensical.

            • 1 year ago
              Anonymous

              >One could view C++ templates as a form of the latter.
              Template metaprogramming was a mistake.

            • 1 year ago
              Anonymous

              Are you shit posting or have you never programmed? A language should not include everything and every library is not a weakness. You don't know the first thing about maintaining code. When you add a feature you are making a commitment to it. The features you add increase scope. The point of a language is to build a tool to solve problems. A good language is flexible enough to solve any or most problems while being well optimized and easy to use. That or it is designed to solve a specific problem very well. 3rd party libraries are recognition of a good language. If someone with no connection to your language wants to maintain a 3rd party library in it for free that means you have done a good job. They view your language as being worthy of their free time. 3rd party libraries are important to a language because they allow you to solve more problems out of the box while not expanding the language maintainers scope. It means someone else has made a promise to maintain that problem domain. If they are good enough they may actually get merged into the standard library.

          • 1 year ago
            Anonymous

            python posters can just use a low level fallacy to defend their shit lang

            • 1 year ago
              Anonymous

              >le python posters
              Says the lisp poster. You are pushing a 70's meme. There are barely any real world applications of lisp and thousands of python applications. Your post is syntactically pure cope.

              >Lisp just becomes a mess of parenthesis too quickly even when you have an editor or an IDE to manage it
              and the truth finally surfaces
              you could have just said "i cant into parens" that's literally the real gripe homosexuals have with lisp, everything else is just cope

              >into parens
              I gave detailed reason why I didn't like lisp the parens were just the cherry on top of the shit sandwich. Being able to deal with them doesn't me I like them. You have so much lisp Stockholm syndrome that you can't even admit how annoying they are.

              • 1 year ago
                Anonymous

                >thousands of python applications
                another low level fallacy!

              • 1 year ago
                Anonymous

                How is it a fallacy to cite repeated past success as evidence of something being good vs the alleged benefits of something else that while existing longer has produced far less successful products. The quality of a tool is defined by how well it gets a job done, Python has plenty of evidence to show that it more than meets this standard while Lisp's record is underwhelming.

              • 1 year ago
                Anonymous

                >How is it a fallacy to cite repeated past success as evidence of something being good vs the alleged benefits of something else that while existing longer has produced far less successful products. The quality of a tool is defined by how well it gets a job done, Python has plenty of evidence to show that it more than meets this standard while Lisp's record is underwhelming.
                average python defender

              • 1 year ago
                Anonymous

                >There are barely any real world applications of lisp
                Apart from Emacs, all lisp applications are either very expensive or so domain specific you'll never hear of them if you don't work at that one weird place just outside of Boston.

              • 1 year ago
                Anonymous

                Fwiw I agree with most of the points wrt Python over Lisp but complaining about parens does make you look like an entry level homosexual who doesn't know shit about programming.

      • 1 year ago
        Anonymous

        >Python is an incredibly well-designed language
        I use Python... and it really isn't well-designed. It's got some pretty clothes that it throws on over the top, but that's it.
        You've just not seen what good languages actually are.

    • 1 year ago
      Anonymous

      Python is an incredibly well-designed language, and the libraries are all written in C(++) anyways, so it isn't like the 250ms python takes to call the C library matters when gradient descent takes days/weeks/months

      PyTorch and TensorFlow are both implemented in C++. Your post was correct in spirit I suppose.

      Python's just used to call libraries written in C/C++.

      What's Tensorflow written in?

      > written in C++

      https://pytorch.org/get-started/pytorch-2.0/

      > we announce torch.compile, a feature that pushes PyTorch performance to new heights and starts the move for parts of PyTorch from C++ back into Python

      out of touch and btfo;d

      • 1 year ago
        Anonymous

        script kiddies who take themselves too seriously always make attempt like this but in the end these attempts fail. you best use different languages for maxing out performance, for maxing out computational complexity and program depth, and also for manchildren too low iq to learn more abstract concepts but want to feel like adults (hence why python and javascript also exist)

        • 1 year ago
          Anonymous

          Pretty sure python is still the only way to program for TPUs

      • 1 year ago
        Anonymous

        script kiddies who take themselves too seriously always make attempt like this but in the end these attempts fail. you best use different languages for maxing out performance, for maxing out computational complexity and program depth, and also for manchildren too low iq to learn more abstract concepts but want to feel like adults (hence why python and javascript also exist)

        https://github.com/pytorch/pytorch

        • 1 year ago
          Anonymous

          https://github.com/tensorflow/tensorflow

        • 1 year ago
          Anonymous

          https://i.imgur.com/19pzhyJ.png

          https://github.com/tensorflow/tensorflow

          did you not see this part?

          > starts the move for parts of PyTorch from C++ back into Python

          it's not done yet lol

  10. 1 year ago
    Anonymous

    >why the frick does ML use python

    Because "data scientists" just need something to glue libraries together.

  11. 1 year ago
    Anonymous

    I mean it works doesn't it?

    Why should they pain themself with learning c, c++, rust, etc when they can use python that has a way easier language to learn and understand just to get a little better performance?

    Just because Python is an "easy" language, it does not mean that ML is easy

  12. 1 year ago
    Anonymous

    Cause they're not dumb. The Python calls the C libraries and just passes all the parameters. Why spend time fricking with memory when you need to do a lot of testing and fiddling with something. Python removes an entire class of errors through it's garbage collector which is useful when your dealing with ML.

  13. 1 year ago
    Anonymous

    numpy, tensorflow, pandas, and whatever other python libraries that do intensive numerical stuff are all really precompiled binaries with python interfaces. As long as the programmer isn't moronic and doesn't reinvent some time-complex procedure in interpreted Python code (non-cs researchers do odd things) then it doesn't really matter. I dislike Python for being weakly typed, but would rather it for ML over the clusterfrick that is C++ ten times out of ten.

    • 1 year ago
      Anonymous

      Python is not weakly typed. There's just no static compile-time type checking. It's dynamically typed. C is weakly typed.

  14. 1 year ago
    Anonymous

    It used to be Lua. See luatorch.

    • 1 year ago
      Anonymous

      It was just called torch back then but yeah. Torch got a python port they didn't care about too much, but as theano died, and everyone was using python because of theano (plus tensorflow was also python also for this reason), they fully reprioritized to python sadly.

  15. 1 year ago
    Anonymous

    It's because data scientists aren't programmers and need something that will work without too much bullshit.

    • 1 year ago
      Anonymous

      Some are, some aren't. Both choose python for good reasons.

  16. 1 year ago
    Anonymous

    Python's just used to call libraries written in C/C++.

  17. 1 year ago
    Anonymous

    ML is hard enough without some neckbeard helicoptering his dick about curly braces

  18. 1 year ago
    Anonymous

    Python is high IQ. Choosing something in between Python and C is literally the midwit route.

  19. 1 year ago
    Anonymous

    >"I'm too smart for Python and therefor why does anyone use it?"

    • 1 year ago
      Anonymous

      >made up curve, artificially made to look smooth, unlabelled y axis
      go read the actual paper. the real result doesn't look like that.

    • 1 year ago
      Anonymous

      https://i.imgur.com/uYZs4NJ.png

      >made up curve, artificially made to look smooth, unlabelled y axis
      go read the actual paper. the real result doesn't look like that.

      The Dunning-Kruger effect has failed replication. So ironically, unironically referencing the effect to try to make a point is Dunning-Kruger in and of itself.

      • 1 year ago
        Anonymous

        unironically quite ironic indeed

  20. 1 year ago
    Anonymous

    Because Python is easy to type and people don't want to waste more time than they have to writing moronic syntax for no reason.

  21. 1 year ago
    Anonymous

    use nim morons

  22. 1 year ago
    Anonymous

    I never understood why Lua didn't take off for ML.
    Lua is easier to bind to C/C++.
    Lua is more simple than Python.
    Also, mathgays largely prefer 1-based indexing.

    • 1 year ago
      Anonymous

      MLgays are mostly cstards, not mathinbreds, so that wasn't very attractive (the ones with math degrees are those who graduated before CS degrees existed for the most part. On the other hand there is a decent chunk of mlgays with a stat phys background).
      The reason python was the winner is also related: the new crop of people interested in ML were uniquely in the CS camp, and mostly knew python (or if not could pick it up easily). Lua, on the other hand, is a much rarer language in colleges.
      Then the rest is the same momentum dynamics that C and unix enjoyed, for example. Because people knew python, tools were made in python, thus people learned python... for ML.
      But it's true, a major defect of python for ML is that the C interop overhead is massive. This isn't a huge problem since back-and-forth are minimal in most ML applications, but lua doesn't have this problem, and the corollary is that it makes C libraries with bindings in lua far more usable (think numpy-for-lua). It also makes writing small acceleration code in C far easier (in python, you have to resort to numba (very limited), cython (python-flavored C, very complicated to build in non-trivial cases, let alone distribute, rarely actually that fast), python.h's horrid interface, or ctypes and its immense overhead, i.e. no good options at all).

  23. 1 year ago
    Anonymous

    no, it's because a top AI researcher makes roughly $100/hr. No company is going to waste their valuable time making them debug buffer overflows in C or fighting with the rust borrow checker.

    Also, all of the number crunching happens with extremely optimized CUDA submodules. The python is basically just a convenient interface to low level libraries

  24. 1 year ago
    Anonymous

    its so idiots such as myself can get paid to use it for niche engineering applications

  25. 1 year ago
    Anonymous

    Programming is for people that doesn't really have anything important to do

  26. 1 year ago
    Anonymous

    people in want to do work, not frick around with a language for making operating systems or some autistic shit from the 60s. python let's you use some 3rd party libraries and build and modify something quick and try it out. fricking around with low level code is the job of the library. go ahead and implement a convolutional neural network in python vs cpp see what happens

  27. 1 year ago
    Anonymous

    >PhD requirement to even enter the field
    As a PhD student, let me let you in on a little secret... most CS PhDs can't fricking program to save their life. Time spent studying theory and advanced mathematics is time not spent learning how to properly design and maintain software, nor is it time spent using any remotely complicated languages. Python is common in AI, data science, and anything in between precisely because it is simple. Academics love Python because it allows them to focus on the task they're trying to accomplish, without regards to the implementation details.

    What matters isn't that you write a good program. You don't have users. You have yourself. Your program will be run however many times it takes to get results for a paper, and then it is unlikely to be run again. Unless, of course, one intends to write a follow up paper. Speed is relevant only insofar as machine learning is very computationally expensive... so all of the code for training these things is written in C or C++, with bindings to use the library within Python.

    I personally like programming. About a year or so ago, I worked on some tools for a project I was doing with some power systems folks. One of them could have been used in pretty much any language with a graph library, so I used Rust. Not because it needed to be fast, but because I wanted to write something in Rust. I'm in a minority in academia though. For most people, writing software is a means to an end and nothing more. If it can be done in Python, it will.

  28. 1 year ago
    Anonymous

    >uses fricking python. is the entire ML field secretly composed of midwits?
    YOU are the midwit, since you don't even know what Python does in ML. The libraries are usually implemented in languages like C, and python acts as a glorified API to it.

  29. 1 year ago
    Anonymous

    Because pytorch and numpy are convenient to data science

  30. 1 year ago
    Anonymous

    >thanks to python i can do codemonkeys jobs but they could never do real engineering
    you love to see it

  31. 1 year ago
    Anonymous

    moron:
    because it is an amalgam of lisp and prolog

  32. 1 year ago
    Anonymous

    > ML uses Python
    Do you forget that almost all libraries are made in C, C++, etc to make them fast.
    Python is just a wrapper and makes things easy and fast to work.

  33. 1 year ago
    Anonymous

    this subhuman filth will advocate "C"
    frick off, ponytail
    C has destroyed the planet

  34. 1 year ago
    Anonymous

    Why isn't Scala more popular?

    • 1 year ago
      Anonymous

      brainlets cant into typing, thats why python/JS is so popular

    • 1 year ago
      Anonymous

      Than what?
      People programming for jvm will typically just write Java. Or maybe kotlin.

    • 1 year ago
      Anonymous

      Due to Java and Kotlin already existing

    • 1 year ago
      Anonymous

      The language design is pretty good. I love the type system and their fusion of OOP and functional works well but the tools are god awful. I used to syntax reverting tool from the compiler and I had to fix most of the syntax myself. Metals will now fix your imports when you move files (I think this feature came out this week) but it actually does a half assed job as well and renaming symbols sometimes doesn't work for some magical reason! Scalafmt makes code look ugly. Only VSCode seems to work well with Scala 3. Documentation for some libraries is lacking might be because a lot of them are Java wrappers.
      I might have these issues because I am using Scala 3 though.

  35. 1 year ago
    Anonymous

    >Engineers (real) and scientists use any language for projects
    >Programmers (fake engineers) spend hours of their day arguing over what language to use.

    lol

  36. 1 year ago
    Anonymous

    because its easy to set up. You have a CNN in like 5 minutes instead of linking dependencies for an hour and trial&erroring

  37. 1 year ago
    Anonymous

    i'm finally going to learn how to program in this year, this thread is full of people that use python, just recommend me a book or a course please.

  38. 1 year ago
    Anonymous

    I just notice that practically all the lisp users magically disappear around halfway to 2/3rds through advent of code

    that should say something

    • 1 year ago
      Anonymous

      most people who do aoc are young students who have time to waste on december, lispgays are doing productive things with their families

  39. 1 year ago
    Anonymous

    I'm thinking python is the real filter for brainlets now
    I haven't head a CompSci PhD complaining about it, it's always some midwit BOT user who make less than 200k a year

    • 1 year ago
      Anonymous

      not like phds earn all that much
      you do not really go the phd route for money
      you do it to learn

      • 1 year ago
        Anonymous

        >you do not really go the phd route for money
        >you do it to learn
        The point of a PhD is that it marks you as being THE world expert on something. Almost always something really off the beaten track and maybe only for a few months, but definitely the world expert on it.

  40. 1 year ago
    Anonymous

    The point of using python for ML is using python for easy access while making C do all of the hard work. If there ought to be a replacement it needs to be attached to C like python is. Something like lua.

    • 1 year ago
      Anonymous

      explain this, then

      https://jax.readthedocs.io/en/latest/jax-101/02-jitting.html

      https://pytorch.org/tutorials/intermediate/torch_compile_tutorial.html

      Python supporters BTFO'd by experts literally needing to shim the language with their own compilers

      nothing says "shit language" like this

      • 1 year ago
        Anonymous

        Nothing says shit poster like linking to posts he didn't read and can't even discuss in contexy, then making bold claims about straw men.

    • 1 year ago
      Anonymous

      nim?

  41. 1 year ago
    Anonymous

    Machine learning is just a while loop u can do it in any language

    • 1 year ago
      Anonymous

      in the end, the only real difference between programming languages in a practical sense, is how much money a big company is using to make it popular, it happened with C, C++, java, python, go, etc.

  42. 1 year ago
    Anonymous

    iterate in slow but easy language then export the model for inferencing in faster language

    • 1 year ago
      Anonymous

      Cant tell if you're esl or AI

      • 1 year ago
        Anonymous

        neither, im right

  43. 1 year ago
    Anonymous

    Python was always for morons and mouth breathers
    >herp derp what the frick are semicolons and braces Black personman save meeeeee

  44. 1 year ago
    Anonymous

    what do you use to program ML you prick
    from your post, you dont' program at a

    what you using to develop your ML framework, genius

    you fricking parasite. python is the only thing
    python is ADVANCED you fricking parasite

  45. 1 year ago
    Anonymous

    > be one of the easiest languages to learn and use
    > arguably has the best library support of any existing language
    > "why the frick does ML use python"
    Hmmm.

  46. 1 year ago
    Anonymous

    It's the most convenient way to create wrappers for C libraries.

  47. 1 year ago
    Anonymous

    Haskell. If it's good enough for Cardano, it's good enough for ML.

  48. 1 year ago
    Anonymous

    The user interface is python, the compute libs are all compiled langs.

    The guys that wrote these ML applications are too busy building things to worry about your hateboner for python

  49. 1 year ago
    Anonymous

    lmao python script kiddie trying to justify his language of choice. real reason: you're too dumb to learn anything better and it's the same reason for ML PhDs. getting a phd in ML is literally nothing it's the most looked down upon post grad degree

  50. 1 year ago
    Anonymous

    What's Tensorflow written in?

    • 1 year ago
      Anonymous

      turing-complete programming language

    • 1 year ago
      Anonymous

      Mostly in C++, CUDA and a little bit of Python itself

  51. 1 year ago
    Anonymous

    The worst thing about python is the xdb forums pajeet tier versioning and lack of true dependency management, coming second is whitespace as syntax

  52. 1 year ago
    Anonymous

    ITT:
    room temp iq pseudo-intellectuals that think harder to learn languages are always better.
    >wahhh wahhh you have to manage memory yourself and you cant use anything dynamically typed

    • 1 year ago
      Anonymous

      lisp is easier

      • 1 year ago
        Anonymous

        No it isn't. And it's not the parens. See

        Lisp isn't hard to learn because of the core syntax. It's hard to learn because of all the other idiosyncratic details of any specific Lisp that you have to learn before you can do simple things. Common Lisp being a lisp-2, for example. Clojure has some wierd javalike scaffolding, understanding how entrypoints work can be a little painful for a new programmer. Racket is full of autustic academic shit that you hit the moment you try to do something you think would be simple like load some data from json or yaml.

        None of those issues have anything to do with syntax. You can even use Hy, which is basically python with lisp syntax, to get a feel for how little the parens really matter.

      • 1 year ago
        Anonymous

        Also, even once past the beginner stage lisp's advantages vs python tend to be minor and not that important. Homoiconicity starts working against you in little ways. Some operations are much more intuitive with infix, eg 'item in set' vs '(in? set item)' where you have to remember which operand comes first. Then you have python's array slice syntax which is often slightly more convenient than using function calls or special forms to access array sections. These are all really minor frictions and lisp has it's own advantages but that's partly the point: the syntax differences just aren't very important. And for a typical workflow glue, interactive notebook, etc the Python is probably going to win out based on big picture reasons like access to libraries, docs, and support.

        • 1 year ago
          Anonymous

          >lisp's advantages vs python tend to be minor and not that important. Homoiconicity starts working against you in little ways. Some operations are much more intuitive with infix, eg 'item in set' vs '(in? set item)' where you have to remember which operand comes first. Then you have python's array slice syntax which is often slightly more convenient than using function calls or special forms to access array sections
          are you dumb, don't you see that lisp being homoiconic everything can be implemented as macros? There are already infix lisps, but the community don't use because they like lisp notation, simple.

          • 1 year ago
            Anonymous

            No, you are dumb. You seem to have forgotten the context of the comment, missed it's main point and more importantly you clearly have no experience actually programming to solve real problems.

            • 1 year ago
              Anonymous

              cope

  53. 1 year ago
    Anonymous

    >why the frick does ML use python
    Because it's easy. Fantastic for training, experimenting and also prototyping. If I need to show board something, I'm going to do it all in jupyter notebook and run through the shit, or email them a PDF of the notebook. Python should however NOT be used for deployment.
    To use myself as an example, I've spent over two decades programming. I transitioned into ML about 8 years ago. I do my data preprocessing and training in python. I do all my experiments and testing in python. I deploy all my shit in C++ and CUDA, whatever python I use for deployment is minimal and doesn't require dependencies past the version of python.
    I can understand how someone that didn't have a programming background would just stick to python for deployment, but honestly, they shouldn't be deploying. The company should hire someone else to do that, or they should hire competent ML people that are also competent programmers.
    It's a really easy to use language but fricking wasteful, and if you're working at scale, it just burns through resources.

  54. 1 year ago
    Anonymous

    You are looking at the front end, Python is ideal for this, trivial to change and does not require recompiling. That is why a simple git pull works for AUTOMATIC1111

    The low level libraries are another issue, they are usually not written in Python.

  55. 1 year ago
    Anonymous

    manip layer: python/golang
    autism layer: C/C++

    just like any other useful bit of software
    most engineering analysis software is in fortran you know.

    fricking moron

  56. 1 year ago
    Anonymous

    Python is like the Kardashians. It's a moronic dumpster fire that's popular only because it's popular, but it has too much momentum to be stopped now, so you just have to accept it and move on.

    • 1 year ago
      Anonymous

      It's not a dumpster fire, though. It has problems, but everything had problems and it's worth understanding what Python did right to get where it is. Instead of just jerking yourself off in ignorance.

  57. 1 year ago
    Anonymous

    yes

  58. 1 year ago
    Anonymous

    Because python abstracts the tedious shit so you can focus on the actual ML task. Nvidia provides C++ libraries for CUDA, no one's forcing you to use PyTorch.
    Go be stupid somewhere else.

  59. 1 year ago
    Anonymous

    Python is C under the good. Built in functions and libraries are written in C. Python is all about using libraries and built in functions to perform a tssk. If you're using a for loop in python you're doing it wrong.

  60. 1 year ago
    Anonymous

    do I need to be good at math to enter DL/ML/AI?

    • 1 year ago
      Anonymous

      I'm currently using ML for science purposes and I'm just importing packages and cleaning data without understanding any of what's going on like a total Black person, I imagine that's what most people do except for the actual geniuses who write these algorithms. I'm working on my understanding, slowly thoughever.

      Would love any input or tips from actual ML specialists or users.

  61. 1 year ago
    Anonymous

    Honestly, because of jupyter notebooks, numpy, matplotlib and similar libraries. The workflow for mucking around with data and then constructing a model and then training the model and then plotting performance is nicely streamlined.
    It's especially useful to use a simple and well-known language given that people who specialize in ML may not spend as much time programming as a full-time software developer might. Often times these are domain experts who know some programming who then leave it to other teams to productionize their models.

    • 1 year ago
      Anonymous

      python notebooks aren't even that good, wolfram notebooks are much better, wolfram lang is not used just because is proprietary

    • 1 year ago
      Anonymous

      jupyter is based, it helped revive interactive programming but sucks hard enough to motivate competing projects
      thanks pytards for tarding clojure clerk into existence

  62. 1 year ago
    Anonymous

    >why
    Because python is superior.

  63. 1 year ago
    Anonymous

    >why
    Because it makes (You) seethe, no other reason.
    Mission Accomplished.

  64. 1 year ago
    Anonymous

    [...]

    >make this efficient in Python using C++ libraries
    You're missing the point. Making small amounts of setup code more efficient isn't worth it. You'll be saving less than 1% of 1% of the execution time and energy. Optimizing your own code is only really worth it once you get to about 5% faster (the break-even point is lower for common library code) and you only get much better when you go to 10 times faster; the ~10 times factor is where you change what sort of applications using the code can be conceived of.

  65. 1 year ago
    Anonymous

    “An idiot admires complexity, a genius admires simplicity, a physicist tries to make it simple, for an idiot anything the more complicated it is the more he will admire it"

    • 1 year ago
      Anonymous

      primitive abd simple are different things schizo

  66. 1 year ago
    Anonymous

    Anyone who works with PhDs knows they only come in three flavors in descending frequency:
    Midwits who cobble together semi-believable middling work on current topics but lack the IQ to understand their field in a complete manner.
    Sycophants who are adept at getting their names thrown on papers and getting grant money, but everyone in their field secretly suspects they are moronic.
    Autistic Rainmen who never get tenure anymore and advise everyone else. They are the only ones who actually understand anything, but are thought of as kooks. They usually end up as forever-postdocs.

Your email address will not be published. Required fields are marked *