I know the typical answer is “no because all the libs are in python”… but I am kind of baffled why more porting isn’t going on especially to Go given how Go like Python is stupid easy to learn and yet much faster to run. Truly not trying to start a flame war or anything. I am just a bigger fan of Go than Python and was thinking coming in to 2024 especially with all the huge money in AI now, we’d see a LOT more movement in the much faster runtime of Go while largely as easy if not easier to write/maintain code with. Not sure about Rust… it may run a little faster than Go, but the language is much more difficult to learn/use but it has been growing in popularity so was curious if that is a potential option.

There are some Go libs I’ve found but the few I have seem to be 3, 4 or more years old. I was hoping there would be things like PyTorch and the likes converted to Go.

I was even curious with the power of the GPT4 or DeepSeek Coder or similar, how hard would it be to run conversions between python libraries to go and/or is anyone working on that or is it pretty impossible to do so?

  • the_quark@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The runtime of your code basically doesn’t matter. You hand it off to a GPU for all the hard calculations, and even a fast environment, your code is going to spend 99% of its execution time waiting to get stuff back from the GPU.

    I’m sure Python’s I/O polling sleeps are just as efficient as Go’s.

    • Dry-Vermicelli-682@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Interesting. I was thinking more the code that is used to train models. It seems the ability to run a model is pretty well covered with the likes of llamacpp and such. So not sure it makes much sense in that area. But I assume the team at OpenAI spent a lot of time wriiting code that is used for the training aspect? It can’t just be a couple lines of code that read in some vector data and train the model, then write it out. There must be a ton of logic/etc for that as well?

      But then again, maybe that doesnt need to be fast either. I don’t know.

      • the_quark@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        The model training is also GPU-intensive. If you read about model training costs, they talk about things like “millions of GPU-hours.”

        As I understand the process (note, I am a software developer, but do not professionally work on AI), you’re feeding lots of example text into the GPU during the training process. The hard work is curating and creating those examples, but that’s largely either human-intensive or you’re using a LLM to help you with it, which is…GPU-intensive.

        Optimizing the non-GPU code just isn’t much of a win. I know all the cool kids hate Python because it’s typeless and not compiled, but it’s just not much of a bottleneck in this space.

        • Dry-Vermicelli-682@alien.topOPB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Yah… the thing is… do I have to learn very fancy advanced python to do this… or can I use more simple python that then makes use of as you said more optimized libraries. I am wondering how much time its going to take to figure out python well enough to be of use and hence was thinking Go and Rust might work well as I know those well enough.

          If it’s just calling APIs even to a local running model, I Can do that in Go just fine. If it’s writing advanced AI code in python, then I have to spend likely months or longer learning the language to use it well enough for that sort of work. In which case I am not sure I am up to that task. I am terrible with math/algos, so not sure how much of all that I am going to have to somehow “master” to be a decent to good AI engineer.

          I think the idea is… am I a developer using AI (via APIs or CLI tools) in some capacity, or am I writing AI code that will be used for training, etc AI. I don’t know which path to go, but I’d lean towards using models and API calls over having to become a deep AI expert.

          • the_quark@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            My sense of it is that most training is still just using the APIs to talk to the GPU, and the art is more in the assembly of the training set than it is optimizing the APIs. There are serious researchers working on improving AI, but they’re figuring out how to make the data get in and out of the GPU faster in a way that doesn’t hurt later quality. But that’s not a code optimization problem, it’s much more of an “understanding the math” kind of problem and whether you’re then using Python or Go to tell the GPU to execute this slightly different math isn’t much of a concern.

            I think it’s a lot like data science. Getting the good clean data to work with is actually the hard part. For training, getting great training sets is the hard part.

            If you just wish to write code that uses AI, or train models for that purpose, the current Python toolkit is more than sufficient, especially given how quickly everything is moving right now; we might have totally different architectures in three years, and Python will be quicker for that R&D iteration than Go is.

            Finally, on your personal thing - I’ve been coding for 39 years. I’ve worked in BASIC, Assembly, C, C++, Perl, Python, and Go (and 37 other languages here and there). Go to Python isn’t going to be a difficult jump. Especially now that you can…use an AI to help you if you’re at all confused how to turn a Go concept into a Python one.