It seems to me that the next big boom for cloud computing will be offering to train and host models that understand the unique business domains it serves.

Are the smart corporations already training local LLMs to understand and answer questions about their business, or is this space too new to accommodate them?

I feel like some of you may be missing a huge business opportunity. You may not realize the value of what you have already researched.

  • InterstitialLove@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The state-of-the-art on training and architecture is likely to improve over the next year alone, certainly over the next 2 or 3. It’s also reasonable to expect cheaper hardware for running LLMs, since all the chip makers are working on it.

    If you don’t need a local LLM now but think it might save money only in the long run, it probably makes sense to wait and build one once we’re better at it

    Collating training data in the mean time probably makes sense. Recording as much as you can, encouraging employees to document more, etc. That data will be useful even in the absence of AI, and with improving AI technology it is likely to become more and more valuable every year. It also takes time to produce that data, and no one else can do it for you