• Most-Trainer-8876@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    wtf? Really? I mean I kinda thought that too because of the way GPT3.5 compares to Falcon 180B. Even tho Falcon has more parameters still GPT3.5 works way better than it. I credited all this to the Data used to train the model. I believe that Not just more parameters but more quality data will help AI Models increase proportionally in terms of quality & performance.
    Can’t believe that ChatGPT is just 20B, I always thought that it’s 175B Model. What about the actual 175B+ Model? Are they going to be AGI? lol.
    If this is true then it means all Open Source Models are trained cheaply and is nothing compared to what OpenAI did.