• k0setes@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I have tested several small 7B models for speaking Polish and it seems to me that currently openchat_3.5.Q4_K_S.gguf

    is probably the best.
    Of course this was not a large-scale study, so it is not necessarily 100% true ;)
    And I look forward to the final release 👍

    • Significant_Focus134@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Thanks! For the record, that version is very under-trained. Today I started to train on much bigger dataset (50k entries) that is mostly built from the wikipedia.