Power User
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Amgadoz@alien.topB to LocalLLaMAEnglish · 2 years ago

dolphin-2.2-yi-34b released

message-square
message-square
34
link
fedilink
1
message-square

dolphin-2.2-yi-34b released

Amgadoz@alien.topB to LocalLLaMAEnglish · 2 years ago
message-square
34
link
fedilink

Eric Hartford, the author of dolphin models, released dolphin-2.2-yi-34b.

This is one of the earliest community finetunes of the yi-34B.

yi-34B was developed by a Chinese company and they claim sota performance that are on par with gpt-3.5

HF: https://huggingface.co/ehartford/dolphin-2_2-yi-34b

Announcement: https://x.com/erhartford/status/1723940171991663088?s=20

  • WolframRavenwolf@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    I’m still working on the updated 70B comparisons/tests, but right now, the top three models are still the same as in the first part of my Huge LLM Comparison/Test: 39 models tested (7B-70B + ChatGPT/GPT-4): lzlv_70B, SynthIA-70B-v1.5, chronos007-70B. Followed by dolphin-2_2-yi-34b.

    • Healthy_Cry_4861@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      SynthIA-70B-v1.5 seems to have the same context length of 2k as SynthIA-70B-v1.2, not the same 4k context length as SynthIA-70B-v1.2b

      • WolframRavenwolf@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        You’re right with your observation, when I load the GGUF, KoboldCpp says “n_ctx_train: 2048”. Could that be an erroneous display? Because I’ve always used v1.5 with 4K context, did all my tests with that, and it’s done so well. If it’s true, it might even be better with native context! Still, 2K just doesn’t cut it anymore, though.

LocalLLaMA

localllama

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !localllama@poweruser.forum

Community to discuss about Llama, the family of large language models created by Meta AI.

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 4 users / day
  • 4 users / week
  • 4 users / month
  • 4 users / 6 months
  • 1 local subscriber
  • 11 subscribers
  • 1.02K Posts
  • 5.82K Comments
  • Modlog
  • mods:
  • communick
  • BE: 0.19.11
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org