Power User
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
obvithrowaway34434@alien.topB to LocalLLaMAEnglish · 2 years ago

New Microsoft codediffusion paper suggests GPT-3.5 Turbo is only 20B, good news for open source models?

message-square
message-square
27
link
fedilink
1
message-square

New Microsoft codediffusion paper suggests GPT-3.5 Turbo is only 20B, good news for open source models?

obvithrowaway34434@alien.topB to LocalLLaMAEnglish · 2 years ago
message-square
27
link
fedilink

Wondering what everyone thinks in case this is true. It seems they’re already beating all open source models including Llama-2 70B. Is this all due to data quality? Will Mistral be able to beat it next year?

Edit: Link to the paper -> https://arxiv.org/abs/2310.17680

https://preview.redd.it/kdk6fwr7vbxb1.png?width=605&format=png&auto=webp&s=21ac9936581d1376815d53e07e5b0adb739c3b06

  • herota@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Can someone explain what exactly is #p?

LocalLLaMA

localllama

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !localllama@poweruser.forum

Community to discuss about Llama, the family of large language models created by Meta AI.

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 4 users / day
  • 4 users / week
  • 4 users / month
  • 4 users / 6 months
  • 1 local subscriber
  • 11 subscribers
  • 1.02K Posts
  • 5.82K Comments
  • Modlog
  • mods:
  • communick
  • BE: 0.19.11
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org