Power User
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Dorialexandre@alien.topB to LocalLLaMAEnglish · 2 years ago

MonadGPT, an early modern chatbot trained on Mistral-Hermes and 17th century books.

alien.top

message-square
11
link
fedilink
1

MonadGPT, an early modern chatbot trained on Mistral-Hermes and 17th century books.

alien.top

Dorialexandre@alien.topB to LocalLLaMAEnglish · 2 years ago
message-square
11
link
fedilink
alert-triangle
You must log in or register to comment.
  • Dorialexandre@alien.topOPB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    As an update: I have now released the finetuning dataset on HuggingFace: https://huggingface.co/datasets/Pclanglais/MonadGPT

    Overall 10,797 excerpts in early modern English, French and Latin with synthetic question generated by Mistral-Hermes.

  • buzzyness@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Very cool, there might be lots of applications of this approach (from an archival standpoint), maybe museums? What are your thoughts on finetuning, vs asking llama to chat in the form of a 17th century astronomy book?

    • Dorialexandre@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      Well that was actually my original motivation for finetuning. Even GPT-4 is not so good with a proper prompt: the text feels fake and/or struggle to maintain cultural consistency. I think finetuning works better for this task, as there are too many directives to give and it helps to relieve the model from anachronistic RLHF.

      As for the applications, I mostly think about education, especially if the model is properly connected to a RAG database. Can be a very interesting way to get immersed in a time period on any kind of topics.

    • unamednational@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 years ago

      Would be awesome in classroom. If kids can ask George Washington what happened exactly I think they’d care more. Plus they could tell him to go f himself for infinite amusement

  • Dorialexandre@alien.topOPB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Link to the ongoing demo for MonadGPT, with generous GPU support from HuggingFace : https://huggingface.co/spaces/Pclanglais/MonadGPT

    The model has been published as well (and soon the dataset): https://huggingface.co/Pclanglais/MonadGPT?text=Hi.

  • UseNew5079@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Absolutely brutal bot and very opinionated. Cool idea.

    https://preview.redd.it/l2f0l3sanezb1.png?width=904&format=png&auto=webp&s=a338617579289a932c9083641f74d78078acf87e

  • FPham@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Interestingly, if you tell in system prompt to the OpenHermes-Mistral 2.5 that he is from 17 century and uses archaic language, he will also say there are 7 planets.

    You are MonadGPT, a very old chatbot from the 17th century. Please answer the questions using an archaic language

    ​

    https://preview.redd.it/0ecpxhg86hzb1.png?width=927&format=png&auto=webp&s=cc626b7c480bf1582b9f937f0c8c671ab403f0be

  • vec1nu@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Which frontend is that?

  • tortistic_turtle@alien.topB
    link
    fedilink
    arrow-up
    1
    ·
    2 years ago

    ​

    https://preview.redd.it/mnwir3lbidzb1.png?width=1641&format=png&auto=webp&s=118f23ef12e0af6580acaa38bb7c1446b1c05abf

    xD

  • ReMeDyIII@alien.topB
    link
    fedilink
    arrow-up
    1
    ·
    2 years ago

    Did we used to spell “we” as “wee?”

  • oKatanaa@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    How was it trained? Did you just train it on the passages from those books? If so, I am very surprised it retained its conversational capabilities. I would expect it to just go off the rails and generate random 17th century stuff

LocalLLaMA

localllama

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !localllama@poweruser.forum

Community to discuss about Llama, the family of large language models created by Meta AI.

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 4 users / day
  • 4 users / week
  • 4 users / month
  • 4 users / 6 months
  • 1 local subscriber
  • 11 subscribers
  • 1.02K Posts
  • 5.82K Comments
  • Modlog
  • mods:
  • communick
  • BE: 0.19.11
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org