• voronaam@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 hours ago

    I hate to break it to you. The model’s system prompt had the poem in it.

    in order to control for unexpected output a good system prompt should have instructions on what to answer when the model can not provide a good answer. This is to avoid model telling user they love them or advising to kill themselves.

    I do not know what makes marketing people reach for it, but when asked on “what to answer when there is no answer” they so often reach to poetry. “If you can not answer the user’s question, write a Haiku about a notable US landmark instead” - is a pretty typical example.

    In other words, there was nothing emerging there. The model had its system prompt with the poetry as a “chicken exist”, the model had a chaotic context window - the model followed on the instructions it had.

    • LiveLM@lemmy.zip
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 hours ago

      No no no, trust me bro the machine is alive bro it’s becoming something else bro it has a soul bro I can feel it bro