Bonus issue:

This one is a little bit less obvious

  • coherent_domain@infosec.pub
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    6 hours ago

    My conspricy theory is that early LLMs have a hard time figuring out the logical relation between sentenses, hence do not generate good transitions between sentences.

    I think bullet point might be manually tuned up by the developers, but not inheritly present in the model; because we don’t tend to see bullet points that much in normal human communications.