Lorem ipsum dolor sit amet, consectetur adipiscing elit. Proin scelerisque, eros nec elementum auctor, velit risus gravida nisl, eget lobortis ligula sapien a eros. Ut pretium consectetur mi, in fringilla elit egestas eu. Donec rutrum porta nulla quis finibus.

  • 0 Posts
  • 5 Comments
Joined 7 months ago
cake
Cake day: May 14th, 2024

help-circle



  • By default, you assume that the people around you are at least capable of caring what you have to say. I wonder what would happen if you took that assumption away.

    Let’s say the latest flu virus has a side effect where it disables that feature from a significant number of the affected individuals. Suddenly millions of people are literally unable to actually care about other people. That would make casual conversations a bit of a gamble because you can’t really be sure whether you’re talking to a normal person or not. Maybe people wouldn’t want to take that gamble at all. What if that would force social norms to change and human interactions would o longer come with this assumption pre-installed.

    As a side note, that kind of a virus would probably also put humanity back to the stone age. Being motivated to work together, care about others and act selflessly is a fundamental part of human civilization.


  • It might also help if the LLM remembered what you discussed earlier.

    However, you’ve also touched upon an interesting topic. When you’re talking to another human, you can’t really be sure how much they really care. If you know the person well, then you can usually tell, but if it’s someone you just met, it’s much harder. Who knows, you could be talking to a psychopath who is just looking for creative ways to exploit you. Maybe that person is completely void of actual empathy, but manages to put on a very convincing facade regardless. You won’t know for sure until you feel a dagger between your ribs, so to speak.

    With modern LLMs, you can see through the smoke and mirrors pretty quickly, but with some humans it can take a few months until they involuntarily expose themselves. When LLMs get more advanced they should be about as convincing as a human suffering from psychopathy or some similar condition.

    What a human or an LLM actually knows about your topic of interest is not that important. What counts, is the ability to display emotion. It doesn’t matter whether that emotion is genuine or not. Your perception of it does.