—- one of my habits is to imagine how a decision-tree-based AI might have emotions. i basically assume that emotions are a universal product of mature complex goal-meeting. recently i developed this idea further, considering universal needs when pursuing goals and states would be different from emotional representation of these needs, and from this you can see how the way our emotions represent holds so much of our evolutionary story. the feeling of interest i recently thought around was comfort or relaxation. a system with limits or capacity bounds, which is any system engaging new information, has areas where it can meet goals more readily because the parts of these goals are easily at hand. for example, a prompt bot that has important information in its prompt, rather than needing to retrieve it, could be said to be meeting a universal need underlying relaxation, with regard to utility of that information.