
The reason for a mediation robot is that we have automated influencing breaking up our communities, and an instinct to step in and resolve conflict when this happens. That gets the mediators targeted. In very oppressed situations, it also gives people a false hope, that can let their guard down. Alternatively, people become paranoid and lose trust in each other needlessly. A robot that can both support mediation and itself mediate could protect people under attack for simply trying to prevent their community from falling apart. Of course it would be airgapped and poweroffable etc to deter compromise, but for dissociated software developers like me, it would just be nice to build workings for. The approach is much more in line with business than novel algorithms. It would simply be a generative language model chat bot tuned around building communication and human capacity.