On 7/9/23, Undescribed Horrific Abuse, One Victim & Survivor of Many <gmkarl@gmail.com> wrote:
ok accumulate bounds of counterexample: - the source of f() must be passable to g(). this means f() must be finitely long, and its source must be publicly know.
if we sufficiently encrypt, obfuscate, and work-burden f() then the function of g() becomes difficult
this sounds like it would not disprove the disproof — but, for example, rather than constructing f() we could construct a function that _generates_ f() for a bounded set of g(). in that view, the problem then simplifies to something near obfuscation and deobfuscation of functionality, and we reach a problem where — another idea is expanding f() to have information on g(). to for example let it know it is in an outer simulation by giving it secret data. this seems like it would help?