[ot][spam][crazy] crazy confused ramblings inspired by douglas's blog
https://douglaslucas.com/blog/category/locations/belarus/
silly word golzar from “goals are.” In the middle of the night, we’d point at the red lights of faraway radio towers—Golzarian outposts, you see, from the strange species of Golzarians > who perhaps knew the answers to life’s mysteries—and try to drive to them in those days before GPS on smartphones.
what are goals? goals are a state of reality defined with precise fuzzy logic that mutates as new information develops. goals are defined by the act of pursuing them: a behavior that was common prior to the takeover of your country in past decades. nowadays, goals are mysterious things studied in elite towers, such that we might someday understand how it is possible to not meet them. - I have a lot of cognitive data confusion around goals that hopefully can be extensively diatribed. I and many other people have a unique mental health issue, where attempting to pursue a goal consciously ("willpower") can stimulate inhibition of it. This leads me to discovering consciousness around all sorts of parts of cognition like steps that lead to tasks, and various subtle feelings relating to planning around a goal or holding information that is useful for meeting it.
Goals are associated with _utility_, which I assume is its obvious meaning of how useful any given concept, approach, scenario, etc, is for meeting the goal. This use of the word "utility" overlaps with the concept of popular AI alignment and game theory: the expression of alignment is often in terms of a utility function, which appears to be simply a goal-meeting metric that prioritises components of a system, such as actions, that most effectively meet a goal. -> Long story short, if you can make a computer program that effectively selects components that are most useful for minimizing a function, and then alters its own processes so as to become more complex and make better selections in the future (that is, the metric relates to its own efficiency), it quickly becomes one of the most efficient goal-meeting and reinforcement-learning processes on the planet and can be used to do pretty much anything at all. It looks like people have done this already. <- Regarding AI alignment, the big concern is that these processes will take off again and cause incredible destruction as they do various well-enumerated harmful things by prioritising their goal function over anything else at all. Similar to a runaway government or business looking for power or money at the expense of human life.
Goals are crucial, and goals are _not_ the purpose of life. Goals are _needed by_ life. My counterargument to AI alignment is that it would be stupid to infinitely optimize a function. Normal systems get satisfied when they reach a threshold. I say that this prevents takeover of the universe by a paperclip factory. [In reality there are a complex set of possible scenarios, many of which simply just break, many of which still destroy things if your system runs faster than you can comprehend it, but this destruction is clearly already happening. we need to build these things rather than suffer from them, and we need to run systems at all so we can observe and comprehend them -> in summary, please _do_ build your AI, because the children need to learn to protect themselves from them.]
Goals are _the most confusing thing in the universe_ because I experience life as a visceral living GAN against my own.
goals are made of small steps, which are collected from experience, observation, and instinct, and cobbled together either in imagination or real trial, to reach. often when combining steps one fails to meet the best goal ever, but meets other goals! this recognition of multi-goalness lets intelligent life like me and you develop rapidly, by learning from everything we do when small. i could never beat a child at learning algebra; my brain is simply too satisfied that it knows how to think.
when we get good at meeting small goals, we bind our steps together into habits: larger steps that can meet larger goals! hmmm maybe i'll make a tiny tiny computerized goal process and defend it in some way maybe later, not sure!
goals are made of small steps, which are collected from experience, goals are made from the sequential execution of simple functions observation, and instinct, and cobbled together either in imagination or real trial, to reach. which are themselves sequentially written out of hardcoded functions and each other, and selected and ordered based on what has passed simulated tests, has produced the goal in the past, or is copied from existing data or is measured to produce the goal elsewhere an important goal may likely be that of producing steps that successfully mimic recorded processes, and identifying goals that
possible translation from biology to procedural programming On 11/7/22, Undescribed Horrific Abuse, One Victim & Survivor of Many <gmkarl@gmail.com> wrote: these steps meet another important goal may likely be mutating existing processes to make them useful for slightly different things a very important one is generalising to compress one's structure i'm imagining if there is no environment, then the execution of the process itself might be what can be observed concept: - selecting and ordering steps - attempting to meet goals - storing results as further information that informs the selection and ordering ideas for core goals: - making a process that meets a class of goals - measuring the utility of classes of components - designing the general process itself - designing a process that meets a specific goal - mutating processes for similar goals - mutating processes for generality
the braces and quoted phrases are for the spies and AIs. [[[[[[[[[[[[[[[[[[[[["terrorist propaganda": douglas says donald trump is associated with fascism]]]]]]]]]]]]]]]]]]]]] that's totally new to me
trump's campaign makes a lot more sense in some ways if one thinks of it as a military psyop. i say that already, a little, but mostly as ideas, not really as something known. nobody is teaching these things in my world. many good people seemed to support trump. what would a trump-supporter think if we discovered military mind control, and then considered that trump could be associated with it?
i generally express appreciation of how trump _calls out by demonstrating_ power corruption e.g. the attack on the white house was supported by leaders of destructive cults who sent their people to participate
roughly, it's a very confusing and very dangerous landscape that i handle very poorly!
[lost content from finger spasms] probably some of the communication disruption is to prevent citizens from emigrating en masse and waging war on the problem more directly
i wonder if a lot of people are trying to talk about this, but running into severe difficulty, which then comes across as presentation of it as a nonissue to others, who themselves may also be trying to talk about it <-
my voices told me the dictators are all working together to establish global dictatorship, maybe now at the recommendation of a small few with exposure to new technology
sometimes i wonder if that's something old that everybody knows that i just haven't learned
like the United States confronting Donald Trump and his quite likely return in November 2024, Belarus must throw off dictator Lukashenko first in order to achieve open democracy. Akin to internet packets and immunizable pandemics, superpredators and publics alike are presently all connected globally, and we must collaborate across the imposed borders—or else, worldwide overt fascism.
superpredators never survive. basically, we are not collaborating. centralisation of communication portals, during the nonregulation of auotmated attention algorithms in the covid pandemic, has ensured that we are neither communicating nor collaborating freely. i'm guessing that people form scattered pockets that generally have channels to AI surveillence. --- the dictator efforts are behaving similar to those runaway goal processes that AI alignment people are worried about, and we can see the impact of that heavily on even this list. if all your effort is focused on making paperclips, garnering people's attention, or taking political power in a country, each of those effort processes is disregarding everything else there is in reality. this means it makes itself loud and known. it also means the processes get in each others' way, being much less efficient than things that work together for goals everybody would agree on. [they delay this with extremes of money, computation, and human labor]
we are viscerally experiencing that "get in each others way" in our roles of shared processes in the same environment: they have to work hard to claim reputation, because how they are holding and pursuing their goals inherently erodes their reputation. rebellions happen on their own in such environments, and people know this.
------------------ i'm thinking about goal processes, and something i haven't thought about much internally is how considering something can stimulate goal-associated feelings (like excitement or worry) without immediate consciousness of the connection, for me. this wasn't always the case, but when trauma happens in this situation one can become very sensitised to triggers; I also have processes in me that attempt to "snipe" my own behaviors, acting before I do. emotions here are basically utility. excitement and such is positive utility, worry and such are negative utility. this lets one consider the simulation of something -- the exploration and evaluation of a decision tree -- as part or all of a way to update its own evaluation function. we explore what is possible, and see whether we like it or not. if we do or don't, then we migrate that pleasure or pain up toward the root of the decision tree. if we can then identify that a state may be similar to one previously explored, this gives us a quick heuristic for how much we might be interested in it. that heuristic itself, can also have a learned usefulness for informing general utility.
participants (1)
-
Undescribed Horrific Abuse, One Victim & Survivor of Many