--- general observations ---
my snark regarding algorithms was misplaced and also wrongly conveyed, due to both ignorance and error, if not trying to even-out the ability of some to be able to mediate issues in this technically computable approach while others are stuck inside equations that do not work for their basic reality, and then operating within said limits that can align with the ideology of quantification. in some sense, having more defining power than language alone, because of its connection to number and to other connected programs and models and frameworks that are operational, within the shared concepts. and that the gap between these and the world can be immense, yet likewise unaccounted for- and thus being able to place ideas into algorithms, inside of scientific and technological and cultural systems, has within it, this motivational aspect that correlates with power, and in that, determinism, that being able to interact in this realm is closer to the reality, versus to creating the substitute reality of the model. and in this, a coldness or hardness in thinking that may presume existence on these terms, for self or others, when it is not actually this simple or easy to obtain- accurately, yet being able to believe it is so, when reduced into a private viewpoint, especially when shared and protected within a technocratic environment.
also wrong, speaking of algorithms without differentiating between those that exist inside computers and electronic circuits of machinery, and the rules and routines outside of this context, in the human step-by-step processing of awareness and interactions. in that they are inherent as an approach, involving first questions and trial and error and building up of empirical heuristic responses, strategies. (perhaps Marvin Minsky here).
in this way, the zeitgeist of algos (~pain) could split between thinking and reasoning of humans, and that of machinery, and perhaps of a natural algorithmic computation that occurs within life, as different from that which needs to be programmed to 'think' - and then questioning the rules for approaching this, especially in a relativistic mindset that never can escape from pseudo-truth while also not acknowledging its existence due to the 'impossibility' of absolute truth, which instead replaces it as belief. in that, a truistic substitution occurs in 'knowing' truth cannot be known, which instead becomes this unknowable truth. in the same way 'you cannot prove something true, you can only falsify' is self-contradictory, as it relies on this unprovable truth to validate falsification as this truth. a limit to allowable perception and understanding then frames thinking, etc.
(so, this to acknowledge i have the jerk gene, yet overall it hopefully balances out with the benny hill gene also present in better moments.)
i think there are assumptions that have been made prematurely about the nature of language, also in terms of its linguistics, and also mathematics that has established a too small boundary around consideration of these situations that then become the realm of code, programs, and computers. and it is connected to the way people think, before the machines are developed, that can also involve errors and bugs in consciousness, reasoning, thinking and relations that formats the technical approach in a given worldview.
i do not believe there is an accurate "model" of code (language/mathematic) to develop from within. i believe there is a tremendous gap in approach that is based on flawed and unchecked ideological assumptions about how people actually think, know, and believe that is not accurately translated into technical systems of computation and that the ambiguities these allow and introduce and sustain are today the realm of exploits against humans.
this is to propose that both language and mathematics are ungrounded by default of their not being removed of falsity or error in their modeling, and that this relates to normalization of linear sign and symbol systems that can be recombined endlessly yet not error corrected in any instance regarding actual grounded truth from all perspectives, (even scientific data when limits are placed on its evaluation, and thusly made political).
it is proposed that this involves the issue of space-time and movement within it, as language goes from point X to point Y and that in this, somehow time makes evaluation of its content a temporary evaluation that is lost to an endless repetition and recreation of observation and perspective such that 'truth' is repeatedly being recreated over and over again, and being copied again and again, copies of copies, without regard to what is the original situation referenced, which has become detached from this and replaced by this very signage. and thus the signs become the false reality and processing can be removed of its earthly connection, disembodied, and made into a pure abstraction, where the binary 'sense' is freed of limits that tie it to shared perception of truth, yet it represents this as the ideal. and the partially shared view becomes warped, onesided, and does not accurately connect to the world, and instead replaces it and enforces this view as a situation of power that can determine what is true, real, 'good'.
and there are people programming this, writing code for it, online and off, including in their brains and beliefs that totally rely on this framework to evaluate their self and others, though in an ungrounded condition that must limit the proof and make it unfalsifiable, themselves god, infallible. and thus some people should have no healthcare, no food, the system itself is perfect and there are only immoral people who should suffer for failing in these terms to sustain themselves against natural instinct and basic thinking skills, and instead conform and comply as if animals, else be trained or used as natural resources as part of a slave-based ecosystem.
inaccurate modeling of truth in language, to the depth of linguistics, and mathematics allows this condition to be normalized and made real within the 'empire of signs' that have become operational in this way. it is as if to presuppose the SIGN was the first truth, versus what it is referencing and in this gap the distortions and false views take hold of consciousness, as 'nothingness' becomes the cosmic centre of existence, and not shared being.
the earthly meridians then are aligned with structures of falsity, and as with institutions and individuals so connected, correspond to its movement and orientations as part of the motor, the mechanism of civilization as a machine, as peoples work together combines into a larger series of forces and it is this momentum captured and extrapolated and extended into the automated machinery working against populations likewise. at its core, once partial truth that has become detached from its external accounting and is essentially a protected realm of lies at the core of false civilization.
so there are Supermen at the helm of this global ship of fools who truly believe in an egotistical way, the mandate of such a corrupt enterprise as if born of natural moral virtue and their superior station over others, at the cost of humanity, life, nature, principled development. and that is where the critique arrives from, the illusion of this condition in its higher moral plane versus of a lower, debased existence which is ignored because it is not accounted for, and even more- it is relied upon to keep the system functioning in these, again, protected parameters. as a boundary and limit to manage and maintain-- of *territory* aligned with ideology. and how this relates to maps and mapping, what can be allowed to exist and what cannot, so as to retain control over the illusion and its 'processing' in the given terms, else, gears move slightly out of alignment and noises appear and soon enough, pressures build and the mechanism begins to break apart, and the more this loss of alignment occurs, the less the larger mechanism can function in its totality, gears even working against one another or grinding into the shared structure needed for their support. in this way, the threshold condition and the risk of not being able to contain truth within the designed and relied upon parameters. margins of error in a context where errors are structuralized- thus, margins of truth, narrowed and needed to be limited to allow the false construct is operability and governance. the role of the censor or directive signaling in this regard.
and this is sustained by sign-based communications in time, it is proposed and that the space it establishes is a vacuum environment that fills in with this dynamic, due to a natural ungroundedness of observation within these parameters. what if- in contrast- the SIGNs were only best used for twitter-like short statements and not vastly long accounts, if only due to the impossibility of sustaining accurate truth from a single perspective of all that arose before or must be recreated from that view, as 'new code'?
what if the very issue is not attaining panoptic accountability of POVs in serial, linear language, and thus there is no error-correction via shared empirical evaluation of all aspects of a situation from N-observers, past, present, and future, to correct the modeling.
what if the very conception of this situation of reality is trapped within a false concept of language from the start, and self-limiting observation that seek to determine what can be true given its usefulness in the present and yet over time this has led to a false view evermore far from the truth.
what if 'code' and 'programming' are not inherently nor need be linear or serial, in terms of computation and reasoning and instead could be parallel as models of circuitry, where signs are a subsystem of concepts and a way of referencing categories, such as metadata yet could autoconfigure into a map or pathway to convey meaning. thus to tell a story with detail XYZ, instead of needing to recreate the context for this, the detail instead can be situated within the relevant model in the dimensions it exists and thus evaluated that way without recreating the data, and it could be viewed to whatever degree of fidelity with truth as an observer is capable of. a child may observe to a limited level, while a historian or anthropologist or psychologist another. why must the world and its truth be recreated in every instance of communication to say something 'minor' that is embedded within the shared framework, when this very framework is not yet secured?
and in this way, from the 1 and 0 of [truth], into [logic], then [concepts] and [patterns] that are then grounded in this sequence, that build up and out from this stable foundation based on N-observers and *hypotheses* that can be viewed from every angle and tested against reality and refined, as part of a shared process of awareness and learning and transferring insight instead of seeking to cordon it off, privatize and limit it, including all human knowledge, making it dependent on money and shared sets (class) in order to allow it to exist, and by default this whole viewpoint cannot be allowed to exist by fiat of dogma, because it would break the ideology that such truth is even possible-- because it is *truly believed* not to exist!
in this way, instead of the sign, the circuit.
the circuit is what connects the sign to its accounting in truth, via logic. it maps from a realm of communication back to what is referred to, which can be a [model] that is a conceptualization of an idea, including in its weighted probability given all various dimensional factors, and these in a timeless static mode unless put into motion. thus a common referent that sinks from information back to originating truth and its accounting and this for each and every single viewpoint as it relates to reality in all that is known, considered, and unknowable. that would indicate the actual limits and boundaries of perception as combined, especially if AI computers were of this same approach, complementing and working-with and for human awareness, versus against in a onesided gameover scenario today.
code itself, in this modular construct, could involve patterns and how they are established, yet inherently the variables would ground towards the most accurate understanding of truth given the combined human awareness, as with adjunct computational resources to sustain and extend this further via its processing and reasoning and insight. the concepts of code and programs then would inherently require grounding for their operation and this as it relates to security - securing of truth, though in a contingent realm of paradox where minor truth may exist in a larger falsity and thus also must be accounted for as truth, recovered. the process this involves in terms of thresholds for data or observations, underlying methodology and barriers to exploiting this system via lesser truth or lies. in this sense a protected domain could exist at a certain level or as a subsystem that would allow only certain access to sensitive data, based on limited access, which is another realm that future crypto could apply, in addition to those who are outside this model or stand against humanity, in terms of their agenda. a liar or mimic would get nowhere. childlike awareness the first defense, in that disregarding truth would no longer be a viable tactic for deceivers.
in this way, point line plane and constellations and circuitry, prior to the naming of the concepts as SIGNs, and then the combination or sequence of signs to occur in a pure realm, where a small assortment together in their absolute purity could replace volumes of books in perspective, in that the viewpoint could be explored as an open perspective from any angle to the connected data, moved through and around, questioned and tested, with a fractal read/write capacity inherent in this approach. to escape the page and enter the dimensional screen in depth, endless, as the model may link to others in a larger context, and that this is room needed to begin to explore ideas, not single letters formed into words that each follow another to try to say something that remains beyond the truth it references and relies upon yet cannot call upon or stop this movement to process (T). in this way, 'logical reasoning' stops the sentence and word and requires computation outside this framework, to break back words into concepts and beliefs and map these back into models of truth, and test them against the observational data, and judge their merit, as they tend towards greater truth or rely upon great falsity as perspectives. and that the truth is what must be the guide and compass, and this falsity not allowed nor given the same weight or capacity as a choice that allows desired movements and the moral treason of this, especially as normalized, made routine, etc.
for instance, consider [drugs] in the non-pharmaceutical context that can range from positive to very negative effects, as this could be mediated as a shared situation and perhaps involve a ritualistic, responsible access that could be part of cultural learning and connected to self development while removed of exploitation or human suffering and violence brought about by these same dynamics in the existing system. within given parameters it would be possible to take what is true about them and deal with it, and to allow access to this truth while not denying other truth, and if that were possible and could be resolved, could exist in balance for humans so that it could be managed in a healthy and responsible way. again, as ritual or some cultural structure that could allow exploration mind-altering drugs as a particular aspect of development, for some people, if viable for them while for others they may not want it yet would not be negatively effected if it were balanced, respectful, and within certain limits or parameters. say a highly conceptual person who wants to further explore their mind and is guided on this path via established structures, and supported in this, yet also connected to the ground and a world of others, as this may then lead to new insights or discoveries or understanding or awareness, yet in a context that also involves addition, danger, mental health, issues such as abuse that may fracture the psyche of a fragile person, etc. so to account for all these dimensions in an individual circuit and evaluate and in such a way, over repeated instances of this throughout society perhaps it can be an optional path for some and yet not others, yet retain overall cohesion versus setting society to function against itself, and losing the greater purpose and forgetting the obligation involved in shared earthly existence.
thus if there were [models] of situations, whatever they may be, over time and repeated iterations, these models would be refined and tested against and made increasingly accurate and grounded in truth, removed of error. and thus differing perspectives could be evaluated in terms of this modeling and 'truth' referenced with regard to all facts and 'theories' (hypotheses) such that what is true in each view is validated in its truth, as part of this process, versus discarded by a winning perspective, as it is today. and thus everything that is true, in its truth, maps back to 1, and all that is false, in its falsity, maps back to 0. though this is contingent and all that is hypothetical or theoretical could also be maintained and accessed again should its previous interpretation have been limited, thus scaffolding of unsupported views could emerge out of a realm of unknowns or the falsified from a given perspective, yet in being reframed or having new evidence, could challenge the existing models once again, perhaps to change or reconfigure or partially dismantle them in whatever inaccuracy exists.
in this way, the use of SIGNs to communicate one SIGN after ANOTHER, and needing to rationalize these as a perspective in a long-running framework that is like a unique number sequence or super-long prime number, instead:
[concept] [concept] [concept] [concept]
as these may be nested or connected together in various _non-linear and-or parallel bit set configurations. the above sequence in its 'content' could hold the information of an entire book, for instance.
[internet] [cyberspace] [infrastructure]
and these concepts could go back to the quasi-algorithms that model them as ideas, their conceptualization in depth and breadth- as molecules, yet also with nested sets of shared structures and scaffolding...
[electromagnetism] [processing] [data]
and thus different frameworks could develop or be referenced in a context and perspective could move into and out of these frameworks as structural description where the /concept/ itself is modeled in its truth, there is no warping or distortion or error-involved in its mapping or accounting and if there is it can be observed and error-corrected via debate and panoptic review that defers to truth, firstly.
[code] [programming] [crypto]
and thus you could go further into a given context or start in one context and move into another, or someone could evaluation particular dimensions of an event or idea, thus inspect and analyze various facets of the whole from any angle, including establishing new perspectives if valid, thus the issue of boundaries or limits or thresholds of ideas, and the conceptualization and conveyance of ideas could remain open to interpretation, given facts and reasoning for a viewpoint. and this would be a right and an obligation.
and thus if someone says there is no inherent limit on considering crypto only in a machine context, say involving human perception and relations and communication in an ordinary day-to-day context, this could be referenced in the shared model and allowed to exist as a hypothesis, a potential in whatever truth it may exist, and could be explored versus censored as an idea within an educational context that involves these aspects, versus to fit the exploration only into a finite, limited approach, sans people in their inherent crypto-capacity. in other words, if a person has code and programs (algos) as part of their daily existence, perhaps some of that also involves a cryptographic aspect as it may relate to anthropological and sociological or other issues such as heritage or demographics, that has value in this same context and could and should be explored in its truth. yet in universities today it would likely be disembodied and held outside 'computer science' cryptography when that domain could likely benefit most from challenging ideological conceptions of what crypto is, how it readily functions in the day to day, and what is possible, given shared awareness. and thus the requirement of accountable empirical truth to enable this interdisciplinary transfer and support of knowledge across 'boundaries'.
it likely cannot exist as an approach until truth is actually secured and a new approach to language (and mathematics) developed as circuitry, without falling to the weaknesses of existing language to overpower involved truth.
--- random re/writes ---
i missed it, should have written about stitches and code yet did not enter into sewing itself yet hand sewing seems most closely connected to coding or at least in some aspects. someone did the detail work at some point that may or may not be autogenerated into other patterning, via computation or algorithms, yet like patchwork quilts, taking sections of patterns and placing them and stitching them together in a larger whole, that could in some sense correlate with a program or development approach. programmatic patchwork or quilting of data. there is potentially much more to this yet my limited knowledge prevents exploration. "stitches in time" as code may go through routines or be processed, programs nesting various data modeling and how it is woven together or involves many broken stitches as processed and likewise its relation to the network or webwork in terms of weaving, and then of spiderwoman and such symbolism, or the three sisters.
so too, the idea of a discontinuous cipher that is pieced together like a quilt. also, the conceptual aspect of QR-code, perhaps datagram-like characteristic, embedded in its aesthetic, as this could relate to abstract if not asymmetrical quilts that can highly advanced aesthetics likewise, as futuristic as technology and potentially in a realm of electronic fabrics could carry diagnostic or data functions as platform and infrastructure. so again a question of boundaries for interpretation of where ideas function and their potential migration across boundaries or within territories.
zero-knowledge crypto in terms of proof and puzzle pieces; may it involve in some sense having a puzzle minus one missing matching piece, or else a single puzzle piece minus the puzzle. (puzzle minus single piece; single piece - puzzle)
then, also another tangent though of 'binary truth' that is only partial, perhaps minor, that when its larger condition is falsified, could undergo the equivalent of magnetic reversal (of earth magnetic poles directions) such that the context for observation flips to the view opposite those who are onesided and warped in their observations and self-serving belief- if corollary, then the very foundation of their entirety of observation would vanish, and the truth they are left with would rely upon a structure they have opposed, in a worldview they cannot 'reasonably' function within. in this way, as with technology where a pole reversal could change compass directions overnight and confuse and disorient wildlife, so too binarists
.T/ ..4\ =q.. ||9:
--- ungrounded numbers ---
...more on the issue of quantification and SIGNs as variables. what seems to be involved is the unaddressed issue of /superposition/ that influences how an observation is grounded or ungrounded, given its interpretation and accounting or lack thereof. and thus, suddenly equating [sign] = 40,020 is also a question of its variability as a sign, its accuracy, for if it has multiple interpretations, this numbering also is fuzzy or ambiguous while presupposedly defined and concrete as a number, as if the calculation or the number is mapped to a solid error-checked reality, when instead the number itself could *symbolize* or emblemize this, yet only as a mirage or belief and not as an actual grounded condition - thus it could be virtual or a false perspective, that the sign of the number then represents.
in other words mathematics as it involves number can also be ungrounded and involve error-reliant calculations where assumptions as to shared view or referencing shared facts or conditions may be reliant on limits or partial truth that does not account for other aspect beyond or within a boundary, yet seeks to and defines a territory by this "objective" numbering that actually may exist in subjective and ungrounded terms (A=B) by default, whereby the language of mathematics is itself not empirically situated else could function in some ways as literature, perhaps as ~theorized physics.
writing code and programs in terms of number does not make them inherently "objective" (A=A) or more accurate in terms of accounting for reality, yet it may involve inherent rationalization and reductionism and deterministic frameworks that could naively assume so, or involve ideological conceit that equates this with a condition of actively mediating grounded truth. the difference is that this is sign-based interaction, at the level of and within the context of signage and its calculations that have become and are removed from a larger accounting for truth, which is outside its boundary and territory- sort of like a Java sandox situation that replaces the world with a representational version that no longer references truth beyond its borders and thus becomes this truth itself, albeit ungrounded, in error. all language is proposed to share in this conundrum of its serialization an linearization, everything moving in a direction yet never converging into a single central point, every observation moves away from every other even when referencing another, it is bounded within particular and unique frameworks that recreate the world again and again and again, yet adding errors and relying upon distortions and lesser and lesser views to maintain given beliefs, against the surrounding evidence, and thus the primacy of uncritical belief and the utility of mind- and brain-washing and pills and drugs for making populations compliant, malleable to the false-viewpoints.
here is the issue with coding and programing and mathematics, numbers, in that the limit or boundary of consideration can add up differently yet this may also be hidden from the computation at another level of processing...
5 + 2 = 7
consider the above calculation of two numbers that equal a third. each as a number could be considered 'finite' and unambiguous as a number, 5=5 &c.
no here is where it gets weird, strange. consider that each number could be evaluated as a [sign] that is exists as a variable in an equation...
[x] + [y] = [z]
for mathematics to be subjective, there needs to be a difference between this sign as it is believed to exist and as it actually exists. and thus a /word/ or /concept/ could be input, such as /work/ and that could have an inherent ambiguity involved and involve estimation and approximately by default of its superposition of meaning, the potential paths of grounding
now imagine that /number/ also could have ambiguity, where A=A could move towards A=B even though the SIGN could appear the same, within a certain context or limits or parameters or a boundary or threshold condition...
[5] + [2] = [7]
what is trying to be established is that each [variable] has a potential for ~fuzziness, in terms of calculation or processing, depending on rules for how it is evaluated. that is, a de facto limit or boundary or parameter that may be assumed to exist may be a faulty assumption that is in error, and that the resulting computation that occurs be ~variable...
[5.1] + [2.8] = [7.9]
if the above parameters were operational yet hidden, the [variables] could all appear the same (5 + 2 = 7) when removing the decimal placeholder and extra numerical detail, and thus the computation could equate with the initial example even though having this extra data that is invisible or left unconsidered in a given model- due to limits or boundaries and the ability of an observer to account for that level of detail.
in this way, if the resulting calculation is concurrently rounded-off in parallel, the same calculation would be as follows:
[5.1] + [2.8] = [8]
further, if the initial variables were slightly different still, yet another result could occur in this seemingly contained ~"objective" framework...
[5.7] + [2.9] = [8.6]
or:
[5.7] + [2.9] = [9]
what this seeks to convey is that if a boundary were to exist within the structural analysis of the code that allowed a floating decimal yet this was hidden, that extra data could be carried in parallel with an equation that could be calculated across a range of outputs due to its variability as evaluated, via limits or boundaries or threshold conditions. not only as a calculation of number, though, as a likely basis for security exploits in that this is how estimation and approximation of variables in pseudo-truth inherently function, that extra-missing part is never accounting for, and it can be the majority of a viewpoint when dealing with words themselves.
in this way, the output could vary, given boundaries:
[5] + [2] = [7|8|9]
that is, if there is some approximation going on, that could be hidden or unaccounted for in each [variable] that is still used for computation or hidden analysis. such that only 5 is accounted for (not 5.2), due to a limit or ambiguous framework, yet could enable other computations to occur.
in this way: [~x] + [~y] = [~z]
insofar as boundaries may be exploited, in terms of number, which can thus transform the assumption of A=A mathematics into A=B mathematics by default of this missing detail or its shifting, especially if due to confusion or the incapacity to account for what is occurring (perhaps chaos, even, if designed this way). thus ~complexity, paradox, ambiguity as these relate to and potentially exploit conditions of partial truth.
this and, in all ways, the use of named variables as SIGNs to denote or connect these to numeral values, as if grounded by default. yet it is not simply to start with A=A correspondence yet is ideologically believed to. instead it is fallacy yet used to govern the world via false perspective
--- other ---
regarding the HIOX signaling display for bit sets, it would be possible to also connect such display signage with a sensor that could effect and-or influence the sign, updating it via a stream of variability, perhaps much like airport or train station or stock market signage that tallies data anew, or towards a weather station diagnostic recording various parameters though it could be abstracted, and thus sensor data could push through the data matrices and potentially influence or function as an random number generator or, if signs are somehow linked together, a random even generator that could have implicit nervous system connections between chaotic data structures in remote locations, as if entangled, changing, shifting, yet in some dynamics or dimensions stable or able to queue in and out of this state so that the signage itself could remain [variable] as a meta bit set and thus perhaps like rolling code or some waterfall, going through this into another hidden realm, thus needing to get past it to gain access, or that it such randomness or variability functions as a boundary or refresh state that can recombine or alter its structure given input or keys, etc. in this way, tallying chaos. providing context for other crypto structure.
--- note on list culture ---
(who knew cypherpunks was an underground comedy venue...)
Häxan, The Seventh Seal, Mars
⚖