This class blew my mind https://www.coursera.org/course/crypto Helped understand how much logic is subverted. -lee On Saturday, September 14, 2013, brian carroll wrote:
// apologies- yet another attempt at error correction of a previous statement and clarification.
quote: Subvert logical reasoning, disallow it, including freedom of speech and thinking and censoring and controlling what facts are allowed - controlling POV - then the opposite occurs: A|B ---> B, such that T => pT
This is the code exploit of the Binary Crypto Regime, where B(0)=A and pT(0)>T(1)
--- clarification on computational approach ---
I made an error in description of processing in the context of infinities, stating one infinity to the next is evaluated as if a serial approach when instead this would be nonlinear and require massively parallel processing, along with serial evaluations in a looping evaluative heuristic - testing against a hypothesis or model, 'running code' or living code' as it were, versus a static one-time interaction of data and algorithms, instead more like a situation of intelligent life in a bounded context as if data aquarium or code planetarium.
the reason for this parallelization relates to considering all the combined permutations in terms of probabilities, and thus [x][y][z] as variables are not necessarily, seemingly, about an algorithm that reveals a structure that helps move from x->y->z by some mathematical structure, or so that is my naive guess, such that if you use conventional crypto approach #1 something may be revealed between these that matches an equation pattern, provides order within the chaos of variability, until legible, intelligible.*
In other words, instead of XYZ being a serial number that could extend linearly onward toward infinity, that is: [xyz] ...[∞], and that "string" is a horizontal number or code, that can in particular be related perhaps to binary on-off processing in a highly efficient manner, processing and computational speed and largest prime numbers as context-- instead, the assumption for this same situation in another crypto framework is that it could be happening 'vertically' like a slot machine that runs to bounded infinities (largest primes or not), within each variable, and thus [xyz] may not have a discernible linear structure or overall equation that makes sense of the resulting 'horizontal' string. such that:
[x][y][z] => [n¹][n²][n³]...[nⁿ]
whereby [n] is variable and could be anything- a number, a function, a calculation, null, its own computation. And in this way, each variable could tend towards infinity or its own structuring, within the string, whose length is not so much the issue as the difficulty in resolving its total structure, especially linearly, such that [n¹²³] would not be decipherable running algorithms across its horizontal string and instead solving for each variable or say grouped variables in the string, eg. [n¹][n²⁻³][n⁴⁻⁹]
Thus, while i have no actual idea of how crypto and binary code relate in terms of encryption methods and decryption, it is assumed this approach remains serial and directly interrogates the serial string to reveal its structure, across various formats of the code (various programming, encoding and other data formatting schemes). Thus, a binary string would be an example, 10100011001010001010101010010101, whereby to solve for [xyz...n] would involve finding the overall linear structure that provides for such linear organization, say assuming it is encrypted code *)S)*S*))SA&*S&**S()S*S)aAUIHNL*0, and therefore the assumption would remain that each variable is related to the next in some 'coherence' and solving one part or layer may reveal another, such as HSKSLLILHSILALSWLWLSDUI and thus the string [xyz] is made intelligible by this coherence within the linear string, across that massive horizontality (very large streams of data that contain data and programs and messaging).
Whereas for a paradoxical logic approach, each variable could itself be 'many' in place of a single bit- or the boundary of the single bit could be [N] and move towards a bounded infinity, a mathematical function, or other calculation in that same location.
disclaimer, stating the obvious, i have no idea what this is in terms of applied cryptography, there is tremendous gap between these statements and actual code, though to me the approach is much more accurate as "thinking code" that involves human processing via logical reasoning, parallel and serial processing, and thus the very idea of a string of code in that view could also function as signal in noise or even absolute truth, in terms of messaging. and so it is obvious 'binary thinking' is not like everyday evaluation in the sense that there is grey-area to mediating events, a pause to decision- yet within this pause, bounded infinities of hypotheses can be queried (referencing previous instances in stored or external memories) that then influence the tallying of the response, which most likely will be weighted between 1 and 0, unless purely ideological.
Thus- *conceptually*- to consider "code" in this human context, of a living breathing idea that is grounded in empirical truth in a shared human viewpoint, that is to be shared as information, via exchange, it is more grayscale than 10110101010100001, in terms of language and how thinking functions, more about looping and weighting of variables than having a *single correct result* when there can be several overlapping or contrasting interpretations *at the same time*. So imagine if the binary string had each bit that was variable instead in a 0-9 scale of weighting the evaluation, such that 10927238292340246.
This moves the [binary] string into a fragmented string of variables, more like analog computation [1][0][9]...[6]. In this way it is to consider the 'bit' as N-variable, and thus what if it were the alphabet instead of numbers: 26 letters possible for each bit: USOWHLSELNSQAHBVY
the issue being that like a slot machine, those [N] variable bits could tally up any potential letter of 26, or +10 numbers with alphanumerics, or add lower case and punctuation or symbols and suddenly a small 'string' of data could involve huge interrelational structures that may or may not be related across the horizontal span, depending on how it is constructed via algorithms and conceptual formatting. Maybe this already is the way transformed code is achieved with taking a certain sized content- variable [x], and then transmuting its entirety into a string or stream of obfuscated data that must be 'worked at' to decrypt or be translated to make use of.
The seeming difference would be computationally, how this relation exists in processing, in terms of hardware and software, though also thinking, programming. Because what if there is a limit to these transmutations that is forced into a binary 1/0 and thus bounds these infinities to only certain algorithmic space, or even computationally, that such numbers cannot be adequately computed and thus *do not exist* as calculations within machines and software approaches, crypto perhaps especially, when the security they would provide would be unfathomable in terms of existing brute force calculations of 'linear' patterns.
my speculation is of an unknowing of applied cryptography and computer programming yet knowing of logical reasoning and empirical thinking, awareness, and how the two are ideologically at odds in approach in terms of basic assumptions. thus within my condition of 'illiteracy' there is an attempt to share an idea (pT) about a shared situation from an outsider vantage, with those of highest literacy of applied code, yet within what to my observation is a flawed idea and based on false and inaccurate assumptions, in particular the primacy of binarism for security when this nonlinear/multilinear computation (parallel & serial) would easily defeat it.
such that it is not about strings and instead parallel sets: [x|x|x|x]...[n]
as the [variable] yet this may not be coherent in a horizontal algorithm to solve, it may not have 'rationality' across, from one digit to the next, revealing its hidden structure. instead, randomness would be inherent instead of woven into the code, it would be more revealing information out of noise structures than putting information into noise that is bounded and can be shaped into structure. in this way also, noise could have structure yet not lead to decryption, it may be a false corridor within the ever expanding maze.
it is that [N] variables each are in superposition, not static by default, finite and absolute, and instead 'truly variable', unbounded to a certain extent (infinities within infinities across infinities via nested sets).
the conceit or test of the heresy would be 256 'bit' quantum computer that solves 256 AES, though if it were a binary string this could even be trivial, versus say [N]-bit, which seemingly could take *forever* to evaluate, via running, looping code evaluation and a shared empirical model that develops alongside, out of and through the technology as a 'thinking machine'-- which, the more it is like the human brain, the more likely the messaging could be made sensible via existing concepts and structures to test against, evaluating patterns and looking for correlations. in that context, a three bit [N]-variable string of code could probably defeat all computing power today, especially if large expanses were allowed, numbers, letters, symbols-- it would be unsolvable potentially, extremely probable. Largest primes would be a minor detail, another variable seemingly in such a context, due to its potential for incoherence and complexity.
likewise, this [N]-bit approach for random number generators, yet why not random outside of 1/0 as a noise field, generating strings via a two [N]-variable string, just let it run and tap that, without or without structure, would it even matter. in other words: take any two ideas, any two signs or symbols or colors or whatever, and relate them and tally and extend this as a process. that is proto-language in a nutshell, this the crazy nut cracked open yet beyond the insanity of my own incapacity to communicate and flaws in understanding-- there is something about this approach and basically observation that has *coherence* that is absent in a binary approach and serial algorithms-- because that is not how people think or communicate, it is N-dimensional, geometrical, looping. and processors and code and software at present cannot model this, allow for it. and that formats reasoning, perception, what options are available to share ideas and evaluate them, and we are stuck in binary because it is enshrined both in technology though also in institutions-- it is the dead static code of shared ideological non-thinking that is pushing decision-making and actions towards its deterministic end game, which is a onesided machine-based value system, devoid of life, nature, and humanity, except insofar as it profits its own continuing automated development and further extension.
so the gap between my illiterate views and the actuality of implemented security code by those literate is one aspect, though another is my literacy in thinking code and the illiteracy of thinking within foundational technology, its infrastructure, and the result of this, which requires a world like it is, and relies on bad code and ideas to allow for it. thus an audit or accounting of the situation, an attempt to get across the idea that there is a model of dumb, unintelligent code at the base of this situation, the approach is so flawed as to be the basis for tyranny, and it ties into 'ideology' across platforms, individuals and groups of people to software/hardware and bureaucratic systems, and in that 'combined state' of a false-perspective empire, the kernel is corrupt and the whole thing invalid, including at the constitutional level which itself is ignored, by binary default, the epic loophole of relativistic frameworks allowing the fiction and its virtuality to replace shared logical reasoning, because truth and logic can simply be ignored, 'privatized'. and enclaves can rule over others as if a caste-system via technology and ideological assumptions that function as religion, technologists as priests, gods of this technocratic utopia, the peasants not having the understanding to operate in such a realm, as guaranteed by the originating lie and tradeoff that allows for all of this to continue. that absolute truth is an everyday condition and you get to choose what to believe as if a right or protected mode of operation, no matter how many others must suffer for it, to sustain the illusion and shared delusion.
the cloud here in the corrupted model a state filing cabinet, digital bureau for the bureaucracy, citizens organizing info into others invisible folder structures, volunteering the data via handover, designed into the technology itself as a marketing and communications strategy. the sieve of private data is equivalent to entire populations seeking out pickpockets to hand over their contents, incentivized as it is. and so 'security' is as if a kind of institutional transparency in relation to a corrupted, failed, rogue state that can read and see everything you are doing, whether or not encrypted, dumb terminals every computer to the state mainframe, rebranded and rebadged, hidden, 'anonymous'.
--- more of this insane ungrounded viewpoint --
it was mentioned a three variable 'string' [x|y|z] would be differently approached if parallel versus serial, in that each bit of a binary string could be N-variable in a parallel approach, or so it is assumed possible, as with probabilities and slot machines, or basic everyday observation of events and what enters and exits consciousness given context. and while not knowing the depth of this in terms of cryptography, completely out of my depth, it would seem the concept of keyspace could relate to how such a 'paradoxical string' could exist, given the boundary for determining what N could be for [x], [y], [z]. For instance if it were binary ones and zeroes, the probabilities could be run and 8 different permutations or combinations: 111, 100, 110, 101, 010, 011, 001, 000.
And within that, perhaps there is meaning. Yet if 'the probabilities' are changed via [N]-bit variables, it could go all the way to infinity for a single variable, and thus BIG BANG inflate via probabilities into a huge keyspace, perhaps unpacking structures this way that reference others already developed, as if infrastructure being revealed that connects with others elsewhere, via wormhole-like connectivity and then closing down upon successful messaging, thus encrypting and decrypting via few variables, via inherent yet hidden structural relations within these combinations, which could be infinities related to infinities and then the issue of how to find them or what to look for. Black box yet even moreso, RNG as model for signal, not noise, thus tending toward psychic Random Event Generator as if innate sense of animals before catastrophe, cosmic faults and folds.
the idea or difference is paradox- essentially *superposition* of the bit as [N]-variable, no longer finite and static, potentially active and transformative, diagnostic even in a sensor sense of the analogue as queued circuit. What if alignment occurs in the string under certain conditions and not others, what if it tunes in and structures revealed, decrypt, yet out of tune it vanishes, code collapse or changes as with temperature sensing colors, and the variables change, mask into background, returning to mystery. It does not seem that computers today can even adequately allow for infinity, a single bit of this, versus a larger parallel string- and what might that mean about thinking, too. nothing more than finite discreet thoughts, one decision to the next unconnected, unless largest prime, say rogue US terror-state pwns earth as if master discourse, shared POV, even though ungrounded- this the dumbed down unintelligent lowliest shared viewpoint of situations in their depth, instead made shallow, sold as daily headline? the CODE makes it so, in brains and machinery and bureaucracy. binary is the enforced and corrupted 'shared state', conceptually and ideologically, yet it is a false belief.
the issue then of shared and unshared identity, belonging or not belonging to this 'master/slave' thinking...
shared ID <-----> unshared ID
And how this relates to default interpretations, the quickest route for 'feedback' and determining events based on perspective... are you binary or paradoxical?
Can you make sense of your own consciousness or must you take on false consciousness to function in society and go about decision-making in its frameworks, taking on its value systems yet which fragment a person from their own 'true' self, taking over and reformatting and reprogramming a life to serve the machine agenda over and against 'shared humanity' -- now an unshared identity, via private relativistic ideology. sell out your ancestors and neighbors for a place in the machine...
Quickest route to thinking- *binary* of course, processor speed as if SUPERSMART! --- "look- i can decide things and determine things irrespective of their actual truth, and it works for me and others, everyone else is just lazy!" Like water flowing downhill, 'logical reasoning' turned into Price is Right PLINKO game, quick and easy 'automated reasoning' via path of least resistance aided and abetted by binary ideology, creating friction-free virtual universe, mind detached from body by also flawed historical beliefs, enabling this madness its onesided platform. the trope of largest prime 'uncorrected ideological perspective' the trophy award for the most stupid, greedy, and ignorant. an entire society and civilization built around rewarding those whose activities align with this, against human conscience and its needs, that then is viewed as the enemy.
--- major social dynamic ---
ideologically there is a differentiation in terms of the process of reasoning, how information is parsed...
intelligent <-----> smart
also, how shared identity may differ between empirical and relativistic models of truth...
truth <-----> partial truth
and the difference in conceptualization, reliance on how frameworks are constructed, tested...
ideas <-----> facts
and this directly relates to issues of observation and cybernetics (looping circuitry)...
fallible observer <-----> infallible observer
error-correction <-----> no error correction
In this way 'inflated' or 'bubble' views can rely on warping, skew, distortion for their truth which is verified by conforming to a false or inaccurate model reliant on a limited *protected* or SECURED version of pseudo-truth (pT), as if shared empirical reality (T) removed of error, because it is believed to be, via ideology.
grounded empiricism <-----> ungrounded relativism
In this way a 'private worldview' can replace 'the public' view as if a shared domain, and become the basis for one-sided 'reasoning' depending on authoritative beliefs, where facts can be chosen to fit the model, others discarded, to uphold the perimeter, basically privatizing perspective to a finite inaccurate view as the ex