cypherpunks
Threads by month
- ----- 2026 -----
- January
- ----- 2025 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
- 5 participants
- 34061 discussions
https://www.muckrock.com/foi/united-states-of-america-10/vupen-contracts-wi…
https://muckrock.s3.amazonaws.com/foia_files/9-11-13_MR6593_RES.pdf
FOIA by @ramdac
Not a massive surprise, but interesting to see it in writing. Looks like
this is for the standard subscription, not custom development. Although,
the cost is redacted. Chaouki Bekrar is also mentioned by name.
—————————————
Rich Jones
*
OpenWatch <https://openwatch.net>* is a global citizen news network.
Download OpenWatch for
iOS<https://itunes.apple.com/us/app/openwatch-social-muckraking/id642680756?ls=…>and
for
Android<https://play.google.com/store/apps/details?id=org.ale.openwatch&hl=en>
!
1
0
disclaimer: if doing this is illegal then the alphabet is illegal and
'ideas' themselves have been criminalized by the terror state
--- (continuing...) ---
hint for previous code: _ |v JL 3 p\ w Z 7r : 2 6
perhaps should have made the last character (6) this instead: O J
the idea then of a digraph substitution: JL = inverted T, etc.
Thus, for instance if someone has a perceptual bias and is only looking at
single characters as single units, the diagraph, trigraph, etc approach
would be beyond their threshold of observation, beyond the perceptual
modeling and thus such bias or limitation in interpretation can allow other
information to exist beyond how it is perceived to exist by a given
observer, unless they figure it out- so it would involve rules or learning
how to determine what schemes are active or not, multiple or none
further, the idea of a search of the original string as anagram or other
approach, such that the breakage of units into fragments or components or
elements thus can defeat brute force meaning unless checking all variables
which would tend towards infinity, via probability. thus the secret is in
some sense the noise, the more it is looked for the more possibilities
arise and fill in the void with additional meaning that may or may not be
active -- whether intended or not, which is the spooky aspect of this, if
systems or data is somehow entangled whether as information and-or
objects; seemingly /metaphysical keys/ even.
so a hypothetical example would be, putting the original 15 characters into
a structural search in which the original concept cannot be located in the
analytical results (in ordinary terms, the 'word' does not exist within the
letters/numbers themselves) and in this way, while it was was the basis for
creating the 'string' (bit set), it may not be active in its interpretation
or its activeness may exist in the permutated results and thus not require
a decode by the receiver if already having the key because they know the
shared reference yet the structure exists as infrastructure- and it may or
may not have relevance though it could be mapped into that structure as if
a conceptual root structure that could have calculative or geometric
aspects- meaning, there could be many simultaneous structures some related
and others not, that may or may not be activated in particular signaling,
perhaps similar to neurons and synaptic structures in the brain that fire
together yet whyso remains a mystery due to the limits of observation and
an accurate model for analysis at the level of activity being observed-
thus phrenology or scrying in an ungrounded sense providing security by
enhancing warped, skewed, biased, limited interpretations, erecting walls
and false rationales by "relativistic frameworks" in a realm of halls of
mirrors within halls of mirrors. distortion by unknowing autogenerated, and
in some sense 'signal' can be masked as noise also, shell games within the
parallelism of infinity and the computational limits of parsing those
dynamics accurately, which could easily exist beyond technological means
thus also in the realm of literacy, 'reading the signs' or abstract
markings can become the limit between those intelligent and those smart yet
illiterate and not in service to the conceptual foundation involved, and
their perceptual frameworks can be used to construct false perspectives
that become labyrinths to control future options, yet while creating the
illusion of being in a 'shared awareness' or framework (partial) whereby
pseudo-truth, its reliance on error, can allow extra-dynamics to exist
while also coexisting within 'some truth' of the shared condition; perhaps
this is zoning within a threat containment model, or subversive takedown
approach, establishing booby-traps and trap-doors and pyramid defenses
so, a letter or number could be de|con-structed and words or concepts or
meaning could also be tokenized by a substitute system of relation which is
what symbolism also allows, a type of exchange existing with a symbol and
its network of conceptual structures aligned with it, an infrastructure of
shared meaning and dynamics that also has myth and magic and spiritual
dimensions as a potential, these tending towards darkness and light yet
also accounted for in a shared universe (U), this richness and depth then
perhaps closest to the forces involved and their dynamics, versus its
caricaturization that presents its substance mainly in cartoon terms, a
popular culture framework as if yet another commoditized ideology, thus the
assumption of surface-relations versus actual depth and core functioning
--- sidenote: after the expansion ---
it should be added that after a 'string' is permutated, say it goes from 15
characters into a million character / word combination space, that is in a
precisely bounded model that defines the perimeters and parameters of its
expansion and analysis frameworks (if not via infrastructure standards),
that it would seem this is a one-way calculation and as impossible as it
may be to find something inside the expanded set combinations, it would
appear even more impossible (exponentially) to try to put it back into the
original pandoric gift package, the originating /bit set/ even if knowing
the rules and trying to apply them in reverse, to go from 'many' words to
only a 'key string' as exampled here (said: bit set now, as a description
of the set relations between characters, not linear string if it is
"calculated" and worked-out in its probabilistic interrelations).
in other words, could you go from one million characters and various word
combinations back to the original 15 characters- and my presumption is no,
it would be even more highly variable in reverse if computed this way, in
that substitutions could occur at some point that introduce (noise) extra
data, such as a S=5 and starts to shift meaning out of the original
context, and so many of these shifts could occur, 'recenterings' as part of
analysis, that if not having them in their entirety and evaluating a
massive set, even if having the majority correct, that small area of error
could continually expand as noise and veer interpretations off course and
bound the backwards analysis- which could be useful if having a code that
generates something yet cannot be worked back, yet still is linked to it or
exists within the unpackaged expansion as an entangled conceptual order,
which again may or may not be activated- say like streetlights turning on
at certain hours, or animal migrations under certain conditions- the data
may appear momentarily or in a given context and then be absent the next
so in this way a 'noisefield' could be inherent in the expansion and
messages could be hidden within it, perhaps even texts or documentation,
that would be reliant on the originating bit set yet disconnected from it
if not available to analyze, yet it may also have carry any signal or
secrets when analysed, such that it is feasibly arbitrary, one of many
frameworks, and unless knowing what the perspective is, it could be a false
corridor and lead to *answers* that are inaccurate or 'counter' to the
actual view yet expand into another parallel stage-set universe (U2) based
on error-reliant pseudo-truth believed to be accurate yet moving further
into falsity (pT-> A=B). in this way, paradox is the limit, as if logical
reasoning and 'shared observation' and if identity is unshared, fiction out
of what can be shared facts in a limited model based on other rationales,
thus exploiting bias and other false assumptions and their structures
perhaps a way of conveying this is those with a human POV versus antihuman
perspective and how consciousness would be correlated differently both in
observations and interrelations, especially with doubling or imposters
for instance someone may perceive a human as 'racist' for an observation
because of a potential interpretation that is a particular framework thus
bounded reasoning to that viewpoint structure, whereby: humans {ethnicity}
in terms of sets, different skin tones and cultures could be secondary to
being a human being and thus 'human' would be the superset...
human {ethnicity} or
human {race}
This versus the opposite, determining interpretations locally as if the
main basis for primary awareness:
race {humanity} or
race {ethnicity}
In other words, subset {set} if not: subset {superset}, can take on the
interpretative 'universal view' (U) and it may have 'some truth' in a
certain relative framework yet it may not be the larger or overriding truth
of the observation and this ambiguity needs to be taken into account. say
with my previous instance of a 'song here song there, here a song there a
song, everywhere a song-song' as it may be viewed instead of a common
rhyme instead a device to channel bias about 'yellow people' as if the
intent. note also several such devices were used, not indicating this
viewpoint.
it is to say that multiple readings can co-exist, and there can be some
partial truth (whatever truth exists is true) yet it may not be the total
or overriding truth or even generative truth and instead exist as anomaly
or entangled instance that arises out of noise, as with a crystal ball that
takes in outside patterns and merges them into a common spheric surface.
i think when i wrote that i noticed the potential ambiguity yet the idea
was not that and decided to keep typing versus try to edit and invent
another device for an ongoing theme (about endless communication in terms
of 'linear string' assumptions as if automatically meaningful versus hollow
and how a lot of people can be singing and saying *nothing* in terms of how
it adds up empirically, versus fragmented relativisms which are basically
noise allowing the cultural degradation to continue under that mirage of
depth and hand-holding feel good hedonism, as if it is all that simple).
though counter to this, it is true that certain cultures have names that
align with 'song song' and thus it could be associated with that, even
though that was not the interpretive framework and it could have meaning,
whether or not true or accurate in the specific context, what is true is
true. and thus if a military strategy and its associated ideology involves
emigrating vast populations into other areas in order to subvert them, and
then holds up a politically correct boundary saying it is discrimination to
speak about what is going on in those terms- it probably is going to cause
collisions in discourses like these, where such dynamics are mediated
outside the "official silence" and politics of the everyday societies.
again, humans and antihumans would be my response to this, and humans are
in every culture and activity. and so they are in the superset and not
viewed as 'subset'- though an antihuman ideology that tries to stop such
analysis of human awareness via censoring debate or activities then is
seeking to overpower and bias interpretation via unshared identity that is
privileged and beyond error-checking or correction, thus requiring and
demanding a safe zone of *secure interpretation* that prevents accounting
for these dynamics via peer pressure, etc. essentially:
antihuman {humanity}
as this is institutionalized, effectively:
subset {superset}
if you give any credence to the higher cosmic order involved in this, such
that the earth today is being governed by 'the lower' to which to higher
functioning have had to adapt to allow the takeover and takedown to occur.
Most excellent in terms of this cross- and intercultural dynamics is the
work and ideas of Edward T. Hall, especially in terms of language and
cultural systems, how signs are interpreted differently yet also in this
variance the richness of truth beyond limited observations of a given view.
here is the ~50 words for snow in eskimo and so on, as it relates to this
same bit set issue of where someone is interpreting the data from.
another possible way of conveying the approximate idea:
(humans {ethnicity}) <--> (antihumans {race})
what matters is truth, who serves it and to what extent it is recognized or
subverted for another agenda than the shared set, its values, goals, and
principles- and this aligns also with coherence and its limits in ideology
versus ideas, rigid limits to allowed thought versus freedom and liberty
and serve to life or its annihilation by rules to prop up false universes,
and the larger context for this being cosmic accountability and the role of
crypto and both sides of these issues- that complex /ground/ of operations
that could be binary or paradoxical, actually grounded or only virtually...
so this gets to a basic idea of 'code' -- what do you trust? -- do you
trust some cryptographic standard, do you trust a person or system or
machine, do you trust 'ideas' without having parsed them completely and
thus rely on 'unknown knowns' determined by others, perhaps of unshared
identity and thus exploits-
and to me it has become self-evident that only TRUTH can be trusted, that
it is the basis for trust, and in combination with logical reasoning allows
this trust to be evaluated and accounted for, and anything less is
potentially an issue of 'trusting signs' in its place, mimicry or facades
or ungrounded connections or relations, such that 'signs of friendship' can
substitute for actual friendship, and other types of programmatic
techniques whereby truth is not shared as the basis for interactions, and
this occurs within 'shallow language' and 'wrong assumptions that are
carried onward' within the scaffolding of viewpoints, /errors/ normalized,
that devalue and degrade truth- yet become structures for *TRUST* within
computing and social systems, again the Binary Crypto Regime as ideology
the peak of this institutionalized untrustworthiness made into religion.
Faith in crooked code, crooked administration, crooked relations, crooked
ideas, and accounting in truth and via logical reasoning nowhere found
outside the *protected limits*, thus allowing the antihuman agenda to
proceed friction-free in fallen environments that serve this 'mastery'.
if you trust in people, it is within their truth, the truth of a person not
their falsity or pseudo-truth (unless mistaken) -- it is a critical
distinction and of wide-ranging consequence. so too, ideas- the truth of
ideas versus their approximations and warping and skew and distortion and
biasing- to defend and uphold that is a corruption, a compromise that is a
degradation and deteriorates whatever truth exists- thus, compromised code
as if secure also compromises the person wrongly believing its integrity
without being able to evaluate it in truth themselves or relying on others
who are untrusted or of unshared identity to guarantee ideas through trust
of technology or 'trust me' models- which if 'shared awareness' is not the
same or exploited, leads to exploits. whole OS platforms built upon this
and similarly exploited, though in a realm of double/triple schemes, etc.
so FAITH in TRUTH and in the truth of people and in the truth of ideas, yet
accounting for this truth, understanding it, being able to grapple with it
versus make and build assumption upon unreliant assumption that could be
built upon a threshold exploit (unknown unknowns) that bound interpretation
and yet another could be decrypting everything from the start because it is
built that way- just not understood by the users- thus 'trust the
algorithm' without dealing with 1:1 truth allows the A:B exploit its
tyrannical freedom to overlord and invisible hand events onesided. for
instance, if time travel there is a high likelihood the present day would
be governed from the future context and its administration over the
present and taking this into account would be one approach and ignoring it
another and thus 'standing lies' could be seen as part of that SCIFI
framework in that actions do not add up in normal terms in day to day
policy and why is it that situations are being run off the cliff regularly
in parallel if not some 'beyond mainstream' policy decisions making sense
of shared madness
so if the truth of ideas is not dealt with, and the truth of people and the
truth of machines and technology, and the truth of code itself, and not
using actual *logic* beyond the binary evaluation, how much truth is there
really in these systems and relations by default of ignoring the reality,
and how much of the surreality that exists is based on the shared FALSE
framework that allows the oppression and oppressors their advantage... thus
the role of the logic bomb in taking down the kernel of the corrupt belief
that sustains the lie and the code that stands in truth truth as its sign
yet ungrounded, an entire standing empire built upon a too simple ideology-
now that is a security nightmare waiting to happen, for those on the inside
and with vested interest in seeing that it and their lifestyle continues,
say versus feedback from the oppressed and exploited... payback time.
--- more on bit set dynamics ---
here are some additional examples of various approaches; you could use
numbers in place of letters or vice versa, perhaps co-existing and thus to
flip and unflip sets...
A B X
1 2 24
Such that A is the first letter of the alphabet, B is second, and X is
twenty-forth. it could be taken further and sent back into letters by a
second pass:
1 2 2 4
A B B D
Optionally, from these same numbers it could instead be reformatted in that
the original 1 2 24 could become:
12 24
L X
Another approach could involve flipping the bit set characters prior to or
after expanded into massive constellations of data of the interiority, and
thus layers of overlapping interpretation could exist from the start which
could be part of the symbolic algorithm as it were:
h i o x / x o ! y
there is a potentiality involved that could result in different expansions
and keyspaces and so it is an issue of what these are and how they could be
determined or shared- perhaps in the bit sets themselves or other keys or
it could be arbitrary and truly crystal ball 'seers' in given instances,
altered states that may align or be shared or unshared yet deliver info
also: this could occur without software in a massive distributed model yet
with software tools it could be something more in-line with crypto systems
of today and massive computation that probably could find ways to encode
and embed programs or documents via such techniques versus SMS equivalent
peer networking. say for instance the following bit set were evaluated:
U SI 20 1LL 2Z O X LS
Perhaps it expands into a thousand character document, and that 'X' symbol
were notation that it should be twisted 45% and read at an angle, dropping
out certain symbols as noise and others would be retained by symmetry or
having multiple readings (say: N S Z in terms of rotation & reflection). In
terms of the mirroring of language, reading upside is a useful skill or
being able to have /superposition/ of alphanumbers and those substitutions
in mind and parsed in real-time evaluation- perhaps relevant, perhaps not.
yet a computer system could likely mine such data very effectively given
rules and decipher or encipher and puzzle data back into jack-in-the-boxes
that could be reused as structure, or so it is hypothesized
continuing with approaches, there is a certain interesting correlation with
certain letters and logic notation, such as p q p d that in their relations
demonstrate mirroring and so this is part of the structural of numbers and
letters that can indicate function or have distinct features as if puzzle
pieces that can and cannot fit together in given parameters of co-alignment
p q p d
b d b q
The ambiguity of this, yet also other options...
b = |> <| = d etc.
perhaps even |> / as a trigraph as 'twist b, thus q'. yet any of this could
also be compressed or have multiple meanings depending on perspective and
the framework/s for evaluation.
the corollary to this is potentially sequencing of data structures though
again -- not linearly. multilinearly, nonlinearly, as permutative sets, the
total ecology of combined and related dynamics, CERN-level and weather
forecast modeling intensities... and still there may be infinite unknowns
even within the structural data if not having the keys to unlock it
another foundation aspect of this is character overlap and subtractions and
additions via their combined state; it is very difficult to convey without
images because it can be high resolution in a standardized context (16
segment display, square UNION JACK symbol of the ancient mastercode, HIOX).
and it is logic dependent in certain instances to figure out... first hint:
b 8 d
The numeral '8' essentially can be a character in /superposition/ whereby
while the 7-segment is the seeming source for hexadecimal letters, it has a
potential substitution of: A, B, C, D, E, F, H, J, L, O, P, S, U.
Thus the words: bad, bed, bod, bud, in addition to other acronyms could be
seen as a potential, held within the shared overlapping structure of 8. it
is thus a *wildcard* bit, in terms of alphanumerics. The HIOX or union jack
symbol is 26 letters + 10 numbers for a single bit, by comparison. so
imagine the set evaluation of: b88d, in terms of including the hidden 13
other letters and how this may influence anagrams or other permutation
combinations.
In a similar way, you could have a letter 'm' and break it into two letters
'n' or invert them, 'vv' which could be mirroring 'm' or 'w'. or turned
sideways and becoming 3 or E, that is, M or W. or c & c in a given view,
depending on the construct. consider another approach to overlap:
3
E
= 8
And further, into the larger alphabet...
R
L
= 8
This approach would include retaining the overlapping elements, thus
additive, whereas if subtractive- combining the number zero (O) sans null
slash (0) and capital letter i (I) assumed having full upper and lower bars
at top and bottom of vertical, would equate to an: equal always equal
symbol, orientated vertically. what is dissimilar is retained. this
abstraction in part required by the tools, unable to communicate most basic
information for lack of standard formating and basic display of 'code' as
it actually exists and is used. example of subtractive:
R
L
= 3
Perhaps this aspect of superposition is closer to that of /spin/ for
certain character features, whereas for a large capacity symbol such as '8'
or HIOX it is a much more vast range of set potentials to unpack via these
interrelational dynamics.
--- messing with HASH ---
it is still unknown if the parallel structure of a bit set as the string is
comprehended, such that [xyz] could function as a string yet also co-exist
in parallelism as [x|y|z] via set recombinations and expansions. [x] could
be anything, say 'the internet', [y] could be pi/1.666 and [z] letters A-H.
likewise, [x] could be symbol 1, say a purple dinosaur, [y] could be '8'
and [z] could be symbol 2. the slot machine really is vastly variable and
could involve multiply turning bounded infinities that tally in custom
circuits (outerlimits of enigma), and the 'string' could also be infinite,
such that both the bits and bit set could move towards infinity, such that:
[infinity^1][infinity^2][infinity^3]...[infinity^N]
how could any machine possibly rationalize all of that? impossible. and
what if most of it is *noise*. it leads to interesting conditions for ideas
and algorithms, new territories perhaps, places without any maps or
infrastructure within existing _known technology. randomness, yessir.
like the Moholy-Nagy painting (ref.typo error) in a previous post, there
could be calculative dimensions involved in ordinary language, through its
transmutation and reimagining in its 'other variability'. imagine colliding
[hash 1] vs. [hash 2] and arriving at some kind of structuring... what if
it is the same 'bit set' that is collided with itself, yet transformed
from bit set A to bit set A'. thus A x A' = new bit set
u x w h ==> y m x u == ? ? ? ?
Thus it would matter if it is additive or subtractive for what the new bit
set would be and what would be expanded in terms of its interiority.
so what if the english alphabet is 26 letters, A-M/N-Z and you run half of
these against the other, what kind of structures result. and what if you
take half of the alphabet and mirror it above itself then add/substract or
flip, or recombine in various ways, or two-letter sets, adjacent and
non-adjacent-- suddenly it is off the map, unexplored symbolic territory
perhaps, and that is seemingly something else entirely in terms of meaning;
and for the naive like myself, how might this relate to various encoding
schemes that use character sets and symbols for data transmission, are they
substitutable or could they even function as a deep data code (signage
going into and through data instead of only across as linear strings), and
what kind of access/entry and egress and circulation may this allow, open
up or prevent in terms of traditional data models and also processing. it
would seem in some way related to a compass with magnetic north and true
north, though that could be widely variant and easily polarity could be
lost and someone could be heading south or 'up' or 'down' without knowing
it, in flatland terms of bounded interdimensionality
also, language translation, western european languages, umlauts and other
details related to extra-textual cues or clues perhaps
venezia & venice
what would the different keyspaces expand into if only some of the
structure is shared, yet the idea is essentially identical, such that the
signage maps onto the same concept, empirically, via translation-- is it an
issue of distortion, cultural frameworks of language that format structures
and how might these be bridged or is it a limitation to the model, thus can
bit set dynamics occur only in A:A frameworks or can A1:A2 also function to
a threshold limit, until A:B is reached, or may it also function at A=B,
insofar as the structure of A is retained within the structure of B:
A <---> B{A}
and further, consider if the 'translation' was a particle collision and
thus the probing of molecular structuring shared and unshared between the
two concepts, attaining higher resolution of their shared dynamics via
interrogating their relations, grounding and mapping and testing of
assumptions into a common empirical framework- as if language is the basis
for conceptual infrastructure
--- note on correlations ---
having written about this elsewhere and forgotten a specific clarification,
pattern recognition itself is about this sign=sign matching, yet assumes
that the match itself equals truth and this is not the case, it is a faulty
assumption and an ideology that the truth of language is in the sign itself
and not what it is referencing. thus an apple would be the word apple
versus an apple itself and the 'idea' of what an apple is, its potential
superposition of meaning. this is the binary plague of ideas everywhere
evident in technology-- the potential of technology now making toys for
adolescent adults based on limited viewpoints, goals, and shared purpose.
mathematics begins with patterns. understanding and deciphering them.
yet it is this recognition that must occur in a A=A framework for it to be
actually true (1), and thus the refinement of ideas and removing falsity
from pseudo-truth and using hypotheses prior to declaring theories, as is
the empirical methodology. and so TRUTH is foundational to mathematics,
knowing what is true from what is false, and relating this to an A=A and
A=B evaluative context, which is Plato/Aristotle and others of course.
and yet if a mathematician arrives at an equation, or a person proposes a
universal idea, it is not just that they have found a "pattern" that seems
to fit, it is that it must correlate with other structures and patterns in
their known integrity, that is "truth". so there is a relation in truth for
the evaluation of truth. (thus: binary assumption of truth extends all the
way to all subsequent assumptions based upon it).
so testing hypotheses and ideas and equations and finding flaws is the
vital process and obligation of error-correction and refinement and the
changing of the model or hypothesis or pattern conceptualization so as to
better align with what is known to be true, verifiably so (1=1) versus
relaying on something know false and relying on its structure (1=0) for
further such 'ideas' and 'equations' as an error-based structure and
viewpoint, which can get religious even, and is hugely invested in the
ideological, where questions have long ago been answered and unasked since
even when facts and evidence refute the claims. thus limits, boundaries
allow even faulty pattern-recognition to persist by keeping out
error-correcting observations. and 'empirical' observation is thus limited
to only those facts and views that support the claims, ideas, equations.
this is why logic is so critical to determining and accounting for truth
and evaluating 'structures' that supposedly carry its /momentums/. the
trusswork of ideology that may be corroded to such a degree to collapse yet
carry the weight of a society, these areas of weakness offlimits for any
interpretation, protected by retaliatory violence to allow virtualized
viewpoint to continue, while endangering all who rely upon the error-rate.
the 'binary logic' that is falsely absolute yet unaccounted for can thus
"exist" in a bubble civilization yet only through dictatorial tyranny that
is against its evaluation, most especially in logical terms, these also
being the most self-evident and only way to actually describe what is
occurring as it occurs conceptually, structurally. everyone knows, yet the
'reasoning' itself is made off-limits and thus the incapacity to say for
lack of a way of accurately 'recognizing' and accounting for the events.
in this way a [sign] could be interpreted one way by a binarist and another
by an observer of 3-value or N-value paradoxical logic. the pattern that is
encountered may appear identical yet its mapping into structural frameworks
of truth could be entirely different, and more accurate than another.
so it is an issue of intelligence, how these signs are interpreted, and in
a more general sense [patterns]. and thus C.P. Snow and the Two Cultures
allows consideration of the mathematical evaluation versus linguistic, or
'literature' as a basis for this interpretation, leading to numbers and
equations in the former and alphabets and writing in the latter. though
perhaps with mathesis that division related to pattern recognition is of a
previous era, and instead 'literacy' has moved closer whereby concepts of
"programming" (ala Lewis Mumford) have "symbolic processing" as a skill
that present-day once future populations were predicted to need to survive
symbolic processing ===> [patterns] (signs & symbols)
thus, patterns could involve both mathematics and language, equations and
words, and some overlap could occur or exist as it relates to the nature of
thinking itself-- such that, while 'binary' views may be easily taken on as
a quick route to decision-making, looping probabalistic reasoning via
multiple hypotheses autonomic and conscious are the ways the nervous system
and brain actually work, in a context of navigating absolute truth and
falsity, including constantly refining and add and querying data against
contingent models and not simply discarded what is unlike thus 'false'.
in this way, an approach to thinking that is ungrounded in experience and
natural observation itself can become religious dictate via indoctrinating
people into an inaccurate framework and basis for shared awareness and
relations that are A=B in terms of pattern recognition, customizing POVs to
a warped, skewed pseudotruth that tends to zero, as a compass and guide.
The kernel is wrong, faulty, yet cannot be error corrected, power says,
government enforces, institutions crushing the facts, observers, denying
and censoring and outlawing illegal patterns, universal perspective.
Thus there is truth and falsity where A=A and A=/=B reside, and this is the
basis for establishing logic. that is, A=A is the pattern match, and A=B is
the false match, the errored idea.
T/F <--> logic
And depending on the 'logic' used for evaluation, that will then influence
what is and can be observed, such that:
T/F <--> {binary|paradoxical}
T/F <--> binary --> 1/0
T/F <--> paradox --> 1/N/0
such that, the logic then influences the observed patterning:
T/F <--> logic <--> [patterns]
In this situation 'code' could be the patterns that are assumed true and
accurate for sake of being operational in a binary framework, whereas with
a paradoxical evaluation this truth is not presupposed, instead questioned
because it is and always remains contingent- a hypothesis that requires
testing, challenging, verification, validation and continual error-
correction-- else, via entropy itself, the inherent fallibility of partial
observers within limited frameworks would be taken as if truth itself,
absolute, even though inaccurate, faulty, by lack of godlike omniscience.
logical reasoning is what this testing and error correction are all about
yet if it occurs in a binary framework the observational bias continues to
see itself as 'correct', infallible observers because they can equate the
sign they believe is something with the thing itself apple is the word and
not the larger external reference- and thus 'signage' stands in for and
begins to mediate truth irrespective of its external validation, within a
false empire of signs and symbols-- mathematics and language that becomes
and is detached from reality, its own virtual word of anything goes...
that is, it may in part connect with truth, yet be assumed 'all true' or
absolute, by selective observation based on power dynamics for reasoning,
which gets into brute force peer pressure to engineer truth via bullying
dynamics, censorship, violence against unshared views, blacklisting,
attacks on equipment and tools, etc, to shut down the other observations
that error check and correct and *falsify* the believers code, who make
their bread on lies and deception, knowingly or not. thus:
truth <--> logic <--> [math/lang] <--> [patterns]
and in this same way, the observers and perceptual framework could exist
in a paradoxical approach, more towards mathesis in terms of higher
literacy, whereas the binary is by comparison illiterate to everything
outside it limited and controlled framework...
truth <--> logic <--> [code] <--> [patterns]
And all of this about 'reality' and issues of shared observation and
identity, shared and unshared, that allows 'truthful' and false patterns to
exist and /represent/ ideas, events, people, accurate or not. mathematics
and language are both reliant on logic and truth, and the verification of
pattern recognition is in A:A and A:B dynamics, which are evaluated and
justified via truth and logic, and a too-simple approach is a basis not
only for structural insecurity, it can be a planned exploit of civilization
and thus as the "language" or discourse continues in confusing by assuming
A=B, many nefarious activities are allowed within that inaccurate context.
true evil, and its justification on the normalization of these dynamics and
the mediocrity that is allowed and allows for it, as a 'consensus view'
that is based in non-truth, only ideologically accurate to a partial view
(pT=>0, whereby B>A).
that is a context for crypto then, because 'secret communication' and
'hidden writing' can occur both within mathematic and linguistic structure
that is in that same context, including an undifferentiated logic, whereby
the binary may exist in a bounded realm that the 3-value or N-value could
likewise inhabit and move in and through and outside of, via its other
dimensionality, yet more accurately account for absolutes (1 and 0) due to
a weighted, more accurate evaluative framework for shared empiricism than a
rough, coarse, inaccurate all-or-nothing approach that is knowingly false
from the start, in that it is unreal, detached from limits of observation
and made into a fiction, and thus fictionalizes everything as its result.
in other words: [patterns] can be in superposition, yet not in a binary
mindset because they can not be evaluated this way, as /ideas/. instead it
remains shallow and on the surface of the sign, too simple pattern
recognition whereby the belief that A=A is its own confirmation as long as
the pattern matches, the image is believed to be what it is, then it is by
default correct and true (1), due to the sign itself, not its grounding.
truth <--> logic <--> crypto <--> [patterns]
Thus if considering ideas in this context, ideas of patterns, ideas of
crypto, the models of what are going on _must be accounted for within truth
and within an accurate logical evaluation and a binary approach (via logic)
is not this-- it is like playing paddy-cake paddy-cake to get the results
and seeing the layer cake as an end in itself, its own verification and
self-validation, if not narcissism-- i wear the badge, therefore i must be.
in other words, as is mentioned many times over a crypto system is not
inherently or necessarily a secure system just because it is cryptographic
though some may relate to [crypto] as a sign that equates with it without
getting into the error checking of ideas, truth of models, and assume or
allow models that cannot be tested and thus, limit/exploit scenarios by
this approach, where crypto functions against the role of crypto in society
and 'the sign' becomes its own antithesis, whereby 'to encrypt' is actually
'to decrypt' in another hidden context-- and that logically, when only
partial truth is active, this is probable, beyond 50% likelihood given that
several false assumptions may be relied upon that the truth is only minor
about what security may exist- say with factoring outside of quantum or
other computing architectures, which become dogmatic, faith-based beliefs
that everyone knows are bounded observations- thus contingent, unless you
have access to the deep underground labs or other galaxies resources.
it is the same situation with architecture and anti-architecture or those
who look human yet have antihuman agendas (bladerunner-esque), in that the
sign of something can be camouflage, a ploy, a substitute or stand-in that
is not the thing itself (mimic) and references another unshared grounding
(say A-> B) versus what is assumed (A-> A'). so, the signage could be
pointing in a given direction, yet going in the opposite, and in this way
instead of modeling truth and aligning with it (A=A) it could be subverting
and moving away from it (B), yet appear to static or "said to be moving
toward it via language and reference to other signs" (A=>A' actually B=>B').
in this way, [crypto] can move towards falsity in a binary framework, yet
with paradox this falsity could be multiple, beyond those controlled by the
binary approach and its /controlled signage/, beyond the limited threshold
and its particular interpretation, thus feasibly invisible, incalculable.
this would require reframing thesis-antithesis (and synthesis) in a dueling
context of binary and paradoxical logic, which itself would be the compass
and basis for a kernel, from my naive view, for the core of the dynamics.
another way of saying it: you can have all the code you want, yet if it is
not grounded you're fucked.
--- conceptualization of hidden interiority --
i found these paintings online that to me indicate a conceptual correlation
with visualization of the expanded bit sets in their venn context...
# Ordered Chaos 12 & 14
http://www.mnartists.org/work.do?rid=339132
http://www.mnartists.org/work.do?rid=339132&pageIndex=90
# Tao Of Physics, Quantum Space
http://www.well.com/~hendrix/TaoOfPhysics.html
--- decrypt for feedback ---
_ |v JL 3 p\ w Z 7r : 2 6
I N T E R E S T i N G
1
0
John Gilmore assesses NSA disclosures to EFF on encryption,
security, operations, more:
NSA FISA Business Records Offer a Lot to Learn
http://cryptome.org/2013/09/nsa-fisa-business-records.htm
1
0
// apologies- yet another attempt at error correction of a previous
statement and clarification.
quote: Subvert logical reasoning, disallow it, including freedom of speech
and thinking and censoring and controlling what facts are allowed -
controlling POV - then the opposite occurs: A|B ---> B, such that T => pT
This is the code exploit of the Binary Crypto Regime, where B(0)=A and
pT(0)>T(1)
--- clarification on computational approach ---
I made an error in description of processing in the context of infinities,
stating one infinity to the next is evaluated as if a serial approach when
instead this would be nonlinear and require massively parallel processing,
along with serial evaluations in a looping evaluative heuristic - testing
against a hypothesis or model, 'running code' or living code' as it were,
versus a static one-time interaction of data and algorithms, instead more
like a situation of intelligent life in a bounded context as if data
aquarium or code planetarium.
the reason for this parallelization relates to considering all the combined
permutations in terms of probabilities, and thus [x][y][z] as variables are
not necessarily, seemingly, about an algorithm that reveals a structure
that helps move from x->y->z by some mathematical structure, or so that is
my naive guess, such that if you use conventional crypto approach #1
something may be revealed between these that matches an equation pattern,
provides order within the chaos of variability, until legible,
intelligible.*
In other words, instead of XYZ being a serial number that could extend
linearly onward toward infinity, that is: [xyz] ...[∞], and that "string"
is a horizontal number or code, that can in particular be related perhaps
to binary on-off processing in a highly efficient manner, processing and
computational speed and largest prime numbers as context-- instead, the
assumption for this same situation in another crypto framework is that it
could be happening 'vertically' like a slot machine that runs to bounded
infinities (largest primes or not), within each variable, and thus [xyz]
may not have a discernible linear structure or overall equation that makes
sense of the resulting 'horizontal' string. such that:
[x][y][z] => [n¹][n²][n³]...[nⁿ]
whereby [n] is variable and could be anything- a number, a function, a
calculation, null, its own computation. And in this way, each variable
could tend towards infinity or its own structuring, within the string,
whose length is not so much the issue as the difficulty in resolving its
total structure, especially linearly, such that [n¹²³] would not be
decipherable running algorithms across its horizontal string and instead
solving for each variable or say grouped variables in the string, eg.
[n¹][n²⁻³][n⁴⁻⁹]
Thus, while i have no actual idea of how crypto and binary code relate in
terms of encryption methods and decryption, it is assumed this approach
remains serial and directly interrogates the serial string to reveal its
structure, across various formats of the code (various programming,
encoding and other data formatting schemes). Thus, a binary string would be
an example, 10100011001010001010101010010101, whereby to solve for
[xyz...n] would involve finding the overall linear structure that provides
for such linear organization, say assuming it is encrypted code
*)S)*S*))SA&*S&**S()S*S)aAUIHNL*0, and therefore the assumption would
remain that each variable is related to the next in some 'coherence' and
solving one part or layer may reveal another, such as
HSKSLLILHSILALSWLWLSDUI and thus the string [xyz] is made intelligible by
this coherence within the linear string, across that massive horizontality
(very large streams of data that contain data and programs and messaging).
Whereas for a paradoxical logic approach, each variable could itself be
'many' in place of a single bit- or the boundary of the single bit could be
[N] and move towards a bounded infinity, a mathematical function, or other
calculation in that same location.
disclaimer, stating the obvious, i have no idea what this is in terms of
applied cryptography, there is tremendous gap between these statements and
actual code, though to me the approach is much more accurate as "thinking
code" that involves human processing via logical reasoning, parallel and
serial processing, and thus the very idea of a string of code in that view
could also function as signal in noise or even absolute truth, in terms of
messaging. and so it is obvious 'binary thinking' is not like everyday
evaluation in the sense that there is grey-area to mediating events, a
pause to decision- yet within this pause, bounded infinities of hypotheses
can be queried (referencing previous instances in stored or external
memories) that then influence the tallying of the response, which most
likely will be weighted between 1 and 0, unless purely ideological.
Thus- *conceptually*- to consider "code" in this human context, of a living
breathing idea that is grounded in empirical truth in a shared human
viewpoint, that is to be shared as information, via exchange, it is more
grayscale than 10110101010100001, in terms of language and how thinking
functions, more about looping and weighting of variables than having a
*single correct result* when there can be several overlapping or
contrasting interpretations *at the same time*. So imagine if the binary
string had each bit that was variable instead in a 0-9 scale of weighting
the evaluation, such that 10927238292340246.
This moves the [binary] string into a fragmented string of variables, more
like analog computation [1][0][9]...[6]. In this way it is to consider the
'bit' as N-variable, and thus what if it were the alphabet instead of
numbers: 26 letters possible for each bit: USOWHLSELNSQAHBVY
the issue being that like a slot machine, those [N] variable bits could
tally up any potential letter of 26, or +10 numbers with alphanumerics, or
add lower case and punctuation or symbols and suddenly a small 'string' of
data could involve huge interrelational structures that may or may not be
related across the horizontal span, depending on how it is constructed via
algorithms and conceptual formatting. Maybe this already is the way
transformed code is achieved with taking a certain sized content- variable
[x], and then transmuting its entirety into a string or stream of
obfuscated data that must be 'worked at' to decrypt or be translated to
make use of.
The seeming difference would be computationally, how this relation exists
in processing, in terms of hardware and software, though also thinking,
programming. Because what if there is a limit to these transmutations that
is forced into a binary 1/0 and thus bounds these infinities to only
certain algorithmic space, or even computationally, that such numbers
cannot be adequately computed and thus *do not exist* as calculations
within machines and software approaches, crypto perhaps especially, when
the security they would provide would be unfathomable in terms of existing
brute force calculations of 'linear' patterns.
my speculation is of an unknowing of applied cryptography and computer
programming yet knowing of logical reasoning and empirical thinking,
awareness, and how the two are ideologically at odds in approach in terms
of basic assumptions. thus within my condition of 'illiteracy' there is an
attempt to share an idea (pT) about a shared situation from an outsider
vantage, with those of highest literacy of applied code, yet within what to
my observation is a flawed idea and based on false and inaccurate
assumptions, in particular the primacy of binarism for security when this
nonlinear/multilinear computation (parallel & serial) would easily defeat
it.
such that it is not about strings and instead parallel sets: [x|x|x|x]...[n]
as the [variable] yet this may not be coherent in a horizontal algorithm to
solve, it may not have 'rationality' across, from one digit to the next,
revealing its hidden structure. instead, randomness would be inherent
instead of woven into the code, it would be more revealing information out
of noise structures than putting information into noise that is bounded and
can be shaped into structure. in this way also, noise could have structure
yet not lead to decryption, it may be a false corridor within the ever
expanding maze.
it is that [N] variables each are in superposition, not static by default,
finite and absolute, and instead 'truly variable', unbounded to a certain
extent (infinities within infinities across infinities via nested sets).
the conceit or test of the heresy would be 256 'bit' quantum computer that
solves 256 AES, though if it were a binary string this could even be
trivial, versus say [N]-bit, which seemingly could take *forever* to
evaluate, via running, looping code evaluation and a shared empirical model
that develops alongside, out of and through the technology as a 'thinking
machine'-- which, the more it is like the human brain, the more likely the
messaging could be made sensible via existing concepts and structures to
test against, evaluating patterns and looking for correlations. in that
context, a three bit [N]-variable string of code could probably defeat all
computing power today, especially if large expanses were allowed, numbers,
letters, symbols-- it would be unsolvable potentially, extremely probable.
Largest primes would be a minor detail, another variable seemingly in such
a context, due to its potential for incoherence and complexity.
likewise, this [N]-bit approach for random number generators, yet why not
random outside of 1/0 as a noise field, generating strings via a two
[N]-variable string, just let it run and tap that, without or without
structure, would it even matter. in other words: take any two ideas, any
two signs or symbols or colors or whatever, and relate them and tally and
extend this as a process. that is proto-language in a nutshell, this the
crazy nut cracked open yet beyond the insanity of my own incapacity to
communicate and flaws in understanding-- there is something about this
approach and basically observation that has *coherence* that is absent in a
binary approach and serial algorithms-- because that is not how people
think or communicate, it is N-dimensional, geometrical, looping. and
processors and code and software at present cannot model this, allow for
it. and that formats reasoning, perception, what options are available to
share ideas and evaluate them, and we are stuck in binary because it is
enshrined both in technology though also in institutions-- it is the dead
static code of shared ideological non-thinking that is pushing
decision-making and actions towards its deterministic end game, which is a
onesided machine-based value system, devoid of life, nature, and humanity,
except insofar as it profits its own continuing automated development and
further extension.
so the gap between my illiterate views and the actuality of implemented
security code by those literate is one aspect, though another is my
literacy in thinking code and the illiteracy of thinking within
foundational technology, its infrastructure, and the result of this, which
requires a world like it is, and relies on bad code and ideas to allow for
it. thus an audit or accounting of the situation, an attempt to get across
the idea that there is a model of dumb, unintelligent code at the base of
this situation, the approach is so flawed as to be the basis for tyranny,
and it ties into 'ideology' across platforms, individuals and groups of
people to software/hardware and bureaucratic systems, and in that 'combined
state' of a false-perspective empire, the kernel is corrupt and the whole
thing invalid, including at the constitutional level which itself is
ignored, by binary default, the epic loophole of relativistic frameworks
allowing the fiction and its virtuality to replace shared logical
reasoning, because truth and logic can simply be ignored, 'privatized'. and
enclaves can rule over others as if a caste-system via technology and
ideological assumptions that function as religion, technologists as
priests, gods of this technocratic utopia, the peasants not having the
understanding to operate in such a realm, as guaranteed by the originating
lie and tradeoff that allows for all of this to continue. that absolute
truth is an everyday condition and you get to choose what to believe as if
a right or protected mode of operation, no matter how many others must
suffer for it, to sustain the illusion and shared delusion.
the cloud here in the corrupted model a state filing cabinet, digital
bureau for the bureaucracy, citizens organizing info into others invisible
folder structures, volunteering the data via handover, designed into the
technology itself as a marketing and communications strategy. the sieve of
private data is equivalent to entire populations seeking out pickpockets to
hand over their contents, incentivized as it is. and so 'security' is as if
a kind of institutional transparency in relation to a corrupted, failed,
rogue state that can read and see everything you are doing, whether or not
encrypted, dumb terminals every computer to the state mainframe, rebranded
and rebadged, hidden, 'anonymous'.
--- more of this insane ungrounded viewpoint --
it was mentioned a three variable 'string' [x|y|z] would be differently
approached if parallel versus serial, in that each bit of a binary string
could be N-variable in a parallel approach, or so it is assumed possible,
as with probabilities and slot machines, or basic everyday observation of
events and what enters and exits consciousness given context. and while not
knowing the depth of this in terms of cryptography, completely out of my
depth, it would seem the concept of keyspace could relate to how such a
'paradoxical string' could exist, given the boundary for determining what N
could be for [x], [y], [z]. For instance if it were binary ones and
zeroes, the probabilities could be run and 8 different permutations or
combinations: 111, 100, 110, 101, 010, 011, 001, 000.
And within that, perhaps there is meaning. Yet if 'the probabilities' are
changed via [N]-bit variables, it could go all the way to infinity for a
single variable, and thus BIG BANG inflate via probabilities into a huge
keyspace, perhaps unpacking structures this way that reference others
already developed, as if infrastructure being revealed that connects with
others elsewhere, via wormhole-like connectivity and then closing down upon
successful messaging, thus encrypting and decrypting via few variables, via
inherent yet hidden structural relations within these combinations, which
could be infinities related to infinities and then the issue of how to find
them or what to look for. Black box yet even moreso, RNG as model for
signal, not noise, thus tending toward psychic Random Event Generator as if
innate sense of animals before catastrophe, cosmic faults and folds.
the idea or difference is paradox- essentially *superposition* of the bit
as [N]-variable, no longer finite and static, potentially active and
transformative, diagnostic even in a sensor sense of the analogue as queued
circuit. What if alignment occurs in the string under certain conditions
and not others, what if it tunes in and structures revealed, decrypt, yet
out of tune it vanishes, code collapse or changes as with temperature
sensing colors, and the variables change, mask into background, returning
to mystery. It does not seem that computers today can even adequately allow
for infinity, a single bit of this, versus a larger parallel string- and
what might that mean about thinking, too. nothing more than finite discreet
thoughts, one decision to the next unconnected, unless largest prime, say
rogue US terror-state pwns earth as if master discourse, shared POV, even
though ungrounded- this the dumbed down unintelligent lowliest shared
viewpoint of situations in their depth, instead made shallow, sold as daily
headline? the CODE makes it so, in brains and machinery and bureaucracy.
binary is the enforced and corrupted 'shared state', conceptually and
ideologically, yet it is a false belief.
the issue then of shared and unshared identity, belonging or not belonging
to this 'master/slave' thinking...
shared ID <-----> unshared ID
And how this relates to default interpretations, the quickest route for
'feedback' and determining events based on perspective... are you binary or
paradoxical?
Can you make sense of your own consciousness or must you take on false
consciousness to function in society and go about decision-making in its
frameworks, taking on its value systems yet which fragment a person from
their own 'true' self, taking over and reformatting and reprogramming a
life to serve the machine agenda over and against 'shared humanity' -- now
an unshared identity, via private relativistic ideology. sell out your
ancestors and neighbors for a place in the machine...
Quickest route to thinking- *binary* of course, processor speed as if
SUPERSMART! --- "look- i can decide things and determine things
irrespective of their actual truth, and it works for me and others,
everyone else is just lazy!" Like water flowing downhill, 'logical
reasoning' turned into Price is Right PLINKO game, quick and easy
'automated reasoning' via path of least resistance aided and abetted by
binary ideology, creating friction-free virtual universe, mind detached
from body by also flawed historical beliefs, enabling this madness its
onesided platform. the trope of largest prime 'uncorrected ideological
perspective' the trophy award for the most stupid, greedy, and ignorant. an
entire society and civilization built around rewarding those whose
activities align with this, against human conscience and its needs, that
then is viewed as the enemy.
--- major social dynamic ---
ideologically there is a differentiation in terms of the process of
reasoning, how information is parsed...
intelligent <-----> smart
also, how shared identity may differ between empirical and relativistic
models of truth...
truth <-----> partial truth
and the difference in conceptualization, reliance on how frameworks are
constructed, tested...
ideas <-----> facts
and this directly relates to issues of observation and cybernetics (looping
circuitry)...
fallible observer <-----> infallible observer
error-correction <-----> no error correction
In this way 'inflated' or 'bubble' views can rely on warping, skew,
distortion for their truth which is verified by conforming to a false or
inaccurate model reliant on a limited *protected* or SECURED version of
pseudo-truth (pT), as if shared empirical reality (T) removed of error,
because it is believed to be, via ideology.
grounded empiricism <-----> ungrounded relativism
In this way a 'private worldview' can replace 'the public' view as if a
shared domain, and become the basis for one-sided 'reasoning' depending on
authoritative beliefs, where facts can be chosen to fit the model, others
discarded, to uphold the perimeter, basically privatizing perspective to a
finite inaccurate view as the exploit.
reality (1) <-----> false-perspective (0)
humanity <-----> the state (?)
There is always the possibility that this was planned in advance as a
cosmic setup from the beginning, and therefore the state could be doubled,
paradoxical, existing both in the shared truth of humans and in the lies of
the false viewpoint...
humanity <-----> state (T|F)
In this way the two state solution could be moving those 'false' on the
side of humanity over to the false state, and the true state over to side
with humanity, via reversal of dynamics, trapping the exploiters within
their own rigged game...
human state (1) <-----> rogue state (0)
And thus doubled or backdoor crypto could be vital to this process, itself
constituting the trap while transcending this context of surveillance and
putting it to use for human goals and values, working-through this hell
into the birth of new cosmic civilization, of shared identity, as this
relates to eco|soc|pol-issues, shared set evaluations for money and taxes
and policy and direction slash governance, versus the corrupt code and its
circuits etched into the world and minds today as pain, suffering, terror,
horror, insanity.
// * the following repeated text in case superscript numbers do not
translate, for reference:
In other words, instead of XYZ being a serial number that could extend
linearly onward toward infinity, that is: [xyz] ...[infinity], and that
"string" is a horizontal number or code, that can in particular be related
perhaps to binary on-off processing in a highly efficient manner,
processing and computational speed and largest prime numbers as context--
instead, the assumption for this same situation in another crypto framework
is that it could be happening 'vertically' like a slot machine that runs to
bounded infinities (largest primes or not), within each variable, and thus
[xyz] may not have a discernible linear structure or overall equation that
makes sense of the resulting 'horizontal' string. such that:
[x][y][z] => [n^1][n^2][n^3]...[n^n]
whereby [n] is variable and could be anything- a number, a function, a
calculation, null, its own computation. And in this way, each variable
could tend towards infinity or its own structuring, within the string,
whose length is not so much the issue as the difficulty in resolving its
total structure, especially linearly, such that [n^123] would not be
decipherable running algorithms across its horizontal string and instead
solving for each variable or say grouped variables in the string, eg.
[n^1][n^2-3][n^4-9]
⚔
2
2
May it be true the spies admit defeat. But not likely, more likely
a ruse, for they are highly trained liars and tricksters. Feints are
commonplace to disarm and delude opponents.
Pardons of Manning and Snowden, shutdown investigating
of Assange, and several other intimidating and chilling
operations would be persuasive. Hearings on abuse would
help air the stench. But will not eliminate the rot of secrecy.
These formulaic conciliatory gestures would be cheap discounts
of giant budgets for ancient practices of extorting money from
the public, for, what else, protection, protection against
government defined threats, identified by secret means.
The spying apparatus will re-surge as it has forever, governments
cannot survive without spying on its taxpayers. IRS and global
revenue agencies the main data collectors on citizens and subjects
from birth to death for everyone everywhere, even spies. SSN the
ubiquitous UID. Got a problem with that, go to jail. Tax refusers and
evaders more stigmatized and loathed than terrorists.
Joint Terrorism Task Forces around the world spend more time
chasing and prosecuting voluminous tax cheats then the tiny
number of terrorists. With their contractors this produces handsome
tax revenue, confiscations and fines. Prison populations too.
Spying was invented to catch tax cheats and assure revenue
for government operation and rule, and nothing unites the
world's governments more than that essential transfusion
of money to sustain government beneficiares, in the US the
three branches, state and local. Military merely cops to
control obedient payment.
In NYC, the racket of Wall Street is to pay NYPD to stay
away from finance through the Police Foundation. The
recently published "Enemies Within," by two AP reporters
recount how this has come about through the services of
David Cohen, formerly CIA Director of Operations, who
has set a completely unaccountable spying operation.
Prior to setting this up, Ray Kelly, NYPD chief, and Cohen
worked for Wall Street firms. Mayor Bloomberg happily
endorsing the 1% bribery methodology.
Nobody should believe this will change without a lot more
than hopeful dreaming.
At 08:08 AM 9/14/2013, you wrote:
>On Fri, Sep 13, 2013 at 11:43 AM, John Young <jya(a)pipeline.com> wrote:
> > ... (now even Clapper is applauding
> > the Snowden campaign, which stinks of the fix is in on
> > what to release and when).
>
>MI6 also downplaying.
>
>but if you read between the lines, this is signalling defeat.
>
>they're fucked; the docs will leak, ongoing, in a tight fisted drip of
>insufficient and insufferable dribble... perhaps this is their zen
>state - nothing to do but see how much is burned, and continues to be
>burned, for months and years ahead.
>
>watch for indications of political fights and influence, lobbying,
>persuasion, LOVEINT blackmail?
> cut part or much funding and they've really got a problem!
2
1
// attempt at a basic comparative example here...
--- code play ---
first to reference the previous transparent ziplock with letters (ACID) to
provide conceptual framework for what follows... and how the *interpretive*
aspect of the code, its encoding/decoding and encrypting/decrypting, may
likewise relate in some way as a transformative, calculative process (if
not phase-change). not going theory, just abstraction into a realm of
N-dimensions and probability versus linear equation exegesis...
here are some online resources then for examining the letters [A,C,I,D], as
recombinational variables:
#Internet Anagram Server mentioned two solutions:
http://wordsmith.org/anagram/index.html
CAD I, and ACID
#Scrabble Word Finder lists 8 word solutions:
http://www.scrabblefinder.com/solver/
4 letters: acid, cadi, caid
3 letters: aid, cad
2 letters: ad, ai, id
And other scrabble word searches find additional, including 'da' and 'i'
So there are these permutative interactions that can occur within and
between signs (letters and their arrangement, though also symbols) and this
is mapped out to some degree with anagrams, palindromes, ambigrams, rebus',
kangaroo words (word within a word), and so on.
#Palindromes, anagrams, and 9 other names for alphabetical antics
http://theweek.com/article/index/244111/palindromes-anagrams-and-9-other-na…
In terms of language, this is a zone of paradox where multiple meanings
could exist in a 'string' that could have more than one interpretation, say
the word 'top' and 'top', one indicating 'the upper part of something' and
the other 'a child's toy' via dictionary reference. yet without context how
can you determine which viewpoint is correct: is there actually a 1:1
meaning or is it one to many? Thus, in some sense, weighted given the
environment it occupies for observation. Thus paradox, though also
superposition, and an aspect of the unknown or mystery within the ordinary
and everyday that in a binary worldview would seek to determine a 1:1
answer and discard the other 'variables', and linearly "progress" in such
computations down a particular _unique path that then is equated with _the
correct path, analytically, even though potentially arbitrary or flawed.
Any such search engines for 'words' can be broken via substitutions, as
mentioned before, and doing so would limit or bound the interpretation to a
smaller set. Thus using a number one instead of capital 'i' causes the
library reference to ignore the variable, discard it from consideration,
and only analyze ACD; returning only 3 sets: cad, ad, da. So it moves from
8 words to 3 words due to illiteracy or incapacity to translate or
substitute, as mentioned previously.
Yet once again consider the ACID letters, in particular the letter 'I'
again as it may be hypothetically or conceptually connected to the letter D
and result in new letter potentials: P, d, q, b --- likewise the letter 'C'
could be turned sideways into the letters U and n. Note the transgression
of upper and lowercase alphabets, opening up a much wider range of letter
combinations via allowing multiple scales to co-exist in the evaluative
framework. Further, the letters C and I can be combined into a second
letter D.
Thus A,C,I,D, can become [A,C,I,D,D,U,n,p,d,q,b] as a potential string, if
'suspending judgement' and allowing time to configure and reconfigured *in
a probability space* these co-existing parallel potentials. inputting the
string (aciddunpdqb) at the Scrabble Word Finder leads to 110 words, with
six letters the highest word count of eleven total.
A question being, is the 'string' ACID really only ACID, as if 1:1 in terms
of code and its meaning, and if 'decrypted' perhaps CAD I would be a
result; and likewise, what if this could range to 110 words, without
considering two or more letter acronyms and abbreviations, and then, each
of these unpacked "variables" could potentially be related to one another
as a structural framework from which to further reveal or construct code,
such as via the subset words: candid, baud, pin. Perhaps keys or a bermuda
triangle zone that submerges or reveals purposefully scuttled content. This
location related to another inside the same hash function: cab, pain, quid;
or yet another: cabin, cupid, id.
Essentially sets related if not collided and run against other sets, one
against many, symmetries or asymmetrical dynamics, spins, planes, fields,
layers, levels, zones of meaning shared and unshared. Perhaps someone could
even have the key and yet get lost in the ever expanding forest of never
ending choices, thus: the labyrinth is constructed by each interaction and
decision and without the correct context, variables in their superposition
could remain hidden in their correct interpretation (A=A)
how could you know without an observer of shared identity and same model of
empirical truth to reference, given N-possibilities for each choice and
evaluation, potentially, as this is bounded by whatever the keyspace may
be: only capital letters, alphanumeric plus symbols, whatever. and
likewise, in terms of computation and equipment, could this same HASH
function be recreated _without identical equipment, due to the floating
point aspect (if it were to still exist) in computating irrational numbers,
given a necessary boundary to stop and round up or down at that limit, thus
forcing alphabets or equations into one hash scenario or another-- and
could enough of this be controlled to even allow two machines to function
identically or may there inherently be noise (chaos) between them, and so
it is an issue of approximation, creating the same structures via unpacking
a superposition ~bit-string and yet having variability still within these
parallel models
It just reminds so much of enigma and patch cords that perhaps the the RNG
with patch-cord like custom circuit wiring could tweak machines to a shared
match or 'entangled parallelism' that technology may be limited in allowing
due to limits to standardization at the level of unique processors with
slightly different characteristics, processor temperatures or working
transistors that could somehow effect computation, speed, rounding, at what
boundary the evaluation is bounded. perhaps different bounding could
provide different codes or access: abc^10 opening up one hash and abc^2
opening up another entirely different subset universe in terms of what
appears in its relational structuring. different tech, different hashspace
(?) given how much infinity can be modeled and to what extent -- And thus
what might keys be. what if the starting variable was used multiple times,
such as opened at one level and then reopened at another, and these
compared, or some mathematical computation occurring to parse each against
the other, and using that derivation for the keyspace. The idea of custom
wiring in addition just unpacking then potentially allowing one version of
crypto-signage to be used many times and in many different ways, as if
signpost even or relay or storage device, as if such crypto could be
infrastructure, if not alive like a sensor network in terms of feeding and
routing information through such matrices
it is curious if perhaps this is opposite the idea of the hash, in that it
is an inversion or reverse hash function or something such, due to its
exponentiality of parallelism and superposition versus linearity that
appears standard to the naive outside observer of this.
Just to provide a different example entirely, for consideration- imagine a
two variable string, [v][w], in this superposition context. Despite the
known transmutations such as the letters being flipped and turned, such
that two letters v = w, or when flipped could equate to acronym 'mn', it is
more than a single letter as the variable and instead, say, the signs S and
O that are overlapped for a single variable [v], such that their layering
appears as if the number 8 or letter B. and likewise, a second variable as
letters F and L traced atop one another, resulting in the combined letter
E, much like a Jasper Johns painting or 7 segment or 16 segment display. In
this approach, [B][E] then could, as a 'string' carry these other letter
potentials within it or could no longer be decipherable as a standard
alphabet letter and instead as a structural pattern, and then computation
could occur via breaking down these patterns in that abstraction, mapped to
potential letters and numbers and their recombination. For instance the
letter B is equivalent to number 8 and via 7-segment displays, encompasses
all numbers 0-9 in its geometry. Thus what if [3][E] were suddenly
evaluated in a context of letters [S,O,B,E,L,F] and its 47 words unpacked
from this starting point; say 'fobs' <=> 'lobes', as a subset universe
within which, via another key, may match to a circuit elsewhere or provide
meaning and thus 'shared framework for interpretation' whether intentional
or unintentional. likewise, lesbo and sol, fe, Io, os & so -- leading into
potential transcendent oblivion... or other dimensionality beyond what can
be determined, predicted or controlled even- the RNG as REG, tapping cosmic
circuitry as if interdimensional tendrils of noosphere
--- another aspect ---
in terms of superposition variables, the length of [x] in the string
[x,y,z] could be _anything. the bit as container could hold a bounded
infinity, it could hold a mathematical function, it could hold five numbers
or a hundred million trillion, or could nest or subset other variables in
chained relations, such that x = xz. and thus a larger string could
generate more complex entry code or certain structures, yet it may be
unrelated to solving the interior of the hashing package because its inside
is larger than its outside
symbols especially relevant here, HIOX, fractal codes, ciphers within
signage so that it is beyond ordinary language to begin with, such an
approach could be two variable and encompass a vast universe of relations,
say as a partial HIOX symbol (16 segment) or fractal-based sign of letters
embedded one inside the next, say a Y being letters I and V combined, yet
far beyond this and into a keyword written inside a starting letter, at
scale, and moving downward or across, and having that be a variable that
unpacks certain letters and characters and not others, thus potentially
relating this outside with inside via the starting string, thus providing
context via its superposition options that may be more complex
"structurally", all of this requiring a standard approach that is shared,
grounded in truth and logical reasoning, to allow 'the language as code' to
be useful for shared communication, carrying ideas, or being the idea-
seemingly about providing context, creating the conditions allowing for
such interpretation to exist
also, it should be noted that this could easily involve numbers and so
alphanumerics could exist in a state of superposition between their
mathematical and linguistic meaning, again 'variable' in terms of
interpretive framework; for instance, a number 4 may be missing its
hypotenuse or upper-left bracket -- given character style or typographic
standard -- and thus it may look instead like a sideways letter T, in some
sense. And thus the glyph representing sideways-T could *potentially*
occupy both the realm of the letter T and-or the number 4 in terms of
superposition, given that it is somewhat corrupted or incomplete or has
extra information (such as Q and O similarly).
thus a starting bit-string could have strange symbols that could map into
letters and numbers via transformation (its de|con-struction via
constructions and destructions) of subset relations and dynamics, unpacking
particular characters and relations, creating a particular type of noise
field for other content, perhaps framing it for interpretation. And thus
language itself, tending towards noise if not illegible, may look like
actual hash code to start the string, then unpack its contained information
based on these variables that may normalize within certain parameters and
not others, providing both signal and noise- and perhaps having multiple
keys or no keys and to be accessed or referenced via temporary tunings or
harmonizations, as if emergent data, or a particular correlated state of
shared mind, et cetera
--- example ---
so i could provide a ~bit-string (do not know the correct word so that is a
conceptual placeholder until someone those knows what this is define the
concepts accurately) such as the following and it could be assumed to
operate and function within these stated dynamics...
[nv+x!]
in other words: [n][v][+][x][!]
And here is what i think is peculiarly interesting- is that i could shared
with you 'the key' that created this string, for a subsequent parallel hash
function, via superposition, yet it may or may not be relevant to what is
unpacked on the inside unless it were activated, and thus it is like an
on/off switch that may not be in the interpretative context and so can be
routed around if compromised, or used as ruse and trap to set up
alternative interior corridors and pathways to checkmate mimics via false
perspectives.
so, running through the example... potentials: n, c, u, v, n, >, <, +, t,
j, r, x, +, v, y, ., l, i, !
Note duplications are added for combination potentials, especially useful
for acronyms or symbolic 'compressed' meaning, mirroring of palindromes,
etc. likewise, punctuation is allowed for its dual-use language capacity,
the exclamation point rotated into the lowercase letter i, and period
useful perhaps in other ways as well. As with plus sign and the lowercase
t. the issue of archetypal transference alongside transformative mutation,
as if organization, allowing for entropic reversal or coherence gains even
via losses, such is the "calculus" of the signage, as if moving from
concept 1 (speed) to concept 2 (acceleration) yet within a different
conceptual realm, wide ranging if not arbitrary
and thus the issue of *potential*, as with potential energy- *potential
meaning* and potential interpretation, framework, structure, that may exist
momentarily or collapse given shared or unshared identity, referent,
whatnot.
so it would be possible to say 'cat' is what generated the original string,
the upside-down caret (^) removed of the crossbar a hint or clue to the
uppercase A at another scale (uppercase versus lower), and the plus sign
functioning as a category-crossing letter t. and that could be the basis
for generating the hash funktion, via this string-- and it may or may not
be relevant, it may or may not be used beyond this threshold, it may or may
not be the framework once the interior is unpacked, and this depends on the
keyspace and how many variables will be referenced on the inside, though
could also be on the outside as well, and only [n,v,+] could be used in
their permutations and nested recombinations- meaning different hashes
could be referenced and made active depending either on generating key, as
a partial string (thus secret) or it could be entirely open and yet of an
unshared POV, bounded and limited in observation via threshold of
eavesdropper, and thus their biasing could force only some variables and
not others, which it is proposed may make it nearly impossible to decipher
externally without knowing what is being communicated in a given framework
or context that is only partially observed and evaluated. such that it may
not add up the same, just as unpacking the string with different hardware
capabilities may generate different random interrelations as noise though
also structure, if somehow calculations are not tuned A:A and instead it is
closer to A1=A2, though compensated for. it is as if relativism itself is
the encryption envelope, its warping skew distortion where the data can
hide within and between and the revealed structures *whatever they may be*
could be one-time pad connections or stable grids or force fields spanning
multiple such hash universes that interconnect or feed data one into
another again perhaps like wormholes
further, phonetics could be used for substitutions such that the original
lowercase letter 'c' and its subset [v, ^, c, n] could be replaced by 'k'
such that kat=cat in terms of pronunciation yet has no trace of a c from
which to decipher via a dictionary search, unless phonetic. in this way the
original string [nv+x!] could instead be: [kv+x!] and generate an entirely
different inferiority to reference, which may or may not be keyed to 'cat'
in any given instance, unless it were to be, yet may not reveal anything
even if the key is known, given perspective and variable timing and
interpretations. it could be throwaway or only active on occasion or a
divining method such as organizing cards out of a Tarot deck for reading, a
process that establishes interpretation though in a magical context, and
thus perhaps more ritual or procedural yet still an important vital step.
i.e. if everything is in tune, because the arbitrary could cut either way
if not 'literate' and capable of handling this kind of code and
interpretation as it were- it could drive you crazy
of all the myriad possibilities, this is a infinitesimal example and it
would be in the collection of all various techniques and approaches and
evaluation and surveying their coherence and decoherence dynamics and
related functionality that perhaps a new approach to coding could be
developed in this alphanumeric and symbolic parallelism where sign=sign is
essentially outside of the language as it operates in consciousness, unless
brought into a realm of shared truth (1) and shared perspectives for
exchange. thus creating or *revealing* that realm, essentially navigable
infinity and issues of markers, lighthouses, obelisks, waypoints,
ecosystems and ecologies, mazes, traps, that mapping data or mining,
storing, conveying, relaying information within preexisting contexts,
frameworks, could be utilized- made infrastructural, another paradigm
entirely that aligns with consciousness, not the deadzone of the silicon
wasteland as if final destiny of life with the singularity, leading to
death and nothingness as ideal future.
current and currency. a realm for those literate and of shared awareness.
keeping the lies and the liars and their falsehood out of the equations,
trapping them in their own false perspectives and thus bounding the
interaction while the false perspective is dismantled, freezing everything
binary in its place while the word keeping spinning. this, perhaps a
reverse-vertigo effect of the nonsensibility of the perceived world, though
from another point of view. the oppressive ideology extinguished, no longer
capable of sustaining the lies and controlling events from an external
position, losing power of 'shared awareness' by dissimilar or dismantled
structures-- sitting ducks. quack,quack.
1
0
It continues to mystify why Greenwald and others crop and
redact documents and slides but show them to staff at
O Globo, Guardian, Der Spiegel, New York Times, ProPublica,
Washington Post and perhaps others yet to be disclosed
with bombshell releases (now even Clapper is applauding
the Snowden campaign, which stinks of the fix is in on
what to release and when).
O Globo videos show glimpses of slides which are then
further redacted or cropped for release as slides alone.
Schneier claims to be working with Greenwald so he is
presumably seeing full views of docs and slides. Yet he
sustains a steady beat of surprise and outrage, almost
as if overly defensive about who knows what.
Greenwald has tweeted that there are legal reasons to
not show full views nor "distribute" document instead only
"report" on them. No answer to a tweet to GG about who
set those legal boundaries.
This seems to be game the Snowden manipulators are
playing with authorities, or at least lawyers are playing
with the gov, to toy with and tease the public by hoarding
documents, maintaining insider privileges of "journalists"
against outsiders, their readers, and experts who could
deconstruct the journo's pallid intepretation.
This is a game played also by secret-hoarding governments
against their citizens, aided and abetted by duplicitous
laws and lawyers.
MITM exploitation is what it is whatever they chose to call
their privilege protection racket.
And not to overlook the singular role of Tor in MITM
exploitation. The same distinctive rhetoic is deployed
by all of them to wave off suspicions as as if tradecraft.
2
1
So, the washington-london-jerusalem axis of infinite goodness and boundless moral perfection uses the internet to keep close tabs on their subjects. Western cattle is free to use google, facebook, wikipedia and similar technological wonders so that the all-loving state can freely spy on them.
Now, what happens when a country is ruled by an evil power that prevents people from connecting to the 'free' internet? Seems to me that the morally perfect western governemnts are denied the opportunity of spying on people living in those countries. Say, I don't know, China.
What is the axis of goodness to do now, I wonder...? What about helping all those oppressed people, especially the ones unhappy about their government and so likely to be useful in further spreading western imperialism?
What kind of tools could the US military develop to be able to influence those foreign yellow assets and to collect internet usage information from yellow cattle?
J.
1
0
quote: Subvert logical reasoning, disallow it, including freedom of speech
and thinking and censoring and controlling what facts are allowed -
controlling POV - then the opposite occurs: B ---> A, such that pT => T
---
Subvert logical reasoning, disallow it, including freedom of speech and
thinking, censoring and controlling what facts are allowed - controlling
POV - then the opposite occurs: the viewpoint of B 'overtakes' A such that
an error-reliant pseudo-truth (0) = truth (1)
This is critical to establishing the false perspective, basically a hack by
removing logic from reasoning via faith-based binary ideology as a means to
control 'programming' populations, institutions, the state and world itself
via what amounts to submission and servitude to lies and the grand
deception.
Note: this relies on B=A which leads to B>A and pT=T, which leads to pT(0)
> T(1)
1
0