// again, disclaimer: observations of a naive observer...

--- on pattern matching ---

there is a significant difference between evaluations that search for recognizable patterns and structures based on relativistic pseudo-truth versus empirical models of truth, removed of falsity.

in the former, a pattern match thus verifies pT=pT in some way that retains the error rate as part of its structural assumption though likewise could be considered true by default of the match. in this way the [sign] itself becomes the truth, as if the pattern itself is true, an accurate model. it is thus possible for an ungrounded viewpoint to be validated via pattern recognition and *believe* in the legitimacy of the model because it is able to locate and recognize and categorize patterns, validating the approach. that it works confirms that what it is doing is true, as an assumption. statistics and mathematical modeling often can validate this as forms of 'objective reasoning' that are themselves likewise ungrounded, as if an approximate algorithm is by default removed of its estimations or need for ambiguity and via binary 'reason' these inaccuracies can be denied and-or discarded from questioning the approach itself, the error rate relied upon and becoming structural to sustain the viewpoint, functioning as belief.

   [sign] = [sign]  equated with truth

the point is that the sign could be ungrounded, weakly or inaccurately modeled, and thus a rounding error or observational bias is involved...

   [pT.sign] = [T.sign]  via pattern match

a quick way of saying it is that there is some unaccounted for subjectivity involved (a=b) yet also ignored that presumes A=A is the evaluative result. the issue appears to be that the [sign] itself is arbitrary to an extent, and does not require investigation beyond its context in language for the observation to be considered true- seemingly that a concept represented as a sign effectively is equated with what it is meant to signify, therefore it may not actually have external truth outside of the language system that instead functions as if this truth. the pattern match is with the [sign] that is the word, not what the word references, because the word can be viewed as its truth. that would be a worst-case confusion, lack of rigor to the point that this distinction is not being made, allowing the shallow or weak correlations to exist. at the very least 'computers' could do this, though likely many a person could likewise who does it think it through further or allows ideological presumption to take hold of observation by default of existing structures, without correcting for errors or ambiguity.

an empirical evaluation in contrast would secure the [sign] in relation to truth firstly, thus when a pattern is match, this match does accurately correlate with truth, the concept having been removed of known errors.

  [T.sign] = [T.sign]  via pattern match

although again it must be said this remains contingent and in a realm of the grey area, within the middle N value (1-N-0) in either a 3-value or N-value paradoxical evaluation, tending towards absolute truth yet bounded by worldly limits to only allow a high percentage or reliability (nine nines) and thus the binary 1/0 instead may function as a sliding scale, in that it may be effectively '1' yet never absolute, always contingent and always returning to a state of perpetual questioning, testing of hypotheses against new evidence and additional structuring. thus a match is still not 'absolute truth' -- instead it is an extreme likelihood and highly weighted toward truth, yet upon further investigation or data could be overturned as an observation if the model is somehow flawed and thus error corrected. in this way a failsafe exists in the observation allowing for correction, whereas a binary model (either/or) would not be able to make this change after having decided its state, or if it did it could break the entire worldview whereas for a paradoxical weighted approach it seems much more likely that a paradigm shift would rearrange or reconfigure the perspective  and that it would be possible to do this within a malleable framework that is capable of handling ambiguity, unknowing, multiple hypotheses at once. and perhaps what this indicates is that a [sign] for paradoxical pattern matching may exist in superposition, in various different configurations by this probabilistic weighting, and only over time resolve the overlapping or contrasting dynamics- which then would indicate a field of its inquiry, versus a binary approach that would choose one interpretation over another while discarding its validity within the model, or so it is assumed.

and perhaps this last aspect of superposition is a new take on synthesis as it relates to the paradox of theses and antitheses for a given [sign], that logical reasoning mediates this condition via panoptic observation.

--- on turning ---

this is speculation though i have a strong intuitive sense that forcing a binary ideological structure into a paradoxical condition would in turn double everything that exists in that system, via the new codebase. (?!)

another way to say it is that the pT.1 of binary observers would exist in a context of truth (T) of empirical observation and the removal of error will destroy the ideological structure needed to maintain pT.1 as if absolute truth, thus forcing it into recognition as a partial view, pseudo, while incapacitating the false ideological structures that prop up the viewpoint, such as invalid or false assumptions. in its minor state, pT.1 can no longer determine shared truth in the larger empirical context, for others and must submit to the larger interpretive framework and shared context, insofar as it is valid and legitimate and removed of errors. In this way the binary observer _must accept 'other truth' that accounts for the larger totality of truth (T), that is pT.2, pT.3...pT.N, and thus everything that is T minus pT.1, which could be this doubling of truth that must in turn be mediated beyond the binary constraints. If the observer is unwilling or unable to do this, they would be incapacitated in logical reasoning with other observers insights as a shared model, to uphold an ideological POV. yet if they accept it, their minor truth by comparison is limited in controlling the interpretation and thus forces a compliance with truth that in effect no longer allows the old presumptions and positions to be retained. if this was a ploy, internal contradictions would likely be noticeable or a fakeness in this process. It probably could not be very well mimicked though if it were, could only last so long before tension between competing views internal and external caused psychic collapse. it is to say that without 'grounding' in truth, or its actual observation, that 'going along with things' in a paradoxical framework without truly believing in the process, recognizing truth in such a way, could be a self destructive process if the person remained binarist, and this intolerable conflicting position between logics could force submission due to madness of having once all powerful observations instead become minimal at best in a larger framework, if psychologically unable to see beyond the self's POV.

to try to defeat the larger truth the binarist would have to maintain two versions of truth, while being able to externally reason in their biased framework with others, or rely on false frameworks for their evaluations. it should be readily evident and easy to discern this kind of deception because binary rationalization would be the governing OS of the person, even though they may say or indicate otherwise by following, mimicry.

--- questions on random ---

basic electronic circuit with reversed diode for noise. wondering if size of diode has been correlated to noise patterns-- does a larger diode generate more randomness. is there any boundary issue for randomness. it would seem like there would be for linear algorithms versus parallel sets.

for instance, if it were analogous, imagine an aquarium is a smaller diode and a swimming pool is a much larger diode. and the same effect is going to be used in both to test for randomness of numbers. how would their output be compared and evaluated in the same terms, and is it related in any way to the size or boundary of the diode itself, as to randomness generated.

here is why it seems like it might. if dropping a rock into an aquarium there would be a splash and waves would immediately start as the rock sinks to the bottom and thus the boundary condition would influence how much this outside interaction effects the inside equilibrium. in that higher waves may form and multiple, if not causing a local splash, and the structure inside the aquarium could be altered by rock entering its domain.

throwing the same rock into a swimming pool may not have similar effects at the larger scale, it may sink further to the bottom yet not disrupt anything else at that scale, and the waves it makes may be minor compared with the smaller closed environment. whatever influence it may have on the equilibrium would appear to be much less of a disruption or influence.

then consider throwing the same rock in the middle of the ocean which may have large waves already and it may not sink for a long time compared to the other two environments, and it may have negligible effect on wave creation and may never effect the outer boundary, essentially 'infinity' in comparison to the aquarium or swimming pool. and thus it may no discernible effect on the structure that may exist or be considered random, even though it may have some influence, because it is so infinitesimal.

in this way it is to ask if the 'bounds' or region of randomness may be related to what is accessed or output as randomness, also in relation to accessing this state externally or interacting with it, as an influence.

now perhaps this is not an accurate correlation, though i thought i read or heard mention of various approaches to gleaning information from closed if not blackbox systems via external diagnostic evaluations seemingly similar in nature, where a signal may be injected into a realm and read back to learn of its internal functioning, structure or timing that could be used to gain access or subvert its operation.

and in my naive mind i relate that to throwing the rock into the bounded environment and monitoring its trajectory, what is known about it, and using this perhaps in a way like a random number generator.

if structure of randomness is discernible whatever that mystery is in the box (aquarium, etc) is assumed to be bad for generating numbers because it could be used to compromise the security of cryptographic algorithms.

and so if someone were to evaluate the water (numbers generated) and they could somehow discern a bounded condition that forced patterns somehow, that would compromise the number generation. or, what if a diode could have an electromagnet placed next to it and align force fields somehow that would change the structure of what is generated, yet this may not be detectable on a circuit board or in an unseen encased or protected device.

and while this is foolish to consider from this naive perspective, without any knowledge or likely inaccurate assumptions and faulty framework that does not correlate with the issues -- it is to wonder still if it might have something to do with a linear approach to this computation that is requiring of 'no discernible structure' as an a priori constraint. for instance, what if multiple bit sets queried the diode state for numbers simultaneously and their interaction was randomized atop the diode return values, or that these were mapped to 0-9 and not just 0/1 for numbers. or what if it were possible to do this fast enough such that various sized randomized numbers could be input into a running stream, such as 1 to 12 variables stitched on after another with concurrent randomness. or multiple diodes in an array which could be queried both in serial and parallel and return 'variable' output that may again randomly stream into a string (or a bit string, if each output were to become a set in a running bit superset).

if someone could influence the operation of those devices, could they still access the cryptographic secrets of the algorithms or could defenses exist in the construction and access of randomness that separates these systems.

in a parallel approach why must structure be assume to be a default exploit for knowing the computational structure if it is actually arbitrary and to me in terms of bit sets and calculating multiple infinities, this is an issue seemingly equivalent with the rock and the ocean. whatever local structure that rock may encounter or microscopic wave does not indicate it will be able to discern overall structure of the infinite boundary. you could throw a million rocks in and it still may not have any compromising effect on whatever detail or area or structure the computation resides in in a temporal shifting structure that may or may not be 'on' or accessible in a given interaction- and thus repeated targeting against randomness may not reveal any greater insight into particular messaging in the infinity context, or so it is proposed, if 'vertical' computation is involved.

this fool does not realize how foolish they are to consider such questions so it is funny for me, to neither know nor care how ridiculous this is.

the ragged presumption then is that infinity calculations could function as a type of 'mystery box' that computation and encryption occurs within and that randomness is part of this, yet structure within randomness may not indicate what is or what is not encrypted in that particular approach. it would seemingly offer randomness, even if structure exists, because whatever is accessed would be so minor compared to its interior boundary. if you have multiple sets as a starting point and each has a potential for infinite x infinite x infinite possibilities, that involves far greater randomness than a string of binary digits made arbitrary. and it cannot be easily accounted for by an algorithm, to decipher its meaning, if that is indeed its basis for randomness because the algorithm could be random, as with the output, within certain parameters. anything x anything + anything / anything = context. what computer is capable of figuring that out, prior to accessing the crypto code, and doing it repeatedly in real-time in an ever changing array of numbers and autogenerated code, variables upon variables. it would seem even an issue of forensics would be no good, as it could be arbitrary, non-repeating and repeating structures that may or may not be active or reappear again, themselves shifting or within structures that open up or close or phase change. maybe a part of a structure is part of number, touches upon it, and yet that is it. if it is a random would it in any way indicate the structure it is related to or would it be arbitrary and thus like looking for a skull in a sand dune based on a ridge that was formed on one day and gone the next, yet not knowing where the skull is.

so while the serial approach seems to seek out noise, the parallel bit set approach appears to exist within the noise as a default condition and may involve a different realm of questions and new assumptions. processing noise, ubiquitous noise, contextless needles. localized skews, uncorrected shotglass scenarios.  potentially 1,000s of permutations of encrypting code -- because it is of an empirical 'many' versus 2 or 3 or 5 layered crypto approaches.

another analogy might be a cloud chamber, wherein if a serial string or crypto algorithm may be broken if those fleeting cosmic rays were somehow to momentarily light up and reveal a hidden structure via this interaction. and yet the detachment of multiple sets in a bit string may not readily be recognized as a totality because it could occupy more noise than the cosmic rays introduce into the system or may not work-back to a solution for the shared framework if it were generated randomly or disconnected from the output in its arbitrary range of meaning- the boundary where signal may exist nested in the structure of noise yet not be readily differentiated as a single structure or active unless those dots are connected or revealed, which encryption could seemingly hide and would require a key to decrypt. as if the entire cloud chamber would need to be decrypted, potentially, by brute force, and thus throwing every externality into the interior realm yet it could expand infinitely and still not reveal what is on its inside. or so that is what a conceptualization of nested sets appears to indicate, when in a noisy, randomly generated environment, signaling not overt.

a monkey-typewriter situation, any probing potentially to reveal meaning. maybe the mystery box has produced shakespeare upon a dictionary search or query, and an elaborate false universe opens up, a portal that instead could be activated and sustained as a false corridor and then be made operational with doubling and turning of those trapped inside the mirror, containing and parallelizing the reality, merging yet bounding its action.

thus, probabilities in context of nested infinities could remain unknowns and unstable. querying the arbitrary data set as randomizer would generate its own internal meaning, may or may not be connected to other frameworks, yet ever changing, irrespective of decrypting interpretation. therefore, a stone thrown into this realm could create its own data yet may not have any structural effect on what already exists as it exists, or it may access some angle or detail of a shared framework yet within another perspective or meaning and thus bounded in another use of the same signage, via not knowing what is activated or not in a given moment.

why is the RNG not of cloud code formation, such that:

  RNG = [N1][N2][N3]...[N^n] => #     

  such that: N = 0 -> infinity

  (or any calculative function or sign or computation, randomized)

this would create a noise ocean, versus a string of binary bits in terms of a generating structure (seemingly aquarium, in terms of potentially being able to manipulate the environment to create effects to subvert it).

--- cloud formations ---

to me the issue of encountering a recognizable pattern or symbol formed of clouds in in the sky provides a context for the issue of bounded infinity and its interpretation by an observer.

if the universe (U) was considered the largest boundary, and of all the clouds that may be referenced, it is only the set in certain encounters that provide this meaningful connection, only a limit portion of the sky at a limited time and duration, and involves weather patterns, quality of light, types of clouds, and also the particular observational framework that provides meaning or links to the symbolism. thus in the set of all clouds it is only a specific cloud code that has this function, and if it is not determined by the observer, it may even appear arbitrary from their perspective.

thus in cloudspace only some clouds are code like this, and it is very small portion given all the clouds in the sky, for a particular observer.

   cloudspace (clouds {code})

now it may be possible that the generation of cloud code is not arbitrary and this reverse-engineering of individual perspective could deliver a meaningful cloud formation on demand as if by a script, so an observer may see in a given instance a given symbol that may not be noticed by others or be perceived meaningful, except by the observer in a particular framework.

and thus a forced perspective of a sort could format the sky in such a way and thus write this symbolism into the environment, via magic seemingly. how it occurs beyond the boundary, observational limits and threshold of understanding of the observer, and yet there it is, decrypted as it were from the other surrounding clouds yet if reliant on a time and unique perspective (ID) it may not be noticed by others or be recognized as having meaning that may instead have heightened significance in a given moment. and thus the cloud could, as a sign, relay a signal and communicate ideas and information this way.

(the comparison with a realm of interior infinities is that it would be entirely populated by recognizable fragments and 'clouds of meaning', as if a particulate gas that is held together by various atmospheric charge and that can be combined or separated and recombine with others layers, and that universe upon universe of cloud formations could be mapped out, and may be ever changing, including if patterns included weather-like flows of data that transform or shift the interior context or keep it destabilized.)

a man in the middle attack for looking up at the sky and instead of a state of nature, the clouds could be formed within a simulation of nature and thus the clouds may be data objects that can be manipulated as [signs] of nature, representing nature, yet not actually nature themselves, only images, substitutions. the Wizard of Oz scenario, earth populated by robotic pigeons. the messages could be transmitted in a false POV and false framework, and thus involve a deception that could lead a person to doom.

'the mystery box' contains both scenarios, potentially. allowing for both the open extension into the hidden uncompressed domain that could be developed as an interiority, mapping out infinity structures, and it could also function as a boundary space that is a false perspective of this, in some ways sharing the same structure yet within a different zoning that expands otherwise and is controlled otherwise, involving knowing which clouds are real and which are pseudo-real, a split of logics the difference as the signs ground into different frameworks yet remain correlated and can establish and sustain entangled dynamics within protected boundaries. or so it is imagined possible, given the nature of labyrinths and perspective.

--- what the hell ---

if i knew anything about electronics i would get a cracked artificial quartz crystal ball, internally fit it with piezo sensor, temperature sensor, photodiode on the outside, put tinfoil around it, get a feverish rotating DJ laser and point it on the inside and output readings from the various sensors into various combined number producing sequences.

if really clever i would ask a nanotechnologist to prototype the lottery ball randomizer at nanoscale or have micromachines tooled to do something equivalent that could be put on a circuitboard. and yet, i wonder, why not just use the internet as the randomizer and multi set search and randomize the output. or take a photograph in multiple parallel set evaluation and randomize that. that is, the reading or data interaction is arbitrary yet bounded, though the calculation itself could be random, nonlinear and thus add more variability. and maybe this is already the default, though the photograph would have structure, yet if the computation based on input is also random, how could it be so readily traced back to crack encryption. maybe it is the more times it is referenced and an issue of boundaries, or that computers are not able to compute 'infinity' numbers to allow this to occur without constraints, such that speed is all important thus binary or single streams of random digits not pieced together set after concatenated set ad infinitum. so maybe it is the serial approach that requires it. if not this, why not have non-binary character sets as the output and use that, something potentially arbitrary or unmappable even, as a system. why must it be a number. why not a glyph or fragment or symbol instead.

--- cryptic symbolism ---

the HIOX symbol is a square Union Jack, easy to identify.

as far as i got with research was the Egyptian Sacred Cut for its origins as well as Plato, Meno i think it was, where the geometry of the square is first cut diagonally, this making a quadrant of the HIOX symbol. http://en.wikipedia.org/wiki/Meno

also important in this is dimensionality, nested platonic solids, whereby a single unit (say toothpick) can be used to generate five different perfect forms (the elements: fire, air, earth, water, aether or electromagnetism) and they each nest within the other forms. this is an entire philosophy that involves geometry and relations between hierarchical structures. a small amount of polymer clay and toothpicks should allow the forms to be built and nested via experimentation.

though an inversion of the HIOX form exists, or an opposite structure which is the same graphic yet half of it is mirrored, so that the diagonals that radiate outward instead touch the midpoint of each edge. it is as if a female version of the form.

there is also a combined symbol with both the male and female dynamics within them, and from this my contention has always been that data could be written into this structure fractally, as if sentences could be wrapped around its at decreasing scale as if encrypting to planck scale. in that it would lose its legibility yet like a QR code could be read by machines as if a kind of microdot or data signature file. in other word, what would the result be if you took the letter A within such a master symbol, then decided where you would next write the letter B within its matrix, at another level of scale, and onward through the alphabet. What if you took a sentence or a book. How much data might be tied to structures that could be written in the empty space, as if a coastline, if in a decipherable order.'

what if data was written into HIOX and decoded by its inverted symbol. or shifted inbetween the two, etc. questions, possibilities. cryptic code.

Paris - Eiffel Tower
http://www.pinterest.com/pin/178525572702304954/

attachments: 1.5, 2.0