--- overview ---

when cryptography is removed from a computer-only context, the boundaries in which it could be modeled to function within would expand into a wider realm of considerations that otherwise may never be questioned, and thus exist limited in that computational domain, confined in a given threshold of interpretation. thus, to exit the ideological constraint of 'knowing' a particular approach, and then eventually to reenter the computer context with perhaps a wider range of consideration for what might be possible...

i think one of the most evident assumptions about computer cryptography as it exists is the role of mathematics (if not algebraic,) in defining the model if not paradigm for computer encryption. this could be a false view due to a naive outsider observer of these technological events, though it allows consideration of the involved issues nonetheless, however accurate.

the case has been made that [signs] are used in language and mathematics, and that this is the basis for code that is used to program software, and it is tied into equations, algorithms of mathematics yet also software that models a cryptographic process, to encrypt and decrypt information. and so it has been questioned - how are these signs themselves modeled in terms of their truth, or has this not occurred and potentially ungrounded 'data' and beliefs are the default, as this relates to language, observation, and relations that are the basis for this cryptographic exchange.

thus a question of [code] could be removed from consideration of the truth of the symbols and signs used in these interactions, their foundation, if they are actually accounted for in their truth, grounded or ungrounded, as this relates to issues of security, secrecy, privacy, and so on. and so a default condition seemingly exists, where this code itself, programming, and software/hardware solutions could potentially be ungrounded, if they are operating within frameworks and context of partial-truth (pT), versus a model that is empirical grounded, not just in terms of the mathematics as 1=1, further on to account for 1=truth. which seems to be missing in all such equations, as representations can be detached from their accounting.

and so the bulletproof ideology could exist that mathematics equals strong code, on the basis of 'belief in mathematics' that could tend towards the ideological. yet in this way, functions beyond proof and other dynamics may rely on a theorized capacity and theorized security framework that itself is the weakness, as pT=/T, as this structuralizes a basis for exploitation.


     [code] ---> mathematics 


so it is to question a prevailing condition or potential assumption that the simple act of representing reality can be equated with reality itself as a belief system, that becomes faith based, or based on personal trust issues as a security model. the more esoteric the mathematic equations or code, perhaps the more secure, if it were not involving rigor, though that appears opposite the nature of the discipline or its practicioners and developers, in that a community overviews and oversees development of the crypto and its security and integrity is also a basis for their own.


     [code] ---> mathematics  == security


so a presumption could exist that the involvement or role of mathematics in cryptography is how it establishes its security. and this text is to call this fundamental or foundational notion into question. is it really true?

another way of evaluating this condition is that 1=1 would be the basis for establishing security. and mathematically it could represent 'truth' via this correspondence or pattern matching of [signs].

and yet in an ungrounded condition, pattern matching of [sign]=[sign] can equate with 'truth' via its representation, yet not its actuality beyond the signage itself, thereby [1]=[1] could remain variable and only in a context of pseudo-truth by default. if ["sign"] is not accounted for in a shared empirical model. it is just language then, communication at a given surface-level interpretation. there is the missing dimension of philosophy that validates the truth it potentially involves. it is not just about signs -- their truth must accounted for beyond this immediate level of calculation. it involves and requires more consideration and evaluation.


     [code] ---> mathematics  != security


in other words it is to propose that 'truth' is not by default within the signs used to represent a situation, because they can be ungrounded and function as if literature - describing a situation, though it involves variability by default. an issue of relativistic observation and how the data that is modeled is accounted for, removed of error or reliant upon it, and thus the gap between what is real and what is represented is at issue.

it would seem to involve 'pattern matching' as a concept, the basis for the ability to establish 1=1 correlations in terms of number and its processing via symbols and other signs that map into the world and manipulate these relations and frameworks. and thus as stated, this A=A consideration is of the domain of establishing truth via logical reasoning, and thus a realm of thinking about concepts and ideas is involved, underneath the establishing of these mathematical models and processes. the ideas involved in these relations, as they become formalized. there is incoherence at this level, as currently the ideology of binarism make this connection an issue of shared belief in an approach, versus its truth beyond a given boundary.


     [code] ---> mathematics (concepts)


so it is to question what is going on at a more fundamental, foundational level prior to this cryptographic modeling of information, such that What If the concepts themselves are ungrounded in some way, such that variables or representative signs or equations may not be by default /empirical/ and instead could exist in domains of relativistic skew, distortion, and bias in hidden ways that could also be exploited or subverted. thus while the [signs] may be operational, are they actually grounded in truth or some belief system that equates with truth, because it is assumed to exist in the signage and not beyond it. in the manipulations not what is referenced.


     [code] ---> mathematics ('concepts')


thus to consider the deeper truth involved in such conceptualization, and this relates number to letter in terms of its use as signage, as both of these can function as language systems that are in ways integrated, yet assumed to be differentiated at the level of mathematics and, say, writing a novel or short-story fiction as this is believed different than novel  algorithms and exploratory equations and theoretical proofs. what if the mathematical viewpoint is ungrounded or relativistic, for instance, or that the literature could be more objective than equations filled with numbers

ultimately, the mathesis involved appears to not differentiate a model of empirical truth in terms of A=A equivalence, from either math or language in that both could be evaluated in this same context from the beginning. a shared modeling in other words, potentially. so the alphanumeric code could be integrated at a substructural level to the [signage] that differentiates the mathematic and linguistic, yet this could also be a false perspective or inaccurate belief and mistaken assumption- perhaps they are one system

so where this is going is to consider a particular existing viewpoint and interpretative framework for crypto, especially as defined by computers and peripherals, that establishes a fixed idea about what it is and involves and yet may involve these hidden boundaries that are also warped or biased towards certain interactions or investigations and can disallow others


    [code] ---> mathematics (algebra)


an example is if considering a given approach to crypto involves algebraic functions as a paradigm as this relates to computation. this approach then becomes the context for evaluation and mediation of representative [signs] that may also be bounded in their interpretation in this way, due to the delineation between mathematical and linguistic data. the "algebra" may only be conceptualized and believed to function at the unit of [signs] and their manipulation as signs, and not involve algebraic computations within the [signage] itself, potentially, in terms of subsign units.

this is to attempt to convey that a boundary condition could be upheld that views and models language as inviolable in terms of certain existing rules such that a [word] is viewed as a unit, and not considered in its inherent  variability in terms of this same potential algebraic functioning. in that the math is the math and the language is the linguistics, and the math is doing things to the language based on particular established relations and boundaries about what these relations and how they are believed to function based on convention if not ideological views and understanding. it is very abstract and perhaps inaccurate as stated here, yet seeks to ask- to what extent is the information viewed passive, inert, and non-meaningful, as this relates to its transformation (encryption) and reconstitution (decryption). where is the boundary for this transmutative relation and dynamics: is it inherently what mathematics does to language, from an outside-in approach, such that mathematics acts upon the [signs], or might it potentially involve accessing an inherent mathematical structure within language itself, and thus a different boundary or relation could allow the language itself to be the basis for the algorithms and equations, or to bridge across these in a different, more integrated and meaningful way.

it makes little sense without visualizing it, yet email flintworks this era  of devolving infrastructure and tools involve make it difficult to convey in the given medium, thus limiting what can be easily accurately shared and in what ways- forcing the perspective for signage, and thus relationships


     [code] ---> mathematics (geometry)


likewise, if cryptographic operations involved a geometric modeling of data this could also apply to how the content of the encryption scheme then is evaluated and processed. and again, an issue of boundaries. how are the [signs] considered in terms of the language or messaging involved. is this an outside operation of geometry that transforms 'information' which is measured by units of words and sentence structures and their formatting, or may it potentially involve more than this, such that beyond this limit, a subsign geometric structure could exist and be connected to, and become a basis for this transformational processing. thus the 'truth' of the signs as these relate in across the conventional line separating mathematics and linguistics, in terms of a shared patterning that involves both domains.


     [code]  ==  (mathematic & linguistic)


so if considering the issue of boundaries and representation, and how logic establishes these structures of observation and perception and modeling, that perhaps code itself, in its truth, involves a more fluid interaction in these domains than the traditional viewpoint can acknowledge, as this relates to the concepts involved and how they are approached. for instance computation or equations or algorithms, how data is processed, encrypted

in terms of pattern matching (A=A), this could span a model of both code as a mathematic and linguistic structure, given 3-value and N-value logic. in this way, the [sign] itself could not only have a mathematic operation that is transforming it from the outside or external boundary, and instead this processing could occur inside, and consist of its own equations, based upon inherent calculative dimensions of its symbolic or sign-based linguistic structuring (as language). in other words, a [word] could have calculative and computational potential built-into it, in terms of its patterning and yet if the word is not allowed to be evaluated beyond its whole conception, the subsign structuring may be by default off-limits or made inaccessible. this is to include smaller units than the word as sign, to include even more basically letters, whereby for example the letter [Z] may only be evaluated in terms of its being 'z' and not its components or ~various structural relations with other letters, such as S|Z or N/Z or numbers: Z|5 and Z-2. though of course there is more to it than this, because the same structure can be taken apart and evaluated in its individual components: -/_ or > and <, etc


     [code] --->  pattern matching


so the idea is that programming itself is based within code and issues of how it is modeled and how it represents the world, and it is to question if this is actually truly grounded or based in an ideological belief system. and so it is assumed there is partial grounding, in some ways, though a realm of error or a gap exists between what is modeled and what exists (pT=/T) and this includes the conceptualization of code itself as signage

likewise, the default boundaries of this code could effect how it is both managed and processed, within what parameters. and thus the heavy reliance on mathematics as if the basis for this strength, yet the same potential as a weakness if it too is ungrounded or only partially so, in terms of the potential for exploits based on these errored notions and beliefs. (A=B)

the cryptographic consideration in this scenario then, of how signs are processed and conceived of, as units to be transformed by equations, as if the basis for objectivity, yet not accounting for this in logic itself (T) beyond the level of the [signage], such that pattern matching of signs is believed of direct equivalence with truth itself, thus 1=1 is truth, yet not taking into account what this potentially represents, in its truth

and perhaps this is the issue with language and observation as a context for the mathematic, and how internal processing of a person is externalized  and thus ungrounded views and beliefs can be made structural and equated with [signs] via shared assumptions and viewpoints, that because they are shared and agreed upon, are themselves believed to be true. binary logic and ideology are what allows this perception as a default condition, yet it can be and likely is ungrounded and based within relativism, automatically, or in other words, occupies a framework in pseudo-truth that continually is expanded upon via endless viewpoints that together in their inaction with other such views, even as agreed upon and confirmed as shared observation, tends towards nothingness (0) as a perspective instead of towards truth (1)


     [code] ---> (signs/symbols)


thus it is to consider the code in terms of this issue of signage and of boundaries, as it involves interpretation beyond these, to what they are referencing, where their truth can be accounted for, in its accuracy as a model or representation. ungrounded relativism has no need of this extra step, and in this way mathematics can freely function as if writing..

thus the vital issue of error-checking and correction of code at the level of signs used to represent ideas and concepts (mathematics, crypto models) as this exists beyond equations and algorithms and into a realm of ideas, how truth is evaluated, and the requirement of this in terms of security

all of this to establish and allow a conceptualization that follows, that considers programming and code for cryptography in what may be perceived as an off-limits consideration- that of typography.


--- crypto.typologic ---

in the same way that crypto is conceptualized to be related to mathematics,  it is also proposed typography has connected structural relevance to this crypto~graphic inquiry


     [crypto] ---> [mathematics]


in other words, in the linguistic context that also establish and define approaches to cryptologic systems and their cryptographic conventions, it is to consider the boundaries separating their interactions...


     [crypto] ---> [linguistics]


in other words, what if at the level of representation within code itself there is a boundary or limit or threshold condition upheld by convention that is itself arbitrary, a forced perspective even, and that it could be holding back other options and perspectives for the questioning involved...

for instance, encryption that involves algebraic and geometric operations and functions, as these may be bound to mathematical transformation of signage, yet at a certain bounded condition, outside or upon the sign itself or at its periphery, versus within it, in terms of its subsign dynamics or subsign meaning


     [crypto] ---> [mathematics] --> [signage]


this approach is essentially to consider the relation between mathematics and language, in a context of linguistics, whereby a calculus could exist that bridges the distance between what is traditionally viewed as the objective (A=A) and the subjective (A=B) as this corresponds with numbers and letters, here in a context of signs and symbols or various patterning


     [crypto] ---> [math/linguistics] ---> [signage]


what if, for instance, the context for evaluation of data, pre-encryption, was based in a combined A=A boundary established by *mathesis*, such that the signs evaluated and transformed had this larger dimensionality involved in the initial consideration, versus bounding of the linguistic within the mathematic, potentially, as a set(subset) relation: mathematic(language)

in this way, equations could be limited, skewed, or bounded by a particular relativistic interpretation that may assume accuracy due to shared views yet be based upon or rely upon mistaken assumptions while believed true, even while signs themselves may exist or persist beyond these boundaries and be accounted for otherwise, yet not evaluated due to being off-limits


     [crypto] ---> [geometry/algebra] ---> [signage]


thus the consideration of signs and collections of signage within crypto communications and messaging could exist in a calculative context, yet this could involve both mathematic -and- linguistic computations, by default, yet in terms of software evaluations may bias a mathematic approach to establishing equations and algorithms in terms of numbers and not letters due to convention and an inherited mindset for what parameters exist and how computation takes place, at the level of pattern recognition of signs yet not of the underlying truth these signs map to and reference, and in this disconnection, the potential for a representational short-circuiting between what is represented and calculated and what is actually real, true. and thus ungrounded observation and computation, as this leads to relations and crypto exchange that is insecure by design, versus a model that is empirically grounded and error-corrected and constant under evaluation in terms of its truth, including that of its content, the signs it involves


     [crypto] ---> [linguistic] ---> [signage]


it is in this conflicted condition that the linguistic evaluation of signs can establish a foundation for truth via the de|con-struction of signs into their more elemental armatures. and this evaluation can occur in terms of various structures, such as nouns or verbs, or sentence tree diagrams, or hundreds of other approaches to evaluate how language is structured and how this maps into meaning and its verification of some perceived truth, though this could still be at the level of pattern matching of signs, and not of actual knowledge itself. such that a boundary may exist for mimicry-based AI versus intuitive computations that are based on a comprehensive model of grounded empirical knowledge, due to this gap and approach to computation, say reliance on binary approaches and constraints to force viewpoint, etc


     [crypto] ---> linguistic  (algebraic/geometric)


so all of this background text is required to establish a given framework to evaluate a pending alternative conceptualization that considers and recontextualizes cryptology within a computational context of linguistics, yet potentially in a territory beyond existing perspective that involves subsign computations that are not mapped into traditional adjective/noun and other existing models, yet can likewise potentially be interconnected with them in various structural entanglements, as patterns collide, form, and mutate based upon relations and dynamics of sign-al processing.

in other words: why not have algebraic and geometric functions and ~various operations within the signage itself, instead of at a protected boundary that limits such computation to a realm of numeracy, for instance. why not run an algorithm that transforms or relates or transforms subsign units, whether letters or words or sentence or paragraphs or all of these together (in terms of nested superset-set-subset dynamics), such that the [signage]  is itself transformed, encrypted, versus a secondary wrapper or envelope or container that "represents" this encryption of plain-text interior content

one approach to this, of a vast innumerable many, would be to evaluate the typographic condition of code itself, as a basis for what is and what can be ~programmed, in what terms and parameters, based on how tools function and how the cryptologic and cryptographic situation is conceptualized...


     [crypto] ---> [typography] ---> [signage]
 

in other words the geometry of signs themselves, letters as with numbers (though to focus on only the former as the primary example) have within their patterning an implicit structure that graphically relates to other alphanumeric characters, and thus the unit of measure, whether individual letter or their combination into words, can become a basis for evaluating these relational dynamics in terms of shared dimensionality, the shared scaffolding of logic connection that pre-exists other evaluations else informs it and can provide additional framework to map onto considerations whereby letters and numbers themselves are entangled in their connectedness and likeness and unlikeness as patterns, and this is inherently ~variable

such that a letter such as 'y' may relate to the letter 'v' in one context whereas if rotated may relate to the letter 'h'. this transformation is inherent in all letters and their combination. such that letters alone may have properties, though so too words, via ambigrams or other evaluations. yet the question goes further than this, and into a realm of abstraction that is perhaps approximate to moving from a question of typography from an interpretation of fonts and font styles, to that of abstract patterning that may no longer be legible as a decipherable language, due to the potential to break apart each letter into subsign units, say a capital letter L into components: | _

and in this way, how might [code] and geometric calculation exist in such a transmutational context of alphanumerics that break the model of literacy or go beyond its existing boundary, into other realms of interpretation. such that the ascender and descender, mean line, baseline and median, and arms, spans,bowls, shoulders, counters, and terminals become graphic units that are potentially computational, if they are standardized and aligned. and this is what the HIOX model of alphanumerics opens up and allows yet it could go to a much higher level of resolution given the details of language and how these sign systems exist across all language, potentially, mapping into a master symbol that reverse engineers all language in a single view

in this way, from [code] to [mastercode] if not many relativistic codes into a shared framework of a grounded empirical model that is based within and references the same evaluation of (paradoxical) truth in its pattern matching. this is ancient stuff, the ideas involved formatting civilization


The Orphic Trilogy,  Cabaret, GiGi

Π Ω δ