[13] typographic processing

brian carroll electromagnetize@gmail.com
Fri Oct 18 00:22:37 PDT 2013


--- overview ---

the potential for a linguistic component in models of and approaches to
encryption has been introduced, perhaps beyond the normal boundaries of
consideration as this relates to sign-based relations and dynamics.

the context for this perhaps also correlates with a recent article on
Google and its geometric approach to language translation...

How Google Converted Language Translation Into a Problem of Vector Space
Mathematics
http://www.technologyreview.com/view/519581/how-google-converted-language-translation-into-a-problem-of-vector-space-mathematics/


...in that pattern matching has a primary role in establishing structures,
taking note of the algebraic relations of [sign] - [sign] = [sign]

again, it does matter how accurate this representative signage is, in terms
of what it is referencing and in what frameworks. and thus pattern matching
that is ungrounded could tend towards a false perspective, whereas pattern
matching that is grounded could tend towards realism. a boundary condition
exists, and it could like the manual settings on a camera establish how a
situation is viewed, given light, f-stop, and focus and distance, etc

the potential exists for a quadratic equation-like review of data sets in
nonlinear contexts, churning through variables upon variables and searching
out equations. this can happen in terms of numbers and their conceptual
frameworks, like physics or other formulae, though it could also exist at a
more abstract level of pattern matching of various subsign components and
elemental structures.

thus, symbolic processing and symbolic computation could take place in such
a scenario if accurately modeled, the variables conceptually understood. in
their truth this is about the grounding of signage in its truth, and this
evaluation could exist below the level of the word, letter, or even number
or could exist in terms of images and what signs exists within the images,
in terms of their nested and-or dynamic relations. it is perhaps a future
test of such an approach, can a computer interpret a painting in terms of
its content, based on its perspective, or does it only pattern match what
others perceive about it, and thus mimics observation and knowledge. it
would seem the former is an ungrounded evaluation of signs, matching the
sign with the sign for its model of truth, whereas the latter would be a
grounded evaluation of signs, in an empirical model of truth, bounded by
computer observation in that it may not 'like' or love or hate the artwork,
though through diagnostics could statistically pinpoint its location
amongst various mapped qualities of other such works and ideas, and gauge
its like and unlike characteristics, evaluating against a human viewpoint

and while this mapping or correlation could occur between [signs] that are
computationally processed in this way, via geometry, as with language in
the above example-- the question in this instance is what might be possible
if this geometric evaluation broke the barrier of the sign itself, and thus
the geometry of the signage was broken down into its component parts and
evaluate for its dynamic relations, as if DNA of alphanumeric characters
that compose both math and linguistic units and algorithms (equations if
not words, sentences) such that further mapping could take place beyond the
given boundary, and be used computationally and for empirical structuring,
and that this additional data may not relate to the word itself because it
is only partial, as different languages have their own words, perspectives,
though in identifying these structures, could perhaps function as another
level of language that could universalize different characters into a
meta-characters that like QR-codes or some fractal language, exist in a
context of an alphanumeric master code, reverse engineering the context for
an original alphanumeric system (which HIOX is for western civilization)

what if the quadratic pattern matching occurred as an aspect of machine
translation at the level of computation, and perhaps only machine readable
in some layers or levels, that could move into N-dimensional frameworks,
and how this could relate to cryptology and modeling of information in
various ways. it is easy to conjure up What Ifs, though in considering the
impact of transcending given boundaries of data modeling, it does appear to
go beyond the existing conceptualization, into another territory entirely
than what the tools and software and ideology allows to be considered, and
thus interpretation is bounded and limited by a particle fixed viewpoint or
forced perspective that such inquiries are in direct challenge of, as to
what exactly these issues involved consist of, in terms of their truth

and it is this truth of matching patterns and A=A and A=B evaluation and
differentiation, that subsign indicators or components and elements could
be considered to have computational characteristics that could be a realm
for the application of algebraic, geometric, and various calculus-based
approaches to transforming and relating words, structures, ideas, content,
in the very way that cryptographic equations do for the signs themselves,
when _contained inside or within a crypto model, though bound or inviolate
as data itself, which then establishes the "THIS IS SECRET DATA" scenario
that is a strange confession, when everything is to avoid the easy reveal

so these boundaries and the way signs are interpreted effect where the
algebra and geometry and transmutation of data begins and ends, in terms of
what is transformed and how, in what dimensionality, parameters, and
terms.

and thus by opening up the sign itself to consideration, in terms of its
patterning and various structuring, its subsign scaffolding and likewise,
the /superposition/ of various shared and unshared elements within its
particular or unique design, as could be empirically modeled and evaluated,
it would then be possible to establish 'many truths' about the sign that is
beyond just its meaning or its pattern match at the macro-level of the sign
and into its range of interconnections and the frameworks it shares with
other concepts and words, including across various language systems and
character sets, as these may be bridged by geometric or other trusswork

perhaps there are new mathematics involved or that this is mathesis in a
realm of signs and symbols that function as an alchemical model that goes
beyond the normal rules for the molecular, conceptual ~forms into other
esoteric dimensions, wild abstractions or unknowns, seeking out or finding
and testing hypotheses to establish what is possible and potential within
the gap, and the distance between conceptualizations in their otherness

the ruin of description here is detached abstraction that is easy to get
lost in and yet it is also necessary to provide context, which in this
moment involves a chasm or void that would be encountered beyond the given
boundary of alphanumeric computation, into a flatland-like extradimensional
realm that perhaps could not be visualized except by a quantum computer,
due it its massive linearity combined with infinite connectivities that
would likely involve change as a variable, or weather systems for the data
that like leaves on a tree move in response, or become wet when it rains,
or fall away in autumn and regrow in spring. so too, conceptualization as
it may be born and be sustained and then, parts decay while others survive,
and so to consider starting such an evaluation, as it may involve a process
that is equivalent with nature to a degree more than the filesystems and
architectures of computers today, seemingly. in that the data model this
involves would be the foundation for a shared empirical model referenced
and observed and contributed to from multiple perspectives simultaneously
and thus an expansiveness that involves bit sets and their permutation,
though also the components of these [bits] as signs, deconstructing the
elements and considering the myriad connectivities of subsign relations and
dynamics, in their linguistic and mathematical importance, which could be
relevant in terms of encryption, translation, data modeling, for instance


--- typology ---

so there is a theme of typography connected to this symbolic processing of
information via computation, and it is proposed this is a model or approach
more like that that humans rely upon for advanced awareness, fundamental
knowledge, and heuristic reasoning, yet is unaccounted for in technology
and tools of today, turning highly literate people into illiterates in the
binary context, for the dissonance that a false perspective brings in terms
of thinking, making it seem irrational in the limited, warped biased view.

so typography has letters and numbers that are represented as signs and
this goes from printing era blocks of type and machinery into the present
day era of electronic pre-press and desktop publishing, from dot matrix to
ink jet, laser printers, page layout tools (Quark Express), vector and
bitmap graphics software (Illustrator, Photoshop) and specific software for
typography (Fontographer, other) -- and this is not Mathematica territory
in terms of the manipulations of signage, as ideas or concepts, it would
seem, and instead upholds a particular boundary or conceptualization of
what type is, what images are, and the meaning of fonts and representation
of words and letters and numbers appears to occupy a narrow interpretation
that is mainly visual, about its geometry and usability for communication,
and enhancing this with stylistic refinements, a range of type expressions

and pretty much none of this relates to what i am attempting to evaluate.
none of it accesses the dimensions, exception perhaps if modeling text
within a 3D program and spinning and mirroring it, yet that is an insane
approach, more of a one-off Kryptos sculpture than a way of exchanging
information. the graphics program 'rotate' and horizontal and vertical
mirroring are another very basic aspect of this, though again it involves
the [sign] or text as an image, to be manipulated at its macroscopic level
and not to allow different dynamics to occur within a given word, whereby
some letters may have one function and others others, as this involves a
larger document of, say, twenty different co-existing variables, changing
the text from one document into another series of parallel perspectives.

typology is not typography, yet it is a vital concept in this situation
because [signs] are essentially sets, and subsign characteristics are the
variabilities nested within the set, in various subset dynamics, 'pattern
recognition all the way down' to zeros and ones.

typology, in the terms i relate to it, is easy to understand in terms of
infrastructure and building types. a TV station is a type of building that
is like a component on a circuit-board. a nuclear power station. gas
station. and these have various programming that is unique and overlaps
with other buildings, such as a home dwelling or school. telephones for
instance or network connections. and likewise, [signs], [words] as they
consist of substructural patterns, including geometric characteristics
though also correspondence with other letters and numbers, as part of this
overlapping of functionality and meaning: 5 and 2, S and Z. it may be seen
as irrelevant in terms of a discrete letter or number, yet within a shared
context it can bring additional meaning or reveal interconnected patterning
that could be useful or applied in a calculative schema. such that data can
be hidden or revealed via such means, potentially, yet appear as noise
otherwise. or only /fragments/ or parts of the characters may appear yet in
their statistical or probabilistic frequency, may tend towards a particular
viewpoint or interpretation versus another, or could remain ambiguous and
hidden, unless having the missing key to know which tumblers to turn, etc

so while typology is not describing fonts or "type" on a computer display,
it has some relevance to typography in its set-theory evaluation, as this
relates to signs and symbols and transformative relations, their potential
alchemical connections, as this perhaps even moves between stereotype and
archetypes, as concepts are removed of grounded truth and copied over and
over, where this truth is replaced by its sign, increasingly ungrounded,
and thus 'vision' and IMAGE replace and stand-in for understanding, and so
too in this way, pattern matching is removed from its logical foundation
(A=A) where this can be about matching signs with themselves as if truth,
yet this is only pseudo (pT) and when standardized, moves towards falsity
as a shared framework, the false perspective at this layer of evaluation

image versus idea, the disconnect between forms of signage and content,
everything foregrounded and the empirical background removed, thus perhaps
an early cosmos condition, from order into chaos, from one view into many
as the limits of an older viewpoint could not withstand and keep up with
the changes, then only now to realize the consequences of lost modeling,
lost grounding, lost conceptualization, lost reasoning, lost communication,
lost sensibility, lost relations, lost truth, lost principles, everything

thus the situation that now exists is basically opposed to evaluating these
ideas due to proscribed boundaries and fixed interpretations, and thus a
lot of writing is necessary to build up a structural framework to share
this other viewpoint within, to seek to describe the situation, model it to
some extent, and offer an alternative approach into another realm which is
seemingly another territory for language, appearing largely unexplored, as
if a hidden potential of or within language, as if it were to be unlocked
and made nonlinear by default of such restructuring, recontextualization,
as this changes perspective, going from unconnected linear viewpoints in
serial progression to massively connected parallelism of shared views

in some sense the 'type' (non-font version, as in typical or repeated unit)
is to consider the various ecology of the sign-ecostem in terms of a
breakdown of these various repeated yet also unique structural relations,
across various classes, categories, and groups that map to meaning though
also to geometric and or other principles. for me Plato's mention of this
knowledge of language when mirrored in water is a form of calculus, and
thus figuring out what sets belong to what characteristics would be to then
group or identify them according to type, function, programmatic qualities,
potential or variable meaning -- yet here is where it is like an orbital
cloud,.. because this involves *potentials* for these set relations, the
dynamics are about superpositions that may be nested within a structure and
not visible immediately else could emerge or hide upon consecutive changes
and thus, there could be a flurry of potentials in any given bit set, both
in itself as a compact sign, such that [dob] which could be 'date of birth'
could transform into [pod] given particular instructions. and what does
that mean or what does that mutative characteristic make possible? well, in
a crypto context, you could hide instructions that reveal that data, and it
could involve several layers based on multiple or single keys that unlock
such a hidden perspective within a text, even while in plain text mode. yet
it could go much further than this, into questions of universal language
and unification of all data in a single model of grounded empirical truth,
as patterns map back to A=A (1) or remain in a realm of ambiguity (A=B)

so again, the questions involved seem to be about the unit of measure,
where is the boundary for considering what a pattern is and what parameters
are evaluated, in what frameworks, dimensionality, conceptual terms. a way
of encapsulating a small aspect would be that the entirety of linguistic
conceptualization could be tied to the scaffolding of language, worldwide,
in this way, via particular approaches to abstraction. and other models
could tie into particular concepts, say [economics] branches out to all
other data sets in various ways, and interconnects in that pattern-based
infrastructural conduit, though in its own dimensionality which may or may
not overlap with hypotheses of other signage. and thus like protocols and
layering, perhaps there is a particular layer this is occurring on, and who
knows, perhaps crypto is its own layer within this model of knowledge, such
that its truth is reflected in how it is represented and modeled as an idea
and that this information shapes the perspective from the ground up- so if
the purpose is protecting or securing communications (truth), how might
this fit into the various contexts that are embattled or under siege or
boundaries need to be created or data needs to be hidden, as this can be
evaluated in terms of truth, firstly, and not applied crypto systems that
may rely upon false perspectives and assumptions of unaccounted variability
that may largely undermine or exploit this purpose or outright prevent it

if you have ungrounded observations, ungrounded mathematics, ungrounded
language, ungrounded communications, ungrounded relations, ungrounded
reasoning, ungrounded beliefs, ungrounded ideas - and then expect that some
software and hardware solution and encryption model is going to deal with
all the security exploits inherent in those, do to protecting [signs] in
their inviolable state, it would be perhaps to limit questioning to a much
smaller set of criteria than what the situation actually exists in, and
thus a false sense of security, knowing, secrecy, privacy, that may rely
more on belief if not ideological compliance, to not destroy the model and
allow it to persist, as if this validates the process, than to rigorously
question the frameworks in their depth beyond issues of representation that
exist in biased and institutionalized agendas, driven toward certain ends.
and while most probably find themselves situated in something preexisting,
and few probably have chosen these parameters from the start, it is very
difficult if not impossible to do anything else, given the fixed idea of
what cryptography involves, if the tools themselves are bounded and limit
what can be considered or evaluated- which is a perspective forced also by
indoctrination into an orthodoxy of patterning, "received ideas" that are
foundational yet flawed, and instead of theory and law should be brought
back into a realm of falsification and hypotheses instead, because perhaps
people are trading in dead and rotten fish, skeletons of previous solutions
that are by now carcasses or firmly established yet potentially ungrounded
in what is actually going on as it is exists, versus as it is believed to

maybe the truth of this is that security is not possible, and not allowed
if there were a dictatorial framework, and thus a limit allows a false
sense of security by going along with the charade. maybe it is purposive to
not allow tools to serve ideas - think: the internet has no model for ideas
and yet is chock full of data and information, yet no conceptualization
that builds upon itself beyond the relation between [signs] as if "search"
is the biggest idea to emerge from 21st civilization. seriously- WTF.
exactly how stupid have we become, the Google billboard hiding yahoo-like
organization of categories no one can access to actually find anything via
structural knowledge. how helpful is that for everyone, actually. how
onesided. is there not some symbiotic relation there, or is it parasitic,
asymmetric, a siphoning occuring for a hidden, protected, privatized agenda

what if the unit of measure is not the [word] and instead the [idea], the
concept. where is the internet or software tools to navigate or explore
those dimensions and dynamics. nowhere you say? exactly. it is absent,
missing information, once the core of culture, now relativism, viewpoints
disjoined from every other, sects of ideology, the interdisciplinary yet
without a shared truth, a structural limit to this being able to happen,
the span needing to go beyond private views and skews, binary onesidedness
and back into truth, back to logic, underneath the skein of civilization,
to question what is beneath this scrim the movie image is projected onto

the nature of reality. yet there is nothing documented in culture or in the
academy or educational system about the existence of electromagnetism in
relation to these events. the paradigm is 300 years outdated, from the
first industrial revolution onward, though moreso than that, if taking into
account metaphysics and cultural beliefs prior to this, when unified and
allowed beyond the confines of a 'public methodology' of science that is as
biased as religion once was to external data beyond the given modeling

so basically it is hopeless, there is no way, populations are illiterate,
relations are animalistic now, like dealing with deranged jungle animals in
terms of 'reasoning ideas', the poison of the snake delivered by pills, or
violence of predator via hidden toxins attacking from every angle instead

unless that is, someone, somewhere else out there also relates to truth as
the primary condition for evaluating events. which is what the internet is
all about, connecting people in their various contexts and perspectives and
building a reality up and out of that shared context, as it may or may not
scale beyond a certain point. and thus the collapse of literate email list
culture into the politics of commercialization as this educational venue
is  socialized and commoditized and turned into a scene of endless groups
and groupies, even the unpopular now have their own support groups online-
everyone is cool and everyone is a celebrity in their own way. egotastic.
who needs TV cameras when there are webcams and the internet, each person a
star in their own movie even, situation comedies, varied lifestyles mapped
into contrasting frameworks - notice ensuing antics, protesters kill off
opponents, latest rape video online, see cannibalism tomorrow, live! how
much of what is engineered into this state of failure is purposeful for the
underlying agenda and politics of subversion and subservience to lesser
views and lesser ideals, brought about by the incapacitation of reason, its
absence in the culture, public debate, virtues, principles, goals, ideals

where is it. it is not at this superficial level of reading, referencing,
repeating, and regurgitating dumb [signs] as if truth itself. it is not
simply the act of communicating information or sharing and storing data. it
involves conceptualization that is accurate, grounded in a larger truth,
and that becomes the basis for shared awareness. consciousness. a modeling
of reality that at its core is based on verisimilitude, *insight*, and a
way of existing, surviving, being, though growing, developing, moving
through the world in a more optimal and purposeful way. and that involves
getting at the concepts in their truth, accessing it, questioning it, and
then having this be the basis for relations, for actions, governance. thus
what if the technology and tools limit this possibility to such an extent
that it is not even possible to communicate, the tools are purposefully
degraded, the equipment made to break and forced to break by attackers, and
thus the very interaction in this truth is in jeopardy via this condition

'the computers' a forced perspective. the software. the processors. the
hardware. what is made and how it is made and thought about. to varying
degrees, mostly JUNK. mostly inadequate. mostly off-course, limiting. to
such a degree as to force incapacitation by the very use of these tools.
programming and reinforcing a captive mindset, subservient to this agenda.

in this way -- breaking the illusion that is sustained, cracking the mirror
that this establishes in civilization, would remap categories and concepts
beyond their existing connections, short-circuiting and rewiring the way
these interactions can and cannot take place, tearing down boundaries and
erecting others that keep unbelievers and those who do not serve 'truth' in
a different contained realm, for they cannot be reasoned with and rely on
lies and manipulations to force dynamics as they exist, and should not be
allowed to continue this practice, a protected boundary must be established
to differentiate those who serve truth and those who exploit it so as to
serve themselves instead. survival of the fittest, finally grounded by
actual capacity versus conformity, actual accounting for value instead of
standardizing high and low capacity to the same bell curve median scale,
though far lower as mimicry and imitation and acting go, pretending to be,
manipulating signs to appear as something yet not do the actual work, etc

accounting is core to truth. without it, enronomics, in whatever theorized
confection it may appear as, resolving all scenarios in a single relativism
that is binarily N=P complete. everything is resolved, solvable this way,
just believe, follow, indoctrinate, be the religion, the brick in the wall

so what if the bricks are de|con-structed, the wall dismantled. the [signs]
accounted for in their truth, what does it involve- what limits exist to
making this new interpretative context a possibility. and it is proposed it
exists going beyond the normal conception of the alphabet, language in its
existing boundaries, and breaking apart the signage, to get at the concepts
and start talking about them in structural terms, of logic and their truth.
and to do this would require stopping the flow of time in sentences and in
reading and writing, and moving to a nonlinear approach, as previously
mentioned in terms of conceptual models as molecules, a diagramming and
testing of ideas and beliefs as hypotheses.

yet to get there, it is necessary to break the IMAGE of the sign, a ruling
illusion about the inviolability of words as if truth itself, inherently.
to shatter the illusion of perfection, especially within language as if a
perfect perspective if unerring, yet not accounting for error in thought
and framework, which is the core failure, this incapacity to reference
truth removed of falsity. so how to get to empirical truth when these same
SIGNS are by default only pseudo-truth, partially or minimally accessing it
in local distributed contexts, and it would seem that overcoding or
under-coding or various other approaches to language beyond existing
boundaries could open up further interpretation via the inherent
/superposition/ of letters, words, and numbers, such that [sign] could have
a range or cloud of variable meaning, that may be multiple, and could be
written and read this way, potentially, though likely starting off in very
small code-like fragments though building to words or sentences
potentially, another form of language that is based in a grounded
superposition, as if ideograms

so, this is to propose a potential communication capacity exists that could
develop to break the illusion of linear, serial relativism based in binary
ideology, and instead operate within a many-viewed parallel perspective
that has a range of meanings within the [sign] as de|con-structed into
various subsign scaffolding, as it becomes a component of this language.
and from this, so too, hidden communication that could occur by way of a
limited observer where those who do not share the same parameters or logic
would be bounded in what they can interpret, and that this interpretation
would default to an ungrounded or forced rationalization in a biased binary
viewpoint, due to limits upon its meaning, thus perhaps inventing views
that do not exist yet fulfill the limited model. this could be useful in an
illegal surveillence society if the oppressive generates false positives by
their mistaken analyses, skewing models, forcing extreme decision-making
out of line with the actual condition, thus increasing polarization the
side-effect in the unshared evaluation, which can be very telling and
provide an offset by which to distinguish those skewed and hostile from
those aligned in shared modeling of reality. anxiety of the "irrational"


that was a best attempt at a recap of the issues and a setup for a way into
this situation given what exists in typographic and linguistic structuring
of alphabets as [signage]. it should be insanely obvious i am unknowing of
most of what this involves, beyond the given conceptualization, though the
goal here is to tap into ideas that others know about in their depth that
could break this fixed structural condition wide-open across a wide range
of parameters beyond the few exampled here and there in the text. so too,
there are probably many errors in my own observations, so that should be
part of the evaluation, if assumptions or language or conceptualization is
wrong or missing or inaccurate or false. consider it a working hypothesis,
open to additional addition, able to be corrected, improved upon, part of a
larger collaborative model, this just an introduction to be expanded upon
if and as others relate to the concepts. these are ideas, not just my view
is valid or right or correct, so what follows and what precedes are based
on questions and considerations over a long period and this is where it
arrives today, mostly, in terms of making sense of the totality involved.

and thus, from this limited perspective, the way into it, as far as i can
perceive given a particular experience of these issues, is with structural
relations between [signs] themselves at the scale of individual letters


--- importance of ligature ---

a key idea is how to establish geometric relations between signs based on
their visual ordering, which directly involves how they are patterned and
what is like and unlike, over a range of criteria and analyses (phonetic,
anagrams, mirroring, &c.)

i did not know this before, though the ampersand (&) is a ligature, a
combination of letters E and t, from Latin. the dollar sign ($) also..

Typographic ligature
http://en.wikipedia.org/wiki/Typographic_ligature

the way i arrived at the concept of ligature was with a thesis combining
two seemingly disparate concepts, [architecture] and [electricity]. and
strangely found a symbol that had 'ae' - that combined these two letters
into a single character (æ)

Foucault had a book 'the Archaeology of Knowledge' which conceptualized
across a similar gap between concepts [archaeology] and [knowledge] and i
think it was through that reading and a given interpretation that it was
clear a progression existed from A & E as two concepts that relate to their
fusion in a shared structure, which the ligature to me represents. in that
discovery it was realized the concepts of architecture map directly onto
electromagnetic infrastructure, and thus circuitboard city planning is not
just a metaphor, typology of components, nor is building programming, etc.

so this is just a backgrounder on a visual equivalence of the structure
between two letters, the middle horizontal line in a capital A and capital
E, and how they combined into a single character (Æ) via shared structure.
it seems innocuous or without meaning, potentially, unless having meaning.
and thus when 'the æther' or 'archæology' is rendered with or without the
ligature it indicates something about its nature even, perhaps.

so there seem to be a a few ligatures for the American English alphabet
that get little or no use except perhaps in specialized domains. while the
ampersand is normalized, and dollar sign, the æ ligature is not. perhaps it
went out with the era of the classics (texts of western civilization). to
give a sense of the chaos, it is unknown if the character will reliably
render because via email software because there is no special character
input, so simply trying to document the language itself is bounded and made
into a technical feat, to try to access a deeper level of its operation
than the readily available graphic emoticons, including steaming pile of
shit with animated flies that Google provides in its place.

so language is debased, shallow, image-focused on recognition of signs and
responding to these patterns. and it is in is warped and twisted realm that
ideas must function, yet when competing with idiocy, it does not good to
reference or rely upon truth beyond the shallowness and the choose your own
adventure style of navigating the situation, that relativism best allows.
do your own thing, everyone is legitimate in their viewpoints, etc. and
this ability to write and convey things, ideas, makes it so, apparently.
functionally. willing into being a perspective that can be reinforced and
defaults to true and believable by mere fact of sharing it, making it real
via language, shared viewpoints, even if they are fundamentally ungrounded

so it is incredibly easy to string together letters into words via typing
them out on a computer and sharing them worldwide with others, finding that
others exist elsewhere in a similar dimensionality and establishing those
synaptic and neuronal connections that establish a noospheric condition,
whereby awareness and atmosphere, local and global, syncronize in a new and
different way, via this remote connectivity as it defines, establishes a
shared reality - at the level of perception and communication of viewpoints

and yet the language itself could be superficial in contrast to the ideas
involved, so a lot could be communicated - information exchanged - and yet
very little said. or that the needle of truth remains in the haystack of
false and faulty frameworks, and thus it is about concentrating what this
minor truth is, establishing it, while not having a model of the whole that
this truth exists in, as a shared reality, only the partial details of it,
here to there, still all disconnected for lack of a larger framework and
the capacity to link together ideas beyond their isolated observations

as mentioned, this involves logical accounting for concepts in their truth
and connection of concepts beyond the context of pseudo-truth, which can be
a realm of conflict, a showdown of ideological vested interest and bullying
for authoritarian control over what is true, the valid perspective, etc, so
it is not just an issue of locating truth, it is also often unwanted or not
allowed, censored even, if not about destroying the messengers with ideas,
so to keep the paradigm on its existing downward course, into oblivion

thus to get into the concepts, it is to evaluate [signs] in terms of units
and the letter is one of these first steps. a single letter can have its
own meaning, such that the letter A may signify a mountain peak, to some
people or in a given context. the letter S evaluated as a snake. note how
children's books establish literacy in these same terms, where an image of
an illustrated green snake with googly eyes will be in the shape of an 'S',
though this extends into a realm of advanced cultural literacy as well

so there are lowercase and capital letters, which have their own dynamics
that could be evaluated. and while some concepts are single letter, such as
'I' that represents a first-person perspective of an observing self, most
letters unless defined mathematic or scientific variables (a, c, E, e, f,
F, M, W, etc) are instead likely involved in 'boundary crossing' via their
sounds, into a realm of other signage they can still refer to. for instance
the letter 'b' when sounded out equates with 'be' the two letter word, and
'bee' the three letter word. thus these dynamics could be mapped letter by
letter likewise, in their discrete existence as isolated units, prior to
visible or geometric connectivity though this hidden relation exists as a
structure and likewise could and probably has been mapped out extensively

so moving from single letters of the alphabet into two letter combinations
it is a realm of the ligature, where two letters can become one again by
their recombination, which involves a new character that replaces them

consider the idea that every word in this conveyance has adjacent letters
with a hidden potential to be combined in a compact ligature format, such
that a text could be reduced by a certain percentage yet still feasibly be
readable as the same text, only in a smaller space, and perhaps reading the
information would be more or less efficient, more or less detached from
issues of signage and help or hinder consideration of underlying concepts
in terms of how this could effect pattern matching and interpretation

for example, a word like HEIGHT could potentially have a ligature for [HE]
and [HT] such that [HE]IG[HT] would go from six characters to four. and
thus if still legible could compress or compact words and sentences this
way. so for instance, the ligatures for AE and OE could example across the
entire alphabet of two letter or more relations.

potential 2 letter ligatures // text version
https://www.dropbox.com/s/c1jhz61ddtvmzff/alphabet2.pdf

(note: graphic version attached for list archive)

consider that there is an array of 26 letters whereby A-Z, each letter is
recombined with every other letter, which arrives at 676 combinations that
can be evaluated for potential ligature connections. it need not arrive out
of Olde English or some historical context to be legitimate, or so it is
proposed that this technique has other applications than pronunciation or
cross-language word migrations.

thus looking at the table, the matrix can be navigated via X,Y coordinates
such that (1,1) would be AA and (5,8) would be HE for example.

the first pair of letters may be difficult to make a ligature of in some
situations  due to issues of fonts and spacing with adjacent letters on
either side, though imagine that software model could have multiple options
for how a ligature exists and is rendered, given its context, and thus a
single character perhaps somewhat like an M with a horizontal bar may in
some instance be legible as a double A ligature and thus if utilized could
compress whatever word it appears in, if any.

(it should be noted that these two letters combinations are still not in a
full context of communicating language though could have a symbolic or
other sign-based relevance, such as acronyms. and thus while not yet in a
normal realm of language evaluation for long form communication, meaning
can be compacted into these two character letters, prior to evaluating them
as potential ligature candidates. for instance: GE in terms of mythology
and also General Electric corporation, larger than many nation-states. that
is a lot of concept in a two-letter compressed word. and thus another such
connected structuring exists in this realm of abbreviation and acronyms
that can be another structure to consider when mapping out such dynamics)

to return to the table, a second example of HE would easily combine across
the center horizontal and allow a shared edge between the two letters. and
so evaluating candidates or approaches for ligature combinations involves
looking at their structure and considering geometric relations between the
graphic forms - which can be widely variable give existing fonts/type.

thus, the preference for the 16 segment LED display that has segments that
turn on and off for various parts of these letter structures, albeit not to
the degree needed for different fonts, though a matrix LED display could.
in other words, 16 segment displays are not chained together, do not share
common edges between characters and thus do not allow such ligatures, yet
demonstrate the need for a standardization of structure to enable common
geometric relations to exist, likewise. as if properties that vary given
how they are approached.

it should be noted that typical software allowing ligatures appears to be a
copy-paste scenario unless somehow macro-keyed. in that it is referenced,
as if looking up a thesaurus or doing a key-combo, to access the special
character, and that such translation of adjacent letters a and e (ae) do
not automatically translate into the ligature 'æ'. this is a vital gap and
involves usability issues as the approach or limits define the interaction.

now what if a keyboard existed that had its own fonts and controlled the
output, such that it could take a text like this, and automatically when
writing translate the variable letter combinations into a running ligature
framework via real-time or post processing, thus compressing the data yet
retaining its readability. and what if reading consisted of interpreting
such texts and where an anomaly may exist in legibility, the letters in the
word could be expanded to their full scale, removed from their ligature
connection. further, beyond just 2 letters or 3 letters, what if entire
words could be compacted down this way.

what this is to say is that the table above defines the 26x26 A-Z letter
combinations, and yet that is just one level of many. a word such as
'house' could be evaluated in terms of ligature, here capitalized to relate
it to the capital letter combinations...


     [H][O][U][S][E]


in terms of *potential* ligature combinations, there could be two letter
evaluation for every adjacent pair: HO, OU, US, SE.

each of these pair could have multiple potential versions, based on the
optimal relation to adjacent other letters. in this way, a span across
multiple ligatures could exist, such that [HO] and [OU] combine into a
single fused 3 letter ligature, while [US] and [SE] combine into another:


     [HOU][SE]


and thus, given variability, the letter 'U' could tend towards a particle
optimal state either to co-exist with the adjacent letter 'S' or to attempt
to combine with it in an optimal, balanced way, based on its legibility...


     [HO|U|SE]


this is to propose that *superposition* of variables could occur in the
word as a bit set, which may seem meaningless.  though the language itself,
the sign itself, would be calculating its own rendering as a sign, and not
just a passive issue of someone tapping a key for letters H O U S E and
having that be the result- instead, various modeling would be activated and
word combinations would be processed in real time in terms of ligatures and
other structural data based on how the situation is modeled (which could go
beyond visual structure, several examples mentioned earlier).

so this is to propose that any given word could potentially be mapped into
and out of this interstructural connection, based on its unit of measure as
the letter as a combinable entity. it is not yet about cracking them open,
yet it prepares the framework for doing so, by seeing how language works in
terms of communication techniques using existing software and equipment.


     [15,8|21|5,19]


thus, coordinates can map into these ligature frameworks, and three value
ligatures would have their own chart, which potentially would resolve many
three letter words or [signs] in terms of ligatures, and then potentially
many four letter ligatures could exist, and massive data charts would be
needed to chart these out, though it is entirely possible with computers.

now for instance if the original word used in the example (HOUSE) were to
be evaluated differently, it would have different variability in terms of
how the sign is rendered...


     [H][OU][SE]


in this case, the letter 'H' is non-ligatured or combined with an adjacent
letter, say due to readability issues, and the other two ligatures remain
separate...


     [8][21,15][5,19]


note the letter 'H' is the eighth letter in the alphabet, as a single
number input. now what if the [OU] and [SE] could were suddenly realized to
exist in another chart of four letter ligatures, and thus the coordinates
for these two ligature resolutions were replaced by another matrix value,
here fictional, equaling


     [H][OUSE]  ...

     [8][3.2.122,0.299.9]


what this suggests is that every letter and would could potentially be
fragmented and reunited in a similar structural framework that operates
well beyond ligatures, into an N-dimensional modeling of word and language
relations in a computational context, whereby the [signs] themselves are
not just inert static entities with no connection to truth, and instead
could map into different variabilities, ligatures or not, as a form or
approach to reading/writing.


--- other potential views ---

thus, extrapolated into this other abstract variability and the issues of
superposition, a text could be typed that automatically is transposed into
another framework via its software, though this would need to be a model
that is geometrically controlled, within a particular non-arbitrary shared
framework based on a common empirical model that can be developed via many
parallel interactions referencing and improving upon the same rulesets.

in such a way, an interface could exist along with a physical or software
keyboard that could offer optional structuring or [sign] relations, such
that another approach could involve a non-ligature evaluation of all the
available ASCII letters A and other special characters in their graphic
potential, and having this be an option for typing a concrete-poetry-like
message akin to SMS abbreviation though into the signage, deepening the
meaning, not just compacting it, instead making it multivariate. such that
a word or short communication could convey a prose like awareness, and
stand in and of itself as a perspective, in place of a paragraph, for
instance, due to its capacity and its depth of meaning. what are the
parameters that could be accessed, what are the existing limits of
parameter sets that *break* upon use or going beyond a given boundary, such
as characters in email, or limits within software, that instead could be
made accessible and reliable in use, for such symbolic communication, as it
may involve transformation of text or other approaches and issues, such as
one-time pads or encryption perhaps if a key infrastructure was involved
and meaning was bounded by interpreter

another consideration is to alter the rules of evaluation, such that the
text could be flipped or twisted within ready-made software made for such
communications, or other such dynamics, like turning everything 90 degrees
that may reveal a hidden pattern or transform certain words and not others
or reveal a portal to gain access to another level of the text or to move
through several as the data is rearranged via a given planned sequence. for
instance, what if the axis of left to right is reversed, right to left, for
the display of information. such that [axis] is variable. or mirroring, or
other pattern dynamics, say structural substitutions or additions whether
graphic or phonetic or acronyms or whatnot.

in ways it may be reminiscent of invisible ink, even, if a strange text
were to arrive that may be formatted beyond a legible threshold due to
unknown or unknowable rules (if bit set permutations to subsign structures)
and thus a threshold condition existed requiring deciphering or decrypting
or translation of the text or abstracted data.

with no such key to unlock the cryptic patterning, an AI device could
search for patterns in the existing structure, looking for clues, of which
there could be infinite many, given several approaches that appear likely.
yet which one to investigate may not be determinable, and the number of
choices could limit what kind of analyses are allowed, as if a barrier or
boundary, related to the way data is processed yet also truth modeled. in
other words it is speculated that beyond a certain threshold it could be
too difficult to compute or analyze such formatted communications because
they are equivalent to NOISE and display inherent structure that leads to
automatic meaning that is multiple and spread out in different directions
and thus the originating perspective remains unknown, potentially even to a
surveiller who may observe its decryption yet still not be capable of
comprehending its meaning, in terms of the information that is accessed, or
where the message begins or ends, potentially, depending on the framework

if a receiver of the message did have the key to unlock the meaning, it
could potentially exist as if a CSS/XML layer or transparency that appears
over the original text, as if the word-correction underlining on word
processor and text editors, though instead could highlight sections and
reference particular models or approaches for this data, to evaluate its
meaning in terms of sign/symbol relations. thus, as if an additional layer
of highlighting upon an underlying page, the message could appear garbled
or abstracted by default, yet with the translucent additional layer,
various structures or patterns could emerge, as if masking out some or
revealing other data. this could also involve nested sets organized by
different rules in combination with layers, such that /sequencing/ data in
a particular step-by-step approach or ruleset, perhaps based on a private
and-or shared public key, could then open up or allow access to a hidden
message nested within a larger ~variable context. what happens if you start
to spin the third page of the fifth chapter of a given book, which may not
be encoded this way, yet could reveal a hint or clue that then references
what is on the page, recontextualizing it, creating a path via one-time pad
that may disappear and be irrelevant both before and after yet not during
that moment, when it is instead decisive and allows certain functioning in
the given parameters. and yet what if the words cannot spin, the software
does not exist or the concepts are not realizable, what if the threshold is
too high or the ideas too controversial, such that it is made illegal to
have such considerations or develop such tools as if *against culture*, in
that the binary ideologues realize the threat to their false perspective
and grip on interpretation, the inanity of SMS becoming deadly serious in
its repercussions as the language is forked into an unreadable realm.


-- related material --

numbers 2 example A
https://www.dropbox.com/s/k07n9hrm897ixn6/numbers-2xA.gif

numbers 2 example B
https://www.dropbox.com/s/ep2oxewk1bdfs82/numbers-2xB.gif


taro root bubble tea, phở

© ȍ ®
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cpunks.org/pipermail/cypherpunks/attachments/20131018/66c30847/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: alphabet 2.PNG
Type: image/png
Size: 43165 bytes
Desc: not available
URL: <http://cpunks.org/pipermail/cypherpunks/attachments/20131018/66c30847/attachment.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: numbers 2.gif
Type: image/gif
Size: 4198 bytes
Desc: not available
URL: <http://cpunks.org/pipermail/cypherpunks/attachments/20131018/66c30847/attachment.gif>


More information about the cypherpunks mailing list