cypherpunks
Threads by month
- ----- 2024 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2023 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2022 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2021 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2020 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2019 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2018 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2017 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2016 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2015 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2014 -----
- December
- November
- October
- September
- August
- July
- June
- May
- April
- March
- February
- January
- ----- 2013 -----
- December
- November
- October
- September
- August
- July
October 2013
- 143 participants
- 357 discussions
--- overview ---
the potential for a linguistic component in models of and approaches to
encryption has been introduced, perhaps beyond the normal boundaries of
consideration as this relates to sign-based relations and dynamics.
the context for this perhaps also correlates with a recent article on
Google and its geometric approach to language translation...
How Google Converted Language Translation Into a Problem of Vector Space
Mathematics
http://www.technologyreview.com/view/519581/how-google-converted-language-t…
...in that pattern matching has a primary role in establishing structures,
taking note of the algebraic relations of [sign] - [sign] = [sign]
again, it does matter how accurate this representative signage is, in terms
of what it is referencing and in what frameworks. and thus pattern matching
that is ungrounded could tend towards a false perspective, whereas pattern
matching that is grounded could tend towards realism. a boundary condition
exists, and it could like the manual settings on a camera establish how a
situation is viewed, given light, f-stop, and focus and distance, etc
the potential exists for a quadratic equation-like review of data sets in
nonlinear contexts, churning through variables upon variables and searching
out equations. this can happen in terms of numbers and their conceptual
frameworks, like physics or other formulae, though it could also exist at a
more abstract level of pattern matching of various subsign components and
elemental structures.
thus, symbolic processing and symbolic computation could take place in such
a scenario if accurately modeled, the variables conceptually understood. in
their truth this is about the grounding of signage in its truth, and this
evaluation could exist below the level of the word, letter, or even number
or could exist in terms of images and what signs exists within the images,
in terms of their nested and-or dynamic relations. it is perhaps a future
test of such an approach, can a computer interpret a painting in terms of
its content, based on its perspective, or does it only pattern match what
others perceive about it, and thus mimics observation and knowledge. it
would seem the former is an ungrounded evaluation of signs, matching the
sign with the sign for its model of truth, whereas the latter would be a
grounded evaluation of signs, in an empirical model of truth, bounded by
computer observation in that it may not 'like' or love or hate the artwork,
though through diagnostics could statistically pinpoint its location
amongst various mapped qualities of other such works and ideas, and gauge
its like and unlike characteristics, evaluating against a human viewpoint
and while this mapping or correlation could occur between [signs] that are
computationally processed in this way, via geometry, as with language in
the above example-- the question in this instance is what might be possible
if this geometric evaluation broke the barrier of the sign itself, and thus
the geometry of the signage was broken down into its component parts and
evaluate for its dynamic relations, as if DNA of alphanumeric characters
that compose both math and linguistic units and algorithms (equations if
not words, sentences) such that further mapping could take place beyond the
given boundary, and be used computationally and for empirical structuring,
and that this additional data may not relate to the word itself because it
is only partial, as different languages have their own words, perspectives,
though in identifying these structures, could perhaps function as another
level of language that could universalize different characters into a
meta-characters that like QR-codes or some fractal language, exist in a
context of an alphanumeric master code, reverse engineering the context for
an original alphanumeric system (which HIOX is for western civilization)
what if the quadratic pattern matching occurred as an aspect of machine
translation at the level of computation, and perhaps only machine readable
in some layers or levels, that could move into N-dimensional frameworks,
and how this could relate to cryptology and modeling of information in
various ways. it is easy to conjure up What Ifs, though in considering the
impact of transcending given boundaries of data modeling, it does appear to
go beyond the existing conceptualization, into another territory entirely
than what the tools and software and ideology allows to be considered, and
thus interpretation is bounded and limited by a particle fixed viewpoint or
forced perspective that such inquiries are in direct challenge of, as to
what exactly these issues involved consist of, in terms of their truth
and it is this truth of matching patterns and A=A and A=B evaluation and
differentiation, that subsign indicators or components and elements could
be considered to have computational characteristics that could be a realm
for the application of algebraic, geometric, and various calculus-based
approaches to transforming and relating words, structures, ideas, content,
in the very way that cryptographic equations do for the signs themselves,
when _contained inside or within a crypto model, though bound or inviolate
as data itself, which then establishes the "THIS IS SECRET DATA" scenario
that is a strange confession, when everything is to avoid the easy reveal
so these boundaries and the way signs are interpreted effect where the
algebra and geometry and transmutation of data begins and ends, in terms of
what is transformed and how, in what dimensionality, parameters, and
terms.
and thus by opening up the sign itself to consideration, in terms of its
patterning and various structuring, its subsign scaffolding and likewise,
the /superposition/ of various shared and unshared elements within its
particular or unique design, as could be empirically modeled and evaluated,
it would then be possible to establish 'many truths' about the sign that is
beyond just its meaning or its pattern match at the macro-level of the sign
and into its range of interconnections and the frameworks it shares with
other concepts and words, including across various language systems and
character sets, as these may be bridged by geometric or other trusswork
perhaps there are new mathematics involved or that this is mathesis in a
realm of signs and symbols that function as an alchemical model that goes
beyond the normal rules for the molecular, conceptual ~forms into other
esoteric dimensions, wild abstractions or unknowns, seeking out or finding
and testing hypotheses to establish what is possible and potential within
the gap, and the distance between conceptualizations in their otherness
the ruin of description here is detached abstraction that is easy to get
lost in and yet it is also necessary to provide context, which in this
moment involves a chasm or void that would be encountered beyond the given
boundary of alphanumeric computation, into a flatland-like extradimensional
realm that perhaps could not be visualized except by a quantum computer,
due it its massive linearity combined with infinite connectivities that
would likely involve change as a variable, or weather systems for the data
that like leaves on a tree move in response, or become wet when it rains,
or fall away in autumn and regrow in spring. so too, conceptualization as
it may be born and be sustained and then, parts decay while others survive,
and so to consider starting such an evaluation, as it may involve a process
that is equivalent with nature to a degree more than the filesystems and
architectures of computers today, seemingly. in that the data model this
involves would be the foundation for a shared empirical model referenced
and observed and contributed to from multiple perspectives simultaneously
and thus an expansiveness that involves bit sets and their permutation,
though also the components of these [bits] as signs, deconstructing the
elements and considering the myriad connectivities of subsign relations and
dynamics, in their linguistic and mathematical importance, which could be
relevant in terms of encryption, translation, data modeling, for instance
--- typology ---
so there is a theme of typography connected to this symbolic processing of
information via computation, and it is proposed this is a model or approach
more like that that humans rely upon for advanced awareness, fundamental
knowledge, and heuristic reasoning, yet is unaccounted for in technology
and tools of today, turning highly literate people into illiterates in the
binary context, for the dissonance that a false perspective brings in terms
of thinking, making it seem irrational in the limited, warped biased view.
so typography has letters and numbers that are represented as signs and
this goes from printing era blocks of type and machinery into the present
day era of electronic pre-press and desktop publishing, from dot matrix to
ink jet, laser printers, page layout tools (Quark Express), vector and
bitmap graphics software (Illustrator, Photoshop) and specific software for
typography (Fontographer, other) -- and this is not Mathematica territory
in terms of the manipulations of signage, as ideas or concepts, it would
seem, and instead upholds a particular boundary or conceptualization of
what type is, what images are, and the meaning of fonts and representation
of words and letters and numbers appears to occupy a narrow interpretation
that is mainly visual, about its geometry and usability for communication,
and enhancing this with stylistic refinements, a range of type expressions
and pretty much none of this relates to what i am attempting to evaluate.
none of it accesses the dimensions, exception perhaps if modeling text
within a 3D program and spinning and mirroring it, yet that is an insane
approach, more of a one-off Kryptos sculpture than a way of exchanging
information. the graphics program 'rotate' and horizontal and vertical
mirroring are another very basic aspect of this, though again it involves
the [sign] or text as an image, to be manipulated at its macroscopic level
and not to allow different dynamics to occur within a given word, whereby
some letters may have one function and others others, as this involves a
larger document of, say, twenty different co-existing variables, changing
the text from one document into another series of parallel perspectives.
typology is not typography, yet it is a vital concept in this situation
because [signs] are essentially sets, and subsign characteristics are the
variabilities nested within the set, in various subset dynamics, 'pattern
recognition all the way down' to zeros and ones.
typology, in the terms i relate to it, is easy to understand in terms of
infrastructure and building types. a TV station is a type of building that
is like a component on a circuit-board. a nuclear power station. gas
station. and these have various programming that is unique and overlaps
with other buildings, such as a home dwelling or school. telephones for
instance or network connections. and likewise, [signs], [words] as they
consist of substructural patterns, including geometric characteristics
though also correspondence with other letters and numbers, as part of this
overlapping of functionality and meaning: 5 and 2, S and Z. it may be seen
as irrelevant in terms of a discrete letter or number, yet within a shared
context it can bring additional meaning or reveal interconnected patterning
that could be useful or applied in a calculative schema. such that data can
be hidden or revealed via such means, potentially, yet appear as noise
otherwise. or only /fragments/ or parts of the characters may appear yet in
their statistical or probabilistic frequency, may tend towards a particular
viewpoint or interpretation versus another, or could remain ambiguous and
hidden, unless having the missing key to know which tumblers to turn, etc
so while typology is not describing fonts or "type" on a computer display,
it has some relevance to typography in its set-theory evaluation, as this
relates to signs and symbols and transformative relations, their potential
alchemical connections, as this perhaps even moves between stereotype and
archetypes, as concepts are removed of grounded truth and copied over and
over, where this truth is replaced by its sign, increasingly ungrounded,
and thus 'vision' and IMAGE replace and stand-in for understanding, and so
too in this way, pattern matching is removed from its logical foundation
(A=A) where this can be about matching signs with themselves as if truth,
yet this is only pseudo (pT) and when standardized, moves towards falsity
as a shared framework, the false perspective at this layer of evaluation
image versus idea, the disconnect between forms of signage and content,
everything foregrounded and the empirical background removed, thus perhaps
an early cosmos condition, from order into chaos, from one view into many
as the limits of an older viewpoint could not withstand and keep up with
the changes, then only now to realize the consequences of lost modeling,
lost grounding, lost conceptualization, lost reasoning, lost communication,
lost sensibility, lost relations, lost truth, lost principles, everything
thus the situation that now exists is basically opposed to evaluating these
ideas due to proscribed boundaries and fixed interpretations, and thus a
lot of writing is necessary to build up a structural framework to share
this other viewpoint within, to seek to describe the situation, model it to
some extent, and offer an alternative approach into another realm which is
seemingly another territory for language, appearing largely unexplored, as
if a hidden potential of or within language, as if it were to be unlocked
and made nonlinear by default of such restructuring, recontextualization,
as this changes perspective, going from unconnected linear viewpoints in
serial progression to massively connected parallelism of shared views
in some sense the 'type' (non-font version, as in typical or repeated unit)
is to consider the various ecology of the sign-ecostem in terms of a
breakdown of these various repeated yet also unique structural relations,
across various classes, categories, and groups that map to meaning though
also to geometric and or other principles. for me Plato's mention of this
knowledge of language when mirrored in water is a form of calculus, and
thus figuring out what sets belong to what characteristics would be to then
group or identify them according to type, function, programmatic qualities,
potential or variable meaning -- yet here is where it is like an orbital
cloud,.. because this involves *potentials* for these set relations, the
dynamics are about superpositions that may be nested within a structure and
not visible immediately else could emerge or hide upon consecutive changes
and thus, there could be a flurry of potentials in any given bit set, both
in itself as a compact sign, such that [dob] which could be 'date of birth'
could transform into [pod] given particular instructions. and what does
that mean or what does that mutative characteristic make possible? well, in
a crypto context, you could hide instructions that reveal that data, and it
could involve several layers based on multiple or single keys that unlock
such a hidden perspective within a text, even while in plain text mode. yet
it could go much further than this, into questions of universal language
and unification of all data in a single model of grounded empirical truth,
as patterns map back to A=A (1) or remain in a realm of ambiguity (A=B)
so again, the questions involved seem to be about the unit of measure,
where is the boundary for considering what a pattern is and what parameters
are evaluated, in what frameworks, dimensionality, conceptual terms. a way
of encapsulating a small aspect would be that the entirety of linguistic
conceptualization could be tied to the scaffolding of language, worldwide,
in this way, via particular approaches to abstraction. and other models
could tie into particular concepts, say [economics] branches out to all
other data sets in various ways, and interconnects in that pattern-based
infrastructural conduit, though in its own dimensionality which may or may
not overlap with hypotheses of other signage. and thus like protocols and
layering, perhaps there is a particular layer this is occurring on, and who
knows, perhaps crypto is its own layer within this model of knowledge, such
that its truth is reflected in how it is represented and modeled as an idea
and that this information shapes the perspective from the ground up- so if
the purpose is protecting or securing communications (truth), how might
this fit into the various contexts that are embattled or under siege or
boundaries need to be created or data needs to be hidden, as this can be
evaluated in terms of truth, firstly, and not applied crypto systems that
may rely upon false perspectives and assumptions of unaccounted variability
that may largely undermine or exploit this purpose or outright prevent it
if you have ungrounded observations, ungrounded mathematics, ungrounded
language, ungrounded communications, ungrounded relations, ungrounded
reasoning, ungrounded beliefs, ungrounded ideas - and then expect that some
software and hardware solution and encryption model is going to deal with
all the security exploits inherent in those, do to protecting [signs] in
their inviolable state, it would be perhaps to limit questioning to a much
smaller set of criteria than what the situation actually exists in, and
thus a false sense of security, knowing, secrecy, privacy, that may rely
more on belief if not ideological compliance, to not destroy the model and
allow it to persist, as if this validates the process, than to rigorously
question the frameworks in their depth beyond issues of representation that
exist in biased and institutionalized agendas, driven toward certain ends.
and while most probably find themselves situated in something preexisting,
and few probably have chosen these parameters from the start, it is very
difficult if not impossible to do anything else, given the fixed idea of
what cryptography involves, if the tools themselves are bounded and limit
what can be considered or evaluated- which is a perspective forced also by
indoctrination into an orthodoxy of patterning, "received ideas" that are
foundational yet flawed, and instead of theory and law should be brought
back into a realm of falsification and hypotheses instead, because perhaps
people are trading in dead and rotten fish, skeletons of previous solutions
that are by now carcasses or firmly established yet potentially ungrounded
in what is actually going on as it is exists, versus as it is believed to
maybe the truth of this is that security is not possible, and not allowed
if there were a dictatorial framework, and thus a limit allows a false
sense of security by going along with the charade. maybe it is purposive to
not allow tools to serve ideas - think: the internet has no model for ideas
and yet is chock full of data and information, yet no conceptualization
that builds upon itself beyond the relation between [signs] as if "search"
is the biggest idea to emerge from 21st civilization. seriously- WTF.
exactly how stupid have we become, the Google billboard hiding yahoo-like
organization of categories no one can access to actually find anything via
structural knowledge. how helpful is that for everyone, actually. how
onesided. is there not some symbiotic relation there, or is it parasitic,
asymmetric, a siphoning occuring for a hidden, protected, privatized agenda
what if the unit of measure is not the [word] and instead the [idea], the
concept. where is the internet or software tools to navigate or explore
those dimensions and dynamics. nowhere you say? exactly. it is absent,
missing information, once the core of culture, now relativism, viewpoints
disjoined from every other, sects of ideology, the interdisciplinary yet
without a shared truth, a structural limit to this being able to happen,
the span needing to go beyond private views and skews, binary onesidedness
and back into truth, back to logic, underneath the skein of civilization,
to question what is beneath this scrim the movie image is projected onto
the nature of reality. yet there is nothing documented in culture or in the
academy or educational system about the existence of electromagnetism in
relation to these events. the paradigm is 300 years outdated, from the
first industrial revolution onward, though moreso than that, if taking into
account metaphysics and cultural beliefs prior to this, when unified and
allowed beyond the confines of a 'public methodology' of science that is as
biased as religion once was to external data beyond the given modeling
so basically it is hopeless, there is no way, populations are illiterate,
relations are animalistic now, like dealing with deranged jungle animals in
terms of 'reasoning ideas', the poison of the snake delivered by pills, or
violence of predator via hidden toxins attacking from every angle instead
unless that is, someone, somewhere else out there also relates to truth as
the primary condition for evaluating events. which is what the internet is
all about, connecting people in their various contexts and perspectives and
building a reality up and out of that shared context, as it may or may not
scale beyond a certain point. and thus the collapse of literate email list
culture into the politics of commercialization as this educational venue
is socialized and commoditized and turned into a scene of endless groups
and groupies, even the unpopular now have their own support groups online-
everyone is cool and everyone is a celebrity in their own way. egotastic.
who needs TV cameras when there are webcams and the internet, each person a
star in their own movie even, situation comedies, varied lifestyles mapped
into contrasting frameworks - notice ensuing antics, protesters kill off
opponents, latest rape video online, see cannibalism tomorrow, live! how
much of what is engineered into this state of failure is purposeful for the
underlying agenda and politics of subversion and subservience to lesser
views and lesser ideals, brought about by the incapacitation of reason, its
absence in the culture, public debate, virtues, principles, goals, ideals
where is it. it is not at this superficial level of reading, referencing,
repeating, and regurgitating dumb [signs] as if truth itself. it is not
simply the act of communicating information or sharing and storing data. it
involves conceptualization that is accurate, grounded in a larger truth,
and that becomes the basis for shared awareness. consciousness. a modeling
of reality that at its core is based on verisimilitude, *insight*, and a
way of existing, surviving, being, though growing, developing, moving
through the world in a more optimal and purposeful way. and that involves
getting at the concepts in their truth, accessing it, questioning it, and
then having this be the basis for relations, for actions, governance. thus
what if the technology and tools limit this possibility to such an extent
that it is not even possible to communicate, the tools are purposefully
degraded, the equipment made to break and forced to break by attackers, and
thus the very interaction in this truth is in jeopardy via this condition
'the computers' a forced perspective. the software. the processors. the
hardware. what is made and how it is made and thought about. to varying
degrees, mostly JUNK. mostly inadequate. mostly off-course, limiting. to
such a degree as to force incapacitation by the very use of these tools.
programming and reinforcing a captive mindset, subservient to this agenda.
in this way -- breaking the illusion that is sustained, cracking the mirror
that this establishes in civilization, would remap categories and concepts
beyond their existing connections, short-circuiting and rewiring the way
these interactions can and cannot take place, tearing down boundaries and
erecting others that keep unbelievers and those who do not serve 'truth' in
a different contained realm, for they cannot be reasoned with and rely on
lies and manipulations to force dynamics as they exist, and should not be
allowed to continue this practice, a protected boundary must be established
to differentiate those who serve truth and those who exploit it so as to
serve themselves instead. survival of the fittest, finally grounded by
actual capacity versus conformity, actual accounting for value instead of
standardizing high and low capacity to the same bell curve median scale,
though far lower as mimicry and imitation and acting go, pretending to be,
manipulating signs to appear as something yet not do the actual work, etc
accounting is core to truth. without it, enronomics, in whatever theorized
confection it may appear as, resolving all scenarios in a single relativism
that is binarily N=P complete. everything is resolved, solvable this way,
just believe, follow, indoctrinate, be the religion, the brick in the wall
so what if the bricks are de|con-structed, the wall dismantled. the [signs]
accounted for in their truth, what does it involve- what limits exist to
making this new interpretative context a possibility. and it is proposed it
exists going beyond the normal conception of the alphabet, language in its
existing boundaries, and breaking apart the signage, to get at the concepts
and start talking about them in structural terms, of logic and their truth.
and to do this would require stopping the flow of time in sentences and in
reading and writing, and moving to a nonlinear approach, as previously
mentioned in terms of conceptual models as molecules, a diagramming and
testing of ideas and beliefs as hypotheses.
yet to get there, it is necessary to break the IMAGE of the sign, a ruling
illusion about the inviolability of words as if truth itself, inherently.
to shatter the illusion of perfection, especially within language as if a
perfect perspective if unerring, yet not accounting for error in thought
and framework, which is the core failure, this incapacity to reference
truth removed of falsity. so how to get to empirical truth when these same
SIGNS are by default only pseudo-truth, partially or minimally accessing it
in local distributed contexts, and it would seem that overcoding or
under-coding or various other approaches to language beyond existing
boundaries could open up further interpretation via the inherent
/superposition/ of letters, words, and numbers, such that [sign] could have
a range or cloud of variable meaning, that may be multiple, and could be
written and read this way, potentially, though likely starting off in very
small code-like fragments though building to words or sentences
potentially, another form of language that is based in a grounded
superposition, as if ideograms
so, this is to propose a potential communication capacity exists that could
develop to break the illusion of linear, serial relativism based in binary
ideology, and instead operate within a many-viewed parallel perspective
that has a range of meanings within the [sign] as de|con-structed into
various subsign scaffolding, as it becomes a component of this language.
and from this, so too, hidden communication that could occur by way of a
limited observer where those who do not share the same parameters or logic
would be bounded in what they can interpret, and that this interpretation
would default to an ungrounded or forced rationalization in a biased binary
viewpoint, due to limits upon its meaning, thus perhaps inventing views
that do not exist yet fulfill the limited model. this could be useful in an
illegal surveillence society if the oppressive generates false positives by
their mistaken analyses, skewing models, forcing extreme decision-making
out of line with the actual condition, thus increasing polarization the
side-effect in the unshared evaluation, which can be very telling and
provide an offset by which to distinguish those skewed and hostile from
those aligned in shared modeling of reality. anxiety of the "irrational"
that was a best attempt at a recap of the issues and a setup for a way into
this situation given what exists in typographic and linguistic structuring
of alphabets as [signage]. it should be insanely obvious i am unknowing of
most of what this involves, beyond the given conceptualization, though the
goal here is to tap into ideas that others know about in their depth that
could break this fixed structural condition wide-open across a wide range
of parameters beyond the few exampled here and there in the text. so too,
there are probably many errors in my own observations, so that should be
part of the evaluation, if assumptions or language or conceptualization is
wrong or missing or inaccurate or false. consider it a working hypothesis,
open to additional addition, able to be corrected, improved upon, part of a
larger collaborative model, this just an introduction to be expanded upon
if and as others relate to the concepts. these are ideas, not just my view
is valid or right or correct, so what follows and what precedes are based
on questions and considerations over a long period and this is where it
arrives today, mostly, in terms of making sense of the totality involved.
and thus, from this limited perspective, the way into it, as far as i can
perceive given a particular experience of these issues, is with structural
relations between [signs] themselves at the scale of individual letters
--- importance of ligature ---
a key idea is how to establish geometric relations between signs based on
their visual ordering, which directly involves how they are patterned and
what is like and unlike, over a range of criteria and analyses (phonetic,
anagrams, mirroring, &c.)
i did not know this before, though the ampersand (&) is a ligature, a
combination of letters E and t, from Latin. the dollar sign ($) also..
Typographic ligature
http://en.wikipedia.org/wiki/Typographic_ligature
the way i arrived at the concept of ligature was with a thesis combining
two seemingly disparate concepts, [architecture] and [electricity]. and
strangely found a symbol that had 'ae' - that combined these two letters
into a single character (æ)
Foucault had a book 'the Archaeology of Knowledge' which conceptualized
across a similar gap between concepts [archaeology] and [knowledge] and i
think it was through that reading and a given interpretation that it was
clear a progression existed from A & E as two concepts that relate to their
fusion in a shared structure, which the ligature to me represents. in that
discovery it was realized the concepts of architecture map directly onto
electromagnetic infrastructure, and thus circuitboard city planning is not
just a metaphor, typology of components, nor is building programming, etc.
so this is just a backgrounder on a visual equivalence of the structure
between two letters, the middle horizontal line in a capital A and capital
E, and how they combined into a single character (Æ) via shared structure.
it seems innocuous or without meaning, potentially, unless having meaning.
and thus when 'the æther' or 'archæology' is rendered with or without the
ligature it indicates something about its nature even, perhaps.
so there seem to be a a few ligatures for the American English alphabet
that get little or no use except perhaps in specialized domains. while the
ampersand is normalized, and dollar sign, the æ ligature is not. perhaps it
went out with the era of the classics (texts of western civilization). to
give a sense of the chaos, it is unknown if the character will reliably
render because via email software because there is no special character
input, so simply trying to document the language itself is bounded and made
into a technical feat, to try to access a deeper level of its operation
than the readily available graphic emoticons, including steaming pile of
shit with animated flies that Google provides in its place.
so language is debased, shallow, image-focused on recognition of signs and
responding to these patterns. and it is in is warped and twisted realm that
ideas must function, yet when competing with idiocy, it does not good to
reference or rely upon truth beyond the shallowness and the choose your own
adventure style of navigating the situation, that relativism best allows.
do your own thing, everyone is legitimate in their viewpoints, etc. and
this ability to write and convey things, ideas, makes it so, apparently.
functionally. willing into being a perspective that can be reinforced and
defaults to true and believable by mere fact of sharing it, making it real
via language, shared viewpoints, even if they are fundamentally ungrounded
so it is incredibly easy to string together letters into words via typing
them out on a computer and sharing them worldwide with others, finding that
others exist elsewhere in a similar dimensionality and establishing those
synaptic and neuronal connections that establish a noospheric condition,
whereby awareness and atmosphere, local and global, syncronize in a new and
different way, via this remote connectivity as it defines, establishes a
shared reality - at the level of perception and communication of viewpoints
and yet the language itself could be superficial in contrast to the ideas
involved, so a lot could be communicated - information exchanged - and yet
very little said. or that the needle of truth remains in the haystack of
false and faulty frameworks, and thus it is about concentrating what this
minor truth is, establishing it, while not having a model of the whole that
this truth exists in, as a shared reality, only the partial details of it,
here to there, still all disconnected for lack of a larger framework and
the capacity to link together ideas beyond their isolated observations
as mentioned, this involves logical accounting for concepts in their truth
and connection of concepts beyond the context of pseudo-truth, which can be
a realm of conflict, a showdown of ideological vested interest and bullying
for authoritarian control over what is true, the valid perspective, etc, so
it is not just an issue of locating truth, it is also often unwanted or not
allowed, censored even, if not about destroying the messengers with ideas,
so to keep the paradigm on its existing downward course, into oblivion
thus to get into the concepts, it is to evaluate [signs] in terms of units
and the letter is one of these first steps. a single letter can have its
own meaning, such that the letter A may signify a mountain peak, to some
people or in a given context. the letter S evaluated as a snake. note how
children's books establish literacy in these same terms, where an image of
an illustrated green snake with googly eyes will be in the shape of an 'S',
though this extends into a realm of advanced cultural literacy as well
so there are lowercase and capital letters, which have their own dynamics
that could be evaluated. and while some concepts are single letter, such as
'I' that represents a first-person perspective of an observing self, most
letters unless defined mathematic or scientific variables (a, c, E, e, f,
F, M, W, etc) are instead likely involved in 'boundary crossing' via their
sounds, into a realm of other signage they can still refer to. for instance
the letter 'b' when sounded out equates with 'be' the two letter word, and
'bee' the three letter word. thus these dynamics could be mapped letter by
letter likewise, in their discrete existence as isolated units, prior to
visible or geometric connectivity though this hidden relation exists as a
structure and likewise could and probably has been mapped out extensively
so moving from single letters of the alphabet into two letter combinations
it is a realm of the ligature, where two letters can become one again by
their recombination, which involves a new character that replaces them
consider the idea that every word in this conveyance has adjacent letters
with a hidden potential to be combined in a compact ligature format, such
that a text could be reduced by a certain percentage yet still feasibly be
readable as the same text, only in a smaller space, and perhaps reading the
information would be more or less efficient, more or less detached from
issues of signage and help or hinder consideration of underlying concepts
in terms of how this could effect pattern matching and interpretation
for example, a word like HEIGHT could potentially have a ligature for [HE]
and [HT] such that [HE]IG[HT] would go from six characters to four. and
thus if still legible could compress or compact words and sentences this
way. so for instance, the ligatures for AE and OE could example across the
entire alphabet of two letter or more relations.
potential 2 letter ligatures // text version
https://www.dropbox.com/s/c1jhz61ddtvmzff/alphabet2.pdf
(note: graphic version attached for list archive)
consider that there is an array of 26 letters whereby A-Z, each letter is
recombined with every other letter, which arrives at 676 combinations that
can be evaluated for potential ligature connections. it need not arrive out
of Olde English or some historical context to be legitimate, or so it is
proposed that this technique has other applications than pronunciation or
cross-language word migrations.
thus looking at the table, the matrix can be navigated via X,Y coordinates
such that (1,1) would be AA and (5,8) would be HE for example.
the first pair of letters may be difficult to make a ligature of in some
situations due to issues of fonts and spacing with adjacent letters on
either side, though imagine that software model could have multiple options
for how a ligature exists and is rendered, given its context, and thus a
single character perhaps somewhat like an M with a horizontal bar may in
some instance be legible as a double A ligature and thus if utilized could
compress whatever word it appears in, if any.
(it should be noted that these two letters combinations are still not in a
full context of communicating language though could have a symbolic or
other sign-based relevance, such as acronyms. and thus while not yet in a
normal realm of language evaluation for long form communication, meaning
can be compacted into these two character letters, prior to evaluating them
as potential ligature candidates. for instance: GE in terms of mythology
and also General Electric corporation, larger than many nation-states. that
is a lot of concept in a two-letter compressed word. and thus another such
connected structuring exists in this realm of abbreviation and acronyms
that can be another structure to consider when mapping out such dynamics)
to return to the table, a second example of HE would easily combine across
the center horizontal and allow a shared edge between the two letters. and
so evaluating candidates or approaches for ligature combinations involves
looking at their structure and considering geometric relations between the
graphic forms - which can be widely variable give existing fonts/type.
thus, the preference for the 16 segment LED display that has segments that
turn on and off for various parts of these letter structures, albeit not to
the degree needed for different fonts, though a matrix LED display could.
in other words, 16 segment displays are not chained together, do not share
common edges between characters and thus do not allow such ligatures, yet
demonstrate the need for a standardization of structure to enable common
geometric relations to exist, likewise. as if properties that vary given
how they are approached.
it should be noted that typical software allowing ligatures appears to be a
copy-paste scenario unless somehow macro-keyed. in that it is referenced,
as if looking up a thesaurus or doing a key-combo, to access the special
character, and that such translation of adjacent letters a and e (ae) do
not automatically translate into the ligature 'æ'. this is a vital gap and
involves usability issues as the approach or limits define the interaction.
now what if a keyboard existed that had its own fonts and controlled the
output, such that it could take a text like this, and automatically when
writing translate the variable letter combinations into a running ligature
framework via real-time or post processing, thus compressing the data yet
retaining its readability. and what if reading consisted of interpreting
such texts and where an anomaly may exist in legibility, the letters in the
word could be expanded to their full scale, removed from their ligature
connection. further, beyond just 2 letters or 3 letters, what if entire
words could be compacted down this way.
what this is to say is that the table above defines the 26x26 A-Z letter
combinations, and yet that is just one level of many. a word such as
'house' could be evaluated in terms of ligature, here capitalized to relate
it to the capital letter combinations...
[H][O][U][S][E]
in terms of *potential* ligature combinations, there could be two letter
evaluation for every adjacent pair: HO, OU, US, SE.
each of these pair could have multiple potential versions, based on the
optimal relation to adjacent other letters. in this way, a span across
multiple ligatures could exist, such that [HO] and [OU] combine into a
single fused 3 letter ligature, while [US] and [SE] combine into another:
[HOU][SE]
and thus, given variability, the letter 'U' could tend towards a particle
optimal state either to co-exist with the adjacent letter 'S' or to attempt
to combine with it in an optimal, balanced way, based on its legibility...
[HO|U|SE]
this is to propose that *superposition* of variables could occur in the
word as a bit set, which may seem meaningless. though the language itself,
the sign itself, would be calculating its own rendering as a sign, and not
just a passive issue of someone tapping a key for letters H O U S E and
having that be the result- instead, various modeling would be activated and
word combinations would be processed in real time in terms of ligatures and
other structural data based on how the situation is modeled (which could go
beyond visual structure, several examples mentioned earlier).
so this is to propose that any given word could potentially be mapped into
and out of this interstructural connection, based on its unit of measure as
the letter as a combinable entity. it is not yet about cracking them open,
yet it prepares the framework for doing so, by seeing how language works in
terms of communication techniques using existing software and equipment.
[15,8|21|5,19]
thus, coordinates can map into these ligature frameworks, and three value
ligatures would have their own chart, which potentially would resolve many
three letter words or [signs] in terms of ligatures, and then potentially
many four letter ligatures could exist, and massive data charts would be
needed to chart these out, though it is entirely possible with computers.
now for instance if the original word used in the example (HOUSE) were to
be evaluated differently, it would have different variability in terms of
how the sign is rendered...
[H][OU][SE]
in this case, the letter 'H' is non-ligatured or combined with an adjacent
letter, say due to readability issues, and the other two ligatures remain
separate...
[8][21,15][5,19]
note the letter 'H' is the eighth letter in the alphabet, as a single
number input. now what if the [OU] and [SE] could were suddenly realized to
exist in another chart of four letter ligatures, and thus the coordinates
for these two ligature resolutions were replaced by another matrix value,
here fictional, equaling
[H][OUSE] ...
[8][3.2.122,0.299.9]
what this suggests is that every letter and would could potentially be
fragmented and reunited in a similar structural framework that operates
well beyond ligatures, into an N-dimensional modeling of word and language
relations in a computational context, whereby the [signs] themselves are
not just inert static entities with no connection to truth, and instead
could map into different variabilities, ligatures or not, as a form or
approach to reading/writing.
--- other potential views ---
thus, extrapolated into this other abstract variability and the issues of
superposition, a text could be typed that automatically is transposed into
another framework via its software, though this would need to be a model
that is geometrically controlled, within a particular non-arbitrary shared
framework based on a common empirical model that can be developed via many
parallel interactions referencing and improving upon the same rulesets.
in such a way, an interface could exist along with a physical or software
keyboard that could offer optional structuring or [sign] relations, such
that another approach could involve a non-ligature evaluation of all the
available ASCII letters A and other special characters in their graphic
potential, and having this be an option for typing a concrete-poetry-like
message akin to SMS abbreviation though into the signage, deepening the
meaning, not just compacting it, instead making it multivariate. such that
a word or short communication could convey a prose like awareness, and
stand in and of itself as a perspective, in place of a paragraph, for
instance, due to its capacity and its depth of meaning. what are the
parameters that could be accessed, what are the existing limits of
parameter sets that *break* upon use or going beyond a given boundary, such
as characters in email, or limits within software, that instead could be
made accessible and reliable in use, for such symbolic communication, as it
may involve transformation of text or other approaches and issues, such as
one-time pads or encryption perhaps if a key infrastructure was involved
and meaning was bounded by interpreter
another consideration is to alter the rules of evaluation, such that the
text could be flipped or twisted within ready-made software made for such
communications, or other such dynamics, like turning everything 90 degrees
that may reveal a hidden pattern or transform certain words and not others
or reveal a portal to gain access to another level of the text or to move
through several as the data is rearranged via a given planned sequence. for
instance, what if the axis of left to right is reversed, right to left, for
the display of information. such that [axis] is variable. or mirroring, or
other pattern dynamics, say structural substitutions or additions whether
graphic or phonetic or acronyms or whatnot.
in ways it may be reminiscent of invisible ink, even, if a strange text
were to arrive that may be formatted beyond a legible threshold due to
unknown or unknowable rules (if bit set permutations to subsign structures)
and thus a threshold condition existed requiring deciphering or decrypting
or translation of the text or abstracted data.
with no such key to unlock the cryptic patterning, an AI device could
search for patterns in the existing structure, looking for clues, of which
there could be infinite many, given several approaches that appear likely.
yet which one to investigate may not be determinable, and the number of
choices could limit what kind of analyses are allowed, as if a barrier or
boundary, related to the way data is processed yet also truth modeled. in
other words it is speculated that beyond a certain threshold it could be
too difficult to compute or analyze such formatted communications because
they are equivalent to NOISE and display inherent structure that leads to
automatic meaning that is multiple and spread out in different directions
and thus the originating perspective remains unknown, potentially even to a
surveiller who may observe its decryption yet still not be capable of
comprehending its meaning, in terms of the information that is accessed, or
where the message begins or ends, potentially, depending on the framework
if a receiver of the message did have the key to unlock the meaning, it
could potentially exist as if a CSS/XML layer or transparency that appears
over the original text, as if the word-correction underlining on word
processor and text editors, though instead could highlight sections and
reference particular models or approaches for this data, to evaluate its
meaning in terms of sign/symbol relations. thus, as if an additional layer
of highlighting upon an underlying page, the message could appear garbled
or abstracted by default, yet with the translucent additional layer,
various structures or patterns could emerge, as if masking out some or
revealing other data. this could also involve nested sets organized by
different rules in combination with layers, such that /sequencing/ data in
a particular step-by-step approach or ruleset, perhaps based on a private
and-or shared public key, could then open up or allow access to a hidden
message nested within a larger ~variable context. what happens if you start
to spin the third page of the fifth chapter of a given book, which may not
be encoded this way, yet could reveal a hint or clue that then references
what is on the page, recontextualizing it, creating a path via one-time pad
that may disappear and be irrelevant both before and after yet not during
that moment, when it is instead decisive and allows certain functioning in
the given parameters. and yet what if the words cannot spin, the software
does not exist or the concepts are not realizable, what if the threshold is
too high or the ideas too controversial, such that it is made illegal to
have such considerations or develop such tools as if *against culture*, in
that the binary ideologues realize the threat to their false perspective
and grip on interpretation, the inanity of SMS becoming deadly serious in
its repercussions as the language is forked into an unreadable realm.
-- related material --
numbers 2 example A
https://www.dropbox.com/s/k07n9hrm897ixn6/numbers-2xA.gif
numbers 2 example B
https://www.dropbox.com/s/ep2oxewk1bdfs82/numbers-2xB.gif
taro root bubble tea, phở
© ȍ ®
1
0
----- Forwarded message from Theodore Ts'o <tytso(a)mit.edu> -----
Date: Thu, 17 Oct 2013 17:29:52 -0400
From: Theodore Ts'o <tytso(a)mit.edu>
To: David Mercer <radix42(a)gmail.com>
Cc: Cryptography Mailing List <cryptography(a)metzdowd.com>
Subject: Re: [Cryptography] /dev/random is not robust
Message-ID: <20131017212952.GC14512(a)thunk.org>
User-Agent: Mutt/1.5.21 (2010-09-15)
On Fri, Oct 18, 2013 at 03:43:08AM +0800, David Mercer wrote:
>
> Sometime in the last two months I described the somewhat widespread issue
> at VM hosting/cloud providers of provisioning VM's with the same
> /dev/urandom seed from the image template. firstboot scripts typically only
> get run at image generation, and then the urandom seed is frozen in amber,
> as it were, in the VM image template file. It is a fairly trivial fix to
> re-seed it from /dev/random (one line in the right place).
Yeah, there are some people (including Dustin Kirkland at Canonical)
working on automated provisioning of random seeds from the hypervisor
to the guest kernels.
If you are compiling your own guest kernel, and the hypervisor
supports it, using virtio-rng which allows the guest to use the host
OS's /dev/random to bootstrap its local entropy pool is almost
certainly the Right Thing.
Cheers,
- Ted
_______________________________________________
The cryptography mailing list
cryptography(a)metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
----- Forwarded message from Theodore Ts'o <tytso(a)mit.edu> -----
Date: Thu, 17 Oct 2013 17:26:22 -0400
From: Theodore Ts'o <tytso(a)mit.edu>
To: John Denker <jsd(a)av8n.com>
Cc: Cryptography <cryptography(a)metzdowd.com>
Subject: Re: [Cryptography] /dev/random has issues
Message-ID: <20131017212622.GB14512(a)thunk.org>
User-Agent: Mutt/1.5.21 (2010-09-15)
On Thu, Oct 17, 2013 at 09:12:48AM -0700, John Denker wrote:
> Here is an experiment you can do, if you have a Linux system:
> cat /proc/sys/kernel/random/entropy_avail
>
> I predict that it is likely to be a smallish number, less than 192
> bits, not enough to cut a PGP key. This seems to conflict with
> the stated purpose of having /dev/random, and with the purpose
> of having buffers within the device.
This is a known problem, and I have a patch pending for the next merge
window to address this.
http://git.kernel.org/cgit/linux/kernel/git/tytso/random.git/commit/?h=dev&…
The Chrome browser in particular is a very heavy /dev/urandom user,
and this is causing the problem you describe below:
> On 10/17/2013 06:08 AM, Theodore Ts'o wrote:
> > using a Yarrow-like approach,
>
> I find the current version of /dev/random to be partly yarrow-like
> and partly not. It is yarrow-like in the sense that it performs
> updates in batches, with a substantial minimum batch-size. It
> is non-yarrow-like in that it presents far too much load on the
> upstream source of entropy.
With my recent change, /dev/urandom becomes much more like a
periodically seeded CRNG, where we aren't even pretending to extract a
bit of entropy from the input pool for each bit sent to userspace. If
you want that, then you should use /dev/random.
> A non-exhaustive list of questions and issues -- some quite
> deep and some quite superficial -- can be found at
> http://www.av8n.com/turbid/paper/devrandom.htm
There are some good questions here. Some of them have been recently
addressed, others have not been yet. I don't have time right now to
go through them all in detail, but I will put this on my reading list.
Some quick notes: I have considered the possibility of replacing the
output pools with something that uses AES instead, which would be
especially useful for those architectures which have an AESNI-like
instruction. That's obviously something that would require a lot of
thinking and prototyping before making such a major change, though.
As far as your comments about /proc/sys/kernel/random/entropy_avail
usually being close to zero, I'm currently running an upstream kernel
with the dev branch of the random.git tree merged in, and things are
significantly improved on that score:
% cat /proc/sys/kernel/random/entropy_avail
2847
On a process note, there is a huge amount of interest about
/dev/random that has been demonstrated on this mail thread, and while
some seem to be from people who haven't necessarily looked at the
actual drivers/char/random.c source code, nor are interested in
proposing specific changes, your comments above indicate that you have
done this, and I very much appreciate your thoughts.
Is the cryptography mailing list the best place to be having these
discussions? There is the moderation delay, and I'm also not sure how
eager the moderators are about having the mailing list taken over by
people talking about code patches, etc., on this list. I wonder if we
should create a separate mailing list, perhaps a
linux-random(a)vger.kernel.org, and take the more technical discussions
to that mailing list.
- Ted
P.S. If there are folks who will be at the LISA Conference in
Washington, D.C, I'm hoping to meet with Matthew Green and try to
interest him into doing a detailed look into at the random driver, and
perhaps dragoon some of his students into evaluating entropy sources
on various embedded Linux platforms. If there are other people who
are interested in talking /dev/random while I'm in DC, I've
tentatively blocked off the afternoon of Tuesday, November 5th for
that purpose. Let me know off-line....
_______________________________________________
The cryptography mailing list
cryptography(a)metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
On Thu, Oct 17, 2013 at 10:03 PM, coderman <coderman(a)gmail.com> wrote:
>...
> 2. strong encryption ...
the practical uses are not just authenticity and privacy, but also
censorship avoidance and other availability improvements; c.f.:
"Lightweight Obfuscated Datagram Protocol (LODP)"
https://lists.torproject.org/pipermail/tor-dev/2013-August/005334.html
"Elligator [...] introduces a new solution: an encoding for points on
a single curve as strings indistinguishable from uniform random
strings."
http://elligator.cr.yp.to/
encryption for privacy of data at rest can almost be considered a
subset of the problem of encryption for privacy of communication as
above.
and all of the above hinge critically upon effective key management
and usability! which are the things you're most likely to screw up
inconspicuously and completely.
1
0
Foreign Intelligence Resistant systems [was Re: reasonable return on investment; better investments in security [....]]
by coderman 18 Oct '13
by coderman 18 Oct '13
18 Oct '13
i must amend this prior advice.
in addition to legal protections, educational support, and competitive programs,
also provide:
- direct and unrestricted backbone access to various individuals or
groups who demonstrate competence in either the educational or
competitive realms, in order for them to mount additional attack
strategies against any reach-able target. this access must consist of
both passive taps of backbone traffic as well as injection taps for
raw packet transmission at core rates. this should be available on the
Internet backbone at internet exchanges, private fiber through public
right of way, and core networks of operators of licensed wireless
spectrum.
a side benefit of implementing these reforms would be the de-facto
de-funding of offensive network operations by third parties or
governments. the cost to keep ahead of such a widespread, popular, and
distributed effort would be enormous, and provide continually
decreasing returns.
... getting there is much more complicated of course. *grin*
---
On Sun, Apr 21, 2013 at 1:29 PM, coderman <coderman(a)gmail.com> wrote:
> On Fri, Apr 19, 2013 at 1:26 PM, <paul.szabo(a)sydney.edu.au> wrote:
>> ...
>> > 2012-02-15 - Vulnerability Discovered by VUPEN
>> > 2013-03-06 - Vulnerability Exploited At Pwn2Own 2013 and Reported to Adobe...
>>
>> Is a delay of a year before reporting to the vendor, acceptable?
>
>
> three years or more is better of course! i would not be disappointed
> with a dozen months, however.
> alas external factors (especially when licenses are non-exclusive)
> complicate longevity of weaponized exploits...
>
>
> if you really want to improve security:
> a) remove all criminal and civil liability for "hacking", computer
> trespass, and all related activities performed over data networks;
> establish proactive "shield" legislation to protect and encourage
> unrestricted security research of any subject on any network. extend
> to international agreements for blanket protection in all
> jurisdictions.
> b) establish lock picking, computing, and hacking curriculum in pre
> school through grade school with subsidized access to technical
> resources including mobile, tablet, laptop test equipment, grid/cloud
> computing on-demand, software defined radios with full
> receive/transmit, and gigabit internet service or faster.
> c) organize a program of blue and red teaming challenges for
> educational and public participation at the district, regional, and
> national level cultivating expertise and rewarding it with hacking
> toys, access, and monies.
>
> if implemented, i can guarantee a significant and measurable
> improvement in the security posture of the systems that remain in
> such an environment.
1
0
Secure whistleblowing feedback / reporting systems in the content of compartmented information, endpoint security [was: [NSA bitching] [formerly Re: PRISM][]]
by coderman 18 Oct '13
by coderman 18 Oct '13
18 Oct '13
regarding the inability for NSA employees to report ethical violations
in a manner that did not assure retribution:
this is actually a somewhat difficult anonymity / privacy question in
the context of highly compartmented information and operations, where
knowledge of a subset of specific details is sufficient to imply
strong suspicion and scrutiny to a very small number of individuals...
... assuming you don't circumvent the apparently mediocre constraints
to this information in the information systems that contain it. ;)
---
while academically interesting, in all practical terms we should
render this question moot and provide absolute communication
origin[0], destination[1], and content[2] privacy to all network users
in all locations under all circumstances guaranteed by constitutional
law, prosecutorial discretion, and practical realities (read:
implementations resistant to Tailored Access Operations like efforts
(NSA TAO / CNE related programs)
this latter guarantee will require a bit more design, coding and deployment,
fun problems to solve![3]
0.,
1. "peer communication endpoint privacy" - this is a hard problem.
the existing implementations are not usable and insufficiently large
in anonymity set (too few users): zero knowledge high latency mail
like messaging mixes, even if the twitter mixes are pretty cool.
a proper solution would be datagram based, NAT busting, low latency
(read: sufficiently real-time for video and voice), the majority
protocol across the Internet and local intranets and ad-hoc mesh nets
and other networks,
in an implementation that resists all known general purpose (wide
scale) and specialized (highly targeted and/or weaponized bleeding
edge and/or privileged positioned) attacks.
2. strong encryption like: alligator wrapped forward secrecy intended
streams, and equivalent techniques, solve this problem.
clearly there is much work to do in the implementation and protocol
side of crypto integrity. very, very much work...
3. "NSA TAO / CNE related programs" resistance is a very tall bar.
they rolled this out at DEF CON, of course. the soon departing .gov
Alexander rolled into town with some world class shit, no doubt... is
it really going to be 33 years before we can talk about it? for
better or for worse we won't have Snowden to disclose this
(http://cryptome.org/2013/10/26-years-snowden.htm) as he's too classy
to drop dox on specific field operations and highly technical method
and tools information. hmmm...
1
0
Intelligence agency subversions and clandestine, illicit programs; lack of popular outrage [was Re: PRISM]
by coderman 18 Oct '13
by coderman 18 Oct '13
18 Oct '13
On Wed, Oct 2, 2013 at 1:52 AM,
<catsandd0gz.dinosaursandwh0res(a)hushmail.com> wrote:
> Is anyone else super mad?
if you're not mad as hell about PRISM, UPSTREAM, BULLRUN, FLYING PIG,
XKEYSCORE, FOXACID, EgotisticalGiraffe, QUICKANT, QuantunInsert,
FRUGAL SHOT, MOTHMONSTER, MULLENIZE, ERRORONEOUSINGENUITY,
FINKDIFFERENT, GREATEXPECTATIONS, VALIDATOR, RAKE, PEDDLE,
PACKETCHEAP, BEACH HEAD, FERRET CANON, PINWALE, MARINA, TRAFFICTHIEF,
REMATION, LACONIC, ENDUE, MANASSAS, DANCINGOASIS, SPINNERET,
MOONLIGHTPATH, ...
and all the other myriad "exceptionally controlled information",
then you're beyond reason and redemption...
... let's not take a show of hands
;P
----
P.S. the new cypherpunks list has dropped the cypherpunks(a)al-qaeda.net
for a more benign and powers that be submissive cypherpunks(a)cpunks.org
... perhaps it does get past a few more filters? ...
--- fwd:
Subject: Snowden sets OPSEC record straight
To: cpunks <cypherpunks(a)cpunks.org>
Date: Thu, 17 Oct 2013 21:13:38 -0700
it doesn't get much more definitive than this retort.. :
"""
[Snowden] felt confident that he had kept the documents secure from
Chinese spies, and that the N.S.A. knew he had done so. His last
target while working as an agency contractor was China...
adding that he had had "access to every target, every active
operation mounted by the N.S.A. against the Chinese. Full lists of
them," he said.
"If that was compromised," he went on, "N.S.A. would have set the
table on fire from slamming it so many times in denouncing the damage
it had caused. Yet N.S.A. has not offered a single example of damage
from the leaks. They haven't said boo about it except "we think,"
"maybe", "have to assume" from anonymous and former officials. Not
"China is going dark." Not "the Chinese military has shut us out."
"""
there is a clear thoughtfulness, moral reasoning, and
conscientiousness repeatedly demonstrated by Snowden in these events.
it is now obvious that history will exonerate him fully.
... the distance between current reactionary retribution and that
future absolution appears to be a bit of a distance, however...
hopefully not too long.
---
http://www.nytimes.com/2013/10/18/world/snowden-says-he-took-no-secret-file=
s-to-russia.html?_r=3D0&pagewanted=3Dprint
October 17, 2013
Snowden Says He Took No Secret Files to Russia
By JAMES RISEN
WASHINGTON - Edward J. Snowden, the former National Security Agency
contractor, said in an extensive interview this month that he did not
take any secret N.S.A. documents with him to Russia when he fled there
in June, assuring that Russian intelligence officials could not get
access to them.
Mr. Snowden said he gave all of the classified documents he had
obtained to journalists he met in Hong Kong, before flying to Moscow,
and did not keep any copies for himself. He did not take the files to
Russia because it wouldn't serve the public interest," he said.
"What would be the unique value of personally carrying another copy of
the materials onward?" he added.
He also asserted that he was able to protect the documents from
China's spies because he was familiar with that nation's intelligence
abilities, saying that as an N.S.A. contractor he had targeted Chinese
operations and had taught a course on Chinese
cybercounterintelligence.
"There's a zero percent chance the Russians or Chinese have received
any documents," he said.
American intelligence officials have expressed grave concern that the
files might have fallen into the hands of foreign intelligence
services, but Mr. Snowden said he believed that the N.S.A. knew he had
not cooperated with the Russians or the Chinese. He said he was
publicly revealing that he no longer had any agency documents to
explain why he was confident that Russia had not gained access to
them. He had been reluctant to disclose that information previously,
he said, for fear of exposing the journalists to greater scrutiny.
In a wide-ranging interview over several days in the last week, Mr.
Snowden offered detailed responses to accusations that have been
leveled against him by American officials and other critics, provided
new insights into why he became disillusioned with the N.S.A. and
decided to disclose the documents, and talked about the international
debate over surveillance that resulted from the revelations. The
interview took place through encrypted online communications.
Mr. Snowden, 30, has been praised by privacy advocates and assailed by
government officials as a traitor who has caused irreparable harm, and
he is facing charges under the Espionage Act for leaking the N.S.A.
documents to the news media. In the interview, he said he believed he
was a whistle-blower who was acting in the nation's best interests by
revealing information about the N.S.A.s surveillance dragnet and huge
collections of communications data, including that of Americans.
He argued that he had helped American national security by prompting a
badly needed public debate about the scope of the intelligence effort.
The secret continuance of these programs represents a far greater
danger than their disclosure," he said. He added that he had been more
concerned that Americans had not been told about the N.S.A.s reach
than he was about any specific surveillance operation.
So long as there's broad support amongst a people, it can be argued
there's a level of legitimacy even to the most invasive and morally
wrong program, as it was an informed and willing decision," he said.
However, programs that are implemented in secret, out of public
oversight, lack that legitimacy, and that's a problem. It also
represents a dangerous normalization of governing in the dark, where
decisions with enormous public impact occur without any public input.
Mr. Snowden said he had never considered defecting while in Hong Kong,
nor in Russia, where he has been permitted to stay for one year. He
said he felt confident that he had kept the documents secure from
Chinese spies, and that the N.S.A. knew he had done so. His last
target while working as an agency contractor was China, he said,
adding that he had had access to every target, every active
operation mounted by the N.S.A. against the Chinese. Full lists of
them, he said.
If that was compromised, he went on, N.S.A. would have set the
table on fire from slamming it so many times in denouncing the damage
it had caused. Yet N.S.A. has not offered a single example of damage
from the leaks. They havent said boo about it except we think,
maybe, have to assume from anonymous and former officials. Not
China is going dark. Not the Chinese military has shut us out.
An N.S.A. spokeswoman did not respond Thursday to a request for
comment on Mr. Snowden's assertions.
Mr. Snowden said his decision to leak N.S.A. documents developed
gradually, dating back at least to his time working as a technician in
the Geneva station of the C.I.A. His experiences there, Mr. Snowden
said, fed his doubts about the intelligence community, while also
convincing him that working through the chain of command would only
lead to retribution.
He disputed an account in The New York Times last week reporting that
a derogatory comment placed in his personnel evaluation while he was
in Geneva was a result of suspicions that he was trying to break in to
classified files to which he was not authorized to have access. (The
C.I.A. later took issue with the description of why he had been
reprimanded.) Mr. Snowden said the comment was placed in his file by a
senior manager seeking to punish him for trying to warn the C.I.A.
about a computer vulnerability.
Mr. Snowden said that in 2008 and 2009, he was working in Geneva as a
telecommunications information systems officer, handling everything
from information technology and computer networks to maintenance of
the heating and air-conditioning systems. He began pushing for a
promotion, but got into what he termed a petty e-mail spat in which
he questioned a senior manager's judgment.
Several months later, Mr. Snowden said, he was writing his annual
self-evaluation when he discovered flaws in the software of the
C.I.A.s personnel Web applications that would make them vulnerable to
hacking. He warned his supervisor, he said, but his boss advised him
to drop the matter and not rock the boat. After a technical team also
brushed him off, he said, his boss finally agreed to allow him to test
the system to prove that it was flawed.
He did so by adding some code and text in a nonmalicious manner=94 to
his evaluation document that showed that the vulnerability existed, he
said. His immediate supervisor signed off on it and sent it through
the system, but a more senior manager the man Mr. Snowden had
challenged earlier was furious and filed a critical comment in Mr.
Snowden's personnel file, he said.
He said he had considered filing a complaint with the C.I.A.=92s
inspector general about what he considered to be a reprisal, adding
that he could not recall whether he had done so or a supervisor had
talked him out of it. A C.I.A. spokesman declined to comment on Mr.
Snowden's account of the episode or whether he had filed a complaint.
But the incident, Mr. Snowden said, convinced him that trying to work
through the system would only lead to punishment. He said he knew of
others who suffered reprisals for what they had exposed, including
Thomas A. Drake, who was prosecuted for disclosing N.S.A. contracting
abuses to The Baltimore Sun. (He met with Mr. Snowden in Moscow last
week to present an award to him for his actions.) And he knew other
N.S.A. employees who had gotten into trouble for embarrassing a senior
official in an e-mail chain that included a line, referring to the
Chinese Army, that said, Is this the P.L.A. or the N.S.A.?
Mr. Snowden added that inside the spy agency theres a lot of dissent
palpable with some, even. But he said that people were kept in line
through fear and a false image of patriotism, which he described as
obedience to authority.
He said he believed that if he tried to question the N.S.A.s
surveillance operations as an insider, his efforts would have been
buried forever, and he would have been discredited and ruined.=94 He
said that the system does not work, adding that you have to report
wrongdoing to those most responsible for it.
Mr. Snowden said he finally decided to act when he discovered a copy
of a classified 2009 inspector generals report on the N.S.A.s
warrantless wiretapping program during the Bush administration. He
said he found the document through a dirty word search, which he
described as an effort by a systems administrator to check a computer
system for things that should not be there in order to delete them and
sanitize the system.
"It was too highly classified to be where it was," he said of the
report. He opened the document to make certain that it did not belong
there, and after he saw what it revealed, curiosity prevailed, he
said.
After reading about the program, which skirted the existing
surveillance laws, he concluded that it had been illegal, he said. =93If
the highest officials in government can break the law without fearing
punishment or even any repercussions at all, he said, secret powers
become tremendously dangerous.
He would not say exactly when he read the report, or discuss the
timing of his subsequent actions to collect N.S.A. documents in order
to leak them. But he said that reading the report helped crystallize
his decision. You cant read something like that and not realize what
it means for all of these systems we have," he said.
Mr. Snowden said that the impact of his decision to disclose
information about the N.S.A. had been bigger than he had anticipated.
He added that he did not control what the journalists who had the
documents wrote about. He said that he handed over the documents to
them because he wanted his own bias divorced from the decision-making
of publication, and that technical solutions were in place to ensure
the work of the journalists couldn't be interfered with."
Mr. Snowden declined to provide details about his living conditions in
Moscow, except to say that he was not under Russian government control
and was free to move around.
1
0
RISKS-LIST: Risks-Forum Digest Thursday 17 October 2013 Volume 27 : Issue 55
ACM FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS (comp.risks)
Peter G. Neumann, moderator, chmn ACM Committee on Computers and Public Policy
***** See last item for further information, disclaimers, caveats, etc. *****
This issue is archived at <http://www.risks.org> as
<http://catless.ncl.ac.uk/Risks/27.55.html>
The current issue can be found at
<http://www.csl.sri.com/users/risko/risks.txt>
Contents:
GPS map leads to border crossing and shooting (Scott Nicol)
"The shutdown gets real for science and high tech" (Robert X. Cringely
via Gene Wirchenko)
"How federal cronies built -- and botched -- Healthcare.gov"
(Serdar Yegulalp via Gene Wirchenko)
Health care exchange still plagued by problems (Kelly Kennedy via
Monty Solomon)
How applying to college just got a lot harder (David Strom via
Gabe Goldberg)
Food Stamp Debit Cards Failing To Work In 17 States (Monty Solomon)
Majority of Brits fail to back up their important data (Monty Solomon)
"Web sites tracking users using fonts, Belgian researchers find"
(Candice So via Gene Wirchenko)
Smart meter deployments to double market revenue of wireless modules
(Bob Frankston)
"Apple's claim of unbreakable iMessage encryption 'basically lies'"
(Jeremy Kirk via Gene Wirchenko)
Re: "We can't let the Internet become Balkanized" (Sam Steingold)
Re: Founding Fathers (Richard A. O'Keefe)
Abridged info on RISKS (comp.risks)
----------------------------------------------------------------------
Date: Thu, 17 Oct 2013 15:18:58 -0400
From: Scott Nicol <scott.nicol(a)gmail.com>
Subject: GPS map leads to border crossing and shooting
A 16-year old boy from a small town in eastern Ontario stole a car, picked
up his girlfriend and headed east. A few police chases and stolen cars
later they ended up in Sherbrooke Quebec, where they stole another car. Not
far from Sherbrooke is the US border, which they promptly crashed through
and were shot at.
http://www.ottawasun.com/2013/10/15/ontario-runaways-nabbed-in-maine
http://www.ottawasun.com/2013/10/16/wrong-turn-at-border-maine-cops-probe-o…
Speculation as to why the kids entered the US points towards a GPS map
routing. Apparently they were headed for the Maritimes, which are the
eastern-most provinces of Canada. If you go to google maps and ask for a
routing from Sherbrooke, QC to St John, NB, all of the options go through
the US. There is a small yellow banner at the top of the directions that
reads "This route crosses through the United States".
http://goo.gl/maps/n5b0I
On an android phone the warning is in small print with a yellow triangle to
the left of it. This is the same yellow triangle you see when maps warns
about tolls on a route. Once you enter navigation there appears to be no
warning at all.
If you're on the run you probably won't notice the warning regardless. But
even if you aren't on the run, it's easy enough to just click "navigate"
and then any warning disappears.
------------------------------
Date: Tue, 15 Oct 2013 13:33:48 -0700
From: Gene Wirchenko <genew(a)telus.net>
Subject: "The shutdown gets real for science and high tech"
(Robert X. Cringely)
Robert X. Cringely | InfoWorld, 14 Oct 2013
Think the shutdown only hits panda cams and national parks? Hardly --
scientific research will feel impact for years to come
http://www.infoworld.com/t/cringely/the-shutdown-gets-real-science-and-high…
------------------------------
Date: Tue, 15 Oct 2013 13:31:23 -0700
From: Gene Wirchenko <genew(a)telus.net>
Subject: "How federal cronies built -- and botched -- Healthcare.gov"
(Serdar Yegulalp)
Serdar Yegulalp | InfoWorld, 14 Oct 2013
Many contractors for Healthcare.gov site seem to have been picked
based on past government work rather than IT expertise
http://www.infoworld.com/t/e-government/how-federal-cronies-built-and-botch…
------------------------------
Date: Wed, 16 Oct 2013 23:35:41 -0400
From: Monty Solomon <monty(a)roscom.com>
Subject: Health care exchange still plagued by problems (Kelly Kennedy)
Kelly Kennedy, *USA Today*, 16 Oct 2013
http://www.usatoday.com/story/news/nation/2013/10/16/exchanges-two-weeks-in…
Cloud devs: We could have saved buggy HealthCare.gov
Christina Farr, VentureBeat
Oct 14 2013
http://venturebeat.com/2013/10/14/cloud-devs-we-could-have-saved-buggy-heal…
Why healthcare.gov has so many problems
Steven Bellovin, Special to CNN, 15 Oct 2013
http://www.cnn.com/2013/10/14/opinion/bellovin-obamacare-glitches/
------------------------------
Date: Tue, 15 Oct 2013 16:31:09 -0400
From: Gabe Goldberg <gabe(a)gabegold.com>
Subject: How applying to college just got a lot harder (David Strom)
New software version flawed. Imagine!
- - ------ Original Message --------
Date: Tue, 15 Oct 2013 07:43:45 -0500
From: David Strom <david(a)strom.com>
Subject: David Strom's Web Informant: How applying to college just got a lot harder
To: webinformant(a)list.webinformant.tv
Web Informant, 15 Oct 2013
We've all heard the stories about a broken website that was overwhelmed with
visitors and was inadequately tested. But unless you have a high school
senior in your home, you may not have heard about another website besides
the much-flogged HealthCare.gov (that I and many others wrote about). I am
talking about the common application website for college admissions.
About 500 out of the nation's several thousand colleges and universities
support this site, which allows them to eliminate paper student admissions
applications. The idea dates back to when I was applying for college, when a
common paper-based application was put in use. Later it went
digital. Trouble is, the latest version of the common app is seriously
broken and has prevented many kids from applying to the colleges of their
choice. Given the high stakes involved, it is a serious problem.
The best press coverage about the breakdown has been from Nancy Griesemer in
examiner.com <http://examiner.com> where she lists work-arounds for the
students and chronicles the troubles of CommonApp, as it is known, has gone
through since they did a major overhaul this past summer. "The
implementation has been terrible," one college admissions IT director told
me. "Applicants have had difficulties in creating and completing their
application, school officials have had problems in submitting transcripts
and recommendations, and major changes in how the information is delivered
to colleges have happened without sufficient time for schools to adapt and
test their systems. We needed more lead time."
This director isn't alone: many college admissions officers vented their
frustrations at their annual meeting last month in Toronto, where some said
they couldn't get satisfactory answers from the CommonApp staff. There were
lots of things that should have been caught before being implemented. For
example, a payment processor routine that takes two days to send a
confirmation receipt, so many kids are paying multiple times. Or a signature
page that is so well hidden that students didn't find it to sign their
apps. As a result, their apps are never delivered to the college. Or those
all-important student essays turn into gibberish under some circumstances,
due to a faulty text import routine. Supposedly, these issues are being
fixed literally right now. It makes the HealthCare.gov site look like a
well-run place.
The CommonApp processes more than a million applications a year, and is the
only application method for about 300 schools. If you are applying early
decision to one of these, you are in a tough situation as the decision
deadlines are approaching.
Some 50 others are using another online process called the Universal College
App, including most recently Princeton. This process hasn't been plagued
with problems.
It is hard enough for high school seniors to figure out the college game
without having to become unwitting software UI and QC testers. CommonApp
needs to fix its code fast, and be more transparent about its problems in
the future.
Your comments are always welcome:
http://strom.wordpress.com/2013/10/15/college/
[See also
http://www.nytimes.com/2013/10/13/education/online-application-woes-make-st…
Noted by Monty Solomon. PGN]
------------------------------
Date: Wed, 16 Oct 2013 23:32:13 -0400
From: Monty Solomon <monty(a)roscom.com>
Subject: Food Stamp Debit Cards Failing To Work In 17 States
Walmart, Xerox Point Fingers, The Associated Press, 12 Oct 2013
People in Ohio, Michigan and 15 other states found themselves temporarily
unable to use their food stamp debit-style cards on Saturday, after a
routine test of backup systems by vendor Xerox Corp. resulted in a system
failure. Xerox announced late in the evening that access has been restored
for users in the 17 states affected by the outage, hours after the first
problems were reported. ...
http://www.huffingtonpost.com/2013/10/12/food-stamp-debit-cards_n_4090647.h…
Walmart, Xerox Point Fingers After Food Stamp Card Glitch Leads To
Wild Shopping Spree, Reuters, 14 Oct 2013 updated 16 Oct 2013
http://www.huffingtonpost.com/2013/10/15/walmart-xerox_n_4099207.html
[See also
"Food stamp recipients flood Louisiana Wal-Marts after EBT glitch"
Jessica Chasmar, *The Washington Times*, 14 Oct 2013
http://www.washingtontimes.com/news/2013/oct/14/food-stamp-recipients-flood…
Noted by Gene Wirchenko. PGN]
------------------------------
Date: Wed, 16 Oct 2013 23:26:27 -0400
From: Monty Solomon <monty(a)roscom.com>
Subject: Majority of Brits fail to back up their important data
Computer Business Review, 4 Oct 2013
Tons of individuals admitted to not storing an additional copy of digital
files. The majority of individuals in the UK do not back up their data,
leaving themselves vulnerable to loss of important files and digital
photographs. A new research commissioned by digital storage firm WD
revealed that many of Brits admitted to not storing an additional copy of
digital files, with most of them saying they simply are not concerned or
were unaware of how it could be done. ...
http://www.cbronline.com/news/tech/hardware/storage/majority-of-brits-fail-…
------------------------------
Date: Tue, 15 Oct 2013 13:44:04 -0700
From: Gene Wirchenko <genew(a)telus.net>
Subject: "Web sites tracking users using fonts, Belgian researchers find"
(Candice So)
Candice So, *IT Business*, 11 Oct 2013
Web sites tracking users using fonts, Belgian researchers find
http://www.itbusiness.ca/news/44120/44120
------------------------------
Date: October 16, 2013 at 6:02:53 PM PDT
From: "Bob Frankston" <Bob19-0501(a)bobf.frankston.com>
Subject: Smart meter deployments to double market revenue of wireless modules
[from Dewayne Hendricks via Dave Farber's IP]
I can't help but worry when I read a quote like ``The preference for
wireless [cellular] communication modules over wired technology is also
owed to their incredibly secured network.''
Trusting the cellular network to be secure (whatever that means) is a
problem in itself -- not only are there issues with the cellular protocols
but what happens once the bits get past the towers? Depending on perimeter
security is risky in that there is no protection once there is a breach.
Of course the motivation is clear as the article states -- the cellular
carriers stand to make a lot of money by charging for using their network.
Even if one doesn't depend on cellular there is the cost and complexity of
maintaining a parallel network.
All that protects content are protocols and encryption. There is nothing
magic about RF bits -- any approach that can be used for wireless bits can
be used for bits over IP. Not only would using existing connectivity be far
simpler and provide us with immediate benefits, the protocols would also
offer the potential for users to have access to the data for their own use
such as managing the power usage within their homes.
Bob Frankston
Smart meter deployments to double market revenue of wireless modules
By Esme Vos
Oct 16 2013
<
http://www.muniwireless.com/2013/10/16/smart-meter-deployments-double-marke…
>
An increase in smart meter deployments will see the global market for
wireless communication modules approximately double in value over the
coming years, jumping from $532m in 2012 to $1.3 billion in 2020, at a
compound annual growth rate (CAGR) of 12 percent, according to a new report
from research and consulting firm GlobalData.
The company's latest report states that North America, currently the
dominant player in the market for global wireless communication modules for
smart meters, will be a key driver behind the leap, with its own market
revenue expected to climb steadily from $379m in 2012 to $433.7m in 2020.
Europe will also continue to account for a considerable share of the global
market, thanks to a significant number of pilot-scale projects getting
underway across the region. The uptake of wireless communication modules in
the UK, Denmark and Ireland in particular looks promising, according to
GlobalData, and these countries are predicted to occupy an even larger
share of Europe's wireless smart meter communication market by the end of
2020.
Cellular and Radio Frequency (RF) communication modules are the two key
technologies used in smart meters for two-way data transmission. RF modules
account for an 85 percent share of the North American market, thanks to
their low cost, high bandwidth and efficient performance in industrial
areas.
Ginni Hima Bindu, GlobalData's Analyst covering Smart Grid, says: ``The
preference for wireless communication modules over wired technology is also
owed to their incredibly secured network, and as a result, we expect to see
an increased take-up of wireless technology for smart meter deployments
across North America, the UK and Japan, which will continue to drive the
market over the forecast period.''
However, while the outlook for the wireless communication modules market is
largely positive, a number of challenges remain that may prevent any
further growth in global revenue.
``The problem of coverage is one of the major restraints of the market for
cellular communication modules,'' says Bindu. ``For an indoor electric meter,
GPRS technology provides just 80--85 percent coverage, if the electric
meter, or other grid device, is not moved accordingly.'' ...
Dewayne-Net RSS Feed: <http://dewaynenet.wordpress.com/feed/>
------------------------------
Date: Thu, 17 Oct 2013 14:04:51 -0700
From: Gene Wirchenko <genew(a)telus.net>
Subject: "Apple's claim of unbreakable iMessage encryption 'basically lies'"
(Jeremy Kirk)
Jeremy Kirk, InfoWorld, 17 Oct 2013
A famed iPhone jailbreak software developer says Apple could easily
decrypt iMessages, despite the company's claims
http://www.infoworld.com/d/security/apples-claim-of-unbreakable-imessage-en…
------------------------------
Date: Thu, 17 Oct 2013 14:13:44 -0400
From: Sam Steingold <sds(a)gnu.org>
Subject: Re: "We can't let the Internet become Balkanized" (Sascha Meinrath)
I keep wondering what is wrong with what NSA is doing. They are a spy
agency. They have been created to spy on everyone in the world, whether a
declared enemy or a professed "ally" (alliances do shift, so not spying on
an ally is a liability no nation can afford).
They "subverted the secure Internet protocols by inserting backdoors"? You
mean the Internet servers run on closed-source software? Or pre-compiled
binaries from open-source vendors which NSA compromised? Well, as a
"netizen", I am delighted that those insecure practices will now cease. If
an inept government bureaucracy could do that, I am sure it is being
routinely done by the criminals and terrorists all over the world. So, now
we at least have a chance to see this fixed.
They spied on US citizens, thus violating their "foreign intelligence"
charter? Yeah, this is no good. I would have felt much better if the same
surveillance were conducted by the FBI, not the NSA.
I actually welcome this scandal because it should bring home to people the
fact that we have lost "the expectation of privacy" battle. Yes, we can
legislate away the US government's ability to do surveillance - but how do
you make sure that China/Russia/Iran will not do it?
Sam Steingold (http://sds.podval.org/)
------------------------------
Date: Thu, 17 Oct 2013 18:33:20 +1300
From: "Richard A. O'Keefe" <ok(a)cs.otago.ac.nz>
Subject: Re: Founding Fathers (Robinson, RISKS-27.51)
In Risks 27.51 (http://catless.ncl.ac.uk/Risks/27.51.html#subj2)
Paul Robinson stated or implied that
1. The US is exceptional in having a right to bear arms.
2. (The US founding fathers having been no dummies.)
3. Women habitually went armed in Wyoming.
4. Wyoming was the first state to give women the vote.
5. 2 caused 1, which enabled 3 which caused 4.
Ad 1: The right to bear arms is in the British Bill of Rights, 1689.
And that did not create the right, but reaffirmed it as an
ancient right. It's noteworthy that the Bill of Rights
affirms this as a right of *individual* self-defence.
Ad 2: They certainly weren't.
There are two caveats in the Bill of Rights which the framers
of the second amendment carefully removed.
However, the second amendment is famously difficult to interpret,
and a case can be made that the people whose right to bear arms
was affirmed was those who would have been called on to serve in
the militia, namely (free, non-Amerind) men.
Ad 3: That's an empirical question I have no evidence on.
It's not clear that more women were armed in Wyoming than in
say Arizona, where women didn't get the vote until 1912, or
Texas, where they didn't get it until 1918.
Ad 4: This is certainly false. Women in New Jersey had the right
to vote since 1776. When Wyoming women got the vote, it was
not a state. Women in Pitcairn Island got the vote in 1838,
31 years before women in Wyoming, and they had neither the
protection of the US constitution nor the danger of rattlesnakes.
Ad 5: If women having guns got them the vote, it would be difficult to
understand how women with guns could ever _lose_ the vote. Yet
they did.
New Jersey: women got the right to vote in 1776, did vote from
1787, LOST the vote in 1807.
Utah: women got the vote in 1870, and LOST the vote in 1887.
Territory of Washington: women got the vote in 1883,
and LOST the vote in 1887.
Ohio: women got the vote in 1917 and LOST it later that year.
We would also expect that countries that limited the right to
bear arms would extend the vote to women later. Now the
1918 constitution of the USSR says (Article 2, paragraph 19):
For the purpose of defending the victory of the great
peasants' and workers' revolution, the Russian Socialist
Federated Soviet Republic recognizes the duty of all citizens
of the Republic to come to the defence of their socialist
fatherland, and it therefore introduces universal military
training. The honor of defending the revolution with arms
is accorded only to the workers, and the non-working
elements are charged with the performance of other military duties.
This actually sounds a lot like the 2nd amendment, except for the
restriction to "the workers". However, article 23 makes it clear
that this has nothing to do with defence *from* the state:
Being guided by the interests of the working class as a
whole, the Russian Socialist Federated Soviet Republic
deprives all individuals and groups of rights which could
be utilized by them to the detriment of the socialist revolution.
So you could carry a gun in the army, but not shoot a tax collector.
Yet the USSR gave women the vote before Michigan or Oklahoma or
South Dakota or Texas! Did women in Texas have no guns?
My source for these dates is
http://www.nzhistory.net.nz/politics/womens-suffrage/world-suffrage-timeline
which cites C. Daley and M. Nolan (eds), Suffrage and beyond: international
feminist perspectives, Auckland University Press, Auckland, 1994.
The RISK? The truth is out there, but so is a whole lot of self-serving
wishful thinking. (For example, the Pill had no detectable effect on
birth rates in English-speaking countries, contra the popular mythology.)
------------------------------
Date: Sun, 7 Oct 2012 20:20:16 -0900
From: RISKS-request(a)csl.sri.com
Subject: Abridged info on RISKS (comp.risks)
The ACM RISKS Forum is a MODERATED digest. Its Usenet manifestation is
comp.risks, the feed for which is donated by panix.com as of June 2011.
=> SUBSCRIPTIONS: PLEASE read RISKS as a newsgroup (comp.risks or equivalent)
if possible and convenient for you. The mailman Web interface can
be used directly to subscribe and unsubscribe:
http://lists.csl.sri.com/mailman/listinfo/risks
Alternatively, to subscribe or unsubscribe via e-mail to mailman
your FROM: address, send a message to
risks-request(a)csl.sri.com
containing only the one-word text subscribe or unsubscribe. You may
also specify a different receiving address: subscribe address= ... .
You may short-circuit that process by sending directly to either
risks-subscribe(a)csl.sri.com or risks-unsubscribe(a)csl.sri.com
depending on which action is to be taken.
Subscription and unsubscription requests require that you reply to a
confirmation message sent to the subscribing mail address. Instructions
are included in the confirmation message. Each issue of RISKS that you
receive contains information on how to post, unsubscribe, etc.
=> The complete INFO file (submissions, default disclaimers, archive sites,
copyright policy, etc.) is online.
<http://www.CSL.sri.com/risksinfo.html>
*** Contributors are assumed to have read the full info file for guidelines.
=> .UK users may contact <Lindsay.Marshall(a)newcastle.ac.uk>.
=> SPAM challenge-responses will not be honored. Instead, use an alternative
address from which you NEVER send mail!
=> SUBMISSIONS: to risks(a)CSL.sri.com with meaningful SUBJECT: line.
*** NOTE: Including the string "notsp" at the beginning or end of the subject
*** line will be very helpful in separating real contributions from spam.
*** This attention-string may change, so watch this space now and then.
=> ARCHIVES: ftp://ftp.sri.com/risks for current volume
or ftp://ftp.sri.com/VL/risks for previous VoLume
http://www.risks.org takes you to Lindsay Marshall's searchable archive at
newcastle: http://catless.ncl.ac.uk/Risks/VL.IS.html gets you VoLume, ISsue.
Lindsay has also added to the Newcastle catless site a palmtop version
of the most recent RISKS issue and a WAP version that works for many but
not all telephones: http://catless.ncl.ac.uk/w/r
<http://the.wiretapped.net/security/info/textfiles/risks-digest/> .
==> PGN's comprehensive historical Illustrative Risks summary of one liners:
<http://www.csl.sri.com/illustrative.html> for browsing,
<http://www.csl.sri.com/illustrative.pdf> or .ps for printing
is no longer maintained up-to-date except for recent election problems.
*** NOTE: If a cited URL fails, we do not try to update them. Try
browsing on the keywords in the subject line or cited article leads.
==> Special Offer to Join ACM for readers of the ACM RISKS Forum:
<http://www.acm.org/joinacm1>
------------------------------
End of RISKS-FORUM Digest 27.55
************************
1
0
----- Forwarded message from Theodore Ts'o <tytso(a)mit.edu> -----
Date: Thu, 17 Oct 2013 09:08:00 -0400
From: Theodore Ts'o <tytso(a)mit.edu>
To: Adam Back <adam(a)cypherspace.org>
Cc: Jerry Leichter <leichter(a)lrw.com>, Sandy Harris <sandyinchina(a)gmail.com>, Cryptography <cryptography(a)metzdowd.com>
Subject: Re: [Cryptography] /dev/random is not robust
Message-ID: <20131017130800.GE11932(a)thunk.org>
User-Agent: Mutt/1.5.21 (2010-09-15)
On Thu, Oct 17, 2013 at 02:32:57PM +0200, Adam Back wrote:
>
> Yarrow, and the replacement Fortuna try to address this problem by
> accumulating entropy and adding it in bigger lumps..
... and Linux's /dev/random driver does this.
Post July 2012, most of the entropy is gathered via a per-CPU (to a
avoid cache line bouncing effects and so it can be lockless) entropy
pool, where we sample the high resolution cycle counter (or whatever
the highest granularity clock / memory refresh control register /
etc. we have access to on the archtecture) and the interrupted IP, and
mix that into the per-CPU fast mix pool on every interrupt. We do
*not* use an entropy estimator for this interrupt fast mix pool.
Instead, we sample Every 64 interrupts, we transfer entropy from the
fast mix pool to the input pool, and we credit the input pool with a
single bit of entropy. (There is very likely much more than a single
bit of entropy that has gotten accumulated during those 64 interrupts,
but out of an abundance of caution, we're using a very conservative
estimate for administrative concerns.)
In both the pre and post July 2012 designs, using a Yarrow-like
approach, we only transfer entropy from the input pool to the output
pool when there is sufficient entropy estimated to be in the input
pool so that we can do a "catastrophic ressed". The "/dev/random is
not robust paper" assumed that the attacker could control the
interrupt timings such that estimate of entropy in the input pool was
incorrect, and thus the catastrophic reseed aspect of the design could
be bypassed.
I've already discussed why I don't believe that the assumption that
the attacker could control the interrupt timings to such an extent is
not realistic, and analysis of the entropy estimator (as used in the
pre-July 2012 design) showed that in fact, it was quite good. But in
the post July 2012 design, we no longer use an interrupt estimator for
the interrupt fast mix pool. We abandoned it for efficiency concerns,
since we wanted to make the cpu count on the global interrupt fast
path as low overhead as possible; instead, we traded this off by a
brute force quantity argument --- if we can collect the timing for
every single interrupt we're much better off than collecting it only
for some interrupts, especially when in the old design (which involved
CPU cache line bouncing and potential lock contention) device driver
authors were disabling the entropy collection more often than not.
So in the new design, we aren't using an dynamic entropy estimator ---
instead, we're assuming that after collecting the timings for 64
interrupts, we've collecting a single bit of entropy, which is really
a static entropy measure. Could this be spoofed if the attacker has
control of the interrupt timings of the system?
Sure, but if the attacker has that level of control on the system,
then then pretty much all generators would be seriously compromised as
well. The only way the paper could show that their proposed generator
was "robust" was based on the assumption that it would be possible for
the attacker to control the entropy inputs in such a way that entropy
estimator would be spoofed, but the attacker might still not know some
of the bits of the entropy inputs.
After all, if the attacker knows all of the bits, then by definition
all generators would be screwed. However, what has not been
demonstrated in the paper is a real life scenario where the attacker
would have that level of control over the entropy inputs --- enough
that entrpoy estimators would be fooled, but not enough control that
their constuction could be considered robust.
Regards,
- Ted
_______________________________________________
The cryptography mailing list
cryptography(a)metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography
----- End forwarded message -----
--
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://ativel.com http://postbiota.org
AC894EC5: 38A5 5F46 A4FF 59B8 336B 47EE F46E 3489 AC89 4EC5
1
0
--- overview ---
when cryptography is removed from a computer-only context, the boundaries
in which it could be modeled to function within would expand into a wider
realm of considerations that otherwise may never be questioned, and thus
exist limited in that computational domain, confined in a given threshold
of interpretation. thus, to exit the ideological constraint of 'knowing' a
particular approach, and then eventually to reenter the computer context
with perhaps a wider range of consideration for what might be possible...
i think one of the most evident assumptions about computer cryptography as
it exists is the role of mathematics (if not algebraic,) in defining the
model if not paradigm for computer encryption. this could be a false view
due to a naive outsider observer of these technological events, though it
allows consideration of the involved issues nonetheless, however accurate.
the case has been made that [signs] are used in language and mathematics,
and that this is the basis for code that is used to program software, and
it is tied into equations, algorithms of mathematics yet also software that
models a cryptographic process, to encrypt and decrypt information. and so
it has been questioned - how are these signs themselves modeled in terms of
their truth, or has this not occurred and potentially ungrounded 'data' and
beliefs are the default, as this relates to language, observation, and
relations that are the basis for this cryptographic exchange.
thus a question of [code] could be removed from consideration of the truth
of the symbols and signs used in these interactions, their foundation, if
they are actually accounted for in their truth, grounded or ungrounded, as
this relates to issues of security, secrecy, privacy, and so on. and so a
default condition seemingly exists, where this code itself, programming,
and software/hardware solutions could potentially be ungrounded, if they
are operating within frameworks and context of partial-truth (pT), versus a
model that is empirical grounded, not just in terms of the mathematics as
1=1, further on to account for 1=truth. which seems to be missing in all
such equations, as representations can be detached from their accounting.
and so the bulletproof ideology could exist that mathematics equals strong
code, on the basis of 'belief in mathematics' that could tend towards the
ideological. yet in this way, functions beyond proof and other dynamics may
rely on a theorized capacity and theorized security framework that itself
is the weakness, as pT=/T, as this structuralizes a basis for exploitation.
[code] ---> mathematics
so it is to question a prevailing condition or potential assumption that
the simple act of representing reality can be equated with reality itself
as a belief system, that becomes faith based, or based on personal trust
issues as a security model. the more esoteric the mathematic equations or
code, perhaps the more secure, if it were not involving rigor, though that
appears opposite the nature of the discipline or its practicioners and
developers, in that a community overviews and oversees development of the
crypto and its security and integrity is also a basis for their own.
[code] ---> mathematics == security
so a presumption could exist that the involvement or role of mathematics in
cryptography is how it establishes its security. and this text is to call
this fundamental or foundational notion into question. is it really true?
another way of evaluating this condition is that 1=1 would be the basis for
establishing security. and mathematically it could represent 'truth' via
this correspondence or pattern matching of [signs].
and yet in an ungrounded condition, pattern matching of [sign]=[sign] can
equate with 'truth' via its representation, yet not its actuality beyond
the signage itself, thereby [1]=[1] could remain variable and only in a
context of pseudo-truth by default. if ["sign"] is not accounted for in a
shared empirical model. it is just language then, communication at a given
surface-level interpretation. there is the missing dimension of philosophy
that validates the truth it potentially involves. it is not just about
signs -- their truth must accounted for beyond this immediate level of
calculation. it involves and requires more consideration and evaluation.
[code] ---> mathematics != security
in other words it is to propose that 'truth' is not by default within the
signs used to represent a situation, because they can be ungrounded and
function as if literature - describing a situation, though it involves
variability by default. an issue of relativistic observation and how the
data that is modeled is accounted for, removed of error or reliant upon it,
and thus the gap between what is real and what is represented is at issue.
it would seem to involve 'pattern matching' as a concept, the basis for the
ability to establish 1=1 correlations in terms of number and its processing
via symbols and other signs that map into the world and manipulate these
relations and frameworks. and thus as stated, this A=A consideration is of
the domain of establishing truth via logical reasoning, and thus a realm of
thinking about concepts and ideas is involved, underneath the establishing
of these mathematical models and processes. the ideas involved in these
relations, as they become formalized. there is incoherence at this level,
as currently the ideology of binarism make this connection an issue of
shared belief in an approach, versus its truth beyond a given boundary.
[code] ---> mathematics (concepts)
so it is to question what is going on at a more fundamental, foundational
level prior to this cryptographic modeling of information, such that What
If the concepts themselves are ungrounded in some way, such that variables
or representative signs or equations may not be by default /empirical/ and
instead could exist in domains of relativistic skew, distortion, and bias
in hidden ways that could also be exploited or subverted. thus while the
[signs] may be operational, are they actually grounded in truth or some
belief system that equates with truth, because it is assumed to exist in
the signage and not beyond it. in the manipulations not what is referenced.
[code] ---> mathematics ('concepts')
thus to consider the deeper truth involved in such conceptualization, and
this relates number to letter in terms of its use as signage, as both of
these can function as language systems that are in ways integrated, yet
assumed to be differentiated at the level of mathematics and, say, writing
a novel or short-story fiction as this is believed different than novel
algorithms and exploratory equations and theoretical proofs. what if the
mathematical viewpoint is ungrounded or relativistic, for instance, or that
the literature could be more objective than equations filled with numbers
ultimately, the mathesis involved appears to not differentiate a model of
empirical truth in terms of A=A equivalence, from either math or language
in that both could be evaluated in this same context from the beginning. a
shared modeling in other words, potentially. so the alphanumeric code could
be integrated at a substructural level to the [signage] that differentiates
the mathematic and linguistic, yet this could also be a false perspective
or inaccurate belief and mistaken assumption- perhaps they are one system
so where this is going is to consider a particular existing viewpoint and
interpretative framework for crypto, especially as defined by computers and
peripherals, that establishes a fixed idea about what it is and involves
and yet may involve these hidden boundaries that are also warped or biased
towards certain interactions or investigations and can disallow others
[code] ---> mathematics (algebra)
an example is if considering a given approach to crypto involves algebraic
functions as a paradigm as this relates to computation. this approach then
becomes the context for evaluation and mediation of representative [signs]
that may also be bounded in their interpretation in this way, due to the
delineation between mathematical and linguistic data. the "algebra" may
only be conceptualized and believed to function at the unit of [signs] and
their manipulation as signs, and not involve algebraic computations within
the [signage] itself, potentially, in terms of subsign units.
this is to attempt to convey that a boundary condition could be upheld that
views and models language as inviolable in terms of certain existing rules
such that a [word] is viewed as a unit, and not considered in its inherent
variability in terms of this same potential algebraic functioning. in that
the math is the math and the language is the linguistics, and the math is
doing things to the language based on particular established relations and
boundaries about what these relations and how they are believed to function
based on convention if not ideological views and understanding. it is very
abstract and perhaps inaccurate as stated here, yet seeks to ask- to what
extent is the information viewed passive, inert, and non-meaningful, as
this relates to its transformation (encryption) and reconstitution
(decryption). where is the boundary for this transmutative relation and
dynamics: is it inherently what mathematics does to language, from an
outside-in approach, such that mathematics acts upon the [signs], or might
it potentially involve accessing an inherent mathematical structure within
language itself, and thus a different boundary or relation could allow the
language itself to be the basis for the algorithms and equations, or to
bridge across these in a different, more integrated and meaningful way.
it makes little sense without visualizing it, yet email flintworks this
era of devolving infrastructure and tools involve make it difficult to
convey in the given medium, thus limiting what can be easily accurately
shared and in what ways- forcing the perspective for signage, and thus
relationships
[code] ---> mathematics (geometry)
likewise, if cryptographic operations involved a geometric modeling of data
this could also apply to how the content of the encryption scheme then is
evaluated and processed. and again, an issue of boundaries. how are the
[signs] considered in terms of the language or messaging involved. is this
an outside operation of geometry that transforms 'information' which is
measured by units of words and sentence structures and their formatting, or
may it potentially involve more than this, such that beyond this limit, a
subsign geometric structure could exist and be connected to, and become a
basis for this transformational processing. thus the 'truth' of the signs
as these relate in across the conventional line separating mathematics and
linguistics, in terms of a shared patterning that involves both domains.
[code] == (mathematic & linguistic)
so if considering the issue of boundaries and representation, and how logic
establishes these structures of observation and perception and modeling,
that perhaps code itself, in its truth, involves a more fluid interaction
in these domains than the traditional viewpoint can acknowledge, as this
relates to the concepts involved and how they are approached. for instance
computation or equations or algorithms, how data is processed, encrypted
in terms of pattern matching (A=A), this could span a model of both code as
a mathematic and linguistic structure, given 3-value and N-value logic. in
this way, the [sign] itself could not only have a mathematic operation that
is transforming it from the outside or external boundary, and instead this
processing could occur inside, and consist of its own equations, based upon
inherent calculative dimensions of its symbolic or sign-based linguistic
structuring (as language). in other words, a [word] could have calculative
and computational potential built-into it, in terms of its patterning and
yet if the word is not allowed to be evaluated beyond its whole conception,
the subsign structuring may be by default off-limits or made inaccessible.
this is to include smaller units than the word as sign, to include even
more basically letters, whereby for example the letter [Z] may only be
evaluated in terms of its being 'z' and not its components or ~various
structural relations with other letters, such as S|Z or N/Z or numbers: Z|5
and Z-2. though of course there is more to it than this, because the same
structure can be taken apart and evaluated in its individual components:
-/_ or > and <, etc
[code] ---> pattern matching
so the idea is that programming itself is based within code and issues of
how it is modeled and how it represents the world, and it is to question if
this is actually truly grounded or based in an ideological belief system.
and so it is assumed there is partial grounding, in some ways, though a
realm of error or a gap exists between what is modeled and what exists
(pT=/T) and this includes the conceptualization of code itself as signage
likewise, the default boundaries of this code could effect how it is both
managed and processed, within what parameters. and thus the heavy reliance
on mathematics as if the basis for this strength, yet the same potential as
a weakness if it too is ungrounded or only partially so, in terms of the
potential for exploits based on these errored notions and beliefs. (A=B)
the cryptographic consideration in this scenario then, of how signs are
processed and conceived of, as units to be transformed by equations, as if
the basis for objectivity, yet not accounting for this in logic itself (T)
beyond the level of the [signage], such that pattern matching of signs is
believed of direct equivalence with truth itself, thus 1=1 is truth, yet
not taking into account what this potentially represents, in its truth
and perhaps this is the issue with language and observation as a context
for the mathematic, and how internal processing of a person is
externalized and thus ungrounded views and beliefs can be made structural
and equated with [signs] via shared assumptions and viewpoints, that
because they are shared and agreed upon, are themselves believed to be
true. binary logic and ideology are what allows this perception as a
default condition, yet it can be and likely is ungrounded and based within
relativism, automatically, or in other words, occupies a framework in
pseudo-truth that continually is expanded upon via endless viewpoints that
together in their inaction with other such views, even as agreed upon and
confirmed as shared observation, tends towards nothingness (0) as a
perspective instead of towards truth (1)
[code] ---> (signs/symbols)
thus it is to consider the code in terms of this issue of signage and of
boundaries, as it involves interpretation beyond these, to what they are
referencing, where their truth can be accounted for, in its accuracy as a
model or representation. ungrounded relativism has no need of this extra
step, and in this way mathematics can freely function as if writing..
thus the vital issue of error-checking and correction of code at the level
of signs used to represent ideas and concepts (mathematics, crypto models)
as this exists beyond equations and algorithms and into a realm of ideas,
how truth is evaluated, and the requirement of this in terms of security
all of this to establish and allow a conceptualization that follows, that
considers programming and code for cryptography in what may be perceived as
an off-limits consideration- that of typography.
--- crypto.typologic ---
in the same way that crypto is conceptualized to be related to
mathematics, it is also proposed typography has connected structural
relevance to this crypto~graphic inquiry
[crypto] ---> [mathematics]
in other words, in the linguistic context that also establish and define
approaches to cryptologic systems and their cryptographic conventions, it
is to consider the boundaries separating their interactions...
[crypto] ---> [linguistics]
in other words, what if at the level of representation within code itself
there is a boundary or limit or threshold condition upheld by convention
that is itself arbitrary, a forced perspective even, and that it could be
holding back other options and perspectives for the questioning involved...
for instance, encryption that involves algebraic and geometric operations
and functions, as these may be bound to mathematical transformation of
signage, yet at a certain bounded condition, outside or upon the sign
itself or at its periphery, versus within it, in terms of its subsign
dynamics or subsign meaning
[crypto] ---> [mathematics] --> [signage]
this approach is essentially to consider the relation between mathematics
and language, in a context of linguistics, whereby a calculus could exist
that bridges the distance between what is traditionally viewed as the
objective (A=A) and the subjective (A=B) as this corresponds with numbers
and letters, here in a context of signs and symbols or various patterning
[crypto] ---> [math/linguistics] ---> [signage]
what if, for instance, the context for evaluation of data, pre-encryption,
was based in a combined A=A boundary established by *mathesis*, such that
the signs evaluated and transformed had this larger dimensionality involved
in the initial consideration, versus bounding of the linguistic within the
mathematic, potentially, as a set(subset) relation: mathematic(language)
in this way, equations could be limited, skewed, or bounded by a particular
relativistic interpretation that may assume accuracy due to shared views
yet be based upon or rely upon mistaken assumptions while believed true,
even while signs themselves may exist or persist beyond these boundaries
and be accounted for otherwise, yet not evaluated due to being off-limits
[crypto] ---> [geometry/algebra] ---> [signage]
thus the consideration of signs and collections of signage within crypto
communications and messaging could exist in a calculative context, yet this
could involve both mathematic -and- linguistic computations, by default,
yet in terms of software evaluations may bias a mathematic approach to
establishing equations and algorithms in terms of numbers and not letters
due to convention and an inherited mindset for what parameters exist and
how computation takes place, at the level of pattern recognition of signs
yet not of the underlying truth these signs map to and reference, and in
this disconnection, the potential for a representational short-circuiting
between what is represented and calculated and what is actually real, true.
and thus ungrounded observation and computation, as this leads to relations
and crypto exchange that is insecure by design, versus a model that is
empirically grounded and error-corrected and constant under evaluation in
terms of its truth, including that of its content, the signs it involves
[crypto] ---> [linguistic] ---> [signage]
it is in this conflicted condition that the linguistic evaluation of signs
can establish a foundation for truth via the de|con-struction of signs into
their more elemental armatures. and this evaluation can occur in terms of
various structures, such as nouns or verbs, or sentence tree diagrams, or
hundreds of other approaches to evaluate how language is structured and how
this maps into meaning and its verification of some perceived truth, though
this could still be at the level of pattern matching of signs, and not of
actual knowledge itself. such that a boundary may exist for mimicry-based
AI versus intuitive computations that are based on a comprehensive model of
grounded empirical knowledge, due to this gap and approach to computation,
say reliance on binary approaches and constraints to force viewpoint, etc
[crypto] ---> linguistic (algebraic/geometric)
so all of this background text is required to establish a given framework
to evaluate a pending alternative conceptualization that considers and
recontextualizes cryptology within a computational context of linguistics,
yet potentially in a territory beyond existing perspective that involves
subsign computations that are not mapped into traditional adjective/noun
and other existing models, yet can likewise potentially be interconnected
with them in various structural entanglements, as patterns collide, form,
and mutate based upon relations and dynamics of sign-al processing.
in other words: why not have algebraic and geometric functions and ~various
operations within the signage itself, instead of at a protected boundary
that limits such computation to a realm of numeracy, for instance. why not
run an algorithm that transforms or relates or transforms subsign units,
whether letters or words or sentence or paragraphs or all of these together
(in terms of nested superset-set-subset dynamics), such that the [signage]
is itself transformed, encrypted, versus a secondary wrapper or envelope or
container that "represents" this encryption of plain-text interior content
one approach to this, of a vast innumerable many, would be to evaluate the
typographic condition of code itself, as a basis for what is and what can
be ~programmed, in what terms and parameters, based on how tools function
and how the cryptologic and cryptographic situation is conceptualized...
[crypto] ---> [typography] ---> [signage]
in other words the geometry of signs themselves, letters as with numbers
(though to focus on only the former as the primary example) have within
their patterning an implicit structure that graphically relates to other
alphanumeric characters, and thus the unit of measure, whether individual
letter or their combination into words, can become a basis for evaluating
these relational dynamics in terms of shared dimensionality, the shared
scaffolding of logic connection that pre-exists other evaluations else
informs it and can provide additional framework to map onto considerations
whereby letters and numbers themselves are entangled in their connectedness
and likeness and unlikeness as patterns, and this is inherently ~variable
such that a letter such as 'y' may relate to the letter 'v' in one context
whereas if rotated may relate to the letter 'h'. this transformation is
inherent in all letters and their combination. such that letters alone may
have properties, though so too words, via ambigrams or other evaluations.
yet the question goes further than this, and into a realm of abstraction
that is perhaps approximate to moving from a question of typography from an
interpretation of fonts and font styles, to that of abstract patterning
that may no longer be legible as a decipherable language, due to the
potential to break apart each letter into subsign units, say a capital
letter L into components: | _
and in this way, how might [code] and geometric calculation exist in such a
transmutational context of alphanumerics that break the model of literacy
or go beyond its existing boundary, into other realms of interpretation.
such that the ascender and descender, mean line, baseline and median, and
arms, spans,bowls, shoulders, counters, and terminals become graphic units
that are potentially computational, if they are standardized and aligned.
and this is what the HIOX model of alphanumerics opens up and allows yet it
could go to a much higher level of resolution given the details of language
and how these sign systems exist across all language, potentially, mapping
into a master symbol that reverse engineers all language in a single view
in this way, from [code] to [mastercode] if not many relativistic codes
into a shared framework of a grounded empirical model that is based within
and references the same evaluation of (paradoxical) truth in its pattern
matching. this is ancient stuff, the ideas involved formatting civilization
The Orphic Trilogy, Cabaret, GiGi
Π Ω δ
1
0