q: cryptoanalytic processor

brian carroll electromagnetize@gmail.com
Wed Feb 4 16:38:11 PST 2015


2nd observation. a few months ago i realized the close parallel
between the needs for a new computational paradigm for the IoT if it
were to succeed and issues of breaking cryptography, wherein the ideal
processing framework could be that used in cryptanalysis, and yet I am
only imagining how such a processor likely would have to function in
order to successfully approach & crack unknown codes, which tends
towards AI most likely, with human guidance or continuous interaction
and input, revised or new modeling, etc.

strangely, in the mundane realm of the everyday, it seems to relate to
how IoT and home-automation technology exists in a Wi-Fi based model,
at least in the non-integrated off the shelf choices for consumers, as
if everything is plug-n-play via Wi-Fi capability when this instead
relies upon a continuous broadcast of otherwise unnecessary radiation
(if wired ethernet exists as an option, in wall) and in general is the
least elegant, most pollutive approach to information exchange, in
terms of what is required versus what is actually needed or useful. in
that, to get a refrigerator to link with the WiFi wireless router and
internet connection then requires a _broadband connection that is most
suited to audio-visual streams of information or data, perhaps only
useful if appliances and other gear is either AV-based or covertly
audiovisual, where hidden streams of microphone data are sent or other
diagnostics via this broadband which otherwise is extreme overkill.
say for a controlling a wi-fi lightbulb or switch or simple robot.

So the thing is, it seems quite obvious that Wi-Fi as a universal
solution is a dead-end approach, and a failure in terms of strategy
for what kind of small-data sharing is required in appliances that
could be handled with a less energy intensive, lower data rate
solution, which tends towards Zigbee [0], aka 802.15.4, which is
normally used in distributed sensor networks, within limited
information exchange (though plenty to sustain and manage IoT device
interactions that are non-multimedia or requiring broadband
throughput, such as a network harddrive, NAS, etc.)

Thus it seems quite commonsense to question the 40+ year old model to
data management that WiFi is providing bandwidth to and further
securing as a universalized, dumb approach to issues of interaction
and device management, en group. In that the entire model is centered
on the PC and its binary processing and files and folders and existing
techniques to segment or sector activities into their own sectors and
domains and that a larger rationale cannot be established beyond
existing ideological constraints, to, say, suddenly have everything
operating together in new terms from bottom-up relations, based on
various game-of-life efficiencies as devices interact and develop
relationships and survive and optimize in the general model (as if
Spore lifeforms in some sense).

the thing about Zigbee, [1] in contrast, or at least it seems, is that
it provides an opportunity for an open framework of relations between
devices (each device akin to a sensor-platform in concept, perhaps, or
mars spirit rover, mars opportunity rover, etc, or refrigerator, hdtv,
playstation, lighting, etc). what if certain parameters involving home
automation or device interaction existed outside the single device
parameters in a business sense, and industry to industry or sector to
sector likewise, such that activity of an individual life is
fragmented within devices, not united and combined, based on the given
computational/processing approach yet where a /data potential/ exists
within each device likewise, such that if it were more accurately and
openly modeled in terms of device functionality, could perhaps
interact in other parameters and dimensions, in a micro-data sense,
where the parts benefit the whole and together function in meta-ways,
that cannot be determined by a single device and instead requires a
new system and way of considering these interactions.

and from my view it seems the limit to this is the PC model itself as
a hub, because people turn their computers off usually, and the
personal computer is supposedly supposed to be processing this
meta-condition via its hub, via Wi-Fi maximal radiation that is
always-on to support these minimal benefits and interactions at the
cost of, well, unnecessary electromagnetic pollution at the least,
though also, non-broadband requiring data rates for what is actually
going on or should be going on, with device interactions.

so this leads to the presumption that the computer server (for me i
once thought this would be Xserve [2] from Apple) in the home, where a
utility closet or server room/closet would be dedicated for household
computing, people would host their own websites, etc, not realizing at
that time the larger dynamics and limits involved, which today are
quite evidently neither secure nor effective in terms of data
management beyond an inherent bureaucratic inefficiency, which would
lead to home sysadmins visiting domiciles as if plumbers, yet on a
probably weekly basis to keep the messy works functional or at least
the appearance that everything is wound up and functioning correctly,
instead of this being an impossible to maintain state, given the
existing approaches/system.

in other words this is to question taking potentially novel
information, a realm of 'unknowns' occurring at a meta-level between
various devices, in which some parameters are accounted for or modeled
effectively and others in a gray-area of consideration though which
should not be thrown-out and instead analyzed or parsed, at the level
of patterns, and a PC-hub system today that is opposite this approach,
by requiring a preexisting rationalization to exist to fit everything
into which is flattened down into a realm of binary code which throws
out these unknowns, which is made structural in software solutions
that streamline this kind of ideological processing, into
compartmentalized views, most notably the transformation of the
computer program into apps, which perhaps in some ways has similar to
city planning at a larger scale (major desktop programs) to individual
houses (apps) that are of endless and repeating yet altered variety,
as housing stock, less comprehensive in totality yet more
comprehensive tiny details. of the latter approach, the now evident
realization that a majority of apps are base don reinventing the
wheel, taking something and tweaking it and calling it something new
or something else, while the main idea degrades through endless copies
of copies. and in this way also, the improbability of moving from this
decentered app-landscape or 'distributed and disconnected
computational environment' into a city-planning or development model,
where the various intricacies are united at a larger scale, made
coherent if this is not their origin or purpose to begin with. And it
seems the consumer-version of the IoT begins here, in this
fragmentation and ideological adherence, that all that is needed are
enough apps and Wi-Fi and the IoT will just start functioning as if by
magic with various efficiencies gained simply by plugging in and using
the devices in a backwards, inefficient, and ineffective way that
benefits those making these claims financially, by extending their
models as if shared standards, minus questions of data and the
substance of knowledge in relation to information transmission, and
truth itself as part of this development process.

so it seems obvious then that the computer model is off. not only off,
way way way off, like fundamentally unrealistic and that those who are
pursuing this personal adventure are involved in folly and have money
to burn and exist inside bubbles without any accountability outside
self-same ideological tenets and beliefs, drip fed or spoon fed from
Silicon Valley weathervanes pointing digital, where apps apps apps are
the best practice solution for any and everything, so long as it makes
money for technocrats.

the big issue with the PC appears and existing computer architecture
and processing appears to be that of predetermination, that the
existing model is correct and is the basis for decision-making and it
cannot be questioned and instead functions as a given or assumption,
and thus provides strategy and options to choose from, even if they
are subpar or even backwards and regressive. thus determinism is at
work, as ideology, whereby solutions are based on preexisting answers
and approaches that have in the past worked and now are being further
extended and yet cannot extend, cannot improve beyond a certain
preexisting limitation which should be obvious to those growing up
from the first decades of consumer computing, when Radio Shack sold
its own version of the same, the Tandy computer, which is essentially
the same device in use today, whatever the various branding is. not
much has changed under the hood in the ways things are done, at a
consumer-interface level, with how information is processed as data,
in that "ideas" do not exist within computers, only various
rationalizations or solutions for specific tasks or frameworks that
are limited, none of which are comprehensive. the closest to this
would be the OS as a generic space for potential processing and yet
even within this space of the desktop or tablet, the larger
conceptualization of the IoT cannot fit within it, as a file & folder
issue, as a subprogram that is of a lower hierarchy in some
significant sense to activities that are of lesser not greater
importance.

in other words, the binary-based black-and-white processing that is
also in the OS approach, beyond parsing bits and entering into the way
of managing ideas as data, such that there is no place for knowledge
or organization of ideas or information at a general abstract level
within these machines, when what is needed is each person's life to be
modeled at the scale of a world or universe and not a city plan or
various combined housing stock by outside developers or app builders.
cosmology is need, not comixology. and yet it is just not possible in
the given scenario. there is no there there, nor within the internet
front-end, for such interaction.

and so if people and these machines cannot correlate and relate and
become coherent, how can device-to-device interactions be modeled
effectively in their various optional dynamics, if it is a question
that exists beyond limited partial rationalizations, where the part
cannot approach or model the whole or substitute for it, and any
attempts lead to folly, indicating a threshold condition has been
reached and that the model is outdated, retrograde, if not
fundamentally unreal for the task encountered.

so it seems quite obvious then that an always-on home computer
infrastructure is needed that could process ambiguous data and make
new discoveries of relations and interactions in an open data model
based on patterns, how does this relate to that within these
parameters and in these looping sequences and frameworks. parsing at
the symbolic-level of data, not as binary bits and instead in a
different conception of processing and computation itself, where
'variables' are retained and not thrown-out, akin to holding onto all
the junk DNA in case it may have relevance or value in heuristic
modeling and reevaluation, where the data that is processed exists in
a realm of hypothesis and, critically, -- logical reasoning -- and is
not treated as a rationalization, where the PC awaits is already-known
correct answer which it correlates with and provides output for,
determined by software developers who are not wise about everything in
the world yet have the weird prospect of modeling everything as if
this is indeed their task, which then turns into a form of inaccurate
translation or representation that is wildly off-course with how the
world actually works, the nature of reality, social interactions,
knowledge, basically everything that is not in a programmers cookbook
of ready-made solutions, yet on the interior appears as if
comprehensive modeling and control over environments to those
ideologically aligned.

the Internet of Things is a joke in the existing scenario. a
boondoggle to pursue and participate in within the existing modeling,
for it requires a new foundation of computing, a new way of processing
data that is not possible within existing equipment, regimes of
computation, and computer architecture. it requires an understanding
of thinking, reasoning, decision-making beyond bits, 2-value where
making a choice is equated with being right or wrong. a bit too
simplistic for what is going on today, this absolutist relativism. it
requires massive questions about the nature of computers,
infrastructure, systems, feedback, diagnostics, the entire realm of
existing approaches interrogated, perhaps for the first time, so as to
retain the things that are effective and place them into a new
paradigmatically-different and improved framework.

to give some idea of the challenge, while today it would be possible
to swap out WiFi for Zigbee wireless, to allow minimal data
interactions, in the example of LED lighting, say with HUE data.lights
and a dual Zigbee/WiFi router to manage them [2][3], that it still
potentially relies upon and requires a certain approach to computing
that becomes a limit for how 'shared systems' can relate and within
what particular parameters, based on OS management and again
rationalizations within software that require choices or decisions to
be made about functionality. in that at some point it may be like
scripting the home environment, the triggering of various effects
based on noted cause scenarios (ifttt) though under the control of the
user. which may be fine for the present day yet at the meta-level,
everything needs to be decided upon. An example of this information
overload conundrum is easy to notice with the connected home and
interfaces that script conditions for the automated shades to rise and
lights to dim and TV to descend from the ceiling, that, so long as
those conditions are matched _exactly and can be predetermined
accurately, then succeed, though life is so much more complex than a
repeated pattern occurring the same over and over day to day, year to
year, and without that computational leeway ('ambiguous reasoning')
then forces fixed-patterns onto a schedule or flow, and becomes rigid
or pre/pro-scribes solutions or seeks to determine events based on
inadequate approaches to processing/evaluation, that then dumb-down
environments, efficiencies are lost at the same time technical
efficiencies are gained, in part, and that this is due to binary
computation and data modeling in insufficient frameworks and
non-AI-based approaches that cannot grow beyond these limitations or
scale or integrate with other activities that do not share the same
dynamics, or if they do, they are a false-view of the issues involved,
as if turning the washing machine on from the car on the highway is
the question and priority, versus its spewing toxic VOC fumes
(volatile organic compounds) [4] into the atmosphere, which should
have a global policy requiring scrubbers for all new laundry rooms and
machinery, to capture these neurotoxins prior to their release into
the air effecting people, environments and wildlife, though
-unmentioned- in the global warming and carbon emissions likewise
(outside the ideological data model), yet on par with pollution in
18th c. England with blackened soot filled skies from the first
industrial revolution, [5] which begat policies to recover nature and
civility, versus ignoring it (as with unnecessary gross-output of wifi
without its use, and disproportionate data-rate to what devices
actually need that connect with it, which could instead be turned off,
not radiating just to blast radiation).

so priorities are backwards, goals, strategy, if working from the
needs of truth and ideas outward, versus ideology inward. and this is
oddly enough correlated with crypto or so it seems. because it would
appear an always-on centralized computer processor in the home would
be needed to parse always-on IoT micro-data of various interconnected
devices and events queuing and chiming-in near and far in a
comprehensive data model. And that if unknowns are involved, this
server-like device would need to be able to manage a realm of unknown
variables that loop about as questions or hypotheses, as if lifeforms
or partial patterns that have yet to find their constellation match
and thus only partly process or 'compute' in the model, yet are
included or accounted for as such and not thrown out as data. so an
entire life of data (what is in the refrigerator, what products a
person has, their warranties, need for resupply, various contacts and
scheduling, analysis of routine patterns, tasks, etc) is then being
continuously juggled in what would be a custom framework per person
per environment (as world/cosmos) which could in some sense function
as clockwork, or fuzzy clockwork at best, if not also as circuitry
(the circuity of self, circuitry of work, circuitry of relations,
etc). and so at some extreme level of coherence all of this combines
into a general model of a life and its functioning, and what is
proposed is that this approach neither can be contained within a 1980s
file&folder framework nor within binary processing of these events nor
within an ideological solution provided by a developer of said
existing tools. and that instead, an always-running process would be
required to allow for this flow of data, this interaction and
placement of a person within it, accurately represented, and require
both AI and human interaction with the modeling, to help build,
establish, falsify data-model hypotheses, to optimize these relations
and in that way, the fuzzyness of patterns and routines then could
inform IoT person-to-device and device-to-device relations by this
larger comprehensive framework as the context, which instead is
absent, completey missing because it cannot exist in yesterday's
approach.

so the thing is, how do you process an incoming stream of potentially
unknowable data and yet keep it around and relevant. and one day i was
thinking about this scenario above, which i call the Data Furnace,
where a server-like machine is in a dwelling and controls the
lighting, media, networking, computing, and where peripherals tap into
this centralized resource which syncs all the data (as if grandfather
clock perhaps, symbolically), such that the modem and router disappear
into the utility closet and are integrated in the Data Furnace rack,
much like home theater server-systems or actually, most like modular
stereo systems as a conceptual basis for modular computing of the
future, which can be connected together, say UPS backup power module,
fiber modem, and musical sampler that provides lightspeed sound data
to musical keyboard interface or wirelessly to others. that the
technology is centralized, on the model of the mainframe and
timesharing, where dumb terminals are the most efficient approach to
coherence amongst devices and information then is in one place, not
many places at the same time in slightly different forms.

so if the PC of today is based on having everything 'known' in advance
of its processing, and having ready-made solutions that are available
for how to address situations via predetermined rationalizations, then
what the data furnace approach requires is a way of continual
questioning, and openness to information, and going beyond existing
models and challenging its own conceptualization as well as having the
ability to grown, transform, integrate data and various models and
frameworks at a higher meta-level, as this relates to an individual
though also to larger patterns, which could function equally well in
as a data-model and framework for a business, government, organization
or their combination, in that local patterns may be evaluated en masse
and reveal unique data at another level, as this is useful for
planning or business strategy. it is about intelligence, having
machines that support thinking and themselves provide an accurate
worldview that structurally connects to other 'integral and
differential' patterns and processes.

and so one day thinking about crypto, it dawned on me that the NSA and
government likely have already developed a solution for this kind of
processing albeit within a realm of high secrecy. or i would be
willing to bet that should capacity exists that would be directly
transferable as a technique, as the basis for this data furnace,
without risking any of the cryptological issues involved. in that
there are assumed two basic routes to cracking crypto via computers
today, or so it seems. and the first would be to have knowledge of the
task, and to know the encryption scheme in advance, and thus when a
stream of data is capture and analyzed it is done within an existing
framework or 'known rationalization' much like today's computers,
whereby speed and accuracy then allow a code to be broken due to
advanced knowledge of the code or secrets about the code that allow
its compromise via these or other technical methods. and thus ordinary
computation that exists in a realm of extreme performance could crack
crypto that is in some sense ordinary or usual. and so billions of
trillions of calculations a second could potentially deal with
everything that occupies a given realm of parameters and known crypto
approaches. so mainstream crypto would be this, presumably. and either
already broken or breakable.

though another likely possibility exists for cryptanalytic computers,
which is that the data stream is unknown and unknowable in the given
existing scenarios. and so how would a machine approach that
situation, if either its binary data was analyzed or data that is
non-binary, say patterns or traces within a series of 20 books that
are referenced in an article believed to be used to pass secret data,
whereby entire collections of libraries (of books referencing books)
and data from content experts is needed, (papers, lectures,
whitepapers) and also mathematical and linguistic analysis ranging
from geometry to typography and translation and etymology, all that
either could be addressed by a network of said people and distributed
experts, or automated and partially developed in a hypothetical
framework via computer and human hybrid modeling, which may or may not
accurately approach the code in question, if it indeed is the
encrypted code.

what my presumption is is that a computer exists, a processor, that is
capable of taking information in, whatever it is, and analyzing it via
patterns within various frameworks, models, parameters and approaches,
and seeking out structure and relations that establish or build some
meta-conceptualization based on what is availably parsed in whatever
dimensions are accounted for and correctly analyzed. in other words,
identifying and managing and combining and distinguishing and relating
patterns in ways that are akin to reasoning, within hypothetical
models that may only partially match the situation, and then could be
further evaluated, expanded or refined, though continually are tested
and retested again, as unanswered questions though at most only
partially answerable, awaiting or given more information that either
contradicts or expands or solidifies the modeling (tending towards 1,
in terms of probable accurate interpretation). if such a computational
engine could be, as a defense technology, transplanted into a home
environment (and business, and government) as a new foundation for
'reasoned computation' that is grounded in truth, and reliant upon it,
then the various ambiguous data streams could begin to automatically
build-out and build-up their own interrelations specific to given
situations, scenarios, environments, and individual and interacting
people, whereby the rulesets are not prefigured or predetermined and
instead evolve, as data-models, within paradoxical frameworks, where
logic is not just on/off, and instead like sampling, of variable
resolution (N-value logic) though beginning in the gray-area
(3-value), where the processing exists in a neutral middle realm of
questioning, and remains there, tending towards 1, away from 0 (false
modeling) as time moves in the connected systems, near and far, as
patterns function as gears and together clockwork though as
information tied to matter and energy, not separated from it, and held
apart, segmented into other unrelated domains. this integrated parsing
of events and activity then what distributed sensor networks
potentially involve, operate within, as with ambient intelligence of
nature as a realm of processing, as patterns develop, change,
transform, die, begin anew.

perhaps those who know how these cryptanalytic processors actually
work could correct the above assumptions if inaccurate or provide more
information about the nature of the challenges involved. from my view
such technology is required for the IoT to serve humans, whereas the
existing binary equipment is sure-fire the path to human enslavement
via idiotic ideological technology that seeks to rule and manage over
other lives via conformance to authority-based obedience to inaccurate
models, wrong beliefs, and misplaced authority, in the guise and under
the shield of being cool, trendy, up-to-date, when this is backwards,
mind-numbingly stupid, and beneath the ability to even reason about
because 'ideas' are not involved beyond private interests exploiting
for profit.


0. http://en.wikipedia.org/wiki/ZigBee
1. http://en.wikipedia.org/wiki/Xserve
2. https://www.pinterest.com/pin/566538828100249725/
3. https://www.pinterest.com/pin/566538828100255612/
4. http://en.wikipedia.org/wiki/Volatile_organic_compound
5. http://www.eh-resources.org/timeline/timeline_industrial.html



More information about the cypherpunks mailing list