NSA data centre power surges & unknowns...

brian carroll electromagnetize at gmail.com
Fri Oct 11 22:02:42 PDT 2013


thanks to everyone for sharing their views regarding my forward-leaning
speculation about the NSA facility...

two additional observations:

1) if the NSA Utah data centre is not involved in quantum computing in any
way, as per the larger society and its synchronization with existing tools
and technologies manufactured and standardized likewise in the mainstream-
it is curious to what extent non-quantum computing technologies could
potentially exist in such a scenario, yet remain secret and off-limits.

this is to consider the basic technology research and scientific advances
that are oftentimes in the news on physics sites, such that university or
corporate RD&D labs have achieved a latest milestone, and that a given
improvement may make it to market in the next 10-20 years, and then from
this frontier of development, a gap with the present in what appears to be
an 80s computing paradigm largely seemingly unchanged, perhaps mostly in
terms of the OS itself and how data is managed on these electronic filing
cabinet devices that collectively become the worldwide junk drawer (aka
Internet and WWW), the epitome of "bureaucracy" as networked furniture.

for instance, what if the facade of a computing server has hidden or
advanced technology within it, in such an installation, that may change the
equations for how data is being processed. 'spintronics' is one such area,
various approaches to memory are others, that could potentially have larger
combined effects if built-out at scale, perhaps the equations would differ
or the methodology or techniques employed to discern and sift thru data.

and thus, it is still wondered if in the gap between where research is and
is essentially "bounded" both within a secret government context and also
outside of a public, commercial mainstream domain, what may actually be
going on in the inner workings as it relates to what is going on within the
surveillance society itself and its capacity to process vast amounts of
data that seemingly would require more 'organizational capacity' than just
brute force computing alone-- meaning: the capacity to compute "unknowns"
in a running, looping equation... which to me appears to be what serial
processors are not good at, and instead seemingly must arrive at an answer;
and even parallelism within serial processing (non-quantum), from my truly
naive and completely limited understanding and perhaps errored viewpoint,
would only best be served by quantum nonlinear or multilinear processing
across vast amounts of data sets that may not have known correlations to
start and instead to build these up and test against models. can equations
or algorithms do what the hardware cannot do, in its logic-based biased
processing via digitalism, or might other transistors and other ways of
balancing these computations more towards weighted analog looping means
exist by which to 'naturally approach' this situation, from big picture to
local instances, versus localized modeling outward, via rationalization.

that is, in conceptual terms, what if the likelihood that coherence could
be decided upon or determined by a software framework alone, versus having
additional capacity both in advanced hardware components, to both assist
and allow for such nonlinear massively dynamic (ecological) capacity for
global computation across all interconnected sets-- perhaps such data sets
could even would shrink the example of weather modeling for supercomputing
to a minor issue by comparison, given a potentially wider range of knowns
that are relationally unknown in their effects, given empirical evaluation,
and the massive floating unknown that may never be computed for beyond a
vague nebulous threat-condition, some threshold that suddenly grounds in
the modeling and triggers alerts or forecasts based on statistics as this
relates to recorded data, live surveillance, the entirety of this, in situ.

i think what would be impossible about a traditional computing paradigm for
this situation would be the incapacity of 'knowing' what to model given the
vast amounts of data and their instability in terms of limits to what can
be known and understood, versus what is probabilistic and reevaluated again
and again and again, and perhaps tens of thousands of such equations that
may exist in various computations, and then the 'meta-algorithm' that may
govern over this situation-- is it a man's judgement for how this data is
interpreted and thus the situation is managed as if an office computer that
is queried- or may 'computational reasoning' exist in the form of AI that
is running models, both theories and competing hypotheses - in parallel -
and that these are built of past events and future statistical graphs, and
potentially reveal *constellations* or patterns that begin to match what
was a previous condition or sequence, and thus the computer is determining
the direction of what is tapped based on its inferences and its deductive
and inductive, ~intuitive reasoning.

now, i totally believe it is technologically possible to do this-- just not
with today's limited equipment and software paradigm-- not a chance. it is
too heavily biased in both processing and computer architecture to allow
for open-ended questioning. everything must be rationalized in advance, it
is the way the logic-gates work on the PNP and NPN transistors. there is no
'unknown' state within the circuitry, and whatever may be achieved via
software would be unnatural to the hardware and bounded in its capacity and
functioning against the direction of current, as if an aberration. or thus
is the foolish explanation i conjure from the abstract. that it is going
against the grain of electrical current, essentially.

it is like telling someone who is entirely vested in a binary mindset to
consider a paradoxical situation, and how for its resolution it must be
bounded within that worldview and made to fit within its sampling (1/0)
else the model or worldview fails. the paradox would need to fit within the
bit and not exist beyond or outside it else it would not exist or be real.

so how can an entire paradoxical situation that is actively parsed exist in
a running model of 'rationalization' in terms of computation, if the charge
itself is biased to an on/off state for any given consideration when the
majority likely remain gray-area computations and empirically unknown,
except within a sliding scale analog-like weighting that shifts modeling.
perhaps parallel software could "model" this, yet how effectively if such
data can only be run one-way and must be restarted and recalculated versus
allowing the data to live in ambiguity and continuously compute itself, in
a realm that is too highly constrained by binary modeling, determined by
it, and thus would bound any such computation to only what the technology
allows in its bias, and thus limits how much can be questioned, asked, and
allowed to be unknown, ambiguous, and computed this way.

is it not possible that stuxnet-like influences could exist if making a
continuous brute-force calculation upon a massively changing real-time
model that it could essentially burn-out the serial processes linked in
parallel via networking, such that they would be performing at peak rates
and potentially overheat or have a limit to what kinds of data could be
processed and in what fidelity-- and could such fidelity with the world of
real-time surveillance and state threat modeling exist in its vastness and
nth-degree unknowns and run that within the parameters of petaflops and not
consider all the other issues that may limit such computations, such as how
the processing itself disallows certain considerations that appear solved
or actively processed -somehow- and to me that explanation is that there is
a highly advanced system that is naturally capable, not by brute-force or
by chess-computer linear one-point perspective, and instead by utilizing
knowledge of nature and how quantum mechanics relates directly with paradox
and in this context- it is probable a quantum computer could only do this
level of computation that is integrative and a basis for threat analysis,
albeit perhaps guided by a too-simple binary interface or outlook or way of
managing a situation due to lack of previous experience for guidance.

that is: it is likely impossible that existing technology could carry the
computational and data load in terms of its organization and coherence as a
computing model of N-dimensions or interrelated degrees. it likely does
not  involve a simple script or algorithm that rationalizes the entirety of
what is going on, and instead it likely would emerge as a pattern from
within the churning cauldron of data, well beyond meta-data, only an
ingredient amongst vast many others that span any relevant archivable
referenceable 'known' form that could establish a model, from any given
perspective. and thus be evaluated in terms of computer reasoning versus
human reasoning, and gauge insight upon what correlations may be beyond the
human limit of considering as shared sets, versus what such computing power
could compare and consider (and thus, perhaps the AI develops its own
algorithms and has its own working hypotheses- yet this would inherent
require grounding with the hardware, that allows this and does not involve
translating or other lossy middle-management or mediation that skews or
warps or limits the running iterative computation or nonlinear
computational analysis).

so while a step can be taken back from the assumption of a fielded quantum
computer in a seemingly mundane gargantuan data centre/warehouse, what is
occurring within the existing surveillance regime does not realistically
appear to be grounded in such a facility, either. it does not add up, the
capacity of storing records versus making judgments based on such data, yet
not only that data, everything that is known and modeled in terms of the
issues and the society and economic, social, political dimensions, locally
and globally, as it fits into a whole model and local instances- where
might that computation occur and it is my view that it is impossible for
this to be occurring outside a quantum context, in terms of efficiency.

lacking this capacity would involve constantly fighting the equipment and
being limited in approaching 'questioning itself', though this is also the
majority situation. it is an entirely different approach than the internet
and desktop and workstations equipment of today. it is beyond the binary.

there is nothing to bet on it either. either the government has a viable
working model of the state in all of its dimensions or it does not have
this capacity. and there is every indication it does have this today and
yet meta-data is like a binary worldview of the existing situation, and too
limited to get at what boundaries are involved in terms of citizens and law
and representation and rules, including for those in government. if it does
involve a realm of meta-computing and yet computing is the limit to what
can be reasoned, then as with physics and metaphysics- it is in that gap
that the reality is lost and another world exists outside the other and
could even be beyond accountability as a result of plausible deniability.

it is implausible a non-quantum computer could be modeling state data in
its ubiquity in a running simulation given existing binary technology.

2)  Google and Nasa just announced a video of their quantum computer that
is actually an installation of a quantum processor whose 'number crunching'
will apparently help optimize algorithms for the Google Glass blink apps.
and a question encountered in this application of D-Wave quantum chip is
what to do with this technology- effectively a new kind of calculator it
seems, given the approach.

A first look inside Google's futuristic quantum lab // video
http://www.theverge.com/2013/10/10/4824026/a-first-look-inside-googles-secretive-quantum-lab

in other words, it is not a 'quantum computer' that is installed and thus
there are no other quantum-specific devices connected to it, such that it
seems the pins in the chip lead directly to software to run queries, and
this sustained within a sealed supercold environment, to allow it to occur.

so it is more like a micro-controller or integrated circuit than a
multi-use processor in the sense it is outside of a motherboard context or
larger connected circuitry, or so it appears by my naive account.

and so there is this disjunction between what data processing today must
be  capable of, in terms of processing, and then this 'first steps'
approach that is fielded in the commercial and scientific realm of Google
and NASA. like the computation of the 'state mind' and the situation in the
body of the state are mismatched and there is dissonance between what is
said and what is being done- which begs for mistrust if not fear, deity
state, etc.

so the question asked in the video is what can a quantum processor do. what
is it capable of. and i tend to imagine it is precisely this realm of the
paradoxical and the N-dimensional that begins in the gray-area with very
limited models and AI, and cosmic-soup-like, allows this data situation to
computationally bubble bubble while on the lookout for toil and trouble.

the random number generator (RNG) itself seems most closely aligned with
the paradigm of quantum mechanics - as a field of superposition and of
potentiality -- and this is where paradoxical unknowns would emerge from as
contingent patterns in terms of their grounded interconnectivity within the
ecologically and dynamically linked empirical models.

even more the case, or so it is proposed, are random event generators in
relation to spooky action at a distance. if these purposeful RNGs are
somehow linked and-or _grounded, as if quantum-tuned sensor-boxes even,
that extrasensory aspect of birds knowing navigation or animals knowing of
earthquakes via uncanny abilities could also exist naturally within the
computer hardware model at the physical level, if sensitive to various
indicators and 'aware' or capable of making-sense of the chaotic input.

humans on the internet who may gauge a larger situation via interactions
without this being voiced, is similar to a distributed sensor network in
its parallel computation, grounded locally, and its predictive capacity in
that something may not feel right or a particular momentum may be sensed
and thus serve as a meridian or guideline or boundary via lines of force
like intuitive calculation. so too, computer processing whereby the logic
is capable of assessing existing and potential N-dimensional connections
and boundaries and considering the dynamics involved, hunches, intuitions.

for instance, a software program could be written to look at bird call data
as it relates to air quality as this relates to news reports and events. if
you know what you are looking for, it could be rationalized-- find these
criteria and in matching those patterns x=y.

yet what if the computer was able to take any data, and consider any event
from a largely unbounded context, and thus it could begin to process the
migration of birds in realm time with pollution levels as it relates to,
say, periods of heavy traffic due to sporting events in a given sector. and
the algorithm would find such correlations, yet not stop at this, and keep
looking and evaluating. and perhaps there are 'infinitely more variables
than equations', and thus quadratic approaches are beyond the bounds of a
limited viewpoint or rationalization, where kind of perspective that seeks
to view a situation that is too small to accurately model it then becomes
the sign of the thing it seeks to understand, where the model replaces the
entity itself. (note: this as it relates to conservation issues, politics,
saving one species to jeopardize another; else, technology and wildlife and
ecosystems, wind turbines and bird and bat and eagle deaths, etc)

in a binary approach, the 'good enough' model allows the extraneous data to
be ignored or kept out of the analysis -- yet this does not work at scale,
because the bias and warping and skew and distortion only increases with
each further reliance upon inaccuracies in the false framework. you could
not have accuracy at the level of the state via such an approach when in a
context of ubiquitous information, there would be a total onesided madness.
and it does not appear this is actually the case. again, beyond binarism.

thus highly-dimensional modeling may begin as inaccurate models and require
constant analysis and refinement that only a computer could be in the vast
data relations required. for instance, taking all sensor data from street
infrastructure regarding pollution and toxins, all weather, traffic, news,
academic research, epistemological models, social, economic, anything of
any import that can exist in an equation, -- essentially a noise field --
and then allow whatever potential interconnection exists to be modeled as a
possibility, whether activated as a structural scaffold of a hypothesis or
not, and that through educational, career, taxes, health records, and other
indicators - across all known statistics viewed as an empirical framework
for pattern matching past, present, and future events -- to then see what
stands-out in a given moment or in a particular context or perspective, and
that like the extrasensory 'knowing' or intuitive understanding, somehow
the model could suddenly achieve a strange coherence in the data modeled,
and this could occur from the ground-up from the particulate to the whole
and entirely or mostly in a referenced-based AI computational approach, yet
requiring a capacity of parallel linked quantum supercomputers to achieve.
there is no time-sharing on a system like this, it would always be on and
running and like the birth of a universe, would grow and grow and grow, and
while accessed, ultimately it would be best at modeling these chaotic and
unknown dynamics itself -- allowed to run and compute, reason autonomously
so as to consider these dynamics, while humans could input models and help
evaluate models and assumptions. yet ultimately its accuracy is not within
the 1 and 0 of binary digits as representations of on/off switches, instead
it would be this 1 and 0 mapping directly to truth, the basis for grounding
the software model in the hardware physical world as it relates to matter,
energy, and information. it would not inherently involve an extra layer or
distancing from this, seemingly, an additional language and translation.

if allowed, the paradoxical processing of gray-area conditions by a quantum
computer installation could - in accordance with AI and 3-value and N-value
logical reasoning - achieve this 'holistic' approach yet with attributes of
analog characteristic of shifting parameters and sliding scale analyses. in
this way the rigidity of binary 2-value computation that does not allow the
truth to exist would instead not allow the assumption of truth within the
modeling of representational signs (programming, models themselves,
beliefs) to become the truth by default of this interaction (true belief)
and instead, this truth would need to be earned by massive reasoning that
is tied into facts and physics and human knowledge, from the beginning, and
not some ideological bubble-universe that has control over the equipment.

grounded sensing, in other words. if a linear supercomputer can take input
from every weather station and model weather systems worldwide, given what
is known about cloud system formation, wind and temperature and humidity,
and various weather systems and events- what can it do beyond this same
seemingly *fixed* boundary that could involve birds or wildlife or sounds
or resonance or people who have broken bones that ache before a storm or
who immediately know tornado weather minutes far ahead of warning sirens.
if they are outside the model, that is not computed. yet what if that data
somehow could be connected to, yet the software and hardware model do not
allow it, because it is a deterministic approach, rationalized within a
given set of parameters where it controls the knowns via leaving out the
unknowns. what if the unknowns are where the discovery is for new
knowledge. what if the gathered flight of seagulls indicates a thunderstorm
two days out with a degree of certainty before computer models or at least
could be correlated with such observations, tested as empirical evidence as
part of a larger hypothesis of integrated systems. and what if this was
only one aspect of one question of a given scenario in a given discipline
of which there are trillions to consider. the boundaries kill off the
ideas, certainties replace and stand-in for uncertainty, controlling what
is interpreted and allowed to be considered within modeling that then
limits and bounds knowledge to only what can be known from a given view,
and that its inaccuracies become structural, beyond questioning, dogma.
say, a non-electromagnetic view of the universe as this relates to weather
and cloud systems as giant charge-based entities (electric weather).

The Electric Universe / Electrical Weather
http://www.holoscience.com/wp/synopsis/synopsis-9-electrical-weather/

what seems most likely in my naive estimation is that the quantum computer
in its capacity for paradoxical computation and looping heuristic methods
for meta-modeling across multiple interrelated scales simultaneously, is
that this ecological condition of reality and this dimensional capacity of
massive quantum installations linked together in parallel would allow for
this evaluation by default of a natural affinity for the physics involved,
and this goes beyond just the processor itself and into its grounding at
the macro-level with material entanglement, whereby sensor networks that
count microns of chemicals could potentially be remotely entangled in their
data sets this way, so that a trigger upon one variable may effect others
at a distance likewise, in terms of latent superposition potentialities.

in this way, the grounded antenna or the ground wire, as with the sensor
connected to a proposed LED signaling display influencing or becoming an
RNG input, could have a autonomic nervous system alongside the brain-based
computational reasoning, whereby ~sensing may remain ambiguous yet also be
highly connected in other hidden or missing or gray area dimensionality
that could like junk DNA recombine in a given instance as a pattern or
allow other frameworks to exist and this seems inherent or unique to the
quantum situation, in that the grounding of the computer also would be a
grounding of the data model in terms of its realism, that the information
modeled accurately maps into the world and is harmonized with it, and that
this diagnostic evaluation occurs if not also in terms of error-correction
or tabulation or computational processing. perhaps there is intelligence
within the current itself, in terms of entanglement, and so perhaps the
entangling of computers in a monitoring sense of environment or various
"dimensions" would also be involved in how data is effectively processed.

this versus having society serve a flawed computer model, subservient to
it, versus the ability to question the model and test it against the truth
of all combined data within the given hypotheses, and the issue of going
against nature. the microns of chemicals cannot simply be ignored. the
poison in the air, toxins everywhere, sick and inhuman approaches as this
relates to ethics and morality. essentially-- the society is schizophrenic
and allowed to be this way in the binary ideology and its computer model,
required even, enforced, yet denied by those who benefit via onesideness,
the inequality, exploitativeness, the epic swindle for fake immortal gain.

thus it is proposed that the quantum computer as a device would have this
RNG/REG aspect that relates to grounding data, and this could connect with
sensors or various other inputs (as if peripherals perhaps).

in this way, a quantum computer installation at massive scale could parse
all traffic cams, all weather info, all news and knowledge and reference
all books and texts in all languages, and build up models within these as
frameworks for evaluating issues of concern as tools for state management
and governance - or tyranny. and thus the short-circuit if this were amiss
or something was off about it, sparking loose-ends that need to be pulled
to find out there is binarism at the helm of such computational power and
that its output is skewed towards the ideological, due to boundaries that
are retained from a given mindset or too narrow belief system, etc. and
that this could likely be expected as a result of not know the questions to
ask when faced with the physics involved, in their metaphysical dimensions.

forcing the data versus listening to it. forcing answers via biased views
versus allowing questions to exist, coherence discerned logical reasoning.
this as it relates to private mankind and public humanity, as referenced to
the US Constitution or ignoring and replacing it via its substitution as a
hollowed-out digital sign. the state as empty set. any tyranny is possible.

♬
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: text/html
Size: 25986 bytes
Desc: not available
URL: <https://lists.cpunks.org/pipermail/cypherpunks/attachments/20131012/88ba51e4/attachment-0001.txt>


More information about the cypherpunks mailing list