Fwd: Public user prediction software?
gmkarl at gmail.com
Mon Apr 26 15:10:02 PDT 2021
On Mon, Apr 26, 2021, 12:21 PM coderman <coderman at protonmail.com> wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA512
> hello Karl!
> my replies below clearsigned as usual,
> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
> On Monday, April 26, 2021 6:42 AM, Karl <gmkarl at gmail.com> wrote:
> > ...
> > Couldn't a user provide their own keystrokes, mouse movements,
> > etc and fine-tune a personalised model on their own system?
> this would simply be personalization, not behavioral prediction.
? I do not see personal behavior prediction as a contradiction.
all of the behavioral prediction models are based on statistics;
> you need a sufficient sample size to make inferences from the
> i don't know of any methods that work with a sample size of 1.
> if you look at open source projects for behavioral modeling and
> prediction, you see they all use large datasets to build the
> E.g. https://github.com/numenta/nupic,
Looks like it would work, to me, with some work.
If that's not clear to you:
- models trained on huge datasets can be fine-tuned on small datasets and
- a single user has a lot of statistical data. The sample size is not one:
it is every streaming datapoint they produce.
- many users are likely to be interested in trying it out.
> But wouldn't it be _better_ to have this data out in the public
> > than held privately by marketing and military organisations?
> i am not convinced it would be better: consider being a victim
> of identity theft. should you just post your personal info out
> in the clear, knowing that some are using it already?
> no, that'd just make the problem worse.
I guess this depends on your relationship with marketing and military
organisations vs your relationship with the rest of the world.
If you are a military target among a community of caring academics, it
seems far better to have your data in the clear than privately held.
same with an open source open data privacy invasion system:
> it's still privacy invasive and detrimental!
> there are some uses of technology that just SHOULD NOT BE.
> i remain open to changing my mind, however :)
Yeah this is like drugs, guns, or cryptography, it gets more violent if the
resource is not in the clear for those who desire it to be.
> -----BEGIN PGP SIGNATURE-----
> -----END PGP SIGNATURE-----
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Size: 4955 bytes
Desc: not available
More information about the cypherpunks