Fwd: Public user prediction software?

Karl gmkarl at gmail.com
Mon Apr 26 15:17:45 PDT 2021


On Mon, Apr 26, 2021, 6:10 PM Karl <gmkarl at gmail.com> wrote:

>
>
> On Mon, Apr 26, 2021, 12:21 PM coderman <coderman at protonmail.com> wrote:
>
>> -----BEGIN PGP SIGNED MESSAGE-----
>> Hash: SHA512
>>
>>
>> hello Karl!
>>
>> my replies below clearsigned as usual,
>>
>>
>> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
>> On Monday, April 26, 2021 6:42 AM, Karl <gmkarl at gmail.com> wrote:
>> >
>> > ...
>> > Couldn't a user provide their own keystrokes, mouse movements,
>> > etc and fine-tune a personalised model on their own system?
>>
>> this would simply be personalization, not behavioral prediction.
>>
>
> ?  I do not see personal behavior prediction as a contradiction.
>
> all of the behavioral prediction models are based on statistics;
>> you need a sufficient sample size to make inferences from the
>> data.
>>
>> i don't know of any methods that work with a sample size of 1.
>>
>> if you look at open source projects for behavioral modeling and
>> prediction, you see they all use large datasets to build the
>> models.
>>
>> E.g. https://github.com/numenta/nupic,
>>  https://github.com/numenta/nupic.core
>
>
> Looks like it would work, to me, with some work.
>
> If that's not clear to you:
> - models trained on huge datasets can be fine-tuned on small datasets and
> work effectively
> - a single user has a lot of statistical data.  The sample size is not
> one: it is every streaming datapoint they produce.
> - many users are likely to be interested in trying it out.
>
> > But wouldn't it be _better_ to have this data out in the public
>> > than held privately by marketing and military organisations?
>>
>> i am not convinced it would be better: consider being a victim
>> of identity theft. should you just post your personal info out
>> in the clear, knowing that some are using it already?
>
>
>> no, that'd just make the problem worse.
>>
>
> I guess this depends on your relationship with marketing and military
> organisations vs your relationship with the rest of the world.
>
> If you are a military target among a community of caring academics, it
> seems far better to have your data in the clear than privately held.
>
> same with an open source open data privacy invasion system:
>>  it's still privacy invasive and detrimental!
>
>
>> there are some uses of technology that just SHOULD NOT BE.
>>
>> i remain open to changing my mind, however :)
>>
>
> Yeah this is like drugs, guns, or cryptography, it gets more violent if
> the resource is not in the clear for those who desire it to be.
>


Do you imagine a way to halt it without publicizing it?


> best regards,
>>
>> -----BEGIN PGP SIGNATURE-----
>>
>> iNUEAREKAH0WIQRBwSuMMH1+IZiqV4FlqEfnwrk4DAUCYIboW18UgAAAAAAuAChp
>> c3N1ZXItZnByQG5vdGF0aW9ucy5vcGVucGdwLmZpZnRoaG9yc2VtYW4ubmV0NDFD
>> MTJCOEMzMDdEN0UyMTk4QUE1NzgxNjVBODQ3RTdDMkI5MzgwQwAKCRBlqEfnwrk4
>> DKbcAQDIJWz/Rj+ioyaAQYn9DLCO0TrBKqQWUihx+NjRIDB+QQEAhNLwq8LaFRjl
>> UGEvbxtxybfvJE10tHhw/Z+tk8Ls8O0=
>> =D/ZQ
>> -----END PGP SIGNATURE-----
>>
>>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: text/html
Size: 5779 bytes
Desc: not available
URL: <https://lists.cpunks.org/pipermail/cypherpunks/attachments/20210426/0ec9df2f/attachment.txt>


More information about the cypherpunks mailing list