for those that have not seen this... http://www.youtube.com/watch?v=CgLkWT246qU -- Tentacle #99 ecc public key curve p25519(pcp 0.15) 1l0$WoM5C8z=yeZG7?$]f^Uu8.g>4rf#t^6mfW9(rr910 Governments are instituted among men, deriving their just powers from the consent of the governed, that whenever any form of government becomes destructive of these ends, it is the right of the people to alter or abolish it, and to institute new government, laying its foundation on such principles, and organizing its powers in such form, as to them shall seem most likely to effect their safety and happiness.’ https://github.com/TLINDEN/pcp.git to get pcp(curve25519 cli) https://github.com/stef/pbp.git (curve 25519 python based cli)
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 I liked it, but the realistic portions were destroyed by the A.I. 'argument' with the operator. I have a hard time believing that to be the future, especially when in the video itself it shows the A.I. making a poor decision and getting the craft destroyed. I can't imagine a military buying weapons that can routinely overrule their chain of command Now explain to me that the female voice was actually a higher ranking secondary operator, and i'd start to see the benefits. *operator 1 : do not want to sacrifice craft for discovery of enemy* *operator 2 : override previous command, potential enemy discovery of greater priority.* Making a superior officer appear as if they are 'Siri' could possibly reduce confrontation and the feeling of being overruled, and thus increase individual operator happiness, job satisfaction, etc etc. It could also reduce the chances that an operator takes personal responsibility for their actions. "That stupid AI fucked up, not me. I KNEW they were innocent!" I liked it, though. Neat footage. On Sat 01 Feb 2014 01:16:33 AM PST, gwen hastings wrote:
for those that have not seen this...
- -- http://about.me/sam.gordon Keep the net free Electronic Frontier Foundation https://supporters.eff.org/donate Free Software Foundation https://my.fsf.org/associate/support_freedom/join_fsf -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.14 (GNU/Linux) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iQEcBAEBAgAGBQJS7QUrAAoJEBNrXBfj4zc+UZ4IANNkIBsEOZ/+SloZ/SWeLWhf l4DnWoa1Ooq0KfR0/lt6HtPIQiz+Pq4xi5TQY9P2qTpdGA9rWNog5r8FjGnfGXEv zBmuw7s1DCWnxz5f7FjnmrxQiFXvOzmfx888ov2tam72gYo5WFPKEtbA4qU65V1M UhshdGaMYo9m6b1hXdp0MyxEW1TaJ5h0cg+Be92PXBTd3jCqeCGDdYNL49rEqVot mL0pCNGgMuBrvYMjPZuX46aklYfFmvug+8V/OXKqugPprq1Cds9SaIYp7zJP0Do3 UX1lfZdPfnqXSB/akhE6oTr2QuUUJoVudmpP+9tlBjATgYh+zYesE0MQidUOZzk= =CV4W -----END PGP SIGNATURE-----
Dnia sobota, 1 lutego 2014 06:31:07 Sam Gordon pisze:
I liked it, but the realistic portions were destroyed by the A.I. 'argument' with the operator.
I have a hard time believing that to be the future, especially when in the video itself it shows the A.I. making a poor decision and getting the craft destroyed. I can't imagine a military buying weapons that can routinely overrule their chain of command
When you put it like that, sure. When you put it like it's not "overruling the chain of command", but "correcting operator mistakes in accordance with procedures", it's a whole different story! The latter creates the appearance as if a drone would not be be able make any "wrong" decisions, as all it would base its decisions on would be procedures written and implemented by "the right people". What gets hidden in such a scenario is that (obviously): - procedures are bound to have mistakes themselves (oh, the irony of unintended consequences!); - people implementing them will make mistakes. But that will not stop the introduction of such drones, if properly packaged in marketing mumbo-jumbo. Never underestimate the power of new shiny toys for the uniformed (just one letter away from "uninformed", eh?) boys!
Now explain to me that the female voice was actually a higher ranking secondary operator, and i'd start to see the benefits.
*operator 1 : do not want to sacrifice craft for discovery of enemy* *operator 2 : override previous command, potential enemy discovery of greater priority.*
Making a superior officer appear as if they are 'Siri' could possibly reduce confrontation and the feeling of being overruled, and thus increase individual operator happiness, job satisfaction, etc etc. It could also reduce the chances that an operator takes personal responsibility for their actions.
"That stupid AI fucked up, not me. I KNEW they were innocent!"
Consider: http://en.wikipedia.org/wiki/Firing_squad#Blank_cartridge http://en.wikipedia.org/wiki/Diffusion_of_responsibility -- Pozdr rysiek
participants (3)
-
gwen hastings
-
rysiek
-
Sam Gordon