Re: [NOISE] was Re: java security concerns
Did you also send this post to cypherpunks@toad.com?
No I didn't -- wasn't sure if that's where F. Stuart's email originated from. Please feel free to circulate to the universe, along with some further clarification below. I hope I wasn't too positive in my support of auto. program verification. There are real problems. However, with the development of the field called formal methods, computation has been directly linked to mathematical logic, which is a much better understood, and well circumscribed, domain than programming languages per se. Applications of formal methods in software engineering depend critically on the use of automated theorem provers to provide improved support for the development of safety critical systems. Potentially catastrophic consequences can derive from the failure of computerized systems upon which human lives rely such as medical diagnostic systems, air traffic control systems and defence systems (the recent failure of the computerized system controlling the London Ambulance Service provides an example of how serious software failure can be). Formal methods are used to provide programs with, or prove that programs have, certain properties: a program may be proved to terminate; two programs may be proved equivalent; an inefficient program may be transformed into an equivalent efficient program; a program may be verified to satisfy some specification (i.e. a program is proved to compute the specified function/relation); and a program may be synthesized that satisfies some specification. Program Verification boils down to proving a mathematical conjecture specifying that a given program will, for all inputs of a certain type, generate outputs of a certain type. This is relatively straightforward -- we already have the program P described in the initial conjecture to be proved. Program synthesis, on the other hand, starts with a similar conjecture *except* that P remains an unidentified variable. The task of synthesis (auto. or otherwise) is to incrementally identify P as the conjecture proof is unraveled. This requires all kinds of "intelligent", and often intuitive, choices during the proof, and is consequently a difficult process to automate. Peter ================================================================= Dr Peter Madden, Email: madden@mpi-sb.mpg.de Max-Planck-Institut fuer Informatik, Phone: (49) (681) 302-5434 Im Stadtwald, W-66123 Saarbruecken, Germany. Fax: (49) (681) 302-5401 =================================================================
Peter Madden) (by way of Duncan Frissell <madden@mpi-sb.mpg.de> writes:
Applications of formal methods in software engineering depend critically on the use of automated theorem provers to provide improved support for the development of safety critical systems. Potentially catastrophic consequences can derive from the failure of computerized systems upon which human lives rely such as medical diagnostic systems, air traffic control systems and defence systems (the recent failure of the computerized system controlling the London Ambulance Service provides an example of how serious software failure can be).
I far prefer trusting robust and failsafe engineering in such situations. Theorem provers can't account for what happens when the one in a billion DRAM corruption occurs, or someone kicks the cable connecting the machine to its disks, or when a nut shoots the sensors, or whatever. Well built systems fail in a safe manner because of good engineering design -- as an example, in we hope that the motor controller might die but the motor won't eat itself anyway no matter what garbage it puts out. Such design is needed whether the systems are "formally proven" or not -- and frankly, I can't see formal proofs having much of an impact given that you are in the end simply shifting the problem to bug-free specifications and yet still have to worry about failures in the system. Perry
Peter Madden) (by way of Duncan Frissell <madden@mpi-sb.mpg.de> writes:
Applications of formal methods in software engineering depend critically on the use of automated theorem provers to provide improved support for the development of safety critical systems. Potentially catastrophic consequences can derive from the failure of computerized systems upon which human lives rely such as medical diagnostic systems, air traffic control systems and defence systems (the recent failure of the computerized system controlling the London Ambulance Service provides an example of how serious software failure can be).
and Perry responds:
I far prefer trusting robust and failsafe engineering in such situations. Theorem provers can't account for what happens when the one in a billion DRAM corruption occurs, or someone kicks the cable connecting the machine to its disks, or when a nut shoots the sensors, or whatever. Well built systems fail in a safe manner because of good engineering design -- as an example, in we hope that the motor controller might die but the motor won't eat itself anyway no matter what garbage it puts out. Such design is needed whether the systems are "formally proven" or not -- and frankly, I can't see formal proofs having much of an impact given that you are in the end simply shifting the problem to bug-free specifications and yet still have to worry about failures in the system.
Hence comes my response: I think that both formal methods and defense-in-depth are important in building an effective protection program. - Formal methods are very important for detecting flaws and producing what is commonly called "Fault Intollerance", but - Defense-in-depth is important because there are no perfect technical defenses. This is commonly called "Fault Tollerance" The most secure (and often highest quality) systems combine fault intollerance with fault tollerance to produce high quality parts and a system that continues operating safely even when those high quality parts fail. -- -> See: Info-Sec Heaven at URL http://all.net Management Analytics - 216-686-0090 - PO Box 1480, Hudson, OH 44236
participants (3)
-
fc@all.net -
madden@mpi-sb.mpg.de -
Perry E. Metzger