Hi, I realize now I didn't state the scaling correctly in my example. Let's try it again... Let's say I use an algorithm of O(x^2). My initial problem set size is 10 and I use x resources in y amount of time. If I go to a problem set size of 20 then I use (2^2)x and (2^2)y. And not x^2 and y^2 as I originaly typed. So, to gauge the resource difference between two problem set sizes of x and y, with an O(f()) and z being run time for x, and w being the resource load for x, we get, y/x = dx And the new run time for y will be, f(dx)z And the new resource load for y will be, f(dx)w ____________________________________________________________________ He is able who thinks he is able. Buddha The Armadillo Group ,::////;::-. James Choate Austin, Tx /:'///// ``::>/|/ ravage@ssz.com www.ssz.com .', |||| `/( e\ 512-451-7087 -====~~mm-'`-```-mm --'- --------------------------------------------------------------------