Re: Orthogonal (fwd)

Forwarded message:
Date: Sun, 26 Oct 1997 19:13:12 -0800 From: Kent Crispin <kent@bywater.songbird.com> Subject: Re: Orthogonal (fwd)
I do believe the use of the term this way was inspired by the notion of a 'basis' in a vector space -- a set of orthogonal vectors that span the space, ideally, unit vectors.
Can you better define the term 'basis'?
This is basic linear algebra:
V a vector space -- the set of all (s1,s2,s3,...,sn), where si is an element of the set of reals.
Actualy si (scalars) can be rational, real, or complex. It is also possible to use more general structures such as fields. [1] Note that complex numbers numbers may be used, in other words a vector can be used to multiply another vector.
A set of vectors {v1,v2,...,vm} in V is
A vector of vectors, and you haven't even really defined vector yet...:(
linearly independent if there is no set of scalars {c1,c2,...,cm} with at least one non-zero element such that sum(ci*vi) == 0
Are you saying: (c1*v1)+(c2*v2)+...+(cn*vn) <> 0 where at least one ci is <>0, is some sort of test for membership in a vector space? What about, (1/c1*v1)+(-1/c2*v2)+(0*v3)...+(0*vn) = 0, 1 -1 if so it should be clear that except for the case of c=0 there is always a way to take two, which is clearly more than one, of the scalars and cause the sum to be zero. Now if you have a single non-zero scalar multiplier then it seems reasonable that you can create such a structure that is always <>0.
A set of vectors S spans
In other words 'are expressible in'?
a vector space V iff every element of V can be expressed as a linear combination of the elements of S. ^^^^^^^^^^^ Need to better define this one.
The operations that are applicable to a vector space are: [1] 1. Associative law of addition: (x+y)+z = x+(y+z) 2. Commutative law of addition: x+y=y+x 3. Existance of zero: x+0=x for all x in V 4. Existance of inverses: x+(-x)=0 5. Associative law of multiplication: a(bx)=(ab)x 6. Unital law: 1*x=x 7. First distributive law: a(x+y)=(a*x)+(a*y) 8. Second distributive law: (a+b)*x=(a*x)+(b*x) where x is of the form 'asubn(x^n)+bsubn-1(x^n-1)+...+asub1x+asub0' This is a remarkably circular definition to present. In clearer wording; The set of vectors S is expressible in a vector space V iff S is contained in V. While this may be a requirement or test for membership in a vector space (as used below) it doesn't qualify as a definition.
Finally, a basis for V is a linearly independent set of vectors in V that spans V. A space is finite dimensioned if it has a finite set for a basis. The ^^^ set of what?
standard basis (or natural basis) for a vector space of dimension n is th set of vectors
(1,0,0,...0) (0,1,0,...0) (0,0,1,...0)
In other words, any 'vector' can be expressed as the sum of the multiplication of unit elements by some set of scalars. Is this definition something you just wrote or would you be so kind as to give the reference if it isn't. This definition uses 'basis', doesn't define it. You can't use a term to define the term, it's called circular reasoning. My question still stands, can you please better define 'basis'. Or is your claim that basis is simply a way of stating 'elements of unit magnitude'? Furthermore, a definition of vector space has nothing to do with orthogonal measurement systems, it does have to do with polynomials. It just so happens that this sort of geometry shares the same sort of rules, that doesn't make them the same thing. There is nothing in your description, in particular the nifty little (*,*,*,...*)'s that implies any sort of measurement system based on line-segment axis that are 90 degrees apart, merely that there is more than one variable (ie x,y,z,...a) involved. Vector space - a vector space is a set of objects or elements that can be added together and multiplied by numbers (the result being an element of the set), in such a way that the usual rules of calculations hold. [1] ____________________________________________________________________ | | | The financial policy of the welfare state requires that there | | be no way for the owners of wealth to protect themselves. | | | | -Alan Greenspan- | | | | _____ The Armadillo Group | | ,::////;::-. Austin, Tx. USA | | /:'///// ``::>/|/ http://www.ssz.com/ | | .', |||| `/( e\ | | -====~~mm-'`-```-mm --'- Jim Choate | | ravage@ssz.com | | 512-451-7087 | |____________________________________________________________________| [1] The VNR Concise Encyclopedia of Mathematics Gellert, Kustner, Hellwich, Kastner ISBN 0-442-22646-2 17.3 Vector spaces pp. 362 [Hallowen Trivia] 'Dracula' is a word derived from 'dracul' meaning dragon and was given to Vlad Tepish, whose name means 'son of the devil', he was further known as Vlad the Impaler because he would impale people on poles and place them along the roads in Transylvania. When he died in 1496 his head was impaled on a stick as well.

On Sun, Oct 26, 1997 at 11:28:09PM -0600, Jim Choate wrote:
Forwarded message:
Date: Sun, 26 Oct 1997 19:13:12 -0800 From: Kent Crispin <kent@bywater.songbird.com> Subject: Re: Orthogonal (fwd)
I do believe the use of the term this way was inspired by the notion of a 'basis' in a vector space -- a set of orthogonal vectors that span the space, ideally, unit vectors.
Can you better define the term 'basis'?
This is basic linear algebra:
V a vector space -- the set of all (s1,s2,s3,...,sn), where si is an element of the set of reals.
Actualy si (scalars) can be rational, real, or complex. It is also possible to use more general structures such as fields. [1]
That's true. You asked for a better definition of "basis". I was trying to do that without writing a textbook, though.
Note that complex numbers numbers may be used, in other words a vector can be used to multiply another vector.
A set of vectors {v1,v2,...,vm} in V is
A vector of vectors, and you haven't even really defined vector yet...:(
No. A *set* of vectors. And I presumed you already knew what a vector was, and a slight reminder would be all that was necessary.. if you aren't at least somewhat familiar with the ideas of sets and vectors, it would be somewhat difficult for you to follow the rest of what I wrote, I grant.
linearly independent if there is no set of scalars {c1,c2,...,cm} with at least one non-zero element such that sum(ci*vi) == 0
Are you saying:
(c1*v1)+(c2*v2)+...+(cn*vn) <> 0
where at least one ci is <>0, is some sort of test for membership in a vector space?
Yes, though you have to understand that "0" is the zero *vector* -- the sum of vectors is always another vector. I was defining "linear independence", a basic concept needed to understand the notion of a basis.
What about,
(1/c1*v1)+(-1/c2*v2)+(0*v3)...+(0*vn) = 0, 1 -1
if so it should be clear that except for the case of c=0 there is always a way to take two, which is clearly more than one, of the scalars and cause the sum to be zero. Now if you have a single non-zero scalar multiplier then it seems reasonable that you can create such a structure that is always <>0.
Nope. It may be the zero vector vs zero scalar confusion. Consider the two linerarly *dependent* vectors in 2-space: (1,0), (2,0), and the set {2,-1} of non-zero scalars: 2*(1,0) + -1*(2,0) = (0,0). Contrast with the two linearly *independent* vectors (1,0), (0,2). There is *no* set of scalars {c1,c2}, with at least 1 of ci != 0, such that c1*(1,0) + c2*(0,2) == (0,0). This is obvious: c1*(1,0) + c2*(0,2) = (c1,2*c2), which obviously cannot be (0,0) if either c1 or c2 is non-zero.
A set of vectors S spans
In other words 'are expressible in'?
If you like. I am defining the word "span", however, because that is the common terminology.
a vector space V iff every element of V can be expressed as a linear combination of the elements of S. ^^^^^^^^^^^ Need to better define this one.
Sorry. I really thought the notion of a "linear combination" would be common knowledge. You can check your dictionary.
The operations that are applicable to a vector space are: [1]
1. Associative law of addition: (x+y)+z = x+(y+z) 2. Commutative law of addition: x+y=y+x 3. Existance of zero: x+0=x for all x in V 4. Existance of inverses: x+(-x)=0 5. Associative law of multiplication: a(bx)=(ab)x 6. Unital law: 1*x=x 7. First distributive law: a(x+y)=(a*x)+(a*y) 8. Second distributive law: (a+b)*x=(a*x)+(b*x)
where x is of the form 'asubn(x^n)+bsubn-1(x^n-1)+...+asub1x+asub0'
This is a remarkably circular definition to present. In clearer wording;
No, in fact it's not. There is nothing original in the definition -- I'm just parroting what I learned 20 years ago.
The set of vectors S is expressible in a vector space V iff S is contained in > V.
That's not what I said, though.
While this may be a requirement or test for membership in a vector space (as used below) it doesn't qualify as a definition.
Finally, a basis for V is a linearly independent set of vectors in V that spans V. A space is finite dimensioned if it has a finite set for a basis. The ^^^ set of what?
Oh Jeez. Vectors. Note that the last sentence defines "finite dimensioned", not "basis".
standard basis (or natural basis) for a vector space of dimension n is th set of vectors
(1,0,0,...0) (0,1,0,...0) (0,0,1,...0)
In other words, any 'vector' can be expressed as the sum of the multiplication of unit elements by some set of scalars.
Is this definition something you just wrote or would you be so kind as to give the reference if it isn't.
I paraphrased material from "Calculus of Vector Functions", 3rd edition, by Williamson, Crowell, and Trotter and "Linear Algebra", by Hoffman and Kunze. These are both really old texts, but any book on linear algebra will cover the same material.
This definition uses 'basis', doesn't define it. You can't use a term to define the term, it's called circular reasoning.
Nope. The definition is "a basis for V is a linearly independent set of vectors in V that spans V." I defined "linearly independent" and "span" without using the word "basis", or even the idea of a "basis". It is not a circular definition.
My question still stands, can you please better define 'basis'. Or is your claim that basis is simply a way of stating 'elements of unit magnitude'?
At this point, it is clear that I should simply refer you to an elementary linear algebra text. I will just repeat that, for those that are familiar with linear algebra, the analogical use of "orthogonal" in language design is pretty intuitive. If you don't have the background you may not have the intuition, and discussion on this list probably isn't the best way to build it. -- Kent Crispin "No reason to get excited", kent@songbird.com the thief he kindly spoke... PGP fingerprint: B1 8B 72 ED 55 21 5E 44 61 F4 58 0F 72 10 65 55 http://songbird.com/kent/pgp_key.html

At 9:47 AM -0700 10/27/97, Kent Crispin wrote:
At this point, it is clear that I should simply refer you to an elementary linear algebra text. I will just repeat that, for those that are familiar with linear algebra, the analogical use of "orthogonal" in language design is pretty intuitive. If you don't have the background you may not have the intuition, and discussion on this list probably isn't the best way to build it.
I returned home after being away to find half a dozen or more posts arguing about the use of the word "orthogonal." Which I used in my essay Saturday. Like Kent, I don't see any point in arguing basic concepts of linear algebra, vector spaces, correlations, inner products, and how these ideas relate to machine and language functions. This is stuff most of us learned way, way back, and the usage for machines and languages is natural. Arguing about what a "basis" is, or what a "linear combination" is, is just a waste of time. I will give just _one_ example of orthogonality, though. Imagine a 747, with various commands or functions to do things like lower the wing flaps, retract the landing gear, dump excess fuel, turn on cabin lights, signal an emergency, etc. Normally, most of these commands are "orthogonal" to each other, in that issuing a command to turn on the cabin lights does not also lower the landing gear. Orthogonal in that there is no "projection" of one vector onto the other. (Viewed in another way, the inner product of the two commands is zero, or nearly zero, meaning there is no correlation, or interaction, between the commands or vectors.) (If Jim Choates quibbles in a overly literalitst way that "commands are not vectors," I don't plan to respond.) Contrast this orthogonal situation to one where one had these kinds of commands: Command 353: Lower landing gear, turn on cabin lights, and release emergency braking parachute. Command 792: Raise landing gear, send out emergency signal, and depressurize cabin. (The above examples might be a kind of "complex instruction set" for a plane, because some architect decided that Command 353 _might_ conceivably be needed someday. A cleaner, more functionally orthogonal instruction set would be more like a RISC architecture.) These commands would "intereact" with each other. The inner product, or measure of correlation, would be far from being zero. In fact, this 747 example is not original to me. When people discuss software complexity, where small changes can have huge effects elsewhere, one may hear examples like: "When a pilot tells the plane to raise it wing flaps, he doesn't expect the tail section to fall off." Meaning, clean design limits global propagation in various ways, keeps functions fairly simple, and minimizes interactions and side effects. Thus is the connection between my usage and software. --Tim May The Feds have shown their hand: they want a ban on domestic cryptography ---------:---------:---------:---------:---------:---------:---------:---- Timothy C. May | Crypto Anarchy: encryption, digital money, ComSec 3DES: 408-728-0152 | anonymous networks, digital pseudonyms, zero W.A.S.T.E.: Corralitos, CA | knowledge, reputations, information markets, Higher Power: 2^2,976,221 | black markets, collapse of governments. "National borders aren't even speed bumps on the information superhighway."
participants (3)
-
Jim Choate
-
Kent Crispin
-
Tim May