
On Sun, Oct 26, 1997 at 11:28:09PM -0600, Jim Choate wrote:
Forwarded message:
Date: Sun, 26 Oct 1997 19:13:12 -0800 From: Kent Crispin <kent@bywater.songbird.com> Subject: Re: Orthogonal (fwd)
I do believe the use of the term this way was inspired by the notion of a 'basis' in a vector space -- a set of orthogonal vectors that span the space, ideally, unit vectors.
Can you better define the term 'basis'?
This is basic linear algebra:
V a vector space -- the set of all (s1,s2,s3,...,sn), where si is an element of the set of reals.
Actualy si (scalars) can be rational, real, or complex. It is also possible to use more general structures such as fields. [1]
That's true. You asked for a better definition of "basis". I was trying to do that without writing a textbook, though.
Note that complex numbers numbers may be used, in other words a vector can be used to multiply another vector.
A set of vectors {v1,v2,...,vm} in V is
A vector of vectors, and you haven't even really defined vector yet...:(
No. A *set* of vectors. And I presumed you already knew what a vector was, and a slight reminder would be all that was necessary.. if you aren't at least somewhat familiar with the ideas of sets and vectors, it would be somewhat difficult for you to follow the rest of what I wrote, I grant.
linearly independent if there is no set of scalars {c1,c2,...,cm} with at least one non-zero element such that sum(ci*vi) == 0
Are you saying:
(c1*v1)+(c2*v2)+...+(cn*vn) <> 0
where at least one ci is <>0, is some sort of test for membership in a vector space?
Yes, though you have to understand that "0" is the zero *vector* -- the sum of vectors is always another vector. I was defining "linear independence", a basic concept needed to understand the notion of a basis.
What about,
(1/c1*v1)+(-1/c2*v2)+(0*v3)...+(0*vn) = 0, 1 -1
if so it should be clear that except for the case of c=0 there is always a way to take two, which is clearly more than one, of the scalars and cause the sum to be zero. Now if you have a single non-zero scalar multiplier then it seems reasonable that you can create such a structure that is always <>0.
Nope. It may be the zero vector vs zero scalar confusion. Consider the two linerarly *dependent* vectors in 2-space: (1,0), (2,0), and the set {2,-1} of non-zero scalars: 2*(1,0) + -1*(2,0) = (0,0). Contrast with the two linearly *independent* vectors (1,0), (0,2). There is *no* set of scalars {c1,c2}, with at least 1 of ci != 0, such that c1*(1,0) + c2*(0,2) == (0,0). This is obvious: c1*(1,0) + c2*(0,2) = (c1,2*c2), which obviously cannot be (0,0) if either c1 or c2 is non-zero.
A set of vectors S spans
In other words 'are expressible in'?
If you like. I am defining the word "span", however, because that is the common terminology.
a vector space V iff every element of V can be expressed as a linear combination of the elements of S. ^^^^^^^^^^^ Need to better define this one.
Sorry. I really thought the notion of a "linear combination" would be common knowledge. You can check your dictionary.
The operations that are applicable to a vector space are: [1]
1. Associative law of addition: (x+y)+z = x+(y+z) 2. Commutative law of addition: x+y=y+x 3. Existance of zero: x+0=x for all x in V 4. Existance of inverses: x+(-x)=0 5. Associative law of multiplication: a(bx)=(ab)x 6. Unital law: 1*x=x 7. First distributive law: a(x+y)=(a*x)+(a*y) 8. Second distributive law: (a+b)*x=(a*x)+(b*x)
where x is of the form 'asubn(x^n)+bsubn-1(x^n-1)+...+asub1x+asub0'
This is a remarkably circular definition to present. In clearer wording;
No, in fact it's not. There is nothing original in the definition -- I'm just parroting what I learned 20 years ago.
The set of vectors S is expressible in a vector space V iff S is contained in > V.
That's not what I said, though.
While this may be a requirement or test for membership in a vector space (as used below) it doesn't qualify as a definition.
Finally, a basis for V is a linearly independent set of vectors in V that spans V. A space is finite dimensioned if it has a finite set for a basis. The ^^^ set of what?
Oh Jeez. Vectors. Note that the last sentence defines "finite dimensioned", not "basis".
standard basis (or natural basis) for a vector space of dimension n is th set of vectors
(1,0,0,...0) (0,1,0,...0) (0,0,1,...0)
In other words, any 'vector' can be expressed as the sum of the multiplication of unit elements by some set of scalars.
Is this definition something you just wrote or would you be so kind as to give the reference if it isn't.
I paraphrased material from "Calculus of Vector Functions", 3rd edition, by Williamson, Crowell, and Trotter and "Linear Algebra", by Hoffman and Kunze. These are both really old texts, but any book on linear algebra will cover the same material.
This definition uses 'basis', doesn't define it. You can't use a term to define the term, it's called circular reasoning.
Nope. The definition is "a basis for V is a linearly independent set of vectors in V that spans V." I defined "linearly independent" and "span" without using the word "basis", or even the idea of a "basis". It is not a circular definition.
My question still stands, can you please better define 'basis'. Or is your claim that basis is simply a way of stating 'elements of unit magnitude'?
At this point, it is clear that I should simply refer you to an elementary linear algebra text. I will just repeat that, for those that are familiar with linear algebra, the analogical use of "orthogonal" in language design is pretty intuitive. If you don't have the background you may not have the intuition, and discussion on this list probably isn't the best way to build it. -- Kent Crispin "No reason to get excited", kent@songbird.com the thief he kindly spoke... PGP fingerprint: B1 8B 72 ED 55 21 5E 44 61 F4 58 0F 72 10 65 55 http://songbird.com/kent/pgp_key.html