Not always. Any m by n matrix is also a vector. Polynomials are vectors. As are continuous functions.
A vector is an element of a vector space over a field. These are sets which have a few operations, vector addition and scalar multiplication, and obey some well known rules, such as the existence of a zero vector (identity for vector addition), associativity and commutativity of vector addition, distributivity of scalar multiplication over vector sums, that sort of thing!
These basic properties give rise to more elaborate concepts such as linear independence, spanning sets, and the idea of a basis, though not all vector spaces have a finite basis.
Your polynomial, f(x) = a + bx +cx^2 + dx^3, is an element of the vector space P3®, the polynomial vector space of degree at most 3 over the reals. This space is isomorphic to R^4 and it has a standard basis: {1, x, x^2, x^3}. Then you can see that any such f(x) may be written as a linear combination of the basis vectors with real valued scalars.
As an exercise, you can check that P3® satisfies some of the properties of vector spaces yourself (existence of zero vector, associativity and commutativity of vector addition, distributivity of scalar multiplication over vector sums).
What happens to elements with powers of x above 3? Say we multiply the example vector above with itself. We would end up with a component d2x6, witch is not part of the P3R vector space, right?
Do we need a special multiplication rule to handle powers of x above 3?
I’ve worked with quaternions before, which has " special" multiplication rules by defining i j and k.
Multiplication of two vectors is not an operation defined on vector spaces. If you want that, you’re looking at either a structure known as an inner product space or an algebra over a field.
Note that the usual notion of polynomial multiplication doesn’t apply to polynomial vector spaces, nor does it agree with the definition of an inner product nor the bilinear product of an algebra.
Every vector is a tensor. Matrices are vectors because m by n matrices form vector spaces. Magnitude and direction have nothing to do with the definition of vectors which are just elements of vector spaces.
All vectors are tensors but not vice versa. And every page/definition of vector I’ve seen references magnitude and direction, even the vector space page you linked.
It looks like “vector” commonly refers to geometric vectors which is what most folks in this thread are discussing.
Would N by M vectors be imaginary, where each DOF has real and imaginary components?
Continuous functions on [0,1] are vectors. Magnitude and direction are meaningless in that vector space, usually called C[0,1]. Magnitude and direction are not fundamental properties of vectors.
n by m matrices (and the vector spaces to which they belong) are perhaps best thought of similarly to functions and function spaces. Not as geometric objects, but as linear transformations (which they are).
Not always. Any m by n matrix is also a vector. Polynomials are vectors. As are continuous functions.
A vector is an element of a vector space over a field. These are sets which have a few operations, vector addition and scalar multiplication, and obey some well known rules, such as the existence of a zero vector (identity for vector addition), associativity and commutativity of vector addition, distributivity of scalar multiplication over vector sums, that sort of thing!
These basic properties give rise to more elaborate concepts such as linear independence, spanning sets, and the idea of a basis, though not all vector spaces have a finite basis.
Removed by mod
Your polynomial, f(x) = a + bx +cx^2 + dx^3, is an element of the vector space P3®, the polynomial vector space of degree at most 3 over the reals. This space is isomorphic to R^4 and it has a standard basis: {1, x, x^2, x^3}. Then you can see that any such f(x) may be written as a linear combination of the basis vectors with real valued scalars.
As an exercise, you can check that P3® satisfies some of the properties of vector spaces yourself (existence of zero vector, associativity and commutativity of vector addition, distributivity of scalar multiplication over vector sums).
What happens to elements with powers of x above 3? Say we multiply the example vector above with itself. We would end up with a component d2x6, witch is not part of the P3R vector space, right?
Do we need a special multiplication rule to handle powers of x above 3? I’ve worked with quaternions before, which has " special" multiplication rules by defining i j and k.
Multiplication of two vectors is not an operation defined on vector spaces. If you want that, you’re looking at either a structure known as an inner product space or an algebra over a field.
Note that the usual notion of polynomial multiplication doesn’t apply to polynomial vector spaces, nor does it agree with the definition of an inner product nor the bilinear product of an algebra.
Wouldn’t N by M be a tensor? Magnitude and direction only need one entry per DOF.
Every vector is a tensor. Matrices are vectors because m by n matrices form vector spaces. Magnitude and direction have nothing to do with the definition of vectors which are just elements of vector spaces.
All vectors are tensors but not vice versa. And every page/definition of vector I’ve seen references magnitude and direction, even the vector space page you linked.
It looks like “vector” commonly refers to geometric vectors which is what most folks in this thread are discussing.
Would N by M vectors be imaginary, where each DOF has real and imaginary components?
Continuous functions on [0,1] are vectors. Magnitude and direction are meaningless in that vector space, usually called C[0,1]. Magnitude and direction are not fundamental properties of vectors.
n by m matrices (and the vector spaces to which they belong) are perhaps best thought of similarly to functions and function spaces. Not as geometric objects, but as linear transformations (which they are).