Basis (linear algebra) in the context of Basis function


Basis (linear algebra) in the context of Basis function

Basis (linear algebra) Study page number 1 of 2

Play TriviaQuestions Online!

or

Skip to study material about Basis (linear algebra) in the context of "Basis function"


⭐ Core Definition: Basis (linear algebra)

In mathematics, a set B of elements of a vector space V is called a basis (pl.: bases) if every element of V can be written in a unique way as a finite linear combination of elements of B. The coefficients of this linear combination are referred to as components or coordinates of the vector with respect to B. The elements of a basis are called basis vectors.

Equivalently, a set B is a basis if its elements are linearly independent and every element of V is a linear combination of elements of B. In other words, a basis is a linearly independent spanning set.

↓ Menu
HINT:

In this Dossier

Basis (linear algebra) in the context of Determinant

In mathematics, the determinant is a scalar-valued function of the entries of a square matrix. The determinant of a matrix A is commonly denoted det(A), det A, or |A|. Its value characterizes some properties of the matrix and the linear map represented, on a given basis, by the matrix. In particular, the determinant is nonzero if and only if the matrix is invertible and the corresponding linear map is an isomorphism. However, if the determinant is zero, the matrix is referred to as singular, meaning it does not have an inverse.

The determinant is completely determined by the two following properties: the determinant of a product of matrices is the product of their determinants, and the determinant of a triangular matrix is the product of its diagonal entries.

View the full Wikipedia page for Determinant
↑ Return to Menu

Basis (linear algebra) in the context of Scalar (physics)

Scalar quantities or simply scalars are physical quantities that can be described by a single pure number (a scalar, typically a real number), accompanied by a unit of measurement, as in "10 cm" (ten centimeters).Examples of scalar are length, mass, charge, volume, and time. Scalars may represent the magnitude of physical quantities, such as speed is to velocity. Scalars do not represent a direction.

Scalars are unaffected by changes to a vector space basis (i.e., a coordinate rotation) but may be affected by translations (as in relative speed).A change of a vector space basis changes the description of a vector in terms of the basis used but does not change the vector itself, while a scalar has nothing to do with this change. In classical physics, like Newtonian mechanics, rotations and reflections preserve scalars, while in relativity, Lorentz transformations or space-time translations preserve scalars. The term "scalar" has origin in the multiplication of vectors by a unitless scalar, which is a uniform scaling transformation.

View the full Wikipedia page for Scalar (physics)
↑ Return to Menu

Basis (linear algebra) in the context of Orthogonal basis

In mathematics, particularly linear algebra, an orthogonal basis for an inner product space is a basis for whose vectors are mutually orthogonal. If the vectors of an orthogonal basis are normalized, the resulting basis is an orthonormal basis.

View the full Wikipedia page for Orthogonal basis
↑ Return to Menu

Basis (linear algebra) in the context of Miller index

Miller indices form a notation system in crystallography for lattice planes in crystal (Bravais) lattices.

In particular, a family of lattice planes of a given (direct) Bravais lattice is determined by three integers h, k, and , the Miller indices. They are written (hkℓ), and denote the family of (parallel) lattice planes (of the given Bravais lattice) orthogonal to , where are the basis or primitive translation vectors of the reciprocal lattice for the given Bravais lattice. (Note that the plane is not always orthogonal to the linear combination of direct or original lattice vectors because the direct lattice vectors need not be mutually orthogonal.) This is based on the fact that a reciprocal lattice vector (the vector indicating a reciprocal lattice point from the reciprocal lattice origin) is the wavevector of a plane wave in the Fourier series of a spatial function (e.g., electronic density function) which periodicity follows the original Bravais lattice, so wavefronts of the plane wave are coincident with parallel lattice planes of the original lattice. Since a measured scattering vector in X-ray crystallography, with as the outgoing (scattered from a crystal lattice) X-ray wavevector and as the incoming (toward the crystal lattice) X-ray wavevector, is equal to a reciprocal lattice vector as stated by the Laue equations, the measured scattered X-ray peak at each measured scattering vector is marked by Miller indices.

View the full Wikipedia page for Miller index
↑ Return to Menu

Basis (linear algebra) in the context of Zorn's lemma

Zorn's lemma, also known as the Kuratowski–Zorn lemma, is a proposition of set theory. It states that a partially ordered set containing upper bounds for every chain (that is, every totally ordered subset) necessarily contains at least one maximal element.

The lemma was proven (assuming the axiom of choice) by Kazimierz Kuratowski in 1922 and independently by Max Zorn in 1935. It occurs in the proofs of several theorems of crucial importance, for instance the Hahn–Banach theorem in functional analysis, the theorem that every vector space has a basis, Tychonoff's theorem in topology stating that every product of compact spaces is compact, and the theorems in abstract algebra that in a ring with identity every proper ideal is contained in a maximal ideal and that every field has an algebraic closure.

View the full Wikipedia page for Zorn's lemma
↑ Return to Menu

Basis (linear algebra) in the context of Euler angles

The Euler angles are three angles introduced by Leonhard Euler to describe the orientation of a rigid body with respect to a fixed coordinate system.

They can also represent the orientation of a mobile frame of reference in physics or the orientation of a general basis in three dimensional linear algebra.

View the full Wikipedia page for Euler angles
↑ Return to Menu

Basis (linear algebra) in the context of Pseudo-Euclidean space

In mathematics and theoretical physics, a pseudo-Euclidean space of signature (k, n-k) is a finite-dimensional real n-space together with a non-degenerate quadratic form q. Such a quadratic form can, given a suitable choice of basis (e1, …, en), be applied to a vector x = x1e1 + ⋯ + xnen, giving which is called the scalar square of the vector x.

For Euclidean spaces, k = n, implying that the quadratic form is positive-definite. When 0 < k < n, then q is an isotropic quadratic form. Note that if 1 ≤ ik < jn, then q(ei + ej) = 0, so that ei + ej is a null vector. In a pseudo-Euclidean space with k < n, unlike in a Euclidean space, there exist vectors with negative scalar square.

View the full Wikipedia page for Pseudo-Euclidean space
↑ Return to Menu

Basis (linear algebra) in the context of Homogeneous polynomial

In mathematics, a homogeneous polynomial, sometimes called quantic in older texts, is a polynomial whose nonzero terms all have the same degree. For example, is a homogeneous polynomial of degree 5, in two variables; the sum of the exponents in each term is always 5. The polynomial is not homogeneous, because the sum of exponents does not match from term to term. The function defined by a homogeneous polynomial is always a homogeneous function.

An algebraic form, or simply form, is a function defined by a homogeneous polynomial. A binary form is a form in two variables. A form is also a function defined on a vector space, which may be expressed as a homogeneous function of the coordinates over any basis.

View the full Wikipedia page for Homogeneous polynomial
↑ Return to Menu

Basis (linear algebra) in the context of Hamel dimension

In mathematics, the dimension of a vector space V is the cardinality (i.e., the number of vectors) of a basis of V over its base field. It is sometimes called Hamel dimension (after Georg Hamel) or algebraic dimension to distinguish it from other types of dimension.

For every vector space there exists a basis, and all bases of a vector space have equal cardinality; as a result, the dimension of a vector space is uniquely defined. We say is finite-dimensional if the dimension of is finite, and infinite-dimensional if its dimension is infinite.

View the full Wikipedia page for Hamel dimension
↑ Return to Menu

Basis (linear algebra) in the context of Tensor

In mathematics, a tensor is an algebraic object that describes a multilinear relationship between sets of algebraic objects associated with a vector space. Tensors may map between different objects such as vectors, scalars, and even other tensors. There are many types of tensors, including scalars and vectors (which are the simplest tensors), dual vectors, multilinear maps between vector spaces, and even some operations such as the dot product. Tensors are defined independent of any basis, although they are often referred to by their components in a basis related to a particular coordinate system; those components form an array, which can be thought of as a high-dimensional matrix.

Tensors have become important in physics, because they provide a concise mathematical framework for formulating and solving physics problems in areas such as mechanics (stress, elasticity, quantum mechanics, fluid mechanics, moment of inertia, etc.), electrodynamics (electromagnetic tensor, Maxwell tensor, permittivity, magnetic susceptibility, etc.), and general relativity (stress–energy tensor, curvature tensor, etc.). In applications, it is common to study situations in which a different tensor can occur at each point of an object. For example, the stress within an object may vary from one location to another. A family of tensors, that vary across space in this way, is a tensor field. In some areas, tensor fields are so ubiquitous that they are often simply called "tensors".

View the full Wikipedia page for Tensor
↑ Return to Menu

Basis (linear algebra) in the context of Orthonormal basis

In mathematics, particularly linear algebra, an orthonormal basis for an inner product space with finite dimension is a basis for whose vectors are orthonormal, that is, they are all unit vectors and orthogonal to each other. For example, the standard basis for a Euclidean space is an orthonormal basis, where the relevant inner product is the dot product of vectors. The image of the standard basis under a rotation or reflection (or any orthogonal transformation) is also orthonormal, and every orthonormal basis for arises in this fashion.An orthonormal basis can be derived from an orthogonal basis via normalization.The choice of an origin and an orthonormal basis forms a coordinate frame known as an orthonormal frame.

For a general inner product space an orthonormal basis can be used to define normalized orthogonal coordinates on Under these coordinates, the inner product becomes a dot product of vectors. Thus the presence of an orthonormal basis reduces the study of a finite-dimensional inner product space to the study of under the dot product. Every finite-dimensional inner product space has an orthonormal basis, which may be obtained from an arbitrary basis using the Gram–Schmidt process.

View the full Wikipedia page for Orthonormal basis
↑ Return to Menu

Basis (linear algebra) in the context of Linearly independent

In linear algebra, a set of vectors is said to be linearly independent if there exists no vector in the set that is equal to a linear combination of the other vectors in the set. If such a vector exists, then the vectors are said to be linearly dependent. Linear independence is part of the definition of linear basis.

A vector space can be of finite dimension or infinite dimension depending on the maximum number of linearly independent vectors. The definition of linear dependence and the ability to determine whether a subset of vectors in a vector space is linearly dependent are central to determining the dimension of a vector space.

View the full Wikipedia page for Linearly independent
↑ Return to Menu