Orthogonal vectors in the context of Complete metric space


Orthogonal vectors in the context of Complete metric space

Orthogonal vectors Study page number 1 of 1

Play TriviaQuestions Online!

or

Skip to study material about Orthogonal vectors in the context of "Complete metric space"


⭐ Core Definition: Orthogonal vectors

In mathematics, an inner product space is a real or complex vector space endowed with an operation called an inner product. The inner product of two vectors in the space is a scalar, often denoted with angle brackets such as in . Inner products allow formal definitions of intuitive geometric notions, such as lengths, angles, and orthogonality (zero inner product) of vectors. Inner product spaces generalize Euclidean vector spaces, in which the inner product is the dot product or scalar product of Cartesian coordinates. Inner product spaces of infinite dimensions are widely used in functional analysis. Inner product spaces over the field of complex numbers are sometimes referred to as unitary spaces. The first usage of the concept of a vector space with an inner product is due to Giuseppe Peano, in 1898.

An inner product naturally induces an associated norm, (denoted and in the picture); so, every inner product space is a normed vector space. If this normed space is also complete (that is, a Banach space) then the inner product space is a Hilbert space. If an inner product space H is not a Hilbert space, it can be extended by completion to a Hilbert space This means that is a linear subspace of the inner product of is the restriction of that of and is dense in for the topology defined by the norm.

↓ Menu
HINT:

In this Dossier

Orthogonal vectors in the context of Orthogonality

Orthogonality is a term with various meanings depending on the context.

In mathematics, orthogonality is the generalization of the geometric notion of perpendicularity. Although many authors use the two terms perpendicular and orthogonal interchangeably, the term perpendicular is more specifically used for lines and planes that intersect to form a right angle, whereas orthogonal is used in generalizations, such as orthogonal vectors or orthogonal curves.

View the full Wikipedia page for Orthogonality
↑ Return to Menu

Orthogonal vectors in the context of Cosine similarity

In data analysis, cosine similarity is a measure of similarity between two non-zero vectors defined in an inner product space. Cosine similarity is the cosine of the angle between the vectors; that is, it is the dot product of the vectors divided by the product of their lengths. It follows that the cosine similarity does not depend on the magnitudes of the vectors, but only on their angle. The cosine similarity always belongs to the interval For example, two proportional vectors have a cosine similarity of +1, two orthogonal vectors have a similarity of 0, and two opposite vectors have a similarity of −1. In some contexts, the component values of the vectors cannot be negative, in which case the cosine similarity is bounded in .

For example, in information retrieval and text mining, each word is assigned a different coordinate and a document is represented by the vector of the numbers of occurrences of each word in the document. Cosine similarity then gives a useful measure of how similar two documents are likely to be, in terms of their subject matter, and independently of the length of the documents.

View the full Wikipedia page for Cosine similarity
↑ Return to Menu