Interpolation in the context of Basis function


Interpolation in the context of Basis function

Interpolation Study page number 1 of 1

Play TriviaQuestions Online!

or

Skip to study material about Interpolation in the context of "Basis function"


⭐ Core Definition: Interpolation

In the mathematical field of numerical analysis, interpolation is a type of estimation, a method of constructing (finding) new data points based on the range of a discrete set of known data points.

In engineering and science, one often has a number of data points, obtained by sampling or experimentation, which represent the values of a function for a limited number of values of the independent variable. It is often required to interpolate; that is, estimate the value of that function for an intermediate value of the independent variable.

↓ Menu
HINT:

👉 Interpolation in the context of Basis function

In mathematics, a basis function is an element of a particular basis for a function space. Every function in the function space can be represented as a linear combination of basis functions. In finite-dimensional vector spaces this representation is purely algebraic and involves only finitely many basis functions, whereas in infinite-dimensional settings it typically takes the form of an infinite series whose convergence depends on the topology of the space.

In numerical analysis and approximation theory, basis functions are also called blending functions, because of their use in interpolation: In this application, a mixture of the basis functions provides an interpolating function (with the "blend" depending on the evaluation of the basis functions at the data points).

↓ Explore More Topics
In this Dossier

Interpolation in the context of Spatial statistics

↑ Return to Menu

Interpolation in the context of Kriging

In statistics, originally in geostatistics, kriging or Kriging (/ˈkrɡɪŋ/), also known as Gaussian process regression, is a method of interpolation based on Gaussian process governed by prior covariances. Under suitable assumptions of the prior, kriging gives the best linear unbiased prediction (BLUP) at unsampled locations. Interpolating methods based on other criteria such as smoothness (e.g., smoothing spline) may not yield the BLUP. The method is widely used in the domain of spatial analysis and computer experiments. The technique is also known as Wiener–Kolmogorov prediction, after Norbert Wiener and Andrey Kolmogorov.

The theoretical basis for the method was developed by the French mathematician Georges Matheron in 1960, based on the master's thesis of Danie G. Krige, the pioneering plotter of distance-weighted average gold grades at the Witwatersrand reef complex in South Africa. Krige sought to estimate the most likely distribution of gold based on samples from a few boreholes. The English verb is to krige, and the most common noun is kriging. The word is sometimes capitalized as Kriging in the literature.

View the full Wikipedia page for Kriging
↑ Return to Menu

Interpolation in the context of Extrapolate

In mathematics, extrapolation is a type of estimation, beyond the original observation range, of the value of a variable on the basis of its relationship with another variable. It is similar to interpolation, which produces estimates between known observations, but extrapolation is subject to greater uncertainty and a higher risk of producing meaningless results. Extrapolation may also mean extension of a method, assuming similar methods will be applicable. Extrapolation may also apply to human experience to project, extend, or expand known experience into an area not known or previously experienced. By doing so, one makes an assumption of the unknown (for example, a driver may extrapolate road conditions beyond what is currently visible and these extrapolations may be correct or incorrect). The extrapolation method can be applied in the interior reconstruction problem.

View the full Wikipedia page for Extrapolate
↑ Return to Menu

Interpolation in the context of Blind spot (vision)

A blind spot, scotoma, is an obscuration of the visual field. A particular blind spot known as the physiological blind spot, "blind point", or punctum caecum in medical literature, is the place in the visual field that corresponds to the lack of light-detecting photoreceptor cells on the optic disc of the retina where the optic nerve passes through the optic disc. Because there are no cells to detect light on the optic disc, the corresponding part of the field of vision is invisible. Via processes in the brain, the blind spot is interpolated based on surrounding detail and information from the other eye, so it is not normally perceived.

Although all vertebrates have this blind spot, cephalopod eyes, which are only superficially similar because they evolved independently, do not. In them, the optic nerve approaches the receptors from behind, so it does not create a break in the retina.

View the full Wikipedia page for Blind spot (vision)
↑ Return to Menu

Interpolation in the context of Numerical analysis

Numerical analysis is the study of algorithms that use numerical approximation (as opposed to symbolic manipulations) for the problems of mathematical analysis (as distinguished from discrete mathematics). It is the study of numerical methods that attempt to find approximate solutions of problems rather than the exact ones. Numerical analysis finds application in all fields of engineering and the physical sciences, and in the 21st century also the life and social sciences like economics, medicine, business and even the arts. Current growth in computing power has enabled the use of more complex numerical analysis, providing detailed and realistic mathematical models in science and engineering. Examples of numerical analysis include: ordinary differential equations as found in celestial mechanics (predicting the motions of planets, stars and galaxies), numerical linear algebra in data analysis, and stochastic differential equations and Markov chains for simulating living cells in medicine and biology.

Before modern computers, numerical methods often relied on hand interpolation formulas, using data from large printed tables. Since the mid-20th century, computers calculate the required functions instead, but many of the same formulas continue to be used in software algorithms.

View the full Wikipedia page for Numerical analysis
↑ Return to Menu

Interpolation in the context of Trigonometric table

In mathematics, tables of trigonometric functions are useful in a number of areas. Before the existence of pocket calculators, trigonometric tables were essential for navigation, science and engineering. The calculation of mathematical tables was an important area of study, which led to the development of the first mechanical computing devices.

Modern computers and pocket calculators now generate trigonometric function values on demand, using special libraries of mathematical code. Often, these libraries use pre-calculated tables internally, and compute the required value by using an appropriate interpolation method. Interpolation of simple look-up tables of trigonometric functions is still used in computer graphics, where only modest accuracy may be required and speed is often paramount.

View the full Wikipedia page for Trigonometric table
↑ Return to Menu

Interpolation in the context of Geostatistics

Geostatistics is a branch of statistics focusing on spatial or spatiotemporal datasets. Developed originally to predict probability distributions of ore grades for mining operations, it is currently applied in diverse disciplines including petroleum geology, hydrogeology, hydrology, meteorology, oceanography, geochemistry, geometallurgy, geography, forestry, environmental control, landscape ecology, soil science, and agriculture (esp. in precision farming). Geostatistics is applied in varied branches of geography, particularly those involving the spread of diseases (epidemiology), the practice of commerce and military planning (logistics), and the development of efficient spatial networks. Geostatistical algorithms are incorporated in many places, including geographic information systems (GIS).

View the full Wikipedia page for Geostatistics
↑ Return to Menu

Interpolation in the context of Curve fitting

Curve fitting is the process of constructing a curve, or mathematical function, that has the best fit to a series of data points, possibly subject to constraints. Curve fitting can involve either interpolation, where an exact fit to the data is required, or smoothing, in which a "smooth" function is constructed that approximately fits the data. A related topic is regression analysis, which focuses more on questions of statistical inference such as how much uncertainty is present in a curve that is fitted to data observed with random errors. Fitted curves can be used as an aid for data visualization, to infer values of a function where no data are available, and to summarize the relationships among two or more variables. Extrapolation refers to the use of a fitted curve beyond the range of the observed data, and is subject to a degree of uncertainty since it may reflect the method used to construct the curve as much as it reflects the observed data.

For linear-algebraic analysis of data, "fitting" usually means trying to find the curve that minimizes the vertical (y-axis) displacement of a point from the curve (e.g., ordinary least squares). However, for graphical and image applications, geometric fitting seeks to provide the best visual fit; which usually means trying to minimize the orthogonal distance to the curve (e.g., total least squares), or to otherwise include both axes of displacement of a point from the curve. Geometric fits are not popular because they usually require non-linear and/or iterative calculations, although they have the advantage of a more aesthetic and geometrically accurate result.

View the full Wikipedia page for Curve fitting
↑ Return to Menu

Interpolation in the context of ICC profile

In color management, an ICC profile is a set of data that characterizes a color input or output device, or a color space, according to standards promulgated by the International Color Consortium (ICC). Profiles describe the color attributes of a particular device or viewing requirement by defining a mapping between the device source or target color space and a profile connection space (PCS). This PCS is either CIELAB (L*a*b*) or CIEXYZ. Mappings may be specified using tables, to which interpolation is applied, or through a series of parameters for transformations.

Every device that captures or displays color can be profiled. Some manufacturers provide profiles for their products, and there are several products that allow an end-user to generate their own color profiles, typically through the use of a tristimulus colorimeter or a spectrophotometer (sometimes called a spectrocolorimeter).

View the full Wikipedia page for ICC profile
↑ Return to Menu