Reflection (physics) in the context of Vitrinite reflectance


Reflection (physics) in the context of Vitrinite reflectance

Reflection (physics) Study page number 1 of 6

Play TriviaQuestions Online!

or

Skip to study material about Reflection (physics) in the context of "Vitrinite reflectance"


⭐ Core Definition: Reflection (physics)

Reflection is the change in direction of a wavefront at an interface between two different media so that the wavefront returns into the medium from which it originated. Common examples include the reflection of light, sound and water waves. The law of reflection says that for specular reflection (for example at a mirror) the angle at which the wave is incident on the surface equals the angle at which it is reflected.

In acoustics, reflection causes echoes and is used in sonar. In geology, it is important in the study of seismic waves. Reflection is observed with surface waves in bodies of water. Reflection is observed with many types of electromagnetic wave, besides visible light. Reflection of VHF and higher frequencies is important for radio transmission and for radar. Even hard X-rays and gamma rays can be reflected at shallow angles with special "grazing" mirrors.

↓ Menu
HINT:

In this Dossier

Reflection (physics) in the context of Color

Color (or colour in Commonwealth English) is the visual perception produced by the activation of the different types of cone cells in the eye caused by light. Though color is not an inherent property of matter, color perception is related to an object's light absorption, emission, reflection and transmission. For most humans, visible wavelengths of light are the ones perceived in the visible light spectrum, with three types of cone cells (trichromacy). Other animals may have a different number of cone cell types or have eyes sensitive to different wavelengths, such as bees that can distinguish ultraviolet, and thus have a different color sensitivity range. Animal perception of color originates from different light wavelength or spectral sensitivity in cone cell types, which is then processed by the brain.

Colors have perceived properties such as hue, colorfulness, and lightness. Colors can also be additively mixed (mixing light) or subtractively mixed (mixing pigments). If one color is mixed in the right proportions, because of metamerism, they may look the same as another stimulus with a different reflection or emission spectrum. For convenience, colors can be organized in a color space, which when being abstracted as a mathematical color model can assign each region of color with a corresponding set of numbers. As such, color spaces are an essential tool for color reproduction in print, photography, computer monitors, and television. Some of the most well-known color models and color spaces are RGB, CMYK, HSL/HSV, CIE Lab, and YCbCr/YUV.

View the full Wikipedia page for Color
↑ Return to Menu

Reflection (physics) in the context of Image sensor

An image sensor or imager is a device that detects and conveys information used to form an image. It does so by converting the variable attenuation of light waves (as they pass through or reflect off objects) into signals, small bursts of current that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, optical mouse devices, medical imaging equipment, night vision equipment such as thermal imaging devices, radar, sonar, and others. As technology changes, electronic and digital imaging tends to replace chemical and analog imaging.

The two main types of electronic image sensors are the charge-coupled device (CCD) and the active-pixel sensor (CMOS sensor). Both CCD and CMOS sensors are based on metal–oxide–semiconductor (MOS) technology, with CCDs based on MOS capacitors and CMOS sensors based on MOSFET (MOS field-effect transistor) amplifiers. Analog sensors for invisible radiation tend to involve vacuum tubes of various kinds, while digital sensors include flat-panel detectors.

View the full Wikipedia page for Image sensor
↑ Return to Menu

Reflection (physics) in the context of Telescope

A telescope is a device used to observe distant objects by their emission, absorption, or reflection of electromagnetic radiation. Originally, it was an optical instrument using lenses, curved mirrors, or a combination of both to observe distant objects – an optical telescope. Nowadays, the word "telescope" is defined as a wide range of instruments capable of detecting different regions of the electromagnetic spectrum, and in some cases other types of detectors.

The first known practical telescopes were refracting telescopes with glass lenses and were invented in the Netherlands at the beginning of the 17th century. They were used for both terrestrial applications and astronomy.

View the full Wikipedia page for Telescope
↑ Return to Menu

Reflection (physics) in the context of Biological pigment

A biological pigment, also known simply as a pigment or biochrome, is a substance produced by living organisms that have a color resulting from selective color absorption. Biological pigments include plant pigments and flower pigments. Many biological structures, such as skin, eyes, feathers, fur and hair contain pigments such as melanin in specialized cells called chromatophores. In some species, pigments accrue over very long periods during an individual's lifespan.

Pigment color differs from structural color in that it is the same for all viewing angles, whereas structural color is the result of selective reflection or iridescence, usually because of multilayer structures. For example, butterfly wings typically contain structural color, although many butterflies have cells that contain pigment as well.

View the full Wikipedia page for Biological pigment
↑ Return to Menu

Reflection (physics) in the context of Thermography

Infrared thermography (IRT), also known as thermal imaging, is a measurement and imaging technique in which a thermal camera detects infrared radiation originating from the surface of objects. This radiation has two main components: thermal emission from the object's surface, which depends on its temperature and emissivity, and reflected radiation from surrounding sources. When the object is not (fully) opaque, i.e. exhibits nonzero transmissivity at the cameras operating wavelengths, transmitted radiation also contributes to the observed signal. The result is a visible image called a thermogram. Thermal cameras most commonly operate in the long-wave infrared (LWIR) range (7–14 μm); less frequently, systems designed for the mid-wave infrared (MWIR) range (3–5 μm) are used.

Since infrared radiation is emitted by all objects with a temperature above absolute zero according to the black body radiation law, thermography makes it possible to see one's environment with or without visible illumination. The amount of radiation emitted by an object increases with temperature, and thermography allows one to see variations in temperature. When viewed through a thermal imaging camera, warm objects stand out well against cooler backgrounds. For example, humans and other warm-blooded animals become easily visible against their environment in day or night. As a result, thermography is particularly useful to the military and other users of surveillance cameras.

View the full Wikipedia page for Thermography
↑ Return to Menu

Reflection (physics) in the context of Digital imaging

Digital imaging or digital image acquisition is the creation of a digital representation of the visual characteristics of an object, such as a physical scene or the interior structure of an object. The term is often assumed to imply or include the processing, compression, storage, printing and display of such images. A key advantage of a digital image, versus an analog image such as a film photograph, is the ability to digitally propagate copies of the original subject indefinitely without any loss of image quality.

Digital imaging can be classified by the type of electromagnetic radiation or other waves whose variable attenuation, as they pass through or reflect off objects, conveys the information that constitutes the image. In all classes of digital imaging, the information is converted by image sensors into digital signals that are processed by a computer and made output as a visible-light image. For example, the medium of visible light allows digital photography (including digital videography) with various kinds of digital cameras (including digital video cameras). X-rays allow digital X-ray imaging (digital radiography, fluoroscopy, and CT), and gamma rays allow digital gamma ray imaging (digital scintigraphy, SPECT, and PET). Sound allows ultrasonography (such as medical ultrasonography) and sonar, and radio waves allow radar. Digital imaging lends itself well to image analysis by software, as well as to image editing (including image manipulation).

View the full Wikipedia page for Digital imaging
↑ Return to Menu

Reflection (physics) in the context of Prism (optics)

An optical prism is a transparent optical element with flat, polished surfaces that are designed to refract light. At least one surface must be angled—elements with two parallel surfaces are not prisms. The most familiar type of optical prism is the triangular prism, which has a triangular base and rectangular sides. Not all optical prisms are geometric prisms, and not all geometric prisms would count as an optical prism. Prisms can be made from any material that is transparent to the wavelengths for which they are designed. Typical materials include glass, acrylic and fluorite.

A dispersive prism can be used to break white light up into its constituent spectral colors (the colors of the rainbow) to form a spectrum as described in the following section. Other types of prisms noted below can be used to reflect light, or to split light into components with different polarizations.

View the full Wikipedia page for Prism (optics)
↑ Return to Menu

Reflection (physics) in the context of Visual appearance

The visual appearance of objects is given by the way in which they reflect and transmit light. The color of objects is determined by the parts of the spectrum of (incident white) light that are reflected or transmitted without being absorbed. Additional appearance attributes are based on the directional distribution of reflected (BRDF) or transmitted light (BTDF) described by attributes like glossy, shiny versus dull, matte, clear, turbid, distinct, etc. Since "visual appearance" is a general concept that includes also various other visual phenomena, such as color, visual texture, visual perception of shape, size, etc., the specific aspects related to how humans see different spatial distributions of light (absorbed, transmitted and reflected, either regularly or diffusely) have been given the name cesia. It marks a difference (but also a relationship) with color, which could be defined as the sensation arising from different spectral compositions or distributions of light.

View the full Wikipedia page for Visual appearance
↑ Return to Menu

Reflection (physics) in the context of Transmission medium

A transmission medium is a system or substance that can mediate the propagation of signals for the purposes of telecommunication. Signals are typically imposed on a wave of some kind suitable for the chosen medium. For example, data can modulate sound, and a transmission medium for sounds may be air, but solids and liquids may also act as the transmission medium. Vacuum or air constitutes a good transmission medium for electromagnetic waves such as light and radio waves. While a material substance is not required for electromagnetic waves to propagate, such waves are usually affected by the transmission medium they pass through, for instance, by absorption or reflection or refraction at the interfaces between media. Technical devices can therefore be employed to transmit or guide waves. Thus, an optical fiber or a copper cable is used as transmission media.

Electromagnetic radiation can be transmitted through an optical medium, such as optical fiber, or through twisted pair wires, coaxial cable, or dielectric-slab waveguides. It may also pass through any physical material that is transparent to the specific wavelength, such as water, air, glass, or concrete. Sound is, by definition, the vibration of matter, so it requires a physical medium for transmission, as do other kinds of mechanical waves and heat energy. Historically, science incorporated various aether theories to explain the transmission medium. However, it is now known that electromagnetic waves do not require a physical transmission medium, and so can travel through the vacuum of free space. Regions of the insulative vacuum can become conductive for electrical conduction through the presence of free electrons, holes, or ions.

View the full Wikipedia page for Transmission medium
↑ Return to Menu

Reflection (physics) in the context of Opacity (optics)

Opacity is the measure of impenetrability to electromagnetic or other kinds of radiation, especially visible light. In radiative transfer, it describes the absorption and scattering of radiation in a medium, such as a plasma, dielectric, shielding material, glass, etc. An opaque object is neither transparent (allowing all light to pass through) nor translucent (allowing some light to pass through). When light strikes an interface between two substances, in general, some may be reflected, some absorbed, some scattered, and the rest transmitted (also see refraction). Reflection can be diffuse, for example light reflecting off a white wall, or specular, for example light reflecting off a mirror. An opaque substance transmits no light, and therefore reflects, scatters, or absorbs all of it. Other categories of visual appearance, related to the perception of regular or diffuse reflection and transmission of light, have been organized under the concept of cesia in an order system with three variables, including opacity, transparency and translucency among the involved aspects. Both mirrors and carbon black are opaque. Opacity depends on the frequency of the light being considered. For instance, some kinds of glass, while transparent in the visual range, are largely opaque to ultraviolet light. More extreme frequency-dependence is visible in the absorption lines of cold gases. Opacity can be quantified in many ways (see: Mathematical descriptions of opacity).

Different processes can lead to opacity, including absorption, reflection, and scattering.

View the full Wikipedia page for Opacity (optics)
↑ Return to Menu

Reflection (physics) in the context of Corpuscular theory of light

In optics, the corpuscular theory of light states that light is made up of small discrete particles called "corpuscles" (little particles) which travel in a straight line with a finite velocity and possess impetus. This notion was based on an alternate description of atomism of the time period.

Isaac Newton laid the foundations for this theory through his work in optics. This early conception of the particle theory of light was an early forerunner to the modern understanding of the photon. This theory came to dominate the conceptions of light in the eighteenth century, displacing the previously prominent vibration theories, where light was viewed as "pressure" of the medium between the source and the receiver, first championed by René Descartes, and later in a more refined form by Christiaan Huygens. In part correct, being able to successfully explain refraction, reflection, rectilinear propagation and to a lesser extent diffraction, the theory would fall out of favor in the early nineteenth century, as the wave theory of light amassed new experimental evidence. The modern understanding of light is the concept of wave-particle duality.

View the full Wikipedia page for Corpuscular theory of light
↑ Return to Menu

Reflection (physics) in the context of Huygens–Fresnel principle

The Huygens–Fresnel principle (named after Dutch physicist Christiaan Huygens and French physicist Augustin-Jean Fresnel) states that every point on a wavefront is itself the source of spherical wavelets and that the secondary wavelets emanating from different points mutually interfere. The sum of these spherical wavelets forms a new wavefront. As such, the Huygens–Fresnel principle is a method of analysis applied to problems of luminous wave propagation both in the far-field limit and in near-field diffraction as well as reflection.

View the full Wikipedia page for Huygens–Fresnel principle
↑ Return to Menu

Reflection (physics) in the context of Reflection seismology

Reflection seismology (or seismic reflection) is a method of exploration geophysics that uses the principles of seismology to estimate the properties of the Earth's subsurface from reflected seismic waves. The method requires a controlled seismic source of energy, such as dynamite or Tovex blast, a specialized air gun or a seismic vibrator. Reflection seismology is similar to sonar and echolocation.

View the full Wikipedia page for Reflection seismology
↑ Return to Menu

Reflection (physics) in the context of Inverse-square law

In physical science, an inverse-square law is any scientific law stating that the observed "intensity" of a specified physical quantity (being nothing more than the value of the physical quantity) is inversely proportional to the square of the distance from the source of that physical quantity. The fundamental cause for this can be understood as geometric dilution corresponding to point-source radiation into three-dimensional space.

Radar energy expands during both the signal transmission and the reflected return, so the inverse square for both paths means that the radar will receive energy according to the inverse fourth power of the range.

View the full Wikipedia page for Inverse-square law
↑ Return to Menu

Reflection (physics) in the context of Solar power satellite

Space-based solar power (SBSP or SSP) is the concept of collecting solar power in outer space with solar power satellites (SPS) and distributing it to Earth. Its advantages include a higher collection of energy due to the lack of reflection and absorption by the atmosphere, the possibility of very little night, and a better ability to orient to face the Sun. Space-based solar power systems convert sunlight to some other form of energy (such as microwaves) which can be transmitted through the atmosphere to receivers on the Earth's surface.

Solar panels on spacecraft have been in use since 1958, when Vanguard I used them to power one of its radio transmitters; however, the term (and acronyms) above are generally used in the context of large-scale transmission of energy for use on Earth.

View the full Wikipedia page for Solar power satellite
↑ Return to Menu

Reflection (physics) in the context of Puddle

A puddle is a small accumulation of liquid, usually water, on a surface. It can form either by pooling in a depression on the surface, or by surface tension upon a flat surface. Puddles are often characterized by murky water or mud due to the disturbance and dissolving of surrounding sediment, primarily due to precipitation.

Generally a puddle is shallow enough to walk through, and too small to traverse with a boat or raft. Small wildlife may be attracted to puddles.

View the full Wikipedia page for Puddle
↑ Return to Menu

Reflection (physics) in the context of Microscopy

Microscopy is the technical field of using microscopes to view subjects too small to be seen with the naked eye (objects that are not within the resolution range of the normal eye). There are three well-known branches of microscopy: optical, electron, and scanning probe microscopy, along with the emerging field of X-ray microscopy.

Optical microscopy and electron microscopy involve the diffraction, reflection, or refraction of electromagnetic radiation/electron beams interacting with the specimen, and the collection of the scattered radiation or another signal in order to create an image. This process may be carried out by wide-field irradiation of the sample (for example standard light microscopy and transmission electron microscopy) or by scanning a fine beam over the sample (for example confocal laser scanning microscopy and scanning electron microscopy). Scanning probe microscopy involves the interaction of a scanning probe with the surface of the object of interest. The development of microscopy revolutionized biology, gave rise to the field of histology and so remains an essential technique in the life and physical sciences. X-ray microscopy is three-dimensional and non-destructive, allowing for repeated imaging of the same sample for in situ or 4D studies, and providing the ability to "see inside" the sample being studied before sacrificing it to higher resolution techniques. A 3D X-ray microscope uses the technique of computed tomography (microCT), rotating the sample 360 degrees and reconstructing the images. CT is typically carried out with a flat panel display. A 3D X-ray microscope employs a range of objectives, e.g., from 4X to 40X, and can also include a flat panel.

View the full Wikipedia page for Microscopy
↑ Return to Menu

Reflection (physics) in the context of Infra-red (IR) spectroscopy

Infrared spectroscopy (IR spectroscopy or vibrational spectroscopy) is the measurement of the interaction of infrared radiation with matter by absorption, emission, or reflection. It is used to study and identify chemical substances or functional groups in solid, liquid, or gaseous forms. It can be used to characterize new materials or identify and verify known and unknown samples. The method or technique of infrared spectroscopy is conducted with an instrument called an infrared spectrometer (or spectrophotometer) which produces an infrared spectrum. An IR spectrum can be visualized in a graph of infrared light absorbance (or transmittance) on the vertical axis vs. frequency, wavenumber or wavelength on the horizontal axis. Typical units of wavenumber used in IR spectra are reciprocal centimeters, with the symbol cm. Units of IR wavelength are commonly given in micrometers (formerly called "microns"), symbol μm, which are related to the wavenumber in a reciprocal way. A common laboratory instrument that uses this technique is a Fourier transform infrared (FTIR) spectrometer. Two-dimensional IR is also possible as discussed below.

The infrared portion of the electromagnetic spectrum is usually divided into three regions; the near-, mid- and far- infrared, named for their relation to the visible spectrum. The higher-energy near-IR, approximately 14,000–4,000 cm (0.7–2.5 μm wavelength) can excite overtone or combination modes of molecular vibrations. The mid-infrared, approximately 4,000–400 cm (2.5–25 μm) is generally used to study the fundamental vibrations and associated rotational–vibrational structure. The far-infrared, approximately 400–10 cm (25–1,000 μm) has low energy and may be used for rotational spectroscopy and low frequency vibrations. The region from 2–130 cm, bordering the microwave region, is considered the terahertz region and may probe intermolecular vibrations. The names and classifications of these subregions are conventions, and are only loosely based on the relative molecular or electromagnetic properties.

View the full Wikipedia page for Infra-red (IR) spectroscopy
↑ Return to Menu