Astronomy

Can IR sensitive cameras read signals from Venus's surface emissions?

Can IR sensitive cameras read signals from Venus's surface emissions?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

This question got me thinking about this.

I know that we measure the Earth's surface temperature by satellite (perhaps somewhat inaccurately, but it's done all the same). Using Venus as an example, can we see see evidence of Venus' 800 degree surface from space or is Venus' temperature deduced more by modeling and the Russian ship that landed there.

All things with heat give off a thermal signature and that signature isn't a single wavelength but a range of wavelength. I assume, by measuring the overall color emitted and/or the peak wavelength, it's a fairly straight forward calculation to work out the temperature.

But when you have an atmosphere, and this is the main part of of my question. I assume, an atmosphere is to some degree transparent, so you'd be seeing in a sense, heat through a medium and inside as well as on the upper surface. But the heat you see diminishes depending on how opaque/transparent that atmosphere is to the specific wavelength.

Are there any good ballpark estimates to how thick (or how dense or pressurized), we can see temperature signatures through an atmosphere? In the case of Venus, does any of it's 800 degree surface temperature generated IR make it through it's atmosphere, where it would be observable by spacecraft, giving a good measurement of it's surface temperature, or was it's surface temperature deduced by different types of models and that mistreated Soviet spacecraft that didn't last long once it had landed.

I imagine some atmospheric gases are more transparent than others.

Just to clarify - a specific "yes/no we can detect IR from Venus' surface" and a more general answer as to how far we can see heat through an atmosphere, are both fine.


Yes. A thermal image from the surface can be detected, even by amateurs.

The ground is heated to about 700K, and thermal emission at a wavelength of 1 micron is just about observable by skilled amateurs, with the right equipment. Mountains, which are cooler by about 30K, can be seen at this wavelength

From orbit, more details can be seen. Akatsuki used the 1 micron band to observe the surface in 2015

At other wavelengths, cloud patterns are revealed. Galileo took images at 2.3 microns during its flyby. They show sulphuric acid clouds at 50km above the surface (10km below the visible cloud layer) and at about 270K, with gaps exposing the lower atmosphere, glowing at 500K.


Infrared

Infrared (IR), sometimes called infrared light, is electromagnetic radiation (EMR) with wavelengths longer than those of visible light. It is therefore invisible to the human eye. IR is generally understood to encompass wavelengths from the nominal red edge of the visible spectrum around 700 nanometers (frequency 430 THz), to 1 millimeter (300 GHz) [1] (although the longer IR wavelengths are often designated rather as terahertz radiation). Black-body radiation from objects near room temperature is almost all at infrared wavelengths. As a form of electromagnetic radiation, IR propagates energy and momentum, with properties corresponding to both those of a wave and of a particle, the photon.

Infrared radiation was discovered in 1800 by astronomer Sir William Herschel, who discovered a type of invisible radiation in the spectrum lower in energy than red light, by means of its effect on a thermometer. [2] Slightly more than half of the total energy from the Sun was eventually found [ when? ] to arrive on Earth in the form of infrared. The balance between absorbed and emitted infrared radiation has a critical effect on Earth's climate.

Infrared radiation is emitted or absorbed by molecules when they change their rotational-vibrational movements. It excites vibrational modes in a molecule through a change in the dipole moment, making it a useful frequency range for study of these energy states for molecules of the proper symmetry. Infrared spectroscopy examines absorption and transmission of photons in the infrared range. [3]

Infrared radiation is used in industrial, scientific, military, commercial, and medical applications. Night-vision devices using active near-infrared illumination allow people or animals to be observed without the observer being detected. Infrared astronomy uses sensor-equipped telescopes to penetrate dusty regions of space such as molecular clouds, to detect objects such as planets, and to view highly red-shifted objects from the early days of the universe. [4] Infrared thermal-imaging cameras are used to detect heat loss in insulated systems, to observe changing blood flow in the skin, and to detect the overheating of electrical components. [5]

Military and civilian applications include target acquisition, surveillance, night vision, homing, and tracking. Humans at normal body temperature radiate chiefly at wavelengths around 10 μm (micrometers). Non-military uses include thermal efficiency analysis, environmental monitoring, industrial facility inspections, detection of grow-ops, remote temperature sensing, short-range wireless communication, spectroscopy, and weather forecasting.


Contents

As one of the brightest objects in the sky, Venus has been known since prehistoric times, and as such, many ancient cultures recorded observations of the planet. A cylinder seal from the Jemdet Nasr period indicates that the ancient Sumerians already knew that the morning and evening stars were the same celestial object. The Sumerians named the planet after the goddess Inanna, who was known as Ishtar by the later Akkadians and Babylonians. [1] She had a dual role as a goddess of both love and war, thereby representing a deity that presided over birth and death. [2] [3] One of the oldest surviving astronomical documents, from the Babylonian library of Ashurbanipal around 1600 BC, is a 21-year record of the appearances of Venus.

Because the movements of Venus appear to be discontinuous (it disappears due to its proximity to the sun, for many days at a time, and then reappears on the other horizon), some cultures did not immediately recognize Venus as single entity instead, they assumed it to be two separate stars on each horizon: the morning star and the evening star. The Ancient Egyptians, for example, believed Venus to be two separate bodies and knew the morning star as Tioumoutiri and the evening star as Ouaiti. [4] The Ancient Greeks called the morning star Φωσφόρος , Phosphoros (Latinized Phosphorus), the "Bringer of Light" or Ἐωσφόρος , Eosphoros (Latinized Eosphorus), the "Bringer of Dawn". The evening star they called Hesperos (Latinized Hesperus) ( Ἓσπερος , the "star of the evening"). [5] By Hellenistic times, the ancient Greeks identified it as a single planet, [6] [7] which they named after their goddess of love, Aphrodite (Αφροδίτη) (Phoenician Astarte), [8] a planetary name that is retained in modern Greek. [9] Hesperos became a loanword in Latin as Vesper and Phosphoros was translated as Lucifer ("Light Bearer").

Venus was considered the most important celestial body observed by the Maya, who called it Chac ek, [10] or Noh Ek', "the Great Star". The Maya monitored the movements of Venus closely and observed it in daytime. The positions of Venus and other planets were thought to influence life on Earth, so the Maya and other ancient Mesoamerican cultures timed wars and other important events based on their observations. In the Dresden Codex, the Maya included an almanac showing Venus's full cycle, in five sets of 584 days each (approximately eight years), after which the patterns repeated (since Venus has a synodic period of 583.92 days). [11] The Maya civilization developed a religious calendar, based in part upon the motions of the planet, and held the motions of Venus to determine the propitious time for events such as war. They also named it Xux Ek', the Wasp Star. The Maya were aware of the planet's synodic period, and could compute it to within a hundredth part of a day. [12]

Because its orbit takes it between the Earth and the Sun, Venus as seen from Earth exhibits visible phases in much the same manner as the Earth's Moon. Galileo Galilei was the first person to observe the phases of Venus in December 1610, an observation which supported Copernicus's then-contentious heliocentric description of the Solar System. He also noted changes in the size of Venus's visible diameter when it was in different phases, suggesting that it was farther from Earth when it was full and nearer when it was a crescent. This observation strongly supported the heliocentric model. Venus (and also Mercury) is not visible from Earth when it is full, since at that time it is at superior conjunction, rising and setting concomitantly with the Sun and hence lost in the Sun's glare.

Venus is brightest when approximately 25% of its disk is illuminated this typically occurs 37 days both before (in the evening sky) and after (in the morning sky) its inferior conjunction. Its greatest elongations occur approximately 70 days before and after inferior conjunction, at which time it is half full between these two intervals Venus is actually visible in broad daylight, if the observer knows specifically where to look for it. The planet's period of retrograde motion is 20 days on either side of the inferior conjunction. In fact, through a telescope Venus at greatest elongation appears less than half full due to Schröter's effect first noticed in 1793 and shown in 1996 as due to its thick atmosphere.

On rare occasions, Venus can actually be seen in both the morning (before sunrise) and evening (after sunset) on the same day. This scenario arises when Venus is at its maximum separation from the ecliptic and concomitantly at inferior conjunction then one hemisphere (Northern or Southern) will be able to see it at both times. This opportunity presented itself most recently for Northern Hemisphere observers within a few days on either side of March 29, 2001, and for those in the Southern Hemisphere, on and around August 19, 1999. These respective events repeat themselves every eight years pursuant to the planet's synodic cycle.

Transits of Venus directly between the Earth and the Sun's visible disc are rare astronomical events. The first such transit to be predicted and observed was the Transit of Venus, 1639, seen and recorded by English astronomers Jeremiah Horrocks and William Crabtree. The observation by Mikhail Lomonosov of the transit of 1761 provided the first evidence that Venus had an atmosphere, and the 19th-century observations of parallax during Venus transits allowed the distance between the Earth and Sun to be accurately calculated for the first time. Transits can only occur either in early June or early December, these being the points at which Venus crosses the ecliptic (the orbital plane of the Earth), and occur in pairs at eight-year intervals, with each such pair more than a century apart. The most recent pair of transits of Venus occurred in 2004 and 2012, while the prior pair occurred in 1874 and 1882.

In the 19th century, many observers stated that Venus had a period of rotation of roughly 24 hours. Italian astronomer Giovanni Schiaparelli was the first to predict a significantly slower rotation, proposing that Venus was tidally locked with the Sun (as he had also proposed for Mercury). While not actually true for either body, this was still a reasonably accurate estimate. The near-resonance between its rotation and its closest approach to Earth helped to create this impression, as Venus always seemed to be facing the same direction when it was in the best location for observations to be made. The rotation rate of Venus was first measured during the 1961 conjunction, observed by radar from a 26 m antenna at Goldstone, California, the Jodrell Bank Radio Observatory in the UK, and the Soviet deep space facility in Yevpatoria, Crimea. Accuracy was refined at each subsequent conjunction, primarily from measurements made from Goldstone and Eupatoria. The fact that rotation was retrograde was not confirmed until 1964.

Before radio observations in the 1960s, many believed that Venus contained a lush, Earth-like environment. This was due to the planet's size and orbital radius, which suggested a fairly Earth-like situation as well as to the thick layer of clouds which prevented the surface from being seen. Among the speculations on Venus were that it had a jungle-like environment or that it had oceans of either petroleum or carbonated water. However, microwave observations by C. Mayer et al., [13] indicated a high-temperature source (600 K). Strangely, millimetre-band observations made by A. D. Kuzmin indicated much lower temperatures. [14] Two competing theories explained the unusual radio spectrum, one suggesting the high temperatures originated in the ionosphere, and another suggesting a hot planetary surface.

In September 2020 a team at Cardiff University announced that observations of Venus using the James Clerk Maxwell Telescope and Atacama Large Millimeter Array in 2017 and 2019 indicated that the Venusian atmosphere contained phosphine (PH3) in concentrations 10,000 times higher than those that could be ascribed to any known non-biological source on Venus. The phosphine was detected at heights of at least 30 miles above the surface of Venus, and was detected primarily at mid-latitudes with none detected at the poles of Venus. This indicates the potential presence of biological organisms on Venus. [15] [16]

After the Moon, Venus was the second object in the Solar System to be explored by radar from the Earth. The first studies were carried out in 1961 at NASA's Goldstone Observatory, part of the Deep Space Network. At successive inferior conjunctions, Venus was observed both by Goldstone and the National Astronomy and Ionosphere Center in Arecibo. The studies carried out were similar to the earlier measurement of transits of the meridian, which had revealed in 1963 that the rotation of Venus was retrograde (it rotates in the opposite direction to that in which it orbits the Sun). The radar observations also allowed astronomers to determine that the rotation period of Venus was 243.1 days, and that its axis of rotation was almost perpendicular to its orbital plane. It was also established that the radius of the planet was 6,052 kilometres (3,761 mi), some 70 kilometres (43 mi) less than the best previous figure obtained with terrestrial telescopes.

Interest in the geological characteristics of Venus was stimulated by the refinement of imaging techniques between 1970 and 1985. Early radar observations suggested merely that the surface of Venus was more compacted than the dusty surface of the Moon. The first radar images taken from the Earth showed very bright (radar-reflective) highlands christened Alpha Regio, Beta Regio, and Maxwell Montes improvements in radar techniques later achieved an image resolution of 1–2 kilometres.

There have been numerous unmanned missions to Venus. Ten Soviet probes have achieved a soft landing on the surface, with up to 110 minutes of communication from the surface, all without return. Launch windows occur every 19 months.

Early flybys Edit

On February 12, 1961, the Soviet spacecraft Venera 1 was the first flyby probe launched to another planet. An overheated orientation sensor caused it to malfunction, losing contact with Earth before its closest approach to Venus of 100,000 km. However, the probe was first to combine all the necessary features of an interplanetary spacecraft: solar panels, parabolic telemetry antenna, 3-axis stabilization, course-correction engine, and the first launch from parking orbit.

The first successful flyby Venus probe was the American Mariner 2 spacecraft, which flew past Venus in 1962, coming within 35,000 km. A modified Ranger Moon probe, it established that Venus has practically no intrinsic magnetic field and measured the temperature of the planet's atmosphere to be approximately 500 °C (773 K 932 °F). [17]

The Soviet Union launched the Zond 1 probe to Venus in 1964, but it malfunctioned sometime after its May 16 telemetry session.

During another American flyby in 1967, Mariner 5 measured the strength of Venus's magnetic field. In 1974, Mariner 10 swung by Venus on its way to Mercury and took ultraviolet photographs of the clouds, revealing the extraordinarily high wind speeds in the Venusian atmosphere.

Early landings Edit

On March 1, 1966 the Venera 3 Soviet space probe crash-landed on Venus, becoming the first spacecraft to reach the surface of another planet. Its sister craft Venera 2 had failed due to overheating shortly before completing its flyby mission.

The descent capsule of Venera 4 entered the atmosphere of Venus on October 18, 1967, making it the first probe to return direct measurements from another planet's atmosphere. The capsule measured temperature, pressure, density and performed 11 automatic chemical experiments to analyze the atmosphere. It discovered that the atmosphere of Venus was 95% carbon dioxide ( CO
2 ), and in combination with radio occultation data from the Mariner 5 probe, showed that surface pressures were far greater than expected (75 to 100 atmospheres).

These results were verified and refined by the Venera 5 and Venera 6 in May 1969. But thus far, none of these missions had reached the surface while still transmitting. Venera 4's battery ran out while still slowly floating through the massive atmosphere, and Venera 5 and 6 were crushed by high pressure 18 km (60,000 ft) above the surface.

The first successful landing on Venus was by Venera 7 on December 15, 1970 — the first successful soft (non-crash) landing on another planet, as well as the first successful transmission of data from another planet’s surface to Earth. [18] [19] Venera 7 remained in contact with Earth for 23 minutes, relaying surface temperatures of 455 °C to 475 °C (855 °F to 885 °F). Venera 8 landed on July 22, 1972. In addition to pressure and temperature profiles, a photometer showed that the clouds of Venus formed a layer ending over 35 kilometres (22 mi) above the surface. A gamma ray spectrometer analyzed the chemical composition of the crust.

Lander/orbiter pairs Edit

Venera 9 and 10 Edit

The Soviet probe Venera 9 entered orbit on October 22, 1975, becoming the first artificial satellite of Venus. A battery of cameras and spectrometers returned information about the planet's clouds, ionosphere and magnetosphere, as well as performing bi-static radar measurements of the surface. The 660 kg (1,455 lb) descent vehicle [21] separated from Venera 9 and landed, taking the first pictures of the surface and analyzing the crust with a gamma ray spectrometer and a densitometer. During descent, pressure, temperature and photometric measurements were made, as well as backscattering and multi-angle scattering (nephelometer) measurements of cloud density. It was discovered that the clouds of Venus are formed in three distinct layers. On October 25, Venera 10 arrived and carried out a similar program of study.

Pioneer Venus Edit

In 1978, NASA sent two Pioneer spacecraft to Venus. The Pioneer mission consisted of two components, launched separately: an orbiter and a multiprobe. The Pioneer Venus Multiprobe carried one large and three small atmospheric probes. The large probe was released on November 16, 1978 and the three small probes on November 20. All four probes entered the Venusian atmosphere on December 9, followed by the delivery vehicle. Although not expected to survive the descent through the atmosphere, one probe continued to operate for 45 minutes after reaching the surface. The Pioneer Venus Orbiter was inserted into an elliptical orbit around Venus on December 4, 1978. It carried 17 experiments and operated until the fuel used to maintain its orbit was exhausted and atmospheric entry destroyed the spacecraft in August 1992.

Further Soviet missions Edit

Also in 1978, Venera 11 and Venera 12 flew past Venus, dropping descent vehicles on December 21 and December 25 respectively. The landers carried colour cameras and a soil drill and analyzer, which unfortunately malfunctioned. Each lander made measurements with a nephelometer, mass spectrometer, gas chromatograph, and a cloud-droplet chemical analyzer using X-ray fluorescence that unexpectedly discovered a large proportion of chlorine in the clouds, in addition to sulfur. Strong lightning activity was also detected.

In 1982, the Soviet Venera 13 sent the first colour image of Venus's surface and analysed the X-ray fluorescence of an excavated soil sample. The probe operated for a record 127 minutes on the planet's hostile surface. Also in 1982, the Venera 14 lander detected possible seismic activity in the planet's crust.

In December 1984, during the apparition of Halley's Comet, the Soviet Union launched the two Vega probes to Venus. Vega 1 and Vega 2 encountered Venus in June 1985, each deploying a lander and an instrumented helium balloon. The balloon-borne aerostat probes floated at about 53 km altitude for 46 and 60 hours respectively, traveling about 1/3 of the way around the planet and allowing scientists to study the dynamics of the most active part of Venus's atmosphere. These measured wind speed, temperature, pressure and cloud density. More turbulence and convection activity than expected was discovered, including occasional plunges of 1 to 3 km in downdrafts.

The landing vehicles carried experiments focusing on cloud aerosol composition and structure. Each carried an ultraviolet absorption spectrometer, aerosol particle-size analyzers, and devices for collecting aerosol material and analyzing it with a mass spectrometer, a gas chromatograph, and an X-ray fluorescence spectrometer. The upper two layers of the clouds were found to be sulfuric acid droplets, but the lower layer is probably composed of phosphoric acid solution. The crust of Venus was analyzed with the soil drill experiment and a gamma ray spectrometer. As the landers carried no cameras on board, no images were returned from the surface. They would be the last probes to land on Venus for decades. The Vega spacecraft continued to rendezvous with Halley's Comet nine months later, bringing an additional 14 instruments and cameras for that mission.

The multiaimed Soviet Vesta mission, developed in cooperation with European countries for realisation in 1991–1994 but canceled due to the Soviet Union disbanding, included the delivery of balloons and a small lander to Venus, according to the first plan.

Orbiters Edit

Venera 15 and 16 Edit

In October 1983, Venera 15 and Venera 16 entered polar orbits around Venus. The images had a 1–2 kilometre (0.6–1.2 mile) resolution, comparable to those obtained by the best Earth radars. Venera 15 analyzed and mapped the upper atmosphere with an infrared Fourier spectrometer. From November 11, 1983 to July 10, 1984, both satellites mapped the northern third of the planet with synthetic aperture radar. These results provided the first detailed understanding of the surface geology of Venus, including the discovery of unusual massive shield volcanoes such as coronae and arachnoids. Venus had no evidence of plate tectonics, unless the northern third of the planet happened to be a single plate. The altimetry data obtained by the Venera missions had a resolution four times better than Pioneer's.

Magellan Edit

On August 10, 1990, the American Magellan probe, named after the explorer Ferdinand Magellan, arrived at its orbit around the planet and started a mission of detailed radar mapping at a frequency of 2.38 GHz. [22] Whereas previous probes had created low-resolution radar maps of continent-sized formations, Magellan mapped 98% of the surface with a resolution of approximately 100 m. The resulting maps were comparable to visible-light photographs of other planets, and are still the most detailed in existence. Magellan greatly improved scientific understanding of the geology of Venus: the probe found no signs of plate tectonics, but the scarcity of impact craters suggested the surface was relatively young, and there were lava channels thousands of kilometers long. After a four-year mission, Magellan, as planned, plunged into the atmosphere on October 11, 1994, and partly vaporized some sections are thought to have hit the planet's surface.

Venus Express Edit

Venus Express was a mission by the European Space Agency to study the atmosphere and surface characteristics of Venus from orbit. The design was based on ESA's Mars Express and Rosetta missions. The probe's main objective was the long-term observation of the Venusian atmosphere, which it is hoped will also contribute to an understanding of Earth's atmosphere and climate. It also made global maps of Venerean surface temperatures, and attempted to observe signs of life on Earth from a distance.

Venus Express successfully assumed a polar orbit on April 11, 2006. The mission was originally planned to last for two Venusian years (about 500 Earth days), but was extended to the end of 2014 until its propellant was exhausted. Some of the first results emerging from Venus Express include evidence of past oceans, the discovery of a huge double atmospheric vortex at the south pole, and the detection of hydroxyl in the atmosphere.

Akatsuki Edit

Akatsuki was launched on May 20, 2010, by JAXA, and was planned to enter Venusian orbit in December 2010. However, the orbital insertion maneuver failed and the spacecraft was left in heliocentric orbit. It was placed on an alternative elliptical Venerian orbit on 7 December 2015 by firing its attitude control thrusters for 1233-seconds. [23] The probe will image the surface in ultraviolet, infrared, microwaves, and radio, and look for evidence of lightning and volcanism on the planet. Astronomers working on the mission reported detecting a possible gravity wave that occurred on the planet Venus in December 2015. [24]

Recent flybys Edit

Several space probes en route to other destinations have used flybys of Venus to increase their speed via the gravitational slingshot method. These include the Galileo mission to Jupiter and the Cassini–Huygens mission to Saturn (two flybys). Rather curiously, during Cassini's examination of the radio frequency emissions of Venus with its radio and plasma wave science instrument during both the 1998 and 1999 flybys, it reported no high-frequency radio waves (0.125 to 16 MHz), which are commonly associated with lightning. This was in direct opposition to the findings of the Soviet Venera missions 20 years earlier. It was postulated that perhaps if Venus did have lightning, it might be some type of low-frequency electrical activity, because radio signals cannot penetrate the ionosphere at frequencies below about 1 megahertz. At the University of Iowa, Donald Gurnett's examination of Venus's radio emissions by the Galileo spacecraft during its flyby in 1990 were interpreted at the time to be indicative of lightning. However the Galileo probe was over 60 times further from Venus than Cassini was during its flyby, making its observations substantially less significant. The mystery as to whether or not Venus does in fact have lightning in its atmosphere was not solved until 2007, when the scientific journal Nature published a series of papers giving the initial findings of Venus Express. It confirmed the presence of lightning on Venus and that it is more common on Venus than it is on Earth. [25] [26]

MESSENGER passed by Venus twice on its way to Mercury. The first time, it flew by on October 24, 2006, passing 3000 km from Venus. As Earth was on the other side of the Sun, no data was recorded. [27] The second flyby was on July 6, 2007, where the spacecraft passed only 325 km from the cloudtops. [28]


Contents

Infrared spectroscopy exploits the fact that molecules absorb frequencies that are characteristic of their structure. These absorptions occur at resonant frequencies, i.e. the frequency of the absorbed radiation matches the vibrational frequency. The energies are affected by the shape of the molecular potential energy surfaces, the masses of the atoms, and the associated vibronic coupling.

In particular, in the Born–Oppenheimer and harmonic approximations, i.e. when the molecular Hamiltonian corresponding to the electronic ground state can be approximated by a harmonic oscillator in the neighborhood of the equilibrium molecular geometry, the resonant frequencies are associated with the normal modes of vibration corresponding to the molecular electronic ground state potential energy surface. The resonant frequencies are also related to the strength of the bond and the mass of the atoms at either end of it. Thus, the frequency of the vibrations are associated with a particular normal mode of motion and a particular bond type.

Number of vibrational modes Edit

In order for a vibrational mode in a sample to be "IR active", it must be associated with changes in the dipole moment. A permanent dipole is not necessary, as the rule requires only a change in dipole moment. [2]

A molecule can vibrate in many ways, and each way is called a vibrational mode. For molecules with N number of atoms, linear molecules have 3N – 5 degrees of vibrational modes, whereas nonlinear molecules have 3N – 6 degrees of vibrational modes (also called vibrational degrees of freedom). As an example H2O, a non-linear molecule, will have 3 × 3 – 6 = 3 degrees of vibrational freedom, or modes.

Simple diatomic molecules have only one bond and only one vibrational band. If the molecule is symmetrical, e.g. N2, the band is not observed in the IR spectrum, but only in the Raman spectrum. Asymmetrical diatomic molecules, e.g. CO, absorb in the IR spectrum. More complex molecules have many bonds, and their vibrational spectra are correspondingly more complex, i.e. big molecules have many peaks in their IR spectra.

The atoms in a CH2X2 group, commonly found in organic compounds and where X can represent any other atom, can vibrate in nine different ways. Six of these vibrations involve only the CH2 portion: two stretching modes (ν): symmetrics) and antisymmetricas) and four bending modes: scissoring (δ), rocking (ρ), wagging (ω) and twisting (τ), as shown below. Structures that do not have the two additional X groups attached have fewer modes because some modes are defined by specific relationships to those other attached groups. For example, in water, the rocking, wagging, and twisting modes do not exist because these types of motions of the H atoms represent simple rotation of the whole molecule rather than vibrations within it. In case of more complex molecules, out-of-plane (γ) vibrational modes can be also present. [3]

These figures do not represent the "recoil" of the C atoms, which, though necessarily present to balance the overall movements of the molecule, are much smaller than the movements of the lighter H atoms.

The simplest and most important or fundamental IR bands arise from the excitations of normal modes, the simplest distortions of the molecule, from the ground state with vibrational quantum number v = 0 to the first excited state with vibrational quantum number v = 1. In some cases, overtone bands are observed. An overtone band arises from the absorption of a photon leading to a direct transition from the ground state to the second excited vibrational state (v = 2). Such a band appears at approximately twice the energy of the fundamental band for the same normal mode. Some excitations, so-called combination modes, involve simultaneous excitation of more than one normal mode. The phenomenon of Fermi resonance can arise when two modes are similar in energy Fermi resonance results in an unexpected shift in energy and intensity of the bands etc. [ citation needed ]

Allowing light to have wave properties, it can be shown based on wave optics and dispersion theory that the Beer-Lambert approximation works well if a material only has weak absorptions over the whole spectral range and/or a density that is below that of a condensed phase. If this is not the case, bands can shift and alter their shape as well as their peak intensity. In such cases the use of the wave optics based approach is advisable. [4]

The infrared spectrum of a sample is recorded by passing a beam of infrared light through the sample. When the frequency of the IR is the same as the vibrational frequency of a bond or collection of bonds, absorption occurs. Examination of the transmitted light reveals how much energy was absorbed at each frequency (or wavelength). This measurement can be achieved by scanning the wavelength range using a monochromator. Alternatively, the entire wavelength range is measured using a Fourier transform instrument and then a transmittance or absorbance spectrum is generated using a dedicated procedure.

This technique is commonly used for analyzing samples with covalent bonds. Simple spectra are obtained from samples with few IR active bonds and high levels of purity. More complex molecular structures lead to more absorption bands and more complex spectra.

Sample preparation Edit

Gas samples Edit

Gaseous samples require a sample cell with a long pathlength to compensate for the diluteness. The pathlength of the sample cell depends on the concentration of the compound of interest. A simple glass tube with length of 5 to 10 cm equipped with infrared-transparent windows at the both ends of the tube can be used for concentrations down to several hundred ppm. Sample gas concentrations well below ppm can be measured with a White's cell in which the infrared light is guided with mirrors to travel through the gas. White's cells are available with optical pathlength starting from 0.5 m up to hundred meters.

Liquid samples Edit

Liquid samples can be sandwiched between two plates of a salt (commonly sodium chloride, or common salt, although a number of other salts such as potassium bromide or calcium fluoride are also used). [5] The plates are transparent to the infrared light and do not introduce any lines onto the spectra.

Solid samples Edit

Solid samples can be prepared in a variety of ways. One common method is to crush the sample with an oily mulling agent (usually mineral oil Nujol). A thin film of the mull is applied onto salt plates and measured. The second method is to grind a quantity of the sample with a specially purified salt (usually potassium bromide) finely (to remove scattering effects from large crystals). This powder mixture is then pressed in a mechanical press to form a translucent pellet through which the beam of the spectrometer can pass. [5] A third technique is the "cast film" technique, which is used mainly for polymeric materials. The sample is first dissolved in a suitable, non-hygroscopic solvent. A drop of this solution is deposited on surface of KBr or NaCl cell. The solution is then evaporated to dryness and the film formed on the cell is analysed directly. Care is important to ensure that the film is not too thick otherwise light cannot pass through. This technique is suitable for qualitative analysis. The final method is to use microtomy to cut a thin (20–100 μm) film from a solid sample. This is one of the most important ways of analysing failed plastic products for example because the integrity of the solid is preserved.

In photoacoustic spectroscopy the need for sample treatment is minimal. The sample, liquid or solid, is placed into the sample cup which is inserted into the photoacoustic cell which is then sealed for the measurement. The sample may be one solid piece, powder or basically in any form for the measurement. For example, a piece of rock can be inserted into the sample cup and the spectrum measured from it.

Comparing to a reference Edit

It is typical to record spectrum of both the sample and a "reference". This step controls for a number of variables, e.g. infrared detector, which may affect the spectrum. The reference measurement makes it possible to eliminate the instrument influence.

The appropriate "reference" depends on the measurement and its goal. The simplest reference measurement is to simply remove the sample (replacing it by air). However, sometimes a different reference is more useful. For example, if the sample is a dilute solute dissolved in water in a beaker, then a good reference measurement might be to measure pure water in the same beaker. Then the reference measurement would cancel out not only all the instrumental properties (like what light source is used), but also the light-absorbing and light-reflecting properties of the water and beaker, and the final result would just show the properties of the solute (at least approximately).

A common way to compare to a reference is sequentially: first measure the reference, then replace the reference by the sample and measure the sample. This technique is not perfectly reliable if the infrared lamp is a bit brighter during the reference measurement, then a bit dimmer during the sample measurement, the measurement will be distorted. More elaborate methods, such as a "two-beam" setup (see figure), can correct for these types of effects to give very accurate results. The Standard addition method can be used to statistically cancel these errors.

Nevertheless, among different absorption based techniques which are used for gaseous species detection, Cavity ring-down spectroscopy (CRDS) can be used as a calibration free method. The fact that CRDS is based on the measurements of photon life-times (and not the laser intensity) makes it needless for any calibration and comparison with a reference [6]

FTIR Edit

Fourier transform infrared (FTIR) spectroscopy is a measurement technique that allows one to record infrared spectra. Infrared light is guided through an interferometer and then through the sample (or vice versa). A moving mirror inside the apparatus alters the distribution of infrared light that passes through the interferometer. The signal directly recorded, called an "interferogram", represents light output as a function of mirror position. A data-processing technique called Fourier transform turns this raw data into the desired result (the sample's spectrum): Light output as a function of infrared wavelength (or equivalently, wavenumber). As described above, the sample's spectrum is always compared to a reference.

An alternate method for acquiring spectra is the "dispersive" or "scanning monochromator" method. In this approach, the sample is irradiated sequentially with various single wavelengths. The dispersive method is more common in UV-Vis spectroscopy, but is less practical in the infrared than the FTIR method. One reason that FTIR is favored is called "Fellgett's advantage" or the "multiplex advantage": The information at all frequencies is collected simultaneously, improving both speed and signal-to-noise ratio. Another is called "Jacquinot's Throughput Advantage": A dispersive measurement requires detecting much lower light levels than an FTIR measurement. [7] There are other advantages, as well as some disadvantages, [7] but virtually all modern infrared spectrometers are FTIR instruments.

Infrared microscopy Edit

Other methods in molecular vibrational spectroscopy Edit

Infrared spectroscopy is not the only method of studying molecular vibrational spectra. Raman spectroscopy involves an inelastic scattering process in which only part of the energy of an incident photon is absorbed by the molecule, and the remaining part is scattered and detected. The energy difference corresponds to absorbed vibrational energy.

The selection rules for infrared and for Raman spectroscopy are different at least for some molecular symmetries, so that the two methods are complementary in that they observe vibrations of different symmetries.

Another method is electron energy loss spectroscopy (EELS), in which the energy absorbed is provided by an inelastically scattered electron rather than a photon. This method is useful for studying vibrations of molecules adsorbed on a solid surface.

Recently, high-resolution EELS (HREELS) has emerged as a technique for performing vibrational spectroscopy in a transmission electron microscope (TEM). [11] In combination with the high spatial resolution of the TEM, unprecedented experiments have been performed, such as nano-scale temperature measurements, [12] [13] mapping of isotopically labeled molecules, [14] mapping of phonon modes in position- and momentum-space, [15] [16] vibrational surface and bulk mode mapping on nanocubes, [17] and investigations of polariton modes in van der Waals crystals. [18] Analysis of vibrational modes that are IR-inactive but appear in Inelastic Neutron Scattering is also possible at high spatial resolution using EELS. [19] Although the spatial resolution of HREELs is very high, the bands are extremely broad compared to other techniques. [11]

Computational infrared microscopy Edit

By using computer simulations and normal mode analysis it is possible to calculate theoretical frequencies of molecules. [20]

IR spectroscopy is often used to identify structures because functional groups give rise to characteristic bands both in terms of intensity and position (frequency). The positions of these bands are summarized in correlation tables as shown below.

Regions Edit

A spectrograph is often interpreted as having two regions. [21]

In the functional region there are one to a few troughs per functional group. [21]

In the fingerprint region there are many troughs which form an intricate pattern which can be used like a fingerprint to determine the compound. [21]

Badger's rule Edit

For many kinds of samples, the assignments are known, i.e. which bond deformation(s) are associated with which frequency. In such cases further information can be gleaned about the strength on a bond, relying on the empirical guideline called Badger's Rule. Originally published by Richard McLean Badger in 1934, [22] this rule states that the strength of a bond (in terms of force constant) correlates with the bond length. That is, increase in bond strength leads to corresponding bond shortening and vice versa.

Infrared spectroscopy is a simple and reliable technique widely used in both organic and inorganic chemistry, in research and industry. It is used in quality control, dynamic measurement, and monitoring applications such as the long-term unattended measurement of CO2 concentrations in greenhouses and growth chambers by infrared gas analyzers.

It is also used in forensic analysis in both criminal and civil cases, for example in identifying polymer degradation. It can be used in determining the blood alcohol content of a suspected drunk driver.

IR-spectroscopy has been successfully used in analysis and identification of pigments in paintings [23] and other art objects [24] such as illuminated manuscripts. [25]

A useful way of analyzing solid samples without the need for cutting samples uses ATR or attenuated total reflectance spectroscopy. Using this approach, samples are pressed against the face of a single crystal. The infrared radiation passes through the crystal and only interacts with the sample at the interface between the two materials.

With increasing technology in computer filtering and manipulation of the results, samples in solution can now be measured accurately (water produces a broad absorbance across the range of interest, and thus renders the spectra unreadable without this computer treatment).

Some instruments also automatically identify the substance being measured from a store of thousands of reference spectra held in storage.

Infrared spectroscopy is also useful in measuring the degree of polymerization in polymer manufacture. Changes in the character or quantity of a particular bond are assessed by measuring at a specific frequency over time. Modern research instruments can take infrared measurements across the range of interest as frequently as 32 times a second. This can be done whilst simultaneous measurements are made using other techniques. This makes the observations of chemical reactions and processes quicker and more accurate.

Infrared spectroscopy has also been successfully utilized in the field of semiconductor microelectronics: [26] for example, infrared spectroscopy can be applied to semiconductors like silicon, gallium arsenide, gallium nitride, zinc selenide, amorphous silicon, silicon nitride, etc.

Another important application of Infrared Spectroscopy is in the food industry to measure the concentration of various compounds in different food products [27] [28]

The instruments are now small, and can be transported, even for use in field trials.

Infrared Spectroscopy is also used in gas leak detection devices such as the DP-IR and EyeCGAs. [29] These devices detect hydrocarbon gas leaks in the transportation of natural gas and crude oil.

In February 2014, NASA announced a greatly upgraded database, [30] based on IR spectroscopy, for tracking polycyclic aromatic hydrocarbons (PAHs) in the universe. According to scientists, more than 20% of the carbon in the universe may be associated with PAHs, possible starting materials for the formation of life. PAHs seem to have been formed shortly after the Big Bang, are widespread throughout the universe, and are associated with new stars and exoplanets. [31] Infrared spectroscopy is an important analysis method in the recycling process of household waste plastics, and a convenient stand-off method to sort plastic of different polymers (PET, HDPE, . ). [32]

Recent developments include a miniature IR-spectrometer that's linked to a cloud based database and suitable for personal everyday use, [33] and NIR-spectroscopic chips [34] that can be embedded in smartphones and various gadgets.

The different isotopes in a particular species may exhibit different fine details in infrared spectroscopy. For example, the O–O stretching frequency (in reciprocal centimeters) of oxyhemocyanin is experimentally determined to be 832 and 788 cm −1 for ν( 16 O– 16 O) and ν( 18 O– 18 O), respectively.

By considering the O–O bond as a spring, the frequency of absorbance can be calculated as a wavenumber [= frequency/(speed of light)]

where k is the spring constant for the bond, c is the speed of light, and μ is the reduced mass of the A–B system:

The reduced masses for 16 O– 16 O and 18 O– 18 O can be approximated as 8 and 9 respectively. Thus

The effect of isotopes, both on the vibration and the decay dynamics, has been found to be stronger than previously thought. In some systems, such as silicon and germanium, the decay of the anti-symmetric stretch mode of interstitial oxygen involves the symmetric stretch mode with a strong isotope dependence. For example, it was shown that for a natural silicon sample, the lifetime of the anti-symmetric vibration is 11.4 ps. When the isotope of one of the silicon atoms is increased to 29 Si, the lifetime increases to 19 ps. In similar manner, when the silicon atom is changed to 30 Si, the lifetime becomes 27 ps. [35]

Two-dimensional infrared correlation spectroscopy analysis combines multiple samples of infrared spectra to reveal more complex properties. By extending the spectral information of a perturbed sample, spectral analysis is simplified and resolution is enhanced. The 2D synchronous and 2D asynchronous spectra represent a graphical overview of the spectral changes due to a perturbation (such as a changing concentration or changing temperature) as well as the relationship between the spectral changes at two different wavenumbers.

Nonlinear two-dimensional infrared spectroscopy [36] [37] is the infrared version of correlation spectroscopy. Nonlinear two-dimensional infrared spectroscopy is a technique that has become available with the development of femtosecond infrared laser pulses. In this experiment, first a set of pump pulses is applied to the sample. This is followed by a waiting time during which the system is allowed to relax. The typical waiting time lasts from zero to several picoseconds, and the duration can be controlled with a resolution of tens of femtoseconds. A probe pulse is then applied, resulting in the emission of a signal from the sample. The nonlinear two-dimensional infrared spectrum is a two-dimensional correlation plot of the frequency ω1 that was excited by the initial pump pulses and the frequency ω3 excited by the probe pulse after the waiting time. This allows the observation of coupling between different vibrational modes because of its extremely fine time resolution, it can be used to monitor molecular dynamics on a picosecond timescale. It is still a largely unexplored technique and is becoming increasingly popular for fundamental research.

As with two-dimensional nuclear magnetic resonance (2DNMR) spectroscopy, this technique spreads the spectrum in two dimensions and allows for the observation of cross peaks that contain information on the coupling between different modes. In contrast to 2DNMR, nonlinear two-dimensional infrared spectroscopy also involves the excitation to overtones. These excitations result in excited state absorption peaks located below the diagonal and cross peaks. In 2DNMR, two distinct techniques, COSY and NOESY, are frequently used. The cross peaks in the first are related to the scalar coupling, while in the latter they are related to the spin transfer between different nuclei. In nonlinear two-dimensional infrared spectroscopy, analogs have been drawn to these 2DNMR techniques. Nonlinear two-dimensional infrared spectroscopy with zero waiting time corresponds to COSY, and nonlinear two-dimensional infrared spectroscopy with finite waiting time allowing vibrational population transfer corresponds to NOESY. The COSY variant of nonlinear two-dimensional infrared spectroscopy has been used for determination of the secondary structure content of proteins. [38]


Galactic Structure and Evolution

III.G X-Ray Emission

As noted earlier all galaxies emit X-ray radiation from their stellar components—X-ray binaries, stellar chromospheres, young supernova remnants, neutron stars, etc. More massive objects, particulary elliptical galaxies, have recently been found by Forman and Jones with the Einstein X-ray Observatory to have X-ray halos, probably of hot gas. A small class of the most massive elliptical galaxies which usually reside at the centers of rich clusters of galaxies also appear to be accreting gas from the surrounding galaxy cluster. This has been seen as cooler X-ray emission centered on the brightest cluster galaxy which sits in the middle of the hot cluster gas. This phenomenon is called a “cooling flow,” and results when the hot cluster gas collapses on a central massive object and becomes dense enough to cool efficiently. This process is evidenced by strong optical emission lines as well as radio emission. Cooling flows may be sites of low mass star formation at the centers of galaxy clusters.

Active galactic nuclei—Seyfert 1 and 2 galaxies (discoverd by C. Seyfert in 1943), and quasars are also usually strong X-ray emitters, although the majority are not strong radio sources. The X-ray emission in these galaxies is also nonthermal and is probably either direct synchrotron emission or synchrotron-self-Compton emission.


One of the striking features about Venus atmosphere is its temporal variability and dynamics, with a chaotic polar vortex, large-scale atmospheric waves, sheared features and variable winds that depend on local time and possibly orographic features. The aim of this research is to combine data accumulated over several years and obtain a global mean state of the atmosphere focusing in the global structure of the clouds using the cloud opacity and upper cloud temperatures.

We have first produced global maps using the integrated radiance through the infrared atmospheric windows centred around 1.74 μm and 2.25 μm, that show the spatial variations of the cloud opacity in the lower clouds around 44–48 km altitude and also provide an indirect estimation of the possible particle size. We have also produced similar global maps using the brightness temperatures seen in the thermal region at 3.8 μm and 5.0 μm, which provide direct indication of the temperatures at the top of the clouds around 60–70 km altitude.

These maps have been generated using the complete dataset of the Visible and InfraRed Thermal Imaging Spectrometer mapping channel (VIRTIS-M) on board Venus Express, with a wide spatial and long temporal coverage in the period from May 2006 until October 2008.

Our results provide a global view of the cloud opacity, particle size and upper cloud temperatures at both hemispheres, showing the main different dynamical regions of the planet. The profiles obtained also provide the detailed dependencies with latitude, local time and longitude, diagnostic of the global circulation flow and dynamics at various altitude layers, from about 44 up to 70 km over the surface.


Cross-regional coupling

7.4 Electric coupling between the ring current and ionosphere

The magnetosphere- ionosphere is a coupled system because they are threaded by the same magnetic field lines. Under the assumption that field lines are perfect conductors, any changes in the ionospheric potential will propagate into the magnetosphere and vice-versa. The conductive property of field lines allows currents to flow between the ionosphere and magnetosphere (see Fig. 1.2 in this book). These field aligned currents are essential for the exchange of energy and momentum between these regions. As shown in Section 7.2 , The field aligned current can be derived by the assumption: ∇ · J = ∇ · J ⊥ + J | | = 0 . Vasyliunas (1970) derived the formula of J | | as:

where ξ = 1 + μ o P ⊥ − P | | / B 2 . To get the total J|| into or out of the ionosphere, J||i, one just integrates the right-hand side of (7.4) along the field line between the two magnetically conjugate ionospheres. The field-aligned current must be closed in the ionosphere, where Ohm’s law is applicable. It can be proved that J||i and the ionospheric potential, Φ, are related as ( Wolf 1983 ):

where Σ ↔ is ionospheric conductance tensor and I is the magnetic dip angle. The effect of neutral wind is ignored. Eq. (7.5) illustrates that for given ionospheric conductance and field-aligned current, the electric potential distribution at the ionosphere can be solved. This potential is then mapped along field lines to the magnetosphere. The resulting electric field controls particle E × B drifts in both ionosphere and magnetosphere and, in turn, moderates the pressure distribution in the ring current.

The self-consistent coupling between particle drifts, electric field and currents in the inner magnetosphere and the ionosphere has been implemented in simulation models. The Rice Convection Model (RCM) is the first large-scale model of this type ( Harel et al., 1981 Toffoletto et al., 2003 ). In the RCM, isotropic pitch-angle distribution is assumed. Particles are identified by invariant energy (λj), which relates to kinetic energy (Wi) and field tube volume per unit magnetic flux (V) as λj = Wj · V 2/3 . Wolf (1983) has shown that Region 2 current, J||i can be calculated by:

where ηi is number of particles of type j per unit magnetic flux. The summation is over all ring current species and energy invariants. Later a similar model named the Comprehensive Ring Current Model (CRCM) was developed ( Fok et al., 2001 ). The CRCM considers anisotropic pitch angle and particles are identified by the first and the second adiabatic invariants. Fok et al. (2001) have proved that Eq. (7.5) also applies to the anisotropic case, in which the summation is over all species, first and second adiabatic invariants.

The electric field created by ring current pressure gradient and the corresponding Region 2 current may not be aligned with the high-latitude electric field imposed by the solar wind. As a result, the inner magnetosphere could be shielded from the external field ( Jaggi and Wolf, 1973 Stern 1977 ). The shielding effect can prevent deep particle penetration and shift the ring current pressure eastward in local time ( Wolf et al., 2007 Fok et al., 2003 ). Fig. 7.4 shows the shielding effect on the magnetospheric particle dynamics. The magnetic storm on 17–18 March 2013 was simulated using the improved version of the CRCM, named the Comprehensive Inner Magnetosphere-Ionosphere (CIMI) model ( Fok et al., 2014 ). In one of the CIMI simulations, Weimer electric potential ( Weimer 2001 ) was used (left panels) and the other simulation a self-consistent electric field considering M-I coupling ( Eqs. 7.4–7.5 ). The top panels show the L-Time plots of 80 keV electron fluxes. The black curves are Dst during the storm. In the plots, L is defined by L = ri/cos 2 λi, where ri is ionosphere distance in earth radius and λi is magnetic latitude at the ionosphere. It is clearly shown that, for this particular case, with the empirical Weimer electric field, energetic electrons penetrate deeply earthward during the main phase and are trapped at low- L’s down to ∼2.5 in the storm recovery. When self-consistent electric field is applied, the overall fluxes are much lower and the strong-flux region is confined to L > 3. Note that energetic protons and electrons are included in the CIMI calculation and no wave diffusion is considered. To understand why the simulation results from the two runs are so different, the middle panels of Fig. 7.4 depict the convection potential contours from the Weimer model and from the self-consistent calculation at 12 UT on 17 March 2013, the time marked by dashed lines in the top panel. The Weimer model predicts a generally dawn-to-dusk electric field with strongest field near dusk. The self-consistent field shows an eastward skewing of potential contours near dawn and a Sub-Auroral Polarization Stream (SAPS)-like feature ( Foster and Burke, 2002 Foster and Vo, 2002 ) in the dusk-midnight sector. The lower panels in Fig. 7.4 show the electron drift paths of perpendicular pitch angle particles with magnetic moment of 4.7 × 10 7 keV/T also at 12 UT on 17 March 2013. The particle energy is 80 keV at the reference point at 2.6 RE and 06 MLT, marked by a red asterisk. The last closed drift path is highlighted in red. As shown in the figure, the closed-path region is smaller in the Weimer field. The reference point is at the boundary of open-closed drift paths. With self-consistent convection, the reference point is well inside the closed-path region. In this case, shielding effect forbids the deep earthward transport of electrons from the plasma sheet to the reference point at low-L region. Similar conclusion was obtained from a RAM-SCB-E storm simulation reported in Yu et al. (2017) . They found the proton plasma sheet inner boundary located further away from the Earth, if self-consistent electric field was used than with the Weimer potential.

Figure 7.4 . CIMI simulation of the March 17–18, 2013 storm. Top panels: L-Time plot of 80 keV electrons calculated with Weimer model (left) and the self-consistent electric field (CIMI) model (right). Middle panels: Weimer potentials (left) and CIMI potentials (right) at the equator at 12 UT on March 17, 2013. Bottom panels: drift paths of perpendicular electrons with magnetic moment 4.7 × 107 keV/T in Weimer (left) and CIMI (right) electric field. Particle energy is 80 keV at 2.6 RE and 6.0 MLT (marked by red dots). The red curves are the last closed drift paths.

In some cases when the external dawn-dusk field decreases rapidly, the shielding field established earlier could be stronger than the external field. This condition is called over-shielding ( Kelley et al., 1979 Wolf et al., 2005 ). During over-shielding, the electric field in the inner magnetosphere is directed dusk-to-dawn instead of dawn-to-dusk. One of the signatures of over-shielding is plasmasphere shoulder. Goldstein et al. (2002) discovered a shoulder-shaped bulge at the plasmapause from IMAGE EUV images taken on May 20, 2000. They interpreted this as a result of antisunward drifting of plasma under over-shielding conditions.


Working Papers: Astronomy and Astrophysics Panel Reports (1991)

KEY POINTS

Astronomy makes unexpectedly large contributions to formal and informal science education, given the small number of research astronomers.

Technology transfer and spin-offs from astronomy have important applications in medicine, industry, defense, environmental monitoring, and consumer products.

Mankind's view of its place in the world as a whole is strongly influenced by the results of astronomical research.

Astronomy provides unusually promising opportunities for international cooperation.

Other sciences benefit from synergistic interactions with astronomy.

I. INTRODUCTION

Astronomy and astrophysics could not exist in their present form in this country without firm public support, expressed through the funding of research by federal and other agencies, public and private. The providers of this support can quite reasonably ask what they are getting in return for their money. The primary answer is, of course, scientific knowledge and all that it implies. (Identifying how that knowledge can best be extended in the future was the principle task of the Astronomy and Astrophysics Survey Committee (AASC)). But there are other, less obvious, returns, and the Panel on Astronomy and Astrophysics as National Assets was charged with identifying and documenting these.

It is not the intention to claim that these educational, cultural, and technological spin-offs are the sole, or even the major, justification for astronomical research, but only that they are a real part of the total picture of how science interacts with the rest of society. In addition, because astronomical objects and ideas are relatively appealing to non-scientists, it seems plausible that the subject may be able to play a significant role in the essential task of revitalizing American leadership in science and technology, both by encouraging young people to consider careers in these areas and by promoting scientific awareness among the general public.

The chapter on National Assets in Volume I of the AASC report presents an overview of the synergistic, educational, and cultural contributions of astronomy and astrophysics. This panel report includes a number of additional examples and technical details of a few outstanding ones. Space did not permit including all of the items collected by the panel or complete crediting of the information to the colleagues who provided it, though contributors other than panel members are listed at the end of the chapter.

II. SCIENCE EDUCATION AND LITERACY

The need for a scientifically sophisticated electorate and how far we are from achieving this have received enough publicity in recent years to require no further explication here. But, of the little science that most people are exposed to-and that they choose to expose themselves to-astronomy forms a surprisingly large part. People trained in astronomy also form part of the general technologically-educated manpower pool.

A. Formal Education

1. College-Level Courses

Formal astronomy classes have their largest impact at the non-major undergraduate level. The college and universities with astronomy (or physics and astronomy) departments had 1.2 million undergraduates in 1988 103,300 of them were taking introductory astronomy (Ellis 1988). This means that (integrated over a 4.5- year average curriculum) 35-40 percent of the graduates of these institutions fulfill their science breadth requirements with astronomy, generally as their only exposure to physical science. Astronomers typically make up 5-10 percent of the physical science faculties at these institutions.

There is also considerable demand for astronomy at colleges with no separate department of the subject. Each year the American Astronomical Society receives more than 100 requests for visits by research astronomers to these institutions through its Shapley Program. About 90 requests can be filled. The primary purpose is to talk with classes and student groups, but most visits include a public lecture and meetings with administrators well (C.R. Tolbert, University of Virginia, personal communication 1990). Text book sales indicate that a total of 200-250,000 students per year enroll in an astronomy course (M. Zeilik, University of New Mexico, personal communication 1990).

While taking these classes, students both increase their knowledge of the specific subject and change their attitudes toward science in general. A standardized test, administered as part of the planning for Project STAR (Section II.A.2), shows that those who complete an introductory class know about as much astronomy as the average secondary school teacher. Those just starting the class do considerably less well and score at about the same level as elementary school teachers.

Attitudes toward science were probed with an anonymous questionnaire given to undergraduates at Cornell University, University of Maryland, and University of Wisconsin at the end of one-semester courses. Table 1 shows the results. More than 70 percent of the 1260 students polled reported that they thought understanding science was more important than they had at the beginning of the semester. The majority also said that they were more likely to read about science and to vote for pro-science candidates for political office. All but a few percent of the rest reported their views as unchanged (some explicitly volunteering the information that they had been fairly pro-science to begin with).

Most university departments also offer adult education and extension courses in astronomy and report (e.g., from UCLA and Harvard) that these are among the most popular and successful of their offerings.

2. Pre-College Education and Teacher Training

After prolonged near-absence, astronomy is beginning to reappear in elementary and high school curricula. The 1989 National Science Foundation's (NSF) grants for astronomy education included two high school student summer programs one each for teachers in high schools, two-year colleges, and elementary and middle schools and three projects to develop teaching materials for middle and high schools. Many other programs are supported by schools, colleges, and research organizations. A representative sampling follows.

The Astronomical Society of the Pacific ''Universe in the Classroom" one-week summer workshop for grade 3-12 educators has had 2500 alumni over the past 12 years. The Society also provides a catalog of educational materials to about 250,000 people world-wide and a newsletter "Universe in the Classroom" goes to 22,000 teachers, with further reproduction by school districts and planetariums and translation into five foreign languages.

The Space Telescope Science Institute (StScI) currently supplies speakers on request to school classes in its area at a rate of about one per day. Astronomers at nearly every university, lab, and observatory talk to grade and high school classes and clubs on a regular basis.

TABLE 1. CHANGES IN STUDENT ATTITUDES TOWARD SCIENCE DURING ONE-SEMESTER INTRODUCTORY ASTRONOMY COURSES

Probability of reading about science in the future

Probability of voting for candidates favoring support for scientific research

* Number of students expressing this opinion after a one semester introductory astronomy course.

StScI also participates in (1) a summer workshop for science teachers that is expected to have about 300 participants from across the nation in 1990, (2) enrichment programs for scientifically-interested high school students from under-represented minorities, and (3) production of a 32-part instructional television series for middle schools, with broadcast in Maryland and elsewhere to begin in fall 1990.

SPICA at the Center for Astrophysics is an unusually highly-leveraged project whose participants, secondary school teachers, in turn present workshops for elementary and junior high teachers in their home districts.

The National Radio Astronomy Observatory (NRAO) in cooperation with West Virginia University operates a summer workshop for high school teachers, whose funding for 1990 is being taken over by the Claude Worthington Benedum Foundation from NSF.

The Naval Research Laboratory (NRL) and six other Washington-area research institutions provide opportunities for about 100 high school students a year to get involved in astronomical research. Most go on to careers in science and engineering.

More than 500 Starlab portable planetariums (16-foot inflatable domes from Learning Technologies, Cambridge, Massachusetts) have reached some five million school children (mostly in the earlier grades, and including many inner city and disadvantaged kids)

At the Thacher School Summer Science Program, about 1000 students over the past 30 years have worked on an astronomical research project (determining asteroid orbits from photographs and mastering the necessary associated math and physics). All participants go on to college. About 37 percent of the pre-1985 graduates are now working in science and medicine, and 34 percent in engineering, mathematics, and computer science (including the founder of Lotus Development Corporation).

Haystack Observatory has a similarly-successful summer internship for middle school students and the University of Illinois has one for high school students.

Six inner-city San Antonio schools are pioneering a junior-level year of high school science consisting of astronomy and marine biology as part of Project 2061. The real surprise is that most of the students have chosen to take another year of science as an elective in their senior years.

Project STAR (Science Through its Astronomical Roots), one of the most extensive NSF-funded

programs, is being developed at the Center for Astrophysics as a serious, quantitative alternative to high school chemistry and physics.

Many of these projects were initiated within the astronomical community, and all have had some input from researchers. But this is an area where more can and should be done. Specific initiatives are proposed by other panels. It is at the high school level and earlier that science must be made attractive to students, before they decide not to take the necessary mathematics.

B. INFORMAL EDUCATION AND SCIENTIFIC LITERACY

The activities discussed here have three connections with astronomical and other scientific research. First, virtually all of them have either been initiated by or had significant input from research-oriented astronomers. Second, many lines of anecdotal evidence indicate that informal exposure to astronomy motivates people to take a serious interest in science and technology as potential careers. And, finally, in order for books, television programs, planetarium shows, and other presentations about astronomy to remain as popular as they are, there has to be a continuing stream of exciting new results to present.

1. Television

Cosmos is the most successful public television series in history, seen by about 400 million people in 60 countries. The book version is the best-selling English-language science book ever, and the home video version had 100,000 orders placed for the full 13 episodes before release, an unprecedented number for any kind of videotape. Other astronomical television items include:

Project Universe, a series of 30 half-hour programs which reached its 100th showing in 1989, most broadcasts being on local stations in cooperation with nearby colleges offering credit for the series as a course.

Extensive, widely-watched coverage of the Voyager Neptune encounter, whose audiences included millions of young people in and out of school, and a large number of Pasadena residents and visitors (even some European amateur astronomers who flew in for the occasion), who watched in real time at an auditorium near the Jet Propulsion Laboratory (JPL).

A Galactic Odyssey (funded and produced by Japanese National Television) and The Astronomers' Universe (funded by the Keck Foundation and produced by KCET), which are 6-8 hour series focusing on astronomy and the people who do it, scheduled for 1990-91 broadcasts. One of the stated purposes of the Keck-sponsored series is to motivate pre-college students to consider careers in science and technology.

2. Astronomy in Print

Astronomy is one of the few sciences with its own (profit-making) book club. A volume featured by book clubs will sell in the range of 40,000 copies (e.g., Herbert Friedman's Sun and Earth), while one of the all-time winners, Stephen W. Hawking's A Brief History of Time, has reached the one-million mark and spent two years on the New York Times best seller list. The 1988 New York Times list of ten best non-fiction books included three on astronomy, and the subject is similarly over-represented among the winners of the American Institute of Physics science writing award.

Sales of magazines in 1988 reveal 632,500 regular readers of Scientific American, 95,000 of Sky and Telescope, and 165,000 of Astronomy, indicating that 20-25 percent of the audience for science at this level is specifically an audience for astronomy. Within the broader-based magazines (Discover, Science Digest, Scientific American, and Science 80-86) about 7 percent of the articles over the past decade have dealt with astronomy. In contrast, professional astronomical journals make up 0.5 percent of the 3300 covered by Science Citation Index, and astronomy Ph.D.'s make up about 0.7 percent of the 18,000 awarded each year in physical biological, social, health, engineering, and computer sciences (Kidd 1989).

While few papers cover astronomy as regularly as astrology, the subject is over-represented relative to other sciences in newspapers as well as magazines. For instance, 10 years of articles in the New York Times, Wall Street Journal, Washington Post, and LA Times include 325 items on astronomy and space sciences, 360 on physics, and 280 on biology (excluding medicine J. Cornell, Center for Astrophysics, personal communication 1990).

3. Observatories, Planetariums, and Museums

Of the 9.4 million visitors to the National Air and Space Museum in 1988, 36.2 percent (based on random sampling of departing visitors) found the astronomy and space exhibits more interesting than the aviation ones. About a third of a million visitors saw the planetarium show. On smaller scales:

All but two of the fifty states have observatories or planetariums regularly open to the public.

McDonald, Palornar, and Kitt Peak Observatories report that about 100,000 people per year travel the relatively large distances necessary to visit each of them. McDonald Observatory has been featured in the monthly Texas hotel magazine for tourists.

Griffith Observatory, near Los Angeles, more accessible than the research observatories, hosted 1.7 million people in 1989, as many as the Los Angeles Museum of Art and the John Paul Getty Museum together (879,000 and 338,800 respectively). The Adler Planetarium in Chicago records about 700,000 visitors per year. The total number of planetariums in the U.S. is about 1000.

4. Radio and Telephone Hot Lines

Stardate-daily five-minute programs produced by the University of Texas-is carried (and paid for) by about 200 radio stations, including some large ones like KNX in Los Angeles and KCBS in San Francisco. It has received a Corporation for Public Broadcasting award for excellence and attracted half a million letters from listeners over the past decade. Its spin-offs include a Spanish-language version, one minute TV news spots, and part of a CD-ROM computer commercial demonstration disk.

At least 20 astronomical telephone hot lines operate in the US. Most change about weekly and feature a mix of local observing information (moon phases, planets, and so on) and research news. A typical one, Starwatch at University of Minnesota, receives about 30 calls a day (more during Voyager encounters, Halley perihelion, etc.) and portions of its content are carried by a dozen local newspapers. Incoming students at the University sometimes mention that Starwatch was a factor in their choosing the institution and a science major.

5. Amateur Astronomy

Every state in the union has at least one active astronomy club. More than 240 dealers and manufacturers are engaged in the business of providing telescopes, accessories, and software for observers who are not professional astronomers. Some highlight activities are the following:

Telescope and magazine sales suggest that roughly 200,000 people take some interest in amateur astronomy. Of these, more than 14,000 belong to the main umbrella groups, the Astronomical League and the Western Amateur Astronomers.

The Planetary Society, whose members contribute $25 per year toward the cause of exploration of the planets and the search for extra-terrestrial intelligence, has about 130,000 members. Their publication, The Planetary Report, recently reported results of a random survey by the Public Opinion Laboratory at Northern Illinois University indicating that half or more of adult Americans support the Society's goals.

The American Association of Variable Star Observers provides a bridge between the amateur and professional communities. Of its 1100 members, about half each year provide about 250,000 observations of 3500 stars to a central data depository. Amateur observers of variable stars thus outnumber the professionals (about 100 of whom per year make use of AAVSO data). The total number of contributors over the history of the society is nearly 5000, equal to the current membership of the AAS.

Another important AAVSO contribution is providing data to educators for astronomically-based labs and science projects. Association membership data indicate that amateur astronomy participation among young people serves to recruit both future astronomers and scientists, engineers, and programmers in other disciplines.

Amateur astronomers frequently share their interests and expertise with Scout troops, school classes, and other groups of young people.

B. Contributions to the Pool of Scientifically Trained Personnel

About 70 American colleges and Universities currently offer degrees in astronomy or closely related fields, awarding about 100 Ph.D.'s per year, 160 B.A.'s and B.S.'s, and 40 terminal M.A.'s. Most of the

recipients who do not pursue long-term careers in astronomy do remain part of the manpower pool in science-and technology-intensive fields.

1. Ph.D. Recipients

As indicated in the report of the Panel on Status of the Profession, about half as many astronomers leave the field each year as receive new Ph.D. degrees. Complete samples of 106 doctoral recipients (1952-88) from the California Institute of Technology and 94 (166-88) from the University of Maryland confirm this. About half are primarily engaged in astronomical research 20 percent are employed in other sciences and in industry 7 percent hold teaching or science administration positions and most of the rest work on hardware or software in support of astronomical or related research.

2. B.A./B.S. Recipients

Of recent astronomy bachelors, a little more than half go directly on to graduate school (a third of them in astronomy) and the others enter the work force directly (Ellis and Mulvey 1988). Complete samples from a few institutions over a longer time period confirm this pattern. The samples include Swarthmore College (28 B.A.'s, 1940-85), California Inst. of Technology (140 B.S.'s 1956-88). and Williams College (26 B.A.'s 1974-89). 35 percent are in astronomy (research, supporting activities, or graduate school) 39 percent are engaged in other sciences or are employed in technologically intensive industries 11 percent are teaching and 15 percent are in non-science occupations (including law, photography, writing, and many others).

Undergraduates in astronomy are much more likely than those in most other sciences to engage in significant, publishable research and this may contribute to the high retention rate. If so, there might be a useful example to be followed by other sciences where the ratio of Ph.D.'s to B.A./B.S. degrees is much lower, averaging about 5 percent over all the natural sciences.

III. TECHNOLOGY TRANSFER, SPIN-OFFS, AND THE PRIVATE SECTOR

Astronomy has benefited from technological advances made in many fields in science and engineering, but astronomy also contributes to technological advances in two ways. First, the demands of researchers for devices at the very edge of what is possible have sometimes been the drivers for industrial development whose products were then useful elsewhere. Photographic emulsions are a classic example. Second, ideas, algorithms, devices, processes, materials, and so forth invented within the astronomical community are from time to time modified for use in other areas: the radio astronomy technique of aperture synthesis is such a case. The first four subsections categorize items by the fields in which they are applied rather than the part of astronomy within which they originated. The concluding subsections briefly address some potential areas for future technology transfer and the support of astronomy by the private sector.

A. Medicine

The single largest problem shared by medicine and astronomy is that of imaging things you cannot get to and of reconstructing two or three dimensional structures from a number of one or two dimensional scans. Astronomers, especially radio astronomers, led the way in solving this problem. Martin Ryle's Nobel Prize cited his development of aperture synthesis, and the solution to image reconstruction pioneered by Bracewell and Riddle (1967) is now used in CAT scanners, magnetic resonance imaging, positron emission tomography, and other medical imaging methods.

Specific computer languages and ways of handling large data arrays have also proven transferable from astronomy to medicine. IDL (Interactive Data Language) and IRAF (a very flexible image processing system) are products of optical astronomy. Their medical applications include

Study of activity and chemistry of neutron in the brain (University of Southern California).

Cardiac angiography and PET scans (University of Michigan).

Magnetic resonance imaging (National Inst. of Health).

Medical imaging and product development (Mallinkrodt Institute of Radiology and Siemens Gammasonics).

X-ray computer tomography (PDA Engineering).

The need for clean environments is another problem common to medicine and astronomy. A version of the positive pressure clean room designed at the University of Wisconsin for work on the OAO-1 satellite is now in many hospitals. NASA's needs for contamination-free environments led to data bases, handbooks, and courses for clean room personnel, as well as air handlers and ''bunny suits" whose commercial versions appear in hospitals and pharmaceutical labs.

A U.S. drug company has teamed up with the Cambridge (U.K.) Automatic Plate Measuring facility to use its expertise in scanning and interpreting images to analyze blood samples from leukemia patients. This permits much more rapid detection of responses to changes in medication and other pharmacological effects than would otherwise be possible.

Radio astronomers have adapted their methods of measuring microwave temperature for non-invasive detection of tumors and other regions of vascular insufficiency. Microwaves have poorer angular resolution than infrared but are more sensitive to deep tissue temperatures. The combination of microwave and infrared thermographic data provides a true-positive detection rate of 96 percent, better than either alone, for breast cancer (Barrett et al. 1978)

Tiny paste-on thermal sensors first designed to keep ultraviolet detectors within their narrow operating temperature range have been adapted for controlling heat lamps in neonatology units.

Finally, the X-raying of people shares with X-ray astronomy the problem of having fewer photons than you would like to work with. Thus the Lixiscope (low intensity x-ray imaging scope), a portable, low-energy X-ray scanner to which NASA holds the patents, is widely used in neonatology, out-patient surgery, diagnosis of sports injuries, and third world clinics. The FDA even used it to search for poisoned capsules during the Tylenol scare a few years ago. A second generation spin-off, the Fluoroscan imaging system, has a variable power X-ray tube source among other improvements and a wider range of applications, including catheter placement.

B. Industry

The two kinds of spin-off (driving development and originating ideas) are illustrated by astronomical interactions with photography and the communications industry.

As early as 1912, C.E.K. Mees (the first research director at Eastman Kodak) initiated research leading to special series of spectroscopic plates to meet astronomical needs. The sensitizing dyes and emulsion-making techniques resulting from this work led to products of wide utility. One example is gold sensitization, which made possible Tri-X and a number of other 400-speed films from Kodak and other manufacturers. These have dominated the professional and amateur high speed film market for a number of years.

Kodak Technical Pan film, whose sharp resolution and fine grain permit enormous enlargements, is used by medical and industrial spectroscopists, industrial photographers, and serious fine-art photographers. It was first developed for solar astronomers interested in recording changes in fine scale surface structure.

Red and infrared-sensitive emulsions, evolved for spectroscopic plates, now penetrate military camouflage, and detect diseased crops and forests. Other applications include dentistry, medical diagnosis, and probing below the surface of paintings for evidence of forgery or pentimentos. Hypersensitization techniques, developed by astronomers during the 1970s, show promise in medical and industrial microscopy and in autoradiography.

Radio astronomy has been a copious source of transferable technology, algorithms, and people interested in applying them, especially in communications. Millitech, whose founders came from the University of Massachusetts radio astronomy group, now builds millimeter wavelength components based on devices used in radio astronomy, for the communications industry. Their products include varactor multipliers, voltage-tunable Gunn oscillators, and cooled GaAs Schottky mixers (Weinreb and Kerr 1983).

Radio astronomers also founded Interferometrics (Vienna, Virginia) which tests and evaluates antennas using holographic methods first reduced to practice by British radio astronomers. A holographic map of a dish surface takes a few hours (versus several days for a mechanical survey) to reveal high and low spots that must be corrected before (for instance) sidebands are low enough to meet FCC standards for satellite communication links.

High density recording techniques have come from both NRAO and Haystack Observatory. Digi Data of Maryland is marketing several versions of the NRAO version (which achieves 2.5 Gbyte capacity and 120 kbyte/sec data rate by storing digital data in analog form) for archiving of business data, disk backup,

and other applications. The Haystack technique uses a 36 channel, high accuracy narrow-track headstack that can be moved precisely across tape to increase the density of the recorded information by a factor of more than 12, so that a single reel accommodates nearly 6 terabits and can record at a rate in excess of 1 Gbit/sec. Honeywell of Denver is now producing these high-density headstacks as a standard component.

Radio astronomers have been both drivers and developers of low noise amplifiers, including cryogenically-cooled gallium-arsenide field effect transistors (now marketed by Berkshire Technologies, also founded by radio astronomers) and high electron mobility transistors, which may replace masers in some communications amplifiers.

The computer control language FORTH was invented by a professional programmer with a strong interest in astronomy and first applied by him to coordinate telescope operation, data acquisition, and initial reduction for the NRAO 36-foot dish at Kitt Peak. It has grown into a profitable company (Forth, Inc., Manhattan Beach, California) and been modified for a wide range of purposes in manufacturing and service industries. About 20 vendors supply Forth systems for hardware from handheld computers to VAX mainframes. Some computers (most recently the Harris RTX 2000 microprocessor) execute Forth directly. The system is currently used in a rule based ("expert system") automobile engine analyzer at over 20,000 service stations world wide and in a high-accuracy densitometer used by Kodak for quality control in film manufacture. The initial support from NRAO and wide diffusion of Forth through the astronomical community were instrumental in its development into a broadly-applicable system.

Other examples of fruitful technology transfer from astronomy include:

Use of AIPS (a set of image processing programs from radio astronomy) by Boeing to test computer hardware (several vendors, including Convex and International Imaging Systems advertise that their systems support AIPS).

General Motors' application of IDL to analyzing data on car crashes.

Acquisition of the patents for the first gravitational radiation detectors by Hughes Research Laboratory for use in modified form to sense gravity anomalies associated with underground oil pools.

Use of the IRAF image processing program at AT&T for solid state physics graphics and computer systems analysis.

Cold spot welding techniques that do not distort the underlying metal, developed at University of Wisconsin during construction of OAO-1.

C. Defense

The common technological needs of astronomical observations and of certain defense programs have often resulted in one research community developing techniques or making observations useful to the other. For example, satellite and aerial surveillance have replaced many ground-based intelligence activities. The resulting increased certainty (on both sides) that accurate information will be available has contributed to recent progress in arms reduction. Surveillance requires telescopes with large accurate mirrors, precision optics, and the ability to process numerous imperfect images and extract the maximum possible amount of information. The necessary large mirror technology, adaptive optics, and processing algorithms have all had significant input from techniques developed within astronomy and by people trained as astronomers, from the time of the U2 cameras to the present. Some specific examples which extend across the electromagnetic spectrum:

A recent investigation at Grumman on recognizing rocket plumes for strategic warning purposes made use both of observations of stars and of model stellar atmospheres to discriminate plumes from cosmic objects.

Aperture synthesis radar is the remote descendent of the radio astronomy technique for which Martin Ryle won the Nobel Prize.

Development of the channeltron was supported originally for ultraviolet astronomy, but it has since found its way into various uv military cameras.

Expertise developed in conjunction with the Kuiper Airborne Observatory has provided direct support to several Navy and Air Force airborne infrared sensor development programs.

Star counts and models of stellar spatial distribution are used to assess data rates for spaceborne signal processors and sensors as well as for satellite pointing and calibration.

Astronomers who had been working on X-and gamma-ray detectors at Los Alamos helped build the instruments for the Vela satellite monitors.

Solar blind photon counters were invented for uv astronomy and later adapted to sensing the uv corona round supersonic objects in daylight and for toxic gas detection.

The Air Force Weapons Laboratory at Albuquerque has issued a number of contracts to astronomers to investigate topics like optical imaging of satellites in geosynchronous orbits using 10-30 meter baseline optical interferometry.

The infrared maps of the sky obtained by IRAS met DoD needs for information that the Air Force Geophysics Lab rocket program had been unable to provide.

The techniques being developed by Itek, LBL, and others for stress polishing of off-axis mirror segments for the Keck telescope have potential defense uses.

The early development of thermonuclear weapons made extensive use of astrophysical knowledge of radiative transfer and temperature/density diagnostics. At the present time, 69 American Astronomical Society (AAS) members are employed at Los Alamos and Sandia National Laboratories and another 32 at Lawrence Livermore National Laboratory, most of them at least partly on programmatic work. A background in astrophysics appears to provide flexibility and skills in carrying out approximate calculations based on integrating information from a variety of sources that are a good match to defense laboratory needs.

The presence of Soviet reactors in space has apparently been known to DoD for some time, but astronomical gamma-ray detectors on the Solar Maximum Mission and on a University of California, Riverside balloon-borne experiment made independent discoveries of the phenomenon (Rieger et al. 1989 O'Neill et al. 1989).

Looking ahead, the Navy is supporting neutrino astronomy for its long-term potential for communicating through the earth and for long distances under water. Solving the engineering problems associated with DUMAND (Deep Underwater Muon And Neutrino Detector) should lead to valuable new oceanographic technology as well. Grazing-incidence X-ray optical devices, which have been reduced to practice for solar astronomy, are likely to find future applications in laser weapons.

Another area where astronomical and defense interests overlap is in the need for precise coordinate systems, times, and time intervals, for use in navigation, clock synchronization, guidance, and secure communications as well as in astrophysics. The fundamental time standards are now atomic clocks, not the earth's rotation, but the determination and dissemination of time data for the U.S. are still the responsibility of the U.S. Naval Observatory (USNO). Accurate measurements of the earth's rotation rate are needed to keep civil time in step with astronomical time. This must be done for navigational and other purposes and is accomplished by a network of radio and optical observing stations, maintained by USNO and observatories of many other nations. Very Long Baseline Interferometry between widely separated radio telescopes was the original driver to turn hydrogen maser clocks into rugged, off-the-shelf items, whose main users are now space communications and DoD. In addition, VLBI methods are currently used to synchronize widely separated clocks at the nanosecond level.

The fundamental celestial coordinate system used for navigation is now a radio based one. The locations of the artificial satellites which make up the Global Positioning System and which transmit their own radio signals are in the process of being tied to the positions of quasars and other distant sources. Inertial guidance systems (for missiles and other purposes) require this accurate astronomical coordinate system for their calibration. Accurate optical star positions are used in surveying and in automated star-tracker guidance systems. The tying together of accurate radio and optical coordinate systems is a topic of current intense study. Finally, because satellite orbits are blind to the assorted wobbles of the earth beneath, correct location of terrestrial targets (for environmental and surveillance imaging as well as bombing) requires accurate forecasts of earth orientation. USNO is also responsible for providing and disseminating this information, which comes largely from VLBI observations of quasars, in the U.S.

D. Energy and the Environment

The search for fossil fuels and alternative energy sources has benefited from astronomical spin-offs in several contexts. For instance,

Texaco, Inc. and BP America both use the image processing program IDL for analysis of drilling core samples and other aspects of petroleum research.

SAIC (San Diego) has built solar radiation collectors up to 16 meters in diameter using graphite composite materials first developed in design studies for a proposed orbiting telescope called the LDR (Large Deployable Array).

Grazing incidence X-ray optics was reduced to practice for solar astronomy and now finds application in plasma diagnostics for magnetically confined plasma fusion. Detailed knowledge of atomic spectra at high temperatures, gained from study of the solar corona, is also important in this context.

Plasma and magneto-hydrodynamic phenomena, including magnetic reconnection and radiation-driven thermal instabilities, were first explored in solar and space physics environments. They also occur in fusion plasmas (and are deleterious there).

Remote sensing from orbiting satellites is now the method of choice for keeping track of an enormous range of ecologically important factors-the extent of the Arctic ice pack the moisture content of soil in the Sahel upper atmosphere profiles of temperature, density, and trace constituents sea surface temperatures and many others. Astronomically-derived image processing algorithms are widely used in these applications. Several of these are mentioned elsewhere. Another with many remote sensing and oceanographic uses is a digital correlation technique for spectral analysis of broadband signals which came out of radio astronomy (Weinreb 1963 Cooper 1976)

Specific radio, microwave, and infrared spectroscopic methods from astronomy have also proven useful in environmental applications from space and ground. Downward looking millimeter wave sounding traces back to work on the atmospheres of Venus and Mars and was validated for the earth by radio astronomers using balloon borne telescopes. The technique is operational on the current Defense Meteorological Satellite Program (DMSP) and will be the primary temperature sensor on the next generation of NOAA satellites in the 1990s.

Millimeter wave technology in space (e.g., Staelin 1981) is sensitive to composition as well as temperature of the atmosphere, including greenhouse gases in low concentrations. Microwave sounders, scheduled for the ATLAS series of spacelab experiments and for the Earth Observation Satellites, were developed by a consortium of American and European radio astronomers and atmospheric scientists.

A particularly timely application of microwave astronomy techniques from the ground is study of chlorine chemistry (relevant to ozone depletion) in the Antarctic. In September 1986, instruments developed by radio astronomers at SUNY, Stony Brook found a hundred times the normal concentration of chlorine oxide at an altitude of 15-20 km in the Antarctic ozone hole. The excess disappeared in October, verifying the role of manmade chlorine compounds in ozone depletion. The detailed chemistry had earlier been tested by the group's measurements of the diurnal variation of chlorine oxide in the middle stratosphere above Mauna Kea. The Antarctic spring cycle of chlorine oxide rise and fall was followed through the 1987 season with better instrumentation yielding the full concentration profile from 16 to 40 km. Monitoring at about five sites around the world over the next 10-20 years is planned as part of the NASA-sponsored Network for the Detection of Stratospheric Change.

E. Everyday Life

Many of us benefit regularly from the machinery used to X-ray luggage in airports, whose design descends from that of the earth rocket and satellite borne X-ray telescopes. Airport surveillance for drugs and explosives makes use of a particular gas chromatograph design supported by NASA for use on Mars. Some other mundane spin-offs from ground and space-based astronomy include:

A hand-held COD photometer developed by astronomers at University of Hawaii for use by policemen checking the transparency of automobile windshields

A non-invasive probe for contaminants likely to cause structural weakening in historic buildings it has a neutron source and gamma-ray spectrometer, was first used to analyze lunar soil, and has been tried by astronomers at GSFC in a Colonial Williamsburg smoke house and at St. Mark's Basilica in Venice to look inside the walls behind fragile mosaics.

Software to process two-dimensional images on a personal computer, developed by Michael Norman at the National Center for Supercomputing (Illinois) for his own astronomical purposes and modified for public consumption about 10,000 copies have been sold.

Use of Forth in the hand-held computers carried by the 40,000 delivery agents of one of the major express mail firms.

Application to industrial and amateur photography of enhancement techniques developed by David Malin for handling astronomical images from large telescopes (Malin 1982, 1990).

F. Looking Ahead

Technology transfer is an ongoing process. For instance, observers are currently driving COD technology (as they did photography earlier) in the direction of thinning the chips to broaden the range of wavelengths over which they are sensitive. And astronomers are pushing for cryogenic infrared array detectors with very low backgrounds and long integration times, so that they can be used at low light levels. These technologies are likely to prove useful for non-astronomical purposes.

X-ray astronomers have been responsible for the development of bolometers and superconducting devices as non-dispersive spectrometers. The entire energy of the of the absorbed X-ray is transformed into an electrical signal via phonons, producing a much larger response for a given X-ray energy than in photoelectric detectors. These have potential applications in non-destructive testing and in medicine, where getting the largest possible signal out of the fewest possible X-rays is also important.

Many radio astronomy observatories with millimeter-wave antennas are currently developing SIS (superconducting-insulating-superconducting) mixers for low noise receivers. NRAO is among these and has begun technology exchange with several commercial and government organizations (Hypress, NRL, the National Security Agency, etc.) who are interested in non-astronomical applications. Millimeter-wave astronomers are also working on error-correcting secondary mirrors and lenses. Such error-correcting optics is likely to be part of high-performance communication, surveillance, and other non-astronomical antennas and telescopes of the future.

G. Astronomy and the Private Sector

No other branch of science, except medicine, has had as much support as astronomy from private individuals, industrial firms, and foundations. Two of our great observatories, Lick and McDonald, bear the names of the men whose bequests founded them. Both are now largely maintained by state and local, not federal, funding. Contributions from Rockefeller and Carnegie and the foundations they established have built and helped maintain the Yerkes, Mr. Palomar, Mr. Wilson, and Las Companas Observatories. More recently, Oscar Meyer provided some much-needed new buildings for Palomar. And money from the Keck Foundation is even now being transformed into a ten-meter telescope that will be the largest American optical observing facility for the next generation.

The motive for this generosity (apart from tax laws) appears to have been the breadth of vision needed to span a nation with railroads or to build up a steel industry appreciating the breadth of vision needed to span the Universe and build an understanding of it. Other interactions have been of more obvious mutual benefit. Kodak has donated the several thousand 14" × 14" photographic plates needed for the second Palomar Observatory Sky Survey because this use with long exposure times and low light levels provides a critical test of their emulsions. A recent document from the American Institute of Aeronautics and Astronautics (1989) encourages federal support of the Hubble Space Telescope and similar projects because "such cutting edge technology programs stimulate commercial spin-offs of potentially great value to industry and to the nation's economy."

The process of compiling this report revealed that people whose livelihoods in no way depend upon astronomy can nevertheless feel that it is an essential activity. Whenever the AASC received a bit of publicity, they wrote, phoned, and sent photocopies emphasizing that astronomy is needed to attract students into science and technology, to inspire long-range advances (e.g., neutrino communication), and to form part of the human intellectual adventure-much the same points the Panel has identified.

IV. MAN'S PLACE IN THE UNIVERSE

As far back as history records, peoples have attempted to understand how the world got to be the way it is, what the big picture is, and how we fit into it. Anthropologists call the answers (even answers they

believe) creation myths. Our modern Western myths have a long history, with input from Greek philosophy, from Judeo-Christian religious ideas, and, at Several critical points, from astronomical research.

The Copernican revolution was the most obvious and far reaching of these. The earth ceased to be the unique center of everything and declined to merely one of several planets orbiting the sun. With further celestial study, our sun, in turn, metamorphosed into a typical, undistinguished star, not even at the center of the Galaxy. A third of the way into the 20th century, our Milky Way Galaxy itself had shrunk to a status neither special nor central to anything. In fact, cosmic models incorporating general relativity show that all places in the Universe are equivalent, there being neither any center nor any edges. And in just the last few years, a picture of the very early Universe motivated by theory on the frontier between cosmology and particle physics has made it seem plausible that the Universe-the entire four-dimensional space-time with which we might ever communicate-is only one of many universes, dictionary definitions notwithstanding.

Curiously, other recent astronomical research has pushed our thinking back a little bit in the other direction. The life-bearing earth really is very different from the other nearby planets. Looking down at it from space, we can see our home as a single, small, fragile entity, whose residents all have a common, profound interest in its well being.

Other 19th and 20th century discoveries clarify other aspects of our relationship to the rest of the Universe. The spectra of the sun and stars show absorption and emission lines in just the same patterns that are radiated by common chemical elements when you heat them in the laboratory. Thus celestial objects do not consist of some ''quintessence" or substance unique to them. They are made of the same stuff that we are, and even in more or less the same proportions. Apart from helium (which forms no stable compounds), the commonest elements in the stars are the hydrogen, oxygen, and carbon that make up most of our bodies. Close study of spectra of distant galaxies and quasars reveals not only this commonalty of composition but also that the constants and laws of physics are the same at distant times and places as they are here and now on earth.

The totality of modern astronomy makes up a major part of our Western creation myth, answering many of the traditional questions about how big, how old, and what came before. The world, or Universe, is large. It is the same in all directions (on large enough scales). It expands and is only three or four times older than the earth itself (but the earth is some 100 million times older than the span of an average human life). We are made of starstuff-chemical elements built up from hydrogen atoms by nuclear reactions in massive stars. And chemical reactions in interstellar gas and in the material that formed the meteorites and comets have produced the same molecules that are the building blocks of living creatures on earth.

The task of clarifying our relationship to the rest of the Universe is an on-going one, with many important questions still incompletely answered. It is, for instance, just becoming meaningful to ask whether the Universe could have been very different from what it is (in size, age, laws of physics, kinds of particles, and so forth) and whether such a different Universe could have life arise in it. On smaller scales, detailed studies of Mars and Venus will play an important part in understanding the early evolution of the earth's atmosphere, oceans, and biosphere, and in determining just how delicate the present state of terrestrial habitability is likely to be.

The "where do we belong" aspect of astronomy seems to be responsible for most of the popular interest in the subject. The potential for technology transfer and for attracting students into the sciences may be good (though not central) reasons for funding astronomical research. But they are not the reasons that people watch Cosmos, buy and build small telescopes, or read books and articles about astronomy. Rather, these people are seeking new answers to the old human questions about the world and our place in it.

An important property of the modern creation myth is that its answers are neither static nor given by fiat. Everything (or nearly everything) within the sciences is subject to change without notice. Our picture of the Universe expands and evolves as our knowledge expands and evolves. A vigorous continuation of this process can help to keep human minds flexible enough to deal with immediate practical problems that now also change on timescales much less than a human lifespan. Practicing astronomers feel a great deal of certainty that, although any given piece of information may turn out to be wrong, the basic process of inquiry is sound and leads to continuously better understanding of the world around us. Confidence that the Universe is neither incomprehensible nor intrinsically hostile is perhaps the most important return astronomers can offer to their fellow citizens.

Astronomy as part of our world view has a less serious side as well. For instance, the phrase "black hole"

seems to have entered the everyday vocabularies of Rap. Bill Frenzel (R-Minn.), New York Yankee Don Mattingly, and Supreme Court Justice Sandra Day O'Connor in various contexts (Montgomery Journal, April 26, 1990, p.2). "Astronomical" distances and amounts of money and "zeniths" and ''nadirs" of achievement and despair are also common phrases.

Modern astrophysics has not inspired any artistic works comparable with Dante's treatment of medieval cosmology's circles of heaven and hell, but Van Gogh's "Starry Night" reveals a mind not unmoved by the Universe as it is now understood. And each new astronomical-discovery novae, supernovae, neutron stars, black holes, multiple Universes, and many others-has inspired science fiction films, stories, and novels tying these discoveries to possible individual lives.

V. INTERNATIONAL COMPETITION AND COOPERATION IN ASTRONOMY

Science and technology are normally perceived as factors in international competition, both military and economic. That aspect is by no means absent in astronomy. Most of us are really rather proud of the long list of American "firsts" and "bests" (the Apollo Program and Viking Landers large optical telescopes Uhuru and COBE the VLA and the VLBA and so forth) and pleased by the leading role that the U.S. has played in the International Ultraviolet Explorer (IUE), the Infrared Astronomy Satellite (IRAS), the Hubble Space Telescope, and the establishment of intercontinental networks of VLBI stations.

We believe that it is important for the U.S. to continue to take, and be seen to take, a position of leadership in astronomy, astrophysics, and space exploration. Benefits of remaining at the forefront in research include a strong positive image in the eyes of the nations we interact with, the potential for future spin-offs, and opportunities for fruitful international collaborations. The ability of the U.S. to continue to attract outstanding students and young researchers from abroad is also vital for the continued health of science and engineering here. About one-quarter of the astronomical research community in both senior and entry-level positions is foreign born (Trimble 1988). Among graduate students in engineering and physics, the proportion is roughly one half. Because the results of astronomical research receive a good deal of media attention, leadership in this field can contribute disproportionately to a positive American image abroad.

The world is, however, entering an era in which international cooperation will replace competition, at least so we all devoutly hope and possibly even rationally expect. Among the sciences, astronomy has an unusually long and rich history of internationalism dating back to before the 19th century. One important driver for collaborations is the absolute necessity of observatories spaced over the surface of the earth necessary to see the whole sky. In addition, the perceived impracticality of astrophysical research has probably helped it to be seen as a safe area for contact when there were not many others. Beginning in 1887, two American observatories were among the 18 world wide that banded together to produce a map of the entire sky (Carte du Ciel). The International Astronomical Union was the first of the modern international scientific unions organized, in 1920, under the Treaty of Versailles, with the U.S. as one of the founders.

A large fraction of new observing facilities, spacecraft, and research programs are international collaborations. The European Southern Observatory and the Canada-France-Hawaii telescope are self-explanatory. The latter peacefully shares the top of Mauna Kea with a British infrared telescope and several American projects. Construction of a Japanese observatory there is expected to begin in the next decade. Other recent success stories include the following:

The sharing of the International Ultraviolet Explorer observing time between the European Space Agency and NASA (in ratio 1:2) over the past 13 years.

An American (University of California-Berkeley) spectrometer launched by a Japanese rocket to study the cosmic microwave background.

European-built instruments on IRAS and HST, with proportionate sharing of the observing time.

An American instrument on the Soviet Vega 1 and 2 spacecraft that flew past comet Halley. Soviet scientists participated in the Voyager 2 encounter with Neptune.

The 20 percent of the papers published in the Astrophysical Journal in 1990 that had authors headquartered in the US and at least one other country (Abt 1990).

Scientists and administrators in the United States and the Soviet Union have even begun exploring the idea that the first manned mission to Mars should be a joint one.

On a more personal level, most American astrophysicists count at least a few foreign colleagues, often including some from ideologically very diverse countries, among their closest friends, though they may not


Venus Research Focus Areas

Improved understanding of Venus is essential to better appreciate the full range of terrestrial planet origin and evolution present in our own solar system and to interpret observations of new earth-sized planets being discovered around other stars. The prior articles in this issue have each identified key open issues as well as the measurements or approaches that are needed to resolve them. These approaches can be broadly categorized as (1) earth-based observations, (2) laboratory studies, (3) modeling studies, or (4) new spaceflight missions.

Earth-Based Observations

Venus is our closest neighbor and is very well suited to observation from Earth (or from near-Earth observatories in space). These observations are complementary to Venus spacecraft observations in a number of fields, from geology to atmospheric composition and dynamics (Fig. 1).

Earth-based observations are useful for increasing our understanding of Venus in a wide range of scientific areas. (A) Global average lower cloud cover as determined by Tavenner et al. (2008) using Earth-based observations from the IRTF. (B) Polarized radar image of Hyndla Regio and Zirka Tessera can be used to identify the extent of fine-grained deposits (Campbell et al. 2015)

For atmospheric composition, ground-based observatories can provide a broad range of spectral coverage: including regions of the spectrum for which instrumentation has not yet flown on Venus orbital missions. For example, ground-based observations by Allen and Crawford (1984) provided the discovery of near-infrared spectral windows at 1–2.5 μm, which allowed mapping of tropospheric gases on the night-side of Venus it was then over 20 years before an orbital instrument observing in this spectral range reached Venus. The best spectral resolutions reachable in ground-based facilities are also highly complementary to those achievable in Venus orbit for example, tropospheric HF (Bézard et al. 1990) and mesospheric ClO (Sander and Clancy 2018) have been measured from the ground but not from orbit. Further atmospheric species may be detected as new facilities become available, thus providing important drivers to develop new orbital instrumentation capabilities.

Spatial mapping from ground-based observations provides viewing geometries different, and complementary, to those achieved from Venus spacecraft. Encrenaz et al. (2012, 2016) has mapped HDO and (mbox_<2>) abundances across the full disk of Venus, and has tracked their variation on timescales ranging from minutes to years. The full disk measurements of horizontal distribution are complementary to the point measurements and vertical profiles measured from Venus Express (Marcq et al. 2018, this issue).

Ground-based observatories can also provide monitoring over long periods of time, particularly times when no spacecraft are at Venus. The stand-out example of this is long-term monitoring of mesospheric sulphur dioxide in addition to measurements by Pioneer Venus (1978–1992), Venera 15 (1983–1984), and Venus Express (2006–2014), mesospheric (mbox_<2>) was monitored in the UV from sounding rockets and from the Hubble space telescope (Esposito et al. 1997) it has also been measured in the thermal infrared (IR), as discussed above, and now also in sub-mm ranges from observatories such as JCMT and ALMA. Continuing these observations in the coming decade, when no Venus missions are planned, will help constrain (mbox_<2>) (and, by extension, perhaps volcanic activity) in this period.

High spectral resolution also allows direct measurement of winds through Doppler velocimetry, and detection of trace chemicals. To be scientifically valuable, Doppler velocimetry must achieve accuracies on the order of 10 m/s or better, which requires spectral resolutions of (lambda/dlambda > 3 imes10^<7>) or higher, depending on viewing geometry. This is now achievable in a range of earth-based telescopes, from sub-millimetre (ALMA, NOEMA, JCMT) to visible-near-IR (HIPWAC, THIS, ESPaDOnS)—all of these have been used to measure winds, at altitudes ranging from 60–110 km, depending on the spectral feature observed (Sanchez-Lavega et al. 2017, this issue). Of particular interest is the 90–120 km altitude region, which marks a transition from the retrograde zonal circulation in the mesosphere to a subsolar-to-antisolar circulation in the thermosphere—there are scarcely any clouds in this rarefied portion of the atmosphere, so the only wind velocities measured in this highly variable region come from ground-based Doppler velocimetry. Further observation campaigns, from observatories offering ever more spatial and spectral resolution, will help to understand this variability.

Wind fields at cloud level can also be measured from Earth using feature tracking. Single-station observations have tracked meridional profiles of zonal winds, using observation sequences a few hours long longer durations can be obtained either by co-ordinating observatories distributed in longitude, or by using observatories at polar latitudes. Of particular note is the possibility of telescopes carried by stratospheric balloons at 35 km altitude these are above most atmospheric turbulence and water vapor, offering observing conditions intermediate in spatial resolution and temporal duration between ground-based and Venus satellite observations (Young et al. 2008), providing a unique dataset to study large-scale atmospheric variability on day- to week-long timescales.

While the above observations all discuss the atmosphere, the surface of Venus can be observed from Earth, too. The highest resolution ground-based radar images of Venus, from the Arecibo observatory, reach spatial resolutions of 1–2 km while this is an order of magnitude poorer than Magellan radar images, the long temporal baseline offered by decades of observation allows a search for temporal changes on these timescales. In addition, ground-based images include polarimetric information not captured by Magellan allowing constraints on surface properties. For example, polarimetric information has been effectively used to map impact crater ejecta that mantle tessera terrain. This information is important for selecting potential landing sites that are not covered by materials derived from other surface locations (Campbell et al. 2015). Polarimetric information has also been demonstrated to be helpful for identifying deposits of granular material (few cm in diameter) that have been interpreted as pyroclastic material from explosive volcanic eruptions (Campbell et al. 2017). Future ground-based radars, such as the Square Kilometer Array (SKA) could provide an increase in collecting area of two orders of magnitude compared to current radio telescopes (e.g., Carilli and Rawlings 2004) and may be useful for mapping Venus surface properties and for identifying any changes due to volcanism or tectonism. The SKA is expected to begin preliminary science observations as early as 2020 using a partial array.

Earth-based radar can also be used for monitoring of Venus’ spin. Previous measurements of Venus’ spin rate from ground- and space-based radar have varied by around 1 part in (10^<5>) , equivalent to uncertainties of about three minutes in Venus’ sidereal day. Some variation in the spin rate is expected as a result of momentum exchange between the planet and its massive atmosphere, as well as due to solar tidal forcing and possible mantle-core interactions (Cottereau et al. 2011 Navarro et al. 2018) measuring variations in the spin state therefore would help constrain these parameters. Radio signals reflected from the surface of Venus exhibit spatial inhomogeneities, or speckles a cross-correlation between observations of these speckle patterns from different observatories on Earth allows measurement of the instantaneous spin rate of Venus. The accuracy achievable in this spin rate measurement is estimated to be (Deltalambda/ lambdasim10^<-6>) for a single-frequency measurement using two receiving stations on Earth, or (Deltalambda/ lambdasim10^<-8>) using multiple frequencies and arrays of receivers (Karatekin and Holin 2016). These measurements are logistically demanding, due to the use of multiple observatories, but may lead to valuable constraints on interior structure not achievable from an orbiter.

“Amateur” astronomical observations—i.e. those outside academic or research institutions—have shown their worth in other fields of planetary observation, most spectacularly in the observation and even video recording of impacts on Jupiter (Sanchez-Lavega et al. 2010): could they be similarly useful at Venus? Amateur observers typically collect images in visible, UV or near-IR wavelengths with telescopes of (<0.5

mbox) primary aperture (Barentsen and Koschny 2008). At these wavelengths, almost all light is from the dayside and any contrasts observed at Venus tend to be associated with large-scale cloud features. Imaging of nightside IR emission has been demonstrated using occulting masks to block out light from the dayside of Venus (Mousis et al. 2014). Although this is an impressive achievement with amateur equipment, such maps have poor spatial resolution and have not yet proved useful for scientific analysis. Video capture of the nightside of Venus is novel, well-aligned with the observing equipment used by many amateurs, and potentially scientifically rewarding, as these observations could reveal lightning flashes and/or meteor impacts. However, such emissions are likely to be very faint and difficult to observe in such proximity to the extremely bright dayside of Venus and may be beyond the reach of amateur observing equipment in the near future. On the other hand, participation by non-professional scientists (sometimes called “citizen scientists”) in the analysis of datasets obtained by Venus spacecraft is feasible. Enthusiastic participation in Mars spacecraft missions, with tools such as JMARS and Midnight Planets, has shown the public appetite for engaging in planetary scientific data analysis. Such public engagement should be harnessed in future Venus missions, particularly those such as high-resolution radar orbiters that will generate vast amounts of high resolution imagery.

Laboratory Studies

Laboratory work is absolutely fundamental to our ability to interpret observational and modeling results. There are still many areas where new lab work is needed in support of both studies of the rocky surface as well as of the atmosphere. In particular, recent results from Venus Express indicating possible emissivity anomalies in highland regions (Gilmore et al. 2017, this issue) have driven the need for new laboratory work to fully characterize complex temperature effects on the spectra of minerals and rocks in the near infrared wavelength range. Laboratory studies are also needed to better understand the unusual origin of surface features that may have formed through limited subduction driven by mantle plumes (Davaille et al. 2017). In addition, much is still unknown about the weathering environment at the Venus surface. To better constrain observations, it is important to calibrate the oxidation rate of basaltic glass that results in a thick (10 μm) coating of hematite. Assumptions that it would take less than 1 million years to form a weathered coating have led to the inferences that unweathered lava flows observed by VIRTIS (as high emissivity anomalies) are “young”. Additional laboratory work is also needed to determine reaction rates that can form sulfates in the Venus surface environment in order to better understand the interactions of the atmosphere with the surface. Finally, changes in radar emissivity as a function of altitude that were observed by Magellan remain unexplained. Additional laboratory work is still needed to identify plausible semiconductor and ferroelectric substances that could cause this effect.

In terms of better understanding chemical reactions and rates within the atmosphere, there is a strong need for laboratory studies that can place better constraints on chemical reactions occurring at a range of altitudes as well as to characterize the physical and chemical properties of the aerosols that make up the thick cloud layer. For example, because sulfur species play such an important role in Venus’ atmospheric chemistry, rate coefficients are needed for all expected sulfur reactions to constrain photochemical modeling efforts. Laboratory studies of cross sections of sulfur species are also required to constrain models and to assess their potential as candidates for the unknown UV absorber (Marcq et al. 2018, this issue). In addition, for cloud layer and lightning studies, assessment of mechanisms for faster production (mbox_<2>mbox_<4>) is needed as well as laboratory studies of reactions involving (mbox_<3>mbox_<2>) , (mbox_>mbox) and negative ions. Finally, laboratory studies of aerosol chemistry and characterization of optical properties of particulate products are also needed. In particular, laboratory studies of sulfuric acid aerosols at high concentrations, including phase behavior, are required to better constrain the microphysical properties of the aerosols under Venus conditions (Titov et al. 2018, this issue).

In support of future remote sensing observations, laboratory measurements are needed to improve understanding of high-temperature, high-pressure spectra at near-infrared wavelengths of (mbox_<2>mbox) , HDO, and (mbox_<2>) (Marcq et al. 2018, this issue). Laboratory studies are also needed to understand the mechanisms that convert (mbox_<3>) and OCS to CO, (mbox_<2>) and (mbox_<2>) , the rate coefficients for these reactions, and to identify appropriate wavelengths for observation. In addition, although it is known that both (mbox_<2>) and (mbox_<2>) are each individually critical fluids at Venus surface temperature and pressures, the critical conditions for the mixture have not yet been identified either theoretically or experimentally (Limaye et al. 2017, this issue).

Modeling Studies

Similar to laboratory work, there is a very broad range of modeling work that needs to be completed to aid in interpretation of existing observational data and for planning future observations. The recent VEx and Akatsuki missions have provided important observational data that have driven a surge of progress in atmospheric modeling. Key areas that are very well suited to advancement through modeling work are radiative transfer models to better understand the greenhouse effect (e.g., Lee and Richardson 2011 Lebonnois et al. 2015), chemical kinetics models to better under atmospheric chemistry (e.g., Zhang et al. 2012 Krasnopolsky 2012), cloud micro-physics (e.g., McGouldrick and Toon 2007), and general circulation models (e.g., Lebonnois et al. 2016). In addition, a great deal of progress has been made in recent years in the development of models to better understand planetary interior dynamics and the conditions under which mantle plumes can form (Smrekar and Sotin 2012).

There are several radiative transfer modeling advances needed that will improve understanding of energy balance in Venus’ atmosphere and its influence on global circulation and climate (Limaye et al. 2017, this issue). In particular, the role of large-scale dynamics, chemical reactions, and cloud processes on the Venusian entropy budget still needs to be studied. The influence of clouds and cloud formation processes in climate models needs development and one of the most important open issues is the effect of variability of atmospheric properties such as the abundance of radiatively active gases, cloud microphysical and optical properties and total opacity. In addition, it is critical to understand how the distribution of sources and sinks of radiative energy drive the atmospheric dynamics and new studies are needed to understand the processes that most strongly influence Venus’ climate. Finally, better understanding of radiative processes will provide insights into the role of radiation in Venus’ atmospheric evolution, including the onset of greenhouse conditions and the loss of water.

An area where numerical modeling can be particularly effective is atmospheric dynamics (Sanchez-Lavega et al. 2017, this issue). Recent General Circulation Model (GCM) advances (e.g., Lebonnois et al. 2016) are capable of reproducing important features such as temperature structure, static stability and zonal winds. However, work is needed to understand the dynamics of key features (e.g., cold collar, large stationary gravity waves) and how they couple or not to the super-rotation. In addition, the role of eddy processes is crucial, but likely involves the complex interaction of a variety of different types of eddy, either forced directly by radiative heating and mechanical interactions with the surface or through various forms of instability. There is also a need for improved numerical models that are capable of spatially resolving the polar vortex morphology and accurately reproducing its dynamics, and the role of subgrid-scale processes in the angular momentum budget, especially small-scale gravity waves. Finally, the robustness of existing GCMs should be confirmed through inter-comparison between several models, with particular focus on the conservation of angular momentum.

Photochemical studies can also benefit greatly from new modeling work (Marcq et al. 2018, this issue). Detailed dynamical and photochemical studies of the Venus middle atmosphere ( (sim70mbox<-->110

mbox) ) can be used to understand the photochemistry, dynamics, heating, and microphysics that drive the atmosphere at these altitudes. Existing validated models with updated photochemical schemes can be used for this type of study. Models can also be used to address major questions regarding dynamical exchange between the lower and upper atmosphere. Understanding of aerosols can be improved through new microphysical models of sulfuric acid aerosol formation, growth and decomposition (Titov et al. 2018, this issue). In addition, such studies need to be expanded to other species (e.g., elemental sulfur) that may be consistent with observations of unknown absorbers in the UV and other wavelengths. Incorporation of new microphysical models into regional scale and global scale circulation models can then be used to study feedbacks between microphysics, chemistry, and the momentum and energy balance.

New In Situ Observations

Although much can be learned from Earth-based observations, laboratory studies and modeling, there are many unanswered questions that can only be answered by new observations acquired by a space flight mission. Measurements made in situ, within the Venus atmosphere or from the surface, are critical for understanding Venus’ evolution and for placing Venus into context with Earth and Mars. The last in situ Venus mission was Vega in 1985. The Vega balloons, combined with the earlier Pioneer Venus probes (flown in 1978), have stimulated numerous unresolved questions about the composition and structure of the atmosphere. Likewise, although the Venera and Vega landers measured bulk chemistry of the surface in several locations, surface mineralogy has never been measured, leaving many unanswered questions regarding the origin and evolution of the surface. Many of these questions cannot be fully addressed through remote sensing observations from orbit.

One of the fundamental measurements that is needed is the bulk elemental composition and mineralogy of the surface from key locations, especially tesserae (Gilmore et al. 2017, this issue). Tesserae are thought to be older than the regional plains, and as such may retain evidence of an earlier epoch prior to volcanic resurfacing, including evidence of different climate and weathering regimes. Even the regional volcanic plains that are typical of (sim40\%) of the Venus surface appear to exhibit significant variability as observed by multiple Soviet landers. Although Venera and Vega chemical analysis indicate overall basaltic composition (Fegley et al. 1997), details of the mineralogy can be used to understand the petrologic origin of the magmas and possibly even address hypotheses related to rates of volcanic resurfacing. There is also a need to characterize the oxidation state in the deepest atmosphere that interacts with the surface. Combined with measurements of surface mineralogy, knowledge of the oxidation state can constrain what minerals are stable at the surface. Likewise, the sulfur chemistry cycle (Zhang et al. 2012) needs to be constrained through measurements of rock mineralogy and atmosphere composition. Seismic measurements would be invaluable for studying tectonic and volcanic activity, and for constraining internal structure. A comprehensive seismological survey will require technically demanding long-lived surface stations, but precursor studies could be carried out using short-lived landers, infrasonic detectors from balloons or airglow imaging from orbital platforms (Stevenson et al. 2015 Lorenz and Panning 2018).

Substantially more data are needed on the thermal structure of the deep atmosphere (below 40 km). Very limited data exist for this important region (Limaye et al. 2017, this issue), which contains more than 75% of Venus’ atmospheric mass. Only the Vega 2 lander made any reliable measurements of temperature in this region and those data are too sparse to resolve important structural characteristics, such as the extent of the planetary boundary layer. It is vital that the adiabatic lapse rate for the Venus atmosphere be measured accurately for the conditions found from the Venusian surface to the minimum temperature region, found at about 125 km altitude. In situ platforms are required to secure data on thermal structure below about 35 km. Prior missions have also introduced questions that require new measurements to resolve, such as a possible unexplained gradient in (mbox_<2>) reported by Oyama et al. (1979) and the as yet unidentified species that is absorbing UV in the upper clouds (58–65 km). Again, to constrain dynamics models, one of the most important open issues in this field is the abundance and variability of radiatively active gases as well as radiative heat fluxes. There is also a need to characterize the aerosol population (Titov et al. 2018, this issue), including number density, particle size distribution and optical properties as well as their chemical origin, in order to constrain micro-physical models of Venus’ clouds.

Numerical models of atmospheric dynamics have achieved significant success (Sanchez-Lavega et al. 2017, this issue) but many uncertainties remain, especially in the deep atmosphere. Precise wind field retrieval below the upper clouds (surface to 60 km) as a function of location (long/lat) and local time is an essential ingredient required to derive the vertical distribution of the angular momentum, momentum transfer and super-rotation origin. In addition, observations of waves (e.g., gravity, Kelvin, Rossby and tidal) below the clouds are needed to better understand their role in the inertial forces caused by super-rotation. While in situ measurements are generally the most direct approach to measuring these parameters, they are limited by short lifetimes (hours to days) and small spatial coverage inherent to current in situ approaches. As remote sensing techniques are improved and in situ lifetimes are extended, there is ultimately a need for more detailed observations of the deep atmosphere that can match the coverage obtained by missions such as Venus Express in the middle atmosphere.

There are also many questions concerning the composition of the atmosphere that are needed to constrain models of chemical cycles, photochemistry, and radiative transfer (Marcq et al. 2018, this issue). In situ measurements are needed of trace gas species in the cloud region extending from 48–65 km. More importantly, measurements of trace gas abundances and possible gradients are sorely needed below the clouds, where only limited observations are available. Abundances and isotopic ratios of noble gases are of significant scientific interest in regard to Venus’ atmospheric evolution in comparison with Earth and Mars. These gases are unique because they are inert. As such, they are fossil indicators of the earliest process that formed and modified the original atmosphere. The heaviest noble gases, krypton and xenon, are least susceptible to atmospheric escape and are particularly important. Pioneer Venus and Venera measurements of krypton are discrepant by a factor of four and xenon has never been quantified (Baines et al. 2013). However, because these species lack detectable spectral features, they can only be measured in situ. Measurements of D/H are also needed in order to quantify the volume of water in Venus’ past as well as the timing and processes for water loss. Deep atmospheric measurements of D/H are needed for the bulk atmospheric value as well as to understand vertical profile uncertainties. Improved measurements of other important isotopic ratios, such as (^<13>mbox/^<12>mbox) and (^<15>mbox/^<14>mbox) , provide insights into how Venus’ evolutionary history has been similar to or different from Earth. To better understand the coupling of the surface and atmosphere, and the degree to which buffering reactions produce trace gas species in the atmosphere, there is a need to measure trace species in the lowermost scale heights.

New Orbital Observations

There are many important new observations that are attainable using remote sensing techniques from Venus orbit. The two most recent Venus missions (VEx and Akatsuki) were primarily focused on understanding atmospheric chemistry and dynamics, and a new mission focused on geologic and geophysical questions is needed. The only historic Venus mission that addressed questions of global surface and interior process was Magellan (1990–1994), with its 1970’s era synthetic aperture radar which collected data in one sense of polarization. While Magellan revealed many interesting geologic features on the surface, numerous mysteries remain that require modern instrumentation to bring Venus up to the level of understanding of Mars. For example, Magellan imaging resolutions (100–200 m) are analogous to Viking images of Mars and Magellan altimetry with (sim10

mbox) posting is of insufficient precision to constrain models at geologic process scales. In addition, models of surface emissivity using near infrared remote sensing data (Gilmore et al. 2017, this issue) require surface topography with much better spatial and vertical resolution than possible using Magellan data, particularly in regions of substantial topographic variability (e.g., tesserae). To more fully understand the story of tessera formation and evolution, higher resolution imaging and complete coverage of the surface in the near infrared is also needed. To improve knowledge of temporal changes in atmospheric circulation, there is a need to establish the average albedo of Venus more accurately as well as to detect changes over annual and longer time scales (Limaye et al. 2017, this issue). More precise observations of zonal and meridional winds (Sanchez-Lavega et al. 2017, this issue) are needed to answer questions about polar vortex dynamics, the extent of the Hadley cell, and to reduce wind divergence uncertainty in order to better constrain cloud top sources and sinks.

Synoptic imaging spectroscopy with sufficient spectral, spatial, and temporal resolution and sufficient duration to track variations in (mbox_<2>) and SO at the cloud tops across the few hours to few weeks time scales would be invaluable for sorting the relative contributions of photochemistry, dynamics and microphysics (Marcq et al. 2018, this issue). Longer term temporal variations in (mbox_<2>) abundance at the cloud tops that extend observations made by Pioneer Venus (Esposito 1984), Venera 15 (Zasova et al. 1993), and Venus Express (Marcq et al. 2012), combined with observations of (mbox_<2>) variability in the deeper atmosphere, would provide insights into possible mechanisms (e.g., active volcanism, climate variability, or other mechanisms) that drive observed variations. Optical, radio wave, and electric field monitoring are needed to characterize Venus lightning (Titov et al. 2018, this issue).

Orbiting remote sensing platforms are ideally suited to study the upper atmosphere. A dedicated aeronomy and solar wind interaction mission that can provide coverage of the north and south poles, noon and midnight sectors, and terminator regions, during the full solar cycle, would provide necessary data and allow comparisons with Earth and Mars (Futaana et al. 2017, this issue). Precise measurement of the magnetic field and currents in the ionosphere can provide insights into mantle conductivity, which has implications for crustal water content. In addition to dedicated missions, observations by fly-by spacecraft equipped with plasma and field instrumentation can also provide valuable new data (Coradini et al. 2015). Finally, sustained time-series observations of near-infrared molecular emissions in the mesosphere would increase understanding of observed spectral and spatial variability in airglow on the night side (Gerard et al. 2017, this issue).


The Holy Grail: A road map for unlocking the climate record stored within Mars’ polar layered deposits

Isaac B. Smith , . Matthew Siegler , in Planetary and Space Science , 2020

1.3.1 Orbiters

Spacecraft investigations of the polar regions of Mars began with the Mariner 7 flyby in 1969 when the infrared spectrometer observed CO2 ice, related to seasonal processes ( Herr and Pimentel, 1969 ). Imagery of the polar regions began in 1971 with Mariner 9 and continued with the comprehensive coverage by Viking orbiters (1976–80). In the modern era, imagery at ever increasing spatial resolution has been obtained from the Mars Global Surveyor (MGS), Odyssey, Mars Express, and Mars Reconnaissance Orbiter (MRO) orbiters using the High Resolution Stereo Camera (HRSC), MOC, THEMIS, CTX, and HiRISE instruments. Infrared observations to observe surface properties began with multi-channel instruments (Infrared Thermal Mapper (IRTM) on Viking) and advanced to full spectroscopy and compositional measurements with the MGS Thermal Emission Spectrometer (TES), Mars Odyssey’s THEMIS, and continues with Mars Express’ Observatoire pour la Minéralogie, l’Eau, les Glaces et l’Activité (OMEGA), and MRO’s Compact Reconnaissance Imaging Spectrometer for Mars (CRISM). Infrared spectroscopy began with the Infrared Interferometer Spectrometer (IRIS) instrument on Mariner 9 and continued with MGS′ TES, Mars Express’ Planetary Fourier Spectrometer (PFS), and MRO’s Mars Climate Sounder (MCS) (e.g. Smith, 2008 ). Gamma ray and neutron spectroscopy on Odyssey has been used to study ice ( Feldman et al., 2002 ), and laser altimetry from the Mars Orbiter Laser Altimeter (MOLA) has been used to estimate the total thickness of the PLD and measure seasonal elevation change with deposition ( Smith et al., 2001 ). Radar measurements to probe the subsurface have been performed from both MARSIS and SHARAD sounders ( Picardi et al., 2004 Seu et al., 2007 ).

Polar studies have benefitted from several kinds of orbital observations. Optical and infrared instruments tracked stratigraphy, geomorphology, seasonal geologic processes, inter-annual variability, composition, albedo, and physical state of H2O and CO2 ice deposits. Spectrometers track grain size and layer properties on the surface ( Calvin et al., 2009 Brown et al., 2016 ) along with the distribution of hydrated materials, altered glasses and basaltic materials, and the origins of gypsum on the north polar dunes ( Langevin et al., 2005 Horgan et al., 2009 Horgan and Bell, 2012 Massé et al., 2012 ). Instruments that measure atmospheric properties tell us about thermal and humidity profiles, cloud formation, and snowfall ( Hayne et al., 2014 ). Temperature sensing instruments tell about the thermal inertia of ice-cemented material or the PLD ( Putzig et al., 2014 ). Finally, orbital radars have been instrumental in determining the bulk dielectric properties of the PLD and 3-D stratigraphic relationships inaccessible with cameras.