Plot of best available resolution vs wavelength - radio through gamma rays?

Plot of best available resolution vs wavelength - radio through gamma rays?

What I'm looking for is a graphic that shows in a general way the best available telescope resolution vs wavelength throughout the entire wavelength spectrum. So for example, there might be two very high resolution peaks aroud

  1. millimeter wavelengths (ALMA)
  2. visible wavelengths (HST and many ground telescopes with Adaptive Optics)

The beautiful image below from this great answer got me thinking. I have reposted it from there.

Image courtesy of Wikipedia user Hunster under the Creative Commons Attribution-Share Alike 3.0 Unported license.

The infrared, ultraviolet, and x-ray images come from the Spitzer Space Telescope, the SWIFT observatory, and the Chandra observatory, respectively.

After a bit of searching, I found this blog page, which has several charts about various observatories, including this one:

Image courtesy of Olaf Frohn under the Creative Commons Attribution-Share Alike 4.0 License.

The majority are space-based, although the radio telescopes are largely land-based. They cover existing and future telescopes, at energies from the gamma-ray spectrum to radio waves. You are correct, too, in assuming that adaptive optics can cause dramatic increases in angular resolution; CHARA and the European Extremely Large Telescope both use adaptive optics, and actually can have better angular resolutions than some space-based telescopes.

I annotated the graph to cover in green the smallest angular resolution at various wavelengths:

Notice that most of the lines in the radio, microwave, and infrared part of the spectrum are diagonal, with roughly the same slope. This is because they are limited by diffraction. In the case of radio waves, this is because the atmosphere has little impact. In the case of infrared- and visible- wavelength telescopes in space - and in space-based telescopes in general, the main thing that stops them is the diffraction limit.

The diffraction limit is $$d=frac{lambda}{2nsin heta}$$ where $lambda$ is wavelength and $nsin heta$ is the numerical aperture. On a log-log plot, such as the one above, we have $$log d=loglambda-log(2nsin heta)$$ and $$frac{mathrm{d}log d}{mathrm{d}loglambda}=1$$ for all telescopes limited by the equation. Thus, telescopes restricted by this limit should be described by a diagonal line with a slope of 1 (-1 on this graph).

Fermi Gamma-ray Space Telescope

The Fermi Gamma-ray Space Telescope (FGST [3] ), formerly called the Gamma-ray Large Area Space Telescope (GLAST), is a space observatory being used to perform gamma-ray astronomy observations from low Earth orbit. Its main instrument is the Large Area Telescope (LAT), with which astronomers mostly intend to perform an all-sky survey studying astrophysical and cosmological phenomena such as active galactic nuclei, pulsars, other high-energy sources and dark matter. Another instrument aboard Fermi, the Gamma-ray Burst Monitor (GBM formerly GLAST Burst Monitor), is being used to study gamma-ray bursts [4] and solar flares. [5]

Fermi was launched on 11 June 2008 at 16:05 UTC aboard a Delta II 7920-H rocket. The mission is a joint venture of NASA, the United States Department of Energy, and government agencies in France, Germany, Italy, Japan, and Sweden, [6] becoming the most sensitive gamma-ray telescope on orbit, succeeding INTEGRAL. The project is a recognized CERN experiment (RE7). [7] [8]

What Types of Radiation Are There?

The radiation one typically encounters is one of four types: alpha radiation, beta radiation, gamma radiation, and x radiation. Neutron radiation is also encountered in nuclear power plants and high-altitude flight and emitted from some industrial radioactive sources.

    Alpha Radiation

Alpha radiation is a heavy, very short-range particle and is actually an ejected helium nucleus. Some characteristics of alpha radiation are:

  • Most alpha radiation is not able to penetrate human skin.
  • Alpha-emitting materials can be harmful to humans if the materials are inhaled, swallowed, or absorbed through open wounds.
  • A variety of instruments has been designed to measure alpha radiation. Special training in the use of these instruments is essential for making accurate measurements.
  • A thin-window Geiger-Mueller (GM) probe can detect the presence of alpha radiation.
  • Instruments cannot detect alpha radiation through even a thin layer of water, dust, paper, or other material, because alpha radiation is not penetrating.
  • Alpha radiation travels only a short distance (a few inches) in air, but is not an external hazard.
  • Alpha radiation is not able to penetrate clothing.

Beta radiation is a light, short-range particle and is actually an ejected electron. Some characteristics of beta radiation are:

  • Beta radiation may travel several feet in air and is moderately penetrating.
  • Beta radiation can penetrate human skin to the "germinal layer," where new skin cells are produced. If high levels of beta-emitting contaminants are allowed to remain on the skin for a prolonged period of time, they may cause skin injury.
  • Beta-emitting contaminants may be harmful if deposited internally.
  • Most beta emitters can be detected with a survey instrument and a thin-window GM probe (e.g., "pancake" type). Some beta emitters, however, produce very low-energy, poorly penetrating radiation that may be difficult or impossible to detect. Examples of these difficult-to-detect beta emitters are hydrogen-3 (tritium), carbon-14, and sulfur-35.
  • Clothing provides some protection against beta radiation.

Gamma radiation and x rays are highly penetrating electromagnetic radiation. Some characteristics of these radiations are:

Light and Telescopes ch. 3

The Objective lens at the top of the telescope has a large diameter and long focal length. The eyepiece lens at the bottom of the telescope, is smaller and has a short focal length.

Math: A refractor's magnification is calculated by dividing the focal length of the optical tube with the focal length of the eyepiece.[1]

4 Common Designs: Newtonian, Cassegrain, Nasmyth or Coude, and Prime focus.

Photon energy = Planck's constant x speed of light/ wavelength

c = 3.0 x 10 ^(5) km/s [1.86 x 10^(5) mi/s]

Explanation:When the source of the waves is moving toward the observer, each successive wave crest is emitted from a position closer to the observer than the previous wave. Therefore each wave takes slightly less time to reach the observer than the previous wave. The waves are "bunched together" as the object moves closer.

Blueshift: the colour change of an approaching object.
Redshift: the colour change of an object moving away.

*Wavelength, hence colour, is affected by motion. As objects move towards or away from you, they change colour.

*Doppler measurements reveal that all of the very distant galaxies are moving away from us, meaning that the universe is expanding.


PKS 1441+25 was detected from 2015 April 21 (MJD 57133) to April 28 (MJD 57140) with VERITAS, an array of four imaging atmospheric Cherenkov telescopes located in southern Arizona (Holder 2011). VERITAS imaged gamma-ray induced showers from the source above 80 GeV, enabling the detection of PKS 1441+25 (VER J1443+250) at a position consistent with its radio location and at a significance of 7.7 standard deviations (σ) during the 15.0 hr exposure (2710 ON-source events, 13780 OFF-source events, OFF normalization of 1/6). Using a standard analysis with cuts optimized for low-energy showers (Archambault et al. 2014, and references therein), we measure an average flux of Φ(>80 GeV) = (5.0 ± 0.7) × 10 −11 cm −2 s −1 with a photon index ΓVHE = 5.3 ± 0.5 up to 200 GeV, 47 corresponding to an intrinsic index of 3.4 ± 0.5 after correction for the EBL (Gilmore et al. 2012, "fixed"). The day-by-day lightcurve is compatible with constant emission in that period (χ 2 /ndf = 7.4/6), and fractional variability Fvar < 110% at the 95% confidence level (Vaughan et al. 2003). 48 Subsequent observations in May (MJD 57155–57166, 3.8 hr exposure) showed no significant excess (660 ON-source events, 3770 OFF-source events, OFF normalization of 1/6), resulting in an upper limit of Φ(>80 GeV) < 4.3 × 10 −11 cm −2 s −1 at the 99% confidence level. These results have been cross checked with an independent calibration and analysis. Monte-Carlo simulations indicate systematic uncertainties on the VHE energy scale and photon index of 20% and 0.2, respectively. The systematic uncertainty on the flux of this source is estimated to be 60%, including the energy-scale uncertainty discussed in Archambault et al. (2014).

The LAT pair-conversion telescope onboard the Fermi satellite has surveyed the whole sky in the HE band since 2008 August (Atwood et al. 2009). We analyzed the LAT data using the public science tools v10r0p5 (Pass-8) leaving free the parameters of sources from the 3FGL (Acero et al. 2015) within a region of interest of 10° radius and fixing them for sources 10°–20° away. We reconstruct the spectrum of PKS 1441+25 between 100 MeV and 100 GeV in four-week (MJD 54705–57169, top panel in Figure 1) and two-week (MJD 57001–57169, middle panel) bins assuming a power-law model with a free normalization and photon index (purple points), as well as in one-day bins (pink points) fixing the photon index to its best-fit average value in MJD 57001–57169, ΓHE = 1.97 ± 0.02, slightly harder than in the 3FGL, 2.13 ± 0.07. The source was in a high state during MJD 57001–57169, with an integrated 100 MeV–100 GeV flux that is one to two orders of magnitude above the 3FGL value, (1.3 ± 0.1) × 10 −8 cm −2 s −1 . During the period contemporaneous with the VERITAS detection (MJD 57133–57140), the source shows a flux of (34 ± 4) × 10 −8 cm −2 s −1 and a hard index of 1.75 ± 0.08. Although a power-law model is used for robustness in the lightcurve determination, the spectrum shows a hint of curvature, with a log parabola preferred over a power law by 3.2σ (see Figure 2). The curvature is resilient to changes in the analysis and the temporal window, and fits in smaller energy ranges confirm the hint.

X-ray observations with NuSTAR and Swift were triggered following the VHE detection. NuSTAR, a hard-X-ray instrument sensitive to 3–79 keV photons (Harrison et al. 2013), observed the source on MJD 57137 for an exposure of 38.2 ks. The data were reduced using the NuSTARDAS software v1.3.1. Swift-XRT (Gehrels et al. 2004) observed PKS 1441+25 between 0.3 and 10 keV in 2010 June (MJD 55359), in 2015 January (MJD 57028 and 57050), in 2015 April (MJD 57127–57138), and in 2015 May (MJD 57155 and 57160). Data taken in photon-counting mode were calibrated and cleaned with xrtpipeline using CALDB 20140120 v.014. ON-source and background events were selected within regions of 20-pixel (

46 arcsec) and 40-pixel radius, respectively. The XRT and NuSTAR spectral analyses were performed with XPSEC v12.8.2, requiring at least 20 counts per bin. The NuSTAR spectrum is matched by a power law with a photon index of 2.30 ± 0.10 and an integrated 3–30 keV flux of (1.25 ± 0.09) × 10 −12 erg cm −2 s −1 . No intranight variability is detected. Swift-XRT did not significantly detect the source in 2010 June, but the 2015 observations reveal a power-law spectrum with no detectable spectral variability and an average 0.3–10 keV photon index of 2.35 ± 0.24, using an absorbed model with a hydrogen column density of 3.19 × 10 20 cm −2 (Kalberla et al. 2005). Significant flux variations are detected in the period contemporaneous with VERITAS observations (χ 2 /ndf = 25.9/3, Fvar = 22.6 ± 0.9%), with a flux-halving time of 13.9 ± 1.4 days based on an exponential fit to the data in MJD 57127–57155 (χ 2 /ndf = 8.6/6).

Simultaneously with the XRT observations, Swift-UVOT (Roming et al. 2005) took photometric snapshots of PKS 1441+25 in six optical-to-ultraviolet filters. Flux densities were extracted using uvotmaghist and circular ON-source and background regions of 5 and 15 arcsec radius, respectively. PKS 1441+25 has been observed in the V-band since 2012 January within the All-Sky Automated Survey for Supernovae (ASAS-SN Shappee et al. 2014), using the quadruple 14-cm "Brutus" telescope in Hawaii. The fluxes from both experiments in Figure 2 are dereddenned using E(BV) = 0.043, consistent with the column density used for the XRT analysis (Jenkins & Savage 1974). The 0.68-m Catalina Schmidt Telescope (AZ) has also performed long-term unfiltered optical observations of PKS 1441+25 since 2005 within the Catalina Real-time Transient Survey (CRTS, Drake et al. 2009). Observed magnitudes were converted into the V-band using the empirical method described in Drake et al. (2013). The SPOL spectropolarimeter (Schmidt et al. 1992) has monitored the linear optical polarization of PKS 1441+25 in 5000–7000 Å, with observations at the 1.54-m Kuiper Telescope, at the 6.5-m MMT, and at the Steward Observatory 2.3-m Bok Telescope (AZ). The source shows a high degree of polarization, with values ranging from 37.7 ± 0.1% to 36.2 ± 0.1% between MJD 57133 and MJD 57140.

The OVRO 40-m telescope (Richards et al. 2011) has monitored PKS 1441+25 at 15 GHz since late 2009. A 15 GHz VLBA image obtained by the MOJAVE program (Lister et al. 2009) on 2014 March 30 (MJD 56381) shows a compact core and a bright, linearly polarized jet feature located 1.2 mas downstream, at position angle −68°. Both features have relatively high fractional polarization (

10%), and electric vectors aligned with the jet direction, at an angle of 102° similar to that measured by SPOL, indicating a well-ordered transverse magnetic field. The fractional polarization level of the core feature is among the highest seen in the MOJAVE program (Lister et al. 2011).

The 2008–2015 observations of PKS 1441+25 shown in Figure 1 reveal a brightening of the source in the radio, optical, and HE bands starting around MJD 56900. A simple Pearson test (see caveats in Max-Moerbeck et al. 2014a) applied to the radio and HE long-term lightcurves shows a correlation coefficient r = 0.75 ± 0.02, differing from zero by 5.4σ based on the r-distribution of shuffled lightcurve points. Similarly, the analysis of the optical and HE lightcurves yields r = 0.89 ± 0.02, differing from zero by 4.8σ. The discrete correlation functions display broad, zero-centered peaks with widths of

100 days, indicating no significant time lags beyond this time scale. During the period marked by gray dashed lines in Figure 1, observations on daily timescales from optical wavelengths to X-rays reveal fractional flux variations smaller than 25%, compatible with the upper limits set by Fermi-LAT and VERITAS (30% and 110% at the 95% confidence level, respectively). Such flux variations are small with respect to the four orders of magnitude spanned in νFν, enabling the construction of a quasi-contemporaneous spectral energy distribution in Section 3.

Figure 1. Top: observations of PKS 1441+25 from 2008 to 2015. Middle: observations from 2014 December to 2015 May. Bottom: observations in April and May. The gray dashed lines mark the period considered for the analyses in Sections 3 and 4.

Things You Might’ve Missed

Stuff Made Here built a Robotic Golf Club to improve their scores, proving it don’t mean a thing if you ain’t got that swing.

Kyle Hill has an interesting video on information hazards, Roko’s Basilisk and future blackmail.

Canadian coffee shop chain Tim Hortons is tracking peoples’ locations. If they’re doing it, what else is happening on your phone that you don’t know about?

This whole story about eBay executives cyberstalking a couple who were critical of eBay in a newsletter is wild.

Rome-based artist Agnes Cecile’s drip painting is the perfect accompaniment to Ólafur Arnalds’ Fyrsta, from his album Living Room Songs. I thought I’d end with a quote from Winnie The Pooh:

“People say nothing is impossible, but I do nothing every day.”

With the Smart Watch, missing links and AMA I feel I’ve pushed myself a little too much. I’m going to be back in 2 weeks with issue 12 but am considering a change of schedule after that. I hope you’ve enjoyed this issue enough to share it with someone who might like it too. If you haven’t subscribed, now’s the time to do it, below.

Is there anything physically unique about the visual part of the EM spectrum?

My understanding of the electromagnetic spectrum is that everything from radio waves to gamma rays are just electromagnetic waves with different wavelengths.

Is there anything that makes the visual spectrum unique beyond just happening to be what our eyes evolved to see? I understand that some animals have the ability to see some ultraviolet light or have infrared detectors, but for the most part we all seem to see the same very narrow part of the spectrum.

Is there any reason we couldn't just as easily see only UV light or see entirely in the IR spectrum? Going further, is there a reason we don't see radio waves (the sun puts these out, right?) Is there something physically unique about that narrow band of the electromagnetic spectrum?

There's a few interesting properties:

The atmosphere is transparent in this range, meaning that light can reach the surface in this band.

The wavelength is short enough that it provides good resolution. Radio does not provide good enough resolution, and many materials that it's useful to see is more or less transparent at radio frequencies.

The sun provides a lot of energy in the visible spectrum, less so in the RF spectrum.

Not relevant. But your answer says 1. 1. 1.

Also we are day time animals. Some nocturnal animals have developed infrared vision because it is most useful when you don't have sunlight.

Don't you mean that the atmosphere opaque outside of this visible range? Opaque typically means "not transparent".

The atmosphere is not opaque in this range. If it was opaque, no light would reach the surface.

Point 3 is the most significant to me. The spectrum of sunlight peaks in the middle of the visual range.

But why does it stop at UV?

One thing I always wondered was do matterials reflect and obsorb different wavelengths so dramatically like visible light. If I had a lightsource of UVB in as narrow a range as visible light is could I paint a picture.

As pointed out, the sun puts out a lot of visible light, and it is transparent to air and water, and many other molecules (e.g. most organic compounds).

There is a reason for that, which is that the visible spectrum lies in a bit of a 'gap' energy-wise. The infrared region affects molecular rotation and vibration, while the lowest-energy excitations of electrons often happen in the UV range. (for your average organic molecule maybe 200 nm or so) Thermal energy at the temperatures we deal with are all a bit down in the IR (things have to get pretty hot to start glowing after all), so we don't have a problem with thermal noise. It's mainly transition-metal complexes and certain large organic molecules that have low-energy enough electronic transitions to absorb visible light.

So besides plenty of visible light being around, it's high enough in energy that you don't have thermal noise, but low enough in energy that it's not being absorbed by any and every molecule. So we could evolve chromophore molecules that absorb in specific ranges of the the visible spectrum. These work by being very long chains of alternating single-double bonds (conjugated chains). The longer the chain, the lower the energy required to excite a valence electron in the chain. But there's a limit for that somewhere in the red. I.e. we can't gain IR sight just by making longer chromophores.

So when it comes to pythons and other animals that 'see' IR, it's a different thing entirely. Again, since IR corresponds to molecular motion energy-wise, IR and heat aren't really distinguishable. IR 'vision' works through using heat receptor enzymes more like those we have in our skin than to the photoreceptor enzymes in our eyes.

At the opposite end, you can't have just too short a chromophore because once you get well into the UV the absorbance will overlap with all the other organic molecules in your tissue.

This is neat but itɽ all be moot if we didn't have visible light around. So it's a combination of factors. But basically our color vision has been taken about as far as it can go in terms of how it's evolved. To go far into the IR or UV you have to use very different detection mechanisms, and you run into challenges you don't have with visible light. Evolution could perhaps find an answer (and has for e.g. snakes) but not 'just as easily' for sure.

Would it be possible to construct some sort of radio wave "camera". Such that if you had two of these you could point them at things like radio towers and "see" radiowaves in "3d"(maybe with VR goggles)?

Does this suggest that aliens are more likely to have eyes like ours that see visible light?

Biological photo receptors are made of organic molecules.

If the electromagnetic waves are too energetic (UV an shorter wavelengths) the organic molecules can be destroyed.

If the electromagnetic waves are not energetic enough (IR and longer wavelengths) not much happens.

Visible light has enough energy to cause an organic molecule to isomerize.

This may seem like a chickenegg thing, but the chemistry of organic molecules does put some limits on what types of radiation can be detected by organism with the type of biology we are familiar with.

That's interesting. A common adaptation to cold environments is having biochemical bonds be weaker so that they behave more or less the same at lower temperatures. Do you think if we evolved on a significantly colder world weɽ be able to see in IR? (Assuming we could evolve on a colder world)

No organism can see x-rays or radiowaves. One reason offered is that organisms use EM radiation to move electrons around and shift energy levels, either for senses or for growth. Radio is too low energy. X-rays could do it but there aren't enough of them. So nature fucks around with the energy in the middle.

Yes, it's a sweet spot in relation to the energy levels of bonds in organic molecules.

As others have pointed out there is a lot of light in those wavelengths from our Sun and the atmosphere is transparent at those wavelengths. However, these are mostly just prerequisites, they don't completely explain why the optical spectrum is as favored as it is for seeing.

In order for a wavelength to be useful it helps to be able to obtain imagery with decent resolution without exorbitant biological "cost". You also want the maximum amount of information, so different biological molecules should have as much different contrast and easily distinguishable spectra as possible. And you want eyes to be easy to make, so the wavelength of light should be fairly easy to make photosensitive molecules that can generate signals used by neurons.

In the infrared range the light energy corresponds to vibrational and rotational states in organic molecules. This provides a ton of information, but the spectral information for molecules tends to be in clusters of sharp peaks. This complicates the ability to use IR light biologically because it would take a tremendous biological investment to tease out spectral information to the degree necessary to gain a lot of data from IR spectra. Worse, room temperature thermal glow is in the infrared range, leaving a very narrow band that can be used for imaging without extraordinary cost. On top of that, it's difficult to couple infrared sensing to biomolecules, some animals have the sense but it's a very specialized thing.

UV light often has enough energy to break atomic bonds so it can be challenging to build photoreceptors into the UV spectrum.

In the visible light spectrum everything is different. Many molecules exhibit very simple and broad spectra in the visible and near-UV range. This means there can often be high contrasts between different materials in that wavelength window (e.g. rock versus skin versus bark versus leaves, etc.) Additionally, the light is just below the ionization energy of many electrons in bonds or around atoms, so it's not that hard to make molecules that give up electrons when hit by visible light (this is how chlorophyll works, after all), which can be used to send signals. These characteristics also make color vision possible and worthwhile, because just tweaking a photoreceptor molecule a little bit will lead to it having a different spectral response, and because of the spectral characteristics of organic molecules in the visible spectrum that means there will often be important and high contrast information available in the difference in response between those two photoreceptor molecules.

In short, the visible spectrum is a sweet spot where eyes are easier to build (good, reliable photoreceptor molecules, small size) and also effective (access to light that can travel through the atmosphere that is plentiful and encodes high contrast data of the stuff in the immediate environment). Going too far into the IR or UV spectrum makes eyes harder to build and less useful.

8 Answers 8

There are several nice things about light compared to radar. Maybe the most important for this question is that you can have a diffraction limited beam. This is because the wavelength of light is much smaller than that of radio waves.

The formula for $ heta$ , the half angle of how the beam is expanding, is

where $w$ is what is called the beam waist of the beam, and is often the aperture size of the laser, although one can also put the beam waist somewhere in front of the laser depending on the lens design.

This means that the laser beam is much much more directional than the radar, but far from the laser you can figure out what the radius of the spot size is by taking the angle in radians and multiplying it by the distance. So for the big distances of deep space that probably determines how many photons hit your target per laser pulse.

Another nice thing about light is that you can have detectors that can detect single photons. The amount of energy in a visible photon is a couple of electron volts, whereas a radio wavelength photons is a few milli electron volts.
So, you don't absolutely have to have a lot of light return especially if you have some other information to tell that it is your photons and not some other source of photons. There are a bunch of games that can be played to discriminate you photons from others since you know the wavelength, polarization, timing of the signal and other things about your photons.

An emerging real technology is the use of quantum mechanics to have entangled photons. When the photons are "entangled" when something happens to one photon, it can be observed by measuring the other photon. This is hard to explain in a couple of sentences, but this can be used for communications, quantum cryptography, and with some creative license can be applied in a hard science kind of way for your application of sensing other spacecraft or other applications.

As an aside, many of the LEO satellites and probably others that are being networked together are probably going to be communication via light as an alternative to radio waves. This is partly because of spectrum allocation but light can have higher bandwidth, and rather than large antennas can be smaller packages.

In general for either light or radio wave radars and communication systems the size of transmitters/receivers telescopes or antenna that ultimately determine the amount of energy that hits the target and then how much gets received. With radio waves, since the wavelength is longer and the signal typically has more coherence it is easier to make arrays of antenna dishes and end up with an effective larger aperture. However, you can do the same thing with light it is just harder engineering. For radar you can do beam steering by controlling the relative phase between antenna elements. The more elements you have typically the more directional you can make the beam. You can also do the same thing with light, but it is harder, to coherently beam steer the light from coupled laser emitters is an active area of research.

Optical systems are typically more expensive than RF systems, but there are many applications in space, including ranging to satellites to very accurately get their position and velocity that is being done right now. So it is probably reasonable to extrapolate to deep space applications.

Three extremely luminous gamma-ray sources discovered in Milky Way's satellite galaxy

Optical image of the Milky Way and a multi-wavelength (optical, Hα) zoom into the Large Magellanic Cloud with superimposed H.E.S.S. sky maps. Credit: Milky Way image: © H.E.S.S. Collaboration, optical: SkyView, A. Mellinger LMC image: © H.E.S.S. Collaboration, Hα: R. Kennicutt, J.E. Gaustad et al. (2001), optical (B-band): G. Bothun

Once again, the High Energy Stereoscopic System, H.E.S.S., has demonstrated its excellent capabilities. In the Large Magellanic Cloud, it discovered most luminous very high-energy gamma-ray sources: three objects of different type, namely the most powerful pulsar wind nebula, the most powerful supernova remnant, and a shell of 270 light years in diameter blown by multiple stars, and supernovae – a so-called superbubble. This is the first time that stellar-type gamma-ray sources are detected in an external galaxy, at these gamma-ray energies. The superbubble represents a new source class in very high-energy gamma rays.

Very high-energy gamma rays are the best tracers of cosmic accelerators such as supernova remnants and pulsar wind nebulae – end-products of massive stars. There, charged particles are accelerated to extreme velocities. When these particles encounter light or gas in and around the cosmic accelerators, they emit gamma rays. Very high-energy gamma rays can be measured on Earth by observing the Cherenkov light emitted from the particle showers produced by incident gamma rays high up in the atmosphere using large telescopes with fast cameras.

The Large Magellanic Cloud (LMC) is a dwarf satellite galaxy of our Milky Way, located about 170.000 light years away and showing us its face. New, massive stars are formed at a high rate in the LMC, and it harbors numerous massive stellar clusters. The LMC's supernova rate relative to its stellar mass is five times that of our Galaxy. The youngest supernova remnant in the local group of galaxies, SN 1987A, is also a member of the LMC. Therefore, the H.E.S.S. scientists dedicated significant observation to searching for very high-energy gamma rays from this cosmic object.

For a total of 210 hours, the High Energy Stereoscopic System (H.E.S.S.) has observed the largest star-forming region within the LMC called Tarantula Nebula. For the first time in a galaxy outside the Milky Way, individual sources of very high-energy gamma rays could be resolved: three extremely energetic objects of different type.

The so-called superbubble 30 Dor C is the largest known X-ray-emitting shell and appears to have been created by several supernovae and strong stellar winds. Superbubbles are broadly discussed as (complementary or alternative to individual supernova remnants) factories where the galactic cosmic rays are produced. The H.E.S.S. results demonstrate that the bubble is a source of, and filled by, highly energetic particles. The superbubble represents a new class of sources in the very high-energy regime.

Pulsars are highly magnetized, fast rotating neutron stars that emit a wind of ultra-relativistic particles forming a nebula. The most famous one is the Crab Nebula, one of the brightest sources in the high-energy gamma-ray sky. The pulsar PSR J0537−6910 driving the wind nebula N 157B discovered by the H.E.S.S. telescopes in the LMC is in many respects a twin of the very powerful Crab pulsar in our own Galaxy. However, its pulsar wind nebula N 157B outshines the Crab Nebula by an order of magnitude, in very high-energy gamma rays. Reasons are the lower magnetic field in N 157B and the intense starlight from neighboring star-forming regions, which both promote the generation of high-energy gamma rays.

The supernova remnant N 132D, known as a bright object in the radio and infrared bands, appears to be one of the oldest – and strongest – supernova remnants still glowing in very high-energy gamma rays. Between 2500 and 6000 years old – an age where models predict that the supernova explosion front has slowed down and it ought no longer be efficiently accelerating particles – it still outshines the strongest supernova remnants in our Galaxy. The observations confirm suspicions raised by other H.E.S.S. observations, that supernova remnants can be much more luminous than thought before.

Observed at the limits of detectability, and partially overlapping with each other, these new sources challenged the H.E.S.S. scientists. The discoveries were only possible due to the development of advanced methods of interpreting the Cherenkov images captured by the telescopes, improving in particular the precision with which gamma-ray directions can be determined.

"Both the pulsar wind nebula and the supernova remnant, detected in the Large Magellanic Cloud by H.E.S.S., are more energetic than their most powerful relatives in the Milky Way. Obviously, the high star formation rate of the LMC causes it to breed very extreme objects", summarizes Chia Chun Lu, a student who analyzed the LMC data as her thesis project. "Surprisingly, however, the young supernova remnant SN 1987A did not show up, in contrast to theoretical predictions. But we'll continue the search for it," adds her advisor Werner Hofmann, director at the MPI for Nuclear Physics in Heidelberg and for many years H.E.S.S. spokesperson.

Indeed, the new H.E.S.S. II 28 m telescope will boost performance of the H.E.S.S. telescope system, and in the more distant future the planned Cherenkov Telescope Array (CTA) will provide even deeper and higher-resolution gamma-ray images of the LMC – in the plans for science with CTA, the satellite galaxy is already identified as a "Key Science Project" deserving special attention.

The collaboration: The High Energy Stereoscopic System (H.E.S.S.) team consists of scientists from Germany, France, the United Kingdom, Namibia, South Africa, Ireland, Armenia, Poland, Australia, Austria, the Netherlands and Sweden, supported by their respective funding agencies and institutions.

The instrument: The results were obtained using the High Energy Stereoscopic System (H.E.S.S.) telescopes in Namibia, in South-West Africa. This system of four 13 m diameter telescopes – recently complemented with the huge 28 m H.E.S.S. II telescope – is one of the most sensitive detectors of very high-energy gamma rays. These are absorbed in the atmosphere, where they create a short-lived shower of particles. The H.E.S.S. telescopes detect the faint, short flashes of bluish light which these particles emit (named Cherenkov light, lasting a few billionths of a second), collecting the light with big mirrors which reflect onto extremely sensitive cameras. Each image gives the position on the sky of a single gamma-ray photon, and the amount of light collected gives the energy of the initial gamma ray. Building up the images photon by photon allows H.E.S.S. to create maps of astronomical objects as they appear in gamma rays.

The H.E.S.S. telescopes have been operating since late 2002 in September 2012 H.E.S.S. celebrated the first decade of operation, by which time the telescopes had recorded 9415 hours of observations, and detected 6361 million air shower events. H.E.S.S. has discovered the majority of the about 150 known cosmic objects emitting very high-energy gamma rays. In 2006, the H.E.S.S. team was awarded the Descartes Prize of the European Commission, in 2010 the Rossi Prize of the American Astronomical Society. A study performed in 2009 listed H.E.S.S. among the top 10 observatories worldwide.

More information: "The exceptionally powerful TeV γ-ray emitters in the Large Magellanic Cloud" Science 23 January 2015: Vol. 347 no. 6220 pp. 406-412 DOI: 10.1126/science.1261313

What is the relationship between frequency and distance?

Please can anybody explain how the frequency of a communication signal affect its propagation range?

I thought it was the other way around. Low frequency signals are of larger wavelength than that of high frequency waves and they travel longer, for the same power.

In the case of visible light, waves of shorter wavelength/high frequency travels less (blue light) and those with larger wavelength/low frequency travels longer (red light) distances.

Isn't it the case with coomunication signals too. Besides those are all electromagnetic waves.

E = hf
Which means Energy is directly proportional to frequency.
Low frequency have low energy, so less distance traveled & thus HF can travel long distance.

h= planck's constant
v= frequency of signal

So higher the frequency longer the distance you can transfer

Most of all this is not a good way to look into communication do tell in What form you are trying to transfer data

Thanks All, but is there a mathematical equation that can approximate the maximum distance a signal will travel in a free space before it becomes too weak to be received by any antenna, given a specific frquency (say 2.4GHz)?.

I use free space bearing in mind that so many factors affect signal quality in our atmosphere.

Can you share the equation or the name of the equation you found?

The communications signal distance travelled is based upon the total link loss available in the system. If the tx is +40dBm and Rx sensitivity is -100dBm then the total link loss available is 140dB. In a free space environment, lower frequencies will travel further over higher frequencies (K+20logd+20logf where K is the Boltzmann constant d is distance and f is frequency).

VHF 120MHz is great for long distance such as ground to air comms at an airport and can talk to planes miles away. UHF 450MHz is still good for distance but not as far and is good for penetration into buildings over VHF due to its smaller wavelength size compared to the size of construction materials.

E = hf
Which means Energy is directly proportional to frequency.
Low frequency have low energy, so less distance traveled & thus HF can travel long distance.

If we apply the same power for high and low frequency wave, the number of photon would be inversely proportional to the frequency. Thus more photons will increase the tranvel range im air at lower frequency

But there is a curve rely low go sHort but ground esh middle bonus of atmosphere and really high frequencies can go out to space

When you get into really high frequencies becomes more line of sight

First of all we need to understand the relation between the wavelength and frequency telling fw=c where f is the frequency, c is the speed of light, and w is the wavelength. f and w are inversely related. Let us see the order of arrangement of the EM waves such as starting from gamma waves, xray, infrared, visible, microwave amd radiowaves( sorry if i have missed any). And let us consider the visible band of the EM waves by which we all see the red light has the longest travel distance because its wavelength is the highest among all the visible colours. The famous example of the traffic light because of which red light was considered so that it would be seen to a longer distance so people stop the vehicle in time. Keeping this in mind the wavelength is highest for red and lowest for violet in the visible band and is arranged in the order red, orange, yellow, green, blue, violet( leaving out the intermediate colours).

So by this patter we see that to the right of visible band the wavelength decreases and to the left the wavelength increases. Meaning the X-ray has a better penetration ability as its frequency is low and wavelength is high. And gamma rays has the most penetrating ability of all as for an example X-ray is used to penetrate bones where as microwave and radio waves used in radar us to penetrate clouds. So the pentration ability increases with increase in wavelength which inturn means decrease of frequency helps in increase of penetration ability of a wave. And hence we can tell that the propagation range is highest for the wave with highest wavelength and lowest frequency.


Gamma-ray spectroscopy using scintillation detectors is one of the most important measurement methods used in the various applications of nuclear science and its research. It covers, for example, the basic study of nuclear physics, environmental study, nuclear medicine, and recently, border monitoring equipment. Scintillation detectors consist of a photodetector and a dense scintillation crystal that absorbs gamma quanta and, as a result, emits light. The number of photons detected or “seen” by a photodetector is the main parameter used to define its performance. These photons “carry information” about the detected gamma radiation. The higher their number, the more detailed the information about the radiation is possible, and the better the accuracy of the measurements [1].

The detection of weak light signals, such as scintillation light, can be achieved by means of several types of photodetectors but photomultiplier tubes (PMTs) are most commonly used and have been for over 70 years [1]. The photodetector is also one of the key components defining the final performance of a scintillation detector. The main advantages of photomultipliers are, good quantum efficiency (up to 40%), high gain (10 3 –10 8 ), low dark current (dark noise), the ability to detect single photons, and the availability of a wide range of sizes (from several millimeters to tens of centimeters in diameter). However, PMTs also have some drawbacks for example, they are very sensitive to magnetic fields, require high supply voltages (often above 1000 V), are susceptible to mechanical damage from excessive mechanical shock or vibration, and are expensive because the complicated mechanical structure inside the vacuum container is mostly made manually.

The second most common type of photodetector are silicon photodiodes these include the PIN photodiode (PD), the avalanche photodiode (APD), and the newest Geiger-mode avalanche photodiode (also known as a silicon photomultiplier, SiPM). The main advantage of this type of a photodetector is its insensitivity to magnetic fields.

The most important differences between the various photodetectors concern two parameters: the minimum detectable signal and the internal gain (see Table 1).

The most important parameters that characterize all photodetectors used in gamma-ray spectroscopy are as follows: quantum efficiency (or photon detection efficiency), dark noise, linearity (dynamic range), excess noise factor, after-pulses, cross-talk (if it exists), and gain.

The silicon photomultiplier is a relatively new photodetector that combines the benefits of both APDs and PMTs, such as, high internal gain ∼ 10 5 to 10 7 , insensitivity to magnetic fields, and single photon detection capability [2], [3]. SiPMs also possess other advantages, such as being able to operate at low bias voltages below 100 V (most of the newest SiPM operate below 40 V) and having no burn-in phenomenon due to input light saturation. Of course, SiPMs also have some drawbacks for example, the sensitivity of gain to temperature change (typically from about 2% to 7% per degree centigrade), the limited linearity and dynamic range of detected signals, their small size and therefore limited sensitive surface area, high capacitance, and the presence of effects such as cross-talk and after-pulses.

SiPMs are manufactured by several companies: Hamamatsu Photonics (Japan), SensL (Ireland), Zecotek Photonics (Singapore), FBK (Fondazione Bruno Kessler, Trento, Italy), ST–Microelectronics (Catania, Italy), Amplification Technologies (New Jersey, USA), Ketek (Munich, Germany) among others. “SiPM” is the name most often used, but in the literature this type of a photodetector is also referred to as a multi pixel photon counter (MPPC), micro-pixel avalanche photodiode (MAPD), multi pixel Geiger-mode avalanche photodiode (GM-APD or G-APD), solid-state photomultiplier (SSPM), single-photon avalanche diode (SPAD) array, or pixelated Geiger-mode avalanche photon detector (PPD), among others [4].

A SiPM consists of many small avalanche photodiode cells (APD cells) that are fabricated on a common Si substrate. Currently, commercially available SiPMs can be a single device with a total active area ranging from 0.18 × 0.18 mm 2 up to 6 × 6 mm 2 , depending on the manufacturer’s technology. Detectors with a larger active area are based on SiPM arrays, which are composed of these single elements arranged in various formats. Available single SiPM devices consist of APD cells ranging from ∼ 100 to 15,000 cells per mm 2 , each with the same very small area, which varies from 7.5 × 7. 5 μ m 2 up to a maximum of 100 × 100 μ m 2 .

Each APD cell operates in a Geiger mode, which means that the cell is reverse-biased above the electrical breakdown voltage ( V bd ). In these conditions the electric field within the depletion region of the APD cell is high enough for free carriers, that is, the electrons and holes, (which are the result of light absorption) to produce additional carriers by impact ionization, thus resulting in a self-sustaining avalanche [5]. This Geiger discharge stops due to a drop in voltage below the breakdown value, either passively using an external resistor connected in series with the diode ( R q , quenching resistor, typically ranging from about 100 k Ω to several M Ω , see Fig. 1), or actively, using special quenching electronics [6]. After a certain effective recovery time, which typically lasts several tens of nanoseconds, the voltage is again restored to the operating value and the cell is ready to detect the next photon [7].

The independently operating APD cells are connected in parallel to the same readout line. Therefore, the combined output signal corresponds to the sum of all APD cells that have fired. The number of these activated cells is the measure of the light flux [8], [9], see Fig. 1.

In any single cell the avalanche can be triggered not only by photo-generated carriers, but also by carriers that are thermally generated (dark noise), or emitted as a result of phenomena such as after-pulses or cross-talk (discussed in Sections 2.4 Optical cross-talk, 2.5 After-pulses, respectively).

SiPMs have been widely tested in high energy physics [10], [11], [12], neutrino physics [13], and also in commercial applications such as nuclear medicine [14], [15], [16], [17]. The first commercial PET/MR scanner based on SiPMs was proposed by GE Healthcare on August 4, 2014 — the first integrated, simultaneous, TOF-capable, whole body, SIGNA PET/MR scanner [17].

At the beginning of the SiPM’s development, the small total active area of the individual detectors (ranging from 1 × 1 mm 2 to 2 × 2 mm 2 ) limited their potentiality for use in gamma spectroscopy using scintillators. Currently, the larger total active area for single detectors (3 × 3 mm 2 and 6 × 6 mm 2 ) and their matrices (ranging from 6 × 6 mm 2 to 51 × 51 mm 2 or more) allows for efficient use of silicon photomultipliers in scintillation detectors for gamma spectroscopy [18].

In the first part of the paper the different characteristics of SiPMs that affect the use of SiPMs in gamma spectroscopy using scintillators, are reviewed. It covers the gain of SiPMs, photon detection efficiency (PDE), the after-pulses and cross-talk that are responsible for the excess noise factor, dark noise, and the linearity of response. In respect to this, the influence of the SiPM’s effective dead time on the linearity of the response is discussed. In the second part, the optimization of SiPM operation to get the best energy resolution and linearity of response is presented. Finally, a number of test reports covering a combination of SiPM arrays of different sizes coupled to different scintillators used in gamma spectroscopy, are reviewed.

Besides the characterization of the different SiPMs used in scintillation detection and spectroscopy, a discussion is carried out concerning their present and future application in various fields of use such as instrumentation for homeland security, environmental study, plasma physics, and others.

Watch the video: Plot analysis (October 2021).