Astronomy

Why are there no stars visible in cislunar space?

Why are there no stars visible in cislunar space?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

It's very puzzling that the moon landing had no stars in the background, the ISS clips have no stars in the background. I listened to multiple astronaut interviews speak on what it looks like up in space and about half of them speak of the “darkest black space”. I'm sure there is a very good explanation for this.

Is star light only visible through the medium of earth atmosphere? But once in the vacuum of space where there is no medium they disappear? What's the explanation?

Minute 47-49 stars, press conference all three Apollo 11 astronauts

BBC interview with Neil Armstrong only


Anders's answer is entirely fine, but I'd like to add some extra information. As evidenced by the transcripts, reflected Earth light is quite strong even at this distance:

The earthshine coming through the window is so bright you can read a book by it.

That is, even with the lights turned off, it would probably be tricky to see the stars unless you turned in a way that didn't allow the earthshine through the windows.

However, as the capsule comes into the shadow cast by the Moon (a pure accident - they didn't plan for the approach to go this way), there comes:

Houston, it's been a real change for us. Now we are able to see stars again and recognize constellations for the first time on the trip. It's - the sky is full of stars. Just like the nightside of Earth. But all the way here, we have only been able to see stars occasionally and perhaps through the monocular, but not recognize any star patterns.

So for a few minutes, they did see "the sky full of stars". Other than that, they've seen a few stars once in a while, but only singular, bright stars (perhaps also when looking in a way that minimized the brightness from the Earth and Sun):

Houston, it's been a real change for us. Now we are able to see stars again and recognize constellations for the first time on the trip. It's - the sky is full of stars. Just like the nightside of Earth. But all the way here, we have only been able to see stars occasionally and perhaps through the monocular, but not recognize any star patterns.

The core of Anders's answer is still true, though. Exposure is the main problem here - both cameras and human eyes have a certain dynamic range, and even the brightest stars are entirely too dim in comparison with both the Sun, the Earth (in distance comparable to the Moon's distance from the Earth) and the Lunar surface (if you're in sunshine, as most of the mission was). A modern camera might be able to take a HDR picture that would allow the stars to be visible at the same time as the Earth or the Sun, and it'd be quite easy to do if you could occlude the main light sources (the same way we do it when photographing the Sun's corona etc.). But technically, that would be a "doctored" image - taken at two different exposures and combined in a way that uses different exposures for different parts of the image.


It is a matter of exposure and dynamic range. A sensor like a camera can only handle inputs in a certain range of intensities, and much of photographic skill (or smart presets) is about mapping the outside light onto this range so the details you care about show up rather than turn into white or black.

If you take a picture of a brightly lit scene, in order to make out the details of the bright parts (such as a lunar landscape, the Earth, the ISS etc) you will have to adjust the exposure making faint objects like the stars too dim to see against a dark sky background. You could try to set the exposure to show the stars instead, but now the landscape and Earth would be too bright (and likely also mess up the picture by causing flaring).

One can try to work around it by taking several pictures at different exposure levels and later digitally compositing them together. But this requires a lot of extra work.


The reason is that:

  1. To take a photograph under different lighting conditions, you need to use different camera settings to get a useful image.
  2. Cameras (and the human eye) do not have unlimited range in any given set of conditions, that is, they cannot represent objects of every brightness satisfactorily within one single image.

In particular, if one is photographing a subject that is brightly illuminated, one has to use camera settings that greatly limit the amount of light being recorded by the camera's sensor, otherwise it will be overwhelmed and fail to show useful detail. In the case of taking a subject that is only dimly illuminated (or, in this case, emitting only what amounts to a dim light), such as the stars, one needs to use settings which maximize the amount of light that the sensor absorbs to get useful detail in the image, or one will record nothing. These two types of settings are logically incompatible, and thus it is impossible (with existing camera technology) to capture simultaneously a very dim and very bright subject in a single (i.e. not a composite) photograph and have both of them look sensible.

And the Moon and stars are just such an incompatible pair. The Moon's surface is lit up effectively as brightly as the landscape of Earth in broad daylight. The stars are so dim they can only be seen at night.

In fact, you can demonstrate this right from Earth itself. Here are two photographs I took with my own camera about ten megaseconds or so ago, as of this posting. Both were shot at night, on the same night. The left hand photograph is shot with the camera set to daylight settings. Yes, these are the same settings you'd use to shoot a photograph in actual daylight, only being used at night, and the Moon registers loud and clear. That is how bright it is. Since surface brightness is not affected by distance, the Moon effectively amounts to a little piece of sunlit landscape in the sky, from our point of view, just like on a bright, sunny day on Earth. As you can see, the Moon's surface features are cleanly visible and, moreover, it is similar in coloration to your last photograph - as it should be, because that is its actual color. Note the complete absence of stars, exactly as in the NASA images. In the second photograph on the right, the camera was set to "bulb" mode to expose the sensor for a long time, and its sensitivity was greatly increased. You can now see the stars, but the Moon looks almost like a second Sun - its surface features completely obliterated as the sensor has been saturated with photons like a sponge that has already soaked up too much water and has now had enough, while bloom contaminates the rest of the image. The reason you "expect" to see stars is likely because you have watched too many sci-fi movies. Movies depict stars for artistic effect. In reality, images capturing such, taken in a single bout, are not possible with today's tech, and the reason is that the factor between the two is on the order of a billion (90 dB) in brightness. (You could composite the above two images suitably to fake it, but it would be just that.)


Why can't we see stars in the pictures of spacewalking or moonwalking astronauts?

The stars aren't visible because they are too faint. The astronauts in their white spacesuits appear quite bright, so they must use short shutter speeds and large f/stops to not overexpose the pictures. With those camera settings, though, the stars don't show up.

The same thing happens if you try to take a picture of someone under a dark, starry sky. To get the person perfectly exposed, you have to use a flash or some other light source, and set your camera accordingly. When you do that, there is no way to see the stars in the background. To see the stars, you need long exposures and wide-open aperatures. But with those settings, the subject of the picture would appear dark and blurry.

Astronauts have taken many photographs of the stars from orbit (and many of them are available on NASA's web site, but, unfortunately, not with spacewalking astronauts in the foreground.


What if There Were No Stars?

Immediate follow-up question: What if there were no life in the universe? The sun, after all, is a star. No stars, no sun, no life.Well, if you stumbled across this starless, lifeless universe, you'd find yourself floating through a frigid expanse of nothingness wishing that you had brought a warmer coat. Decent burritos would be harder to find. Every once in a while a neutrino would blip into or out of existence.

So let's revise the question: What if there were no visible stars? We'll say the sun and planets still exist, but for some reason no extrasolar stars can be seen from Earth. Let's say this is because our solar system is surrounded by a dark nebula. Nebulae are large clouds of dust and hot gas, and usually they're in the process of coalescing to form stars. As such they're very bright, but occasionally a cloud of interstellar dust will be thick and cold enough to block visible light without giving off much light itself.

We'll operate under the assumption that our sun developed normally but that we drew an unlucky galactic poker hand, and our solar system is positioned inside a dark nebula. Just as life was developing on Nebula Earth, the solar system began drifting into a dust cloud, and the stars started to dim. As the dust became thicker over the next few million years, the night sky grew darker and darker until, on the night that the first brave little lungfish wiggled onto land, the sky was almost completely black. Only a few red smudges from the last, brightest stars to shine through the nebula remained in the night sky. By the time humans bothered to look up, all they saw was a moon and the planets in the darkness.

Humans on Nebula Earth are at a technological disadvantage. Throughout our history we've been using the stars for setting up calendars, navigating, knowing when to plant crops and developing science, especially physics. The ability to predict the motion of the stars was a big source of authority for priests in ancient Egypt. Without a divine mandate, priests on Nebula Earth have a harder time persuading anyone to help build the pyramids.

But it would be difficult to predict the broad effects of so many technological limitations. So let's focus on a single aspect: celestial navigation.

Early European sailors on Nebula Earth can cruise around the Mediterranean Sea if they keep the coast in sight. It's fairly easy to tell which direction you're headed with a sundial and a compass, but at night it's nearly impossible to determine your position without the stars for reference. Out of sight of land, sailing gets more dangerous — a single storm scrambles any sense of your position. Travel over the open sea is next to impossible, as any sea voyage that takes more than a day has a margin of error that grows every day as the bearing becomes more and more inaccurate.

With no advanced seafarers, all significant human migrations on Nebula Earth occur over land. Australia, the Americas and Greenland, which were settled by land migrations when sea levels were lower, are inhabited but remain isolated well past the time that they were colonized by Europeans during our history. Other islands that were settled using celestial navigation, such as New Zealand, Iceland and Hawaii, are empty of humans. While they're technically reachable by ship, sailors who stumbled on one of these islands would never be able to find their way back — if they ever made it home.

Without sea migration, Nebula Earth's political landscape is dramatically different from real Earth's. European expansion is hamstrung. Left to themselves, the Aztec and Inca, two of the most technologically advanced societies in the Americas, become the most powerful states in the Western Hemisphere. Across the Atlantic Ocean, countries such as Britain, France and Spain that pursued aggressive colonial expansion during the age of sail are never able to build and maintain colonies far from home. States in India and China, which were colonized or economically dominated by Europeans on real Earth, maintain their independence.

Here's where Nebula Earth starts to get really politically interesting. Without sea navigation, overseas trade is restricted. Small merchant ships cruise the Mediterranean, allowing trade between Europe and the Middle East, but the greatest source of international trade is the Silk Road, a long network of trade routes beginning in Constantinople that stretches across Central Asia to India and China.

On real Earth, caravans moved back and forth along the Silk Road trading silk, precious stones and spices for thousands of years. On Nebula Earth it becomes the most important (and possibly only) major trade route in the world. Any nation that controls a significant portion of the route quickly becomes wealthy, but it's also a target for bandits and vulnerable to conquest from powerful neighbors. And most of the land along the road is barren and difficult to settle, making it hard to hold on to. Just as in our own history, parts of the Silk Road change hands often. Major players over thousands of years are the Greeks, Turks, Han Chinese, Mongols, Persians, Scythians and other nomads of the Central Asian steppes.

As it did in the real world, the route changes hands among historical empires as China and India trade indirectly with a weakened Europe, sometimes through the Islamic world and sometimes through Central Asian horse empires. In eastern Central Asia, the Manchu-Chinese conquer the remains of the Junghar steppe empire, and Russia expands through westward colonization and conquest. Russia and China officially set their borders with each other in treaties signed in 1689 and 1727, each demanding control of international trade in their domain.

Here's where our histories diverge. On real Earth, the Russian-Chinese treaties destroyed the economy of Central Asia. Peripheral countries, seeking to avoid a monopolized trade, found alternative routes, mainly through maritime trade and British colonies in India. Trade along the Silk Road ground to a halt, damaging the economies of both China and Russia [source: Beckwith]. On Nebula Earth, however, this overseas trade isn't an option. For Europeans, there is no trade with the New World to offset the economic damage of the Silk Road closing. There's no littoral zone — on the water, close to shore — trade-route system to reach the East. There are no sugar plantations in the Caribbean, no European-controlled silver mines in the New World and no slave trade across the Atlantic.

Maybe Russia becomes the dominant force in an impoverished Europe. China, free from European incursions, expands its territory east into Japan and south into the islands of the South Pacific to control the spice trade, possibly even colonizing Australia. Meanwhile, India grows richer and more powerful, as the rest of the world attempts to bypass the Russo-Chinese trade monopoly. As Nebula Earth enters the 20th century, western and northern Europe remain cultural and economic backwaters under the dark night sky. There are no world wars, or at least none led by European countries, but Russia, India and China are dominant global powers. In Africa, Somalia and Ethiopia form an increasingly important overland economic and cultural hub between Europe and India. And far across the oceans, an undisturbed North and South America await contact with a new and unpredictable Old World.


There are no stars in space photos? Actually, there are

Here’s a picture below that might give you an idea of what the sky looks like for astronauts and cosmonauts. This was shot at “night” so you can see the stars but keep in mind that, to the human eye, most of the sky around you looks like this at all times. You just usually can’t capture it in photographs.

Moon Landing: Astronaut Buzz Aldrin, lunar module pilot, stands on the surface of the moon near the leg of the lunar module, Eagle, during the Apollo 11 moonwalk. Astronaut Neil Armstrong, mission commander, took this photograph with a 70mm lunar surface camera. While Armstrong and Aldrin descended in the lunar module to explore the Sea of Tranquility, astronaut Michael Collins, command module pilot, remained in lunar orbit with the Command and Service Module, Columbia. This is the actual photograph as exposed on the moon by Armstrong. He held the camera slightly rotated so that the camera frame did not include the top of Aldrin’s portable life support system (“backpack”). A communications antenna mounted on top of the backpack is also cut off in this picture. When the image was released to the public, it was rotated clockwise to restore the astronaut to vertical for a more harmonious composition, and a black area was added above his head to recreate the missing black lunar “sky”. The edited version is the one most commonly reproduced and known to the public, but the original version, above, is the authentic exposure. This image was cataloged by NASA Headquarters of the United States National Aeronautics and Space Administration (NASA) under Photo ID: AS11-40-5903. Image: Wikipedia

Why are there no stars visible in cislunar space? - Astronomy

I have an above average knowledge of physics, astronomy, and science in general, having made these my hobbies in the past. However, when my colleague asked me this question it didn't make sense so I first told him he was mistaken.

However, when looking at any photo of any man-made object (such as satellites, stations, Shuttle etc) taken in space by astronauts, although the foreground object is in sharp focus, the background is devoid of any light (Including pin pricks) at all. I figured that even if the stars were out of focus, there should be some light registering from the black "space" areas in the photo.

Why is this? If I too am now mistaken, can you please show me a non-doctored picture taken in space without a telescopic lens that shows stars and the foreground object?

The pictures of human-made objects in space that you speak of all suffer from one fatal flaw: they lack what astronomers call "integration time". Even in space, stars are very faint. If you use a camera to take a picture of an object in space, then, you have to illuminate it using some kind of flash (just like on Earth). The flash is bright enough that the time over which the camera film is being exposed is, like on Earth, only a fraction of a second. This short time is more than sufficient to get a picture of the man-made object that your flash illuminates, but way too short to capture the stars. The fundamental difference between pictures of the stars themselves taken by telescopes and the pictures of things in space with stars in the background is the exposure time, or integration time: in fact, astronomers do everything they can to avoid "doctoring" images they obtain, since this might hide the very science that they are trying to get at.

I bet you can see how this works for yourself. The next time you are out on a clear night with some friends, take a picture of them with a starry sky in the background. When you develop the pictures, have a hard look for the stars that you know were there when you took the picture. Just like in space, a flash on Earth that allows you to photograph your friends obscures the stars (the effect should be more pronounced on Earth than it is in space because of our light-scattering atmosphere). To photograph the sky from Earth, you need a long-exposure camera, just like in space.

Page last updated on June 22, 2015.

About the Author

Kristine Spekkens

Kristine studies the dynamics of galaxies and what they can teach us about dark matter in the universe. She got her Ph.D from Cornell in August 2005, was a Jansky post-doctoral fellow at Rutgers University from 2005-2008, and is now a faculty member at the Royal Military College of Canada and at Queen's University.


Can Astronauts See Stars From the Space Station?

I’ve often been asked the question, “Can the astronauts on the Space Station see the stars?” Astronaut Jack Fischer provides an unequivocal answer of “yes!” with a recent post on Twitter of a timelapse he took from the ISS. Fischer captured the arc of the Milky Way in all its glory, saying it “paints the heavens in a thick coat of awesome-sauce!”

Can you see stars from up here? Oh yeah baby! Check out the Milky Way as it spins & paints the heavens in a thick coat of awesome-sauce! pic.twitter.com/MsXeNHPxLF

&mdash Jack Fischer (@Astro2fish) August 16, 2017

But, you might be saying, “how can this be? I thought the astronauts on the Moon couldn’t see any stars, so how can anyone see stars in space?”

John W. Young on the Moon during Apollo 16 mission. Charles M. Duke Jr. took this picture. The LM Orion is on the left. April 21, 1972. Credit: NASA
It is a common misconception that the Apollo astronauts didn’t see any stars. While stars don’t show up in the pictures from the Apollo missions, that’s because the camera exposures were set to allow for good images of the bright sunlit lunar surface, which included astronauts in bright white space suits and shiny spacecraft. Apollo astronauts reported they could see the brighter stars if they stood in the shadow of the Lunar Module, and also they saw stars while orbiting the far side of the Moon. Al Worden from Apollo 15 has said the sky was “awash with stars” in the view from the far side of the Moon that was not in daylight.

Just like stargazers on Earth need dark skies to see stars, so too when you’re in space.

The cool thing about being in the ISS is that astronauts experience nighttime 16 times a day (in 45 minute intervals) as they orbit the Earth every 90 minutes, and can have extremely dark skies when they are on the “dark” side of Earth. Here’s another recent picture from Fischer where stars can be seen:

Twinkle, twinkle, little star…
Up above the world so high
Like a diamond in the sky… pic.twitter.com/8H7CshyP0p

&mdash Jack Fischer (@Astro2fish) August 13, 2017

For stars to show up in any image, its all about the exposure settings. For example, if you are outside (on Earth) on a dark night and can see thousands of stars, if you just take your camera or phone camera and snap a quick picture, you’ll just get a darkness. Earth-bound astrophotographers need long-exposure shots to capture the Milky Way. Same is true with ISS astronauts: if they take long-exposure shots, they can get stunning images like this one:

This long exposure image of the night sky over Earth was taken on August 9, 2015 by a member of the Expedition 44 crew on board the International Space Station. Credit: NASA.

This image, set to capture the bright solar arrays and the rather bright Earth (even though its in twilight) reveals no stars:

Sometimes you look out the window and it just takes your breath away from how beautiful Earth is. Today is one of those times… #EarthShapes pic.twitter.com/53UqL9BFH1

&mdash Jack Fischer (@Astro2fish) August 2, 2017

In this timelapse of Earth at night, a few stars show up, but again, the main goal here was to have the camera capture the Earth:

Universe Today’s Bob King has a good, detailed explanation of how astronauts on the ISS can see stars on his Astro Bob blog Astrophysicist . Brian Koberlein explains it on his blog, here.

You can check out all the images that NASA astronauts take from the ISS on the “Astronaut Photography of Earth” site, and almost all the ISS astronauts and cosmonauts have social media accounts where they post pictures. Jack Fischer, currently on board, tweets great images and videos frequently here.


What is &lsquoExposure&rsquo?

The term &lsquoexposure&rsquo, in relation to cameras, refers to the amount of light that the camera&rsquos sensor captures while taking a photograph. Technically speaking, exposure is the amount of light per unit area reaching an image sensor (or photographic film). In simple words, the exposure setting of a camera directly influences how light or dark a photograph captured by the camera will be.

An image with different levels of exposure.

As you might already know, a camera works by allowing light from an object to enter through its lens and then strike its image sensor, which subsequently helps to form an image of the object. Quite clearly, the exposure setting of a camera plays a pivotal role in how bright/dark a photograph turns out to be.

The exposure of a camera is controlled by three attributes (or settings) of the camera: aperture, ISO and shutter speed.


Why Are There No Green Stars?

If you think about the sheer number of other stars out there, and the range of possible colors our eyes can detect, it seems like there should be green stars, doesn’t it? After all, there’s very clearly a green band in a rainbow. But a rainbow is just a demonstration of all the independent colors that make up the white light we get from our sun, and the sun itself is glowing to produce that light.

Takakkaw Falls, Yoho National Park, British Columbia, Canada. Image credit: Michael Rogers, CC BY-SA . [+] 3.0

All stars glow in a particular color depending on how hot they are, but they’re not glowing only in that color in the same way that we see the green light in the rainbow only because it’s been separated from the other colors. Each star glows in a range of colors (corresponding to a range of frequencies of light), but they will produce some frequencies more than others. It turns out that we can predict the exact range of colors produced by a star, along with the frequency it produces the most. This peak frequency is determined entirely by the temperature of the star. We can do this calculation so easily because stars are well described as a black-body -- any object which reflects no light, and glows according to its own internal heat falls into this class of objects.

There are a couple of black-body objects on Earth we might be familiar with, and they’re all things we should avoid touching. Molten iron, for instance (and in fact any forged metal), will glow because of the temperature to which it has been heated in the forge. Both red-hot and white-hot metal will burn you badly, but the brighter white that metal is, the hotter it is. We actually use this property for science on Earth for example, if you’re studying volcanoes. If you’d like to know how hot the lava is without sticking a thermometer in it by hand (potentially dangerous), you can find out the temperature by checking out the color of the uncooled lava. The closer to yellow-white that lava is, the hotter it is.

Metal, after being heated in a forge, glows bright yellow, cooling to a darker orange-red. Image . [+] credit: Alex Lines, CC 2.0 A-SA

But this doesn’t explain the lack of green stars. We can certainly make things burn green - we’ve managed to come up with green fireworks, for instance, but this burning is not the same as heating something until it glows green. Green fireworks are usually that color because certain salts (usually copper chloride or barium chloride) have been mixed in with the gunpowder. Heating those salts makes them glow at certain frequencies, just like a fluorescent light bulb. Critically, this isn’t a heat-based process.

Green fireworks explode in the night sky the green due to the inclusion of barium salts. Image . [+] credit: Jerry Daykin, CC-2.0-A-SA

I said earlier that based 100% on the temperature of the star, I can tell you the peak frequency of light that star produces. By that logic, it’s entirely possible for a star to have a most-produced color that is green. And in fact this is true and technically this could be called a green star. However, if you were to go look at that star, or any other black body object, heat its way to a peak frequency of green, what you would see would be disappointingly un-green. As you heat from the lowest temperatures, you go from a dark red, to orange, to yellow, to white. From white, if you continue heating, you can get to blue, but blue black-body objects tend to be extremely hot, with peak frequencies in the ultraviolet, and what we’re seeing as blue is the cooler tail extending down into the blue end of the visible spectrum.

We’ve totally skipped green! We replaced it with white. Because green is close to the center of the visible range, even if a star is producing a lot of green light, it’s also producing a lot of yellow and blue light, and the mixture appears white to our eyes. If the peak of the light production is off to one side or another, we can spot the shift towards orange or blue, but with green, we see a near perfect white light.


These Photos Taken from the Moon Show Lots and Lots of Stars

Photo of a partially-lit Earth captured by the Far Ultraviolet Camera on Apollo 16. Note that stars are visible in the background. (NASA)

One of the favorite allegations by those who continue to be skeptical of the Apollo moon landings is that there are no stars visible in the photographs taken by the astronauts while they were “supposedly” on the Moon. Now while there’s a rather short but succinct list of why that’s the case (and feel free to review those reasons here) the truth is that there ARE stars visible in photographs taken from the Moon—photographs taken in ultraviolet light during the penultimate Apollo mission in April of 1972.

The image above was captured with the Far Ultraviolet Camera/Spectrograph instrument that was set up on the lunar surface by Apollo 16 astronauts John Young and Charlie Duke on April 21, 1972.

The Far-UV Camera in position with Charlie Duke and the LRV in the background. (NASA: AS16-114-18439)

It was a gold-plated, 3-inch telescope and camera with a cesium iodide cathode and film cartridge, developed by African-American physicist Dr. George Carruthers while working at the U.S. Naval Research Laboratory. The camera was sensitive to light “at wavelengths between 500 and 1600 Angstroms… Emission at these wavelengths comes primarily from very hot stars of spectral classes O, B, and A, with surface temperatures of 10,000° to 50,000° K. For comparison, the temperature at the visible surface of the Sun is about 5800° K or 11,000°F. Stars as faint as magnitude 11, or 100 times fainter than can be seen with the human eye, were recorded.” (Source)

Image captured with the Far Ultraviolet Camera/Spectrograph during Apollo 16 (NASA)

The Far Ultraviolet Camera was set up at the start of the first Apollo 16 EVA in the shade of the LM and aimed at planned points of astronomical interest during the course of the mission, and allowed to expose the film at various lengths as needed. At the end of the mission the film was retrieved and brought back to Earth the camera itself remains on the surface in Descartes Highlands next to the LM descent stage.

A total of 190 UV mission photographs were taken, along with additional calibration frames.

“Specific planned targets were the geocorona, the Earth’s atmosphere, the solar wind, various nebulae, the Milky Way, galactic clusters and other galactic objects, intergalactic hydrogen, solar bow cloud, the lunar atmosphere, and lunar volcanic gases (if any).” (Source)

Image captured with the Far Ultraviolet Camera/Spectrograph during Apollo 16 (NASA)

So as we can clearly see, stars were captured on camera from the Moon. It just required a long exposure and a steady, tripod-mounted camera aimed upward from a dark location—not one strapped to the chest of an astronaut’s space suit aimed toward the bright surface of the Moon.

And why ultraviolet photography? Simple: it had scientific value. The Earth’s atmosphere blocks much of the UV light that comes in from distant stars having an observatory on the Moon, even for a brief few days, was worth it for astronomers—especially in the days before orbiting space telescopes like Hubble.

Besides stars—which show up in some of the UV photos as streaks because of the exposure times—the Far Ultraviolet Camera also captured some very cool pictures of Earth with its atmosphere illuminated by airglow as well as rings of aurorae visible around the polar regions.

Image captured with the Far Ultraviolet Camera/Spectrograph during Apollo 16 (NASA)

“The most immediately obvious and spectacular results were really for the Earth observations, because this was the first time that the Earth had been photographed from a distance in ultraviolet light, so that you could see the full extent of the hydrogen atmosphere, the polar auroris and what we call the tropical airglow belt.”
— Dr. George Carruthers

So basically this is yet another nail in the already nail-bristled coffin lid for Apollo hoaxers. No stars in Apollo pictures? Au contraire: Dr. Carruthers’ golden camera captured plenty.

(By the way, the images seen here aren’t readily available online from NASA they were ordered from Goddard Space Flight Center and decoded and scanned by a private third party who goes by the handle “Apollo 16 UVC S201.” It’s a work in progress, but they are still public domain NASA images.)

See the scanned photos from the Far Ultraviolet Camera here, and check out a slideshow of the current lot of them below.

UPDATE: Here’s a processed version of the above Earth aurora photo, with some color added to enhance contrast.

Earth and aurora captured from the Moon in far-ultraviolet light during Apollo 16. (NASA / Jason Major)


Can astronauts see stars from the space station?

Stars and the limb of Earth seen in the background of the International Space Station on July 29, 2017. Credit: NASA/Jack Fischer.

I've often been asked the question, "Can the astronauts on the Space Station see the stars?" Astronaut Jack Fischer provides an unequivocal answer of "yes!" with a recent post on Twitter of a timelapse he took from the ISS. Fischer captured the arc of the Milky Way in all its glory, saying it "paints the heavens in a thick coat of awesome-sauce!"

Can you see stars from up here? Oh yeah baby! Check out the Milky Way as it spins & paints the heavens in a thick coat of awesome-sauce! pic.twitter.com/MsXeNHPxLF

— Jack Fischer (@Astro2fish) August 16, 2017

But, you might be saying, "how can this be? I thought the astronauts on the Moon couldn't see any stars, so how can anyone see stars in space?"

It is a common misconception that the Apollo astronauts didn't see any stars. While stars don't show up in the pictures from the Apollo missions, that's because the camera exposures were set to allow for good images of the bright sunlit lunar surface, which included astronauts in bright white space suits and shiny spacecraft. Apollo astronauts reported they could see the brighter stars if they stood in the shadow of the Lunar Module, and also they saw stars while orbiting the far side of the Moon. Al Worden from Apollo 15 has said the sky was "awash with stars" in the view from the far side of the Moon that was not in daylight.

Just like stargazers on Earth need dark skies to see stars, so too when you're in space.

John W. Young on the Moon during Apollo 16 mission. Charles M. Duke Jr. took this picture. The LM Orion is on the left. April 21, 1972. Credit: NASA

The cool thing about being in the ISS is that astronauts experience nighttime 16 times a day (in 45 minute intervals) as they orbit the Earth every 90 minutes, and can have extremely dark skies when they are on the "dark" side of Earth.

Twinkle, twinkle, little star…
Up above the world so high
Like a diamond in the sky… pic.twitter.com/8H7CshyP0p

— Jack Fischer (@Astro2fish) August 13, 2017

For stars to show up in any image, its all about the exposure settings. For example, if you are outside (on Earth) on a dark night and can see thousands of stars, if you just take your camera or phone camera and snap a quick picture, you'll just get a darkness. Earth-bound astrophotographers need long-exposure shots to capture the Milky Way. Same is true with ISS astronauts: if they take long-exposure shots, they can get stunning images.

This long exposure image of the night sky over Earth was taken on August 9, 2015 by a member of the Expedition 44 crew on board the International Space Station. Credit: NASA