leadingkeron.blogg.se

Towards the stars and then the shadows
Towards the stars and then the shadows





towards the stars and then the shadows

When I take a photo in HDR mode, the phone actually takes two photos (one longer-exposure, one shorter) and merges the best-exposed parts of both images, to show me details both outside the window and in my friend’s features.

towards the stars and then the shadows

With my phone camera I can turn on a feature called “HDR,” which stands for high dynamic range. Then I take out my camera and take a picture, and it looks terrible.īut wait - modern digital cameras have a trick that mimics what the human eye and brain do. My brain builds up a composite of all this information, making the view in my mind’s eye more detailed than any instantaneous view from my physical eye.

towards the stars and then the shadows

My eyes constantly dart about, looking out the window, looking indoors, looking at my friend’s face, each time adjusting focus and aperture. This is partly because my eyes aren’t still when I look at a scene. When I see a friend sitting in front of a window, I can see their face just fine because my eyes are capable of discerning detail in both shadow and sunlight. Is your camera capable of seeing both dimly-lit and well-lit things in the same image? Or does its light-collecting capacity get quickly overwhelmed by brighter things before it's had time to detect any light from dimmer things? Here is where our eyes generally do much better than our cameras. What Is the Dynamic Range of Your Camera? OSIRIS-REx, whose MapCam was designed to study the colors of a very dark-toned asteroid, couldn’t look at Earth without getting overwhelmed by the brilliant light reflecting off of bright clouds, causing the artifacts you see in the top of this image. This can be a challenge if your spacecraft will encounter a wide range of target brightnesses, but you make your camera to work on the intended science targets and don't worry if it isn't ideal for any fun extras you may photograph along the way. Instead, scientists predict the light levels that a camera will encounter through its mission, and design their instruments to have an aperture that's an appropriate size for the range of targets they expect to encounter. Most space cameras actually can't adjust their aperture in this way. As your eyes dial up their sensitivity by opening up your pupils, you slowly notice fainter and fainter stars. If you're a sighted person walking from a brightly lit to a dark outdoor area, you won't see stars in the sky either, at least not right away. Human eyes do the same thing, automatically, all the time, by dilating and contracting their pupils. How much light does your camera need to see by? Fancy cameras can adjust sensitivity by opening and closing the aperture that lets in the light. Let's talk about three things that affect what details you can see in any photo, whether it's of a star, a planet, or a person: the sensitivity of the camera, the time your camera had to collect light, and the dynamic range of your camera. The same issues that can make your casual snapshots look bad affect space images, too. Their face still exists, of course! It's just not brightly lit enough to show up in the photo. In your photo, all you can see is a silhouette your subject's face is a nearly featureless shadow. I'm sure that everyone reading this article has made the mistake of shooting a photo of a loved one standing in front of a brightly lit window. I can illustrate with an example from everyday life.

towards the stars and then the shadows

The answer: The stars are there, they're just too faint to show up. Why, then, do photos of things in space not contain stars? How come the black skies of the Moon contain no stars in Chang’e photos? Look up at space at night from a dark location and you can see innumerable stars. There are a few questions that we get all the time at The Planetary Society.







Towards the stars and then the shadows