Every autumn, when I see the golden ginkgo leaves, I take out my mobile phone to take pictures. No matter how many angles I change, there are only some mediocre photos left on my phone. Is it because I don’t have enough talent for photography? Later, I encountered this situation many times. When I saw a patch of dandelions on the grass, I picked up my phone and took a mediocre photo. When I saw peach blossoms blooming in the spring, I picked up my phone and took another mediocre photo. Why is it that the beautiful scenery I see with my eyes doesn't look so good when I take a photo? Is the problem with the camera? After thinking about it, I found that the root of the problem is: what I see is not what the camera captures. Copyright image, no permission to reprint 01 Cameras and the Landscape We “See” Why is it different? When you see a beautiful scene, turn on the camera, press the shutter, and the recording is complete. In this process, the camera sensor records the scene information, and then converts it into a digital signal for storage. The scene "seen" by the camera and the photo displayed on the screen are consistent, with strict data for comparison. Image source: Wikipedia But the results of observing the same scene by human eyes and cameras are very different, and this is where the problem lies. To a certain extent, the camera is also a work of bionics, and its imaging mechanism is roughly the same as that of the human eye. The lens of the human eye can refract light, and so can the camera lens; the pupil can expand and contract to control the amount of light entering the eye, and the aperture in the camera can also expand and contract to control the amount of light entering; the visual nerve cells on the retina convert light signals into electrical signals and transmit them to the brain, and the photosensitive elements of modern digital cameras also convert light into electrical signals and then transmit the signals to the processor. The imaging principles are generally similar, but the imaging effects are very different. 02 Binocular vision is different from single camera imaging There is a difference between the scenes observed by two human eyes (you can feel this difference by blinking alternately). The brain uses the scenes observed by two human eyes to "make up" a three-dimensional world, while the picture taken by the camera is a flat picture. Now, close one eye and extend the two index fingers of your left and right hands. You will find that it is difficult to determine their positional relationship with just one eye. The same is true for the images taken by the camera. The spatial position relationship of the objects is not so obvious in the picture. When looking at the photo, it will also feel very "thin". Which pen do you think is closer to the camera? See the answer in the picture below Map: Salted Fish in the Sea The technology to solve this difference has already appeared, but its application scope is not yet widespread. On the image recording side, there are devices that imitate the human eye's perspective to record images. They often have two lenses, and the distance between the lenses is close to the distance between human eyes. They are used to record images similar to what the left and right eyes of a person see. These devices can be used to shoot 3D movies. Image source: Wikipedia At the display end of the image, there is currently no portable naked-eye stereoscopic image display device. People need to wear special 3D glasses (composed of polarized lenses) in the cinema to observe the three-dimensional scene. This places special requirements on both the projection equipment and the glasses. In order to observe stereoscopic images at any time in life, the help of "light field display technology" is also required. This technology can deliver two images to a person's eyes separately without wearing glasses. The specific principle is rather complicated, so I won't elaborate on it here. In short, due to cost and equipment size reasons, this display technology has not been widely used. 03 Different lateral ranges of clear imaging The human eye can observe a wide range. The field of view of a single eye is about 150°, and the field of view of both eyes can reach 200°. However, the range of clear imaging is very small, only a few degrees. You who are reading this article can see no more than ten words clearly. Don't believe it? Let's try it. Just stare at a word and feel how many other words around it are clear. At this point, someone may ask, why is there a saying that "one glance can read ten lines"? This is because the eyeballs are moving rapidly, using a narrow and clear field of view to quickly scan scenes in different locations. This is not the case with cameras. Objects at the same distance from the camera and in different directions can all be clearly imaged. Generally speaking, the field of view of the cameras equipped on mobile phones is mostly between 40° and 120°, and the pictures taken are clear at all angles. Map: Salted Fish in the Sea From the perspective of camera equipment, it would be great to take clear pictures of every position, but this may not be the case from a photography perspective. When people appreciate beautiful scenery, the clear imaging range is often the beautiful part of the scenery, while the messy secondary elements are not in the clear imaging range and become blurred side scenes. The eyes are attracted by the beautiful scenery, and the things around the scenery are blurred in our eyes and have a low sense of existence. If you take a bunch of photos with your mobile phone at this time, you often feel that the photos are very average when you look through them, because the messy side scenes around the scenery become very clear in the photos and have a strong sense of existence. 04 Different vertical distances for clear imaging We have discussed the "horizontal" imaging characteristics of cameras and human eyes. Their vertical imaging characteristics are actually different. Generally, people call the vertical imaging characteristics "depth of field", which refers to the spatial depth at which a camera can clearly image when shooting. When you take a photo of a scene two meters away with your phone, objects within the range of one meter in front of the scene to four meters behind the scene are all clear in the photo, which means that the spatial depth (depth of field) of clear imaging is 5 meters (the depth of field of cameras with different parameters is different). Similarly, the human eye also has a clear imaging range when observing the outside world. When the human eye looks at an object 25 cm in front of it, the spatial depth that can be seen clearly is 23.6-26.6 cm (the depth of field is 3 cm, and this value will change depending on the person and lighting conditions). The depth of field of the human eye varies when observing objects at different distances. When observing close objects, the depth of field is very small, and when observing far objects, the depth of field is relatively large. The depth of field of a camera depends not only on the distance of the object being photographed, but also on the focal length, aperture value and other parameters of the camera lens. Image source: Wikipedia When the depth of field of the observer's eyes is different from the depth of field of the shooting device, there will be a difference between the scene observed by the human eye and the scene recorded in the photo. These differences may make the photo look mediocre or more amazing. The specific structure of the mobile phone camera determines that its depth of field is relatively large, and the vertical distance of clear imaging is very long. Mobile phone manufacturers often use image processing technology to imitate the shooting effect of shallow depth of field - the subject is clear and the background is blurred, making the photos look more beautiful. 05 See the world from the camera's "perspective" Take better photos Current mobile phones are equipped with three or more camera systems (each system has a different focal length and can capture a different range of scenes), and the photos taken by different systems also have their own characteristics. Choose different cameras for different scenes, use a wide-angle camera (large field of view) to capture grand scenes, and use a telephoto camera (smaller field of view) to capture local beauty. Map: Salted Fish in the Sea Highlight the subject as much as possible within the imaging field of view, and use a camera with a small depth of field (the longer the focal length, the closer to the scene, and the larger the aperture, the smaller the depth of field) to effectively reduce background interference outside the subject. Once you understand the imaging characteristics of the camera, taking photos will be easier. Finally, I hope that after learning these, everyone can take wonderful photos in this beautiful autumn! Produced by | Science Popularization China Author: Salted Fish in the Sea Producer|China Science Expo Submitted by: Computer Information Network Center, Chinese Academy of Sciences The cover image and the images in this article are from the copyright library Reproduction of image content is not authorized |
<<: Who is the fattest bear this year? It's this one, one bear is worth two bears! | Nature Trumpet
>>: 0 sugar, 0 calories, 0 fat, can sugar substitutes really relieve your sweet burden?
Aviation, aerospace, astronautics...can you tell ...
[[162822]] Everyone is familiar with the Internet...
Learn visual language with Hollywood animation ma...
When it comes to using WeChat, there have always ...
The third wave of free total colony count testing...
The Wenchang position can be used as a bedroom or...
[L0: Preschool] Lesson 0: You make money so that ...
Because life only exists on Earth now, the extrat...
Have you ever seen the sun smile? Here is a photo...
Everyone is familiar with orchids, such as this o...
There is static electricity in winter and heat in...
Why are consumers replacing their mobile phones a...
Today is the 73rd anniversary of the founding of ...
All rivers flow into the sea, which not only refe...
On March 10, local time, WTO Director-General Aze...