/blog @bhaprayan   ·  

Journey through a Photographer's Lens

“We take a handful of sand from the endless landscape of awareness around us and call that handful of sand the world.” — Robert M. Pirsig, Zen and the Art of Motorcycle Maintenance: An Inquiry Into Values

With the advent of high-quality1 cameras on our smartphones, we now find ourselves spending more time taking pictures of the world around us. I often enjoy taking pictures of nature, wildlife, and cityscapes. However, despite this ever-increasing deluge of pictures, I still found myself lacking basic photography skills. What separates a good photograph from a bad one? Most of us have an intrinsic sense of taste when it comes to distinguishing between photograph quality, but was there some way I could better articulate and understand this? Could a minor time investment help me take better pictures, and perhaps learn to see the world as photographers do?

Luckily, I managed to find a course earlier this year that offered to teach the fundamentals of photography, titled “Learning Photography the Hard Way” (later renamed to “Pragmatic Photography”)2.

The intent of writing this post is to give a high-level overview of some of the topics we touched on, and reflections on what I learned. I’ll structure this post in three sections, in the order of material we covered. The first part will be on the history and evolution of imaging systems3. Next, I’ll provide a short tour through a camera’s internals. I’ll conclude with thoughts and reflections on what I learned. The course also covered topics on lighting, scene composition, and editing, but for brevity, I won’t be covering them in this post. Let’s dive in!

Evolution of Imaging Systems

What do we mean by the word “image”? Colloquially, we use it to refer to a photograph or perhaps something having to do with a camera. However, if we think about its broader meaning, an image can be thought of as anything used to approximately portray a subject. This subject could be a person, a landscape, or even a map! What ties all of these forms together is that the representation exists in the creator’s mind, much before its final imprint on a canvas.

Hence, I’d argue that an “imaging system” is anything that helps us better portray our mental representation of the subject. Of course, this is far from a static process. Our best tools have a symbiotic relationship with our thoughts. Our mental representations change under the influence of the tool’s constraints. In the case of a camera, these constraints could be the type of lens or scene lighting conditions. These then influence the range of images we can capture. I view a photographer as an artist who not only works within the bounds set by these constraints but pushes the limits of the medium to paint vivid mental imageries.

Hence, I think it’s important to understand the nature of this medium’s constraints, to gain insight into the possible range of expressions which it supports4. This post attempts to do so by diving into the important components behind modern-day cameras and how each influences the culminating image. What follows is by no means an exhaustive enumeration, but should provide insight into what the fundamental parts are5.

Camera Internals


The shutter is arguably amongst the most important camera component6 since it allows light to pass through to the camera sensor. No light? No photo :)

A simple model I like to imagine is of trying to use a soda machine to fill a cup with the “right” amount of soda. We neither want the cup to overflow nor do we want it to underflow. All we have control over is how long to keep the nozzle open. If we miss our target amount we have to throw all the water out and start over again. As you can see it’s somewhat of a balancing act to get the right amount of water in7.

Depending on the shutter mechanics, there exist types of shutters. Here, I’ll cover the two types that I encountered during the course, and interesting side effects they produce.

Focal Plane Shutter

In this variant, the shutter moves across the length of the camera to expose the sensor. The shutter movement can be either vertical or horizontal, depending on the type of camera. In film cameras, for example, the shutter movement is usually horizontal, since the unexposed film is on the left, and the exposed film spools up on the right. This results in a simpler mechanical system compared to vertical shutter movement, since the shutter moves in the same direction as the film winding.

Image illustrating the movement of a focal plane shutter. Image source

A disadvantage of this type of shutter though is that it can distort fast-moving subjects, such as that in the following photograph:

The apparent elongation of subjects due to the mechanics of focal plane shutters. Image source

The apparent elongation of the car and people in the scene is a result of one edge of the film being exposed an appreciable time after the other. This phenomenon can either elongate or shorten the image of the subject depending on the direction of relative motion between the shutter and the subject.

Leaf Shutter

Leaf shutters consist of several thin metal plates that slide over each other to create a circular aperture. The plates retract to expose the camera sensor and then retract in the same motion to close the aperture.

Image illustrating the movement of a leaf shutter. Image source

Ideally, the shutter should open instantaneously, stay open for the exposure duration, and then close instantaneously. This assumption is approximately valid for slower shutter speeds but breaks down once we start to approach the limits of maximum shutter speeds. In these situations, the shutter spends a significant portion of the exposure period in an intermediate state between open and closed, which may lead to side effects such as vignetting.

Vignetting caused due to a leaf shutter camera operating at higher shutter speeds. Image source

Sync Speed

For concision, I won’t be diving into a detailed discussion of camera lighting and flash8. An interesting concept to think about though is how we’d synchronize the timing of the flash and shutter9. For scenes with suitable ambient light, we wouldn’t have to use the flash, and hence not have to think about this. But what about in most other situations?

In such situations, a concept called sync speed is at play which is the fastest shutter speed that we can use with flash. A shutter speed faster than this will lead to a blackout, as illustrated in the following image:

Image illustrating the effect of blackout due to a shutter speed faster than the sync speed. Image source

In “ancient” times, photographers would have to manually hold the shutter open, fire the flash, and then allow the shutter to close.

In the mid-1940s, add-on flash synchronizers became common, which allowed the photographer to send a signal to fire both the flashbulb and shutter release. Modern-day cameras embody the digital equivalent of this analog mechanism

Fun fact: the iconic lightsabers from the Star Wars movies were made from these very flash camera handles. Image source


It is difficult to do justice to the physical principles of lenses and image formation in a short post such as this. I’ll instead provide a brief overview of concepts relevant to camera lenses which you’re likely to come across, and how they influence taking pictures10.

[a] pinhole camera [b] simple convex lens [c] compound lens

The lens is the component that helps concentrates light at a fixed point (e.g. the focal point). We can create the simplest type of camera using no lenses, colloquially known as a pinhole camera (Figure [a] in image). If you’ve experimented with a pinhole camera, you’ll recall that it permits creating only a limited range of pictures. This is largely due to us not having control over the light path, once light enters the camera.

This problem can be partially solved by constructing a camera with a single convex lens (Figure [b] in image), which gives us limited control over the light path. This setup works for simple cameras but has the drawback of being susceptible to chromatic aberrations, such as the appearance of fringes in the image.

A cheetah with fringe views. Image source

Actual camera lenses are compound (Figure [c] in image) which means they consist of a series of convex and concave lenses. This helps us amplify light properties we care about and diminish other effects such as aberrations.

Ultimately, however, there is no “one size fits all” lens. By emphasizing some properties and diminishing others the lens designer crafts a lens that is suitable for specific use cases.

However, there do exist certain concepts which underlie the construction of any camera lens. You would typically encounter these when studying and contrasting between different lenses and is what we’ll look at next.


A camera lens diagram with labels for important components and concepts. Image source: https://blender.stackexchange.com/questions/52495/is-it-possible-to-make-an-optically-functional-camera-lens-in-cycles

The camera lens diagram in my experience is usually the driest (and most technical), part of understanding how a camera operates. For the following definitions, refer to the lens diagram to get a better intuition for what each concept is trying to convey.

Angle of view (AoV): The visible extent of the scene which can be captured by the camera sensor

Focal length: The distance between the lens and sensor when the object is in focus. The focal length has an inverse relation to the angle of view. Shorter focal lengths result in a large angle of view, and vice versa.

Image illustrating the drastic effects that changing the focal length can have on the image. For an even more fascinating effect, check out the original GIF.

Depth of field: The distance around the object which also appears to be in focus. This is an important tool in the photographer’s toolbox since it influences where the viewer should focus on.

Aperture: The opening that lets light into the lens. Cameras support changing the diameter of this opening (measured by a quantity called f-stops). Similar to the inverse relation between focal length and angle of view, f-stops too have an inverse relation with the aperture size. Smaller f-stop numbers correspond to larger aperture sizes, and vice versa. The size of the aperture directly affects the depth of field.

Image illustrating how changing the depth of field can draw the viewer’s attention to different regions of the image. For the original continuous effect, check out this GIF.

Camera lenses are bounded with a maximum aperture size which indicates how wide the aperture can open. Since the aperture influences how much light can enter the camera, lenses with a wide max aperture are usually great for low light photography.

Primary Controls

As we’ve seen so far, a photographer has many knobs at his disposal to tweak when taking a picture. This may seem overwhelming when starting (I did!). Thankfully, there are three core adjustments worth understanding, that you can make to cover most use cases. These three adjustments, also known as “the exposure triangle", refers to the triad of shutter speed, ISO, and aperture size, which work together in harmony (or disharmony) to influence the final picture. So far in this post, we’ve seen shutter speed and aperture size. Let’s understand ISO next, and then see how each of these principles interact with each other.


The ISO setting governs how sensitive the camera is to light. I find it useful to understand ISO through the “lens” of the soda machine model, we’ve seen earlier in this post. As an extension to the analogy, think of the soda machine vendor being mischievous and contaminating the container with trace amounts of sand. If we now attempted to fill our cup with soda, we’d receive a “grainy” drink. We could try to reduce the cup size to reduce the amount of grain, but this would also result in us getting less soda :-(

Is there a solution to this tradeoff between the amount of soda and the amount of grain? A similar phenomenon occurs in cameras when we adjust the sensor sensitivity. An increase does result in registering more photons but also increases our susceptibility to background noise. This background noise is always present in nature and is caused due to many different factors. This tradeoff can often be resolved by understanding how “the exposure triad” interacts.

The Exposure Triad

Now that we’ve seen the three concepts of shutter speed, ISO, and aperture size, we’re in a position to understand how these three principles interact. I’ll illustrate these concepts through examples of images however if you readily have access to a camera it’s most beneficial if you play around with these knobs yourself11 to understand how they interact12. I found it simpler to understand the interaction through examples, so that’s what I’ll attempt to do.

Concepts in Practice

Motion Blur

Have you seen images like these in the past?

Motion blur captured of the 10-15 Hz hummingbird wing beating frequency. Image source

Based on what we’ve seen so far, we’re now in a position to combine these concepts and recreate something similar. Since we want a blur, we’d want to decrease the shutter speed to boost the exposure duration. However, the increase in exposure time would also increase the total noise in the final image, so it makes sense to reduce the ISO.

Frozen Motion

How about an image like this?

This picture was taken through methods pioneered by Harold Edgerton (aka Papa Flash), a professor of electrical engineering at MIT who pioneered the broader usage of stroboscopes in photography. Image source

Since we want to create an instantaneous capture (i.e. frozen in time), we should increase the shutter speed. However, this would result in less light entering the sensor during exposure. To counter this we should increase the sensor sensitivity (i.e. by increasing the ISO).

Night Shots

Nightshot of CMU’s Pausch Bridge (slight blur due to the lack of a tripod)

In my experience, these shots are tricky to capture. Since we’re working in low light, we have to decrease the shutter speed and increase the aperture size to let more light into the lens. However, this can easily corrupt the image with noise if the ISO setting is too high, or can result in a dark image if the ISO is too low relative to the shutter speed and aperture size. Also, the difficulty is compounded if you’re not using a tripod since the longer exposure time can easily result in a blurry photograph with even the slightest movement.

Image Critique

Armed with this knowledge, we’re also in a better position to critique the following image, which is one I’d taken around the start of the course:

Snapshot of the CMU fence (painted then in remembrance of Kobe)

What is this an image of? Well back then, I was intending it to be an image of the CMU fence, but when I look at this now, I think its an image of a fire hydrant. That’s the object that first draws my attention in the foreground, especially with the brighter colors that pop out. Instead, it would’ve been better to either walk closer to the fence (or zoom in) and then slightly reduce the depth of field to blur out the background detail. If I also had the lighting under my control, I would’ve liked to bump up the brightness, which I can still do in post-processing.

Controlling Lighting

For concision, I didn’t dive into an explanation of lighting in this post, which is a fascinating topic in itself! For one assignment, I was tasked with controlling the lighting setup, without having access to the standard tools a working photographer would have in their studio. I ended up piecing together items you’d find normally lying around the house such as paper, a plastic bag, a tower lamp, a book, and aluminum foil to create the following setup:

Hacky setup to control environment lighting conditions.

which resulted in the following image:

A General(ly) Grievous droid

Concluding Thoughts

In trying to cover a vast topic such as photography, I inevitably had to make compromises on which topics to include and which to skip for brevity.

Looking back over this experience, I gained a deeper appreciation for both what it is that photographers do, but also how the pursuit of photography can positively influence our lives. When done with intent (instead of passive clicking), photography can make us more mindful of our surroundings. You start to notice the minutest details in your environment, as everything becomes a potential picture moment in your mind. After this experience, I can resonate with anecdotes I’ve heard of photographers often spending hours (if not days!) waiting for the right scene to manifest itself prior to capturing a picture.

I feel the advent of modern digital photography is both a blessing and a curse. A curse in that we are now drowning in a deluge of pictures, most of which are ephemeral and don’t survive beyond a short glimpse. A blessing in that, most of us now have access to the basic tools to get started in refining our photography skills and start on the path towards creating timeless pieces of art. I’m far from being a photographer, but this brief experience has opened my mind to new ways of seeing and thinking. I’ll end with a quote I thought of 13 during the course:

Photography is an art. A camera is just the medium.

Thanks to Wan Shen Lim for reviewing an initial draft of this.

Have thoughts, feedback, or comments about this post? I’d love to hear about them :)


  1. This was my opinion of smartphone cameras going into the course. My view of this completely changed after the first week of the course which we spent dissecting the internals of a camera and understanding its history, evolution, and function from first principles. ↩︎

  2. I actually liked the original name better since it better captured the course’s intent, which is to emphasize the importance of studying cameras from fundamental principles. This allows the practitioner to deeply understand how a camera operates through the features which it exposes. ↩︎

  3. I’m using the term imaging systems here intentionally, to avoid pre-association to any existing concepts you may have with modern-day cameras. ↩︎

  4. As Marshall McLuhan once said, “The medium is the message”. ↩︎

  5. And hopefully stoke enough interest to figure out components either not listed here for concision or as a result of them not being invented yet. ↩︎

  6. Fun fact: The shutter is often associated with the iconic CLICK sound, but it is often not the noisiest component of a camera (which is often either the mirror or flash). Digital shutters are usually silent, which is why countries such as Japan and South Korea enforce a mandatory shutter sound on smartphones to discourage covert photography. ↩︎

  7. The water, in this case, is the photons of light impinging on the camera and the cup is the sensor. This analogy may appear contrived but surprising is a good first-order approximation to understand what a camera does. This model comes in handy when thinking about concepts such as ISO. ↩︎

  8. The topic of camera lighting, ambient light, and flash is fascinating! To do it justice would require a complete post in and of itself. ↩︎

  9. This refers to the complete process of shutter opening, sensor exposure, and shutter closing. ↩︎

  10. This is an oversimplified diagram mostly due to my amateur vector graphic drawing skills. In the process of creating this diagram, I gained a renewed appreciation for the work that graphic designers do. ↩︎

  11. “I hear and I forget. I see and I remember. I do and I understand.” — Confucius ↩︎

  12. D/SLR cameras offer fine-grained control of these knobs through manual mode. For smartphone camera apps offer similar functionality, but I find they don’t offer the same degree of insight (or fun). ↩︎

  13. I suspect this quote was influenced by Edsger Dijkstra’s famous quip: “Computer science is no more about computers than astronomy is about telescopes”. ↩︎

Written August 2, 2020. Send feedback to @bhaprayan.

← Stuff Matters  Grand Challenges of Robotics →