Although we call them smartphones, making phone calls is far from the top uses for today's highly-versatile handsets. Smartphones are built from the ground up for a number of more resource-demanding tasks, such as browsing the web, running apps and games, and, of course, taking photos and videos. And as you have surely noticed, most people's primary camera device is their smartphone. That's why camera quality is of utmost priority to the great majority of smartphone buyers. But while many are the folks demanding phones with awesome cameras, few are the ones familiar with a camera's inner workings. So here's something to shed some light on the matter. Today's topic – how your smartphone's digital camera captures colors. Visual representation of a Bayer filter on an image sensor. Via Wikipedia user Cburnett In a nutshell, today's cameras have sensors on which a grid of light-sensitive pixels is laid out. Put, let's say, 12 million of these together and you get a 12-megapixel camera. The light that hits these pixels when the shutter button is pressed is processed and turned into a digital image that ends up in your device's gallery. The interesting part is that the light-sensitive element on a pixel cannot detect color. It can only detect the intensity of the light hitting it, which is why to calculate the RGB (red, green, and blue) color value of each pixel, the camera needs what's called a Bayer filter. The latter places a red, green, or blue filter on top of each pixel, enabling it to detect the value of a particular color. As for each pixel's missing two color values, these are estimated based on the values of adjacent pixels that capture the missing colors. Simple as that. If you're still confused as to how cameras capture color, just watch Computerphile's video embedded below. It does a great job at explaining the Bayer filter's purpose in easy-to-understand terms.