Images cannot exist without light. To produce an image, the scene must be illuminated with one or more light sources. In general terms, light sources can be divided into point and area light sources. A point light source originates at a single location in a three-dimensional space, like a small light bulb, or potentially at infinity like the sun. In addition to that, light sources have a range of intensities and a spectrum of wavelengths which influences the design of imaging systems. Machine vision, which deals with vision tasks and controlled setting capitalize on the principles of lighting. Recognizing and understanding the importance of illumination can lead to the simpler, easier and less costly solutions. In an image, the key factors that affect the color of a pixel are the light sources and object surface properties, their emittance and reflectance spectrum, and their relative position and orientation. Most of the computer vision systems assume a local illumination model with no inter reflections nor scattering. A correct illumination model needs to consider the inter reflections as well. However, it is quite difficult to analyze a scene by such a complex model. Let us look at the basic concepts of light, reflectance and shading in much more detail. In general, the word light refers to the visible spectrum of the electromagnetic radiation. Visible light spans to wavelengths between 400-700 manometers which is a very minute fraction of the entire electromagnetic spectrum. The appearance of light sources depends on their spectral power distribution, which shows the relative intensity of light across the wavelengths. Different light sources have different power spectrum, which influence the design of imaging systems and how objects appear in the images. Light is measured using several techniques and the terminology can be a little overwhelming. For now, we'll be focusing on radiometric techniques. Radiometric measurements at the light source are represented as radiance or luminance. If the radiometric measurements are made at the object surface, they are represented as irradiance or illuminance. Depending on the direction of the light source, the shape and reflectance properties of the object, different configurations of shadows and shading are observed. When light hits an object, it gives rise to different scenarios. Either, it gets transmitted through or it gets absorbed or it gets reflected. Let us consider the simplest case of all, reflection. Given the surface normal, the angle of incidence is always equal to the angle of reflection. Majority of the surfaces in the real world have both diffuse reflection, as well as specular reflection component. Irrespective of the direction of the incident light, diffuse reflection happens in almost every direction. Now, let us try to formalize this reflection model by considering this figure. We have the incoming ray, the incident ray, which is defined by a theta and phi, as well as we have the reflected ray or the outgoing ray defined by theta and phi as well. This kind of formulation makes sense considering the reflected ray can be in any direction. So the theta and phi are going to capture those directions for us. Given an incoming ray and outgoing gray, what proportion of the incoming light is reflected along outgoing ray? This answer is given by bidirectional reflection distribution function, which is represented as a function of theta i, phi i, theta e and phi e. Here are few facts about bidirectional reflection distribution function. The first one is outgoing light energy is less than or equal to incident light energy. Reversing the incident light direction, results in the same reflectance. We are going to stick with this convention, where we have unit vector L pointing to the light source direction, unit vector N, which is the surface normal, R corresponding to the angle of incidence, and V which is the viewing direction. Now, let see what happens if we consider a mirror surface. The incident light will be equal to the reflected light provided, viewing direction corresponds to the unit vector R. Now, let us look at the other extreme, which is a Lambertian surface, which diffuses the incident light in all the directions. The micro facets on the lambertian surface scatter incoming light randomly in all the directions. Now, you may be wondering how does the angle of incidence matter when it comes to diffuse reflection. It does matter. The reflected light is maximum when the angle of incidence is parallel to the surface normal. Now, let us try to formally define the diffuse reflection model by considering the angle of incidence theta into account. In this equation which relates outgoing and incoming gradients, you have to observe that the constant kd is the albedo of the surface and N.L is the dot product of the unit vectors N and L, which equals to cos theta. Now, the bidirectional radiance distribution function for a lambertian surface is kd cos theta. When theta equals to zero, the BRDF is equals to the albedo. Albedo is the proportion of the incident light or radiation that is reflected by a surface. typically, that of a planet or a moon. A random fact in the middle of the lecture, earth's albedo is around 0.3. The albedo value may vary with the variation in the wavelength of the incident light. Given the light sources have several wavelengths, it's an important factor to consider. Now, let's look at the specular surfaces which are not exactly like mirrors, but they do have a specular component to them. Now, the specular component is very high when the viewing angle is close to the unit vector R. Now, not just this. We do have ambient light that we have to take into consideration as well. So this is the holistic illumination model, which takes ambient light, diffuse reflection, as well as specular reflection into account. Just to show you how these three components can be separated, this is a visualization of ambient, diffuse and specular reflection, popularly known as phong reflection. Light and reflectance properties can be used to estimate the shape of an object. A clear understanding of the scene illumination and the surface properties will let you develop augmented reality applications which are very photo realistic If you get a chance, check out light stage, which is an application of light reflectance and shading.