Objective:
The objective of this assignment is to help you understand texture mapping, bump mapping / normal mapping, and shadow mapping. First, you will learn how to introduce more details to the existing lighting system using normal mapping, which increases the complexity of the illusion on the surface; then you will have a feeling of how shadows improve the sense of depth and immersion in the scene.
Summary:
In this assignment, you are provided with a scene with several spheres placed on a plane. For simplicity, there is only one directional light source. Your first task is adding normal mapping (also called bump mapping) to the illumination system, for which you need first to calculate tangent/bitangent vectors and then, in the shader, construct the TBN coordinate system and convert the computation of lighting from world space to tangent space. In the second task, the scene will be rendered twice. In the first pass, it is rendered from the light’s perspective to get the depth map as texture. In the second pass the scene is just rendered as normal. So you have first to calculate the matrix to convert the scene from world space to the light’s viewing space (the goal is to get the depth map which will be used as a texture in the second pass), then in the shader, you need to implement a function to decide whether a fragment is in shadow.
Specifics:
2. Model loading and setup. The models used in this assignment are sphere.obj and plane.obj, whose format is more complex than the models in the previous assignment since texture coordinates are involved. For the specification of OBJ files, you can refer to the instructions . In config.txt, rotation matrices and translation vectors are also included to give the objects an initial position. The scene only has one directional light. You can use a keyboard/mouse to change the position of the light/objects.
3. Normal mapping (60%). The normals in normal maps are in their local coordinate systems (i.e. tangent space/TBN space). To ensure correct lighting, you have to obtain the TBN system for each face and then convert all related light vectors to this system and compute the shading.
● Tangent space (20%). It is a space local to the surface of the model. It consists of normal, tangent and bitangent vectors. Normal is already given as face normal. In the image below, normal is pointing out of the image, the green arrows are bitangents and the red ones are tangents. In the vertex shader, you need to set up the TBN system and then convert the related lighting vectors (light position, viewer vector and fragment position) to tangent space, before passing them to the fragment shader.
● Lighting calculation (20%). The normal map is given as texCubeNorm. There are three places in the fragment shader where you need to get the normal, the light vector and the view vector (all in tangent space). If you calculate them correctly, you will see the result below (when you turn on normal mapping mode by pressing m/M):
Turning off normal mapping v.s. Turning on:
4. Shadow Mapping (40%). In the implementation of shadow mapping, there will be two passes. In the first pass, to decide what is visible and what is invisible (thus in shadow), you will render the scene from the light’s perspective. To generate the depth map/shadow map, this stage requires two very simple shaders (called depth_v.glsl and depth_f.glsl, already provided).
● Shadow testing (20%). The shadow map is given as shadowMap in the fragment shader. Your job is to implement the function calculateShadow in the shader. In this function, you will first read the closest depth value from the depth map, then get the depth of the current fragment, and compare them to decide whether this fragment is in shadow. The current parameter light_frag_pos is the fragment position in light’s space. Feel free to add other arguments to the function if you think they are necessary. Finally apply shadow to the diffuse term (shadow should have no effect on the ambient term).
You shall see the result as below:
● PCF (Bonus 10%): The shadows exhibit jagged, blocky edges, which can be smoothed using percentage-closer filtering (PCF). This technique works by averaging over a 3×3 grid of texture coordinate samples.
5. The functions/methods requiring your implementation will be marked “TODO”; however, depending on your particular implementation there might be other places for you to change code. You are expected to add/modify the code as necessary to ensure the application runs smoothly.
Turn-in:
Don’t wait until the last moment to hand in the assignment!
For grading, the program will be compiled on Linux and run from the terminal (with Visual Studio as a fallback – please try to avoid platform-specific code (e.g., don’t #include <windows.h>)), run without command line arguments, and the code will be inspected. If the program does not compile, zero points will be given. If you have more questions, please ask on Piazza!
Good luck!

![[SOLVED] Cs334/ece30834 assignment #3 – map it! normal mapping + shadow mapping p0](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip.jpg)

![[SOLVED] MongoDB CRUD Faculty](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip-1200x1200.jpg)
Reviews
There are no reviews yet.