-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathreadme.txt
60 lines (49 loc) · 2.76 KB
/
readme.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
//-----------------------------------------------------------------------------
// Name: Cerebral_Cortex Direct3D Sample
//
// Copyright (c) 1998-2000 Microsoft Corporation. All rights reserved.
//-----------------------------------------------------------------------------
Description
===========
The Cerebral_Cortex sample demonstrates an enviroment-mapping technique called
sphere-mapping. Environment-mapping is a technique in which the environment
surrounding a 3D object (such as the lights, etc.) are put into a texture
map, so that the object can have complex lighting effects without expensive
lighting calculations.
Note that not all cards support all features for all the various environment
mapping techniques (such as cubemapping and projected textures). For more
information on environment mapping, cubemapping, and projected textures,
refer to the DirectX SDK documentation.
Path
====
Source: MSSDK\Samples\Multimedia\D3D\EnvMapping\Cerebral_Cortex
Executable: MSSDK\Samples\Multimedia\D3D\Bin
User's Guide
============
The following keys are implemented. The dropdown menus can be used for the
same controls.
<Enter> Starts and stops the scene
<Space> Advances the scene by a small increment
<F1> Shows help or available commands.
<F2> Prompts user to select a new rendering device or display mode
<Alt+Enter> Toggles between fullscreen and windowed modes
<Esc> Exits the app.
Programming Notes
=================
Sphere-mapping uses a precomputed (at model time) texture map which contains
the entire environment as reflected by a chrome sphere. The idea is to
consider each vertex, compute it's normal, find where the normal matches up
on the chrome sphere, and then assign that texture coordinate to the vertex.
Although the math is not complicated, this still involves computations for
each vertex for every frame. Fortunately, Direct3D has a texture-coordinate
generation feature that can be used to do this for us. The relevant
renderstate operation is D3DTSS_TCI_CAMERASPACENORMAL, which takes the
normal of the vertex in camera space and pumps it through a texture
transform to generate texture coordinates. We use this and simply set up our
texture matrix to do the rest. In this simple case, the matrix just has
to scale and translate the texture coordinates to get from camera space
(-1, +1) to texture space (0,1).
This sample makes use of common DirectX code (consisting of helper functions,
etc.) that is shared with other samples on the DirectX SDK. All common
headers and source code can be found in the following directory:
Mssdk\Samples\Multimedia\Common