next up previous contents
Next: Time Step Calculation Up: The Simulator Program Previous: Main Loop


A real-time flight simulator is a virtual reality program. It presents the user with an environment that mimics, to some degree, the environment of a real airplane. A very expensive flight simulator might rest on a full-motion platform, and have a cockpit identical to the simulated aircraft's cockpit, identical controls and instruments, a realistic view of the scenery, and simulated engine noise. On the other hand, an inexpensive flight simulator running on a PC might display the scenery and instruments on the monitor, and employ the keyboard for airplane controls.

The output of the flight simulator is what gives the user the perception that he or she is flying a real aircraft. There are many ways to contribute to the overall experience of flying, including very subtle effects such as cockpit ventilation. This paper covers only the most basic forms of output, namely, drawing the scenery. Reference 15 details some more advanced methods of output.

Wireframe Scenery.

The simplest type of scene is a wireframe scene. A wireframe scene is a scene drawn using only straight line segments. Figure 2 depicts a wireframe scene of Pittsburgh, Pennsylvania, as it might appear from an aircraft.

Figure 2: Wireframe scene of Downtown Pittsburgh, Pennsylvania.

The description of a wireframe scene consists simply of a list of line segments in 3-D space. For example, in a Cartesian coordinate system, a cube could be represented by the following twelve line segments:

...0& 0& 0 \\
0& 1& 0 & 0& 0& 0 & 0& 0& 1 & 0& 0& 0

Because scenery is (usually) fixed relative to the Earth, the endpoints of the line segments are stored in an Earth-fixed coordinate system.

To draw the scene, the flight simulator transforms each endpoint in the scene into eye coordinates. The eye coordinate system is a Cartesian coordinate system based on the pilot's view. The eye axes originate approximately at the pilot's eyes. The x-axis points along the pilot's line of sight, the y-axis points to the right as seen by the pilot, and the z-axis points down as seen by the pilot.

After the endpoints have been transformed to eye coordinates, the flight simulator calculates their projections onto the 2-D plane of the screen. Figure 3 illustrates this. The location at which to draw the object on the screen is where the line of sight intersects the plane of the screen. Given the distance from the eyes to the screen (xs), and eye coordinates of a point ( xE,yE,zE), the screen coordinates can be obtained using similar triangles. (The screen coordinate system is the 2-D coordinate system of the physical plane of the screen, originating at the screen's center, with the y-axis pointing to the right, and the z-axis pointing down.) Equations 97-98 list the formulas for the projection.

yS = xs yE/xE (97)
zS = xs zE/xE (98)

The projection yields the screen coordinates of the endpoint. The screen coordinates are actual lengths. Thus, if an endpoint's zs-coordinate is -1 inch, and its ys-coordinate is zero, the flight simulator draws the endpoint one inch above the screen's center.

Figure 3: Schematic of a projection. The screen coordinate, yS, is determined by similar triangles.

So, to draw the wireframe scene, the simulator transforms the endpoints of every line segment into screen coordinates, and then draws a straight, 2-D line segment on the screen connecting the endpoints of each segment.


There is one problem, however. A point in eye coordinates is not always within the field of view. Projecting such a point would lead to incorrectly rendered images. Therefore, the simulator must draw only the part of the line segment within the field of view. Before it transforms an endpoint into screen axes, the simulator must detect when an endpoint is not visible, and use the endpoint of the visible part of the segment instead, if there is one. This process is called clipping. To draw the scene correctly, the flight simulator must clip every line segment to the visible region % latex2html id marker 1872

Polygon Scenery.

Wireframe scenery is not realistic. It is bland and colorless, and surfaces do not hide objects behind them. Using polygons instead of line segments increases the realism of the scenery. Figure 4 depicts a polygon scene of Penn State.

Figure 4: Polygon scene of Penn State's University Park Campus, State College, Pennsylvania.

Drawing polygons is not much different than drawing line segments. (In fact, one can think of a line segment as a degenerate polygon with two vertices.) A polygon is described by a sequence of 3-D points, which represent the vertices. For example, in Cartesian coordinates, a square might be described this way:


To draw the polygon, the simulator transforms these points into screen coordinates, and then draws the polygon described by the screen coordinates, filling its interior.

The most important difficulty arising from polygon scenery is how to draw the scene so that distant polygons do not obscure close ones. One algorithm that accomplishes this is Painter's Algorithm, which draws the most distant polygons first, and the closer polygons afterward. Thus, the closer polygons obscure the more distant ones.


Monocolor polygons, although much better than wireframes, are still not very realistic. Textures can give the scene structure at scales much too small to efficiently model as polygons. Figure 5 depicts a textured polygon scene of Las Vegas, Nevada, taken from Microsoft(R) Flight Simulator 98. In the scene, the windows on the buildings are not polygons, but part of the building's surface texture.

Figure 5: Textured polygon scene of Las Vegas, Nevada. (From Microsoft(R) Flight Simulator 98)

next up previous contents
Next: Time Step Calculation Up: The Simulator Program Previous: Main Loop
Carl Banks