A real-time flight simulator is a virtual reality program. It presents the user with an environment that mimics, to some degree, the environment of a real airplane. A very expensive flight simulator might rest on a full-motion platform, and have a cockpit identical to the simulated aircraft's cockpit, identical controls and instruments, a realistic view of the scenery, and simulated engine noise. On the other hand, an inexpensive flight simulator running on a PC might display the scenery and instruments on the monitor, and employ the keyboard for airplane controls.

The output of the flight simulator is what gives the user the perception that he or she is flying a real aircraft. There are many ways to contribute to the overall experience of flying, including very subtle effects such as cockpit ventilation. This paper covers only the most basic forms of output, namely, drawing the scenery. Reference 15 details some more advanced methods of output.

The description of a wireframe scene consists simply of a list of line
segments in 3-D space. For example, in a Cartesian coordinate system,
a cube could be represented by the following twelve line segments:

Because scenery is (usually) fixed relative to the Earth, the endpoints of the line segments are stored in an Earth-fixed coordinate system.

To draw the scene, the flight simulator transforms each endpoint in
the scene into eye coordinates. The eye coordinate system is a
Cartesian coordinate system based on the pilot's view. The eye axes
originate approximately at the pilot's eyes. The *x*-axis points
along the pilot's line of sight, the *y*-axis points to the right as
seen by the pilot, and the *z*-axis points down as seen by the pilot.

After the endpoints have been transformed to eye coordinates, the
flight simulator calculates their projections onto the 2-D plane of
the screen. Figure 3 illustrates this. The
location at which to draw the object on the screen is where the line
of sight intersects the plane of the screen. Given the distance from
the eyes to the screen (*x*_{s}), and eye coordinates of a point
(
*x*^{E},*y*^{E},*z*^{E}), the screen coordinates can be obtained using similar
triangles. (The screen coordinate system is the 2-D coordinate system
of the physical plane of the screen, originating at the screen's
center, with the *y*-axis pointing to the right, and the *z*-axis
pointing down.) Equations 97-98 list the formulas
for the projection.

The projection yields the screen coordinates of the endpoint. The screen coordinates are actual lengths. Thus, if an endpoint's

So, to draw the wireframe scene, the simulator transforms the endpoints of every line segment into screen coordinates, and then draws a straight, 2-D line segment on the screen connecting the endpoints of each segment.

Drawing polygons is not much different than drawing line segments.
(In fact, one can think of a line segment as a degenerate polygon with
two vertices.) A polygon is described by a sequence of 3-D points,
which represent the vertices. For example, in Cartesian coordinates,
a square might be described this way:

(0,0,0)-(1,0,0)-(1,1,0)-(0,1,0)

To draw the polygon, the simulator transforms these points into screen coordinates, and then draws the polygon described by the screen coordinates, filling its interior.

The most important difficulty arising from polygon scenery is how to draw the scene so that distant polygons do not obscure close ones. One algorithm that accomplishes this is Painter's Algorithm, which draws the most distant polygons first, and the closer polygons afterward. Thus, the closer polygons obscure the more distant ones.