JavaScript seems to be disabled in your browser.

You must have JavaScript enabled in your browser to utilize the functionality of this website. Click here for instructions on enabling javascript in your browser.

The Future of Aviation

Reprint from the October 2004 issue of Avionics News magazine.

Paul Novacek, M.A.S., MCFI
Former NASA Human Factors Researcher

What does our future hold? A questions that’s plagued humankind from the start. Some people worry more about the future, those that bet on it; others just let the future happen and are fascinated with what transpires. Then there are the few that create the future. The dreamers and visionaries that share a common goal to make life better for all. They say that the best way to predict the future is to create it. Aviation has always been on the forefront of innovation and the dreamers abound.

And no other field within aviation has been more innovative of late than avionics. Spurred by the space race and fueled by the computer revolution, the avionics world is exciting indeed. The near future is easy to predict, large displays will proliferate panels worldwide and central computers will control every aspect of the aircraft. But there’s still lots to accomplish.

The futuristic Super-Headset will feature sunglasses that display a 3-D image of the terrain, traffic, weather and navigation cues. You'll be able to see right through the clouds and aircraft structure.

The display and database technology has spurred all-in-one units such as the Garmin G1000 and Avidyne Entegra for the light aircraft market and even larger display systems are in the works for the corporate fleet of aircraft. Graphical FMSs will become the norm in the near future and simple synthetic display systems will be certified soon. Taxiway guidance and runway incursion systems will show a pilot hazards on the ground. Sensitive electronic sensors will let us see the runway or turbulence through the clouds and in-flight weather pictures are already beaming into our cockpits. Avionics are changing rapidly and there’s plenty work for all.

But, what’s after next? I mean what’s really next, and still in the minds of dreamers. Well, plenty. Hollywood has a good handle on the future. There’s been plenty of instances of a high government official starting the research and allocating funds for an idea borne from a Sci-Fi movie. But other ideas are just borne from evolving technologies.

Converging Technologies
Computing power and speed have evolved so rapidly that it’s difficult to keep up learning about this stuff, much less trying to implement it. I once heard that if the automotive world made as much progress as the computer industry, we’d have cars that would go Mach 12 and cost about 10 bucks. Speed freaks could only wish.

We’re in an age that if you can imagine it, you can probably do it given a little time and money. Sometimes those dreams need to wait for technology to catch up. Aviation is strewn with failed attempts because the existing technology could not support it. Case in point, the Northrop Flying Wing. Designed in the 1930s and built as a bomber in the 1940s, ultimately did not succeed, on the technical front, because the slightly unstable platform wandered too much during a bomb run. But the Wing was highly efficient and stealthy. The ultimate success of the flying wing, the stealth B-2 Spirit, needed to wait for computers to solve the stability problem. Hence, the technology of a Flying Wing and technology of computer fly-by-wire controls merged in modern times to create a beautiful flying machine. The same exists for the future. There are plenty of far-out ideas that just need increased computing power to make it possible. Synthetic vision is one of those technologies where the concept needs to wait for off-the-shelf-components capable of the task.

Synthetic Vision
NASA is working with the industry to help create Synthetic Vision, a kind of virtual reality display system for cockpits. These new technologies offer pilots a clear, electronic picture of what’s ahead, no matter what the weather or time of day. Super-accurate terrain databases and graphical displays can draw three-dimensional moving scenes that will show pilots exactly what’s outside. The media has reported for many years the coming of Synthetic Vision and there has been some progress with certified units, but true synthetic vision with the resolution necessary to provide accurate depth cues still needs a very large and very fast computer that currently fills the entire baggage area of a small plane. We’re close though, and processing power will soon be capable enough to display high-resolution images at airplane speeds. One of the biggest issues is the accuracy of the electronic terrain database and accurate positioning. Radar and infrared sensors are currently being developed to verify positional accuracy by mapping the actual terrain and comparing it to the database terrain. The displays would not only show terrain, but ground obstacles, air traffic, landing and approach patterns, runway surfaces and other relevant information.

Once these technologies are mature, we may see the death of an instrument rating (ok…you FAA guys, stop throwing things at me now). If we can “see” through the clouds with Synthetic Vision and be able to avoid hitting anybody else, why would we need a special rating to fly through the clouds. Truly VFR in IFR. But that dream is a long way off, mainly because of the infrastructure that will need to accommodate free-flying aircraft. But, the technology is within our grasp and we just have to work out the details, which is what our universities, government research houses and avionics manufacturers are currently trying to accomplish.

The Small Aircraft Transportation System, SATS, is a partnership among various organizations including NASA, the FAA, U.S. aviation industry, state and local aviation officials and universities. The system partners intend to relieve the nation's current problems of highway gridlock and airport delays. At equivalent highway system costs, SATS will reduce transportation times to more communities by half in ten years and by two-thirds in twenty-five years.

Previous NASA research programs paved the way for SATS. These programs explored the feasibility of advanced technologies for the aviation fleet. Also called enabling technologies, they provide the framework that allows for demonstrations in the SATS program. Some of these technologies include Highway-In-The-Sky (HITS) graphical pilot guidance systems, new turbine engine technology with revolutionary thrust-to-weight ratios, streamlined composite airframe manufacturing techniques, ice protection technology, autoland capability and improved crashworthiness. We don’t need flying cars, just faster, easier to fly airplanes, and improved facilities for reduced turn-around time.

Display Technology
Visual display systems are constantly evolving, sometimes for the better, sometimes for the worse. It started with mechanical displays, which are to the most extent, still very much with us. The boom of the electronic age brought Cathode Ray Tubes (TV screens) to the aviation world in the 70s with EFIS and color radar displays. The next display medium to burst on the scene was the venerable Liquid Crystal Display (LCD), which has slowly improved in the last 15 years, but hasn’t quite approached the fidelity, saturation or contrast range of a good CRT.

Designed for laptop computers, the LCD and all its iterations have only been a stop-gap measure until the next display system comes along. The LCD has numerous human interface problems, but because of its relatively small profile and mass production has been used extensively in aircraft displays when the human interface is of secondary concern. An LCD still hasn’t come anywhere close to the performance of a good Trinitron picture tube. The LCD resolution is low because of it’s pixilated construction, and they are difficult to view in bright light or from an angle. But as a society, we have accepted the LCD as the best possible technology for current applications.

The next big breakthrough in the display area will be stereoscopic head mounted displays. The current crop of Head-Up-Displays (HUDs) are terrific for superimposing computer-generated guidance information directly in the pilot’s forward view. As cool as these are, it can get much better. One of the big limitations of the current HUDs is the inability to show any depth and therefore depicts just a two-dimensional view. A whole new world could open up if the pilot could view 3-D projections. Imagine what could be displayed.

We see the world in three dimensions, which is what enables us to determine distance and gives us the ability to avoid walking into walls. The fact that we have two eyes and that they are separated by a few inches allows us to observe two distinct images, which our brain then calculates the image shift in order to determine distance. In essence, stereoscopic vision. Now, if we can place a separate display in front of each eye, we can then artificially generate a 3-D image. Remember those View-Masters — same principle.

This technique is already being used in the 3-D IMAX theaters. You wear a visor that electronically modulates what you see out of each eye. Masking each eye to match images on the screen, thus fooling your brain to thinking there is depth. Quite a breakthrough that could open up endless possibilities in future display systems. Applying this to the piloting task, imagine wearing a personal piece of headgear, or Super-Headset, which combined the traditional noise-canceling earphones for sound, but also included some tricked-out sunglasses that were actual tiny translucent display systems. Not only do they provide tinting for the outside glare, but they can also display terrain, navigation guidance, traffic and a three-dimensional rendition of the weather. As your head moves, the display images move to keep the symbology superimposed over the outside world. You could look down at the floor of the cockpit and see a virtual rendition of the ground. It would be like seeing through metal.

The current weather displays do not give a pilot the vertical dimension of a storm in a graphical presentation, thereby leading to vertical interpretation errors. A 3-D graphic depiction of a thunderstorm will give the pilot a three-dimensional picture that will allow either a horizontal or vertical deviation. Oh, and the Super-Headsets could also automatically tint those areas that look outside the window. No more waiting for your eyes to adjust to a dark cockpit after looking outside.

Why stop at stereoscopic visual displays, why not also apply this to audio displays. Your Super-Headset is also driven by a system that can sound an audio alarm selectively to each earpiece. So, if the right engine has a failure, the warning sound is only heard in the right earpiece, thus directing your attention to the correct engine and saving precious seconds or a possible blunder by securing the wrong engine. Just like our vision, our hearing is directional, and the phase or time difference heard from each ear lets our brain determine direction. Imagine one of your passengers behind you on the opposite side of the airplane starts talking over the intercom. That voice could be applied to your ears separately and out of phase so you could determine which direction the voice was coming from. These stereoscopic and directional audio techniques are already being used in military fighters to warn of incoming missiles — greatly adding to the pilot’s situational awareness by quickly determining the direction of the incoming missile.

To make this Super-Headset possible, there are a few technologies that still need to “converge” in the future. First, a display must be created that allows you to see through it, yet still have high enough contrast to display symbology. It must be small, light and employ an incredibly high-resolution projector. Second, work needs to continue on head tracking sensors to allow superimposing the symbology onto the real world as you look through the glasses. Third, the computing power needs to track head movements and keep up with the rapid display changes so the graphics do not lag behind. This computing power already exists, but not for a reasonable price yet.

Research and development is already underway for the next communication system. Called NEXCOM, it is an analog/digital system incorporating the latest technologies in RF radio communications. This system combines the virtues of voice communication with digital text datalink to accommodate additional sectors and services; reduce logistical costs; provide data link communications capability; replace expensive to maintain VHF and UHF radios; reduce interference and provide security mechanisms. It will take text messaging to the next level.

Digital text communication is already in use with systems such as ACARS and satellite datalink, but the NEXCOM system will combine many of these systems into a large network available to all aircraft. Frequency hopping across the country will be eliminated because the next channel will be automatically programmed and switched. Routine communication, such as clearances, will use a text datalink technology. Thus eliminating readback errors and congestion. Voice communication will still be an integral part of the system, but generally only used for non-routine communication.

The NEXCOM system will be slow to implement because of the cost of new radios and control systems for both the ground and air elements. IFR traffic will have the most to gain from a new digital communication system, so expect to see it first implemented with the airliners and corporate iron. When completed, it’s estimated that over 46,000 radios will be installed throughout the FAA system. The NEXCOM system will enhance the FAA's ability to meet expanding air traffic control communication demands.

A demonstration of the NEXCOM capability was already flown on FAA test aircraft utilizing the FAA's prototype ground station at the FAA technical center in Atlantic City. All ground station modes were demonstrated during the flight including urgent downlink request, next channel uplink, controller override, and digital voice using 2V2D mode. Expect to see the first series of NEXCOM systems available in three to five years. Although reminiscent of the Mode-S transponders, the ground stations will take much longer to install and provide added services. In the mean time, there’s much to learn and develop with the pilot and controller interfaces.

Flight Control
Flight controls will change drastically also. We’ll still be able to turn the control left to go left, but the fly-by-wire controllers will not allow exceedences that place the aircraft in a precarious situation. Current fly-by-wire airplanes already achieve this in airliners and to a limited extent, corporate jets, but the technology will definitely trickle down into all aircraft. Including small general aviation aircraft. NASA and the industry are working hard toward this goal.

For instance, in the future we’ll probably loose the rudder controls. Not the rudder, but just the manual control of it. Fly-by-wire controls and computers will keep the turns coordinated during cruise flight. Castoring main wheels that automatically center will take care of any crosswind landings and the yoke, or stick, can provide the input to steer the nosewheel during taxi. Combined with a single “GO” lever to control the brakes, the rudder pedals can be eliminated. Yea, more legroom for us tall ones!

With the advent of computer controls for piston and light turbine engines entering the market, using a single power lever to control just the engine is an inefficient use of cockpit resources. The single power lever could also be used to control the various drag devices on the airplane. This single control has the potential to become a true thrust/drag lever — controlling not only the engine thrust, but also spoilers and flaps. The potential for this device could be far reaching, by reducing the training burden and contributing to a reduction in workload.

A “GO” lever will combine all of the thrust functions into a single control. Imagine the throttle, flaps, airbrakes and wheel brakes all controlled by a single, console-mounted lever. This single control programs the flight computer to configure the aircraft to accomplish the given task.

The sidestick controller would control the nosewheel steering. The computer would smoothly transition between moving the ailerons and rudder while flying, controlling a combination of flight controls and nosewheel steering for high-speed taxiing and nosewheel steering and differential braking for slow taxiing. Thereby, eliminating the need for differential braking and pedal nosewheel steering. Additionally, the spoilers could be kept deployed for taxiing, reducing the need to position the flight controls to counteract high winds while taxiing.

The far aft travel of the GO lever will apply the brakes equally. Just like a hand brake in a glider or car. So when maneuvering into a tight parking spot, the pilot turns the aircraft with the sidestick controller and slows the forward movement with the GO lever. The far aft travel of the GO lever could employ a lateral locking movement to apply the parking brakes.

All of these concepts are within our grasp, but it will take a conscious effort by designers, financiers and regulatory bodies to achieve.

Stay tuned, the future is exciting!!!