Minimizing Input-to-Output Latency in Virtual Environment
TBMG-5655
09/01/2009
- Content
A method and apparatus were developed to minimize latency (time delay) in virtual environment (VE) and other discrete-time computer-based systems that require real-time display in response to sensor inputs. Latency in such systems is due to the sum of the finite time required for information processing and communication within and between sensors, software, and displays. Even though the latencies intrinsic to each individual hardware, software, and communication component can be minimized (or theoretically eliminated) by speeding up internal computation and transmission speeds, time delays due to the integration of the overall system will persist. These “integration” delays arise when data produced or processed by earlier components or stages in a system pathway sit idle, waiting to be accessed by subsequent components. Such idle times can be sizeable when compared with latency of individual system components and can also be variable in duration because of insufficient synchrony between events in the data path. This development is intended specifically to reduce the magnitude and variability of idle-time type delays and thus enable the minimization and stabilization of overall latency in the complete VE (or other computer) system.
- Citation
- "Minimizing Input-to-Output Latency in Virtual Environment," Mobility Engineering, September 1, 2009.