Open-Source-based Visualization of Flight Waypoint Tracking Using Flight Manipulation System Myeong-Chul Park a, Hyeon-Gab Shin b, Yong Ho Moon b, Seok-Wun Ha b*, a Dept. of Biomedical Electronics, Songho College, Namsanri, HoengseongEup, HoengseongGun 225-704, Republic of Korea b,b* Dept. of Informatics, Gyeongsang National University, 900 Gajwadong, Jinju 660-701, Republic of Korea a africa@songho.ac.kr, b hkshin2@gmail.com, yhmoon5@gnu.ac.kr b* swha@gnu.ac.kr ( * corresponding author) Abstract Flight Waypoint visualization of the aircraft is the system which is widely used to solve the threat against a low altitude task and terrain altitude. But, it is difficult to implement the system because of restrictions that a GPS data and a huge geographic information should be stored. In this paper, an open-source-based Moving Map system for an economic visualization of the flight Waypoint is proposed. First, a simulated flight path information transferred from Flight Manipulation System interlocked with X-Plane through UDP is acquired and then from this a terrain altitude information is earned and simultaneously demonstrated on Google Earth. In the proposed Moving Map system, flight waypoints are visualized on the map information downloaded from a Map server using the mapping data between the present altitude of the aircraft and the present terrain altitude. Also, the monitoring screen is provided that indicates the terrain location which has the dangerous characteristic of collision at the projected course by comparing the terrain altitude and the aircraft s altitude which it follows in time interval. The outgrowths of this paper could be used an economic tool in the field of flight Waypoint visualization and the flight algorithm research. Keywords: Flight Waypoint, Visualization, Manipulation, Open Source, Moving Map. 1. Introduction Flight visualization has been widely used to more effectively cope with the various threats from skies and an accidental situation change [1], [2]. In recent, the related studies of automatic generation of waypoints considering terrain altitude information and of waypoint generation to avoid the multiple threats using optimization method are proposed. [3], [4]. But, these researches provide only a path or waypoints of aircrafts. In order to represent the generated waypoints to the user, a moving map system should be built with the map and the terrain information, it is used as a tool to guide the flight path. But the most moving map has restrictions that it needs databases with the vast amount of geographical and terrain information, and it is difficult for them to apply to a system have an intension of a game or a research. In this paper, we propose a visualization tool to track the waypoints in effect by establishing an economic moving map system with the use of various software based on the 1
open-sources. Data for visualization is earned from a Flight Manipulation System designed as an external flight system interlocked with X-plane because of difficulties of the flight data extraction on an aircraft flight in real. Simulated flight data is transferred from the Flight Manipulation System using UDP protocol. And the position data earned from the Flight Manipulation System is represented on the Google Earth and the corresponding terrain data of the position is transferred to the moving map system. Using data of the altitude and the orientation data of the aircraft an estimated vertical ascending speed rate or descending speed rate of the aircraft is calculated and from this one the terrain altitude toward the proceeding direction is extracted. Lastly all the flight data is represented on the moving map through the visualization modules using the OpenGL libraries and the map information is transferred from the MapServer of ArcGIS [5] at real-time Our proposed system has a problem of a time delay because the map information is transferred through Web database but it would not give a serious affect to the simulation level system that demands a real-time operation because it could be implemented using multi-core system. 2. Background Song, J. and others [6] have been presented a system for the flight path visualization which using the state transition information without a terrain information [6]. Figure 1 shows the flight path visualization system. In this system there is a limit that it lacks in reality and it simply represents the location information of the aircraft because this system merely appears the flight path. Figure 1. Flight Path Visualization System by Song, J. and others And Park, M. and Hur, H [1] introduced a visualization system which the flight information is visualized on the Google Earth but does not present the path information besides the flight condition and the instrument information. Also Park, S. and Park, M. [2] presented a three-dimensional visualization tool to recognize the flight situation. This tool represents a simple moving path but does not alert the threat factors according to the terrain altitude information. In short, the existing flight information visualization tools don t provide a data worthy of reference during the flight state else than the simple visualization of the flight results. 2
3. Visualization System Figure 2 shows the overall structure of the proposed waypoint tracking visualization system. The structure is composed of Flight Manipulation System interlocked with X-Plane, Visualization and UDP Engines, and Visualization Module includes Map View that the waypoint tracking could be represented. Figure 2. Architecture of the Visualization System In this section we design Flight Manipulation System, Visualization/UDP engine, and Visualization Module including Map View. 3.1 Flight Manipulation System Figure 3 represents a visual view of the Flight Manipulation System. This system has a role to generate the flight data such as pitch, roll, headings, and speeds operating a external flight control device and receive the flight data such as latitude, longitude, altitude, and vertical speed rates from the interlocked X-Plane, and then send these information to the Flight Data Collector connected with Visualization/UDP Engines. Figure 3. Flight Manipulation System 3
3.2 UDP Engine An UDP Engine Module is designed to receive a UDP packet from the Flight Data Collector and operate UDP Server, then analysis the received UDP packet and transfer to the Visualization Engine Module. Figure 4 shows a structure of UDP Engine Module. Figure 4 shows the block diagram of the UDP Engine module Figure 4. Diagram of the UDP Engine A data packet transferred from Flight Manipulation System has a type of UDP with the data corresponding to its packet index number. One UDP packet is composed of independent data of 8 at the most and each of them have a float type of 4 bytes. Table 1 defines items of an UDP packet for visualization. Table 1. Items of the UDP Packet for Visualization Packet No. Information Index UDP 01 Times 0 UDP 03 Aircraft Speeds 0 UDP 04 Vertical Speed Indicator 1 UDP 18 Pitch, Roll, Headings 0,1,2 UDP 20 Lat, Lon, Altitude 0,1,2 UDP 37 Engine RPM 0 UDP 41,42 N1, N2 0 The packet structure format for the received UDP packet is a form of code 1. Code 1 : Structures type for UDP packet format typedef struct data_struct { int index; float data[8]; } FMS_UDP; 4
3.3 Visualization Engine The UDP Packet information stored to the packet structure is transferred to the flight waypoint generation module for the waypoint visualization and this module compares altitudes between the aircraft and the terrain by using map and vertical speed rate information received from the Map Server. Vertical speed rate is calculated on a next equation and here means aircraft s descending or ascending speed. Vertical speed rate Aircraft speed 6075.62963/ 60 tan In order to visualize the aircraft s past path and the estimated proceeding path on the basis of the flight waypoints several consecutive rectangular boxes are utilized by using heading, pitch, roll data from UDP Engine. Heading data is an information for a proceeding direction and pitch is one for ascending/descending activities, and roll data is used for left/right rotation. And to represent the estimated proceeding path of the aircraft using vertical speed rate information a blue dotted line is used in figure 5. 3.4 Visualization Panel Figure 5. Representation of the Flight Waypoint Tracking In addition to visualization of the waypoint tracking, all of the flight information is needed to display on the instrumental panel so that the pilot can be recognized the current flight state. In this study, using various functions of the open-source OpenGL the basic instruments with Map View is designed and implemented. Figure 6 shows the manipulated cockpit instrumental panel of the simulation aircraft. It is composed of MFD(Multi-Function Display) instruments on the left side and Map Viewer on the right side and on the middle part PFD(Primary Function Display) instruments are arranged. These instruments are designed on the basis of the real cockpit instrument images and so it is enhanced the reality. 5
Figure 6. Integration of Simulation Instruments and Map View In the instruments, a mapping between a flight data and instrument indicator is operated in method of mapping to the corresponding area by calculating the rotational coordinate, using the central coordinates as a guide. Because the rotation functions to be provided from OpenGL affects to other image area, a method is used which mapping by extracting the corresponding area on the image through a specific rotation coordinate calculation procedure. The coordinate arrangement follows the range mapping on the figure 7. Figure 7. Texture Mapping for Rotary Coordinate The following source code 2 is one part of the real-time maps for MFD mapping descriptions. Code 2 : Implementation of MFD using rotary coordinate glbindtexture(gl_texture_2d, texture[0]); glbegin(gl_quads); x1=(sm_x-cen); y1=(sm_y-cen); x2 = x1 * cos(roll) - y1 * sin(roll)+cen; 6
y2 = x1 * sin(roll) + y1 * cos(roll)+cen-pitch; gltexcoord2f(x2,y2); glvertex3f(x, y, 0.0f); x1=(la_x-cen); y1=(sm_y-cen); x2 = x1 * cos(roll) - y1 * sin(roll)+cen; y2 = x1 * sin(roll) + y1 * cos(roll)+cen-pitch; gltexcoord2f(x2,y2); glvertex3f(x+x_s, y, 0.0f); x1=(la_x-cen); y1=(la_y-cen); x2 = x1 * cos(roll) - y1 * sin(roll)+cen; y2 = x1 * sin(roll) + y1 * cos(roll)+cen-pitch; gltexcoord2f(x2,y2); glvertex3f(x+x_s,y+y_s, 0.0f); x1=(sm_x-cen); y1=(la_y-cen); x2 = x1 * cos(roll) - y1 * sin(roll)+cen; y2 = x1 * sin(roll) + y1 * cos(roll)+cen-pitch; gltexcoord2f(x2,y2); glvertex3f(x, y+y_s, 0.0f); glend(); And real-time map view of the right side on the figure 6 has extension steps of 10 levels and it displays the map information online. Map information is provided from MapServer for World Street Map of ArcGIS and the MapServer supplies the scale of from 1 to 73,874,399 to 1 to 144,285. If up-and-down directional keys is pressed the scale is automatically changed and the corresponding level is represented in the left side. In the middle part of the map the longitude and latitude coordinates for current flight are arranged and the divided map images of 2,097,152 pieces at the most are arranged on the display view to fit to the corresponding longitude and latitude degrees. Next code 3 presents one part of the code to make a mapping in real-time. Code 3 : Fuction for texture mapping of Map void textures_run(){ int i, j, no=1; x_st=abs((int)(((-180)-lon)/zoom_ra)); y_st=abs((int)((lat-90)/zoom_ra)); for(i=-1;i<=1;i++) { for(j=-1;j<=1;j++,no++){ sprintf(str1,"%d/%d/%d.jpg",zoom,y_st+i,x_st+j); BuildTexture(str1, texture[no]); } } } As shown in code 3 a total of 9 map images are loaded from Web on the basis of the center coordinate of the map and 8-neighbor map images around the center are participated to the 7
mapping. Because the number of images are 2,097,152 that corresponds to the maximum level, all of the map images could not be loaded at a time. In our approach, when the center coordinate gets out of current image area new 9 map images would be loaded and mapped. In figure 8(A), if the current longitude and latitude of the aircraft is located at (i+1, j+1) the maps are loaded that corresponds to the locations (i,j), (i,j+1), and (i+1,j) and arranged on the Map View as figure 8(B). i-1, j-1 i-1, j i-1, j+1 i, j-1 i, j i, j+1 i+1, j-1 i+1, j i+1, j+1 (A) Combinational positions of the 9 map images for mapping 4. Experimental Results Figure 8. Arrangement of Map Image (B) Example of the image arrangement In order to extract the flight information from the Flight Manipulation System interlocked to X-Plane, the test flight simulation using the input flight control device is operated on an airport. The test flight is operated at Yosu airport in Jeollanam-do district online. To visualize the aircraft s landing using the input driving device of the Flight Manipulation System the Google Earth System is utilized. Figure 9 shows the landing scenes of the test aircraft operated on the airport Yosu. (A) Takeoff (0 Sec) (B) After 10 seconds (C) After 20 seconds (D) After 35 seconds Figure 9. Flight Visualization of Google Earth The Interlocked X-Plane is not displayed and all of the flight data is extracted from Flight Manipulation System and X-Plane. The extracted flight data is sent to the visualization 8
system for Map View and Flight Waypoint Tracking and this data is realized on the Map View. Figure 10 is the visualized result of the Moving Map screen and the flight waypoint tracking on the Map Screen for the aircraft on the area of Yosu at Jeollanam-do district. In lower part, a black line presents the aircraft proceeding direction estimated by the flight waypoint tracking and the blue area means the terrain altitude information. And On the aircraft proceeding direction, the altitude region is represented with a red line that the collision danger exists and it alerts the dangerous state. 5. Conclusion Figure 10. Moving Map and Flight Waypoint Tracking In this paper we presented the visualization system for the Flight Waypoint Tracking using Flight Manipulation System interlocked to X-Plane on the basis of open-source OpenGL. Using this Flight Waypoint Tracking system the path search of an aircraft and the collision avoidance could be provided to the pilot in visual. Also this tool becomes a more economical tool than the existing tools for providing the flight information. This proposed system could be a basis for the integrated flight visual system in field of avionics. Acknowledgement This research was supported by the MKE (The Ministry of Knowledge Economy), Korea, under the ITRC (Information Technology Research Center) support program supervised by the NIPA (National IT Industry Promotion Agency) (NIPA-2010-C1090-1031-0007). References [1] Park, M., Hur, H.: Implementation of the Flight Information Visualization System using Google Earth. Journal of The Korea Society of Computer and Information, vol. 15, No. 10, pp. 79-86 (Oct 2010) 9
[2] Park, S., Park, M.: 3D Visualization for Flight Situational Awareness using Google Earth. Journal of The Korea Society of Computer and Information, Vol. 15, No. 12, pp. 181-188 (Dec 2010) [3] Park, J., Park, S., Ryoo, C., Shin, S.: A Study on the Algorithm for Automatic Generation of Optimal Waypoint with Terrain Avoidance. Journal of The Korea Society for Aeronautical and Space Sciences, Vol. 37, No. 11, pp. 1104-1111 (Nov 2009) [4] Kim, B., Ryoo, C., Bang, H., Chung, E.: Optimal Path Planning for UAVs under Multiple Ground Threats. Journal of The Korea Society for Aeronautical and Space Sciences, Vol. 34, No. 1, pp. 74-80 (Jan 2006) [5] World Street Map, http://www.arcgis.com [6] Song, J., Park, T., Kim, J., Choy, Y.: Flight Path Visualization Using State Transition Information of Track. In: Proc. of the 34th KIISE Fall Conference, KIISE vol. 34(2), pp. 172 177 (2007) Authors Myeong-Chul Park received B.S. degree in the Dept. of Computer Science from Korea National Open University. and M.S. and Ph.D. degree in the Dept. of Computer Science from Gyeongsang National University. Since 2007 to now, he has been a professor in the Dept. of Biomedical Electronics, SongHo College, HoengseongGun, Korea. His research interests include Computer Vision, Image Processing, Visualization, Simulator, Parallel Programs and Debugging. Hyeon-Gab Shin received B.S. degree in the Dept. of Animal Science from Jinju National University, Korea in 2000 and his M.S. degree in the Dept. of Computer Science from Gyeongsang National University, Korea in 2003. Currently, he is ph.d. degree in the Dept. of Computer Science from Gyeongsang National University. His research interests include Embedded Software, Avionics Software, Flight Simulator and Computer Vision System. Yong Ho Moon received his B.S, M.S, and Ph.D. degrees in electronics engineering from Pusan National University in 1992, 1994, and 1998, respectively. From 1998 to 2001, he worked for the Corporate Research and Development Center of Samsung Electronics Co. Ltd. From 2001 to 2002, he was a BK21 assistant professor at the School of Electrical and Computer Engineering at Pusan National University. From 2003 to 2006, he was an assistant professor in Division of Digital Media Engineering at Pusan University of Foreign Studies. From 2007, he has been working as assistant/associate professor of Department of Informatics at Gyeongsang National University. His research interests include video coding and related VLSI design, image processing, and image communication. Seok-Wun Ha received the B.S., M.S. and Ph.D. degrees in the Dept. of Electronic Engineering from Pusan National University. Since 1993, he has been a professor in the Dept. of Informatics, Gyeongsang National University, Jinju, Korea. His research interests include Digital Signal Processing, Neural Network, Image Processing and Computer Vision. 10