Greetings all, (sorry about this long description but this a deep subject, at least for me) a friend passed this thread off to me and I thought I should join the PilotsOfAmerica and add a few thoughts. My son and I and a small group of pilots and programmers have been working the GA/Experimental world HUD challenge for several years. The TronView web site mentioned above is our primary focus point for our HUD software development . Were also part of the FlyOnSpeed AOA team that were the EAA Innovation Award Grand Champions in their effort to reduce LOC (Loss Of Control) accidents. Our HUD project is an off shoot of that effort. Our first HUD effort used the Hudly HUD, a some $350 dollar unit which was designed for car use but actually was just small HUD projector using an HDMI video input. This is the same technology the SKYDISPLAY tried to use except they were trying to sell it as a production HUD for small jets by MGF, in reality the HUD projector was never going to be FAA certified, only the software and installation was FAA approved. The founder of MGF dies several years ago and that pretty much stopped this effort. I talked with the follow on MGF owner and he offered to sell me this whole project for several $Million dollars. I guess its still for sell. The Hudly HUD worked, was fairly cheap, but had significant limitations, starting with a fairly small screen (pixel density) and its focus was only about 9 feet in front of you. Just inside the distance you could reasonably get used to for a HUD. But our original HUD software worked pretty good on this HUD.
We meet the Epic Optix folks at OSHKOSH as they were finalist in the EAA innovation contest along with our FlyOnSpeed team. Their HUD has super optics (real focus at infinity), takes a HDMI video input which made it almost perfect for our HUD computer (we use the Raspberry micro computer), and we were able to start using it immediately. Their HUD design goal was a unit that could sit ontop an aircraft Dash and connect directly to an iPhone or Android unit or to their Raspberry Pi computer connected into the aircraft digital attitude system. They wanted certified aircraft owners to be part of their customer base so the HUD had to portable and easily installed or removed. As you can see in my RV8 HUD picture below I made it a permanent install which you can do in an experimental aircraft. The big limitations to this HUD is its size so its impossible in many experimental aircraft (it barely fits in my RV8) but I have a Non-VANS custom canopy) and again its pixel density. We made it work pretty good but its not the perfect solution. I also configured a little Video Camera to fit on this HUD which outside of aircraft vibration as times does a good job.
I was part of the F35 development program for some 15 years so got spoiled early with the F35 Helmet Mounted Display system, and actually got to do some flight test with this HMD in our B-737 avionics test test bed (called the CATB). I knew from that experience that the ultimate goal for a GA/Experimental aircraft HUD was some type of HMD. A F35 pilot friend of mine bought a pair of XReal AR Glasses, did some airborne testing of them in his private aircraft (just watching iPhone video), got excited and brought them over to me to connect to our little HUD computer. These glasses use a simple HDMI video input, weigh almost nothing, are comfortable, and they worked pretty awesome, it was instantly obvious that AR glass technology had advanced to where it could be usable in a GA aircraft. We had to rearrange our TRONVIEW (the name of our HUD software) graphics output to optimize the 1080P (1920x1080) pixel resolution to fully utilize the screen size of the XReal glasses but they provide a large bright full color and very clear HUD picture that certainly exceeded our expectations. The focus of these glasses is about 12 feet in front of you, still not perfect but definitely falls into the good enough category. My F35 friend agreed the focus was good enough (he has several hundred of hours of F35 HMD use). The F35 HMD is a Helmet, weighs over 5 pounds, cost's a very lot of $$(~$400K), took a long time to develop, is very complicated, has a large connection to the aircraft, only has a green graphics display, but works. To create anything like this for the GA world I knew would not be easy.
If you go to our site (flyonspeedspeed dot org slash hud) we have a lengthy discussion on HUD's in general, why have a HUD at all, and what/where to buy the HUD equipment, how to put it together, and how to load our (free) Raspberry Pi 5 software for the Epic Optix and XReal Glasses HUD. The current Xreal AR Head Mounted glasses we believe work very well with our software, and can definitely increase your Pilot SA. The directions at our site shows you how to download the software, install, and then test our HUD software with a Raspberry Pi 5 computer on a desktop while playing back actual recorded flight data to see how it all works. As already mentioned this software does not have Head Tracking yet, but we are working hard to add head tracking capability to our software. This part is not easy. So we are looking for software savvy Python programmers to join our team, and also for anyone who wants to help us fly and test the HUD hardware and software integration to help us perfect the product.
We have also redesigned the software so that the HUD display will be essentially be a projected 3D window into the the outside the aircraft world with airports, routes, traffic, airspace, and other essential flight information presented to the pilot in a correct 3D geographic display. To do this in a HUD in a way that increases your situational awareness without also distracting from it will be a double challenge, any way that's the direction we are going.
Some general information from our GitHub site;
This 3D HUD Version is in Active development! (Jan 2025!)
This software can support aircraft displays, as well as a HUD, its all in the requirements of the pilot, builder.
If you want to help with development or testing please join our Join Discord sever. We are working on several new features.
- We are currently testing with XReal and Viture AR glasses
- New Editor to make creating or editing your own screen very easy.
- Improving our G3x support
- Show/Hide screen modules based on key commands or other inputs. (event handlers)
- Additional IMU/Gyro support (BNO085, BNO055)
- Looking into live data broadcast to other TronView displays (that would be cool!)
- Moving Map.
- More external sensors (Pressure, Voltage, GPS, etc)
- Touch screen support (New Raspberry Pi Touchscreen v2)
--Features Include:
--Build custom efis or hud screens or AR glasses screen. (or any kinda of screen you want)
--Can build any kind of UI for external sensors or external data.
--Record and Playback flight log data ( and fast forward through playback )
-- All screens look and work the same for all supported data input.
-- All display screen sizes and ratios supported.
-- Built in editor to make creating or editing your own screen.
-- Text mode
-- Touch screen support
-- 30 + FPS on Raspberry Pi 4/5 (60+ FPS on Mac M1)
-- Remote keypad / user input support.
-- Display flight data in Knots, Standard, Metric, F or C
-- Designed for Raspberry Pi 4/5 but also runs on Mac OSx, Windows, and other linux systems.
-- Show NAV needles for approaches. (If NAV data is available)
-- Use multiple data input sources, (MGL, G3x, Dynon, GRT EIS, iLevil BOM, Stratux, Analog CDI via ADS1115, 2 types of IMU's --> BNO055 & BNO085, a Generic Serial Logger, simple pilot keypad, and/or a pilot Joystick)
- The first picture shows what the XReal glasses look like under a lightspeed headset, and the very hard to take of the HUD graghics contrast against bright blue sky taken through the glasses projection.
- The second picture is the Epic Optix HUD with Gun Video Camera mounted in my RV8 (the little AOA display next to it is from our FlyOnSpeed aural/visual AOA system).
- The 3rd picture shows an IMU chip (Inertial Measurement Unit) I have for current head tracking testing in a different headset. This chip tells the computer where the pilot's head is looking. The computer also gets data from the aircraft ADHRS (attitude information, heading/roll/ pitch/yaw).
- The 4th picture is some early testing of our head tracking 3D software using a Stratux ADSB traffic unit outputting WIFI traffic data into our Raspberry Pi computer, which then tells me to turn my head and look up so i can identify this aircraft in my 3D visual HUD display. As you can see the HUD test software tells me (not that I would care if I was actually flying) Southwest Airlines flight 1106 at 14,125 feet heading 021degrees, climbing, 24.2 miles away, and the last data update was 0 seconds ago. Most of the data in this picture is useful for development because the computer has to track what the airplane id doing (heading/roll, pitch, yaw, and what the HMD (pilots head) is doing (roll, pitch and yaw) and add this all together so that when the pilot looks where the 3D glasses tells him/her to look the actual aircraft will be there. So far we are in the ball park and its close but not perfect yet. In out test software so far every aircraft is presented as a B-707, this will of course change but is OK for now. The nearest aircraft will also have a box around their visual location, and the aircraft picture will go away. The + in the center represents the pilot line of sight center point.