Search
Optics
Ruggedized Infrared Camera
This new technology applies NASA engineering to a FLIR Systems Boson® Model No. 640 to enable a robust IR camera for use in space and other extreme applications. Enhancements to the standard Boson® platform include a ruggedized housing, connector, and interface. The Boson® is a COTS small, uncooled, IR camera based on microbolometer technology and operates in the long-wave infrared (LWIR) portion of the IR spectrum. It is available with several lens configurations. NASA's modifications allow the IR camera to survive launch conditions and improve heat removal for space-based (vacuum) operation. The design includes a custom housing to secure the camera core along with a lens clamp to maintain a tight lens-core connection during high vibration launch conditions. The housing also provides additional conductive cooling for the camera components allowing operation in a vacuum environment. A custom printed circuit board (PCB) in the housing allows for a USB connection using a military standard (MIL-STD) miniaturized locking connector instead of the standard USB type C connector. The system maintains the USB standard protocol for easy compatibility and "plug-and-play" operation.
Materials and Coatings

Atomic Layer Deposition-Enhanced Far-to-Mid Infrared Camera Coating
The ALD-Enhanced Far-to-Mid IR Camera Coating is fabricated by first applying a conductively loaded epoxy binder ~500 microns thick onto a conductive metal substrate (e.g., Cu, Al). This serves to provide high absorptance and low reflectance at the longest wavelength of interest, as well as to provide a mechanical buffer layer to reduce coating stress. Borosilicate glass microspheres are coated with a thin film metal via ALD, essentially turning the microspheres into resonators. That film is optically thin in the far infrared and approximates a resistive (~200 ohms per square) coating. Light trapped in the borosilicate glass microspheres is reflected back and forth within the glass–at each contact point, the light is attenuated by 50%. A monolayer of thin metal film-coated borosilicate glass microspheres is applied to the epoxy binder and cured, forming a robust mechanical structure that can be grounded to prevent deep dielectric charging by ionizing radiation in space. Once cured, the far-to-mid IR absorber structure can be coated with a traditional ~20-to-50 microns “black” absorptive paint to enhance the absorption band at short wavelengths, or a “white” diffusive paint to reject optical radiation. At this thickness and broad tolerance, the longwave response of the coating is preserved. Tailoring the electromagnetic properties of the coating layers and geometry enables realization of a broad band absorption response where the mass required per unit area has been minimized.
While NASA originally developed the ALD-Enhanced Far-to-Mid IR Camera Coating for the Stratospheric Observatory for Infrared Astronomy mission, its robustness, absorptive qualities, and optical performance make it a significant addition to IR and terahertz imaging systems. The IR camera coating is at Technology Readiness Level (TRL) 3 (experimental proof-of-concept) and is available for patent licensing.
Aerospace

Vision-based Approach and Landing System (VALS)
The novel Vision-based Approach and Landing System (VALS) provides Advanced Air Mobility (AAM) aircraft with an Alternative Position, Navigation, and Timing (APNT) solution for approach and landing without relying on GPS. VALS operates on multiple images obtained by the aircraft’s video camera as the aircraft performs its descent. In this system, a feature detection technique such as Hough circles and Harris corner detection is used to detect which portions of the image may have landmark features. These image areas are compared with a stored list of known landmarks to determine which features correspond to the known landmarks. The world coordinates of the best matched image landmarks are inputted into a Coplanar Pose from Orthography and Scaling with Iterations (COPOSIT) module to estimate the camera position relative to the landmark points, which yields an estimate of the position and orientation of the aircraft. The estimated aircraft position and orientation are fed into an extended Kalman filter to further refine the estimation of aircraft position, velocity, and orientation. Thus, the aircraft’s position, velocity, and orientation are determined without the use of GPS data or signals. Future work includes feeding the vision-based navigation data into the aircraft’s flight control system to facilitate aircraft landing.