Search
information technology and software
Meta Monitoring System (MMS)
Meta Monitoring System (MMS) was developed as an add-on to NASA Ames patented Inductive Monitoring System (IMS), which estimates deviation from normal system operations. MMS helps to interpret deviation scores and determine whether anomalous behavior is transient or systemic. MMS has two phases: a model-building training phase, and a monitoring phase. MMS not only uses deviation scores from nominal data for training but can also make limited use of results from anomalous data. The invention builds two models: one of nominal deviation scores and one of anomalous deviation scores, each consisting of a probability distribution of deviation scores. After the models are built, incoming deviation scores from IMS (or a different monitoring system that produces deviation scores) are passed to the learned model, and probabilities of producing the observed deviation scores are calculated for both models. In this fashion, users of MMS can interpret deviation scores from the monitoring system more effectively, reducing false positives and negatives in anomaly detection.
Note: Patent license only; no developed software available for licensing
Aerospace
Mitigating Risk in Commercial Aviation Operations
NASA’s newly developed software leverages flight operations data (e.g., SWIM Terminal Data Distribution System (STDDS) information), and with it, can predict aviation related risks, such as unstable approaches of flights. To do this, the software inputs the complex, multi-source STDDS data, and outputs novel prediction and outcome information.
The software converts the relatively inaccessible SWIM data from its native format that is not data science friendly into a format easily readable by most programs. The converted, model friendly data are then input into machine learning algorithms to enable risk prediction capabilities. The backend software sends the machine learning algorithm results to the front end software to display the results in appropriate user interfaces. These user interfaces can be deployed on different platforms including mobile phones and desktop computers and efficiently update models based on changes in the data.
To allow for visualization, the software uses a commercially available mapping API. The data are visualized in several different ways, including a heat map layer that shows the risk score, with higher risk in areas of higher flight density, a polyline layer, which shows flight paths, and markers that can indicate a flight’s location in real time, among other things. The related patent is now available to license. Please note that NASA does not manufacturer products itself for commercial sale.
information technology and software
Computer Vision Lends Precision to Robotic Grappling
The goal of this computer vision software is to take the guesswork out of grapple operations aboard the ISS by providing a robotic arm operator with real-time pose estimation of the grapple fixtures relative to the robotic arms end effectors. To solve this Perspective-n-Point challenge, the software uses computer vision algorithms to determine alignment solutions between the position of the camera eyepoint with the position of the end effector as the borescope camera sensors are typically located several centimeters from their respective end effector grasping mechanisms.
The software includes a machine learning component that uses a trained regional Convolutional Neural Network (r-CNN) to provide the capability to analyze a live camera feed to determine ISS fixture targets a robotic arm operator can interact with on orbit. This feature is intended to increase the grappling operational range of ISSs main robotic arm from a previous maximum of 0.5 meters for certain target types, to greater than 1.5 meters, while significantly reducing computation times for grasping operations.
Industrial automation and robotics applications that rely on computer vision solutions may find value in this softwares capabilities. A wide range of emerging terrestrial robotic applications, outside of controlled environments, may also find value in the dynamic object recognition and state determination capabilities of this technology as successfully demonstrated by NASA on-orbit.
This computer vision software is at a technology readiness level (TRL) 6, (system/sub-system model or prototype demonstration in an operational environment.), and the software is now available to license. Please note that NASA does not manufacture products itself for commercial sale.