Search

PATENT PORTFOLIO
Information Technology and Software
Information Technology and Software
NASA develops information technology and software to support a wide range of activities, including mission planning and operations, data analysis and visualization, and communication and collaboration. These technologies also play a key role in the development of advanced scientific instruments and spacecraft, as well as in the management of NASA's complex organizational and technical systems. By leveraging NASA's advances in IT and software, your business can gain a competitive edge.
Orbital debris
Space Traffic Management (STM) Architecture
As ever larger numbers of spacecraft seek to make use of Earth's limited orbital volume in increasingly dense orbital regimes, greater coordination becomes necessary to ensure these spacecraft are able to operate safely while avoiding physical collisions, radio-frequency interference, and other hazards. While efforts to date have focused on improving Space Situational Awareness (SSA) and enabling operator to operator coordination, there is growing recognition that a broader system for Space Traffic Management (STM) is necessary. The STM architecture forms the framework for an STM ecosystem, which enables the addition of third parties that can identify and fill niches by providing new, useful services. By making the STM functions available as services, the architecture reduces the amount of expertise that must be available internally within a particular organization, thereby reducing the barriers to operating in space and providing participants with the information necessary to behave responsibly. Operational support for collision avoidance, separation, etc., is managed through a decentralized architecture, rather than via a single centralized government-administered system. The STM system is based on the use of standardized Application Programming Interfaces (API) to allow easier interconnection and conceptual definition of roles to more easily allow suppliers with different capabilities to add value to the ecosystem. The architecture handles basic functions including registration, discovery, authentication of participants, and auditable tracking of data provenance and integrity. The technology is able to integrate data from multiple sources.
https://images.nasa.gov/details-iss062e000422
Computer Vision Lends Precision to Robotic Grappling
The goal of this computer vision software is to take the guesswork out of grapple operations aboard the ISS by providing a robotic arm operator with real-time pose estimation of the grapple fixtures relative to the robotic arms end effectors. To solve this Perspective-n-Point challenge, the software uses computer vision algorithms to determine alignment solutions between the position of the camera eyepoint with the position of the end effector as the borescope camera sensors are typically located several centimeters from their respective end effector grasping mechanisms. The software includes a machine learning component that uses a trained regional Convolutional Neural Network (r-CNN) to provide the capability to analyze a live camera feed to determine ISS fixture targets a robotic arm operator can interact with on orbit. This feature is intended to increase the grappling operational range of ISSs main robotic arm from a previous maximum of 0.5 meters for certain target types, to greater than 1.5 meters, while significantly reducing computation times for grasping operations. Industrial automation and robotics applications that rely on computer vision solutions may find value in this softwares capabilities. A wide range of emerging terrestrial robotic applications, outside of controlled environments, may also find value in the dynamic object recognition and state determination capabilities of this technology as successfully demonstrated by NASA on-orbit. This computer vision software is at a technology readiness level (TRL) 6, (system/sub-system model or prototype demonstration in an operational environment.), and the software is now available to license. Please note that NASA does not manufacture products itself for commercial sale.
The touch screen of the Electronic Flight Bag allows pilots to easily use TASAR.
Traffic Aware Strategic Aircrew Requests (TASAR)
The NASA software application developed under the TASAR project is called the Traffic Aware Planner (TAP). TAP automatically monitors for flight optimization opportunities in the form of lateral and/or vertical trajectory changes. Surveillance data of nearby aircraft, using ADS-B IN technology, are processed to evaluate and avoid possible conflicts resulting from requested changes in the trajectory. TAP also leverages real-time connectivity to external information sources, if available, of operational data relating to winds, weather, restricted airspace, etc., to produce the most acceptable and beneficial trajectory-change solutions available at the time. The software application is designed for installation on low-cost Electronic Flight Bags that provide read-only access to avionics data. The user interface is also compatible with the popular iPad. FAA certification and operational approval requirements are expected to be minimal for this non-safety-critical flight-efficiency application, reducing implementation cost and accelerating adoption by the airspace user community. Awarded "2016 NASA Software of the Year"
Urban Air Mobility
Near-Real Time Verification and Validation of Autonomous Flight Operations
NASA's Extensible Traffic Management (xTM) system allows for distributed management of the airspace where disparate entities collaborate to maintain a safe and accessible environment. This digital ecosystem relies on a common data generation and transfer framework enabled by well-defined data collection requirements, algorithms, protocols, and Application Programming Interfaces (APIs). The key components in this new paradigm are: Data Standardization: Defines the list of data attributes/variables that are required to inform and safely perform the intended missions and operations. Automated Real Time And/or Post-Flight Data Verification Process: Verifies system criteria, specifications, and data quality requirements using predefined, rule-based, or human-in-the-loop verification. Autonomous Evolving Real Time And/or Post-Flight Data Validation Process: Validates data integrity, quantity, and quality for audit, oversight, and optimization. The verification and validation process determines whether an operation’s performance, conformance, and compliance are within known variation. The technology can verify thousands of flight operations in near-real time or post flight in the span of a few minutes, depending on networking and computing capacity. In contrast, manual processing would have required hours, if not days, for a team of 2-3 experts to review an individual flight.
Technology Example
Computational Visual Servo
The innovation improves upon the performance of passive automatic enhancement of digital images. Specifically, the image enhancement process is improved in terms of resulting contrast, lightness, and sharpness over the prior art of automatic processing methods. The innovation brings the technique of active measurement and control to bear upon the basic problem of enhancing the digital image by defining absolute measures of visual contrast, lightness, and sharpness. This is accomplished by automatically applying the type and degree of enhancement needed based on automated image analysis. The foundation of the processing scheme is the flow of digital images through a feedback loop whose stages include visual measurement computation and servo-controlled enhancement effect. The cycle is repeated until the servo achieves acceptable scores for the visual measures or reaches a decision that it has enhanced as much as is possible or advantageous. The servo-control will bypass images that it determines need no enhancement. The system determines experimentally how much absolute degrees of sharpening can be applied before encountering detrimental sharpening artifacts. The latter decisions are stop decisions that are controlled by further contrast or light enhancement, producing unacceptable levels of saturation, signal clipping, and sharpness. The invention was developed to provide completely new capabilities for exceeding pilot visual performance by clarifying turbid, low-light level, and extremely hazy images automatically for pilot view on heads-up or heads-down display during critical flight maneuvers.
Big Data Analysis
Context Based Configuration Management System
Context Based Configuration Management (CBCM) is a hybrid tool-suite that directly supports the dynamic, distributed strategic planning and decision making environment. The CBCM system marries Decision Map technology with Commercial Off-the-Shelf (COTS) configuration management work flow (Xerox Docushare), embedded component models (events models, configuration item models, and feedback models) all on top of a web based online collaboration technology (e.g., NASA/Xerox Netmark middleware engine). CBCM drives an enterprise management configuration system with built-in analysis functions, tightly interoperable with other enterprise management systems (through middleware connections) that deliver integrated reports and enable enterprise-wide inputs on decisions/actions/events, and present context-based query-driven views on configuration management information. The theory of operation flows from the most senior level of decision making and creates a master Configuration Decision Map. This map track events, configuration objects, and contexts. Additional Configuration Decision Maps can be created independently and/or as a result of the Master Configuration Decision Map by successive organizations and/or individuals in the entire enterprise. The integrated icon-objects have intelligent triggers embedded within them configurable by the users to provide automatic analysis, forecasts, and reports. All information is stored in an object-relational database that provides robust query and reporting tools to help analyze and support past and current decisions as well as track the enterprise baseline and future potential vectors.
India-Pakistan Border at Night
Integrated Genomic and Proteomic Information Security Protocol
NASA GSFC has developed a cybersecurity security protocol consisting of message exchanges utilizing message authentication codes and encryption codes derived from the genetic encoding system for key generation, Proteins and the processes of transcription and translation of DNA and RNA into proteins. These are used in conjunction with the existing principles of a public key infrastructure and traditional encryption and authentication algorithms and processes. This security protocol requires a cryptanalysis infrastructure not available to most attackers. By using the processes of transcription and translation of actual genes (referred to as biogenes) in conjunction with a genomic and proteomic based encryption and authentication approach, security is achieved in two simultaneous domains. An attacker has to successfully breach both domains simultaneously for successful network attack.
The heart of the NASA Center for Climate Simulation (NCCS) is the Discover supercomputer. In 2009, NCCS added more than 8,000 computer processors to Discover, for a total of nearly 15,000 processors.
First Stage Bootloader
The First Stage Bootloader reads software images from flash memory, utilizing redundant copies of the images, and launches the operating system. This bootloader finds a valid copy of the OS image and the ram filesystem image in flash memory. If none of the copies are fully valid, the bootloader attempts to construct a fully valid image by validating the images in small sections and piecing together validated sections from multiple copies. Periodically, throughout this process, the First Stage Bootloader restarts the watchdog timer. The First Stage Bootloader reads a boot table from a default location in flash memory. This boot table describes where to find the OS image and its supporting ram filesystem image in flash. It uses header information and a checksum to validate the table. If the table is corrupt, it reads the next copy until it finds a valid table. There can be many copies of the table in flash, and all will be read if necessary. The First Stage Bootloader reads the ram filesystem image into memory and validates its contents. Similar to the boot table, if one copy of the image is corrupt, it will read the remaining copies until it finds one with a valid header. If it doesn't find a valid copy, it will break the image down into smaller portions. For each section, it checks each copy until it finds a valid copy of the section and copies the valid section into a new copy of the image. The First Stage Bootloader reads the OS image and interprets it. If anything in the image is corrupt, it reads the remaining copies until it finds a fully valid copy. If no copy is fully valid, it will use individual valid records from multiple copies to create a fully valid image.
Hubble Spies Charming Spiral Galaxy Bursting with Stars
Radiation Hardened 10BASE-T Ethernet Physical Interface
Currently there is no radiation hardened Ethernet interface device/circuit available commercially. In this Ethernet solution, the portion of the PHY in the FPGA is responsible for meeting the IEEE 802.3 protocol, decoding received packets and link pulses, and encoding transmitted data packets. The decoded payload data is sent to a user interface internal to the FPGA which sends data for transmission back to the FPGA PHY. The transmit portion is composed of two AD844 op amps from Analog Devices with appropriate filtering. The receive portion is composed of a transformer, an Aeroflex Low-Voltage Differential Multi-drop device, and appropriate filtering.
View more patents
Stay up to date, follow NASA's Technology Transfer Program on:
facebook twitter linkedin youtube
Facebook Logo Twitter Logo Linkedin Logo Youtube Logo