Robotic Inspection System for Fluid Infrastructures

Instrumentation
Robotic Inspection System for Fluid Infrastructures (MSC-TOPS-70)
Surveys interior volume, interrogates structure integrity, and displays real-time video and sonar
Overview
NASA Johnson Space Center innovators have designed a Robotic Inspection System that is capable of surveying deep sea structures such as oil platform storage cells/tanks and pipelines in order to determine the volume of material remaining inside, interrogate structure integrity, and display real-time video and sonar. This inspection device and method could significantly reduce the cost of inspecting and in the future, provide sampling of the structure contents. The technology is an all-in-one inspection device that includes cameras, sonar and motion sensing instruments with hardware and software components. This NASA developed technology is available for licensing. This NASA Technology is available for your company to license and develop into a commercial product. NASA does not manufacture products for commercial sale.

The Technology
The Robotic Inspection System improves the inspection of deep sea structures such as offshore storage cells/tanks, pipelines, and other subsea exploration applications. Generally, oil platforms are comprised of pipelines and/or subsea storage cells. These storage cells not only provide a stable base for the platform, they provide intermediate storage and separation capability for oil. Surveying these structures to examine the contents is often required when the platforms are being decommissioned. The Robotic Inspection System provides a device and method for imaging the inside of the cells, which includes hardware and software components. The device is able to move through interconnected pipes, even making 90 degree turns with minimal power. The Robotic Inspection System is able to display 3-dimentional range data from 2-dimensional information. This inspection method and device could significantly reduce the cost of decommissioning cells. The device has the capability to map interior volume, interrogate integrity of cell fill lines, display real-time video and sonar, and with future development possibly sample sediment or oil.
Shown: Robotic Inspection Orb prototype with forward-facing camera and sonar enclosure
Benefits
  • Sonar display: 3-dimensional range data from 2-dimensional information
  • All-in-one inspection device: cameras, sonar, and motion sensing instruments
  • Minimal moving parts

Applications
  • Sub-sea oil and gas platform structures
  • Deep sea exploration
  • Pipelines at least partially containing a fluid
Technology Details

Instrumentation
MSC-TOPS-70
MSC-25784-1 MSC-25784-2
9739411
Similar Results
satellite
Method and Associated Apparatus for Capturing, Servicing, and De-Orbiting Earth Satellites Using Robotics
This method begins with the optical seeking and ranging of a target satellite using LiDAR. Upon approach, the tumble rate of the target satellite is measured and matched by the approaching spacecraft. As rendezvous occurs the spacecraft deploys a robotic grappling arm or berthing pins to provide a secure attachment to the satellite. A series of robotic arms perform servicing autonomously, either executing a pre-programmed sequence of instructions or a sequence generated by Artificial Intelligence (AI) logic onboard the robot. Should it become necessary or desirable, a remote operator maintains the ability to abort an instruction or utilize a built-in override to teleoperate the robot.
skyscrapers, fiber optics, dam, jet
Fiber Optic Sensing Technologies
The FOSS technology revolutionizes fiber optic sensing by using its innovative algorithms to calculate a range of useful parameters—any and all of which can be monitored simultaneously and in real time. FOSS also couples these cutting-edge algorithms with a high-speed, low-cost processing platform and interrogator to create a single, robust, stand-alone instrumentation system. The system distributes thousands of sensors in a vast network—much like the human body's nervous system—that provides valuable information. How It Works Fiber Bragg grating (FBG) sensors are embedded in an optical fiber at intervals as small as 0.25 inches, which is then attached to or integrated into the structure. An innovative, low-cost, temperature-tuned distributed feedback (DFB) laser with no moving parts interrogates the FBG sensors as they respond to changes in optical wavelength resulting from stress or pressure on the structure, sending the data to a processing system. Unique algorithms correlate optical response to displacement data, calculating the shape and movement of the optical fiber (and, by extension, the structure) in real time, without affecting the structure's intrinsic properties. The system uses these data to calculate additional parameters, displaying parameters such as 2D and 3D shape/position, temperature, liquid level, stiffness, strength, pressure, stress, and operational loads. Why It Is Better FOSS monitors strain, stresses, structural instabilities, temperature distributions, and a plethora of other engineering measurements in real time with a single instrumentation system weighing less than 10 pounds. FOSS can also discern between liquid and gas states in a tank or other container, providing accurate measurements at 0.25-inch intervals. Adaptive spatial resolution features enable faster signal processing and precision measurement only when and where it is needed, saving time and resources. As a result, FOSS lends itself well to long-term bandwidth-limited monitoring of structures that experience few variations but could be vulnerable as anomalies occur (e.g., a bridge stressed by strong wind gusts or an earthquake). As a single example of the value FOSS can provide, consider oil and gas drilling applications. The FOSS technology could be incorporated into specialized drill heads to sense drill direction as well as temperature and pressure. Because FOSS accurately determines the drill shape, users can position the drill head exactly as needed. Temperature and pressure indicate the health of the drill. This type of strain and temperature monitoring could also be applied to sophisticated industrial bore scope usage in drilling and exploration. For more information about the full portfolio of FOSS technologies, see visit https://technology-afrc.ndc.nasa.gov/featurestory/fiber-optic-sensing
3D Laser Scanner
ShuttleSCAN 3-D
How It Works The scanners operation is based on the principle of Laser Triagulation. The ShuttleSCAN contains an imaging sensor; two lasers mounted on opposite sides of the imaging sensor; and a customized, on-board processor for processing the data from the imaging sensor. The lasers are oriented at a given angle and surface height based on the size of objects being examined. For inspecting small details, such as defects in space shuttle tiles, a scanner is positioned close to the surface. This creates a small field of view but with very high resolution. For scanning larger objects, such as use in a robotic vision application, a scanner can be positioned several feet above the surface. This increases the field of view but results in slightly lower resolution. The laser projects a line on the surface, directly below the imaging sensor. For a perfectly flat surface, this projected line will be straight. As the ShuttleSCAN head moves over the surface, defects or irregularities above and below the surface will cause the line to deviate from perfectly straight. The SPACE processors proprietary algorithms interpret these deviations in real time and build a representation of the defect that is then transmitted to an attached PC for triangulation and 3-D display or printing. Real-time volume calculation of the defect is a capability unique to the ShuttleSCAN system. Why It Is Better The benefits of the ShuttleSCAN 3-D system are very unique in the industry. No other 3-D scanner can offer the combination of speed, resolution, size, power efficiency, and versatility. In addition, ShuttleSCAN can be used as a wireless instrument, unencumbered by cables. Traditional scanning systems make a tradeoff between resolution and speed. ShuttleSCANs onboard SPACE processor eliminates this tradeoff. The system scans at speeds greater than 600,000 points per second, with a resolution smaller than .001". Results of the scan are available in real time, whereas conventional systems scan over the surface, analyze the scanned data, and display the results long after the scan is complete.
front image
Interim, In Situ Additive Manufacturing Inspection
The in situ inspection technology for additive manufacturing combines different types of cameras strategically placed around the part to monitor its properties during construction. The IR cameras collect accurate temperature data to validate thermal math models, while the visual cameras obtain highly detailed data at the exact location of the laser to build accurate, as-built geometric models. Furthermore, certain adopted techniques (e.g., single to grouped pixels comparison to avoid bad/biased pixels) reduce false positive readings. NASA has developed and tested prototypes in both laser-sintered plastic and metal processes. The technology detected errors due to stray powder sparking and material layer lifts. Furthermore, the technology has the potential to detect anomalies in the property profile that are caused by errors due to stress, power density issues, incomplete melting, voids, incomplete fill, and layer lift-up. Three-dimensional models of the printed parts were reconstructed using only the collected data, which demonstrates the success and potential of the technology to provide a deeper understanding of the laser-metal interactions. By monitoring the print, layer by layer, in real-time, users can pause the process and make corrections to the build as needed, reducing material, energy, and time wasted in nonconforming parts.
https://images.nasa.gov/details-iss062e000422
Computer Vision Lends Precision to Robotic Grappling
The goal of this computer vision software is to take the guesswork out of grapple operations aboard the ISS by providing a robotic arm operator with real-time pose estimation of the grapple fixtures relative to the robotic arms end effectors. To solve this Perspective-n-Point challenge, the software uses computer vision algorithms to determine alignment solutions between the position of the camera eyepoint with the position of the end effector as the borescope camera sensors are typically located several centimeters from their respective end effector grasping mechanisms. The software includes a machine learning component that uses a trained regional Convolutional Neural Network (r-CNN) to provide the capability to analyze a live camera feed to determine ISS fixture targets a robotic arm operator can interact with on orbit. This feature is intended to increase the grappling operational range of ISSs main robotic arm from a previous maximum of 0.5 meters for certain target types, to greater than 1.5 meters, while significantly reducing computation times for grasping operations. Industrial automation and robotics applications that rely on computer vision solutions may find value in this softwares capabilities. A wide range of emerging terrestrial robotic applications, outside of controlled environments, may also find value in the dynamic object recognition and state determination capabilities of this technology as successfully demonstrated by NASA on-orbit. This computer vision software is at a technology readiness level (TRL) 6, (system/sub-system model or prototype demonstration in an operational environment.), and the software is now available to license. Please note that NASA does not manufacture products itself for commercial sale.
Stay up to date, follow NASA's Technology Transfer Program on:
facebook twitter linkedin youtube
Facebook Logo Twitter Logo Linkedin Logo Youtube Logo