Visual Inspection Posable Invertebrate Robot (VIPIR)

Robotics Automation and Control
Visual Inspection Posable Invertebrate Robot (VIPIR) (GSC-TOPS-360)
An advanced tele-operated spacecraft inspection tool
Overview
On-orbit spacecraft maintenance and repair of components are critical for extending mission lifespans and ensuring operational safety. Traditional inspection methods, like astronaut extravehicular activities (EVAs), are risky and resource-intensive, while fixed cameras offer limited views. To address these challenges, innovators at NASA’s Goddard Space Flight Center developed the Visual Inspection Posable Invertebrate Robot (VIPIR) system, a tele-operated robotic imaging system offering advanced capabilities for visual assessments of space-based assets. VIPIR is a significant leap forward in the realm of on-orbit robotic inspection, evaluation, and servicing systems. Integrating a dexterous, flexible video borescope with advanced articulation mechanisms, VIPIR allows NASA to remotely inspect spacecraft components that would otherwise be difficult or impossible to view, enabling subsequent repairs and maintenance.

The Technology
Initially developed as a close quarters inspection tool capable of accessing hard to reach, tight, or visibly restricted, areas of satellites, the VIPIR system can be used to remotely inspect inaccessible locations such as behind a sheet of thermal blanketing material, into a satellites plumbing, or perhaps even deep inside the otherwise unreachable crevasses of a spacecraft bus. The VIPIR system incorporates a number of subassemblies for incredible operational freedom and capabilities for imaging and dissemination of componentry. VIPIR’s Video Borescope Assembly (VBA) is a flexible snake-camera capable of multidirectional articulations, making steering and control simple and intuitive for an operator. The VBA also includes at least one imaging sensor and lighting to see in dark, confined spaces. Real-time, high-resolution visual information can be fed back to an operator for live analysis. A reel system extends and retracts the VBA with the use of a spool, and includes position indicators for deployment tracking. The Tendon Management System (TMS), not unlike human tendons, utilizes pulleys and tensioners to articulate the VBA in the confined spaces. A seal system ensures the VBA is free of contamination. VIPIR underwent space-based testing on the ISS during the Robotic Refueling Phase 2 (RRM-2) mission designed to showcase and test several NASA advanced robotic satellite servicing technologies. During this mission, VIPIR demonstrated state-of-the-art near and midrange inspection capabilities. NASA’s VIPIR system is available for licensing to industry, and may be desirable to companies focused on satellite servicing, on-orbit assembly, and other applications requiring detailed inspection of assets in space.
A close-up image of NASA's Visual Inspection Poseable Invertebrate Robot (VIPIR). Credit: NASA/Chris Gunn VIPIR extends its snake-like borescope camera in free space as a part of the RRM3 mission. It would later be inserted into the RRM3 module's piping system to verify proper cryogen hose placement. Credit: NASA
Benefits
  • Enhanced on-orbit spacecraft inspection capabilities: VIPIR’s compact, flexible, and articulating design enables it to access confined and hard-to-reach areas, providing high-resolution visual assessments of spacecraft components that are not possible with fixed cameras or traditional inspection methods.
  • Repositionable: The VIPIR system is compact and moveable, providing significantly more flexibility and freedom relative to fixed camera systems or manned EVA inspections.
  • Compact and flexible: The size of the VBA and camera assembly provides the ability for the VIPIR system to view otherwise unviewable areas that may be inaccessible to traditional imaging systems.
  • Safety: VIPIR’s capabilities allow for tele-operated spacecraft inspections in lieu of astronaut EVAs, enhancing safety and enabling more detailed inspections.
  • Real-time information: Cameras and wireless signal generation provide accessible and immediate feedback, allowing operators to make swift decisions.

Applications
  • Spacecraft inspection and servicing: Initially designed for space applications, VIPIR can be employed to inspect and diagnose issues on spacecraft (e.g., satellites, space stations, etc.), enabling repairs and maintenance that extend their operational lifespan.
  • On-orbit assembly and construction: VIPIR’s capabilities may be valuable for inspecting joints and connections in structures assembled in space, ensuring structural integrity.
  • Aviation and defense: Inspections of engines, bulkheads, fuel tanks, ECS, and wiring at service stations.
  • Industrial equipment: Mining, reactors, and farming machinery, where componentry is large, numerous, and tight could benefit from the VIPIR system.
Technology Details

Robotics Automation and Control
GSC-TOPS-360
GSC-17175-1
12139279
Similar Results
satellite
Method and Associated Apparatus for Capturing, Servicing, and De-Orbiting Earth Satellites Using Robotics
This method begins with the optical seeking and ranging of a target satellite using LiDAR. Upon approach, the tumble rate of the target satellite is measured and matched by the approaching spacecraft. As rendezvous occurs the spacecraft deploys a robotic grappling arm or berthing pins to provide a secure attachment to the satellite. A series of robotic arms perform servicing autonomously, either executing a pre-programmed sequence of instructions or a sequence generated by Artificial Intelligence (AI) logic onboard the robot. Should it become necessary or desirable, a remote operator maintains the ability to abort an instruction or utilize a built-in override to teleoperate the robot.
https://images.nasa.gov/details-iss062e000422
Computer Vision Lends Precision to Robotic Grappling
The goal of this computer vision software is to take the guesswork out of grapple operations aboard the ISS by providing a robotic arm operator with real-time pose estimation of the grapple fixtures relative to the robotic arms end effectors. To solve this Perspective-n-Point challenge, the software uses computer vision algorithms to determine alignment solutions between the position of the camera eyepoint with the position of the end effector as the borescope camera sensors are typically located several centimeters from their respective end effector grasping mechanisms. The software includes a machine learning component that uses a trained regional Convolutional Neural Network (r-CNN) to provide the capability to analyze a live camera feed to determine ISS fixture targets a robotic arm operator can interact with on orbit. This feature is intended to increase the grappling operational range of ISSs main robotic arm from a previous maximum of 0.5 meters for certain target types, to greater than 1.5 meters, while significantly reducing computation times for grasping operations. Industrial automation and robotics applications that rely on computer vision solutions may find value in this softwares capabilities. A wide range of emerging terrestrial robotic applications, outside of controlled environments, may also find value in the dynamic object recognition and state determination capabilities of this technology as successfully demonstrated by NASA on-orbit. This computer vision software is at a technology readiness level (TRL) 6, (system/sub-system model or prototype demonstration in an operational environment.), and the software is now available to license. Please note that NASA does not manufacture products itself for commercial sale.
Portable Microscope
The handheld digital microscope features a 3D-printed chassis to house its hardware, firmware, and rechargeable Li-ion battery with built-in power management. It incorporates an internal stainless-steel cage system to enclose and provide mechanical rigidity for the optics and imaging sensor. To reduce the microscope’s size, yet retain high spatial resolution, engineers devised an optical light path that uniquely folds back on itself using high reflectivity mirrors, thus significantly reducing internal volume. Imaging control and acquisition is performed using a secure web-based graphical user interface accessible via any wireless enabled device. The microscope serves as its own wireless access point thus obviating the need for a pre-existing network. This web interface enables multiple simultaneous connections and facilitates data sharing with clinicians, scientists, or other personnel as needed. Acquired images can be stored locally on the microscope server or on a removable SD card. Data can be securely downloaded to other devices using a range of industry standard protocols. Although the handheld digital microscope was originally developed for in-flight medical diagnosis in microgravity applications, prototypes were thoroughly ground-tested in a variety of environments to verify the accurate resolve of microbial samples for identification and compo-sitional analysis for terrestrial field use. Owing to its portability, other applications demanding rapid results may include research, education, veterinarian, military, contagion disaster response, telemedicine, and point-of-care medicine.
https://science.nasa.gov/mission/viper/
3D Lidar for Improved Rover Traversal and Imagery
The SQRLi system is made up of three major components including the laser assembly, the mirror assembly, and the electronics and data processing equipment (electronics assembly) as shown in the figure below. The three main systems work together to send and receive the lidar signal then translate it into a 3D image for navigation and imaging purposes. The rover sensing instrument makes use of a unique fiber optic laser assembly with high, adjustable output that increases the dynamic range (i.e., contrast) of the lidar system. The commercially available mirror setup used in the SQRLi is small, reliable, and has a wide aperture that improves the field-of-view of the lidar while maintaining a small instrument footprint. Lastly, the data processing is done by an in-house designed processor capable of translating the light signal into a high-resolution (sub-millimeter) 3D map. These components of the SQRLi enable successful hazard detection and navigation in visibility-impaired environments. The SQRLi is applicable to planetary and lunar exploration by unmanned or crewed vehicles and may be adapted for in-space servicing, assembly, and manufacturing purposes. Beyond NASA missions, the new 3D lidar may be used for vehicular navigation in the automotive, defense, or commercial space sectors. The SQRLi is available for patent licensing.
Adaptive Camera Assembly
NASA’s adaptive camera assembly possesses a variety of unique and novel features. These features can be divided into two main categories: (1) those that improve “human factors” (e.g., the ability for target users with limited hand, finger, and body mobility to operate the device), and (2) those that enable the camera to survive harsh environments such as that of the moon. Some key features are described below. Please see the design image on this page for more information. NASA’s adaptive camera assembly features an L-shaped handle that the Nikon Z9 camera mounts to via a quick connect T-slot, enabling tool-less install and removal. The handle contains a large tactile two-stage button for controlling the camera’s autofocus functionality as well as the shutter. The size and shape of the handle, as well as the location of the buttons, are optimized for use with a gloved hand (e.g., pressurized spacesuit gloves, large gloves for thermal protection, etc.). In addition, the assembly secures the rear LCD screen at an optimal angle for viewing when the camera is held at chest height. It also includes a button for cutting power – allowing for a hard power reset in the event of a radiation event. Two large button plungers are present, which can be used to press the picture review and "F4" buttons of the Nikon Z9 through an integrated blanket system that provides protection from dust and thermal environments. Overall, NASA’s adaptive camera assembly provides a system to render the Nikon Z9 camera (a) easy to use by individuals with limited mobility and finger dexterity / strength, and (b) resilient in extreme environments.
Stay up to date, follow NASA's Technology Transfer Program on:
facebook twitter linkedin youtube
Facebook Logo X Logo Linkedin Logo Youtube Logo