Skip to main content
Category

Aerospace

Tracking Space Debris in Earth’s Orbit With Centimeter Precision Using Efficient Laser Technology

By Aerospace, Editor's Choice, laser distance sensor, Sensors, technology
Fighting the perils of space debris: Fraunhofer IOF's fiber laser technology. Credit: Fraunhofer IOF Read more at: https://phys.org/news/2017-09-tracking-debris-earth-orbit-centimeter.html#jCp

Fighting the perils of space debris: Fraunhofer IOF’s fiber laser technology. Credit: Fraunhofer IOF Read more at: https://phys.org/news/2017-09-tracking-debris-earth-orbit-centimeter.html#jCp

Uncontrollable flying objects in orbit are a massive risk for modern space travel, and, due to our dependence on satellites today, it is also a risk to global economy. A research team at the Fraunhofer Institute for Applied Optics and Precision Engineering IOF in Jena, Germany, has now especially developed a fiber laser that reliably determines the position and direction of the space debris’ movement to mitigate these risks.

A short-pulse fiber laser for centimeter-accurate detection of space debris.

A short-pulse fiber laser suitable for LIDAR applications (light detection and ranging) for the centimeter-accurate detection of space debris. Credit: Fraunhofer IOF

Space debris is a massive problem in low Earth orbit space flight. Decommissioned or damaged satellites, fragments of space stations and other remnants of space missions pose a potential threat of collisions with active satellites and spacecraft every day. In addition to their destructive force, collisions also create additional risk creating thousands of new pieces of debris, which in turn could collide with other objects – a dangerous snowball effect.

Today, the global economy depends to a substantial degree on satellites and their functions – these applications are, for example, used in telecommunications, the transmission of TV signals, navigation, weather forecasting and climate research. The damage or destruction of such satellites through a collision with orbiting satellites or remains of rockets can cause immense and lasting damage. Therefore, the hazardous space debris needs to be reliably tracked and recorded before any salvaging or other counter-measures can be considered. Experts from Fraunhofer IOF in Jena have developed a laser system that is perfectly suited for this task.

Reliable recording of the position and movement of objects in the Earth’s orbit

“With our robust and efficient system we can reliably and accurately determine the objects’ exact position and direction of movement in orbit,” explains Dr. Thomas Schreiber from the fiber lasers group at Fraunhofer IOF. “Laser systems like ours must be exceptionally powerful in order to withstand the extreme conditions in space. In particular, the high physical strain on the carrier rocket during the launch, where the technology is subjected to very strong vibrations. “In the low earth orbit, the high level of exposure to radiation, the extreme temperature fluctuations and the low energy supply are just as great obstacles to overcome. This necessitated the new development by the Jena research team since common laser technologies are not able to cope with these challenges.

Moreover, it is also necessary to analyze space debris over comparatively long distances. For this purpose, the laser pulse is propagating through a glass fiber-based amplifier and sent on its kilometers long journey.

Measurements with ten thousands laser pulses per second

“Very short laser pulses, which last only a few billionths of a second, are shot at different positions in space to determine the speed, direction of motion and the rotational motion of the objects,” explains Dr. Dr. Oliver de Vries. “With our laser system it is possible to shoot up thousands of pulses per second. If an object is actually at one of the positions examined, part of the radiation is reflected back to a special scanner, which is directly integrated into the system. Even though the laser beam is very fast, it takes some time for the emitted light to get to the object and back again. This so-called ‘time of flight’ can then be converted into a distance and a real 3D coordinate accordingly.” The system’s sophisticated sensors, which collect the reflected light reflexes, can detect even billionths of the reflected light.

The principle – originally developed by the two researchers of Fraunhofer IOF for Jena-Optronik and the German Aerospace Centre (Deutsches Zentrum für Luft- und Raumfahrt, DLR) – has already been successfully tested during a space transporter’s docking maneuver at the International Space Station ISS. Previously, the laser system had been installed in a sensor of the Thuringian aerospace company Jena-Optronik GmbH and was launched in 2016 with the autonomous supply transporter ATV-5. Jena Optronik’s system also excels in energy efficiency: the fiber laser operates at a total power of less than 10 watts – that is significantly less than a commercial laptop, for instance.

Source: Physics.org. Read more at: https://phys.org/news/2017-09-tracking-debris-earth-orbit-centimeter.html

S-MAD Drone Can Land and Perch On Vertical Walls Like a Bird

By Aerospace, Editor's Choice, News, Robotics, Sensors

Last week, researchers of the University of Sherbrooke presented the Sherbrook Multimodal Autonomous Drone (S-MAD) at the Living Machine 2017 conference at Stanford University, where it won the “Best Robotics Paper Award.” According to NewAtlas, the Createk Design Lab and Sherbrooke researchers looked to birds and their last-minute perching instincts and abilities and infused those into their unmanned aerial vehicle.

Not only that, but the company went with a fixed-wing approach—again, hewing closer to a bird’s anatomy as opposed to the typical rotor-based drone design. Makes sense, as the Living Machine conference is all about the symbiotic relationship between nature and machinery, rewarding those companies most creative and effective in their designs.

The Sherbrooke researchers apparently tested thousands of aerodynamic model simulations in order to perfect the design’s flight and perching capabilities, before actually succeeding in developing a capable fixed-wing drone with the proper thrust and pitch required to pull the aforementioned behavior off correctly. Now, we’ve reported on drones that attempt to both hover and cruise, combining the behaviors of airplanes and regular drones into one UAV.

That effort was highly impressive, and it seems like the S-MAD is yet another addition to the list of drones that look toward other machines (or in this case, animals) to become something more than just another hovering UAV. Let’s take a look at the S-MAD in action, shall we?

As you can see here, the motor brakes after detecting an impact, successfully perching the device against the flat surface. This is easily achieved thanks to the S-MAD’s “wall detection range sensor.” According to NewAtlas, the S-MAD flies toward a wall at 7 to 9 meters per second, after which a laser-based sensor detects the surface and slows the device down to 1 to 3 meters per second. This is when it begins shifting from horizontal to vertical state, upping its thrust to remain tilted upward.

Once the device hits the wall, the suspension helps in avoiding any damage. Microfiber feet of sorts hold on to the wall, and the propeller turns off. S-MAD’s researchers claim that this will work on a wide array of surfaces, like bricks, concrete, stucco, etc. Once you’re ready to take off again, you can—the S-MAD can remain perched against the surface until it’s time to fly away.

Pretty impressive stuff, and something that could be used for a wide array of reasons. Motherboard says that aerial monitoring after an earthquake or assisting building inspections could be primary uses for the S-MAD, and those scenarios seem to make perfect sense.

New Sensor System Helps Tracking ISIS Explosives

By Aerospace, Editor's Choice, laser distance sensor, Sensors, technology

A new use for a bomb-detecting technology: a system which identifies the dangerous opioid fentanyl from a distance using a laser beam to protect soldiers and law enforcement officers and help with prosecutions.

Fentanyl — 100 times more potent than morphine — can be dangerous even to touch. Officers are often sickened by fentanyl exposure during busts.

“Farther, faster, safer,” said Ed Dottery, owner of Alakai Defense Systems, describing the advantages of the technology. Dottery said his sensor system is already being used by the US military to protect troops against the Islamic State of Iraq and Syria truck bombs.

It works by shooting a laser beam at an object, then reading the change in the beam when it bounces back. Those changes signal whether there are chemicals used to make explosives.

According to tampabay.com, a portable version of the system appears to have great value in law enforcement narcotics and bomb detection investigations, especially involving the deadly drug Fentanyl, make technology like this important for officer safety.

Dottery said his system could prevent officers from having to don protective gear because they could detect the presence of the drug without getting near it.

Still, while promising, the technology has a long way to go before it can be used by law enforcement to detect fentanyl. The current version is too bulky and too expensive to be practical, a police sheriff said.

Alen Tomczak, an Alakai field engineer who was present for the Sheriff’s Office test, said the company is not trying to sell a system to either agency at this point because more research is needed. But the initial result was promising, Tomczak said.

The test, he said, only took one laser shot of the fentanyl. “To make it official, we usually take 100 or 200 shots to make sure our algorithms are right,” he said.

Alakai is looking to collaborate with law enforcement to refine and develop the system, Tomczak said. That includes finding enough fentanyl to conduct further testing.

3DR partners with DJI to launch drone operations platform

By Aerospace, Editor's Choice, Engineering and construction, technology

Dive Brief:

  • 3D Robotics (3DR) is teaming up global drone maker DJI to make its Site Scan surveying technology compatible with that company’s drones, opening the software to a wider market, 3DR announced Tuesday.
  • Now, DJI drones can be used to gather data with the Site Scan Field mobile app, which offers features including autonomous flight modes, multi-engine cloud processing, topographic surveys and volumetric calculations.
  • 3DR also announced the launch of its Enterprise Atlas platform, allowing large AEC firms to establish and manage drone operations on the job site and subsequently analyze the data collected from project operations.

Dive Insight:

The AEC-focused 3DR is growing its foothold in the drone space as the technology takes off in the construction industry.In May, the company scored $53 million in Series D funding from investors including Atlantic Bridge, Autodesk and True Ventures. At the time, the Berkeley, CA–based company said it planned to invest that capital into expanding its flagship product, Site Scan, and marketing it toward construction and engineering firms.

With this latest development, 3DR seems to be making good on that promise.

Since its deployment in March 2016, Site Scan — a collaboration between 3DR, Autodesk and Sony — has, through its single-plaform format, attempted to transform how AEC firms capture and process aerial data.

Such practices are becoming more popular across the job site as project teams look to more efficient means of gathering, visualizing and analyzing construction data. The goal, for many, is to get ahead of potential errors, stay in line with project schedules and better estimate and achieve costs.

Drone use is on the rise, particularly within the construction and infrastructure sectors. According to a 2016 report  from the Association for Unmanned Vehicle Systems International, those two categories are two of the biggest drone users across all industries. That pattern is likely to hold, with estimates from PricewaterhouseCoopers finding that drone-powered labor for infrastructure alone has a market value of $45.2 billion.

The rise of new tech like drones on the job site requires new ways to integrate the various, data-gathering hardware on the job — in the air and on the ground.

The rise of mixed-fleet telematics is spreading from earth-moving equipment to surveying drones, opening up new opportunities for data collection. In 2015, Komatsu announced a partnership with drone company Skycatch to automate and streamline on-site project data transfer.

Top image credit: 3D Robotics

Airborne Laser Spectroscopy System Can Map Atmospheric Gases

By Aerospace, Editor's Choice, Engineering and construction, laser distance sensor, Sensors

An experimental instrument that combines precision laser spectroscopy and a mobile airborne reflector to map atmospheric gas concentrations could lead to safer methods of detecting hazardous emissions and better modeling of Earth’s atmosphere, its inventors report.

“These measurements are important fundamentally for understanding how gases [mix] and spread in the atmosphere,” said Kevin Cossel, a physical chemist at the National Institute of Standards and Technology (NIST) in Boulder, Colo., and technical lead for the project. “In the future, we also hope to be able to use the system to help detect and quantify hazardous gases,” he added.

To measure gases, the instrument uses a sophisticated type of multifrequency light source, known as a laser comb, to shoot eye-safe lasers through a gas plume at 50,000 slightly different wavelengths simultaneously. The laser sits on an adjustable platform coupled to a telescope. A second component, called a retroreflector, which is a kind of mirror that bounces light back exactly in the direction from which it came, makes it possible to redirect light fired through a portion of the atmosphere straight back to the instrument for analysis.

Until recently, the team had to position the retroreflector by hand on the far side of whatever volume of gas was to be probed, along a line of sight from the laser. Having to set the mirror by hand limited the system to measuring gas concentrations along a single open-air path, Cossel said.

Now, however, the inventors have taken a step that they say makes laser sensing of gas plumes much more versatile. They have mounted the retroreflector on a small remote-controlled drone called a multicopter, which uses helicopter-like propellers to hover and steer. Using a remote control, the scientists can fly the mirror-carrying drone wherever they wish and make it hover to receive a flash of laser light passing through a gas plume at any angle.

With this new mobility, “we’re excited about showing the capability of doing spatial mapping,” said Nathan Newbury, a NIST physicist and the principal investigator for this project. “There’s a lot of interest if we could go vertical to map up to the [atmospheric] boundary layer,” he added.

Ken Davis, professor of atmospheric and climate science at Pennsylvania State University in University Park, expressed his excitement about the possibility of using this system to map “trace gases from sources that are complex in space and time. The mobile reflectors on the copter make that possible.”

Small DARPA drones fly without human or GPS help

By Aerospace, Automotive Industry, Editor's Choice, Sensors

 

Small unmanned aerial vehicles (UAVs) will soon navigate without using GPS or human assistance  as demonstrated by several successful test runs last month as part of the Fast Lightweight Autonomy (FLA) program.

DARPA recently announced the breakthrough, after four days of testing in central Florida. The test runs marked progress toward development of small, quadcopter drones that are able to fly through obstacle-ridden environments without the guidance of a human operator or GPS navigation. Advanced software algorithms and sensors enabled the drones to navigate obstacle courses to find target objects, according to a DARPA press release.

The drones would be able to help troops remotely assess operational areas that are underground or in buildings where GPS can’t reach, or when the enemy is engaging in electronic signal jamming. It would either collect full motion video or still images that would be delivered back to the operators, according to a DARPA public affairs official.

“Small, low-cost unmanned aircraft rely heavily on tele-operators and GPS, not only for knowing the vehicle’s position precisely, but also for correcting errors in the estimated altitude and velocity of the air vehicle,” explained JC Ledé, Program Manager for FLA at DARPA. “In FLA, the aircraft has to figure out all of that out on its own with sufficient accuracy to avoid obstacles and complete its mission.”

Several different approaches and technologies for navigation in GPS-denied environments were demonstrated during the test period.

One approach, described by Dr. Camillo J. Taylor, Professor of Computer and Information Science at the University of Pennsylvania, uses Stereo Visual Inertial Odometry and Light Detection and Ranging (LIDAR) technology.

Stereo Visual Inertial Odometry requires two stereo camera sensors attached to the front of the quadcopter, according to Taylor. A stereo camera has multiple lenses and separate image sensors for each lens, which means that it can use triangulation algorithms to imitate the human eye’s ability to perceive distance and three-dimensionality.

“We prefer to use the stereo approach because it resolves the scale ambiguity,” said Taylor.

“Basically with two ‘eyes’ we can triangulate… and that helps us know how far metrically we’ve gone.”

Inertial measurement navigation can also make use of cold-atom interferometry technology, which uses software algorithms to monitor the acceleration and rotation of a compacted mass of atoms contained within a sensor, according to DARPA.

The quadcopter features a “nodding” LIDAR sensor, so called because it scans back and forth like a nodding head, explained Taylor. The LIDAR system uses a special, near-infrared laser that emits electromagnetic pulses and measures the return wavelengths to calculate the distance and 3D shape of objects in its path, according to a previous Defense Systems report.

Another approach explored at the testing site took inspiration from the way that people give each other directions.

“One of the big challenges we face when we can’t operate with GPS and don’t have any RC link to base station, is that we have a very limited knowledge prior to launch of what the environment will look like,” said Dr. Andrew Browning, Principal Investigator, Scientific Systems Company/AeroVironment.

To address that issue, another drone prototype technology pre-programs geographic or object cues into the drone’s software so that it knows to turn left when it senses a dumpster, for example. The other navigation relies on real-time sensors and decision-making, said Dr. Nicholas Roy, Principal Investigator at the Draper/ Massachusetts Institute of Technology. Eventually artificial intelligence learning technology may be incorporated, as well.

The FLA program that sponsored the autonomous quadcopter projects wants its drones to use lightweight, off-the-shelf technology and fly at speeds of up to 45 mph, according to DARPA’s press release.

“The goal of FLA is to develop advanced algorithms to allow unmanned air or ground vehicles to operate without the guidance of a human tele-operator, GPS, or any datalinks going to or coming from the vehicle,” said Ledé.

The challenge is to develop state-of-the-art software programs that offer a more complex solution than simply increasing the computing power of the technology, as computing hardware adds unwanted weight to the quadcopter. Ideally, the algorithms for the autonomous drone would run on a single computing board similar to that of a smart phone, said Ledé.

Now that this round of testing is complete, small autonomous drones that don’t rely on GPS will continue to be developed in Phase 2 of the program

  • By Katherine Owens
  • Jul 10, 2017

Small DARPA drones fly without human or GPS help

By Aerospace, Automotive Industry, Editor's Choice, News, Robotics, Sensors

 

Small unmanned aerial vehicles (UAVs) will soon navigate without using GPS or human assistance as demonstrated by several successful test runs last month as part of the Fast Lightweight Autonomy (FLA) program.

DARPA recently announced the breakthrough, after four days of testing in central Florida. The test runs marked progress toward development of small, quadcopter drones that are able to fly through obstacle-ridden environments without the guidance of a human operator or GPS navigation. Advanced software algorithms and sensors enabled the drones to navigate obstacle courses to find target objects, according to a DARPA press release.

The drones would be able to help troops remotely assess operational areas that are underground or in buildings where GPS can’t reach, or when the enemy is engaging in electronic signal jamming. It would either collect full motion video or still images that would be delivered back to the operators, according to a DARPA public affairs official.

“Small, low-cost unmanned aircraft rely heavily on tele-operators and GPS, not only for knowing the vehicle’s position precisely, but also for correcting errors in the estimated altitude and velocity of the air vehicle,” explained JC Ledé, Program Manager for FLA at DARPA. “In FLA, the aircraft has to figure out all of that out on its own with sufficient accuracy to avoid obstacles and complete its mission.”

Several different approaches and technologies for navigation in GPS-denied environments were demonstrated during the test period.

One approach, described by Dr. Camillo J. Taylor, Professor of Computer and Information Science at the University of Pennsylvania, uses Stereo Visual Inertial Odometry and Light Detection and Ranging (LIDAR) technology.

Stereo Visual Inertial Odometry requires two stereo camera sensors attached to the front of the quadcopter, according to Taylor. A stereo camera has multiple lenses and separate image sensors for each lens, which means that it can use triangulation algorithms to imitate the human eye’s ability to perceive distance and three-dimensionality.

“We prefer to use the stereo approach because it resolves the scale ambiguity,” said Taylor.

“Basically with two ‘eyes’ we can triangulate… and that helps us know how far metrically we’ve gone.”

Inertial measurement navigation can also make use of cold-atom interferometry technology, which uses software algorithms to monitor the acceleration and rotation of a compacted mass of atoms contained within a sensor, according to DARPA.

The quadcopter features a “nodding” LIDAR sensor, so called because it scans back and forth like a nodding head, explained Taylor. The LIDAR system uses a special, near-infrared laser that emits electromagnetic pulses and measures the return wavelengths to calculate the distance and 3D shape of objects in its path, according to a previous Defense Systems report.

Another approach explored at the testing site took inspiration from the way that people give each other directions.

“One of the big challenges we face when we can’t operate with GPS and don’t have any RC link to base station, is that we have a very limited knowledge prior to launch of what the environment will look like,” said Dr. Andrew Browning, Principal Investigator, Scientific Systems Company/AeroVironment.

To address that issue, another drone prototype technology pre-programs geographic or object cues into the drone’s software so that it knows to turn left when it senses a dumpster, for example. The other navigation relies on real-time sensors and decision-making, said Dr. Nicholas Roy, Principal Investigator at the Draper/ Massachusetts Institute of Technology. Eventually artificial intelligence learning technology may be incorporated, as well.

The FLA program that sponsored the autonomous quadcopter projects wants its drones to use lightweight, off-the-shelf technology and fly at speeds of up to 45 mph, according to DARPA’s press release.

“The goal of FLA is to develop advanced algorithms to allow unmanned air or ground vehicles to operate without the guidance of a human tele-operator, GPS, or any datalinks going to or coming from the vehicle,” said Ledé.

The challenge is to develop state-of-the-art software programs that offer a more complex solution than simply increasing the computing power of the technology, as computing hardware adds unwanted weight to the quadcopter. Ideally, the algorithms for the autonomous drone would run on a single computing board similar to that of a smart phone, said Ledé.

Now that this round of testing is complete, small autonomous drones that don’t rely on GPS will continue to be developed in Phase 2 of the program

  • By Katherine Owens
  • Jul 10, 2017