Military wants More Capable Sensor Inputs for UAS

As unmanned aircraft become increasingly common, warfighters continue to press for more imagery from a broader range of high-resolution sensors, putting more pressure on those who design the systems that collect images and send them to analysts.

System developers are responding by deploying different types of sensors and electronics that analyze and compress images before they’re transmitted.

Larger UAS are carrying far more sensors while mission durations increase and distances lengthen, putting pressure on system designers to trim weight and power requirements. At the same time, engineers are providing systems that reduce an analyst’s workload by doing some pre-processing that will help users pinpoint the areas analysts want to examine.

That’s being done by a range of techniques, led by a trend to collect input from multiple sensors to provide more insight into what’s happening in a target area. For example, one newer technique is to analyze light output to determine whether a metal target is an aluminum aircraft or a steel car.

“We’re using hyperspectral imaging. We use a spectrometer to get the signatures of different objects,” said Neil Peterson, business development director at Raytheon Intelligence, Surveillance and Reconnaissance Systems. “If you’re looking for a downed airplane, you can look for its elemental composition, for example. This lets you detect things in very complex environments.”

Users and product developers are also adopting different technologies to augment the limitations of the cameras that form the basis of many ISR missions. When vehicles can’t get close to their targets, using multiple inputs can bring dramatic improvements in image quality.

“Environmental factors like pollution, humidity and light rain are the biggest challenges,” said David Strong, vice president of marketing at FLIR Government Systems. “That’s driving us to short wave infrared [SWIR], which provides longer range than you can achieve with alternatives. SWIR sees both emitted heat energy and reflected IR light.”

In some instances, a technology such as SWIR can replace alternatives such as thermal imagers. For example, these sensors can track heat-generating targets that include people.

“Blending SWIR and visible imagery makes it easier to follow a person walking in and out of shadows,” Strong said. “SWIR follows the heat signature when the target is lost in shadows.”

As a growing number of sensors provide higher resolution, the volume of data is overloading the channels that send data to ground stations. Design engineers are responding by letting sensors handle more processing, exploitation and dissemination (PED) of data, giving analysts images that are more likely to provide useful information.

“We’re doing more on-board PEDs, with sensors that detect important data instead of flooding the pipe with tons of data,” said Tom Breen, director of strategic planning at Goodrich ISR Systems.

Algorithms that determine which images to send use techniques such as watching for changes deemed to be significant. UAS storage systems still store all imagery, giving analysts the opportunity to review all files if they want to more fully understand the surrounding.

“When you mine the data smartly, you can send five selected images instead of sending 6 Gbs of data. Still, you need to let the operator expand those jpegs and look at larger images and related images that might be useful,” Robinson said.

This data mining can take many forms. For example, the software may search for a white pickup truck or other parameters that operators select. This reduces the workload for analysts.

“Smart sensors are essential,” Peterson said. “Things like automated feature extraction help mitigate the need for more analysts, which is critical because manpower is one of the biggest costs.”

Design engineers are also using more data compression by using commercial technologies that dramatically reduce bandwidth requirements.

“You can’t cost effectively downlink using uncompressed video, so we use MPEG-4,” Strong said. “MPEG-4 gives operators 95 percent of the image quality when they’re viewing a 10 Mbs stream versus 1.5 Gbs for an uncompressed stream.”

With flight, there are always tradeoffs between payload, distance and duration. Design engineers are doing everything they can to squeeze more high-resolution sensors into UAS without affecting the distance or duration of missions. The ongoing reduction in the size of electronic components plays a huge role in this effort.

Radar systems that were over 500 pounds now weigh 150 to 300 pounds and their electronic power requirements have gone from around 6 kilowatts to less than 1 kilowatt,” said Robert Robinson, senior program manager for airborne reconnaissance systems at Lockheed Martin Information Systems and Global Solutions-Defense.

While reducing weight has obvious implications for fuel consumption, many developers focus just as much on power consumption. When semiconductor designers squeeze more computing capabilities onto a chip, power consumption and heat generally rise. System designers who reduce these power requirements can gain significant benefits.

“If you can get rid of a generator and a power supply or two, you can really increase the duration of a flight,” Strong said.

Reducing power consumptions also reduces heat, which has a direct effect on the lifetime of electronic components. Although it seems simple to keep systems cool at high elevations, engineers face many problems when they try to cool the high-density modules used in high resolution radars and cameras.

“You’ve got a lot of cold air at 25,000 feet, but its density is very thin so you’re not getting a lot of air over the chip,” Robinson said. “Some people use fans, but we use liquid cooling to remove heat from hot spots.”

As sensors advance, developers are also doing more with the positioning systems that move these sensors into the positions analysts want. Mounts are being improved and motors are getting more precise.

“There are a lot of mechanical components that let you drive the gimbal and keep it isolated from vibration. When you’re flying at 25,000 feet and trying to point at something on an angle, very minor variations can make a huge difference,” Peterson said.

As microprocessors get faster, they can take over most of the control functions and constantly adjust the position of sensors so they stay focused on their targets. At the same time, automated controls remove the need for manual adjustments.

“Aircraft always have pitch and yaw and vibration, so you need a lot of magic in the gimbal so you can point the camera at an air crew in trouble or a specific car and stay focused,” Strong said.

“When stable positioning systems are combined with navigation systems that predict where the vehicle will be, electronic controls can handle the task of focusing on specific targets so operators don’t have to manipulate joysticks.”

Electronic controls are also helping analysts keep track of target areas when they zoom in and out or focus on a different section of the image. Advanced systems also let operators do things like outlining target areas with boxes so they can quickly focus on the area of interest.

Source: Defense Systems

Leave a Reply

Your email address will not be published. Required fields are marked *