top of page

Video Script: The Truth About Thermal Sensor Resolution

 

The biggest misunderstanding in thermal imaging today is that the bigger the sensor, the higher the spatial resolution, and therefore the further you can see. This is absolutely not true. It's a misunderstanding that is rife in the industry, and we're here today to set this straight.

Sensor size does not control thermal resolution and how far you can see. Lens size and pixel spacing on the thermal chip is what determines the resolution of the thermal imager. A 384, 640, 1024, and a 1280 sensor will all have the same spatial resolution if they all share a 12-micron pixel spacing and the same lens size.

Larger sensors give you a wider field of view, not a higher resolution. My name is Paul Alisauskas, and I head up the thermal development program here at FYRLYT.

Let's look at the fundamentals. A thermal scope consists of a lens, a sensor, and a display. The lens is a primary driver for field of view. If we have a 12-micron spaced chip with a 50mm lens, a 640 sensor will give a field of view of 8.8 by 7 degrees, while a 384 sensor gives 5.3 by 4 degrees.

You'll notice that the base magnification is higher on a 384 scope than a 640. With a 640 scope, we're effectively projecting the image untouched onto the display. We could project the 384 image at the same base magnification, but we'd only end up filling half of the LCD screen. So, we expand the 384 image to fill the screen, by a ratio of roughly 1.7 to 1.8.

This is where the misunderstanding originates. When you look through a 384 in a store, it has already started its digital magnification journey. This is why the images look different at base magnification and has confused the market.

Spare me a few minutes, and I'll explain how this works so you can work it out for yourselves, challenge the misunderstanding, and potentially save yourself and your friends thousands of dollars.

The spatial resolution of a thermal imager is a simple concept: it is thermal pixels per degree of the field of view. This is simply the pixels of the sensor divided by the resulting field of view.

For a standard 640 sensor with a 12-micron pitch, we have 640 pixels divided by 8.8 degrees, which equals 72.73 pixels per degree. For a 384 sensor with a 12-micron pitch, we have 384 pixels divided by 5.3 degrees, which also equals 72.45 pixels per degree. They are effectively identical.

This concept is what's used for thermal detection, identification, and recognition (DIR). The more pixels we have per degree, the further we can detect, identify, and recognize an image. That is the spatial resolution.

If you look at our data, you'll see a 384, a 640, and even a 1024 sensor—all with 12-micron spaced chips and a 50mm lens—will have the same spatial resolution and indeed have the same detection distances. The manufacturer is literally telling you they have the same thermal resolution, but it's misunderstood because of the confusion with base magnification.

So, what do you really get when you buy a bigger sensor? You're literally buying a wider field of view and a lower base magnification. You are not increasing your detection and recognition distances. You are not increasing resolution.

A very important point to note: no software, no algorithm, and no improvement in thermal sensitivity can make up for a lack of pixels on target. Spatial resolution is King.

In the professional shooting space, most people shoot optically at 9x, 10x, and above. With a 384 sensor, you would set the digital zoom to approximately 2x. With a 640, you would put it to 3x. This would approximately equal a 9-10 power optical scope. If you never plan to shoot below the equivalent of a 5x optical zoom and have no problems with target acquisition, there is no benefit in going to a 640 thermal scope regarding spatial resolution.

This is going to be a bitter pill for some to swallow. Be ready for a strong reaction when you explain this concept to people who are uneducated on the subject and can't get their head around the concept that "big must be better." With sensors, big must be better holds true if we're talking about the lenses. Indeed, when it relates to the sensor itself, smaller is actually better. If we could shrink from 12-micron to 6-micron spacing, we would tighten up the field of view and increase our detection distances. The problem is that pushing pixels closer together increases digital noise, and to do this effectively, we'd have to cryogenically cool the chip, which is not practical.

If you come across any strong objection, point toward the detection distances on these scopes. As long as they're sharing the same pixel spacing and lens size, you're quickly going to see that the spatial resolution is absolutely the same.

You may ask where the 1.7 zoom factor comes from. If you look at 640 divided by 384, you'll come up with that ratio. That's how much we blow the 384 image up to display on the same LCD.

Now, look at the new 1280 sensors. They have pushed their lens sizes up from 50mm to 60mm. The sensor is twice the size, but the thermal resolution has only gone up by 20%, which is the ratio of the lens size. We've gone from 73 thermal pixels per degree to 87.6. That's a 20% improvement, not significant given the chip is twice the size and you've generally paid twice as much.

We're not anti-larger sensors. We are anti-exploiting the market's ignorance. We don't want to confuse the market to think that when they buy a bigger sensor, they can see further. The best way to improve spatial resolution is purely with the lens. Going to a 60mm lens absolutely will improve the thermal resolution, but increasing the pixel count on the chip itself actually doesn't; it just buys a wider field of view.

In closing, some key points to remember:

  1. Pixel spacing on the chip and the lens is what determines your spatial resolution.

  2. Larger sensors buy you field of view. They do not buy you the ability to detect, recognize, and increase spatial resolution.

  3. We can only make a triangle from three thermal dots. No algorithm can turn that into a fluffy bunny rabbit.

  4. If you're looking at an older 17-micron chip and you buy a 12-micron chip, you are buying a tighter field of view and the corresponding detection distance has gone up.

Thank you for listening. If you have any questions, please go to fyrlyt.com.

bottom of page