In the Trail Camera Detection Range Shootout, a test subject walks parallel to the test cameras from each direction at 10-foot intervals. This continues out to 120 feet.
In this trail camera comparison, each camera was awarded a point for each full picture (greater than 60% of the body) of the test subject captured on each pass. When completing the trail camera shootout comparison, we also recorded empty and partial pictures to compute an efficiency rating for each camera.
Overall Score: The total number of pictures the camera took with the person completely inside the field of view.
Efficiency Rating: The efficiency rating is simply the percentage of pictures which captured a valid image (60% or greater) of the test subject relative to the total number of images taken (valid + partial + empty).
This rating has no bearing on how we ranked the cameras, this Trail Camera Detection Range Shootout comparison is just for your convenience.
Detection Range: The furthest distance the camera detected during the Trail Camera Detection Range Shootout.
Trailcampro Commentary & Notes
We get a lot of questions about the Shootout and how to interpret the results for certain cameras. So in this section, we will try to explain some of the nuances of the tests and add commentary for things that aren't readily apparent.
The #1 thing you need to take into account is the Detection Shootout is the ultimate test of a camera's ability to detect and record activity. However, this is not the only consideration you should make when buying a camera. This test does not account for picture quality, battery life, video detection, case design, programming, etc. For an overall ranking of cameras that accounts for everything, we urge you to visit our Trail Camera Reviews. This page has all the cameras ranked from the highest overall score to the lowest.
The Shootout has many variables, the most obvious is the ambient temperature on the day of testing (For this year it was 60° F.). This is why it is imperative we test the cameras on the same day, at the same time. This gives us an apples to apples comparison.
We put 66 cameras on the board this year. We tried to have doubles of every camera. If we didn't have a double of a particular model, then we had a camera that would have had the same detection circuit.
This year we had multiple trail camera manufacturers present for testing (Bushnell, Moultrie, Spypoint, and Stealth Cam). They were able to observe and go over the settings of their cameras to ensure proper setup for maximum detection. We enjoyed having them and welcome any and all trailcam manufacturers to these tests in the future.
Every year we have folks ask us why certain cameras didn't make it into the Shootout. At the end of the day, we want to have as comprehensive a list as possible. If a certain camera didn't make it into testing, it either wasn't available at the time, didn't work on day of testing, was inadvertently left out (we are human and forget things), was a duplicate of another model being tested, or the company didn't show interest in being in our tests. If you are reading this and are a trailcam manufacturer that wants your cameras in our testing for next year, just email us (email@example.com) or call us (1-800-791-0660) and we will include any models in the test you like.
Quick Hits on the Brands We Tested
Bushnell - The Impulse cameras were both aimed a touch low, however, they still performed extremely well. Overall, the Bushnell cameras were impressive.
Browning - Browning did exactly as we expected. Excellent detection circuits, with good efficiency ratings.
Covert - Covert scored as they have in past tests. Very typical results. They would benefit greatly from faster recovery times so they could get more pictures (points) per pass.
Cuddeback - They have asked that they no longer be included in any of our tests, reviews or analysis. In fact, due to their request, we are no longer allowed to even mention their name on our site (this includes telephone calls as well). We include them in these basic tests each year because there is consumer demand to know how they stack up to other cameras. At this time, we choose to not offer commentary of how their cameras performed.
Spartan - Most of the Spartan cameras performed how we expected. They would also benefit from faster recovery times.
Moultrie - The Moultrie M8000's did very, very well. Very high efficiency ratings and detected out over 100'. These cameras were set to the 3 photo triggered setting (this is not a burst mode), but keeps the camera's PIR hyper-active after each detection.
Reconyx - The Hyperfire 2 scored similarly to last year. It is consistent and scored in the middle of the pack. It would benefit from longer detection range.
Spypoint - Is there such a thing as too fast? For the second year in a row, nobody else can match Spypoint's speed, consistency, and detection range - all while maintaining a solid efficiency rating. They are the clear winner here and it isn't remotely close.
*The Link Evo's defaulted to "medium sensitivity." In order to change the setting we would have needed to access the Spypoint app, but we did the Shootout in an area with no phone reception. They would have scored closer to the Link-S' if they were set to "high sensitivity."
Stealth Cam - The Stealth cams do ok but their recovery times and lack of long detection range keep them from scoring higher.
Wildgame Innovations - These are very typical results. Slow recovery times and short detection range hamper their scores.
Weather Conditions on Morning of Testing
We started at 0715 on the morning of the tests. The temperature was in the low 60's (Fahrenheit) and the skies were cloudy.