trail camera comparisons

2017 Trail Camera Shootout

In the Detection Shootout, a test subject walks parallel to the test cameras from each direction at 10-foot intervals.  This continues out to 120 feet.  

Each camera was awarded a point for each full picture (greater than 60% of the body) of the test subject captured on each pass.  We also recorded empty and partial pictures to compute an efficiency rating for each camera.  

Overall Score: The total number of pictures the camera took with the person completely inside the field of view.

Efficiency Rating: The efficiency rating is simply the percentage of pictures which captured a valid image (60% or greater) of the test subject relative to the total number of images taken (valid + partial + empty).

This rating has no bearing on how we ranked the cameras, this is just for your convenience.

Detection Range: The furthest distance the camera detected during the Detection Shootout. 

Trailcampro Commentary & Notes

We get a lot of questions about the Shootout and how to interpret the results for certain cameras. So in this section, we will try to explain some of the nuances of the tests and add commentary for things that aren't readily apparent.

General Notes

  • The #1 thing you need to take into account is the Detection Shootout is the ultimate test of a camera's ability to detect and record activity. However, this is not the only consideration you should make when buying a camera. This test does not account for picture quality, battery life, video detection, case design, programming, etc. For an overall ranking of cameras that accounts for everything, we urge you to visit our Trail Camera Reviews. This page has all the cameras ranked from the highest overall score to the lowest.
  • Across the board, cameras detected extremely well this year. In fact, the top scoring camera had 4 times the overall score the top camera had last year. This is evidence of the R&D companies committed to over the last 12 months. 
  • The Shootout has many variables, the most obvious is the ambient temperature on the day of testing. This is why it is imperative we test the cameras on the same day, at the same time. This gives us an apples to apples comparison.
  • We tested 96 cameras this year. We tried to have doubles of every camera. If we didn't have a double of a particular model, then we had a camera that would have had the same detection circuit. For instance, we did not have two Moultrie M-40's and two Moultrie M-40i's. They have the same detection circuit, so we had one of each. We published the results from the duplicate that scored higher.
  • This year we had multiple trail camera manufacturers present for testing (Bushnell, Moultrie, Primos, Spypoint, and Wildgame Innovations). They were able to observe and go over the settings of their cameras to ensure proper setup for maximum detection. We enjoyed having them and welcome any and all trailcam manufacturers to these tests in the future.
  • Every year we have folks ask us why certain cameras didn't make it into the Shootout. At the end of the day, we want to have as comprehensive a list as possible. If a certain camera didn't make it into testing, it either wasn't available at the time, didn't work on day of testing (only 1 camera malfunctioned this year), was inadvertently left out (we are human and forget things), was a duplicate of another model being tested, or the company didn't show interest in being in our tests. Several cameras we had in the 2016 Detection Test haven't changed in 2017, so you can view their results from last year. If you are reading this and are a trailcam manufacturer that wants your cameras in our testing for next year, just email us ( or call us (1-800-791-0660) and we will include any models in the test you like.

Quick Hits on the Brands We Tested

Boly - Boly contacted us and wanted their cameras in the Shootout this year. This is a Chinese manufacturing company that makes cameras under several lesser known brand names. Their Fisheye camera detected out a whopping 120 ft. but didn't take many pictures per pass. Still an impressive feat.

Bushnell - Bushnell is still the king of the jungle in this test. The fact that they swept the top 3 spots didn't surprise us, but the Aggressor Low Glow's overall score certainly did. That is a monster number.

Browning - Browning did exactly as we expected. Really good detection circuits, with good efficiency ratings. The question we expect to hear the most is why the Strike Force Pro didn't score higher. The detection zone on that camera is somewhat narrower than the field of view. So it consistently and accurately detected all the way out to 100 ft. (impressive) but didn't allow for more than 2-3 pictures per pass.

Covert - Covert scored as they have in past tests. Very typical results. They would benefit greatly from faster recovery times so they could get more pictures (points) per pass. Their new 12.1 cell cameras did a very good job in the test, ranking well, but their non-cell cameras were near the bottom.

Cuddeback - They have asked that they no longer be included in any of our tests, reviews or analysis. In fact, due to their request, we are no longer allowed to even mention their name on our site (this includes telephone calls as well). We include them in these basic tests each year because there is consumer demand to know how they stack up to other cameras. At this time, we choose to not offer commentary of how their cameras performed.

HCO - Most of the HCO cameras performed how we expected. The disappointing result is from the new detection circuit in the Spartan US Cellular. Very poor detection range. However, their ATT cameras (same circuit as the Verizon models) still have the top cellular camera score.

Moultrie - The P-180i utilizes the wide angles to score very well. Impressive unit to say the least. The big surprise here is the outstanding result of the Moultrie A-30i. Not many $99 cameras are expected to be that fast. 

Primos - All the Primos cameras scored respectively and consistently. Nothing flashy in their results, but their consistency made an impression on us.

Reconyx - The notable standout here is the terrible detection range on the MR-5. We knew it wouldn't go out to 80 ft., but one picture at 40 ft. really limits the setups you can use the MR5 on. This camera is most useful high on a tree or building facing down to a single spot (door, gate, feeder, etc.). The rest of their cameras show the same detection circuit they have had the last several years. No improvements or downgrades. They would benefit from a longer detection range.

Spypoint - Spypoint continues to be a force in the Shootout. Their cameras showed improved detection range this year, resulting in even higher scores. We know their cameras are scorching fast, but now we know they can detect out 70-80 ft. 

Stealth Cam - The Steatlh GXW continues to impress us as a quality cell camera with good detection. The G34 Pro and G45NG are steady and consistent. Not flashy, but very capable.

Wildgame - The results here are deceiving. Yes, they are at the bottom of the list, but they consistently took one picture at every pass out to 60 or 80 ft. (depending on the model). Their cameras have a slow recovery time, but good trigger speeds and solid/consistent detection range. One tweak to their recovery times would boost their scores. With multiple Wildgame reps in attendance, I think you could see that happen at our 2018 Detection Shootout. This is a simple fix that will drastically change their rankings.

Weather Conditions on Morning of Testing

We started at 0730 on the morning of the tests. The temperature was in the 50's (Fahrenheit) and the skies were clear.



Trail Cameras purchased from us come with:

Free Shipping to the Lower 48  |  2-Year Warranty  |  90-Day Returns