Collapse bottom bar
Subscribe
Gear & Accessories News Trail Cameras

Head-to-Head Review: Best Trail Cameras of 2013

by Jon E. Silks   |  February 11th, 2014 13

Few technologies have impacted our sport more than trail cameras. Take a minute to think of all the ways these tiny scouts work their way into your life as a bowhunter. I challenge you to turn on the Sportsman Channel and count the minutes it takes until you see or hear someone reference a trail camera; if you get past 30 minutes, it would be shocking.

Have you hunted with an outfitter in the past 10 years? They generally have photo albums full of trail-camera pictures and will often use the same in brochures, websites, Facebook pages, etc. Plus, there is no doubt your chances of success are greater if you are hunting with an outfitter who employs an army of trail cameras, because there is no better way to pattern game. Outfitters are also excellent sources to talk to about their choice in cameras and advice for using them effectively.

Of course, some people still argue that you should just put in the time and figure things out the old-fashioned way; and to an extent, I agree. Connecting with nature, understanding the lay of the land and being able to read deer sign helps greatly with camera placement. And the cameras will enhance your understanding as you start to see how the pieces of the puzzle fall into place. I encourage collaboration of old-school and new-school technologies; a little of Uncle Ted’s “Mystical Flight of the Arrow” mixed with advanced circuitry, like the high-definition camera that filmed him in the hunt you watched on TV.

Game cameras also fit the lifestyle many of us lead today. We tend to be extremely busy with all the things that fill up a life—family, work, health, taking care of the house, shuttling kids all over creation and the list goes on. Game cameras are like little clones that allow you to be in the woods 24/7, keeping an eye on your hotspots regardless of other demands on your time. When you finally catch a break and head to the field, your time is maximized because you at least have some idea of what’s going on thanks to your cameras. These benefits are amplified for individuals who hunt property far from home.

All of that brings us right back to the actual cameras. What is it that makes one camera better than another? It could be technology, quality or even application. If you have land in another state you only get to once or twice a year, then you would most likely put a premium on battery life as a starting point for camera selection.

If you were watching a lone trail in the middle of the woods where the animal is not likely to stop for anything, you may want a camera with fast trigger speed and recovery time, which would result in as many pictures as possible. Watching a food plot? Fast trigger speed may not be your focus; instead a slow trigger and an extended sensor range could be of prime interest. Or maybe you are on a budget and need significant adjustability so you can use the one camera in several applications/scenarios.

Whatever the case, the test results (see the gallery below) are meant to help you get further down the road in choosing the type and model camera that best suits your needs. We tested 13 of the year’s top models, evaluating their performance in key areas using a head-to-head format. We performed the following six tests:

The Daylight Walkthough Test was performed by passing a subject in front of the cameras at seven specific distances (15, 25, 35, 45, 55, 65 and 75 feet) and at two known speeds. A consistent, measurable speed was achieved using our Bucky Pace Car System, which consists of a rigged-up Renzo decoy, a rope that was stretched across a small field and a drill motor regulated by a DC power supply and volt meter. The drill motor is attached to a line that pulls Bucky along the rope on a set of pulleys. By controlling the volts, we were able to control the speed, which we set up for two scenarios—a subject walking casually (1.94 fps) and one on a mission (5.94 fps).

Since trail cameras require both heat and motion to trigger their sensors, we had Chad Smith walk alongside Bucky. During the test, each camera had 14 opportunities to capture images (7 distances and two runs—one fast and one slow).

Field of view (FOV) was calculated through two known values—the 25-foot distance to the camera and the 10-foot markers in the picture (our scale). A Staedtler Mars 12-inch triangular engineer’s scale was used to measure the width of the actual picture and the on-screen width of the 10-foot markers. The picture width was adjusted to scale for the FOV.

Detection angle was also measured at 25 feet and was calculated by using the engineer’s scale to measure the distance from the subject to the edge of the picture. This was subtracted from the distance between the picture’s mid-point to the edge. The distance the subject traveled after the trigger until the picture was taken was calculated using our known speed and the unit’s tested trigger speed.

The relationship between a camera’s lens FOV and infrared detection angle significantly impacts its performance. Cameras with a detection angle significantly narrower than the lens’ FOV tend to capture images with the subject relatively close to the center of the frame.

However, such units also may not detect animals that move in and out of the periphery of the camera’s FOV. Conversely, cameras with detection angles that are wider than the lens’ FOV don’t center subjects as well and may result in multiple “blank” pictures with no animals. But the upside is that the wide detection angle makes it less likely that an animal will pass within range without triggering the camera.

The Results
The first picture each camera captured was at the slow speed and at a distance of 25 feet. In the case of the Covert, we used the 35-foot slow picture because the subject was out of frame at 25 feet. That gives an overall feel for the quality of the pictures and the field of view.

All 13 cameras in the Daylight Walkthrough Test were capable of detecting motion out to at least 45 feet. When we moved back to 55 feet, the cameras from Primos, Simmons, Stealth Cam and Wildview dropped out. When we moved back to 65 feet, Browning, Moultrie and Wildgame dropped out. And when we moved back to 75 feet, Reconyx dropped out. That left five of the 13 cameras—Bushnell Trophy Cam HD Max, Covert Extreme Black 60, Eyecon Storm, Minox DTC 600 and SpyPoint BF-6—that successfully detected motion at our maximum test range of 75 feet.

This test also gave us the total number of pictures each camera captured during the 14 passes (seven distances, two speeds). The four with the most pictures were: Bushnell Trophy Cam HD Max (50), Reconyx HC600 (50), Moultrie M-880 (18) and Minox DTC 600 (17). Although the ability to capture more pictures does not necessarily make a camera better, such models may be ideal for use along game trails where subjects are quickly entering and exiting the camera’s FOV.

Flash Effectiveness Test
Our Flash Effectiveness Test measured the distance each camera’s flash is able to effectively illuminate a subject for nighttime photos. We set up the cameras one at a time on a moonless night with a series of 3-D targets laid out in front of them and activated each camera two times. We had a stegosaurus at 15 feet, gobbler at 25, bedded ram at 35, fallow deer at 45, antelope at 55, wolf at 65 and mule deer at 75.

Trigger Speed Test
Once again this year, our trigger speed testing was conducted by the experts at TrailCamPro.com, who generously volunteered their time, knowledge and use of their Triggernator machine. TrailCamPro’s technical wizard, Charles, invented the Triggernator and describes it this way:

“The Triggernator,” nicknamed by the staff at TrailCamPro, is a device to accurately measure the trigger time of scouting cameras. Trigger time is the time differential between a target being detected and a picture taken. To accurately measure this time, the Triggernator passes a heated target by the center of the PIR detector of the test camera and at the same time starts a stopwatch displayed on a computer monitor. The test camera will take a picture of the running stopwatch and give us the trigger time. We run the test many times looking for consistency and then average the test results to obtain the average trigger time of that particular game camera.”

Recovery Time Test
Like the Trigger Speed Test, the Recovery Time Test was conducted by the experts at TrailCamPro.com. Recovery time measures how quickly a camera can store the first picture and be ready to capture a second picture.

Nighttime Blur Test
Our Nighttime Blur Test was set up in the same way as our Daylight Walkthrough Test, except it was done at night and each camera was tested individually rather than in a group. This was done to prevent infrared flash interference from camera to camera. Once a camera was set, the motorized retrieval system was used to bring the subject past the lens on the slow speed setting (1.94 fps) at a distance of 25 feet. This was a one-pass test, with only one picture from each camera used for data.

Why do this test? Well, if you have been using trail cameras for a while then you know how frustrating it is to get a picture of what may be a huge animal but the blur keeps you from deciphering any detail.

Nighttime Trigger Range Test
The Nighttime Trigger Range Test measured the maximum distance at which a camera will trigger on a rapidly moving subject—at night. This was accomplished by someone walking briskly in front of the camera in 10-foot increments until the camera was no longer triggered. Once the distance was outside a camera’s range, we moved back toward the camera in one-foot increments until motion was detected.

You may ask why we conducted this test at night. The answer is simple; the red infrared flash LEDs on the cameras would light up, letting us know the camera had detected movement. Models with black LEDs were very hard or impossible to detect. For those cameras, we had to remove the data cards and inspect the pictures, if there were any, to determine if there was detection at that range.

  • Mark

    Were is the rest of the article

    • eric_conn

      If you look at the gallery (bottom of article), the results from the tests are listed next to each camera. Thanks for providing us your feedback.

  • Lightning

    Exactly, Mark. Sounds like a great test if we could see the results. The first few paragraphs give specifics, then, at flash effectiveness, we get “as you can see in the photos” and no photos. The rest of the the article as posted describes testing without giving results. I use mainly reconyx and bushnell and flash range is a big issue for me. I think the reconyx illumination is more even, less beaming, but that’s anecdotal, no measurements.

    • eric_conn

      If you look at the gallery (bottom of article), the results from the test are listed next to each camera. Thanks for providing us your feedback.

  • VanillaGorilla

    Being that the title says ” Best Trail Camera’s of 2013 ” and not best trail camera. I’m assuming that the 13 cameras are what they are considering the best and they are just listing a few if the pros and cons of the cameras.. JMO I could be totally wrong.

  • Wengman

    Where is the battery life test? How convenient not to help you decide..What a joke…

    • eric_conn

      Battery life is listed in the gallery (bottom of article) next to each camera. Thanks for providing feedback.

  • T2

    What about Cuddebak???

    • dinkus

      cuddeback pulled their products from allowing trail cam pros to review them. in fact they said cuddeback wont even allow them to say their name. which really steers me away from buying anything they sell. sounds like they are trying to hide the truth.

  • Joel

    A chart of the specifications and pros/cons of each camera would have been immensely more helpful than a photo gallery with the info about each camera next to it. It’s incredibly difficult to compare units in this fashion.

  • http://www.besttrailcameraguide.com/ trailcamguy

    I think the Moultrie M-880 is a great trail camera for the money.

  • AH

    Are we to assume that the order in which youdisplay the cameras are the order in which they are recommended by you? This appears to be a rather incmplete study (and report) of the items listed. I think I shall have to search for other reviews. Why bother doing this if yo are only going to do half a job? By the way, you’d be far better off just giving reslts (i.e., “such-and-such model deteceted a white tail-sized target at XX feet, moving at XX fps, etc.), rater than long-winded descriptions about how ingenious your testing methodology was. Criteria is important, but to blow on and on about your method is a waste of space when you have actually performed an incomplete test and report and is just plain silly. I shall move on to find something more cedible.

  • Greg Roberts

    Thanks for making a quantitative comparison. I took the effort to put all the information into a spreadsheet. As was said before, would have been nice if you did that for us. It appears to me that you are comparing apples to oranges. The price of the trail cameras you compared ranged 5X – from $100 to $550. You spent a lot of care designing and running the tests. Clearly you understand the importance of a well-design set of tests. Would be much better if you compared comparable priced cameras. That is what is really telling

back to top