Joined
·
467 Posts
I've only chronographed two loads with my new chronograph, so the issue I describe MAY only be characteristic of those two loads, BUT...
How large does the standard deviation get, as a percentage of average velocity, before you conclude that the load is inconsistent?
For example, if I have a load in .357 magnum that averages 1260 f/s + 63 f/s, then that's a 5% standard deviation. I'd continue to look for better but, for a pistol round, at pistol distances, I could LIVE with this. This same amount of deviation in a 3000 f/s rifle, on the other hand, seems so large as to probably render the rifle too inaccurate to be depended on for much of anything beyond shouting distances.
I'm ALSO aware that low variation of velocity is no guarantee of good accuracy. It's darned difficult, however, to obtain good accuracy without it.
We all HOPE for single-digit S.D.s, but must often settle for low double-digit S.D.s. At what percentage do we decide that a certain load is consistent enough for use, or not?
How large does the standard deviation get, as a percentage of average velocity, before you conclude that the load is inconsistent?
For example, if I have a load in .357 magnum that averages 1260 f/s + 63 f/s, then that's a 5% standard deviation. I'd continue to look for better but, for a pistol round, at pistol distances, I could LIVE with this. This same amount of deviation in a 3000 f/s rifle, on the other hand, seems so large as to probably render the rifle too inaccurate to be depended on for much of anything beyond shouting distances.
I'm ALSO aware that low variation of velocity is no guarantee of good accuracy. It's darned difficult, however, to obtain good accuracy without it.
We all HOPE for single-digit S.D.s, but must often settle for low double-digit S.D.s. At what percentage do we decide that a certain load is consistent enough for use, or not?