John Lott Responds on Gun Control Law Study
FYI (reference links in original, copy below):
http://volokh.com/archives/archive_2005_01_07.shtml#1105644864
************************************************************
John Lott Responds to some posts (linked to at the end of
this one) that criticize his work in light of the National
Academy of Science report on gun control laws:
Last month, the National Academy of Sciences issued
a 328-page report on gun control laws. The big
news that has been ignored on all the blog sites is
that the academy’s panel couldn’t identify any
benefits of the decades-long effort to reduce crime
and injury by restricting gun ownership. The only
conclusion it could draw was: Let’s study the
question some more.
The panel has left us with two choices: Either
academia and the government have wasted tens of
millions of dollars and countless man-hours on
useless research (and the panel would like us to
spend more in the same worthless pursuit), or the
National Academy is so completely unable to
separate politics from its analyses that it simply
can’t accept the results for what they are.
Based on 253 journal articles, 99 books, 43
government publications, and some of its own
empirical work, the panel couldn’t identify a
single gun control regulation that reduced violent
crime, suicide or accidents.
From the assault weapons ban to the Brady Act to
one-gun-a-month restrictions to gun locks, nothing
worked. (Something that I have been the first
person to investigate empirically for many of these
laws, and I also had been unable to find evidence
that they reduced violent crime.)
The study was not the work of gun-control
opponents. The panel was set up during the Clinton
administration, and of its members whose views on
guns were publicly known before their appointments
all but one had favored gun control. Something
that I wrote up about the panel three years ago is
still relevant.
While the panel dealt with a broad range of gun
control issues, only one issue has received
attention on different blogs: right-to-carry laws.
In fact, the panel apparently originated with the
desire from some to respond to the debate on that
issue and to respond specifically to my research
that concludes that allowing law abiding citizens
to carry concealed weapons reduces crime. I
originally overheard Phil Cook and Dan Nagin
discussing the need for a panel to “deal with” me
in the same way that an earlier panel had “dealt
with Isaac” Ehrlich’s work showing that the death
penalty deterred murder. They agreed and Nagin
said that he would talk to Al Blumstein about
setting up such a panel. Needless to say, that is
what ended up happening.
1) James Q. Wilson’s very unusual dissent is very
interesting (only two out of the last 236 reports
over the last 10 years have carried a dissent).
Wilson states that all the research provided
“confirmation of the findings that shall-issue laws
drive down the murder rate . . . ” Wilson has been
on four of these panels and never previously
thought that it was necessary to write a dissent,
including the previous panel that attacked Isaac
Ehrlich’s work showing that the death penalty
represented a deterrent.
Wilson said that that panel’s conclusion raises
concerns given that “virtually every reanalysis
done by the committee” confirmed right-to-carry
laws reduced crime. He found the committee’s only
results that didn’t confirm the drop in crime
“quite puzzling.” They accounted for “no control
variables” – nothing on any of the social,
demographic, and public policies that might affect
crime. Furthermore, he didn’t understand how
evidence that was not publishabled in a
peer-reviewed journal would be given such weight.
The non-results are basically due to dropping all
the control variables (particularly the arrest rate
which is not defined when the crime rate is zero).
When that happens a lot of observations with zero
crime rates are introduced. The problem with using
OLS when you have all these zero crime rates is
that if a crime rate is already zero, no matter how
good the law is, it can’t lower the crime rate any
further. There is thus a positive bias in these
results. Plassmann’s two papers (his piece in the
Journal of Law and Economics with Nic Tideman and
his paper with Whitley in the Stanford Law Review)
show how you can address this as a count data
problem. Although his research consistently shows
statistically significant results that shall issue
laws reduce crime, the National Academy report
ignores the research.
The panel’s discussion of Duggan’s results focuses
on the regressions without any control variables
and that use the OLS estimates when they have a
large number of zero values for the crime rates.
2) As an interesting aside, there are a number of
factual mistakes in the NAS report and those
mistakes work against my findings. For example,
Figure 6.1 makes a mistake where it shows the
increase in violent crime of 7 percent in year one,
when the amount is 5 percent (7-2, where 2 is from
the trend). (Of course, the overall problem with
the hybrid approach is discussed below.) There are
significant drops in crime in Table 6-3 that are
statistically significant, but they are not
properly marked to indicate that is so. Even
something trivial as the number of states currently
with right-to-carry laws is wrong, 36 (not 34) (and
if Minnesota is included the number is 37).
3) Last year there was a debate over the use of
clustering between Ayres and Donohue and me, but
the statements of the NAS panel corresponds
extremely closely to what was written in my
original paper with David Mustard.
4) p. 127: “We focus on the conflicting results .
. .” No attempt is made to give readers an idea of
the frequency or importance of unusual results.
Take the results in Table 6-3. For Plassmann and
Whitley, the panel doesn’t mention that Plassmann
and Whitley say that there are “major problems”
with the particular regressions that the panel
decides to report and more importantly that the
effects in those regressions are biased towards
zero (see point 2 above). For Moody’s results,
they show only two specifications of all the
results that he reports and don’t mention that the
one weird result that he got was from a
specification that he flagged as problematic and
not controlling for other factors.
Even with the very selective sample of regressions
that they pick, there is not one statistically
significant bad effect of right-to-carry laws on
murder. Only one case for robbery and that is one
problematic specification from Ayres and Donohue.
5) Hybrid model. The so-called hybrid model used
by Ayres and Donohue finds that the law dummy
variable is positive while the trend variable
indicates that crime rates decline over time.
While Plassmann and Whitley do a good job
explaining why the “hybrid” model produces
misleading results and the panel never discusses
their critique (looking at the crime rates on a
year by year basis show no initial increase in
crime), it still would have been useful for the
panel to at least say whether the “hybrid” results
produced a statistically significant temporary bad
effect. The problem with determining statistical
significance is that when both the dummy and trend
variables are on at the same time, we are concerned
about the net effect not just the dummy variable by
itself as Ayres and Donohue argue. The answer for
all those results in the panel’s Table 6-4 is “no.”
6) Reset tests. Professor Horowitz’s discussion of
the reset tests seem too strong since I provided
the panel with the reset tests done for a wide
range of estimates. Even accepting that the Reset
test is appropriate (and no one else on the panel
also uses this test in their work), there are many
estimates where the results pass this test and he
should thus conclude that those indicate a drop in
violent crime.
7) Using too many control variables. Bartley and
Cohen and I report all possible combinations of the
control variables and show a great deal of
consistency in the results. The only difference
between these and those discussed in the NAS report
is that these regressions included the arrest rate
because of the zero crime rate problem.
8) Process. While the NAS is in name an academic
organization, the process was hardly an academic
one. Members of the panel were forbidden to talk
to me about the issues being examined by the panel.
Despite promises to get my input on the panels’
review as it went forward, that never occurred. In
particular, Charles Wellford promised me that I
would be able to look at the tables and figures in
the report. If I had been involved, I could have
helped catch some of their mistakes. When the
report was finally released to the public, I was
promised that I would get a copy at the beginning
of the presentation and that I would be allowed to
ask questions. I was told that they preferred that
I not attend the presentation, but there would be
no problem with me asking questions. Instead even
though the presentation ended a half hour earlier
than scheduled because there were supposedly no
more questions, my questions were never asked. (I
had one main question: Professor Wellford mentions
all the research that has been done on
right-to-carry laws, but if he is correct that
right-to-carry laws are just as likely to increase
as decrease crime, can he point to one refereed
journal article that claims to find a bad effect
from the law?) Despite promises to the contrary, I
did not receive a copy of the study until well into
the afternoon and then only after a reporter from
USA Today sent me a copy.
Minor notes: Despite claims to the contrary, I
responded to the Ayres and Donohue study in January
of 2004. (Simultaneously, it goes unnoticed that
Ayres and Donohue themselves ignored virtually all
of Plassmann and Whitley’s points.)
In commenting on the report, others have raised
additional issues that the NAS study did not find
relevant. As to the claims raised again in these
posts reguarding Jim Lindgren’s investigation of
the “phantom survey,” many are apparently unaware
that David Gross, David Mustard, and I have said
that Lindgren has grossly mischaracterized what we
said to him. For comments by Gross and Mustard,
please see statements 3 and 4 in this link.
For a general response to the charges on the survey
and other issues you raise see this link. False
claims have been made with regard to these issues
and the pseudonym.
Claims have also been made by Jim Lindgren
regarding the demographic control variables, but he
fails to note that it is only for the state level
regressions and not the county level regressions
where some of the significant results are affected.
Given all the combinations of control variables
that have been examined, even in that case, one
wants some theory for why you selectively include
what appears to be a weird combination of
demographic controls. I think that Lindgren is a
biased observer. He was upset after a critical
piece that I published on his work in 2003 and his
attacks started shortly after that. Further his
attacks are untrue.
Final comments.
It is hard to look through the NAS panel’s tables
on right-to-carry laws and not find overwhelming
evidence that right-to-carry laws reduce violent
crime. The results that don’t are based upon the
inclusion of zero values noted in point 1 above.
Overall, the panel’s own evidence from the latest
data up through 2000 shows significant benefits and
no costs from these laws.
My impression is that Gary Kleck also has a very
similar reaction to the panels’ findings regarding
surveys on self defense.