There are several problems with the FBI's data on police officers killing civilians in America — and Vox has broken down both the data and the problems. But many of the limitations of that data — that police might be careless or manipulative when they categorize an incident, or that it's limited to departments that bother to report crimes at all — also apply to crime data in general.
In the case of officer-involved homicides, there might be more reason to be skeptical of the data because the reporting agency is also the offender's employer. But there are other factors that actually make it easier for stats to get skewed when officers, and death, aren't involved.
There's often talk about the need for better statistics on officer-involved homicides as a matter of public transparency. But that's not the primary reason that police departments collect statistics. The ostensible purpose of collecting statistics, for most police departments, is to guide internal strategy and help them figure out where to allocate resources. Furthermore, police use crime stats to look better in the eyes of the public, or on a federal grant application. That means they can be skewed, intentionally or not, by what will make the cops look best. That's certainly true in the case of officer-involved shootings, but it's true of other types of crimes as well.
Patrick Ball of the Human Rights Data Analysis Group says that cops want to record crimes that they know they can solve. Unsolved homicides look bad for police departments — and might hurt their ability to get federal funding. And "for non-fatal violations," where there's no body for the cops to explain away, underreporting is "much, much worse."
Just in the past few months, reports have come out about statistical manipulation at two of the country's biggest police departments. Chicago Magazine reported that the sudden drop in the city's homicide rate in 2012 came, in part, because the Chicago Police Department "airbrushed" several cases out of the city's official homicide statistics. The department reclassified cases after it became clear they wouldn't be solved, according to the magazine. And the Los Angeles Times reported that from September 2012 to September 2013, the LAPD misclassified about 1200 violent crimes as "minor offenses" to keep its own crime rate down.
There are pressures that go the other way, too. When the New York Police Department was sued over its stop-and-frisk policy in 2013, several cops said that they'd been ordered to meet quotas to stop a certain number of people.
Even in cases where there isn't pressure from the top of a police department to massage the numbers, the fact that cops are the ones collecting crime statistics is going to lead to a certain amount of selection bias. Ball, of the Human Rights Data Analysis Group, points out that crime rates reported by police are dependent on what crimes the police even know about. Those are going to disproportionately be crimes against older, richer, whiter citizens who are more likely to call the police.
The other way that police can find out about crimes is by deciding to look for them, which means they're driven by judgments of who's dangerous. Take the case of stop-and-frisk. Because cops tend to patrol low-income, heavily black and Latino areas that have a lot of crime, it becomes a vicious cycle. Young black and Latino men are more likely to get stopped and frisked, so they're more likely to get arrested if they're carrying drugs or guns, so the area's crime rate stays high, so cops continue to heavily police the area, so they stop and frisk more black and Latino men.
It is definitely a problem that we don't know the exact number of people who are killed by cops in America. But it's a symptom of bigger problems with the way we collect crime statistics, and why we do it. Right now, the American public's ability to understand trends in crime relies on statistics that weren't compiled for the purpose of helping the public understand them. In order for statistics to be reliable, they need to be collected for the purpose of reliability. In the meantime, the best that the public can do is to acknowledge the problems with the data we have, but use it as a reference anyway. Incomplete data is better than no data at all.