Portland Copwatch Analyzes Compliance Officer Report on Use of Force July 2018
Table of contents
Introduction/Summary
What's Left to Do: Insubstantial Compliance and Force Policy Violations
Failing to Canvas for Witnesses / Force without
Custody
Good News: De-Escalation Leads to Lower Levels of Force
Numbers vs. Percentages: Why Both Matter
Other Interesting Information
Layout / Technical Issues
Conclusion
Footnotes
Portland Copwatch
a project of Peace and Justice Works
PO Box 42456
Portland, OR 97242
(503) 236-3065/ Incident Report Line (503) 321-5120
e-mail:
copwatch@portlandcopwatch.org
COMPLIANCE OFFICER'S REPORT ON FORCE: PPB TARGETING PEOPLE OF COLOR,
HOUSELESS
Unfortunate Errors, Omission of Shootings, and Other Drawbacks Detract from Important
Analysis
by Portland Copwatch, July 27, 2018
In early July, the Compliance Officer/Community Liaison (COCL)*-
1 released a report on the
Portland Police Bureau's progress regarding changes to Use of Force training, policy and
implementation (posted at
https://www.portlandcocl.com/reports/2018/7/2/compliance-and-outcome-assessment-use-of-force-
draft ). Remarkably, through a series of tables and charts, the COCL comes to the scientific
conclusion that community members have known by observation for years: African Americans,
Latinos, youth and houseless persons have more force used against them than other Portlanders.
This is the second COCL report to examine only one part of the US Department of Justice (DOJ)
Settlement Agreement since the reporting began in 2015. Like the April report on Mental Health
issues, the new report found the Portland Police Bureau (PPB) was in "Substantial Compliance"
with all 12 paragraphs relating to Use of Force.*-2 This means ten of
the 12 paragraphs moved up
from "Partial Compliance" in the November 2017 COCL report. That report led Portland Copwatch
(PCW) both then and in April to speculate there is a rush to finish with the Agreement leading to
many paragraphs being "greenlighted." While officer Use of Force is arguably the issue of most
concern to the community at large, with the new COCL reporting system Force is only being
examined once per year rather than two to four times as was done previously. The COCL notes
there is overlap among the various sections of the report, but then, for example, defers weighing in
on whether officers are being held accountable for out of policy Force incidents until their report on
the Accountability section in Q4 2018.
Worse, the utmost Use of Force which drives much community consternation is the Use of Deadly
Force (known in PPB parlance as "Level I"). There have been _sixteen_ officer involved shootings
since the COCL started reporting, and _twenty-two_ since the DOJ Agreement was signed in late
2012.*-3 The vast majority of these incidents involved people in mental
health crisis, the main focus
of the Agreement. Looking at less than deadly uses of force and controlling for reported resistance
and other factors, the new report states people in mental health crisis are not more likely to be
subjected to force. The COCL states there are not enough cases of "Level I" Use of Force to justify
analysis (p. 61). Considering the in-depth look they give to the intersection of force, race, mental
health, and other factors, and the fact John Elifritz was shot while in a mental health crisis by eight
armed officers during the review period (October 2017-July 2018), this is an inexcusable omission.
It also ignores their own data (on p. 63) which show people in mental health crisis are subjected to
level II force twice as much as people who are not (17% vs. 8%).
The COCL says force incidents related to felonies and misdemeanors account for the "majority" of
cases. However, adding up the numbers on p. 53, one can see such crimes only account for 58% of
force use, meaning people are ticketed or not charged with any crime 42% of the time-- so as many
as three out of every seven uses of force were not related to serious crimes.
As with the April report, this report utilizes a new rating of "Substantial-Conditional," which is
given to eight of the 12 paragraphs*-4. PCW warned that using green
highlight color on all
paragraphs breaks from past analysis (where "Substantial-Conditional" would have been called
"Partial" and colored yellow). Portland Copwatch noted it creates a psychological mindframe for
the Bureau to think they are done, and anger in community members who are not seeing any
change. Yet the COCL used this new system for a second time.
One incredibly large drawback of the report is that it compares data from various timeframes, a
glitch which the social scientists could overcome but choose not to. For example, no data are used
from prior to 2015, even though in theory to see whether changes made under the Settlement
Agreement are having an effect one would need to go back at least to 2012. The COCL claims their
failure has to do with a database change made in 2015 (p. 39). Quite a few data sets are limited to
July or August 2017 through March 2018, when the police began tracking a number of kinds of
force which previously were not listed. For example, "Vehicle Ramming," "Control Against
Resistance," and "Baton (non-strike)" are now required to be reported by officers as low level uses
of force. The COCL cautions readers that the spike in Force Reports since mid-2017 is "likely" a
result of the changes in policy. One wonders, then, why the COCL team (and the Bureau, for that
matter) could not remove the data on those new categories of Force from their statistics to show
trends going back further than nine months. It makes sense to also include the new data moving
forward, but since the expanded categories added roughly 700 new incidents just in Q1 2018,
tracking both old and new data would be best.*-5
There are also numerous errors in the report, not the least of which is a table on page 51 wherein
every single row supposedly showing "subject characteristics by resistance" adds up to a number
greater than 100%, between 102 and 108%.*-6 (We believe it was Mel
Brooks who said you can't
have more than 100% of anything.) This does not bolster confidence in the Compliance Officer's
reporting. The office's half a million dollar a year contract used to include the responsibility to
manage and host meetings of the Community Oversight Advisory Board (COAB), but the COAB
was dissolved 17 months before this report was written. The revisions to the Agreement in April
divorced the COCL from responsibility for the new community panel, the Portland Committee for
Community Engaged Policing (PCCEP).*-7 Thus, one wonders why
the COCL could not be
bothered to dive into the above-mentioned matters or proofread their own report.
To be fair, PCW has stated all along that the COCL's reports include important information, which
still holds true here-- including the data on who gets subjected to force more often. They also reveal
the Bureau has begun tracking cases in which officers did _not_ use force, in order to better see
how well de-escalation and other alternatives to force are leading to non-violent outcomes (p. 31).
That said, the COCL continues to lecture the Bureau for conflating tactics meant to avoid force with
lowering the amount of force used when discussing de-escalation. PCW agrees, but has repeatedly
suggested that calling the former de-escalation and the latter "mitigation of force" would go a long
way to ending the confusion. To the Bureau's credit, they put out a memo in April with specific
examples of what is and is not considered de-escalation.*-8 The COCL
also calls out a few cases
wherein Bureau supervisors did not find officers out of policy for using Tasers because the weapon
failed to deliver a 50,000 volt shock when deployed (in one case, it only hit the person's clothing).
Often seeming unconcerned about the consequences of police violence, the Compliance Officer
here uncharacteristically suggests a Taser discharge should be considered force whether or not
there is an "application," and if need be the Bureau should change its policy ("Directive") to ensure
such incidents are reviewed as Use of Force.
But still, remnants of the very first COCL report, in which they raised questions about the validity
of community activists, resurface here. In asking the Bureau to stop printing raw numbers of Force
data in their quarterly and annual reports, the COCL brushes off the fact that these data were
requested by community input (p. 29). There is no effort to compromise and suggest both rates
(percentages) and raw numbers be used. As part of the its full analysis, PCW examines areas of the
COCL's report which illustrate the importance of including both kinds of numbers.
Other major concerns include the COCL assuming a factual basis when officers report a person as
being "resistant." There are at least eight areas where the "level of resistance" is presented as being
accurate based on officer reports, and only two where the COCL describes "perceived" compliance
(on pp. 56 and 64-65). Resistance is broken down to include people being "disobedient or
antagonistic" even though the report says explicitly the officers are being taught that so-called
"contempt of cop" is not a reason to use force (p. 9).
Also, while the analysis using multiple factors showing which community members are most likely
to be subjected to police Use of Force is important, none of those factors are the relative
representation of those people in the population. For example, they note African Americans are
41% more likely than whites to have higher levels of force used on them (p. 64), there is no analysis
that Portland's black population is just 6% compared to 71% white. Latinos (called "Hispanics" in
the report) are three times more likely than whites to be subjected to high level force-- but only
make up 10% of the population.*-9 In the Outcomes section, it also
shows that in terms of how
often force is used during arrests, Latinos are subjected to force in 2.2-4.7% of the time, African
Americans 2.9-5.6% of the time, and whites just 1.8-3.2% of the time (p. 48). Latinos are subjected
to more than twice as much "Level II" force as African Americans or whites-- 21% vs. 9-10% (p.
63). Houseless persons are reportedly 3% of Portland's population but, according to the Bureau's
2017 Force Data Summary Report, received 44% of all force doled out by the cops (and African
Americans received 28% of force). The Oregonian recently reported "transients" make up 52% of
PPB arrests. The COCL shows houseless persons are 34% more likely to receive serious force
than those who are housed (p. 60). A table on p. 56 shows the percentage of officers' first use of
force on houseless people is higher than if the person is housed-- 21% vs. 15 % of the time.
A final overall observation: The COCL examines how certain precincts use force and how people in
certain ethnic groups are subjected to force. However, they do not specifically break out the Gang
Enforcement Team to see whether they are using force at about the same disproportionate rate
against African Americans as they conduct traffic stops, which is 59% in a city which is 6%
black.*-10
See PCW's full analysis at
http://www.portlandcopwatch.org/COCL_quarterly0718_pcw.html .
Back to top
WHAT'S LEFT TO DO: INSUBSTANTIAL COMPLIANCE AND FORCE POLICY
VIOLATIONS
As the COCL and the DOJ seem to be hustling the Agreement to a close, it is important for
Portlanders to remember that the Agreement requires the PPB to be in full compliance for at least a
year before the case will be cleared in Federal Court. Since the PCCEP is not expected to start
meeting until October, it is unlikely Judge Simon will be able to sign off in late 2019 as the City
likely is hoping.
That said, the reasons the COCL gives for not finding full compliance generally revolve around
systems which are too new to have been tested out or tweaks the Bureau needs to make. For
example, the Force Inspector ("Inspector") only recently began reviewing Force reports to see
whether officers violated policy. They give feedback to the chain of command, a process the COCL
wisely says needs more time to take hold. (Again, arguably this would have been deemed in "Partial
Compliance" in earlier reports.)
Another example which should have led to a "Partial" rating is that while supervisors are checking
to see whether officers' actions lead to the situation where they choose to use force, some are still
missing questions about de-escalation and not necessarily looking at the whole encounter
(Paragraph 66a, p. 10).
Regarding Paragraph 66b, in which officers are supposed to develop skills over time to resolve
confrontations without force, the report says senior officers are not being held to higher
expectations. Finding just one rare incident where a supervisor called someone out for a pursuit and
Use of Force which were in policy, but not in line with spirit and goals of the Bureau, the COCL
alarmingly concludes there is some evidence of compliance and gives this subparagraph a
Substantial rating (p. 11).
For Paragraph 67a calling upon officers to de-escalate and call in specialty units, the COCL relates
one scene where an officer ordered someone assaulting another civilian to get off and called it de-
escalation. The often cop-friendly COCL opines that "force was appropriate" in this case. On the
other hand, the COCL noted while the order and a subsequent push may have avoided a higher level
of force, those actions were not de-escalation. In the same section, they say they did not see any
failures to call in specialty units-- despite John Elifritz being killed within 37 seconds of officers
rushing into a homeless shelter in April (p. 12). The Compliance Officer team says they want to see
more evidence before finding this part in Substantial compliance, again indicating a rating of
"Partial" would have been appropriate.
Similarly, the report notes that officers are taking known history of mental illness into account as
required by Paragraph 67b, but some officers have not had a refresher on symptoms since 2006
when 40 hours of Crisis Intervention Training was first given to all officers (p. 13).*-11
As noted above, the COCL reports two Taser uses were found out of policy, but for the wrong
reasons. One was a negligent discharge where an officer stumbled with his finger on the trigger
(good thing it wasn't a gun, right?). The COCL disagreed with the Bureau's assertion that the
suspect's failure to react to the Taser meant it wasn't an application of force. The other was the case
where the Taser probe stuck in the person's clothing. Both were handled as administrative violations
rather than excessive force. In the context of Paragraph 67d, the PPB is supposed to discipline
officers if they use unreasonable force. Such a requirement is also added for any out of policy force
in Paragraph 73e. This is where the COCL punted and said they will look at "corrective action" in
their Q4 Compliance report (pp. 13-15 and 23).
In discussing how often violations were found in their analysis of Paragraphs 74, 75 and 77 on
Force reporting and audits, the COCL warns readers not to assume the 10 violations flagged by
supervisors were due to excessive force. They note the Inspector found 12 additional possible
violations which should have been reviewed and debriefed with officers, for a total of 22 in the nine
month period under review. On the bright side, the report says that while the Inspector's
disagreements used to just go back to the officer's commander to decide whether or not to take
action, they now get reviewed by the Assistant Chief and Chief who can over-ride the commander's
choice (p. 26). This is the feedback loop discussed elsewhere in this document.
As a public service, PCW examined the Q1 2018 Force Audit report (Appendix 7) which lists eight
of the 22 force incidents found out of policy. (The quarterly Force Audit reports for Q3 and Q4
2017 are currently not available to the public.) Seven were found out of policy for administrative
reasons, and in one an officer was found out of policy for _not_ using force (something PCW has
been concerned about since looking at Directive 315.30 on Satisfactory Performance several years
ago). The other seven were: two officers failed to report pointing a firearm soon enough; two
officers failed to provide warnings; one an officer did not report events accurately; one negligent
Taser discharge; and one incident where an off-duty officer didn't immediately report shooting an
"aggressive dog."
So at least for the most recent quarter, it does not seem officers are being disciplined at all for
inappropriate use of force. The Outcomes section notes three of the 22 violations flagged had to do
with "legal standing" but does not go into detail (p. 38).
Regarding the warnings, which officers are required to give before using Tasers (Paragraph 68b),
the COCL said some supervisors were being "too strict" about not letting officers waive the
requirement to give a warning (pp. 15-16). As noted above, the COCL often does not see these
issues through the lens of a community member.
As for the negligent discharge which was found out of policy, it seems the cause was a button on
the side of the PPB's new Tasers which are easily triggered. The COCL had noted this problem
previously (particularly in light of the case of Matt Klug, who was tased six times including at least
once due to the new button), but says it is now "largely resolved." Thus he gave Paragraph 68e,
asking officers to re-evaluate after each Taser cycle, a "Substantial" rating (pp. 16-17).
While the COCL earlier noted the police were not being trained how to put handcuffs on suspects
during or between Taser shocks (Paragraph 68f), they now report the Bureau bought a special
dummy to help train how to do so. However, it is only being used in Advanced Academy (which is
for new recruits after they finish basic State training). It is not being used for In-Service training
which goes to all officers. The other part of this subparagraph says officers can't cycle Tasers more
than two times (at five seconds each) unless "exigent circumstances" exist. In the one incident they
found an officer used the Taser four times. Rather than express concern that could be a form of
torture, the COCL suggests maybe the officer just didn't articulate their sense of urgency properly
(pp. 17-18). This is one of the areas they award as being in Conditional compliance.
Paragraph 70b requires officers to notify Supervisors about serious uses of force, force against
people with mental illness, or suspected misconduct. The COCL says this happens "often" and the
Bureau is "mostly diligent," then finds them in Substantial compliance (p. 20). This is odd, since
previously the COCL said this was happening in policy but not in practice; "often" and "mostly"
indicate it is not universal across the Bureau.
Paragraph 69a, which requires officers to file reports about any use of force, was previously not in
compliance because the Bureau did not make officers file reports after shootings or deaths in
custody. Rather than fix that problem, which the DOJ promised the community would happen, the
City and DOJ had the Settlement Agreement amended to exclude deadly force cases from such
reporting. The COCL reports on this change (p. 19), which allows the administrative investigation
of deadly force to substitute for the officers' first-hand, contemporary reporting. PCW testified to
Council and to Judge Simon that this was an inappropriate change and continues to argue against it.
The language in the revised Agreement is particularly troubling as it includes shootings which do
not end in death-- cases which do not immediately fall under the purview of the District Attorney's
office under Oregon law. It is the DA's involvement, and concerns over forcing officers to testify
against themselves, which led to the change. Because the amendment lowered the river rather than
raising the bridge, the COCL finds the City in Substantial compliance.
The COCL also finds Substantial compliance regarding gathering involved and witness officer
statements in Force cases; similarly to 69a the change in the Agreement is cited as a reason the PPB
is now in compliance. The COCL says having the involved officer give a limited "public safety
statement" and being compelled by Internal Affairs later is in line with best practices (also p. 19).
That is not what the OIR Group and other experts (including the DOJ) said in the past, rather,
getting officers to give an interview immediately is a best practice.
Related to these two sections, Paragraph 70a requires After Action Reports to be done within 72
hours of force events, or the supervisor responsible can be disciplined. The COCL says because of
the revisions to the Agreement this is now in compliance, but doesn't make clear that Paragraph 70
was not amended (p. 20); this requirement is affected by the change to Paragraph 69.
After all these issues related to one change made to the Agreement, the COCL does not mention
Paragraph 73a was also amended in order to reduce how often data has to be entered into the
Employee Information System (EIS). The word "comments" was deleted as a required entry and
the word "material" was inserted to modify the "findings" and "corrections" which are required to
be put in the database. The COCL's analysis of this section notes how EIS entries are often
incomplete and do not always include debriefings given to officers or possible violations. The
amendment could lead to an argument of what is "material," so it is appropriate the COCL gives this
Paragraph a "Partial" rating. That rating is confusing since the overall section is supposedly in
Conditional Substantial compliance. While PCW does not disagree with the COCL that the Bureau
should be entering all the data, the suggestion to define what "findings, comments and corrections"
require EIS entries should acknowledge the amended language (pp. 21-22).
In one example where it's possible the Bureau is making progress but the COCL admits they need
more time to see if it consistent, Paragraph 73b has led to a system where supervisors are now
subjected to progressive discipline if their After Action Reports continue to be inaccurate or
incomplete (p. 22). It is not addressed whether the Portland Police Association (which represents
Sergeants) or Portland Police Commanding Officers Association (which represents Lieutenants)
have objected to the new disciplinary system. Related Paragraph 73d says the supervisor can be
removed if they continue deficient reporting, with the same Conditional compliance rating (p. 23).
The November 2017 report stated the supervisors needed more training, and it is not clear whether
that occurred. In the Outcomes section, the COCL reports the Inspector made it so supervisors with
deficient reports are entered into the EIS, which persists until the problem is fixed (p. 36).
Paragraph 73f requires that concerns about policy, training or tactical issues which come up during
a Use of Force investigation be raised all the way up to the Chief's office. The COCL seems happy
with the Bureau's feedback loop between the Inspector and PPB supervisors, but doesn't mention
the Chief (p. 23), which seems pretty important.
Another questionable way the COCL gives Substantial compliance ratings is when the Bureau has
created a policy but has never used it. Among other places, this is true for Paragraph 73g about the
Chief reassigning Use of Force investigations to the Detectives Division (p. 24).
The COCL says simplifications made to Force reports led to difficulties for officers adding details
in some sections, yet they give Substantial compliance to the Force reporting/audit Paragraphs 74,
75 and 77 (p. 25). The Bureau apparently uses "Survey Monkey" to enter their 131 to 152 points
of data, which raises questions of whether a corporation has access to sensitive criminal justice
information. Also in this section, the COCL notes supervisors are reviewing force for compliance
with the "Graham standard" (the Supreme Court decision which requires that only a reasonable
amount of force to be used), but doesn't address how the DOJ Agreement led Portland to adopt a
"Graham plus" standard requiring officers to use force as little as possible (p. 26). For some
reason, previous reviews of force which fell into three categories-- out of policy, in policy with
concerns, and no concerns, have been reduced to just a pass/fail two point system. Later in looking
at Paragraph 76, the COCL suggests that, since so few Force cases are found out of policy, the
Inspector should include cases which were in policy but led to concerns (p. 30); it is not clear how
that is possible if the "of concern" category no longer exists.
The COCL admits to auditing only 104 of the 152 data points entered by officers, saying they
approved of how data was entered except for "a few instances," and agreeing over 95% of the time
with the 10 points used to determine if officers were in or out of policy. They expressed concern
that many issues were caught by the Inspector's audit rather than by officers in the chain of
command (pp. 27-28). As has been a past deficiency in the COCL's analysis, there is no mention of
how a truly independent civilian oversight body might do better than officers in the same institution
reviewing their peers.
Regarding Paragraph 76's requirement to analyze data to look for trends, the COCL says the
Bureau "attempted to comply." They wisely ask for the Bureau to improve the way they go about
this analysis ("methodology" in consultant-speak) by adjusting figures based on factors like how
often officers make arrests. They use the example of a supervisor who only makes one arrest and
uses force, giving a 100% rate, while another officer might use force in one out of ten arrests
(10%). The average would be a 55% use of force rate, which would mean most officers would fall
far under that line (pp. 28-29). The COCL also suggests focusing on officers who use some types
of force more than others. Even with this rather glaring problem-- and the fact that they haven't yet
met the Agreement's requirement to compare force trends by precinct, shift and unit (p. 32)-- the
COCL praises the PPB for moving from a paper-based system to a computerized one and finds
them in Conditional Substantial compliance.
PCW readily admits we sometimes send out documents with errors in them. However, it is odd that
the COCL looks at one officer who filed just four Use of Force reports with no deficiencies as
someone who should be held up as a role model (p. 31). For one thing, in any other area the COCL
would say four incidents do not make a meaningful trend (for example... there have been many
more than four Officer Involved Shootings, yet they still won't analyze those). For another,
shouldn't there be some concern that one officer filed four Use of Force reports in one reporting
period, if Use of Force is supposedly trending downward?
Back to top
FAILING TO CANVAS FOR WITNESSES / FORCE WITHOUT CUSTODY
In a few places in the report, it is mentioned that officers sometimes fail to canvas for witnesses to
force events. It is given as an example of an administrative policy violation on p. 26, and on p. 35
discussing the most common deficiencies in officers' reports. It doesn't seem to cross the COCL's
mind that the officers may be failing to look for witnesses in order to escape being held
accountable.
This leads to a few questions, including:
--What if an officer fails to report a Use of Force altogether, and/ or under-reports the amount of
force they used? If there are no witnesses except the suspect, it's likely the officer will be believed
over the subject.
--How does the Bureau know if the officer is exaggerating the resistance of the suspect? Frequently
we have seen or heard about officers yelling "stop resisting" when the person is not actually doing
anything; this is a tactic which is used to scare off witnesses who might complain they witnessed
excessive force. (Note: in the Outcomes section, it shows force and resistance have the most
deficiencies in officer reporting, at 32% [p. 35].)
--Why is it assumed (as was stated at the COCL town hall on July 12) that the "Independent"
Police Review (IPR) might catch issues around force policy violations, when IPR relies on police
reports which might be incomplete (or falsified to cover up for misconduct)?
It should not be expected that the COCL would answer these questions, though, as they describe the
first use of force against a person as "setting the stage" for police-community encounters (p. 55).
They also inappropriately explained the force-to-custody ratio is important because when there is an
arrest, cops are "more likely to face a situation where the individual may be upset and disobey the
officer, thereby requiring the use of force" (p. 46). These do not seem very scientific, and the latter
comment ignores the COCL's own data on how often people have force used against them but are
not arrested.
As noted above, three out of seven incidents not related to serious crime lead to officer violence. The
report "explains" that the largest non-arrest category, "release to medical," doesn't mean there was
no crime, but rather, they say, "often" the person is cited rather than arrested (p. 53). Considering
these 301 cases made up 22% of the force incidents,*-12 it seems there
should be more analysis
than using the word "often."
The same table about "dispositions" shows that--somewhat to the Bureau's credit?-- only 11 people
who were cited and released were subjected to force, and all of it was Level IV force. A full 65% of
Level IV force was used in arrests and 25% of it was used when people were given over to EMTs or
the hospital. It should be explained why 230 people subjected to the lowest level of force ended up
being "released to medical."
Back to top
GOOD NEWS: DE-ESCALATION LEADS TO LOWER LEVELS OF FORCE
The report and various data tables show when officers use verbal and other means to avoid using
force, it mostly leads to lower levels of force being used. (As noted above, they are not yet studying
the cases where no force at all was used.) Unmentioned, though, is that when the reason for police
contact is "Mere conversation, welfare check or flagdown," 17.5% of the first force applied is in the
second and third most serious categories (p. 57). PCW thinks this means 17 of 97 people in these
minor non-criminal stops were subjected to serious force during the time period under review. The
same category of stops is shown to result in force being used 1.4 times more often than for "person
offense" calls, putting it second highest to traffic stops.
Also in the table and text discussing the first force applied (pp. 55-56), the report shows when
verbal de-escalation is used, more serious force is only used initially about half as often as when
there is no de-escalation (11% vs. 21%). Having one on one talks with the subject makes serious
force three times less likely, with Level II only used 6% of the time vs. 19%. Safe communication
with the suspect similarly reduces the likelihood of high levels of force from 19% to 12%, roughly
equivalent to officers using less serious force when they know a person has a mental health or
medical condition. Conversely, if a person is believed to be "disobedient or antagonistic" higher
levels of force are used 20% of the time instead of 10%.
The COCL notes the Police Executive Research Forum (PERF) delivered a two day de-escalation
training for instructors and supervisors in May 2018, as well as PPB teaching de-escalation during
the spring In-Service training for officers. As part of the training on why "contempt of cop" does
not justify using force, they talked to officers about regulating their own emotions and calling in an
officer who wasn't wrapped up in the initial incident. The COCL claims they have not seen any
contempt of cop cases since that training, but it's unlikely the officers would report on such
incidents honestly, and the training was given only a few months before the report came out. The
Bureau is working to be sure the police know saying "get down or be tased" is a command and
control issue, not de-escalation. The PPB also included discussion about procedural justice--
making sure people feel respected when interacting with police, but apparently framed it in part as
"positive customer service" (pp. 8-10). PCW has repeatedly expressed that most people do not call
the police on themselves, thus they are not "customers."
Back to top
NUMBERS VS. PERCENTAGES: WHY BOTH MATTER
As noted above, the COCL is urging the Bureau to stop printing raw numbers alongside
percentages in their Use of Force reports. Their own analysis of the data showed why both are
important, in the example where a supervisor only made one arrest and used force (100%) vs.
someone who used force once making ten arrests (10%). Each used force only once.
In addition, in noting that looking for witnesses went from being 8% of deficiencies to 28% of
deficiencies in officers' reports, they state other categories improved, bumping the percentage
higher. This is an excellent reason to show the raw number to assess whether there has been any
actual increase in officers failing to find witnesses.
Similarly, on p. 44 the COCL notes how Taser use was between 5% and 11% of all force, but went
down to 4% in Q3 2017 after the new categories of force were added to Force Reports. Someone
with a real drive to illuminate the public could (a) adjust the percentage by removing the new
categories of force from the calculation (as suggested above) and (b) print the actual number of
Taser deployments, which would show if their use is going up, down or staying the same.
In the Outcomes section, the COCL sometimes displays enough information where both the raw
numbers and percentages are shown. For example, on p. 38 the table showing where the Inspector
found policy violations by officers shows the number of violations (listed as "0") and the number
without (listed as "1"). Then it gives the "percent positive evaluations" which is between 94.3% and
100%. PCW praises the transparency of this table but suggests the "0" and "1" be given better
descriptors, and for consistency's sake-- since the rate of officer failure to complete reports is given
earlier-- to show the percentage of violations found, rather than those which were apparently in
policy. This would lead to a range of 0%-5.7%, with the highest frequency for failure in report
writing.
This flip-flopped success/fail rate is also true in the following table, examining findings about
policy violations for trends.*-13 In this table the COCL also includes
numbers of cases both with
and without violations. The 22 policy violations are shown to be out of 123 cases reviewed, which is
an astonishing 17.9%. According to the "Independent" Police Review's 2017 annual report, only
13% of complaints lodged by community members led to Sustained findings. However, since we
do not know whether any of the reported violations led to discipline-- apparently eight of them led
to "debriefs"-- it is hard to say whether the nearly 18% figure has any relationship to misconduct
the community would have reported. In other words, most people would complain about having too
much force used on them, not that the officer didn't file their report accurately. (Flip-flopped high
numbers also appear on p. 54 regarding findings about Use of Force by category and force type.)
On page 42, reporting on how often force was used against people with mental illness, a graph
combines percentages (which show such force varied from 5% to 25% of all Force incidents) and
raw numbers of Taser applications against such persons (which is noted varied from a high of 13
times in one three month period to just five times in the last nine months). So it's a bit apples and
oranges, and without a corresponding table is a little hard to suss out.
The figure on page 43 showing uses of force over time is purely raw numbers, and surprisingly
shows a drop in "pointing of firearm" from about 35-50 uses per month to about 25 per month in
Q1 2018. With no table these are estimates based on the graph. The same graph shows Control
Against Resistance jumping from 150 in its first reported quarter (Q3 2017) to 250 (Q4 2017) to
over 400 (Q1 2018). The COCL has stated officers had the term clarified for them over that time
period.
In other places raw numbers are helpfully given in addition to percentages, but the analysis falls
short. For example, the table on how often certain kinds of officers use force shows Enhanced
Crisis Intervention Team officers use force at a rate similar to non-ECIT officers, 12% of level II
(second highest) force compared to 10% (p. 47). But the raw numbers show that means ECIT
officers only used Level II force in 38 arrests (11.8% of 323) while the non-ECIT cops used such
force in 107 arrests. On the other hand, a table on page 56 indicates ECIT officers use level II or III
as their first kind of force 22% of the time compared to 16.5% for non-ECIT cops, which is 1.35
times higher for the cops trained in de-escalation.
Also, the COCL explains "Specialty units" have fewer calls but tend to use more force, which is
why their percentages are higher-- 16% level II represents only seven out of 43 arrests. Needless to
say, none of the charts around level II-IV uses of force show who uses deadly force (level I) or how
often.
Back to top
OTHER INTERESTING INFORMATION
The report reveals some information which is both counter-intuitive and counter to what people in
authority have said over the years. For example, the Outcomes section (p. 47) shows officers in
their first 10 years with the Bureau used less serious force (6-10%) than ones with more experience
(11.5-16.3%). Usually people say officers learn de-escalation skills over time and rookie cops are
prone to use violence because they are new at the job.
The table on findings by force type shows pointing of firearms was found out of policy about 14%
of the time, and Taser use found out of policy about 3% of the time (p. 54). By PCW's standard of
also looking at raw numbers, the calculated numbers on those mean 32 firearm pointings were
considered possible misconduct and two Taser incidents.*-14
Getting back to the specialty units, officers are supposed to consult with them as a means of de-
escalation. However, the table on p. 57 shows contact with those units makes cops jump to higher
level of force faster than others-- 19.8% of the time vs. 17.7% of the time. Again to be fair, that just
means 19 incidents had the higher force level at first, as opposed to 229 cases without the
consulting.
Also, if officers feel a person is disobeying them or being "antagonistic," their first choice of force
is in the higher category over twice as often as if the person is not so perceived-- 20.1% of the time
vs. 9.5% (p. 58).
.
The OIR Group has cautioned the Bureau against foot pursuits, noting they often lead to violent
confrontations. The table on p. 58 shows that if there was a pursuit prior to the use of force, a
person is 1.4 times more likely to be subjected to a high level of force (24.1% of the time) than if
there was no pursuit (17.6% of the time).
One interesting observation in the COCL's analysis of when more serious force is used comes
when they note pointing a firearm at someone is in the lowest category of force (level IV), leading to
a statistic showing decreased likelihood of higher levels of force if a suspect is perceived to be
armed (pp. 60-61). Since pointing a loaded weapon at someone is a very serious threat, it seems the
Bureau should consider moving pointing a firearm to a level II or III Use of Force. It does not seem
to appropriately fit in the same category as pushing someone's arms around in order to handcuff
them or other level IV force.
Back to top
LAYOUT / TECHNICAL ISSUES
Overall, there is a slight improvement from the COCL's previous "one-topic-only" quarterly report,
in that each paragraph being examined is clearly listed, and chapter headings are included along
with a table of contents (p. 2).
A lot of problems with the COCL team relaying information, though, comes from their deep
academic background making scientific terminology obvious to them but confusing to casual
readers. A good portion of page 48 describing how they designed their model to predict when force
is used contains high-level jargon. One explanatory section about analyzing force subjects by race
says "the fluctuations of each racial/ethnic force-to-custody ratio was similar to the fluctuations of
White subjects (Black r=.858, p<.05; Hispanic r=.638 p<.1, Other r=.503 p>.1)." If that is
meaningful to you, please consider volunteering to help the PCCEP, as the folks who will be
appointed likely will not be able to easily translate such information.
In an effort to appeal to laypeople, the COCL tries to explain why they don't want to look at just one
factor when examining force ("bivariate") but rather a number of factors ("multivariate"). They use
a strange metaphor about how ice cream sales and murder rates both go up in the summertime but
that doesn't mean there is a correlation between the two (p. 40). This is a silly simile, and ignores
the possibility that eating ice cream does lead to murder. In other words, they warn readers not to
draw conclusions, even though some data like those on race should not be dismissed just because
racial bias isn't proven by the report.
Also confusing are their references to how many years officers have served on the force as a factor
in predicting force use. The first table shows various ranges of years of service, from below 5 to
over 20 (p. 47). Subsequently, at least two other charts only include officer tenure as one line, with
no years of service indicated (pp. 59 and 66).
The COCL says these later comparative "multivariate" charts need a baseline to compare to, which
is often the most common factor for a category, such as white people when looking at race.
However, in addition to officer years of service, other factors including age, "transient," and whether
officers gave commands do not have a control factor listed.
Although the COCL tries to explain its inclusion, their tables on pages 60 and 66 confusingly show
"subject injury" as one predictor of use of force. The footnote (on page 64) says when a
community member is injured,
that automatically inflates the force category to level II. This does not mean the injury predicts
whether or not officers will use force. While it may mean the COCL is trying to control the
predictive table about how high a level of force is used by factoring in when a low level of force
leads to injury, it is confusing, and on its face absurd.
There are also several confusing or unfortunate phrases in the report.
--On p. 32, it refers to cases involving people with mental heath issues or perceived to be in mental
health crisis as involving "perceived mental health illness."
--On p. 62, there is a reference to "the average age of force type."
--On p. 52, one call type is shown as "person contacts (86)," which at first appeared to be a
numerical reference to how many such contacts were made, but is not.
--On p. 20, there are repeated references to "first line supervisors" which generally means
Sergeants; it is not clear why the actual rank is not used.
In addition to the table with all categories over 100%, PCW found other errors in the report. Some
are listed in this analysis and its footnotes. In addition:
--Although the report says a person considered disobedient is three times more likely to receive
higher force, the number given on page 60 is 3.72, which would round up to _four_ times higher.
--On p. 71, Assistant Chief of Operations Ryan Lee is identified as "Eric Lee."
Lastly, here are two things the COCL could do would make the report itself less confusing:
--When referring to the "last report," the COCL should state the date of the report, especially now
that reports on various sections are only issued once per year. For example, on page 22, PCW
believes the COCL was referring to the November 2017 report, not the April 2018 report on Mental
Health issues, when talking about how they noted "in our last report" that discipline of supervisors
who file inaccurate or incomplete reports had not happened yet.
--Tables which use color for multiple variables shown in one graphic do not carry well in greyscale.
Commonly, such graphs will use dotted and/or dashed lines so people without access to color
printers can read them on paper.
Back to top
CONCLUSION
Once again, the COCL seems to be to deferential to the Police Bureau, giving them an "A" for effort
while ignoring the concerns swirling around in the realities of news headlines-- and their own data.
As noted in our previous analysis of the April report, the disconnect between the Chicago-based
COCL team and the people of Portland has become more pronounced since they divorced
themselves from the Community Liaison part of their originally envisioned and contracted job. The
City managed to combine their interim community-oriented forum with the COCL's report
presentation in April, but the COCL held a forum on July 12 to present this report with the City
originally planned a separate forum on July 25.*-15
At the July 12 forum, the COCL stated one way to avoid officers' reports not reflecting the actual
force or resistance used in an encounter would be to get body cameras on Portland Police. While
PCW remains neutral on this subject, we have repeatedly pointed out that body cameras do not
point at the officers, and only capture what is right in front of the lens. This is why it is crucial for
community copwatchers to continue to observe and record police behavior. In the same way, it is
urgent for members of the community to examine the work being done in our name which
simultaneously confirms some concerns about how often officers are using force, and minimizes
others.
Back to top
Footnotes:
*1- We previously noted that the US DOJ Agreement has been modified to remove the COCL's
work with the community, so the position should technically be called the "CO."
back to text
*2- The COCL also includes a compliance rating for Settlement Agreement Section III's opening
paragraph, making 13 total.
back to text
*3- Another incident in which an officer hit a bicyclist with a police car was also considered deadly
force. An April 2018 incident in which an officer chased a person backward onto the freeway,
leading to a deadly head-on collision, raises serious questions as well.
back to text
*4- The un-numbered opening paragraph of Section III also is rated "Substantial-Conditional"
making nine of 13.
back to text
*5- In the Outcomes section, it is stated there were 2617 applications of force from Q2 2015 to Q2
2017, which PCW calculates is 53% of the force used in 3 years over the first 9 quarters, with 2216
after the change redefining force from Q3 2017 to Q1 2018, or 47% in just three quarters (p. 41).
back to text
*6- See
http://www.portlandcopwatch.org/cocl_data0718.pdf , which uses data from p. 51.
back to text
*7- Illustrative of their disinterest in the community board, PCCEP is not even in the COCL's
glossary of abbreviations.
back to text
*8- Unfortunately, this was a follow-up to a January memo intending to explain de-escalation,
which was self-contradictory and a confusing mess. The two memos are included as Appendices 1
and 2 in the report.
back to text
*9- Population data taken from Independent Police Review's online dashboard, May 2018.
back to text
*10- Per the Auditor's report on the Gang Enforcement Team from March 2018.
back to text
*11- Actually, the decision to start training officers was made in 2006 after James Chasse, Jr. was
beaten to death by Portland officers; the training didn't begin until February, 2007 (People's Police
Report #41, May 2007).
back to text
*12- Also see
http://www.portlandcopwatch.org/cocl_data0718.pdf , which uses data from p. 53.
back to text
*13- The first "positive evaluation" in this table is listed as 97.6.8%; it is supposed to be 97.6.
back to text
*14- Because there is so much variation in the time periods being examined, it is not clear these are
the two Taser cases discussed earlier.
back to text
*15- The extra forum was canceled when the Auditor apparently told the IPR not to participate.
back to text
back to table of contents • back to top
back to table of contents • back to
top
Portland Copwatch home page
Peace and Justice
Works home page
Posted July 27, 2018
|