From Casetext: Smarter Legal Research

United States v. Reynolds

United States District Court, W.D. Michigan, Southern Division
Aug 25, 2021
1:20-cr-24 (W.D. Mich. Aug. 25, 2021)

Opinion

1:20-cr-24

08-25-2021

United States of America, Plaintiff, v. Mustafa Deville Reynolds, Defendant.


SUPPLEMENTAL OPINION

Paul L. Maloney United States District Judge

On August 19, 2021, the Court indicated that defendant Mustafa Reynold's Daubert motion to exclude cell-site analysis testimony from Detective Heikkila would be denied (see ECF No. 113; Motion ECF No. 89). Because that order was issued on the eve of trial, the Court noted that a detailed opinion would issue later; this is that supplemental opinion.

Daubert v. Merrell Dow Pharms. Inc., 509 U.S. 579 (1993).

I.

Reynolds is charged with two counts of distribution of heroin and fentanyl resulting in death, and one count of distribution of heroin and fentanyl. At the scheduled pretrial conference on May 3, 2021, defense counsel raised concerns with how the Government will present cell-site location data; the Court allowed Defendant to file the instant Daubert motion.

In the Government's trial brief (filed April 30, 2021), it seeks to offer “aerial maps that plot the location of historical cell-site location and GPS data for cell phones belonging to the defendant, [the victims], and Dan Errico on August 20, 2019, the day of the drug deals that led to the overdoses.” (ECF No. 75 at PageID.435.) These maps and the accompanying testimony from Grand Rapids Detective Thomas Heikkila are the evidence that Defendant seeks to exclude with his Daubert motion. The core of Defendant's argument is that these maps were produced with a software program (TRAX, developed by the company ZetX) that is not sufficiently reliable, so it may not form the basis of an expert opinion.

The Court held a hearing that began on July 23, 2021 and continued on August 16, 2021. Sy Ray, the developer of the TRAX software, testified on both dates. Detective Heikkila testified at the July hearing; Dr. Vladan Jovanovic testified at the August hearing. The Court allowed both parties to file supplemental briefs, but only Defendant did so (ECF No. 112).

II.

The Rules of Evidence require a trial judge to ensure that an expert's testimony is both reliable and relevant before it can be admitted. Daubert, 509 U.S. 579. Rule 702 grants the district court discretion to determine reliability and relevancy, given the particular facts and circumstances of the case. Kumho Tire Co. v. Carmichael, 526 U.S. 137, 142 (1999). The Government must prove by a preponderance of the evidence that the technical evidence is admissible. Daubert, 509 U.S. at 589.

To testify “in the form of opinion or otherwise, ” an expert witness must be qualified based on his “knowledge, skill, experience, training, or education.” Fed.R.Evid. 702. Once qualified, for testimony to be admitted, Rule 702 requires:

(a) the expert's scientific, technical, or other specialized knowledge will help the trier of fact to understand the evidence or to determine a fact in issue;
(b) the testimony is based on sufficient facts or data;
(c) the testimony is the product of reliable principles and methods; and
(d) the expert has reliably applied the principles and methods to the facts of the case.

Expert opinion must be based on actual knowledge, not subjective belief or speculation. Daubert, 509 U.S. at 589.

Recently, the Sixth Circuit thoroughly analyzed Rule 702(c), which considers whether testimony is the product of reliable principles or methods:

Four inquiries guide the reliability analysis: Is the technique testable? Has it been subjected to peer review? What is the error rate and are there standards for lowering it? Is the technique generally accepted in the relevant scientific community? Daubert, 509 U.S. at 593-94. Multi-factor tests, especially non-exhaustive tests, run the risk of obscuring the core inquiry. The key handholds of Rule 702 thus bear repeating: To be admissible, any relevant scientific or technical evidence must be the “product of reliable principles and methods” and must have been “reliably applied” in the case. That is what matters most.
United States v. Gissantener, 990 F.3d 457, 463 (6th Cir. 2021).

Additionally, the Court recognizes that cell-cite location data is generally admissible so long as it is testified about at a broad level. The Middle District of Tennessee has summarized the Sixth Circuit's position:

[W]hile the science and methods upon which historical cell-site analysis is based are understood and well-documented, they are only reliable to show that a cell phone was in a general area . . . . Thus, assuming that the Government lays a proper foundation and accurately represents historical cell-site analysis's limits at trial, its expert testimony is reliable and a Daubert hearing is unnecessary.
United States v. Frazier, 442 F.Supp.3d 1012, 1024 (M.D. Tenn. 2020) (quoting United States v. Brown, No. 18-20075, 2019 WL 3543253 at *6 (E.D. Mich. Aug 5, 2019) (cleaned up)). This is consistent with opinions from several other circuits. See, e.g., United States v. McNeal, 763 Fed.Appx. 307, 308 (4th Cir. 2019); United States v. Grissom, 760 Fed.Appx. 448, 452 (7th Cir. 2019); United States v. Schaffer, 439 Fed.Appx. 344, 347 (5th Cir. 2011).

III.

Defendant's motion raises two issues. First, Defendant questions whether Detective Heikkila reliably applied the TRAX software to the data available in this case, as required by Rule 702(d). After the first hearing on this motion, there is no question that he did. Ray testified about the best practices for using TRAX software, including using at least 30 days of data, verifying cell tower locations, and cross-checking TRAX results with real-world evidence. Detective Heikkila testified that he followed these best practices. Therefore, assuming the TRAX software itself is reliable (more on that in a moment), Detective Heikkila did reliably apply the TRAX software, so his testimony is acceptable under Rule 702(d).

The bulk of Defendant's motion focuses on whether the TRAX software is a reliable method, as required by Rule 702(c). The ultimate inquiry here is whether, in the Court's discretion, it finds that the software is a reliable principle or method. As noted above, four factors help guide this analysis.

A. Testability

“An untestable scientific theory is all theory and no science.” Gissantaner, 990 F.3d at 463. “The question on the table is whether a method can be ‘assessed for reliability,' not whether it always gets it right.” Id. at 464, citing Fed.R.Evid. 702 advisory committee's note to 2000 amendment. “Disputes about the ‘adequacy of the [theory's] testing' or about the ‘accuracy of [a theory's] results,' generally speaking, provide grist for adversarial examination, not grounds for exclusion.” Id., citing United v. Bonds, 12 F.3d 540, 558-59 (6th Cir. 1993).

TRAX can be tested. Ray testified that the software's results can be tested with drive testing, which verifies the “hand-off” range of each cell tower. Further, Ray and his team have gone out into the field and made phone calls to verify that a phone in a certain location will connect to a certain tower. Dr. Jovanovic testified that drive testing has limits which make it less accurate, but as the Circuit recently held, this testimony goes to the adequacy of the testing, which is fodder for cross-examination rather than a reason to exclude the evidence. ZETX is testable, so this factor tends to show reliability.

B. Peer Review

Peer review is a key measure of reliability. Id. at 464-65. Peer review is the anonymous review of an experimenter's methods, data, and conclusions on paper. United States v. Mitchell, 365 F.3d 215, 238 (3d Cir. 2004). It “conditions publication on a bona fide process” of review by members of the relevant field. Daubert v. Merrell Dow Pharms., Inc., 43 F.3d 1311, 1318 n. 6 (9th Cir. 1995). The Gissantaner Court noted as follows:

If experts “have other scientists review their work” and if the other scientists have the chance to identify any methodological flaws, that usually suffices. Mitchell v. Gencorp Inc., 165 F.3d 778, 784 (10th Cir. 1999). When scientific research is accepted by a reputable journal following the “usual rigors of peer review, ” that represents “a significant indication that it is taken seriously by other scientists, i.e., that it meets at least the minimal criteria of good science.” Daubert, 43 F.3d at 1318.
990 F.3d at 465. The publication of that peer review is a significant indication that the technology or methodology is good science, so peer review alone will typically satisfy a Daubert inquiry. Id. at 464-65.

While Ray relies on the “peer review” of courtroom testimony, the Court does not find that argument compelling. When an expert uses the TRAX software and testifies about it in court, he or she does not anonymously review the TRAX algorithm or confirm its accuracy; this is not a peer review in the scientific sense. Nor are the documents proffered on the ZetX website: the two reviews Ray cited in his testimony (authored by Cherry Biometrics and Dr. Jovanovic) are not true anonymous, published reviews. Cherry Biometrics uses the same scientific methodology as TRAX, but Cherry Biometrics did not conduct a peer review of the TRAX software in any sense. And obviously Dr. Jovanovic did not approvingly peer review TRAX, as he testified that TRAX is unreliable and unsupported by science.

Ray also testified that TRAX has been peer reviewed by Dr. Filipo Sharevski in the book Mobile Network Forensics (see ECF No. 112-1). Having read the excerpt of Mobile Network Forensics provided by Defendant, the Court concludes that it is not a scientific peer review. The book itself appears to be a general overview of its subject matter. TRAX is discussed for only five pages in a chapter that covers several analytical methods; it is reviewed as an example processing tool. The book discusses the functions contained in the TRAX software, but it does not evaluate the software's algorithm or its proprietary method of placing and mapping cell sectors. It is not apparent to the Court that TRAX submitted its theories and supporting data for review; rather, TRAX appears to have simply given the author access to TRAX as a data processing tool. Indeed, the discussion of TRAX is so succinct that TRAX is not listed in the book's index. The overview of the TRAX software in Mobile Network Forensics is not an accounting of TRAX's “methods, data, and conclusions on paper.” Accordingly, Mobile Network Forensics is not a true peer review.

There is no record evidence that ZETX has been peer reviewed, so this factor does not show reliability.

C. Error Rate

“This consideration looks to the error rate of the technology and to whether the scientific community has established standards that forensic scientists can use to mitigate the risk of error.” Id.

The Court's analysis of this factor is limited by the record. There is no published data on TRAX's error rate: all the Court has is Ray's testimony that the software is 95% accurate. Defendant points out that Ray testified that prior to April 2020, the TRAX software was “in the mid-80s, ” but in November 2018, he testified in an unrelated Massachusetts case that TRAX had an error rate of 3% (so, an accuracy rate of 97%). Thus, Defendant urges that the Court cannot trust Ray's proffered accuracy rate. But while Dr. Jovanovic testified that TRAX is not “anywhere near” this accurate, Dr. Jovanovic did not offer a competing accuracy rate. Accordingly, Dr. Jovanovic's mere assertion does not undermine Ray's testimony on this subject.

See Commonwealth v. Carrasquillo, Case No. 16-973 (Mass. Nov. 27, 2018) (Mem.) (ECF No. 103-1 at PageID.761).

Defendant also argues that there are no standards for lowering the error rate. While the Court does not have a published document that lists what standards ZetX uses to lower the TRAX error rate, Ray did testify that his team uses the drive-test results to update the accuracy of the TRAX algorithm. It seems that there is at least one standard for lowering the error rate.

This is a close call. Ray was a compelling witness, but there is no supporting evidence that substantiates the 95% accuracy rate, nor is there any scientific basis for increasing or decreasing that 95%. On the other hand, the Court doesn't even have a ballpark estimate from Dr. Jovanovic about how low the accuracy rate might be. On balance, the Court finds Ray a compelling witness, and accepts his estimate of TRAX's accuracy. Accordingly, on this record, the Court concludes that this factor supports reliability. Disputes regarding accuracy are better left to adversarial examination, rather than exclusion. See id.

D. General Acceptance

This factor looks to general acceptance, not uniform acceptance within the scientific community. Id. at 466. “What matters is whether the relevant scientific community accepts the software.” Id. “After that, the long-tested cauldron of cross-examination, not exclusion, is the place to go for accuracy.” Id. In Gissantaner, the Circuit found that the relevant technology had “garnered wide us in forensic laboratories across the country” because over 45 laboratories use it, “including the FBI and many state law enforcement agencies.” Id.

It is not clear how widespread use of TRAX is. TRAX's parent company ZetX was recently acquired by LexisNexis, which shows some widespread use (or an attempt to make use more widespread), and the impression the Court got from Ray's testimony was that many law enforcement agencies have used TRAX. However, he did not testify to a concrete number of users. And Dr. Jovanovic testified that TRAX is not widely accepted in the scientific community, because-as is clear on the TRAX website-TRAX is the only user of its proprietary algorithm. No other organization uses the TRAX methodology or anything similar.

As with accuracy, Ray is a compelling witness but without concrete supporting evidence, it is hard to gauge how widespread use of TRAX is. Thus, the Court comes to the same conclusion: this factor supports reliability, but it does not compellingly show reliability.

E. Balancing the Factors

The Gissantaner factors split on this record. The parties have each made substantial arguments, and the Court finds this t o b e a cl o se ca l l . T R AX i s testable but not peer reviewed, and the final two factors rest solely on Ray's statements. Based on how knowledgeable and compelling Ray was as a witness, however, the latter two factors tip the scale in favor of reliability. The Court finds that the software is reliable within the meaning of Rule 702(c), and notes that the concerns Defendant has raised can be explored via cross-examination and Dr. Jovanovic's testimony if Defendant wishes.

IV.

For the reasons stated in this opinion, the Court concludes that the TRAX software is a reliable method within the meaning of Rule 702(c), and that Detective Heikkila applied the software reliably, within the meaning of Rule 702(d). Accordingly, the cell-site location evidence is admissible, and

IT IS HEREBY ORDERED that Defendant's motion (ECF No. 89) is DENIED.

IT IS SO ORDERED.


Summaries of

United States v. Reynolds

United States District Court, W.D. Michigan, Southern Division
Aug 25, 2021
1:20-cr-24 (W.D. Mich. Aug. 25, 2021)
Case details for

United States v. Reynolds

Case Details

Full title:United States of America, Plaintiff, v. Mustafa Deville Reynolds…

Court:United States District Court, W.D. Michigan, Southern Division

Date published: Aug 25, 2021

Citations

1:20-cr-24 (W.D. Mich. Aug. 25, 2021)

Citing Cases

United States v. Snipes

That dispute is appropriately put to the jury, which can assess the witnesses' credibility and make its own…

United States v. Reynolds

Although finding this issue "close," the district court denied Reynolds's motion after two days of hearings.…