Summary
In Da Silva, the court stated that "the best approach to the use of computer assisted coding is to follow the Sedona Cooperation Proclamation model.
Summary of this case from Entrata, Inc. v. Yardi Sys., Inc.Opinion
Janette Wipper, Esq., Deepika Bains, Esq., Siham Nurhussein, Esq., Sanford Wittels & Heisler, LLP, San Francisco, CA, for Plaintiffs and Class.
Brett M. Anders, Esq., Victoria Woodin Chavey, Esq., Jeffrey W. Brecher, Esq., Jackson Lewis LLP, Melville, NY, for Defendant MSL Group.
OPINION AND ORDER
ANDREW J. PECK, United States Magistrate Judge:
In my article Search, Forward: Will manual document review and keyword searches be replaced by computer-assisted coding?, I wrote:
To my knowledge, no reported case (federal or state) has ruled on the use of computer-assisted coding. While anecdotally it appears that some lawyers are using predictive coding technology, it also appears that many lawyers (and their clients) are waiting for a judicial decision approving of computer-assisted review.
Perhaps they are looking for an opinion concluding that: " It is the opinion of this court that the use of predictive coding is a proper and acceptable means of conducting searches under the Federal Rules of Civil Procedure, and furthermore that the software provided for this purpose by [insert name of your favorite vendor] is the software of choice in this court." If so, it will be a long wait.
....
Until there is a judicial opinion approving (or even critiquing) the use of predictive coding, counsel will just have to rely on this article as a sign of judicial approval. In my opinion, computer-assisted coding should be used in those cases where it will help " secure the just, speedy, and inexpensive" (Fed.R.Civ.P. 1) determination of cases in our e-discovery world.
Andrew Peck, Search, Forward, L. Tech. News, Oct. 2011, at 25, 29. This judicial opinion now recognizes that computer-assisted review is an acceptable way to search for relevant ESI in appropriate cases.
To correct the many blogs about this case, initiated by a press release from plaintiffs' vendor— the Court did not order the parties to use predictive coding. The parties had agreed to defendants' use of it, but had disputes over the scope and implementation, which the Court ruled on, thus accepting the use of computer-assisted review in this lawsuit.
As noted in Paragraphs A(1) and J of this Protocol, Plaintiffs object to the predictive coding methodology proposed by MSL.
CASE BACKGROUND
In this action, five female named plaintiffs are suing defendant Publicis Groupe, " one of the world's ‘ big four’ advertising conglomerates," and its United States public relations subsidiary, defendant MSL Group. ( See Dkt. No. 4: Am. Compl. ¶ ¶ 1, 5, 26-32.) Plaintiffs allege that defendants have a " glass ceiling" that limits women to entry level positions, and that there is " systemic, company-wide gender discrimination against female PR employees like Plaintiffs." (Am.Compl. ¶ ¶ 4-6, 8.) Plaintiffs allege that the gender discrimination includes
(a) paying Plaintiffs and other female PR employees less than similarly-situated male employees; (b) failing to promote or advance Plaintiffs and other female PR employees at the same rate as similarly-situated male employees; and (c) carrying out discriminatory terminations, demotions and/or job reassignments of female PR employees when the company reorganized its PR practice beginning in 2008 ....
(Am.Compl. ¶ 8.)
Plaintiffs assert claims for gender discrimination under Title VII (and under similar New York State and New York City laws) (Am.Compl. ¶ ¶ 204-25), pregnancy discrimination under Title VII and related violations of the Family and Medical Leave Act (Am.Compl. ¶ ¶ 239-71), as well as violations of the Equal Pay Act and Fair Labor Standards Act (and the similar New York Labor Law) (Am.Compl. ¶ ¶ 226-38).
The complaint seeks to bring the Equal Pay Act/FLSA claims as a " collective action" ( i.e., opt-in) on behalf of all " current, former, and future female PR employees" employed by defendants in the United States " at any time during the applicable liability period" (Am.Compl. ¶ ¶ 179-80, 190-203), and as a class action on the gender and pregnancy discrimination claims and on the New York Labor Law pay claim (Am.Compl. ¶ ¶ 171-98). Plaintiffs, however, have not yet moved for collective action or class certification at this time.
Defendant MSL denies the allegations in the complaint and has asserted various affirmative defenses. ( See generally Dkt. No. 19: MSL Answer.) Defendant Publicis is challenging the Court's jurisdiction over it, and the parties have until March 12, 2012 to conduct jurisdictional discovery. ( See Dkt. No. 44: 10/12/11 Order.)
COMPUTER-ASSISTED REVIEW EXPLAINED
My Search, Forward article explained my understanding of computer-assisted review, as follows:
By computer-assisted coding, I mean tools (different vendors use different names) that use sophisticated algorithms to enable the computer to determine relevance, based on interaction with (i.e., training by) a human reviewer.
Unlike manual review, where the review is done by the most junior staff, computer-assisted coding involves a senior partner (or [small] team) who review and code a " seed set" of documents. The computer identifies properties of those documents that it uses to code other documents. As the senior reviewer continues to code more sample documents, the computer predicts the reviewer's coding. (Or, the computer codes some documents and asks the senior reviewer for feedback.)
When the system's predictions and the reviewer's coding sufficiently coincide, the system has learned enough to make confident predictions for the remaining documents. Typically, the senior lawyer (or team) needs to review only a few thousand documents to train the computer.
Some systems produce a simple yes/no as to relevance, while others give a relevance score (say, on a 0 to 100 basis) that counsel can use to prioritize review. For example, a score above 50 may produce 97% of the relevant documents, but constitutes only 20% of the entire document set.
Counsel may decide, after sampling and quality control tests, that documents with a score of below 15 are so highly likely to be irrelevant that no further human review is necessary. Counsel can also decide the cost-benefit of manual review of the documents with scores of 15-50.
Andrew Peck, Search, Forward, L. Tech. News, Oct. 2011, at 25, 29.
From a different perspective, every person who uses email uses predictive coding, even if they do not realize it. The " spam filter" is an example of predictive coding.
My article further explained my belief that Daubert would not apply to the results of using predictive coding, but that in any challenge to its use, this Judge would be interested in both the process used and the results:
[I]f the use of predictive coding is challenged in a case before me, I will want to know what was done and why that produced defensible results. I may be less interested in the science behind the " black box" of the vendor's software than in whether it produced responsive documents with reasonably high recall and high precision.
That may mean allowing the requesting party to see the documents that were used to train the computer-assisted coding system. (Counsel would not be required to explain why they coded documents as responsive or non-responsive, just what the coding was.) Proof of a valid " process," including quality control testing, also will be important.
....
Of course, the best approach to the use of computer-assisted coding is to follow the Sedona Cooperation Proclamation model. Advise opposing counsel that you plan to use computer-assisted coding and seek agreement; if you cannot, consider whether to abandon predictive coding for that case or go to the court for advance approval.
Id.
THE ESI DISPUTES IN THIS CASE AND THEIR RESOLUTION
After several discovery conferences and rulings by Judge Sullivan (the then-assigned District Judge), he referred the case to me for general pretrial supervision. (Dkt. No. 48: 11/28/11 Referral Order.) At my first discovery conference with the parties, both parties' counsel mentioned that they had been discussing an " electronic discovery protocol," and MSL's counsel stated that an open issue was " plaintiffs reluctance to utilize predictive coding to try to cull down the" approximately three million electronic documents from the agreed-upon custodians. (Dkt. No. 51: 12/2/11 Conf. Tr. at 7-8.) Plaintiffs' counsel clarified that MSL had " over simplified [plaintiffs'] stance on predictive coding," i.e., that it was not opposed but had " multiple concerns ... on the way in which [MSL] plan to employ predictive coding" and plaintiffs wanted " clarification." (12/2/11 Conf. Tr. at 21.)
When defense counsel mentioned the disagreement about predictive coding, I stated that: " You must have thought you died and went to Heaven when this was referred to me," to which MSL's counsel responded: " Yes, your Honor. Well, I'm just thankful that, you know, we have a person familiar with the predictive coding concept." (12/2/11 Conf. Tr. at 8-9.)
The Court did not rule but offered the parties the following advice:
Now, if you want any more advice, for better or for worse on the ESI plan and whether predictive coding should be used, ... I will say right now, what should not be a surprise, I wrote an article in the October Law Technology News called Search Forward, which says predictive coding should be used in the appropriate case.
Is this the appropriate case for it? You all talk about it some more. And if you can't figure it out, you are going to get back in front of me. Key words, certainly unless they are well done and tested, are not overly useful. Key words along with predictive coding and other methodology, can be very instructive.
I'm also saying to the defendants who may, from the comment before, have read my article. If you do predictive coding, you are going to have to give your seed set, including the seed documents marked as nonresponsive to the plaintiff's counsel so they can say, well, of course you are not getting any [relevant] documents, you're not appropriately training the computer.
(12/2/11 Conf. Tr. at 20-21.) The December 2, 2011 conference adjourned with the parties agreeing to further discuss the ESI protocol. (12/2/11 Conf. Tr. at 34-35.)
The ESI issue was next discussed at a conference on January 4, 2012. (Dkt. No. 71: 1/4/12 Conf. Tr.) Plaintiffs' ESI consultant conceded that plaintiffs " have not taken issue with the use of predictive coding or, frankly, with the confidence levels that they [MSL] have proposed...." (1/4/12 Conf. Tr. at 51.) Rather, plaintiffs took issue with MSL's proposal that after the computer was fully trained and the results generated, MSL wanted to only review and produce the top 40,000 documents, which it estimated would cost $200,000 (at $5 per document). (1/4/12 Conf. Tr. at 47-48, 51.) The Court rejected MSL's 40,000 documents proposal as a " pig in a poke." (1/4/12 Conf. Tr. at 51-52.) The Court explained that " where [the] line will be drawn [as to review and production] is going to depend on what the statistics show for the results," since " [p]roportionality requires consideration of results as well as costs. And if stopping at 40,000 is going to leave a tremendous number of likely highly responsive documents unproduced, [MSL's proposed cutoff] doesn't work." (1/4/12 Conf. Tr. at 51-52; see also id. at 57-58; Dkt. No. 88: 2/8/12 Conf. Tr. at 84.) The parties agreed to further discuss and finalize the ESI protocol by late January 2012, with a conference held on February 8, 2012. (1/4/12 Conf. Tr. at 60-66; see 2/8/12 Conf. Tr.)
Custodians
The first issue regarding the ESI protocol involved the selection of which custodians' emails would be searched. MSL agreed to thirty custodians for a " first phase." (Dkt. No. 88: 2/8/12 Conf. Tr. at 23-24.) MSL's custodian list included the president and other members of MSL's " executive team," most of its HR staff and a number of managing directors. (2/8/12 Conf. Tr. at 24.)
Plaintiffs sought to include as additional custodians seven male " comparators," explaining that the comparators' emails were needed in order to find information about their job duties and how their duties compared to plaintiffs' job duties. (2/8/12 Conf. Tr. at 25-27.) Plaintiffs gave an example of the men being given greater " client contact" or having better job assignments. (2/8/12 Conf. Tr. at 28-30.) The Court held that the search of the comparators' emails would be so different from that of the other custodians that the comparators should not be included in the emails subjected to predictive coding review. (2/8/12 Conf. Tr. at 28, 30.) As a fallback position, plaintiffs proposed to " treat the comparators as a separate search," but the Court found that plaintiffs could not describe in any meaningful way how they would search the comparators' emails, even as a separate search. (2/8/12 Conf. Tr. at 30-31.) Since the plaintiffs likely could develop the information needed through depositions of the comparators, the Court ruled that the comparators' emails would not be included in phase one. (2/8/12 Conf. Tr. at 31.)
Plaintiffs also sought to include MSL's CEO, Olivier Fleuriot, located in France and whose emails were mostly written in French. (2/8/12 Conf. Tr. at 32-34.) The Court concluded that because his emails with the New York based executive staff would be gathered from those custodians, and Fleuriot's emails stored in France likely would be covered by the French privacy and blocking laws, Fleuriot should not be included as a first-phase custodian. (2/8/12 Conf. Tr. at 35.)
See, e.g., Societe Nationale Industrielle Aerospatiale v. U.S. Dist. Ct. for the S.D. of Iowa, 482 U.S. 522, 107 S.Ct. 2542, 96 L.Ed.2d 461 (1987); see also The Sedona Conference, International Principles on Discovery, Disclosure & Data Protection (2011), available at http:// www. thesedona conference. org/ dlt Form? did= Intl Principles 2011. pdf.
Plaintiffs sought to include certain managing directors from MSL offices at which no named plaintiff worked. (2/8/12 Conf. Tr. at 36-37.) The Court ruled that since plaintiffs had not yet moved for collective action status or class certification, until the motions were made and granted, discovery would be limited to offices (and managing directors) where the named plaintiffs had worked. (2/8/12 Conf. Tr. at 37-39.)
The final issue raised by plaintiffs related to the phasing of custodians and the discovery cutoff dates. MSL proposed finishing phase-one discovery completely before considering what to do about a second phase. ( See 2/8/12 Conf. Tr. at 36.) Plaintiffs expressed concern that there would not be time for two separate phases, essentially seeking to move the phase-two custodians back into phase one. (2/8/12 Conf. Tr. at 35-36.) The Court found MSL's separate phase approach to be more sensible and noted that if necessary, the Court would extend the discovery cutoff to allow the parties to pursue discovery in phases. (2/8/12 Conf. Tr. at 36, 50.)
Sources of ESI
The parties agreed on certain ESI sources, including the " EMC SourceOne [Email] Archive," the " PeopleSoft" human resources information management system and certain other sources including certain HR " shared" folders. ( See Dkt. No. 88: 2/8/12 Conf. Tr. at 44-45, 50-51.) As to other " shared" folders, neither side was able to explain whether the folders merely contained forms and templates or collaborative working documents; the Court therefore left those shared folders for phase two unless the parties promptly provided information about likely contents. (2/8/12 Conf. Tr. at 47-48.)
The Court noted that because the named plaintiffs worked for MSL, plaintiffs should have some idea what additional ESI sources, if any, likely had relevant information; since the Court needed to consider proportionality pursuant to Rule 26(b)(2)(C), plaintiffs needed to provide more information to the Court than they were doing if they wanted to add additional data sources into phase one. (2/8/12 Conf. Tr. at 49-50.) The Court also noted that where plaintiffs were getting factual information from one source ( e.g., pay information, promotions, etc.), " there has to be a limit to redundancy" to comply with Rule 26(b)(2)(C). (2/8/12 Conf. Tr. at 54.)
The Court also suggested that the best way to resolve issues about what information might be found in a certain source is for MSL to show plaintiffs a sample printout from that source. (2/8/12 Conf. Tr. at 55-56.)
The Predictive Coding Protocol
The parties agreed to use a 95% confidence level (plus or minus two percent) to create a random sample of the entire email collection; that sample of 2,399 documents will be reviewed to determine relevant (and not relevant) documents for a " seed set" to use to train the predictive coding software. (Dkt. No. 88: 2/8/12 Conf. Tr. at 59-61.) An area of disagreement was that MSL reviewed the 2,399 documents before the parties agreed to add two additional concept groups ( i.e., issue tags). (2/8/12 Conf. Tr. at 62.) MSL suggested that since it had agreed to provide all 2,399 documents (and MSL's coding of them) to plaintiffs for their review, plaintiffs can code them for the new issue tags, and MSL will incorporate that coding into the system. (2/8/12 Conf. Tr. at 64.) Plaintiffs' vendor agreed to that approach. (2/8/12 Conf. Tr. at 64.)
To further create the seed set to train the predictive coding software, MSL coded certain documents through " judgmental sampling." (2/8/12 Conf. Tr. at 64.) The remainder of the seed set was created by MSL reviewing " keyword" searches with Boolean connectors (such as " training and Da Silva Moore," or " promotion and Da Silva Moore" ) and coding the top fifty hits from those searches. (2/8/12 Conf. Tr. at 64-66, 72.) MSL agreed to provide all those documents (except privileged ones) to plaintiffs for plaintiffs to review MSL's relevance coding. (2/8/12 Conf. Tr. at 66.) In addition, plaintiffs provided MSL with certain other keywords, and MSL used the same process with plaintiffs' keywords as with the MSL keywords, reviewing and coding an additional 4,000 documents. (2/8/12 Conf. Tr. at 68-69, 71.) All of this review to create the seed set was done by senior attorneys (not paralegals, staff attorneys or junior associates). (2/8/12 Conf. Tr. at 92-93.) MSL reconfirmed that " [a]ll of the documents that are reviewed as a function of the seed set, whether [they] are ultimately coded relevant or irrelevant, aside from privilege, will be turned over to" plaintiffs. (2/8/12 Conf. Tr. at 73.)
The next area of discussion was the iterative rounds to stabilize the training of the software. MSL's vendor's predictive coding software ranks documents on a score of 100 to zero, i.e., from most likely relevant to least likely relevant. (2/8/12 Conf. Tr. at 70.) MSL proposed using seven iterative rounds; in each round they would review at least 500 documents from different concept clusters to see if the computer is returning new relevant documents. (2/8/12 Conf. Tr. at 73-74.) After the seventh round, to determine if the computer is well trained and stable, MSL would review a random sample (of 2,399 documents) from the discards ( i.e., documents coded as non-relevant) to make sure the documents determined by the software to not be relevant do not, in fact, contain highly-relevant documents. (2/8/12 Conf. Tr. at 74-75.) For each of the seven rounds and the final quality-check random sample, MSL agreed that it would show plaintiffs all the documents it looked at including those deemed not relevant (except for privileged documents). (2/8/12 Conf. Tr. at 76.)
Plaintiffs' vendor noted that " we don't at this point agree that this is going to work. This is new technology and it has to be proven out." (2/8/12 Conf. Tr. at 75.) Plaintiffs' vendor agreed, in general, that computer-assisted review works, and works better than most alternatives. (2/8/12 Conf. Tr. at 76.) Indeed, plaintiffs' vendor noted that " it is fair to say [that] we are big proponents of it." (2/8/12 Conf. Tr. at 76.) The Court reminded the parties that computer-assisted review " works better than most of the alternatives, if not all of the [present] alternatives. So the idea is not to make this perfect, it's not going to be perfect. The idea is to make it significantly better than the alternatives without nearly as much cost." (2/8/12 Conf. Tr. at 76.)
The Court accepted MSL's proposal for the seven iterative reviews, but with the following caveat:
But if you get to the seventh round and [plaintiffs] are saying that the computer is still doing weird things, it's not stabilized, etc., we need to do another round or two, either you will agree to that or you will both come in with the appropriate QC information and everything else and [may be ordered to] do another round or two or five or 500 or whatever it takes to stabilize the system.
(2/8/12 Conf. Tr. at 76-77; see also id. at 83-84, 88.)
On February 17, 2012, the parties submitted their " final" ESI Protocol which the Court " so ordered." (Dkt. No. 92: 2/17/12 ESI Protocol & Order.) Because this is the first Opinion dealing with predictive coding, the Court annexes hereto as an Exhibit the provisions of the ESI Protocol dealing with the predictive coding search methodology.
Plaintiffs included a paragraph noting its objection to the ESI Protocol, as follows:
OBSERVATIONS ON PLAINTIFF'S OBJECTIONS TO THE COURT'S RULINGS
On February 22, 2012, plaintiffs filed objections to the Court's February 8, 2012 rulings. (Dkt. No. 93: Pls. Rule 72(a) Objections; see also Dkt. No. 94: Nurhussein Aff.; Dkt. No. 95: Neale Aff.) While those objections are before District Judge Carter, a few comments are in order.
Plaintiffs' Reliance on Rule 26(g)(1)(A) is Erroneous
Plaintiffs' objections to my February 8, 2012 rulings assert that my acceptance of MSL's predictive coding approach " provides unlawful ‘ cover’ for MSL's counsel, who has a duty under FRCP 26(g) to ‘ certify’ that their client's document production is ‘ complete’ and ‘ correct’ as of the time it was made. FRCP 26(g)(1)(A)." (Dkt. No. 93: Pls. Rule 72(a) Objections at 8 n. 7; accord, id. at 2.) In large-data cases like this, involving over three million emails, no lawyer using any search method could honestly certify that its production is " complete" — but more importantly, Rule 26(g)(1) does not require that. Plaintiffs simply misread Rule 26(g)(1). The certification required by Rule 26(g)(1) applies " with respect to a disclosure." Fed.R.Civ.P. 26(g)(1)(A) (emphasis added). That is a term of art, referring to the mandatory initial disclosures required by Rule 26(a)(1). Since the Rule 26(a)(1) disclosure is information (witnesses, exhibits) that " the disclosing party may use to support its claims or defenses," and failure to provide such information leads to virtually automatic preclusion, see Fed.R.Civ.P. 37(c)(1), it is appropriate for the Rule 26(g)(1)(A) certification to require disclosures be " complete and correct."
Rule 26(g)(1)(B) is the provision that applies to discovery responses. It does not call for certification that the discovery response is " complete," but rather incorporates the Rule 26(b)(2)(C) proportionality principle. Thus, Rule 26(g)(1)(A) has absolutely nothing to do with MSL's obligations to respond to plaintiffs' discovery requests. Plaintiffs' argument is based on a misunderstanding of Rule 26(g)(1).
Rule 26(g)(1) provides:
Rule 702 and Daubert Are Not Applicable to Discovery Search Methods
Plaintiffs' objections also argue that my acceptance of MSL's predictive coding protocol " is contrary to Federal Rule of Evidence 702" and " violates the gatekeeping function underlying Rule 702." (Dkt. No. 93: Pls. Rule 72(a) Objections at 2-3; accord, id. at 10-12.)
As part of this argument, plaintiffs complain that although both parties' experts ( i.e., vendors) spoke at the discovery conferences, they were not sworn in. (Pls. Rule 72(a) Objections at 12: " To his credit, the Magistrate [Judge] did ask the parties to bring [to the conference] the ESI experts they had hired to advise them regarding the creation of an ESI protocol. These experts, however, were never sworn in, and thus the statements they made in court at the hearings were not sworn testimony made under penalty of perjury." ) Plaintiffs never asked the Court to have the experts testify to their qualifications or be sworn in.
Federal Rule of Evidence 702 and the Supreme Court's Daubert decision deal with the trial court's role as gatekeeper to exclude unreliable expert testimony from being submitted to the jury at trial. See also Advisory Comm. Notes to Fed.R.Evid. 702. It is a rule for admissibility of evidence at trial.
Daubert v. Merrell Dow Pharms., Inc., 509 U.S. 579, 113 S.Ct. 2786, 125 L.Ed.2d 469 (1993).
If MSL sought to have its expert testify at trial and introduce the results of its ESI protocol into evidence, Daubert and Rule 702 would apply. Here, in contrast, the tens of thousands of emails that will be produced in discovery are not being offered into evidence at trial as the result of a scientific process or otherwise. The admissibility of specific emails at trial will depend upon each email itself (for example, whether it is hearsay, or a business record or party admission), not how it was found during discovery.
Rule 702 and Daubert simply are not applicable to how documents are searched for and found in discovery.
Plaintiffs' Reliability Concerns Are, At Best, Premature
Finally, plaintiffs' objections assert that " MSL's method lacks the necessary standards for assessing whether its results are accurate; in other words, there is no way to be certain if MSL's method is reliable." (Dkt. No. 93: Pls. Rule 72(a) Objections at 13-18.) Plaintiffs' concerns may be appropriate for resolution during or after the process (which the Court will be closely supervising), but are premature now. For example, plaintiffs complain that " MSL's method fails to include an agreed-upon standard of relevance that is transparent and accessible to all parties .... Without this standard, there is a high-likelihood of delay as the parties resolve disputes with regard to individual documents on a case-by-case basis." ( Id. at 14.) Relevance is determined by plaintiffs' document demands. As statistics show, perhaps only 5% of the disagreement among reviewers comes from close questions of relevance, as opposed to reviewer error. ( See page 18 n. 11 below.) The issue regarding relevance standards might be significant if MSL's proposal was not totally transparent. Here, however, plaintiffs will see how MSL has coded every email used in the seed set (both relevant and not relevant), and the Court is available to quickly resolve any issues.
Plaintiffs complain they cannot determine if " MSL's method actually works" because MSL does not describe how many relevant documents are permitted to be located in the final random sample of documents the software deemed irrelevant. (Pls. Rule 72(a) Objections at 15-16.) Plaintiffs argue that " without any decision about this made in advance, the Court is simply kicking the can down the road." ( Id. at 16.) In order to determine proportionality, it is necessary to have more information than the parties (or the Court) now has, including how many relevant documents will be produced and at what cost to MSL. Will the case remain limited to the named plaintiffs, or will plaintiffs seek and obtain collective action and/or class action certification? In the final sample of documents deemed irrelevant, are any relevant documents found that are " hot," " smoking gun" documents ( i.e., highly relevant)? Or are the only relevant documents more of the same thing? One hot document may require the software to be re-trained (or some other search method employed), while several documents that really do not add anything to the case might not matter. These types of questions are better decided " down the road," when real information is available to the parties and the Court.
FURTHER ANALYSIS AND LESSONS FOR THE FUTURE
The decision to allow computer-assisted review in this case was relatively easy— the parties agreed to its use (although disagreed about how best to implement such review). The Court recognizes that computer-assisted review is not a magic, Staples-Easy-Button, solution appropriate for all cases. The technology exists and should be used where appropriate, but it is not a case of machine replacing humans: it is the process used and the interaction of man and machine that the courts needs to examine.
The objective of review in ediscovery is to identify as many relevant documents as possible, while reviewing as few non-relevant documents as possible. Recall is the fraction of relevant documents identified during a review; precision is the fraction of identified documents that are relevant. Thus, recall is a measure of completeness, while precision is a measure of accuracy or correctness. The goal is for the review method to result in higher recall and higher precision than another review method, at a cost proportionate to the " value" of the case. See, e.g., Maura R. Grossman & Gordon V. Cormack, Technology-Assisted Review in E-Discovery Can Be More Effective and More Efficient Than Exhaustive Manual Review, Rich. J.L. & Tech., Spring 2011, at 8-9, available at http:// jolt. richmond. edu/ vl 7 i 3/ article 11. pdf.
The slightly more difficult case would be where the producing party wants to use computer-assisted review and the requesting party objects. The question to ask in that situation is what methodology would the requesting party suggest instead? Linear manual review is simply too expensive where, as here, there are over three million emails to review. Moreover, while some lawyers still consider manual review to be the " gold standard," that is a myth, as statistics clearly show that computerized searches are at least as accurate, if not more so, than manual review. Herb Roitblatt, Anne Kershaw, and Patrick Oot of the Electronic Discovery Institute conducted an empirical assessment to " answer the question of whether there was a benefit to engaging in a traditional human review or whether computer systems could be relied on to produce comparable results," and concluded that " [o]n every measure, the performance of the two computer systems was at least as accurate (measured against the original review) as that of human re-review." Herbert L. Roitblatt, Anne Kershaw & Patrick Oot, Document Categorization in Legal Electronic Discovery: Computer Classification v. Manual Review, 61 J. Am. Soc'y for Info. Sci. & Tech. 70, 79 (2010).
The tougher question, raised in Klein Prods. LLC v. Packaging Corp. of Am. before Magistrate Judge Nan Nolan in Chicago, is whether the Court, at plaintiffs' request, should order the defendant to use computer-assisted review to respond to plaintiffs' document requests.
The Roitblatt, Kershaw, Oot article noted that " [t]he level of agreement among human reviewers is not strikingly high," around 70-75%. They identify two sources for this variability: fatigue (" A document that they [the reviewers] might have categorized as responsive when they were more attentive might then be categorized [when the reviewer is distracted or fatigued] as non-responsive or vice versa." ), and differences in " strategic judgment." Id. at 77-78. Another study found that responsiveness " is fairly well defined, and that disagreements among assessors are largely attributable to human error," with only 5% of reviewer disagreement attributable to borderline or questionable issues as to relevance. Maura R. Grossman & Gordon V. Cormack, Inconsistent Assessment of Responsiveness in E-Discovery: Difference of Opinion or Human Error? 9 (DESI IV: 2011 ICAIL Workshop on Setting Standards for Searching Elec. Stored Info. in Discovery, Research Paper), available at http:// www. umiacs. umd. edu/õ ard/ desi 4/ papers/ grossman 3. pdf
Likewise, Wachtell, Lipton, Rosen & Katz litigation counsel Maura Grossman and University of Waterloo professor Gordon Cormack, studied data from the Text Retrieval Conference Legal Track (TREC) and concluded that: " [T]he myth that exhaustive manual review is the most effective— and therefore the most defensible— approach to document review is strongly refuted. Technology-assisted review can (and does) yield more accurate results than exhaustive manual review, with much lower effort." Maura R. Grossman & Gordon V. Cormack, Technology-Assisted Review in E-Discovery Can Be More Effective and More Efficient Than Exhaustive Manual Review, Rich. J.L. & Tech., Spring 2011, at 48. The technology-assisted reviews in the Grossman-Cormack article also demonstrated significant cost savings over manual review: " The technology-assisted reviews require, on average, human review of only 1.9% of the documents, a fifty-fold savings over exhaustive manual review." Id. at 43.
Grossman and Cormack also note that " not all technology-assisted reviews ... are created equal" and that future studies will be needed to " address which technology-assisted review process(es) will improve most on manual review." Id.
Because of the volume of ESI, lawyers frequently have turned to keyword searches to cull email (or other ESI) down to a more manageable volume for further manual review. Keywords have a place in production of ESI— indeed, the parties here used keyword searches (with Boolean connectors) to find documents for the expanded seed set to train the predictive coding software. In too many cases, however, the way lawyers choose keywords is the equivalent of the child's game of " Go Fish." The requesting party guesses which keywords might produce evidence to support its case without having much, if any, knowledge of the responding party's " cards" ( i.e., the terminology used by the responding party's custodians). Indeed, the responding party's counsel often does not know what is in its own client's " cards."
See Ralph C. Losey, " Child's Game of ‘ Go Fish’ is a Poor Model for e-Discovery Search," in Adventures in Electronic Discovery 209-10 (2011).
Another problem with keywords is that they often are over-inclusive, that is, they find responsive documents but also large numbers of irrelevant documents. In this case, for example, a keyword search for " training" resulted in 165,208 hits; Da Silva Moore's name resulted in 201,179 hits; " bonus" resulted in 40,756 hits; " compensation" resulted in 55,602 hits; and " diversity" resulted in 38,315 hits. (Dkt. No. 92: 2/17/12 ESI Protocol Ex. A.) If MSL had to manually review all of the keyword hits, many of which would not be relevant ( i.e., would be false positives), it would be quite costly.
Moreover, keyword searches usually are not very effective. In 1985, scholars David Blair and M. Maron collected 40,000 documents from a Bay Area Rapid Transit accident, and instructed experienced attorney and paralegal searchers to use keywords and other review techniques to retrieve at least 75% of the documents relevant to 51 document requests. David L. Blair & M.E. Maron, An Evaluation of Retrieval Effectiveness for a Full-Text Document-Retrieval System, 28 Comm. ACM 289 (1985). Searchers believed they met the goals, but their average recall was just 20%. Id. This result has been replicated in the TREC Legal Track studies over the past few years.
Judicial decisions have criticized specific keyword searches. Important early decisions in this area came from two of the leading judicial scholars in ediscovery, Magistrate Judges John Facciola (District of Columbia) and Paul Grimm (Maryland). See United States v. O'Keefe, 537 F.Supp.2d 14, 24 (D.D.C.2008) (Facciola, M.J.); Equity Analytics, LLC v. Lundin, 248 F.R.D. 331, 333 (D.D.C.2008) (Facciola, M.J.); Victor Stanley, Inc. v. Creative Pipe, Inc., 250 F.R.D. 251, 260, 262 (D.Md.2008) (Grimm, M.J.). I followed their lead with William A. Gross Construction Associates, Inc., when I wrote:
This Opinion should serve as a wake-up call to the Bar in this District about the need for careful thought, quality control, testing, and cooperation with opposing counsel in designing search terms or " keywords" to be used to produce emails or other electronically stored information (" ESI" ).
....
Electronic discovery requires cooperation between opposing counsel and transparency in all aspects of preservation and production of ESI. Moreover, where counsel are using keyword searches for retrieval of ESI, they at a minimum must carefully craft the appropriate keywords, with input from the ESI's custodians as to the words and abbreviations they use, and the proposed methodology must be quality control tested to assure accuracy in retrieval and elimination of " false positives." It is time that the Bar— even those lawyers who did not come of age in the computer era— understand this.
William A. Gross Constr. Assocs., Inc. v. Am. Mfrs. Mut. Ins. Co., 256 F.R.D. 134, 134, 136 (S.D.N.Y.2009) (Peck, M.J.).
Computer-assisted review appears to be better than the available alternatives, and thus should be used in appropriate cases. While this Court recognizes that computer-assisted review is not perfect, the Federal Rules of Civil Procedure do not require perfection. See, e.g., Pension Comm. of Univ. of Montreal Pension Plan v. Banc of Am. Sec., 685 F.Supp.2d 456, 461 (S.D.N.Y.2010). Courts and litigants must be cognizant of the aim of Rule 1, to " secure the just, speedy, and inexpensive determination" of lawsuits. Fed.R.Civ.P. 1. That goal is further reinforced by the proportionality doctrine set forth in Rule 26(b)(2)(C), which provides that:
On motion or on its own, the court must limit the frequency or extent of discovery otherwise allowed by these rules or by local rule if it determines that:
(i) the discovery sought is unreasonably cumulative or duplicative, or can be obtained from some other source that is more convenient, less burdensome, or less expensive;
(ii) the party seeking discovery has had ample opportunity to obtain the information by discovery in the action; or
(iii) the burden or expense of the proposed discovery outweighs its likely benefit, considering the needs of the case, the amount in controversy, the parties' resources, the importance of the issues at stake in the action, and the importance of the discovery in resolving the issues.
In this case, the Court determined that the use of predictive coding was appropriate considering: (1) the parties' agreement, (2) the vast amount of ESI to be reviewed (over three million documents), (3) the superiority of computer-assisted review to the available alternatives ( i.e., linear manual review or keyword searches), (4) the need for cost effectiveness and proportionality under Rule 26(b)(2)(C), and (5) the transparent process proposed by MSL.
This Court was one of the early signatories to The Sedona Conference Cooperation Proclamation, and has stated that " the best solution in the entire area of electronic discovery is cooperation among counsel. This Court strongly endorses The Sedona Conference Proclamation (available at www. The Sedona Conference. org)." William A. Gross Constr. Assocs., Inc. v. Am. Mfrs. Mut. Ins. Co., 256 F.R.D. at 136. An important aspect of cooperation is transparency in the discovery process. MSL's transparency in its proposed ESI search protocol made it easier for the Court to approve the use of predictive coding. As discussed above on page 10, MSL confirmed that " [a]ll of the documents that are reviewed as a function of the seed set, whether [they] are ultimately coded relevant or irrelevant, aside from privilege, will be turned over to" plaintiffs. (Dkt. No. 88: 2/8/12 Conf. Tr. at 73; see also 2/17/12 ESI Protocol at 14: " MSL will provide Plaintiffs' counsel with all of the non-privileged documents and will provide, to the extent applicable, the issue tag(s) coded for each document .... If necessary, counsel will meet and confer to attempt to resolve any disagreements regarding the coding applied to the documents in the seed set." ) While not all experienced ESI counsel believe it necessary to be as transparent as MSL was willing to be, such transparency allows the opposing counsel (and the Court) to be more comfortable with computer-assisted review, reducing fears about the so-called " black box" of the technology. This Court highly recommends that counsel in future cases be willing to at least discuss, if not agree to, such transparency in the computer-assisted review process.
It also avoids the GIGO problem, i.e., garbage in, garbage out.
Several other lessons for the future can be derived from the Court's resolution of the ESI discovery disputes in this case.
First, it is unlikely that courts will be able to determine or approve a party's proposal as to when review and production can stop until the computer-assisted review software has been trained and the results are quality control verified. Only at that point can the parties and the Court see where there is a clear drop off from highly relevant to marginally relevant to not likely to be relevant documents. While cost is a factor under Rule 26(b)(2)(C), it cannot be considered in isolation from the results of the predictive coding process and the amount at issue in the litigation.
Second, staging of discovery by starting with the most likely to be relevant sources (including custodians), without prejudice to the requesting party seeking more after conclusion of that first stage review, is a way to control discovery costs. If staging requires a longer discovery period, most judges should be willing to grant such an extension. (This Judge runs a self-proclaimed " rocket docket," but informed the parties here of the Court's willingness to extend the discovery cutoff if necessary to allow the staging of custodians and other ESI sources.) Third, in many cases requesting counsel's client has knowledge of the producing party's records, either because of an employment relationship as here or because of other dealings between the parties ( e.g., contractual or other business relationships). It is surprising that in many cases counsel do not appear to have sought and utilized their client's knowledge about the opposing party's custodians and document sources. Similarly, counsel for the producing party often is not sufficiently knowledgeable about their own client's custodians and business terminology. Another way to phrase cooperation is " strategic proactive disclosure of information," i.e., if you are knowledgeable about and tell the other side who your key custodians are and how you propose to search for the requested documents, opposing counsel and the Court are more apt to agree to your approach (at least as phase one without prejudice).
Fourth, the Court found it very helpful that the parties' ediscovery vendors were present and spoke at the court hearings where the ESI Protocol was discussed. (At ediscovery programs, this is sometimes jokingly referred to as " bring your geek to court day." ) Even where as here counsel is very familiar with ESI issues, it is very helpful to have the parties' ediscovery vendors (or in-house IT personnel or in-house ediscovery counsel) present at court conferences where ESI issues are being discussed. It also is important for the vendors and/or knowledgeable counsel to be able to explain complicated ediscovery concepts in ways that make it easily understandable to judges who may not be tech-savvy.
CONCLUSION
This Opinion appears to be the first in which a Court has approved of the use of computer-assisted review. That does not mean computer-assisted review must be used in all cases, or that the exact ESI protocol approved here will be appropriate in all future cases that utilize computer-assisted review. Nor does this Opinion endorse any vendor (the Court was very careful not to mention the names of the parties' vendors in the body of this Opinion, although it is revealed in the attached ESI Protocol), nor any particular computer-assisted review tool. What the Bar should take away from this Opinion is that computer-assisted review is an available tool and should be seriously considered for use in large-data-volume cases where it may save the producing party (or both parties) significant amounts of legal fees in document review. Counsel no longer have to worry about being the " first" or " guinea pig" for judicial acceptance of computer-assisted review. As with keywords or any other technological solution to ediscovery, counsel must design an appropriate process, including use of available technology, with appropriate quality control testing, to review and produce relevant ESI while adhering to Rule 1 and Rule 26(b)(2)(C) proportionality. Computer-assisted review now can be considered judicially-approved for use in appropriate cases.
SO ORDERED.
EXHIBIT
MONIQUE DA SILVA MOORE, MARYELLEN O'DONOHUE, LAURIE MAYERS, HEATHER PIERCE, and KATHERINE WILKINSON, on behalf of themselves and all others similarly situated,
Plaintiffs,
vs.
PUBLICIS GROUPE SA and MSLGROUP,
Defendants.
PARTIES' PROPOSED PROTOCOL RELATING TO THE PRODUCTION OF ELECTRONICALLY STORED INFORMATION (" ESI" ) & ORDER
A. Scope
1. General.
The procedures and protocols outlined herein govern the production of electronically stored information (" ESI" ) by MSLGROUP Americas, Inc. (" MSL" ) during the pendency of this litigation. The parties to this protocol will take reasonable steps to comply with this agreed-upon protocol for the production of documents and information existing in electronic format. Nothing in this protocol will be interpreted to require disclosure of documents or information protected from disclosure by the attorney-client privilege, work-product product doctrine or any other applicable privilege or immunity. It is Plaintiffs' position that nothing in this protocol will be interpreted to waive Plaintiffs' right to object to this protocol as portions of it were mandated by the Court over Plaintiffs' objections, including Plaintiffs' objections to the predictive coding methodology proposed by MSL.
2. Limitations and No-Waiver.
This protocol provides a general framework for the production of ESI on a going forward basis. The Parties and their attorneys do not intend by this protocol to waive their rights to the attorney work-product privilege, except as specifically required herein, and any such waiver shall be strictly and narrowly construed and shall not extend to other matters or information not specifically described herein. All Parties preserve their attorney client privileges and other privileges and there is no intent by the protocol, or the production of documents pursuant to the protocol, to in any way waive or weaken these privileges. All documents produced hereunder are fully protected and covered by the Parties' confidentiality and clawback agreements and orders of the Court effectuating same.
3. Relevant Time Period.
January 1, 2008 through February 24, 2011 for all non-email ESI relating to topics besides pay discrimination and for all e-mails. January 1, 2005 through February 24, 2011 for all non-e-mail ESI relating to pay discrimination for New York Plaintiffs.
B. ESI Preservation
1. MSL has issued litigation notices to designated employees on February 10, 2010, March 14, 2011 and June 9, 2011.
C. Sources
1. The Parties have identified the following sources of potentially discoverable ESI at MSL. Phase I sources will be addressed first, and Phase II sources will be addressed after Phase I source searches are complete. Sources marked as " N/A" will not be searched by the Parties.
Date Source
Description
Phase
a
EMC SourceOne Archive
Archiving System used to capture and store all incomingand outbound e-mails and selected instant messageconversations saved through IBM Sametime (see below).
I
b
Lotus Notes E-mail
Active corporate system that provides e-mailcommunication and calendaring functions.
N/A
c
GroupWise E-mail
Legacy corporate system that provided e-mailcommunication and calendaring functions.
N/A
d
IBM Sametime
Lotus Notes Instant Messaging and collaborationapplication.
N/A
e
Home Directories
Personal network storage locations on the fileserver(s) dedicated to individual users. (With theexception of 2 home directories for which MSL willcollect and analyze the data to determine the level ofduplication as compared to the EMC SourceOne Archive.The parties will meet and confer regarding theselection of the two custodians.)
II
f
Shared Folders
Shared network storage locations on the file server(s)that are accessible by individual users, groups ofusers or entire departments. (With the exception of thefollowing Human Resources shared folders which will bein Phase I: Corporate HR, North America HR and New YorkHR.)
II
g
Database Servers
Backend databases (e.g. Oracle, SQL, MySQL) used tostore information for front end applications or otherpurposes.
N/A
h
Halogen Software
Performance management program provided by Halogen toconduct performance evaluations.
I
i
Noovoo
Corporate Intranet site.
II
j
Corporate Feedback
E-mail addresses that employees may utilize to providethe company with comments, suggestions and overallfeedback.
I
k
Hyperion Financial Management (" HFM" )
Oracle application that offers global financialconsolidation, reporting and analysis.
N/A
l
Vurv/Taleo
Talent recruitment software.
II
m
ServiceNow
Help Desk application used to track employee computerrelated requests.
N/A
n
PeopleSoft
Human resources information management system.
I
o
PRISM
PeopleSoft component used for time and billingmanagement.
I
p
Portal
A project based portal provided through Oracle/BEASystems.
II
q
Desktops/Laptops
Fixed and portable computers provided to employees toperform work related activities. (With the exception of2 desktop/laptop hard drives for which MSL will collectand analyze the data to determine the level ofduplication as compared to the EMC SourceOne Archive.The parties will meet and confer regarding theselection of the two custodians.)
II
r
Publicis Benefits Connection
Web based site that maintains information aboutemployee benefits and related information.
II
s
GEARS
Employee expense reporting system.
II
t
MS & L City
Former corporate Intranet.
N/A
u
Adium
Application which aggregates instant messages.
N/A
v
Pidgin
Application which aggregates instant message.
N/A
w
IBM Lotus Traveler and MobileIron
Mobile device synchronization and security system.
N/A
y
Mobile Communication Devices
Portable PDAs, smart phones, tablets used forcommunication.
N/A
z
Yammer
Social media and collaboration portal.
N/A
aa
SalesForce.com
Web-based customer relationship management application.
N/A
a. EMC SourceOne — MSL uses SourceOne, an EMC e-mail archiving system that captures and stores all e-mail messages that pass through the corporate e-mail system. In addition, if a user chooses to save an instant messaging chat conversation from IBM Sametime (referenced below), that too would be archived in SourceOne. Defendant MSL also acknowledges that calendar items are regularly ingested into the SourceOne system. The parties have agreed that this data source will be handled as outlined in section E below.
bb
Removable Storage Devices
Portable storage media, external hard drives, thumbdrives, etc. used to store copies of work related ESI.
N/A
b. Lotus Notes E-mail — MSL currently maintains multiple Lotus Notes Domino servers in various data centers around the world. All e-mail communication and calendar items are journaled in real time to the EMC SourceOne archive. The parties have agreed to not collect any information from this data source at this time.
c. GroupWise E-mail — Prior to the implementation of the Lotus Notes environment, GroupWise was used for all e-mail and calendar functionality. Before the decommissioning of the GroupWise servers, MSL created backup tapes of all servers that housed the GroupWise e-mail databases. The parties have agreed to not collect any information from this data source at this time.
d. IBM Sametime — MSL provides custodians with the ability to have real time chat conversations via the IBM Sametime application that is part of the Lotus Notes suite of products.
e. Home Directories — Custodians with corporate network access at MSL also have a dedicated and secured network storage location where they are able to save files. MSL will collect the home directory data for 2 custodians and analyze the data to determine the level of duplication of documents in this data source against the data contained in the EMC SourceOne archive for the same custodians. (The parties will meet and confer regarding the selection of the two custodians.) The results of the analysis will be provided to Plaintiffs so that a determination can be made by the parties as to whether MSL will include this data source in its production of ESI to Plaintiffs. If so, the parties will attempt to reach an agreement as to the approach used to collect, review and produce responsive and non-privileged documents.
f. Shared Folders — Individual employees, groups of employees and entire departments at MSL are given access to shared network storage locations to save and share files. As it relates to the Human Resources related shared folders (i.e., North America HR Drive (10.2 GB), Corporate HR Drive (440 MB), N.Y. HR Drive (1.9 GB), Chicago HR Drive (1.16 GB), Boston HR Drive (43.3 MB), and Atlanta HR Drive (6.64 GB)), MSL will judgmentally review and produce responsive and non-privileged documents from the North America HR Drive, Corporate HR Drive, and N.Y. HR Drive. MSL will produce to Plaintiffs general information regarding the content of other Shared Folders. The parties will meet and confer regarding the information gathered concerning the other Shared Folders and discuss whether any additional Shared Folders should be moved to Phase I.
g. Database Servers — MSL has indicated that it does not utilize any database servers, other than those that pertain to the sources outlined above in C, which are likely to contain information relevant to Plaintiffs' claims.
h. Halogen Software — MSL utilizes a third party product, Halogen, for performance management and employee evaluations. The parties will meet and confer in order to exchange additional information and attempt to reach an agreement as to the scope of data and the approach used to collect, review and produce responsive and non-privileged documents.
i. Noovoo — MSL maintains a corporate Intranet site called " Noovoo" where employees are able to access Company-related information. MSL will provide Plaintiffs with any employment-related policies maintained within Noovoo.
j. Corporate Feedback — MSL has maintained various e-mail addresses that employees may utilize to provide the company with comments, suggestions and overall feedback. These e-mail addresses include " powerofone@ mslworldwide. com", " powerofthe individual@ mslworldwide. com", " townhall@ mslworldwide. com" and " whatson yourmind@ mslworldwide. com" . The parties have agreed that all responsive and non-privileged ESI will be produced from these e-mail accounts and any other e-mail accounts that fall under this category of information. At present, MSL intends to manually review the contents of each of these e-mail accounts. However, if after collecting the contents of each of the e-mail accounts MSL determines that a manual review would be impractical, the parties will meet and confer as to the approach used to collect, review and produce responsive and non-privileged documents.
k. Hyperion Financial Management (" HFM" ) — MSL uses an Oracle application called HFM that offers global financial consolidation, reporting and analysis capabilities.
l. Vurv/Taleo — Since approximately 2006, MSL used an application known as Vurv as its talent recruitment software. As of August 31, 2011, as a result of Vurv being purchased by Taleo, MSL has been using a similar application by Taleo as its talent recruitment software. The application, which is accessed through MSL's public website, allows users to search for open positions as well as input information about themselves. To the extent Plaintiffs contend they were denied any specific positions, they will identify same and the Parties will meet and confer to discuss what, if any, information exists within Vurv/Taleo regarding the identified position. If information exists in Vurv/Taleo or another source regarding these positions, MSL will produce this information, to the extent such information is discoverable.
m. ServiceNow — MSL utilizes ServiceNow as its Help Desk application. This system covers a wide variety of requests by employees for computer-related assistance (e.g., troubleshoot incidents, install software, etc.).
n. PeopleSoft — MSL utilizes PeopleSoft, an Oracle-based software product, to record employee data such as date of hire, date of termination, promotions, salary increases, transfers, etc. MSL has produced data from this source and will consider producing additional data in response to a specific inquiry from Plaintiffs.
o. PRISM — MSL utilizes PRISM for tracking time and billing. It is used primarily to track an employee's billable time. MSL will consider producing additional data in response to a specific inquiry from Plaintiffs.
p. Portal — MSL maintains a portal provided through Oracle/BEA Systems. The portal is web-based and is used for light workflow activities (such as reviewing draft documents).
q. Desktops/Laptops — MSL provided employees with desktop and/or laptop computers to assist in work related activities. MSL will collect the desktop/laptop hard drive data for 2 custodians and analyze the data to determine the level of duplication of documents in this data source against the data contained in the EMC SourceOne archive for the same custodians. (The parties will meet and confer regarding the selection of the two custodians.) The results of the analysis will be provided to Plaintiffs so that a determination can be made by the parties as to whether MSL will include this data source in its production of ESI to Plaintiffs. If so, the Parties will attempt to reach an agreement as to the approach used to collect, review and produce responsive and non-privileged documents.
r. Publicis Benefits Connection — Plaintiffs understand that MSL provides employees with access to a centralized web based site that provides access to corporate benefits information and other related content.
s. GEARS — MSL maintains a centralized web-based expense tracking and reporting system called " GEARS" where users are able to enter expenses and generate reports. t. MS & L City — MSL maintained a corporate web-based Intranet prior to migrating to Noovoo.
u. Adium — This is a free and open source instant messaging client for Mac OS X users.
v. Pidgin — Pidgin is a chat program which lets users log into accounts on multiple chat networks simultaneously. However, the data resides with a third party messaging provider (e.g. AIM, Yahoo!, Google Talk, MSN Messenger, etc.).
w. IBM Lotus Traveler and MobileIron — MSL maintains these systems for e-mail device sync and security features for employees' mobile devices, including Blackberry devices, iPhones, iPads, Android phones, and Android tablets.
x. Mobile Communication Devices — MSL provides mobile devices and/or connectivity including Blackberry devices, iPhones, iPads, Android phones, and Android tablets to designated employees.
y. Yammer — This is an instant messaging application hosted externally, used for approximately one year in or around 2008 through 2009.
z. SalesForce.com — This is a web-based customer relationship management application but it was not widely used.
aa. Removable Storage Devices — MSL does not restrict authorized employees from using removable storage devices.
D. Custodians
1. The Parties agree that MSL will search the e-mail accounts of the following individuals as they exist on MSL's EMC SourceOne archive. (Except where a date range is noted, the custodian's entire e-mail account was collected from the archive.)
Custodian Name
Title
1.
Lund, Wendy
Executive VP of Global Client and Business Development
2.
Fite, Vicki
Managing Director, MSL Los Angeles
3.
Wilson, Renee
President, NE Region, Managing Director NY
4.
Brennan, Nancy (1/1/08 to 5/31/08)
SVP/Director Corporate Branding
5.
Lilien (Lillien, Kashanian), Tara
SVP, North America Human Resources
6.
Miller, Peter
Executive Vice President, CFO
7.
Masini, Rita
Chief Talent Officer
8.
Tsokanos, Jim
President of the Americas
9.
Da Silva Moore, Monique
Director Healthcare Practice, Global
10.
O'Kane, Jeanine (2/8/10 to 2/24/11)
Director of Healthcare North America
11.
Perlman, Carol
Senior VP
12.
Mayers, Laurie
SVP MS & L Digital
13.
Wilkinson, Kate
Account Executive
14.
Curran, Joel (5/1/08 to 5/31/10)
Managing Director MSL Chicago
15.
Shapiro, Maury
North American CFO
16.
Baskin, Rob (1/1/08 to 12/31/08)
Managing Director
17.
Pierce, Heather
VP
18.
Branam, Jud (1/1/08 to 1/31/10)
Managing Director, MS & L Digital
19.
McDonough, Jenni (1/1/08 to 12/31/08)
VP, Director of Human Resources
20.
Hannaford, Donald (1/1/08 to 3/1/08)
Managing Director
21.
Orr, Bill (1/1/08 to 2/24/11)
Managing Director
22.
Dhillon, Neil (9/8/08 to 5/31/10)
Managing Director MSL Washington DC
23.
Hubbard, Zaneta
Account Supervisor
24.
Morgan, Valerie (1/1/08 to 2/24/11)
HR Director
25.
Daversa, Kristin (1/1/08 to 2/24/11)
HR Director
26.
Vosk, Lindsey (1/1/08 to 2/24/11)
HR Manager
27.
Carberry, Joe (1/1/08 to 2/24/11)
President, Western Region
28.
Sheffield, Julie (1/1/08 to 2/24/11)
HR/Recruiting Associate
29.
MaryEllen O'Donohue
SVP (2010)
30.
Hass, Mark
CEO (former)
E. Search Methodology 1
31.
Morsman, Michael
Managing Director, Ann Arbor (former)
1. General.
The Parties have discussed the methodologies or protocols for the search and review of ESI collected from the EMC SourceOne archive and the following is a summary of the Parties' agreement on the use of Predictive Coding. This section relates solely to the EMC SourceOne data source (hereinafter referred to as the " e-mail collection" ).
2. General Overview of Predictive Coding Process.
MSL will utilize the Axcelerate software by Recommind to search and review the e-mail collection for production in this case.
The process begins with Jackson Lewis attorneys developing an understanding of the entire e-mail collection while identifying a small number of documents, the initial seed set, that is representative of the categories to be reviewed and coded (relevance, privilege, issue-relation). It is the step when the first seed sets are generated which is done by use of search and analytical tools, including keyword, Boolean and concept search, concept grouping, and, as needed, up to 40 other automatically populated filters available within the Axcelerate system. This assists in the attorneys' identification of probative documents for each category to be reviewed and coded.
Plaintiffs' counsel will be provided with preliminary results of MSL's hit counts using keyword searches to create a high priority relevant seed set, and will be invited to contribute their own proposed keywords. Thereafter, Plaintiffs' counsel will be provided with the non-privileged keyword hits— both from MSL's keyword list and Plaintiffs' keyword list— which were reviewed and coded by MSL. Plaintiffs' counsel will review the documents produced and promptly provide defense counsel with their own evaluation of the initial coding applied to the documents, including identification of any documents it believes were incorrectly coded. To the extent the parties disagree regarding the coding of a particular document, they will meet and confer in an effort to resolve the dispute prior to contacting the Court for resolution. The irrelevant documents so produced shall be promptly returned after review and analysis by Plaintiffs' counsel and/or resolution of any disputes by the Court.
The seed sets are then used to begin the Predictive Coding process. Each seed set of documents is applied to its relevant category and starts the software " training" process. The software uses each seed set to identify and prioritize all substantively similar documents over the complete corpus of the e-mail collection. The attorneys then review and code a judgmental sample of at least 500 of the " computer suggested" documents to ensure their proper categorization and to further calibrate the system by recoding documents into their proper categories. Axcelerate learns from the new corrected coding and the Predictive Coding process is repeated.
Attorneys representing MSL will have access to the entire e-mail collection to be searched and will lead the computer training, but they will obtain input from Plaintiffs' counsel during the iterative seed selection and quality control processes and will share the information used to craft the search protocol as further described herein. All non-privileged documents reviewed by MSL during each round of the iterative process (i.e., both documents coded as relevant and irrelevant) will be produced to Plaintiffs' counsel during the iterative seed set selection process. Plaintiffs' counsel will review the documents produced and promptly provide defense counsel with its own evaluation of the initial coding applied to the documents, including identification of any documents it believes were incorrectly coded. To the extent the Parties disagree regarding the coding of a particular document, they will meet and confer in an effort to resolve the dispute prior to contacting the Court for resolution. Again, the irrelevant documents so produced shall be promptly returned after review and analysis by Plaintiffs' counsel and/or resolution of any disputes by the Court.
At the conclusion of the iterative review process, all document predicted by Axcelerate to be relevant will be manually reviewed for production. However, depending on the number of documents returned, the relevancy rating of those documents, and the costs incurred during the development of the seed set and iterative reviews, MSL reserves the right to seek appropriate relief from the Court prior to commencing the final manual review.
The accuracy of the search processes, both the systems' functions and the attorney judgments to train the computer, will be tested and quality controlled by both judgmental and statistical sampling. In statistical sampling, a small set of documents is randomly selected from the total corpus of the documents to be tested. The small set is then reviewed and an error rate calculated therefrom. The error rates can then be reliably projected on the total corpus, having a margin of error directly related to the sample size.
3. Issue Tags.
The parties agree that, to the extent applicable, as part of the seed set training described above, as well as during the iterative review process, all documents categorized as relevant and not privileged, to the extent applicable, also shall be coded with one or more of the following agreed-upon issue tags:
a. Reorganization.
b. Promotion/Assignments.
c. Work/Life Balance.
d. Termination.
e. Compensation.
f. Maternity/Pregnancy.
g. Complaints/HR.
h. Publicis Groupe/Jurisdiction.
This issue coding will take place during the initial random sample, creation of the seed set and initial and iterative training ( see paragraphs 4, 5 and 6 below). This input shall be provided to Plaintiffs' counsel along with the initial document productions. Plaintiffs' counsel shall promptly report any disagreements on classification, and the parties shall discuss these issues in good faith, so that the seed set training may be improved accordingly. This issue-tagging and disclosure shall take place during the described collaborative seed set training process. The disclosures here made by MSL on its issue coding are not required in the final production set.
4. Initial Random Sample.
Using the Axcelerate software to generate a random sample of the entire corpus of documents uploaded to the Axcelerate search and review platform, MSL's attorneys will conduct a review of the random sample for relevance and to develop a baseline for calculating recall and precision. To the extent applicable, any relevant documents also will be coded with one or more of the issue tags referenced in paragraph E.3 above. The random sample consists of 2,399 documents, which represents a 95% confidence level with a confidence estimation of plus or minus 2%. The Parties agree to utilize the random sample generated prior to the finalization of this protocol. However, during Plaintiffs' counsel's review of the random sample, they may advise as to whether they believe any of the documents should be coded with one or more of the subsequently added issue codes (i.e., Complaints/HR and Publicis Groupe/Jurisdiction) and will, as discussed above, indicate any disagreement with MSL's classifications.
5. Seed Set.
a. Defendant MSL.
To create the initial seed set of documents that will be used to " train" the Axcelerate software as described generally above, MSL primarily utilized keywords listed on Exhibits A and B to this protocol, but also utilized other judgmental analysis and search techniques designed to locate highly relevant documents, including the Boolean, concept search and other features of Axcelerate. Given the volume of hits for each keyword (Exhibit A), MSL reviewed a sampling of the hits and coded them for relevance as well as for the following eight preliminary issues: (i) Reorganization; (ii) Promotion; (iii) Work/Life Balance; (iv) Termination; (v) Compensation; and (vi) Maternity. Specifically, except for key words that were proper names, MSL performed several searches within each set of key word hits and reviewed a sample of the hits. The Axcelerate software ranked the hits in order of relevance based on the software's analytical capabilities and the documents were reviewed in decreasing order of relevance (i.e., each review of the sample of supplemental searches started with the highest ranked documents). Exhibit B identifies the supplemental searches conducted, the number of hits, the number of documents reviewed, the number of documents coded as potentially responsive and general comments regarding the results. In addition, to the extent applicable, documents coded as responsive also were coded with one or more issue tags. MSL will repeat the process outlined above and will include the newly defined issues and newly added custodians. MSL will provide Plaintiffs' counsel with all of the non-privileged documents and will provide, to the extent applicable, the issue tag(s) coded for each document, as described above. Plaintiffs' counsel shall promptly review and provide notice as to any documents with which they disagree where they do not understand the coding. If necessary, counsel will meet and confer to attempt to resolve any disagreements regarding the coding applied to the documents in this seed set.
b. Plaintiffs.
To help create the initial seed set of documents that will be used to " train" the Axcelerate software, Plaintiffs provided a list of potential key words to MSL. MSL provided Plaintiffs with a hit list for their proposed key words. This process was repeated twice with the hit list for Plaintiffs' most recent set of keywords attached as Exhibit C. MSL will review 4,000 randomly sampled documents from Plaintiffs' supplemental list of key words to be coded for relevance and issue tags. MSL will provide Plaintiffs' counsel with all non-privileged documents and will provide, to the extent applicable, the issue tag(s) coded for each document. Plaintiffs' counsel shall promptly review and provide notice as to any documents with which they disagree with or where they do not understand the coding. If necessary, the Parties' counsel will meet and confer to attempt to resolve any disagreements regarding the coding applied to the documents in this seed set.
c. Judgmental Sampling.
In addition to the above, a number of targeted searches were conducted by MSL in an effort to locate documents responsive to several of Plaintiffs' specific discovery requests. Approximately 578 documents have already been coded as responsive and produced to Plaintiffs. In addition, several judgmental searches were conducted which resulted in approximately 300 documents initially being coded as responsive and several thousand additional documents coded as irrelevant. The documents coded as relevant and non-privileged also will be reviewed by Plaintiffs' counsel and, subject to their feedback, included in the seed set. An explanation shall be provided by MSL's attorneys for the basis of the bulk tagging of irrelevant documents (primarily electronic periodicals and newsletters that were excluded in the same manner as spam junk mail is excluded). The explanation shall include the types of documents bulk tagged as irrelevant as well as the process used to identify those types of documents and other similar documents that were bulk tagged as irrelevant.
6. Initial And Iterative Training.
Following the creation of the first seed set, the Axcelerate software will review the entire data set to identify other potentially relevant documents. MSL will then review and tag a judgmental based sample, consisting of a minimum of 500 documents, including all documents ranked as highly relevant or hot, of the new " Computer Suggested" documents, which were suggested by the Axcelerate software. MSL's attorneys shall act in consultation with the Axcelerate software experts to make a reasonable, good faith effort to select documents in the judgmental sample that will serve to enhance and increase the accuracy of the predictive coding functions. The results of this first iteration, both the documents newly coded as relevant and not relevant for particular issue code or codes, will be provided to Plaintiffs' counsel for review and comment. (All documents produced by the parties herein to each other, including, without limitation, these small seed set development productions, shall be made under the Confidentiality Stipulation in this matter as well as any clawback agreement that shall be reduced to an order acceptable to the Court. Any documents marked as irrelevant shall be returned to counsel for MSL at the conclusion of the iterative training phase, unless the relevancy of any documents are disputed, in which case they may be submitted to the Court for review.)
Upon completion of the initial review, and any related meet and confer sessions and agreed upon coding corrections, the Axcelerate software will be run again over the entire data set for suggestions on other potentially relevant documents following the same procedures as the first iteration. The purpose of this second and any subsequent iterations of the Predictive Coding process will be to further refine and improve the accuracy of the predictions on relevance and various other codes. The results of the second iteration shall be reviewed and new coding shared with Plaintiffs' counsel as described for the first iteration. This process shall be repeated five more times, for a total of seven iterations, unless the change in the total number of relevant documents predicted by the system as a result of a new iteration, as compared to the last iteration, is less than five percent (5%), and no new documents are found that are predicted to be hot (aka highly relevant ), at which point MSL shall have the discretion to stop the iterative process and begin the final review as next described. If more than 40,000 documents are returned in the final iteration, then MSL reserves the right to apply to the Court for relief and limitations in its review obligations hereunder. Plaintiffs reserve the right, at all times, to challenge the accuracy and reliability of the predictive coding process and the right to apply to the Court for a review of the process.
7. Final Search and Production.
All of the documents predicted to be relevant in the final iteration described in paragraph six above will be reviewed by MSL, unless it applies to the court for relief hereunder. All documents found by MSL's review to be relevant and non-privileged documents will be promptly produced to Plaintiffs. If more than 40,000 documents are included in the final iteration, then MSL reserves its right to seek payment from Plaintiffs for all reasonable costs and fees MSL incurred related to the attorney review and production of more 40,000 documents.
8. Quality Control by Random Sample of Irrelevant Documents.
In addition, at the conclusion of this search protocol development process described above, and before the final search and production described in Paragraph 7 above, MSL will review a random sample of 2,399 documents contained in the remainder of the database that were excluded as irrelevant. The results of this review, both the documents coded as relevant and not relevant, but not privileged, will be provided to Plaintiffs' counsel for review. (Any documents initially coded as " not relevant" will be provided subject to the Confidentiality Stipulation and any clawback agreements entered in this matter will be returned to counsel for MSL within 60 days of their production.) The purpose for this review is to allow calculation of the approximate degree of recall and precision of the search and review process used. If Plaintiffs object to the proposed review based on the random sample quality control results, or any other valid objection, they shall provide MSL with written notice thereof within five days of the receipt of the random sample. The parties shall then meet and confer in good faith to resolve any difficulties, and failing that shall apply to the Court for relief. MSL shall not be required to proceed with the final search and review described in Paragraph 7 above unless and until objections raised by Plaintiffs have been adjudicated by the Court or resolved by written agreement of the Parties.
F. Costs
1. MSL proposes to limit the costs of its final review and production of responsive ESI from the MSL email collection to an additional $200,000, above and beyond the approximately $350,000 it has already paid or is anticipated to pay in e-discovery related activities as previously described and disclosed to Plaintiffs.
2. Plaintiffs agree to bear all of the costs associated with their compliance with the terms of this protocol and with the receipt and review of ESI produced hereunder including the costs associated with its ESI experts at DOAR Litigation Consulting who will be involved with Plaintiffs in all aspects of this ESI protocol. Plaintiffs propose that MSL bear all of the costs associated with its obligations under the terms of this protocol and do not agree to limit the amount of information subject to the review and production of ESI by MSL.
G. Format of Production For Documents Produced From Axcelerate
1. TIFF/Native File Format Production.
Documents will be produced as single-page TIFF images with corresponding multi-page text and necessary load files. The load files will include an image load file as well as a metadata (.DAT) file with the metadata fields identified on Exhibit D. Defendant MSL will produce spreadsheets (.xls files) and PowerPoint presentations (.ppt files) in native form as well as any documents that cannot be converted to TIFF format (e.g., audio or video files, such as mp3s, wavs, megs, etc.). In addition, for any redacted documents that are produced, the documents' metadata fields will be redacted where required. For the production of ESI from non-email sources, the parties will meet and confer to attempt to reach an agreement of the format of production.
2. Appearance.
Subject to appropriate redaction, each document's electronic image will convey the same information and image as the original document. Documents that present imaging or formatting problems will be promptly identified and the parties will meet and confer in an attempt to resolve the problems.
3. Document Numbering.
Each page of a produced document will have a legible, unique page identifier " Bates Number" electronically " burned" onto the image at a location that does not obliterate, conceal or interfere with any information from the source document. The Bates Number for each page of each document will be created so as to identify the producing party and the document number. In the case of materials redacted in accordance with applicable law or confidential materials contemplated in any Confidentiality Stipulation entered into by the parties, a designation may be " burned" onto the document's image at a location that does not obliterate or obscure any information from the source document.
4. Production Media.
The producing party will produce documents on readily accessible, computer or electronic media as the parties may hereafter agree upon, including CD-ROM, DVD, external hard drive (with standard PC compatible interface), (the " Production Media" ). Each piece of Production Media will be assigned a production number or other unique identifying label corresponding to the date of the production of documents on the Production Media (e.g., " Defendant MSL Production April 1, 2012" ) as well as the sequence of the material in that production (e.g. " -001", " -002" ). For example, if the production comprises document images on three DVDs, the producing party may label each DVD in the following manner " Defendant MSL Production April 1, 2012", " Defendant MSL Production April 1, 2012-002", " Defendant MSL Production April 1, 2012-003." Additional information that will be identified on the physical Production Media includes: (1) text referencing that it was produced in da Silva Moore v. Publicis Groupe SA, et al.; and (2) the Bates Number range of the materials contained on the Production Media. Further, any replacement Production Media will cross-reference the original Production Media and clearly identify that it is a replacement and cross-reference the Bates Number range that is being replaced.
5. Write Protection and Preservation.
All computer media that is capable of write-protection should be write-protected before production.
6. Inadvertent Disclosures.
The terms of the Parties' Clawback Agreement and Court Order shall apply to this protocol.
7. Duplicate Production Not Required.
A party producing data in electronic form need not produce the same document in paper format.
H. Timing
1. To the extent a timeframe is not specifically outlined herein, the parties will use their reasonable efforts to produce ESI in a timely manner consistent with the Court's discovery schedule.
2. The parties will produce ESI on a rolling basis.
I. General Provisions
1. Any practice or procedure set forth herein may be varied by agreement of the parties, and first will be confirmed in writing, where such variance is deemed appropriate to facilitate the timely and economical exchange of electronic data.
2. Should any party subsequently determine it cannot in good faith proceed as required by this protocol, the parties will meet and confer to resolve any dispute before seeking Court intervention.
3. The Parties agree that e-discovery will be conducted in phases and, at the conclusion of the search process described in Section E above, the Parties will meet and confer regarding whether further searches of additional custodians and/or the Phase II sources is warranted and/or reasonable. If agreement cannot be reached, either party may seek relief from the Court.
J. Plaintiffs' Objection
1. Plaintiffs object to this ESI Protocol in its entirety. Plaintiffs submitted their own proposed ESI Protocol to the Court, but it was largely rejected. The Court then ordered the parties to submit a joint ESI Protocol reflecting the Court's rulings. Accordingly, Plaintiffs jointly submit this ESI Protocol with MSL, but reserve the right to object to its use in this case.
This protocol may be executed in counterparts. Each counterpart, when so executed, will be deemed and original, and will constitute the same instrument.
SO ORDERED:
Plaintiffs object to this ESI Protocol in its entirety. Plaintiffs submitted their own proposed ESI Protocol to the Court, but it was largely rejected. The Court then ordered the parties to submit a joint ESI Protocol reflecting the Court's rulings. Accordingly, Plaintiffs jointly submit this ESI Protocol with MSL, but reserve the right to object to its use in this case.(ESI Protocol ¶ J.1 at p. 22.)
(g) Signing Disclosures and Discovery Requests, Responses, and Objections.
(1) Signature Required; Effect of Signature. Every disclosure under Rule 26(a)(1) or (a)(3) and every discovery request, response, or objection must be signed by at least one attorney of record in the attorney's own name .... By signing, an attorney or party certifies that to the best of the person's knowledge, information, and belief formed after a reasonable inquiry:
(A) with respect to a disclosure, it is complete and correct as of the time it is made; and
(B) with respect to a discovery request, response, or objection, it is:
(i) consistent with these rules and warranted by existing law or by a nonfrivolous argument for extending, modifying, or reversing existing law, or for establishing new law;
(ii) not interposed for any improper purpose, such as to harass, cause unnecessary delay, or needlessly increase the cost of litigation; and
(iii) neither unreasonable nor unduly burdensome or expensive, considering the needs of the case, prior discovery in the case, the amount in controversy, and the importance of the issues at stake in the action.Fed.R.Civ.P. 26(g)(1) (emphasis added).