Review Quality Controls. Here is a list of a few of the main quality control methods in use for large scale document reviews:
- Multiple expert review of key seed-set training documents using both subject matter experts (attorneys) and AI experts (technologists).
- Direct supervision and feedback by the responsible lawyer(s) (merits counsel) signing under 26(g).
- Extensive quality control methods, including training and more training, sampling, positive feedback loops, clever batching, and sometimes, quick reassignment or firing of reviewers who are not working well on the project.
- Experienced, well motivated human reviewers who know and like the AI agents (software tools) they work with.
- New tools and psychological techniques (e.g. game theory, story telling) to facilitate prolonged concentration (beyond just coffee, $, and fear) to keep attorney reviewers engaged and motivated to perform the complex legal judgment tasks required to correctly review thousands of usually boring documents for days on end (voyeurism will only take you so far). See Robots With A Story To Tell and Robot Games: the Gamification of Legal Review.
- Highly skilled project managers who know and understand their team, both human and computer, and the new tools and techniques under development to help coach the team.
- Strategic cooperation between opposing counsel with adequate disclosures to build trust and mutually acceptable relevancy standards. See Cooperation page of EDBP.
- Final, last-chance review of a production set before going out the door by spot checking, judgmental sampling (i.e. search for those attorney domains one more time), and random sampling.
See generally: The Sedona Conference® Commentary on Achieving Quality in the E-Discovery Process, May 2009.
For more on sampling and its use in both quality control and quality assurance, see the Review Quality Control page of EDBP; Introducing “ei-Recall” – A New Gold Standard for Recall Calculations in Legal Search – Part One, Part Two and Part Three; In Legal Search Exact Recall Can Never Be Known; and the KPMG publications: Gabriel, Manfred, Quality Control For Predictive Coding In e-Discovery (2013); and, The Case For Statistical Sampling In e-Discovery (2012).
Also see: The U.S. District Court for the Northern District of California Guidelines for the Discovery of Electronically Stored Information at 2.02 on Rule 26(f) conferences. The guideline urges parties to discuss methoids to reduce costs and improve efficiencies, including “sampling methods to validate the search for relevant information.” The Northern California ESI checklist for use during the Rule 26(f) meet and confer process also includes a suggestion for attorneys to discuss:
The quality control method(s) the producing party will use to evaluate whether a production is missing relevant ESI or contains substantial amounts of irrelevant ESI.
For a detailed discussion of Ralph Losey’s particular approach to quality control with an emphasis on metrics, and flow-state concentration, see ZEN Document Review. ZEN stands for Zero Error Numerics. Using Zero Error Numerics skilled reviewers can attain very high levels of efficiency and quality. ZEN methods include:
- predictive coding analytics, a type of artificial intelligence, actively managed by skilled human analysts in a hybrid approach;
- data visualizations with metrics to monitor progress;
- flow-state of human reviewer concentration and interaction with AI processes;
- quiet, uninterrupted, single-minded focus (dual tasking during review is prohibited);
- disciplined adherence to a scientifically proven set of search and review methods including linear, keyword, similarity, concept, and predictive coding;
- repeated tests for errors, especially retrieval omissions;
- objective measurements of recall, precision and accuracy ranges;
- judgmental and random sampling and analysis such as ei-Recall;
- active project management and review-lawyer supervision;
- small team approach with AI leverage, instead of large numbers of reviewers;
- recognition that mere relevant is irrelevant;
- recognition of the importance of simplicity under the 7±2 rule;
- multiple fail-safe systems for error detection of all kinds, including reviewer inconsistencies;
- use of only the highest quality, tested e-discoverysoftware and vendor teams under close supervision and teamwork;
- use of only experienced, knowledgeable Subject Matter Experts for relevancy guidance, either directly or by close consultation;
- extreme care taken to protect client confidentiality; and,
- high ethics – our goal is to find and disclose the truth in compliance with local laws, not win a particular case.
Significant cost savings can also be attained by following Zero Error Numerics methods, especially when compared to more traditional review methods.
Practicing attorneys and paralegals are invited to submit proposed additional best practices for review and quality control not included in the above, and to submit more detailed descriptions of the methods already listed, or links to helpful articles.