skip to content

With Millions of Documents Collected, When Has a Producing Party Completed Document Review?

by Joshua Gilliland


Document review can be a lengthy and involved process, with complex searches and multiple attorneys assigned to review potentially responsive data. Attorneys can rightfully ask, after diligently reviewing their search term reports and predictive coding hits, just when are we done with document review?

The answer is not as simple as when every email is read.

We can look at a recent case in the news to determine what the courts have to say on the matter. In the 2017 Davine v. Golub Corp case, the Defendants were given specific guidance on when their document review was complete. Stated in the order granting a motion to compel the production of email communications from 20 opt-in Plaintiffs was the following:

Defendants are entitled to rely on their predictive coding model for purposes of identifying relevant responsive documents, and may cease their review of the documents identified as possibly relevant when they made a good faith determination that the burden of continuing the review outweighs the benefit in terms of identifying relevant documents.

Here is the $64,000 question: When does the burden of continuing document review outweigh the benefit of identifying relevant documents?

Case law states that the standard for producing electronically stored information is one of reasonableness, not perfection. Moreover, parties must conduct a reasonable search when responding to discovery requests.

Discovery depends on attorneys acting in good faith and meeting their professional obligations to “reasonably and diligently search for and produce responsive documents.”

How Much ESI is There to Review? Analyze the Case

One method for lawyers to determine if they have conducted a reasonable search is using Case Analytics. The attorneys handling the case can know exactly how many records are in the case, how many have been viewed, and how many have been coded. There is also an estimate of the number of months it will take to complete document review.  

How Effective Are the Lawyers at Document Review?

Attorneys using predictive coding can focus their efforts on the ESI that is likely relevant to the case. Nevertheless, that still takes time to review. Case managers can gauge document review effectiveness with review time by Attorney, Review Progress, Rating Trends, and Reviewer Pace.

The amount of time spent on document review is billed directly to the client (unless the law firm is not charging the client based on the billable hour). The progress made in number of records reviewed in an hour shows how effective lawyers are being with their time for identifying what is responsive to a case.

If ongoing document review is yielding a large number of responsive records, then there is a strong argument for it being time well spent. However, if substantial time is being spent identifying non-responsive records, then relevant records have turned into a needle in a haystack. This is the moment when lawyers can ask with all seriousness, does continuing document review make economic sense? If a large amount of responsive records have been identified, and now hours are being spent to find random relevant records with marginal value to the case, the answer is no. Alternatively, if the relevant ESI can make or break the case, then continuing review is worthwhile.

In Search of Reasonableness

It is unreasonable for litigants to review every email, spreadsheet, and document collected for a lawsuit to determine relevance. The issue of when ESI review is completed is a simple one with complex analysis: Has the party conducted a reasonable search for relevant electronically stored information?

Predictive coding helps focus on the ESI that is relevant to a case, so attorneys can spend their energy on what is proportional to the needs of the case and not mindlessly clicking irrelevant and non-responsive on email messages. A party can then determine that the burden of document review outweighs any rogue relevant information when the time spent conducting reviewing is yielding the occasional relevant record in an ocean of irrelevant information.