Ricoh eDiscovery

Conducting an Analytics-Based Document Review? Consider These Three Factors First.

Posted by Jessica Lockett |4 minute read

Mar 30, 2021 11:05:18 AM

Active Learning Case Study Blog - Feature Images

Gone are the days of relying solely on time-consuming, linear, eyes-on review. Not only does leveraging advanced analytic tools result in impressive time savings, but it can also help you save significantly on review spend. A win-win for both you and/or your clients.

Falling under the Technology-Assisted Review (TAR) umbrella, analytics-based reviews, and the various associated tools such as Active Learning by Relativity have revolutionized the eDiscovery industry. While these tools can significantly streamline the review process, they are by no means an ‘easy button’. Before you run full steam ahead, learn about the three factors below that — if not considered and addressed at the outset — can substantially eat into the cost and time savings that TAR can provide.

1. The Review Platform

First, ask yourself, “Do I have access to the appropriate review platform?” Not all review platforms are created equal. Some platforms are very effective for review workflows which only require the application of certain structural analytics (such as email threading and de-duplication) while others have much greater conceptual analytic capabilities. Whether your firm or company has acquired an on-premise review platform or you use the platform serviced by an outside vendor, it is important to understand what advanced capabilities are available to you and whether they are appropriate for your particular review goals.

2. The Review Set

Once you have secured access to an effective and appropriate review tool for your particular matter, you need to identify the review set that will be used with the advanced analytics tool. This is an important step that should not be overlooked.

Just as the name suggests, conceptual analytics require concepts from which it can learn from. It is important to note that not all documents are compatible with this type of tool. For instance, documents that are low in text (i.e., images or number-heavy documents such as Excel files) will generally not have any concepts from which the system can learn from or apply its learnings to. In order to increase the effectiveness of the tool itself, effort should be spent upfront to analyze your review set and ensure that the right documents are being presented to the system for true conceptual learning.

You will also want to limit the instances of duplicative documents in your review set. With Active Learning, for example, the higher the number of duplicative or near-duplicative documents in your review set, the increased likelihood of inconsistent coding. This inconsistency, which is simply due to human error, can create confusion in the tool. Low-text and duplicative documents can (and should) be addressed in separate review workflows.

3. The Review Team

Once your review set has been identified and analyzed, you are ready to assemble your review team. If you are using conceptual analytics, you will need to ensure that you have dedicated human resources available to train the tool. Review teams are tasked with the challenge of distinguishing which documents are ‘relevant’ and which are ‘responsive’. A simple example: a construction litigation is concerned with delays that occurred only at Location A and you are presented with a detailed report outlining delays that occurred at Location B. While this example document is not ‘relevant’ in the legal sense, it is responsive in a conceptual sense and will be a valuable document from which the tool can learn and begin to sort out the most likely from the least likely relevant documents in that particular review set.

Your review team does not need to include the most senior lawyer on the file. In fact, I recommend against that at this stage. The more senior lawyers, partners or counsel can (and should!) provide feedback on specific questions of relevance from your review team throughout an analytics-based review, but they should not be hands-on until after the set of ‘most likely relevant’ documents have been identified. This is when the high-value legal work and analysis truly begins. Who, then, should review documents for use with conceptual analytics? Any trusted member of your legal team can assist at this stage (including an outside review services team). The important takeaway is that the team needs to have time to dedicate to the review and they need to work together to ensure consistency of coding decisions, which is vital to an effective analytics-based review.


Intelligent Review by Ricoh eDiscovery

Working with us for your document review can allow you to free up internal resources and get the job done faster and with increased accuracy, all while reducing overall cost. Have confidence that when you’re facing a judge or opposing counsel, we’re behind you with our ISO-certified processes and our statistically-validated review. We’re your partner and we're here to help you get the job done right.

Our unique combination of people, process and technology provides clients with the support they need through each of the three steps listed above. Check out our latest case study to learn how we were able to achieve the following results for a firm struggling with their internal analytics-based review:

  • A 78 per cent reduction in required eyes-on review
  • 65 per cent fewer documents reviewed to achieve Active Learning (AL) success
  • Review of only four per cent of the overall document set was required for AL completion

Intelligent Review Case Study Ricoh eDiscovery

Topics: Intelligent Review, Jessica Lockett

   

Tell Us What You Think.