2011 Winston & Strawn LLP



Similar documents
Predictive Coding Defensibility and the Transparent Predictive Coding Workflow

Viewpoint ediscovery Services

Predictive Coding Defensibility and the Transparent Predictive Coding Workflow

Enhancing Document Review Efficiency with OmniX

The Tested Effectiveness of Equivio>Relevance in Technology Assisted Review

Understanding How Service Providers Charge for ediscovery Services

Three Methods for ediscovery Document Prioritization:

Software-assisted document review: An ROI your GC can appreciate. kpmg.com

Pr a c t i c a l Litigator s Br i e f Gu i d e t o Eva l u at i n g Ea r ly Ca s e


Xact Data Discovery. Xact Data Discovery. Xact Data Discovery. Xact Data Discovery. ediscovery for DUMMIES LAWYERS. MDLA TTS August 23, 2013

The Truth About Predictive Coding: Getting Beyond The Hype

E-Discovery in Michigan. Presented by Angela Boufford

Stu Van Dusen Marketing Manager, Lexbe LC. September 18, 2014

Reduce Cost and Risk during Discovery E-DISCOVERY GLOSSARY

REDUCING COSTS WITH ADVANCED REVIEW STRATEGIES - PRIORITIZATION FOR 100% REVIEW. Bill Tolson Sr. Product Marketing Manager Recommind Inc.

ZEROING IN DATA TARGETING IN EDISCOVERY TO REDUCE VOLUMES AND COSTS

The United States Law Week

LexisNexis Concordance Evolution Amazing speed plus LAW PreDiscovery and LexisNexis Near Dupe integration

ediscovery Technology That Works for You

AccessData Corporation. No More Load Files. Integrating AD ediscovery and Summation to Eliminate Moving Data Between Litigation Support Products

E- Discovery in Criminal Law

Symantec ediscovery Platform, powered by Clearwell

Amazing speed and easy to use designed for large-scale, complex litigation cases

THE NEW WORLD OF E-DISCOVERY

Equivio FAQs. No. Equivio groups documents by similarity or threads. The documents are grouped together for bulk coding but are not culled out.

E-discovery Taking Predictive Coding Out of the Black Box

E-Discovery Best Practices

Quality Control for predictive coding in ediscovery. kpmg.com

Making Sense of E-Discovery: 10 Plain Steps for Producing ESI

Discussion of Electronic Discovery at Rule 26(f) Conferences: A Guide for Practitioners

Power-Up Your Privilege Review: Protecting Privileged Materials in Ediscovery

E-DISCOVERY: A Primer

ediscovery Policies: Planned Protection Saves More than Money Anticipating and Mitigating the Costs of Litigation

Litigation Solutions. insightful interactive culling. distributed ediscovery processing. powering digital review

Veritas ediscovery Platform

Litigation Support. Learn How to Talk the Talk. solutions. Document management

Data Sheet: Archiving Symantec Enterprise Vault Discovery Accelerator Accelerate e-discovery and simplify review

Recent Developments in the Law & Technology Relating to Predictive Coding

Technology Assisted Review of Documents

This Webcast Will Begin Shortly

Digital Government Institute. Managing E-Discovery for Government: Integrating Teams and Technology

The Business Case for ECA

SEVENTH CIRCUIT ELECTRONIC DISCOVERY PILOT PROGRAM FOR DISCOVERY OF ELECTRONICALLY STORED

Whitepaper: Enterprise Vault Discovery Accelerator and Clearwell A Comparison August 2012

MANAGING BIG DATA IN LITIGATION

Clearwell Legal ediscovery Solution

Assisted Review Guide

Document Review Costs

Best Practices: Defensibly Collecting, Reviewing, and Producing

PREDICTIVE CODING: SILVER BULLET OR PANDORA S BOX?

E-Discovery Tip Sheet

Technology- Assisted Review 2.0

Sample Electronic Discovery Request for Proposal

APPENDIX B TO REQUEST FOR PROPOSALS

Proactive Data Management for ediscovery

Predictive Coding, TAR, CAR NOT Just for Litigation

Navigating Information Governance and ediscovery

Review Easy Guide for Administrators. Version 1.0

Considering Third Generation ediscovery? Two Approaches for Evaluating ediscovery Offerings

GUIDELINES FOR USE OF THE MODEL AGREEMENT REGARDING DISCOVERY OF ELECTRONICALLY STORED INFORMATION

Best Practices in Managed Document Review

Case 2:14-cv KHV-JPO Document 12 Filed 07/10/14 Page 1 of 10 IN THE UNITED STATES DISTRICT COURT FOR THE DISTRICT OF KANSAS

Discovery in the Digital Age: e-discovery Technology Overview. Chuck Rothman, P.Eng Wortzman Nickle Professional Corp.

PICTERA. What Is Intell1gent One? Created by the clients, for the clients SOLUTIONS

Five Steps to Ensure a Technically Accurate Document Production


Transcription:

Today s elunch Presenters John Rosenthal Litigation Washington, D.C. JRosenthal@winston.com Scott Cohen Director of E Discovery Support Services New York SCohen@winston.com 2

What Was Advertised Effective e strategies es for reducing the cost of an eec electronic c document review Use of second generation search technologies to both filter document sets and more efficiently organize the materials for review, including concepting, clustering, threading and near duping technologies Implementing best practices in the review process to ensure a high degree of precision and recall Effective strategies for handling privilege documents Use of lower cost reviewers 3

Why Should You Listen to Us? Spent the last two years studying the e discovery marketplace and designing a new platform Benchmarked the entire industry, including other law firms, vendors and software providers Built an e discovery consulting shop behind Winston & Strawn s firewall to include: Collection Processing Analytics Review Platform Review Center 4

Electronic Document Review Excessive and unpredictable costs: 58 % to 70 % of total litigation costs Document review costs are rising due to the increasing amount of electronic information Traditional document review is not accurate: Evidence suggest there are high error rates in linear manual review Error rates lead to likelihood of inadvertent production of privileged or sensitive information Inability to defend the review process: Judges are increasingly focusing on the need for validation of review processes 5 5

Goals of ESI Review Recall Identification and prioritization of relevant material Precision Elimination of irrelevant/nonresponsive material Identification of privileged material Relevant Data Relevant and not retrieved Non relevant and retrieved Retrieved Data Relevant and retrieved 6

Accuracy of Human Review Recall Number of responsive documents retrieved Total number of responsive documents in the collection Precision Number of responsive documents identified Total number of documents retrieved 7

Accuracy of Human Review 100% Perfection Prec cision 90% 80% 70% 60% 50% 40% Blair & Maron (1985) Typical result in a manual responsiveness review 30% 20% 10% TREC Best Benchmark (Best performance on Precision at a given Recall) 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Recall M. Grossman Presentation - Technology-Assisted Review Can be More Effective 2011 and Winston More Efficient & Strawn Than LLP Exhaustive Manual Review 8

Traditional Electronic Document Review = Linear Review Over collection Little or no culling Ad hoc use of Boolean searches Linear review of the data set Use of traditional associate work force to perform review Traditional Approach Manually Acquire Broad Amounts of Data Process Data First Level Review Second Level Review Produced Documents 9

Linear Review 10

Traditional Approach = $$$ Each custodian = 4 and 8 GB 1 GB of data = 60,000 pages Average review rate = 350 to 500 pages per hour 1 GB = 120 hours of review time Law firm dynamic 1 GB = $36,000 or $1.60 per page Outsourced review 1 GB = $6,500 a GB to $11,000 (60 to 95 cents per page) 11

Why Has Legal Market Been So Slow to Adopt Changes? Fil Failure to understand dthe traditional model dlis broken Does not understand how to use the new technologies Mistakenly interprets move toward outsourced review asa a result of the economic downturn Continues to permit individual litigators to select vendors, review platforms and implement their own processes 12

The Good News! Cost associated itdwith reviewing i large volumes of ESI is forcing a change Vendor entry into review space has accelerated change Clients are demanding greater use of: Low cost domestic review attorneys Off shore review facilities Adoption of new technologies 13

Document Review Models Outsourced Manual Review Most prominent model used today Limited i culling and analysis Heavy reliance on attorney review Use of sampling to ensure quality control Predictive Coding Great deal of confusion regarding what it means Uses attorneys to develop a seed set of data that can be fed into a black box to find similar documents Emphasizes sampling of inclusion set and exclusion set Never tested or accepted in any court ProcessOriented Review Development of broad and defensible relevance criteria through integrative ECA process Process approach to review to increase efficiency, recall and precision, using legally accepted tool sets: Threading Near Duping Advance search Clustering 14

A Word On Predictive Coding 15

Technology Assisted Reviews = TREC 2009 Manual Review = TREC 2009 Technology- Assisted Review M. Grossman Presentation - Technology-Assisted Review Can be More 2011 Effective Winston and & More Strawn Efficient LLP Than Exhaustive Manual Review 16

Developing a Seed Set Precision Document Set for Review Source: 2011 Servient Winston Inc. & Strawn http://www.servient.com/ LLP 17

Developing a Seed Set: Recall Documents Set Excluded From Review Source: 2011 Servient Winston Inc. & Strawn http://www.servient.com/ LLP 18

Our Take on Predictive Coding Not ready for prime time as a final determinant e of relevance and privilege: Never been accepted by a court do you want to be the first? Predictive coding technologies and processes vary wildly As with any statistical model, caution should be exercised ( Torture numbers, and they'll confess to anything ) Seed set methodology: Garbage in = garbage out Who is picking your seed set Cases change rapidly within the first few months Is this scalable to all types of cases Notable to address (images, graphics, excel files, video, voice, etc.) 19

Our Take on Predictive Coding (cont d) Over time, predictive coding and other technology assisted methodologies will become both prevalent and accepted Under the appropriate circumstances (e.g., agreement by the parties), it can be a valuable and cost effective tool Until then predictive coding can be used to help organize and make a review more efficient 20

21

A Process Oriented Electronic Document Review 22

Old vs. New Models TraditionalApproach Process Oriented Approach Manually Acquire Broad Amounts of Data Analytics/ Process Data Search, Acquire, and Process Narrower Amounts of Data First Level Review Second Level Review Filter and Cull Review Produced d Documents Produced Documents 23

Phases to A Process Oriented Document Review Analytical Working with client and data to develop a set of defensible relevance criteria to select data subject to review Collection Use of search and retrieval at the front end can dramatically reduce the volume and cost Risk consideration Processing filtering and culling Employ more sophisticated processing tools to further reduce the volume set Unilaterally vs. negotiate Non Linear Review Employ lower cost reviewers Use technology and process to increase precision and recall across the data dt set 24

Selecting the Review Set Recent focus by courts Recognition that parties have failed to adequate construct and test search terms to withstand judicial scrutiny (Victor Stanley) Absent agreement, you will have to defend your relevance criteria in terms of reasonableness and responsiveness: Selecting test data Quality controlling the results Revising queries Transparent records from which you can defend your decision 25

Selecting the Review Set (cont d.) ESI Sources Archiver Network Stores PCs/Laptops Select Test Data Set Archiver Network Shares PCs/Laptops Process Dedupe, De-NIST, Extract Filter Test Data Set Based Upon Queries Date Range Custodians File Types QC via Sampling Responsive Test Data Set SharePoint SharePoint Location Loose Media Loose Media Key Words Filter Entire Data Set Based Upon Relevance Queries Process Date Range Exceptions Revise Queries as Appropriate Dedupe, De-NIST, Extract Custodians File Types Location Responsive Y/N? Responsive Data Set Key Words 26

Rapid Filtering/Culling 27

Multi Faceted Display of Results 28

Instant File Type Analysis 29

Visualization of Email Traffic 30

Advanced Search 31

Review Phase 70% of your production is e mail 15% to 40% are duplicates or near duplicates Reviewers can review data at a higher rate with a greater degree of precision ii and recall when looking at like information 32

Review Phase Remove from DB? Store NR data in QC via Sampling No QC via Sampling Non-Responsive Cull Batch Documents via Metadata and Smart Filters Privilege Filter Custodian Non-Responsive Initial Data Set Date Range Custodians To/From Location Key Word Metadata Review Responsive File Types Key Words E-mail Thread Privileged? Location Near-Dupes Key Words Cluster Separate Attorneys Send to Privilege il Review 33

Organizing gthe Data For Review Batching will be the most significant decision in terms of expediting the review Our approaches: E mail threading Near Duplicates Review in the remainder Clustering algorithm Boolean queries Key custodian review is appropriate Types of Documents Metadata (e.g., date range, author, recipient, i etc.) 34

E Mail Threads 70% of production is e-mail and of that nearly 65% or more are part of e-mail threads Less Time Less Errors The Problem: No clear method to identify email threads emails are reviewed multiple times and inconsistently Extremely difficult to identify where missing emails exist email Threads Step 1 Group into email sets Source: Equivio 35 email Threads Step 2 Build tree structure Identify missing links Suppress duplicates Focus on inclusives Less Cost

Duplication and Near Duplication 15% to 40% of document population are duplicates or near duplicates Less Time The Problem: No clear method to organize and allocate documents across reviewers Documents are reviewed multiple times by different reviewers High risk of different coding among similar documents Near-Duping Step 1 Group the near-duplicates Identify the differences among the nearduplicates Source: 2011 Winston Equivio & Strawn LLP 36 Near-Duping Step 2 Assign near-dupe sets for coherent review to reviewers Reviewers prioritize and review only the differences Apply coding to entire near-dupe sets where appropriate Less Errors Less Cost

Second Generation Search Technology Concept search places a document or part of a document in this space. Results are returned in order of relevance. higher score = closer document Document 1: 98 Document 3: 92 Document 4: 91 37 Source: K-Cura Corp.

Advanced Features of Relativity Analytics Clustering Automatic grouping (sorting) of a collection of documents into subsets based on their conceptual content Categorization Categorize and organize documents in the entire database based on a small set of user example documents Keyword Expansion Select keywords within the document viewer and Relativity will produce a list of conceptually related terms Inline Concept Search Utilize concept searching within Relativity s innovative document viewer by right clicking on portions of a displayed document 38

Keyword Expansion 39

Best Practices Planning Plan the work and work the plan: Outside counsel ediscovery Team Review team leader Planning meetings: Developing relevance criteria Developing a review plan Deadlines must be established, staffing needs must be defined, and review speeds must be estimated to provide budget predictability to the client and avoid cost and time overruns 40

Best Practices Planning (cont d) Before commencing review, decide on format of production and privilege log Rule 26(f) conference Native vs. TIFF productions Privilege log: metadata vs. manual Discuss potential ti exclusions, such as for post litigation ti outside counsel documents Logging of email chains Timing of privilege log production (with or after document productions) 41

Best Practices Roles ediscovery Team Administration of the data set (e.g., loading of data, managing index, running clusters) Assist Review Team Leader in developing relevance criteria Creation of load file Review Team Leader Coordinate with attorneys regarding development relevance criteria Identify test data set Validate test data set Manage reviewers 42

Best Practices Training the Review Team Substantial attention should be given to training. Briefing materials should be prepared: List of counsel involved in the Legal Matter List of key witnesses involved in the Legal Matter Summary of the claims and issues Summary of the request for production of documents and responses Instructions regarding how to use the review tool Instructions regarding gcoding of the individual documents Procedure for resolving questions or ambiguities Training on how to recognize and code privileged documents 43

Best Practices Conducting the Review Uniformity in review team analysis should be the goal Variations in review team analysis should be addressed each day and additional training provided as required Daily feedback heightens quality and attention to detail by reviewers, and knowledge transfer to reviewers Reduces possibility of accumulation of poorly reviewed documents in the population Review team should be broken down into sub groups (e.g., separate privilege review group) Computer equipment should foster speed and efficiency Dual screens or wide screens, fast processing time Review tool should be configured to reduce need for scrolling and key strokes Procedures to address potential departure of review team members 44

Best Practices Quality Control Review team leader is responsible for overall quality control of the review and should meet with review team extensively during review Review questions Identify categories of documents that t can be removed as N/R Prepare list of additional questions for litigation team QC reviewers work by: Examining i individual id reviewer productivity it and error rates Designating QC reviewers to re review established percentage of documents to verify accuracy of coding Type of sampling g( (random, systematic, stratified) Sample sizes Review rate Accuracy rate Hours worked 45

Other Best Practices Annotating and coding: Determination is required as to extent of annotation and coding that would be beneficial Coding may include responsiveness, privileged, confidential, sensitive and relevant to issue codes Review in native format Review metadata if necessary Review hidden data if necessary 46

Best Practices Privilege Review Create separate privilege review/privilege log preparation team Segregate attorney ESI from the production Screen out potentially privileged documents: Run key word searches of privileged ESI (e.g., attorney, solicitor, privilege, work product) URL analysis Hashing comparisons Clustering Educate the review team on privilege coding and issues Clarify handling of email chains and email attachments Include protocol for compliance with FRCP 26(b)(5)(B) and FRE 502(b) and (d) Additional pre production validation of privileged documents (on sampling basis) 47

Review for Privilege Review Strategies Return to Production Set Privilege Filter URL Search Terms Review all Docs Review Positive Hits Privilege (Y/N)? No Redaction Log QC via Sampling Clustering Sample Non- Privilege Hits Yes Log 48

Best Practices Privilege Log Automate log as much as possible Increases speed and efficiency (saves time and money) Decreases typos and inconsistencies Goal is to avoid manual data entry wherever possible Leverage metadata as much as possible Be careful, though, of relying on metadata alone for email chains and PDFs, unless the parties have agreed Use drop down boxes where possible, having reviewers choose the one or more descriptions that best fit the document 49

Washington, D.C. Litigation Review Center 50

Staffing a Review Lower cost staff that specializes in review Properly recruited and screened Proper supervised Properly trained Considerations: Insource vs. outsource Domestic vs. international JD or barred Use of non lawyers Useof technical ca experts 51

Ethical Issues ABA 08 0451 0451 (August 2008) U.S. lawyers are ethically permitted to outsource legal work, including lawyers or non lawyers (both domestically and internationally) if they adhere to ethics rules requiring: Competence Supervision Protection of confidential information Reasonable fees Not assisting unauthorized practice of law 52

Ethical Issues ABA 08 0451 0451 (August 2008) Minimal obligations: Conduct reference checks and background investigations of lawyer or non lawyer service providers and any intermediaries Interview principal lawyers on a project, assessing their educational background, and evaluate the quality and character of any employees likely to access client information Review security systems Visit the premises of the service provider 53

Ethical Issues If the provider is in a foreign country: Determine whether the legal education system in that country is similar to that of the U.S., and whether professional regulatory systems incorporateequivalent equivalent core ethics principlesand effective disciplinary enforcement systems Consider utilization of additional training, especially regarding privilege Determine whether the foreign legal system protects client confidentiality and provides effective remedies to the lawyer s client in case disputes arise Some circumstances may require more rigorous supervision than others May be necessary to obtain a client's consent before engaging outside assistance 54

Proper Supervision Bray & Gillespie Mgmt. LLC v. Lexington Ins. Co., No. 6:07 cv 222 Orl 35KRS, 2009 WL 546429 at * 21 (M.D. Fla. Mar. 4, 2009) Form of Production; Rule 37 Sanctions; Outside Counsel Changing Law Firms; Individual Attorney Sanctioned While B & G, as the client, has the obligation to supervise its lawyers, the evidence establishes that B & G's outside counsel made the decision how to produce ESI. Additionally, B & G has already spent considerable time and effort to reproduce some ESI in native format, although problems remain with the form of that production. Under these circumstances, I find that it is not appropriate to require B & G to pay the attorney's fees, costs, and expenses.... Should B & G fail to monitor its counsel's actions going forward, however, it will subject itself to all available sanctions should additional problems occur.... blindly relying on outside counsel falls short of the duty he has as an officer of the court, as counsel of record, and as an advocate for his client. 55

Budgeting g Now possible to model the entire review process to establish a reliable budget Considerations: E discovery costs Size of review Potential cull/filter rate Percent of e mail/threads Percent of dupes and near dupes Complexity of the documents 56

57

Questions? 58

Contact Information John Rosenthal Chair, ediscovery & Information Management Practice Group Washington, D.C. 1 (202) 282 5785 JRosenthal@winston.com Scott Cohen Director of ediscovery Support Services New York 1 (212) 294 3558 SCohen@winston.com 59