24h-payday

Posts Tagged ‘Ralph Losey’

7th Circuit eDiscovery Pilot Program Tackles Technology Assisted Review With Mock Arguments

Tuesday, May 22nd, 2012

The 7th Circuit eDiscovery Pilot Program’s Mock Argument is the first of its kind and is slated for June 14, 2012.  It is not surprising that the Seventh Circuit’s eDiscovery Pilot Program would be the first to host an event like this on predictive coding, as the program has been a progressive model across the country for eDiscovery protocols since 2009.  The predictive coding event is open to the public (registration required) and showcases the expertise of leading litigators, technologists and experts from all over the United States.  Speakers include: Jason R. Baron, Director of Litigation at the National Archives and Records Administration; Maura R. Grossman, Counsel at Wachtell, Lipton, Rosen & Katz; Dr. David Lewis, Technology Expert and co-founder of the TREC Legal Track; Ralph Losey, Partner at Jackson Lewis; Matt Nelson, eDiscovery Counsel at Symantec; Lisa Rosen, President of Rosen Technology ResourcesJeff Sharer, Partner at Sidley Austin; and Tomas Thompson, Senior Associate at DLA Piper.

The eDiscovery 2.0 blog has extensively covered the three recent predictive coding cases currently being litigated, and while real court cases are paramount to the direction of predictive coding, the 7th Circuit program will proactively address a scenario that has not yet been considered by a court.  In Da Silva Moore, the parties agreed to the use of predictive coding, but couldn’t subsequently agree on the protocol.  In Kleen, plaintiffs want defendants to redo their review process using predictive coding even though the production is 99% complete.  And, in Global Aerospace the defendant proactively petitioned to use predictive coding over plaintiff’s objections.  By contrast, in the 7th Circuit’s hypothetical, the mock argument predicts another likely predictive coding scenario; the instance where a defendant has a deployed in-house solution in place and argues against the use of predictive coding before discovery has begun.

Traditionally, courts have been reticent to bless or admonish technology, but rather rule on the reasonableness of an organization’s process and depend on expert testimony for issues beyond that scope.  It is expected that predictive coding will follow suit; however, because so little is understood about how the technology works, interest has been generated in a way the legal technology industry has not seen before, as evidenced by this tactical program.

* * *

The hypothetical dispute is a complex litigation matter pending in a U.S. District Court involving a large public corporation that has been sued by a smaller high-tech competitor for alleged anticompetitive conduct, unfair competition and various business torts.  The plaintiff has filed discovery requests that include documents and communications maintained by the defendant corporation’s vast international sales force.  To expedite discovery and level the playing field in terms of resources and costs, the Plaintiff has requested the use of predictive coding to identify and produce responsive documents.  The defendant, wary of the latest (and untested) eDiscovery technology trends, argues that the organization already has a comprehensive eDiscovery program in place.  The defendant will further argue that the technological investment and defensible processes in-house are more than sufficient for comprehensive discovery, and in fact, were designed in order to implement a repeatable and defensible discovery program.  The methodology of the defendant is estimated to take months and result in the typical massive production set, whereas predictive coding would allegedly make for a shorter discovery period.  Because of the burden, the defendant plans to shift some of these costs to the plaintiff.

Ralph Losey’s role will be as the Magistrate Judge, defense counsel will be Martin T. Tully (partner Katten Muchin Rosenman LLP), with Karl Schieneman (of Review Less/ESI Bytes) as the litigation support manager for the corporation and plaintiff’s counsel will be Sean Byrne (eDiscovery solutions director at Axiom) with Herb Roitblat (of OrcaTec) as plaintiff’s eDiscovery consultant.

As the hottest topic in the eDiscovery world, the promises of predictive coding include: increased search accuracy for relevant documents, decreased cost and time spent for manual review, and possibly greater insight into an organization’s corpus of data allowing for more strategic decision making with regard to early case assessment.  The practical implications of predictive coding use are still to be determined and programs like this one will flesh out some of those issues before they get to the courts, which is good for practitioners and judges alike.  Stay tuned for an analysis of the arguments, as well as a link to the video.

Plaintiffs Ask Judge Nan R. Nolan to Go Out On a Limb in Kleen Products Predictive Coding Case

Friday, April 13th, 2012

While the gaze of the eDiscovery community has been firmly transfixed on the unfolding drama in the Da Silva Moore, et. al. v. Publicis Groupe, et. al. predictive coding case, an equally important case in the Northern District of Illinois has been quietly flying under the radar. I recently traveled to Chicago to attend the second of a two day hearing in the 7th Circuit Kleen Products, LLC, et. al. v. Packaging Corporation of America, et. al. case where plaintiff and defense experts duked it out over whether or not defendants should be required to “redo” their document production. On its face, plaintiffs’ request may not seem particularly unusual. However, a deeper dive into the facts reveals that plaintiffs are essentially asking Magistrate Judge Nan R. Nolan to issue an order that could potentially change the way parties are expected to handle eDiscovery in the future.

Can One Party Dictate Which Technology Tool Their Opponent Must Use?

The reason plaintiffs’ position is shocking to many observers is as much about the stage of the case as it is about their argument. Plaintiffs basically ask Judge Nolan to order defendants to redo their production even though defendants have spent thousands of hours reviewing documents, have already produced over a million documents, and at least one defendant claims their review is over 99 percent complete. Given that plaintiffs don’t appear to point to any glaring deficiencies in defendants’ production, an order by Judge Nolan requiring defendants to redo their production using a different technology tool would likely sound alarm bells for 7th Circuit litigants. Judges normally care more about results than methodology when it comes to eDiscovery and they typically do not allow one party to dictate which technology tools their opponents must use.

Plaintiffs’ main contention appears to be that defendants should redo the production because keyword search and other tools were used instead of predictive coding technology. There is no question that keyword search tools have obvious limitations. In fact, Ralph Losey and I addressed this precise issue in a recent webinar titled: “Is Keyword Search in eDiscovery Dead?” The problem with keyword searches, says Losey, is that they are much like the card game “go fish.”  Parties applying keyword searches typically make blind guesses about which keywords might reveal relevant documents. Since guessing every relevant keyword contained in a large collection of documents is virtually impossible, using keyword search tools normally results in some relevant documents being overlooked (those that do not contain the keyword) and some irrelevant documents being retrieved (documents that are not relevant may contain the keyword). Although imperfect, keyword search tools still add value when used properly because they can help identify important documents quickly and expedite document review.

Regardless, plaintiffs take the position that defendants should have used predictive coding to avoid the limitations of keyword search tools. The arguments are not well framed, but ostensibly plaintiffs rely on the common belief that predictive coding tools can minimize the inherent limitations of keyword search tools. The rationale is based in part on the notion that predictive coding tools are better because they don’t require users to know all the relevant keywords in order to identify all the relevant documents. Instead, predictive coding tools rely on human input to construct complex search algorithms. Provided the human input is accurate, computers can use these algorithms to automate the identification of potentially relevant documents during discovery faster and more accurately than humans using traditional linear document review methodologies. Plaintiffs contend defendants should redo their document production using predictive coding technology instead of relying on keywords and traditional linear review because it would provide added assurances that defendants’ productions were thorough.

Aside from the fact that defendants have essentially completed their document production, the problem with plaintiffs’ initial argument is that too much emphasis is placed on the tool and almost no value is attributed to how the tool is used.  Today there are a wide range of technology tools available in the litigator’s tool belt including keyword search, transparent concept search, topic grouping, discussion threading, and predictive coding to name a few. Knowing which of these tools to use for a particular case and in what combination is important. However, even more important is the realization that none of these tools will yield the desired results unless they are used properly. Simply swapping a predictive coding tool for a keyword tool will not solve the problem if the tool is not used properly.

The Artist or The Brush?

Plaintiffs’ blank assertion that defendants’ document production would be more thorough if a predictive coding tool was used as a replacement for keyword searching is naïve. First, using keyword searches and other tools to filter data before using a predictive coding tool is a logical first step for weeding out clearly irrelevant documents. Second, ignoring the importance of the process by focusing only on the tool is like assuming the brush rather than the artist is responsible for the Mona Lisa. The success of a project depends on the artist as much as the tool. Placing a brush in the hands of a novice painter isn’t likely to result in a masterpiece and neither is placing a predictive coding tool in the hands of an untrained end user. To the contrary, placing sophisticated tools in unskilled hands is likely to end poorly.

Hearing Testimony and Da Silva Moore Lessons

Perhaps recognizing their early arguments placed too much emphasis on predictive coding technology, plaintiffs spent most of their time attacking defendants’ process during the hearing. Plaintiffs relied heavily on testimony from their expert, Dr. David Lewis, in an attempt to poke holes in defendants’ search, review, and sampling protocol. For example, Dr. Lewis criticized the breadth of defendants’ collection, their selection of custodians for sampling purposes, and their methodology for validating document review accuracy on direct examination. During a spirited cross examination of Dr. Lewis by Stephen Neuwirth, counsel for defendant Georgia Pacific, many of Dr. Lewis’ criticisms seemed somewhat trivial when measured against today’s eDiscovery status quo – basically the “go fish” method of eDiscovery. If anything, defendants appear to have followed a rigorous search and sampling protocol that goes far beyond what is customary in most document productions today. Since courts require “reasonableness” when it comes to eDiscovery rather than “perfection,” plaintiffs are likely facing an uphill battle in terms of challenging the tools defendants used or their process for using those tools.

The important relationship between technology and process is the lesson in Da Silva Moore and Kleen Products that is buried in thousands of pages of transcripts and pleadings. Although both cases deal squarely with predictive coding technology, the central issue stirring debate is confusion and disagreement about the process for using technology tools.  The issue is most glaring in Da Silva Moore where the parties actually agreed to the use of predictive coding technology, but continue to fight like cats and dogs about establishing a mutually agreeable protocol.

The fact that the parties have spent several weeks arguing about proper predictive coding protocols highlights the complexity surrounding the use of predictive coding tools in eDiscovery and the need for a new generation of predictive coding tools that simplify the current process. Until predictive coding tools become easier to use and more transparent, litigants are likely to shy away from new tools in favor of more traditional eDiscovery tools that are more intuitive and less risky. The good news is that predictive coding technology has the potential to save millions of dollars in document review if done correctly. This fact is fostering a competitive environment that will soon drive development of better predictive coding tools that are easier to use.

Conclusion

Given the amount of time and money defendants have already spent reviewing documents, it is unlikely that Judge Nolan would go out on a limb and order defendants to redo their production unless plaintiffs point to some glaring defect. I did not attend the entire hearing and have not read every court submission. However, based on my limited observations, plaintiffs have not provided much if any evidence that defendants failed to produce a particular document or documents.  Similarly, plaintiffs’ attacks on defendants’ keyword search and sampling protocol are not convincing to the average observer. Even if plaintiffs could poke holes in defendants’ process, a complete redo is unlikely because courts typically require reasonable efforts during document production, not perfection. A third day of hearings has been scheduled in this case, so it may be several more weeks before we find out if Judge Nolan agrees.

LTNY Wrap-Up – What Did We Learn About eDiscovery?

Friday, February 10th, 2012

Now that that dust has settled, the folks who attended LegalTech New York 2012 can try to get to the mountain of emails that accumulated during the event that was LegalTech. Fortunately, there was no ice storm this year, and for the most part, people seemed to heed my “what not to do at LTNY” list. I even found the Starbucks across the street more crowded than the one in the hotel. There was some alcohol-induced hooliganism at a vendor’s party, but most of the other social mixers seemed uniformly tame.

Part of Dan Patrick’s syndicated radio show features a “What Did We Learn Today?” segment, and that inquiry seems fitting for this year’s LegalTech.

  • First of all, the prognostications about buzzwords were spot on, with no shortage of cycles spent on predictive coding (aka Technology Assisted Review). The general session on Monday, hosted by Symantec, had close to a thousand attendees on the edge of their seats to hear Judge Peck, Maura Grossman and Ralph Losey wax eloquently about the ongoing man versus machine debate. Judge Peck uttered a number of quotable sound bites, including the quote of the day: “Keyword searching is absolutely terrible, in terms of statistical responsiveness.” Stay tuned for a longer post with more comments from the General session.
  • Ralph Losey went one step further when commenting on keyword search, stating: “It doesn’t work,… I hope it’s been discredited.” A few have commented that this lambasting may have gone too far, and I’d tend to agree.  It’s not that keyword search is horrific per se. It’s just that its efficacy is limited and the hubris of the average user, who thinks eDiscovery search is like Google search, is where the real trouble lies. It’s important to keep in mind that all these eDiscovery applications are just like tools in the practitioners’ toolbox and they need to be deployed for the right task. Otherwise, the old saw (pun intended) that “when you’re a hammer everything looks like a nail” will inevitably come true.
  • This year’s show also finally put a nail in the coffin of the human review process as the eDiscovery gold standard. That doesn’t mean that attorneys everywhere will abandon the linear review process any time soon, but hopefully it’s becoming increasingly clear that the “evil we know” isn’t very accurate (on top of being very expensive). If that deadly combination doesn’t get folks experimenting with technology assisted review, I don’t know what will.
  • Information governance was also a hot topic, only paling in comparison to Predictive Coding. A survey Symantec conducted at the show indicated that this topic is gaining momentum, but still has a ways to go in terms of action. While 73% of respondents believe an integrated information governance strategy is critical to reducing information risk, only 19% have implemented a system to help them with the problem. This gap presumably indicates a ton of upside for vendors who have a good, attainable information governance solution set.
  • The Hilton still leaves much to be desired as a host location. As they say, familiarity breeds contempt, and for those who’ve notched more than a handful of LegalTech shows, the venue can feel a bit like the movie Groundhog Day, but without Bill Murray. Speculation continues to run rampant about a move to the Javits Center, but the show would likely need to expand pretty significantly before ALM would make the move. And, if there ever was a change, people would assuredly think back with nostalgia on the good old days at the Hilton.
  • Despite the bright lights and elevator advertisement trauma, the mood seemed pretty ebullient, with tons of partnerships, product announcements and consolidation. This positive vibe was a nice change after the last two years when there was still a dark cloud looming over the industry and economy in general.
  • Finally, this year’s show also seemed to embrace social media in a way that it hadn’t done so in years past. Yes, all the social media vehicles were around in years past, but this year many of the vendors’ campaigns seemed to be much more integrated. It was funny to see even the most technically resistant lawyers log in to Twitter (for the first time) to post comments about the show as a way to win premium vendor swag. Next year, I’m sure we’ll see an even more pervasive social media influence, which is a bit ironic given the eDiscovery challenges associated with collecting and reviewing social media content.

Email Isn’t eDiscovery Top Dog Any Longer, Recent Survey Finds

Sunday, September 18th, 2011

Symantec today issued the findings of its second annual Information Retention and eDiscovery Survey, which examined how enterprises are coping with the tsunami of electronically stored information (ESI) that we see expanding by the minute.  Perhaps counter intuitively, the survey of legal and IT personnel at 2,000 enterprises found that email is no longer the primary source of ESI companies produced in response to eDiscovery requests.  In fact, email came in third place (58%) to files/documents (67%) and database/application data (61%).  Marking a departure from the landscape as recently as a few years ago, the survey reveals that email does not axiomatically equal eDiscovery any longer.

Some may react incredulously to these results. For instance, noted eDiscovery expert Ralph Losey continues to stress the paramount importance of email: “In the world of employment litigation it is all about email and attachments and other informal communications. That is not to say databases aren’t also sometimes important. They can be, especially in class actions. But, the focus of eDiscovery remains squarely on email.”   While it’s hard to argue with Ralph, the real takeaway should be less about the relative descent of email’s importance, and more about the ascendency of other data types (including social media), which now have an unquestioned seat at the table.

The primary ramification is that organizations need to prepare for eDiscovery and governmental inquires by casting a wider ESI net, including social media, cloud data, instant messaging and structured data systems.  Forward-thinking companies should map out where all ESI resides company-wide so that these important sources do not go unrecognized.  Once these sources of potentially responsive ESI are accounted for, the right eDiscovery tools need to be deployed so that these disparate types of ESI can be defensibly collected and processed for review in a singular, efficient and auditable environment.

The survey also found that companies which employ best practices such as implementing information retention plans, automating the enforcement of legal holds and leveraging archiving tools instead of relying on backups, fare dramatically better when it comes to responding to eDiscovery requests. Companies in the survey with good information governance hygiene were:

  • 81% more likely to have a formal retention plan in place
  • 63% more likely to automate legal holds
  • 50% more likely to use a formal archiving tool

These top-tier companies in the survey were able to respond much faster and more successfully to an eDiscovery request, often suffering fewer negative consequences:

  • 78% less likely to be sanctioned
  • 47% less likely to lead to a compromised legal position
  • 45% less likely to disclose too much information

This last bullet (disclosing too much information) has a number of negative ramifications beyond just giving the opposition more ammo than is strictly necessary.  Since much of the eDiscovery process is volume-based, particularly the eyes-on review component, every extra gigabyte of produced information costs the organization in both seen and unseen ways.  Some have estimated that it costs between $3-5 a document for manual attorney review – and at 50,000 pages to a gigabyte, these data-related expenses can really add up quickly.

On the other side of the coin, there were those companies with bad information governance hygiene.  While this isn’t terribly surprising, it is shocking to see how many entities fail to connect the dots between information governance and risk reduction.  Despite the numerous risks, the survey found nearly half of the respondents did not have an information retention plan in place, and of this group, only 30% were discussing how to do so.  Most shockingly, 14% appear to be ostriches with their heads in the sand and have no plans to implement any retention plan whatsoever.  When asked why folks weren’t taking action, respondents indicated lack of need (41%), too costly (38%), nobody has been chartered with that responsibility (27%), don’t have time (26%) and lack of expertise (21%) as top reasons.  While I get the cost issue, particularly in these tough economic times, it’s bewildering to think that so many companies feel immune from the requirements of having even a basic retention plan.

As the saying goes, “You don’t need to be a weatherman to tell which way the wind blows.”  And, the winds of change are upon us.  Treating eDiscovery as a repeatable business process isn’t a Herculean task, but it is one that cannot be accomplished without good information governance hygiene and the profound recognition that email isn’t the only game in town.

For more information regarding good records management hygiene, check out this informative video blog and Contoural article.

Electronic Discovery Experts On Stage at LegalTech New York 2010

Thursday, January 28th, 2010

Next week, as most of you know, is the Superbowl of legal technology events.  And, so if this is a newsflash, you’ve probably found this blog by searching for the European Cockpit Association (“ECA”).  If on the other hand you have an unnatural affinity for the other ECA – early case assessment — then you’ve probably been planning to head to this year’s LegalTech show immediately after the last one ended.

For fear of gratuitous self promotion, I will be moderating several panels with e-discovery pundits on the first day. Akin to the upcoming Superbowl, these “Supersessions” will be chockablock with EDD luminaries and it’ll be all I can do to get a word in edgewise.  Below is the schedule. Feel free to pre-register since we expect a packed house.

1:00 – 2:00 pm: The E-Discovery Expert Panel.  This session will discuss best practices in e-discovery. Panelists include:

  • Jay Brudz, senior counsel, legal technology at GE;
  • Ron Best, director of legal information systems at Munger, Tolles and Olson, LLP, and
  • Brian Hill, senior analyst at Forrester Research, Inc.

2:15 – 3:15 pm: Strategies for Transparency and Cooperation in E-Discovery. This session will discuss how to move toward a more cooperative resolution of legal disputes.  Speakers include:

  • Sean Gallagher, partner at Hogan & Hartson, LLP and
  • Lauren Schwartzreich, associate at Outten and Golden, LLP

3:30 – 4:30 pm: Ask the E-Discovery Doctors. The “doctors” will take questions from the audience and provide their prescriptions for a wide-range of e-discovery topics.

  • Craig Ball, attorney and president, Craig D. Ball, P.C.
  • Ralph Losey, attorney and co-chair of E-Discovery Practice Group, Akerman Senterfitt,
  • George Socha, attorney and president, Socha Consulting, LLC

While it’s probably not fair to pick a favorite session, my sense is that the last one will be the most anarchical, chaotic, and stimulating, assuming that the speakers don’t take the faux Doctor thing too far (yes, they will be in scrubs).

Please come by to get your recommended daily dose of e-discovery insights.

Learn More On Electronic Discovery Litigation.

Five Electronic Discovery Questions with Ralph Losey

Tuesday, July 28th, 2009

In continuing my Five e-Discovery Questions series, I had the pleasure of sitting down with and interviewing (ok, e-mailing five questions to) Ralph Losey, electronic discovery expert extraordinaire.

Ralph is the writer, lawyer, and educator behind the e-Discovery Team blog. He has been practicing law since 1980 and playing with computers and cyber-communications since 1978. He holds the highest AV peer rating by Martindale Hubbell and is identified as a SuperLawyer in the field of IT.

The questions I posed to Ralph were:

1. We have always loved the name of your Blog -”e-Discovery Team.” It succinctly sums up your overall approach and philosophy of e-discovery. What’s the current state of the “e-discovery team” in most organizations? How has it progressed over the last few years? Where does it need to go to next?

2. Should there be an adverse inference distinction between cases where e-discovery may have been conducted in a sloppy, incomplete fashion, but without malice, versus one in which the party actively sought to hide or suppress documents in the case?

3. Are judges equipped with enough information to be able to make this distinction (between intentional and accidental destruction)?

4. What is the biggest gap today between e-discovery vendor offerings and what legal end-users need?

5. How much time does it really take you to crank out one of your blog posts? Does the hot Florida sun keep you indoors typing away at your computer? Or do you have some sort of waterproof laptop that allows you to write while floating in your screened in pool?

To read Ralph Losey’s answers and more, read the full version (complete with all cinematic references in video) at his e-Discovery Team blog article, “Five Easy Pieces – An Interview Without Toast.”

“Aggressive Culling”: The E-Discovery Buzz Cut

Tuesday, September 30th, 2008

Ralph Losey, never one to mince words, recently analyzed a recent litigation survey from the elite Fellows of the American College of Trial Lawyers. The survey highlights the fact that one of the main problems facing the U.S. legal system today is (surprise!) e-discovery. Also (not) a surprise is that the study “places the blame squarely on poor rules, bad law, and judges”, while overlooking the role that lawyers play in the problem.

In his analysis, Ralph makes a number of insightful observations that should help lawyers move from being e-discovery troublemakers to being part of the solution. However, one of his key critiques is targeted not at lawyers but rather at the vendor community: “[E-discovery] is too expensive because lawyers and judges do not know what they are doing, and do not know how to properly cull and review email, and because clients are disorganized pack-rats. Many of the e-discovery vendors are also misinformed, but often they do know better; they just have no pecuniary interest in aggressive culling. Some may even seek to line their own pockets in inflated discoveries.”

As Ralph bluntly points out, pecuniary interests (translation: money) plays a big role here, but so does risk reduction. Imagine you’re given the opportunity to process a 2 terabyte case all the way through to review. With the “funnel” of e-discovery costs placing the highest dollar per gigabyte value on the end of the process (i.e. review), what’s your incentive to cull aggressively at the beginning? Not much from a revenue perspective, certainly, but also not much from a risk perspective: particularly when you have sanctions and lawsuits on your mind and are thinking about the potential liability that you incur by excluding potentially relevant documents by using too broad a brush (or pair of garden clippers) in your pruning.

How do we move forward? As document volumes continue to grow, it’s clear that aggressive culling (with a few caveats which we’ll get to in a minute) is a critical tool for managing costs and improving case outcomes (let’s go out on a limb and define “improving” as producing fairer and more equitable rulings). However, in order to adopt more aggressive culling as a standard part of the electronic discovery process, the community has to come to terms with three things:

  • The Myth of Perfection: There may be perfect abs, but there is no perfect e-discovery. Organizations like the E-Discovery Institute are doing fantastic work to measure and improve the accuracy of electronic discovery efforts, but in the end it’s tough to make the argument that having 100 contract attorneys manually reviewing 10 million documents will necessarily produce a better overall e-discovery outcome than  10 specialized attorneys reviewing 200,000 documents that were aggressively (but thoughtfully) culled from initial 10 million document set. There simply is no black and white set of rules that will lead to a perfect process.
  • The Benefit of Cost Control: Given that, it is in the best interest of everyone involved (yes, even vendors) to choose the most cost-effective process that provides a high likelihood of producing the information relevant to the case.  This means “saving your bullets” by not spending all of your e-discovery dollars up front in a case pursing the perfection myth, but instead approaching discovery in an incremental fashion which can adapt to changing facts and circumstances as the matter unfolds. How, you may ask, do vendors benefit? They can become more strategic e-discovery advisors by working with counsel over the full lifecycle of a case, providing higher-value (and, by the way, more interesting and intellectually challenging) consulting services to help incrementally adjust and adapt the course of e-discovery. As Ralph puts it: “…Trial lawyers should accept that specialists in the field of e-discovery are a necessary evil. If an e-discovery specialist knows the field, they can save you money and take you out of the e-discovery morass faster and more reliably than a dozen new rules. The world today is too complex for one man or woman to do it all.”
  • The Value of Defensibility: Many of you likely winced at the term “high likelihood” in the previous point. “Sacrilege!” you cried. “I demand certainty!” First, go back and re-read the first point about the Myth of Perfection. Then, consider that a better way forward may be an approach to e-discovery that involves more aggressive culling early in the process to focus on the most important documents first, more iterations to adapt to changing facts and circumstances, and, all along the way, a complete audit trail that provides defensibility in the event that any aspect of the process is ever questioned. Such defensibility would include specific documentation about the culling decisions that were made, down to the keyword and “sub-keyword” (i.e. wildcard expansion) level, so all the cards are on the table for everyone to see.  The value of defensibility when performing aggressive culling is enormous, in that it adds an additional measure of safety and trust to the process, minimizing the amount of doubt and second-guessing that so often plagues e-discovery negotiations.

By coming to terms with the fundamental imperfections of the e-discovery process and embracing the promise of lower costs and the agility and responsiveness that can be gained with a more iterative approach, everyone stands to gain from the safe and controlled adoption of aggressive culling – yes, even the vendors (at least the smart ones) and their ever-present pecuniary interests.