24h-payday

Archive for April, 2012

District Court Upholds Judge Peck’s Predictive Coding Order Over Plaintiff’s Objection

Monday, April 30th, 2012

In a decision that advances the predictive coding ball one step further, United States District Judge Andrew L. Carter, Jr. upheld Magistrate Judge Andrew Peck’s order in Da Silva Moore, et. al. v. Publicis Groupe, et. al. despite Plaintiff’s multiple objections. Although Judge Carter rejected all of Plaintiff’s arguments in favor of overturning Judge Peck’s predictive coding order, he did not rule on Plaintiff’s motion to recuse Judge Peck from the current proceedings – a matter that is expected to be addressed separately at a later time. Whether or not a successful recusal motion will alter this or any other rulings in the case remains to be seen.

Finding that it was within Judge Peck’s discretion to conclude that the use of predictive coding technology was appropriate “under the circumstances of this particular case,” Judge Carter summarized Plaintiff’s key arguments listed below and rejected each of them in his five-page Opinion and Order issued on April 26, 2012.

  • the predictive coding method contemplated in the ESI protocol lacks generally accepted reliability standards,
  • Judge Peck improperly relied on outside documentary evidence,
  • Defendant MSLGroup’s (“MSL’s”) expert is biased because the use of predictive coding will reap financial benefits for his company,
  • Judge Peck failed to hold an evidentiary hearing and adopted MSL’s version of the ESI protocol on an insufficient record and without proper Rule 702 consideration

Since Judge Peck’s earlier order is “non-dispositive,” Judge Carter identified and applied the “clearly erroneous or contrary to law” standard of review in rejecting Plaintiffs’ request to overturn the order. Central to Judge Carter’s reasoning is his assertion that any confusion regarding the ESI protocol is immaterial because the protocol “contains standards for measuring the reliability of the process and the protocol builds in levels of participation by Plaintiffs.” In other words, Judge Carter essentially dismisses Plaintiff’s concerns as premature on the grounds that the current protocol provides a system of checks and balances that protects both parties. To be clear, that doesn’t necessarily mean Plaintiffs won’t get a second bite of the apple if problems with MSL’s productions surface.

For now, however, Judge Carter seems to be saying that although Plaintiffs must live with the current order, they are by no means relinquishing their rights to a fair and just discovery process. In fact, the existing protocol allows Plaintiffs to actively participate in and monitor the entire process closely. For example, Judge Carter writes that, “if the predictive coding software is flawed or if Plaintiffs are not receiving the types of documents that should be produced, the parties are allowed to reconsider their methods and raise their concerns with the Magistrate Judge.”

Judge Carter also specifically addresses Plaintiff’s concerns related to statistical sampling techniques which could ultimately prove to be their meatiest argument. A key area of disagreement between the parties is whether or not MSL is reviewing enough documents to insure relevant documents are not completely overlooked even if this complex process is executed flawlessly. Addressing this point Judge Carter states that, “If the method provided in the protocol does not work or if the sample size is indeed too small to properly apply the technology, the Court will not preclude Plaintiffs from receiving relevant information, but to call the method unreliable at this stage is speculative.”

Although most practitioners are focused on seeing whether and how many of these novel predictive coding issues play out, it is important not to overlook two key nuggets of information lining Judge Carter’s Opinion and Order. First, Judge Carter’s statement that “[t]here simply is no review tool that guarantees perfection” serves as an acknowledgement that “reasonableness” is the standard by which discovery should be measured, not “perfection.” Second, Judge Carter’s acknowledgement that manual review with keyword searches may be appropriate in certain situations should serve as a wake-up call for those who think predictive coding technology will replace all predecessor technologies. To the contrary, predictive coding is a promising new tool to add to the litigator’s tool belt, but it is not necessarily a replacement for all other technology tools.

Plaintiffs in Da Silva Moore may not have received the ruling they were hoping for, but Judge Carter’s Opinion and Order makes it clear that the court house door has not been closed. Given the controversy surrounding this case, one can assume that Plaintiffs are likely to voice many of their concerns at a later date as discovery proceeds. In other words, don’t expect all of these issues to fade away without a fight.

First State Court Issues Order Approving the Use of Predictive Coding

Thursday, April 26th, 2012

On Monday, Virginia Circuit Court Judge James H. Chamblin issued what appears to be the first state court Order approving the use of predictive coding technology for eDiscovery. Tuesday, Law Technology News reported that Judge Chamblin issued the two-page Order in Global Aerospace Inc., et al, v. Landow Aviation, L.P. dba Dulles Jet Center, et al, over Plaintiffs’ objection that traditional manual review would yield more accurate results. The case stems from the collapse of three hangars at the Dulles Jet Center (“DJC”) that occurred during a major snow storm on February 6, 2010. The Order was issued at Defendants’ request after opposing counsel objected to their proposed use of predictive coding technology to “retrieve potentially relevant documents from a massive collection of electronically stored information.”

In Defendants’ Memorandum in Support of their motion, they argue that a first pass manual review of approximately two million documents would cost two million dollars and only locate about sixty percent of all potentially responsive documents. They go on to state that keyword searching might be more cost-effective “but likely would retrieve only twenty percent of the potentially relevant documents.” On the other hand, they claim predictive coding “is capable of locating upwards of seventy-five percent of the potentially relevant documents and can be effectively implemented at a fraction of the cost and in a fraction of the time of linear review and keyword searching.”

In their Opposition Brief, Plaintiffs argue that Defendants should produce “all responsive documents located upon a reasonable inquiry,” and “not just the 75%, or less, that the ‘predictive coding’ computer program might select.” They also characterize Defendants’ request to use predictive coding technology instead of manual review as a “radical departure from the standard practice of human review” and point out that Defendants cite no case in which a court compelled a party to accept a document production selected by a “’predictive coding’ computer program.”

Considering predictive coding technology is new to eDiscovery and first generation tools can be difficult to use, it is not surprising that both parties appear to frame some of their arguments curiously. For example, Plaintiffs either mischaracterize or misunderstand Defendants’ proposed workflow given their statement that Defendants want a “computer program to make the selections for them” instead of having “human beings look at and select documents.” Importantly, predictive coding tools require human input for a computer program to “predict” document relevance. Additionally, the proposed approach includes an additional human review step prior to production that involves evaluating the computer’s predictions.

On the other hand, some of Defendants’ arguments also seem to stray a bit off course. For example, Defendants’ seem to unduly minimize the value of using other tools in the litigator’s tool belt like keyword search or topic grouping to cull data prior to using potentially more expensive predictive coding technology. To broadly state that keyword searching “likely would retrieve only twenty percent of the potentially relevant documents” seems to ignore two facts. First, keyword search for eDiscovery is not dead. To the contrary, keyword searches can be an effective tool for broadly culling data prior to manual review and for conducting early case assessments. Second, the success of keyword searches and other litigation tools depends as much on the end user as the technology. In other words, the carpenter is just as important as the hammer.

The Order issued by Judge Chamblin, the current Chief Judge for the 20th Judicial Circuit of Virginia, states that “Defendants shall be allowed to proceed with the use of predictive coding for purposes of the processing and production of electronically stored information.”  In a hand written notation, the Order further provides that the processing and production is to be completed within 120 days, with “processing” to be completed within 60 days and “production to follow as soon as practicable and in no more than 60 days.” The order does not mention whether or not the parties are required to agree upon a mutually agreeable protocol; an issue that has plagued the court and the parties in the ongoing Da Silva Moore, et. al. v. Publicis Groupe, et. al. for months.

Global Aerospace is the third known predictive coding case on record, but appears to present yet another set of unique legal and factual issues. In Da Silva Moore, Judge Andrew Peck of the Southern District of New York rang in the New Year by issuing the first known court order endorsing the use of predictive coding technology.  In that case, the parties agreed to the use of predictive coding technology, but continue to fight like cats and dogs to establish a mutually agreeable protocol.

Similarly, in the 7th Federal Circuit, Judge Nan Nolan is tackling the issue of predictive coding technology in Kleen Products, LLC, et. al. v. Packaging Corporation of America, et. al. In Kleen, Plaintiffs basically ask that Judge Nolan order Defendants to redo their production even though Defendants have spent thousands of hours reviewing documents, have already produced over a million documents, and their review is over 99 percent complete. The parties have already presented witness testimony in support of their respective positions over the course of two full days and more testimony may be required before Judge Nolan issues a ruling.

What is interesting about Global Aerospace is that Defendants proactively sought court approval to use predictive coding technology over Plaintiffs’ objections. This scenario is different than Da Silva Moore because the parties in Global Aerospace have not agreed to the use of predictive coding technology. Similarly, it appears that Defendants have not already significantly completed document review and production as they had in Kleen Products. Instead, the Global Aerospace Defendants appear to have sought protection from the court before moving full steam ahead with predictive coding technology and they have received the court’s blessing over Plaintiffs’ objection.

A key issue that the Order does not address is whether or not the parties will be required to decide on a mutually agreeable protocol before proceeding with the use of predictive coding technology. As stated earlier, the inability to define a mutually agreeable protocol is a key issue that has plagued the court and the parties for months in Da Silva Moore, et. al. v. Publicis Groupe, et. al. Similarly, in Kleen, the court was faced with issues related to the protocol for using technology tools. Both cases highlight the fact that regardless of which eDiscovery technology tools are selected from the litigator’s tool belt, the tools must be used properly in order for discovery to be fair.

Judge Chamblin left the barn door wide open for Plaintiffs to lodge future objections, perhaps setting the stage for yet another heated predictive coding battle. Importantly, the Judge issued the Order “without prejudice to a receiving party” and notes that parties can object to the “completeness or the contents of the production or the ongoing use of predictive coding technology.”  Given the ongoing challenges in Da Silva Moore and Kleen, don’t be surprised if the parties in Global Aerospace Inc. face some of the same process-based challenges as their predecessors. Hopefully some of the early challenges related to the use of first generation predictive coding tools can be overcome as case law continues to develop and as next generation predictive coding tools become easier to use. Stay tuned as the facts, testimony, and arguments related to Da Silva Moore, Kleen Products, and Global Aerospace Inc. cases continue to evolve.

The 2012 EDGE Summit (21st Century Technology for Information Governance) Debuts In Nation’s Capitol

Monday, April 23rd, 2012

The EDGE Summit this week is one of the most prestigious eDiscovery events of the year as well as arguably the largest for the government sector. This year’s topics and speakers are top notch. The opening keynote speaker will be the Director of Litigation for the National Archives and Records Administration (NARA), Mr. Jason Baron. The EDGE Summit will be the first appearance for Mr. Baron since the submission deadline for the 480 agencies to submit their reports to his Agency in order to construct the Directive required by the Presidential Mandate. Attendees will be eager to hear what steps NARA is taking to implement a Directive to the government later this year, and the potential impact it will have on how the government approaches its eDiscovery obligations. The Directive will be a significant step in attempting to bring order to the government’s Big Data challenges and to unify agencies with a similar approach to an information governance plan.

Also speaking at EDGE is the renowned Judge Facciola who will be discussing the anticipated updates the American Bar Association (ABA) is expected to make to the Model Rules of Professional Conduct. He plans to speak on the challenges that lawyers are facing in the digital age, and what that means with regard to competency as a practicing lawyer. He will focus as well on the government lawyer and how they can better meet their legal obligations through education, training, or knowing when and how to find the right expert. Whether it is the investigating party for law enforcement, producing party under the Freedom of Information Act (FOIA), or defendant in civil litigation, Judge Facciola will also discuss what he sees in his courtroom every day and where the true knowledge gaps are in the technological understanding of many lawyers today.

While the EDGE Summit offers CLE credit, it also has a very unique practical aspect as well. There will be a FOIA-specific lab, a lab on investigations, one on civil litigation and early case assessment (ECA) and also one on streamlining the eDiscovery workflow process. Those that plan on attending the labs will get the hands-on experience with technology that few educational events offer. It is rare to get in the driver’s seat of the car on the showroom floor and actually drive, which is what EDGE is providing for end users and interested attendees. When talking about the complex problems government agencies face today with Big Data, records management, information governance, eDiscovery, compliance, security, etc. it is necessary to give users a way to  truly visualize how these technologies work.

Another key draw at the Summit will be the panel discussions which will feature experienced government lawyers who have been on the front lines of litigation and have very unique perspectives. The legal hold panel will cover some exciting aspects of the evolution of manual versus automated processes for legal hold. Mr. David Shonka, the Deputy General Counsel of the Federal Trade Commission, is on the panel and he will discuss the defensibility of the process the FTC used and the experience his department had with two 30 (b) (6) witnesses in the Federal Trade Commission v. Lights of America, Inc (CD California, Mar 2011). The session will also cover how issuing a legal hold is imperative once the duty to preserve has been triggered. There are a whole new generation of lawyers that are managing the litigation hold process in an automated way, and it will be great to discuss both the manual and automated approaches and talk about best practices for government agencies. There will also be a session on predictive coding and discussion about the recent cases that have involve the use of technology assisted review. While we are not at the point of mainstream adoption for predictive coding, it is quite exciting to think about the government going from a paper world straight into solutions that would help them manage their unique challenges as well as save them time and money.

Finally, the EDGE Summit will conclude with closing remarks from The Hon. Michael Chertoff, former Secretary of the U.S. Department of Homeland Security from 2005 to 2009. Mr. Chertoff presently consults with high-level strategic counsel to corporate and government leaders on a broad range of security issues, from risk identification and prevention to preparedness, response and recovery. All of these issues now involve data and how to search, collect, analyze, protect and store it. Security is one of the most important aspects of information governance. The government has unique challenges including size and many geographical locations, records management requirements, massive data volume and case load, investigations, heightened security and defense intelligence risks. This year, in particular, will be a defining year; not only because of the Presidential Mandate, but because of the information explosion and the stretch of global economy. This is why the sector needs to come together to share best practices and hear success stories.  Otherwise, they won’t be able to keep up with the data explosion that’s threatening private and public sectors alike.

Breaking News: Court Clarifies Duty to Preserve Evidence, Denies eDiscovery Sanctions Motion Against Pfizer

Wednesday, April 18th, 2012

It is fortunately becoming clearer that organizations do not need to preserve information until litigation is “reasonably anticipated.” In Brigham Young University v. Pfizer (D. Utah Apr. 16, 2012), the court denied the plaintiff university’s fourth motion for discovery sanctions against Pfizer, likely ending its chance to obtain a “game-ending” eDiscovery sanction. The case, which involves disputed claims over the discovery and development of prominent anti-inflammatory drugs, is set for trial on May 29, 2012.

In Brigham Young, the university pressed its case for sanctions against Pfizer based on a vastly expanded concept of a litigant’s preservation duty. Relying principally on the controversial Phillip M. Adams & Associates v. Dell case, the university argued that Pfizer’s “duty to preserve runs to the legal system generally.” The university reasoned that just as the defendant in the Adams case was “sensitized” by earlier industry lawsuits to the real possibility of plaintiff’s lawsuit, Pfizer was likewise put on notice of the university’s claims due to related industry litigation.

The court rejected such a sweeping characterization of the duty to preserve, opining that it was “simply too broad.” Echoing the concerns articulated by the Advisory Committee when it framed the 2006 amendments to the Federal Rules of Civil Procedure (FRCP), the court took pains to emphasize the unreasonable burdens that parties such as Pfizer would face if such a duty were imposed:

“It is difficult for the Court to imagine how a party could ever dispose of information under such a broad duty because of the potential for some distantly related litigation that may arise years into the future.”

The court also rejected the university’s argument because such a position failed to appreciate the basic workings of corporate records retention policies. As the court reasoned, “[e]vidence may simply be discarded as a result of good faith business procedures.” When those procedures operate to inadvertently destroy evidence before the duty to preserve is triggered, the court held that sanctions should not issue: “The Federal Rules protect from sanctions those who lack control over the requested materials or who have discarded them as a result of good faith business procedures.”

The Brigham Young case is significant for a number of reasons. First, it reiterates that organizations need not keep electronically stored information (ESI) for legal or regulatory purposes until the duty to preserve is reasonably anticipated. As American courts have almost uniformly held since the 1997 case of Concord Boat Corp. v. Brunswick Corp., organizations are not required to keep every piece of paper, every email, every electronic document and every back up tape.

Second, Brigham Young emphasizes that organizations can and should use document retention protocols to rid themselves of data stockpiles. Absent a preservation duty or other exceptional circumstances, paring back ESI pursuant to “good faith business procedures” (such as a neutral retention policy) will be protected under the law.

Finally, Brigham Young narrows the holding of the Adams case to its particular facts. The Adams case has been particularly troublesome to organizations as it arguably expanded their preservation duty in certain circumstances. However, Brigham Young clarified that this expansion was unwarranted in the instant case, particularly given that Pfizer documents were destroyed pursuant to “good faith business procedures.”

In summary, Brigham Young teaches that organizations will be protected from eDiscovery sanctions to the extent they destroy ESI in good faith pursuant to a reasonable records retention policy. This will likely bring a sigh of relief to enterprises struggling with the information explosion since it encourages confident deletion of data when the coast is clear of a discrete litigation event.

Proportionality Demystified: How Organizations Can Get eDiscovery Right by Following Four Key Principles

Tuesday, April 17th, 2012

Talk to most any organization about legal issues and invariably the subject of eDiscovery will be raised. The skyrocketing costs and lengthy delays associated with data preservation and document review provide ample justification for organizations to be on the alert about eDiscovery. While these costs and delays tend to make the eDiscovery landscape appear bleak, a positive development on this front is emerging for organizations. That development is the emphasis that many courts are now placing on “proportionality” for addressing eDiscovery disputes.

Though initially embraced by only a few cognoscenti after 1983 and 2000 amendments to the Federal Rules of Civil Procedure (FRCP), proportionality standards are now being championed by various district and circuit courts. As more opinions are issued which analyze proportionality, several key principles are now becoming apparent in this developing body of jurisprudence. To better understand these principles, it is instructive to review some of the top proportionality cases issued this year and last. These cases provide a roadmap of best practices that, if followed, will help courts, clients and counsel reduce the costs and burdens connected with eDiscovery.

1. Discourage Unnecessary Discovery

Case: Bottoms v. Liberty Life Assur. Co. of Boston (D. Colo. Dec. 13, 2011)

Summary: The court dramatically curtailed the written discovery that plaintiff sought to propound on the defendant. Plaintiff had requested leave in this ERISA action to serve “sweeping” interrogatories and document requests to resolve the limited issue of whether the defendant had improperly denied her long term disability benefits. Drawing on the proportionality standards under Federal Rule 26(b)(2)(C), the court characterized the proposed discovery as “patently overbroad” and as seeking materials that were “largely irrelevant.” The court ultimately ordered the defendant to respond to some aspects of plaintiff’s interrogatories and document demands, but not before limiting their nature and scope.

Proportionality Principle No. 1: The Bottoms case emphasizes what courts have been advocating for years: that organizations should do away with unnecessary discovery. That does not mean “robotically recycling discovery requests propounded in earlier actions.” Instead, counsel must “stop and think” to ensure that its discovery is narrowly tailored in accordance with Rule 26(b)(2)(C). As Bottoms teaches, “the responsibility for conducting discovery in a reasonable, proportionate manner rests in the first instance with the parties and their attorneys.”

2. Encourage Reasonable Discovery Efforts

Case: Larsen v. Coldwell Banker Real Estate Corp. (C.D. Cal. Feb. 2, 2012)

Summary: In Larsen, the court rejected the plaintiffs’ assertion that the defendants should be made to redo their production of 9,000 pages of documents. The plaintiffs had argued that re-production of the documents was necessary to address certain discrepancies – including missing emails – in the production. The court disagreed, holding instead that plaintiffs had failed to establish that such discrepancies had “prevented them in any way from obtaining information relevant to a claim or defense under Fed.R.Civ.P. 26(b)(1).”

The court also reasoned that a “do over” would violate the principles of proportionality codified in Rule 26(b)(2)(C). After reciting the proportionality language from Rule 26 and referencing The Sedona Principles, the court determined that “the burden and expense to Defendants in completely reproducing its entire ESI production far outweighs any possible benefit to Plaintiffs.” There were too few discrepancies identified to justify the cost of redoing the production.

Proportionality Principle No. 2: The Larsen decision provides a simple reminder that organizations’ discovery efforts must be reasonable, not perfect. This reminder bears repeating as litigants frequently use eDiscovery sideshows to leverage lucrative settlements without having to address the merits of their claims or defenses. Such a practice, liked to a “cancerous growth” given its destructive nature, emphasizes that discovery devices should be used to “facilitate litigation rather than as weapons to wage litigation.” Calcor Space Facility, Inc. v. Superior Court, 53 Cal.App.4th 216, 221 (1997). Similar to the theme raised in our post regarding the predictive coding dispute in the Kleen Products case, principles of proportionality rightly emphasize the reasonable nature of parties’ obligations in discovery.

3. Discourage Dilatory Discovery Tactics

Case: Escamilla v. SMS Holdings Corporation (D. Minn. Oct. 21, 2011)

Summary: The court rejected an argument that proportionality standards should excuse the individual defendant from paying for additional discovery ordered by the court. The defendant essentially argued that Rule 26(b)(2)(C)(iii) foreclosed the ordered discovery given his limited financial resources. This position was unavailing, however, given that “the burden and expense of this discovery was self-inflicted by [the defendant].” As it turns out, the ordered discovery was necessary to address issues created in the litigation by the defendant’s failure to preserve relevant evidence. Moreover, there were no alternative means available for obtaining the sought-after materials. Given the unique nature of the evidence and the defendant’s misconduct, the court held that the “burden of the additional discovery [did] not outweigh its likely benefit.”

Proportionality Principle No. 3: The Escamilla decision reinforces a common refrain among proportionality cases: that proportionality is foreclosed to those parties who create their own burdens. Like the defense of unclean hands, proportionality essentially requires a litigant to approach the court with a clean slate of conduct in discovery. This is confirmed by The Sedona Conference Comment on Proportionality in Electronic Discovery, which declares that “[c]ourts should disregard any undue burden or expense that results from a responding party’s own conduct or delay.”

4. Encourage Better Information Governance Practices

Case: Salamone v. Carter’s Retail, Inc. (D.N.J. Jan. 28, 2011)

Summary: The court denied a motion for protective order that the defendant clothing retailer filed to stave off the collection and analysis of over 13,000 personnel files. The retailer had argued that proportionality precluded the search and review of the personnel files. In support of its argument, the retailer asserted that the nature, format, location and organization of the records made their review and production too burdensome: “ ‘the burden of production . . . outweigh[s] any benefit to plaintiffs’ considering the ‘disorganization of the information, the lack of accessible format, the significant amount of labor and costs involved, and defendant’s management structure’.”

In rejecting the retailer’s position, the court criticized its information retention system as the culprit for its burdens. That the retailer, the court reasoned, “maintains personnel files in several locations without any uniform organizational method,” does not exempt Defendant from reasonable discovery obligations.” After weighing the various factors that comprise the proportionality analysis under Rule 26(b)(2)(C), the court concluded that the probative value of production outweighed the resulting burden and expense on the retailer.

Proportionality Principle No. 4: Having an intelligent information governance process in place could have addressed the cost and logistics headaches that the retailer faced. Had the records at issue been digitized and maintained in a central archive, the retailer’s collection burdens would have been significantly minimized. Furthermore, integrating these “upstream” data retention protocols with “downstream” eDiscovery processes could have expedited the review process. The Salamone case teaches that an integrated information governance process, supported by effective, enabling technologies, will likely help organizations reach the objectives of proportionality by reducing the extent of discovery burdens and making them more commensurate with the demands of litigation.

Conclusion

The foregoing cases exemplify how proportionality principles can help lawyers and litigants conduct eDiscovery in an efficient and cost effective manner. And by faithfully observing these standards, courts, clients and counsel can better follow the mandate from Federal Rule 1 “to secure the just, speedy, and inexpensive determination of every action and proceeding.”

Plaintiffs Ask Judge Nan R. Nolan to Go Out On a Limb in Kleen Products Predictive Coding Case

Friday, April 13th, 2012

While the gaze of the eDiscovery community has been firmly transfixed on the unfolding drama in the Da Silva Moore, et. al. v. Publicis Groupe, et. al. predictive coding case, an equally important case in the Northern District of Illinois has been quietly flying under the radar. I recently traveled to Chicago to attend the second of a two day hearing in the 7th Circuit Kleen Products, LLC, et. al. v. Packaging Corporation of America, et. al. case where plaintiff and defense experts duked it out over whether or not defendants should be required to “redo” their document production. On its face, plaintiffs’ request may not seem particularly unusual. However, a deeper dive into the facts reveals that plaintiffs are essentially asking Magistrate Judge Nan R. Nolan to issue an order that could potentially change the way parties are expected to handle eDiscovery in the future.

Can One Party Dictate Which Technology Tool Their Opponent Must Use?

The reason plaintiffs’ position is shocking to many observers is as much about the stage of the case as it is about their argument. Plaintiffs basically ask Judge Nolan to order defendants to redo their production even though defendants have spent thousands of hours reviewing documents, have already produced over a million documents, and at least one defendant claims their review is over 99 percent complete. Given that plaintiffs don’t appear to point to any glaring deficiencies in defendants’ production, an order by Judge Nolan requiring defendants to redo their production using a different technology tool would likely sound alarm bells for 7th Circuit litigants. Judges normally care more about results than methodology when it comes to eDiscovery and they typically do not allow one party to dictate which technology tools their opponents must use.

Plaintiffs’ main contention appears to be that defendants should redo the production because keyword search and other tools were used instead of predictive coding technology. There is no question that keyword search tools have obvious limitations. In fact, Ralph Losey and I addressed this precise issue in a recent webinar titled: “Is Keyword Search in eDiscovery Dead?” The problem with keyword searches, says Losey, is that they are much like the card game “go fish.”  Parties applying keyword searches typically make blind guesses about which keywords might reveal relevant documents. Since guessing every relevant keyword contained in a large collection of documents is virtually impossible, using keyword search tools normally results in some relevant documents being overlooked (those that do not contain the keyword) and some irrelevant documents being retrieved (documents that are not relevant may contain the keyword). Although imperfect, keyword search tools still add value when used properly because they can help identify important documents quickly and expedite document review.

Regardless, plaintiffs take the position that defendants should have used predictive coding to avoid the limitations of keyword search tools. The arguments are not well framed, but ostensibly plaintiffs rely on the common belief that predictive coding tools can minimize the inherent limitations of keyword search tools. The rationale is based in part on the notion that predictive coding tools are better because they don’t require users to know all the relevant keywords in order to identify all the relevant documents. Instead, predictive coding tools rely on human input to construct complex search algorithms. Provided the human input is accurate, computers can use these algorithms to automate the identification of potentially relevant documents during discovery faster and more accurately than humans using traditional linear document review methodologies. Plaintiffs contend defendants should redo their document production using predictive coding technology instead of relying on keywords and traditional linear review because it would provide added assurances that defendants’ productions were thorough.

Aside from the fact that defendants have essentially completed their document production, the problem with plaintiffs’ initial argument is that too much emphasis is placed on the tool and almost no value is attributed to how the tool is used.  Today there are a wide range of technology tools available in the litigator’s tool belt including keyword search, transparent concept search, topic grouping, discussion threading, and predictive coding to name a few. Knowing which of these tools to use for a particular case and in what combination is important. However, even more important is the realization that none of these tools will yield the desired results unless they are used properly. Simply swapping a predictive coding tool for a keyword tool will not solve the problem if the tool is not used properly.

The Artist or The Brush?

Plaintiffs’ blank assertion that defendants’ document production would be more thorough if a predictive coding tool was used as a replacement for keyword searching is naïve. First, using keyword searches and other tools to filter data before using a predictive coding tool is a logical first step for weeding out clearly irrelevant documents. Second, ignoring the importance of the process by focusing only on the tool is like assuming the brush rather than the artist is responsible for the Mona Lisa. The success of a project depends on the artist as much as the tool. Placing a brush in the hands of a novice painter isn’t likely to result in a masterpiece and neither is placing a predictive coding tool in the hands of an untrained end user. To the contrary, placing sophisticated tools in unskilled hands is likely to end poorly.

Hearing Testimony and Da Silva Moore Lessons

Perhaps recognizing their early arguments placed too much emphasis on predictive coding technology, plaintiffs spent most of their time attacking defendants’ process during the hearing. Plaintiffs relied heavily on testimony from their expert, Dr. David Lewis, in an attempt to poke holes in defendants’ search, review, and sampling protocol. For example, Dr. Lewis criticized the breadth of defendants’ collection, their selection of custodians for sampling purposes, and their methodology for validating document review accuracy on direct examination. During a spirited cross examination of Dr. Lewis by Stephen Neuwirth, counsel for defendant Georgia Pacific, many of Dr. Lewis’ criticisms seemed somewhat trivial when measured against today’s eDiscovery status quo – basically the “go fish” method of eDiscovery. If anything, defendants appear to have followed a rigorous search and sampling protocol that goes far beyond what is customary in most document productions today. Since courts require “reasonableness” when it comes to eDiscovery rather than “perfection,” plaintiffs are likely facing an uphill battle in terms of challenging the tools defendants used or their process for using those tools.

The important relationship between technology and process is the lesson in Da Silva Moore and Kleen Products that is buried in thousands of pages of transcripts and pleadings. Although both cases deal squarely with predictive coding technology, the central issue stirring debate is confusion and disagreement about the process for using technology tools.  The issue is most glaring in Da Silva Moore where the parties actually agreed to the use of predictive coding technology, but continue to fight like cats and dogs about establishing a mutually agreeable protocol.

The fact that the parties have spent several weeks arguing about proper predictive coding protocols highlights the complexity surrounding the use of predictive coding tools in eDiscovery and the need for a new generation of predictive coding tools that simplify the current process. Until predictive coding tools become easier to use and more transparent, litigants are likely to shy away from new tools in favor of more traditional eDiscovery tools that are more intuitive and less risky. The good news is that predictive coding technology has the potential to save millions of dollars in document review if done correctly. This fact is fostering a competitive environment that will soon drive development of better predictive coding tools that are easier to use.

Conclusion

Given the amount of time and money defendants have already spent reviewing documents, it is unlikely that Judge Nolan would go out on a limb and order defendants to redo their production unless plaintiffs point to some glaring defect. I did not attend the entire hearing and have not read every court submission. However, based on my limited observations, plaintiffs have not provided much if any evidence that defendants failed to produce a particular document or documents.  Similarly, plaintiffs’ attacks on defendants’ keyword search and sampling protocol are not convincing to the average observer. Even if plaintiffs could poke holes in defendants’ process, a complete redo is unlikely because courts typically require reasonable efforts during document production, not perfection. A third day of hearings has been scheduled in this case, so it may be several more weeks before we find out if Judge Nolan agrees.

Take Two and Call me in the Morning: U.S. Hospitals Need an Information Governance Remedy

Wednesday, April 11th, 2012

Given the vast amount of sensitive information and legal exposure faced by hospitals today it’s a mystery why these organizations aren’t taking advantage of enabling technologies to minimize risk. Both HIPPA and the HITECH Act are often achieved by manual, ad hoc methods, which are hazardous at best. In the past, state and federal auditing environments have not been very aggressive in ensuring compliance, but that is changing. While many hospitals have invested in high tech records management systems (EMR/EHR), those systems do not encompass the entire information and data environment within a hospital. Sensitive information often finds its way into and onto systems outside the reach of EMR/EHR systems, bringing with it increased exposure to security breach and legal liability.

This information overload often metastasizes into email (both hospital and personal), attachments, portable storage devices, file, web and development servers, desktops and laptops, home or affiliated practice’s computers and mobile devices such as iPads and smart phones. These avenues for the dissemination and receipt of information expand the information governance challenge and data security risks. Surprisingly, the feedback from the healthcare sector suggests that hospitals rarely get sued in federal court.

One place hospitals do not want to be is the “Wall of Shame,” otherwise known as the HHS website that has detailed 281 Health Insurance Portability and Accountability Act (HIPAA) security violations that have affected more than 500 individuals as of June 9, 2011. Overall, physical theft and loss accounted for about 63% of the reported breaches. Unauthorized access / disclosure accounted for another 16%, while hacking was only 6%. While Software Advice reasons these statistics seem to indicate that physical theft has been the reason for the majority of breaches, it should also be considered that due to the lack of data loss prevention technology, many hospitals are unaware of breaches that have occurred and therefore cannot report on them.

There are a myriad of reasons hospitals aren’t landing on the front page of the newspaper with the same frequency as other businesses and government agencies when it comes to security breach, and document retention and eDiscovery blunders. But, the underlying contagion is not contained and it certainly is not benign. Feedback from the field reveals some alarming symptoms of the unhealthy state of healthcare information governance, including:

  • uncontrolled .pst files
  • exploding storage growth
  • missing or incomplete data retention rules
  • doctors/nurses storing and sending sensitive data via their personal email, iPads and smartphones
  • encryption rules that rely on individuals to determine what to encrypt
  • data backup policies that differ from data retention and information governance rules
  • little to no compliance training
  • and many times non-existent data loss prevention efforts.

This results in the need for more storage, while creating larger legal liability, an indefensible eDiscovery posture, and the risk of breach.

The reason this problem remains latent in most hospitals is because they are not yet feeling the pain of the problem from massive and multiple lawsuits, large invoices from outside law firms or the operational challenges/costs incurred from searching through many mountains of dispersed data.  The symptoms are observable, the pathology is present, the problem is real and the pain is about to acutely present itself as more states begin to deeply embrace eDiscovery requirements and government regulators increase audit frequency and fine amounts. Another less talked about reason hospitals have not had the same pressure to search and produce their data pursuant to litigation is due to cases being settled before they even get to the discovery stage. The lack of well-developed information governance practices leads to cases being settled too soon, for too much money when they otherwise may not have needed to settle at all.

The Patient’s Symptoms Were Treated, but the Patient’s Data Still Needs Medicine

What is still unclear is why hospitals, given their compliance requirements and tightening IT budgets, aren’t archiving, classifying, and protecting their data with the same type of innovation they are demonstrating in their cutting edge patient care technology. In this realm, two opposite ends of the IT innovation spectrum seem to co-exist in the hospital’s data environment. This dichotomy leaves much of a hospital’s data unprotected, unorganized and uncontrolled. Hospitals are experiencing increasing data security breaches and often are not aware that a breach or data loss has occurred. As more patient data is created and copied in electronic format, used in and exposed by an increasing number of systems and delivered on emerging mobile platforms, the legal and audit risks are compounding on top of a faulty or missing information governance foundation.

Many hospitals have no retention schedules or data classification rules applied to existing information, which often results in a checkbox compliance mentality and a keep-everything-forever practice. Additionally, many hospitals have no ability to apply a comprehensive legal hold across different data sources and lack technology to stop or alert them when there has been a breach.

Information Governance and Data Health in Hospitals

With the mandated push for paper to be converted to digital records, many hospitals are now evaluating the interplay of their various information management and distribution systems. They must consider the newly scanned legacy data (or soon to be scanned), and if they have been operating without an archive, they must now look to implement a searchable repository where they can collectively apply document retention and records management while decreasing the amount of storage needed to retain the data.  We are beginning to see internal counsel leading the way to make this initiative happen across business units. Different departments are coming together to pool resources in tight economic and high regulation times that require collaboration.  We are at the beginning of a widespread movement in the healthcare industry for archiving, data classification and data loss prevention as hospitals link their increasing compliance and data loss requirements with the need to optimize and minimize storage costs. Finally, it comes as no surprise that the amount of data hospitals are generating is crippling their infrastructures, breaking budgets and serving as the primary motivator for change absent lawsuits and audits.

These factors are bringing together various stakeholders into the information governance conversation, helping to paint a very clear picture that putting in place a comprehensive information governance solution is in the entire hospital’s best interest. The symptoms are clear, the problem is treatable, the prescription for information governance is well proven. Hospitals can begin this process by calling an information governance meeting with key stakeholders and pursuing an agenda set around examining their data map and assessing areas of security vulnerability, as well as auditing the present state of compliance with regulations for the healthcare industry.

Editor’s note: This post was co-authored with Eric Heck, Healthcare Account Manager at Symantec.  Eric has over 25 years of experience in applying technology to emerging business challenges, and currently works with healthcare providers and hospitals to manage the evolving threat landscape of compliance, security, data loss and information governance within operational, regulatory and budgetary constraints.

Email Archive Saves the Day, Prevents eDiscovery Sanctions

Thursday, April 5th, 2012

The recent case of Danny Lynn Electrical v. Veolia Es Solid Waste (2012 WL 786843, March 9, 2012) showcases the value of an information archive from a compliance and eDiscovery perspective. In Danny Lynn Electrical the plaintiff sought sanctions against the defendant for the spoliation of electronic evidence, including the usual blend of monetary sanctions, adverse evidentiary inferences and the striking of affirmative defenses. Plaintiff argued that the defendant “blatantly disregarded their duty to preserve electronic information” by failing to implement an effective legal hold policy and deleting email after litigation began. In rejecting plaintiff’s claims, the court concluded that sanctions on the basis of spoliation of evidence were not warranted.

The court, in a harbinger of good things to come for the defendant, questioned “whether any spoliation of electronic evidence has actually occurred.” In finding that there wasn’t any spoliation, the court relied heavily on the fact that the defendant had recently deployed an email archive:

“[T]here is no evidence that any of the alleged emails, with the exception of the few that were accidentally deleted due to a computer virus or other unforseen [sic] circumstance, were permanently deleted from the defendants’ computer system. … VESNA began using a new software system which archives all emails on the VESNA network. Therefore, it is clear to the court that the defendant preserved email from its custodians in a backup or archive system.”

In combination with the deployed archive, the court also noted that plaintiff’s arguments were devoid of substantive evidence to support their spoliation claims:

“In order to impose sanctions against the defendants, this court ‘would have to substitute Plaintiffs’ speculation for actual proof that critical evidence was in fact lost or destroyed.”

The rejection of plaintiff’s spoliation claims in Danny Lynn Electrical reinforces the long held notion that information archives[i] have tremendous utility beyond the data management/minimization benefits that were the early drivers of archive adoption. This prophylactic, information governance benefit is particularly useful when the archive goes beyond email to additionally capture loose files, social media and other unstructured content.

As we said in 2011, organizations are already finding that other sources of electronically stored information (ESI) like documents/files and unstructured data are rivaling email in importance for eDiscovery requests, and this trend shows no signs of abating, particularly for regulated industries. This increasingly heterogeneous mix of ESI certainly results in challenges for many organizations, with some unlucky ones getting sanctioned (unlike the defendant Danny Lynn Electrical ) because they ignored these emerging data types.

The good news is that modern day archives have the ability to manage (preserve, categorize, defensibly delete, etc.) ESI from a wide range of sources beyond just email. Given cases like Danny Lynn Electrical it’s increasingly a layup to build the business case for an archive project (assuming your organization doesn’t have one deployed already). Further pushing the archiving play to the top of the stack is the ability to deploy in the cloud context, in addition to traditional on premise deployments.

The Danny Lynn Electrical case also shows how an upstream, proactive information governance program can have an impact in the downstream, reactive eDiscovery context. It is the linking of the yin and yang of the proactive and reactive concepts where an end to end paradigm starts to fulfill the long anticipated destiny of true information governance. As the explosion of data continues to mushroom unabated, it’s only this type of holistic information management regime that will keep eDiscovery chaos at bay.



[i] In the interests of full disclosure, Symantec offers both on-premise archiving and cloud archiving solutions. They are not the solutions referenced in the Danny Lynn Electrical case.

The eDiscovery “Passport”: The First Step to Succeeding in International Legal Disputes

Monday, April 2nd, 2012

The increase in globalization continues to erase borders throughout the world economy. Organizations now routinely conduct business in countries that were previously unknown to their industry vertical.  The trend of global integration is certain to increase, with reports such as the Ernst & Young 2011 Global Economic Survey confirming that 74% of companies believe that globalization, particularly in emerging markets, is essential to their continued vitality.

Not surprisingly, this trend of global integration has also led to a corresponding increase in cross-border litigation. For example, parties to U.S. litigation are increasingly seeking discovery of electronically stored information (ESI) from other litigants and third parties located in Continental Europe and the United Kingdom. Since traditional methods under the Federal Rules of Civil Procedure (FRCP) may be unacceptable for discovering ESI in those forums, the question then becomes how such information can be obtained.

At this point, many clients and their counsel are unaware how to safely navigate these international waters. The short answer for how to address these issues for much of Europe would be to resort to the Hague Convention of March 18, 1970 on the Taking of Evidence Abroad in Civil or Commercial Matters (Hague Convention). Simply referring to the Hague Convention, however, would ignore the complexities of electronic discovery in Europe. Worse, it would sidestep the glaring knowledge gap that exists in the United States regarding the cultural differences distinguishing European litigation from American proceedings.

The ability to bridge this gap with an awareness of the discovery processes in Europe is essential. Understanding that process is similar to holding a valid passport for international travel. Just as a passport is required for travelers to successfully cross into foreign lands, an “eDiscovery Passport™” is likewise necessary for organizations to effectively conduct cross-border discovery.

The Playing Field for eDiscovery in Continental Europe

Litigation in Continental Europe and is culturally distinct from American court proceedings. “Discovery,” as it is known in the United States, does not exist in Europe. Interrogatories, categorical document requests and requests for admissions are simply unavailable as European discovery devices. Instead, European countries generally allow only a limited exchange of documents, with parties typically disclosing only that information that supports their claims.

The U.S. Court of Appeals for the Seventh Circuit recently commented on this key distinction between European and American discovery when it observed that “the German legal system . . . does not authorize discovery in the sense of Rule 26 of the Federal Rules of Civil Procedure.” The court went on to explain that “[a] party to a German lawsuit cannot demand categories of documents from his opponent. All he can demand are documents that he is able to identify specifically—individually, not by category.” Heraeus Kulzer GmbH v. Biomet, Inc., 633 F.3d 591, 596 (7th Cir. 2011).

Another key distinction to discovery in Continental Europe is the lack of rules or case law requiring the preservation of ESI or paper documents. This stands in sharp contrast to American jurisprudence, which typically requires organizations to preserve information as soon as they reasonably anticipate litigation. See, e.g., Micron Technology, Inc. v. Rambus Inc., 645 F.3d 1311, 1320 (Fed.Cir. 2011). In Europe, while an implied preservation duty could arise if a court ordered the disclosure of certain materials, the penalties for European non-compliance are typically not as severe as those issued by American courts.

Only the nations of the United Kingdom, from which American notions of litigation are derived, have discovery obligations that are more similar to those in the United States. For example, in the combined legal system of England and Wales, a party must disclose to the other side information adverse to its claims. Moreover, England and Wales also suggest that parties should take affirmative steps to prepare for disclosure. According to the High Court in Earles v Barclays Bank Plc [2009] EWHC 2500 (Mercantile) (08 October 2009), this includes having “an efficient and effective information management system in place to provide identification, preservation, collection, processing, review analysis and production of its ESI in the disclosure process in litigation and regulation.” For organizations looking to better address these issues, a strategic and intelligent information governance plan offers perhaps the best chance to do so.

Hostility to International Discovery Requests

Despite some similarities between the U.S. and the U.K., Europe as a whole retains a certain amount of cultural hostility to pre-trial discovery. Given this fact, it should come as no surprise that international eDiscovery requests made pursuant to the Hague Convention are frequently denied. Requests are often rejected because they are overly broad.  In addition, some countries such as Italy simply refuse to honor requests for pre-trial discovery from common law countries like the United States. Moreover, other countries like Austria are not signatories to the Hague Convention and will not accept requests made pursuant to that treaty. To obtain ESI from those countries, litigants must take their chances with the cumbersome and time-consuming process of submitting letters rogatory through the U.S. State Department. Finally, requests for information that seek email or other “personal information” (i.e., information that could be used to identify a person) must additionally satisfy a patchwork of strict European data protection rules.

Obtaining an eDiscovery Passport

This backdrop of complexity underscores the need for both lawyers and laymen to understand the basic principles governing eDisclosure in Europe. Such a task should not be seen as daunting. There are resources that provide straightforward answers to these issues at no cost to the end-user. For example, Symantec has just released a series of eDiscovery Passports™ that touch on the basic issues underlying disclosure and data privacy in the United Kingdom, France, Germany, Holland, Belgium, Austria, Switzerland, Italy and Spain. Organizations such as The Sedona Conference have also made available materials that provide significant detail on these issues, including its recently released International Principles on Discovery, Disclosure and Data Protection.

These resources can provide valuable information to clients and counsel alike and better prepare litigants for the challenges of pursuing legal rights across international boundaries. By so doing, organizations can moderate the effects of legal risk and more confidently pursue their globalization objectives.