24h-payday

Archive for the ‘e-discovery trends’ Category

Kleen Products Update: Is Technology Usage Becoming the New “Proportionality” Factor for Judges?

Wednesday, October 30th, 2013

Readers may recall last year’s expensive battle over the use of predictive coding technology in the 7th Circuit’s Kleen Products case. Although the battle was temporarily resolved in Defendants’ favor (they were not required to redo their production using predictive coding or other “Content Based Advanced Analytics” software), a new eDiscovery battle has surfaced this year between Plaintiffs and a non-party, The Levin Group (“TLG”).

In Kleen, Plaintiffs allege anticompetitive and collusive conduct by a number of companies in the containerboard industry. The Plaintiffs served TLG with a subpoena requesting “every document relating to the containerboard industry.” TLG, a non-party retained as a financial and strategic consultant by two of the Defendants, complied by reviewing 21,000 documents comprising 82,000 pages of material.

Extraordinary Billing Rates for Manual Review?

The wheels began to fall off the bus when Plaintiffs received a $55,000 bill from TLG for the review and production of documents in response to the subpoena. TLG billed $500/hour for 110 hours of document review performed by TLG’s founder (a lawyer) and a non-lawyer employee. Although FRCP 45(c)(3)(C) authorizes “reasonable compensation” of a subpoenaed nonparty and the Court previously ordered the Plaintiffs to “bear the costs of their discovery request,” TLG and the Plaintiffs disagreed over the definition of “reasonable compensation” once the production was complete. Plaintiffs argue that the bill is excessive in light of market rates of $35-$45/hour charged by contract attorneys for review and they also claim that they never agreed to a billing rate.

Following a great deal of back and forth about the costs, the court decided to defer its decision until December 16, 2013 because discovery in the underlying antitrust action is still ongoing. Regardless of the outcome in Kleen, the current dispute feels a bit like déjà vu all over again. Both disputes highlight the importance of cooperation and role of technology in reducing eDiscovery costs. For example, better cooperation among the parties during earlier stages of discovery might have helped prevent or at least minimize some of the downstream post-production arguments that occurred last year and this year. Although the “cooperation” drum has been beaten loudly for several years by judges and think tanks like the Sedona Conference, cooperation is an issue that will never fully disappear in an adversarial system.

Judges May Increasingly Consider Technology as Part of Proportionality Analysis

A more novel and interesting eDiscovery issue in Kleen relates to the fact that judges are increasingly being asked to consider the use (or non-use) of technology when resolving discovery disputes. Last year in Kleen the issue was whether or not a producing party should be required to use advanced technology to assure a more thorough production. This year the Kleen court may be asked to consider the role of technology in the context of the disputed document review fees. For example, the court may consider whether or not TLG could have reduced the number of documents by leveraging de-duplication, domain filtering, document threading or other tools in the Litigator’s Toolbelt™ to reduce the number of documents requiring costly manual review.

Recent trends indicate that the federal bench is increasingly under pressure to consider whether or not and how parties utilize technology as factors in resolving eDiscovery disputes. For example, a 2011 Forbes article titled: “Will New Electronic Discovery Rules Save Organizations Millions or Deny Justice?” framed early discussions about amending the Federal Rules of Civil Procedure (Rules) as follows:

A key question that many feel has been overlooked is whether or not organizations claiming significant eDiscovery costs could have reduced those costs had they invested in better technology solutions.  Most agree that technology alone cannot solve the problem or completely eliminate costs.  However, many also believe that understanding the extent to which the inefficient or non-use of modern eDiscovery technology solutions impacts overall costs is critical to evaluating whether better practices might be needed instead of new Rules.”

Significant interest in the topic was further sparked in Da Silva Moore v. Publicis Group in 2012 when Judge Andrew Peck put parties on notice that technology is increasingly important in evaluating eDiscovery disputes. In Da Silva Moore, Judge Peck famously declared that “computer-assisted review is acceptable in appropriate cases.” Judge Peck’s decision was the first to squarely address the use of predictive coding technology, and a number of cases, articles, and blogs on the topic quickly ensued in what seemed to be the opening of Pandora’s Box with respect to the technology discussion.

More recently, The Duke Law Center for Judicial Studies proposed that the Advisory Committee on Civil Rules add language to the newly proposed amendments to the Federal Rules of Civil Procedure addressing the use of technology-assisted review (TAR). The group advocates adding the following sentence at the end of the first paragraph of the Committee Note to proposed Rule 26(b)(1) dealing with “proportionality” in eDiscovery:

“As part of the proportionality considerations, parties are encouraged, in appropriate cases, to consider the use of advanced analytical software applications and other technologies that can screen for relevant and privileged documents in ways that are at least as accurate as manual review, at far less cost.

Conclusion

The significant role technology plays in managing eDiscovery risks and costs continues to draw more and more attention from lawyers and judges alike. Although early disputes in Kleen highlight the fact that litigators do not always agree on what technology should be used in eDiscovery, most in the legal community recognize that many technology tools in the Litigator’s Toolbelt™ are available to help reduce the costs of eDiscovery. Regardless of how the court in Kleen resolves the current issue, the use or non-use of technology tools is likely to become a central issue in the Rules debate and a prominent factor in most judges’ proportionality analysis in the future.

*Blog post co-authored by Matt Nelson and Adam Kuhn

Moving Data to the Cloud? Top 5 Tips for Corporate Legal Departments

Monday, September 30th, 2013

One of the hottest information technology (IT) trends is to move data once stored within the corporate firewall into a hosted cloud environment managed by third-party providers. In 2013 alone, the public cloud services market is forecast to grow an astonishing 18.5 percent to $131 billion worldwide, up from $111 billion in 2012. The trend is driven largely by the fact that labor, infrastructure, and software costs can be reduced by sending email and other data to third-party providers for off-site hosting. Although the benefits of cloud computing are real, many organizations make the decision to move to the cloud without thoroughly weighing all the risks and benefits first.

A common problem is that many corporate IT departments fail to consult with their legal department before making the decision to move company data into the cloud even though the decision may have legal consequences. For example, retrieving information from the cloud in response to eDiscovery requests presents unique challenges that could cause delay and increase costs. Similarly, problems related to data access and even ownership could also arise if the cloud provider merges with another company, goes bankrupt, or decides to change their terms of service.

The bad news is that the list of possible challenges is long when data is stored in the cloud. The good news is that many of the risks of moving important company data to the cloud are foreseeable and can be mitigated by negotiating terms with prospective cloud providers in advance. Although not comprehensive, the following highlights some of the key areas attorneys should consider before agreeing to store company data with third-party cloud providers.

1.      Who Owns the Data?

The cloud market is hot right now with no immediate end in sight. That means competition is likely for years to come and counsel must be prepared for some market volatility. At a high level, delineating the “customer” as the “data owner” in agreements with cloud service providers is a must. More specifically, terms that address what happens in the event of a merger, bankruptcy, divestiture or any event that leads to the termination or alteration of the relationship should be clearly outlined. Identifying the customer as the data owner and preserving the right to retrieve data within a reasonable time and for a reasonable price will help prevent company data from being used as a bargaining chip if there is a disagreement or change in ownership.

2.      eDiscovery Response Times and Capabilities

Many businesses know first-hand that the costs and burdens of complying with eDiscovery requests are significant. In fact, a recent RAND study estimates that every gigabyte of data reviewed costs approximately $18,000. Surprisingly, many businesses do not realize that storing data in the cloud with the wrong provider could increase the risks and costs of eDiscovery significantly. Risks include the possibility of sanctions for overlooking information that should have been produced or failing to produce information in a timely fashion. Costs could be exacerbated by storing data with a cloud provider that lacks the resources or technology to respond to eDiscovery requests efficiently.

That means counsel must understand whether or not the provider has the technology to preserve, collect, and produce copies of data stored in the cloud. If so, is the search technology accurate and thorough? What are the time frames for responding to requests and are there surcharges for expedited versus non-expedited requests for data? Also, is data collected in a forensically sound manner and is the chain of custody recorded to validate the integrity and reasonableness of the process in the event of legal challenges? More than one cloud customer has encountered excessive fees, unacceptable timelines, and mass confusion when relying on cloud providers to help respond to eDiscovery requests. Avoid surprises by negotiating acceptable terms before, not after, moving company data to the cloud.

3.      Where is Data Physically Located?

Knowing where your data is physically located is important because there could be legal consequences. For example, several jurisdictions have their own unique privacy laws that may impact where employee data can be physically located, how it must be stored, and how it can be used. This can result in conflicts where data stored in the cloud is subject to discovery in one jurisdiction, but disclosure is prohibited by the laws of the jurisdiction where the data is stored. Failure to comply with these local foreign laws, sometimes known as blocking statutes, could result in penalties and challenges that might not be circumvented by choice of law provisions. That means knowing where your data will be stored and understanding the applicable laws governing data privacy in that jurisdiction is critical.

4.      Retention, Backup, & Security

Part of any good data retention program requires systematically deleting information that the business does not have a legal or business need to retain. Keeping information longer than necessary increases long-term storage costs and increases the amount of information the organization must search in the event of future litigation. The cloud provider should have the ability to automate the archiving, retention, and disposition of information in accordance with the customer’s preferred policies as well as the ability to suspend any automated deletion policies during litigation.

Similarly, where and how information is backed up and secured is critical. For many organizations, merely losing access to email for a few hours brings productivity to a screeching halt. Actually losing the company email due to technical problems and/or as the result of insufficient backup technology could cripple some companies indefinitely. Likewise, losing confidential customer data, research and development plans, or other sensitive information due to the lack of adequate data security and encryption technology could result in legal penalties and/or the loss of important intellectual property. Understanding how your data is backed up and secured is critical to choosing a cloud provider. Equally important is determining the consequences of and the process for handling data breaches, losses, and downtime if something goes wrong.

5.      Responding to Subpoenas and Third-party Data Requests

Sometimes it’s not just criminals who try to take your data out of the cloud; litigants and investigators might also request information directly from cloud providers without your knowledge or consent. Obviously companies have a vested interest in vetting the reasonableness and legality of any data requests coming from third parties. That interest does not change when data is stored in the cloud. Knowing how your cloud provider will respond to third-party requests for data and obtaining written assurances that you will be notified of requests as appropriate is critical to protecting intellectual property and defending against bogus claims. Furthermore, making sure data in the cloud is encrypted may provide an added safeguard if data somehow slips through the cracks without your permission.

Conclusion

Today’s era of technological innovation moves at lightning speed and cloud computing technology is no exception. This fast-paced environment sometime results in organizations making the important decision to move data to the cloud without properly assessing all the potential risks. In order to minimize these risks, organizations should consider The Top 5 Tips for Corporate Legal Departments and consult with legal counsel before moving data to the cloud.

Breaking News: Court Orders Google to Produce eDiscovery Search Terms in Apple v. Samsung

Friday, May 10th, 2013

Apple obtained a narrow discovery victory yesterday in its long running legal battle against fellow technology titan Samsung. In Apple Inc. v. Samsung Electronics Co. Ltd, the court ordered non-party Google to turn over the search terms and custodians that it used to produce documents in response to an Apple subpoena.

According to the court’s order, Apple argued for the production of Google’s search terms and custodians in order “to know how Google created the universe from which it produced documents.” The court noted that Apple sought such information “to evaluate the adequacy of Google’s search, and if it finds that search wanting, it then will pursue other courses of action to obtain responsive discovery.”

Google countered that argument by defending the extent of its production and the burdens that Apple’s request would place on Google as a non-party to Apple’s dispute with Samsung. Google complained that Apple’s demands were essentially a gateway to additional discovery from Google, which would arguably be excessive given Google’s non-party status.

Sensitive to the concerns of both parties, the court struck a middle ground in its order. On the one hand, the court ordered Google to produce the search terms and custodians since that “will aid in uncovering the sufficiency of Google’s production and serves greater purposes of transparency in discovery.” But on the other hand, the court preserved Google’s right to object to any further discovery efforts by Apple: “The court notes that its order does not speak to the sufficiency of Google’s production nor to any arguments Google may make regarding undue burden in producing any further discovery.”

This latest opinion from the Apple v. Samsung series of lawsuits is noteworthy for two reasons. First, the decision is instructive regarding the eDiscovery burdens that non-parties must shoulder in litigation. While the disclosure of a non-party’s underlying search methodology (in this instance, search terms and custodians) may not be unduly burdensome, further efforts to obtain non-party documents could exceed the boundaries of reasonableness that courts have designed to protect non-parties from the vicissitudes of discovery. For as the court in this case observed, a non-party “should not be required to ‘subsidize’ litigation to which it is not a party.”

Second, the decision illustrates that the use of search terms remains a viable method for searching and producing responsive ESI. Despite the increasing popularity of predictive coding technology, it is noteworthy that neither the court nor Apple took issue with Google’s use of search terms in connection with its production process. Indeed, the intelligent use of keyword searches is still an acceptable eDiscovery approach for most courts, particularly where the parties agree on the terms. That other forms of technology assisted review, such as predictive coding, could arguably be more efficient and cost effective in identifying responsive documents does not impugn the use of keyword searches in eDiscovery. Only time will tell whether the use of keyword searches as the primary means for responding to document requests will give way to more flexible approaches that include the use of multiple technology tools.

New Gartner Report Spotlights Significance of Email Archiving for Defensible Deletion

Thursday, November 1st, 2012

Gartner recently released a report that spotlights the importance of using email archiving as part of an organization’s defensible deletion strategy. The report – Best Practices for Using Email Archiving to Eliminate PST and Mailbox Quota Headaches (Alan Dayley, September 21, 2012) – specifically focuses on the information retention and eDiscovery challenges associated with email storage on Microsoft Exchange and how email archiving software can help address these issues. As Gartner makes clear in its report, an archiving solution can provide genuine opportunities to reduce the costs and risks of email hoarding.

The Problem: PST Files

The primary challenge that many organizations are experiencing with Microsoft Exchange email is the unchecked growth of messages stored in portable storage tablet (PST) files. Used to bypass storage quotas on Exchange, PST files are problematic because they increase the costs and risks of eDiscovery while circumventing information retention policies.

That the unrestrained growth of PST files could create problems downstream for organizations should come as no surprise. Various court decisions have addressed this issue, with the DuPont v. Kolon Industries litigation foremost among them. In the DuPont case, a $919 million verdict and 20 year product injunction largely stemmed from the defendant’s inability to prevent the destruction of thousands pages of email formerly stored in PST files. That spoliation resulted in a negative inference instruction to the jury and the ensuing verdict against the defendant.

The Solution: Eradicate PSTs with the Help of Archiving Software and Retention Policies

To address the PST problem, Gartner suggests following a three-step process to help manage and then eradicate PSTs from the organization. This includes educating end users regarding both the perils of PSTs and the ease of access to email through archiving software. It also involves disabling the creation of new PSTs, a process that should ultimately culminate with the elimination of existing PSTs.

In connection with this process, Gartner suggests deployment of archiving software with a “PST management tool” to facilitate the eradication process. With the assistance of the archiving tool, existing PSTs can be discovered and migrated into the archive’s central data repository. Once there, email retention policies can begin to expire stale, useless and even harmful messages that were formerly outside the company’s information retention framework.

With respect to the development of retention policies, organizations should consider engaging in a cooperative internal process involving IT, compliance, legal and business units. These key stakeholders must be engaged and collaborate if a workable policies are to be created. The actual retention periods should take into account the types of email generated and received by an organization, along with the enterprise’s business, industry and litigation profile.

To ensure successful implementation of such retention policies and also address the problem of PSTs, an organization should explore whether an on premise or cloud archiving solution is a better fit for its environment. While each method has its advantages, Gartner advises organizations to consider whether certain key features are included with a particular offering:

Email classification. The archiving tool should allow your organization to classify and tag the emails in accordance with your retention policy definitions, including user-selected, user/group, or key-word tagging.

User access to archived email. The tool must also give end users appropriate and user-friendly access to their archived email, thus eliminating concerns over their inability to manage their email storage with PSTs.

Legal and information discovery capabilities. The search, indexing, and e-discovery capabilities of the archiving tool should also match your needs or enable integration into corporate e-discovery systems.

While perhaps not a panacea for the storage and eDiscovery problems associated with email, on premise or cloud archiving software should provide various benefits to organizations. Indeed, such technologies have the potential to help organizations store, manage and discover their email efficiently, cost effectively and in a defensible manner. Where properly deployed and fully implemented, organizations should be able to reduce the nettlesome costs and risks connected with email.

FOIA Matters! — 2012 Information Governance Survey Results for the Government Sector

Thursday, July 12th, 2012

At this year’s EDGE Summit in April, Symantec polled attendees about a range of government-specific information governance questions. The attendees of the event were primarily comprised of members from IT, Legal, as well as Freedom of Information Act (FOIA) agents, government investigators and records managers. The main purpose of the EDGE survey was to gather attendees’ thoughts on what information governance means for their agencies, discern what actions were being taken to address Big Data challenges, and assess how far along agencies were in their information governance implementations pursuant to the recent Presidential Mandate.

As my colleague Matt Nelson’s blog recounts from the LegalTech conference earlier this year, information governance and predictive coding were among the hottest topics at the LTNY 2012 show and in the industry generally. The EDGE Summit correspondingly held sessions on those two topics, as well as delved deeper into questions that are unique to the government. For example, when asked what the top driver for implementation of an information governance plan in an agency was, three out of four respondents answered “FOIA.”

The fact that FOIA was listed as the top driver for government agencies planning to implement an information governance solution is in line with data reported by the Department of Justice (DOJ) from 2008-2011 on the number of requests received. In 2008, 605,491 FOIA requests were received. This figure grew to 644,165 in 2011. While the increase in FOIA requests is not enormous percentage-wise, what is significant is the reduction in backlogs for FOIA requests. In 2008, there was a backlog of 130,419 requests and was decreased to 83,490 by 2011. This is likely due to the implementation of newer and better technology, coupled with the fact that the current administration has made FOIA request processing a priority.

In 2009, President Obama directed agencies to adopt “a presumption in favor’” of FOIA requests for greater transparency in the government. Agencies have had pressure from the President to improve the response time to (and completeness of) FOIA requests. Washington Post reporter Ed O’Keefe wrote,

“a study by the National Security Archive at George Washington University and the Knight Foundation, found approximately 90 federal agencies are equipped to process FOIA requests, and of those 90, only slightly more than half have taken at least some steps to fulfill Obama’s goal to improve government transparency.”

Agencies are increasingly more focused on complying with FOIA and will continue to improve their IT environments with archiving, eDiscovery and other proactive records management solutions in order to increase access to data.

Not far behind FOIA requests on the list of reasons to implement an information governance plan were “lawsuits” and “internal investigations.” Fortunately, any comprehensive information governance plan will axiomatically address FOIA requests since the technology implemented to accomplish information governance inherently allows for the storage, identification, collection, review and production of data regardless of the specific purpose. The use of information governance technology will not have the same workflow or process for FOIA that an internal investigation would require, for example, but the tools required are the same.

The survey also found that the top three most important activities surrounding information governance were: email/records retention (73%), data security/privacy (73%) and data storage (72%). These concerns are being addressed modularly by agencies with technology like data classification services, archiving, and data loss prevention technologies. In-house eDiscovery tools are also important as they facilitate the redaction of personally identifiable information that must be removed in many FOIA requests.

It is clear that agencies recognize the importance of managing email/records for the purposes of FOIA and this is an area of concern in light of not only the data explosion, but because 53% of respondents reported they are responsible for classifying their own data. Respondents have connected the concept of information governance with records management and the ability to execute more effectively on FOIA requests. Manual classification is rapidly becoming obsolete as data volumes grow, and is being replaced by automated solutions in successfully deployed information governance plans.

Perhaps the most interesting piece of data from the survey was the disclosures about what was preventing governmental agencies from implementing information governance plans. The top inhibitors for the government were “budget,” “internal consensus” and “lack of internal skill sets.” Contrasted with the LegalTech Survey findings from 2012 on information governance, with respondents predominantly from the private sector, the government’s concerns and implementation timelines are slightly different. In the EDGE survey, only 16% of the government respondents reported that they have implemented an information governance solution, contrasted with the 19% of the LegalTech audience. This disparity is partly because the government lacks the budget and the proper internal committee of stakeholders to sponsor and deploy a plan, but the relatively lows numbers in both sectors indicate the nascent state of information governance.

In order for a successful information governance plan to be deployed, “it takes a village,” to quote Secretary Clinton. Without prioritizing coordination between IT, legal, records managers, security, and the other necessary departments on data management, merely having the budget only purchases the technology and does not ensure true governance. In this year’s survey, 95% of EDGE respondents were actively discussing information governance solutions. Over the next two years the percentage of agencies that will submit a solution is expected to triple from 16%-52%. With the directive on records management due this month by the National Archives Records Administration (NARA), the government agencies will have clear guidance on what the best practices are for records management, and this will aid the adoption of automated archiving and records classification workflows.

The future is bright with the initiative by the President and NARA’s anticipated directive to examine the state of technology in the government. The EDGE survey results support the forecast, provided budget can be obtained, that agencies will be in an improved state of information governance within the next two years. This will be an improvement for FOIA request compliance, efficient litigation with the government and increase their ability to effectively conduct internal investigations.

Many would have projected that the results of the survey question on what drives information governance in the government would be litigation, internal investigations, and FOIA requests respectively. And yet, FOIA has recently taken on a more important role given the Obama administration’s focus on transparency and the increased number of requests by citizens. While any one of the drivers could have facilitated updates in process and technology the government clearly needs, FOIA has positive momentum behind it and seems to be the impetus primarily driving information governance. Fortunately, archiving and eDiscovery technology, only two parts of information governance continuum, can help with all three of the aforementioned drivers with different workflows.

Later this month we will examine NARA’s directive and what the impact will be on the government’s technology environment – stay tuned.

Gartner’s 2012 Magic Quadrant for E-Discovery Software Looks to Information Governance as the Future

Monday, June 18th, 2012

Gartner recently released its 2012 Magic Quadrant for E-Discovery Software, which is its annual report analyzing the state of the electronic discovery industry. Many vendors in the Magic Quadrant (MQ) may initially focus on their position and the juxtaposition of their competitive neighbors along the Visionary – Execution axis. While a very useful exercise, there are also a number of additional nuggets in the MQ, particularly regarding Gartner’s overview of the market, anticipated rates of consolidation and future market direction.

Context

For those of us who’ve been around the eDiscovery industry since its infancy, it’s gratifying to see the electronic discovery industry mature.  As Gartner concludes, the promise of this industry isn’t off in the future, it’s now:

“E-discovery is now a well-established fact in the legal and judicial worlds. … The growth of the e-discovery market is thus inevitable, as is the acceptance of technological assistance, even in professions with long-standing paper traditions.”

The past wasn’t always so rosy, particularly when the market was dominated by hundreds of service providers that seemed to hold on by maintaining a few key relationships, combined with relatively high margins.

“The market was once characterized by many small providers and some large ones, mostly employed indirectly by law firms, rather than directly by corporations. …  Purchasing decisions frequently reflected long-standing trusted relationships, which meant that even a small book of business was profitable to providers and the effects of customary market forces were muted. Providers were able to subsist on one or two large law firms or corporate clients.”

Consolidation

The Magic Quadrant correctly notes that these “salad days” just weren’t feasible long term. Gartner sees the pace of consolidation heating up even further, with some players striking it rich and some going home empty handed.

“We expect that 2012 and 2013 will see many of these providers cease to exist as independent entities for one reason or another — by means of merger or acquisition, or business failure. This is a market in which differentiation is difficult and technology competence, business model rejuvenation or size are now required for survival. … The e-discovery software market is in a phase of high growth, increasing maturity and inevitable consolidation.”

Navigating these treacherous waters isn’t easy for eDiscovery providers, nor is it simple for customers to make purchasing decisions if they’re correctly concerned that the solution they buy today won’t be around tomorrow.  Yet, despite the prognostication of an inevitable shakeout (Gartner forecasts that the market will shrink 25% in the raw number of firms claiming eDiscovery products/services) they are still very bullish about the sector.

“Gartner estimates that the enterprise e-discovery software market came to $1 billion in total software vendor revenue in 2010. The five-year CAGR to 2015 is approximately 16%.”

This certainly means there’s a window of opportunity for certain players – particularly those who help larger players fill out their EDRM suite of offerings, since the best of breed era is quickly going by the wayside.  Gartner notes that end-to-end functionality is now table stakes in the eDiscovery space.

“We have seen a large upsurge in user requests for full-spectrum EDRM functionality. Whether that functionality will be used initially, or at all, remains an open question. Corporate buyers do seem minded to future-proof their investments in this way, by anticipating what they may wish to do with the software and the vendor in the future.”

Information Governance

Not surprisingly, it’s this “full-spectrum” functionality that most closely aligns with marrying the reactive, right side of the EDRM with the proactive, left side.  In concert, this yin and yang is referred to as information governance, and it’s this notion that’s increasingly driving buying behaviors.

“It is clear from our inquiry service that the desire to bring e-discovery under control by bringing data under control with retention management is a strategy that both legal and IT departments pursue in order to control cost and reduce risks. Sometimes the archiving solution precedes the e-discovery solution, and sometimes it follows it, but Gartner clients that feel the most comfortable with their e-discovery processes and most in control of their data are those that have put archiving systems in place …”

As Gartner looks out five years, the analyst firm anticipates more progress on the information governance front, because the “entire e-discovery industry is founded on a pile of largely redundant, outdated and trivial data.”  At some point this digital landfill is going to burst and organizations are finally realizing that if they don’t act now, it may be too late.

“During the past 10 to 15 years, corporations and individuals have allowed this data to accumulate for the simple reason that it was easy — if not necessarily inexpensive — to do so. … E-discovery has proved to be a huge motivation for companies to rethink their information management policies. The problem of determining what is relevant from a mass of information will not be solved quickly, but with a clear business driver (e-discovery) and an undeniable return on investment (deleting data that is no longer required for legal or business purposes can save millions of dollars in storage costs) there is hope for the future.”

 

The Gartner Magic Quadrant for E-Discovery Software is insightful for a number of reasons, not the least of which is how it portrays the developing maturity of the electronic discovery space. In just a few short years, the niche has sprouted wings, raced to $1B and is seeing massive consolidation. As we enter the next phase of maturation, we’ll likely see the sector morph into a larger, information governance play, given customers’ “full-spectrum” functionality requirements and the presence of larger, mainstream software companies.  Next on the horizon is the subsuming of eDiscovery into both the bigger information governance umbrella, as well as other larger adjacent plays like “enterprise information archiving, enterprise content management, enterprise search and content analytics.” The rapid maturation of the eDiscovery industry will inevitably result in growing pains for vendors and practitioners alike, but in the end we’ll all benefit.

 

About the Magic Quadrant
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Kleen Products Predictive Coding Update – Judge Nolan: “I am a believer of principle 6 of Sedona”

Tuesday, June 5th, 2012

Recent transcripts reveal that 7th Circuit Magistrate Judge Nan Nolan has urged the parties in Kleen Products, LLC, et. al. v. Packaging Corporation of America, et. al. to focus on developing a mutually agreeable keyword search strategy for eDiscovery instead of debating whether other search and review methodologies would yield better results. This is big news for litigators and others in the electronic discovery space because many perceived Kleen Products as potentially putting keyword search technology on trial, compared to newer technology like predictive coding. Considering keyword search technology is still widely used in eDiscovery, a ruling by Judge Nolan requiring defendants to redo part of their production using technology other than keyword searches would sound alarm bells for many litigators.

The controversy surrounding Kleen Products relates both to Plaintiffs’ position, as well as the status of discovery in the case. Plaintiffs initially asked Judge Nolan to order Defendants to redo their previous productions and all future productions using alternative technology.  The request was surprising to many observers because some Defendants had already spent thousands of hours reviewing and producing in excess of one million documents. That number has since surpassed three million documents.  Among other things, Plaintiffs claim that if Defendants had used “Content Based Advanced Analytics” tools (a term they did not define) such as predictive coding technology, then their production would have been more thorough. Notably, Plaintiffs do not appear to point to any instances of specific documents missing from Defendants’ productions.

In response, Defendants countered that their use of keyword search technology and their eDiscovery methodology in general was extremely rigorous and thorough. More specifically, they highlight their use of advanced culling and analysis tools (such as domain filtering and email threading) in addition to keyword search tools.  Plaintiffs also claim they cooperated with Defendants by allowing them to participate in the selection of keywords used to search for relevant documents.  Perhaps going above and beyond the eDiscovery norm, the Defendants even instituted a detailed document sampling approach designed to measure the quality of their document productions.

Following two full days of expert witness testimony regarding the adequacy of Plaintiffs’ initial productions, Judge Nolan finally asked the parties to try and reach compromise on the “Boolean” keyword approach.  She apparently reasoned that having the parties work out a mutually agreeable approach based on what Defendants had already implemented was preferable to scheduling yet another full day of expert testimony — even though additional expert testimony is still an option.

In a nod to the Sedona Principles, she further explained her rationale on March 28, 2012, at the conclusion of the second day of testimony:

“the defendants had done a lot of work, the defendant under Sedona 6 has the right to pick the [eDiscovery] method. Now, we all know, every court in the country has used Boolean search, I mean, this is not like some freak thing that they [Defendants] picked out…”

Judge Nolan’s reliance on the Sedona Best Practices Recommendations & Principles for Addressing Electronic Document Production reveals how she would likely rule if Plaintiffs renew their position that Defendants should have used predictive coding or some other kind of technology in lieu of keyword searches. Sedona Principle 6 states that:

“[r]esponding parties are best situated to evaluate the procedures, methodologies, and technologies appropriate for preserving and producing their own electronically stored information.”

In other words, Judge Nolan confirmed that in her court, opposing parties typically may not dictate what technology solutions their opponents must use without some indication that the technology or process used failed to yield accurate results. Judge Nolan also observed that quality and accuracy are key guideposts regardless of the technology utilized during the eDiscovery process:

“what I was learning from the two days, and this is something no other court in the country has really done too, is how important it is to have quality search. I mean, if we want to use the term “quality” or “accurate,” but we all want this…– how do you verify the work that you have done already, is the way I put it.”

Although Plaintiffs have reserved their right to reintroduce their technology arguments, recent transcripts suggest that Defendants will not be required to use different technology. Plaintiffs continue to meet and confer with individual Defendants to agree on keyword searches, as well as the types of data sources that must be included in the collection. The parties and Judge also appear to agree that they would like to continue making progress with 30(b)(6) depositions and other eDiscovery issues before Judge Nolan retires in a few months, rather than begin a third day of expert hearings regarding technology related issues. This appears to be good news for the Judge and the parties since the eDiscovery issues now seem to be headed in the right direction as a result of mutual cooperation between the parties and some nudging by Judge Nolan.

There is also good news for outside observers in that Judge Nolan has provided some sage guidance to help future litigants before she steps down from the bench. For example, it is clear that Judge Nolan and other judges continue to emphasize the importance of cooperation in today’s complex new world of technology. Parties should be prepared to cooperate and be more transparent during discovery given the judiciary’s increased reliance on the Sedona Cooperation Proclamation. Second, Kleen Products illustrates that keyword search is not dead. Instead, keyword search should be viewed as one of many tools in the Litigator’s Toolbelt™ that can be used with other tools such as email threading, advanced filtering technology, and even predictive coding tools.  Finally, litigators should take note that regardless of the tools they select, they must be prepared to defend their process and use of those tools or risk the scrutiny of judges and opposing parties.

Gartner’s “2012 Magic Quadrant for E-Discovery Software” Provides a Useful Roadmap for Legal Technologists

Tuesday, May 29th, 2012

Gartner has just released its 2012 Magic Quadrant for E-Discovery Software, which is an annual report that analyzes the state of the electronic discovery industry and provides a detailed vendor-by-vendor evaluation. For many, particularly those in IT circles, Gartner is an unwavering north star used to divine software market leaders, in topics ranging from business intelligence platforms to wireless lan infrastructures. When IT professionals are on the cusp of procuring complex software, they look to analysts like Gartner for quantifiable and objective recommendations – as a way to inform and buttress their own internal decision making processes.

But for some in the legal technology field (particularly attorneys), looking to Gartner for software analysis can seem a bit foreign. Legal practitioners are often more comfortable with the “good ole days” when the only navigation aid in the eDiscovery world was provided by the dynamic duo of George Socha and Tom Gelbmanm, who (beyond creating the EDRM) were pioneers of the first eDiscovery rankings survey. Albeit somewhat short lived, their Annual Electronic Discovery[i] Survey ranked the hundreds of eDiscovery providers and bucketed the top tier players in both software and litigation support categories. The scope of their mission was grand, and they were perhaps ultimately undone by the breadth of their task (stopping the Survey in 2010), particularly as the eDiscovery landscape continued to mature, fragment and evolve.

Gartner, which has perfected the analysis of emerging software markets, appears to have taken on this challenge with an admittedly more narrow (and likely more achievable) focus. Gartner published its first Magic Quadrant (MQ) for the eDiscovery industry last year, and in the 2012 Magic Quadrant for E-Discovery Software report they’ve evaluated the top 21 electronic discovery software vendors. As with all Gartner MQs, their methodology is rigorous; in order to be included, vendors must meet quantitative requirements in market penetration and customer base and are then evaluated upon criteria for completeness of vision and ability to execute.

By eliminating the legion of service providers and law firms, Gartner has made their mission both more achievable and perhaps (to some) less relevant. When talking to certain law firms and litigation support providers, some seem to treat the Gartner initiative (and subsequent Magic Quadrant) like a map from a land they never plan to visit. But, even if they’re not directly procuring eDiscovery software, the Gartner MQ should still be seen by legal technologists as an invaluable tool to navigate the perils of the often confusing and shifting eDiscovery landscape – particularly with the rash of recent M&A activity.

Beyond the quadrant positions[ii], comprehensive analysis and secular market trends, one of the key underpinnings of the Magic Quadrant is that the ultimate position of a given provider is in many ways an aggregate measurement of overall customer satisfaction. Similar in ways to the net promoter concept (which is a tool to gauge the loyalty of a firm’s customer relationships simply by asking how likely that customer is to recommend a product/service to a colleague), the Gartner MQ can be looked at as the sum total of all customer experiences.[iii] As such, this usage/satisfaction feedback is relevant even for parties that aren’t purchasing or deploying electronic discovery software per se. Outside counsel, partners, litigation support vendors and other interested parties may all end up interacting with a deployed eDiscovery solution (particularly when such solutions have expanded their reach as end-to-end information governance platforms) and they should want their chosen solution to used happily and seamlessly in a given enterprise. There’s no shortage of stories about unhappy outside counsel (for example) that complain about being hamstrung by a slow, first generation eDiscovery solution that ultimately makes their job harder (and riskier).

Next, the Gartner MQ also is a good short-handed way to understand more nuanced topics like time to value and total cost of ownership. While of course related to overall satisfaction, the Magic Quadrant does indirectly address the query about whether the software does what it says it will (delivering on the promise) in the time frame that is claimed (delivering the promise in a reasonable time frame) since these elements are typically subsumed in the satisfaction metric. This kind of detail is disclosed in the numerous interviews that Gartner conducts to go behind the scenes, querying usage and overall satisfaction.

While no navigation aid ensures that a traveler won’t get lost, the Gartner Magic Quadrant for E-Discovery Software is a useful map of the electronic discovery software world. And, particularly looking at year-over-year trends, the MQ provides a useful way for legal practitioners (beyond the typical IT users) to get a sense of the electronic discovery market landscape as it evolves and matures. After all, staying on top of the eDiscovery industry has a range of benefits beyond just software procurement.

Please register here to access the Gartner Magic Quadrant for E-Discovery Software.

About the Magic Quadrant
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.



[i] Note, in the good ole days folks still used two words to describe eDiscovery.

[ii] Gartner has a proprietary matrix that it uses to place the entities into four quadrants: Leaders, Challengers, Visionaries and Niche Players.

[iii] Under the Ability to Execute axis Gartner weighs a number of factors including “Customer Experience: Relationships, products and services or programs that enable clients to succeed with the products evaluated. Specifically, this criterion includes implementation experience, and the ways customers receive technical support or account support. It can also include ancillary tools, the existence and quality of customer support programs, availability of user groups, service-level agreements and so on.”

7th Circuit eDiscovery Pilot Program Tackles Technology Assisted Review With Mock Arguments

Tuesday, May 22nd, 2012

The 7th Circuit eDiscovery Pilot Program’s Mock Argument is the first of its kind and is slated for June 14, 2012.  It is not surprising that the Seventh Circuit’s eDiscovery Pilot Program would be the first to host an event like this on predictive coding, as the program has been a progressive model across the country for eDiscovery protocols since 2009.  The predictive coding event is open to the public (registration required) and showcases the expertise of leading litigators, technologists and experts from all over the United States.  Speakers include: Jason R. Baron, Director of Litigation at the National Archives and Records Administration; Maura R. Grossman, Counsel at Wachtell, Lipton, Rosen & Katz; Dr. David Lewis, Technology Expert and co-founder of the TREC Legal Track; Ralph Losey, Partner at Jackson Lewis; Matt Nelson, eDiscovery Counsel at Symantec; Lisa Rosen, President of Rosen Technology ResourcesJeff Sharer, Partner at Sidley Austin; and Tomas Thompson, Senior Associate at DLA Piper.

The eDiscovery 2.0 blog has extensively covered the three recent predictive coding cases currently being litigated, and while real court cases are paramount to the direction of predictive coding, the 7th Circuit program will proactively address a scenario that has not yet been considered by a court.  In Da Silva Moore, the parties agreed to the use of predictive coding, but couldn’t subsequently agree on the protocol.  In Kleen, plaintiffs want defendants to redo their review process using predictive coding even though the production is 99% complete.  And, in Global Aerospace the defendant proactively petitioned to use predictive coding over plaintiff’s objections.  By contrast, in the 7th Circuit’s hypothetical, the mock argument predicts another likely predictive coding scenario; the instance where a defendant has a deployed in-house solution in place and argues against the use of predictive coding before discovery has begun.

Traditionally, courts have been reticent to bless or admonish technology, but rather rule on the reasonableness of an organization’s process and depend on expert testimony for issues beyond that scope.  It is expected that predictive coding will follow suit; however, because so little is understood about how the technology works, interest has been generated in a way the legal technology industry has not seen before, as evidenced by this tactical program.

* * *

The hypothetical dispute is a complex litigation matter pending in a U.S. District Court involving a large public corporation that has been sued by a smaller high-tech competitor for alleged anticompetitive conduct, unfair competition and various business torts.  The plaintiff has filed discovery requests that include documents and communications maintained by the defendant corporation’s vast international sales force.  To expedite discovery and level the playing field in terms of resources and costs, the Plaintiff has requested the use of predictive coding to identify and produce responsive documents.  The defendant, wary of the latest (and untested) eDiscovery technology trends, argues that the organization already has a comprehensive eDiscovery program in place.  The defendant will further argue that the technological investment and defensible processes in-house are more than sufficient for comprehensive discovery, and in fact, were designed in order to implement a repeatable and defensible discovery program.  The methodology of the defendant is estimated to take months and result in the typical massive production set, whereas predictive coding would allegedly make for a shorter discovery period.  Because of the burden, the defendant plans to shift some of these costs to the plaintiff.

Ralph Losey’s role will be as the Magistrate Judge, defense counsel will be Martin T. Tully (partner Katten Muchin Rosenman LLP), with Karl Schieneman (of Review Less/ESI Bytes) as the litigation support manager for the corporation and plaintiff’s counsel will be Sean Byrne (eDiscovery solutions director at Axiom) with Herb Roitblat (of OrcaTec) as plaintiff’s eDiscovery consultant.

As the hottest topic in the eDiscovery world, the promises of predictive coding include: increased search accuracy for relevant documents, decreased cost and time spent for manual review, and possibly greater insight into an organization’s corpus of data allowing for more strategic decision making with regard to early case assessment.  The practical implications of predictive coding use are still to be determined and programs like this one will flesh out some of those issues before they get to the courts, which is good for practitioners and judges alike.  Stay tuned for an analysis of the arguments, as well as a link to the video.

Courts Increasingly Cognizant of eDiscovery Burdens, Reject “Gotcha” Sanctions Demands

Friday, May 18th, 2012

Courts are becoming increasingly cognizant of the eDiscovery burdens that the information explosion has placed on organizations. Indeed, the cases from 2012 are piling up in which courts have rejected demands that sanctions be imposed for seemingly reasonable information retention practices. The recent case of Grabenstein v. Arrow Electronics (D. Colo. April 23, 2012) is another notable instance of this trend.

In Grabenstein, the court refused to sanction a company for eliminating emails pursuant to a good faith document retention policy. The plaintiff had argued that drastic sanctions (evidence, adverse inference and monetary) should be imposed on the company since relevant emails regarding her alleged disability were not retained in violation of both its eDiscovery duties and an EEOC regulatory retention obligation. The court disagreed, finding that sanctions were inappropriate because the emails were not deleted before the duty to preserve was triggered: “Plaintiff has not provided any evidence that Defendant deleted e-mails after the litigation hold was imposed.”

Furthermore, the court declined to issue sanctions of any kind even though it found that the company deleted emails in violation of its EEOC regulatory retention duty. The court adopted this seemingly incongruous position because the emails were overwritten pursuant to a reasonable document retention policy:

“there is no evidence to show that the e-mails were destroyed in other than the normal course of business pursuant to Defendant’s e-mail retention policy or that Defendant intended to withhold unfavorable information from Plaintiff.”

The Grabenstein case reinforces the principle that reasonable information retention and eDiscovery processes can and often do trump sanctions requests. Just like the defendant in Grabenstein, organizations should develop and follow a retention policy that eliminates data stockpiles before litigation is reasonably anticipated. Grabenstein also demonstrates the value of deploying a timely and comprehensive litigation hold process to ensure that relevant electronically stored information (ESI) is retained once a preservation duty is triggered. These principles are consistent with various other recent cases, including a decision last month in which pharmaceutical giant Pfizer defeated a sanctions motion by relying on its “good faith business procedures” to eliminate legacy materials before a duty to preserve arose.

The Grabenstein holding also spotlights the role that proportionality can play in determining the extent of a party’s preservation duties. The Grabenstein court reasoned that sanctions would be inappropriate since plaintiff managed to obtain the destroyed emails from an alternative source. Without expressly mentioning “proportionality,” the court implicitly drew on Federal Rule of Civil Procedure 26(b)(2)(C) to reach its “no harm, no foul” approach to plaintiff’s sanctions request. Rule 2626(b)(2)(C)(i) empowers a court to limit discovery when it is “unreasonably cumulative or duplicative, or can be obtained from some other source that is more convenient, less burdensome, or less expensive.” Given that plaintiff actually had the emails in question and there was no evidence suggesting other ESI had been destroyed, proportionality standards tipped the scales against the sanctions request.

The Grabenstein holding is good news for organizations looking to reduce their eDiscovery costs and burdens. By refusing to accede to a tenuous sanctions motion and by following principles of proportionality, the court sustained reasonableness over “gotcha” eDiscovery tactics. If courts adhere to the Grabenstein mantra that preservation and production should be reasonable and proportional, organizations truly stand a better chance of seeing their litigation costs and burdens reduced accordingly.