24h-payday

Archive for the ‘early case analysis’ Category

The Gartner 2013 Magic Quadrant for eDiscovery Software is Out!

Wednesday, June 12th, 2013

This week marks the release of the 3rd annual Gartner Magic Quadrant for e-Discovery Software report.  In the early days of eDiscovery, most companies outsourced almost every sizeable project to vendors and law firms so eDiscovery software was barely a blip on the radar screen for technology analysts. Fast forward a few years to an era of explosive information growth and rising eDiscovery costs and the landscape has changed significantly. Today, much of the outsourced eDiscovery “services” business has been replaced by eDiscovery software solutions that organizations bring in house to reduce risk and cost. As a result, the enterprise eDiscovery software market is forecast to grow from $1.4 billion in total software revenue worldwide in 2012 to $2.9 billion by 2017. (See Forecast:  Enterprise E-Discovery Software, Worldwide, 2012 – 2017, Tom Eid, December, 2012).

Not surprisingly, today’s rapidly growing eDiscovery software market has become significant enough to catch the attention of mainstream analysts like Gartner. This is good news for company lawyers who are used to delegating enterprise software decisions to IT departments and outside law firms. Because today those same company lawyers are involved in eDiscovery and other information management software purchasing decisions for their organizations. While these lawyers understand the company’s legal requirements, they do not necessarily understand how to choose the best technology to address those requirements. Conversely, IT representatives understand enterprise software, but they do not necessarily understand the law. Gartner bridges this information gap by providing in depth and independent analysis of the top eDiscovery software solutions in the form of the Gartner Magic Quadrant for e-Discovery Software.

Gartner’s methodology for preparing the annual Magic Quadrant report is rigorous. Providers must meet quantitative requirements such as revenue and significant market penetration to be included in the report. If these threshold requirements are met then Gartner probes deeper by meeting with company representatives, interviewing customers, and soliciting feedback to written questions. Providers that make the cut are evaluated across four Magic Quadrant categories as either “leaders, challengers, niche players, or visionaries.” Where each provider ends up on the quadrant is guided by an independent evaluation of each provider’s “ability to execute” and “completeness of vision.” Landing in the “leaders” quadrant is considered a top recognition.

The nine Leaders in this year’s Magic Quadrant have four primary characteristics (See figure 1 above).

The first is whether the provider has functionality that spans both sides of the electronic discovery reference model (EDRM) (left side – identification, preservation, litigation hold, collection, early case assessment (ECA) and processing and right-side – processing, review, analysis and production). “While Gartner recognizes that not all enterprises — or even the majority — will want to perform legal-review work in-house, more and more are dictating what review tools will be used by their outside counsel or legal-service providers. As practitioners become more sophisticated, they are demanding that data change hands as little as possible, to reduce cost and risk. This is a continuation of a trend we saw developing last year, and it has grown again in importance, as evidenced both by inquiries from Gartner clients and reports from vendors about the priorities of current and prospective customers.”

We see this as consistent with the theme that providers with archiving solutions designed to automate data retention and destruction policies generally fared better than those without archiving technology. The rationale is that part of a good end-to-end eDiscovery strategy includes proactively deleting data organizations do not have a legal or business need to keep. This approach decreases the amount of downstream electronically stored information (ESI) organizations must review on a case-by-case basis so the cost savings can be significant.

Not surprisingly, whether or not a provider offers technology assisted review or predictive coding capabilities was another factor in evaluating each provider’s end-to-end functionality. The industry has witnessed a surge in predictive coding case law since 2012 and judicial interest has helped drive this momentum. However, a key driver for implementing predictive coding technology is the ability to reduce the amount of ESI attorneys need to review on a case-by-case basis. Given the fact that attorney review is the most expensive phase of the eDiscovery process, many organizations are complementing their proactive information reduction (archiving) strategy with a case-by-case information reduction plan that also includes predictive coding.

The second characteristic Gartner considered was that Leaders’ business models clearly demonstrate that their focus is software development and sales, as opposed to the provision of services. Gartner acknowledged that the eDiscovery services market is strong, but explains that the purpose of the Magic Quadrant is to evaluate software, not services. The justification is that “[c]orporate buyers and even law firms are trending towards taking as much e-Discovery process in house as they can, for risk management and cost control reasons. In addition, the vendor landscape for services in this area is consolidating. A strong software offering, which can be exploited for growth and especially profitability, is what Gartner looked for and evaluated.”

Third, Gartner believes the solution provider market is shrinking and that corporations are becoming more involved in buying decisions instead of deferring technology decisions to their outside law firms. Therefore, those in the Leaders category were expected to illustrate a good mix of corporate and law firm buying centers. The rationale behind this category is that law firms often help influence corporate buying decisions so both are important players in the buying cycle. However, Gartner also highlighted that vendors who get the majority of their revenues from the “legal solution provider channel” or directly from “law firms” may soon face problems.

The final characteristic Gartner considered for the Leaders quadrant is related to financial performance and growth. In measuring this component, Gartner explained that a number of factors were considered. Primary among them is whether the Leaders are keeping pace with or even exceeding overall market growth. (See “Forecast:  Enterprise E-Discovery Software, Worldwide, 2012 – 2017,” Tom Eid, December, 2012).

Companies landing in Gartner’s Magic Quadrant for eDiscovery Software have reason to celebrate their position in an increasingly competitive market. To review Gartner’s full report yourself, click here. In the meantime, please feel free to share your own comments below as the industry anxiously awaits next year’s Magic Quadrant Report.

Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

New Gartner Report Spotlights Significance of Email Archiving for Defensible Deletion

Thursday, November 1st, 2012

Gartner recently released a report that spotlights the importance of using email archiving as part of an organization’s defensible deletion strategy. The report – Best Practices for Using Email Archiving to Eliminate PST and Mailbox Quota Headaches (Alan Dayley, September 21, 2012) – specifically focuses on the information retention and eDiscovery challenges associated with email storage on Microsoft Exchange and how email archiving software can help address these issues. As Gartner makes clear in its report, an archiving solution can provide genuine opportunities to reduce the costs and risks of email hoarding.

The Problem: PST Files

The primary challenge that many organizations are experiencing with Microsoft Exchange email is the unchecked growth of messages stored in portable storage tablet (PST) files. Used to bypass storage quotas on Exchange, PST files are problematic because they increase the costs and risks of eDiscovery while circumventing information retention policies.

That the unrestrained growth of PST files could create problems downstream for organizations should come as no surprise. Various court decisions have addressed this issue, with the DuPont v. Kolon Industries litigation foremost among them. In the DuPont case, a $919 million verdict and 20 year product injunction largely stemmed from the defendant’s inability to prevent the destruction of thousands pages of email formerly stored in PST files. That spoliation resulted in a negative inference instruction to the jury and the ensuing verdict against the defendant.

The Solution: Eradicate PSTs with the Help of Archiving Software and Retention Policies

To address the PST problem, Gartner suggests following a three-step process to help manage and then eradicate PSTs from the organization. This includes educating end users regarding both the perils of PSTs and the ease of access to email through archiving software. It also involves disabling the creation of new PSTs, a process that should ultimately culminate with the elimination of existing PSTs.

In connection with this process, Gartner suggests deployment of archiving software with a “PST management tool” to facilitate the eradication process. With the assistance of the archiving tool, existing PSTs can be discovered and migrated into the archive’s central data repository. Once there, email retention policies can begin to expire stale, useless and even harmful messages that were formerly outside the company’s information retention framework.

With respect to the development of retention policies, organizations should consider engaging in a cooperative internal process involving IT, compliance, legal and business units. These key stakeholders must be engaged and collaborate if a workable policies are to be created. The actual retention periods should take into account the types of email generated and received by an organization, along with the enterprise’s business, industry and litigation profile.

To ensure successful implementation of such retention policies and also address the problem of PSTs, an organization should explore whether an on premise or cloud archiving solution is a better fit for its environment. While each method has its advantages, Gartner advises organizations to consider whether certain key features are included with a particular offering:

Email classification. The archiving tool should allow your organization to classify and tag the emails in accordance with your retention policy definitions, including user-selected, user/group, or key-word tagging.

User access to archived email. The tool must also give end users appropriate and user-friendly access to their archived email, thus eliminating concerns over their inability to manage their email storage with PSTs.

Legal and information discovery capabilities. The search, indexing, and e-discovery capabilities of the archiving tool should also match your needs or enable integration into corporate e-discovery systems.

While perhaps not a panacea for the storage and eDiscovery problems associated with email, on premise or cloud archiving software should provide various benefits to organizations. Indeed, such technologies have the potential to help organizations store, manage and discover their email efficiently, cost effectively and in a defensible manner. Where properly deployed and fully implemented, organizations should be able to reduce the nettlesome costs and risks connected with email.

eDiscovery Down Under: New Zealand and Australia Are Not as Different as They Sound, Mate!

Thursday, March 29th, 2012

Shortly after arriving in Wellington, New Zealand, I picked up the Dominion Post newspaper and read its lead article: a story involving U.S. jurisdiction being exercised over billionaire NZ resident Mr. Kim Dotcom. The article reinforced the challenges we face with blurred legal and data governance issues presented by the globalization of the economy and the expansive reach of the internet. Originally from Germany, and having changed his surname to reflect the origin of his fortune, Mr. Dotcom has become all too familiar in NZ of late. He has just purchased two opulent homes in NZ, and has become an internationally controversial figure for internet piracy. Mr. Dotcom’s legal troubles arise out of his internet business that enables illegal downloads of pirated material between users, which allegedly is powering the largest copyright infringement in global history. It is approximated that his website constitutes 4% of the internet traffic in the world, which means there could be tons of discovery in this case (or, cases).

The most recent legal problems Mr. Dotcom faces are with U.S. authorities who want to extradite him to face copyright charges worth $500 million by his Megaupload file-sharing website. From a criminal and record-keeping standpoint, Mr. Dotcom’s issues highlight the need for and use of appropriate technologies. In order to establish a case against him, it’s likely that search technologies were deployed by U.S. intelligence agencies to piece together Mr. Dotcom’s activities, banking information, emails and the data transfers on his site. In a case like this, where intelligence agencies would need to collect, search and cull email from so many different geographies and data sources down to just the relevant information, using technologies that link email conversation threads and give insight into a data collection set from a transparent search point of view would provide immense value. Additionally, the Immigration bureau in New Zealand has been required to release hundreds of documents about Mr. Dotcom’s residency application that were requested under the Official Information Act (OIA). The records that Immigration had to produce were likely pulled from their archive or records management system in NZ, and then redacted for private information before production to the public.

The same tools are needed in Australia and New Zealand to build a criminal case or to comply with the OIA that we use here in the U.S for investigatory and compliance purposes, as well as for litigation. The trend in information governance technology in APAC is trending first toward government agencies who are purchasing archiving and eDiscovery technologies more rapidly than private companies. Why is this? One reason could be that because the governments in APAC have a larger responsibility for healthcare, education and the protection of privacy; they are more invested in the compliance requirements and staying off the front page of the news for shortcomings. APAC private enterprises that are small or mid-sized and are not yet doing international business do not have the same archiving and eDiscovery needs large government agencies do, nor do they face litigation in the same way their American counterparts do. Large global companies should assume no matter where they are based, that they may be availed to litigation where they are doing business.

An interesting NZ use case on the enterprise level is that of Transpower (the quasi-governmental energy agency), where compliance with both the “private and public” requirements are mandatory. Transpower is an organisation that is government-owned, yet operates for a profit. Sally Myles, an experienced records manager that recently came to Transpower to head up information governance initiatives, says,

“We have to comply with the Public Records Act of 2005, public requests for information are frequent as we and are under constant scrutiny about where we will develop our plants. We also must comply with the Privacy Act of 1993. My challenge is to get the attention of our leadership to demonstrate why we need to make these changes and show them a plan for implementation as well as cost savings.”

Myles’ comments indicate NZ is facing many of the same information challenges we are here in the US with storage, records management and searching for meaningful information within the organisation.

Australia, New Zealand and U.S. Commonalities

In Australia and NZ, litigation is not seen as a compelling business driver the same way it is in the U.S. This is because many of the information governance needs of organisations are driven by regulatory, statutory and compliance requirements and the environment is not as litigious as it is in the U.S. The Official Information Act in NZ, and the Freedom of Information in Australia, are analogous to the Freedom of Information Act (FOIA) here in the U.S. The requirements to produce public records alone justify the use of technology to provide the ability to manage large volumes of data and produce appropriately redacted information to the public. This is true regardless of litigation. Additionally, there are now cases like DuPont or Mr. Dotcom’s, that legitimatize the risk of litigation with the U.S. The fact that implementing an information governance product suite will also enable a company to be prepared for litigation is a beneficial by-product for many entities as they need technology for record keeping and privacy reasons anyway. In essence, the same capabilities are achieved at the end of the day, regardless of the impetus for implementing a solution.

The Royal Commission – The Ultimate eDiscovery Vehicle

One way to think about the Australian Royal Commission (RCs) is to see it as a version of the U.S.’ government investigation. A key difference, however, is that in the case of the U.S. government, an investigation is typically into private companies. Conversely, a Royal Commission is typically an investigation into a government body after a major tragedy and it is initiated by the Head of State. A RC is an ad-hoc, formal, public inquiry into a defined issue with considerable discovery powers. These powers can be greater than those of a judge and are restricted to the scope and terms of reference of the Commission. RCs are called to look into matters of great importance and usually have very large budgets. The RC is charged with researching the issue, consulting experts both within and outside of government and developing findings to recommend changes to the law or other courses of actions. RCs have immense investigatory powers, including summoning witnesses under oath, offering of indemnities, seizing of documents and other evidence (sometimes including those normally protected, such as classified information), holding hearings in camera if necessary and—in a few cases—compelling government officials to aid in the execution of the Commission.

These expansive powers give the RC the opportunity to employ state of the art technology and to skip the slow bureaucratic decision making processes found within the government when it comes to implementing technological change. For this reason, initially, eDiscovery will continue to increase in the government sector at a more rapid pace than in the private in the Asia Pacific region. This is because litigation is less prevalent in the Asia Pacific, and because the RC is a unique investigatory vehicle with the most far-reaching authority for discovering information. Moreover, the timeframes for RCs are tight and their scopes are broad, making them hair on fire situations that move quickly.

While the APAC information management environment does not have the exact same drivers the U.S. market does, it definitely has the same archiving, eDiscovery and technology needs for different reasons. Another key point is that the APAC archiving and eDiscovery market will likely be driven by the government as records, search and production requirements are the main compliance needs in Australia and NZ. APAC organisations would be well served by beginning to modularly implement key elements of an information governance plan, as globalization is driving us all to a more common and automated approach to data management. 

Big Data Decisions Ahead: Government-Sponsored Town Hall Meeting for eDiscovery Industry Coincides With Federal Agency Deadline

Wednesday, February 29th, 2012

Update For Report Submission By Agencies

We are fast approaching the March 27, 2012 deadline for federal agencies to submit their reports to the Office of Management and Budget and the National Archives and Records Administration (NARA) to comply with the Presidential Mandate on records management. We are only at the inception, as we look to a very exciting public town hall meeting in Washington, D.C. – also scheduled for March 27, 2012. This meeting is primarily focused on gathering input from the public sector community, the vendor/IT community, and members of the public at large. Ultimately, NARA will issue a directive that will outline a centralized approach for the federal government for managing records and eDiscovery.

Agencies have been tight lipped about how far along they are in the process of evaluating their workflows and tools for managing their information (both electronic and paper). There is, however, some empirical data from an InformationWeek Survey conducted last year that takes the temperature on where the top IT professionals within the government have their sights set, and the Presidential Mandate should bring some of these concerns to the forefront of the reports. For example, the #1 business driver for migrating to the cloud – cited by 62% of respondents – was cost, while 77% of respondents said their biggest concern was security. Nonetheless, 46% were still highly likely to migrate to a private cloud.

Additionally, as part of the Federal Data Center Consolidation Initiative, agencies are looking to eliminate 800 data centers. While the cost savings are clear, from an information governance viewpoint, it’s hard not to ask what the government plans to do with all of those records?  Clearly, this shift, should it happen, will force the government into a more service-based management approach, as opposed to the traditional asset-based management approach. Some agencies have already migrated to the cloud. This is squarely in line with the Opex over Capex approach emerging for efficiency and cost savings.

Political Climate Unknown

Another major concern that will affect any decisions or policy implementation within the government is, not surprisingly, politics. Luckily, regardless of political party affiliation, it seems to be broadly agreed that the combination of IT spend in Washington, D.C. and the government’s slow move to properly manage electronic records is a problem. Two of the many examples of the problem are manifested in the inability to issue effective litigation holds or respond to Freedom of Information Act (FOIA) requests in a timely and complete manner. Even still, the political agenda of the Republican party may affect the prioritization of the Democratic President’s mandate and efforts could be derailed with a potential change in administration.

Given the election year and the heavy analysis required to produce the report, there is a sentiment in Washington that all of this work may be for naught if the appropriate resources cannot be secured then allocated to effectuate the recommendations. The reality is that data is growing at an unprecedented rate, and the need for the intelligent management of information is no longer deniable. The long term effects of putting this overhaul on the back burner could be disastrous. The government needs a modular plan and a solid budget to address the problem now, as they are already behind.

VanRoekel’s Information Governance

One issue that will likely not be agreed upon between Democrats and Republicans to accomplish the mandate is the almighty budget, and the technology the government must purchase in order to accomplish the necessary technological changes are going to cost a pretty penny.  Steven VanRoekel, the Federal CIO, stated upon the release of the FY 2013 $78.8 billion dollar IT budget:

“We are also making cyber security a cross-agency, cross-government priority goal this year. We have done a good job in ramping up on cyber capabilities agency-by-agency, and as we come together around this goal, we will hold the whole of government accountable for cyber capabilities and examine threats in a holistic way.”

His quote indicates the priority from the top down of evaluating IT holistically, which dovetails nicely with the presidential mandate since security and records management are only two parts of the entire information governance picture. Each agency still has their own work cut out for them across the EDRM. One of the most pressing issues in the upcoming reports will be what each agency decides to bring in-house or to continue outsourcing. This decision will in part depend on whether the inefficiencies identified lead agencies to conclude that they can perform those functions for less money and more efficiently than their contractors.  In evaluating their present capabilities, each agency will need to look at what workflows and technologies they currently have deployed across divisions, what they presently outsource,  and what the marketplace potentially offers them today to address their challenges.

The reason this question is central is because it begs an all-important question about information governance itself.  Information governance inherently implies that an organization or government control most or all aspects of the EDRM model in order to derive the benefits of security, storage, records management and eDiscovery capabilities. Presently, the government is outsourcing many of their litigation services to third party companies that have essentially become de facto government agencies.  This is partly due to scalability issues, and partly because the resources and technologies that are deployed in-house within these agencies are inadequate to properly execute a robust information governance plan.

Conclusion

The ideal scenario for each government agency to comply with the mandate would be that they deploy automated classification for their records management, archiving with expiration appropriately implemented for more than just email, and finally, some level of eDiscovery capability in order to conduct early case assessment and easily produce data for FOIA.  The level of early case assessment needed by each agency will vary, but the general idea would be that before contacting a third party to conduct data collection, the scope of an investigation or matter would be able to be determined in-house.  All things considered, the question remains if the Obama administration will foot this bill or if we will have to wait for a bigger price tag later down the road.  Either way, the government will have to come up to speed and make these changes eventually and the town hall meeting should be an accurate thermometer on where the government stands.

Judge Peck Issues Order Addressing “Joint Predictive Coding Protocol” in Da Silva Moore eDiscovery Case

Thursday, February 23rd, 2012

Litigation attorneys were abuzz last week when a few breaking news stories erroneously reported that The Honorable Andrew J. Peck, United States Magistrate Judge for the Southern District of New York, ordered the parties in a gender discrimination case to use predictive coding technology during discovery.  Despite early reports, the parties in the case (Da Silva Moore v. Publicis Group, et. al.) actually agreed to use predictive coding technology during discovery – apparently of their own accord.  The case is still significant because predictive coding technology in eDiscovery is relatively new to the legal field, and many have been reluctant to embrace a new technological approach to document review due to, among other things, a lack of judicial guidance.

Unfortunately, despite this atmosphere of cooperation, the discussion stalled when the parties realized they were miles apart in terms of defining a mutually agreeable predictive coding protocol.  A February status conference transcript reveals significant confusion and complexity related to issues such as random sampling, quality control testing, and the overall process integrity.  In response, Judge Peck ordered the parties to submit a Joint Protocol for eDiscovery to address eDiscovery generally and the use of predictive coding technology specifically.

The parties submitted their proposed protocol on February 22, 2012 and Judge Peck quickly reduced that submission to a stipulation and order.  The stipulation and order certainly provides more clarity and insight into the process than the status conference transcript.  However, reading the stipulation and order leaves little doubt that the devil is in the details – and there are a lot of details.  Equally clear is the fact that the parties are still in disagreement and the plaintiffs do not support the “joint” protocol laid out in the stipulation and order.  Plaintiffs actually go so far as to incorporate a paragraph into the stipulation and order stating that they “object to this ESI Protocol in its entirety” and they “reserve the right to object to its use in the case.”

These problems underscore some of the points made in a Forbes article published earlier this week titled,Federal Judges Consider Important Issues That Could Shape the Future of Predictive Coding Technology.”  The Forbes article relies in part on a recent predictive coding survey to make the point that, while predictive coding technology has tremendous potential, the solutions need to become more transparent and the workflows must be simplified before they go mainstream.

2012: Year of the Dragon – and Predictive Coding. Will the eDiscovery Landscape Be Forever Changed?

Monday, January 23rd, 2012

2012 is the Year of the Dragon – which is fitting, since no other Chinese Zodiac sign represents the promise, challenge, and evolution of predictive coding technology more than the Dragon.  The few who have embraced predictive coding technology exemplify symbolic traits of the Dragon that include being unafraid of challenges and willing to take risks.  In the legal profession, taking risks typically isn’t in a lawyer’s DNA, which might explain why predictive coding technology has seen lackluster adoption among lawyers despite the hype.  This blog explores the promise of predictive coding technology, why predictive coding has not been widely adopted in eDiscovery, and explains why 2012 is likely to be remembered as the year of predictive coding.

What is predictive coding?

Predictive coding refers to machine learning technology that can be used to automatically predict how documents should be classified based on limited human input.  In litigation, predictive coding technology can be used to rank and then “code” or “tag” electronic documents based on criteria such as “relevance” and “privilege” so organizations can reduce the amount of time and money spent on traditional page by page attorney document review during discovery.

Generally, the technology works by prioritizing the most important documents for review by ranking them.  In addition to helping attorneys find important documents faster, this prioritization and ranking of documents can even eliminate the need to review documents with the lowest rankings in certain situations. Additionally, since computers don’t get tired or day dream, many believe computers can even predict document relevance better than their human counterparts.

Why hasn’t predictive coding gone mainstream yet?

Given the promise of faster and less expensive document review, combined with higher accuracy rates, many are perplexed as to why predictive coding technology hasn’t been widely adopted in eDiscovery.  The answer really boils down to one simple concept – a lack of transparency.

Difficult to Use

First, early predictive coding tools attempt to apply a complicated new technological approach to a document review process that has traditionally been very simple.  Instead of relying on attorneys to read each and every document to determine relevance, the success of today’s predictive coding technology typically depends on review decisions input into a computer by one or more experienced senior attorneys.  The process commonly involves a complex series of steps that include sampling, testing, reviewing, and measuring results in order to fine tune an algorithm that will eventually be used to predict the relevancy of the remaining documents.

The problem with early predictive coding technologies is that the majority of these complex steps are done in a ‘black box’.  In other words, the methodology and results are not always clear, which increases the risk of human error and makes the integrity of the electronic discovery process difficult to defend.  For example, the methodology for selecting a statistically relevant sample is not always intuitive to the end user.  This fundamental problem could result in improper sampling techniques that could taint the accuracy of the entire process.  Similarly, the process must often be repeated several times in order to improve accuracy rates.  Even if accuracy is improved, it may be difficult or impossible to explain how accuracy thresholds were determined or to explain why coding decisions were applied to some documents and not others.

Accuracy Concerns

Early predictive coding tools also tend to lack transparency in the way the technology evaluates the language contained in each document.  Instead of evaluating both the text and metadata fields within a document, some technologies actually ignore document metadata.  This omission means a privileged email sent by a client to her attorney, Larry Lawyer, might be overlooked by the computer if the name “Larry Lawyer” is only part of the “recipient” metadata field of the document and isn’t part of the document text.  The obvious risk is that this situation could lead to privilege waiver if it is inadvertently produced to the opposing party.

Another practical concern is that some technologies do not allow reviewers to make a distinction between relevant and non-relevant language contained within individual documents.  For example, early predictive coding technologies are not intelligent enough to know that only the second paragraph on page 95 of a 100-page document contains relevant language.  The inability to discern what language  led to the determination that the document is relevant could skew results when the computer tries to identify other documents with the same characteristics.  This lack of precision increases the likelihood that the computer will retrieve an over-inclusive number of irrelevant documents.  This problem is generally referred to as ‘excessive recall,’ and it is important because this lack of precision increases the number of documents requiring manual review which directly impacts eDiscovery cost.

Waiver & Defensibility

Perhaps the biggest concern with early predictive coding technology is the risk of waiver and concerns about defensibility.  Notably, there have been no known judicial decisions that specifically address the defensibility of these new technology tools even though some in the judiciary, including U.S. Magistrate Judge Andrew Peck, have opined that this kind of technology should be used in certain cases.

The problem is that today’s predictive coding tools are difficult to use, complicated for the average attorney, and the way they work simply isn’t transparent.  All these limitations increase the risk of human error.  Introducing human error increases the risk of overlooking important documents or unwittingly producing privileged documents.  Similarly, it is difficult to defend a technological process that isn’t always clear in an era where many lawyers are still uncomfortable with keyword searches.  In short, using black box technology that is difficult to use and understand is perceived as risky, and many attorneys have taken a wait-and-see approach because they are unwilling to be the guinea pig.

Why is 2012 likely to be the year of predictive coding?

The word transparency may seem like a vague term, but it is the critical element missing from today’s predictive coding technology offerings.  2012 is likely to be the year of predictive coding because improvements in transparency will shine a light into the black box of predictive coding technology that hasn’t existed until now.  In simple terms, increasing transparency will simplify the user experience and improve accuracy which will reduce longstanding concerns about defensibility and privilege waiver.

Ease of Use

First, transparent predictive coding technology will help minimize the risk of human error by incorporating an intuitive user interface into a complicated solution.  New interfaces will include easy-to-use workflow management consoles to guide the reviewer through a step-by-step process for selecting, reviewing, and testing data samples in a way that minimizes guesswork and confusion.  By automating the sampling and testing process, the risk of human error can be minimized which decreases the risk of waiver or discovery sanctions that could result if documents are improperly coded.  Similarly, automated reporting capabilities make it easier for producing parties to evaluate and understand how key decisions were made throughout the process, thereby making it easier for them to defend the reasonableness of their approach.

Intuitive reports also help the producing party measure and evaluate confidence levels throughout the testing process until appropriate confidence levels are achieved.  Since confidence levels can actually be measured as a percentage, attorneys and judges are in a position to negotiate and debate the desired level of confidence for a production set rather than relying exclusively on the representations or decisions of a single party.  This added transparency allows the type of cooperation between parties called for in the Sedona Cooperation Proclamation and gives judges an objective tool for evaluating each party’s behavior.

Accuracy & Efficiency

2012 is also likely to be the year of transparent predictive coding technology because technical limitations that have impacted the accuracy and efficiency of earlier tools will be addressed.  For example, new technology will analyze both document text and metadata to avoid the risk that responsive or privileged documents are overlooked.  Similarly, smart tagging features will enable reviewers to highlight specific language in documents to determine a document’s relevance or non-relevance so that coding predictions will be more accurate and fewer non-relevant documents will be recalled for review.

Conclusion - Transparency Provides Defensibility

The bottom line is that predictive coding technology has not enjoyed widespread adoption in the eDiscovery process due to concerns about simplicity and accuracy that breed larger concerns about defensibility.  Defending the use of black box technology that is difficult to use and understand is a risk that many attorneys simply are not willing to take, and these concerns have deterred widespread adoption of early predictive coding technology tools.  In 2012, next generation transparent predictive coding technology will usher in a new era of computer-assisted document review that is easy to use, more accurate, and easier to defend. Given these exciting technological advancements, I predict that 2012 will not only be the year of the dragon, it will also be the year of predictive coding.

Information Governance Gets Presidential Attention: Banking Bailout Cost $4.76 Trillion, Technology Revamp Approaches $240 Billion

Tuesday, January 10th, 2012

On November 28, 2011, The White House issued a Presidential Memorandum that outlines what is expected of the 480 federal agencies of the government’s three branches in the next 240 days.  Up until now, Washington, D.C. has been the Wild West with regard to information governance as each agency has often unilaterally adopted its own arbitrary policies and systems.  Moreover, some agencies have recently purchased differing technologies.  Unfortunately,  with the President’s ultimate goal of uniformity, this centralization will be difficult to accomplish with a range of disparate technological approaches.

Particular pain points for the government traditionally include retention, search, collection, review and production of vast amounts of data and records.  Specifically, these pain points include examples of: FOIA requests gone awry, the issuance of legal holds across different agencies leading to spoliation, and the ever present problem of decentralization.

Why is the government different?

Old Practices. First, in some instances the government is technologically behind (its corporate counterparts) and is failing to meet the judiciary’s expectation that organizations effectively store, manage and discover their information.  This failing is self-evident via  the directive coming from the President mandating that these agencies start to get a plan to attack this problem.  Though different than other corporate entities, the government is nevertheless held to the same standards of eDiscovery under the Federal Rules of Civil Procedure (FRCP).  In practice, the government has been given more leniency until recently, and while equal expectations have not always been the case, the gap between the private and public sectors in no longer possible to ignore.

FOIA.  The government’s arduous obligation to produce information under the Freedom of Information Act (FOIA) has no corresponding analog for private organizations, who are responding to more traditional civil discovery requests.  Because the government is so large with many disparate IT systems, it is cumbersome to work efficiently through the information governance process across agencies and many times still difficult inside one individual agency with multiple divisions.  Executing this production process is even more difficult if not impossible to do manually without properly deployed technology.  Additionally, many of the investigatory agencies that issue requests to the private sector need more efficient ways to manage and review data they are requesting.  To compound problems, within the US government there are two opposing interests are at play; both screaming for a resolution, and that solution needs to be centralized.  On the one hand, the government needs to retain more than a corporation may need to in order to satisfy a FOIA request.

Titan Pulled at Both Ends. On the other hand, without classification of the records that are to be kept, technology to organize this vast amount of data and some amount of expiry, every agency will essentially become their own massive repository.  The “retain everything mentality” coupled with the inefficient search and retrieval of data and records is where they stand today.  Corporations are experiencing this on a smaller scale today and many are collectively further along than the government in this process, without the FOIA complications.

What are agencies doing to address these mandates?

In their plans, agencies must describe how they will improve or maintain their records management programs, particularly with regard to email, social media and other electronic communications.  They must also move away from such a paper-centric existence.  eDiscovery consultants and software companies are helping agencies through this process, essentially writing their plans to match the President’s directive.  The cloud conversation has been revisited, and agencies also have to explain how they will use cloud-based services and storage solutions, as well as identify gaps in existing laws or regulations that presently prevent improved management.  Small innovations are taking place.  In fact, just recently the DOJ added a new search feature on their website to make it easier for the public to find documents that have been posted by agencies on their websites.

The Office of Management and Budget (OMB), National Archives and Records Administration (NARA), and Justice Department will use those reports to come up with a government-wide records management framework that is more efficient, maintains accountability by documenting agency actions and promotes “appropriate” public access to records.  Hopefully, the framework they come up with will be centralized and workable on a realistic timeframe with resources sufficiently allocated to the initiative.

How much will this cost?

The President’s mandate is a great initiative and very necessary, but one cannot help but think about the costs in terms of money, time and resources when considering these crucial changes.  The most recent version of a financial services and general government appropriations bill in the Senate extends $378.8 million to NARA for this initiative.  President Obama appointed Steven VanRoekel as the United States CIO in August 2011 to succeed Vivek Kundra.  After VanRoekel’s speech at the Churchill Club in October of 2011, an audience member asked him what the most surprising aspect of his new job was.  VanRoekel said that it was managing the huge and sometimes unwieldy resources of his $80 billion budget.  It is going to take even more than this to do the job right, however.

Using conservative estimates, assume for an agency to implement archiving and eDiscovery capabilities as an initial investment would be $100 million.  That approximates $480 billion for all 480 agencies.  Assume a uniform information governance platform gets adopted by all agencies at a 50% discount due to the large contracts and also factoring in smaller sums for agencies with lesser needs.  The total now comes to $240 billion.  For context, that figure is 5% of what was spent by Federal Government ($4.76 trillion) on the biggest bailout in history in 2008. That leaves a need for $160 billion more to get the job done. VanRoekel also commented at the same meeting that he wants to break down massive multi-year information technology projects into smaller, more modular projects in the hopes of saving the government from getting mired in multi-million dollar failures.   His solution to this, he says, is modular and incremental deployment.

While Rome was not built in a day, this initiative is long overdue, yet feasible, as technology exists to address these challenges rather quickly.  After these 240 days are complete and a plan is drawn the real question is, how are we going to pay now for technology the government needed yesterday?  In a perfect world, the government would select a platform for archiving and eDiscovery, break the project into incremental milestones and roll out a uniform combination of solutions that are best of breed in their expertise.

Lessons Learned for 2012: Spotlighting the Top eDiscovery Cases from 2011

Tuesday, January 3rd, 2012

The New Year has now dawned and with it, the certainty that 2012 will bring new developments to the world of eDiscovery.  Last month, we spotlighted some eDiscovery trends for 2012 that we feel certain will occur in the near term.  To understand how these trends will play out, it is instructive to review some of the top eDiscovery cases from 2011.  These decisions provide a roadmap of best practices that the courts promulgated last year.  They also spotlight the expectations that courts will likely have for organizations in 2012 and beyond.

Issuing a Timely and Comprehensive Litigation Hold

Case: E.I. du Pont de Nemours v. Kolon Industries (E.D. Va. July 21, 2011)

Summary: The court issued a stiff rebuke against defendant Kolon Industries for failing to issue a timely and proper litigation hold.  That rebuke came in the form of an instruction to the jury that Kolon executives and employees destroyed key evidence after the company’s preservation duty was triggered.  The jury responded by returning a stunning $919 million verdict for DuPont.

The spoliation at issue occurred when several Kolon executives and employees deleted thousands emails and other records relevant to DuPont’s trade secret claims.  The court laid the blame for this destruction on the company’s attorneys and executives, reasoning they could have prevented the spoliation through an effective litigation hold process.  At issue were three hold notices circulated to the key players and data sources.  The notices were all deficient in some manner.  They were either too limited in their distribution, ineffective since they were prepared in English for Korean-speaking employees, or too late to prevent or otherwise ameliorate the spoliation.

The Lessons for 2012: The DuPont case underscores the importance of issuing a timely and comprehensive litigation hold notice.  As DuPont teaches, organizations should identify what key players and data sources may have relevant information.  A comprehensive notice should then be prepared to communicate the precise hold instructions in an intelligible fashion.  Finally, the hold should be circulated immediately to prevent data loss.

Organizations should also consider deploying the latest technologies to help effectuate this process.  This includes an eDiscovery platform that enables automated legal hold acknowledgements.  Such technology will allow custodians to be promptly and properly apprised of litigation and thereby retain information that might otherwise have been discarded.

Another Must-Read Case: Haraburda v. Arcelor Mittal U.S.A., Inc. (D. Ind. June 28, 2011)

Suspending Document Retention Policies

Case: Viramontes v. U.S. Bancorp (N.D. Ill. Jan. 27, 2011)

Summary: The defendant bank defeated a sanctions motion because it modified aspects of its email retention policy once it was aware litigation was reasonably foreseeable.  The bank implemented a retention policy that kept emails for 90 days, after which the emails were overwritten and destroyed.  The bank also promulgated a course of action whereby the retention policy would be promptly suspended on the occurrence of litigation or other triggering event.  This way, the bank could establish the reasonableness of its policy in litigation.  Because the bank followed that procedure in good faith, it was protected from court sanctions under the Federal Rules of Civil Procedure 37(e) “safe harbor.”

The Lesson for 2012: As Viramontes shows, an organization can be prepared for eDiscovery disputes by timely suspending aspects of its document retention policies.  By modifying retention policies when so required, an organization can develop a defensible retention procedure and be protected from court sanctions under Rule 37(e).

Coupling those procedures with archiving software will only enhance an organization’s eDiscovery preparations.  Effective archiving software will have a litigation hold mechanism, which enables an organization to suspend automated retention rules.  This will better ensure that data subject to a preservation duty is actually retained.

Another Must-Read Case: Micron Technology, Inc. v. Rambus Inc., 645 F.3d 1311 (Fed. Cir. 2011)

Managing the Document Collection Process

Case: Northington v. H & M International (N.D.Ill. Jan. 12, 2011)

Summary: The court issued an adverse inference jury instruction against a company that destroyed relevant emails and other data.  The spoliation occurred in large part because legal and IT were not involved in the collection process.  For example, counsel was not actively engaged in the critical steps of preservation, identification or collection of electronically stored information (ESI).  Nor was IT brought into the picture until 15 months after the preservation duty was triggered. By that time, rank and file employees – some of whom were accused by the plaintiff of harassment – stepped into this vacuum and conducted the collection process without meaningful oversight.  Predictably, key documents were never found and the court had little choice but to promise to inform the jury that the company destroyed evidence.

The Lesson for 2012: An organization does not have to suffer the same fate as the company in the Northington case.  It can take charge of its data during litigation through cooperative governance between legal and IT.  After issuing a timely and effective litigation hold, legal should typically involve IT in the collection process.  Legal should rely on IT to help identify all data sources – servers, systems and custodians – that likely contain relevant information.  IT will also be instrumental in preserving and collecting that data for subsequent review and analysis by legal.  By working together in a top-down fashion, organizations can better ensure that their eDiscovery process is defensible and not fatally flawed.

Another Must-Read Case: Green v. Blitz U.S.A., Inc. (E.D. Tex. Mar. 1, 2011)

Using Proportionality to Dictate the Scope of Permissible Discovery

Case: DCG Systems v. Checkpoint Technologies (N.D. Ca. Nov. 2, 2011)

The court adopted the new Model Order on E-Discovery in Patent Cases recently promulgated by the U.S. Court of Appeals for the Federal Circuit.  The model order incorporates principles of proportionality to reduce the production of email in patent litigation.  In adopting the order, the court explained that email productions should be scaled back since email is infrequently introduced as evidence at trial.  As a result, email production requests will be restricted to five search terms and may only span a defined set of five custodians.  Furthermore, email discovery in DCG Systems will wait until after the parties complete discovery on the “core documentation” concerning the patent, the accused product and prior art.

The Lesson for 2012: Courts seem to be slowly moving toward a system that incorporates proportionality as the touchstone for eDiscovery.  This is occurring beyond the field of patent litigation, as evidenced by other recent cases.  Even the State of Utah has gotten in on the act, revising its version of Rule 26 to require that all discovery meet the standards of proportionality.  While there are undoubtedly deviations from this trend (e.g., Pippins v. KPMG (S.D.N.Y. Oct. 7, 2011)), the clear lesson is that discovery should comply with the cost cutting mandate of Federal Rule 1.

Another Must-Read Case: Omni Laboratories Inc. v. Eden Energy Ltd [2011] EWHC 2169 (TCC) (29 July 2011)

Leveraging eDiscovery Technologies for Search and Review

Case: Oracle America v. Google (N.D. Ca. Oct. 20, 2011)

The court ordered Google to produce an email that it previously withheld on attorney client privilege grounds.  While the email’s focus on business negotiations vitiated Google’s claim of privilege, that claim was also undermined by Google’s production of eight earlier drafts of the email.  The drafts were produced because they did not contain addressees or the heading “attorney client privilege,” which the sender later inserted into the final email draft.  Because those details were absent from the earlier drafts, Google’s “electronic scanning mechanisms did not catch those drafts before production.”

The Lesson for 2012: Organizations need to leverage next generation, robust technology to support the document production process in discovery.  Tools such as email analytical software, which can isolate drafts and offer to remove them from production, are needed to address complex production issues.  Other technological capabilities, such as Near Duplicate Identification, can also help identify draft materials and marry them up with finals that have been marked as privileged.  Last but not least, technology assisted review has the potential of enabling one lawyer to efficiently complete the work that previously took thousands of hours.  Finding the budget and doing the research to obtain the right tools for the enterprise should be a priority for organizations in 2012.

Another Must-Read Case: J-M Manufacturing v. McDermott, Will & Emery (CA Super. Jun. 2, 2011)

Conclusion

There were any number of other significant cases from 2011 that could have made this list.  We invite you to share your favorites in the comments section or contact us directly with your feedback.

For more on the cases discussed above, watch this video:

Top Ten eDiscovery Predictions for 2012

Thursday, December 8th, 2011

As 2011 comes quickly to a close we’ve attempted, as in years past, to do our best Carnac impersonation and divine the future of eDiscovery.  Some of these predictions may happen more quickly than others, but it’s our sense that all will come to pass in the near future – it’s just a matter of timing.

  1. Technology Assisted Review (TAR) Gains Speed.  The area of Technology Assisted Review is very exciting since there are a host of emerging technologies that can help make the review process more efficient, ranging from email threading, concept search, clustering, predictive coding and the like.  There are two fundamental challenges however.  First, the technology doesn’t work in a vacuum, meaning that the workflows need to be properly designed and the users need to make accurate decisions because those judgment calls often are then magnified by the application.  Next, the defensibility of the given approach needs to be well vetted.  While it’s likely not necessary (or practical) to expect a judge to mandate the use of a specific technological approach, it is important for the applied technologies to be reasonable, transparent and auditable since the worst possible outcome would be to have a technology challenged and then find the producing party unable to adequately explain their methodology.
  2. The Custodian-Based Collection Model Comes Under Stress. Ever since the days of Zubulake, litigants have focused on “key players” as a proxy for finding relevant information during the eDiscovery process.  Early on, this model worked particularly well in an email-centric environment.  But, as discovery from cloud sources, collaborative worksites (like SharePoint) and other unstructured data repositories continues to become increasingly mainstream, the custodian-oriented collection model will become rapidly outmoded because it will fail to take into account topically-oriented searches.  This trend will be further amplified by the bench’s increasing distrust of manual, custodian-based data collection practices and the presence of better automated search methods, which are particularly valuable for certain types of litigation (e.g., patent disputes, product liability cases).
  3. The FRCP Amendment Debate Will Rage On – Unfortunately Without Much Near Term Progress. While it is clear that the eDiscovery preservation duty has become a more complex and risk laden process, it’s not clear that this “pain” is causally related to the FRCP.  In the notes from the Dallas mini-conference, a pending Sedona survey was quoted referencing the fact that preservation challenges were increasing dramatically.  Yet, there isn’t a consensus viewpoint regarding which changes, if any, would help improve the murky problem.  In the near term this means that organizations with significant preservation pains will need to better utilize the rules that are on the books and deploy enabling technologies where possible.
  4. Data Hoarding Increasingly Goes Out of Fashion. The war cry of many IT professionals that “storage is cheap” is starting to fall on deaf ears.  Organizations are realizing that the cost of storing information is just the tip of the iceberg when it comes to the litigation risk of having terabytes (and conceivably petabytes) of unstructured, uncategorized and unmanaged electronically stored information (ESI).  This tsunami of information will increasingly become an information liability for organizations that have never deleted a byte of information.  In 2012, more corporations will see the need to clean out their digital houses and will realize that such cleansing (where permitted) is a best practice moving forward.  This applies with equal force to the US government, which has recently mandated such an effort at President Obama’s behest.
  5. Information Governance Becomes a Viable Reality.  For several years there’s been an effort to combine the reactive (far right) side of the EDRM with the logically connected proactive (far left) side of the EDRM.  But now, a number of surveys have linked good information governance hygiene with better response times to eDiscovery requests and governmental inquires, as well as a corresponding lower chance of being sanctioned and the ability to turn over less responsive information.  In 2012, enterprises will realize that the litigation use case is just one way to leverage archival and eDiscovery tools, further accelerating adoption.
  6. Backup Tapes Will Be Increasingly Seen as a Liability.  Using backup tapes for disaster recovery/business continuity purposes remains a viable business strategy, although backing up to tape will become less prevalent as cloud backup increases.  However, if tapes are kept around longer than necessary (days versus months) then they become a ticking time bomb when a litigation or inquiry event crops up.
  7. International eDiscovery/eDisclosure Processes Will Continue to Mature. It’s easy to think of the US as dominating the eDiscovery landscape. While this is gospel for us here in the States, international markets are developing quickly and in many ways are ahead of the US, particularly with regulatory compliance-driven use cases, like the UK Bribery Act 2010.  This fact, coupled with the menagerie of international privacy laws, means we’ll be less Balkanized in our eDiscovery efforts moving forward since we do really need to be thinking and practicing globally.
  8. Email Becomes “So 2009” As Social Media Gains Traction. While email has been the eDiscovery darling for the past decade, it’s getting a little long in the tooth.  In the next year, new types of ESI (social media, structured data, loose files, cloud context, mobile device messages, etc.) will cause headaches for a number of enterprises that have been overly email-centric.  Already in 2011, organizations are finding that other sources of ESI like documents/files and structured data are rivaling email in importance for eDiscovery requests, and this trend shows no signs of abating, particularly for regulated industries. This heterogeneous mix of ESI will certainly result in challenges for many companies, with some unlucky ones getting sanctioned because they ignored these emerging data types.
  9. Cost Shifting Will Become More Prevalent – Impacting the “American Rule.” For ages, the American Rule held that producing parties had to pay for their production costs, with a few narrow exceptions.  Next year we’ll see even more courts award winning parties their eDiscovery costs under 28 U.S.C. §1920(4) and Rule 54(d)(1) FRCP. Courts are now beginning to consider the services of an eDiscovery vendor as “the 21st Century equivalent of making copies.”
  10. Risk Assessment Becomes a Critical Component of eDiscovery. Managing risk is a foundational underpinning for litigators generally, but its role in eDiscovery has been a bit obscure.  Now, with the tremendous statistical insights that are made possible by enabling software technologies, it will become increasingly important for counsel to manage risk by deciding what types of error/precision rates are possible.  This risk analysis is particularly critical for conducting any variety of technology assisted review process since precision, recall and f-measure statistics all require a delicate balance of risk and reward.

Accurately divining the future is difficult (some might say impossible), but in the electronic discovery arena many of these predictions can happen if enough practitioners decide they want them to happen.  So, the future is fortunately within reach.

Enterprise Strategy Group (ESG)’s Legal Trends Survey Reveals Alarming Inattention to eDiscovery Spending

Monday, December 5th, 2011

In their latest survey, entitled “E-Discovery Market Trends: A View from the Legal Department,” Enterprise Strategy Group (ESG) analysts Brian Babineau and Katey Wood analyze a number of interesting statistics and provide a range of insightful conclusions.  By surveying general counsel from large, mid-market (500-999 employees) and enterprise-class organizations in North America they were able to dive into a range of eDiscovery topics, including pain points, operational expenses and prioritizations on a go-forward basis.  Some are more intuitive than others, but in either case the results serve as good calibration metrics for those who endeavor to understand the corporate eDiscovery state of the nation.

“Most corporations are not tracking e-discovery spending…” In what may be the most notable finding of this ESG report, 60% of survey respondents claim that they did not track annual eDiscovery spending in 2010.  The authors correctly note that the eDiscovery process, “which can be highly unpredictable due to its project-by-project nature to begin with, has historically been outsourced to service providers charging at variable rates and often billed back to companies via their law firms.”  Despite the significant challenges of tracking eDiscovery spending, it’s nevertheless irresponsible for organizations to keep their heads in the sand regarding such a significant operational expense.

As the old saw goes, “you can’t manage what you can’t measure,” so it’s almost inconceivable to think that so many organizations aren’t tracking such a significant expense category.  For organizations who want to create a repeatable business process, as opposed to the fire-drill chaos that is typically associated with eDiscovery, it’s vitally important to accurately capture core eDiscovery metrics.  For starters, it’s useful to understand basic collection parameters, such as of the typical numbers of key custodians, average data volumes per custodian, data expansion rates, de-duplication statistics, etc.  Once these metrics are in place, it then becomes possible to manage the process and reduce costs.

Katey went on to expound in an exclusive quote for EDD 2.0:

“E-discovery can be managed as a strategic business process with an understanding of costs, performance and outcomes. When there’s no basis for reporting or comparison, it’s pin the tail on the donkey.  Corporate litigants won’t ever know they’re getting their money’s worth if they don’t even know what they’re spending.”

“E-Discovery accuracy/efficiency isn’t being measured, in large part.” Similar to the failure to measure eDiscovery costs, a full two thirds of GCs (67%) aren’t tracking the “efficiency and/or accuracy of e-discovery document review.” Until corporate counsel can link expectations of competency/efficiency with oversight and performance metrics, outside law firms will likely avoid having their feet held to the fire.  This passive stance makes transparency and process improvement difficult at best.  Additionally, this model of having expectations for efficiency, with low or no accountability, doesn’t bode well for the quick adoption of enabling technologies like predictive coding, since the driver has to inherently be the need/desire for increased efficiency (which axiomatically equals lower law firm review bills).

“Corporate information governance and litigation readiness (especially defensible deletion) are a priority, but not yet a reality.” From an internal prioritization perspective, more than two thirds (69%) of respondents identified their desire to expire/delete data more consistently, “thereby limiting unnecessary data retention for future litigation requests.”  Savvy enterprises correctly recognized the “multi-prong threat of unregulated data retention: the large amounts of irrelevant data ultimately produced for legal review, the greater difficulty of hanging onto potentially litigious documents past their required retention periods.”

This finding is very encouraging, and it ties into the upward momentum the industry is seeing regarding information governance generally – particularly linking the reactive (right) side of the EDRM with the logically connected and proactive (left) side of the EDRM.  As a good first step it’s critical to see organizations now associating good information governance hygiene with lower costs and better eDiscovery response times.  The ESG finding also triangulates with results from the recent Information Retention and eDiscovery Survey, which found that companies having good information governance hygiene were often able to respond much faster and more successfully to an eDiscovery/investigation requests, often suffering fewer negative consequences.

The only downside to the positive information governance trend, as reported by the survey, was that,

“while there are great benefits to defensible deletion, internal initiatives for implementing it too often are stymied by difficulty in obtaining cross functional consensus and authorization, particularly as it touches so many other critical processes like regulatory compliance and legal hold.”

“Legal hold processes are still very manual.” Another similar question revealed that many companies are attempting to get their information governance house in order, but are still in the very early stages.  When asked about their  current legal hold notification and tracking process, a whopping 69% of organizations said that they are using a “manual process performed by internal staff using e-mail and spreadsheets, etc.”  And, another 6% said they either had no formal process or tracking mechanism.

Given the risks attendant to flaws in the preservation process this area is ripe for improvement.  The good news is that 54% of survey respondents are intending to improve their legal hold process, with 25% planning improvement within the next 12 months.  This is a healthy acknowledgement that there is risk, and with a modicum of investment (time, personnel, procedures, and technology) the legal hold area can be brought up to current best practices.

The ESG survey is a welcome temperature gauge into the state of corporate legal departments.  It notes, in conclusion, “with the staggering growth, diversity and dispersion of data, the pain e-discovery is currently causing large and serial litigants are only a symptom of the larger problem of unwieldy and under-developed information management affecting all businesses.”  With data insights from the ESG survey, it’s becoming clear that foundational information governance elements (like deploying auditable legal hold procedures, tracking eDiscovery spending, updating data maps, etc.) are desperately needed by the many organizations that want to turn eDiscovery into a repeatable business process.  The good news is that many of these organization have improvements in mind for the next 12 months, and the challenge will be to make sure these proactive projects maintain the same level of organizational urgency that it often present for more reactive tasks.