24h-payday

Posts Tagged ‘processing’

Federal Directive Hits Two Birds (RIM and eDiscovery) with One Stone

Thursday, October 18th, 2012

The eagerly awaited Directive from The Office of Management and Budget (OMB) and The National Archives and Records Administration (NARA) was released at the end of August. In an attempt to go behind the scenes, we’ve asked the Project Management Office (PMO) and the Chief Records Officer for the NARA to respond to a few key questions. 

We know that the Presidential Mandate was the impetus for the agency self-assessments that were submitted to NARA. Now that NARA and the OMB have distilled those reports, what are the biggest challenges on a go forward basis for the government regarding record keeping, information governance and eDiscovery?

“In each of those areas, the biggest challenge that can be identified is the rapid emergence and deployment of technology. Technology has changed the way Federal agencies carry out their missions and create the records required to document that activity. It has also changed the dynamics in records management. In the past, agencies would maintain central file rooms where records were stored and managed. Now, with distributed computing networks, records are likely to be in a multitude of electronic formats, on a variety of servers, and exist as multiple copies. Records management practices need to move forward to solve that challenge. If done right, good records management (especially of electronic records) can also be of great help in providing a solid foundation for applying best practices in other areas, including in eDiscovery, FOIA, as well as in all aspects of information governance.”    

What is the biggest action item from the Directive for agencies to take away?

“The Directive creates a framework for records management in the 21st century that emphasizes the primacy of electronic information and directs agencies to being transforming their current process to identify and capture electronic records. One milestone is that by 2016, agencies must be managing their email in an electronically accessible format (with tools that make this possible, not printing out emails to paper). Agencies should begin planning for the transition, where appropriate, from paper-based records management process to those that preserve records in an electronic format.

The Directive also calls on agencies to designate a Senior Agency Official (SAO) for Records Management by November 15, 2012. The SAO is intended to raise the profile of records management in an agency to ensure that each agency commits the resources necessary to carry out the rest of the goals in the Directive. A meeting of SAOs is to be held at the National Archives with the Archivist of the United States convening the meeting by the end of this year. Details about that meeting will be distributed by NARA soon.”

Does the Directive holistically address information governance for the agencies, or is it likely that agencies will continue to deploy different technology even within their own departments?

“In general, as long as agencies are properly managing their records, it does not matter what technologies they are using. However, one of the drivers behind the issuance of the Memorandum and the Directive was identifying ways in which agencies can reduce costs while still meeting all of their records management requirements. The Directive specifies actions (see A3, A4, A5, and B2) in which NARA and agencies can work together to identify effective solutions that can be shared.”

Finally, although FOIA requests have increased and the backlog has decreased, how will litigation and FOIA intersecting in the next say 5 years?  We know from the retracted decision in NDLON that metadata still remains an issue for the government…are we getting to a point where records created electronically will be able to be produced electronically as a matter of course for FOIA litigation/requests?

“In general, an important feature of the Directive is that the Federal government’s record information – most of which is in electronic format – stays in electronic format. Therefore, all of the inherent benefits will remain as well – i.e., metadata being retained, easier and speedier searches to locate records, and efficiencies in compilation, reproduction, transmission, and reduction in the cost of producing the requested information. This all would be expected to have an impact in improving the ability of federal agencies to respond to FOIA requests by producing records in electronic formats.”

Fun Fact- Is NARA really saving every tweet produced?

“Actually, the Library of Congress is the agency that is preserving Twitter. NARA is interested in only preserving those tweets that a) were made or received in the course of government business and b) appraised to have permanent value. We talked about this on our Records Express blog.”

“We think President Barack Obama said it best when he made the following comment on November 28, 2011:

“The current federal records management system is based on an outdated approach involving paper and filing cabinets. Today’s action will move the process into the digital age so the American public can have access to clear and accurate information about the decisions and actions of the Federal Government.” Paul Wester, Chief Records Officer at the National Archives, has stated that this Directive is very exciting for the Federal Records Management community.  In our lifetime none of us has experienced the attention to the challenges that we encounter every day in managing our records management programs like we are now. These are very exciting times to be a records manager in the Federal government. Full implementation of the Directive by the end of this decade will take a lot of hard work, but the government will be better off for doing this and we will be better able to serve the public.”

Special thanks to NARA for the ongoing dialogue that is key to transparent government and the effective practice of eDiscovery, Freedom Of Information Act requests, records management and thought leadership in the government sector. Stay tuned as we continue to cover these crucial issues for the government as they wrestle with important information governance challenges. 

 

Responsible Data Citizens Embrace Old World Archiving With New Data Sources

Monday, October 8th, 2012

The times are changing rapidly as data explosion mushrooms, but the more things change the more they stay the same. In the archiving and eDiscovery world, organizations are increasingly pushing content from multiple data sources into information archives. Email was the first data source to take the plunge into the archive, but other data sources are following quickly as we increase the amount of data we create (volume) along with the types of data sources (variety). While email is still a paramount data source for litigation, internal/external investigations and compliance – other data sources, namely social media and SharePoint, are quickly catching up.  

This transformation is happening for multiple reasons. The main reason for this expansive push of different data varieties into the archive is because centralizing an organization’s data is paramount to healthy information governance. For organizations that have deployed archiving and eDiscovery technologies, the ability to archive multiple data sources is the Shangri-La they have been looking for to increase efficiency, as well as create a more holistic and defensible workflow.

Organizations can now deploy document retention policies across multiple content types within one archive and can identify, preserve and collect from the same, singular repository. No longer do separate retention policies need to apply to data that originated in different repositories. The increased ability to archive more data sources into a centralized archive provides for unparalleled storage, deduplication, document retention, defensible deletion and discovery benefits in an increasingly complex data environment.

Prior to this capability, SharePoint was another data source in the wild that needed disparate treatment. This meant that legal hold in-place, as well as insight into the corpus of data, was not as clear as it was for email. This lack of transparency within the organization’s data environment for early case assessment led to unnecessary outsourcing, over collection and disparate time consuming workflows. All of the aforementioned detractors cost organizations money, resources and time that can be better utilized elsewhere.

Bringing data sources like SharePoint into an information archive increases the ability for an organization to comply with necessary document retention schedules, legal hold requirements, and the ability to reap the benefits of a comprehensive information governance program. If SharePoint is where an organization’s employees are storing documents that are valuable to the business, order needs to be brought to the repository.

Additionally, many projects are abandoned and left to die on the vine in SharePoint. These projects need to be expired and that capacity must be recycled for a higher business purpose. Archives currently enable document libraries, wikis, discussion boards, custom lists, “My Sites” and SharePoint social content for increased storage optimization, retention/expiration of content and eDiscovery. As a result, organizations can better manage complex projects such as migrations, versioning, site consolidations and expiration with SharePoint archiving.  

Data can be analogized to a currency, where the archive is the bank. In treating data as a currency, organizations must ask themselves: why are companies valued the way they are on Wall Street? For companies that perform service or services in combination with products, they are valued many times on customer lists, data to be repurposed about consumers (Facebook), and various other databases. A recent Forbes article discusses people, value and brand as predominant indicators of value.

While these valuation metrics are sound, the valuation stops short of measuring the quality of the actual data within an organization, examining if it is organized and protected. The valuation also does not consider the risks of and benefits of how the data is stored, protected and whether or not it is searchable. The value of the data inside a company is what supports all three of the aforementioned valuations without exception. Without managing the data in an organization, not only are eDiscovery and storage costs a legal and financial risk, the aforementioned three are compromised.

If employee data is not managed/monitored appropriately, if the brand is compromised due to lack of social media monitoring/response, or if litigation ensues without the proper information governance plan, then value is lost because value has not been assessed and managed. Ultimately, an organization is only as good as its data, and this means there’s a new asset on Wall Street – data.

It’s not a new concept to archive email,  and in turn it isn’t novel that data is an asset. It has just been a less understood asset because even though massive amounts of data are created each day in organizations, storage has become cheap. SharePoint is becoming more archivable because more critical data is being stored there, including business records, contracts and social media content. Organizations cannot fear what they cannot see until they are forced by an event to go back and collect, analyze and review that data. Costs associated with this reactive eDiscovery process can range from $3,000-30,000 a gigabyte, compared to the 20 cents per gigabyte for storage. The downstream eDiscovery costs are obviously costly, especially as organizations begin to deal in terabytes and zettabytes. 

Hence, plus ca change, plus c’est le meme chose and we will see this trend continue as organizations push more valuable data into the archive and expire data that has no value. Multiple data sources have been collection sources for some time, but the ease of pulling everything into an archive is allowing for economies of scale and increased defensibility regarding data management. This will decrease the risks associated with litigation and compliance, as well as boost the value of companies.

The Malkovich-ization of Predictive Coding in eDiscovery

Tuesday, August 14th, 2012

In the 1999 Academy Award-winning movie, Being John Malkovich, there’s a scene where the eponymous character is transported into his own body via a portal and everyone around him looks exactly like him.  All the characters can say is “Malkovich” as if this single word conveys everything to everyone.

In the eDiscovery world it seems lately like predictive coding has been Malkovich-ized, in the sense that it’s the start and end of every discussion. We here at eDiscovery 2.0 are similarly unable to break free of predictive coding’s gravitational pull – but we’ve attempted to give the use of this emerging technology some context, in the form of a top ten list.

So, without further ado, here are the top ten important items to consider with predictive coding and eDiscovery generally…

1. Perfection Is Not Required in eDiscovery

While not addressing predictive coding per se, it’s important to understand the litmus test for eDiscovery efforts. Regardless of the tools or techniques utilized to respond to document requests in electronic discovery, perfection is not required. The goal should be to create a reasonable and repeatable process to establish defensibility in the event you face challenges by the court or an opposing party. Make sure the predictive coding application (and broader eDiscovery platform you choose) functions correctly, is used properly and can generate reports illustrating that a reasonable process was followed. Remember, making smart decisions to establish a repeatable and defensible process early will inevitably reduce the risk of downstream problems.

2. Predictive Coding Is Just One Tool in the Litigator’s Tool-belt

Although the right predictive coding tools can reduce the time and cost of document review and improve accuracy rates, they are not a substitute for other important technology tools. Keyword search, concept search, domain filtering, and discussion threading are only a few of the other important tools in the litigator’s tool-belt that can and should be used together with predictive coding. Invest in an eDiscovery platform that contains a wide range of seamlessly integrated eDiscovery tools that work together to ensure the simplest, most flexible, and most efficient eDiscovery process.

3. Using Predictive Coding Tools Properly Makes All the Difference

Electronic discovery applications, like most technology solutions, are only effective if deployed properly. Since many early-generation tools are not intuitive, learning how to use a given predictive coding tool properly is critical to eDiscovery success. To maximize chances for success and minimize the risk of problems, select trustworthy predictive coding applications supported by reputable providers and make sure to learn how to use the solutions properly.

4. Predictive Coding Isn’t Just for Big Cases

Sometimes predictive coding applications must be purchased separately from other eDiscovery tools; other times additional fees may be required to use predictive coding. As a result, many practitioners only consider predictive coding for the largest cases, to ensure the cost of eDiscovery doesn’t exceed the value of the case. If possible, invest in an electronic discovery solution that includes predictive coding as part of an integrated eDiscovery platform containing legal hold, collection, processing, culling, analysis, and review capabilities at no additional charge. Since the cost of using different predictive coding tools varies dramatically, make sure to select a tool at the right price point to maximize economic efficiencies across multiple cases, regardless of size.

5. Investigate the Solution Providers

All predictive coding applications are not created equal. The tools vary significantly in price, usability, performance and overall reputation. Although the availability of trustworthy and independent information comparing different predictive coding solutions is limited, information about the companies creating these different application is available. Make sure to review independent research from analysts such as Gartner, Inc., as part of the vetting process instead of starting from scratch.

6. Test Drive Before You Buy

Savvy eDiscovery technologists take steps to ensure that the predictive coding application they are considering works within their organization’s environment and on their organization’s data. Product demonstrations are important, but testing products internally through a proof of concept evaluation is even more important if you are contemplating bringing an eDiscovery platform in house. Additionally, check company references before investing in a solution to find out how others feel about the software they purchased and the level of product support they receive.

7. Defensibility Is Paramount

Although predictive coding tools can save organizations money through increased efficiency, the relative newness and complexity of the technology can create risk. To avoid this risk, choose a predictive coding tool that is easy to use, developed by an industry leading company and fully supported.

8. Statistical Methodology and Product Training Are Critical

The underlying statistical methodology behind any predictive coding application is critical to the defensibility of the entire eDiscovery process. Many providers fail to incorporate a product workflow for selecting a properly sized control set in certain situations. Unfortunately, this oversight could unwittingly result in misrepresentations to the court and opposing parties about the system’s performance. Select providers capable of illustrating the statistical methodology behind their approach and that are capable of providing proper training on the use of their system.

9. Transparency Is Key

Many practitioners are legitimately concerned that early-generation predictive coding solutions operate as a “black box,” meaning the way they work is difficult to understand and/or explain. Since it is hard to defend technology that is difficult to understand, selecting a solution and process that can be explained in court is critical. Make sure to choose a predictive coding solution that is transparent to avoid allegations by opponents that your tool is ”black box” technology that cannot be trusted.

10. Align with Attorneys You Trust

The fact that predictive coding is relatively new to the legal field and can be more complex than traditional approaches to eDiscovery highlights the importance of aligning with trusted legal counsel. Most attorneys defer legal technology decisions to others on their legal team and have little practical experience using these solutions themselves. Conversational knowledge about these tools isn’t enough given the confusion, complexity, and risk related to selecting the wrong tool or using the applications improperly. Make sure to align with an attorney who possesses hands-on experience and who are able to articulate specific reasons why they prefer a particular solution or approach.

Hopefully this top ten list can ensure that your use of “predictive coding” isn’t Malkovich-ized – meaning you understand when, how and why you’re deploying this particularly eDiscovery technology. Without the right context, the eDiscovery industry risks overusing this term and in turn over-hyping this exciting next chapter in process improvement.

Gartner’s 2012 Magic Quadrant for E-Discovery Software Looks to Information Governance as the Future

Monday, June 18th, 2012

Gartner recently released its 2012 Magic Quadrant for E-Discovery Software, which is its annual report analyzing the state of the electronic discovery industry. Many vendors in the Magic Quadrant (MQ) may initially focus on their position and the juxtaposition of their competitive neighbors along the Visionary – Execution axis. While a very useful exercise, there are also a number of additional nuggets in the MQ, particularly regarding Gartner’s overview of the market, anticipated rates of consolidation and future market direction.

Context

For those of us who’ve been around the eDiscovery industry since its infancy, it’s gratifying to see the electronic discovery industry mature.  As Gartner concludes, the promise of this industry isn’t off in the future, it’s now:

“E-discovery is now a well-established fact in the legal and judicial worlds. … The growth of the e-discovery market is thus inevitable, as is the acceptance of technological assistance, even in professions with long-standing paper traditions.”

The past wasn’t always so rosy, particularly when the market was dominated by hundreds of service providers that seemed to hold on by maintaining a few key relationships, combined with relatively high margins.

“The market was once characterized by many small providers and some large ones, mostly employed indirectly by law firms, rather than directly by corporations. …  Purchasing decisions frequently reflected long-standing trusted relationships, which meant that even a small book of business was profitable to providers and the effects of customary market forces were muted. Providers were able to subsist on one or two large law firms or corporate clients.”

Consolidation

The Magic Quadrant correctly notes that these “salad days” just weren’t feasible long term. Gartner sees the pace of consolidation heating up even further, with some players striking it rich and some going home empty handed.

“We expect that 2012 and 2013 will see many of these providers cease to exist as independent entities for one reason or another — by means of merger or acquisition, or business failure. This is a market in which differentiation is difficult and technology competence, business model rejuvenation or size are now required for survival. … The e-discovery software market is in a phase of high growth, increasing maturity and inevitable consolidation.”

Navigating these treacherous waters isn’t easy for eDiscovery providers, nor is it simple for customers to make purchasing decisions if they’re correctly concerned that the solution they buy today won’t be around tomorrow.  Yet, despite the prognostication of an inevitable shakeout (Gartner forecasts that the market will shrink 25% in the raw number of firms claiming eDiscovery products/services) they are still very bullish about the sector.

“Gartner estimates that the enterprise e-discovery software market came to $1 billion in total software vendor revenue in 2010. The five-year CAGR to 2015 is approximately 16%.”

This certainly means there’s a window of opportunity for certain players – particularly those who help larger players fill out their EDRM suite of offerings, since the best of breed era is quickly going by the wayside.  Gartner notes that end-to-end functionality is now table stakes in the eDiscovery space.

“We have seen a large upsurge in user requests for full-spectrum EDRM functionality. Whether that functionality will be used initially, or at all, remains an open question. Corporate buyers do seem minded to future-proof their investments in this way, by anticipating what they may wish to do with the software and the vendor in the future.”

Information Governance

Not surprisingly, it’s this “full-spectrum” functionality that most closely aligns with marrying the reactive, right side of the EDRM with the proactive, left side.  In concert, this yin and yang is referred to as information governance, and it’s this notion that’s increasingly driving buying behaviors.

“It is clear from our inquiry service that the desire to bring e-discovery under control by bringing data under control with retention management is a strategy that both legal and IT departments pursue in order to control cost and reduce risks. Sometimes the archiving solution precedes the e-discovery solution, and sometimes it follows it, but Gartner clients that feel the most comfortable with their e-discovery processes and most in control of their data are those that have put archiving systems in place …”

As Gartner looks out five years, the analyst firm anticipates more progress on the information governance front, because the “entire e-discovery industry is founded on a pile of largely redundant, outdated and trivial data.”  At some point this digital landfill is going to burst and organizations are finally realizing that if they don’t act now, it may be too late.

“During the past 10 to 15 years, corporations and individuals have allowed this data to accumulate for the simple reason that it was easy — if not necessarily inexpensive — to do so. … E-discovery has proved to be a huge motivation for companies to rethink their information management policies. The problem of determining what is relevant from a mass of information will not be solved quickly, but with a clear business driver (e-discovery) and an undeniable return on investment (deleting data that is no longer required for legal or business purposes can save millions of dollars in storage costs) there is hope for the future.”

 

The Gartner Magic Quadrant for E-Discovery Software is insightful for a number of reasons, not the least of which is how it portrays the developing maturity of the electronic discovery space. In just a few short years, the niche has sprouted wings, raced to $1B and is seeing massive consolidation. As we enter the next phase of maturation, we’ll likely see the sector morph into a larger, information governance play, given customers’ “full-spectrum” functionality requirements and the presence of larger, mainstream software companies.  Next on the horizon is the subsuming of eDiscovery into both the bigger information governance umbrella, as well as other larger adjacent plays like “enterprise information archiving, enterprise content management, enterprise search and content analytics.” The rapid maturation of the eDiscovery industry will inevitably result in growing pains for vendors and practitioners alike, but in the end we’ll all benefit.

 

About the Magic Quadrant
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Gartner’s “2012 Magic Quadrant for E-Discovery Software” Provides a Useful Roadmap for Legal Technologists

Tuesday, May 29th, 2012

Gartner has just released its 2012 Magic Quadrant for E-Discovery Software, which is an annual report that analyzes the state of the electronic discovery industry and provides a detailed vendor-by-vendor evaluation. For many, particularly those in IT circles, Gartner is an unwavering north star used to divine software market leaders, in topics ranging from business intelligence platforms to wireless lan infrastructures. When IT professionals are on the cusp of procuring complex software, they look to analysts like Gartner for quantifiable and objective recommendations – as a way to inform and buttress their own internal decision making processes.

But for some in the legal technology field (particularly attorneys), looking to Gartner for software analysis can seem a bit foreign. Legal practitioners are often more comfortable with the “good ole days” when the only navigation aid in the eDiscovery world was provided by the dynamic duo of George Socha and Tom Gelbmanm, who (beyond creating the EDRM) were pioneers of the first eDiscovery rankings survey. Albeit somewhat short lived, their Annual Electronic Discovery[i] Survey ranked the hundreds of eDiscovery providers and bucketed the top tier players in both software and litigation support categories. The scope of their mission was grand, and they were perhaps ultimately undone by the breadth of their task (stopping the Survey in 2010), particularly as the eDiscovery landscape continued to mature, fragment and evolve.

Gartner, which has perfected the analysis of emerging software markets, appears to have taken on this challenge with an admittedly more narrow (and likely more achievable) focus. Gartner published its first Magic Quadrant (MQ) for the eDiscovery industry last year, and in the 2012 Magic Quadrant for E-Discovery Software report they’ve evaluated the top 21 electronic discovery software vendors. As with all Gartner MQs, their methodology is rigorous; in order to be included, vendors must meet quantitative requirements in market penetration and customer base and are then evaluated upon criteria for completeness of vision and ability to execute.

By eliminating the legion of service providers and law firms, Gartner has made their mission both more achievable and perhaps (to some) less relevant. When talking to certain law firms and litigation support providers, some seem to treat the Gartner initiative (and subsequent Magic Quadrant) like a map from a land they never plan to visit. But, even if they’re not directly procuring eDiscovery software, the Gartner MQ should still be seen by legal technologists as an invaluable tool to navigate the perils of the often confusing and shifting eDiscovery landscape – particularly with the rash of recent M&A activity.

Beyond the quadrant positions[ii], comprehensive analysis and secular market trends, one of the key underpinnings of the Magic Quadrant is that the ultimate position of a given provider is in many ways an aggregate measurement of overall customer satisfaction. Similar in ways to the net promoter concept (which is a tool to gauge the loyalty of a firm’s customer relationships simply by asking how likely that customer is to recommend a product/service to a colleague), the Gartner MQ can be looked at as the sum total of all customer experiences.[iii] As such, this usage/satisfaction feedback is relevant even for parties that aren’t purchasing or deploying electronic discovery software per se. Outside counsel, partners, litigation support vendors and other interested parties may all end up interacting with a deployed eDiscovery solution (particularly when such solutions have expanded their reach as end-to-end information governance platforms) and they should want their chosen solution to used happily and seamlessly in a given enterprise. There’s no shortage of stories about unhappy outside counsel (for example) that complain about being hamstrung by a slow, first generation eDiscovery solution that ultimately makes their job harder (and riskier).

Next, the Gartner MQ also is a good short-handed way to understand more nuanced topics like time to value and total cost of ownership. While of course related to overall satisfaction, the Magic Quadrant does indirectly address the query about whether the software does what it says it will (delivering on the promise) in the time frame that is claimed (delivering the promise in a reasonable time frame) since these elements are typically subsumed in the satisfaction metric. This kind of detail is disclosed in the numerous interviews that Gartner conducts to go behind the scenes, querying usage and overall satisfaction.

While no navigation aid ensures that a traveler won’t get lost, the Gartner Magic Quadrant for E-Discovery Software is a useful map of the electronic discovery software world. And, particularly looking at year-over-year trends, the MQ provides a useful way for legal practitioners (beyond the typical IT users) to get a sense of the electronic discovery market landscape as it evolves and matures. After all, staying on top of the eDiscovery industry has a range of benefits beyond just software procurement.

Please register here to access the Gartner Magic Quadrant for E-Discovery Software.

About the Magic Quadrant
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.



[i] Note, in the good ole days folks still used two words to describe eDiscovery.

[ii] Gartner has a proprietary matrix that it uses to place the entities into four quadrants: Leaders, Challengers, Visionaries and Niche Players.

[iii] Under the Ability to Execute axis Gartner weighs a number of factors including “Customer Experience: Relationships, products and services or programs that enable clients to succeed with the products evaluated. Specifically, this criterion includes implementation experience, and the ways customers receive technical support or account support. It can also include ancillary tools, the existence and quality of customer support programs, availability of user groups, service-level agreements and so on.”

Morton’s Fork, Oil Filters the Nexus with Information Governance

Thursday, May 10th, 2012

Those old enough to have watched TV in the early eighties will undoubtedly remember the FRAM oil slogan where the mechanic utters his iconic catchphrase: “You can pay me now, or pay me later.”  The gist of the vintage ad was that the customer could either pay a small sum now for the replacement of oil filter, or a far greater sum later for the replacement of the car’s entire engine.

This choice between two unpleasant alternatives is sometimes called a Morton’s Fork (but typically only when both choices are equal in difficulty).  The saying (not to be confused with the equally colorful Hobson’s Choice) apparently originated with the collection of taxes by John Morton (the Archbishop of Canterbury) in the late 15th century.  Morton was apparently fond of saying that a man living modestly must be saving money and could therefore afford to pay taxes, whereas if he was living extravagantly then he was obviously rich and could still afford them.[i]

This “pay me now/pay me later” scenario perplexes many of today’s organizations as they try to effectively govern (i.e., understand, discover and retain) electronically stored information (ESI).  The challenge is similar to the oil filter conundrum, in that companies can often make rather modest up-front retention/deletion decisions that help prevent monumental, downstream eDiscovery charges.

This exponential gap has been illustrated recently by a number of surveys contrasting the cost of storage with the cost of conducting basic eDiscovery tasks (such as preservation, collection, processing, review and production).  In a recent AIIM webcast it was noted that “it costs about 20¢/day to buy 1GB of storage, but it costs around $3,500 to review that same gigabyte of storage.” And, it turns out that the $3,500 review estimate (which sounds prohibitively expensive, particularly at scale) may actually be on the conservative side.  While the review phase is roughly 70 percent of the total eDiscovery costs – there is the other 30% that includes upstream costs for preservation, collection and processing.

Similarly, in a recent Enterprise Strategy Group (ESG) paper the authors noted that eDiscovery costs range anywhere from $5,000 to $30,000 per gigabyte, citing the Minnesota Journal of Law, Science & Technology.  This $30,000 figure is also roughly in line with other per-gigabyte eDiscovery costs, according to a recent survey by the RAND Corporation.  In an article entitled “Where the Money Goes — Understanding Litigant Expenditures for Producing Electronic Discovery” authors Nicholas M. Pace and Laura Zakaras conducted an extensive analysis and concluded that “… the total costs per gigabyte reviewed were generally around $18,000, with the first and third quartiles in the 35 cases with complete information at $12,000 and $30,000, respectively.”

Given these range of estimates, the $18,000 per gigabyte metric is probably a good midpoint figure that advocates of information governance can use to contrast with the exponentially lower baseline costs of buying and maintaining storage.  It is this stark (and startling) gap between pure information costs and the expenses of eDiscovery that shows how important it is to calculate latent “information risk.”  If you also add in the risks for sanctions due to spoliation, the true (albeit still murky) information risk portrait comes into focus.  It is this calculation that is missing when legal goes to bat to argue about the necessity of information governance solutions, particularly when faced with the host of typical objections (“storage is cheap” … “keep everything” … “there’s no ROI for proactive information governance programs”).

The good news is that as the eDiscovery market continues to evolve, practitioners (legal and IT alike) will come to a better and more holistic understanding of the latent information risk costs that the unchecked proliferation of data causes.  It will be this increased level of transparency that permits the budding information governance trend to become a dominant umbrella concept that unites Legal and IT.



[i] Insert your own current political joke here…

Look Before You Leap! Avoiding Pitfalls When Moving eDiscovery to the Cloud

Monday, May 7th, 2012

It’s no surprise that the eDiscovery frenzy gripping the American legal system over the past decade has become increasingly expensive.  Particularly costly to organizations is the process of preserving and collecting documents, a fact repeatedly emphasized by the Advisory Committee in its report regarding the 2006 amendments to the Federal Rules of Civil Procedure (FRCP).  These aspects of discovery are often lengthy and can be disruptive to business operations.  Just as troubling, they increase the duration and expense of litigation.

Because these costs and delays affect the courts as well as clients, it comes as no surprise that judges have now heightened their expectation for how organizations store, manage and discover their electronically stored information (ESI).  Gone are the days when enterprises could plead ignorance for not preserving or producing their data in an efficient, cost effective and defensible manner.  Organizations must now follow best practices – both during and before litigation – if they are to safely navigate the stormy seas of eDiscovery.

The importance of deploying such practices applies acutely to those organizations that are exploring “cloud”-based alternatives to traditional methods for preserving and producing electronic information.  Under the right circumstances, the cloud may represent a fantastic opportunity to streamline the eDiscovery process for an organization.  Yet it could also turn into a dangerous liaison if the cloud offering is not properly scrutinized for basic eDiscovery functionality.  Indeed, the City of Los Angeles’s recent decision to partially disengage from its cloud service provider exemplifies this admonition to “look before you leap” to the cloud.  Thus, before selecting a cloud provider for eDiscovery, organizations should be particularly careful to ensure that a provider has the ability both to efficiently retrieve data from the cloud and to issue litigation hold notices.

Effective Data Retrieval Requires Efficient Data Storage

The hype surrounding the cloud has generally focused on the opportunity for cheap and unlimited storage of information.  Storage, however, is only one of many factors to consider in selecting a cloud-based eDiscovery solution.  To be able to meet the heightened expectations of courts and regulatory bodies, organizations must have the actual – not theoretical – ability to retrieve their data in real time.  Otherwise, they may not be able to satisfy eDiscovery requests from courts or regulatory bodies, let alone the day-to-day demands of their operations.

A key step to retrieving company data in a timely manner is to first confirm whether the cloud offering can intelligently organize that information such that organizations can quickly respond to discovery requests and other legal demands.  This includes the capacity to implement and observe company retention protocols.  Just like traditional data archiving software, the cloud must enable automated retention rules and thus limit the retention of information to a designated time period.  This will enable data to be expired once it reaches the end of that period.

The pool of data can be further decreased through single instance storage.  This deduplication technology eliminates redundant data by preserving only a master copy of each document placed into the cloud.  This will reduce the amount of data that needs to be identified, preserved, collected and reviewed as part of any discovery process.  For while unlimited data storage may seem ideal now, reviewing unlimited amounts of data will quickly become a logistical and costly nightmare.

Any viable cloud offering should also have the ability to suspend automated document retention/deletion rules to ensure the adequate preservation of relevant information.  This goes beyond placing a hold on archival data in the cloud.  It requires that an organization have the ability to identify the data sources in the cloud that may contain relevant information and then modify aspects of its retention policies to ensure that cloud-stored data is retained for eDiscovery.  Taking this step will enable an organization to create a defensible document retention strategy and be protected from court sanctions under the Federal Rule of Civil Procedure 37(e) “safe harbor.”  The decision from Viramontes v. U.S. Bancorp (N.D. Ill. Jan. 27, 2011) is particularly instructive on this issue.

In Viramontes, the defendant bank defeated a sanctions motion because it timely modified aspects of its email retention policy.  The bank implemented a policy that kept emails for 90 days, after which the emails were deleted.  That policy was promptly suspended, however, once litigation was reasonably foreseeable.  Because the bank followed that procedure in good faith, it was protected from sanctions under Rule 37(e).

As the Viramontes case shows, an organization can be prepared for eDiscovery disputes by appropriately suspending aspects of its document retention policies.  By creating and then faithfully observing a policy that requires retention policies be suspended on the occurrence of litigation or other triggering event, an organization can develop a defensible retention procedure. Having such eDiscovery functionality in a cloud provider will likely facilitate an organization’s eDiscovery process and better insulate it from litigation disasters.

The Ability to Issue Litigation Hold Notices

To be effective for eDiscovery purposes, a cloud service provider must also enable an organization to deploy a litigation hold to prevent users from destroying data. Unless the cloud has litigation hold technology, the entire discovery process may very well collapse.  For electronic data to be produced in litigation, it must first be preserved.  And it cannot be preserved if the key players or data source custodians are unaware that such information must be retained.  Indeed, employees and data sources may discard and overwrite electronically stored information if they are oblivious to a preservation duty.

A cloud service provider should therefore enable automated legal hold acknowledgements.  Such technology will allow custodians to be promptly and properly notified of litigation and thereby retain information that might otherwise have been discarded.  Inadequate litigation hold technology leaves organizations vulnerable to data loss and court punishment.

Conclusion

Confirming that a cloud offering can quickly retrieve and efficiently store enterprise data while effectively deploying litigation hold notices will likely address the basic concerns regarding its eDiscovery functionality. Yet these features alone will not make that solution the model of eDiscovery cloud providers. Advanced search capabilities should also be included to reduce the amount of data that must be analyzed and reviewed downstream. In addition, the cloud ought to support load files in compatible formats for export to third party review software. The cloud should additionally provide an organization with a clear audit trail establishing that neither its documents, nor their metadata were modified when transmitted to the cloud.  Without this assurance, an organization may not be able to comply with key regulations or establish the authenticity of its data in court. Finally, ensure that these provisions are memorialized in the service level agreement governing the relationship between the organization and the cloud provider.

First State Court Issues Order Approving the Use of Predictive Coding

Thursday, April 26th, 2012

On Monday, Virginia Circuit Court Judge James H. Chamblin issued what appears to be the first state court Order approving the use of predictive coding technology for eDiscovery. Tuesday, Law Technology News reported that Judge Chamblin issued the two-page Order in Global Aerospace Inc., et al, v. Landow Aviation, L.P. dba Dulles Jet Center, et al, over Plaintiffs’ objection that traditional manual review would yield more accurate results. The case stems from the collapse of three hangars at the Dulles Jet Center (“DJC”) that occurred during a major snow storm on February 6, 2010. The Order was issued at Defendants’ request after opposing counsel objected to their proposed use of predictive coding technology to “retrieve potentially relevant documents from a massive collection of electronically stored information.”

In Defendants’ Memorandum in Support of their motion, they argue that a first pass manual review of approximately two million documents would cost two million dollars and only locate about sixty percent of all potentially responsive documents. They go on to state that keyword searching might be more cost-effective “but likely would retrieve only twenty percent of the potentially relevant documents.” On the other hand, they claim predictive coding “is capable of locating upwards of seventy-five percent of the potentially relevant documents and can be effectively implemented at a fraction of the cost and in a fraction of the time of linear review and keyword searching.”

In their Opposition Brief, Plaintiffs argue that Defendants should produce “all responsive documents located upon a reasonable inquiry,” and “not just the 75%, or less, that the ‘predictive coding’ computer program might select.” They also characterize Defendants’ request to use predictive coding technology instead of manual review as a “radical departure from the standard practice of human review” and point out that Defendants cite no case in which a court compelled a party to accept a document production selected by a “’predictive coding’ computer program.”

Considering predictive coding technology is new to eDiscovery and first generation tools can be difficult to use, it is not surprising that both parties appear to frame some of their arguments curiously. For example, Plaintiffs either mischaracterize or misunderstand Defendants’ proposed workflow given their statement that Defendants want a “computer program to make the selections for them” instead of having “human beings look at and select documents.” Importantly, predictive coding tools require human input for a computer program to “predict” document relevance. Additionally, the proposed approach includes an additional human review step prior to production that involves evaluating the computer’s predictions.

On the other hand, some of Defendants’ arguments also seem to stray a bit off course. For example, Defendants’ seem to unduly minimize the value of using other tools in the litigator’s tool belt like keyword search or topic grouping to cull data prior to using potentially more expensive predictive coding technology. To broadly state that keyword searching “likely would retrieve only twenty percent of the potentially relevant documents” seems to ignore two facts. First, keyword search for eDiscovery is not dead. To the contrary, keyword searches can be an effective tool for broadly culling data prior to manual review and for conducting early case assessments. Second, the success of keyword searches and other litigation tools depends as much on the end user as the technology. In other words, the carpenter is just as important as the hammer.

The Order issued by Judge Chamblin, the current Chief Judge for the 20th Judicial Circuit of Virginia, states that “Defendants shall be allowed to proceed with the use of predictive coding for purposes of the processing and production of electronically stored information.”  In a hand written notation, the Order further provides that the processing and production is to be completed within 120 days, with “processing” to be completed within 60 days and “production to follow as soon as practicable and in no more than 60 days.” The order does not mention whether or not the parties are required to agree upon a mutually agreeable protocol; an issue that has plagued the court and the parties in the ongoing Da Silva Moore, et. al. v. Publicis Groupe, et. al. for months.

Global Aerospace is the third known predictive coding case on record, but appears to present yet another set of unique legal and factual issues. In Da Silva Moore, Judge Andrew Peck of the Southern District of New York rang in the New Year by issuing the first known court order endorsing the use of predictive coding technology.  In that case, the parties agreed to the use of predictive coding technology, but continue to fight like cats and dogs to establish a mutually agreeable protocol.

Similarly, in the 7th Federal Circuit, Judge Nan Nolan is tackling the issue of predictive coding technology in Kleen Products, LLC, et. al. v. Packaging Corporation of America, et. al. In Kleen, Plaintiffs basically ask that Judge Nolan order Defendants to redo their production even though Defendants have spent thousands of hours reviewing documents, have already produced over a million documents, and their review is over 99 percent complete. The parties have already presented witness testimony in support of their respective positions over the course of two full days and more testimony may be required before Judge Nolan issues a ruling.

What is interesting about Global Aerospace is that Defendants proactively sought court approval to use predictive coding technology over Plaintiffs’ objections. This scenario is different than Da Silva Moore because the parties in Global Aerospace have not agreed to the use of predictive coding technology. Similarly, it appears that Defendants have not already significantly completed document review and production as they had in Kleen Products. Instead, the Global Aerospace Defendants appear to have sought protection from the court before moving full steam ahead with predictive coding technology and they have received the court’s blessing over Plaintiffs’ objection.

A key issue that the Order does not address is whether or not the parties will be required to decide on a mutually agreeable protocol before proceeding with the use of predictive coding technology. As stated earlier, the inability to define a mutually agreeable protocol is a key issue that has plagued the court and the parties for months in Da Silva Moore, et. al. v. Publicis Groupe, et. al. Similarly, in Kleen, the court was faced with issues related to the protocol for using technology tools. Both cases highlight the fact that regardless of which eDiscovery technology tools are selected from the litigator’s tool belt, the tools must be used properly in order for discovery to be fair.

Judge Chamblin left the barn door wide open for Plaintiffs to lodge future objections, perhaps setting the stage for yet another heated predictive coding battle. Importantly, the Judge issued the Order “without prejudice to a receiving party” and notes that parties can object to the “completeness or the contents of the production or the ongoing use of predictive coding technology.”  Given the ongoing challenges in Da Silva Moore and Kleen, don’t be surprised if the parties in Global Aerospace Inc. face some of the same process-based challenges as their predecessors. Hopefully some of the early challenges related to the use of first generation predictive coding tools can be overcome as case law continues to develop and as next generation predictive coding tools become easier to use. Stay tuned as the facts, testimony, and arguments related to Da Silva Moore, Kleen Products, and Global Aerospace Inc. cases continue to evolve.

Proportionality Demystified: How Organizations Can Get eDiscovery Right by Following Four Key Principles

Tuesday, April 17th, 2012

Talk to most any organization about legal issues and invariably the subject of eDiscovery will be raised. The skyrocketing costs and lengthy delays associated with data preservation and document review provide ample justification for organizations to be on the alert about eDiscovery. While these costs and delays tend to make the eDiscovery landscape appear bleak, a positive development on this front is emerging for organizations. That development is the emphasis that many courts are now placing on “proportionality” for addressing eDiscovery disputes.

Though initially embraced by only a few cognoscenti after 1983 and 2000 amendments to the Federal Rules of Civil Procedure (FRCP), proportionality standards are now being championed by various district and circuit courts. As more opinions are issued which analyze proportionality, several key principles are now becoming apparent in this developing body of jurisprudence. To better understand these principles, it is instructive to review some of the top proportionality cases issued this year and last. These cases provide a roadmap of best practices that, if followed, will help courts, clients and counsel reduce the costs and burdens connected with eDiscovery.

1. Discourage Unnecessary Discovery

Case: Bottoms v. Liberty Life Assur. Co. of Boston (D. Colo. Dec. 13, 2011)

Summary: The court dramatically curtailed the written discovery that plaintiff sought to propound on the defendant. Plaintiff had requested leave in this ERISA action to serve “sweeping” interrogatories and document requests to resolve the limited issue of whether the defendant had improperly denied her long term disability benefits. Drawing on the proportionality standards under Federal Rule 26(b)(2)(C), the court characterized the proposed discovery as “patently overbroad” and as seeking materials that were “largely irrelevant.” The court ultimately ordered the defendant to respond to some aspects of plaintiff’s interrogatories and document demands, but not before limiting their nature and scope.

Proportionality Principle No. 1: The Bottoms case emphasizes what courts have been advocating for years: that organizations should do away with unnecessary discovery. That does not mean “robotically recycling discovery requests propounded in earlier actions.” Instead, counsel must “stop and think” to ensure that its discovery is narrowly tailored in accordance with Rule 26(b)(2)(C). As Bottoms teaches, “the responsibility for conducting discovery in a reasonable, proportionate manner rests in the first instance with the parties and their attorneys.”

2. Encourage Reasonable Discovery Efforts

Case: Larsen v. Coldwell Banker Real Estate Corp. (C.D. Cal. Feb. 2, 2012)

Summary: In Larsen, the court rejected the plaintiffs’ assertion that the defendants should be made to redo their production of 9,000 pages of documents. The plaintiffs had argued that re-production of the documents was necessary to address certain discrepancies – including missing emails – in the production. The court disagreed, holding instead that plaintiffs had failed to establish that such discrepancies had “prevented them in any way from obtaining information relevant to a claim or defense under Fed.R.Civ.P. 26(b)(1).”

The court also reasoned that a “do over” would violate the principles of proportionality codified in Rule 26(b)(2)(C). After reciting the proportionality language from Rule 26 and referencing The Sedona Principles, the court determined that “the burden and expense to Defendants in completely reproducing its entire ESI production far outweighs any possible benefit to Plaintiffs.” There were too few discrepancies identified to justify the cost of redoing the production.

Proportionality Principle No. 2: The Larsen decision provides a simple reminder that organizations’ discovery efforts must be reasonable, not perfect. This reminder bears repeating as litigants frequently use eDiscovery sideshows to leverage lucrative settlements without having to address the merits of their claims or defenses. Such a practice, liked to a “cancerous growth” given its destructive nature, emphasizes that discovery devices should be used to “facilitate litigation rather than as weapons to wage litigation.” Calcor Space Facility, Inc. v. Superior Court, 53 Cal.App.4th 216, 221 (1997). Similar to the theme raised in our post regarding the predictive coding dispute in the Kleen Products case, principles of proportionality rightly emphasize the reasonable nature of parties’ obligations in discovery.

3. Discourage Dilatory Discovery Tactics

Case: Escamilla v. SMS Holdings Corporation (D. Minn. Oct. 21, 2011)

Summary: The court rejected an argument that proportionality standards should excuse the individual defendant from paying for additional discovery ordered by the court. The defendant essentially argued that Rule 26(b)(2)(C)(iii) foreclosed the ordered discovery given his limited financial resources. This position was unavailing, however, given that “the burden and expense of this discovery was self-inflicted by [the defendant].” As it turns out, the ordered discovery was necessary to address issues created in the litigation by the defendant’s failure to preserve relevant evidence. Moreover, there were no alternative means available for obtaining the sought-after materials. Given the unique nature of the evidence and the defendant’s misconduct, the court held that the “burden of the additional discovery [did] not outweigh its likely benefit.”

Proportionality Principle No. 3: The Escamilla decision reinforces a common refrain among proportionality cases: that proportionality is foreclosed to those parties who create their own burdens. Like the defense of unclean hands, proportionality essentially requires a litigant to approach the court with a clean slate of conduct in discovery. This is confirmed by The Sedona Conference Comment on Proportionality in Electronic Discovery, which declares that “[c]ourts should disregard any undue burden or expense that results from a responding party’s own conduct or delay.”

4. Encourage Better Information Governance Practices

Case: Salamone v. Carter’s Retail, Inc. (D.N.J. Jan. 28, 2011)

Summary: The court denied a motion for protective order that the defendant clothing retailer filed to stave off the collection and analysis of over 13,000 personnel files. The retailer had argued that proportionality precluded the search and review of the personnel files. In support of its argument, the retailer asserted that the nature, format, location and organization of the records made their review and production too burdensome: “ ‘the burden of production . . . outweigh[s] any benefit to plaintiffs’ considering the ‘disorganization of the information, the lack of accessible format, the significant amount of labor and costs involved, and defendant’s management structure’.”

In rejecting the retailer’s position, the court criticized its information retention system as the culprit for its burdens. That the retailer, the court reasoned, “maintains personnel files in several locations without any uniform organizational method,” does not exempt Defendant from reasonable discovery obligations.” After weighing the various factors that comprise the proportionality analysis under Rule 26(b)(2)(C), the court concluded that the probative value of production outweighed the resulting burden and expense on the retailer.

Proportionality Principle No. 4: Having an intelligent information governance process in place could have addressed the cost and logistics headaches that the retailer faced. Had the records at issue been digitized and maintained in a central archive, the retailer’s collection burdens would have been significantly minimized. Furthermore, integrating these “upstream” data retention protocols with “downstream” eDiscovery processes could have expedited the review process. The Salamone case teaches that an integrated information governance process, supported by effective, enabling technologies, will likely help organizations reach the objectives of proportionality by reducing the extent of discovery burdens and making them more commensurate with the demands of litigation.

Conclusion

The foregoing cases exemplify how proportionality principles can help lawyers and litigants conduct eDiscovery in an efficient and cost effective manner. And by faithfully observing these standards, courts, clients and counsel can better follow the mandate from Federal Rule 1 “to secure the just, speedy, and inexpensive determination of every action and proceeding.”

The eDiscovery “Passport”: The First Step to Succeeding in International Legal Disputes

Monday, April 2nd, 2012

The increase in globalization continues to erase borders throughout the world economy. Organizations now routinely conduct business in countries that were previously unknown to their industry vertical.  The trend of global integration is certain to increase, with reports such as the Ernst & Young 2011 Global Economic Survey confirming that 74% of companies believe that globalization, particularly in emerging markets, is essential to their continued vitality.

Not surprisingly, this trend of global integration has also led to a corresponding increase in cross-border litigation. For example, parties to U.S. litigation are increasingly seeking discovery of electronically stored information (ESI) from other litigants and third parties located in Continental Europe and the United Kingdom. Since traditional methods under the Federal Rules of Civil Procedure (FRCP) may be unacceptable for discovering ESI in those forums, the question then becomes how such information can be obtained.

At this point, many clients and their counsel are unaware how to safely navigate these international waters. The short answer for how to address these issues for much of Europe would be to resort to the Hague Convention of March 18, 1970 on the Taking of Evidence Abroad in Civil or Commercial Matters (Hague Convention). Simply referring to the Hague Convention, however, would ignore the complexities of electronic discovery in Europe. Worse, it would sidestep the glaring knowledge gap that exists in the United States regarding the cultural differences distinguishing European litigation from American proceedings.

The ability to bridge this gap with an awareness of the discovery processes in Europe is essential. Understanding that process is similar to holding a valid passport for international travel. Just as a passport is required for travelers to successfully cross into foreign lands, an “eDiscovery Passport™” is likewise necessary for organizations to effectively conduct cross-border discovery.

The Playing Field for eDiscovery in Continental Europe

Litigation in Continental Europe and is culturally distinct from American court proceedings. “Discovery,” as it is known in the United States, does not exist in Europe. Interrogatories, categorical document requests and requests for admissions are simply unavailable as European discovery devices. Instead, European countries generally allow only a limited exchange of documents, with parties typically disclosing only that information that supports their claims.

The U.S. Court of Appeals for the Seventh Circuit recently commented on this key distinction between European and American discovery when it observed that “the German legal system . . . does not authorize discovery in the sense of Rule 26 of the Federal Rules of Civil Procedure.” The court went on to explain that “[a] party to a German lawsuit cannot demand categories of documents from his opponent. All he can demand are documents that he is able to identify specifically—individually, not by category.” Heraeus Kulzer GmbH v. Biomet, Inc., 633 F.3d 591, 596 (7th Cir. 2011).

Another key distinction to discovery in Continental Europe is the lack of rules or case law requiring the preservation of ESI or paper documents. This stands in sharp contrast to American jurisprudence, which typically requires organizations to preserve information as soon as they reasonably anticipate litigation. See, e.g., Micron Technology, Inc. v. Rambus Inc., 645 F.3d 1311, 1320 (Fed.Cir. 2011). In Europe, while an implied preservation duty could arise if a court ordered the disclosure of certain materials, the penalties for European non-compliance are typically not as severe as those issued by American courts.

Only the nations of the United Kingdom, from which American notions of litigation are derived, have discovery obligations that are more similar to those in the United States. For example, in the combined legal system of England and Wales, a party must disclose to the other side information adverse to its claims. Moreover, England and Wales also suggest that parties should take affirmative steps to prepare for disclosure. According to the High Court in Earles v Barclays Bank Plc [2009] EWHC 2500 (Mercantile) (08 October 2009), this includes having “an efficient and effective information management system in place to provide identification, preservation, collection, processing, review analysis and production of its ESI in the disclosure process in litigation and regulation.” For organizations looking to better address these issues, a strategic and intelligent information governance plan offers perhaps the best chance to do so.

Hostility to International Discovery Requests

Despite some similarities between the U.S. and the U.K., Europe as a whole retains a certain amount of cultural hostility to pre-trial discovery. Given this fact, it should come as no surprise that international eDiscovery requests made pursuant to the Hague Convention are frequently denied. Requests are often rejected because they are overly broad.  In addition, some countries such as Italy simply refuse to honor requests for pre-trial discovery from common law countries like the United States. Moreover, other countries like Austria are not signatories to the Hague Convention and will not accept requests made pursuant to that treaty. To obtain ESI from those countries, litigants must take their chances with the cumbersome and time-consuming process of submitting letters rogatory through the U.S. State Department. Finally, requests for information that seek email or other “personal information” (i.e., information that could be used to identify a person) must additionally satisfy a patchwork of strict European data protection rules.

Obtaining an eDiscovery Passport

This backdrop of complexity underscores the need for both lawyers and laymen to understand the basic principles governing eDisclosure in Europe. Such a task should not be seen as daunting. There are resources that provide straightforward answers to these issues at no cost to the end-user. For example, Symantec has just released a series of eDiscovery Passports™ that touch on the basic issues underlying disclosure and data privacy in the United Kingdom, France, Germany, Holland, Belgium, Austria, Switzerland, Italy and Spain. Organizations such as The Sedona Conference have also made available materials that provide significant detail on these issues, including its recently released International Principles on Discovery, Disclosure and Data Protection.

These resources can provide valuable information to clients and counsel alike and better prepare litigants for the challenges of pursuing legal rights across international boundaries. By so doing, organizations can moderate the effects of legal risk and more confidently pursue their globalization objectives.