24h-payday

Archive for the ‘regulatory inquiries’ Category

Take Two and Call me in the Morning: U.S. Hospitals Need an Information Governance Remedy

Wednesday, April 11th, 2012

Given the vast amount of sensitive information and legal exposure faced by hospitals today it’s a mystery why these organizations aren’t taking advantage of enabling technologies to minimize risk. Both HIPPA and the HITECH Act are often achieved by manual, ad hoc methods, which are hazardous at best. In the past, state and federal auditing environments have not been very aggressive in ensuring compliance, but that is changing. While many hospitals have invested in high tech records management systems (EMR/EHR), those systems do not encompass the entire information and data environment within a hospital. Sensitive information often finds its way into and onto systems outside the reach of EMR/EHR systems, bringing with it increased exposure to security breach and legal liability.

This information overload often metastasizes into email (both hospital and personal), attachments, portable storage devices, file, web and development servers, desktops and laptops, home or affiliated practice’s computers and mobile devices such as iPads and smart phones. These avenues for the dissemination and receipt of information expand the information governance challenge and data security risks. Surprisingly, the feedback from the healthcare sector suggests that hospitals rarely get sued in federal court.

One place hospitals do not want to be is the “Wall of Shame,” otherwise known as the HHS website that has detailed 281 Health Insurance Portability and Accountability Act (HIPAA) security violations that have affected more than 500 individuals as of June 9, 2011. Overall, physical theft and loss accounted for about 63% of the reported breaches. Unauthorized access / disclosure accounted for another 16%, while hacking was only 6%. While Software Advice reasons these statistics seem to indicate that physical theft has been the reason for the majority of breaches, it should also be considered that due to the lack of data loss prevention technology, many hospitals are unaware of breaches that have occurred and therefore cannot report on them.

There are a myriad of reasons hospitals aren’t landing on the front page of the newspaper with the same frequency as other businesses and government agencies when it comes to security breach, and document retention and eDiscovery blunders. But, the underlying contagion is not contained and it certainly is not benign. Feedback from the field reveals some alarming symptoms of the unhealthy state of healthcare information governance, including:

  • uncontrolled .pst files
  • exploding storage growth
  • missing or incomplete data retention rules
  • doctors/nurses storing and sending sensitive data via their personal email, iPads and smartphones
  • encryption rules that rely on individuals to determine what to encrypt
  • data backup policies that differ from data retention and information governance rules
  • little to no compliance training
  • and many times non-existent data loss prevention efforts.

This results in the need for more storage, while creating larger legal liability, an indefensible eDiscovery posture, and the risk of breach.

The reason this problem remains latent in most hospitals is because they are not yet feeling the pain of the problem from massive and multiple lawsuits, large invoices from outside law firms or the operational challenges/costs incurred from searching through many mountains of dispersed data.  The symptoms are observable, the pathology is present, the problem is real and the pain is about to acutely present itself as more states begin to deeply embrace eDiscovery requirements and government regulators increase audit frequency and fine amounts. Another less talked about reason hospitals have not had the same pressure to search and produce their data pursuant to litigation is due to cases being settled before they even get to the discovery stage. The lack of well-developed information governance practices leads to cases being settled too soon, for too much money when they otherwise may not have needed to settle at all.

The Patient’s Symptoms Were Treated, but the Patient’s Data Still Needs Medicine

What is still unclear is why hospitals, given their compliance requirements and tightening IT budgets, aren’t archiving, classifying, and protecting their data with the same type of innovation they are demonstrating in their cutting edge patient care technology. In this realm, two opposite ends of the IT innovation spectrum seem to co-exist in the hospital’s data environment. This dichotomy leaves much of a hospital’s data unprotected, unorganized and uncontrolled. Hospitals are experiencing increasing data security breaches and often are not aware that a breach or data loss has occurred. As more patient data is created and copied in electronic format, used in and exposed by an increasing number of systems and delivered on emerging mobile platforms, the legal and audit risks are compounding on top of a faulty or missing information governance foundation.

Many hospitals have no retention schedules or data classification rules applied to existing information, which often results in a checkbox compliance mentality and a keep-everything-forever practice. Additionally, many hospitals have no ability to apply a comprehensive legal hold across different data sources and lack technology to stop or alert them when there has been a breach.

Information Governance and Data Health in Hospitals

With the mandated push for paper to be converted to digital records, many hospitals are now evaluating the interplay of their various information management and distribution systems. They must consider the newly scanned legacy data (or soon to be scanned), and if they have been operating without an archive, they must now look to implement a searchable repository where they can collectively apply document retention and records management while decreasing the amount of storage needed to retain the data.  We are beginning to see internal counsel leading the way to make this initiative happen across business units. Different departments are coming together to pool resources in tight economic and high regulation times that require collaboration.  We are at the beginning of a widespread movement in the healthcare industry for archiving, data classification and data loss prevention as hospitals link their increasing compliance and data loss requirements with the need to optimize and minimize storage costs. Finally, it comes as no surprise that the amount of data hospitals are generating is crippling their infrastructures, breaking budgets and serving as the primary motivator for change absent lawsuits and audits.

These factors are bringing together various stakeholders into the information governance conversation, helping to paint a very clear picture that putting in place a comprehensive information governance solution is in the entire hospital’s best interest. The symptoms are clear, the problem is treatable, the prescription for information governance is well proven. Hospitals can begin this process by calling an information governance meeting with key stakeholders and pursuing an agenda set around examining their data map and assessing areas of security vulnerability, as well as auditing the present state of compliance with regulations for the healthcare industry.

Editor’s note: This post was co-authored with Eric Heck, Healthcare Account Manager at Symantec.  Eric has over 25 years of experience in applying technology to emerging business challenges, and currently works with healthcare providers and hospitals to manage the evolving threat landscape of compliance, security, data loss and information governance within operational, regulatory and budgetary constraints.

The eDiscovery “Passport”: The First Step to Succeeding in International Legal Disputes

Monday, April 2nd, 2012

The increase in globalization continues to erase borders throughout the world economy. Organizations now routinely conduct business in countries that were previously unknown to their industry vertical.  The trend of global integration is certain to increase, with reports such as the Ernst & Young 2011 Global Economic Survey confirming that 74% of companies believe that globalization, particularly in emerging markets, is essential to their continued vitality.

Not surprisingly, this trend of global integration has also led to a corresponding increase in cross-border litigation. For example, parties to U.S. litigation are increasingly seeking discovery of electronically stored information (ESI) from other litigants and third parties located in Continental Europe and the United Kingdom. Since traditional methods under the Federal Rules of Civil Procedure (FRCP) may be unacceptable for discovering ESI in those forums, the question then becomes how such information can be obtained.

At this point, many clients and their counsel are unaware how to safely navigate these international waters. The short answer for how to address these issues for much of Europe would be to resort to the Hague Convention of March 18, 1970 on the Taking of Evidence Abroad in Civil or Commercial Matters (Hague Convention). Simply referring to the Hague Convention, however, would ignore the complexities of electronic discovery in Europe. Worse, it would sidestep the glaring knowledge gap that exists in the United States regarding the cultural differences distinguishing European litigation from American proceedings.

The ability to bridge this gap with an awareness of the discovery processes in Europe is essential. Understanding that process is similar to holding a valid passport for international travel. Just as a passport is required for travelers to successfully cross into foreign lands, an “eDiscovery Passport™” is likewise necessary for organizations to effectively conduct cross-border discovery.

The Playing Field for eDiscovery in Continental Europe

Litigation in Continental Europe and is culturally distinct from American court proceedings. “Discovery,” as it is known in the United States, does not exist in Europe. Interrogatories, categorical document requests and requests for admissions are simply unavailable as European discovery devices. Instead, European countries generally allow only a limited exchange of documents, with parties typically disclosing only that information that supports their claims.

The U.S. Court of Appeals for the Seventh Circuit recently commented on this key distinction between European and American discovery when it observed that “the German legal system . . . does not authorize discovery in the sense of Rule 26 of the Federal Rules of Civil Procedure.” The court went on to explain that “[a] party to a German lawsuit cannot demand categories of documents from his opponent. All he can demand are documents that he is able to identify specifically—individually, not by category.” Heraeus Kulzer GmbH v. Biomet, Inc., 633 F.3d 591, 596 (7th Cir. 2011).

Another key distinction to discovery in Continental Europe is the lack of rules or case law requiring the preservation of ESI or paper documents. This stands in sharp contrast to American jurisprudence, which typically requires organizations to preserve information as soon as they reasonably anticipate litigation. See, e.g., Micron Technology, Inc. v. Rambus Inc., 645 F.3d 1311, 1320 (Fed.Cir. 2011). In Europe, while an implied preservation duty could arise if a court ordered the disclosure of certain materials, the penalties for European non-compliance are typically not as severe as those issued by American courts.

Only the nations of the United Kingdom, from which American notions of litigation are derived, have discovery obligations that are more similar to those in the United States. For example, in the combined legal system of England and Wales, a party must disclose to the other side information adverse to its claims. Moreover, England and Wales also suggest that parties should take affirmative steps to prepare for disclosure. According to the High Court in Earles v Barclays Bank Plc [2009] EWHC 2500 (Mercantile) (08 October 2009), this includes having “an efficient and effective information management system in place to provide identification, preservation, collection, processing, review analysis and production of its ESI in the disclosure process in litigation and regulation.” For organizations looking to better address these issues, a strategic and intelligent information governance plan offers perhaps the best chance to do so.

Hostility to International Discovery Requests

Despite some similarities between the U.S. and the U.K., Europe as a whole retains a certain amount of cultural hostility to pre-trial discovery. Given this fact, it should come as no surprise that international eDiscovery requests made pursuant to the Hague Convention are frequently denied. Requests are often rejected because they are overly broad.  In addition, some countries such as Italy simply refuse to honor requests for pre-trial discovery from common law countries like the United States. Moreover, other countries like Austria are not signatories to the Hague Convention and will not accept requests made pursuant to that treaty. To obtain ESI from those countries, litigants must take their chances with the cumbersome and time-consuming process of submitting letters rogatory through the U.S. State Department. Finally, requests for information that seek email or other “personal information” (i.e., information that could be used to identify a person) must additionally satisfy a patchwork of strict European data protection rules.

Obtaining an eDiscovery Passport

This backdrop of complexity underscores the need for both lawyers and laymen to understand the basic principles governing eDisclosure in Europe. Such a task should not be seen as daunting. There are resources that provide straightforward answers to these issues at no cost to the end-user. For example, Symantec has just released a series of eDiscovery Passports™ that touch on the basic issues underlying disclosure and data privacy in the United Kingdom, France, Germany, Holland, Belgium, Austria, Switzerland, Italy and Spain. Organizations such as The Sedona Conference have also made available materials that provide significant detail on these issues, including its recently released International Principles on Discovery, Disclosure and Data Protection.

These resources can provide valuable information to clients and counsel alike and better prepare litigants for the challenges of pursuing legal rights across international boundaries. By so doing, organizations can moderate the effects of legal risk and more confidently pursue their globalization objectives.

Big Data Decisions Ahead: Government-Sponsored Town Hall Meeting for eDiscovery Industry Coincides With Federal Agency Deadline

Wednesday, February 29th, 2012

Update For Report Submission By Agencies

We are fast approaching the March 27, 2012 deadline for federal agencies to submit their reports to the Office of Management and Budget and the National Archives and Records Administration (NARA) to comply with the Presidential Mandate on records management. We are only at the inception, as we look to a very exciting public town hall meeting in Washington, D.C. – also scheduled for March 27, 2012. This meeting is primarily focused on gathering input from the public sector community, the vendor/IT community, and members of the public at large. Ultimately, NARA will issue a directive that will outline a centralized approach for the federal government for managing records and eDiscovery.

Agencies have been tight lipped about how far along they are in the process of evaluating their workflows and tools for managing their information (both electronic and paper). There is, however, some empirical data from an InformationWeek Survey conducted last year that takes the temperature on where the top IT professionals within the government have their sights set, and the Presidential Mandate should bring some of these concerns to the forefront of the reports. For example, the #1 business driver for migrating to the cloud – cited by 62% of respondents – was cost, while 77% of respondents said their biggest concern was security. Nonetheless, 46% were still highly likely to migrate to a private cloud.

Additionally, as part of the Federal Data Center Consolidation Initiative, agencies are looking to eliminate 800 data centers. While the cost savings are clear, from an information governance viewpoint, it’s hard not to ask what the government plans to do with all of those records?  Clearly, this shift, should it happen, will force the government into a more service-based management approach, as opposed to the traditional asset-based management approach. Some agencies have already migrated to the cloud. This is squarely in line with the Opex over Capex approach emerging for efficiency and cost savings.

Political Climate Unknown

Another major concern that will affect any decisions or policy implementation within the government is, not surprisingly, politics. Luckily, regardless of political party affiliation, it seems to be broadly agreed that the combination of IT spend in Washington, D.C. and the government’s slow move to properly manage electronic records is a problem. Two of the many examples of the problem are manifested in the inability to issue effective litigation holds or respond to Freedom of Information Act (FOIA) requests in a timely and complete manner. Even still, the political agenda of the Republican party may affect the prioritization of the Democratic President’s mandate and efforts could be derailed with a potential change in administration.

Given the election year and the heavy analysis required to produce the report, there is a sentiment in Washington that all of this work may be for naught if the appropriate resources cannot be secured then allocated to effectuate the recommendations. The reality is that data is growing at an unprecedented rate, and the need for the intelligent management of information is no longer deniable. The long term effects of putting this overhaul on the back burner could be disastrous. The government needs a modular plan and a solid budget to address the problem now, as they are already behind.

VanRoekel’s Information Governance

One issue that will likely not be agreed upon between Democrats and Republicans to accomplish the mandate is the almighty budget, and the technology the government must purchase in order to accomplish the necessary technological changes are going to cost a pretty penny.  Steven VanRoekel, the Federal CIO, stated upon the release of the FY 2013 $78.8 billion dollar IT budget:

“We are also making cyber security a cross-agency, cross-government priority goal this year. We have done a good job in ramping up on cyber capabilities agency-by-agency, and as we come together around this goal, we will hold the whole of government accountable for cyber capabilities and examine threats in a holistic way.”

His quote indicates the priority from the top down of evaluating IT holistically, which dovetails nicely with the presidential mandate since security and records management are only two parts of the entire information governance picture. Each agency still has their own work cut out for them across the EDRM. One of the most pressing issues in the upcoming reports will be what each agency decides to bring in-house or to continue outsourcing. This decision will in part depend on whether the inefficiencies identified lead agencies to conclude that they can perform those functions for less money and more efficiently than their contractors.  In evaluating their present capabilities, each agency will need to look at what workflows and technologies they currently have deployed across divisions, what they presently outsource,  and what the marketplace potentially offers them today to address their challenges.

The reason this question is central is because it begs an all-important question about information governance itself.  Information governance inherently implies that an organization or government control most or all aspects of the EDRM model in order to derive the benefits of security, storage, records management and eDiscovery capabilities. Presently, the government is outsourcing many of their litigation services to third party companies that have essentially become de facto government agencies.  This is partly due to scalability issues, and partly because the resources and technologies that are deployed in-house within these agencies are inadequate to properly execute a robust information governance plan.

Conclusion

The ideal scenario for each government agency to comply with the mandate would be that they deploy automated classification for their records management, archiving with expiration appropriately implemented for more than just email, and finally, some level of eDiscovery capability in order to conduct early case assessment and easily produce data for FOIA.  The level of early case assessment needed by each agency will vary, but the general idea would be that before contacting a third party to conduct data collection, the scope of an investigation or matter would be able to be determined in-house.  All things considered, the question remains if the Obama administration will foot this bill or if we will have to wait for a bigger price tag later down the road.  Either way, the government will have to come up to speed and make these changes eventually and the town hall meeting should be an accurate thermometer on where the government stands.

Information Governance Gets Presidential Attention: Banking Bailout Cost $4.76 Trillion, Technology Revamp Approaches $240 Billion

Tuesday, January 10th, 2012

On November 28, 2011, The White House issued a Presidential Memorandum that outlines what is expected of the 480 federal agencies of the government’s three branches in the next 240 days.  Up until now, Washington, D.C. has been the Wild West with regard to information governance as each agency has often unilaterally adopted its own arbitrary policies and systems.  Moreover, some agencies have recently purchased differing technologies.  Unfortunately,  with the President’s ultimate goal of uniformity, this centralization will be difficult to accomplish with a range of disparate technological approaches.

Particular pain points for the government traditionally include retention, search, collection, review and production of vast amounts of data and records.  Specifically, these pain points include examples of: FOIA requests gone awry, the issuance of legal holds across different agencies leading to spoliation, and the ever present problem of decentralization.

Why is the government different?

Old Practices. First, in some instances the government is technologically behind (its corporate counterparts) and is failing to meet the judiciary’s expectation that organizations effectively store, manage and discover their information.  This failing is self-evident via  the directive coming from the President mandating that these agencies start to get a plan to attack this problem.  Though different than other corporate entities, the government is nevertheless held to the same standards of eDiscovery under the Federal Rules of Civil Procedure (FRCP).  In practice, the government has been given more leniency until recently, and while equal expectations have not always been the case, the gap between the private and public sectors in no longer possible to ignore.

FOIA.  The government’s arduous obligation to produce information under the Freedom of Information Act (FOIA) has no corresponding analog for private organizations, who are responding to more traditional civil discovery requests.  Because the government is so large with many disparate IT systems, it is cumbersome to work efficiently through the information governance process across agencies and many times still difficult inside one individual agency with multiple divisions.  Executing this production process is even more difficult if not impossible to do manually without properly deployed technology.  Additionally, many of the investigatory agencies that issue requests to the private sector need more efficient ways to manage and review data they are requesting.  To compound problems, within the US government there are two opposing interests are at play; both screaming for a resolution, and that solution needs to be centralized.  On the one hand, the government needs to retain more than a corporation may need to in order to satisfy a FOIA request.

Titan Pulled at Both Ends. On the other hand, without classification of the records that are to be kept, technology to organize this vast amount of data and some amount of expiry, every agency will essentially become their own massive repository.  The “retain everything mentality” coupled with the inefficient search and retrieval of data and records is where they stand today.  Corporations are experiencing this on a smaller scale today and many are collectively further along than the government in this process, without the FOIA complications.

What are agencies doing to address these mandates?

In their plans, agencies must describe how they will improve or maintain their records management programs, particularly with regard to email, social media and other electronic communications.  They must also move away from such a paper-centric existence.  eDiscovery consultants and software companies are helping agencies through this process, essentially writing their plans to match the President’s directive.  The cloud conversation has been revisited, and agencies also have to explain how they will use cloud-based services and storage solutions, as well as identify gaps in existing laws or regulations that presently prevent improved management.  Small innovations are taking place.  In fact, just recently the DOJ added a new search feature on their website to make it easier for the public to find documents that have been posted by agencies on their websites.

The Office of Management and Budget (OMB), National Archives and Records Administration (NARA), and Justice Department will use those reports to come up with a government-wide records management framework that is more efficient, maintains accountability by documenting agency actions and promotes “appropriate” public access to records.  Hopefully, the framework they come up with will be centralized and workable on a realistic timeframe with resources sufficiently allocated to the initiative.

How much will this cost?

The President’s mandate is a great initiative and very necessary, but one cannot help but think about the costs in terms of money, time and resources when considering these crucial changes.  The most recent version of a financial services and general government appropriations bill in the Senate extends $378.8 million to NARA for this initiative.  President Obama appointed Steven VanRoekel as the United States CIO in August 2011 to succeed Vivek Kundra.  After VanRoekel’s speech at the Churchill Club in October of 2011, an audience member asked him what the most surprising aspect of his new job was.  VanRoekel said that it was managing the huge and sometimes unwieldy resources of his $80 billion budget.  It is going to take even more than this to do the job right, however.

Using conservative estimates, assume for an agency to implement archiving and eDiscovery capabilities as an initial investment would be $100 million.  That approximates $480 billion for all 480 agencies.  Assume a uniform information governance platform gets adopted by all agencies at a 50% discount due to the large contracts and also factoring in smaller sums for agencies with lesser needs.  The total now comes to $240 billion.  For context, that figure is 5% of what was spent by Federal Government ($4.76 trillion) on the biggest bailout in history in 2008. That leaves a need for $160 billion more to get the job done. VanRoekel also commented at the same meeting that he wants to break down massive multi-year information technology projects into smaller, more modular projects in the hopes of saving the government from getting mired in multi-million dollar failures.   His solution to this, he says, is modular and incremental deployment.

While Rome was not built in a day, this initiative is long overdue, yet feasible, as technology exists to address these challenges rather quickly.  After these 240 days are complete and a plan is drawn the real question is, how are we going to pay now for technology the government needed yesterday?  In a perfect world, the government would select a platform for archiving and eDiscovery, break the project into incremental milestones and roll out a uniform combination of solutions that are best of breed in their expertise.

Lessons Learned for 2012: Spotlighting the Top eDiscovery Cases from 2011

Tuesday, January 3rd, 2012

The New Year has now dawned and with it, the certainty that 2012 will bring new developments to the world of eDiscovery.  Last month, we spotlighted some eDiscovery trends for 2012 that we feel certain will occur in the near term.  To understand how these trends will play out, it is instructive to review some of the top eDiscovery cases from 2011.  These decisions provide a roadmap of best practices that the courts promulgated last year.  They also spotlight the expectations that courts will likely have for organizations in 2012 and beyond.

Issuing a Timely and Comprehensive Litigation Hold

Case: E.I. du Pont de Nemours v. Kolon Industries (E.D. Va. July 21, 2011)

Summary: The court issued a stiff rebuke against defendant Kolon Industries for failing to issue a timely and proper litigation hold.  That rebuke came in the form of an instruction to the jury that Kolon executives and employees destroyed key evidence after the company’s preservation duty was triggered.  The jury responded by returning a stunning $919 million verdict for DuPont.

The spoliation at issue occurred when several Kolon executives and employees deleted thousands emails and other records relevant to DuPont’s trade secret claims.  The court laid the blame for this destruction on the company’s attorneys and executives, reasoning they could have prevented the spoliation through an effective litigation hold process.  At issue were three hold notices circulated to the key players and data sources.  The notices were all deficient in some manner.  They were either too limited in their distribution, ineffective since they were prepared in English for Korean-speaking employees, or too late to prevent or otherwise ameliorate the spoliation.

The Lessons for 2012: The DuPont case underscores the importance of issuing a timely and comprehensive litigation hold notice.  As DuPont teaches, organizations should identify what key players and data sources may have relevant information.  A comprehensive notice should then be prepared to communicate the precise hold instructions in an intelligible fashion.  Finally, the hold should be circulated immediately to prevent data loss.

Organizations should also consider deploying the latest technologies to help effectuate this process.  This includes an eDiscovery platform that enables automated legal hold acknowledgements.  Such technology will allow custodians to be promptly and properly apprised of litigation and thereby retain information that might otherwise have been discarded.

Another Must-Read Case: Haraburda v. Arcelor Mittal U.S.A., Inc. (D. Ind. June 28, 2011)

Suspending Document Retention Policies

Case: Viramontes v. U.S. Bancorp (N.D. Ill. Jan. 27, 2011)

Summary: The defendant bank defeated a sanctions motion because it modified aspects of its email retention policy once it was aware litigation was reasonably foreseeable.  The bank implemented a retention policy that kept emails for 90 days, after which the emails were overwritten and destroyed.  The bank also promulgated a course of action whereby the retention policy would be promptly suspended on the occurrence of litigation or other triggering event.  This way, the bank could establish the reasonableness of its policy in litigation.  Because the bank followed that procedure in good faith, it was protected from court sanctions under the Federal Rules of Civil Procedure 37(e) “safe harbor.”

The Lesson for 2012: As Viramontes shows, an organization can be prepared for eDiscovery disputes by timely suspending aspects of its document retention policies.  By modifying retention policies when so required, an organization can develop a defensible retention procedure and be protected from court sanctions under Rule 37(e).

Coupling those procedures with archiving software will only enhance an organization’s eDiscovery preparations.  Effective archiving software will have a litigation hold mechanism, which enables an organization to suspend automated retention rules.  This will better ensure that data subject to a preservation duty is actually retained.

Another Must-Read Case: Micron Technology, Inc. v. Rambus Inc., 645 F.3d 1311 (Fed. Cir. 2011)

Managing the Document Collection Process

Case: Northington v. H & M International (N.D.Ill. Jan. 12, 2011)

Summary: The court issued an adverse inference jury instruction against a company that destroyed relevant emails and other data.  The spoliation occurred in large part because legal and IT were not involved in the collection process.  For example, counsel was not actively engaged in the critical steps of preservation, identification or collection of electronically stored information (ESI).  Nor was IT brought into the picture until 15 months after the preservation duty was triggered. By that time, rank and file employees – some of whom were accused by the plaintiff of harassment – stepped into this vacuum and conducted the collection process without meaningful oversight.  Predictably, key documents were never found and the court had little choice but to promise to inform the jury that the company destroyed evidence.

The Lesson for 2012: An organization does not have to suffer the same fate as the company in the Northington case.  It can take charge of its data during litigation through cooperative governance between legal and IT.  After issuing a timely and effective litigation hold, legal should typically involve IT in the collection process.  Legal should rely on IT to help identify all data sources – servers, systems and custodians – that likely contain relevant information.  IT will also be instrumental in preserving and collecting that data for subsequent review and analysis by legal.  By working together in a top-down fashion, organizations can better ensure that their eDiscovery process is defensible and not fatally flawed.

Another Must-Read Case: Green v. Blitz U.S.A., Inc. (E.D. Tex. Mar. 1, 2011)

Using Proportionality to Dictate the Scope of Permissible Discovery

Case: DCG Systems v. Checkpoint Technologies (N.D. Ca. Nov. 2, 2011)

The court adopted the new Model Order on E-Discovery in Patent Cases recently promulgated by the U.S. Court of Appeals for the Federal Circuit.  The model order incorporates principles of proportionality to reduce the production of email in patent litigation.  In adopting the order, the court explained that email productions should be scaled back since email is infrequently introduced as evidence at trial.  As a result, email production requests will be restricted to five search terms and may only span a defined set of five custodians.  Furthermore, email discovery in DCG Systems will wait until after the parties complete discovery on the “core documentation” concerning the patent, the accused product and prior art.

The Lesson for 2012: Courts seem to be slowly moving toward a system that incorporates proportionality as the touchstone for eDiscovery.  This is occurring beyond the field of patent litigation, as evidenced by other recent cases.  Even the State of Utah has gotten in on the act, revising its version of Rule 26 to require that all discovery meet the standards of proportionality.  While there are undoubtedly deviations from this trend (e.g., Pippins v. KPMG (S.D.N.Y. Oct. 7, 2011)), the clear lesson is that discovery should comply with the cost cutting mandate of Federal Rule 1.

Another Must-Read Case: Omni Laboratories Inc. v. Eden Energy Ltd [2011] EWHC 2169 (TCC) (29 July 2011)

Leveraging eDiscovery Technologies for Search and Review

Case: Oracle America v. Google (N.D. Ca. Oct. 20, 2011)

The court ordered Google to produce an email that it previously withheld on attorney client privilege grounds.  While the email’s focus on business negotiations vitiated Google’s claim of privilege, that claim was also undermined by Google’s production of eight earlier drafts of the email.  The drafts were produced because they did not contain addressees or the heading “attorney client privilege,” which the sender later inserted into the final email draft.  Because those details were absent from the earlier drafts, Google’s “electronic scanning mechanisms did not catch those drafts before production.”

The Lesson for 2012: Organizations need to leverage next generation, robust technology to support the document production process in discovery.  Tools such as email analytical software, which can isolate drafts and offer to remove them from production, are needed to address complex production issues.  Other technological capabilities, such as Near Duplicate Identification, can also help identify draft materials and marry them up with finals that have been marked as privileged.  Last but not least, technology assisted review has the potential of enabling one lawyer to efficiently complete the work that previously took thousands of hours.  Finding the budget and doing the research to obtain the right tools for the enterprise should be a priority for organizations in 2012.

Another Must-Read Case: J-M Manufacturing v. McDermott, Will & Emery (CA Super. Jun. 2, 2011)

Conclusion

There were any number of other significant cases from 2011 that could have made this list.  We invite you to share your favorites in the comments section or contact us directly with your feedback.

For more on the cases discussed above, watch this video:

New Utah Rule 26: A Blueprint for Proportionality in eDiscovery

Tuesday, December 20th, 2011

The eDiscovery frenzy that has gripped the American legal system over the past decade has become increasingly expensive.  Particularly costly to both clients and courts is the process of preserving, collecting and producing documents.  This was supposed to change after the Federal Rules of Civil Procedure (FRCP) were amended in 2006.  After all, weren’t the amended rules designed to streamline discovery, allowing parties to focus on the merits while making discovery costs more reasonable?  Instead, it seems the rules have spawned more collateral discovery disputes than ever before about preservation, collection and production issues.

As a solution to these costs, the eDiscovery cognoscenti are emphasizing the concept of “proportionality.”  Proportionality typically requires that the benefits of discovery be commensurate with its corresponding burdens.  Under the Federal Rules of Civil Procedure, the directive that discovery be proportional is found in Rules 26(c), 26(b)(2)(C) and Rule 26(b)(2)(B).  Under Rule 26(c), courts may generally issue protective orders that limit or even proscribe discovery that causes “annoyance, embarrassment, oppression, or undue burden or expense.”  More specifics are set forth in Rule 26(b)(2)(C), which enables courts to restrict discovery if the requests are unreasonably cumulative or duplicative, the discovery can be obtained from an alternative source that is less expensive or burdensome, or the burden or expense of the discovery outweighs its benefit.  In the specific context of electronic discovery, Rule 26(b)(2)(B) restricts the discovery of backup tapes and other electronically stored information that are “not reasonably accessible” due to “undue burden or cost.”

Despite the existence of these provisions, they are often bypassed.  The most recent and notable example of this trend is found in Pippins v. KPMG (S.D.N.Y. Oct. 7, 2011).  In Pippins, the court ordered the defendant accounting firm to continue preserving thousands of employee hard drives.  In so doing, the court sidestepped the firm’s proportionality argument, citing Orbit One v. Numerex (S.D.N.Y. 2010) for the premise that such a standard is “too amorphous” and therefore unworkable.  Regardless of cost or burden, the court reasoned that “prudence” required preservation of all relevant materials “until a more precise definition [of proportionality] is created by rule.”

The Pippins order and its associated costs for the firm – potentially into the millions of dollars – has given new fuel to the argument that an amended federal rule should be implemented to include a more express mandate regarding proportionality.  Surprisingly enough, a blueprint for such an amended rule is already in place in the State of Utah.  Effective November 1, 2011, Utah implemented sweeping changes to civil discovery practice through amended Civil Procedure Rule 26.  The new rule makes proportionality the standard now governing eDiscovery in Utah.

Proportionality Dictates the Scope of Permissible Discovery

Utah Rule 26 has changed the permissible scope of discovery to expressly condition that all discovery meet the standards of proportionality.  That means parties may seek discovery of relevant, non-privileged materials “if the discovery satisfies the standards of proportionality.”  This effectively shifts the burden of proof on proportionality from the responding party to the requesting party.  Indeed, Utah Rule 26(b)(3) specifically codifies this stunning change:  “The party seeking discovery always has the burden of showing proportionality and relevance.”  This stands in sharp contrast to Federal Rules 26(b)(2) and 26(c), which require the responding party to show the discovery is not proportional.

The “standards of proportionality” that have been read into Utah Rule 26 incorporate those found in Federal Rule 26(b)(2)(C).  In addition, Utah Rule 26 requires that discovery be “reasonable.”  Reasonableness is to be determined on the needs of a given case such as the amount in controversy, the parties’ resources, the complexity and importance of the issues, and the role of the discovery in addressing such issues.  Last but not least, discovery must expressly comply with the cost cutting mandate of Rule 1 and thereby “further the just, speedy and inexpensive determination of the case.”

Proportionality Limits the Amount of Discovery

To further address the burdens and costs of disproportionate discovery, Utah Rule 26(c) limits the amount of discovery that parties may conduct as a matter of right based on the specific amounts in controversy.  For those matters involving damages of $300,000 or more, parties may propound 20 interrogatories, document requests and requests for admissions.  Total fact deposition time is restricted to a mere 30 hours.  For matters between $50,000 and $300,000, those figures are halved.  And for matters under $50,000, only five document requests and requests for admissions are allotted to the parties.  Fact depositions are curtailed to three hours total per side, while interrogatories are eliminated.

If these limits are too restrictive, parties may request “extraordinary discovery” under Rule 26(c)(6).  However, any such request must demonstrate that the sought after discovery is “necessary and proportional” under the rules.  The parties must also certify that a budget for the discovery has been “reviewed and approved.”

A Potential Model for Federal Discovery Rule Amendments

Utah Rule 26 could perhaps serve as a model for amending the scope of permissible discovery under the Federal Rules.  Like Utah Rule 26, Federal Rule 26 could be amended to expressly condition discovery on meeting the principles of proportionality.  The Federal Rules could also be modified to ensure the propounding party always has the burden of demonstrating the fact specific good cause for its discovery.  Doing so would undoubtedly force counsel and client to be more precise with their requests and do away with the current regime of “promiscuous discovery.”  Calcor Space Facility, Inc. v. Superior Court, 53 Cal.App.4th 216, 223 (1997) (urging courts to “aggressively” curb discovery abuses which, “like a cancerous growth, can destroy a meritorious cause or defense”).

Tiering the amounts of permitted discovery based on alleged damages could also reduce the costs of discovery.  With limited deposition time and fewer document requests, discovery of necessity would likely focus on the merits instead of eDiscovery sideshows.  Coupling this with an “extraordinary discovery” provision would enable courts to exercise greater control over the process and ensure that genuinely complex matters are litigated efficiently.

If all of this seems like a radical departure from established discovery practice, consider that the new Model Order on E-Discovery in Patent Cases has also incorporated tiered and extraordinary discovery provisions.  See DCG Systems v. Checkpoint Technologies (N.D. Ca. Nov. 2, 2011) (adopting the model order and explaining the benefits of limiting eDiscovery in patent cases).

For those who are seeking a vision of how proportionality might be incorporated into the Federal Rules, new Utah Rule 26 could be a blueprint for doing so.

Top Ten eDiscovery Predictions for 2012

Thursday, December 8th, 2011

As 2011 comes quickly to a close we’ve attempted, as in years past, to do our best Carnac impersonation and divine the future of eDiscovery.  Some of these predictions may happen more quickly than others, but it’s our sense that all will come to pass in the near future – it’s just a matter of timing.

  1. Technology Assisted Review (TAR) Gains Speed.  The area of Technology Assisted Review is very exciting since there are a host of emerging technologies that can help make the review process more efficient, ranging from email threading, concept search, clustering, predictive coding and the like.  There are two fundamental challenges however.  First, the technology doesn’t work in a vacuum, meaning that the workflows need to be properly designed and the users need to make accurate decisions because those judgment calls often are then magnified by the application.  Next, the defensibility of the given approach needs to be well vetted.  While it’s likely not necessary (or practical) to expect a judge to mandate the use of a specific technological approach, it is important for the applied technologies to be reasonable, transparent and auditable since the worst possible outcome would be to have a technology challenged and then find the producing party unable to adequately explain their methodology.
  2. The Custodian-Based Collection Model Comes Under Stress. Ever since the days of Zubulake, litigants have focused on “key players” as a proxy for finding relevant information during the eDiscovery process.  Early on, this model worked particularly well in an email-centric environment.  But, as discovery from cloud sources, collaborative worksites (like SharePoint) and other unstructured data repositories continues to become increasingly mainstream, the custodian-oriented collection model will become rapidly outmoded because it will fail to take into account topically-oriented searches.  This trend will be further amplified by the bench’s increasing distrust of manual, custodian-based data collection practices and the presence of better automated search methods, which are particularly valuable for certain types of litigation (e.g., patent disputes, product liability cases).
  3. The FRCP Amendment Debate Will Rage On – Unfortunately Without Much Near Term Progress. While it is clear that the eDiscovery preservation duty has become a more complex and risk laden process, it’s not clear that this “pain” is causally related to the FRCP.  In the notes from the Dallas mini-conference, a pending Sedona survey was quoted referencing the fact that preservation challenges were increasing dramatically.  Yet, there isn’t a consensus viewpoint regarding which changes, if any, would help improve the murky problem.  In the near term this means that organizations with significant preservation pains will need to better utilize the rules that are on the books and deploy enabling technologies where possible.
  4. Data Hoarding Increasingly Goes Out of Fashion. The war cry of many IT professionals that “storage is cheap” is starting to fall on deaf ears.  Organizations are realizing that the cost of storing information is just the tip of the iceberg when it comes to the litigation risk of having terabytes (and conceivably petabytes) of unstructured, uncategorized and unmanaged electronically stored information (ESI).  This tsunami of information will increasingly become an information liability for organizations that have never deleted a byte of information.  In 2012, more corporations will see the need to clean out their digital houses and will realize that such cleansing (where permitted) is a best practice moving forward.  This applies with equal force to the US government, which has recently mandated such an effort at President Obama’s behest.
  5. Information Governance Becomes a Viable Reality.  For several years there’s been an effort to combine the reactive (far right) side of the EDRM with the logically connected proactive (far left) side of the EDRM.  But now, a number of surveys have linked good information governance hygiene with better response times to eDiscovery requests and governmental inquires, as well as a corresponding lower chance of being sanctioned and the ability to turn over less responsive information.  In 2012, enterprises will realize that the litigation use case is just one way to leverage archival and eDiscovery tools, further accelerating adoption.
  6. Backup Tapes Will Be Increasingly Seen as a Liability.  Using backup tapes for disaster recovery/business continuity purposes remains a viable business strategy, although backing up to tape will become less prevalent as cloud backup increases.  However, if tapes are kept around longer than necessary (days versus months) then they become a ticking time bomb when a litigation or inquiry event crops up.
  7. International eDiscovery/eDisclosure Processes Will Continue to Mature. It’s easy to think of the US as dominating the eDiscovery landscape. While this is gospel for us here in the States, international markets are developing quickly and in many ways are ahead of the US, particularly with regulatory compliance-driven use cases, like the UK Bribery Act 2010.  This fact, coupled with the menagerie of international privacy laws, means we’ll be less Balkanized in our eDiscovery efforts moving forward since we do really need to be thinking and practicing globally.
  8. Email Becomes “So 2009” As Social Media Gains Traction. While email has been the eDiscovery darling for the past decade, it’s getting a little long in the tooth.  In the next year, new types of ESI (social media, structured data, loose files, cloud context, mobile device messages, etc.) will cause headaches for a number of enterprises that have been overly email-centric.  Already in 2011, organizations are finding that other sources of ESI like documents/files and structured data are rivaling email in importance for eDiscovery requests, and this trend shows no signs of abating, particularly for regulated industries. This heterogeneous mix of ESI will certainly result in challenges for many companies, with some unlucky ones getting sanctioned because they ignored these emerging data types.
  9. Cost Shifting Will Become More Prevalent – Impacting the “American Rule.” For ages, the American Rule held that producing parties had to pay for their production costs, with a few narrow exceptions.  Next year we’ll see even more courts award winning parties their eDiscovery costs under 28 U.S.C. §1920(4) and Rule 54(d)(1) FRCP. Courts are now beginning to consider the services of an eDiscovery vendor as “the 21st Century equivalent of making copies.”
  10. Risk Assessment Becomes a Critical Component of eDiscovery. Managing risk is a foundational underpinning for litigators generally, but its role in eDiscovery has been a bit obscure.  Now, with the tremendous statistical insights that are made possible by enabling software technologies, it will become increasingly important for counsel to manage risk by deciding what types of error/precision rates are possible.  This risk analysis is particularly critical for conducting any variety of technology assisted review process since precision, recall and f-measure statistics all require a delicate balance of risk and reward.

Accurately divining the future is difficult (some might say impossible), but in the electronic discovery arena many of these predictions can happen if enough practitioners decide they want them to happen.  So, the future is fortunately within reach.

Watchdog (SEC) v. Watchdog (FINRA): Destruction, Doctoring and Deflection

Monday, November 14th, 2011

In the first settlement of its kind, FINRA settled with the SEC on October 27, 2011 due to allegations over a 2008 incident where a regional Kansas City office of FINRA doctored documents.  The alleged doctored documents were from three internal staff meetings, where information was either edited or deleted and then provided to the SEC with the “inaccurate and incomplete” changes. Mary Shapiro, currently the Chairman of the SEC, is in an interesting spot as she was Chief Executive of FINRA at the time of the alleged wrongdoing.  She apparently had no direct involvement with the decision to take action against FINRA.

The motives for doctoring the documents are unclear, and so is whether or not the alterations of the documents led to any material damage other than FINRA’s diminished credibility.  Ironically, the SEC has had its own struggles in recent months with a slew of articles published in various newspapers highlighting their own challenges with document retention and the improper destruction of documents. Both of these scenarios have been called to light by whistleblowers within their respective agencies.

These antics certainly pose the question: Is it a good use of taxpayer money to have regulatory agencies fighting each other over document retention and record keeping practices? The answer is probably no. But the first question begs the second: If they don’t do it, who will?  While information management is not the sexiest part of the SEC and FINRA’s responsibilities, it certainly is an important one and the foundation of their information intelligence.  Without proper document retention and information governance, the probability of connecting the dots to discover insider trading or other malfeasance is low.  Moreover, in order for agencies to retain credibility they need to be able to locate documents with ease and speed and those documents must be truthful and accurate.

Because FINRA is a self-regulatory firm for securities and is overseen by the SEC, it seems appropriate that they investigate matters like the one at hand.  According to the SEC, the 2008 incident is the third instance in the past eight years where an employee of FINRA, or its predecessor, the National Association of Security Dealers, has provided altered or misleading documents to the SEC.  It remains to be seen if this is intentional on the part of FINRA to conceal undesirable facts or to promote an item on their agenda, or if in fact they are simply negligent with regard to their record keeping policies.  Either way, it is a problem for the SEC and the government in general as it undermines agency credibility and compromises the ability to intelligently leverage information.   This settlement also does no favors for FINRA at a time when they aim to expand their 4,600 base of supervisory authority to include 10,000 more investment advisory firms.

So, what can be done about this behavior and the risks it poses? Corporations and governments are facing the same issues that information governance poses due to the data explosion and the growing complexity of data sources today.  At a minimum, there needs to be a policy in place that governs how data, regardless of form, is handled and disposed of in the information lifecycle.  It also makes sense to form an audit committee within the government that can inspect and assess the information management practices of each agency, as well as serve as a  third party mediator between agencies when these challenges arise.  This is a good idea for two reasons.  One, agencies can focus on their responsibilities instead of getting sidetracked with issues they are not expert in, like document retention or record management.  Next, this problem has reached a point that it’s necessary to appoint an independent group to audit the government due to the data explosion and pace of technology today.  We have the SEC and FINRA to watch the financial industry and provide us with assurance that business is being conducted in a lawful manner.  We don’t need the SEC or FINRA to take up document retention as another responsibility, as there are other professionals that can do that more effectively and independently.

While expansion of government is not the goal of forming yet another committee, this committee could potentially free up agencies to do more of the work they are charged with.  This would also promote standardization across agencies and regulatory bodies, which would be a giant step in the right direction as data volumes grow.  The actions that resulted in this settlement were remedial in nature.  FINRA took decisive action to air a podcast about document integrity and scheduled an agency-wide town hall meeting addressing the same for all current and new employees.  They also hired an independent outside consultant to provide additional staff training on document retention and integrity.  This will be a continual educational process for the private and public sector, and employee training and auditing the process will be the lynchpins for success.  The element of deflection is also at work here, as the SEC is not a model example of best practices for document retention and the moment.

The SEC is working through allegations of document destruction, FINRA is accused of document doctoring, but all these assertions circle back to the central theme of having a document retention policy and compliance with that policy.  This naturally leads to the need for education and training, and the ultimate auditing of the process for compliance.  In this rare case of watchdog bites watchdog, three points become clear: 1) The SEC has a higher and best use other than policing these issues; 2) information management has reached a point that it requires a separate and independent body to monitor and regulate allegations of misconduct; and 3) sometimes it takes a dog biting a dog to truly illustrate the magnitude of a problem.

Fulbright’s 2011 Litigation Trends Report Predicts a Constant Litigation Pace and a Swell of Regulatory Investigations

Monday, November 7th, 2011

Fulbright & Jaworski has conducted their Litigation Trends survey for nearly the past decade and the results are always interesting since they tend to capture the mindset of inside counsel and litigators as they anticipate the upcoming year.  In their 8th Annual Litigation Trends Survey, Fulbright noted that 92% of U.S. respondents predict that litigation will either increase or stay the same in the upcoming year.  This trend bodes well for players in the litigation services and eDiscovery sectors, and confirms the counter cyclical nature of the industry.  Breaking down the perceived increases across industry verticals, the Survey noted that the biggest anticipated jumps were in the technology, financial services, healthcare and insurance sectors.  Meanwhile energy (the leading sector from the prior year) was one of the few that predicted a decrease.

Going behind the scenes, there were a number of factors that caused respondents to predict litigation increases.  First and foremost, respondents indicated that “stricter regulation was the number one reason” for the increases, particularly with insurance, financial services, health care and retail sectors.  These concerns around regulatory compliance have been increasingly keeping GCs and corporate boards awake as the governance climate continues to heat up.  This regulation driver showed a demonstrable increase with 46% of all respondents having retained outside counsel to assist with regulatory proceedings, up from 37% in the prior year.  The Survey noted that U.S. companies facing a regulatory investigation were most likely to be under pressure from the DOJ (27%), State Attorney General (24%), OSHA (18%), the EPA (16%) and U.S. Attorney (13%).  Also on the regulatory front, U.S. respondents have increasingly begun to recognize the potential jurisdictional reach of the U.K. Bribery Act, with 25% of U.S. companies stating that they have already conducted a review of existing procedures in preparation for implementation.

In addition to managing risk, most in-house counsel are keenly concerned with controlling litigation costs.  The good news here is that associated costs are predicted to be generally flat.  Yet, eDiscovery remained the largest category targeted for increased spending, with 18% of respondents making this their top priority.  Interestingly, though, large enterprises seem to have been doing a good job of getting eDiscovery expenses under control (likely by taking expensive elements of the EDRM in-house), with these expenses declining among the largest companies, from 42% last year to 24% this year.

The Survey noted that the use of cloud computing has gained speed, with 34% of all public companies using the cloud.  And yet, only 40% of those companies using cloud computing have had “to preserve and/or collect data from the cloud in connection with actual or threatened litigation, disputes or investigations.”  This number appears curiously light, and it should definitely rise during the upcoming year as the plaintiff’s bar gets more savvy about this relatively new source of responsive electronically stored information (ESI).

On the narrower eDiscovery front, the Survey honed in on newer issues like cooperation.  Here, the Survey noted that this Sedona-sponsored concept still hasn’t completely taken hold, with nearly 40% of all respondents claiming that “their company has not made the effort to be more transparent or cooperative” due to a litigation strategy of “defending on all fronts.”  This area appears particularly muddled, with one third saying their previous attempts haven’t been reciprocated and another quarter feeling that their company was already transparent.

All in all,  the 2011 Fulbright Litigation Trends Survey notes trends that appear to be largely in line with the primary drivers of (1) managing risk and (2) lowering litigation costs.  On the risk side, compliance with an increasingly complex regulatory environment is offsetting any potential lull in the litigation environment.  And, on the cost side, eDiscovery continues to be a hot button issue, particularly with the relatively new challenges associated with ESI distributed on social media, cloud computing and mobile sources.

ECPA, 4th Amendment, and FOIA: A Trident of Laws Collide on the 25th Birthday of the Electronic Communications Privacy Act

Wednesday, November 2nd, 2011

Google has publicly released the number of U.S. Government requests it had for email productions in the six months preceding December 31, 2009.  They have had to comply with 94% of these 4,601 requests.  Granted, many of these requests were search warrants or subpoenas, but many were not.  Now take 4,601 and multiply it by at least 3 for other social media sources for Facebook, LinkedIn, and Twitter.  The number is big – and so is the concern over how this information is being obtained.

What has becoming increasingly common (and alarming at the same time) is the way this electronically stored information (ESI) is being obtained from third party service providers by the U.S. Government. Some of these requests were actually secret court orders; it is unclear how many of the matters were criminal or civil.  Many of these service providers (Sonic, Google, Microsoft, etc.) are challenging these requests and most often losing. They are losing on two fronts:  1) they are not allowed to inform the data owner about the requests, nor the subsequent production of the emails, and 2) they are forced to actually produce the information.  For example, the U.S. Government obtained one of these secret orders to get WikiLeaks volunteer Jacob Applebaum’s email contact list of the people he has corresponded with over the past two years.  Both Google and Sonic.net were ordered to turn over information and Sonic challenged  the order and lost.  This has forced technology companies to band together to lobby Congress to require search warrants in digital investigations.

There are three primary laws operating at this pivotal intersection that affect the discovery of ESI that resides with third party service providers, and these laws are in a car wreck with no ambulance in sight.  First, there is the antiquated Federal Law, the Electronic Communications Privacy Act of 1986, over which there is much debate at present.  To put the datedness of the ECPA in perspective, it was written before the internet.  This law is the basis that allows the government to secretly obtain information from email and cell phones without a search warrant. Not having a search warrant is in direct conflict with the U.S. Constitution’s 4th Amendment protection against unreasonable searches and seizures.  In the secret order scenario, the creator of data is denied their right to know about the search and seizure (as they would if their homes were being searched, for example) as it is transpiring with the third party.

Where a secret order has been issued and emails have been obtained from a third party service provider, we see the courts treating email much differently than traditional mail and telephone lines.  However, the intent of the law was to give electronic communications the same protections that mail and phone calls have enjoyed for some time. Understandably, the law did not anticipate the advent of the technology we have today.  This is the first collision, and the reason the wheels have gone off the car, since the standard under the ECPA sets a lower bar for email than that of the former two modes of communication.  The government must only show “reasonable grounds” that the records would be “relevant and material” to an investigation, criminal or civil, compared to the other higher standard.

The third law in this collision is the Freedom of Information Act (FOIA).  While certain exceptions and allowances are made for national security and in criminal investigations, these secret orders are not able to be seen by the person whose information has been requested.  Additionally, the public wants to see these requests and these orders, especially if they have no chance of fighting them.  What remains to be seen is what our rights are under FOIA to see these orders, either as a party or a non-related individual to the investigation as a matter of public record.  U.S. Senator Patrick Leahy, (D-VT), the author of the ECPA, acknowledged in no uncertain terms that the law is “significantly outdated and outpaced by rapid changes in technology.”   He has since introduced a bill with many changes that third party service providers have lobbied for to bring the ECPA up to date. The irony of this situation is that the law was intended to provide the same protections for all modes of communication, but in fact makes it easier for the government to request information without the author even knowing.

This is one of the most important issues now facing individuals and the government in the discovery of ESI during investigations and litigation.  A third party service provider of cloud offerings is really no different than a utility company, and the same paradigm can exist as it does with the U.S. Postal Service and the telephone companies when looking to discover this information under the Fourth Amendment, where a warrant is required. The law looks to be changing to reflect this and FOIA should allow the public to access these orders.  Amendments to the Act have been introduced by Senator Leahy, and we can look forward to the common sense changes he proposes that are necessary.  The American people don’t like secrets. Lawyers, get ready to embrace the revisions into your practice by reading up on the changes as they will impact your practices significantly in the near future.