24h-payday

Posts Tagged ‘compliance’

What Ocean’s Eleven and Judge Kozinski Can Teach Organizations About Confidentiality

Friday, April 26th, 2013

Confidentiality in the digital age is certainly an elusive concept. As more organizations turn to social networking sites, cloud computing, and bring your own device (BYOD) policies to facilitate commercial enterprise, they are finding that such innovations could provide unwanted visibility into their business operations. Indeed, technology has seemingly placed confidential corporate information at the fingertips of third parties. This phenomenon, in which some third party could be examining your trade secrets, revenue streams and attorney-client communications, brings to mind an iconic colloquy from the movie Ocean’s Eleven involving Tess (Julia Roberts) and Terry Benedict (Andy Garcia). Tess caustically reminds the guarded casino magnate that: “You of all people should know Terry, in your hotel, there’s always someone watching.”

That someone could always be “watching” proprietary company information was recently alluded to by Chief Judge Alex Kozinski of the Ninth Circuit Court of Appeal. Speaking on the related topic of data privacy at a panel sponsored by The Recorder, Judge Kozinski explained that technological advances that enable third party access to much of the data transmitted through or stored in cyberspace seemingly removes the veneer of confidentiality from that information. Excerpts from Judge Kozinski’s videotaped remarks can be viewed here.

That technological innovations could provide third parties with access to proprietary information is certainly problematic as companies incorporate social networking sites, cloud computing and BYOD into more aspects of their operations. Without appropriate safeguards, the use of such innovations could jeopardize the confidentiality of proprietary company information.

For example, content that corporate employees exchange on social networking sites could be accessed and monitored by site representatives under the governing terms of service. While those terms typically provide privacy settings that would allow corporate employees to limit the extent to which information may be disseminated, they also notify those same users that site representatives may access their communications. Though the justification for such access varies from site to site, the terms generally delineate the lack of confidentiality associated with user communications. This includes ostensibly private communications sent through the direct messaging features available on social networks like LinkedIn, Twitter, MySpace, Facebook and Reddit.

In like manner, providers of cloud computing services often have access and monitoring rights vis-à-vis a company’s cloud hosted data. Memorialized in service level agreements, those rights may allow provider representatives to access, review or even block transmissions of company data to and from the cloud. Provider access may, in turn, destroy the confidentiality required to preserve the character of company trade secrets or maintain the privileged status of communications with counsel.

BYOD also presents a difficult challenge for preserving the confidentiality of company data. This is due to the lack of corporate control that BYOD has introduced into company’s information ecosystem. Unless appropriate safeguards are deployed, employees may unwittingly disclose proprietary information to third parties by using personal cloud storage providers for storage or transmission of company data. In addition, family, friends or even strangers who have access to the employee device could retrieve such information.

Given the confluence of the above referenced factors, the question becomes what steps an organization can take to preserve the confidentiality of its information. On the social network front, a company could deploy an on site social network environment that would provide a secure environment for its employees to communicate about internal corporate matters. Conceptually similar to private clouds that house data behind the company firewall, an on site network could be jointly developed with a third party provider to ensure specific levels of confidentiality.

For the enterprise that is considering cloud computing for its ESI storage needs, it should require that a cloud service provider offer measures to preserve the confidentiality of trade secrets and privileged messages. That may include specific confidentiality terms or a separate confidentiality agreement. In addition, the provider should probably have certain encryption functionality to better preserve confidentiality. By so doing, the enterprise can better satisfy itself that it has taken appropriate measures to ensure the confidentiality of its data.

To address the confidentiality problems associated with BYOD, a company should prepare a cogent policy and deploy technologies that facilitate employee compliance. Such a policy would discourage workers from using personal cloud storage providers to facilitate data transfers or for ESI storage. It would also delineate the parameters of access to employee devices by the employee’s family, friends, or others. To make such a policy more effective, employers will need to develop a technological architecture that reasonably supports conformity with the policy.

By developing cogent and reasonable policies, training employees and deploying effective, enabling technologies, organizations can better prevent unauthorized disclosures of confidential information. Only by taking such professionally recognized best practices can companies hope to shield their proprietary data from the prying eyes of third parties in the digital age.

Twitter Contempt Sanctions Increase Need for Social Media Governance Plan

Thursday, September 13th, 2012

The headline-grabbing news this week regarding Twitter facing possible contempt sanctions is an important reminder that organizations should consider developing a strategy for addressing social media governance. In criminal proceedings against protesters involved in the Occupy Wall Street movement, a New York state court ordered Twitter several weeks ago to turn over various tweets that a protester deleted from his twitter feed relating to the movement’s blocking of the Brooklyn Bridge last year. Twitter has delayed compliance with that order, which has invited the court’s wrath: “I can’t put Twitter or the little blue bird in jail, so the only way to punish is monetarily.” The court is now threatening Twitter with a monetary contempt sanction based on “the company’s earnings statements for the past two quarters.”

At first blush, the proceeding involving Twitter may not seem paradigmatic for organizations. While most organizations do not engage in civil disobedience and typically stay clear of potential criminal actions, the conduct of the protester in unilaterally deleting his tweets raises the question of whether organizations have developed an effective policy to retain and properly supervise communications made through social networking sites.

Organizations in various industry verticals need to ensure that certain messages communicated through social media sites are maintained for legal or regulatory purposes. For example, financial services companies must retain communications with investors and other records that relate to their “business as such” – including those made through social networking sites – for at least three years under section 17a-4(b) of the Securities Exchange Act of 1934. Though this provision is fairly straightforward, it has troubled regulated companies for years. Indeed, almost two-thirds of surveyed asset managers reported that “regulatory recordkeeping” remains their greatest challenge with respect to social media.

Supervision is another troubling issue. With the proliferation of smartphones, burgeoning “bring your own device” (BYOD) policies and the demands of a 24-hour workday, supervision cannot be boiled down to a simple protocol of “I’ll review your messages before you hit send.” Yet supervision is necessary, particularly given the consequences for rogue communications including litigation costs, lost revenues, reduced stock price and damage to the company brand.

Though there are no silver bullets to ensure perfection regarding these governance challenges, organizations can follow some best practices to develop an effective social media governance policy. The first is that companies should prepare a global plan for how they will engage in social media marketing. This initial step is particularly important for groups that are just now exploring the use of social media to communicate with third parties. Having a plan in place that maps out a contact and communication strategy, provides for supervision of company representatives and accounts for compliance with regulatory requirements is essential.

The next step involves educating and training employees regarding the company’s social media policy. This should include instructions regarding what content may be posted to social networking sites and the internal process for doing so. Policies that describe the consequences for deviating from the social media plan should also be clearly delineated. Those policies should detail the legal repercussions – civil and criminal – for both the employee and the organization for social media missteps.

Third, organizations can employ technology to ensure compliance with their social media plan. This may include archiving software and other technology that both retains and enables a cost-effective supervisory review of content. Electronic discovery tools that enable legal holds and efficiently retrieve archived social media content are also useful in developing an efficient and cost-effective response to legal and regulatory requests.

By following these steps and other best practices, organizations will likely be on the way to establishing the foundation of an effective social media governance plan.

Falcon Discovery Ushers in Savings with Transparent Predictive Coding

Tuesday, September 4th, 2012

The introduction of Transparent Predictive Coding to Symantec’s Clearwell eDiscovery Platform helps organizations defensibly reduce the time and cost of document review. Predictive coding refers to machine learning technology that can be used to automatically predict how documents should be classified based on limited human input. As expert reviewers tag documents in a training set, the software identifies common criteria across those documents, which it uses to “predict” the responsiveness of the remaining case documents. The result is that fewer irrelevant and non-responsive documents need to be reviewed manually – thereby accelerating the review process, increasing accuracy and allowing organizations to reduce the time and money spent on traditional page-by-page attorney document review.

Given the cost, speed and accuracy improvements that predictive coding promises, its adoption may seem to be a no-brainer. Yet predictive coding technology hasn’t been widely adopted in eDiscovery – largely because the technology and process itself still seems opaque and complex. Symantec’s Transparent Predictive Coding was developed to address these concerns and provide the level of defensibility necessary to enable legal teams to adopt predictive coding as a mainstream technology for eDiscovery review. Transparent Predictive Coding provides reviewers with complete visibility into the training and prediction process and delivers context for more informed, defensible decision-making.

Early adopters like Falcon Discovery have already witnessed the benefits of Transparent Predictive Coding. Falcon is a managed services provider that leverages a mix of top legal talent and cutting-edge technologies to help corporate legal departments, and the law firms that serve them, manage discovery and compliance challenges across matters. Recently, we spoke with Don McLaughlin, founder and CEO of Falcon Discovery, on the firm’s experiences with and lessons learned from using Transparent Predictive Coding.

1. Why did Falcon Discovery decide to evaluate Transparent Predictive Coding?

Predictive coding is obviously an exciting development for the eDiscovery industry, and we want to be able to offer Falcon’s clients the time and cost savings that it can deliver. At the same time there is an element of risk. For example, not all solutions provide the same level of visibility into the prediction process, and helping our clients manage eDiscovery in a defensible manner is of paramount importance. Over the past several years we have tested and/or used a number of different software solutions that include some assisted review or prediction technology. We were impressed that Symantec has taken the time and put in the research to integrate best practices into its predictive coding technology. This includes elements like integrated, dynamic statistical sampling, which takes the guesswork out of measuring review accuracy. This ability to look at accuracy across the entire review set provides a more complete picture, and helps address key issues that have come to light in some of the recent predictive coding court cases like Da Silva Moore.

2. What’s something you found unique or different from other solutions you evaluated?

I would say one of the biggest differentiators is that Transparent Predictive Coding uses both content and metadata in its algorithms to capture the full context of an e-mail or document, which we found to be appealing for two reasons. First, you often have to consider metadata during review for sensitive issues like privilege and to focus on important communications between specific individuals during specific time periods. Second, this can yield more accurate results with less work because the software has a more complete picture of the important elements in an e-mail or document. This faster time to evaluate the documents is critical for our clients’ bottom line, and enables more effective litigation risk analysis, while minimizing the chance of overlooking privileged or responsive documents.

3. So what were some of the success metrics that you logged?

Using Transparent Predictive Coding, Falcon was able to achieve extremely high levels of review accuracy with only a fraction of the time and review effort. If you look at academic studies on linear search and review, even under ideal conditions you often get somewhere between 40-60% accuracy. With Transparent Predictive Coding we are seeing accuracy measures closer to 90%, which means we are often achieving 90% recall and 80% precision by reviewing only a small fraction – under 10% – of the data population that you might otherwise review document-by-document. For the appropriate case and population of documents, this enables us to cut review time and costs by 90% compared to pure linear review. Of course, this is on top of the significant savings derived from leveraging other technologies to intelligently cull the data to a more relevant review set, prior to even using Transparent Predictive Coding. This means that our clients can understand the key issues, and identify potentially ‘smoking gun’ material, much earlier in a case.

4. How do you anticipate using this technology for Falcon’s clients?

I think it’s easy for people to get swept up by the “latest and greatest” technology or gadget and assume this is the silver bullet for everything we’ve been toiling over before. Take, for example, the smartphone camera – great for a lot of (maybe even most) situations, but sometimes you’re going to want that super zoom lens or even (gasp!) regular film. By the same token, it’s important to recognize that predictive coding is not an across-the-board substitute for other important eDiscovery review technologies and targeted manual review. That said, we’ve leveraged Clearwell to help our clients lower the time and costs of the eDiscovery process on hundreds of cases now, and one of the main benefits is that the solution offers the flexibility of using any number of advanced analytics tools to meet the specific requirements of the case at hand. We’re obviously excited to be able to introduce our clients to this predictive coding technology – and the time and cost benefits it can deliver – but this is in addition to other Clearwell tools, like advanced keyword search, concept or topic clustering, domain filtering, discussion threading and so on, that can and should be used together with predictive coding.

5. Based on your experience, do you have advice for others who may be looking to defensibly reduce the time and cost of document review with predictive coding technology?

The goal of the eDiscovery process is not perfection. At the end of the day, whether you employ a linear review approach and/or leverage predictive coding technology, you need to be able to show that what you did was reasonable and achieved an acceptable level of recall and precision. One of the things you notice with predictive coding is that as you review more documents, the recall and precision scores go up but at a decreasing rate. A key element of a reasonable approach to predictive coding is measuring your review accuracy using a proven statistical sampling methodology. This includes measuring recall and precision accurately to ensure the predictive coding technology is performing as expected. We’re excited to be able to deliver this capability to our clients out of the box with Clearwell, so they can make more informed decisions about their cases early-on and when necessary address concerns of proportionality with opposing parties and the court.

To find out more about Transparent Predictive Coding, visit http://go.symantec.com/predictive-coding

#InfoGov Twitter Chat Hones in on Starting Places and Best Practices

Tuesday, July 3rd, 2012

Unless you’re an octogenarian living in rural Uzbekistan[i] you’ve likely seen the meteoric rise of social media over the last decade. Even beyond hyper-texting teens, businesses too are taking advantage of this relatively new form function to engage with their more technically savvy customers. Recently, Symantec held its first “Twitter Chat” on the topic of information governance (fondly referred to on Twitter as #InfoGov). For those not familiar with the concept, a Twitter Chat is a virtual discussion held on Twitter using a specific hashtag – in this case #IGChat. At a set date and time, parties interested in the topic log into Twitter and start participating in the fireworks on the designated hashtag.

“Fireworks” may be a bit overstated, but given that the moderators (eDiscovery Counsel at Symantec) and participants were limited to 140 characters, the “conversation” was certainly frenetic. Despite the fast pace, one benefit of a Twitter Chat is that you can communicate with shortened web links, as a way to share and discuss content beyond the severely limited word count. During this somewhat staccato discussion, we found the conversation to take some interesting twists and turns, which I thought I’d excerpt (and expound upon[ii]) in this blog.

Whether in a Twitter Chat or otherwise, once the discussion of information governance begins everyone wants to know where to start. The #IGChat was no different.

  • Where to begin?  While there wasn’t consensus per se on a good starting place, one cogent remark out of the blocks was: “The best way to start is to come up with an agreed upon definition — Gartner’s is here t.co/HtGTWN2g.” While the Gartner definition is a good starting place, there are others out there that are more concise. The eDiscovery Journal Group has a good one as well:  “Information Governance is a comprehensive program of controls, processes, and technologies designed to help organizations maximize the value of information assets while minimizing associated risks and costs.”  Regardless of the precise definition, it’s definitely worth the cycles to rally around a set construct that works for your organization.
  • Who’s on board?  The next topic centered around trying to find the right folks organizationally to participate in the information governance initiative. InfoGovlawyer chimed in: “Seems to me like key #infogov players should include IT, Compliance, Legal, Security reps.” Then, PhilipFavro suggested that the “[r]ight team would likely include IT, legal, records managers, pertinent business units and compliance.” Similar to the previous question, at this stage in the information governance maturation process, there isn’t a single, right answer. More importantly, the team needs to have stakeholders from at least Legal and IT, while bringing in participants from other affected constituencies (Infosec, Records, Risk, Compliance, etc.) – basically, anyone interested in maximizing the value of information while reducing the associated risks.
  • Where’s the ROI?  McManusNYLJ queried: “Do you think #eDiscovery, #archiving and compliance-related technology provide ample ROI? Why or why not?”  Here, the comments came in fast and furious. One participant pointed out that case law can be helpful in showing the risk reduction:  “Great case showing the value of an upstream archive – Danny Lynn t.co/dcReu4Qg.” AlliWalt chimed in: “Yes, one event can set your company back millions…just look at the Dupont v. Kolon case… ROI is very real.” Another noted that “Orgs that take a proactive approach to #eDiscovery requests report a 64% faster response time, 2.3x higher success rate.” And, “these same orgs were 78% less likely to be sanctioned and 47% less likely to be legally compromised t.co/5dLRUyq6.” ROI for information governance seemed to be a nut that can be cracked any number of ways, ranging from risk reduction (via sanctions and adverse legal decisions) to better preparation. Here too, an organization’s particular sensitivities should come into play since all entities won’t have the same concerns about risk reduction, for example.
  • Getting Granular. Pegduncan, an active subject matter expert on the topic, noted that showing ROI was the right idea, but not always easy to demonstrate: “But you have to get their attention. Hard to do when IT is facing funding challenges.” This is when granular eDiscovery costs were mentioned: “EDD costs $3 -18k per gig (Rand survey) and should wake up most – adds up w/ large orgs having 147 matters at once.” Peg wasn’t that easily convinced: “Agreed that EDD costs are part of biz case, but .. it’s the problem of discretionary vs non-discretionary spending.”
  • Tools Play a Role. One participant asked: “what about tools for e-mail thread analysis, de-duplication, near de-duplication – are these applicable to #infogov?” A participant noted that “in the future we will see tools like #DLP and #predictivecoding used for #infogov auto-classification – more on DLP here: t.co/ktDl5ULe.” Pegduncan chimed in that “DLP=Data Loss Prevention. Link to Clearwell’s post on Auto-Classification & DLP t.co/ITMByhbj.”

With a concept as broad and complex as information governance, it’s truly amazing that a cogent “conversation” can take place in a series of 140 character tweets. As the Twitter Chat demonstrates, the information governance concept continues to evolve and is doing so through discussions like this one via a social media platform. As with many of the key information governance themes (Ownership, ROI, Definition, etc.) there isn’t a right answer at this stage, but that isn’t an excuse for not asking the critical questions. “Sooner started, sooner finished” is a motto that will serve many organizations well in these exciting times. And, for folks who say they can’t spare the time, they’d be amazed what they can learn in 140 characters.

Mark your calendars and track your Twitter hashtags now: The next #IGChat will be held on July 26 @ 10am PT.



[i] I’ve never been to rural Uzbekistan, but it just sounded remote.  So, my apologies if there’s a world class internet infrastructure there where the denizens tweet prolifically. Given that’s it’s one (of two) double landlocked countries in the world it seemed like an easy target. Uzbeks please feel free to use the comment field and set me straight.

[ii] Minor edits were made to select tweets, but generally the shortened Twitter grammar wasn’t changed.

Survey Says… Information Governance and Predictive Coding Adoption Slow, But Likely to Gain Steam as Technology Improves

Wednesday, February 15th, 2012

The biggest legal technology event of the year, otherwise known as LegalTech New York, always seems to have a few common rallying cries and this year was no different.  In addition to cloud computing and social media, predictive coding and information governance were hot topics of discussion that dominated banter among vendors, speakers, and customers.  Symantec conducted a survey on the exhibit show floor to find out what attendees really thought about these two burgeoning areas and to explore what the future might hold.

Information Governance is critical, understood, and necessary – but it is not yet being adequately addressed.

Although 84% of respondents are familiar with the term information governance and 73% believe that an integrated information governance strategy is critical to reducing information risk and cost, only 19% have implemented an information governance solution.  These results beg the question, if information governance is critical, then why aren’t more organizations adopting information governance practices?

Perhaps the answer lies in the cross-functional nature of information governance and confusion about who is responsible for the organization’s information governance strategy.  For example, the survey also revealed that information governance is a concept that incorporates multiple functions across the organization, including email/records retention, data storage, data security and privacy, compliance, and eDiscovery.  Given the broad impact of information governance across the organization, it is no surprise  respondents also indicated that multiple departments within the organization – including Legal, IT, Compliance, and Records Management – have an ownership stake.

These results tend to suggest at least two things.  First, information governance is a concept that touches multiple parts of the organization.  Defining and implementing appropriate information governance policies across the organization should include an integrated strategy that involves key stakeholders within the organization.  Second, recognition that information governance is a common goal across the entire organization highlights the fact that technology must evolve to help address information governance challenges.

The days of relying too heavily on disconnected point solutions to address eDiscovery, storage, data security, and record retention concerns are limited as organizations continue to mandate internal cost cutting and data security measures.  Decreasing the number of point solutions an organization supports and improving integration between the remaining solutions is a key component of a good information governance strategy because it has the effect of driving down technology and labor costs.   Similarly, an integrated solution strategy helps streamline the backup, retrieval, and overall management of critical data, which simultaneously increases worker productivity and reduces organizational risk in areas such as eDiscovery and data loss prevention.

The trail that leads from point solutions to an integrated solution strategy is already being blazed in the eDiscovery space and this trend serves as a good information governance roadmap.  More and more enterprises faced with investigations and litigation avoid the cost and time of deploying point solutions to address legal hold, data collection, data processing, and document review in favor of a single, integrated, enterprise eDiscovery platform.  The resulting reduction in cost and risk is significant and is fueling support for even broader information governance initiatives in other areas.  These broader initiatives will still include integrated eDiscovery solutions, but the initiatives will continue to expand the integrated solution approach into other areas such as storage management, record retention, and data security technologies to name a few.

Despite mainstream familiarity, predictive coding technology has not yet seen mainstream adoption but the future looks promising.

Much like the term information governance, most respondents were familiar with predictive coding technology for electronic discovery, but the survey results indicated that adoption of the technology to date has been weak.  Specifically, the survey revealed that while 97% of respondents are familiar with the term predictive coding, only 12% have adopted predictive coding technology.  Another 19% are “currently adopting” or plan to adopt predictive coding technology, but the timeline for adoption is unclear.

When asked what challenges “held back” respondents from adopting predictive coding technology, most cited accuracy, cost, and defensibility as their primary concerns.  Concerns about “privilege/confidentiality” and difficulty understanding the technology were also cited as reasons impeding adoption.  Significantly, 70% of respondents believe that predictive coding technology would “go mainstream” if it was easier to use, more transparent, and less expensive. These findings are consistent with the observations articulated in my recent blog (2012:  Year of the Dragon and Predictive Coding – Will the eDiscovery Landscape Be Forever Changed?)

The survey results combined with the potential cost savings associated with predictive coding technology suggest that the movement toward predictive coding technology is gaining steam.  Lawyers are typically reluctant to embrace new technology that is not intuitive because it is difficult to defend a process that is difficult to understand.  The complexity and confusion surrounding today’s predictive coding technology was highlighted recently in Da Silva Moore v. Publicis Group, et. al. during a recent status conference.  The case is venued in Southern District of New York Federal Court before Judge Andrew Peck and serves as further evidence that predictive coding technology is gaining steam.  Expect future proceedings in the Da Silva Moore case to further validate these survey results by revealing both the promise and complexity of current predictive coding technologies.  Similarly, expect next generation predictive coding technology to address current complexities by becoming easier to use, more transparent, and less expensive.

Foreign Corrupt Practices Act (FCPA) Drives Increased Electronic Discovery Overseas

Tuesday, May 5th, 2009

Ask a European about e-discovery, or e-disclosure as it is called in the UK, and you will often be met with a look of distaste. Much like SUVs or obesity, electronic discovery is viewed as an unpleasant, uniquely American phenomenon. But, in reality, there are fat people in Paris, Range Rovers all over London, and a lot of electronic discovery happening all across Continental Europe – whether people like to admit it or not.

One reason for that is the Foreign Corrupt Practices Act (FCPA). This US law, which has inspired similar legislation in other countries, prohibits companies from engaging in corruption, such as bribing government officials to win large contracts. That sounds simple enough, but it’s not always easy to do. For example, an American friend of mine runs a travel website in China. To advertise, he hired people to hand out flyers at all the major train stations. But after a few weeks, his employees began to get hassled by station officials who said they needed an official “permit”. So he did what anyone would do and paid the “permit fees” even though no paperwork for this “permit” was ever produced. When his US auditors looked at that, they immediately cried foul. He was then compelled to end the practice and bring in a law firm to conduct a full FCPA investigation. The result: lots of legal bills, no more advertising in train stations, and a more powerful Chinese-run competitor who has no such qualms about paying “permit fees”.

In speaking to Daniel Dorsky, Tyco’s Compliance Counsel and an expert in FCPA issues, I discovered that my friend’s experience is no longer the exception. From what Daniel described, enforcement of the FCPA has been stepped up dramatically in the past couple of years. Apparently, 2007 was the watershed. Prior to that, no one really worried about the FCPA too much. But two years ago, the Department of Justice (DoJ) under Mark Mendelsohn, began to take a different approach. First, the fines became much stiffer as, for example, Baker Hughes got hit with a $44 million penalty, by far the largest ever at the time. Second, the DoJ started to prosecute executives personally, bringing 15 criminal cases against individuals. Nothing focuses the mind like the threat of jail time, and FCPA compliance suddenly took on greater urgency.

The number of FCPA enforcement actions continued to increase in 2008, most notably with the infamous Siemens case. By the time the dust settled, the CEO of Siemens had been fired and the company was reeling from a $1.4 billion fine. Nor do things look like they are slowing down in 2009. In the first few months of this year, ABB took an $800 million accounting reserve for FCPA issues, Halliburton got fined $177 million, KBR $502 million, and the KBR CEO, Albert Stanley, got 7 years in jail to go along with his $11 million personal fine. These companies are also now vulnerable to civil suits. While there’s no private right of action under the FCPA, that does not stop securities fraud class actions or shareholder lawsuits, which charge that defendants either understated the risks or overstated the controls in their disclosures.

There are a number of reasons why FCPA enforcement actions will likely increase further in the coming months and years. The FBI recently created an FCPA taskforce of 8-12 agents, bringing all the standard law enforcement tools to FCPA compliance (e.g., wire-taps, subpoenas, informants, warrants, etc.). Many other countries are starting to enforce similar laws, with much encouragement from the US which does not want to see American businesses disadvantaged by doing the right thing. And international law enforcement agencies are cooperating more than ever before. For example, last summer in Paris, international agencies held their first FCPA conference to share information.

All of this is driving a boom in e-discovery as General Counsels and Compliance Officers regularly conduct investigations of their overseas subsidiaries to ensure FCPA compliance. These investigations often center on “red flag” countries like China, Brazil, or Russia, where compliance is most difficult. They almost always involve outside counsel, and require the processing, analysis and review of large volumes of electronic information. This applies to European companies as much as it does to American ones. Non-US nationals can be prosecuted if either communications or money goes via the US, and many European countries are following the DoJ’s lead (e.g., $600 million of Siemens’ $1.4 billion fine came from German authorities).

So no matter how Europeans feel about e-discovery, or e-disclosure, they will be doing more of it in the coming years, much like their American counterparts. It’s fair to say that, in this domain, as perhaps in others, Europeans and Americans have much more in common than they might think.