24h-payday

Archive for the ‘user-generated content’ Category

Breaking News: Court Clarifies Duty to Preserve Evidence, Denies eDiscovery Sanctions Motion Against Pfizer

Wednesday, April 18th, 2012

It is fortunately becoming clearer that organizations do not need to preserve information until litigation is “reasonably anticipated.” In Brigham Young University v. Pfizer (D. Utah Apr. 16, 2012), the court denied the plaintiff university’s fourth motion for discovery sanctions against Pfizer, likely ending its chance to obtain a “game-ending” eDiscovery sanction. The case, which involves disputed claims over the discovery and development of prominent anti-inflammatory drugs, is set for trial on May 29, 2012.

In Brigham Young, the university pressed its case for sanctions against Pfizer based on a vastly expanded concept of a litigant’s preservation duty. Relying principally on the controversial Phillip M. Adams & Associates v. Dell case, the university argued that Pfizer’s “duty to preserve runs to the legal system generally.” The university reasoned that just as the defendant in the Adams case was “sensitized” by earlier industry lawsuits to the real possibility of plaintiff’s lawsuit, Pfizer was likewise put on notice of the university’s claims due to related industry litigation.

The court rejected such a sweeping characterization of the duty to preserve, opining that it was “simply too broad.” Echoing the concerns articulated by the Advisory Committee when it framed the 2006 amendments to the Federal Rules of Civil Procedure (FRCP), the court took pains to emphasize the unreasonable burdens that parties such as Pfizer would face if such a duty were imposed:

“It is difficult for the Court to imagine how a party could ever dispose of information under such a broad duty because of the potential for some distantly related litigation that may arise years into the future.”

The court also rejected the university’s argument because such a position failed to appreciate the basic workings of corporate records retention policies. As the court reasoned, “[e]vidence may simply be discarded as a result of good faith business procedures.” When those procedures operate to inadvertently destroy evidence before the duty to preserve is triggered, the court held that sanctions should not issue: “The Federal Rules protect from sanctions those who lack control over the requested materials or who have discarded them as a result of good faith business procedures.”

The Brigham Young case is significant for a number of reasons. First, it reiterates that organizations need not keep electronically stored information (ESI) for legal or regulatory purposes until the duty to preserve is reasonably anticipated. As American courts have almost uniformly held since the 1997 case of Concord Boat Corp. v. Brunswick Corp., organizations are not required to keep every piece of paper, every email, every electronic document and every back up tape.

Second, Brigham Young emphasizes that organizations can and should use document retention protocols to rid themselves of data stockpiles. Absent a preservation duty or other exceptional circumstances, paring back ESI pursuant to “good faith business procedures” (such as a neutral retention policy) will be protected under the law.

Finally, Brigham Young narrows the holding of the Adams case to its particular facts. The Adams case has been particularly troublesome to organizations as it arguably expanded their preservation duty in certain circumstances. However, Brigham Young clarified that this expansion was unwarranted in the instant case, particularly given that Pfizer documents were destroyed pursuant to “good faith business procedures.”

In summary, Brigham Young teaches that organizations will be protected from eDiscovery sanctions to the extent they destroy ESI in good faith pursuant to a reasonable records retention policy. This will likely bring a sigh of relief to enterprises struggling with the information explosion since it encourages confident deletion of data when the coast is clear of a discrete litigation event.

LTNY Wrap-Up – What Did We Learn About eDiscovery?

Friday, February 10th, 2012

Now that that dust has settled, the folks who attended LegalTech New York 2012 can try to get to the mountain of emails that accumulated during the event that was LegalTech. Fortunately, there was no ice storm this year, and for the most part, people seemed to heed my “what not to do at LTNY” list. I even found the Starbucks across the street more crowded than the one in the hotel. There was some alcohol-induced hooliganism at a vendor’s party, but most of the other social mixers seemed uniformly tame.

Part of Dan Patrick’s syndicated radio show features a “What Did We Learn Today?” segment, and that inquiry seems fitting for this year’s LegalTech.

  • First of all, the prognostications about buzzwords were spot on, with no shortage of cycles spent on predictive coding (aka Technology Assisted Review). The general session on Monday, hosted by Symantec, had close to a thousand attendees on the edge of their seats to hear Judge Peck, Maura Grossman and Ralph Losey wax eloquently about the ongoing man versus machine debate. Judge Peck uttered a number of quotable sound bites, including the quote of the day: “Keyword searching is absolutely terrible, in terms of statistical responsiveness.” Stay tuned for a longer post with more comments from the General session.
  • Ralph Losey went one step further when commenting on keyword search, stating: “It doesn’t work,… I hope it’s been discredited.” A few have commented that this lambasting may have gone too far, and I’d tend to agree.  It’s not that keyword search is horrific per se. It’s just that its efficacy is limited and the hubris of the average user, who thinks eDiscovery search is like Google search, is where the real trouble lies. It’s important to keep in mind that all these eDiscovery applications are just like tools in the practitioners’ toolbox and they need to be deployed for the right task. Otherwise, the old saw (pun intended) that “when you’re a hammer everything looks like a nail” will inevitably come true.
  • This year’s show also finally put a nail in the coffin of the human review process as the eDiscovery gold standard. That doesn’t mean that attorneys everywhere will abandon the linear review process any time soon, but hopefully it’s becoming increasingly clear that the “evil we know” isn’t very accurate (on top of being very expensive). If that deadly combination doesn’t get folks experimenting with technology assisted review, I don’t know what will.
  • Information governance was also a hot topic, only paling in comparison to Predictive Coding. A survey Symantec conducted at the show indicated that this topic is gaining momentum, but still has a ways to go in terms of action. While 73% of respondents believe an integrated information governance strategy is critical to reducing information risk, only 19% have implemented a system to help them with the problem. This gap presumably indicates a ton of upside for vendors who have a good, attainable information governance solution set.
  • The Hilton still leaves much to be desired as a host location. As they say, familiarity breeds contempt, and for those who’ve notched more than a handful of LegalTech shows, the venue can feel a bit like the movie Groundhog Day, but without Bill Murray. Speculation continues to run rampant about a move to the Javits Center, but the show would likely need to expand pretty significantly before ALM would make the move. And, if there ever was a change, people would assuredly think back with nostalgia on the good old days at the Hilton.
  • Despite the bright lights and elevator advertisement trauma, the mood seemed pretty ebullient, with tons of partnerships, product announcements and consolidation. This positive vibe was a nice change after the last two years when there was still a dark cloud looming over the industry and economy in general.
  • Finally, this year’s show also seemed to embrace social media in a way that it hadn’t done so in years past. Yes, all the social media vehicles were around in years past, but this year many of the vendors’ campaigns seemed to be much more integrated. It was funny to see even the most technically resistant lawyers log in to Twitter (for the first time) to post comments about the show as a way to win premium vendor swag. Next year, I’m sure we’ll see an even more pervasive social media influence, which is a bit ironic given the eDiscovery challenges associated with collecting and reviewing social media content.

Backup Tapes and Archives Bursting at the Seams? The Seven Year Itch Has Technology to Answer the Scratch

Monday, December 12th, 2011

Just like Marilyn Monroe stopped traffic in her white dress in The Seven Year Itch, enterprises are being stopped dead in their tracks by the data explosion, lack of information governance policies and overstuffed IT infrastructures.  During the 2004-05 timeframe, a large number of enterprises began migrating to an archive, and this trend has kept steady pace since.  Archiving historically began with email, but has been recently extended to many other forms of information, including social media, unstructured data and cloud content.  This adoption was somewhat related to the historic Zubulake ruling, that required preservation to attach upon “reasonable anticipation of litigation.”  Another significant driver behind the archive need is the ability to comply with a range of statutes and regulations.  The reality is it is difficult to preserve efficiently and defensibly without an archive and other automatic classification technologies.  Some companies still complete the information management and eDiscovery processes manually, but not without peril.

Currently, there is a sudden upsurge in corporations finally starting to shrink the archives that they implemented to manage email, legal preservation requirements and regulatory compliance.  After roughly seven years, over which time there have been many advances in technology, a shift in thinking is taking place with regard to information governance and data retention.  Change has been borne out of necessity, as infrastructures are suffering with the amount of data they are retaining and the pains associated with searching that data.  This shift will enable companies to delete with confidence, clean up their backup tapes, shrink their archives, and manage/expire data on a go-forward basis effectively.  Collectively, this type of good information governance hygiene allows organizations to minimize the litigation risk that’s attendant with bloated information stores.

One reason many archives have become so bloated is because many enterprises purchased archiving software, but did not properly enable expiry procedures according to a  defensible document retention policy.  This resulted in saving everything for the past seven or so years.  Another reason for retaining all data in the archive was because enterprises were afraid to delete anything fearing being accused of spoliation and/or the inability to retrieve data that should have been on legal hold.  These two reasons combined have resulted in companies being forced to address the impact of having to search this massive amount of data in the archive each time a matter arises.  The resulting workflow for data collection is time consuming and expensive, especially for companies that still employ third party vendors for data collection.  For many organizations, the situation has become unsustainable from both a legal and IT perspective.

In recent years, backup has been given less attention as archives have become more common, storage has become more affordable, and most lawyers argue that tapes are “inaccessible” – making restoration less common.  However, there is still an area of concern with regard to over-retention of backup, especially when organizations do not have an archive.  They may be required to produce backup tapes as much of the relevant information to a matter could be contained therein.  This has led to saving large numbers of backup tapes with no real knowledge of what data is on the tapes and no one wanting to be accountable for pulling the trigger on deletion.  When forced to restore backup tapes it can be expensive and an eDiscovery nightmare.

For example, in Moore v. Gilead Sciences (N.D. Ca. Nov. 16, 2011), the plaintiff sought production of “all archived emails” that he sent or received during his five-year tenure with the defendant pharmaceutical company.  The company objected to the request as being unduly burdensome.  The company argued that:

  1. The emails were exclusively stored on its disaster recovery backup tapes;
  2. It would cost $360,000 to index those tapes, exclusive of processing and review costs;
  3. Many of the requested emails would not be retrieved since the company conducted its backups on monthly (not daily) intervals; and
  4. Over 25,000 pages of the plaintiff’s emails had already been produced in the litigation.

It is common for the inaccessibility and unduly burdensome arguments to be made with regard to backup tapes to combat indexing and restoration.  However, where a discovery dispute has merit, courts routinely reject projected cost estimates (such as the company’s $360,000 figure) as being unfounded/speculative and order production nevertheless.  [See Pippins v. KPMG and Escamilla v. SMS Holdings Corp.]  Had the judge gone the other way on restoration in Moore, the outcome could have easily been different, expensive and detrimental to the company.

What does this mean for organizations keeping seven years or more of legacy content?  Firstly, take inventory on where backup tapes reside and determine if they need to be saved or if they can be deleted.  Most corporations have amassed many tapes that are only a legal liability at this point.  Technology exists today that can index and search what is on the tapes, enabling educated decisions to then be made about whether to delete and/or transfer to the archive for legal hold.  Essentially, new technology can give sight to the blind.  Those decisions must be made according to a plan and documented.  Backup should only be for disaster recovery.

Secondly, purchase an archive if the company does not yet have one and configure the archive to expire data according to the document retention policy that can protect the company’s data decisions under Safe Harbor laws.

Is the company experiencing what many others are right now, which is an archive that is bursting at the seams? If the company does have an archive, check to see if expiry has been properly deployed according to the company’s policy.  If not, initiate a project to free up the archive from information retention that is unnecessary and that should not be subject to discovery.  Again, this must be documented.  Archives are for discovery and they need to be lean, efficient, and executing the information management lifecycle.

Avoid the request for backup tapes in litigation by having a sufficient archive and clearly stating that backup tapes are solely for disaster recovery. Delete tapes when possible by analyzing what is on them with appropriate technology and through a documented process that will eliminate the possibility of them being discoverable in litigation.

In sum, it is very helpful to examine the EDRM model and carve out what technologies and policies will apply to each aspect of the continuum.  For the challenges addressed in this blog, backup tapes fall under information management as does an archive all the way to the left of the model.  Backup tapes need search and expiry in order to retain only what is necessary for legal hold and should be ingested into an archive;  then, the company’s disaster recovery policies should be enforced on a go-forward basis.  Similarly, the archive needs search and expiration according to document retention policies so it does not become overgrown. From left to right, the model logically walks through the lifecycle of data, and many of the responsibilities associated with the data can be automated.  This project will require commitment, resources and time, but in light of the fact that data is only growing, there aren’t any other options.

Top Ten eDiscovery Predictions for 2012

Thursday, December 8th, 2011

As 2011 comes quickly to a close we’ve attempted, as in years past, to do our best Carnac impersonation and divine the future of eDiscovery.  Some of these predictions may happen more quickly than others, but it’s our sense that all will come to pass in the near future – it’s just a matter of timing.

  1. Technology Assisted Review (TAR) Gains Speed.  The area of Technology Assisted Review is very exciting since there are a host of emerging technologies that can help make the review process more efficient, ranging from email threading, concept search, clustering, predictive coding and the like.  There are two fundamental challenges however.  First, the technology doesn’t work in a vacuum, meaning that the workflows need to be properly designed and the users need to make accurate decisions because those judgment calls often are then magnified by the application.  Next, the defensibility of the given approach needs to be well vetted.  While it’s likely not necessary (or practical) to expect a judge to mandate the use of a specific technological approach, it is important for the applied technologies to be reasonable, transparent and auditable since the worst possible outcome would be to have a technology challenged and then find the producing party unable to adequately explain their methodology.
  2. The Custodian-Based Collection Model Comes Under Stress. Ever since the days of Zubulake, litigants have focused on “key players” as a proxy for finding relevant information during the eDiscovery process.  Early on, this model worked particularly well in an email-centric environment.  But, as discovery from cloud sources, collaborative worksites (like SharePoint) and other unstructured data repositories continues to become increasingly mainstream, the custodian-oriented collection model will become rapidly outmoded because it will fail to take into account topically-oriented searches.  This trend will be further amplified by the bench’s increasing distrust of manual, custodian-based data collection practices and the presence of better automated search methods, which are particularly valuable for certain types of litigation (e.g., patent disputes, product liability cases).
  3. The FRCP Amendment Debate Will Rage On – Unfortunately Without Much Near Term Progress. While it is clear that the eDiscovery preservation duty has become a more complex and risk laden process, it’s not clear that this “pain” is causally related to the FRCP.  In the notes from the Dallas mini-conference, a pending Sedona survey was quoted referencing the fact that preservation challenges were increasing dramatically.  Yet, there isn’t a consensus viewpoint regarding which changes, if any, would help improve the murky problem.  In the near term this means that organizations with significant preservation pains will need to better utilize the rules that are on the books and deploy enabling technologies where possible.
  4. Data Hoarding Increasingly Goes Out of Fashion. The war cry of many IT professionals that “storage is cheap” is starting to fall on deaf ears.  Organizations are realizing that the cost of storing information is just the tip of the iceberg when it comes to the litigation risk of having terabytes (and conceivably petabytes) of unstructured, uncategorized and unmanaged electronically stored information (ESI).  This tsunami of information will increasingly become an information liability for organizations that have never deleted a byte of information.  In 2012, more corporations will see the need to clean out their digital houses and will realize that such cleansing (where permitted) is a best practice moving forward.  This applies with equal force to the US government, which has recently mandated such an effort at President Obama’s behest.
  5. Information Governance Becomes a Viable Reality.  For several years there’s been an effort to combine the reactive (far right) side of the EDRM with the logically connected proactive (far left) side of the EDRM.  But now, a number of surveys have linked good information governance hygiene with better response times to eDiscovery requests and governmental inquires, as well as a corresponding lower chance of being sanctioned and the ability to turn over less responsive information.  In 2012, enterprises will realize that the litigation use case is just one way to leverage archival and eDiscovery tools, further accelerating adoption.
  6. Backup Tapes Will Be Increasingly Seen as a Liability.  Using backup tapes for disaster recovery/business continuity purposes remains a viable business strategy, although backing up to tape will become less prevalent as cloud backup increases.  However, if tapes are kept around longer than necessary (days versus months) then they become a ticking time bomb when a litigation or inquiry event crops up.
  7. International eDiscovery/eDisclosure Processes Will Continue to Mature. It’s easy to think of the US as dominating the eDiscovery landscape. While this is gospel for us here in the States, international markets are developing quickly and in many ways are ahead of the US, particularly with regulatory compliance-driven use cases, like the UK Bribery Act 2010.  This fact, coupled with the menagerie of international privacy laws, means we’ll be less Balkanized in our eDiscovery efforts moving forward since we do really need to be thinking and practicing globally.
  8. Email Becomes “So 2009” As Social Media Gains Traction. While email has been the eDiscovery darling for the past decade, it’s getting a little long in the tooth.  In the next year, new types of ESI (social media, structured data, loose files, cloud context, mobile device messages, etc.) will cause headaches for a number of enterprises that have been overly email-centric.  Already in 2011, organizations are finding that other sources of ESI like documents/files and structured data are rivaling email in importance for eDiscovery requests, and this trend shows no signs of abating, particularly for regulated industries. This heterogeneous mix of ESI will certainly result in challenges for many companies, with some unlucky ones getting sanctioned because they ignored these emerging data types.
  9. Cost Shifting Will Become More Prevalent – Impacting the “American Rule.” For ages, the American Rule held that producing parties had to pay for their production costs, with a few narrow exceptions.  Next year we’ll see even more courts award winning parties their eDiscovery costs under 28 U.S.C. §1920(4) and Rule 54(d)(1) FRCP. Courts are now beginning to consider the services of an eDiscovery vendor as “the 21st Century equivalent of making copies.”
  10. Risk Assessment Becomes a Critical Component of eDiscovery. Managing risk is a foundational underpinning for litigators generally, but its role in eDiscovery has been a bit obscure.  Now, with the tremendous statistical insights that are made possible by enabling software technologies, it will become increasingly important for counsel to manage risk by deciding what types of error/precision rates are possible.  This risk analysis is particularly critical for conducting any variety of technology assisted review process since precision, recall and f-measure statistics all require a delicate balance of risk and reward.

Accurately divining the future is difficult (some might say impossible), but in the electronic discovery arena many of these predictions can happen if enough practitioners decide they want them to happen.  So, the future is fortunately within reach.

ECPA, 4th Amendment, and FOIA: A Trident of Laws Collide on the 25th Birthday of the Electronic Communications Privacy Act

Wednesday, November 2nd, 2011

Google has publicly released the number of U.S. Government requests it had for email productions in the six months preceding December 31, 2009.  They have had to comply with 94% of these 4,601 requests.  Granted, many of these requests were search warrants or subpoenas, but many were not.  Now take 4,601 and multiply it by at least 3 for other social media sources for Facebook, LinkedIn, and Twitter.  The number is big – and so is the concern over how this information is being obtained.

What has becoming increasingly common (and alarming at the same time) is the way this electronically stored information (ESI) is being obtained from third party service providers by the U.S. Government. Some of these requests were actually secret court orders; it is unclear how many of the matters were criminal or civil.  Many of these service providers (Sonic, Google, Microsoft, etc.) are challenging these requests and most often losing. They are losing on two fronts:  1) they are not allowed to inform the data owner about the requests, nor the subsequent production of the emails, and 2) they are forced to actually produce the information.  For example, the U.S. Government obtained one of these secret orders to get WikiLeaks volunteer Jacob Applebaum’s email contact list of the people he has corresponded with over the past two years.  Both Google and Sonic.net were ordered to turn over information and Sonic challenged  the order and lost.  This has forced technology companies to band together to lobby Congress to require search warrants in digital investigations.

There are three primary laws operating at this pivotal intersection that affect the discovery of ESI that resides with third party service providers, and these laws are in a car wreck with no ambulance in sight.  First, there is the antiquated Federal Law, the Electronic Communications Privacy Act of 1986, over which there is much debate at present.  To put the datedness of the ECPA in perspective, it was written before the internet.  This law is the basis that allows the government to secretly obtain information from email and cell phones without a search warrant. Not having a search warrant is in direct conflict with the U.S. Constitution’s 4th Amendment protection against unreasonable searches and seizures.  In the secret order scenario, the creator of data is denied their right to know about the search and seizure (as they would if their homes were being searched, for example) as it is transpiring with the third party.

Where a secret order has been issued and emails have been obtained from a third party service provider, we see the courts treating email much differently than traditional mail and telephone lines.  However, the intent of the law was to give electronic communications the same protections that mail and phone calls have enjoyed for some time. Understandably, the law did not anticipate the advent of the technology we have today.  This is the first collision, and the reason the wheels have gone off the car, since the standard under the ECPA sets a lower bar for email than that of the former two modes of communication.  The government must only show “reasonable grounds” that the records would be “relevant and material” to an investigation, criminal or civil, compared to the other higher standard.

The third law in this collision is the Freedom of Information Act (FOIA).  While certain exceptions and allowances are made for national security and in criminal investigations, these secret orders are not able to be seen by the person whose information has been requested.  Additionally, the public wants to see these requests and these orders, especially if they have no chance of fighting them.  What remains to be seen is what our rights are under FOIA to see these orders, either as a party or a non-related individual to the investigation as a matter of public record.  U.S. Senator Patrick Leahy, (D-VT), the author of the ECPA, acknowledged in no uncertain terms that the law is “significantly outdated and outpaced by rapid changes in technology.”   He has since introduced a bill with many changes that third party service providers have lobbied for to bring the ECPA up to date. The irony of this situation is that the law was intended to provide the same protections for all modes of communication, but in fact makes it easier for the government to request information without the author even knowing.

This is one of the most important issues now facing individuals and the government in the discovery of ESI during investigations and litigation.  A third party service provider of cloud offerings is really no different than a utility company, and the same paradigm can exist as it does with the U.S. Postal Service and the telephone companies when looking to discover this information under the Fourth Amendment, where a warrant is required. The law looks to be changing to reflect this and FOIA should allow the public to access these orders.  Amendments to the Act have been introduced by Senator Leahy, and we can look forward to the common sense changes he proposes that are necessary.  The American people don’t like secrets. Lawyers, get ready to embrace the revisions into your practice by reading up on the changes as they will impact your practices significantly in the near future.

Addressing the Regulatory and eDiscovery Challenges of Social Media

Thursday, August 18th, 2011

Is your organization among those that have jumped with both feet into the world of social media?

Recent survey results confirm that social media use is on the rise for almost all organizations across the globe.  This is particularly the case in the financial services industry.  A recent industry survey confirms that nearly two-thirds of all asset managers are actively using social media for marketing purposes.

Despite its increasing popularity and ubiquity, the securities industry is experiencing growing pains with social media.  Just like other industries, financial services providers are struggling with applying notions of information governance to these non-traditional forms of communication.  Indeed, with social media becoming an increasingly important data source for both business and legal purposes, it behooves enterprises to develop an information governance strategy with respect to this data.  The best practices being followed in this regard by financial services companies should be paradigmatic for organizations across the board.

Social Media Challenges for Financial Services Companies

Many financial services companies are experiencing difficulty supervising or retaining social media communications as required by FINRA Regulatory Notice 10-06.  A landmark regulation, FINRA 10-06 was promulgated last year to protect investors from false or misleading claims made on social networking sites.  To comply with this regulation, securities firms must develop protocols that enable them to supervise and retain social media content and ensure conformity by their representatives.

It is no secret that social media communications continue to bedevil securities firms.  Indeed, 63% of surveyed asset managers reported that “regulatory recordkeeping” remains their greatest challenge with respect to social media.  And as more firms move toward social media marketing, the number of financial services companies experiencing difficulty with retention is also likely to increase.

The challenges firms are experiencing with social media are not limited to retention.  They also include the need to properly supervise social media communications.  This was acknowledged by FINRA chairman and chief executive Richard Ketchum at an industry event this past June.  Among other social media issues, Ketchum explained that firms have questioned how they can most effectively supervise their employees’ use of smart phones and tablet computers that can access company sites.  In response to these matters, FINRA just issued Regulatory Notice 11-39 to help clarify several lingering questions regarding retention and supervision.

Best Practices for Addressing the Challenges of Social Media

Given the complexity of these issues, regulated enterprises need to know what best practices can be followed to ensure compliance with pertinent FINRA and SEC regulations.  While there are perhaps many steps that could be implemented, three stand out as indispensable for firms.

The first is that firms should develop a global plan for how they will engage in social media marketing.  This initial step is particularly important for groups that are just now exploring the use of social media to communicate with investors.  Having a plan in place that maps out investor contact and communication strategy, provides for required supervision of firm representatives, and accounts for compliance with regulatory requirements is essential for securities firms.  Failing to take these steps could result in fines, suspensions or worse.

The next step involves educating and training employees regarding the firm’s social media plan.  This should include instruction regarding what content may be posted to social networking sites and the internal process for doing so.  Policies that describe the consequences for deviating from the firm’s social media plan should also be clearly delineated.  Those policies should detail the legal repercussions – civil and criminal – for both the employee and the firm for social media missteps.

Third, firms can employ technology to ensure compliance with their social media plan.  Indeed, FINRA 10-06 specifically emphasizes the importance of deploying technological “systems” to facilitate conformity with the regulation’s “Recordkeeping Responsibilities” requirement.  Those “systems” include archiving software and other technology tools.  With the right tools in place, firms can perform a cost-effective supervisory review of content to help ensure compliance with corporate policy and regulatory bodies.  Moreover, an effective “system” will implement legal holds and efficiently retrieve archived social media content in response to legal and regulatory requests.  All of this enables a company to establish the reasonableness of its retention and eDiscovery processes and demonstrate compliance with relevant SEC and FINRA regulations.

By following these steps and other best practices, financial services companies can begin to reasonably address the challenges of social media.  Knowing that those challenges are being dealt with in an effective manner will enable firms to confidently engage in social media marketing – and reap the financial benefits of doing so.

Social Media: Electronic Discovery’s New New Thing?

Monday, June 1st, 2009

Lately, the electronic discovery blogosphere has been, well, a-twitter about twitter and other social media as they relate to electronic discovery. While twitter struggles to find a business model, enterprises and law firms are racing to understand the implications of this latest boomtown of user-generated content that’s being built in out on the frontier of the World Wide Web (or is that Wild Wild West?).

There’s talk of intellectual property being cast out, irrevocably, onto the Internet for all to see. Or slanderous things being uttered for which your company may be held liable. But, hold on a second: is there really anything new here? Anyone heard of e-mail? Web pages? Peer-to-peer? Google? Instant messaging? As Debra Logan astutely points out in her recent post on the topic, “everything that exists is discoverable (at least pretty much).” If you haven’t already, take a look at the FRCP’s definition of ESI and you’ll get her point. So, yes, it’s obviously important to have a common sense corporate policy around what’s appropriate and what’s not for the public Internet, but it shouldn’t be any different from the policy that you should have already had in place regarding blogs, web pages, and email.

What about the other side of the electronic discovery coin: finding information that’s responsive to a request? If anything, social media are more easily discoverable than just about any other form of user-generated content (though admittedly in some cases they can be more transient, which can post unique challenges). And, while it’s not universally true, the argument can be made that the more easily something can be discovered, the lower the cost and risk of that content to you. Worried if anyone on twitter is stealing your new idea for a router architecture? How about the top-secret approach to making coffee you were thinking about patenting? Well, if anyone twittered about it, tracking it down is a snap. Just keep in mind that because of the public nature of social media, it’s likely that the more important the information is to your company in the context of electronic discovery, the less likely it is to live out on the public Internet. Obviously, there will be exceptions. But when there are those exceptions, tracking down the relevant information will likely be a fairly straightforward and relatively inexpensive process.

However, before we dismiss social media as nothing new and something that can largely be addressed through already-existing policies and discovery techniques, let’s consider one aspect of social media that is on the upswing, but often out of the blogging limelight: enterprise applications.

Increasingly, companies are moving to advanced enterprise social media platforms such as Jive or SocialText as a way of improving internal collaboration and making projects run more smoothly and effectively. Because such enterprise platforms are often used on a company’s most important and strategic projects, having robust e-discovery capabilities to allow internal blog, wiki, and discussion content to be captured and placed into a format that can be seamlessly searched along with other more traditional documents is becoming critical to forward-thinking enterprises.

For example, I recently came across a large financial institution that uses Jive SBS as its wiki and Clearwell as its e-discovery solution. What surprised me is that this company has created its own Jive/Clearwell “adapter” that feeds Jive discussions directly into Clearwell as a conversation thread. This is just one example, but I’m sure more will follow. Over time, it will become a requirement for e-discovery platforms to integrate with enterprise social media products. And, rest assured, as that happens, we’ll be sure to tweet about it!

UPDATE: Whit Andrews of Gartner was kind enough point out his (prescient) research note on the subject of e-discovery and social networking from November, 2007. He points out that there is in fact a very important “new new thing” about social networks, which is that they may be able to be leveraged in an e-discovery context to find out more about the people relevant to an investigation. By tapping these publically-available sources of information, investigators may be able to gain better insight into private (i.e. enterprise) information stores to guide the e-discovery process. More detail on this and other insights can be found at http://www.gartner.com/DisplayDocument?id=543110&ref=g_forward&call=email.

Learn More On Electronic Discovery Litigation