24h-payday

Posts Tagged ‘social media’

What Ocean’s Eleven and Judge Kozinski Can Teach Organizations About Confidentiality

Friday, April 26th, 2013

Confidentiality in the digital age is certainly an elusive concept. As more organizations turn to social networking sites, cloud computing, and bring your own device (BYOD) policies to facilitate commercial enterprise, they are finding that such innovations could provide unwanted visibility into their business operations. Indeed, technology has seemingly placed confidential corporate information at the fingertips of third parties. This phenomenon, in which some third party could be examining your trade secrets, revenue streams and attorney-client communications, brings to mind an iconic colloquy from the movie Ocean’s Eleven involving Tess (Julia Roberts) and Terry Benedict (Andy Garcia). Tess caustically reminds the guarded casino magnate that: “You of all people should know Terry, in your hotel, there’s always someone watching.”

That someone could always be “watching” proprietary company information was recently alluded to by Chief Judge Alex Kozinski of the Ninth Circuit Court of Appeal. Speaking on the related topic of data privacy at a panel sponsored by The Recorder, Judge Kozinski explained that technological advances that enable third party access to much of the data transmitted through or stored in cyberspace seemingly removes the veneer of confidentiality from that information. Excerpts from Judge Kozinski’s videotaped remarks can be viewed here.

That technological innovations could provide third parties with access to proprietary information is certainly problematic as companies incorporate social networking sites, cloud computing and BYOD into more aspects of their operations. Without appropriate safeguards, the use of such innovations could jeopardize the confidentiality of proprietary company information.

For example, content that corporate employees exchange on social networking sites could be accessed and monitored by site representatives under the governing terms of service. While those terms typically provide privacy settings that would allow corporate employees to limit the extent to which information may be disseminated, they also notify those same users that site representatives may access their communications. Though the justification for such access varies from site to site, the terms generally delineate the lack of confidentiality associated with user communications. This includes ostensibly private communications sent through the direct messaging features available on social networks like LinkedIn, Twitter, MySpace, Facebook and Reddit.

In like manner, providers of cloud computing services often have access and monitoring rights vis-à-vis a company’s cloud hosted data. Memorialized in service level agreements, those rights may allow provider representatives to access, review or even block transmissions of company data to and from the cloud. Provider access may, in turn, destroy the confidentiality required to preserve the character of company trade secrets or maintain the privileged status of communications with counsel.

BYOD also presents a difficult challenge for preserving the confidentiality of company data. This is due to the lack of corporate control that BYOD has introduced into company’s information ecosystem. Unless appropriate safeguards are deployed, employees may unwittingly disclose proprietary information to third parties by using personal cloud storage providers for storage or transmission of company data. In addition, family, friends or even strangers who have access to the employee device could retrieve such information.

Given the confluence of the above referenced factors, the question becomes what steps an organization can take to preserve the confidentiality of its information. On the social network front, a company could deploy an on site social network environment that would provide a secure environment for its employees to communicate about internal corporate matters. Conceptually similar to private clouds that house data behind the company firewall, an on site network could be jointly developed with a third party provider to ensure specific levels of confidentiality.

For the enterprise that is considering cloud computing for its ESI storage needs, it should require that a cloud service provider offer measures to preserve the confidentiality of trade secrets and privileged messages. That may include specific confidentiality terms or a separate confidentiality agreement. In addition, the provider should probably have certain encryption functionality to better preserve confidentiality. By so doing, the enterprise can better satisfy itself that it has taken appropriate measures to ensure the confidentiality of its data.

To address the confidentiality problems associated with BYOD, a company should prepare a cogent policy and deploy technologies that facilitate employee compliance. Such a policy would discourage workers from using personal cloud storage providers to facilitate data transfers or for ESI storage. It would also delineate the parameters of access to employee devices by the employee’s family, friends, or others. To make such a policy more effective, employers will need to develop a technological architecture that reasonably supports conformity with the policy.

By developing cogent and reasonable policies, training employees and deploying effective, enabling technologies, organizations can better prevent unauthorized disclosures of confidential information. Only by taking such professionally recognized best practices can companies hope to shield their proprietary data from the prying eyes of third parties in the digital age.

Twitter Contempt Sanctions Increase Need for Social Media Governance Plan

Thursday, September 13th, 2012

The headline-grabbing news this week regarding Twitter facing possible contempt sanctions is an important reminder that organizations should consider developing a strategy for addressing social media governance. In criminal proceedings against protesters involved in the Occupy Wall Street movement, a New York state court ordered Twitter several weeks ago to turn over various tweets that a protester deleted from his twitter feed relating to the movement’s blocking of the Brooklyn Bridge last year. Twitter has delayed compliance with that order, which has invited the court’s wrath: “I can’t put Twitter or the little blue bird in jail, so the only way to punish is monetarily.” The court is now threatening Twitter with a monetary contempt sanction based on “the company’s earnings statements for the past two quarters.”

At first blush, the proceeding involving Twitter may not seem paradigmatic for organizations. While most organizations do not engage in civil disobedience and typically stay clear of potential criminal actions, the conduct of the protester in unilaterally deleting his tweets raises the question of whether organizations have developed an effective policy to retain and properly supervise communications made through social networking sites.

Organizations in various industry verticals need to ensure that certain messages communicated through social media sites are maintained for legal or regulatory purposes. For example, financial services companies must retain communications with investors and other records that relate to their “business as such” – including those made through social networking sites – for at least three years under section 17a-4(b) of the Securities Exchange Act of 1934. Though this provision is fairly straightforward, it has troubled regulated companies for years. Indeed, almost two-thirds of surveyed asset managers reported that “regulatory recordkeeping” remains their greatest challenge with respect to social media.

Supervision is another troubling issue. With the proliferation of smartphones, burgeoning “bring your own device” (BYOD) policies and the demands of a 24-hour workday, supervision cannot be boiled down to a simple protocol of “I’ll review your messages before you hit send.” Yet supervision is necessary, particularly given the consequences for rogue communications including litigation costs, lost revenues, reduced stock price and damage to the company brand.

Though there are no silver bullets to ensure perfection regarding these governance challenges, organizations can follow some best practices to develop an effective social media governance policy. The first is that companies should prepare a global plan for how they will engage in social media marketing. This initial step is particularly important for groups that are just now exploring the use of social media to communicate with third parties. Having a plan in place that maps out a contact and communication strategy, provides for supervision of company representatives and accounts for compliance with regulatory requirements is essential.

The next step involves educating and training employees regarding the company’s social media policy. This should include instructions regarding what content may be posted to social networking sites and the internal process for doing so. Policies that describe the consequences for deviating from the social media plan should also be clearly delineated. Those policies should detail the legal repercussions – civil and criminal – for both the employee and the organization for social media missteps.

Third, organizations can employ technology to ensure compliance with their social media plan. This may include archiving software and other technology that both retains and enables a cost-effective supervisory review of content. Electronic discovery tools that enable legal holds and efficiently retrieve archived social media content are also useful in developing an efficient and cost-effective response to legal and regulatory requests.

By following these steps and other best practices, organizations will likely be on the way to establishing the foundation of an effective social media governance plan.

#InfoGov Twitter Chat Hones in on Starting Places and Best Practices

Tuesday, July 3rd, 2012

Unless you’re an octogenarian living in rural Uzbekistan[i] you’ve likely seen the meteoric rise of social media over the last decade. Even beyond hyper-texting teens, businesses too are taking advantage of this relatively new form function to engage with their more technically savvy customers. Recently, Symantec held its first “Twitter Chat” on the topic of information governance (fondly referred to on Twitter as #InfoGov). For those not familiar with the concept, a Twitter Chat is a virtual discussion held on Twitter using a specific hashtag – in this case #IGChat. At a set date and time, parties interested in the topic log into Twitter and start participating in the fireworks on the designated hashtag.

“Fireworks” may be a bit overstated, but given that the moderators (eDiscovery Counsel at Symantec) and participants were limited to 140 characters, the “conversation” was certainly frenetic. Despite the fast pace, one benefit of a Twitter Chat is that you can communicate with shortened web links, as a way to share and discuss content beyond the severely limited word count. During this somewhat staccato discussion, we found the conversation to take some interesting twists and turns, which I thought I’d excerpt (and expound upon[ii]) in this blog.

Whether in a Twitter Chat or otherwise, once the discussion of information governance begins everyone wants to know where to start. The #IGChat was no different.

  • Where to begin?  While there wasn’t consensus per se on a good starting place, one cogent remark out of the blocks was: “The best way to start is to come up with an agreed upon definition — Gartner’s is here t.co/HtGTWN2g.” While the Gartner definition is a good starting place, there are others out there that are more concise. The eDiscovery Journal Group has a good one as well:  “Information Governance is a comprehensive program of controls, processes, and technologies designed to help organizations maximize the value of information assets while minimizing associated risks and costs.”  Regardless of the precise definition, it’s definitely worth the cycles to rally around a set construct that works for your organization.
  • Who’s on board?  The next topic centered around trying to find the right folks organizationally to participate in the information governance initiative. InfoGovlawyer chimed in: “Seems to me like key #infogov players should include IT, Compliance, Legal, Security reps.” Then, PhilipFavro suggested that the “[r]ight team would likely include IT, legal, records managers, pertinent business units and compliance.” Similar to the previous question, at this stage in the information governance maturation process, there isn’t a single, right answer. More importantly, the team needs to have stakeholders from at least Legal and IT, while bringing in participants from other affected constituencies (Infosec, Records, Risk, Compliance, etc.) – basically, anyone interested in maximizing the value of information while reducing the associated risks.
  • Where’s the ROI?  McManusNYLJ queried: “Do you think #eDiscovery, #archiving and compliance-related technology provide ample ROI? Why or why not?”  Here, the comments came in fast and furious. One participant pointed out that case law can be helpful in showing the risk reduction:  “Great case showing the value of an upstream archive – Danny Lynn t.co/dcReu4Qg.” AlliWalt chimed in: “Yes, one event can set your company back millions…just look at the Dupont v. Kolon case… ROI is very real.” Another noted that “Orgs that take a proactive approach to #eDiscovery requests report a 64% faster response time, 2.3x higher success rate.” And, “these same orgs were 78% less likely to be sanctioned and 47% less likely to be legally compromised t.co/5dLRUyq6.” ROI for information governance seemed to be a nut that can be cracked any number of ways, ranging from risk reduction (via sanctions and adverse legal decisions) to better preparation. Here too, an organization’s particular sensitivities should come into play since all entities won’t have the same concerns about risk reduction, for example.
  • Getting Granular. Pegduncan, an active subject matter expert on the topic, noted that showing ROI was the right idea, but not always easy to demonstrate: “But you have to get their attention. Hard to do when IT is facing funding challenges.” This is when granular eDiscovery costs were mentioned: “EDD costs $3 -18k per gig (Rand survey) and should wake up most – adds up w/ large orgs having 147 matters at once.” Peg wasn’t that easily convinced: “Agreed that EDD costs are part of biz case, but .. it’s the problem of discretionary vs non-discretionary spending.”
  • Tools Play a Role. One participant asked: “what about tools for e-mail thread analysis, de-duplication, near de-duplication – are these applicable to #infogov?” A participant noted that “in the future we will see tools like #DLP and #predictivecoding used for #infogov auto-classification – more on DLP here: t.co/ktDl5ULe.” Pegduncan chimed in that “DLP=Data Loss Prevention. Link to Clearwell’s post on Auto-Classification & DLP t.co/ITMByhbj.”

With a concept as broad and complex as information governance, it’s truly amazing that a cogent “conversation” can take place in a series of 140 character tweets. As the Twitter Chat demonstrates, the information governance concept continues to evolve and is doing so through discussions like this one via a social media platform. As with many of the key information governance themes (Ownership, ROI, Definition, etc.) there isn’t a right answer at this stage, but that isn’t an excuse for not asking the critical questions. “Sooner started, sooner finished” is a motto that will serve many organizations well in these exciting times. And, for folks who say they can’t spare the time, they’d be amazed what they can learn in 140 characters.

Mark your calendars and track your Twitter hashtags now: The next #IGChat will be held on July 26 @ 10am PT.



[i] I’ve never been to rural Uzbekistan, but it just sounded remote.  So, my apologies if there’s a world class internet infrastructure there where the denizens tweet prolifically. Given that’s it’s one (of two) double landlocked countries in the world it seemed like an easy target. Uzbeks please feel free to use the comment field and set me straight.

[ii] Minor edits were made to select tweets, but generally the shortened Twitter grammar wasn’t changed.

Email Archive Saves the Day, Prevents eDiscovery Sanctions

Thursday, April 5th, 2012

The recent case of Danny Lynn Electrical v. Veolia Es Solid Waste (2012 WL 786843, March 9, 2012) showcases the value of an information archive from a compliance and eDiscovery perspective. In Danny Lynn Electrical the plaintiff sought sanctions against the defendant for the spoliation of electronic evidence, including the usual blend of monetary sanctions, adverse evidentiary inferences and the striking of affirmative defenses. Plaintiff argued that the defendant “blatantly disregarded their duty to preserve electronic information” by failing to implement an effective legal hold policy and deleting email after litigation began. In rejecting plaintiff’s claims, the court concluded that sanctions on the basis of spoliation of evidence were not warranted.

The court, in a harbinger of good things to come for the defendant, questioned “whether any spoliation of electronic evidence has actually occurred.” In finding that there wasn’t any spoliation, the court relied heavily on the fact that the defendant had recently deployed an email archive:

“[T]here is no evidence that any of the alleged emails, with the exception of the few that were accidentally deleted due to a computer virus or other unforseen [sic] circumstance, were permanently deleted from the defendants’ computer system. … VESNA began using a new software system which archives all emails on the VESNA network. Therefore, it is clear to the court that the defendant preserved email from its custodians in a backup or archive system.”

In combination with the deployed archive, the court also noted that plaintiff’s arguments were devoid of substantive evidence to support their spoliation claims:

“In order to impose sanctions against the defendants, this court ‘would have to substitute Plaintiffs’ speculation for actual proof that critical evidence was in fact lost or destroyed.”

The rejection of plaintiff’s spoliation claims in Danny Lynn Electrical reinforces the long held notion that information archives[i] have tremendous utility beyond the data management/minimization benefits that were the early drivers of archive adoption. This prophylactic, information governance benefit is particularly useful when the archive goes beyond email to additionally capture loose files, social media and other unstructured content.

As we said in 2011, organizations are already finding that other sources of electronically stored information (ESI) like documents/files and unstructured data are rivaling email in importance for eDiscovery requests, and this trend shows no signs of abating, particularly for regulated industries. This increasingly heterogeneous mix of ESI certainly results in challenges for many organizations, with some unlucky ones getting sanctioned (unlike the defendant Danny Lynn Electrical ) because they ignored these emerging data types.

The good news is that modern day archives have the ability to manage (preserve, categorize, defensibly delete, etc.) ESI from a wide range of sources beyond just email. Given cases like Danny Lynn Electrical it’s increasingly a layup to build the business case for an archive project (assuming your organization doesn’t have one deployed already). Further pushing the archiving play to the top of the stack is the ability to deploy in the cloud context, in addition to traditional on premise deployments.

The Danny Lynn Electrical case also shows how an upstream, proactive information governance program can have an impact in the downstream, reactive eDiscovery context. It is the linking of the yin and yang of the proactive and reactive concepts where an end to end paradigm starts to fulfill the long anticipated destiny of true information governance. As the explosion of data continues to mushroom unabated, it’s only this type of holistic information management regime that will keep eDiscovery chaos at bay.



[i] In the interests of full disclosure, Symantec offers both on-premise archiving and cloud archiving solutions. They are not the solutions referenced in the Danny Lynn Electrical case.

Policy vs. Privacy: Striking the Right Balance Between Organization Interests and Employee Privacy

Friday, March 9th, 2012

The lines between professional and personal lives are being further blurred every day. With the proliferation of smart phones, the growth of the virtual workplace and the demands of business extending into all hours of the day, employees now routinely mix business with pleasure by commingling such matters on their work and personal devices. This trend is sure to increase, particularly with “bring your own device” policies now finding their way into companies.

This sometimes awkward marriage of personal and professional issues raises the critical question of how organizations can respect the privacy rights of their employees while also protecting their trade secrets and other confidential/proprietary information. The ability to properly navigate these murky waters under the broader umbrella of information governance may be the difference between a successful business and a litigation-riddled enterprise.

Take, for instance, a recent lawsuit that claimed the Food and Drug Administration (FDA) unlawfully spied on the personal email accounts of nine of its employee scientists and doctors. In that litigation, the FDA is alleged to have monitored email messages those employees sent to Congress and the Office of Inspector of General for the Department of Health & Human Services. In the emails at issue, the scientists and doctors scrutinized the effectiveness of certain medical devices the FDA was about to approve for use on patients.

While the FDA’s email policy clearly delineates that employee communications made from government devices may be monitored or recorded, the FDA may have intercepted employees’ user IDs and passwords and accessed messages they sent from their home computers and personal smart phones. Not only would such conduct potentially violate the Electronic Communications Privacy Act (ECPA), it might also conceivably run afoul of the Whistleblower Protection Act.

The FDA spying allegations have also resulted in a congressional inquiry into the email monitoring policies of all federal agencies throughout the executive branch. Congress is now requesting that the Office of Management and Budget (OMB) produce the following information about agency email monitoring policies:

  • Whether a policy distinguishes between work and personal email
  • Whether user IDs and passwords can be obtained for personal email accounts and, if so, whether safeguards are deployed to prevent misappropriation
  • Whether a policy defines what constitutes protected whistleblower communications

The congressional inquiry surrounding agency email practices provides a valuable measuring stick for how private sector organizations are addressing related issues. For example, does an organization have an acceptable use policy that addresses employee privacy rights? Having such a policy in place is particularly critical given that employees use company-issued smart phones to send out work emails, take photographs and post content to personal social networking pages. If such a policy exists now, query whether it is enforced, what the mechanisms exist for doing so and whether or not such enforcement is transparent to the employees.  Compliance is just as important as issuing the policy in the first place.

Another critical inquiry is whether an organization has an audit/oversight process to prevent the type of abuses that allegedly occurred at the FDA. Such a process is essential for organizations on multiple levels. First, as Congress made clear in its letter to the OMB, monitoring communications that employees make from their personal devices violates the ECPA. It could also interfere with internal company whistleblower processes. And to the extent adverse employment action is taken against an employee-turned-whistleblower, the organization could be liable for violations of the False Claims Act or the Dodd-Frank Wall Street Reform and Consumer Protection Act.

A related aspect to these issues concerns whether an organization can obtain work communications sent from employee personal devices. For example, financial services companies must typically retain communications with investors for at least three years. Has the organization addressed this document retention issue while respecting employee privacy rights in their own smart phones and tablet computers?

If an organization does not have such policies or protections in place, it should not panic and rush off to get policies drafted without thinking ahead. Instead, it should address these issues through an intelligent information governance plan. Such a plan will typically address issues surrounding information security, employee privacy, data retention and eDiscovery within the larger context of industry regulations, business demands and employee productivity. That plan will also include budget allocations to support the acquisition and deployment of technology tools to support written policies on these and other issues.  Addressed in this context, organizations will more likely strike the right balance between their interests and their employees’ privacy and thereby avoid a host of unpleasant outcomes.

LTNY Wrap-Up – What Did We Learn About eDiscovery?

Friday, February 10th, 2012

Now that that dust has settled, the folks who attended LegalTech New York 2012 can try to get to the mountain of emails that accumulated during the event that was LegalTech. Fortunately, there was no ice storm this year, and for the most part, people seemed to heed my “what not to do at LTNY” list. I even found the Starbucks across the street more crowded than the one in the hotel. There was some alcohol-induced hooliganism at a vendor’s party, but most of the other social mixers seemed uniformly tame.

Part of Dan Patrick’s syndicated radio show features a “What Did We Learn Today?” segment, and that inquiry seems fitting for this year’s LegalTech.

  • First of all, the prognostications about buzzwords were spot on, with no shortage of cycles spent on predictive coding (aka Technology Assisted Review). The general session on Monday, hosted by Symantec, had close to a thousand attendees on the edge of their seats to hear Judge Peck, Maura Grossman and Ralph Losey wax eloquently about the ongoing man versus machine debate. Judge Peck uttered a number of quotable sound bites, including the quote of the day: “Keyword searching is absolutely terrible, in terms of statistical responsiveness.” Stay tuned for a longer post with more comments from the General session.
  • Ralph Losey went one step further when commenting on keyword search, stating: “It doesn’t work,… I hope it’s been discredited.” A few have commented that this lambasting may have gone too far, and I’d tend to agree.  It’s not that keyword search is horrific per se. It’s just that its efficacy is limited and the hubris of the average user, who thinks eDiscovery search is like Google search, is where the real trouble lies. It’s important to keep in mind that all these eDiscovery applications are just like tools in the practitioners’ toolbox and they need to be deployed for the right task. Otherwise, the old saw (pun intended) that “when you’re a hammer everything looks like a nail” will inevitably come true.
  • This year’s show also finally put a nail in the coffin of the human review process as the eDiscovery gold standard. That doesn’t mean that attorneys everywhere will abandon the linear review process any time soon, but hopefully it’s becoming increasingly clear that the “evil we know” isn’t very accurate (on top of being very expensive). If that deadly combination doesn’t get folks experimenting with technology assisted review, I don’t know what will.
  • Information governance was also a hot topic, only paling in comparison to Predictive Coding. A survey Symantec conducted at the show indicated that this topic is gaining momentum, but still has a ways to go in terms of action. While 73% of respondents believe an integrated information governance strategy is critical to reducing information risk, only 19% have implemented a system to help them with the problem. This gap presumably indicates a ton of upside for vendors who have a good, attainable information governance solution set.
  • The Hilton still leaves much to be desired as a host location. As they say, familiarity breeds contempt, and for those who’ve notched more than a handful of LegalTech shows, the venue can feel a bit like the movie Groundhog Day, but without Bill Murray. Speculation continues to run rampant about a move to the Javits Center, but the show would likely need to expand pretty significantly before ALM would make the move. And, if there ever was a change, people would assuredly think back with nostalgia on the good old days at the Hilton.
  • Despite the bright lights and elevator advertisement trauma, the mood seemed pretty ebullient, with tons of partnerships, product announcements and consolidation. This positive vibe was a nice change after the last two years when there was still a dark cloud looming over the industry and economy in general.
  • Finally, this year’s show also seemed to embrace social media in a way that it hadn’t done so in years past. Yes, all the social media vehicles were around in years past, but this year many of the vendors’ campaigns seemed to be much more integrated. It was funny to see even the most technically resistant lawyers log in to Twitter (for the first time) to post comments about the show as a way to win premium vendor swag. Next year, I’m sure we’ll see an even more pervasive social media influence, which is a bit ironic given the eDiscovery challenges associated with collecting and reviewing social media content.

The Social Media Rubik’s Cube: FINRA Solved it First, Are Non-Regulated Industries Next?

Wednesday, January 25th, 2012

It’s no surprise that the first industry to be heavily regulated regarding social media use was the financial services industry. The predominant factor that drove regulators to address the viral qualities of social media was the fiduciary nature of investing that accompanies securities, coupled with the potential detrimental financial impact these offerings could have on investors.

Although there is no explicit language in FINRA’s Regulatory Notices 10-06 (January 2010) or 11-30 (August 2011) requiring archival, the record keeping component of the notices necessitate social media archiving in most cases due to the sheer volume of data produced on social media sites. Melanie Kalemba, Vice President of Business Development at SocialWare in Austin, Texas states:

“Our clients in the financial industry have led the way, they have paved the road for other industries, making social media usage less daunting. Best practices for monitoring third-party content, record keeping responsibilities, and compliance programs are available and developed for other industries to learn from. The template is made.”

eDiscovery and Privacy Implications. Privacy laws are an important aspect of social media use that impact discoverability. Discovery and privacy represent layers of the Rubik’s cube in the ever-changing and complex social media environment. No longer are social media cases only personal injury suits or HR incidents, although those are plentiful. For example, in Largent v. Reed the court ruled that information posted by a party on their personal Facebook page was discoverable and ordered the plaintiff to provide user name and password to enable the production of the information. In granting the motion to compel the Defendant’s login credentials, Judge Walsh acknowledged that Facebook has privacy settings, and that users must take “affirmative steps” to keep their information private. However, his ruling determined that no social media privacy privilege exists: “No court has recognized such a privilege, and neither will we.” He further reiterated his ruling by adding, “[o]nly the uninitiated or foolish could believe that Facebook is an online lockbox of secrets.”

Then there are the new cases emerging over social media account ownership which affect privacy and discoverability. In the recently filed Phonedog v. Kravitz, 11-03474 (N.D. Cal.; Nov. 8, 2011), the lines between the “professional” versus the “private” user are becoming increasingly blurred. This case also raises questions about proprietary client lists, valuations on followers, and trade secrets  – all of which are further complicated when there is no social media policy in place. The financial services industry has been successful in implementing effective social media policies along with technology to comply with agency mandates – not only because they were forced to by regulation, but because they have developed best practices that essentially incorporate social media into their document retention policies and information governance infrastructures.

Regulatory Framework. Adding another Rubik’s layer are the multitude of regulatory and compliance issues that many industries face. The most active and vocal regulators for guidance in the US on social media have been FINRA, the SEC and the FTC. FINRA initiated guidance to the financial services industry, and earlier this month the SEC issued their alert. The SEC’s exam alert to registered investment advisers issued on January 4, 2012 was not meant to be a comprehensive summary for compliance related to the use of social media. Instead, it lays out staff observations of three major categories: third party content, record keeping and compliance – expounding on FINRA’s notice.

Last year the FTC issued an extremely well done Preliminary FTC Staff Report on Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers.  Three main components are central to the report. The first is a call for all companies to build privacy and security mechanisms into new products – considering the possible negative ramifications at the outset, avoiding social media and privacy issues as an afterthought. The FTC has cleverly coined the notion, “Privacy by Design.” Second, “Just-In-Time” is a concept about notice and encourages companies to communicate with the public in a simple way that prompts them to make informed decisions about their data in terms that are clear and that require an affirmative action (i.e., checking a box). Finally, the FTC calls for greater transparency around data collection, use and retention. The FTC asserts that consumers have a right to know what kind of data companies collect, and should have access to the sensitivity and intended use of that data. The FTC’s report is intended to inform policymakers, including Congress, as they legislate on privacy – and to motivate companies to self-regulate and develop best practices. 

David Shonka, Principal Deputy General Counsel at the FTC in Washington, D.C., warns, “There is a real tension between the situations where a company needs to collect data about a transaction versus the liabilities associated with keeping unneeded data due to privacy concerns. Generally, archiving everything is a mistake.” Shonka arguably reinforces the case for instituting an intelligent archive, whether a company is regulated or not;  an archive that is selective about what it ingests based on content, and that has an appropriate deletion cycle applied to defined data types/content according to a policy. This will ensure expiry of private consumer information in a timely manner, but retains the benefits of retrieval for a defined period if necessary.

The Non-Regulated Use Case­. When will comprehensive social media policies, retention and monitoring become more prevalent in the non-regulated sectors? In the case of FINRA and the SEC, regulations were issued to the financial industry. In the case of the FTC, guidance had been given to companies regarding how to avoid false advertisement and protect consumer privacy. The two are not dissimilar in effect. Both require a social media policy, monitoring, auditing, technology, and training. While there is no clear mandate to archive social media if you are in a non-regulated industry, this can’t be too far away. This is evidenced by companies that have already implemented social media monitoring systems for reasons like brand promotion/protection, or healthcare companies that deal with highly sensitive information. If social media is replacing email, and social media is essentially another form of electronic evidence, why would social media not be part of the integral document retention/expiry procedures within an organization?

Content-based monitoring and archiving is possible with technology available today, as the financial sector has demonstrated. Debbi Corej, who is a compliance expert for the financial sector and has successfully implemented an intensive social media program, says it perfectly: “How do you get to yes? Yes you can use social media, but in a compliant way.” The answer can be found at LegalTech New YorkJanuary 30 @ 2:00pm.

Losing Weight, Developing an Information Governance Plan, and Other New Year’s Resolutions

Tuesday, January 17th, 2012

It’s already a few weeks into the new year and it’s easy to spot the big lines at the gym, folks working on fad diets and many swearing off any number of vices.  Sadly perhaps, most popular resolutions don’t even really change year after year.  In the corporate world, though, it’s not good enough to simply recycle resolutions every year since there’s a lot more at stake, often with employee’s bonuses and jobs hanging in the balance.

It’s not too late to make information governance part of the corporate 2012 resolution list.  The reason is pretty simple – most companies need to get out of the reactive firefighting of eDiscovery given the risks of sloppy work, inadvertent productions and looming sanctions.  Yet, so many are caught up in the fog of eDiscovery war that they’ve failed to see the nexus between the upstream, proactive good data management hygiene and the downstream eDiscovery chaos.

In many cases the root cause is the disconnect between differing functional groups (Legal, IT, Information Security, Records Management, etc.).  This is where the emerging umbrella concept of Information Governance comes to play, serving as a way to tackle these information risks along a unified front. Gartner defines information governanceas the:

“specification of decision rights, and an accountability framework to encourage desirable behavior in the valuation, creation, storage, use, archiving and deletion of information, … [including] the processes, roles, standards, and metrics that ensure the effective and efficient use of information to enable an organization to achieve its goals.”

Perhaps more simply put, what were once a number of distinct disciplines—records management, data privacy, information security and eDiscovery—are rapidly coming together in ways that are important to those concerned with mitigating and managing information risk. This new information governance landscape is comprised of a number of formerly discrete categories:

  • Regulatory Risks – Whether an organization is in a heavily regulated vertical or not, there are a host of regulations that an organization must navigate to successfully stay in compliance.  In the United States these include a range of disparate regimes, including the Sarbanes-Oxley Act, HIPPA, the Securities and Exchange Act, the Foreign Corrupt Practices Act (FCPA) and other specialized regulations – any number of which require information to be kept in a prescribed fashion, for specified periods of time.  Failure to turn over information when requested by regulators can have dramatic financial consequences, as well as negative impacts to an organization’s reputation.
  • Discovery Risks – Under the discovery realm there are any number of potential risks as a company moves along the EDRM spectrum (i.e., Identification, Preservation, Collection, Processing, Analysis, Review and Production), but the most lethal risk is typically associated with spoliation sanctions that arise from the failure to adequately preserve electronically stored information (ESI).  There have been literally hundreds of cases where both plaintiffs and defendants have been caught in the judicial crosshairs, resulting in penalties ranging from outright case dismissal to monetary sanctions in the millions of dollars, simply for failing to preserve data properly.  It is in this discovery arena that the failure to dispose of corporate information, where possible, rears its ugly head since the eDiscovery burden is commensurate with the amount of data that needs to be preserved, processed and reviewed.  Some statistics show that it can cost as much as $5 per document just to have an attorney privilege review performed.  And, with every gigabyte containing upwards of 75,000 pages, it is easy to see massive discovery liability when an organization has terabytes and even petabytes of extraneous data lying around.
  • Privacy Risks – Even though the US has a relatively lax information privacy climate there are any number of laws that require companies to notify customers if their personally identifiable information (PII) such as credit card, social security, or credit numbers have been compromised.  For example, California’s data breach notification law (SB1386) mandates that all subject companies must provide notification if there is a security breach to the electronic database containing PII of any California resident.  It is easy to see how unmanaged PII can increase corporate risk, especially as data moves beyond US borders to the international stage where privacy regimes are much more staunch.
  • Information Security Risks Data breaches have become so commonplace that the loss/theft of intellectual property has become an issue for every company, small and large, both domestically and internationally.  The cost to businesses of unintentionally exposing corporate information climbed 7 percent last year to over $7 million per incident.  Recently senators asked the SEC to “issue guidance regarding disclosure of information security risk, including material network breaches” since “securities law obligates the disclosure of any material network breach, including breaches involving sensitive corporate information that could be used by an adversary to gain competitive advantage in the marketplace, affect corporate earnings, and potentially reduce market share.”  The senators cited a 2009 survey that concluded that 38% of Fortune 500 companies made a “significant oversight” by not mentioning data security exposures in their public filings.

Information governance as an umbrella concept helps organizations to create better alignment between functional groups as they attempt to solve these complex and interrelated data risk challenges.  This coordination is even more critical given the way that corporate data is proliferating and migrating beyond the firewall.  With even more data located in the cloud and on mobile devices a key mandate is managing data in all types of form factors. A great first step is to determine ownership of a consolidated information governance approach where the owner can:

  • Get C-Level buy-in
  • Have the organizational savvy to obtain budget
  • Be able to define “reasonable” information governance efforts, which requires both legal and IT input
  • Have strong leadership and consensus building skills, because all stakeholders need to be on the same page
  • Understand the nuances of their business, since an overly rigid process will cause employees to work around the policies and procedures

Next, tap into and then leverage IT or information security budgets for archiving, compliance and storage.  In most progressive organizations there are likely ongoing projects that can be successfully massaged into a larger information governance play.  A great place to focus on initially is information archiving, since this one of the simplest steps an organization can take to improve their information governance hygiene.  With an archive organizations can systematically index, classify and retain information and thus establish a proactive approach to data management.  It’s this ability to apply retention and (most importantly) expiration policies that allows organizations to start reducing the upstream data deluge that will inevitably impact downstream eDiscovery processes.

Once an archive is in place, the next logical step is to couple a scalable, reactive eDiscovery process with the upstream data sources, which will axiomatically include email, but increasingly should encompass cloud content, social media, unstructured data, etc.  It is important to make sure  that a given  archive has been tested to ensure compatibility with the chosen eDiscovery application to guarantee that it can collect content at scale in the same manner used to collect from other data sources.  Overlaying both of these foundational pieces should be the ability to place content on legal hold, whether that content exists in the archive or not.

As we enter 2012, there is no doubt that information governance should be an element in building an enterprise’s information architecture.  And, different from fleeting weight loss resolutions, savvy organizations should vow to get ahead of the burgeoning categories of information risk by fully embracing their commitment to integrated information governance.  And yet, this resolution doesn’t need to encompass every possible element of information governance.  Instead, it’s best to put foundational pieces into place and then build the rest of the infrastructure in methodical and modular fashion.

Information Governance Gets Presidential Attention: Banking Bailout Cost $4.76 Trillion, Technology Revamp Approaches $240 Billion

Tuesday, January 10th, 2012

On November 28, 2011, The White House issued a Presidential Memorandum that outlines what is expected of the 480 federal agencies of the government’s three branches in the next 240 days.  Up until now, Washington, D.C. has been the Wild West with regard to information governance as each agency has often unilaterally adopted its own arbitrary policies and systems.  Moreover, some agencies have recently purchased differing technologies.  Unfortunately,  with the President’s ultimate goal of uniformity, this centralization will be difficult to accomplish with a range of disparate technological approaches.

Particular pain points for the government traditionally include retention, search, collection, review and production of vast amounts of data and records.  Specifically, these pain points include examples of: FOIA requests gone awry, the issuance of legal holds across different agencies leading to spoliation, and the ever present problem of decentralization.

Why is the government different?

Old Practices. First, in some instances the government is technologically behind (its corporate counterparts) and is failing to meet the judiciary’s expectation that organizations effectively store, manage and discover their information.  This failing is self-evident via  the directive coming from the President mandating that these agencies start to get a plan to attack this problem.  Though different than other corporate entities, the government is nevertheless held to the same standards of eDiscovery under the Federal Rules of Civil Procedure (FRCP).  In practice, the government has been given more leniency until recently, and while equal expectations have not always been the case, the gap between the private and public sectors in no longer possible to ignore.

FOIA.  The government’s arduous obligation to produce information under the Freedom of Information Act (FOIA) has no corresponding analog for private organizations, who are responding to more traditional civil discovery requests.  Because the government is so large with many disparate IT systems, it is cumbersome to work efficiently through the information governance process across agencies and many times still difficult inside one individual agency with multiple divisions.  Executing this production process is even more difficult if not impossible to do manually without properly deployed technology.  Additionally, many of the investigatory agencies that issue requests to the private sector need more efficient ways to manage and review data they are requesting.  To compound problems, within the US government there are two opposing interests are at play; both screaming for a resolution, and that solution needs to be centralized.  On the one hand, the government needs to retain more than a corporation may need to in order to satisfy a FOIA request.

Titan Pulled at Both Ends. On the other hand, without classification of the records that are to be kept, technology to organize this vast amount of data and some amount of expiry, every agency will essentially become their own massive repository.  The “retain everything mentality” coupled with the inefficient search and retrieval of data and records is where they stand today.  Corporations are experiencing this on a smaller scale today and many are collectively further along than the government in this process, without the FOIA complications.

What are agencies doing to address these mandates?

In their plans, agencies must describe how they will improve or maintain their records management programs, particularly with regard to email, social media and other electronic communications.  They must also move away from such a paper-centric existence.  eDiscovery consultants and software companies are helping agencies through this process, essentially writing their plans to match the President’s directive.  The cloud conversation has been revisited, and agencies also have to explain how they will use cloud-based services and storage solutions, as well as identify gaps in existing laws or regulations that presently prevent improved management.  Small innovations are taking place.  In fact, just recently the DOJ added a new search feature on their website to make it easier for the public to find documents that have been posted by agencies on their websites.

The Office of Management and Budget (OMB), National Archives and Records Administration (NARA), and Justice Department will use those reports to come up with a government-wide records management framework that is more efficient, maintains accountability by documenting agency actions and promotes “appropriate” public access to records.  Hopefully, the framework they come up with will be centralized and workable on a realistic timeframe with resources sufficiently allocated to the initiative.

How much will this cost?

The President’s mandate is a great initiative and very necessary, but one cannot help but think about the costs in terms of money, time and resources when considering these crucial changes.  The most recent version of a financial services and general government appropriations bill in the Senate extends $378.8 million to NARA for this initiative.  President Obama appointed Steven VanRoekel as the United States CIO in August 2011 to succeed Vivek Kundra.  After VanRoekel’s speech at the Churchill Club in October of 2011, an audience member asked him what the most surprising aspect of his new job was.  VanRoekel said that it was managing the huge and sometimes unwieldy resources of his $80 billion budget.  It is going to take even more than this to do the job right, however.

Using conservative estimates, assume for an agency to implement archiving and eDiscovery capabilities as an initial investment would be $100 million.  That approximates $480 billion for all 480 agencies.  Assume a uniform information governance platform gets adopted by all agencies at a 50% discount due to the large contracts and also factoring in smaller sums for agencies with lesser needs.  The total now comes to $240 billion.  For context, that figure is 5% of what was spent by Federal Government ($4.76 trillion) on the biggest bailout in history in 2008. That leaves a need for $160 billion more to get the job done. VanRoekel also commented at the same meeting that he wants to break down massive multi-year information technology projects into smaller, more modular projects in the hopes of saving the government from getting mired in multi-million dollar failures.   His solution to this, he says, is modular and incremental deployment.

While Rome was not built in a day, this initiative is long overdue, yet feasible, as technology exists to address these challenges rather quickly.  After these 240 days are complete and a plan is drawn the real question is, how are we going to pay now for technology the government needed yesterday?  In a perfect world, the government would select a platform for archiving and eDiscovery, break the project into incremental milestones and roll out a uniform combination of solutions that are best of breed in their expertise.

Top Ten eDiscovery Predictions for 2012

Thursday, December 8th, 2011

As 2011 comes quickly to a close we’ve attempted, as in years past, to do our best Carnac impersonation and divine the future of eDiscovery.  Some of these predictions may happen more quickly than others, but it’s our sense that all will come to pass in the near future – it’s just a matter of timing.

  1. Technology Assisted Review (TAR) Gains Speed.  The area of Technology Assisted Review is very exciting since there are a host of emerging technologies that can help make the review process more efficient, ranging from email threading, concept search, clustering, predictive coding and the like.  There are two fundamental challenges however.  First, the technology doesn’t work in a vacuum, meaning that the workflows need to be properly designed and the users need to make accurate decisions because those judgment calls often are then magnified by the application.  Next, the defensibility of the given approach needs to be well vetted.  While it’s likely not necessary (or practical) to expect a judge to mandate the use of a specific technological approach, it is important for the applied technologies to be reasonable, transparent and auditable since the worst possible outcome would be to have a technology challenged and then find the producing party unable to adequately explain their methodology.
  2. The Custodian-Based Collection Model Comes Under Stress. Ever since the days of Zubulake, litigants have focused on “key players” as a proxy for finding relevant information during the eDiscovery process.  Early on, this model worked particularly well in an email-centric environment.  But, as discovery from cloud sources, collaborative worksites (like SharePoint) and other unstructured data repositories continues to become increasingly mainstream, the custodian-oriented collection model will become rapidly outmoded because it will fail to take into account topically-oriented searches.  This trend will be further amplified by the bench’s increasing distrust of manual, custodian-based data collection practices and the presence of better automated search methods, which are particularly valuable for certain types of litigation (e.g., patent disputes, product liability cases).
  3. The FRCP Amendment Debate Will Rage On – Unfortunately Without Much Near Term Progress. While it is clear that the eDiscovery preservation duty has become a more complex and risk laden process, it’s not clear that this “pain” is causally related to the FRCP.  In the notes from the Dallas mini-conference, a pending Sedona survey was quoted referencing the fact that preservation challenges were increasing dramatically.  Yet, there isn’t a consensus viewpoint regarding which changes, if any, would help improve the murky problem.  In the near term this means that organizations with significant preservation pains will need to better utilize the rules that are on the books and deploy enabling technologies where possible.
  4. Data Hoarding Increasingly Goes Out of Fashion. The war cry of many IT professionals that “storage is cheap” is starting to fall on deaf ears.  Organizations are realizing that the cost of storing information is just the tip of the iceberg when it comes to the litigation risk of having terabytes (and conceivably petabytes) of unstructured, uncategorized and unmanaged electronically stored information (ESI).  This tsunami of information will increasingly become an information liability for organizations that have never deleted a byte of information.  In 2012, more corporations will see the need to clean out their digital houses and will realize that such cleansing (where permitted) is a best practice moving forward.  This applies with equal force to the US government, which has recently mandated such an effort at President Obama’s behest.
  5. Information Governance Becomes a Viable Reality.  For several years there’s been an effort to combine the reactive (far right) side of the EDRM with the logically connected proactive (far left) side of the EDRM.  But now, a number of surveys have linked good information governance hygiene with better response times to eDiscovery requests and governmental inquires, as well as a corresponding lower chance of being sanctioned and the ability to turn over less responsive information.  In 2012, enterprises will realize that the litigation use case is just one way to leverage archival and eDiscovery tools, further accelerating adoption.
  6. Backup Tapes Will Be Increasingly Seen as a Liability.  Using backup tapes for disaster recovery/business continuity purposes remains a viable business strategy, although backing up to tape will become less prevalent as cloud backup increases.  However, if tapes are kept around longer than necessary (days versus months) then they become a ticking time bomb when a litigation or inquiry event crops up.
  7. International eDiscovery/eDisclosure Processes Will Continue to Mature. It’s easy to think of the US as dominating the eDiscovery landscape. While this is gospel for us here in the States, international markets are developing quickly and in many ways are ahead of the US, particularly with regulatory compliance-driven use cases, like the UK Bribery Act 2010.  This fact, coupled with the menagerie of international privacy laws, means we’ll be less Balkanized in our eDiscovery efforts moving forward since we do really need to be thinking and practicing globally.
  8. Email Becomes “So 2009” As Social Media Gains Traction. While email has been the eDiscovery darling for the past decade, it’s getting a little long in the tooth.  In the next year, new types of ESI (social media, structured data, loose files, cloud context, mobile device messages, etc.) will cause headaches for a number of enterprises that have been overly email-centric.  Already in 2011, organizations are finding that other sources of ESI like documents/files and structured data are rivaling email in importance for eDiscovery requests, and this trend shows no signs of abating, particularly for regulated industries. This heterogeneous mix of ESI will certainly result in challenges for many companies, with some unlucky ones getting sanctioned because they ignored these emerging data types.
  9. Cost Shifting Will Become More Prevalent – Impacting the “American Rule.” For ages, the American Rule held that producing parties had to pay for their production costs, with a few narrow exceptions.  Next year we’ll see even more courts award winning parties their eDiscovery costs under 28 U.S.C. §1920(4) and Rule 54(d)(1) FRCP. Courts are now beginning to consider the services of an eDiscovery vendor as “the 21st Century equivalent of making copies.”
  10. Risk Assessment Becomes a Critical Component of eDiscovery. Managing risk is a foundational underpinning for litigators generally, but its role in eDiscovery has been a bit obscure.  Now, with the tremendous statistical insights that are made possible by enabling software technologies, it will become increasingly important for counsel to manage risk by deciding what types of error/precision rates are possible.  This risk analysis is particularly critical for conducting any variety of technology assisted review process since precision, recall and f-measure statistics all require a delicate balance of risk and reward.

Accurately divining the future is difficult (some might say impossible), but in the electronic discovery arena many of these predictions can happen if enough practitioners decide they want them to happen.  So, the future is fortunately within reach.