24h-payday

Posts Tagged ‘discoverable’

Symantec Positioned Highest in Execution and Vision in Gartner Archiving MQ

Tuesday, December 18th, 2012

Once again Gartner has named Symantec as a leader in the Enterprise Information Archiving magic quadrant.  We’ve continued to invest significantly in this market and it is gratifying to see the recognition for the continued effort we put into archiving both in the cloud and on premises with our Enterprise Vault.cloud and Enterprise Vault products. Symantec has now been rated a leader 9 years in a row.

 

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Symantec.

Gartner does not endorse any vendor, product or service depicted in the Magic Quadrant, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

 This year marks a transition in a couple of regards.  We are seeing an acceleration of customers looking for the convenience and simplicity of SaaS based archiving solution. The caveat being that they want the security and trust that only a vendor like Symantec can deliver.

Similarly the market has continued to ask for integrated solutions that deliver information archiving and eDiscovery to quickly address often complex and time sensitive process of litigation and regulatory requests.  The deep integration we offer between our archiving solutions – Enterprise Vault and Enterprise Vault.cloud – and the Clearwell eDiscovery Platform has led many customers to deploy these together to streamline their eDiscovery workflow.

An archive is inherently deployed with the long term in mind.  Over the history of Gartner’s Enterprise Information Archiving MQ, only Symantec has provided a consistent solution to customers by investing and innovating with Enterprise Vault to lead the industry in performance, functionality, and support without painful migrations or changes. 

We’re excited about what we have planned next for Enterprise Vault and Enterprise Vault.cloud and intend to maintain our leadership in the years to come. Our customers will continue to be able to manage their critical information assets and meet their needs for eDiscovery and Information Governance as we improve our products year after year.

Spotlighting the Top Electronic Discovery Cases from 2012

Friday, December 14th, 2012

With the New Year quickly approaching, it is worth reflecting on some of the key eDiscovery developments that have occurred during 2012. While legislative, regulatory and rulemaking bodies have undoubtedly impacted eDiscovery, the judiciary has once again played the most dramatic role.  There are several lessons from the top 2012 court cases that, if followed, will likely help organizations reduce the costs and risks associated with eDiscovery. These cases also spotlight the expectations that courts will likely have for organizations in 2013 and beyond.

Implementing a Defensible Deletion Strategy

Case: Brigham Young University v. Pfizer, 282 F.R.D. 566 (D. Utah 2012)

In Brigham Young, the plaintiff university had pressed for sanctions as a result of Pfizer’s destruction of key documents pursuant to its information retention policies. The court rejected that argument because such a position failed to appreciate the basic workings of a valid corporate retention schedule. As the court reasoned, “[e]vidence may simply be discarded as a result of good faith business procedures.” When those procedures operate to inadvertently destroy evidence before the duty to preserve is triggered, the court held that sanctions should not issue: “The Federal Rules protect from sanctions those who lack control over the requested materials or who have discarded them as a result of good faith business procedures.”

Summary: The Brigham Young case is significant since it emphasizes that organizations should implement a defensible deletion strategy to rid themselves of data stockpiles. Absent a preservation duty or other exceptional circumstances, organizations that pare back ESI pursuant to “good faith business procedures” (such as a neutral retention policy) will be protected from sanctions.

**Another Must-Read Case: Danny Lynn Elec. v. Veolia Es Solid Waste (M.D. Ala. Mar. 9, 2012)

Issuing a Timely and Comprehensive Litigation Hold

Case: Apple, Inc. v. Samsung Electronics Co., Ltd, — F. Supp. 2d. — (N.D. Cal. 2012)

Summary: The court first issued an adverse inference instruction against Samsung to address spoliation charges brought by Apple. In particular, the court faulted Samsung for failing to circulate a comprehensive litigation hold instruction when it first anticipated litigation. This eventually culminated in the loss of emails from several key Samsung custodians, inviting the court’s adverse inference sanction.

Ironically, however, Apple was subsequently sanctioned for failing to issue a proper hold notice. Just like Samsung, Apple failed to distribute a hold until several months after litigation was reasonably foreseeable. The tardy hold instruction, coupled with evidence suggesting that Apple employees were “encouraged to keep the size of their email accounts below certain limits,” ultimately led the court to conclude that Apple destroyed documents after its preservation duty ripened.

The Lesson for 2013: The Apple case underscores the importance of issuing a timely and comprehensive litigation hold notice. For organizations, this likely means identifying the key players and data sources that may have relevant information and then distributing an intelligible hold instruction. It may also require suspending aspects of information retention policies to preserve relevant ESI. By following these best practices, organizations can better avoid the sanctions bogeyman that haunts so many litigants in eDiscovery.

**Another Must-Read Case: Chin v. Port Authority of New York, 685 F.3d 135 (2nd Cir. 2012)

Judicial Approval of Predictive Coding

Case: Da Silva Moore v. Publicis Groupe, — F.R.D. — (S.D.N.Y. Feb. 24, 2012)

Summary: The court entered an order that turned out to be the first of its kind: approving the use of predictive coding technology in the discovery phase of litigation. That order was entered pursuant to the parties’ stipulation, which provided that defendant MSL Group could use predictive coding in connection with its obligation to produce relevant documents. Pursuant to that order, the parties methodically (yet at times acrimoniously) worked over several months to fine tune the originally developed protocol to better ensure the production of relevant documents by defendant MSL.

The Lesson for 2013: The court declared in its order that predictive coding “is an acceptable way to search for relevant ESI in appropriate cases.” Nevertheless, the court also made clear that this technology is not the exclusive method now for conducting document review. Instead, predictive coding should be viewed as one of many different types of tools that often can and should be used together.

**Another Must-Read Case: In Re: Actos (Pioglitazone) Prods. Liab. Litig. (W.D. La. July 10, 2012)

Proportionality and Cooperation are Inextricably Intertwined

Case: Pippins v. KPMG LLP, 279 F.R.D. 245 (S.D.N.Y. 2012)

Summary: The court ordered the defendant accounting firm (KPMG) to preserve thousands of employee hard drives. The firm had argued that the high cost of preserving the drives was disproportionate to the value of the ESI stored on the drives. Instead of preserving all of the drives, the firm hoped to maintain a reduced sample, asserting that the ESI on the sample drives would satisfy the evidentiary demands of the plaintiffs’ class action claims.

The court rejected the proportionality argument primarily because the firm refused to permit plaintiffs or the court to analyze the ESI found on the drives. Without any transparency into the contents of the drives, the court could not weigh the benefits of the discovery against the alleged burdens of preservation. The court was thus left to speculate about the nature of the ESI on the drives, reasoning that it went to the heart of plaintiffs’ class action claims. As the district court observed, the firm may very well have obtained the relief it requested had it engaged in “good faith negotiations” with the plaintiffs over the preservation of the drives.

The Lesson for 2013: The Pippins decision reinforces a common refrain that parties seeking the protection of proportionality principles must engage in reasonable, cooperative discovery conduct. Staking out uncooperative positions in the name of zealous advocacy stands in sharp contrast to proportionality standards and the cost cutting mandate of Rule 1. Moreover, such a tactic may very well foreclose proportionality considerations, just as it did in Pippins.

**Another Must-Read Case: Kleen Products LLC v. Packaging Corp. of America (N.D. Ill. Sept. 28, 2012)

Conclusion

There were any number of other significant cases from 2012 that could have made this list.  We invite you to share your favorites in the comments section or contact us directly with your feedback.

December Symantec SharePoint Governance Twitter Chat

Thursday, December 13th, 2012

Join hashtag #IGChat and learn about SharePoint governance and creating effective governance plans

Over the years, SharePoint has become a favorite among organizations as a place to share and manage content. As SharePoint adoption increases – storage, performance and on-going maintenance become major challenges, and SharePoint governance becomes essential. Archiving and eDiscovery solutions provide a key part in any effective and lasting governance strategy for SharePoint.  

In a 2012 survey conducted by Osterman research, the results showed that 39 percent of all SharePoint implementations still don’t have a governance plan. This is due to the fact that implementing governance plans can be difficult.

During this Twitter Chat we will discuss the reasons why organizations need SharePoint governance and the role of archiving and eDiscovery in governance plans. Please join Symantec’s archiving/eDiscovery and SharePoint experts, Dave Scott (@DScottyt) and Rob Mossi (@RMossi24) next Tuesday, December 18 at 10 am PT to chat.

Dave Scott: Dave Scott is a Group Product Manager at Symantec specializing in social media and SharePoint archiving and eDiscovery. He has contributed articles to a number of leading industry publications and is a frequent contributor to Connect.symantec.com. 

Rob Mossi: Rob Mossi is a Sr. Product Marketing Manager with Symantec’s Enterprise Vault product team. With a focus on SharePoint, Rob actively participates in SharePoint archiving and information governance thought leadership activities, including research, conferences and social media. 

 Twitter Chat: SharePoint Governance #IGChat

 Date: Tuesday, December 18, 2012

 Time: 10 am PT

 Length: 1 hour

 Where: Twitter – follow the hashtag #IGChat

 Moderator: Symantec’s Dave Scott (@DScottyt)

What Abraham Lincoln Teaches about Defensible Deletion of ESI

Monday, November 19th, 2012

The reviews are in and movie critics are universally acclaiming Lincoln, the most recent Hollywood rendition regarding the sixteenth president of the United States. While viewers may or may not enjoy the movie, the focus on Abraham Lincoln brings to mind a rather key insight for organizations seeking to strengthen their defensible deletion process.

Lincoln has long been admired for his astute handling of the U.S. Civil War and for his inventive genius (he remains the only U.S. President who patented an invention). Nevertheless, it is Lincoln’s magnanimous, yet shrewd treatment of his rivals that provides the key lesson for organizations today. With a strategy that inexplicably escapes many organizations, Lincoln intelligently organized his documents and other materials so that he could timely retrieve them to help keep his political enemies in check.

This strategy was particularly successful with his Secretary of the Treasury, Salmon Chase, who constantly undermined Lincoln in an effort to bolster his own presidential aspirations. To blunt the effect of Chase’s treachery, Lincoln successfully wielded the weapon of information: Chase’s letters to Lincoln that were filled with problematic admissions. Doris Kearns Goodwin chronicled in her Pulitzer Prize winning book, Team of Rivals, how Lincoln always seemed to access that information at a moment’s notice to save him from Chase’s duplicity.

Lincoln’s tactics reinforce the value of retaining and retrieving important information in a time of need. Lacking the organizational and technological capacity to do so may prevent companies from pulling up information at a crucial moment, be it for business, legal or regulatory purposes. For this and many other reasons, industry experts are recommending that organizations implement a defensible deletion strategy.

Defensible Deletion Requires Deletion                    

Such a strategy could have some success if it is powered by the latest in effective retention technologies such as data classification and automated legal hold. Such innovations will better enable organizations to segregate and preserve business critical ESI.

And yet, it is not enough to just adopt the preservation side of this strategy, for the heart of defensible deletion requires just that – deleting large classes of superfluous, duplicative and harmful data – if its benefits are ever to be realized. Companies that fail to delete such ESI will likely never come off conqueror in the “battle of the data bulge.” Indeed, such a growing waistline of data is problematic for three reasons. First, it can place undue pressure on an organization’s storage infrastructure and needlessly increase the cost of data retention. It can also result in higher eDiscovery costs as the organization is forced to review and analyze all of that ESI largesse. Finally, a potentially fatal risk of producing harmful materials – kept beyond the time required by law – in eDiscovery will unnecessarily increase. All of which could have been obviated had the enterprise observed the rule of “good corporate housekeeping” by eliminating ESI in a manner approved by courts and the rules makers.

For organizations willing to get rid of their digital clutter, defensible deletion offers just what they need so as to reduce the costs and risks of bloated ESI retention. Doing so will help companies make better use that information so, like Honest Abe, they can stave off troublesome challenges threatening the enterprise.

New Gartner Report Spotlights Significance of Email Archiving for Defensible Deletion

Thursday, November 1st, 2012

Gartner recently released a report that spotlights the importance of using email archiving as part of an organization’s defensible deletion strategy. The report – Best Practices for Using Email Archiving to Eliminate PST and Mailbox Quota Headaches (Alan Dayley, September 21, 2012) – specifically focuses on the information retention and eDiscovery challenges associated with email storage on Microsoft Exchange and how email archiving software can help address these issues. As Gartner makes clear in its report, an archiving solution can provide genuine opportunities to reduce the costs and risks of email hoarding.

The Problem: PST Files

The primary challenge that many organizations are experiencing with Microsoft Exchange email is the unchecked growth of messages stored in portable storage tablet (PST) files. Used to bypass storage quotas on Exchange, PST files are problematic because they increase the costs and risks of eDiscovery while circumventing information retention policies.

That the unrestrained growth of PST files could create problems downstream for organizations should come as no surprise. Various court decisions have addressed this issue, with the DuPont v. Kolon Industries litigation foremost among them. In the DuPont case, a $919 million verdict and 20 year product injunction largely stemmed from the defendant’s inability to prevent the destruction of thousands pages of email formerly stored in PST files. That spoliation resulted in a negative inference instruction to the jury and the ensuing verdict against the defendant.

The Solution: Eradicate PSTs with the Help of Archiving Software and Retention Policies

To address the PST problem, Gartner suggests following a three-step process to help manage and then eradicate PSTs from the organization. This includes educating end users regarding both the perils of PSTs and the ease of access to email through archiving software. It also involves disabling the creation of new PSTs, a process that should ultimately culminate with the elimination of existing PSTs.

In connection with this process, Gartner suggests deployment of archiving software with a “PST management tool” to facilitate the eradication process. With the assistance of the archiving tool, existing PSTs can be discovered and migrated into the archive’s central data repository. Once there, email retention policies can begin to expire stale, useless and even harmful messages that were formerly outside the company’s information retention framework.

With respect to the development of retention policies, organizations should consider engaging in a cooperative internal process involving IT, compliance, legal and business units. These key stakeholders must be engaged and collaborate if a workable policies are to be created. The actual retention periods should take into account the types of email generated and received by an organization, along with the enterprise’s business, industry and litigation profile.

To ensure successful implementation of such retention policies and also address the problem of PSTs, an organization should explore whether an on premise or cloud archiving solution is a better fit for its environment. While each method has its advantages, Gartner advises organizations to consider whether certain key features are included with a particular offering:

Email classification. The archiving tool should allow your organization to classify and tag the emails in accordance with your retention policy definitions, including user-selected, user/group, or key-word tagging.

User access to archived email. The tool must also give end users appropriate and user-friendly access to their archived email, thus eliminating concerns over their inability to manage their email storage with PSTs.

Legal and information discovery capabilities. The search, indexing, and e-discovery capabilities of the archiving tool should also match your needs or enable integration into corporate e-discovery systems.

While perhaps not a panacea for the storage and eDiscovery problems associated with email, on premise or cloud archiving software should provide various benefits to organizations. Indeed, such technologies have the potential to help organizations store, manage and discover their email efficiently, cost effectively and in a defensible manner. Where properly deployed and fully implemented, organizations should be able to reduce the nettlesome costs and risks connected with email.

Federal Directive Hits Two Birds (RIM and eDiscovery) with One Stone

Thursday, October 18th, 2012

The eagerly awaited Directive from The Office of Management and Budget (OMB) and The National Archives and Records Administration (NARA) was released at the end of August. In an attempt to go behind the scenes, we’ve asked the Project Management Office (PMO) and the Chief Records Officer for the NARA to respond to a few key questions. 

We know that the Presidential Mandate was the impetus for the agency self-assessments that were submitted to NARA. Now that NARA and the OMB have distilled those reports, what are the biggest challenges on a go forward basis for the government regarding record keeping, information governance and eDiscovery?

“In each of those areas, the biggest challenge that can be identified is the rapid emergence and deployment of technology. Technology has changed the way Federal agencies carry out their missions and create the records required to document that activity. It has also changed the dynamics in records management. In the past, agencies would maintain central file rooms where records were stored and managed. Now, with distributed computing networks, records are likely to be in a multitude of electronic formats, on a variety of servers, and exist as multiple copies. Records management practices need to move forward to solve that challenge. If done right, good records management (especially of electronic records) can also be of great help in providing a solid foundation for applying best practices in other areas, including in eDiscovery, FOIA, as well as in all aspects of information governance.”    

What is the biggest action item from the Directive for agencies to take away?

“The Directive creates a framework for records management in the 21st century that emphasizes the primacy of electronic information and directs agencies to being transforming their current process to identify and capture electronic records. One milestone is that by 2016, agencies must be managing their email in an electronically accessible format (with tools that make this possible, not printing out emails to paper). Agencies should begin planning for the transition, where appropriate, from paper-based records management process to those that preserve records in an electronic format.

The Directive also calls on agencies to designate a Senior Agency Official (SAO) for Records Management by November 15, 2012. The SAO is intended to raise the profile of records management in an agency to ensure that each agency commits the resources necessary to carry out the rest of the goals in the Directive. A meeting of SAOs is to be held at the National Archives with the Archivist of the United States convening the meeting by the end of this year. Details about that meeting will be distributed by NARA soon.”

Does the Directive holistically address information governance for the agencies, or is it likely that agencies will continue to deploy different technology even within their own departments?

“In general, as long as agencies are properly managing their records, it does not matter what technologies they are using. However, one of the drivers behind the issuance of the Memorandum and the Directive was identifying ways in which agencies can reduce costs while still meeting all of their records management requirements. The Directive specifies actions (see A3, A4, A5, and B2) in which NARA and agencies can work together to identify effective solutions that can be shared.”

Finally, although FOIA requests have increased and the backlog has decreased, how will litigation and FOIA intersecting in the next say 5 years?  We know from the retracted decision in NDLON that metadata still remains an issue for the government…are we getting to a point where records created electronically will be able to be produced electronically as a matter of course for FOIA litigation/requests?

“In general, an important feature of the Directive is that the Federal government’s record information – most of which is in electronic format – stays in electronic format. Therefore, all of the inherent benefits will remain as well – i.e., metadata being retained, easier and speedier searches to locate records, and efficiencies in compilation, reproduction, transmission, and reduction in the cost of producing the requested information. This all would be expected to have an impact in improving the ability of federal agencies to respond to FOIA requests by producing records in electronic formats.”

Fun Fact- Is NARA really saving every tweet produced?

“Actually, the Library of Congress is the agency that is preserving Twitter. NARA is interested in only preserving those tweets that a) were made or received in the course of government business and b) appraised to have permanent value. We talked about this on our Records Express blog.”

“We think President Barack Obama said it best when he made the following comment on November 28, 2011:

“The current federal records management system is based on an outdated approach involving paper and filing cabinets. Today’s action will move the process into the digital age so the American public can have access to clear and accurate information about the decisions and actions of the Federal Government.” Paul Wester, Chief Records Officer at the National Archives, has stated that this Directive is very exciting for the Federal Records Management community.  In our lifetime none of us has experienced the attention to the challenges that we encounter every day in managing our records management programs like we are now. These are very exciting times to be a records manager in the Federal government. Full implementation of the Directive by the end of this decade will take a lot of hard work, but the government will be better off for doing this and we will be better able to serve the public.”

Special thanks to NARA for the ongoing dialogue that is key to transparent government and the effective practice of eDiscovery, Freedom Of Information Act requests, records management and thought leadership in the government sector. Stay tuned as we continue to cover these crucial issues for the government as they wrestle with important information governance challenges. 

 

Defensible Deletion: The Cornerstone of Intelligent Information Governance

Tuesday, October 16th, 2012

The struggle to stay above the rising tide of information is a constant battle for organizations. Not only are the costs and logistics associated with data storage more troubling than ever, but so are the potential legal consequences. Indeed, the news headlines are constantly filled with horror stories of jury verdicts, court judgments and unreasonable settlements involving organizations that failed to effectively address their data stockpiles.

While there are no quick or easy solutions to these problems, an ever increasing method for effectively dealing with these issues is through an organizational strategy referred to as defensible deletion. A defensible deletion strategy could refer to many items. But at its core, defensible deletion is a comprehensive approach that companies implement to reduce the storage costs and legal risks associated with the retention of electronically stored information (ESI). Organizations that have done so have been successful in avoiding court sanctions while at the same time eliminating ESI that has little or no business value.

The first step to implementing a defensible deletion strategy is for organizations to ensure that they have a top-down plan for addressing data retention. This typically requires that their information governance principals – legal and IT – are cooperating with each other. These departments must also work jointly with records managers and business units to decide what data must be kept and for what length of time. All such stakeholders in information retention must be engaged and collaborate if the organization is to create a workable defensible deletion strategy.

Cooperation between legal and IT naturally leads the organization to establish records retention policies, which carry out the key players’ decisions on data preservation. Such policies should address the particular needs of an organization while balancing them against litigation requirements. Not only will that enable a company to reduce its costs by decreasing data proliferation, it will minimize a company’s litigation risks by allowing it to limit the amount of potentially relevant information available for current and follow-on litigation.

In like manner, legal should work with IT to develop a process for how the organization will address document preservation during litigation. This will likely involve the designation of officials who are responsible for issuing a timely and comprehensive litigation hold to custodians and data sources. This will ultimately help an organization avoid the mistakes that often plague document management during litigation.

The Role of Technology in Defensible Deletion

In the digital age, an essential aspect of a defensible deletion strategy is technology. Indeed, without innovations such as archiving software and automated legal hold acknowledgements, it will be difficult for an organization to achieve its defensible deletion objectives.

On the information management side of defensible deletion, archiving software can help enforce organization retention policies and thereby reduce data volume and related storage costs. This can be accomplished with classification tools, which intelligently analyze and tag data content as it is ingested into the archive. By so doing, organizations may retain information that is significant or that otherwise must be kept for business, legal or regulatory purposes – and nothing else.

An archiving solution can also reduce costs through efficient data storage. By expiring data in accordance with organization retention policies and by using single instance storage to eliminate ESI duplicates, archiving software frees up space on company servers for the retention of other materials and ultimately leads to decreased storage costs. Moreover, it also lessens litigation risks as it removes data available for future litigation.

On the eDiscovery side of defensible deletion, an eDiscovery platform with the latest in legal hold technology is often essential for enabling a workable litigation hold process. Effective platforms enable automated legal hold acknowledgements on various custodians across multiple cases. This allows organizations to confidently place data on hold through a single user action and eliminates concerns that ESI may slip through the proverbial cracks of manual hold practices.

Organizations are experiencing every day the costly mistakes of delaying implementation of a defensible deletion program. This trend can be reversed through a common sense defensible deletion strategy which, when powered by effective, enabling technologies, can help organizations decrease the costs and risks associated with the information explosion.

Responsible Data Citizens Embrace Old World Archiving With New Data Sources

Monday, October 8th, 2012

The times are changing rapidly as data explosion mushrooms, but the more things change the more they stay the same. In the archiving and eDiscovery world, organizations are increasingly pushing content from multiple data sources into information archives. Email was the first data source to take the plunge into the archive, but other data sources are following quickly as we increase the amount of data we create (volume) along with the types of data sources (variety). While email is still a paramount data source for litigation, internal/external investigations and compliance – other data sources, namely social media and SharePoint, are quickly catching up.  

This transformation is happening for multiple reasons. The main reason for this expansive push of different data varieties into the archive is because centralizing an organization’s data is paramount to healthy information governance. For organizations that have deployed archiving and eDiscovery technologies, the ability to archive multiple data sources is the Shangri-La they have been looking for to increase efficiency, as well as create a more holistic and defensible workflow.

Organizations can now deploy document retention policies across multiple content types within one archive and can identify, preserve and collect from the same, singular repository. No longer do separate retention policies need to apply to data that originated in different repositories. The increased ability to archive more data sources into a centralized archive provides for unparalleled storage, deduplication, document retention, defensible deletion and discovery benefits in an increasingly complex data environment.

Prior to this capability, SharePoint was another data source in the wild that needed disparate treatment. This meant that legal hold in-place, as well as insight into the corpus of data, was not as clear as it was for email. This lack of transparency within the organization’s data environment for early case assessment led to unnecessary outsourcing, over collection and disparate time consuming workflows. All of the aforementioned detractors cost organizations money, resources and time that can be better utilized elsewhere.

Bringing data sources like SharePoint into an information archive increases the ability for an organization to comply with necessary document retention schedules, legal hold requirements, and the ability to reap the benefits of a comprehensive information governance program. If SharePoint is where an organization’s employees are storing documents that are valuable to the business, order needs to be brought to the repository.

Additionally, many projects are abandoned and left to die on the vine in SharePoint. These projects need to be expired and that capacity must be recycled for a higher business purpose. Archives currently enable document libraries, wikis, discussion boards, custom lists, “My Sites” and SharePoint social content for increased storage optimization, retention/expiration of content and eDiscovery. As a result, organizations can better manage complex projects such as migrations, versioning, site consolidations and expiration with SharePoint archiving.  

Data can be analogized to a currency, where the archive is the bank. In treating data as a currency, organizations must ask themselves: why are companies valued the way they are on Wall Street? For companies that perform service or services in combination with products, they are valued many times on customer lists, data to be repurposed about consumers (Facebook), and various other databases. A recent Forbes article discusses people, value and brand as predominant indicators of value.

While these valuation metrics are sound, the valuation stops short of measuring the quality of the actual data within an organization, examining if it is organized and protected. The valuation also does not consider the risks of and benefits of how the data is stored, protected and whether or not it is searchable. The value of the data inside a company is what supports all three of the aforementioned valuations without exception. Without managing the data in an organization, not only are eDiscovery and storage costs a legal and financial risk, the aforementioned three are compromised.

If employee data is not managed/monitored appropriately, if the brand is compromised due to lack of social media monitoring/response, or if litigation ensues without the proper information governance plan, then value is lost because value has not been assessed and managed. Ultimately, an organization is only as good as its data, and this means there’s a new asset on Wall Street – data.

It’s not a new concept to archive email,  and in turn it isn’t novel that data is an asset. It has just been a less understood asset because even though massive amounts of data are created each day in organizations, storage has become cheap. SharePoint is becoming more archivable because more critical data is being stored there, including business records, contracts and social media content. Organizations cannot fear what they cannot see until they are forced by an event to go back and collect, analyze and review that data. Costs associated with this reactive eDiscovery process can range from $3,000-30,000 a gigabyte, compared to the 20 cents per gigabyte for storage. The downstream eDiscovery costs are obviously costly, especially as organizations begin to deal in terabytes and zettabytes. 

Hence, plus ca change, plus c’est le meme chose and we will see this trend continue as organizations push more valuable data into the archive and expire data that has no value. Multiple data sources have been collection sources for some time, but the ease of pulling everything into an archive is allowing for economies of scale and increased defensibility regarding data management. This will decrease the risks associated with litigation and compliance, as well as boost the value of companies.

Kleen Products Predictive Coding Update – Judge Nolan: “I am a believer of principle 6 of Sedona”

Tuesday, June 5th, 2012

Recent transcripts reveal that 7th Circuit Magistrate Judge Nan Nolan has urged the parties in Kleen Products, LLC, et. al. v. Packaging Corporation of America, et. al. to focus on developing a mutually agreeable keyword search strategy for eDiscovery instead of debating whether other search and review methodologies would yield better results. This is big news for litigators and others in the electronic discovery space because many perceived Kleen Products as potentially putting keyword search technology on trial, compared to newer technology like predictive coding. Considering keyword search technology is still widely used in eDiscovery, a ruling by Judge Nolan requiring defendants to redo part of their production using technology other than keyword searches would sound alarm bells for many litigators.

The controversy surrounding Kleen Products relates both to Plaintiffs’ position, as well as the status of discovery in the case. Plaintiffs initially asked Judge Nolan to order Defendants to redo their previous productions and all future productions using alternative technology.  The request was surprising to many observers because some Defendants had already spent thousands of hours reviewing and producing in excess of one million documents. That number has since surpassed three million documents.  Among other things, Plaintiffs claim that if Defendants had used “Content Based Advanced Analytics” tools (a term they did not define) such as predictive coding technology, then their production would have been more thorough. Notably, Plaintiffs do not appear to point to any instances of specific documents missing from Defendants’ productions.

In response, Defendants countered that their use of keyword search technology and their eDiscovery methodology in general was extremely rigorous and thorough. More specifically, they highlight their use of advanced culling and analysis tools (such as domain filtering and email threading) in addition to keyword search tools.  Plaintiffs also claim they cooperated with Defendants by allowing them to participate in the selection of keywords used to search for relevant documents.  Perhaps going above and beyond the eDiscovery norm, the Defendants even instituted a detailed document sampling approach designed to measure the quality of their document productions.

Following two full days of expert witness testimony regarding the adequacy of Plaintiffs’ initial productions, Judge Nolan finally asked the parties to try and reach compromise on the “Boolean” keyword approach.  She apparently reasoned that having the parties work out a mutually agreeable approach based on what Defendants had already implemented was preferable to scheduling yet another full day of expert testimony — even though additional expert testimony is still an option.

In a nod to the Sedona Principles, she further explained her rationale on March 28, 2012, at the conclusion of the second day of testimony:

“the defendants had done a lot of work, the defendant under Sedona 6 has the right to pick the [eDiscovery] method. Now, we all know, every court in the country has used Boolean search, I mean, this is not like some freak thing that they [Defendants] picked out…”

Judge Nolan’s reliance on the Sedona Best Practices Recommendations & Principles for Addressing Electronic Document Production reveals how she would likely rule if Plaintiffs renew their position that Defendants should have used predictive coding or some other kind of technology in lieu of keyword searches. Sedona Principle 6 states that:

“[r]esponding parties are best situated to evaluate the procedures, methodologies, and technologies appropriate for preserving and producing their own electronically stored information.”

In other words, Judge Nolan confirmed that in her court, opposing parties typically may not dictate what technology solutions their opponents must use without some indication that the technology or process used failed to yield accurate results. Judge Nolan also observed that quality and accuracy are key guideposts regardless of the technology utilized during the eDiscovery process:

“what I was learning from the two days, and this is something no other court in the country has really done too, is how important it is to have quality search. I mean, if we want to use the term “quality” or “accurate,” but we all want this…– how do you verify the work that you have done already, is the way I put it.”

Although Plaintiffs have reserved their right to reintroduce their technology arguments, recent transcripts suggest that Defendants will not be required to use different technology. Plaintiffs continue to meet and confer with individual Defendants to agree on keyword searches, as well as the types of data sources that must be included in the collection. The parties and Judge also appear to agree that they would like to continue making progress with 30(b)(6) depositions and other eDiscovery issues before Judge Nolan retires in a few months, rather than begin a third day of expert hearings regarding technology related issues. This appears to be good news for the Judge and the parties since the eDiscovery issues now seem to be headed in the right direction as a result of mutual cooperation between the parties and some nudging by Judge Nolan.

There is also good news for outside observers in that Judge Nolan has provided some sage guidance to help future litigants before she steps down from the bench. For example, it is clear that Judge Nolan and other judges continue to emphasize the importance of cooperation in today’s complex new world of technology. Parties should be prepared to cooperate and be more transparent during discovery given the judiciary’s increased reliance on the Sedona Cooperation Proclamation. Second, Kleen Products illustrates that keyword search is not dead. Instead, keyword search should be viewed as one of many tools in the Litigator’s Toolbelt™ that can be used with other tools such as email threading, advanced filtering technology, and even predictive coding tools.  Finally, litigators should take note that regardless of the tools they select, they must be prepared to defend their process and use of those tools or risk the scrutiny of judges and opposing parties.

The Demise of The News of the World: An Analysis of “Hackgate” Through an eDiscovery Lens

Friday, June 1st, 2012

The events surrounding the troubled News Corporation media empire, under investigation for the illegal seizure of electronic evidence (ESI), are seemingly never-ending. The Australian billionaire Rupert Murdoch is chairman of the New York-based parent company, News Corporation, and as a U.S. based company with subsidiaries abroad, the litigation exposure for the company is vast. News International, a U.K. subsidiary of News Corporation, shut down one of their oldest running publications, The News of the World, in July last year amid the monumental phone hacking scandal known as Hackgate. Although the paper was dissolved, allegations beginning as early as 2002 detail unethical media practices, email/phone (voicemail)/text hacking, police bribery, and the recent Leveson inquiry. This firestorm continues to plague the company and has created one of the most complex legal debacles of the modern era.

A myriad of reasons are responsible for these legal complexities that continue to unfold, including: active civil/criminal actions in both U.S. and U.K jurisdictions, questions about how evidence has been obtained and the subsequent admissibility in differing jurisdictions, public inquiries in the U.K., as well as investigations by the Federal Bureau of Investigation (FBI) and the U.S. Department of Justice under the Foreign Corrupt Practices Act (FCPA). Under the FCPA, American companies are prohibited from compensating representatives of a foreign government for a commercial advantage. This is particularly poignant given the recently released text messages uncovered in the Leveson inquiry, which expose alleged illegal communications between Frederic Michel, a lobbyist for News Corporation and Jeremy Hunt, the Secretary of State for Culture, Olympics, Media and Sport, during News Corporation’s bid to acquire BSkyB during 2010-11. The bid has since been abandoned and so have Murdoch’s attempts to create the largest media empire in the world.

eDiscovery and Hackgate

To date, there have been more than 60 civil claims brought in the U.K. derived from Hackgate (many have been privately settled), not including any U.S. litigation, Operation Weeting, the Leveson inquiry, and other various concurrent investigations. Several key disclosure orders from the High Court in these civil cases have resulted in extensive discovery that points to not only a conspiracy, but to the willful destruction of evidence. The High Court judge presiding over the civil lawsuits, Geoffrey Vos, was shocked by the company’s “startling approach” to e-mail, particularly because subsequent to receiving formal requests for documents, the company still failed to preserve relevant emails. In fact, the company inquired with its email provider about how to delete those emails. Vos is quoted as saying that News International should be “treated as deliberate destroyers of evidence.”

A hard copy of an email from 2008 addressed to Mr. Murdoch’s son, James Murdoch, who at the time was a top executive of News International, is of particular interest regarding his level of knowledge about Hackgate. The email is from a thread between News Corporation’s in-house counsel to the then-editor, Colin Myler, informing James that the legal fallout from phone-hacking was imminent.  James and his father later testified that they had no knowledge of the emails and that they failed to appreciate any illegal activity regarding phone hacking at the newspaper. Apparently, the electronic copy of the email was deleted on Jan. 15, 2011 during an “e-mail stabilization and modernization program.”

As frequently discussed in the U.S., having a document retention policy is crucial to the defensible deletion of data in a corporation. That deletion must be suspended and relevant data must be place on legal hold once litigation is reasonably anticipated. Moreover, it should not be instituted in the midst of a company-wide international crisis.  What is troublesome in this scenario is that no such policy seems to have existed regarding document retention or legal hold.  If a properly deployed retention schedule existed, then the emails would have been deleted prior to 2011 as part of the normal course of business. Conversely, if there was reasonable anticipation of litigation, then given the proper issuance of legal hold, the emails surely would not have been deleted. In the U.K., case law does exist to support the need for preservation and an ESI management system that would allow for full disclosure of relevant information.

The News Corporation has both the U.S. and U.K. to contend with regarding the defensibility of their information management systems and potential sanctions. However, in either scenario, the intentional deletion of relevant evidence is an obstruction of justice (in a criminal sense). News Corporation is a prime example of a multinational corporation that is not only suffering from the repercussions of bad behavior, but one that could not mitigate these risks at the highest level due to poor information management. The need for a comprehensive information governance plan and in-house technology would have been key to any internal investigations to research and monitor alleged illegal activities of employees, as well as to responding to litigation and regulatory inquiries. A proper information management system might have obviated much of News of the World’s troubles, provided for more transparency, and potentially prevented this never-ending downward spiral.