24h-payday

Archive for the ‘e-mail’ Category

The eDiscovery Trinity: Spoliation Sanctions, Keywords and Predictive Coding

Monday, May 20th, 2013

The world of eDiscovery appears to be revolving around a trifecta of issues that are important to both clients and counsel. A discovery-focused conversation with litigants and lawyers in 2013 will almost invariably turn to some combination of this eDiscovery trinity: Spoliation sanctions, keyword searches and predictive coding. This should not come as a surprise since all three of these issues can have a strong impact on the cost, duration and disposition of a lawsuit. Indeed, the near universal desire among parties to minimize discovery costs and thereby further the resolution of cases on the merits has driven the Civil Rules Advisory Committee to explore ways to address the eDiscovery trinity in draft amendments to the Federal Rules.

While the proposed amendments may or may not succeed in reducing discovery expenses, the examples of how the eDiscovery trinity is playing out in litigation are instructive. These cases – bereft of the additional guidance being developed by the Advisory Committee – provide valuable insight on how courts, counsel and clients are handling the convergence of these issues. One such example is a recent decision from the DuPont v. Kolon Industries case.

Spoliation, Keywords and a $4.5 Million Sanction

In DuPont, the court awarded the plaintiff manufacturer $4.5 million in fees and costs that it incurred as part of its effort to address Kolon’s spoliation of ESI. In an attempt to stave off the award, Kolon argued that DuPont’s fees were not justified due to “inefficiencies” associated with DuPont’s review of Kolon’s document productions. In particular, Kolon complained about the extensive list of search terms that DuPont developed to comb through the ESI Kolon produced. According to Kolon, DuPont’s search methodology was “recklessly inefficient”:

DuPont’s forensic experts ran a list of almost 350 “keywords,” which yielded thousands of “false positives” that nevertheless had to be translated, analyzed, and briefed. Of the nearly 18,000 “hits,” only 1,955 (roughly 10 percent) were determined to be even “potentially relevant.” Thus, to state the obvious, 90 percent of the results were wholly irrelevant to the issue, but DuPont still seeks to tax Kolon for having the bulk of those documents translated and analyzed.

Kolon then asserted that the “reckless inefficiency” of the search methodology was “fairly attributable to the fact that DuPont ran insipid keywords like ‘other,’ ‘news,’ and ‘mail.’” Had DuPont been more precise with its keywords searches, argued Kolon, it “would have saved vast amounts of time and money.”

Before addressing the merits of Kolon’s arguments, the court observed how important search terms had become in discovery:

Of course, in the current world of litigation, where so many documents are stored and, hence, produced, electronically, the selection of search terms is an important decision because it, in turn, drives the subsequent document discovery, production and review.

After doing so, the court rejected Kolon’s arguments, finding instead that DuPont’s search methodology was reasonable under the circumstances. The court based its decision on the source of those search terms (derived from Kolon documents suggesting that ESI had been deleted), the “considerable volume” of Kolon’s productions and the nature of DuPont’s search (an investigation for deleted evidence).

The Impact of Predictive Coding on DuPont’s Search Efficiency

While DuPont considered the issues of spoliation and keywords in connection with the imposition of attorney fees and costs, it was silent on the impact that predictive coding might have had on the fee award. Indeed, neither the court’s order, nor the parties’ briefing considered whether the proper application of machine learning technology could have raised the success rate of DuPont’s searches for documents relevant to Kolon’s spoliation above the ten percent (10%) figure cited by Kolon.

On the one hand, many eDiscovery cognoscenti would likely assert that a properly applied predictive coding solution could have produced the same corpus of relevant documents at a fraction of the cost and effort. Others, however, might argue that predictive coding perhaps would not yield the results that DuPont obtained through keyword searches given that DuPont was looking for evidence of deleted ESI. Still others would contend that the issue is moot since DuPont was fully within its right to determine how it should conduct the search of Kolon’s document productions.

Whether predictive coding could have made a difference in DuPont is entirely speculative. Regardless, the debate over keyword searches versus machine learning technology will likely continue unabated. As it stands, the DuPont case, together with the recent decision from Apple v. Samsung, confirm that keywords may be an acceptable method for conducting searches for relevant ESI. The issue, as the DuPont court observed, turns on “the selection of the search terms.”

Nevertheless, the promise of predictive coding cannot be ignored, particularly if the technology that is used could ultimately reduce the costs and duration of discovery. Given that this debate is far from settled, these issues, along with spoliation sanctions, will likely continue to dominate the eDiscovery airwaves for the foreseeable future.

The “Sedona Bubble” and the Top 3 TAR Trends of 2013

Tuesday, April 23rd, 2013

 

References to the “Sedona Bubble” are overheard more and more commonly at conferences dealing with cutting edge topics like the use of predictive coding technology in eDiscovery. The “Sedona Bubble” refers to a small number of lawyers and judges (most of whom are members of The Sedona Conference) that are fully engaged in discussions about issues that influence the evolution of modern discovery practice. Let’s face it. The fact that only a small percentage of judges and lawyers drive important eDiscovery policy decisions is more than just a belief, it is reality.

This reality stems largely from the fact that litigators are a busy lot. So busy in fact, that they are often forced to operate reactively instead of proactively because putting out unexpected fires comes with the territory in litigation practice. As a result, the Sedona Bubble has a tremendous impact on cutting edge eDiscovery issues that include topics spanning everything from cross-border litigation and cloud computing, to social media and bring your own device to work (BYOD) issues. Recognizing the heavy time demands facing most litigators is what compelled me to provide more insight into the Sedona Bubble. That is why I am sharing my top three observations about the current state of predictive coding – one of the hottest eDiscovery topics on the planet.

#1 – Plenty of confusion about TAR still exists

Technology-assisted review (TAR) is a term that often means different things to different people. Adding further confusion to the discussion is the fact that the acronym TAR is commonly used interchangeably with other terms like computer-assisted review (CAR) and predictive coding. Many believe confusion about TAR is largely the result of misinformation spread by eDiscovery providers eager to capitalize on current marketplace momentum. Regardless of the reason, many in the industry remain confused about the key differences between predictive coding and other kinds of TAR tools.

What is important to remember is that most people are referring to predictive coding technology when they use any of the aforementioned terms. Predictive coding is a type of supervised machine learning technology that relies on human input to “train” a computer to classify documents. That does not mean attorneys are abdicating their responsibility to review and classify documents during discovery. t means that attorneys can review a fraction of the documents at a fraction of the cost by training the computer system.

In recent months, more litigators and judges are beginning to understand that there are many kinds of TAR tools to choose from in the litigator’s toolbelt.™ Predictive coding is one of the tools that falls underneath the broader TAR umbrella and is arguably the most important tool in the toolbelt™ if used properly. All the tools can be helpful, however, TAR tools such as keyword searching, concept searching, clustering, email threading, and de-duplication are not supervised machine learning tools and therefore are not predictive coding tools. The rule of thumb for those being courted by predictive coding is caveat emptor.  Make sure the providers clarify what they mean when they use terms like TAR, CAR, or predictive coding.

#2 – Momentum is building

In 2013, more and more attorneys and judges are dipping their toes into predictive coding waters – waters that were largely perceived as too frigid to enter only last year. One explanation for the increased usage of predictive coding technologies is the corresponding increase in judicial guidance. In the beginning of 2012, there were no known cases addressing the use of predictive coding technology. Since then, at least six different judges have addressed the use of predictive coding technology. (Moore v. Publicis Group; Kleen Products v. Packaging Corporation of America; Global Aerospace v. Landow Aviation; In re: Actos Product Liability Litigation; EOHRB v. HOA Holdings; Gabriel Technologies v. Qualcomm). Taken as whole, the court decisions are either supportive of the technology or remain neutral on the issue. In fact, an order in a new case named In Re Biomet was reported only a few days ago and continues the general trend toward judicial awareness and support of the technology.

In addition to the growing number of judicial opinions, conference attendees are sharing experiences related to the use of these technologies far more than was the case at conferences in 2012. This data point suggests that usage far exceeds the number of reported predictive coding cases. Further evidence of this momentum, and possibly even greater momentum to come, are discussions about adding comments to the proposed FRCP amendments that would encourage the use of predictive coding technology. Newly proposed amendments to the Federal Rules of Civil Procedure are expected to be published for comment in August, and predictive coding will almost certainly be part of the discussion.

#3 – Skeptics remain

Despite a significant uptick in predictive coding usage since early 2012, the technology is not without skeptics. Those less bullish cite concerns about the multitude of new predictive coding offerings that have recently come onto the market. Most realize that all predictive coding technologies are not created equally and the vast majority of tools on the market lack transparency. A key concern on this front is the lack of visibility into the underlying statistical methodology that many tools and their providers apply. Since statistics are the backbone of a viable predictive coding process, a lack of transparency into statistical methodologies by most providers has left some to perceive all predictive coding tools as “black boxes.” In reality, different tools provide different levels of transparency, but a general lack of transparency in the industry has perpetuated a “throw the baby out with the bathwater” mentality in some circles. Rumblings about the applicability of Daubert and/or Rule 702 in vetting these tools and the methodologies they rely upon are likely to gain steam.

The issue of transparency is also a common area of debate in the context of an issue known as the “discard pile.” The discard pile generally refers to documents classified as non-responsive that are used to train the predictive coding solution. The protocol established in Da Silva Moore and other cases requires the producing party to reveal the discard pile to the propounding party as part of the predictive coding training process. Proponents argue that this additional level of cooperation invites scrutiny by both parties that will help insure that training documents are properly classified. The rationale in support of this approach is that predictive coding tools are garbage-in garbage-out devices so improperly classifying training documents will lead to erroneous downstream results. The pushback by producing parties varies, but one common theme is predominant and can be summarized as follows: “I will share my non-responsive documents to the other side when they are pried from my cold, dead fingers.”

Conclusion

Although some barriers to widespread predictive coding adoption remain, it is clear that the future of predictive coding is now. Eventually best practices for using these technologies will rise to the surface and the tools themselves will improve. For example, most tools today require complex statistical calculations to be made manually. That means hiring consultants and/or statisticians to crunch the numbers in order to ensure a defensible process which increases costs. The tools themselves can also be costly because most providers charge a premium to use predictive coding solutions. However, price pressure is already afoot and some providers offer their predictive coding technology at no additional cost. In short, despite some early challenges, most of those within the Sedona Bubble believe predictive coding is here to stay.   

 

Falling Off The Cliff: Parties Are Still Failing The Proportionality Test

Thursday, March 28th, 2013

One of the great questions that the legal profession and the eDiscovery cognoscenti are grappling with is how to best address the unreasonable costs and burdens associated with the discovery process. This is not a new phenomenon. While accentuated by the information explosion, the courts and rules makers have been struggling for years with a solution to this perpetual dilemma.

Proportionality As The Solution

Over the past three decades, the answer to this persistent problem has generally focused on emphasizing proportionality standards. Proportionality – requiring that the benefits of discovery be commensurate with the corresponding burdens – has the potential to be a game-changing concept. If proportionality standards are followed by counsel, clients and the courts, there is a strong possibility that discovery costs and burdens could be made more reasonable. That is perhaps why various courts (at the circuit, district and state levels) throughout the U.S. have implemented rules to highlight proportionality as the touchstone of discovery practice.

These issues were recently spotlighted by United States Magistrate Judge Frank Maas, Lockheed Martin Associate General Counsel Shawn Cheadle and Milberg partner Ariana Tadler at the LegalTech conference in New York City. What is most evident and important from the various video excerpts of their discussion is the panelists’ general agreement that proportionality standards – if followed – can keep a lawsuit from veering off the eDiscovery cliff. These experts, who represent vastly different and conflicting constituencies, emphasized how proportionality and the related concepts of reasonableness and cooperation can lead to quicker and ostensibly cheaper results in litigation.  As Judge Maas makes clear, however, that will only happen with a “change in paradigm and a change in thinking on both sides” of a lawsuit.

Failing The Proportionality Test

Unfortunately, far too many litigants often still neglect to follow basic proportionality standards. This troubling trend is confirmed by various court opinions that are seemingly issued every month in which discovery costs and burdens are increased due to litigants’ failures to engage in proportional discovery. The failure to engage in proportional discovery follows a familiar pattern. Overly broad discovery requests are typically met with general objections and evasive responses that unreasonably limit the scope of responsive information. Such requests and responses generally run contrary to the spirit of proportionality.

The “bible” on proportionality law, Mancia v. Mayflower Textile Services Co., provides that discovery requests and their corresponding responses must be reasonable and proportional. To achieve such an objective, the Mancia court urged counsel and clients to “stop and think” about their discovery conduct as mandated by Federal Rule 26(g):

Rule 26(g) imposes an affirmative duty to engage in pretrial discovery in a responsible manner that is consistent with the spirit and purposes of Rules 26 through 37. In addition, Rule 26(g) is designed to curb discovery abuse by explicitly encouraging the imposition of sanctions. The subdivision provides a deterrent to both excessive discovery and evasion by imposing a certification requirement that obliges each attorney to stop and think about the legitimacy of a discovery request, a response thereto, or an objection.

The clear lesson from this example is the negative impact that discovery conduct can have on a case. Instead of engaging in a proportional approach in which the parties cooperatively hammer out (with court assistance, if necessary) the parameters and limitations of discovery, parties frequently adopt a unilateral, “take no prisoners” strategy. Such an approach generally affects the cost and pace of litigation. Instead of addressing the merits of a dispute through dispositive motion practice, the parties and the court are often thrown into distracting and costly collateral eDiscovery litigation. And as the LegalTech panelists made clear, the resulting situation benefits nobody.

Falling Off The Cliff?

The current discovery paradigm is particularly troubling given that many sophisticated litigants who are incentivized to engage in proportional discovery may not be doing so. If that is the case, how can courts realistically expect other less educated parties to do otherwise?

To deter such discovery conduct, courts may need to embark on a proportionality education campaign. However, any such efforts will likely need to include a promise to address noncompliance with sanctions under Federal Rule 26(g). As the courts have made clear, many counsel and clients will likely engage in proportional discovery only under the threat of some real consequence.

To better address these issues, the federal Civil Advisory Committee is now considering multiple amendments to the Federal Rules of Civil Procedure that would better emphasize proportionality standards. In a recent post, we discussed one such proposal, which would change Rule 37(e) to ensure that courts consider the role of proportionality in connection with parties’ preservation efforts. Another would modify Federal Rule 26(b)(1) to spotlight the limitations of proportionality on the permissible scope of discovery. Though still far from final, these proposed rule amendments could ultimately advance the objective of reducing the costs and burdens of discovery. Such efforts may very well be necessary if we are to keep the discovery process from falling off the cliff.

South Africa’s Motivation for Information Governance: Privacy, Fraud and the Cloud

Tuesday, March 19th, 2013

On a recent trip to South Africa, where Symantec sponsored an event with PricewaterhouseCoopers (PwC) entitled The Protection of Personal Information (POPI) Drives Information Governance, customers and partners shared important insights. One major concern the attendees had was how they will comply with the newly proposed privacy legislation set to pass any day now.

POPI is the first comprehensive body of law addressing privacy in the country. Personal data is defined as a natural person’s name, date of birth, national identification number, passport number, health or credit information and other personally identifiable information. The bill has eight principles, each of which addresses aspects of how data must be collected, stored, processed, secured, expired and how access may be granted. This bill will apply to both public and private organizations and is driving the need for archiving, classification, eDiscovery, and data loss prevention technology.

Interestingly, the main motivator for purchasing eDiscovery technology will be the need for organizations in Africa to be able to conduct internal investigations to detect fraud. South Africa’s recent POPI legislation was crafted in order to address the age of digital information and the risks associated with it, but also to instill a level of confidence from the global economy in South Africa as a safe place to do business. A recent survey by Compuscan found that South Africa and Nigeria have the highest number of reported fraud cases in Africa. In addition, fraud related crimes have cost African businesses and governments at least $10.9 billion in 2011-12. Of the 875 reported cases, 40% of fraud perpetrators were in upper management.

Archiving the email of top management is a recommended best practice to address this fraud because it ensures that there will be a record of electronic communications should an investigation or lawsuit be necessary. Similarly, leveraging in-house eDiscovery and data loss prevention (DLP) technology enables investigators within the organizations to collect and analyze these emails in conjunction with other pertinent information to detect and even prevent fraud. To date, the majority of organizations in South Africa lack this kind of capability because they have not invested in technology.

Because corruption and fraud have been impediments to doing business in South Africa in the past, businesses and the government are taking steps to address these issues. Having the ability to conduct internal investigations will be a huge advantage for organizations looking to gain control over their information and those who commit fraud. PwC Partner Kris Budnik noted at the conference, “Many times when clients call me for an emergency forensic investigation, about 50% of the time in South Africa I cannot help them.  The reason for this is that the clients are not keeping the appropriate information governance systems in place and not keeping log files. Many times when we go to collect evidence, none is there because it has truly been overwritten in the data environment due to poor information governance practices.”

Litigation does not appear to be the biggest factor for purchasing eDiscovery technologies and implementing workflows as one might expect. The reason for this is unclear, but may be related to a less aggressive litigation profile as compared to that of the U.S. Much of the discovery in South Africa that involves electronically stored information is printed, reviewed and produced in paper format. The concern over retaining relevant metadata and reviewing/producing data in the format data was originally created does not seem to be top of mind for litigators.

Litigators in South Africa are not taking advantage of the rich information in metadata to supplement their cases or to challenge opposing counsel’s claims/productions. Also of concern is the inability to deduplicate and sort data once metadata is removed. The reason for this is most likely because there have not been enough cases where lack of metadata has been challenged. With time, and as cross-border litigation increases, there will be more demand for eDiscovery technology in the traditional legal context.

The increase in privacy concerns and internal fraud investigations presents a compelling reason for investing in archiving, eDiscovery, and DLP technologies for businesses in South Africa. Many organizations are moving data to the cloud to streamline POPI related objectives faster and because outsourcing their infrastructure is very attractive to organizations that don’t want to own the responsibilities of managing their information on premise. The main business drivers for cloud archiving in South Africa are: email continuity, cost and compliance.

It is interesting to observe how different countries and economies respond to technology and what drives use cases. The legal frameworks in each jurisdiction around the world vary, but the great equalizer will be technology. This is because whether it is privacy, litigation or fraud driving the information governance plan, the technology is the same.

Check out this article for more information on privacy legislation in South Africa.

Available soon: please visit our eDiscovery passport page for more information the legal system, eDiscovery, privacy and data protection in South Africa and other countries.

 

Would Rule Changes Alleviate eDiscovery Burdens?

Wednesday, February 6th, 2013

You have heard this one before. Changes to the Federal Rules are in the works that could alleviate the eDiscovery burdens of organizations. Greeting this news with skepticism would probably be justified. After all, many feel that the last set of amendments failed to meet the hype of streamlining the discovery process to make litigation costs more reasonable. Others, while not declaring the revised Rules a failure, nonetheless believe that the amendments have been doomed by the lack of adherence among counsel and the courts. Regardless of the differing perspectives, there seems to be agreement on both sides that the Rules have spawned more collateral disputes than ever before about the preservation and collection of ESI.

What is different this time is that the latest set of proposed amendments could offer a genuine opportunity for organizations to slash the costs of document preservation and collection. Chief among these changes would be a revised Rule 37(e). The current iteration of this rule is designed to protect companies from court sanctions when the programmed operation of their computer systems automatically destroys ESI. Nevertheless, the rule has largely proved ineffective as a national standard because it did not apply to pre-litigation information destruction activities. As a result, courts often bypassed the rule’s protections to punish companies who negligently, though not nefariously, destroyed documents before a lawsuit was filed.

The current proposal to amend Rule 37(e) (see page 127) would substantially broaden the existing protection against sanctions. The proposal would shield an organization’s pre-litigation destruction of information from sanctions except where that destruction was “willful or in bad faith and caused substantial prejudice in the litigation” or “irreparably deprived a party of any meaningful opportunity to present a claim or defense.”

In making a determination on this issue, courts would be forced to examine the enterprise’s information retention protocols through more than just the lens of litigation. Instead, they would have to consider the nature and motives behind a company’s decision-making process. Such factors include:

  •  The extent to which the party was on notice that litigation was likely
  • The reasonableness and proportionality of the party’s efforts to preserve the information
  • The nature and scope of any request received to preserve information
  • Whether the party sought timely judicial guidance regarding any preservation disputes

By seeking to punish only nefarious conduct and by ensuring that the analysis includes a broad range of considerations, organizations could finally have a fighting chance to reduce the costs and risks of preservation.

Despite the promise this proposal holds, there is concern among some of the eDiscovery cognoscenti that provisions in the draft proposal to amend Rule 37(e) could water down its intended protections. Robert Owen, a partner at Sutherland Asbill & Brennan LLP and a leading eDiscovery thought leader, has recently authored an insightful articlethat spotlights some of these issues. Among other things, Owen points out that the “irreparably deprived” provision could end up diluting the “bad faith” standard. This could ultimately provide activist jurists with an opportunity to re-introduce a negligence standard through the backdoor, which would be a troubling development for clients, counsel and the courts.

These issues and others confirm the difficulty of establishing national standards to address the factual complexities of many eDiscovery issues. They also point to the difficult path that the Civil Rules Advisory Committee still must travel before a draft of Rule 37(e) can be finalized for public comment. Even assuming that stage can be reached after the next rules committee meeting in April 2013, additional changes could still be forthcoming to address the concerns of other constituencies. Stay tuned; the debate over revisions to Rule 37(e) and its impact on organizations’ defensible deletion efforts is far from over.

LegalTech Plenary 2013: Symantec Mediates the eDiscovery Debate of the Year

Thursday, January 10th, 2013

The eDiscovery frenzy that has gripped the American legal system over the past decade has become increasingly expensive. Particularly costly to both clients and the courts is the process of preserving and reviewing ESI. As a solution to these costs, many are emphasizing the concept of “proportionality.” Proportionality typically requires that the benefits of discovery be commensurate with its corresponding burdens.

Despite nearly universal agreement that eDiscovery should be governed by proportionality standards, there remains a polarizing debate that threatens to curtail the impact of proportionality. That debate is centered on disagreements over the scope of ESI preservation, the standard for permissible discovery and the use of cutting edge review technologies like predictive coding.

To better understand these issues and to explore feasible solutions, Philip Favro, Discovery Counsel at Symantec, will lead a lively discussion at LegalTech New York among industry leaders such U.S. Magistrate Judge Frank Maas, Ariana Tadler of Milberg LLP and Shawn Cheadle, General Counsel (Military Space) at Lockheed Martin Space Systems Co. The panelists will take stances on either side of difficult questions like:

  •  Should proportionality standards apply to the preservation of ESI to help address the high costs of retaining so much data?
  • Will the proportionality rule ever be used to rein in lawyers and judges that have distorted the standard of discovery from reasonableness to perfection?
  • Can predictive coding facilitate proportional discovery when lawyers are unwilling to share their training set of documents?

While our expert panelists are well-versed in both sides of the proportionality debate, we had a little fun imagining what they might be going through before they take the stage on Tuesday, January 29th.  Watch this video to get an exclusive behind-the-scenes look into the LTNY Locker Room.

In addition, don’t miss our microsite for the complete plenary session description and a look at Symantec’s LTNY 2013 presence. We hope you stay tuned to eDiscovery 2.0 from now until the show to hear what Symantec has planned for the supersessions, our special event, contest giveaways and product announcements.

Breaking News: Bad Faith Retention Policy Leads to Terminating Sanctions

Friday, January 4th, 2013

The patent infringement litigation involving chipmaker Rambus took another twist this week as the court in Micron Technology v. Rambus declared several Rambus patents to be unenforceable as an eDiscovery sanction for its destruction of evidence. In a crushing blow to Rambus’ dynamic random access memory (DRAM) chips litigation strategy, the court reasoned that such a “dispositive sanction” was the only remedy that could address the chipmaker’s “bad faith” spoliation of email backup tapes, paper documents and other ESI.

At the heart of the Micron court’s sanctions order was its finding that Rambus implemented its information retention policy in bad faith. While acknowledging that retention policies may be “employed for legitimate business reasons such as general house-keeping,” the court found that the policies at issue were designed to deprive the chipmaker’s litigation adversaries of evidence that could impugn its patents. Furthermore, Rambus deviated from its policies to ensure that evidence favorable to its claims would be preserved. For example, after ordering the destruction of 1,270 email back-up tapes pursuant to its retention schedule, the chipmaker intervened to save a lone back-up tape after determining that “it contained data that could be used … to establish a conception date for an invention.” Such selective use of its retention policies belied Rambus’ contention that the policies were neutral and conclusively showed that the policies were tactically deployed to “seek an advantage in litigation.”

While the Micron court’s sanctions order will likely set up another round of appeals before the Federal Circuit, the lesson to be learned by organizations is the importance of developing a reasonable information retention policy. Indeed, had Rambus followed good faith business procedures within the parameters recently delineated by the Federal Circuit, it is unlikely that the destruction would have been seen as spoliation. The Micron ruling should not affect the current judicial trend that absent a preservation duty or other exceptional circumstances, organizations may use document retention protocols to destroy stockpiles of data that have no meaningful business value.

Legal Tech 2013 Sessions: Symantec explores eDiscovery beyond the EDRM

Wednesday, December 19th, 2012

Having previously predicted the ‘happenings-to-be’ as well as recommended the ‘what not to do’ at LegalTech New York, the veteran LTNY team here at Symantec has decided to build anticipation for the 2013 event via a video series starring the LTNY un-baptized associate.  Get introduced to our eDiscovery-challenged protagonist in the first of our videos (above).

As for this year’s show we’re pleased to expand our presence and are very excited to introduce eDiscovery without limits, along with a LegalTech that promises sessions, social events and opportunities for attendees in the same vein.   In regards to the first aspect – the sessions – the team of Symantec eDiscovery counsels will moderate panelist sessions on topics ranging across and beyond the EDRM.  Joined by distinguished industry representatives they’ll push the discussion deeper in 5 sessions with a potential 6 hours of CLE credits offered to the attendees.

Matt Nelson, resident author of Predictive Coding for Dummies will moderate “How good is your predictive coding poker face?” where panelists tackle the recently controversial subjects of disclosing the use of Predictive Coding technology, statistical sampling and the production of training sets to the opposition.

Allison Walton will moderate, “eDiscovery in 3D: The New Generation of Early Case Assessment Techniques” where panelists will enlighten the crowd on taking ECA upstream into the information creation and retention stages and implementing an executable information governance workflow.  Allison will also moderate “You’re Doing it Wrong!!! How To Avoid Discovery Sanctions Due to a Flawed Legal Hold Process” where panelists recommend best practices towards a defensible legal hold process in light of potential changes in the FRCP and increased judicial scrutiny of preservation efforts.

Phil Favro will moderate “Protecting Your ESI Blindside: Why a “Defensible Deletion” Offense is the Best eDiscovery Defense” where panelists debate the viability of defensible deletion in the enterprise, the related court decisions to consider and quantifying the ROI to support a deletion strategy.

Chris Talbott will moderate a session on “Bringing eDiscovery back to Basics with the Clearwell eDiscovery Platform”, where engineer Anna Simpson will demonstrate Clearwell technology in the context of our panelist’s everyday use on cases ranging from FCPA inquires to IP litigation.

Please browse our microsite for complete supersession descriptions and a look at Symantec’s LTNY 2013 presence.  We hope you stay tuned to eDiscovery 2.0 throughout January to hear what Symantec has planned for the plenary session, our special event, contest giveaways and product announcements.

Symantec Positioned Highest in Execution and Vision in Gartner Archiving MQ

Tuesday, December 18th, 2012

Once again Gartner has named Symantec as a leader in the Enterprise Information Archiving magic quadrant.  We’ve continued to invest significantly in this market and it is gratifying to see the recognition for the continued effort we put into archiving both in the cloud and on premises with our Enterprise Vault.cloud and Enterprise Vault products. Symantec has now been rated a leader 9 years in a row.

 

This graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request from Symantec.

Gartner does not endorse any vendor, product or service depicted in the Magic Quadrant, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

 This year marks a transition in a couple of regards.  We are seeing an acceleration of customers looking for the convenience and simplicity of SaaS based archiving solution. The caveat being that they want the security and trust that only a vendor like Symantec can deliver.

Similarly the market has continued to ask for integrated solutions that deliver information archiving and eDiscovery to quickly address often complex and time sensitive process of litigation and regulatory requests.  The deep integration we offer between our archiving solutions – Enterprise Vault and Enterprise Vault.cloud – and the Clearwell eDiscovery Platform has led many customers to deploy these together to streamline their eDiscovery workflow.

An archive is inherently deployed with the long term in mind.  Over the history of Gartner’s Enterprise Information Archiving MQ, only Symantec has provided a consistent solution to customers by investing and innovating with Enterprise Vault to lead the industry in performance, functionality, and support without painful migrations or changes. 

We’re excited about what we have planned next for Enterprise Vault and Enterprise Vault.cloud and intend to maintain our leadership in the years to come. Our customers will continue to be able to manage their critical information assets and meet their needs for eDiscovery and Information Governance as we improve our products year after year.

December Symantec SharePoint Governance Twitter Chat

Thursday, December 13th, 2012

Join hashtag #IGChat and learn about SharePoint governance and creating effective governance plans

Over the years, SharePoint has become a favorite among organizations as a place to share and manage content. As SharePoint adoption increases – storage, performance and on-going maintenance become major challenges, and SharePoint governance becomes essential. Archiving and eDiscovery solutions provide a key part in any effective and lasting governance strategy for SharePoint.  

In a 2012 survey conducted by Osterman research, the results showed that 39 percent of all SharePoint implementations still don’t have a governance plan. This is due to the fact that implementing governance plans can be difficult.

During this Twitter Chat we will discuss the reasons why organizations need SharePoint governance and the role of archiving and eDiscovery in governance plans. Please join Symantec’s archiving/eDiscovery and SharePoint experts, Dave Scott (@DScottyt) and Rob Mossi (@RMossi24) next Tuesday, December 18 at 10 am PT to chat.

Dave Scott: Dave Scott is a Group Product Manager at Symantec specializing in social media and SharePoint archiving and eDiscovery. He has contributed articles to a number of leading industry publications and is a frequent contributor to Connect.symantec.com. 

Rob Mossi: Rob Mossi is a Sr. Product Marketing Manager with Symantec’s Enterprise Vault product team. With a focus on SharePoint, Rob actively participates in SharePoint archiving and information governance thought leadership activities, including research, conferences and social media. 

 Twitter Chat: SharePoint Governance #IGChat

 Date: Tuesday, December 18, 2012

 Time: 10 am PT

 Length: 1 hour

 Where: Twitter – follow the hashtag #IGChat

 Moderator: Symantec’s Dave Scott (@DScottyt)