24h-payday

Posts Tagged ‘cloud computing’

Moving Data to the Cloud? Top 5 Tips for Corporate Legal Departments

Monday, September 30th, 2013

One of the hottest information technology (IT) trends is to move data once stored within the corporate firewall into a hosted cloud environment managed by third-party providers. In 2013 alone, the public cloud services market is forecast to grow an astonishing 18.5 percent to $131 billion worldwide, up from $111 billion in 2012. The trend is driven largely by the fact that labor, infrastructure, and software costs can be reduced by sending email and other data to third-party providers for off-site hosting. Although the benefits of cloud computing are real, many organizations make the decision to move to the cloud without thoroughly weighing all the risks and benefits first.

A common problem is that many corporate IT departments fail to consult with their legal department before making the decision to move company data into the cloud even though the decision may have legal consequences. For example, retrieving information from the cloud in response to eDiscovery requests presents unique challenges that could cause delay and increase costs. Similarly, problems related to data access and even ownership could also arise if the cloud provider merges with another company, goes bankrupt, or decides to change their terms of service.

The bad news is that the list of possible challenges is long when data is stored in the cloud. The good news is that many of the risks of moving important company data to the cloud are foreseeable and can be mitigated by negotiating terms with prospective cloud providers in advance. Although not comprehensive, the following highlights some of the key areas attorneys should consider before agreeing to store company data with third-party cloud providers.

1.      Who Owns the Data?

The cloud market is hot right now with no immediate end in sight. That means competition is likely for years to come and counsel must be prepared for some market volatility. At a high level, delineating the “customer” as the “data owner” in agreements with cloud service providers is a must. More specifically, terms that address what happens in the event of a merger, bankruptcy, divestiture or any event that leads to the termination or alteration of the relationship should be clearly outlined. Identifying the customer as the data owner and preserving the right to retrieve data within a reasonable time and for a reasonable price will help prevent company data from being used as a bargaining chip if there is a disagreement or change in ownership.

2.      eDiscovery Response Times and Capabilities

Many businesses know first-hand that the costs and burdens of complying with eDiscovery requests are significant. In fact, a recent RAND study estimates that every gigabyte of data reviewed costs approximately $18,000. Surprisingly, many businesses do not realize that storing data in the cloud with the wrong provider could increase the risks and costs of eDiscovery significantly. Risks include the possibility of sanctions for overlooking information that should have been produced or failing to produce information in a timely fashion. Costs could be exacerbated by storing data with a cloud provider that lacks the resources or technology to respond to eDiscovery requests efficiently.

That means counsel must understand whether or not the provider has the technology to preserve, collect, and produce copies of data stored in the cloud. If so, is the search technology accurate and thorough? What are the time frames for responding to requests and are there surcharges for expedited versus non-expedited requests for data? Also, is data collected in a forensically sound manner and is the chain of custody recorded to validate the integrity and reasonableness of the process in the event of legal challenges? More than one cloud customer has encountered excessive fees, unacceptable timelines, and mass confusion when relying on cloud providers to help respond to eDiscovery requests. Avoid surprises by negotiating acceptable terms before, not after, moving company data to the cloud.

3.      Where is Data Physically Located?

Knowing where your data is physically located is important because there could be legal consequences. For example, several jurisdictions have their own unique privacy laws that may impact where employee data can be physically located, how it must be stored, and how it can be used. This can result in conflicts where data stored in the cloud is subject to discovery in one jurisdiction, but disclosure is prohibited by the laws of the jurisdiction where the data is stored. Failure to comply with these local foreign laws, sometimes known as blocking statutes, could result in penalties and challenges that might not be circumvented by choice of law provisions. That means knowing where your data will be stored and understanding the applicable laws governing data privacy in that jurisdiction is critical.

4.      Retention, Backup, & Security

Part of any good data retention program requires systematically deleting information that the business does not have a legal or business need to retain. Keeping information longer than necessary increases long-term storage costs and increases the amount of information the organization must search in the event of future litigation. The cloud provider should have the ability to automate the archiving, retention, and disposition of information in accordance with the customer’s preferred policies as well as the ability to suspend any automated deletion policies during litigation.

Similarly, where and how information is backed up and secured is critical. For many organizations, merely losing access to email for a few hours brings productivity to a screeching halt. Actually losing the company email due to technical problems and/or as the result of insufficient backup technology could cripple some companies indefinitely. Likewise, losing confidential customer data, research and development plans, or other sensitive information due to the lack of adequate data security and encryption technology could result in legal penalties and/or the loss of important intellectual property. Understanding how your data is backed up and secured is critical to choosing a cloud provider. Equally important is determining the consequences of and the process for handling data breaches, losses, and downtime if something goes wrong.

5.      Responding to Subpoenas and Third-party Data Requests

Sometimes it’s not just criminals who try to take your data out of the cloud; litigants and investigators might also request information directly from cloud providers without your knowledge or consent. Obviously companies have a vested interest in vetting the reasonableness and legality of any data requests coming from third parties. That interest does not change when data is stored in the cloud. Knowing how your cloud provider will respond to third-party requests for data and obtaining written assurances that you will be notified of requests as appropriate is critical to protecting intellectual property and defending against bogus claims. Furthermore, making sure data in the cloud is encrypted may provide an added safeguard if data somehow slips through the cracks without your permission.

Conclusion

Today’s era of technological innovation moves at lightning speed and cloud computing technology is no exception. This fast-paced environment sometime results in organizations making the important decision to move data to the cloud without properly assessing all the potential risks. In order to minimize these risks, organizations should consider The Top 5 Tips for Corporate Legal Departments and consult with legal counsel before moving data to the cloud.

What Ocean’s Eleven and Judge Kozinski Can Teach Organizations About Confidentiality

Friday, April 26th, 2013

Confidentiality in the digital age is certainly an elusive concept. As more organizations turn to social networking sites, cloud computing, and bring your own device (BYOD) policies to facilitate commercial enterprise, they are finding that such innovations could provide unwanted visibility into their business operations. Indeed, technology has seemingly placed confidential corporate information at the fingertips of third parties. This phenomenon, in which some third party could be examining your trade secrets, revenue streams and attorney-client communications, brings to mind an iconic colloquy from the movie Ocean’s Eleven involving Tess (Julia Roberts) and Terry Benedict (Andy Garcia). Tess caustically reminds the guarded casino magnate that: “You of all people should know Terry, in your hotel, there’s always someone watching.”

That someone could always be “watching” proprietary company information was recently alluded to by Chief Judge Alex Kozinski of the Ninth Circuit Court of Appeal. Speaking on the related topic of data privacy at a panel sponsored by The Recorder, Judge Kozinski explained that technological advances that enable third party access to much of the data transmitted through or stored in cyberspace seemingly removes the veneer of confidentiality from that information. Excerpts from Judge Kozinski’s videotaped remarks can be viewed here.

That technological innovations could provide third parties with access to proprietary information is certainly problematic as companies incorporate social networking sites, cloud computing and BYOD into more aspects of their operations. Without appropriate safeguards, the use of such innovations could jeopardize the confidentiality of proprietary company information.

For example, content that corporate employees exchange on social networking sites could be accessed and monitored by site representatives under the governing terms of service. While those terms typically provide privacy settings that would allow corporate employees to limit the extent to which information may be disseminated, they also notify those same users that site representatives may access their communications. Though the justification for such access varies from site to site, the terms generally delineate the lack of confidentiality associated with user communications. This includes ostensibly private communications sent through the direct messaging features available on social networks like LinkedIn, Twitter, MySpace, Facebook and Reddit.

In like manner, providers of cloud computing services often have access and monitoring rights vis-à-vis a company’s cloud hosted data. Memorialized in service level agreements, those rights may allow provider representatives to access, review or even block transmissions of company data to and from the cloud. Provider access may, in turn, destroy the confidentiality required to preserve the character of company trade secrets or maintain the privileged status of communications with counsel.

BYOD also presents a difficult challenge for preserving the confidentiality of company data. This is due to the lack of corporate control that BYOD has introduced into company’s information ecosystem. Unless appropriate safeguards are deployed, employees may unwittingly disclose proprietary information to third parties by using personal cloud storage providers for storage or transmission of company data. In addition, family, friends or even strangers who have access to the employee device could retrieve such information.

Given the confluence of the above referenced factors, the question becomes what steps an organization can take to preserve the confidentiality of its information. On the social network front, a company could deploy an on site social network environment that would provide a secure environment for its employees to communicate about internal corporate matters. Conceptually similar to private clouds that house data behind the company firewall, an on site network could be jointly developed with a third party provider to ensure specific levels of confidentiality.

For the enterprise that is considering cloud computing for its ESI storage needs, it should require that a cloud service provider offer measures to preserve the confidentiality of trade secrets and privileged messages. That may include specific confidentiality terms or a separate confidentiality agreement. In addition, the provider should probably have certain encryption functionality to better preserve confidentiality. By so doing, the enterprise can better satisfy itself that it has taken appropriate measures to ensure the confidentiality of its data.

To address the confidentiality problems associated with BYOD, a company should prepare a cogent policy and deploy technologies that facilitate employee compliance. Such a policy would discourage workers from using personal cloud storage providers to facilitate data transfers or for ESI storage. It would also delineate the parameters of access to employee devices by the employee’s family, friends, or others. To make such a policy more effective, employers will need to develop a technological architecture that reasonably supports conformity with the policy.

By developing cogent and reasonable policies, training employees and deploying effective, enabling technologies, organizations can better prevent unauthorized disclosures of confidential information. Only by taking such professionally recognized best practices can companies hope to shield their proprietary data from the prying eyes of third parties in the digital age.

Morton’s Fork, Oil Filters the Nexus with Information Governance

Thursday, May 10th, 2012

Those old enough to have watched TV in the early eighties will undoubtedly remember the FRAM oil slogan where the mechanic utters his iconic catchphrase: “You can pay me now, or pay me later.”  The gist of the vintage ad was that the customer could either pay a small sum now for the replacement of oil filter, or a far greater sum later for the replacement of the car’s entire engine.

This choice between two unpleasant alternatives is sometimes called a Morton’s Fork (but typically only when both choices are equal in difficulty).  The saying (not to be confused with the equally colorful Hobson’s Choice) apparently originated with the collection of taxes by John Morton (the Archbishop of Canterbury) in the late 15th century.  Morton was apparently fond of saying that a man living modestly must be saving money and could therefore afford to pay taxes, whereas if he was living extravagantly then he was obviously rich and could still afford them.[i]

This “pay me now/pay me later” scenario perplexes many of today’s organizations as they try to effectively govern (i.e., understand, discover and retain) electronically stored information (ESI).  The challenge is similar to the oil filter conundrum, in that companies can often make rather modest up-front retention/deletion decisions that help prevent monumental, downstream eDiscovery charges.

This exponential gap has been illustrated recently by a number of surveys contrasting the cost of storage with the cost of conducting basic eDiscovery tasks (such as preservation, collection, processing, review and production).  In a recent AIIM webcast it was noted that “it costs about 20¢/day to buy 1GB of storage, but it costs around $3,500 to review that same gigabyte of storage.” And, it turns out that the $3,500 review estimate (which sounds prohibitively expensive, particularly at scale) may actually be on the conservative side.  While the review phase is roughly 70 percent of the total eDiscovery costs – there is the other 30% that includes upstream costs for preservation, collection and processing.

Similarly, in a recent Enterprise Strategy Group (ESG) paper the authors noted that eDiscovery costs range anywhere from $5,000 to $30,000 per gigabyte, citing the Minnesota Journal of Law, Science & Technology.  This $30,000 figure is also roughly in line with other per-gigabyte eDiscovery costs, according to a recent survey by the RAND Corporation.  In an article entitled “Where the Money Goes — Understanding Litigant Expenditures for Producing Electronic Discovery” authors Nicholas M. Pace and Laura Zakaras conducted an extensive analysis and concluded that “… the total costs per gigabyte reviewed were generally around $18,000, with the first and third quartiles in the 35 cases with complete information at $12,000 and $30,000, respectively.”

Given these range of estimates, the $18,000 per gigabyte metric is probably a good midpoint figure that advocates of information governance can use to contrast with the exponentially lower baseline costs of buying and maintaining storage.  It is this stark (and startling) gap between pure information costs and the expenses of eDiscovery that shows how important it is to calculate latent “information risk.”  If you also add in the risks for sanctions due to spoliation, the true (albeit still murky) information risk portrait comes into focus.  It is this calculation that is missing when legal goes to bat to argue about the necessity of information governance solutions, particularly when faced with the host of typical objections (“storage is cheap” … “keep everything” … “there’s no ROI for proactive information governance programs”).

The good news is that as the eDiscovery market continues to evolve, practitioners (legal and IT alike) will come to a better and more holistic understanding of the latent information risk costs that the unchecked proliferation of data causes.  It will be this increased level of transparency that permits the budding information governance trend to become a dominant umbrella concept that unites Legal and IT.



[i] Insert your own current political joke here…

Look Before You Leap! Avoiding Pitfalls When Moving eDiscovery to the Cloud

Monday, May 7th, 2012

It’s no surprise that the eDiscovery frenzy gripping the American legal system over the past decade has become increasingly expensive.  Particularly costly to organizations is the process of preserving and collecting documents, a fact repeatedly emphasized by the Advisory Committee in its report regarding the 2006 amendments to the Federal Rules of Civil Procedure (FRCP).  These aspects of discovery are often lengthy and can be disruptive to business operations.  Just as troubling, they increase the duration and expense of litigation.

Because these costs and delays affect the courts as well as clients, it comes as no surprise that judges have now heightened their expectation for how organizations store, manage and discover their electronically stored information (ESI).  Gone are the days when enterprises could plead ignorance for not preserving or producing their data in an efficient, cost effective and defensible manner.  Organizations must now follow best practices – both during and before litigation – if they are to safely navigate the stormy seas of eDiscovery.

The importance of deploying such practices applies acutely to those organizations that are exploring “cloud”-based alternatives to traditional methods for preserving and producing electronic information.  Under the right circumstances, the cloud may represent a fantastic opportunity to streamline the eDiscovery process for an organization.  Yet it could also turn into a dangerous liaison if the cloud offering is not properly scrutinized for basic eDiscovery functionality.  Indeed, the City of Los Angeles’s recent decision to partially disengage from its cloud service provider exemplifies this admonition to “look before you leap” to the cloud.  Thus, before selecting a cloud provider for eDiscovery, organizations should be particularly careful to ensure that a provider has the ability both to efficiently retrieve data from the cloud and to issue litigation hold notices.

Effective Data Retrieval Requires Efficient Data Storage

The hype surrounding the cloud has generally focused on the opportunity for cheap and unlimited storage of information.  Storage, however, is only one of many factors to consider in selecting a cloud-based eDiscovery solution.  To be able to meet the heightened expectations of courts and regulatory bodies, organizations must have the actual – not theoretical – ability to retrieve their data in real time.  Otherwise, they may not be able to satisfy eDiscovery requests from courts or regulatory bodies, let alone the day-to-day demands of their operations.

A key step to retrieving company data in a timely manner is to first confirm whether the cloud offering can intelligently organize that information such that organizations can quickly respond to discovery requests and other legal demands.  This includes the capacity to implement and observe company retention protocols.  Just like traditional data archiving software, the cloud must enable automated retention rules and thus limit the retention of information to a designated time period.  This will enable data to be expired once it reaches the end of that period.

The pool of data can be further decreased through single instance storage.  This deduplication technology eliminates redundant data by preserving only a master copy of each document placed into the cloud.  This will reduce the amount of data that needs to be identified, preserved, collected and reviewed as part of any discovery process.  For while unlimited data storage may seem ideal now, reviewing unlimited amounts of data will quickly become a logistical and costly nightmare.

Any viable cloud offering should also have the ability to suspend automated document retention/deletion rules to ensure the adequate preservation of relevant information.  This goes beyond placing a hold on archival data in the cloud.  It requires that an organization have the ability to identify the data sources in the cloud that may contain relevant information and then modify aspects of its retention policies to ensure that cloud-stored data is retained for eDiscovery.  Taking this step will enable an organization to create a defensible document retention strategy and be protected from court sanctions under the Federal Rule of Civil Procedure 37(e) “safe harbor.”  The decision from Viramontes v. U.S. Bancorp (N.D. Ill. Jan. 27, 2011) is particularly instructive on this issue.

In Viramontes, the defendant bank defeated a sanctions motion because it timely modified aspects of its email retention policy.  The bank implemented a policy that kept emails for 90 days, after which the emails were deleted.  That policy was promptly suspended, however, once litigation was reasonably foreseeable.  Because the bank followed that procedure in good faith, it was protected from sanctions under Rule 37(e).

As the Viramontes case shows, an organization can be prepared for eDiscovery disputes by appropriately suspending aspects of its document retention policies.  By creating and then faithfully observing a policy that requires retention policies be suspended on the occurrence of litigation or other triggering event, an organization can develop a defensible retention procedure. Having such eDiscovery functionality in a cloud provider will likely facilitate an organization’s eDiscovery process and better insulate it from litigation disasters.

The Ability to Issue Litigation Hold Notices

To be effective for eDiscovery purposes, a cloud service provider must also enable an organization to deploy a litigation hold to prevent users from destroying data. Unless the cloud has litigation hold technology, the entire discovery process may very well collapse.  For electronic data to be produced in litigation, it must first be preserved.  And it cannot be preserved if the key players or data source custodians are unaware that such information must be retained.  Indeed, employees and data sources may discard and overwrite electronically stored information if they are oblivious to a preservation duty.

A cloud service provider should therefore enable automated legal hold acknowledgements.  Such technology will allow custodians to be promptly and properly notified of litigation and thereby retain information that might otherwise have been discarded.  Inadequate litigation hold technology leaves organizations vulnerable to data loss and court punishment.

Conclusion

Confirming that a cloud offering can quickly retrieve and efficiently store enterprise data while effectively deploying litigation hold notices will likely address the basic concerns regarding its eDiscovery functionality. Yet these features alone will not make that solution the model of eDiscovery cloud providers. Advanced search capabilities should also be included to reduce the amount of data that must be analyzed and reviewed downstream. In addition, the cloud ought to support load files in compatible formats for export to third party review software. The cloud should additionally provide an organization with a clear audit trail establishing that neither its documents, nor their metadata were modified when transmitted to the cloud.  Without this assurance, an organization may not be able to comply with key regulations or establish the authenticity of its data in court. Finally, ensure that these provisions are memorialized in the service level agreement governing the relationship between the organization and the cloud provider.

Policy vs. Privacy: Striking the Right Balance Between Organization Interests and Employee Privacy

Friday, March 9th, 2012

The lines between professional and personal lives are being further blurred every day. With the proliferation of smart phones, the growth of the virtual workplace and the demands of business extending into all hours of the day, employees now routinely mix business with pleasure by commingling such matters on their work and personal devices. This trend is sure to increase, particularly with “bring your own device” policies now finding their way into companies.

This sometimes awkward marriage of personal and professional issues raises the critical question of how organizations can respect the privacy rights of their employees while also protecting their trade secrets and other confidential/proprietary information. The ability to properly navigate these murky waters under the broader umbrella of information governance may be the difference between a successful business and a litigation-riddled enterprise.

Take, for instance, a recent lawsuit that claimed the Food and Drug Administration (FDA) unlawfully spied on the personal email accounts of nine of its employee scientists and doctors. In that litigation, the FDA is alleged to have monitored email messages those employees sent to Congress and the Office of Inspector of General for the Department of Health & Human Services. In the emails at issue, the scientists and doctors scrutinized the effectiveness of certain medical devices the FDA was about to approve for use on patients.

While the FDA’s email policy clearly delineates that employee communications made from government devices may be monitored or recorded, the FDA may have intercepted employees’ user IDs and passwords and accessed messages they sent from their home computers and personal smart phones. Not only would such conduct potentially violate the Electronic Communications Privacy Act (ECPA), it might also conceivably run afoul of the Whistleblower Protection Act.

The FDA spying allegations have also resulted in a congressional inquiry into the email monitoring policies of all federal agencies throughout the executive branch. Congress is now requesting that the Office of Management and Budget (OMB) produce the following information about agency email monitoring policies:

  • Whether a policy distinguishes between work and personal email
  • Whether user IDs and passwords can be obtained for personal email accounts and, if so, whether safeguards are deployed to prevent misappropriation
  • Whether a policy defines what constitutes protected whistleblower communications

The congressional inquiry surrounding agency email practices provides a valuable measuring stick for how private sector organizations are addressing related issues. For example, does an organization have an acceptable use policy that addresses employee privacy rights? Having such a policy in place is particularly critical given that employees use company-issued smart phones to send out work emails, take photographs and post content to personal social networking pages. If such a policy exists now, query whether it is enforced, what the mechanisms exist for doing so and whether or not such enforcement is transparent to the employees.  Compliance is just as important as issuing the policy in the first place.

Another critical inquiry is whether an organization has an audit/oversight process to prevent the type of abuses that allegedly occurred at the FDA. Such a process is essential for organizations on multiple levels. First, as Congress made clear in its letter to the OMB, monitoring communications that employees make from their personal devices violates the ECPA. It could also interfere with internal company whistleblower processes. And to the extent adverse employment action is taken against an employee-turned-whistleblower, the organization could be liable for violations of the False Claims Act or the Dodd-Frank Wall Street Reform and Consumer Protection Act.

A related aspect to these issues concerns whether an organization can obtain work communications sent from employee personal devices. For example, financial services companies must typically retain communications with investors for at least three years. Has the organization addressed this document retention issue while respecting employee privacy rights in their own smart phones and tablet computers?

If an organization does not have such policies or protections in place, it should not panic and rush off to get policies drafted without thinking ahead. Instead, it should address these issues through an intelligent information governance plan. Such a plan will typically address issues surrounding information security, employee privacy, data retention and eDiscovery within the larger context of industry regulations, business demands and employee productivity. That plan will also include budget allocations to support the acquisition and deployment of technology tools to support written policies on these and other issues.  Addressed in this context, organizations will more likely strike the right balance between their interests and their employees’ privacy and thereby avoid a host of unpleasant outcomes.

Big Data Decisions Ahead: Government-Sponsored Town Hall Meeting for eDiscovery Industry Coincides With Federal Agency Deadline

Wednesday, February 29th, 2012

Update For Report Submission By Agencies

We are fast approaching the March 27, 2012 deadline for federal agencies to submit their reports to the Office of Management and Budget and the National Archives and Records Administration (NARA) to comply with the Presidential Mandate on records management. We are only at the inception, as we look to a very exciting public town hall meeting in Washington, D.C. – also scheduled for March 27, 2012. This meeting is primarily focused on gathering input from the public sector community, the vendor/IT community, and members of the public at large. Ultimately, NARA will issue a directive that will outline a centralized approach for the federal government for managing records and eDiscovery.

Agencies have been tight lipped about how far along they are in the process of evaluating their workflows and tools for managing their information (both electronic and paper). There is, however, some empirical data from an InformationWeek Survey conducted last year that takes the temperature on where the top IT professionals within the government have their sights set, and the Presidential Mandate should bring some of these concerns to the forefront of the reports. For example, the #1 business driver for migrating to the cloud – cited by 62% of respondents – was cost, while 77% of respondents said their biggest concern was security. Nonetheless, 46% were still highly likely to migrate to a private cloud.

Additionally, as part of the Federal Data Center Consolidation Initiative, agencies are looking to eliminate 800 data centers. While the cost savings are clear, from an information governance viewpoint, it’s hard not to ask what the government plans to do with all of those records?  Clearly, this shift, should it happen, will force the government into a more service-based management approach, as opposed to the traditional asset-based management approach. Some agencies have already migrated to the cloud. This is squarely in line with the Opex over Capex approach emerging for efficiency and cost savings.

Political Climate Unknown

Another major concern that will affect any decisions or policy implementation within the government is, not surprisingly, politics. Luckily, regardless of political party affiliation, it seems to be broadly agreed that the combination of IT spend in Washington, D.C. and the government’s slow move to properly manage electronic records is a problem. Two of the many examples of the problem are manifested in the inability to issue effective litigation holds or respond to Freedom of Information Act (FOIA) requests in a timely and complete manner. Even still, the political agenda of the Republican party may affect the prioritization of the Democratic President’s mandate and efforts could be derailed with a potential change in administration.

Given the election year and the heavy analysis required to produce the report, there is a sentiment in Washington that all of this work may be for naught if the appropriate resources cannot be secured then allocated to effectuate the recommendations. The reality is that data is growing at an unprecedented rate, and the need for the intelligent management of information is no longer deniable. The long term effects of putting this overhaul on the back burner could be disastrous. The government needs a modular plan and a solid budget to address the problem now, as they are already behind.

VanRoekel’s Information Governance

One issue that will likely not be agreed upon between Democrats and Republicans to accomplish the mandate is the almighty budget, and the technology the government must purchase in order to accomplish the necessary technological changes are going to cost a pretty penny.  Steven VanRoekel, the Federal CIO, stated upon the release of the FY 2013 $78.8 billion dollar IT budget:

“We are also making cyber security a cross-agency, cross-government priority goal this year. We have done a good job in ramping up on cyber capabilities agency-by-agency, and as we come together around this goal, we will hold the whole of government accountable for cyber capabilities and examine threats in a holistic way.”

His quote indicates the priority from the top down of evaluating IT holistically, which dovetails nicely with the presidential mandate since security and records management are only two parts of the entire information governance picture. Each agency still has their own work cut out for them across the EDRM. One of the most pressing issues in the upcoming reports will be what each agency decides to bring in-house or to continue outsourcing. This decision will in part depend on whether the inefficiencies identified lead agencies to conclude that they can perform those functions for less money and more efficiently than their contractors.  In evaluating their present capabilities, each agency will need to look at what workflows and technologies they currently have deployed across divisions, what they presently outsource,  and what the marketplace potentially offers them today to address their challenges.

The reason this question is central is because it begs an all-important question about information governance itself.  Information governance inherently implies that an organization or government control most or all aspects of the EDRM model in order to derive the benefits of security, storage, records management and eDiscovery capabilities. Presently, the government is outsourcing many of their litigation services to third party companies that have essentially become de facto government agencies.  This is partly due to scalability issues, and partly because the resources and technologies that are deployed in-house within these agencies are inadequate to properly execute a robust information governance plan.

Conclusion

The ideal scenario for each government agency to comply with the mandate would be that they deploy automated classification for their records management, archiving with expiration appropriately implemented for more than just email, and finally, some level of eDiscovery capability in order to conduct early case assessment and easily produce data for FOIA.  The level of early case assessment needed by each agency will vary, but the general idea would be that before contacting a third party to conduct data collection, the scope of an investigation or matter would be able to be determined in-house.  All things considered, the question remains if the Obama administration will foot this bill or if we will have to wait for a bigger price tag later down the road.  Either way, the government will have to come up to speed and make these changes eventually and the town hall meeting should be an accurate thermometer on where the government stands.

LTNY Wrap-Up – What Did We Learn About eDiscovery?

Friday, February 10th, 2012

Now that that dust has settled, the folks who attended LegalTech New York 2012 can try to get to the mountain of emails that accumulated during the event that was LegalTech. Fortunately, there was no ice storm this year, and for the most part, people seemed to heed my “what not to do at LTNY” list. I even found the Starbucks across the street more crowded than the one in the hotel. There was some alcohol-induced hooliganism at a vendor’s party, but most of the other social mixers seemed uniformly tame.

Part of Dan Patrick’s syndicated radio show features a “What Did We Learn Today?” segment, and that inquiry seems fitting for this year’s LegalTech.

  • First of all, the prognostications about buzzwords were spot on, with no shortage of cycles spent on predictive coding (aka Technology Assisted Review). The general session on Monday, hosted by Symantec, had close to a thousand attendees on the edge of their seats to hear Judge Peck, Maura Grossman and Ralph Losey wax eloquently about the ongoing man versus machine debate. Judge Peck uttered a number of quotable sound bites, including the quote of the day: “Keyword searching is absolutely terrible, in terms of statistical responsiveness.” Stay tuned for a longer post with more comments from the General session.
  • Ralph Losey went one step further when commenting on keyword search, stating: “It doesn’t work,… I hope it’s been discredited.” A few have commented that this lambasting may have gone too far, and I’d tend to agree.  It’s not that keyword search is horrific per se. It’s just that its efficacy is limited and the hubris of the average user, who thinks eDiscovery search is like Google search, is where the real trouble lies. It’s important to keep in mind that all these eDiscovery applications are just like tools in the practitioners’ toolbox and they need to be deployed for the right task. Otherwise, the old saw (pun intended) that “when you’re a hammer everything looks like a nail” will inevitably come true.
  • This year’s show also finally put a nail in the coffin of the human review process as the eDiscovery gold standard. That doesn’t mean that attorneys everywhere will abandon the linear review process any time soon, but hopefully it’s becoming increasingly clear that the “evil we know” isn’t very accurate (on top of being very expensive). If that deadly combination doesn’t get folks experimenting with technology assisted review, I don’t know what will.
  • Information governance was also a hot topic, only paling in comparison to Predictive Coding. A survey Symantec conducted at the show indicated that this topic is gaining momentum, but still has a ways to go in terms of action. While 73% of respondents believe an integrated information governance strategy is critical to reducing information risk, only 19% have implemented a system to help them with the problem. This gap presumably indicates a ton of upside for vendors who have a good, attainable information governance solution set.
  • The Hilton still leaves much to be desired as a host location. As they say, familiarity breeds contempt, and for those who’ve notched more than a handful of LegalTech shows, the venue can feel a bit like the movie Groundhog Day, but without Bill Murray. Speculation continues to run rampant about a move to the Javits Center, but the show would likely need to expand pretty significantly before ALM would make the move. And, if there ever was a change, people would assuredly think back with nostalgia on the good old days at the Hilton.
  • Despite the bright lights and elevator advertisement trauma, the mood seemed pretty ebullient, with tons of partnerships, product announcements and consolidation. This positive vibe was a nice change after the last two years when there was still a dark cloud looming over the industry and economy in general.
  • Finally, this year’s show also seemed to embrace social media in a way that it hadn’t done so in years past. Yes, all the social media vehicles were around in years past, but this year many of the vendors’ campaigns seemed to be much more integrated. It was funny to see even the most technically resistant lawyers log in to Twitter (for the first time) to post comments about the show as a way to win premium vendor swag. Next year, I’m sure we’ll see an even more pervasive social media influence, which is a bit ironic given the eDiscovery challenges associated with collecting and reviewing social media content.

The Social Media Rubik’s Cube: FINRA Solved it First, Are Non-Regulated Industries Next?

Wednesday, January 25th, 2012

It’s no surprise that the first industry to be heavily regulated regarding social media use was the financial services industry. The predominant factor that drove regulators to address the viral qualities of social media was the fiduciary nature of investing that accompanies securities, coupled with the potential detrimental financial impact these offerings could have on investors.

Although there is no explicit language in FINRA’s Regulatory Notices 10-06 (January 2010) or 11-30 (August 2011) requiring archival, the record keeping component of the notices necessitate social media archiving in most cases due to the sheer volume of data produced on social media sites. Melanie Kalemba, Vice President of Business Development at SocialWare in Austin, Texas states:

“Our clients in the financial industry have led the way, they have paved the road for other industries, making social media usage less daunting. Best practices for monitoring third-party content, record keeping responsibilities, and compliance programs are available and developed for other industries to learn from. The template is made.”

eDiscovery and Privacy Implications. Privacy laws are an important aspect of social media use that impact discoverability. Discovery and privacy represent layers of the Rubik’s cube in the ever-changing and complex social media environment. No longer are social media cases only personal injury suits or HR incidents, although those are plentiful. For example, in Largent v. Reed the court ruled that information posted by a party on their personal Facebook page was discoverable and ordered the plaintiff to provide user name and password to enable the production of the information. In granting the motion to compel the Defendant’s login credentials, Judge Walsh acknowledged that Facebook has privacy settings, and that users must take “affirmative steps” to keep their information private. However, his ruling determined that no social media privacy privilege exists: “No court has recognized such a privilege, and neither will we.” He further reiterated his ruling by adding, “[o]nly the uninitiated or foolish could believe that Facebook is an online lockbox of secrets.”

Then there are the new cases emerging over social media account ownership which affect privacy and discoverability. In the recently filed Phonedog v. Kravitz, 11-03474 (N.D. Cal.; Nov. 8, 2011), the lines between the “professional” versus the “private” user are becoming increasingly blurred. This case also raises questions about proprietary client lists, valuations on followers, and trade secrets  – all of which are further complicated when there is no social media policy in place. The financial services industry has been successful in implementing effective social media policies along with technology to comply with agency mandates – not only because they were forced to by regulation, but because they have developed best practices that essentially incorporate social media into their document retention policies and information governance infrastructures.

Regulatory Framework. Adding another Rubik’s layer are the multitude of regulatory and compliance issues that many industries face. The most active and vocal regulators for guidance in the US on social media have been FINRA, the SEC and the FTC. FINRA initiated guidance to the financial services industry, and earlier this month the SEC issued their alert. The SEC’s exam alert to registered investment advisers issued on January 4, 2012 was not meant to be a comprehensive summary for compliance related to the use of social media. Instead, it lays out staff observations of three major categories: third party content, record keeping and compliance – expounding on FINRA’s notice.

Last year the FTC issued an extremely well done Preliminary FTC Staff Report on Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers.  Three main components are central to the report. The first is a call for all companies to build privacy and security mechanisms into new products – considering the possible negative ramifications at the outset, avoiding social media and privacy issues as an afterthought. The FTC has cleverly coined the notion, “Privacy by Design.” Second, “Just-In-Time” is a concept about notice and encourages companies to communicate with the public in a simple way that prompts them to make informed decisions about their data in terms that are clear and that require an affirmative action (i.e., checking a box). Finally, the FTC calls for greater transparency around data collection, use and retention. The FTC asserts that consumers have a right to know what kind of data companies collect, and should have access to the sensitivity and intended use of that data. The FTC’s report is intended to inform policymakers, including Congress, as they legislate on privacy – and to motivate companies to self-regulate and develop best practices. 

David Shonka, Principal Deputy General Counsel at the FTC in Washington, D.C., warns, “There is a real tension between the situations where a company needs to collect data about a transaction versus the liabilities associated with keeping unneeded data due to privacy concerns. Generally, archiving everything is a mistake.” Shonka arguably reinforces the case for instituting an intelligent archive, whether a company is regulated or not;  an archive that is selective about what it ingests based on content, and that has an appropriate deletion cycle applied to defined data types/content according to a policy. This will ensure expiry of private consumer information in a timely manner, but retains the benefits of retrieval for a defined period if necessary.

The Non-Regulated Use Case­. When will comprehensive social media policies, retention and monitoring become more prevalent in the non-regulated sectors? In the case of FINRA and the SEC, regulations were issued to the financial industry. In the case of the FTC, guidance had been given to companies regarding how to avoid false advertisement and protect consumer privacy. The two are not dissimilar in effect. Both require a social media policy, monitoring, auditing, technology, and training. While there is no clear mandate to archive social media if you are in a non-regulated industry, this can’t be too far away. This is evidenced by companies that have already implemented social media monitoring systems for reasons like brand promotion/protection, or healthcare companies that deal with highly sensitive information. If social media is replacing email, and social media is essentially another form of electronic evidence, why would social media not be part of the integral document retention/expiry procedures within an organization?

Content-based monitoring and archiving is possible with technology available today, as the financial sector has demonstrated. Debbi Corej, who is a compliance expert for the financial sector and has successfully implemented an intensive social media program, says it perfectly: “How do you get to yes? Yes you can use social media, but in a compliant way.” The answer can be found at LegalTech New YorkJanuary 30 @ 2:00pm.

Losing Weight, Developing an Information Governance Plan, and Other New Year’s Resolutions

Tuesday, January 17th, 2012

It’s already a few weeks into the new year and it’s easy to spot the big lines at the gym, folks working on fad diets and many swearing off any number of vices.  Sadly perhaps, most popular resolutions don’t even really change year after year.  In the corporate world, though, it’s not good enough to simply recycle resolutions every year since there’s a lot more at stake, often with employee’s bonuses and jobs hanging in the balance.

It’s not too late to make information governance part of the corporate 2012 resolution list.  The reason is pretty simple – most companies need to get out of the reactive firefighting of eDiscovery given the risks of sloppy work, inadvertent productions and looming sanctions.  Yet, so many are caught up in the fog of eDiscovery war that they’ve failed to see the nexus between the upstream, proactive good data management hygiene and the downstream eDiscovery chaos.

In many cases the root cause is the disconnect between differing functional groups (Legal, IT, Information Security, Records Management, etc.).  This is where the emerging umbrella concept of Information Governance comes to play, serving as a way to tackle these information risks along a unified front. Gartner defines information governanceas the:

“specification of decision rights, and an accountability framework to encourage desirable behavior in the valuation, creation, storage, use, archiving and deletion of information, … [including] the processes, roles, standards, and metrics that ensure the effective and efficient use of information to enable an organization to achieve its goals.”

Perhaps more simply put, what were once a number of distinct disciplines—records management, data privacy, information security and eDiscovery—are rapidly coming together in ways that are important to those concerned with mitigating and managing information risk. This new information governance landscape is comprised of a number of formerly discrete categories:

  • Regulatory Risks – Whether an organization is in a heavily regulated vertical or not, there are a host of regulations that an organization must navigate to successfully stay in compliance.  In the United States these include a range of disparate regimes, including the Sarbanes-Oxley Act, HIPPA, the Securities and Exchange Act, the Foreign Corrupt Practices Act (FCPA) and other specialized regulations – any number of which require information to be kept in a prescribed fashion, for specified periods of time.  Failure to turn over information when requested by regulators can have dramatic financial consequences, as well as negative impacts to an organization’s reputation.
  • Discovery Risks – Under the discovery realm there are any number of potential risks as a company moves along the EDRM spectrum (i.e., Identification, Preservation, Collection, Processing, Analysis, Review and Production), but the most lethal risk is typically associated with spoliation sanctions that arise from the failure to adequately preserve electronically stored information (ESI).  There have been literally hundreds of cases where both plaintiffs and defendants have been caught in the judicial crosshairs, resulting in penalties ranging from outright case dismissal to monetary sanctions in the millions of dollars, simply for failing to preserve data properly.  It is in this discovery arena that the failure to dispose of corporate information, where possible, rears its ugly head since the eDiscovery burden is commensurate with the amount of data that needs to be preserved, processed and reviewed.  Some statistics show that it can cost as much as $5 per document just to have an attorney privilege review performed.  And, with every gigabyte containing upwards of 75,000 pages, it is easy to see massive discovery liability when an organization has terabytes and even petabytes of extraneous data lying around.
  • Privacy Risks – Even though the US has a relatively lax information privacy climate there are any number of laws that require companies to notify customers if their personally identifiable information (PII) such as credit card, social security, or credit numbers have been compromised.  For example, California’s data breach notification law (SB1386) mandates that all subject companies must provide notification if there is a security breach to the electronic database containing PII of any California resident.  It is easy to see how unmanaged PII can increase corporate risk, especially as data moves beyond US borders to the international stage where privacy regimes are much more staunch.
  • Information Security Risks Data breaches have become so commonplace that the loss/theft of intellectual property has become an issue for every company, small and large, both domestically and internationally.  The cost to businesses of unintentionally exposing corporate information climbed 7 percent last year to over $7 million per incident.  Recently senators asked the SEC to “issue guidance regarding disclosure of information security risk, including material network breaches” since “securities law obligates the disclosure of any material network breach, including breaches involving sensitive corporate information that could be used by an adversary to gain competitive advantage in the marketplace, affect corporate earnings, and potentially reduce market share.”  The senators cited a 2009 survey that concluded that 38% of Fortune 500 companies made a “significant oversight” by not mentioning data security exposures in their public filings.

Information governance as an umbrella concept helps organizations to create better alignment between functional groups as they attempt to solve these complex and interrelated data risk challenges.  This coordination is even more critical given the way that corporate data is proliferating and migrating beyond the firewall.  With even more data located in the cloud and on mobile devices a key mandate is managing data in all types of form factors. A great first step is to determine ownership of a consolidated information governance approach where the owner can:

  • Get C-Level buy-in
  • Have the organizational savvy to obtain budget
  • Be able to define “reasonable” information governance efforts, which requires both legal and IT input
  • Have strong leadership and consensus building skills, because all stakeholders need to be on the same page
  • Understand the nuances of their business, since an overly rigid process will cause employees to work around the policies and procedures

Next, tap into and then leverage IT or information security budgets for archiving, compliance and storage.  In most progressive organizations there are likely ongoing projects that can be successfully massaged into a larger information governance play.  A great place to focus on initially is information archiving, since this one of the simplest steps an organization can take to improve their information governance hygiene.  With an archive organizations can systematically index, classify and retain information and thus establish a proactive approach to data management.  It’s this ability to apply retention and (most importantly) expiration policies that allows organizations to start reducing the upstream data deluge that will inevitably impact downstream eDiscovery processes.

Once an archive is in place, the next logical step is to couple a scalable, reactive eDiscovery process with the upstream data sources, which will axiomatically include email, but increasingly should encompass cloud content, social media, unstructured data, etc.  It is important to make sure  that a given  archive has been tested to ensure compatibility with the chosen eDiscovery application to guarantee that it can collect content at scale in the same manner used to collect from other data sources.  Overlaying both of these foundational pieces should be the ability to place content on legal hold, whether that content exists in the archive or not.

As we enter 2012, there is no doubt that information governance should be an element in building an enterprise’s information architecture.  And, different from fleeting weight loss resolutions, savvy organizations should vow to get ahead of the burgeoning categories of information risk by fully embracing their commitment to integrated information governance.  And yet, this resolution doesn’t need to encompass every possible element of information governance.  Instead, it’s best to put foundational pieces into place and then build the rest of the infrastructure in methodical and modular fashion.

Information Governance Gets Presidential Attention: Banking Bailout Cost $4.76 Trillion, Technology Revamp Approaches $240 Billion

Tuesday, January 10th, 2012

On November 28, 2011, The White House issued a Presidential Memorandum that outlines what is expected of the 480 federal agencies of the government’s three branches in the next 240 days.  Up until now, Washington, D.C. has been the Wild West with regard to information governance as each agency has often unilaterally adopted its own arbitrary policies and systems.  Moreover, some agencies have recently purchased differing technologies.  Unfortunately,  with the President’s ultimate goal of uniformity, this centralization will be difficult to accomplish with a range of disparate technological approaches.

Particular pain points for the government traditionally include retention, search, collection, review and production of vast amounts of data and records.  Specifically, these pain points include examples of: FOIA requests gone awry, the issuance of legal holds across different agencies leading to spoliation, and the ever present problem of decentralization.

Why is the government different?

Old Practices. First, in some instances the government is technologically behind (its corporate counterparts) and is failing to meet the judiciary’s expectation that organizations effectively store, manage and discover their information.  This failing is self-evident via  the directive coming from the President mandating that these agencies start to get a plan to attack this problem.  Though different than other corporate entities, the government is nevertheless held to the same standards of eDiscovery under the Federal Rules of Civil Procedure (FRCP).  In practice, the government has been given more leniency until recently, and while equal expectations have not always been the case, the gap between the private and public sectors in no longer possible to ignore.

FOIA.  The government’s arduous obligation to produce information under the Freedom of Information Act (FOIA) has no corresponding analog for private organizations, who are responding to more traditional civil discovery requests.  Because the government is so large with many disparate IT systems, it is cumbersome to work efficiently through the information governance process across agencies and many times still difficult inside one individual agency with multiple divisions.  Executing this production process is even more difficult if not impossible to do manually without properly deployed technology.  Additionally, many of the investigatory agencies that issue requests to the private sector need more efficient ways to manage and review data they are requesting.  To compound problems, within the US government there are two opposing interests are at play; both screaming for a resolution, and that solution needs to be centralized.  On the one hand, the government needs to retain more than a corporation may need to in order to satisfy a FOIA request.

Titan Pulled at Both Ends. On the other hand, without classification of the records that are to be kept, technology to organize this vast amount of data and some amount of expiry, every agency will essentially become their own massive repository.  The “retain everything mentality” coupled with the inefficient search and retrieval of data and records is where they stand today.  Corporations are experiencing this on a smaller scale today and many are collectively further along than the government in this process, without the FOIA complications.

What are agencies doing to address these mandates?

In their plans, agencies must describe how they will improve or maintain their records management programs, particularly with regard to email, social media and other electronic communications.  They must also move away from such a paper-centric existence.  eDiscovery consultants and software companies are helping agencies through this process, essentially writing their plans to match the President’s directive.  The cloud conversation has been revisited, and agencies also have to explain how they will use cloud-based services and storage solutions, as well as identify gaps in existing laws or regulations that presently prevent improved management.  Small innovations are taking place.  In fact, just recently the DOJ added a new search feature on their website to make it easier for the public to find documents that have been posted by agencies on their websites.

The Office of Management and Budget (OMB), National Archives and Records Administration (NARA), and Justice Department will use those reports to come up with a government-wide records management framework that is more efficient, maintains accountability by documenting agency actions and promotes “appropriate” public access to records.  Hopefully, the framework they come up with will be centralized and workable on a realistic timeframe with resources sufficiently allocated to the initiative.

How much will this cost?

The President’s mandate is a great initiative and very necessary, but one cannot help but think about the costs in terms of money, time and resources when considering these crucial changes.  The most recent version of a financial services and general government appropriations bill in the Senate extends $378.8 million to NARA for this initiative.  President Obama appointed Steven VanRoekel as the United States CIO in August 2011 to succeed Vivek Kundra.  After VanRoekel’s speech at the Churchill Club in October of 2011, an audience member asked him what the most surprising aspect of his new job was.  VanRoekel said that it was managing the huge and sometimes unwieldy resources of his $80 billion budget.  It is going to take even more than this to do the job right, however.

Using conservative estimates, assume for an agency to implement archiving and eDiscovery capabilities as an initial investment would be $100 million.  That approximates $480 billion for all 480 agencies.  Assume a uniform information governance platform gets adopted by all agencies at a 50% discount due to the large contracts and also factoring in smaller sums for agencies with lesser needs.  The total now comes to $240 billion.  For context, that figure is 5% of what was spent by Federal Government ($4.76 trillion) on the biggest bailout in history in 2008. That leaves a need for $160 billion more to get the job done. VanRoekel also commented at the same meeting that he wants to break down massive multi-year information technology projects into smaller, more modular projects in the hopes of saving the government from getting mired in multi-million dollar failures.   His solution to this, he says, is modular and incremental deployment.

While Rome was not built in a day, this initiative is long overdue, yet feasible, as technology exists to address these challenges rather quickly.  After these 240 days are complete and a plan is drawn the real question is, how are we going to pay now for technology the government needed yesterday?  In a perfect world, the government would select a platform for archiving and eDiscovery, break the project into incremental milestones and roll out a uniform combination of solutions that are best of breed in their expertise.