Bloomberg Law
Sept. 22, 2015, 6:28 PM UTC

Ethical Considerations in Using Social Data

Judy Selby
Judy Selby
Partner
Alan Friel
Baker & Hostetler
Jenna N. Felz
Jenna N. Felz
Associate

Editor’s Note: This article is authored by two BakerHostetler partners — an information governance practice leader and a new media, advertising, IT and privacy partner — and an associate. It is the final of a four-part series.

By Judy Selby, Alan Friel and Jenna Felz of BakerHostetler

In our last article, we discussed the legal considerations in collecting, analyzing, and using social data. Complying with the black letter of the law is critical, but not enough; companies need to consider how their contemplated use of social data will be perceived by their customers and in the marketplace as a whole.

Companies should ask themselves, will the use of social data be viewed as intrusive, exploitative, or “creepy?” And is the use consistent with the image the company wants to project, particularly to its consumer base and investors?

“The complexity and, at times, intimacy, of social data opens up many unexplored ethical questions that when left unaddressed can lead to reputational and legal risk,” said Susan Etlinger, industry analyst at Altimeter Group, a research and advisory firm. “Organizations that accept this reality now, and expand their governance frameworks to address social data, will be better positioned to retain the trust and loyalty of their constituencies — even as new technologies emerge.”

The IBM UK and Ireland Technical Consultancy Group (TCG) created an “ethical awareness framework” to help businesses develop ethical policies for their use of analytics and big data.

[Image “Ethical” (src=https://bol.bna.com/wp-content/uploads/2015/09/Ethical.png)]

According to the framework, businesses should consider the wider implications of their activities when collecting, using, and storing data, including:

• Context – For what purpose was the data originally surrendered? For what purpose is the data now being used? How far removed from the original context is its new use? Is this appropriate?

• Consent & Choice – What are the choices given to an affected party? Do they know they are making a choice? Do they really understand what they are agreeing to? Do they really have an opportunity to decline? What alternatives are offered?

• Reasonable – Is the depth and breadth of the data used and the relationships derived reasonable for the application it is used for?

• Substantiated – Are the sources of data used appropriate, authoritative, complete and timely for the application?

• Owned – Who owns the resulting insight? What are their responsibilities towards it in terms of its protection and the obligation to act?

• Fair – How equitable are the results of the application to all parties? Is everyone properly compensated?

• Considered – What are the consequences of the data collection and analysis?

• Access – What access to data is given to the data subject?

• Accountable – How are mistakes and unintended consequences detected and repaired? Can the interested parties check the results that affect them?

These issues are important for all companies that use data, but they are particularly important for companies with business plans that depend on consumers, customers, or clients being willing to entrust their data to the business. As data collection methods evolve, the amount of data collected by companies will grow, and data security concerns will rise. As a result, a company’s reputation and consumer trust will increasingly become of paramount concern. Implementing best practices now, and continuously revising them as the company and the technology grow and change, can make the all the difference in the eyes of the law and the public.

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.