The whole future of social media intelligence turns on being able to make sense of unstructured Big Data – the ability to interpret millions of social posts, comments and tweets to create a deeper understanding of markets, audiences and consumer interests. But sensitivity and concerns about data privacy are on the rise. A survey conducted by the Pew Research Center found that 80% of adults who use social networking sites are concerned about third parties like advertisers or businesses accessing the data they share. This is a challenge, not for the social networks themselves, but for the businesses that harvest and process data from them.
People are concerned about losing their privacy and seeing their personally identifiable information (PII) used without their knowledge or consent. The 2015 TRUSTe US Consumer Confidence Index puts this into context for businesses with the revelation that 91% of people avoid companies that do not respect their privacy. This sentiment is percolating through to social networks who are changing their terms of service and to governments who are drafting new regulations.
At DataSift we believe that respecting data privacy needs to be the starting point of any Human Data project. Technology needs to be built with privacy protection at its core (see Privacy by Design for a framework for this) and respecting personal data should be part of the culture of every business. So what does that mean in practice? What do organizations need to consider when they are devising a Big Data program? The following are seven founding principles for a privacy-first approach to Human Data.
Put ethics before “the art of the possible”
Data is in the public domain so a business has the right to process and analyze it. Right or wrong? Many organizations would argue that social data is public data, so it’s permissible to do whatever they like with it. However, unethical use of personal data can quickly erode trust, undermine customer relationships and expose the company to brand damage.
Build consumer trust through transparency
Companies can manage this by acting transparently – being open about what the organization is doing with consumer data. Providing people with clarity about why, what and how data is collected and used puts them in control. Organizations need to engage customers for sustainable personal data usage, educating them on personal data applications, as well as on privacy management.
Respect data provenance and maintain data integrity
Social network data cannot be processed as public data. Even if tech companies believe that, “the data is in the public domain, so I will use it how I choose,” they still need to adhere to the terms of service set out by the providers. The advice here is to explicitly respect the terms of service for each and every social network, ensure individuals stay in control throughout the value chain and eliminate social data propagation.
Adhere to data governance and retention requirements
Data retention is related to transparency. Once the stated purpose of the collection of data has been achieved, there is no legitimate reason why that data should be kept and it should be deleted. Storing personal data on individuals incurs a responsibility to protect that data. The more data and the wider its use, the more of a burden this becomes. Every organization involved in Human Data analytics needs to adopt an active policy that defines data retention limits and periods. This policy needs to be reviewed regularly to ensure that it is staying ahead of changes in the business, social network terms of service and the law.
Aggregate and anonymize data for “big insights”
Fostering consumer trust does not mean that businesses need to reject the opportunities to improve marketing efficiency, new product development, and market research that Human Data intelligence presents. Many of these market-level “big insights” can be garnered without having to process PII. Anonymizing data by removing people’s names is a start, but truly anonymizing data about an individual is much more difficult than removing their name. In fact, the more data points collected for an individual, the more likely their record is to be unique. The solution to this problem is to aggregate data to truly anonymize it. Businesses can easily aggregate, filter and extract meaning from social data and other enterprise data sources and put it to work—gathering market insight, understanding customer segments more deeply, measuring results, optimizing campaigns and more. None of this requires processing of PII.
Build “small insights” for consumers through opt-in
Companies increasingly see the potential in Big Data technology to help drive “small insights” into consumer interests, preferences and potential product needs. These insights can help drive personalization, recommendation, and a better customer experience. In these cases, where identifiable data is needed, a mutual benefit can be offered. To support this, social logins have become a standard, with everyone from Airbnb to American Express implementing the technology on their web properties. This brand level (as opposed to social network level) consent allows organizations to ask for permission to use consumer social data, without alienating the very audience they seek to serve. Both parties benefit: consumers gain a more convenient user experience, while businesses gain valuable permission-based data about their consumers.
The message is simple and clear: Human Data analytics should never include those below the age to consent. The Children’s Online Privacy Protection Act and other legislation prohibit unauthorized disclosure, use and dissemination of PII regarding minors. Any operator must get consent from a parent or guardian if they wish to collect PII from the child.
Great businesses are built on the currency of trust. Consumers are becoming more concerned about the way their information is used and these concerns are being taken seriously by social networks and governments around the world. Businesses performing Human Data analytics need to keep the interests of consumers uppermost in their minds. By adopting a privacy-first approach to Human Data, businesses can get the insights they need to grow and maintain trusting, long-term and rewarding consumer relationships.
Download our white paper Balancing Human Data Intelligence in Consumer Trust to learn more about the seven principles you should consider for a privacy-first approach to Human Data.
[White paper: Balancing Human Data Intelligence and Consumer Trust]