My former colleagues at the Pew Research Center continue to publish the best research on the impact of the internet on American society, bar none. My fandom extends to creating a fact sheet summarizing their recent surveys about Americans’ data worries. The results are indications about what people think and feel about the shifting technology landscape — are they gaining confidence? Losing it? How aware are they of the dangers and opportunities?
Here are two tantalizing examples from a 2019 report entitled “Americans and Privacy”:
45% of U.S. adults say it is unacceptable for social media companies to monitor users’ posts for signs of depression so they can identify people who are at risk of self-harm and connect them to counseling services.
- By comparison, 27% say this would be acceptable (and 27% answered “don’t know”).
35% of U.S. adults say it is unacceptable for fitness tracking app makers to share user data with medical researchers to better understand the link between exercise and heart disease.
- By comparison, 41% say it is acceptable (and 22% answered “don’t know”).
If I could, I’d ask the “not acceptable” and “don’t know” respondents a follow-up question: Under what circumstances would you say these data uses are acceptable?
How would you answer that?
Alternatively: What follow-up questions would you like to ask respondents?
My colleagues at FasterCures and I have been talking about how companies and organizations can build trust among their users and customers, particularly when it comes to health-related activities. What do you think? What tools could companies create if people trusted them to do good things (and nothing evil) with their data?
Please share your thoughts in the comments below.
Image: “Secret agent” by Jon on Flickr.
Kelly Close says
That’s so interesting … I think both uses of data, which at minimum help identify depression and could make safety more likely and in the second case, could enhance wellness, would be really valuable. As long as neither winds up receiving ads set to target vulnerable populations, and as long as the info is not used for malice, I’d think that would make this acceptable. I wonder to what degree the “not acceptable” responses relate to trust.
Susannah Fox says
Yes! If I could, I’d have an open text box attached to every online survey question so people could explain more about what they are thinking about.
Paul Wicks says
Excellent analysis thanks for the fact sheet and highlighting some of these salient points!
I’d also love to know more about the use cases and how people might imagine these data are being used. Perhaps they might think differently about social media companies flagging possible mood issues in their content vs. their children’s content, for example. Maybe the question felt framed more like an ad, and people might just have blockers around the concept of speaking to counseling services – whether that’s because they’ve had a bad experience, it’s expensive, they don’t have time for it, or that it might open up a whole can of worms they’re not willing to deal with vs. venting a little bit on Facebook.
Equally it’s worth considering how people think about the context of any engagement – if a stranger overheard my conversation on the phone and decided to tap me on the shoulder afterwards and hand me a counselling service’s card I’d be offended (also I’m British so don’t make eye contact nevermind talking to strangers!)
I can also imagine a subset of the population is always going to resist, on principle, explicit nudging towards different services or restriction of percieved “freedoms”. In theory my supermarket loyalty card could decline my 3rd purchase of Ben & Jerry’s this week, my car could easily refuse to exceed the speed limit, and Nespresso could send me a Holter monitor becuase they’re not sure I should be getting through *quite* this many coffee capsules every month.
But if a friend reached out to see if I was OK, if my personal trainer reminded me that a protein bar might fill me up without spiking my diet, if the community safety group in my village reminded me that speeding kills, I’d accept that same information because the wrapper has come from a different place. What’s missing from the tech companies is trust, connection, and empathy. But who knows, maybe Nespresso might send me some free decaf 😉
Susannah Fox says
Since this blog is my outboard memory, here’s an insight shared by Nick Jacobson on Twitter:
“Interesting, maybe it’s about distrust for social media companies?
Note: referrals aren’t always benign.
We found that referring persons to in-person care during online screens predicted a 73% increase risk in folks actively trying to end their lives:
Impact of online mental health screening tools on help-seeking, care receipt, and suicidal ideation and suicidal intent: Evidence from internet search behavior in a large U.S. cohort (Journal of Psychiatric Research; Available online 9 November 2020)”