Onboarding III: Conversations
In a previous post about the topic of onboarding, I have explored how disappointing or slow the adoption of digital tools can be. In a second post about onboarding, I discussed how the general appeal of digital minimalism can lead to meaningful discussions about privacy.
Today, I would like to talk about what it is like having these conversations with people about digital privacy, and what you need to watch out for.
"Alexa...thank you!"
One of the most effective digital privacy conversation moments I had was thanks to Amazon's Alexa. It was Christmas time, and I was talking with a family member about Amazon and Alexa (as you do...). She was a big fan of the service and couldn't see any downsides. I tried to explain the problems of unlimited record keeping, but it had no effect.
I then spent a little bit of time looking up what's possible in the Amazon settings menu. I was determined to convince my family member, and had a hunch that Amazon might provide the ammunition I needed.
It turned out this was true. If you go into your settings on the app that manages your Alexa device, you can find a record of all the short audio files of all requests spoken to the microphone. I showed her how to do this, and we found a long list of audio recordings. She began to play the files, and we heard her adult son's song requests of that day (he was at her house, in a different country). She could hear, in the background, which of his friends were visiting. It was a strange and creepy experience, listening in on fragments of a Christmas party happening at the same time in a different country. The spying element brought visceral discomfort.
It's a good thing that in their family, they only use Alexa to request songs, and that they don't use it for internet searches. That would have made the audio record potentially an even greater breach of her children's privacy.
Needless to say, my family member deleted all the audio files and opted out of future recordings. See Documents below for details on how to turn off these settings.
What the experts say (MoneroTopia 2023)
Though I wasn't able to fly out to Mexico, I was able to participate in MoneroTopia 2023 online. I heard several speakers talk about onboarding and how to have conversations about adoption.
The speakers below all focused on Monero, as this was the topic of the conference, but their arguments can easily be generalised to any privacy tool and general discussions about digital privacy.
Guest speaker Rachel O'Leary presented a talk that addressed issues of onboarding people to digital privacy tools like Monero. She argued that as regulations increase and become more strict and controlling, people will naturally look for tools to ensure they can maintain their freedom. Those of us who are already familiar with tools like Monero, custom ROMs, and private mail, storage and messengers should then be well-positioned to have conversations about privacy with a more receptive audience. In the EU, we are seeing an increase in regulations on crypto currencies right now (MiCA, Transfer of Funds regulations). It will be interesting to see if this is picked up on in mainstream discourse and if it will indeed bring more people to digital cash currencies like Monero. When the messenger app WhatsApp received some bad press due to changes in their privacy policy last year, a number of people began using Signal. When things looked shaky for Twitter after Musk's takeover, we saw an increase in Mastodon adoption. I hope Rachel is right, but don't see evidence of lasting, mass adoption yet.
Speaker Luke Smith presented a talk titled "Is the Best Cryptocurrency Good Enough for Grandma Yet?" One thing I like about the conversation in the Monero community is that it is often (not always) a bit more open to critical views and harsh realities. Luke was pretty clear that, despite great tools like the Cake Wallet app, people are still generally cautious, and it is difficult to onboard people to a privacy tool like Monero through conversation. He argued that it would be much more effective (and save a lot of energy spent on discussion) if online stores would start to include Monero as a payment option. I tend to agree with this.
Finally, Seth for Privacy spoke online about the importance of guarding against tribalism within the community. Seth was talking about Monero in particular, but having listened to a lot of his interviews and podcasts, I know this is part of his general attitude to having conversations about privacy. Seth talks about onboarding a lot. What I find persuasive about his approach is that he speaks from a position of being well-informed and knowledgeable, but with a open mind to other points of view. I highly recommend listening to and reading his work for examples of healthy, confident and open conversations about digital privacy.
Positive conversation
When I first got into privacy in a serious way, I became quite evangelical. I would talk about privacy at length, especially after a beer or two. I've learned a few lessons about what does and doesn't work:
Don't bore people - Avoid becoming strident. This requires some empathy, but you have to know when to stop talking about state and company surveillance and the fight for privacy. Endless conversations about the same topic can make you less effective in actually getting people to think about their own privacy, so you have to choose your moments and know when its time to give it a rest and switch to another topic.
Lower expectations - Some of the people I am closest to are full-on Big Tech and social media users (all Meta's platforms, Google, Apple, etc.), despite all those hours listening to me and my convictions. There is no point in feeling disappointed by this. What helps is to notice and celebrate small wins: no one in our house buys from Amazon anymore. My entire extended family is on Signal. Some of my friends and family downloaded Cake Wallet to try Monero. I don't know if anything will change in the long run, but these small steps are a start.
Avoid a superior attitude - A superior or sanctimonious attitude doesn't achieve anything positive. I sometimes despair at the tone of conversation in privacy platforms, where anyone who isn't 100% onboard with the latest privacy tool is just an idiot, or part of a flock of sheep. The antidote is empathy. It helps me to remember that I was a full-on Google platforms user not all that long ago. The most poisonous result of superiority is in-fighting in the privacy communities, which must be a real turn-off for any potential newcomer.
Show & tell - I regularly show friends, family, colleagues my digital privacy apps and hardware devices. I encourage people to take a look CalyxOS on my smartphone. I talk about Monero. I teach a unit on Edward Snowden. I share books I have read and print posters of their covers and put them up in my classroom. I show students how to switch the search engine in the browser.
Share privacy tools that work - One of my children uses my Nextcloud server as a work and storage space for YouTube videos they create with a school friend. They are large files, and I have 3TB of space on an old HDD drive, at no cost. Some of my family members use the Whoogle instance I have set up on a Raspberry Pi as their default search engine. These are practical, functioning tools that also bring privacy, opening the door to good conversations.
Conclusions
One reader who goes by the pseudonym of u/Hong-Kwong shared their thoughts about the issue of onboarding conversations with me in the Reddit comments (in response to my post about obsessing over privacy):
I always tell newcomers to privacy and security that it's a long process and it doesn't really stop. It's a process that needs to be followed, updated and learned about constantly.
This takes time so don't make drastic decisions which may make your life difficult and put you off from further changes.
I try subtle broadcasting of my open source choices but try to refrain from being to judgemental on others who don't.
Hong-Kwong, If you are reading this, I hope you don't mind me sharing these words. I think you summed it up perfectly!
Documentation
The New Oil recently posted this article on Mastodon: "Is Alexa Always Listening? Not If This Mic Jammer Has Anything To Say About It".
Amazon to pay $25m over child privacy violations (BBC, 1 June 2023)
"How to Delete Your Alexa History and Recordings"
"4 Amazon privacy settings you should change right now" (PCWorld, 22 March 2023)
From the last article:
Limit Alexa data collection
If you use an Amazon Echo speaker or other Alexa device, swing by Amazon’s Alexa privacy page and make the following adjustments:
- Under “Voice Recordings,” click the arrow next to “Choose how long to save,” then select “Don’t save recordings.” This stops Amazon from storing the audio of your voice commands.
- Under Smart Home Device History, click the arrow next to “Choose how long to save,” then select three months, the minimum timeframe available.
- Repeat these steps for “Detected Sounds History.”
- Under “Help improve Alexa,” turn off “Use of voice recordings” and “Use messages to improve transcriptions.”
-----Discuss on Reddit-----
Subscribe to my blog via email or RSS feed.
Find me on Mastodon and Twitter.
Back to Blog