Images of dramatically aged friends and family members have been flooding social media feeds over the last week, courtesy of FaceApp, an app that uses AI to digitally age a user’s photo. While many have been asking themselves “why would I make myself look older?” others have been discussing the risks of allowing an app to access and store personal data.
The app’s privacy policy allows FaceApp to retrieve information such as IP addresses and location data from users, in addition to the photo the user has selected for editing. When users agree to FaceApp’s terms of service, they agree to grant FaceApp a perpetual and irrevocable licence to user this data, including their name and likeness, which can be used for any purposes, including commercial purposes.
It is common for apps to access personal data (including photos) and have the ability to use it for any purpose. FaceApp’s policy is similar to Facebook’s and most other social media platforms in this respect. However, because FaceApp was created by a Russian developer in the context of a trending distrust of Russia, eyebrows have been raised about the extent to which consumers are willing to allow FaceApp to access, store and use their data. Indeed, the USA Democratic National Committee issued an alert that “this novelty is not without risk: FaceApp was developed by Russians”.
While there is no evidence that FaceApp is misusing user data, the discourse does highlight that users are often willing to sign up to use apps or websites at the cost of allowing developers to access and use their private data.
FaceApp has been widely covered in the media, most likely because of its Russian ties than because of the way it accesses and handles user data. While the media attention on FaceApp may be overblown, the underlying data privacy issue of data privacy is not. It is clear that the FaceApp discussion would do well to raise a mirror not only to the prospects of our own digitally-rendered mortality but to the uncomfortable reality of how we handle our own personal data online.
This post features contributions from Karla Hodgson.