Apparently, the craze of FaceApp is around every corner of the global digital space at the moment. While there’s a news flying all over that this is a ploy of Russians trying to steal the users’ data. A big worry? Not really, but users should be concerned about much more than that.
In the last few days we saw nothing but the #faceappchallenge spreading on social media like a wildfire. The challenge was simple, download FaceApp, a face editing or ageing tool that will digitally transform your face and let you share it on your social media profile. The craze is short-lived—a few “HaHas” and pleasant taunts from your circle, and the excitement fades away more quickly than Thanos’s snap.
After the challenge went viral this week, Joshua Nozzi, a software developer, informed users to stay careful with FaceApp, as it instantly uploads their pictures without consent. Some sources affirmed this claim and privacy concerns began to arise, obviously.
The concerns gathered further urgency, when news started circling around that the app was Russian.
“The app that you’re willingly giving all your facial data to says the company’s location is in Saint-Petersburg, Russia”, the New York Times’s Charlie Warzel tweeted.
Evidently, people have the impression of Russians always wanting to extract users’ personal data for wicked purposes.
By Wednesday last week, things seemingly got a little calm. However, a French security researcher with the pseudo name, Elliot Alderson, stated that the app was not pulling your entire camera roll, but only the picture being used. Also that there was no credible evidence that the app was stealing the users’ data, rather only your device ID and model. Then why is there so much fuss regarding privacy concerns and data mining purposes of the app?
Seemingly, the data invasion from Russian programmers and hackers is the answer!
On last Wednesday, FaceApp responded to 9to5Mac by stating that it “might store” some uploaded pics of the users in the cloud for performance and testing purposes. Further, it insisted that even though the app’s R&D team resides in Russia, it has nothing to do with personal data transfer and usage.
When more and more information about FaceApp uncovered, Nozzi, the developer who first started warning the world regarding FaceApp, issued an extensive acknowledgement of his own error and deleted his tweets.
Wurzel admitted that his tweets about FaceApp and Russian connection were misunderstood. He tweeted, “My frame of reference for them came from reporting I’m doing on diff apps accessing data/ sending it places we wouldn’t assume (3rd parties, not govts).”
So should we take this as a proof that everything is normal? Should we excitingly take part in the #faceappchallenge just like others in our circle? No exactly though, even if we stand by the above comments taking the innocence of FaceApp.
According to the app’s own terms of service, when you start using the app, you grant it a “perpetual, irrevocable, nonexclusive, royalty-free, worldwide” license and allow it to put your photos to whatever use it wishes to.
Unsurprisingly, the prospect of allowing an unknown entity to access your private pictures and use them according to their unknown agenda is scary and somewhat, dangerous. Albeit, almost every platform or tech service we use through our phones or home systems display the same policies and terms of agreement.
If you aren’t satisfied or convinced to take part in the challenge due to privacy concerns, fair enough. However, let’s not forget that our faces are already present in at least one of the databases across the globe, with the aim to educate and train AI in taking over the world.
According to some Google researchers, they have used a minimum of 8 million users’ faces to train their face recognition system. Moreover, in May this year, researchers pointed out that they had used around 2,000 YouTube videos of users doing the mannequin challenge to aid their AI program in predicting the depth of a random moving object. The same researchers also admitted that they have kept the data sets for future research, in other words, we have no idea how or to what purpose that data will be put to.
For example, you never know your silly prank video on your friend or family member could be potentially used to train self-instructing robots or, even assassin drones.
In another example, earlier this year it was reported that University of Colorado was spot secretly photographing students for some facial recognition research. More than 1,700 students were snapped without being aware or giving consent. Quite a data for facial recognition algorithms to start their work!
Even worse, you don’t need to upload anything anymore to tempt the artificial intelligence technology for self-train.
Hence, we can safely say that whether FaceApp is a Russian tool used for some wild data-based conspiracy or not, there are bigger things to worry about. Remember, we have just scratched the surface of the street surveillance cam next door, not to mention an almost limitless potential or impulse it holds we have only watched in movies until now. In short, our faces no longer belong to us, but have been privatized for the ruling corporations and governments.
Imran Abdul Rauf is a Digital Marketing Strategist, employed at CMOLDS, and specializes in content marketing, email marketing campaigns, lead generation, and other aspects of digital marketing. A content enthusiast by the day, and hardcore gamer by night, Imran is also a regular guest contributor at some of the top tech and digital marketing platforms.