IBM used Flickr photos for facial recognition project |  Up to one million photos used without consent When facial recognition technology tracks shoppers in malls people will naturally start asking why? As members of the public become more aware of their Personal Privacy rights they are understandably concerned about what data is being collected on them and who it is being shared with. But when the faces of real people are innocently shared on social media – and then used WITHOUT PERMISSION by an international tech giant, the people whose images have been used have every right to complain. The following is a BBC News report revealing that IBM used Flickr photos for a facial recognition project is another case of disrespecting people’s privacy rights. IBM has been accused of using Flickr photos for a facial-recognition project, without the full consent of people in the images. The company extracted nearly one million photos from a dataset of Flickr images originally compiled by Yahoo. Many people pictured were probably unaware of how their data had been used, according to an NBC News report. IBM said in a statement that it had taken great care to comply with privacy principles. But one digital rights group said IBM’s actions represented a “huge threat” to people’s privacy. A photographer told NBC News.

None of the people I photographed had any idea their images were being used in this way,

Photos selected by IBM were listed under a Creative Commons licence, which generally means the images can be widely used with only a small number of restrictions. In a paper published online about the work, IBM researchers describe in detail the steps taken to analyse people’s faces, including taking measurements of the distance between individuals’ facial features. “Many of these measures can be reliably estimated from photos of frontal faces, using 47 landmark points of the head and face,” the researchers wrote. It was these measurements, rather than the photos themselves, that IBM has compiled into its own dataset.

‘Don’t be creepy’

Such data can be used to help artificial neural networks better distinguish between faces, so that individuals can be recognised in different images. By using large datasets such as this, technology companies hope to make their facial-recognition systems more accurate. IBM said in a statement.

We take the privacy of individuals very seriously and have taken great care to comply with privacy principles, … Individuals can opt out of this dataset.

However, opting out would only be an option if an individual were aware that their data had been used in the first place. Digital rights group Privacy International said IBM had been wrong to use the photos without direct consent from those pictured. “Flickr’s community guidelines explicitly say, ‘Don’t be creepy.’ Unfortunately, IBM has gone far beyond this. “Using these photos in this way is a flagrant breach of anti-creepiness – as well as a huge threat to people’s privacy.” Sources: BBC News

Contact Our Team Today
Your confidential, no obligation discussion awaits.