When tech companies created face recognition systems that have been used extensively for government surveillance and violating personal privacy, they received help from an unexpected source: your face.

Millions of images gathered from a mixture of online sources have been used by companies, universities and government laboratories to develop the technology.

Now, researchers have created an online tool called Exposing.AI, which allows people to search for their old photos in the many existing photo collections.

The tool, which matches photos from the online photo-sharing service Flickr, provides a window into the vast amounts of data needed to build a variety of artificial intelligence technologies, from face recognition to online "chatbots".

"People need to realize that some of their most intimate moments have been used as a weapon," said Liz O'Sullivan, technology director at Project Watch Monitor Technology, a privacy and civil rights advocacy group.

Exposing A.I. collaborated with Adam Harvey, a researcher and artist in Berlin.

Systems that use artificial intelligence do not magically become intelligent. They learn by identifying patterns in human-generated data, such as images, audios, books, Wikipedia articles, and all kinds of other materials.

Technology is improving all the time, but it also can learn human prejudices against women and minorities.

People may not know that they are contributing to AI education.

For some it may arouse curiosity and for others, it is very frightening, and it can be against the law.

A 2008 law passed in Illinois, the Biometric Information Privacy Act - such as fingerprints, eyes, and fingers - imposes financial penalties if face scans are used to residents without their consent.

In 2006, Brett Gaylor, a documentary filmmaker, uploaded their honeymoon photos to Flickr, a popular service at the time.

After nearly 15 years, using an early version of "Exposing. AI," Gaylor discovered that hundreds of those images made their way into multiple datasets that may have been used to train face recognition systems around the world.

But the terrifying one is how honeymoon photos helped build up surveillance systems in China. In some ways, Gaylor's story may be unintended or unexpected.

Honeymoon photos of Brett Gaylor helped build surveillance systems in China by teaching artificial intelligence (Reuters)

Feeding the monster was not forbidden

Researchers at leading universities and technology companies have begun collecting digital images from a variety of sources, including photo-sharing services, social networks, sites like OkCupid, and even college cameras, and have shared those images with other organizations.

They all needed data to enter into the new AI systems, so they shared what they had, and it was legal at the time.

One example is MegaFace, a dataset that professors at the University of Washington created in 2015 that they built without the knowledge or consent of the people whose photos were incorporated into the massive photo collection, and professors posted them online so that others could Download it.

Mega Vice has been downloaded more than 6,000 times by companies and government agencies around the world, among them the US Defense Department contractor Northrop Grumman and In-Q-Tel. The investment arm of the CIA, ByteDance, the parent company of the Chinese social media app TikTok, and the Chinese monitoring company, Megvii.

Researchers have built Mega Face for use in an academic competition aimed at stimulating the development of face recognition systems.

It was not intended for commercial use, but only a small percentage of those who downloaded "Mega Face" participated in the competition overall.

"We are not in a position to discuss third-party projects," said Victor Palta, a spokesman for the University of Washington. "Mega Vice has been turned off, and Mega Vice data is no longer distributed to anyone."

The University of Washington shut down Mega Vice in May, and other organizations have removed other data sets, but these files could be copied anywhere and potentially fuel new research.