According to Chaslot, the metric used by the algorithm to determine whether a recommendation worked is displayed time, which may help companies who wish to advertise, but does not accurately reflect what the user wants, and has consequences.
“It’s not intrinsically appalling that YouTube uses artificial intelligence to recommend videos for you, because if the AI is well tuned, it can help you get what you want. created to help you get what you want – it was created to make you addicted to YouTube – recommendations were designed to squander your time. ”
Guillaume Chaslot to the TNW website.
The developer demonstrated his reasoning in a speech at the DisinfoLab conference last month, saying that sensational content is often recommended. Charlot’s concern is that the recommendations take people to extremes, since the stranger the content, the greater the possibility of keeping people watching and, consequently, get recommendations by the algorithm.
His project, also transparency, is basically a robot that monitors which videos are most recommended by YouTube on that particular date. Users can search for specific content when using keywords, so the platform displays videos that contain the term in the title or channel name, plus the number of times it was recognized.
Recommended: JUNGLE_MEN For PC (Windows & MAC)
Searching for the Next Web portal, Google has rejected AlgoTransparency’s methodology, saying that YouTube recommendations are based on searches, likes, and shares of each user, and can not replicate the results your ex-developers project showed.
Source: canaltech.com.br