

91·
2 years agoElon has a habit of doing this
Elon has a habit of doing this
Finally found the solution to the alignment problem
A boring dystopia
It’s possible sure. In order to train these image AIs you essentially feed them a massive amount of pictures as “training data.” These biases happen because more often than not the training data used is mostly pictures of white people. This might be due to racial bias of the creators, or a more CRT explanation where they only had the rights to pictures of mostly white people. Either way, the fix is to train the AI on more diverse faces.
Your fear of constant data collection and targeted advertising is valid and draining. Take back your privacy with this code for 30% off Nord VPN.