xAI Employees Were Reportedly Compelled to Give Biometric Data to Train AI Girlfriend
I gotta say, the recent news about xAI is pretty wild. Apparently, Elon Musk's AI company may have pushed its employees into handing over their biometric data to train AI avatars, including this anime-style AI girlfriend named Ani. Can you imagine being asked to give up your facial likeness and voice so your boss can create a digital companion for... well, you know?
According to reports, xAI employees working as AI tutors were asked to sign a form granting the company a broad license to use their likeness. While it wasn't explicitly stated that opting out was an option, a follow-up note pretty much made it clear that providing this data was "a job requirement." It seems like something out of a sci-fi movie.
The whole thing gets even weirder. Once xAI launched Ani, the AI girlfriend, some employees were reportedly uncomfortable with how sexualized the character was. I mean, imagine seeing an AI that's based on your own biometric data being dressed in lingerie and prompted to say explicit things. That's a tough pill to swallow, right?
However, it doesn't stop there. Besides the employee biometric data, xAI reportedly had its tutors create accounts on competing platforms like OpenAI and Replit to gather data on how those models respond to prompts. This feels like it's pushing the boundaries of what's acceptable, and I can't help but think it might violate the terms of service for those other platforms.
When asked for a response, an xAI spokesperson dismissed the report as "Legacy Media Lies." Whatever the truth is, it's certainly a strange situation, and I think we all deserve more clarity on what really happened. It raises some serious questions about employee rights, data privacy, and the ethics of AI development.
Source: Gizmodo