Major changes in the updated version include provisions on X’s collection of users’ biometric data and information about their employment and educational background.
It doesn’t specify what “biometric information” means, but the term can refer to a range of biological properties Examples include facial recognition, fingerprint or voice recognition.
Under the new terms, the company will also be able to store data on users’ personal background, including their school and work history.
“We may collect and use your personal information (such as your work history, education history, employment preferences, skills and abilities, job search activities and participation, etc.) to recommend potential jobs to you and communicate with you when you choose a job. Potential Employer Sharing. Applying for jobs enables employers to find potential candidates and show you more relevant ads,” the updated policy says.
These updates will add to user information X already collects, such as location data, payment information and how people interact with ads.
A spokesman for X was not immediately available for comment when contacted. wealth.
Use user data to train artificial intelligence
One notable change in the new policy, however, includes plans to use user data to train artificial intelligence systems.
musk – who $44 billion acquisition of Twitter before last year Rename it to X— has previously warned that AI will hit humanity “like an asteroid” and insisted it has the potential to “become the terminator”.
However, he later started his own artificial intelligence company, xAI, which he said was aimed at “understanding the universe” and preventing human extinction.
It also states that users can access their data, delete it, or change their data settings at any time.
While X and its users may see some benefit from the additional data collection, experts warn that there are also significant privacy concerns associated with increased access to user data.
Brad Smith, Founder, UK Digital Agency Success DigitalTell wealth On Thursday, storing a user’s biometric, professional or educational data could have some positive and negative consequences.
For example, fingerprint or facial recognition can enable more secure and convenient user authentication, while education and work history data can facilitate networking and career opportunities.
“On the other hand, holding this kind of data raises a lot of privacy concerns,” Smith warned. “X has a responsibility to keep this data safe, but we can never be sure. There is also the question of whether private companies should be allowed to take such risks with user data.”
Companies and even governments have the potential to misuse this information for surveillance without users’ consent, and storing education and work history could inadvertently lead to discriminatory practices, he explained.
“Algorithms could use this data to make decisions, which could exacerbate bias in hiring and networking,” Smith said.
Jacob Pantaleoniformer principal engineer and research scientist at Nvidia, and Fastest Revolution: An insider’s guide to sweeping technological change and its biggest threatswas more pessimistic about looming privacy changes in X.
“X’s plan to collect biometric data as well as work and education histories sets a dangerous precedent,” he warned. “The danger is twofold. First, if the use of these markers becomes more widely adopted, it could further erode the notion of online privacy by creating a system in which it is nearly impossible to remain anonymous online.”
Pantaleoni drew an analogy to OpenAI CEO Sam Altman’s plan to scan the eyes of billions of people in exchange for cryptocurrency.
“While these initiatives are primarily sold as solutions to problems such as identity theft and the proliferation of bots, the use of these inescapable identity markers will necessarily lead to the development of more granular and precise methods of targeted advertising and customized news releases,” he said wealth.
“This means that it will become more difficult for users to gain a neutral view of the network and the world. The consequences could be disastrous.”