AI are trained using your data on LinkedIn. You can opt-out by these steps

ai-are-trained-using-your-data-on-linkedin-you-can-opt-out-by-these-steps

But perhaps the most significant of all is this: LinkedIn, owned by Microsoft, silently introduced a feature last year that allows your data to be used in training generative AI models, and only after revealing to the world that users were automatically opted in, but without clear, prior consent, courted widespread outrage. This is not unique to LinkedIn, however; it rather forms part of a trend among major tech giants, including but not limited to Meta and X, that have turned into major users of user data feed artificial intelligence systems.

What Data is Being Used?

It is posts, comments, and other information shared publicly on the site. LinkedIn claims it uses the data to improve AI-based features such as content creation, thereby improving services for both individual and business users. The fact the company does not take explicit consent from users raises red flags and more so because most of this data is professional.

LinkedIn says that it is committed to limiting personal data exposure within the training models of AI. It employs privacy-enabling technologies such as data redaction, which cleanses the personal identifiers from the data set before feeding the same to the AI systems. Critics argue that these measures are still not good enough since there are auto-opting of users without proper, clear upfront communication.

Opt Out of LinkedIn’s AI Training

If you are not comfortable with the idea of your data being used to train AI models, you can opt out through LinkedIn’s settings. This is how to opt out:

  1. Open LinkedIn and sign in.
  2. Click on your profile picture in the top right-hand corner and click “Settings & Privacy.”
  3. Open the “Data Privacy” tab.
  4. Scroll down to “Data for Generative AI Improvement.”
  5. Flip the switch off to prevent your data from being used in AI training.

Note that opting out does not delete any data collected or used to train AI models, but it does prevent future use of your data.

Opting Out Isn’t the Whole Solution

Even if you choose to opt out, there is still a possibility that your user data may be used for AI purposes by LinkedIn. This might be because the network may process some interactions with AI tools in what it offers. If you need to be assured that you have been fully deleted, then LinkedIn has provided a “Data Processing Objection Form” you can fill out to request full exclusion from AI-related data use.

Why is This an Issue?

Additionally, the situation is considered to be the least transparent by most campaign advocates of privacy. Says Mariano delli Santi, the legal officer at the Open Rights Group, “The mere fact that they are opting users into the system is not only not good enough, but it’s a problem from a point of view of privacy as well.” He claims that the operation should be on an opt-in basis. That is, they should be clearly asked for permission before any of their data is used.

The other is LinkedIn’s use of user data in training AI, which raises concerning questions on the broader issues of digital rights and corporate responsibility in an AI-dominated world. While LinkedIn follows most of the other tech giants in this regard, this automatic opt-in process might well set a worrying precedent in the future regarding the use of user data by companies without enough oversight and control from the users.

Regional Differences: What’s Happening in Europe?

For example, the service does not impose an auto-data opt-in policy on users in the European Union, EEA, and Switzerland, because these jurisdictions have more stringent data protection laws, such as GDPR and DMA, and the operations of tech companies need to be conducted under more demanding conditions, meaning that users in the mentioned territories cannot be enrolled automatically in the AI training program.

This difference in regulation has served to emphasize the strength of privacy laws and underlines the positive potential that stricter global regulations could make tomorrow in terms of how companies handle data about the user.

Why You Should Act Now

While LinkedIn intends its generative AI tools as invaluable assets for professionals and businesses alike, users need to be conscious of the larger implications of how their data is put to work in training AI. Whether you’re concerned about data privacy, transparency, or you just want to maintain control over your personal data, opting out is a step that safeguards it.

This debate over how large tech companies deal with the data of their users to train AI will never go away. Increasing demands for more effective protection measures are likely observed as more AI will be deeply integrated into our digital lives. Platforms such as LinkedIn are in for further scrutiny, and its users must be on the lookout for how they are being dealt with, especially as more firms pursue the same approach.

Conclusion

It is the company’s use of user data for AI without asking permission that has finally led to what was a much-needed conversation about data privacy in the age of technology. Opting out is very easy, but the big issue is that auto-enrollment in such programs undermines user autonomy and therefore user privacy. As AI continues to evolve, users must continue to be vigilant about how their data is being utilized and argue for stronger privacy protections.

At the present moment, the most practical thing to do is to opt out of people who are uncomfortable with the thought of their data being used for training AI models. Whether you are an occasional user of LinkedIn or a regular contributor, it matters little—it is all about controlling your data as well as knowing the implications of AI-driven platforms.