OpenAI is reportedly asking third-party contractors to upload examples of real work they have completed in previous or current roles, according to a report by Wired. The request is part of a broader push by AI companies to source higher-quality training data that could help models take on more complex white-collar tasks.
The report says OpenAI, working alongside training data firm Handshake AI, has shared internal materials encouraging contractors to describe tasks they have performed elsewhere and submit “real, on-the-job work” they have personally produced. These submissions are expected to be concrete outputs, such as Word documents, PDFs, PowerPoint presentations, Excel files, images, or even code repositories, rather than summaries or synthetic examples.
To reduce risk, OpenAI reportedly instructs contractors to remove proprietary content and personally identifiable information before uploading any files. Contractors are also directed to use a dedicated ChatGPT-based “Superstar Scrubbing” tool designed to help sanitize sensitive data.
Even with these safeguards, the approach raises legal and ethical concerns. Intellectual property lawyer Evan Brown told Wired that asking contractors to make judgment calls about what is confidential exposes AI labs to significant risk, since it relies heavily on individual interpretation and trust.
OpenAI declined to comment on the report.
We have helped 20+ companies in industries like Finance, Transportation, Health, Tourism, Events, Education, Sports.