Elon Musk’s xAI has lost its bid for a preliminary injunction that would have temporarily blocked California from enforcing a law that requires AI firms to publicly share information about their training data.

xAI had tried to argue that California’s Assembly Bill 2013 (AB 2013) forced AI firms to disclose carefully guarded trade secrets.

The law requires AI developers whose models are accessible in the state to clearly explain which dataset sources were used to train models, when the data was collected, if the collection is ongoing, and whether the datasets include any data protected by copyrights, trademarks, or patents. Disclosures would also clarify whether companies licensed or purchased training data and whether the training data included any personal information. It would also help consumers assess how much synthetic data was used to train the model, which could serve as a measure of quality.

  • ageedizzle@piefed.ca
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    8 hours ago

    Elon Musk’s xAI has lost its bid for a preliminary injunction that would have temporarily blocked California from enforcing a law that requires AI firms to publicly share information about their training data.

    How do you actually enforce this? What’s stopping these companies from just lying about what training data they use?

    • Pup Biru@aussie.zone
      link
      fedilink
      English
      arrow-up
      7
      ·
      5 hours ago

      what’s stripping these companies lying about their financial data to tax authorities?

      there are lots of self-report mechanisms that we use… it’s just not worth the blowback of non-disclosure to lie about it. some people do, and sometimes they get caught; not always, but overall it’s a net benefit to transparency

      • ageedizzle@piefed.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 hours ago

        I don’t know anything about accounting, but at first blush it seems like tax evasion and so forth would be easier to detect because the government can look at their bank activity and perform random audits, and so on. In contrast I don’t really know what tools we’d use to catch people lying about their training data

        • Pup Biru@aussie.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 hours ago

          for large companies, i think you’re probably right… but there are plenty of transactions that happen cash. i think it’s a case of not letting perfect be the enemy of better. some people might lie, and if they get caught that should have some punishment… but we hope that most people don’t lie, because the risk just isn’t worth it