That said, the Firefox policy actually reflects something similar to what I'm describing in the post: even if AI tools are used to generate code, a human must ultimately take responsibility for what gets merged. AI can't own a change or be accountable for it.
If a large portion of the implementation is generated, ownership can't really mean "I wrote this" anymore. It starts to look more like stewardship of a system: understanding how it behaves, watching how it fails, and designing it so mistakes are contained and recoverable.
Curious how others are thinking about this shift.