283 pointsby prefork12 hours ago47 comments
  • stefankuehnel11 hours ago
    If you scroll down to "Allow GitHub to use my data for AI model training" in GitHub settings, you can enable or disable it. However, what really gets me is how they pitch it like it’s some kind of user-facing feature:

    Enabled = You will have access to the feature

    Disabled = You won't have access to the feature

    As if handing over your data for free is a perk. Kinda hilarious.

    • data-ottawa8 hours ago
      It’s not so bad, there’s no double negative and it’s not a confusing “switch” that is always ambiguous as to whether it’s enabled or not.

      In contrast when you create a a GCS bucket it uses a checkmark for enabling “public access prevention”. Who designed that modal? It takes me a solid minute to figure out if I’m publishing private data or not.

    • a1o11 hours ago
      I went to check on this and I have everything copilot related disabled and in the two bars that measure usage my Copilot Chat usage was somehow in 2%, how is this possible?

      Before anyone comes to me to sell me on AI, this is on my personal account, I have and use it in my business account (but it is a completely different user account), I just make it a point to not use it in my personal time so I can keep my skills sharp.

      • hakunin11 hours ago
        Does Github count it as copilot chat usage when you use AI search form on their website, I wonder?
        • a1o8 hours ago
          I wonder if that’s it! I occasionally do some code search on GitHub and then remember it doesn’t work well and go back to searching in the IDE. I usually need to look into not the main branch because I do a lot of projects that have a develop branch where things actually happen. But that would explain so I guess this is it.
      • saratogacx10 hours ago
        If you're taking about the quota bar. That is only measuring your premium request usage (models with a #.#x multiplier next to the name). If you only use the free models and code completion you won't actually consume any "usage". If you use AI code review that consumes a single request (now). Same with the Github Copilot web chat, if you use a free model, it doesn't count, if you use a premium model you get charged the usage cost.
    • mirekrusin10 hours ago
      The feature is that your coding style will be in next models!
      • rzmmm10 hours ago
        I wish my GPL license would transit along with my code.
        • mirekrusin23 minutes ago
          I said it few years back that code license doesn't exist anymore, some people just haven't realized it yet.
        • UqWBcuFx6NV4r7 hours ago
          If you are wholly confident that model training is a violation of the GPL then go sue.
          • tglman4 hours ago
            I guess freedom of study and use may include also training AI, but would be cool if all the derivate work, as AI models and generated code from AI models should be licensed as GPL, layers needed here
    • petcat11 hours ago
      I guess the "perk" is that maybe their models get retrained on your data making them slightly more useful to you (and everyone else) in the future? idk
    • Rapzid10 hours ago
      Is that not some stock feature-flag verbiage?
      • bigiain9 hours ago
        Stock dark pattern verbiage...

        I'm a little surprised the options aren't "Enable" and "Ask me later".

      • NewJazz8 hours ago
        But it isn't a feature, so using a feature flag is a bit weird.
        • UqWBcuFx6NV4r7 hours ago
          No, it’s not. Please think like a developer and not like someone playing amateur gotcha journalist on social media. Feature flags are (ab)used in this way all the time. What is a feature? What is a feature flag? It’s like asking what authorisation is vs all your other business rules. There’s grey area.
          • NewJazz7 hours ago
            "Please think like a developer" lmao if I said this to someone at my dayjob I'd be gone.
    • 7bit10 hours ago
      It's worded that way to create FOMO in the hopes people keep it enabled.

      Dark pattern and dick move.

    • martin-t9 hours ago
      A few days ago, I unchecked it, only to see it checked again when I reloaded the page.

      It could be incompetence but it shouldn't matter. This level of incompetence should be punished equally to malice.

  • mentalgear11 hours ago
    > On April 24 we'll start using GitHub Copilot interaction data for AI model training unless you opt out. Review this update and manage your preferences in your GitHub account settings.

    Now "Allow GitHub to use my data for AI model training" is enabled by default.

    Turn it off here: https://github.com/settings/copilot/features

    Do they have this set on business accounts also by default? If so, this is really shady.

    • lenova11 hours ago
      Ugh, can't believe they made this opt-in by default, and didn't even post the direct URLs to disable in their blog post.

      To add on to your (already helpful!) instructions:

      - Go to https://github.com/settings/copilot/features - Go to the "Privacy" section - Find: "Allow GitHub to use my data for AI model training" - Set to disabled

      • thrdbndndn4 hours ago
        I always thought "opt-in" (not "opt in") meant something you have to actively choose to enable; otherwise, it stays off. So calling something "opt-in by default" sounds like a misnomer to me.

        But English is not my first language so please correct me if I'm wrong.

      • inetknght10 hours ago
        > can't believe they made this opt-in by default

        You can't believe Microslop is force-feeding people Copilot in yet another way?

        > and didn't even post the direct URLs to disable in their blog post

        You can't believe Microshaft didn't tell you how to not get shafted?

    • parkersweb10 hours ago
      Yes - not impressed at all that this is opt-in default for business users. We have a policy in place with clients that code we write for them won’t be used in AI training - so expecting us to opt out isn’t an acceptable approach for a business relationship where the expectation is security and privacy.
      • aksss9 hours ago
        It is not opt-in by default for business users. The feature flag doesn't show in org policies and github states that it's not scoped to business users.
        • parkersweb8 hours ago
          Gah - you’re right - but given that I don’t use personal copilot - but I do manage an organisation that gives copilot to some of our developers AND I was sent an email this evening making no mention at all of business copilot being excluded it could definitely have been communicated better…
    • g947o11 hours ago
      https://github.com/orgs/community/discussions/188488

      > Why are you only using data from individuals while excluding businesses and enterprises?

      > Our agreements with Business and Enterprise customers prohibit using their Copilot interaction data for model training, and we honor those commitments. Individual users on Free, Pro, and Pro+ plans have control over their data and can opt out at any time.

      • dormento10 hours ago
        Aka "they have lawyers and you usually don't, so we think we can get away with it."
        • gentleman119 hours ago
          only big companies have access to the legal system. nobody else can afford it
      • themafia10 hours ago
        > and we honor those commitments.

        Ah, so when the inevitable "bug" appears, and we all learn that you've completely failed to honor anything, what will be your "commitment" then? An apology and a few free months?

        Time to start pushing for a self hosted git service again.

    • martinwoodward11 hours ago
      Just confirming, we do not use Copilot interaction data for model training of Copilot Business or Enterprise customers.
      • verdverm37 minutes ago
        You shouldn't do it for public by opt-in, it should be opt-out. But that is the Microslop effect on GitHub, users are an afterthought.
    • whynotmaybe6 hours ago
      Per their blog post

      > Business and Copilot Enterprise users are not affected by this update.

    • archb11 hours ago
      Interestingly, it is disabled by default for me.
      • crashingintoyou11 hours ago
        Reading the github blog post "If you previously opted out of the setting allowing GitHub to collect this data for product improvements, your preference has been retained—your choice is preserved, and your data will not be used for training unless you opt in."
        • verdverm27 minutes ago
          Is this the new name for the setting? I cannot find one that sounds like the previous one you mention

          Notable that they have no "privacy" section in account settings

        • 32 minutes ago
          undefined
      • gpm11 hours ago
        Me too, which is making me wonder if they're planning on silently flipping this setting on April 24th (making it impossible to opt out in advance).
        • martinwoodward3 hours ago
          We are not. The reason we wanted to announce early was so that folks had plenty of time to opt-out now. We've also added the opt-out setting even if you don't use Copilot so that you can opt-out now before you forget and then if you decide to use Copilot in the future it will remember your preference.
        • spiderfarmer11 hours ago
          Is it because I'm in the EU?
          • paularmstrong11 hours ago
            I'm in the US and it's off for me. I believe I've previously opted out of everything copilot related in the past if there was anything.
          • gpm10 hours ago
            I'm in Canada, so not only the EU at least.
      • xgdgsc6 hours ago
        I guess we have to check out again on April 24 ?
    • gentleman119 hours ago
      What did everyone expect? I can't understand this community's trust of microsoft or startups. It's the typical land grab: start off decent, win people over, build a moat, then start shaking everybody down in the most egregious way possible.

      It's just unusual how quickly they're going for the shakedown this time

    • DavidSJ11 hours ago
      > Do they have this set on business accounts also by default? If so, this is really shady.

      Looks like not, but would it actually have been shadier, or are we just used to individual users being fucked over?

      • hrmtst9383711 hours ago
        If they turned it on for business orgs, that would blow up fast. The line between "helpful telemetry" and "silent corporate data mining" gets blurry once your team's repo is feeding the next Copilot.

        People are weirdly willing to shrug when it's some solo coder getting fleeced instead of a company with lawyers and procurement people in the room. If an account tier is doing all the moral cleanup, the policy is bad.

  • QuadrupleA9 hours ago
    Fun fact: Copilot gives you no way to ignore sensitive files with API keys, passwords, DB credentials, etc.: https://github.com/orgs/community/discussions/11254#discussi...

    So by default you send all this to Microsoft by opening your IDE.

    • 0xbadcafebee7 hours ago
      Separate fun fact: Gemini CLI blocks env vars with strings like 'AUTH' in the name. They have two separate configuration options that both let you allow specific env vars. Neither work (bad vibe coding). Tried opening an issue and a PR, and two separate vibe-coding bots picked up my issue and wrote PRs, but nobody has looked at them. Bug's still there, so can't do git code signing via ssh agent socket. Only choice is to do the less-secure, not-signed git commits.

      On top of that, Gemini 3 refuses to refactor open source code, even if you fork it, if Gemini thinks your changes would violate the spirit of the intent of the original developers in a safety/security context. Even if you think you're actually making it more secure, but Gemini doesn't, it won't write your code.

    • nulld3v7 hours ago
      Sadly, this issue is systemic: https://github.com/openai/codex/issues/2847
      • stavros5 hours ago
        OpenCode has a plugin that lets you add an .ignore file (though I think .agentignore would be better). The problem is that, even though the plugin makes it so the agent can't directly read the file, there's no guarantee the agent will try to be helpful and do something like "well I can't read .envrc using my read tool, so let me cat .envrc and read it that way".
    • malnourish8 hours ago
      I swear I just set up enterprise and org level ignore paths.
      • veverkap7 hours ago
        Yeah, it's a Copilot Business/Enterprise feature
  • ncr10034 minutes ago
    On my Android phone I was able to change the setting using Firefox by logging into GitHub and not allowing it to launch the GitHub app.

    I was unable to change the setting when I used the GitHub app to open up the web page in a container.. button clicks weren't working. Quite frustrating.

  • sph11 hours ago
    Thanks to Github and the AI apocalypse, all my software is now stored on a private git repository on my server.

    Why would I even spend time choosing a copyleft license if any bot will use my code as training data to be used in commercial applications? I'm not planning on creating any more opensource code, and what projects of mine still have users will be left on GH for posterity.

    If you're still serious about opensource, time to move to Codeberg.

    • heavyset_go5 hours ago
      Made the same choice, my open source projects with users are in maintenance mode or archived. New projects are released via SaaS, compiled artifacts or not at all.

      I scratch my open source itch by contributing to existing language and OS projects where incremental change means eventually having to retrain models to get accurate inference :)

    • thesmart10 hours ago
      Yeah, I'm guessing that probably because in their TOS you grant them some license work-around for running the service, which can mean anything.
    • midasz10 hours ago
      I'm in my happy space selfhosting forgejo and having a runner on my own hardware
  • section_me11 hours ago
    If I'm paying, which I am, I want to have to opt-in, not opt-out, Mario Rodriguez / @mariorod needs to give his head a wobble.

    What on earth are they thinking...

    • sph11 hours ago
      > What on earth are they thinking...

      @mariorod's public README says one of his focuses is "shaping narratives and changing \"How we Work\"", so there you go.

      • fmjrey11 hours ago
        Translation: more alignment with Microsoft practices
      • section_me11 hours ago
        "shaping narratives", sounds like they follow the methodologies of a current president
        • okanat11 hours ago
          It looks like the literal translation of "manipulation" to Linkedin-speak.
    • wenldev11 hours ago
      [dead]
  • diath11 hours ago
    > This approach aligns with established industry practices

    "others are doing it too so it's ok"

    • theshrike7911 hours ago
      Ackshually Anthropic is opt-in AND they give you discounts if you enable it
      • stingraycharles2 hours ago
        It’s opt-out, not opt-in, at least for Claude Desktop and Claude Code, unless you use the API.
      • cma10 hours ago
        Anthropic puts up random prompts defaulting to enabled to trick you into accidentally enabling.
      • nodar8610 hours ago
        What kind of discounts? I have never heard of this
  • pred_10 hours ago
    What is the legal basis of this in the EU? Ignoring the fact they could end up stealing IP, it seems like the collected information could easily contain PII, and consent would have to be

    > freely given, specific, informed and unambiguous. In order to obtain freely given consent, it must be given on a voluntary basis.

    • LadyCailin8 hours ago
      I actually don’t seem to have this option on my GitHub settings page, which leads me to wonder if this only applies to Americans.
      • LauraMedia8 hours ago
        I actually did have to manually disable this from Germany, so it might be a different reason you don't have it?
      • spartanatreyu7 hours ago
        I have the setting in Australia.

        I'd be curious to see which countries are affected

  • Deukhoofd11 hours ago
    So basically they want to retain everyone's full codebases?

    > The data used in this program may be shared with GitHub affiliates, which are companies in our corporate family including Microsoft

    So every Microsoft owned company will have access to all data Copilot wants to store?

  • hmate911 hours ago
    For what it's worth they're not trying to hide this change at all and are very upfront about it and made it quite simple to opt out.
    • ncr10032 minutes ago
      They do not make it very simple to opt out. That is false.

      On Android for instance I invite you to use the GitHub app and modify your opt-in or opt outside settings... You will find that nothing works on the settings page once you actually find the settings page after digging through a couple of layers and scrolling about 2 ft.

    • matltc11 hours ago
      They didn't even link the setting in their email. They didn't even name it specifically, just vaguely gestured toward it. Dark patterns, but that's Microslop for ya
      • hmate910 hours ago
        going to github i was greeted with a banner and a link directly to the settings for changing it
      • w4yai4 hours ago
        I've seen worse dark pattern to be honest... I don't think they're being malicious here.
  • badthingfactory7 hours ago
    I appreciated the notification at the top of the screen because it prompted me to disable every single copilot feature I possibly could from my account. I also appreciated Microsoft for making Windows 11 horrible so I could fall back in love with Linux again.
  • stefanos827 hours ago
    Serious question: let's say I host my code on this platform which is proprietary and is for my various clients. Who can guarantee me that AI won't replicate it to competitors who decide to create something similar to my product?
    • halfcat5 hours ago
      If the code is ever visible to anyone else ever, you have no guarantee. If it’s actually valuable, you have to protect it the same way you’d protect a pile of gold bars.

      What does “my code...for my clients” mean (is it yours or theirs)? If it’s theirs let them house it and delegate access to you. If they want to risk it being, ahem...borrowed, that’s their business decision to make.

      If it’s yours, you can host it yourself and maintain privacy, but the long tail risk of maintaining it is not as trivial as it seems on the surface. You need to have backups, encrypted, at different locations, geographically distant, so either you need physical security, or you’re using the cloud and need monitoring and alerting, and then need something to monitor the monitor.

      It’s like life. Freedom means freedom from tyranny, not freedom from obligation. Choosing a community or living solo in the wilderness both come with different obligations. You can pay taxes (and hope you’re not getting screwed, too much), or you can fight off bears yourself, etc.

  • OtherShrezzing11 hours ago
    It’s not clear to me how GitHub would enforce the “we don’t use enterprise repos” stuff alongside “we will use free tier copilot for training”.

    A user can be a contributor to a private repository, but not have that repository owner organisation’s license to use copilot. They can still use their personal free tier copilot on that repository.

    How can enterprises be confident that their IP isn’t being absorbed into the GH models in that scenario?

    • danelski7 hours ago
      Quite simply, that's just a matter of the corporate internal policy and its (lack of) enforcement. This problem is just a subset of the wider IP breach with some people happily feeding their work documents into the free tier of ChatGPT.
    • martinwoodward9 hours ago
      We do not train on the contents from any paid organization’s repos, regardless of whether a user is working in that repo with a Copilot Free, Pro, or Pro+ subscription. If a user’s GitHub account is a member of or outside collaborator with a paid organization, we exclude their interaction data from model training.
      • lmcan hour ago
        Thank you for clarifying this.
      • 8cvor6j844qw_d63 hours ago
        For private repositories under a personal account, if the repo owner has opted out of model training but a collaborator has not, would the collaborator's Copilot interactions with that repo still be used for training?
  • _pdp_10 hours ago
    Microsoft doing dumb things once again.

    Who in their right mind will opt into sharing their code for training? Absolutely nobody. This is just a dark pattern.

    Btw, even if disabled, I have zero confidence they are not already training on our data.

    I would also recommend to sprinkle copyright noticed all over the place and change the license of every file, just in case they have some sanity checks before your data gets consumed - just to be sure.

  • hoten11 hours ago
    Why is there no cancel copilot subscription option here?. Docs say there should be...

    Mobile

    https://github.com/settings/billing/licensing

    EDIT:

    https://docs.github.com/en/copilot/how-tos/manage-your-accou...

    > If you have been granted a free access to Copilot as a verified student, teacher, or maintainer of a popular open source project, you won’t be able to cancel your plan.

    Oh. jeez.

  • pizzafeelsright11 hours ago
    I am not certain this is that big of a deal outside of "making AI better".

    At this point, is there any magic in software development?

    If you have super-secret-content is a third party the best location?

    • danelski7 hours ago
      They've had ample access to the final output - our code, but they still hope with enough data on HOW we work they can close the agentic gap and finally get those stinky, lazy humans that demand salary out of the loop.
    • thesmart10 hours ago
      How about "no." You may be okay giving away your individual rights, including to copyright, but I am not.
  • explodesan hour ago
    We all knew Microsoft was going to destroy GitHub eventually when it was first bought.

    How much longer do you want to tolerate the enshittification? How much longer CAN you tolerate it?

  • rectang9 hours ago
    I just checked my Github settings, and found that sharing my data was "enabled".

    This setting does not represent my wishes and I definitely would not have set it that way on purpose. It was either defaulted that way, or when the option was presented to me I configured it the opposite of how I intended.

    Fortunately, none of the work I do these days with Copilot enabled is sensitive (if it was I would have been much more paranoid).

    I'm in the USA and pay for Copilot as an individual.

    Shit like this is why I pay for duck.ai where the main selling point is that the product is private by default.

  • 11 hours ago
    undefined
  • liquid_thyme10 hours ago
    They use data from the poor student tier, but arguably, large corporates and businesses hiring talented devs are going to create higher quality training data. Just looking at it logically, not that I like any of this...
  • jmhammond6 hours ago
    Mine was defaulted to disabled. I’m on the Education pro plan (academic), so maybe that’s different than personal?
  • sbinnee6 hours ago
    Bold move. Who uses Copilot these days? Unless they have free credit I mean.
  • david_allison9 hours ago
    I have GitHub Copilot Pro. I don't believe I signed up for it. I neither use it nor want it.

    1. A lot of settings are 'Enabled' with no option to opt out. What can I do?

    2. How do I opt out of data collection? I see the message informing me to opt out, but 'Allow GitHub to use my data for AI model training' is already disabled for my account.

    • martinwoodward9 hours ago
      Hey David - if you want to send me (martinwoodward at github.com) details of your GitHub account I can take a look. At a guess I suspect you are one of the many folks who qualified for GitHub Copilot Pro for free as a maintainer of a popular open source project.

      Sounds like you are already opted out because you'd previously opted out of the setting allowing GitHub to collect this data for product improvements. But I can check that.

      Note, it's only _usage_ data when using Copilot that is being trained on. Therefore if you are not using Copilot there is no usage data. We do not train on private data at rest in your repos etc.

  • thesmart10 hours ago
    I'm ready to abandon Github. Enschitification of the world's source infrastructure is just a matter of time.
  • OtherShrezzing10 hours ago
    So, how does this work with source-available code, that’s still licensed as proprietary - or released under a license which requires attribution?

    If someone takes that code and pokes around on it with a free tier copilot account, GitHub will just absorb it into their model - even if it’s explicitly against that code’s license to do so?

    • danelski7 hours ago
      Most of the new culture and website contents is under full copyright. How much of an obstacle was that to these companies?
  • Heliodex7 hours ago
    Finally. The option for me to enable Copilot data sharing has been locked as disabled for some time, so until now I couldn't even enable it if I wanted to.
  • TZubiri11 hours ago
    Two issues with this:

    1- Vulnerabilities, Secrets can be leaked to other users. 2- Intellectual Property, can also be leaked to other users.

    Most smart clients won't opt-out, they will just cut usage entirely.

    • matltc11 hours ago
      That's me. Frankly, looking at just uninstalling VSCode because Copilot straight-up gets in the way of so much, and they stopped even bothering with features that are not related to it (with one exception of native browser in v112, which, admittedly, is great)
  • cebert9 hours ago
    I wish GitHub would focus on making their service reliable instead of Copilot and opting folks into their data being stolen for training.
  • 10 hours ago
    undefined
  • indigodaddy11 hours ago
    Checked and mine was already on disabled. Don't remember if I previously toggled it or not..
    • martinwoodward11 hours ago
      If you previously opted out of the setting allowing GitHub to collect data for product improvements, your preference has been retained here. We figured if you didn't want that then you definitely wouldn't want this..
  • djmashko212 hours ago
    > Content from your issues, discussions, or private repositories at rest. We use the phrase “at rest” deliberately because Copilot does process code from private repositories when you are actively using Copilot. This interaction data is required to run the service and could be used for model training unless you opt out.

    Sounds like it's even likely to train on content from private repositories. This feels like a bit of an overstep to me.

  • mt42or11 hours ago
    Is it legal ? Surely not in any EU countries.
    • okanat11 hours ago
      Does it even matter? They trained AI on obviously copyrighted and even pirated content. If this change is legally significant and a legal breach, the existence of all models and all AI businesses also is illegal.
      • 0x3f11 hours ago
        It might or might not be legal, but it seems materially worse to screw over your direct customers than to violate the social-contracty nature of copyright law. But hey ho if you're not paying then you're the product, as ever was.
    • mentalgear11 hours ago
      At least one instance where it was enabled in EU countries as well.
  • phendrenad25 hours ago
    So I do all the work of thinking about how to do something, and as soon as I tell Copilot about it, not it's in the training data and anyone can ask the LLM and it'll tell them the solution I came up with? Great. I'm going to cancel.
  • TZubiri11 hours ago
    If this doesn't sound bad enough, it's possible that Copilot is already enabled. As we know this kind of features are pushed to users instead of being asked for.

    Maybe it's already active in our accounts and we don't realize it, so our code will be used to train the AI.

    Now we can't be sure if this will happen or not, but a company like GitHub should be staying miles away from this kind of policy. I personally wouldn't use GitHub for private corporate repositories. Only as a public web interface for public repos.

  • semiinfinitely10 hours ago
    ill be moving off github now
  • rvz11 hours ago
    > From April 24 onward, interaction data—specifically inputs, outputs, code snippets, and associated context—from Copilot Free, Pro, and Pro+ users will be used to train and improve our AI models unless they opt out.

    Now is the time to run off of GitHub and consider Codeberg or self hosting like I said before. [0]

    [0] https://news.ycombinator.com/item?id=22867803

    • 0x3f11 hours ago
      Codeberg doesn't support non OSS and I'd rather just have one 'git' thing I have to know for both OSS and private work. So it's not a great option, IMO. Self-hosting also for other reasons.

      I'm not sure there are any good GitHub alternatives. I don't trust Gitlab either. Their landing page title currently starts with "Finally, AI". Eek.

  • marak8306 hours ago
    As it's enabled by default, does that mean everything has already been siphoned off and now I'm just closing the gate behind the animals escaping?

    Shit like this shouldn't be allowed.

  • baobabKoodaa11 hours ago
    (oops)
    • tech234a11 hours ago
      It’s currently March
  • patrickRyu31 minutes ago
    [dead]
  • pugchat6 hours ago
    [dead]
  • iam_circuit7 hours ago
    [dead]
  • ComputeLeap7 hours ago
    [dead]
  • Mooshux10 hours ago
    [dead]
  • manudaro8 hours ago
    [dead]
  • SilentEditor11 hours ago
    [dead]
  • bustah10 hours ago
    [dead]
  • latand69 hours ago
    Why won't people like to make the models better? Aren't we all getting the benefit after all?
    • danelski7 hours ago
      That's akin to being grateful for your local shop owner that they allowed you to sweep the floor for other customers.