I think it's fair to say that you were able to code in the first place. Yes, you took more time, but if you were able to meet your deadlines and produce decent code, that's what matters.
A more apt question could be: are you able to code right now? Challenge yourself to a small project, maybe in a weekend at most. Do not use AI tools at all. If you can write a small project in the language you are most familiar with, you can still code, even if you need 5x more time.
If the anwser to both questions is no, then you probably never could code....
But if you generalize this by asking, isn't code just a set of descriptions, have you written a tutorial and people used that? Then I would say, yes.
Of if you share some snippet and 1000 people see that and some upvote, are you able to code? Also refer to my other replies in this thread for clarification.
Although... "coding" originally meant doing what an assembler does - translating mnemonics into binary (or octal or hex) instructions, literally encoding the instructions. So by the original standards, if you're using even an assembler, then no, you're not.
But definitions change over time. By current standards, from what you said, you definitely are able to code.
Yet it is not so much about downplaying myself than rather thinking about whether what I did was useful, even necessary. Is there inherent intellectual value in fixing dependency issues? Or is the real value in the actual idea? In the perfect description of the problem? Basically the antithesis to the old-age statement of "Ideas are worth nothing, execution is what counts"?