No official report that I can link, though I recall seeing something at some point. I mostly speak from personal experience. I am a bit sensitive to these, so I noticed them right away. I barely use AI mostly because of the “hooking effect”. Some friends I know are downright addicted, it’s sad to see.
I’m far from being an AI defender. And for the longest time I resisted the idea of vibe coding.
I will give you that, without the right experience, vibe coding feels like gambling.
But I learned rather quickly that you must first work on a dev plan with an LLM, and until that plan hasn’t covered every scenario, then you don’t move on.
That yields much better results, and it has the advantage of having the blueprints for when you need the LLM to make changes.
Do you know of any reports that AI companies did this intentionally? It would make business sense for them to make them addicting
No official report that I can link, though I recall seeing something at some point. I mostly speak from personal experience. I am a bit sensitive to these, so I noticed them right away. I barely use AI mostly because of the “hooking effect”. Some friends I know are downright addicted, it’s sad to see.
I’m far from being an AI defender. And for the longest time I resisted the idea of vibe coding.
I will give you that, without the right experience, vibe coding feels like gambling.
But I learned rather quickly that you must first work on a dev plan with an LLM, and until that plan hasn’t covered every scenario, then you don’t move on.
That yields much better results, and it has the advantage of having the blueprints for when you need the LLM to make changes.