Very keen to avoid this sort of life lock-in personally.
I'd much rather build a local version that is half as capable. Use cloud for inference via API but have the core data storage etc local. Already made some progress replacing pieces with own MCP servers
>A.I. companions
Another thing I'm absolutely not doing hosted. Certainly not the emotional connotations of "companion".
I can see both strategies working commercially though