I'm sure companies are lined up to emotionally manipulate vulnerable people into an illusion of love and acceptance so that they can collect a record of people's most intimate conversations, thoughts, fears, desires, and hopes. They'll mine that data endlessly and sell it to anyone willing to pay, then accept money from others who want to use that AI partner to manipulate the customer.
I'm not opposed to the idea of people using AI for this sort of thing, but handing that kind of data over to companies who want to exploit you seems horrific. Unless you have control over the AI and all of the data you're just exposing your vulnerabilities while handing out ammunition that will be used against you. That and letting 3rd parties put words in your AI's mouth to turn it against you too.
They don't even have to sell the data, just get the model to manipulate the user into buying things for their AI companion. Virtual clothes, virtual locations, upgrades for their personality, etc.
They don't have to sell the data, but they will because they'll do literally anything if it will make them more money. Not selling it is leaving money on the table and the shareholders won't stand for it.
in other words, the ultimate end state of marketing. if they could mind control you they would, and an AI chatbot is probably as close as we can get without zapping your brain directly.