- cross-posted to:
- asklemmy@lemmy.ml
- hackernews@lemmy.smeargle.fans
- cross-posted to:
- asklemmy@lemmy.ml
- hackernews@lemmy.smeargle.fans
DDG is now offering free/private AI chat using several models.
DDG is now offering free/private AI chat using several models.
If they are using GPT-3.5 and Claude, that means that they are sending the chats to Open AI and Anthropic, right? How can they assure that the chats are private and not being used in training if they don’t control what other companies do?
Edit: ok, they claim to have agreements with them to delete chats within 30 days and they hide the user IP
DDG’s classic “Trust me bro” privacy policy.
I don’t dislike DDG and I do use it, but goddamn I’d love to see a public audit of their privacy claims. DDG is closed source and they’ve only ever given Their Word TM about their claims. The privacy community puts a lot of faith in DDG despite not being able to test anything it says.
Every service that claims to be private should be obliged to have a recent public audit available as a proof
Kagi!!!
I started using it entirely last month and I’m never going back.
Kagi isn’t open source either
It’s also a subscription based search engine. After Neeva imploded I’m not going to be investing money into a subscription search service.
And their CEO is a maniac
That is how DDG search works as well. They take your search query and send it to a regular, data harvesting search engine. The engine does not see your IP address and cannot track you with a cookie but they can monitor the search queries of DDG users in aggregate.
True, but anonymized search queries are much less personal than chats
True, but anonymized search queries are much less personal than chats
They delete the chats but the metadata is forever
They state that they obsficate the metadata of chats sent to OpenAI
Doubt
Open source is the onlybway to know and DDG still hides a lot of their code
They say they proxy the requests. Deleting the chats after 30 days is meaningless, the data has already been utilized.
Here’s the thing: they can’t.