- cross-posted to:
- opensource@programming.dev
- cross-posted to:
- opensource@programming.dev
Tidy- Offline semantic Text-to-Image and Image-to-Image search on Android powered by quantized state-of-the-art vision-language pretrained CLIP model and ONNX Runtime inference engine
Features
- Text-to-Image search: Find photos using natural language descriptions.
- Image-to-Image search: Discover visually similar images.
- Automatic indexing: New photos are automatically added to the index.
- Fast and efficient: Get search results quickly.
- Privacy-focused: Your photos never leave your device.
- No internet required: Works perfectly offline.
- Powered by OpenAI’s CLIP model: Uses advanced AI for accurate results.
Just tried it out, works great! I was able to do some basic searches and most results mad sense
Two cool features would be:
- ability to delete photos. I could see searching and deleting a good workflow
- if possible, see what tags the images have to see how CLIP output works
Something like this for desktop would be nice
Absolutely! Did anyone know of one?
Professional shitposting tool
This looks promising. Not sure if fully baked yet, was going to circle back to it in a month or two. If you try, let me know…
Last commit was a year ago - is this abandonware? 😯
Its perfect code. It needs no updates. Ignore the open issues about it crashing on startup.
I mean it works on my phone without issues, but for how long… What with Android updates and all.
This looks great! I’d love to have this for all my nextcloud pictures!
It it bad my dyslexic ass read that as tiddy?
That way it would be instantly more popular.
Deity bless the people who make and maintain these apps
That sounds fun! Let’s see what it has to say about tge 19.000 photos on mt phone! 😸
I love the idea, but in reality I think it’s a bit clunky having to wait for the app finishing it’s scanning (Every. Time. It. Opens), which is… kinda slow, before you can get to see the gallery. And yes, the whole purpose of the app is to use the index it builds, but it should be able to let you use an outdated index while a new one is being rebuilt, instead of having you wait forever each time.
I do have a large gallery, so maybe this is not an issue with smaller photo albums.
This doesn’t sound like the normal behavior. From the app description: “This indexing process may take some time, but it’s a one-time event. Once this initial indexing process is complete, the app will store the index on your device, and any new photos you add to your photo library will be automatically added to the index on the subsequent app launches.”
If your experience is different you may let the dev know so they can fix it.
If you close the app (as in, task switcher swipe away close), it doesn’t start rebuilding the index again? Or, after a phone reboot?
Doing that I can see the loading bar for half a second when reopening the app. As stated on the app webpage this is certainly to update the database with any new images but it doesn’t rebuild the entire database from scratch.
Maybe your cache gets deleted automatically when you close the app on your device. Make sure you have no other app that may do so.
Does anybody know which CLIP model does it use?