A lot of pictures on the web are flipped horizontally bc. of cameras, mirrors, you name it. It's usually trivial for humans to infer what are the directions involved, I wonder if LLMs could do it as well.
Recently I scanned thousands of family photos, but I didn't have a good way to get them oriented correctly before scanning. I figured I could "fix it in post" .
If you upload an incorrectly oriented image to google photos, it will automatically figure that out and suggest the right way up (no EXIF data). So I set about trying to find an open-source way to do that since I'm self-hosting the family photos server.
So far, I haven't managed it. I found a project doing it using pytorch or something, but it didn't work well.