Here's another idea - every time the page picks an image and shows it, there are rating controls (as simple as thumbs up and down, or maybe 1-10 funniness rating). Users can just view each image if they're lazy, or they can rate it with a click if they want to.
The ratings contribute to weighting the randomness to show the funniest images the most often. It's easy to make SQL that does quality-rating-weighted, semi-randomness to pick a record.
Yeah, you're already gonna have quality filters of the images GOING IN, but this would be ONGOING ratings once they're already in there!
Also, you can use some mechanism to keep track of the last X number of images each user has seen, so repetitions are minimized (at least, repetitions close to each other in time). Using client-side storage (cookies, DOM storage, IndexedDB, etc.) for this is the most decentralized way, taking up zero storage space per-user on the server(s), it's all stored in the client (browser). Also if the client-side data isn't available, who cares? Just don't use the anti-repetition feature on that request. Failure causes no problems.
However, once your collection gets huge enough, close-in-time repetitions should be rare enough that the feature may not be worth the data transfer involved. It's all a balancing act, like most software design!
Here's another idea - every time the page picks an image and shows it, there are rating controls (as simple as thumbs up and down, or maybe 1-10 funniness rating). Users can just view each image if they're lazy, or they can rate it with a click if they want to.
The ratings contribute to weighting the randomness to show the funniest images the most often. It's easy to make SQL that does quality-rating-weighted, semi-randomness to pick a record.
Yeah, you're already gonna have quality filters of the images GOING IN, but this would be ONGOING ratings once they're already in there!
Also, you can use some mechanism to keep track of the last X number of images each user has seen, so repetitions are minimized (at least, repetitions close to each other in time). Using client-side storage (cookies, DOM storage, IndexedDB, etc.) for this is the most decentralized way, taking up zero storage space per-user on the server(s), it's all stored in the client (browser). Also if the client-side data isn't available, who cares? Just don't use the anti-repetition feature on that request. Failure causes no problems.