Pretty sure these tools are often seeded with prompts that enforce diversity. Bing does the same or similar. I’m more amused by this, as the process isn’t aware and can’t actively enable or disable these settings.
To actively fit a historical prompt, it would need to not only consider images from the period, but also properly synthesize historical data to go with the prompt.
That’s really what I’m expecting. My guess is that the training data is skewed, and the prompt cannot adjust.
Either the machine will need to understand what is expected, or the company will need to address this and allow people to enable or disable diversity.
The first option may be impossible to attain at this stage. The second can lead to inappropriate images.