Were there a lot of songs in the library you had it select from? Was it able to populate a list of 100 songs without duplicates or did it actually build a list with less than 100 songs?
When I ran this test, Play Doctor only had a library of roughly 150 songs to select from. This library also had plenty of diversity in the types of songs. I should have mentioned this before. That's why I started to think that it was forced to fill the list with duplicates. It was left with less than 100 songs that it could pull in order to build the playlist, which forced it to fill the rest of the playlist with duplicates. I believe that with a library of thousands of songs, this issue would be very unlikely, unless something about the search criteria only pointed to a very small, distinct sampling.
What still kind of puzzles me is why dameonjamie only got duplicates from that one album. I can understand that it might weigh more heavily if they're more closely related to the search criteria when making duplicates, but even when I tried this, I would still get duplicates of everything. More specifically, when doing a search for "Green Day," I would get about six to eight duplicates of Green Day songs, and a couple of duplicates from songs by other artists from the same genre. The amount of duplicates I got seemed to depend on how I rated the song and how closely it was closely related to the search criteria.
One other issue that ended up puzzling me during this test was that it seemed to be ignoring the result modification rules I was setting. At one point, to test it out, I tried to modify the results to exclude duplicate album names. If I were to do this with a smart playlist, I would only get one song from an album, but with Play Doctor, I was still getting multiple songs from one album. This makes think that it possibly overrides rules when the resulting playlist is too short. Does that sound reasonable?