I'm sure you're right that the lower x is the more expected number of generations will need to pass before the mutation becomes prevalent. With a population around 1000 and rate of 1/1000 I ran it ten times. It took 347, 975, 775, 353, 262, 659, 609, 241, 79, and 204 generations before mutants exceeded 1/2 of the population. With the same population size and a rate of 1/10000 it took 1564, 3551, 4261, 3979, 1047, 7947, 3936, 8336, 747, and 1632 generations - about 10x. With 10000 and 1/10000 it looks like about another 10x. I'm guessing it is linear in both parameters.
I'd be happy to share the program with you, it's really simple.