Alex is still bored. To amuse himself, he took N dices, numbered from 1 to N, shuffled them, took K of them randomly and then recorded their numbers and returned them to the pile. Then he repeated these steps: shuffle them again, again took the K dices, and so on. And now he has a question: how many times he need to do so, that each dice has been taken at least once?
The single line contains two integers: N and K.
1 ≤ N ≤ 1000
1 ≤ K ≤ N
You should print a single number — average number of iterations before each dice will be taken at least once. Output response should be with an absolute percentage error of no more than 10 ^7.