Minimum Difference Function
For any positive integer n let f(n) denote the minimum difference
|a-b| for any two integers a,b such that ab=N. Then define F(n)
as the sum of f(k) for k=1,2,..,n.
When is F(n) divisible by n? Here's a table of all the occurrences
for n < 175000:
n F(n) F(n)/n
------ -------- --------
1 0 0
3 3 1
7 14 2
8 16 2
55 550 10
75 1050 14
146 3504 24
204 6732 33
224 7840 35
679 63826 94
831 93072 112
860 98900 115
63057 328085571 5203
113740 1009897460 8879
114507 1022891031 8933
Are there infinitely many such occurrences?
I've subsequently found a couple more, so all such n I know are
1, 3, 7, 8, 55, 75, 146, 204, 224, 679, 831, 860, 63057,
113740, 114507, 660479, 2329170, ....?
This raises a more general question. Suppose f(n) is defined as
a randomly selected integer in the range from 0 to n, and the
cumulative form F(n) is the sum of f(k) for k=1 to n. What is
the expected density of n values that divide F(n)?
Robert Israel notes that if we made the range for f(n) be 0 to n-1
(instead of 0 to n) then F(n) = F(n-1) + f(n), and for any F(n-1)
there's exactly one f(n) value that would make F(n) divisible by n.
So the probability that n divides F(n) is exactly 1/n. With the
range 0 to n, he remarks that the exact answer is more complicated,
but presumably very close to 1/n.
Return to MathPages Main Menu
Сайт управляется системой
uCoz