Logo
All Random Solved Random Open
OPEN
Let $n\geq 1$ and define $L_n$ to be the lowest common multiple of $\{1,\ldots,n\}$ and $a_n$ by \[\sum_{1\leq k\leq n}\frac{1}{k}=\frac{a_n}{L_n}.\] Is it true that $(a_n,L_n)=1$ and $(a_n,L_n)>1$ both occur for infinitely many $n$?