Logo
All Random Solved Random Open
4 solved out of 8 shown (show only solved or open)
OPEN
Let $f$ be a Rademacher multiplicative function: a random $\{-1,0,1\}$-valued multiplicative function, where for each prime $p$ we independently choose $f(p)\in \{-1,1\}$ uniformly at random, and for square-free integers $n$ we extend $f(p_1\cdots p_r)=f(p_1)\cdots f(p_r)$ (and $f(n)=0$ if $n$ is not squarefree). Does there exist some constant $c>0$ such that, almost surely, \[\limsup_{N\to \infty}\frac{\sum_{m\leq N}f(m)}{\sqrt{N\log\log N}}=c?\]
Note that if we drop the multiplicative assumption, and simply assign $f(m)=\pm 1$ at random, then this statement is true (with $c=\sqrt{2}$), the law of the iterated logarithm.

Wintner [Wi44] proved that, almost surely, \[\sum_{m\leq N}f(m)\ll N^{1/2+o(1)},\] and Erdős improved the right-hand side to $N^{1/2}(\log N)^{O(1)}$. Lau, Tenenbaum, and Wu [LTW13] have shown that, almost surely, \[\sum_{m\leq N}f(m)\ll N^{1/2}(\log\log N)^{2+o(1)}.\] Caich [Ca24b] has improved this to \[\sum_{m\leq N}f(m)\ll N^{1/2}(\log\log N)^{3/4+o(1)}.\] Harper [Ha13] has shown that the sum is almost surely not $O(N^{1/2}/(\log\log N)^{5/2+o(1)})$, and conjectured that in fact Erdős' conjecture is false, and almost surely \[\sum_{m\leq N}f(m) \ll N^{1/2}(\log\log N)^{1/4+o(1)}.\]

Additional thanks to: Mehtaab Sawhney
SOLVED
Let $f(z)=\sum_{0\leq k\leq n} \epsilon_k z^k$ be a random polynomial, where $\epsilon_k\in \{-1,1\}$ independently uniformly at random for $0\leq k\leq n$.

Is it true that the number of roots of $f(z)$ in $\{ z\in \mathbb{C} : \lvert z\rvert \leq 1\}$ is, almost surely, \[\left(\frac{1}{2}+o(1)\right)n?\]

Random polynomials with independently identically distributed coefficients are sometimes called Kac polynomials - this problem considers the case of Rademacher coefficients, i.e. independent uniform $\pm 1$ values. Erdős and Offord [EO56] showed that the number of real roots of a random degree $n$ polynomial with $\pm 1$ coefficients is $(\frac{2}{\pi}+o(1))\log n$.

Solved by Yakir [Ya21], who proved that almost all such polynomials have \[\frac{n}{2}+O(n^{9/10})\] many roots in $\{ z\in \mathbb{C} : \lvert z\rvert \leq 1\}$.

See also [521].

Additional thanks to: Michal Bassan and Zachary Chase
SOLVED
Let $f(z)=\sum_{0\leq k\leq n} \epsilon_k z^k$ be a random polynomial, where $\epsilon_k\in \{-1,1\}$ independently uniformly at random for $0\leq k\leq n$.

Does there exist some constant $C>0$ such that, almost surely, \[\max_{\lvert z\rvert=1}\left\lvert \sum_{k\leq n}\epsilon_k(t)z^k\right\rvert=(C+o(1))\sqrt{n\log n}?\]

Salem and Zygmund [SZ54] proved that $\sqrt{n\log n}$ is the right order of magnitude, but not an asymptotic.

This was settled by Halász [Ha73], who proved this is true with $C=1$.

Additional thanks to: Adrian Beker
OPEN
For any $t\in (0,1)$ let $t=\sum_{k=1}^\infty \epsilon_k(t)2^{-k}$ (where $\epsilon_k(t)\in \{0,1\}$). What is the correct order of magnitude (for almost all $t\in(0,1)$) for \[M_n(t)=\max_{x\in [0,1]}\left\lvert \sum_{k\leq n}\epsilon_k(t)x^k\right\rvert?\]
A problem of Salem and Zygmund [SZ54]. Chung showed that, for almost all $t$, there exist infinitely many $n$ such that \[M_n(t) \ll \left(\frac{n}{\log\log n}\right)^{1/2}.\] Erdős (unpublished) showed that for almost all $t$ and every $\epsilon>0$ we have $\lim_{n\to \infty}M_n(t)/n^{1/2-\epsilon}=\infty$.
SOLVED
Is it true that all except at most $o(2^n)$ many degree $n$ polynomials with $\pm 1$-valued coefficients $f(z)$ have $\lvert f(z)\rvert <1$ for some $\lvert z\rvert=1$? What is the behaviour of \[m(f)=\min_{\lvert z\rvert=1}\lvert f(z)\rvert?\]
Random polynomials with independently identically distributed coefficients are sometimes called Kac polynomials - this problem considers the case of Rademacher coefficients, i.e. independent uniform $\pm 1$ values. The first problem asks whether $m(f)<1$ almost surely. Littlewood [Li66] conjectured that the stronger $m(f)=o(1)$ holds almost surely.

The answer to both questions is yes: Littlewood's conjecture was solved by Kashin [Ka87], and Konyagin [Ko94] improved this to show that $m(f)\leq n^{-1/2+o(1)}$ almost surely. This is essentially best possible, since Konyagin and Schlag [KoSc99] proved that for any $\epsilon>0$ \[\limsup_{n\to \infty} \mathbb{P}(m(f) \leq \epsilon n^{-1/2})\ll \epsilon.\] Cook and Nguyen [CoNg21] have identified the limiting distribution, proving that for any $\epsilon>0$ \[\lim_{n\to \infty} \mathbb{P}(m(f) > \epsilon n^{-1/2}) = e^{-\epsilon \lambda}\] where $\lambda$ is an explicit constant.

Additional thanks to: Mehtaab Sawhney
SOLVED
Let $a_n\geq 0$ with $a_n\to 0$ and $\sum a_n=\infty$. Find a necessary and sufficient condition on the $a_n$ such that, if we choose (independently and uniformly) random arcs on the unit circle of length $a_n$, then all the circle is covered with probability $1$.
A problem of Dvoretzky [Dv56]. It is easy to see that (under the given conditions alone) almost all the circle is covered with probability $1$.

Kahane [Ka59] showed that $a_n=\frac{1+c}{n}$ with $c>0$ has this property, which Erdős (unpublished) improved to $a_n=\frac{1}{n}$. Erdős also showed that $a_n=\frac{1-c}{n}$ with $c>0$ does not have this property.

Solved by Shepp [Sh72], who showed that a necessary and sufficient condition is that \[\sum_n \frac{e^{a_1+\cdots+a_n}}{n^2}=\infty.\]

OPEN
Let $a_n\in \mathbb{R}$ be such that $\sum_n \lvert a_n\rvert^2=\infty$ and $\lvert a_n\rvert=o(1/\sqrt{n})$. Is it true that, for almost all $\epsilon_n=\pm 1$, there exists some $z$ with $\lvert z\rvert=1$ (depending on the choice of signs) such that \[\sum_n \epsilon_n a_n z^n\] converges?
It is unclear to me whether Erdős also intended to assume that $\lvert a_{n+1}\rvert\leq \lvert a_n\rvert$.

It is 'well known' that, for almost all $\epsilon_n=\pm 1$, the series diverges for almost all $\lvert z\rvert=1$ (assuming only $\sum \lvert a_n\rvert^2=\infty$).

Dvoretzky and Erdős [DE59] showed that if $\lvert a_n\rvert >c/\sqrt{n}$ then, for almost all $\epsilon_n=\pm 1$, the series diverges for all $\lvert z\rvert=1$.

OPEN
Let $d_k(n)$ be the expected distance from the origin after taking $n$ random steps from the origin in $\mathbb{Z}^k$ (conditional on no self intersections) - that is, a self-avoiding walk. Is it true that \[\lim_{n\to \infty}\frac{d_2(n)}{n^{1/2}}= \infty?\] Is it true that \[d_k(n)\ll n^{1/2}\] for $k\geq 3$?
Slade [Sl87] proved that, for $k$ sufficiently large, $d_k(n)\sim Dn^{1/2}$ for some constant $D>0$ (independent of $k$). Hara and Slade ([HASl91] and [HaSl92]) proved this for all $k\geq 5$.

For $k=2$ Duminil-Copin and Hammond [DuHa13] have proved that $d_2(n)=o(n)$.

It is now conjectured that $d_k(n)\ll n^{1/2}$ is false for $k=3$ and $k=4$, and more precisely (see for example Section 1.4 of [MaSl93]) that $d_2(n)\sim Dn^{3/4}$, $d_3(n)\sim n^{\nu}$ where $\nu\approx 0.59$, and $d_4(n)\sim D(\log n)^{1/8}n^{1/2}$.

Madras and Slade [MaSl93] have a monograph on the topic of self-avoiding walks.

See also [528].

Additional thanks to: Terence Tao