I roll a fair die with $n>1$ sides until the most recent roll is smaller than the previous one. Let $E_n$ be the expected number of rolls. Do we have $\lim_{n\to\infty} E_n < \infty$? If not, what about $\lim_{n\to\infty} E_n/n$ and $\lim_{n\to\infty} E_n/\log(n)$?

1$\begingroup$ Quick simulations suggest that this expectation may be bounded for all n. Estimated means : n= 5 : 2.05 ; n=15 : 1.81; n=45: 1.75 ; n= 100: 1.74. The number of dice rolls looks like a Poisson distribution. $\endgroup$– Gilles MordantNov 13 '20 at 11:09

2$\begingroup$ I counted the number of rolls you need to perform once you have got the first roll to compare with, indeed. $\endgroup$– Gilles MordantNov 13 '20 at 11:17

3$\begingroup$ Fun fact: based on simulations, it looks like we have $E_n \to e $, as $n\to \infty$. $\endgroup$– Gilles MordantNov 13 '20 at 11:27

4$\begingroup$ The number of rolls required converges in distribution to $1+X$ for $X=\max\{n: U_1<U_2<\dots<U_n\}$, where $U_i$ are i.i.d. uniform$[0,1]$ (or indeed, $U_i$ may be i.i.d. from any continuous distribution). The probability that $U_1<U_2<\dots<U_n$ is $1/(n!)$ (since all $n!$ orderings are equally likely). $\endgroup$– James MartinNov 13 '20 at 13:17

1$\begingroup$ @GillesMordant : You wrote: "I counted the number of rolls you need to perform once you have got the first roll to compare with, indeed." It sounds as if that would mean the smallest possible value is $1,$ whereas with the Poisson distribution it is $0. \qquad$ $\endgroup$– Michael HardyNov 15 '20 at 3:33
By doing casework on the second to last roll $m$, one has $$E_n = \sum_{k=2}^\infty k p_k^{(n)},$$ where $$p_k^{(n)} = \sum_{m=1}^n n^{(k1)}{m+k3 \choose k2}\frac{m1}{n}.$$ Note, for any fixed $k$, $$\lim_{n \to \infty} p_k^{(n)} = \frac{k1}{k!}.$$ Therefore, by an easy dominated convergence argument, one has $$\lim_{n \to \infty} E_n = \sum_{k=2}^\infty k\frac{k1}{k!} = e.$$
The answer may be expressed more simply, in fact $E_n = \left( \frac{n}{n1}\right)^n$.
Update 1: (The following was independently obtained by Pierre PC, I just found out after I finished typing.) The following is a simple way to see this. Let $E_{n,\ell}$ denote the expected number of rolls if the last roll is $\ell$. We then have the formula \begin{equation*} E_{n,\ell} = \underbrace{1\cdot \frac{1}{n} + \dotsb + 1\cdot \frac{1}{n}}_{\ell1} + \sum_{k=\ell}^n \left(1+E_{n,k}\right)\cdot \frac{1}{n}. \end{equation*} Solving these backwards (i.e. solve for $E_{n,n}, E_{n,n1}, \dotsc, E_{n,1}$) gives $E_{n,\ell}=\left( \frac{n}{n1}\right)^{n\ell+1}$. It then follows that \begin{equation*} E_n = 1 + \sum_{\ell=1}^n E_{n,\ell}\cdot \frac{1}{n} = \left( \frac{n}{n1}\right)^n. \end{equation*}
Update 2: Here is a derivation starting from mathworker21's formula. We have \begin{align*} E_n & {}= \sum_{k=2}^{\infty} k \sum_{m=1}^n n^{(k1)} \binom{m+k3}{k2} \frac{m1}{n}\\ & {}= \sum_{m=1}^n (m1) \sum_{k=2}^{\infty} k \binom{m+k3}{k2} n^{k}\\ & {}= \sum_{m=1}^n \frac{m1}{n^2} \sum_{k=0}^{\infty} (k+2) \binom{m+k1}{k} n^{k}\\ & {}= \sum_{m=1}^n \frac{m1}{n^2} \sum_{k=0}^{\infty} (k+2) \binom{m}{k} \left( \frac{1}{n} \right)^k.\\ \end{align*} Now massaging the sum and using the binomial series twice gives \begin{align*} \sum_{k=0}^{\infty} (k+2) \binom{m}{k} x^k & {}= 2 \sum_{k=0}^{\infty} \binom{m}{k} x^k + \sum_{k=1}^{\infty} k \binom{m}{k} x^k\\ & {}= 2 (1+x)^{m} + \sum_{k=1}^{\infty} (m) \binom{(m+1)}{k1} x^k\\ & {}= 2 (1+x)^{m}  m x (1+x)^{(m+1)}\\ & {}= \frac{2(m2)x}{(1+x)^{m+1}}. \end{align*} Inserting $x=\frac{1}{n}$ then gives \begin{align*} E_n & {}= \sum_{m=1}^n \frac{m1}{n^2} \left( \frac{2+\frac{m2}{n}}{(1\frac{1}{n})^{m+1}}\right)\\ & {}= \frac{1}{n^2 (n1)} \sum_{m=1}^n (m1)(2 n+m2) \left( \frac{n}{n1} \right)^m\\ & {}= \frac{2}{n^2} \left(\sum_{m=1}^n \left( \frac{n}{n1} \right)^m\right) + \frac{2 n3}{n^2 (n1)} \left(\sum_{m=1}^n m \left( \frac{n}{n1} \right)^m\right) + \frac{1}{n^2 (n1)} \left(\sum_{m=1}^n m^2 \left( \frac{n}{n1} \right)^m\right).\\ \end{align*} Finally using \begin{equation*} \sum_{m=1}^n x^m = \frac{x}{1x} \left( 1x^n \right), \qquad \sum_{m=1}^n m x^m = \frac{x}{(1x)^2} \left( n x^{n+1}  (n+1) x^n + 1\right) \qquad\text{ and }\qquad \sum_{m=1}^n m^2 x^m = \frac{x}{(1x)^3} \left( (1+x)  x^n \left( n^2 (1x)^2 +2 n (1x) + x + 1 \right) \right) \end{equation*} one then obtains $E_n=\left(\frac{n}{n1}\right)^n$ after heavy simplification.
Write $\alpha_i$ for the expected time starting from “something then $i$”. Then $$ \alpha_i = \frac1n(\alpha_i+1) + \dotsb + \frac1n(\alpha_n+1) + \frac{i1}n \cdot 1 = 1 + \frac{\alpha_i + \cdots + \alpha_n}n. $$ From this one can deduce $\alpha_i = \left(\frac{n}{n1}\right)^{ni+1}$, and $$ E_n = \frac{\alpha_1}n+\cdots+\frac{\alpha_n}n = \left(\frac n{n1}\right)^n $$ as explained by Kasper Andersen. Of course the limit is $e$.
The way I got the expression for $\alpha_i$ is setting $\beta_i = \alpha_i/n$ and realising that $$ (n1)(\beta_{i1}  \beta_i) = \left(\beta_i+\dotsb+\beta_n+1\right)  \left(\beta_{i+1}+\cdots+\beta_n+1\right) = \beta_i, $$ from which we deduce $\beta_{i1} = \frac n{n1}\beta_i$, then from $\alpha_n = 1+\alpha_n/n$ we get $\beta_n = 1/(n1)$ and $\beta_i=\frac1n\left(\frac n{n1}\right)^{ni+1}$.
Yet another derivation of the formula $E_n=\left(\frac{n}{n1}\right)^n$ given by Kasper Andersen.
Write $[m]=\{1,2,\dots, m\}$.
There are ${n+k1 \choose k}$ nondecreasing functions from $[k]$ to $[n]$.
( Why? The number of such functions $f$ is the same as the number of strictly increasing functions $g$ from $[k]$ to $[n+k1]$  there is a bijection given by $g(i)=f(i)+i1$. The number of those is ${n+k1 \choose k}$, since for every subset of $[n+k1]$ of size $k$, there is precisely one strictly increasing function from $[k]$ to $[n+k1]$ with that image set. )
Let $X$ be the number of rolls before the first descent is observed. Then $X\geq k$ if and only if the first $k$ rolls give a nondecreasing function from $[k]$ to $[n]$. The total number of possibilities for the first $k$ rolls is $n^k$, and they are all equally likely, so the probability that $X\geq k$ is $\frac{1}{n^k}{n+k1 \choose k}$.
Then \begin{align*} E_n &= \mathbb{E}(1+X)\\ &=1+\mathbb{E}(X)\\ &=1+\sum_{k=1}^\infty \mathbb{P}(X\geq k)\\ &=1+\sum_{k=1}^\infty \frac{1}{n^k}{n+k1 \choose k}\\ &=1+\frac{1}{n}{n\choose 1} + \frac{1}{n^2}{n+1\choose 2} +\frac{1}{n^3}{n+2\choose 3} +\dots\\ &=\left(1\frac{1}{n}\right)^{n} \end{align*} by the binomial theorem.

3$\begingroup$ Hi James, thanks for your great answer! I believe we played Bach's Sonatas (cello and piano) in the Merton Music room in the academic year 2004/2005. Best regards to Oxford $\endgroup$ Nov 14 '20 at 6:25