This website contains problems from math contests. Problems and corresponding tags were obtained from the Art of Problem Solving website.

Tags were heavily modified to better represent problems.

AND:
OR:
NO:

Found problems: 5

2001 Miklós Schweitzer, 11

Let $\xi_{(k_1, k_2)}, k_1, k_2 \in\mathbb N$ be random variables uniformly bounded. Let $c_l, l\in\mathbb N$ be a positive real strictly increasing infinite sequence such that $c_{l+1}/ c_l$ is bounded. Let $d_l=\log \left(c_{l+1}/c_l\right), l\in\mathbb N$ and suppose that $D_n=\sum_{l=1}^n d_l\uparrow \infty$ when $n\to\infty$ Suppose there exist $C>0$ and $\varepsilon>0$ such that $$\left| \mathbb E \left\{ \xi_{(k_1,k_2)}\xi_{(l_1,l_2)}\right\}\right| \leq C\prod_{i=1}^2 \left\{ \log_+\log_+\left( \frac{c_{\max\{ k_i, l_i\}}}{c_{\min\{ k_i, l_i\}}}\right)\right\}^{-(1+\varepsilon)}$$ for each $(k_1, k_2), (l_1,l_2)\in\mathbb N^2$ ($\log_+$ is the positive part of the natural logarithm). Show that $$\lim_{\substack{n_1\to\infty \\ n_2\to\infty}} \frac{1}{D_{n_1}D_{n_2}}\sum_{k_1=1}^{n_1} \sum_{k_2=1}^{n_2} d_{k_1}d_{k_2}\xi_{(k_1,k_2)}=0$$ almost surely. (translated by j___d)

1986 Miklós Schweitzer, 10

Let $X_1, X_2$ be independent, identically distributed random variables such that $X_i\geq 0$ for all $i$. Let $\mathrm EX_i=m$, $\mathrm{Var} (X_i)=\sigma ^2<\infty$. Show that, for all $0<\alpha\leq 1$ $$\lim_{n\to\infty} n\,\mathrm{Var} \left( \left[ \frac{X_1+\ldots +X_n}{n}\right] ^\alpha\right)=\frac{\alpha ^ 2 \sigma ^ 2}{m^{2(1-\alpha)}}$$ [Gy. Michaletzki]

1985 Miklós Schweitzer, 12

Let $(\Omega, \mathcal A, P)$ be a probability space, and let $(X_n, \mathcal F_n)$ be an adapted sequence in $(\Omega, \mathcal A, P)$ (that is, for the $\sigma$-algebras $\mathcal F_n$, we have $\mathcal F_1\subseteq \mathcal F_2\subseteq \dots \subseteq \mathcal A$, and for all $n$, $X_n$ is an $\mathcal F_n$-measurable and integrable random variable). Assume that $$\mathrm E (X_{n+1} \mid \mathcal F_n )=\frac12 X_n+\frac12 X_{n-1}\,\,\,\,\, (n=2, 3, \ldots )$$ Prove that $\mathrm{sup}_n \mathrm{E}|X_n|<\infty$ implies that $X_n$ converges with probability one as $n\to\infty$. [I. Fazekas]

2010 Miklós Schweitzer, 11

For problem 11 , i couldn’t find the correct translation , so i just posted the hungarian version . If anyone could translate it ,i would be very thankful . [tip=see hungarian]Az $X$ ́es$ Y$ valo ́s ́ert ́eku ̋ v ́eletlen v ́altoz ́ok maxim ́alkorrel ́acio ́ja az $f(X)$ ́es $g(Y )$ v ́altoz ́ok korrela ́cio ́j ́anak szupr ́emuma az olyan $f$ ́es $g$ Borel m ́erheto ̋, $\mathbb{R} \to \mathbb{R}$ fu ̈ggv ́enyeken, amelyekre $f(X)$ ́es $g(Y)$ v ́eges sz ́ora ́su ́. Legyen U a $[0,2\pi]$ interval- lumon egyenletes eloszl ́asu ́ val ́osz ́ınu ̋s ́egi v ́altozo ́, valamint n ́es m pozit ́ıv eg ́eszek. Sz ́am ́ıtsuk ki $\sin(nU)$ ́es $\sin(mU)$ maxim ́alkorrela ́ci ́oja ́t. [/tip] Edit: [hide=Translation thanks to @tintarn] The maximal correlation of two random variables $X$ and $Y$ is defined to be the supremum of the correlations of $f(X)$ and $g(Y)$ where $f,g:\mathbb{R} \to \mathbb{R}$ are measurable functions such that $f(X)$ and $g(Y)$ is (almost surely?) finite. Let $U$ be the uniformly distributed random variable on $[0,2\pi]$ and let $m,n$ be positive integers. Compute the maximal correlation of $\sin(nU)$ and $\sin(mU)$. (Remark: It seems that to make sense we should require that $E[f(X)]$ and $E[g(Y)]$ as well as $E[f(X)^2]$ and $E[g(Y)^2]$ are finite. In fact, we may then w.l.o.g. assume that $E[f(X)]=E[g(Y)]=0$ and $E[f(Y)^2]=E[g(Y)^2]=1$.)[/hide]

2021 Alibaba Global Math Competition, 4

Let $(\Omega, \mathcal{A},\mathbb{P})$ be a standard probability space, and $\mathcal{X}$ be the set of all bounded random variables. For $t>0$, defined the mapping $R_t$ by \[R_t(X)=t\log \mathbb{E}[\exp(X/t)], \quad X \in \mathcal{X},\] and for $t \in (0,1)$ define the mapping $Q_t$ by \[Q_t(X)=\inf\{x \in \mathbb{R}: \mathbb{P}(X>x) \le t\}, \quad X \in \mathcal{X}.\] For two mappings $f,g:\mathcal{X} \to [-\infty,\infty)$, defined the operator $\square$ by \[f\square g(X)=\inf\{f(Y)+g(X-Y): Y \in \mathcal{X}\}, \quad X \in \mathcal{X}.\] (a) Show that, for $t,s>0$, \[R_t \square R_s=R_{t+s}.\] (b) Show that, for $t,s>0$ with $t+s<1$, \[Q_t \square Q_s=Q_{t+s}.\]