В книге Jun Shao есть такой пример:
We usually try to find a Bayes rule or a minimax rule in a parametric problem where

for a

. Consider the special case of

and

, the squared error loss. Note that
![$r_T(\prod)=\int_{\mattbb R}E[\theta-T(X)]^2d\prod(\theta),$ $r_T(\prod)=\int_{\mattbb R}E[\theta-T(X)]^2d\prod(\theta),$](https://dxdy-03.korotkov.co.uk/f/e/7/1/e71ae9c0271810904005b6571253a56f82.png)
which is equivalent to
![$E[{\tilde{ \theta}}-T(X)]^2$ $E[{\tilde{ \theta}}-T(X)]^2$](https://dxdy-02.korotkov.co.uk/f/5/e/f/5ef29be0a600196a8a5acaa15867c11282.png)
, where

is a random variable having the distribution

and, given

, the conditional distribution of

is

. Then, the problem can be viewed as a prediction problem for

using functions of

. Using previous results, the best predictor is

, which is

-Bayes rule w.r.t.

with

being the class of rules

satisfying
![$E[T(X)]^2<\infty$ $E[T(X)]^2<\infty$](https://dxdy-01.korotkov.co.uk/f/0/0/9/009926a8579749e939580c1d74987eb282.png)
for any

Мне не понятно выражение

и

находятся на разных пространствах с разными мерами. Подскажите!