Metodološki zvezki, Vol. 3, No. 1, 2006, 1-7 Ratio of Two Random Variables: A Note on the Existence of its Moments Anton Cedilnik1, Katarina Košmelj2, and Andrej Blejec3 Abstract To enable correct statistical inference, the knowledge about the existence of moments is crucial. The objective of this paper is to study the existence of the moments for the ratio Z = X/Y, where X and Y are arbitrary random variables with the additional assumption P(Y = 0) = 0. We present three existence theorems showing that specific behaviour of the distribution of Y in the neighbourhood of zero is essential. Simple consequences of these theorems give evidence to the existence of the moments for particular random variables; some of these results are well known from standard probability theory. However, we obtain them in a simple way. 1 Introduction The ratio of two normally distributed random variables occurs frequently in statistical analysis. From standard probability literature, see for example Johnson et al. (1994), it is known that the ratio of two centred normal variables Z = X/Y is a non-centred Cauchy variable. Marsaglia (1965) and Hinkley (1969) discussed the general situation: [X Y]T~N(m X,m Y, sX,sY, r ¹±1). Cedilnik et al. (2004) studied the general situation as well, they followed the same procedure as Hinkley did. They showed that the density of the ratio of two arbitrary normal variables can be expressed very neatly as a product of two factors. 1 Biotechnical Faculty, University of Ljubljana, Jamnikarjeva 101, 1000 Ljubljana, Slovenia; anton.cedilnik@bf.uni-lj.si 2 Biotechnical Faculty, University of Ljubljana Jamnikarjeva 101, 1000 Ljubljana, Slovenia; katarina.kosmelj@bf.uni-lj.si 3 National Institute of Biology, University of Ljubljana, Vecna pot 111, 1000 Ljubljana, Slovenia; andrej.blejec@nib.si 2 Anton Cedilnik, Katarina Košmelj, and Andrej Blejec The first factor, the Cauchy part, is the density for a non-centred Cauchy variable, sX Ca = s X b = sX X 1-r 2 , which is independent of the expected values mX sY sY and mY . The second factor, the deviant part, is a complicated function of z (see Cedilnik et al., 2003). For illustration, let us consider the N(mX ,mY ,sX =sY = 1, r= 0.5). We vary mX from -4 to +4 with the step of 2, and mY from 0 to 2 with the step of 0.5. The Cauchy part is the same for all these cases. The deviant part, however, takes different shapes. Figure 1a displays the Cauchy and the deviant part, Figure 1b the density which is their product. It should be pointed out that the ratio of two normally distributed random variables has no moments due to the fact that the asymptotic behaviour of the density is the same as that of the Cauchy part. To enable correct statistical inference the knowledge about the existence of the moments of the ratio is crucial. The objective of this paper is to study the existence of the moments of the ratio for the general setting. r=0.5 2.0 1.5 1.0 0.5 0.0 -4 -2 0 2 4 Figure 1a: The Cauchy part and the deviant part for the ratio X/Y, where [X Y]T ~ N(mX ,mY ,sX =sY = 1,r= 0.5). mX varies from -4 to +4 with the step of 2 (horizontally) and mY from 0 to 2 with the step of 0.5 (vertically). The Cauchy part is constant for all these cases. For mX =mY = 0 the deviant part equals 1. Ratio of Two Random Variables… 3 r=0.5 2.0 1.5 1.0 0.5 0.0 -4 -2 0 2 4 Figure 1b: The density for the ratio X/Y, where [X Y]T ~ N(mX ,mY ,sX =sY = 1, r= 0.5). mX varies from -4 to +4 with the step of 2 (horizontally) and mY from 0 to 2 with the step of 0.5 (vertically). 2 Existence of the moments of the ratio In what follows, we consider the random vector [X Y]T, where X and Y are arbitrary random variables with the additional assumption P(Y = 0) = 0. Consequently, the ratio Z = X/Y is a well defined random variable. We present three theorems on the existence of the moments and some consequences. The proofs are based on the well known Hölder’s inequality which we present in Appendix to put on view the notations used in the text. Theorem 1. Suppose that there exists such an e>0, that P(Y n , and let X have the moments of order £ m. Further assume there exist two positive real numbers e and C , such that for any d within the interval 0 0 , use Hölder’s inequality mm assuming U = X n , V =Y-n , and p = , q = n [W [ m ] ?W Y mn m-n dP m-n m-n m : Since X has the moments of order m, the first factor on the right is finite. Let - us consider the second factor, denoting T = Y m-n . Estimation gives (2.2): mn m-n P T> d - mn ^ m-n = P T m-n mn t)£C×t 1 e for any t which is large enough. We show further that under (2.2) the second factor is finite - mn Y - m-n W dP = ? TdP + ? ¥ N 0£T£2 j=0 N+ j TdP 2N+j) j=0 < 2N + £ 2N+j+1 ¦ C ¦ 2-(N+j )(1+£) < -, j=0 for N large enough. QED The expression (2) is very general because it holds for an arbitrary random variable Y. Example 1 presents its form for continuous random variable Y, Example 2 provides the discrete case. Example 1. Assume that X has the moments of order 0, a>-1) for -£0) which can be estimated in light of Example 1: 6 Anton Cedilnik, Katarina Košmelj, and Andrej Blejec v(y) < r/2-1—t) yr (for any real y). m Then T has all moments of order less than ------ (where m is arbitrary), which is 1 + m r as close to r as one wants. Hence, T has all the moments of order less than r, as it is well known from standard probability theory. Example 2. Let Y have a discrete distribution with a strictly decreasing infinite sequence (aj)>l0 of values, and P(Y = aj) = vj. Assume that there exist two positive real numbers e and C such that if aN < e the following holds: /vj < C ¦(aN)m - n (2.3) Than (2.2) is valid. The next theorem describes the reverse situation: we study the existence of the moments of X given the moments for Z and Y. Theorem 3. If Y has the moments of order m and Z the moments of order n, m and n are positive (possibly not integers), then X has the moments of order mn T j <-------. he following relationship holds: m + n E{\Xj)1 , 1/ p + 1/ q = 1 and 0 < j < mini mp, qn f . Proof. The case j = 0 is trivial. Let us use Hölder’s inequality for U = Yj and V = Zj for some j > 0. The estimation has sense if jp^m and jq^n. Hence, j< minim,nf< max minim,nf = -------; the maximum is reached at p =-------. p m + n n Ratio of Two Random Variables… 7 Appendix Hölder’s inequality: Let (W,F,P) be a probability space, where W is a set of outcomes g, F a Borel’s s -algebra of events, and P the probability measure. Further, let U and V be random variables on W. Then .W p |T7( q U(g)V(g)dP £ U(g) dP × \vg) dP iW 1/ p q 1/q JW for any pair of positive p ,q, where 1/p + 1/q = 1, if the integrals on the right converge. If p = 1 and q = ¥, then \U(g)V(g)dP £ U(g)dP × ess sup ÎW V (g) References [1] Cedilnik, A., Košmelj, K., and Blejec, A. (2004): The Distribution of the ratio of jointly normal variables. Metodološki zvezki. 1, 99-108. [2] Hinkley, D.V. (1969): On the ratio of two correlated normal random variables. Biometrika, 56, 3, 635-639. [3] Johnson, N.L, Kotz, S. and Balakrishnan, N. (1994): Continuous Univariate Distributions. 1. John Wiley and Sons. [4] Marsaglia, G. (1965): Ratios of normal variables and ratios of sums of uniforms variables. JASA, 60, 163-204.