Skip to main content

Questions tagged [marginal-probability]

Marginal probability arises from a joint probability measure on a product space. The marginal probability distributions are the push-forward measures induced by the coordinate projections. A marginal probability is the probability of a single cylinder-set event. This is contrast to joint probability or conditional probability, in which additional events are considered.

Filter by
Sorted by
Tagged with
-2 votes
1 answer
22 views

understanding random variables ; Marginal densities [closed]

I need help in solving this class example. I am quite lost on how to go about it. Any help is very much appreciated.image of the example
munarh's user avatar
  • 1
0 votes
0 answers
24 views

Gaussian processes for Machine Learning, Rasmussen, Exercise 5.2

Problem Consider the difference between the log marginal likelihood given by: $\sum_i \log p(y_i|\{y_j, j<i\})$, and the LOO-CV using log probability which is given by $\sum_i \log p(y_i| \{y_j, j\...
Mashe Burnedead's user avatar
0 votes
2 answers
46 views

Joint PMF to marginal PMF

Consider X and Y as rvs and joint PMF $p_{X,Y}(x, y)$. As I understood we can always get $p_X(x)$ and $p_Y(y)$ by taking marginal PMF, it doesn't matter whether it dependent or not. As I also ...
Pyrettt Pyrettt's user avatar
3 votes
1 answer
73 views

Is it possible to decompose the joint conditions?

Given three events $E$, $A$ and $B$, is it possbile to decompose the joint-conditional probability $P(E \vert A, B)$, as a expression in terms of non-joint-conditional probabilities, and marginal ...
Phil's user avatar
  • 183
0 votes
1 answer
49 views

The marginal distribution and marginal density of e^x

I have the probability density $$f(x,y)= \begin{cases} e^{-x}& \text{for $0\leq y\leq x$}\\ 0&\text{otherwise}\\ \end{cases}$$ Am i correct to assume that the marginal densities are $f_x(x)=xe^...
user609603's user avatar
0 votes
2 answers
54 views

Joint PDF: Finding the marginal

$f(x,y)=\begin{cases} 0.5, & \text{$|x|+|y| < 1$} \\ 0, & \text{others} \end{cases}$ Find the marginal of X My attempt: Here I assume that since it's absolute, then it should start from $...
epickid's user avatar
  • 15
1 vote
0 answers
32 views

Joint probability table not adding up to 1

I got this question asked in my exam today: Suppose there is a group of 9 students that belong to a university. 4 of them study economics, 3 of them study business and the rest study engineer. Suppose ...
gnzlama's user avatar
  • 187
0 votes
0 answers
58 views

I can't seem to understand marginal probabilities.

In the book Pattern Recognition and Machine Learning, CM Bishop has elaborated on calculating marginal, conditional and joint probabilities by creating a table with rows and columns being outcomes of ...
DeadAsDuck's user avatar
0 votes
1 answer
96 views

Prove $X$ and $Y$ with $X \sim \text{Unif}(0, 1)$ and $Y = X$ have no joint probability density function

Prove that the two random variables $X$ and $Y$ with $X \sim \text{Unif}(0, 1)$ and $Y = X$ have no joint probability density function (PDF), while each margin has a PDF. Here is my progress, please ...
stryx's user avatar
  • 47
2 votes
1 answer
74 views

Can marginals determine all probabilities

suppose we are given $n$ binary random variables, $X_1,\dots,X_n$. A probability distribution $P$ assigns all elementary events a probability; $$ P(X_1=x_1\&\ldots \& X_n=x_n)\in[0,1].$$ A ...
Juergen's user avatar
  • 19
2 votes
1 answer
82 views

Probability density of two random variables using characteristic function

I've been trying to solve the following question : $X$ and $Y$ are two real random variables with a probability density of : $$f(x,y) = e^{-y} *\mathscr{1}_{0<x<y}(x,y)$$ where $\mathscr{1}$ is ...
LeA's user avatar
  • 55
1 vote
1 answer
51 views

Marginal density functions of $X$ and $Y$

Let $R$ be a region bounded by $y=0$,$y=1$,$x=1$,$y=\sqrt{2x}$. Choose a point $(X, Y)$ uniformly at random from the bounded region. I know that $$f_{X,Y}(x,y) = \frac{1}{\text{area}(R)} = \frac65, \...
Felix's user avatar
  • 131
0 votes
2 answers
392 views

Marginalisation when one variable is discrete and the other is continuous

If we have two discrete random variables $X$ and $Y$, $$p_X(x) = \sum_y p_{XY}(x,y) = \sum_y p_{X|Y}(x|y)p_Y(y) = \sum_y \mathbb{P}(X = x|Y=y) \mathbb P(Y=y)$$ Similarly, if we have two continuous ...
Blahblahblacksheep's user avatar
2 votes
2 answers
144 views

Compute marginal density given conditional density

Given a continuous random variable $X$ with conditional pdfs $$f_{X=x|Y=0}\qquad \text{and}\qquad f_{X=x|Y=1}$$ and the probability of a discrete random variable $Y$ as $P(Y=0)=0.1,P(Y=1)=0.9$, how ...
Alex99pd's user avatar
1 vote
1 answer
227 views

Calculating marginal mean, variance, and pmf

Suppose that $Y|X \sim\operatorname{Poisson}(cX)$ where $c>0$ is a constant and $X\sim\operatorname{Exp}(1)$ (a) Find the marginal mean and variance of $Y$. (b) Find the marginal pmf of $Y$. For ...
Pierre's user avatar
  • 85
0 votes
1 answer
33 views

If $\pi\in P(X^2\times X^2)$ has marginals $\delta_x\times\delta_y$ and $\delta_x\times\nu$, then $\pi=\delta_x\times\delta_y\times\delta_x\times\nu$?

Let $X = \mathbb{R}$ and suppose that the probability measure $\pi\in P(X^2\times X^2)$ has the marginals $\delta_x\times\delta_y$ and $\delta_x\times\nu$, where $x, y \in X$ and $\nu$ is a ...
ViktorStein's user avatar
  • 4,878
-1 votes
1 answer
23 views

Construct a new probability measure without changing its marginals

Disclaimer This thread is meant to record. See: SE blog: Answer own Question and MSE meta: Answer own Question. Anyway, it is written as problem. Have fun! :) Let $X, Y$ be second-countable ...
Akira's user avatar
  • 17.9k
2 votes
2 answers
481 views

(Expectation)Consider the joint density $f(x,y)=c(x−y)e^{−x}, 0 \le y \le x$.

Consider the joint density $f(x,y)=c(x−y)e^{−x}, 0 \le y \le x$. a) Determine the value of c. b) Calculate the marginal of Y. c) Calculate the expectation E(Y). a)$1=\int_0^x c(x−y)e^{−x} = \frac{...
Bruno's user avatar
  • 37
0 votes
1 answer
19 views

Unable to prove(may be under some condition) the identity: for random variables $X,Y,W$, we have $p(x|y) = \int p(x|w)p(w|y)dw$.

Let $X,Y,W$ are random variables. Let $p(x)$ denotes the pdf $X$. We try proving the identity: $p(x|y) = \int p(x|w)p(w|y)dw$. We start with $$p(x|y) = \int p(x|y,w)p(w)\,dw,$$ $$p(x|y) = \int \int p(...
Rakesh Balhara's user avatar
0 votes
0 answers
182 views

What does it mean to divide by a pdf

We know the formula, f(A|B) = f(A,B)/f(B). In my mind the left hand side makes sense since we are basically taking a 2D distribution to a 1D distribution(well it does not necessarily integrate to one ...
Leon Von Moltke's user avatar
0 votes
0 answers
145 views

Joint laws and marginal laws

We have 10 marbles enumerated from 1 to 10, and two boxes $B_1,B_2$. Marbles are inserted randomly in boxes. Let $X$ be a random variable counting the number of marble in $B_1$ and $Y$ a random ...
Turquoise Tilt's user avatar
0 votes
1 answer
64 views

Conditional expectation of a bivariate function?

Let $Y = f(X, W)$ for two discrete random variables $X$ and $W$. If I know $P(Y = y |X=x, W)$, how can I get $$E[Y | X=x]$$. Is the following correct: $$E[Y|X=x] = \sum _w P(X=x, W=w) * f(x,w)$$ And ...
Jamal Mantburg's user avatar
2 votes
1 answer
47 views

Understanding Why we Integrate joint density function with opposite bounds to get marginal density

I have a function $f_{x,y}(x,y)$ which represents the joint density function. In order to get marginal density function in terms of $x$, I need to integrate using $y$ bounds. Why is that? I assumed ...
J4102's user avatar
  • 125
0 votes
0 answers
87 views

Formula of joint pdf

Let $\Omega$ be a sample space and $\mathbb{Q}=(\Omega, \mathcal{F}, \mathbb{P})$ standard probability space. Suppose that there exeist three different functions, say $$\mu:\Omega\to\Delta(X)$$ $$\tau:...
Hunger Learn's user avatar
0 votes
0 answers
121 views

Conditional Independence on Directed Graphs

In my lecture material, there is given following directed graph: a-->c-->b The graph can be represented by the factorization: $$p(a,b,c)=p(a)p(c|a)p(b|c)$$ The task is to check, whether $a$ and ...
RolandSt's user avatar
0 votes
0 answers
45 views

I seem to have wrong math for marginal independence proof (details in question), can anyone explain where my logic went wrong.

I'm not sure but for proofing marginal independence I've came across this stack exchange post, and from there to this link, where I found that the condition to hold for 2 variables to be independent ...
Nikolai Savulkin's user avatar
1 vote
1 answer
105 views

Fractal on a triangle with uniform marginals

Is there a fractal like probability distribution inside an equilateral triangle such that $$F(x)+G(y)+H(z)=3/2$$ for every point inside the fractal and $$F(x)+G(y)+H(z)\le3/2$$ for every point outside ...
Siddharth Joshi's user avatar
-1 votes
1 answer
31 views

Is this question correct because if I find the PDF in terms of X and Y wouldn't that be the marginal PDF f XY(x,y)? [closed]

Let S = { (𝑥,𝑦,𝑧) ∈ R³ | (𝑥² + 𝑦² + 𝑧²) ≤ 1, 𝑧 ≥ 0 } for any lebesgue measurable subset A of S, let P(A) = (3÷(2π)) × (volume of A) let X (𝑥,𝑦,𝑧) = 𝑥 and Y(𝑥,𝑦,𝑧) = 𝑦, ∀(𝑥,𝑦,𝑧) ∈ S. ...
Shahzan Ahmad's user avatar
3 votes
1 answer
44 views

Does positive joint probability imply positivity of a conditional event?

I have very little experience with probability so apologies if the title is confusing!! Let $\mu, \nu$ be probability measures on measure spaces $X,Y$ (if helpful we can assume $X = Y$ are compact ...
cloudman123's user avatar
0 votes
1 answer
29 views

Peripheralization of Conditional Distributions of head to tail model

If we peripheralize the concurrent probability of the head to tail graphical model for c, we get the following equation. $p(a,b) = p(a)Σ_{c}p(a|c)p(c|b) = p(a)p(b|a)$ The question is, does the ...
pie's user avatar
  • 103
0 votes
2 answers
48 views

Find the $Var(X+Y)$, $f (x,y) = cx^{α−1}(y − x)^{β−1}e^{−y}$

Let X and U have the joint p.d.f. $$ f_{X,Y} (x,y) = cx^{α−1}(y − x)^{β−1}e^{−y}, 0<x<y<\infty $$ Find the Var(X+Y). V(X+Y) = V(X) + V(Y) + cov(X,Y) V(X) = E(x^2)-(Ex)^2 To fine the ...
user913386's user avatar
0 votes
2 answers
74 views

Find the density function of $X+Y$ when $X,Y \sim_{i.i.d} U[0,1]$

I'm using the transformation of R.V and calculating marginal density approach and it confuses me how there's two cases. In my working I let $U=X+Y, V=Y$ $f_{U,V} = f_{X,Y}|J|=f_{X,Y}=1$ from $0<x&...
smaillis's user avatar
  • 580
0 votes
1 answer
106 views

Splitting of marginal PDF

If we have the function: $$f_{X,Y}\left(x,y\right)=\frac{3}{7}x,\:\text{ when }\:1\le x\le 2\:\wedge \:0\le y\le x$$ If I want to find the marginal function $f_Y\left(y\right)$. It should be: $$f_Y\...
user avatar
2 votes
1 answer
971 views

Transforming uniform probability density function over unit disc from polar coordinates to cartesian coordinates

I need to solve the following exercise and I am not fully sure about my approach as the results seem odd and therefore would like some advise. Given are uniform random distributions of an angle $\...
Hinshman's user avatar
1 vote
1 answer
79 views

Determining the Distribution of an uncorrelated Random Variable by using Dirac's Delta

Choosing random variables $X \sim U(-1,1)$ and $Y = X^2$, it is possible to show that X and Y are uncorrelated, yet not independent. I was wondering what the probability distribution of $Y$ now looks ...
dreieinigkeitsmoses's user avatar
0 votes
1 answer
117 views

Checking if jointly CRVs are independent via the marginal density

I'm trying to determine whether the following continuous random variables are independent or not given their joint probability density. $f_{X,Y}(x,y) = 2(x+y)$ for $ 0\le y\le x\le 1$ I calculated the ...
Governor's user avatar
  • 505
0 votes
0 answers
430 views

Marginalisation - Integrate out a variable

I calculated a joint density function of two non uniformly distributed variables. $$ f_{x,y}(x,y) = \frac{1}{2\pi}\cdot{1 \over \sqrt(x^2+y^2)}$$ I want to "integrate out" one of the ...
WiPU's user avatar
  • 155
0 votes
1 answer
51 views

Marginalization of conditional probabilities

An exam question asked to select all expressions that are equal to 1, given no independence assumptions. The solution stated that the following expression was not equal to 1, but I don't understand ...
Senne Verhaegen's user avatar
0 votes
1 answer
460 views

Interpretation of probability density greater than one

Good morning to everyone, I'm trying to figure out how to interpret a probability density greater than one, but I didn't find any explanation that I considered satisfactory... Let me formalize my ...
Matteo Bulgarelli's user avatar
2 votes
1 answer
221 views

Why do elliptical copula densities contain $x_1$ and $x_2$, but Archimedean copula densities contain $u_1$ and $u_2$?

$$c\left(u_{1}, u_{2}\right)=\frac{1}{\sqrt{1-\rho_{12}^{2}}} \exp \left\{-\frac{\rho_{12}^{2}\left(x_{1}^{2}+x_{2}^{2}\right)-2 \rho_{12} x_{1} x_{2}}{2\left(1-\rho_{12}^{2}\right)}\right\}$$ is the ...
develarist's user avatar
  • 1,574
0 votes
1 answer
491 views

How to get the marginal density function of dependent random variables

I am unsure how to get the marginal probability density function if the two random variables are dependent. In general for given $f_{XY}(x,y)$ we get $$f_X(x)=\int_{-\infty}^{+\infty}f_{X,Y}(x,y)dy\...
youneedtoread1's user avatar
0 votes
1 answer
66 views

Is always true for the formula of marginal density function from joint distribution?

This is the formula to help you get the marginal pdf from a joint distribution: $$f(x)= \int_{}^{} f(x,y) \, dy\\$$ $$f(y)= \int_{}^{} f(x,y) \, dx\\$$ where the first integral is over all points in ...
Szu-Hua Huang's user avatar
0 votes
0 answers
56 views

Integral of the square of a non-standard gaussian pdf [duplicate]

What is the integral of the square of a non-standard univariate gaussian distribution? The following is what I have so far, is it right? $\int_{-\infty}^{\infty} p^2(x)dx$, where $p(x)\sim N(\mu, \...
ItsKalvik's user avatar
0 votes
1 answer
41 views

Implications of independence for two dichotomous random variables

I am trying to understand the relationship among the following: $P(A | B)$, $P(A | \lnot B)$, and $P(A)$ A is independent of B iff: $P(A | B) = P(A)$ But, if A is independent of B, then does this ...
rsilg's user avatar
  • 1
4 votes
3 answers
160 views

Given a coupling $\pi(\mu,\nu)$, show that $E_\mu f- E_\nu f= E_\pi [f(X) - f(Y)]$

In the lecture notes by for High-Dimensional Probability by Handel, the following is affirmed: Let $\mu$ and $\nu$ be probability measures, then $$\mathcal C(\mu,\nu) = \{ \text{Law} (X,Y) : X\sim \mu,...
Davi Barreira's user avatar
0 votes
1 answer
96 views

What happen if a probability function does not converge?

Find the marginal distribution for $y_2$ given the following PDF $$f(y_1,y_2)= \begin{cases} 3y_1, & \text{if } 0\leq y_2\leq y_1 \\ 0, & \text{elsewhere} \end{cases} $$ But when I try to ...
Juan Sebastian's user avatar
0 votes
1 answer
37 views

Showing function is not independent

I need to show that this is not independent, I found that $f(x,y) = 1$ with the given restrictions on $x$ and $y$, but now to find the marginal density functions, would the integral for finding $f_x(x)...
LearningMath899's user avatar
0 votes
0 answers
29 views

What is the probability that a sample belongs to a particular random variable?

Edited for clarity. Let $X$ be a continuous random variable with a known probability distribution. Let $\mathbf{v}$ be a vector in $\mathbb{R}^n$. What is the probability that $\mathbf{v}$ is a sample ...
Asger Jon Vistisen's user avatar
2 votes
1 answer
59 views

Independence between fractional parts of consecutive sums of independent uniforms

Let $X_1,X_2$ be independent $\text{Uniform}(0,1)$ random variables. Define $U_1 = X_1 - \lfloor X_1 \rfloor$ and $U_2 = X_1 + X_2 - \lfloor X_1 + X_2 \rfloor$ where $\lfloor a \rfloor$ is the largest ...
Joe Robert's user avatar
2 votes
0 answers
42 views

Concept about marginal probability $p(y)$to conditional probability $p(y|x)$ transformation?

I have a function like the following, $p\left( y \right) = \int\limits_x {\int\limits_z {(Q({x^2} + y) + yz + z)dxdz} } $ Where, $Q(x) = \frac{1}{{2\pi }}\int\limits_x^\infty {{e^{ - \frac{{{t^2}}}{2}...
Samantha's user avatar
  • 303