Expectation of $1/||X||^2$ under $X sim N(theta, I)$ goes to 0 as $||theta||$ goes to infinity The 2019 Stack Overflow Developer Survey Results Are In Unicorn Meta Zoo #1: Why another podcast? Announcing the arrival of Valued Associate #679: Cesar ManaraShow that $fracX+1n+2$ is a biased estimator of the binomial parameter $theta$Method of moments estimator of $θ$ using a random sample from $X sim U(0,θ)$Show that $hattheta$ is not minimax.Maximum Likelihood Estimator for $X_1,dots, X_n ; sim U(-theta,theta)$Minimum Variance Unbiased Estimator for exponential distribution casesEstimator for $theta$ using the method of momentsMinimax bounds on estimation of Gaussian meanFind UMVU estimator for $e^-3 theta$ given a complete sufficient statistic $X sim Pois(theta)$ with $theta>0$.Maximum Likelihood Estimator for a exp(1/$theta$) distributed rvConditional Expectation Inequality for bounded moment
How to politely respond to generic emails requesting a PhD/job in my lab? Without wasting too much time
Do I have Disadvantage attacking with an off-hand weapon?
Why not take a picture of a closer black hole?
Can I visit the Trinity College (Cambridge) library and see some of their rare books
Do working physicists consider Newtonian mechanics to be "falsified"?
Can the DM override racial traits?
Why can't devices on different VLANs, but on the same subnet, communicate?
Could an empire control the whole planet with today's comunication methods?
Is this wall load bearing? Blueprints and photos attached
The following signatures were invalid: EXPKEYSIG 1397BC53640DB551
What aspect of planet Earth must be changed to prevent the industrial revolution?
Using dividends to reduce short term capital gains?
Sort list of array linked objects by keys and values
Homework question about an engine pulling a train
Make it rain characters
Are there continuous functions who are the same in an interval but differ in at least one other point?
Do warforged have souls?
How did the audience guess the pentatonic scale in Bobby McFerrin's presentation?
Mortgage adviser recommends a longer term than necessary combined with overpayments
Why are PDP-7-style microprogrammed instructions out of vogue?
What is the role of 'For' here?
Did the new image of black hole confirm the general theory of relativity?
Simulating Exploding Dice
My body leaves; my core can stay
Expectation of $1/||X||^2$ under $X sim N(theta, I)$ goes to 0 as $||theta||$ goes to infinity
The 2019 Stack Overflow Developer Survey Results Are In
Unicorn Meta Zoo #1: Why another podcast?
Announcing the arrival of Valued Associate #679: Cesar ManaraShow that $fracX+1n+2$ is a biased estimator of the binomial parameter $theta$Method of moments estimator of $θ$ using a random sample from $X sim U(0,θ)$Show that $hattheta$ is not minimax.Maximum Likelihood Estimator for $X_1,dots, X_n ; sim U(-theta,theta)$Minimum Variance Unbiased Estimator for exponential distribution casesEstimator for $theta$ using the method of momentsMinimax bounds on estimation of Gaussian meanFind UMVU estimator for $e^-3 theta$ given a complete sufficient statistic $X sim Pois(theta)$ with $theta>0$.Maximum Likelihood Estimator for a exp(1/$theta$) distributed rvConditional Expectation Inequality for bounded moment
$begingroup$
Let $p geq 3$ and $X sim N(theta, I_p)$. I want to show that
$$mathbbE_thetabigg(frac1bigg) rightarrow 0 textrm as ||theta|| rightarrow infty.$$
I have the following proof, but it is not very straightforward, so I am wondering if there is a simple argument.
First note that $mathbbE_thetabig(frac1big)$ depends only on $||theta||$ by spherical symmetry. Let $g: [0, infty) rightarrow [0, infty]$ be such that $g(||theta||) = mathbbE_thetabig(frac1big).$ Note also that it is clear that $g$ is non-increasing.
Define the James-Stein estimator $delta^JS(X) = big(1 - fracp-2big)X$. It can be shown that
$$mathbbE_thetaBig([delta^JS(X) - theta]^2Big) = p - (p-2)^2mathbbE_thetabigg(frac1bigg).$$ Since this is $geq 0$, we must have that $mathbbE_thetabig(frac1big)$ is finite for all $theta in mathbbR^p$, so $g$ is finite.
Let $S$ be the unit sphere in $mathbbR^p$, and $E = mathbbR^p setminus S$. Then $g(r) = g_S(r) + g_E(r)$ where $g_S(r) = mathbbE_thetabig(frac1 1_Xin Sbig)$ and $g_E(r) = mathbbE_thetabig(frac1 1_Xin Ebig)$ where $||theta|| = r$.
It is not hard to show that $g_E(r) rightarrow 0$ as $r rightarrow 0$. Now, given arbitrary $epsilon > 0$, we can pick $r geq 1$ large enough that $f(x | rmathbfe_1) leq epsilon f(x | mathbfe_1)$ for all $xin S$, where $f(x | theta)$ is the p.d.f. of the $N(theta, I_p)$ distribution and $mathbfe_1 = (1, 0, ..., 0)$. Then $g_S(r) leq epsilon g_S(1)$. Since $g_S(1)$ is finite, we have that $inf_r geq 0 g_S(r) = 0$.
Thus, $inf_rgeq 0 g(r) = 0$. Since $g$ is non-increasing, its limit as $r to infty$ must be 0.
Note: this result shows in particular that, although the James-Stein estimator dominates the estimator $X$ under quadratic loss, they have the same maximal risk.
probability statistics probability-distributions normal-distribution
$endgroup$
add a comment |
$begingroup$
Let $p geq 3$ and $X sim N(theta, I_p)$. I want to show that
$$mathbbE_thetabigg(frac1bigg) rightarrow 0 textrm as ||theta|| rightarrow infty.$$
I have the following proof, but it is not very straightforward, so I am wondering if there is a simple argument.
First note that $mathbbE_thetabig(frac1big)$ depends only on $||theta||$ by spherical symmetry. Let $g: [0, infty) rightarrow [0, infty]$ be such that $g(||theta||) = mathbbE_thetabig(frac1big).$ Note also that it is clear that $g$ is non-increasing.
Define the James-Stein estimator $delta^JS(X) = big(1 - fracp-2big)X$. It can be shown that
$$mathbbE_thetaBig([delta^JS(X) - theta]^2Big) = p - (p-2)^2mathbbE_thetabigg(frac1bigg).$$ Since this is $geq 0$, we must have that $mathbbE_thetabig(frac1big)$ is finite for all $theta in mathbbR^p$, so $g$ is finite.
Let $S$ be the unit sphere in $mathbbR^p$, and $E = mathbbR^p setminus S$. Then $g(r) = g_S(r) + g_E(r)$ where $g_S(r) = mathbbE_thetabig(frac1 1_Xin Sbig)$ and $g_E(r) = mathbbE_thetabig(frac1 1_Xin Ebig)$ where $||theta|| = r$.
It is not hard to show that $g_E(r) rightarrow 0$ as $r rightarrow 0$. Now, given arbitrary $epsilon > 0$, we can pick $r geq 1$ large enough that $f(x | rmathbfe_1) leq epsilon f(x | mathbfe_1)$ for all $xin S$, where $f(x | theta)$ is the p.d.f. of the $N(theta, I_p)$ distribution and $mathbfe_1 = (1, 0, ..., 0)$. Then $g_S(r) leq epsilon g_S(1)$. Since $g_S(1)$ is finite, we have that $inf_r geq 0 g_S(r) = 0$.
Thus, $inf_rgeq 0 g(r) = 0$. Since $g$ is non-increasing, its limit as $r to infty$ must be 0.
Note: this result shows in particular that, although the James-Stein estimator dominates the estimator $X$ under quadratic loss, they have the same maximal risk.
probability statistics probability-distributions normal-distribution
$endgroup$
add a comment |
$begingroup$
Let $p geq 3$ and $X sim N(theta, I_p)$. I want to show that
$$mathbbE_thetabigg(frac1bigg) rightarrow 0 textrm as ||theta|| rightarrow infty.$$
I have the following proof, but it is not very straightforward, so I am wondering if there is a simple argument.
First note that $mathbbE_thetabig(frac1big)$ depends only on $||theta||$ by spherical symmetry. Let $g: [0, infty) rightarrow [0, infty]$ be such that $g(||theta||) = mathbbE_thetabig(frac1big).$ Note also that it is clear that $g$ is non-increasing.
Define the James-Stein estimator $delta^JS(X) = big(1 - fracp-2big)X$. It can be shown that
$$mathbbE_thetaBig([delta^JS(X) - theta]^2Big) = p - (p-2)^2mathbbE_thetabigg(frac1bigg).$$ Since this is $geq 0$, we must have that $mathbbE_thetabig(frac1big)$ is finite for all $theta in mathbbR^p$, so $g$ is finite.
Let $S$ be the unit sphere in $mathbbR^p$, and $E = mathbbR^p setminus S$. Then $g(r) = g_S(r) + g_E(r)$ where $g_S(r) = mathbbE_thetabig(frac1 1_Xin Sbig)$ and $g_E(r) = mathbbE_thetabig(frac1 1_Xin Ebig)$ where $||theta|| = r$.
It is not hard to show that $g_E(r) rightarrow 0$ as $r rightarrow 0$. Now, given arbitrary $epsilon > 0$, we can pick $r geq 1$ large enough that $f(x | rmathbfe_1) leq epsilon f(x | mathbfe_1)$ for all $xin S$, where $f(x | theta)$ is the p.d.f. of the $N(theta, I_p)$ distribution and $mathbfe_1 = (1, 0, ..., 0)$. Then $g_S(r) leq epsilon g_S(1)$. Since $g_S(1)$ is finite, we have that $inf_r geq 0 g_S(r) = 0$.
Thus, $inf_rgeq 0 g(r) = 0$. Since $g$ is non-increasing, its limit as $r to infty$ must be 0.
Note: this result shows in particular that, although the James-Stein estimator dominates the estimator $X$ under quadratic loss, they have the same maximal risk.
probability statistics probability-distributions normal-distribution
$endgroup$
Let $p geq 3$ and $X sim N(theta, I_p)$. I want to show that
$$mathbbE_thetabigg(frac1bigg) rightarrow 0 textrm as ||theta|| rightarrow infty.$$
I have the following proof, but it is not very straightforward, so I am wondering if there is a simple argument.
First note that $mathbbE_thetabig(frac1big)$ depends only on $||theta||$ by spherical symmetry. Let $g: [0, infty) rightarrow [0, infty]$ be such that $g(||theta||) = mathbbE_thetabig(frac1big).$ Note also that it is clear that $g$ is non-increasing.
Define the James-Stein estimator $delta^JS(X) = big(1 - fracp-2big)X$. It can be shown that
$$mathbbE_thetaBig([delta^JS(X) - theta]^2Big) = p - (p-2)^2mathbbE_thetabigg(frac1bigg).$$ Since this is $geq 0$, we must have that $mathbbE_thetabig(frac1big)$ is finite for all $theta in mathbbR^p$, so $g$ is finite.
Let $S$ be the unit sphere in $mathbbR^p$, and $E = mathbbR^p setminus S$. Then $g(r) = g_S(r) + g_E(r)$ where $g_S(r) = mathbbE_thetabig(frac1 1_Xin Sbig)$ and $g_E(r) = mathbbE_thetabig(frac1 1_Xin Ebig)$ where $||theta|| = r$.
It is not hard to show that $g_E(r) rightarrow 0$ as $r rightarrow 0$. Now, given arbitrary $epsilon > 0$, we can pick $r geq 1$ large enough that $f(x | rmathbfe_1) leq epsilon f(x | mathbfe_1)$ for all $xin S$, where $f(x | theta)$ is the p.d.f. of the $N(theta, I_p)$ distribution and $mathbfe_1 = (1, 0, ..., 0)$. Then $g_S(r) leq epsilon g_S(1)$. Since $g_S(1)$ is finite, we have that $inf_r geq 0 g_S(r) = 0$.
Thus, $inf_rgeq 0 g(r) = 0$. Since $g$ is non-increasing, its limit as $r to infty$ must be 0.
Note: this result shows in particular that, although the James-Stein estimator dominates the estimator $X$ under quadratic loss, they have the same maximal risk.
probability statistics probability-distributions normal-distribution
probability statistics probability-distributions normal-distribution
asked Apr 7 at 20:38
Baran KarakusBaran Karakus
61
61
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
I don't know if this is fundamentally simpler, but it looks superficially different.
The quantity $R=|X|^2$ has a non-central chi squared distribution with $p$ degrees of freedom and non-centrality parameter $lambda=|theta|^2$. We can represent $R=S+T$ where $Sge0$ and $Tge0$ are independent, $S$ has an ordinary chi-squared distribution on $p$ degrees of freedom, and $T$ is a mixture of ordinary chi-square rvs with $2k$ degrees of freedom, where $k$ is Poisson distributed with parameter $lambda/2$. ($T$ is sometimes called a non-central chi-squared on zero degrees of freedom; see the wikipedia page cited above for details.)
We will use the dominated convergence theorem, with dominating integrand $1/S$, for which $1/Rle 1/S$. Note that $E(1/S)<infty$ because of the shape of the density function of $S$: you are integrating $int_0^infty s^p/2-1 exp(-s/2) (1/s),ds$, which is finite if $pge3$.
Let $lambda_ntoinfty$ and let $T_n$ be a random variable with non-centralality parameter $lambda_n$ and zero degrees of freedom. We want to see if $E(1/(S+T_n))to0$. Note that $1/T_n$ converges in distribution to $0$ as $ntoinfty$. By the Skorokhod theorem, we can pretend $1/T_n$ converges to $0$ with probability $1$. So now we use the DCT. We have $1/(S+T_n)le 1/S$ with probability $1$, we have $lim_ntoinfty1/(S+T_n)=0$ with probability $1$, and we have $E(1/S)ltinfty$. So the DCT tells us $E(1/(S+T_n))to0$ as desired.
Another way to prove the result is to notice that this chi-square with zero degrees of freedom stuff implies the family of distributions of $1/|X|^2$ is uniformly integrable. Then passage to the limit under the integral sign is justified.
$endgroup$
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3178740%2fexpectation-of-1-x2-under-x-sim-n-theta-i-goes-to-0-as-theta%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
I don't know if this is fundamentally simpler, but it looks superficially different.
The quantity $R=|X|^2$ has a non-central chi squared distribution with $p$ degrees of freedom and non-centrality parameter $lambda=|theta|^2$. We can represent $R=S+T$ where $Sge0$ and $Tge0$ are independent, $S$ has an ordinary chi-squared distribution on $p$ degrees of freedom, and $T$ is a mixture of ordinary chi-square rvs with $2k$ degrees of freedom, where $k$ is Poisson distributed with parameter $lambda/2$. ($T$ is sometimes called a non-central chi-squared on zero degrees of freedom; see the wikipedia page cited above for details.)
We will use the dominated convergence theorem, with dominating integrand $1/S$, for which $1/Rle 1/S$. Note that $E(1/S)<infty$ because of the shape of the density function of $S$: you are integrating $int_0^infty s^p/2-1 exp(-s/2) (1/s),ds$, which is finite if $pge3$.
Let $lambda_ntoinfty$ and let $T_n$ be a random variable with non-centralality parameter $lambda_n$ and zero degrees of freedom. We want to see if $E(1/(S+T_n))to0$. Note that $1/T_n$ converges in distribution to $0$ as $ntoinfty$. By the Skorokhod theorem, we can pretend $1/T_n$ converges to $0$ with probability $1$. So now we use the DCT. We have $1/(S+T_n)le 1/S$ with probability $1$, we have $lim_ntoinfty1/(S+T_n)=0$ with probability $1$, and we have $E(1/S)ltinfty$. So the DCT tells us $E(1/(S+T_n))to0$ as desired.
Another way to prove the result is to notice that this chi-square with zero degrees of freedom stuff implies the family of distributions of $1/|X|^2$ is uniformly integrable. Then passage to the limit under the integral sign is justified.
$endgroup$
add a comment |
$begingroup$
I don't know if this is fundamentally simpler, but it looks superficially different.
The quantity $R=|X|^2$ has a non-central chi squared distribution with $p$ degrees of freedom and non-centrality parameter $lambda=|theta|^2$. We can represent $R=S+T$ where $Sge0$ and $Tge0$ are independent, $S$ has an ordinary chi-squared distribution on $p$ degrees of freedom, and $T$ is a mixture of ordinary chi-square rvs with $2k$ degrees of freedom, where $k$ is Poisson distributed with parameter $lambda/2$. ($T$ is sometimes called a non-central chi-squared on zero degrees of freedom; see the wikipedia page cited above for details.)
We will use the dominated convergence theorem, with dominating integrand $1/S$, for which $1/Rle 1/S$. Note that $E(1/S)<infty$ because of the shape of the density function of $S$: you are integrating $int_0^infty s^p/2-1 exp(-s/2) (1/s),ds$, which is finite if $pge3$.
Let $lambda_ntoinfty$ and let $T_n$ be a random variable with non-centralality parameter $lambda_n$ and zero degrees of freedom. We want to see if $E(1/(S+T_n))to0$. Note that $1/T_n$ converges in distribution to $0$ as $ntoinfty$. By the Skorokhod theorem, we can pretend $1/T_n$ converges to $0$ with probability $1$. So now we use the DCT. We have $1/(S+T_n)le 1/S$ with probability $1$, we have $lim_ntoinfty1/(S+T_n)=0$ with probability $1$, and we have $E(1/S)ltinfty$. So the DCT tells us $E(1/(S+T_n))to0$ as desired.
Another way to prove the result is to notice that this chi-square with zero degrees of freedom stuff implies the family of distributions of $1/|X|^2$ is uniformly integrable. Then passage to the limit under the integral sign is justified.
$endgroup$
add a comment |
$begingroup$
I don't know if this is fundamentally simpler, but it looks superficially different.
The quantity $R=|X|^2$ has a non-central chi squared distribution with $p$ degrees of freedom and non-centrality parameter $lambda=|theta|^2$. We can represent $R=S+T$ where $Sge0$ and $Tge0$ are independent, $S$ has an ordinary chi-squared distribution on $p$ degrees of freedom, and $T$ is a mixture of ordinary chi-square rvs with $2k$ degrees of freedom, where $k$ is Poisson distributed with parameter $lambda/2$. ($T$ is sometimes called a non-central chi-squared on zero degrees of freedom; see the wikipedia page cited above for details.)
We will use the dominated convergence theorem, with dominating integrand $1/S$, for which $1/Rle 1/S$. Note that $E(1/S)<infty$ because of the shape of the density function of $S$: you are integrating $int_0^infty s^p/2-1 exp(-s/2) (1/s),ds$, which is finite if $pge3$.
Let $lambda_ntoinfty$ and let $T_n$ be a random variable with non-centralality parameter $lambda_n$ and zero degrees of freedom. We want to see if $E(1/(S+T_n))to0$. Note that $1/T_n$ converges in distribution to $0$ as $ntoinfty$. By the Skorokhod theorem, we can pretend $1/T_n$ converges to $0$ with probability $1$. So now we use the DCT. We have $1/(S+T_n)le 1/S$ with probability $1$, we have $lim_ntoinfty1/(S+T_n)=0$ with probability $1$, and we have $E(1/S)ltinfty$. So the DCT tells us $E(1/(S+T_n))to0$ as desired.
Another way to prove the result is to notice that this chi-square with zero degrees of freedom stuff implies the family of distributions of $1/|X|^2$ is uniformly integrable. Then passage to the limit under the integral sign is justified.
$endgroup$
I don't know if this is fundamentally simpler, but it looks superficially different.
The quantity $R=|X|^2$ has a non-central chi squared distribution with $p$ degrees of freedom and non-centrality parameter $lambda=|theta|^2$. We can represent $R=S+T$ where $Sge0$ and $Tge0$ are independent, $S$ has an ordinary chi-squared distribution on $p$ degrees of freedom, and $T$ is a mixture of ordinary chi-square rvs with $2k$ degrees of freedom, where $k$ is Poisson distributed with parameter $lambda/2$. ($T$ is sometimes called a non-central chi-squared on zero degrees of freedom; see the wikipedia page cited above for details.)
We will use the dominated convergence theorem, with dominating integrand $1/S$, for which $1/Rle 1/S$. Note that $E(1/S)<infty$ because of the shape of the density function of $S$: you are integrating $int_0^infty s^p/2-1 exp(-s/2) (1/s),ds$, which is finite if $pge3$.
Let $lambda_ntoinfty$ and let $T_n$ be a random variable with non-centralality parameter $lambda_n$ and zero degrees of freedom. We want to see if $E(1/(S+T_n))to0$. Note that $1/T_n$ converges in distribution to $0$ as $ntoinfty$. By the Skorokhod theorem, we can pretend $1/T_n$ converges to $0$ with probability $1$. So now we use the DCT. We have $1/(S+T_n)le 1/S$ with probability $1$, we have $lim_ntoinfty1/(S+T_n)=0$ with probability $1$, and we have $E(1/S)ltinfty$. So the DCT tells us $E(1/(S+T_n))to0$ as desired.
Another way to prove the result is to notice that this chi-square with zero degrees of freedom stuff implies the family of distributions of $1/|X|^2$ is uniformly integrable. Then passage to the limit under the integral sign is justified.
edited Apr 8 at 10:10
answered Apr 8 at 7:29
kimchi loverkimchi lover
11.8k31229
11.8k31229
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3178740%2fexpectation-of-1-x2-under-x-sim-n-theta-i-goes-to-0-as-theta%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown