# Changeset 2753 for trunk/doc

Ignore:
Timestamp:
Jun 24, 2012, 11:18:39 AM (10 years ago)
Message:

remove trailing WS

File:
1 edited

### Legend:

Unmodified
 r2752 w_i} \f$. Hence, an estimator of the variance of \f$ X \f$is \f$ s^2 = -^2= \f$s^2 = -^2= \f$ \f$>^2+(-\mu)^2 \f$. \f$. In the case when weights are included in analysis due to varying L(\sigma_0^2)=\prod\frac{1}{\sqrt{2\pi\sigma_0^2/w_i}}\exp{(-\frac{w_i(x-m)^2}{2\sigma_0^2})} \f$ and taking the derivity with respect to \f$\sigma_o^2\f$, and taking the derivity with respect to \f$\sigma_o^2\f$, \f$\f$ \sum -\frac{1}{2\sigma_0^2}+\frac{w_i(x-m)^2}{2\sigma_0^2\sigma_o^2} \f$\f$ which \subsection Correlation As the mean is estimated as \f$m_x=\frac{\sum w_xw_yx}{\sum w_xw_y} As the mean is estimated as \f$ m_x=\frac{\sum w_xw_yx}{\sum w_xw_y} \f$, the variance is estimated as \f$ \sigma_x^2=\frac{\sum w_xw_y(x-m_x)^2}{\sum w_xw_y} \f$. the variance is estimated as \f$ \sigma_x^2=\frac{\sum w_xw_y(x-m_x)^2}{\sum w_xw_y} \f$. As in the non-weighted case we define the correlation to be the ratio between the covariance and geometrical average of the variances \f$ \frac{\sum w(x-m_x)^2+\sum w(y-m_y)^2} {\frac{\left(\sum w_x\right)^2}{\sum w_x^2}+ {\frac{\left(\sum w_x\right)^2}{\sum w_x^2}+ \frac{\left(\sum w_y\right)^2}{\sum w_y^2}-2} \f$\f$ \frac{\sum w(x-m_x)^2+\sum w(y-m_y)^2} {\frac{\left(\sum w_x\right)^2}{\sum w_x^2}+ {\frac{\left(\sum w_x\right)^2}{\sum w_x^2}+ \frac{\left(\sum w_y\right)^2}{\sum w_y^2}-2} \left(\frac{1}{\sum w_i}+\frac{1}{\sum w_i}\right), \f$\frac{w\sum (x-m_x)^2+w\sum (y-m_y)^2} {n_x+n_y-2} {n_x+n_y-2} \left(\frac{1}{wn_x}+\frac{1}{wn_y}\right), \f$ in other words the good old expression as for non-weighted. in other words the good old expression as for non-weighted. \subsection FoldChange and calculating the weighted median of the distances. \section Distance \section Distance A \ref concept_distance measures how far apart two ranges are. A Distance should The polynomial kernel of degree \f$N\f$ is defined as \f$(1+)^N\f$, where \f$\f$ is the linear kernel (usual scalar product). For the weighted case we define the linear kernel to be case we define the linear kernel to be \f$=\frac{\sum {w_xw_yxy}}{\sum{w_xw_y}}\f$ and the polynomial kernel can be calculated as before \f$(1+)^N\f$. \f$(1+)^N\f$. \subsection gaussian_kernel Gaussian Kernel We have the model \f$y_i=\alpha+\beta (x-m_x)+\epsilon_i, \f$ \f$y_i=\alpha+\beta (x-m_x)+\epsilon_i, \f$ where \f$\epsilon_i\f$ is the noise. The variance of the noise is Taking the derivity with respect to \f$\alpha\f$ and \f$\beta\f$ yields two conditions \f$\f$ \frac{\partial Q_0}{\partial \alpha} = -2 \sum w_i(y_i - \alpha - \beta (x_i-m_x)=0 \beta (x_i-m_x)=0 \f$\f$ \frac{\partial Q_0}{\partial \beta} = -2 \sum w_i(x_i-m_x)(y_i-\alpha-\beta(x_i-m_x)=0 w_i(x_i-m_x)(y_i-\alpha-\beta(x_i-m_x)=0 \f$\f$ \beta=\frac{\sum w_i(x_i-m_x)(y-m_y)}{\sum w_i(x_i-m_x)^2}=\frac{Cov(x,y)}{Var(x)} w_i(x_i-m_x)^2}=\frac{Cov(x,y)}{Var(x)} \f$\f$\alpha\f$and \f$\beta\f$. \f$ \f\$ \textrm{Var}(\alpha )=\frac{w_i^2\frac{\sigma^2}{w_i}}{(\sum w_i)^2}= \frac{\sigma^2}{\sum w_i}