zoukankan      html  css  js  c++  java
  • 在俄罗斯学到的第一个trick First trick I learnt in Russia

    考虑一个$n imes n$级方阵$A$, 每个位置都是$0,1,-1$.如果$A$的特征值$lambda$取到特征值绝对值的最大值. 假设$lambda$的几何重数(特征子空间的维数)是$m$. 那么$A$的任意$(n-m+1) imes (n-m+1)$级主子阵一定有一列非零元数目超过$|lambda|$. 

    Consider a matrix $A$ of size $n imes n$ with each indices $0$, $1$ or $-1$. Assume that the eigenvalue $lambda$ takes the maximum of the absolute value of all eigenvalues. Assume that the geometrically multiplicity (dimension of eigenspace) of $lambda$ is $m$. Then any $(n-m+1) imes (n-m+1)$ principal submatrix has a row with nonzero indices more than $|lambda|$.

    证明如下.

    The proof is as follow.

    假设$A=(A_{alphaeta})_{n imes n}$. 不妨假设就是前$(n-m+1)$级子式. 取$n imes m$级矩阵$B$是特征子空间的基组成的矩阵. 考虑$B$的末$n-(n-m+1)=m-1$行$B^*$, 因为降秩, 存在非零向量$x$使得$B^*x=0$. 此时$y=Bx$前$n-m+1$行不全为零, 末$m-1$ 行全为$0$. 假设$y=(y_{alpha})_{n imes 1}$, 那么

    $$egin{array}{rll}|lambda|cdot |y_{alpha}|&=|(lambda y)_{alpha}|=|(lambda Bx)_{alpha}|= |(ABx)_{alpha}|=|(Ay)_{alpha}|\& leq sum_{eta} |A_{alphaeta}| |y_{eta}|= sum_{eta leq n-m+1} |A_{alphaeta}| |y_{eta}| & extrm{取$alpha$使得$|y_{alpha}|$最大} \& leq |y_{alpha}|sum_{etaleq n-m+1} |A_{alphaeta}| = |y_{alpha}|cdot #{ extrm{$alpha$行的非零元素}}end{array}$$

    两边消去$|y_{alpha}|$即知结果.

    Assume that $A=(A_{alphaeta})_{n imes n}$. WLOG assume it is exactly the first $n-m+1$ rows and columns. Taking the matrix of basis of eigenspace belonging to $lambda$ whose size is $n imes m$. Denote $B^*$ the last $n-(n-m+1)=m-1$ rows of $B$. Then there exists nonzero vector $x$ such that $B^* x=0$ by the argument of rank. Now $y=Bx$ has first $n-m+1$ rows not all zero, but last $m-1$ rows vanishing. Assume that $y=(y_{alpha})_{n imes 1}$, then

    $$egin{array}{rll}
    |lambda|cdot |y_{alpha}|&=|(lambda y)_{alpha}|=|(lambda Bx)_{alpha}|= |(ABx)_{alpha}|=|(Ay)_{alpha}|\& leq sum_{eta} |A_{alphaeta}| |y_{eta}|\
    &= sum_{eta leq n-m+1} |A_{alphaeta}| |y_{eta}| & extrm{taking $alpha$ such that $|y_{alpha}|$ is maximal} \& leq |y_{alpha}|sum_{etaleq n-m+1} |A_{alphaeta}| \& = |y_{alpha}|cdot #{ extrm{nonzero indices of row $alpha$}}end{array}$$

    Cancelling $|y_{alpha}|$ of both sides completes the proof.

    应用: $n$维超立方体$Q_n$的任意顶点数$geq 2^{n-1}+1$的子图必有顶点$deg geq sqrt{n}$.

    Application. Any subgraphs of $n$-dimensional hyper-cube $Q_n$ who has more than $2^{n-1}+1$ vertices possesses a vertex with degree $geq sqrt{n}$.

    解释: 考虑$n$位的2进制码, 如果两串码只相差1位, 就说是容易弄混的, 那么任意一半以上的码拿出来之中一定有一个码和$sqrt{n}$个码容易弄混.

    Explanation. Consider the $0$-$1$ code with length $n$. If two codes differ only one position, we say them difficult to distinguish. If we pick more than a half of codes from all of them, there always exists a code difficult to distinguish with $sqrt{n}$ codes.

    证明: 考虑$A_0=(0), A_n=left(egin{matrix}A_{n-1} & I \ I & -A_{n-1}end{matrix} ight)$, 那么归纳可证$A_n^2=nI$.

    Proof. Consider $A_0=(0), A_n=left(egin{matrix}A_{n-1} & I \ I & -A_{n-1}end{matrix} ight)$, now $A_n^2=nI$ by induction.

  • 相关阅读:
    用户自定义异常
    触发异常
    第一阶段冲刺终
    第一阶段冲刺七
    第一阶段冲刺六
    第一阶段冲刺五
    第一阶段冲刺四
    Sufficient Statistic (充分统计量)
    DAG-GNN: DAG Structure Learning with Graph Neural Networks
    Masked Gradient-Based Causal Structure Learning
  • 原文地址:https://www.cnblogs.com/XiongRuiMath/p/11469745.html
Copyright © 2011-2022 走看看