3

Obtain $M\in\{-1,+1\}^{n\times n}$ by unbiased coin flipping.

What is known about the distribution of permanent $\mathsf{Perm}(M)$? It seems to be bimodal.

Given a function $g(n)$ what is the function $f(m)$ (at least approximately) such that probability $$\mathsf{Prob}\big(|\mathsf{Perm}(M)|\in\big(\mathsf{E}[|\mathsf{Perm}(M)|]-g(n),\mathsf{E}[|\mathsf{Perm}(M)|]+g(n)\big)\big)\geq f(n)$$ holds for some $a>0$ where $\mathsf{E}[|\mathsf{Perm}(M)|]$ is expected absolute value permanent over ${M\in\{-1,+1\}^{n\times n}}$?

Can $g=O\big(n^{-c\log n}\big)$ and $f=\Omega\big(n^{-d\log n}\big)$ simultaneously hold for some $c,d>0$?

That is if we seek then the fraction of matrices with permanent close to mean the fraction should be lower bounded generously as well reasonably which should mean concentration is large.

I think this is false and for $\{0,1\}^{n\times n}$ case as the comments suggest it seems false as well.

The closest reference I found was here https://arxiv.org/pdf/0804.2362v3.pdf which looks insufficient.

Is there a lower bound on probability that $\mathsf{Perm}(M)\in(-n^{c\log n},n^{c\log n})$ is valid?

  • 1
    If you replace permanent by determinant, the limiting distribution is fairly well understood: $\log\left(| Det(A) |\right)$ is asymptotically normal with mean $\frac{1}{2} \log( (n-1)! )$ and variance $\sqrt{0.5 \log n}$ (see, for example, https://arxiv.org/pdf/1112.0752.pdf ). The concentration is of the logarithm instead of the determinant itself. I find it hard to imagine any argument that could give the tight concentration you want for the permanent without giving a similarly (overly) tight concentration for the determinant. – Kevin P. Costello Oct 21 '16 at 00:21
  • 1
    Vaguely related -- in his recent talk at Avi Wigderson's birthday, at 38:10 (I suggest starting at 37:00), Scott Aaronson mentioned it as an open problem to understand the distribution of the permananent of matrices with i.i.d. Gaussian entries. (We might expect the answer to be similar to those with your distribution...) He said that experiments show it to be lognormal as with the determinant. – usul Oct 21 '16 at 00:59
  • @KevinP.Costello "(overly) tight concentration for the determinant" makes it sound what I seek is impossible. Is it your judgement? –  Oct 21 '16 at 06:51
  • 1
    Not "impossible" so much as "very hard to prove", even if it's true. For an $n \times n \pm 1$ matrix $A$, with asymptotically positive probability we have $$|\det(A)|>\sqrt{(n-1)!} e^{\sqrt{\log n} },$$ and with asymptotically positive probability we have $$|\det(A)| < \sqrt{(n-1)!} e^{-\sqrt{\log n}}.$$ So for the determinant, there's no bound of the sort you're asking for. And the ways I know of for getting a handle on random permanents (e.g. expansion by minors, as in the Tao-Vu paper) would give identical bounds if you replaced "permanent" by "determinant" everywhere. – Kevin P. Costello Oct 21 '16 at 19:06
  • @KevinP.Costello I wrongly guessed expected value (may be now there is an analogy with determinants). –  Oct 21 '16 at 19:20
  • @usul -- Aaronson actually asked about this on MO here: http://mathoverflow.net/questions/45822/anti-concentration-bound-for-permanents-of-gaussian-matrices?rq=1 (which also shows up to the right of this page as the most related question, for me at least). – Nick Cook Oct 23 '16 at 00:26
  • @KevinP.Costello In fact I would be fairly confident that the permanent is log-normally distributed in the limit. For a lazy heuristic why this should be true (but it's enough to show that better than log-normal concentration is definitely false, and get quite close to log-normal): observe that the permanent of a uniform random 0/1 nxn matrix is the number of matchings in a uniform random bipartite graph, whose logarithm (by the solutions of the van der Waerden / Bregman conjectures) we know is basically controlled by the number of edges; and that's normally distributed in the limit by CLT. – user36212 Oct 23 '16 at 22:47
  • I think Janson has some papers proving log-normality of matching counts, though as far as I know they focus on the sparse case (i.e. only a few 1s and mainly 0s). – user36212 Oct 23 '16 at 22:49
  • @user36212 so you think the anti-concentration I seek is false? –  Oct 23 '16 at 22:53
  • At least for 0/1 matrices, it is false. The expected permanent for a p-biased choice of 0/1 is $n!p^n$; if you increase $p$ by a factor $1+1/n$ you multiply the result by $e$. If you believe the permanent is close to expectation for $p=1/2$, why not for all constant $p$..? But the standard deviation in the number of $1$s in your matrix is about $n$, so you expect that reasonably often your matrix will look like a $p(1+1/n)$-random matrix. – user36212 Oct 23 '16 at 23:00
  • @user36212 ok great comment I will remove ${0,1}$ case. Please feel free to expand your comment as answer. Could you also post your references for fact that concentration cannot be tight? An expanded answer would help understand greatly –  Oct 23 '16 at 23:16
  • @KevinP.Costello without $|\cdot|$ (absolute value) what probability does $\mathsf{Perm}(M)\in(-f(n),f(n))$ hold for some fixed function $f(n):\Bbb N\rightarrow\Bbb N$? –  Oct 24 '16 at 22:53

0 Answers0