21

Problem 1: Find a (not extremely artificial) set A of integers so that for every $n$, $|A\cap [n]| \le n^{0.499}$, ($[n]=\{1,2,...,n\}$,) where you can prove that $A$ contains infinitely many primes.

Problem 2: Find a (not extremely artificial) set $A$ of integers so that for every $n$, $ |A\cap [n]| \le n^{0.499}$, where you can prove that

$ \sum \{\mu(k): k \le n, k \in A\} = o(|A \cap [n]).$

###Variation

For problem 1 it makes sense to ask not just about infinitely many primes but about a result regarding the density of primes which is of the full strength of the prime number theorem. (In fact, I thought about Problem 2 as a weak form of problem 1 but this is not the case the way I formulated it.)

Problem 3: Find a (not extremely artificial) set $A$ of integers so that for every $n$, $|A\cap [n]| \le n^{0.499}$, ($[n]=\{1,2,...,n\}$,) where you can prove that the density of primes in $A$ in the interval $[n]$ is $1/\log n+o(1)$.

Perhaps the best way to formulate and think about Problem 3 is in terms of orthogonality with the Van Mangold function.

Motivations:

This question is motivated by various recent results on Mobius randomness and infinitude of primes in various exotic sets, and also on this question: Why so difficult to prove infinitely many restricted primes?.

(The set of $p_{n^5}$ is extremely artificial and I suppose that a set that can be ordered in a sequence (not necessarily increasing) such that $a_n$ can be provably computed in $poly (\log |a_n|)$ steps is not extremely artificial.)

Examples:

  1. A very natural example is an interval of the form $[n,n+t]$ and here indeed the best known absolute results is when $t=n^{0.535..}$. RH allows $t=\sqrt n \log n$ and it looks that here the $n^{1/2}$ is a viable barrier. (Here I base the info on the paper: A Survey of Results on Primes in Short Intervals by Cem Yalçın YILDIRIM.)

  2. There is a result by Friedlander and Iwaniec that there are infinitely many primes of the form $a^2+b^4$. Here the density is above the square-root barrier, but I don't know if there are any insights regarding improvement to, say, $a^3+b^7$.

  3. There is a result by Elkies about the infinitude of supersingular primes. Those primes are conjectured to be of density $n^{1/2}$ among the first $n$ numbers but provably it is only known that they are of density less or equal $n^{3/4}$ I don't know if there are Elkies-like results that can lead (provably or conjecturally) to sparser sets of primes.

  4. There is a "PNT for majority functions" result of Bourgain that implies that there are infinitely many m-digits primes with more than a fraction $c$ of their binary digits are ones, for some $c=1/2+m^{-\rho}$, for some $\rho<1$ Having this for $c=0.9$ will cross the square root barrier. (Update based on Christian's answer: )Eric Naslund used the results about primes in intervals to prove it for $c=0.737..$.

Gil Kalai
  • 24,218
  • Can't you use Tao and Ziegler's result to produce sets with this property? –  Jul 06 '13 at 21:53
  • 1
    Could you please be more specific. When I hear Tao--Ziegler my first association is polynomial progressions in the primes. But it is not apparent to me that this has to say something about this problem. But possibly it does or you mean a different results. –  Jul 06 '13 at 22:03
  • If you define 'not extremely artificial' to be 'the n-th term is constructable by a poly-log time algorithm (in the bit length of n)' then this is essentially equivalent to getting around the square root barrier in the prior Polymath "deterministic methods to find primes" problem, which asks for an algorithm that constructs a $\log(n)$ bit prime in time $n^{1/2-\delta}$. See http://arxiv.org/abs/1009.3956. Notice that many natural candidates for 'not extremely artificial' are efficiently constructable in this sense. – Mark Lewko Jul 07 '13 at 00:20
  • Dear Mark, the (tentative) suggestion for not extremely artificial was certainly inspired by polymath4 that you mentioned. But I don't see how a positive answer to Problem 1 will lead to crossing the $n^{1/2-\delta}$ barrier for finding primes. It will be very nice to prove that there are infinitely many primes of the form $x^6+y^{12}+z^{20}+v^{30}+u^{42}+w^{56}+r^{72}+s^{90}+t^{110}$ (and you can go further if needed) but suppose you can prove it I dont see how this helps you to cross the $n^{1/2-\delta}$ barrier in the polymath4 goal. – Gil Kalai Jul 07 '13 at 05:51
  • 1
    @Gil, perhaps I'm misunderstanding something. If one wanted a prime near N, then one could just produce (via the algorithm) and test each of the $\ll N^{.49}$ values of the polynomial less than N for primality with AKS (which would run in $O( N^{.49+\epsilon})$ time. Now there is a slight issue that if all one knows is that the set contains primes infinitely often this may fail for some bit lengths. (cont.) – Mark Lewko Jul 07 '13 at 06:21
  • (cont.) This is why I said `essentially equivalent.' In practice, most methods for proving a set contains infinitely many primes actually produces an asymptotic formula which shows the set is prime is of density 1/log(N) within the set (modulo obvious obstructions). – Mark Lewko Jul 07 '13 at 06:21
  • 1
    Dear Mark, this looks correct, at least for the sequence I mentioned or similar examples. In general you may need to go over all N numbers to tell which of them belongs to A so it is not clear why in general a positive answer to Problem 1 gives an $N^{1/2-\delta}$ algorithm for finding a prime with log N digits. Also it is not clear why from such an algorithm you get a positive answer to Problem 1. – Gil Kalai Jul 07 '13 at 07:41
  • @Gil, I concede your second point, I don't see how to reverse the implication. (If you had an algorithm you could consider the set of primes that the algorithm produces, but this doesn't give quite what you would need, at least not in a blackbox way). On your first point, I was assuming that the polylog algorithm produced the k-th element in poly-log time (in k). You seem to be thinking that admissibility is what is testable in poly-log time (in k). The latter does make things easier. – Mark Lewko Jul 08 '13 at 01:04
  • Dear Mark, yes, in fact one way that polymath4 will be settled affarmatively is by proving very strong (and probably also beyond reach) derandomization conjectures which (for all we know) have nothing to do with number theory. Actually, I meant for Problem 1 to be a stronger version of Problem 2 but this is not the case the way I stated it (The problem as stated is also interesting.) Probably a statement based on Von Mangoldt function can serve as common strengthening of both. – Gil Kalai Jul 08 '13 at 16:31
  • Is the prime set $\lfloor A^{3^{n}} \rfloor$ resulting from Mill's constant artificial? http://en.wikipedia.org/wiki/Mills%27_constant. Under RH it is close to computing it poly(log(|a_n|)) – joro Mar 17 '14 at 10:41
  • Dear Joro, Yes, I think we need to regard it as artificial. Probably Problems 2 and 3 are more "imune" against such examples. – Gil Kalai Mar 17 '14 at 18:55
  • Is poly computability enough for problem 2? Take $a(n)=\prod_{k=1}^{2n} p_k$. By construction $\mu(a(n))=1$ and it is growing much faster than $n^2$. $\log a(n)=\theta(2n) \sim 2 n$, so you can compute it in poly(log(a(n))) by computing the first about 2n primes. – joro Mar 19 '14 at 11:43

1 Answers1

11

Let me first comment on your examples:

  1. The record is acually $t=n^{0.525}$, due to Baker, Harman and Pintz.

  2. A slightly thinner sequence of this type is $n=a^3+2b^3$, investigated by Heath-Brown, and also cubic polynomials in two variables, by Heath-Brown and Moroz, eprints.maths.ox.ac.uk/00000163/01/morozcub2.pdf

Other polynomials in two or more variables, like those that you suggest, are certainly not known by today's methods.

  1. Here is a sketch to obtain primes with a density of 0.7375 one bits (compare also this paper by Eric Naslund https://arxiv.org/abs/1211.2455

and his answer on mathoverflow, to a question by Gil Kalai Primes with more ones than zeroes in their Binary expansion their-binary-expansion/97345#97345

By Baker-Harman-Pintz one can fix the first 0.475n bits as one. With $x=2^n$ There are $\gg x/\log x$ many primes in the interval [2^n-(2^n)^0.525, 2^n]. For the remaining 0.525n bits about 50% must be 0 and 50% ones. If the proportion of one bits was smaller, then estimating the tail (sum of binomial coefficients) would show the number of primes is too small.

So altogether the asymptotic density of one bits can be as large as $(0.475+1/2\, 0.525)n=0.7375n$.

Here is a paper, where a similar method was used to find quadratic non-squares or even primitive roots, with only few one-bits (less bits than the least non-square or least primitive root might need).

http://arxiv.org/abs/1207.0842

Now an example for a sequence with quite small counting function (Problem 1).

Let us first look at large and sparse, but finite sets. Let $F_n=2^{2^n}+1$ be the $n$-th Fermat number.

Let $D(F_n)$ be the set of its divisors. Observe that the number of divisors is (by Wigert's bound) $d(F_n)=O(exp (c \frac{\log F_n}{\log \log F_n})=O(\exp(c' 2^n/n)$.

Let $S(k)=\cup_{n=1}^k D(F_n)\subset [1,2^{2^k}+1]$. Observe that $|S(k)|= O(k \exp(c' 2^k /k)< (F_k)^{0.499}$.

The set of divisors of the Fermat numbers contains also the prime factors (for each $F_n$ at least one new prime number, as the Fermat numbers are coprime).

Well, to extend this to an infinite set we need to take care of the fact that with $F_{n+1}$ one possibly also adds smaller elements, so that the counting becomes a bit more tricky. But these added elements are not very small, as one can show that the prime factors of $F_n$ are at least of size $2^n$. So maybe one can take the union over a thin subset of the Fermat numbers, such as $\cup_{n=1}^k D(F_{2^n})$ or similar,

Or one only takes the second smallest element $s_n$ of $D(F_n)$, (which must be a prime!).

$S= \cup_{n=1}^{\infty} s_n$.

Note again, that the $s_n$ are distinct and are (at least) exponentially increasing, $s_n>2^n$, by properties of the Fermat numbers.

In case anybody thinks this is an artificial sequence, equivalent to writing down some prime numbers: well, the prime divisors of sequences such as $s_n=n^2+1$ or $s_n=\lfloor 2^n/n\rfloor$ might not work, as these might be too dense.

  • 1
    There is a fundamental difference with FI and HB. The first establishes bilinear bounds via a Cauchy step, the main tool being the lacunarity of the squares. The operative range is $\sum_m|\sum_n \gamma(n)a_{mn}|$ where $\gamma$ is a lot like $\mu$ (sum of $\mu$ up to some parameter), and $n\sim N\gg x^{1/4}$ so the sum has mass, but also $N\ll x^{1/2}/\log^B$ so $m\sim M\gg x^{1/2}\log^B$ so upon Cauchy + order-swap the $m$-sum is feasible lattice point counting. So 3/4 is a natural limit. Meanwhile, HB has a multi-dimensional large sieve, on a comparison basis, so this limit is not there. – v08ltu Jul 08 '13 at 02:14