Dette er faktisk en veldig sterk stemme for Grok. Jeg har sjekket, og det ser ut til at ja, det forbedret nedre grense i en seriøs sannsynlighetsartikkel fra 2025. Multiagent med søk og kodekjøring, men hvorfor sette seg selv på bekostning hvis du faktisk kan bruke verktøyene? DS (kun nett) mislykkes/gir opp.
Paata Ivanisvili
Paata Ivanisvili18. feb. 2026
Grok 4.20 (Beta) improves the lower bound by 9.1% on the Gaussian perimeter of convex sets in two minutes. This is something that was pointed out to me by Xinyuan Xie. Back in 1993, Keith Ball showed that the Gaussian perimeter of a convex body in n-dimensional Euclidean space is bounded from above by 4n^{1/4}. As for the lower bound, Ball showed that for a cube (of appropriate size) the perimeter can grow as \sqrt{\log(n)}. So there was a gap for a while as to which bound is sharp, until 2003, when, in a beautiful paper, Fedor Nazarov showed that on the example of a random polyhedron (the intersection of many random half-spaces) the lower bound can grow as C n^{1/4}, with C=\exp(-5/4)=0.286…. Besides, Nazarov also improved the constant 4 in the upper bound (replacing it with 0.64) when n is large. These bounds stayed unbeaten until recently, when in 2019 Martin Raic managed to improve the upper-bound constant factor from 0.64 to 0.59. Grok 4.20 (Beta), by more carefully optimizing Nazarov’s construction, managed to improve the lower-bound constant from 0.286 to 0.3126. I find this surprising even if it is just playing within the techniques of Nazarov’s paper, because very recently Nadimpalli--Pascale (2025) posted a preprint where, with a different approach, they recovered Nazarov’s lower bound with the same constant factor 0.286…. Grok was very generous in its response: it said that the improvement it provided follows the same argument of Nazarov ``line-by-line,'' whereas when I asked other models (other than Grok) to verify Grok’s claim, they agreed on everything except this part; they said the improvement is not really ``line-by-line'' :D. Finally, I would not say that Nazarov missed this improvement. Knowing him for a long time, I am pretty confident it is common for him to sacrifice optimal constants for algebraic elegance. Why is all this interesting? Having control of the Gaussian perimeter allows one to control Fourier tails of characteristic functions of these sets, which leads to controlling the time complexity of PAC learning and agnostic learning algorithms for this family (see Klivans--O’Donnell--Servedio). References: Chat link with Grok 4.20 (Beta). Keith Ball. The Reverse Isoperimetric Problem for Gaussian Measure. Discrete and Computational Geometry, 10:411–420, 1993. Adam Klivans, Ryan O’Donnell, and Rocco A Servedio. Learning geometric concepts via Gaussian surface area. In Proc. 49th IEEE Symposium on Foundations of Computer Science (FOCS), pages 541–550, 2008. Shivam Nadimpalli, Caleb Pascale. On the Maximal Gaussian Perimeter of Convex Sets, Revisited. Preprint (2025) Fedor Nazarov. On the maximal perimeter of a convex set in R^n with respect to a Gaussian measure. In Geometric Aspects of Functional Analysis (2001-2002) pages 169–187. Lecture Notes in Mathematics, Volume 1807, Springer, 2003 Martin Raicz. A multivariate Berry–Esseen theorem with explicit constants. Bernoulli 25(4A), 2019, 2824–2853
For å være tydelig, hvis jeg sier til DS at han IKKE skal gi opp, tenker den mye hardere, 12 minutter her, og gir en idé om hvordan konstanten kan forbedres. Men koden den genererer feiler. Når jeg tenker meg om, gir den opp. Faktisk ser det kvalitativt ut til å være "korrekt", men får 0,3116, <Grok
Hvis DeepSeeks kode er fast (selv av DeepSeek), produserer den et resultat som konvergerer mot Groks verdi. Så jeg antar at med en ganske enkel REPL ville det ha «lykkes» likevel. Uansett, høyere nytte for Grok her.
114