# Controlling A+B/B 0

Some numerical data on $|A+B/B_0|$ source and also $\mathrm{Re} \frac{A+B}{B_0}$ source, using a step size of 1 for $x$, suggesting that this ratio tends to oscillate roughly between 0.5 and 3 for medium values of $x$:

range of $x$ minimum value max value average value standard deviation min real part max real part
0-1000 0.179 4.074 1.219 0.782 -0.09 4.06
1000-2000 0.352 4.403 1.164 0.712 0.02 4.43
2000-3000 0.352 4.050 1.145 0.671 0.15 3.99
3000-4000 0.338 4.174 1.134 0.640 0.34 4.48
4000-5000 0.386 4.491 1.128 0.615 0.33 4.33
5000-6000 0.377 4.327 1.120 0.599 0.377 4.327
$1-10^5$ 0.179 4.491 1.077 0.455 -0.09 4.48
$10^5-2 \times 10^5$ 0.488 3.339 1.053 0.361 0.48 3.32
$2 \times 10^5-3 \times 10^5$ 0.508 3.049 1.047 0.335 0.50 3.00
$3 \times 10^5-4 \times 10^5$ 0.517 2.989 1.043 0.321 0.52 2.97
$4 \times 10^5-5 \times 10^5$ 0.535 2.826 1.041 0.310 0.53 2.82
$5 \times 10^5-6 \times 10^5$ 0.529 2.757 1.039 0.303 0.53 2.75
$6 \times 10^5-7 \times 10^5$ 0.548 2.728 1.038 0.296 0.55 2.72

Here is a computation on the magnitude $|\frac{d}{dx}(B'/B'_0)|$ of the derivative of $B'/B'_0$, sampled at steps of 1 in $x$ source, together with a crude upper bound coming from the triangle inequality source, to give some indication of the oscillation:

range of $T=x/2$ max value average value standard deviation triangle inequality bound
0-1000 1.04 0.33 0.19
1000-2000 1.25 0.39 0.24
2000-3000 1.31 0.39 0.25
3000-4000 1.39 0.38 0.27
4000-5000 1.64 0.37 0.26
5000-6000 1.60 0.36 0.27
6000-7000 1.61 0.36 0.26
7000-8000 1.55 0.36 0.27
8000-9000 1.65 0.34 0.26
9000-10000 1.47 0.34 0.26
$1-10^5$ 1.78 0.28 0.23 2.341
$10^5-2 \times 10^5$ 1.66 0.22 0.18 2.299
$2 \times 10^5-3 \times 10^5$ 1.55 0.20 0.17 2.195
$3 \times 10^5-4 \times 10^5$ 1.53 0.19 0.16 2.109
$4 \times 10^5-5 \times 10^5$ 1.31 0.18 0.15 2.039
$5 \times 10^5-6 \times 10^5$ 1.34 0.18 0.14
$6 \times 10^5-7 \times 10^5$ 1.33 0.17 0.14

In the toy case, we have

$\frac{|A^{toy}+B^{toy}|}{|B^{toy}_0|} \geq |\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}|$

where $b_n := \exp( \frac{t}{4} \log^2 n)$, $a_n := (n/N)^{y} b_n$, and $s := \frac{1+y+ix}{2} + \frac{t}{2} \log N + \frac{\pi i t}{8}$. For the effective approximation one can write

$\frac{A^{eff}+B^{eff}}{B^{eff}_0} = \sum_{n=1}^N \frac{b_n}{n^{s_B}} + \lambda \sum_{n=1}^N \frac{b_n}{n^{s_A}}$

where now $b_n := \exp( \frac{t}{4} \log^2 n)$, $s_B := \frac{1+y-ix}{2} + \frac{t}{2} \alpha_1(\frac{1+y-ix}{2})$, $s_A := \frac{1-y+ix}{2} + \frac{t}{2} \alpha_1(\frac{1-y+ix}{2})$, and

$\lambda := \frac{\exp( \frac{t}{4} \alpha_1(\frac{1-y+ix}{2})^2 ) H_{0,1}( \frac{1-y+ix}{2} )}{ \overline{\exp( \frac{t}{4} \alpha_1(\frac{1+y+ix}{2})^2 ) H_{0,1}( \frac{1+y+ix}{2} )} }.$

In particular one has

$\frac{|A^{eff}+B^{eff}|}{|B^{eff}_0|} \geq |\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| \quad (2.1)$

where $b_n := \exp( \frac{t}{4} \log^2 n)$, $s := \frac{1+y+ix}{2} + \frac{t}{2} \alpha_1(\frac{1+y+ix}{2})$, and

$a_n := |\lambda| n^{y - \frac{t}{2} \alpha_1(\frac{1-y+ix}{2}) + \frac{t}{2} \alpha_1(\frac{1+y+ix}{2}) )} b_n.$

It is thus of interest to obtain lower bounds for expressions of the form

$|\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}|$

in situations where $b_1=1$ is expected to be a dominant term.

From the triangle inequality one obtains the lower bound

$|\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| \geq 1 - |a_1| - \sum_{n=2}^N \frac{|a_n|+|b_n|}{n^\sigma}$

where $\sigma := \frac{1+y}{2} + \frac{t}{2} \log N$ is the real part of $s$. There is a refinement:

Lemma 1 If $a_n,b_n$ are real coefficients with $b_1 = 1$ and $0 \leq a_1 \lt 1$ we have

$|\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| \geq 1 - a_1 - \sum_{n=2}^N \frac{\max( |b_n-a_n|, \frac{1-a_1}{1+a_1} |b_n+a_n|)}{n^\sigma}.$

Proof By a continuity argument we may assume without loss of generality that the left-hand side is positive, then we may write it as

$|\sum_{n=1}^N \frac{b_n - e^{i\theta} a_n}{n^s}|$

for some phase $\theta$. By the triangle inequality, this is at least

$|1 - e^{i\theta} a_1| - \sum_{n=2}^N \frac{|b_n - e^{i\theta} a_n|}{n^\sigma}.$

We factor out $|1 - e^{i\theta} a_1|$, which is at least $1-a_1$, to obtain the lower bound

$(1-a_1) (1 - \sum_{n=2}^N \frac{|b_n - e^{i\theta} a_n| / |1 - e^{i\theta} a_1|}{n^\sigma}).$

By the cosine rule, we have

$(|b_n - e^{i\theta} a_n| / |1 - e^{i\theta} a_1|)^2 = \frac{b_n^2 + a_n^2 - 2 a_n b_n \cos \theta}{1 + a_1^2 -2 a_1 \cos \theta}.$

This is a fractional linear function of $\cos \theta$ with no poles in the range $[-1,1]$ of $\cos \theta$. Thus this function is monotone on this range and attains its maximum at either $\cos \theta=+1$ or $\cos \theta = -1$. We conclude that

$\frac{|b_n - e^{i\theta} a_n|}{|1 - e^{i\theta} a_1|} \leq \max( \frac{|b_n-a_n|}{1-a_1}, \frac{|b_n+a_n|}{1+a_1} )$

and the claim follows.

We can also mollify the $a_n,b_n$:

Lemma 2 If $\lambda_1,\dots,\lambda_D$ are complex numbers, then

$|\sum_{d=1}^D \frac{\lambda_d}{d^s}| (|\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}|) = ( |\sum_{n=1}^{DN} \frac{\tilde b_n}{n^s}| - |\sum_{n=1}^{DN} \frac{\tilde a_n}{n^s}| )$

where

$\tilde a_n := \sum_{d=1}^D 1_{n \leq dN} 1_{d|n} \lambda_d a_{n/d}$
$\tilde b_n := \sum_{d=1}^D 1_{n \leq dN} 1_{d|n} \lambda_d b_{n/d}$

Proof This is immediate from the Dirichlet convolution identities

$(\sum_{d=1}^D \frac{\lambda_d}{d^s}) \sum_{n=1}^N \frac{a_n}{n^s} = \sum_{n=1}^N \frac{\tilde a_n}{n^s}$

and

$(\sum_{d=1}^D \frac{\lambda_d}{d^s}) \sum_{n=1}^N \frac{b_n}{n^s} = \sum_{n=1}^N \frac{\tilde b_n}{n^s}.$

$\Box$

Combining the two lemmas, we see for instance that we can show $|\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| \gt 0$ whenever can find $\lambda_1,\dots,\lambda_D$ with $\lambda_1=1$ and

$\sum_{n=2}^N \frac{\max( \frac{|\tilde b_n-\tilde a_n|}{1-a_1}, \frac{|\tilde b_n+ \tilde a_n|}{1+a_1})}{n^\sigma} \lt 1.$

A usable choice of mollifier seems to be the Euler products

$\sum_{d=1}^D \frac{\lambda_d}{d^s} := \prod_{p \leq P} (1 - \frac{b_p}{p^s})$

which are designed to kill off the first few $\tilde b_n$ coefficients.

## Contents

### Analysing the toy model

With regards to the toy problem of showing $A^{toy}+B^{toy}$ does not vanish, here are the least values of $N$ for which this method works source source source source source:

$P$ in Euler product $N$ using triangle inequality $N$ using Lemma 1 $N$ using Lemma 1 and a custom mollifier
1 1391 1080
2 478 341 336
3 322 220 212
5 282 192 182
7 180
11 176

Dropping the $\lambda_6$ term from the $P=3$ Euler factor worsens the 220 threshold slightly to 235 source. However, other custom mollifiers do work (see above table).

### Analysing the effective model

The differences between the toy model and the effective model are:

• The real part $\sigma$ of $s$ is now $\frac{1+y}{2} + \frac{t}{2} \mathrm{Re} \alpha_1(\frac{1+y+ix}{2})$ rather than $\frac{1+y}{2} + \frac{t}{2} \log N$. (The imaginary part of $s$ also changes somewhat.)
• The coefficient $a_n$ is now given by
$a_n = |\lambda| n^{y + \frac{t}{2} (\alpha_1(\frac{1+y+ix}{2}) - \alpha_1(\frac{1-y+ix}{2}))} b_n$

rather than $a_n = N^{-y} n^y b_n$, where

$|\lambda| = |\frac{\exp( \frac{t}{4} \alpha_1(\frac{1-y+ix}{2})^2 H_{0,1}( \frac{1-y+ix}{2})}{\exp( \frac{t}{4} \alpha_1(\frac{1-y+ix}{2})^2 H_{0,1}( \frac{1-y+ix}{2})}|.$

Two complications arise here compared with the toy model: firstly, $\sigma,a_n$ now depend on $x$ and not just on $N$, and secondly the $a_n$ are not quite real-valued making it more difficult to apply Lemma 1.

However we have good estimates for $\sigma,a_n$ that depend only on $N$. Note that

$2\pi N^2 \leq T' \lt 2\pi (N+1)^2$

and hence

$x_N \leq x \lt x_{N+1}$

where

$x_N := 4\pi N^2 - \frac{\pi t}{4}.$

To control $\sigma$, it suffices to obtain lower bounds because our criteria (both the triangle inequality and Lemma 1) become harder to satisfy when $\sigma$ decreases. We compute

$\sigma = \frac{1+y}{2} + \frac{t}{2} \mathrm{Re}(\frac{1}{1+y+ix} + \frac{2}{-1+y+ix} + \frac{1}{2} \log \frac{1+y+ix}{4\pi})$
$= \frac{1+y}{2} + \frac{t}{2} (\frac{1+y}{(1+y)^2+x^2} + \frac{-2+2y}{(-1+y)^2+x^2} + \frac{1}{2} \log \frac{|1+y+ix|}{4\pi})$
$\geq \frac{1+y}{2} + \frac{t}{2} (\frac{4 y(1+y)}{((-1+y)^2+x^2)((1+y)^2+x^2)} - \frac{1+3y}{(-1+y)^2+x^2} + \frac{1}{2} \log \frac{x}{4\pi})$
$\geq \frac{1+y}{2} + \frac{t}{2} (-\frac{1-3y-\frac{4y(1+y)}{x^2}}{(-1+y)^2+x^2} + \log N)$
$\geq \frac{1+y}{2} + \frac{t}{2} \log N$

assuming that $y \geq \frac{1}{3} + \frac{4y(1+y)}{3x^2}$. Hence we can actually just use the same value of $\sigma$ as in the toy case.

Next we control $|\lambda|$. Note that we can increase $|\lambda|$ (thus multiplying $\sum_{n=1}^N \frac{a_n}{n^s}$ by a quantity greater than 1) without affecting (2.1), so we just need upper bounds on $|\lambda|$. We may factor

$|\lambda| = \exp( \frac{t}{4} \mathrm{Re} (\alpha_1(\frac{1-y+ix}{2})^2 - \alpha_1(\frac{1+y+ix}{2})^2) + \mathrm{Re}( f(\frac{1-y+ix}{2}) - f(\frac{1+y+ix}{2} ) )$

where

$f(s) := -\frac{s}{2} \log \pi + (\frac{s}{2} - \frac{1}{2}) \log \frac{s}{2} - \frac{s}{2}.$

By the mean value theorem, we have

$\mathrm{Re} (\alpha_1(\frac{1-y+ix}{2})^2 - \alpha_1(\frac{1+y+ix}{2})^2) = -2 y \alpha_1(s') \alpha'_1(s')$

for some $s_1$ between $\frac{1-y+ix}{2}$ and $\frac{1+iy}{2}$. We have

$\alpha_1(s_1) = \frac{1}{2s_1} + \frac{1}{s_1-1} + \frac{1}{2} \log \frac{s_1}{2\pi}$
$= O_{\leq}(\frac{1}{x}) + O_{\leq}(\frac{1}{x/2}) + \frac{1}{2} \log \frac{|s_1|}{2\pi} + O_{\leq}(\frac{\pi}{4})$
$= O_{\leq}( \frac{\pi}{4} + \frac{3}{x_N}) + \frac{1}{2} O_{\leq}^{\mathbf{R}}( \log \frac{|1+y+ix_{N+1}|}{4\pi} )$

and

$\alpha'_1(s_1) = -\frac{1}{2s_1^2} + \frac{1}{(s_1-1)^2} + \frac{1}{2s_1}$
$= O_{\leq}(\frac{1}{x^2/2}) + O_{\leq}(\frac{1}{x^2/4}) + \frac{1}{2s_1}$
$= O_{\leq}(\frac{6}{x_N^2}) + \frac{1}{2s_1}$
$= O_{\leq}(\frac{6}{x_N^2}) + O_{\leq}( \frac{1}{x_N} ).$

Thus one has

$\mathrm{Re} (\alpha_1(\frac{1-y+ix}{2})^2 - \alpha_1(\frac{1+y+ix}{2})^2) = 2y O_{\leq}( (\frac{\pi}{4} + \frac{3}{x_N}) (\frac{1}{x_N} + \frac{6}{x_N^2}) )$
$+ 2y O_{\leq}( \log \frac{|1+y+ix_{N+1}|}{4\pi} (\frac{6}{x_N^2} + |\mathrm{Re} \frac{1}{2s'}|) )$

Now we have

$\mathrm{Re} \frac{1}{2s'} = \frac{\mathrm{Re}(s')}{2|s'|^2}$
$\leq \frac{1+y}{x^2}$
$\leq \frac{1+y}{x_N^2};$

also

$(\frac{\pi}{4} + \frac{3}{x_N}) (\frac{1}{x_N} + \frac{6}{x_N^2}) \leq \frac{\pi}{4} (1 + \frac{12/\pi}{x_N}) \frac{1}{x_N-6}$
$\leq \frac{\pi}{4} ( \frac{1}{x_N-6} + \frac{12/\pi}{(x_N-6)^2} )$
$\leq \frac{\pi}{4} \frac{1}{x_N - 6 - 12/\pi}.$

We conclude that

$\mathrm{Re} (\alpha_1(\frac{1-y+ix}{2})^2 - \alpha_1(\frac{1+y+ix}{2})^2) = O_{\leq}(\frac{\pi y}{2 (x_N - 6 - 12/\pi)} + \frac{2y(7+y)}{x_N^2} \log \frac{|1+y+ix_{N+1}|}{4\pi}).$

In a similar vein, from the mean value theorem we have

$\mathrm{Re}( f(\frac{1-y+ix}{2}) - f(\frac{1+y+ix}{2} ) = -y \mathrm{Re} f'(s_2)$

for some $s_2$ between $\frac{1-y+ix}{2}$ and $\frac{1+y+ix}{2}$. We have

$\mathrm{Re} f'(s_2) = -\frac{1}{2} \log \pi + \frac{1}{2} \log \frac{|s_2|}{2} - \mathrm{Re} \frac{1}{2s_2}$
$= \frac{1}{2} \log \frac{|s_2|}{2\pi} + O_{\leq}(\frac{\mathrm{Re}(s_2)}{2|s_2|^2})$
$\geq \log N + O_{\leq}(\frac{1+y}{x^2})$
$\geq \log N + O_{\leq}(\frac{1+y}{x_N^2})$

and thus

$|\lambda| \leq N^{-y} \exp( \frac{\pi y}{2 (x_N - 6 - 12/\pi)} + \frac{2y(7+y)}{x_N^2} \log \frac{|1+y+ix_{N+1}|}{4\pi} + \frac{y(1+y)}{x_N^2} )$
$\leq e^\delta N^{-y}$

where

$\delta := \frac{\pi y}{2 (x_N - 6 - \frac{14+2y}{\pi})} + \frac{2y(7+y)}{x_N^2} \log \frac{|1+y+ix_{N+1}|}{4\pi} \quad (1)$

Asymptotically we have

$\delta = \frac{\pi y}{2 x_N} + O( \frac{\log x_N}{x_N^2} ) = O( \frac{1}{x_N} ).$

Now we control $\alpha_1(\frac{1+y+ix}{2}) - \alpha_1(\frac{1-y+ix}{2})$. By the mean-value theorem we have

$\alpha_1(\frac{1+y+ix}{2}) - \alpha_1(\frac{1-y+ix}{2}) = O_{\leq}( y |\alpha'_1(s_3)|)$

for some $s_3$ between $\frac{1+y+ix}{2}$ and $\frac{1-y+ix}{2}$. As before we have

$\alpha'_1(s_3) = -\frac{1}{2s_3^2} - \frac{1}{(s_3-1)^2} + \frac{1}{2s_3}$
$= O_{\leq}( \frac{1}{x^2/2} + \frac{1}{x^2/4} + \frac{1}{x} )$
$= O_{\leq}( \frac{1}{x_N} + \frac{6}{x_N^2} )$
$= O_{\leq}( \frac{1}{x_N-6} ).$

We conclude that (after replacing $|\lambda|$ with $e^\delta N^{-y}$)

$a_n = (n/N)^y \exp( \delta + O_{\leq}( \frac{t y \log n}{2(x_N-6)} ) ) b_n.$

The triangle inequality argument will thus give $A^{eff}+B^{eff}$ non-zero as long as

$\sum_{n=1}^N (1 + (n/N)^y \exp( \delta + \frac{t y \log n}{2(x_N-6)} ) ) \frac{b_n}{n^\sigma} \lt 2.$

The situation with using Lemma 1 is a bit more complicated because $a_n$ is not quite real. We can write $a_n = e^\delta a_n^{toy} + O_{\leq}( e_n )$ where

$a_n^{toy} := (n/N)^y b_n$

and

$e_n := e^\delta (n/N)^y (\exp( \frac{t y \log n}{2(x_N-6)} ) - 1) b_n$

and then by Lemma 1 and the triangle inequality we can make $A^{eff}+B^{eff}$ non-zero as long as

$a_1^{toy} + \sum_{n=2}^N \frac{\max( |b_n-a_n^{toy}|, \frac{1-a_1^{toy}}{1+a_1^{toy}} |b_n + a_n^{toy}|}{n^\sigma} + \sum_{n=1}^N \frac{e_n}{n^\sigma} \lt 1.$

### Large $x$ case

When $N \geq 2000$ and $t=y=0.4$, we use the cruder lower bound

$|\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| \geq 2 - \sum_{n=1}^N \frac{|b_n|}{n^\sigma} - \sum_{n=1}^N \frac{|a_n|}{n^\sigma}$
$\geq 2 - \sum_{n=1}^N \frac{1}{n^{0.7 + 0.1 \log \frac{N^2}{n}}} - e^\delta \exp( \frac{t y \log N}{2(x_N-6)} ) N^{-0.4} \sum_{n=1}^N \frac{1}{n^{0.3 + 0.1 \log \frac{N^2}{n}}}$
$\geq 2 - 1.706 - e^\delta \exp( \frac{0.08 \log N}{4\pi N^2 - 6.315} ) N^{-0.4} \times 3.469$

using the estimates from Estimating a sum. We can bound

$\delta := \frac{0.2 \pi}{4\pi N^2 - 6.315 - \frac{14.8}{\pi}} + \frac{5.6}{(4\pi N^2 - 0.315)^2} \log |\frac{1.4}{4\pi}+i (N+1)^2|$

which for $N \geq 2000$ may be bounded by $1.26 \times 10^{-8}$. One may similarly bound $\frac{0.08 \log N}{4\pi N^2 - 6.315}$ by $1.21 \times 10^{-8}$, so we obtain the lower bound

$|\frac{A^{eff}+B^{eff}|}{|B^{eff}_0|} \geq |\sum_{n=1}^N \frac{b_n}{n^s}| - |\sum_{n=1}^N \frac{a_n}{n^s}| \geq 2 - 1.706 - 0.166 = 0.128.$

### Derivative bounds

We have

$\frac{A^{eff}+B^{eff}}{B^{eff}_0} = \sum_{n=1}^N \frac{b_n}{n^{s_B}} + \lambda \sum_{n=1}^N \frac{b_n}{n^{s_A}}$

where $b_n := \exp( \frac{t}{4} \log^2 n)$, $s_B := \frac{1+y-ix}{2} + \frac{t}{2} \alpha_1(\frac{1+y-ix}{2})$, $s_A := \frac{1-y+ix}{2} + \frac{t}{2} \alpha_1(\frac{1-y+ix}{2})$, and

$\lambda := \frac{\exp( \frac{t}{4} \alpha_1(\frac{1-y+ix}{2})^2 ) H_{0,1}( \frac{1-y+ix}{2} )}{ \exp( \frac{t}{4} \alpha_1(\frac{1+y-ix}{2})^2 ) H_{0,1}( \frac{1+y-ix}{2} ) }.$

Differentiating in $x$, we obtain

$\frac{d}{dx} \frac{A^{eff}+B^{eff}}{B^{eff}_0} = \sum_{n=1}^N \frac{-b_n \log n}{n^{s_B}} \frac{d}{dx}(s_B) + \lambda \sum_{n=1}^N \frac{b_n}{n^{s_A}} (-\log n \frac{d}{dx}(s_A) + \frac{d}{dx} \log \lambda)$

and hence

$|\frac{d}{dx} \frac{A^{eff}+B^{eff}}{B^{eff}_0}| \leq \sum_{n=1}^N \frac{b_n \log n}{n^{\mathrm{Re}(s_B)}} |\frac{d}{dx}(s_B)| + |\lambda| \sum_{n=1}^N \frac{b_n}{n^{\mathrm{Re}(s_A)}} |-\log n \frac{d}{dx}(s_A) + \frac{d}{dx} \log \lambda|.$

Recall from above that $\alpha'_1(s) = O_{\leq}( \frac{1}{x_N-6} )$. Thus we have

$\frac{d}{dx} s_B = \frac{-i}{2} - \frac{it}{4} \alpha'_1( \frac{1+y-ix}{2} )$
$= O_{\leq}( \frac{1}{2} + \frac{t}{4(x_N-6)} ).$

Similarly

$\frac{d}{dx} s_A = \frac{i}{2} + O_{\leq}( \frac{t}{4(x_N-6)} ).$

Finally, writing $s := \frac{1-y+ix}{2}$, we have

$\log \lambda = \frac{t}{4} (\alpha_1(s)^2 - \alpha_1(1-s)^2) + (-\frac{s}{2} \log \pi + (\frac{s}{2}-\frac{1}{2}) \log \frac{s}{2} - \frac{s}{2}) - (-\frac{1-s}{2} \log \pi + (\frac{1-s}{2}-\frac{1}{2}) \log \frac{1-s}{2} - \frac{1-s}{2})$

and hence

$-\frac{i}{2} \log n + \frac{d}{dx} \log \lambda = -\frac{i}{2} \log n +\frac{i}{2} \frac{d}{ds} \log \lambda$
$= \frac{it}{4} (\alpha_1(s) \alpha'_1(s) + \alpha_1(1-s) \alpha'_1(1-s)) + \frac{i}{4} \log \frac{s}{2\pi n} - \frac{i}{4s} + \frac{i}{4} \log \frac{1-s}{2\pi n} - \frac{i}{4(1-s)}$
$= O_{\leq}( \frac{t}{4(x_N-6)} (|\alpha_1(s)| + |\alpha_1(1-s)| + \frac{1}{4} \log \frac{|1-y+ix| |1+y-ix|}{16\pi^2 n^2} + \frac{1}{4|s| |1-s|} );$

bounding

$|\alpha_1(s)| \leq \frac{1}{2} \log \frac{|1-y+ix|}{4\pi} + \frac{3}{2x_N}$

and

$|\alpha_1(1-s)| \leq \frac{1}{2} \log \frac{|1+y+ix|}{4\pi} + \frac{3}{2x_N}$

we thus have

$|\frac{i}{2} \log n + \frac{d}{dx} \log \lambda| \leq \frac{1}{4} (1 + \frac{t}{2(x_N-6)}) \log \frac{|1-y+ix| |1+y-ix|}{16\pi^2 n^2} + \frac{3t}{4 x_N(x_N-6)} + \frac{1}{4 x_N^2}$
$\leq \frac{1}{4} (1 + \frac{t}{2(x_N-6)}) \log \frac{|1-y+ix_{N+1}| |1+y-ix_{N+1}|}{16\pi^2} + \frac{3t+1}{4 x_N(x_N-6)}.$

Also, as established above, we have

$\mathrm{Re}(s_B) \geq \frac{1+y}{2} + \frac{t}{2} (\frac{3y-1}{x_{N+1}^2} + \log N)$

(for $y \geq 1/3 + 4y(1+y)/3x^2$) and a similar argument gives

$\mathrm{Re}(s_A) \geq \frac{1-y}{2} + \frac{t}{2} (\frac{2-3y}{x_{N}^2} + \log N).$

We conclude that

$|\frac{d}{dx} \frac{A^{eff}+B^{eff}}{B^{eff}_0}| \leq (1 + \frac{t}{2(x_N-6)}) \sum_{n=1}^N \frac{b_n}{n^{\frac{1+y}{2} + \frac{t}{2} (\frac{3y-1}{x_{N+1}^2} + \log N)}} \frac{\log n}{2} + e^\delta N^{-y} \sum_{n=1}^N \frac{b_n}{n^{\frac{1-y}{2} + \frac{t}{2} (\frac{2-3y}{x_{N}^2} + \log N)}} (\frac{t}{4(x_N-6)} \log n + \frac{1}{4} \log \frac{|1-y+ix_{N+1}| |1+y-ix_{N+1}|}{16\pi^2 n^2} + \frac{3t+1}{4 x_N(x_N-6)})$

where $b_n = \exp( \frac{t}{4} \log^2 n)$, $x_N = 4\pi N^2 - \frac{\pi t}{4}$, and $\delta$ was defined in (1) above.