What does an algorithm having linear convergence apply about the required number of iterations for convergence (within a fixed tolerance)?
In particular, I have an algorithm (the PCG algorithm, in case you're curious) whose convergence rate is linear in the square root of the condition number. If I could prove that the condition number was O(f(n)), would the number of iterations be O(sqrt(f(n)))? (My advisor thinks so; I can't get my head around it.)
Thanks in advance!