Call an array $C$sorted if each element is atmost $C$ places away from its place had the array been sorted (this is the same as “almost”sorted array in the question, but in plain English).
Assume we will use comparison sort.
Claim One: Given an array of $n$ elements that is $C$sorted, we can sort it with at most $nlceillog_2 (C+1)rceil$ comparisons.
Proof: Here is how we can sort the given array.

Create a buffer that is an array of size $C+1$.

One by one move the first $min(n, C+1)$ elements to the buffer, keeping the buffer sorted using binary search to find the right place for each new element.

One by one move the min element from the buffer array to the result array, and, if the given array is not empty yet, add the next element to the buffer, keeping the buffer sorted using binary search to find the right place for that next element.
For each element in the array, at most $lceillog_2 (C+1)rceil$ comparisons is used to find its right place in the buffer. The number of total comparisons used is at most $nlceillog_2 (C+1)rceil$. $quadcheckmark$
Claim Two: At least $log_2 (n!) – nlceillog_2 (C+1)rceil$ comparisons are needed to $C$sort an array of size $n$ in the worst case.
Proof: Otherwise, suppose there is an algorithm that can $C$sort any given array of size $n$ with less than $log_2 (n!) – nlceillog_2 (C+1)rceil$ comparisons. Extending that algorithm with the algorithm shown in the proof of claim one, we obtain an algorithm that can sort any given array of size $n$ with less than $log_2(n!)$ comparisons. That is wellknown to be impossible. $quadcheckmark$
Claim Three: There is no algorithm that can $C$sort an array of size $n$ with $O(frac{nlog_2 n}{C})$ comparisons.
Proof: We have, if $nge C^{1+epsilon}$ for some fixed $epsilongt0$,
$$begin{aligned}lim_{Ctoinfty} frac{log_2 (n!) – nlceillog_2 (C+1)rceil}{frac{nlog_2 n}{C}} &= lim_{Ctoinfty} Cleft(frac{log_2 (n!)}{nlog_2 n}frac{lceillog_2 (C+1)rceil}{log_2 n}right)\&ge lim_{Ctoinfty} Cleft(1frac1{1+epsilon}right)=infty.
end{aligned}quadcheckmark$$
Indeed, $log_2 (n!) – nlceillog_2 (C+1)rceil$ given a pretty strong lower bound for the number of comparison needed in the worst case since
$log_2 (n!)$ is the lower bound to ordinarily sort an array, and, as $log_2(n!)approx nlog n$, for any fixed $C$,
$$nlceillog_2 (C+1)rceil = o(log_2 (n!)). color{#d0d0d0}{text{(small o, not big O!)}}$$
That is, at least $(1epsilon)log_2(n!)$ comparison is needed to $C$sort array when $n$ is sufficiently large, for all fixed $epsilongt0$ and $C$.
We also note that
on average, quicksort performs only about 39% worse than in its best case. In this sense, it is closer to the best case than the worst case. A comparison sort cannot use less than $log_2(n!)$ comparisons on average to sort $n$ items (as explained in the article Comparison sort) and in case of large $n$, Stirling’s approximation yields $log_2(n!) approx n(log_2 n − log_2 e)$, so quicksort is not much worse than an ideal comparison sort.
Now let us have an array of $n=1000$ elements and $C=10$.
 Quicksort will perform about $2nlog napprox 19931$ comparisons on average to sort the array, as shown on Wikipedia.
 Claim Two says $log_2 (n!) – nlceillog_2 (C+1)rceilapprox4529$ are needed to $10$sort the array in the worse case, which is sort of informationtheoretical the best.
 Note that $frac{nlog_2 n}{C}approx997$, which is much less than
of 4529.
So I believe, (some more argument are omitted here), an algorithm that performs about $nlog_2(n/C)approx6643$ comparisons to $10$sort the array is pretty much about the best we can get.