I've already asked a very similar question about stack overflow in a different wording and got a satisfying answer assuming it's impossible to go through all the possibilities (np-hard).
After further research (because this is my bottleneck), I stumbled upon this similar question, but again, it is assumed that the problem is not working and that it is not exactly what I research. However, it seems that the question is better suited to this site than SO (correct me if I'm wrong).
I am pretty sure that this can be solved in "brute force" (optimal) with the right algorithm in a short time for the given constraints. More on that later.
I have a small set of repeated values, for example.
values = 10 x  (ten times the value 18); 5 x , 6 x 
which I need to distribute as much as possible evenly on a fixed number of bins with a maximum difference between bins (maximum difference between the sum of the elements in the bins, see below).
The reason I think this can be brutally forced are my constraints. I will never have a large number of bins (for the moment, the maximum is 3, suppose it is 4 or 5 if). The maximum difference between the two tanks is 2 for atm. We can deduce that it stays there if it helps, because I can adjust this parameter arbitrarily if necessary.
For a small example, let's say
values = [[18,18,18,18,18],[30,30,30]],
number_of_bins = 2 and
max_difference_between_bins = 2.
The algorithm to distribute them would be something of the kind
for the rest in the range (0.8) #try distributing all the packets first,
# then one less, then two less, etc.
generate possible distributions for the values in the bins
add distributions to check for differences, if any
# is below the threshold of max max_difference_between_bins, break
L & # 39; try:
Rest = 0
Divide 5 times 18 in two bins:
(0.90), (18.72), (36.36), (72.18), (90.0)
Divide 3 times 30 in two bins:
(0.90), (30.60), (60.30), (90.0)
See that the example is bad because it works in the first run, in any case, adding to them returns:
(0.90) -> (0.180), (30.150), (60.120), (90.90)
(18.72) -> (18.162), (48.132), (78.102) (108.72)
(36,36) -> (36,126), (66,96), (96,66) (126,36)
(72.18) -> (72.108), (102.78), (132.48), (162.18)
(90.0) -> (90.90), (120.60), (150.30), (180.0)
As you can see, half of them are obsolete because they are duplicates.
If there was no solution (90,90), it would continue at the next execution by trying the same thing for
values = [[18,18,18,18],[30,30,30]] and
values = [[18,18,18,18,18],[30,30]] and
rest =  or
rest = .
So, for a very big example of 70 values with 5 different sizes in 3 bins, the possible solutions should be for the first time:
Let's say that 50 values of the same size are divided into 3 trays, then we get 50 + 3-1 out of 3 = 1326 possibilities or 316251 for 5 trays (no?) (Assuming that the trays are differentiable, which is not the case. is not necessary, but then it would be necessary to increase the size later for the combination with the other values).
Distribute the other 20 with, say, 8, 5, 4, 3 values each worth 45, 21, 15, 10 values. Multiplied, that's 190 million possibilities. However, most of these combinations can not be a good solution. Some in-depth research or division and conquest should be able to break them down.
Of course, these are then multiplied by 5 for the first pass with a remainder of 1, 120 for a passage with a remainder of 2, and so on. But these solutions can also be generated from old solutions and again, it should be possible to go through a fraction of them.
Basically, how to find a solution to this problem in a short time? Pseudocode or even python would be great, but not necessary, I can understand the coding myself.
We can make many assumptions about the number of different values and the maximum number of values you want. So if you plan to reduce the number of solutions by 190 million times, whatever the chosen solution, for example: maximum 3 boxes, maximum 3 different values occurring at most, for example, 20 times (not 20 times, 20 times, but 10 times, 5 times, 5 times).