recursion – How to create a recursive method for java classes with catalog structure?

I need to make a catalog structure with classes.
It must look like the following:

   1] Subgroup1
       1.1 subgroup1
               1.1.1 product1
               1.1.2 product2  
       1.2 subgroup2   
  2]  Subgroup2
       2.1 subgroup1
               2.1.1 product1
               2.1.2 product2  
       2.2 subgroup2   

I have created three classes Group, SubGroup, and LeafGroup with such relation:

class Group { 
int id; 
String name; 

class SubGroup extends Group { 

List<Group> subgroups;

class LeafGroup extends Group { 

List<Product> products;

So, only Leaf Group has products.
I need to create a method for the group where each group will recursively get all available products.
For example. for Subgroup2 available 2.1.1 product and 2.1.2 product.
As I understand it must check if the subgroup has products and if no, calls itself and check it in the next level.
Have can correctly looks this recursive method ?

recurrence relation – Two dimensional recursive function in $Olog n$ time complexity

It is well known that a recursive sequence or $1$-d sequence can be solved in $O log n$ time given that it has the form

$$a_n=sum_{k=1}^{n} C_ka_{n-k}$$

where $C_k$ is a constant. Examples would include polynomials like $n^2$ or $n^3$, exponentials like $2^n$, and Fibonacci Numbers defined by $a_n=a_{n-1}+a_{n-2}, a_0=0, a_1=1$. Factorials would not be included for example, because they are defined by $a_n=na_{n-1}$, and $C_k=n$ is not constant.

Let $a_{n,k}$ be a two dimensional sequence defined by

$$a_{n,k} = sum_{i=0,j=0, (ine n text{ }∧text{ }jne k)}^{n,k}C_{i,j}a_{i,j}$$

where $C_{i,j}$ is constant.

Is it possible to compute $a_{n,k}$ in logarithmic time (a.k.a $O log n$) or better?

I know one case where this is possible, namely if $a_{n,k}$ are coefficients of polynomials in a sequence ($1$-d recursive sequence), such that the degrees of each term differ by $1$. In this case, the diagonals of the sequence $a_{n,n+t}$ for some constant $t$, are $1$-d constant recursive functions, whose terms can be computed in $Olog n$ time. This doesn’t use the initial relation in terms of two indices though as expected.

However, in certain cases, the diagonals do not form $1$-d constant recursive functions, namely when the degree difference of consecutive polynomial terms is more than $1$.

python – Are user-callable recursive functions an anti-pattern?

I have a function in python that calls itself recursively, and has some internal variables
passed down the recursion using keyword arguments (that are not listed in the docstring)

Is it a problem to have this exposed to an user instead of defining another private recursive function that is called by a public function. Where the public function would be a wrapper around the recursive function.

I cannot think of any downsides of the first approach compared to the second.

sql server – Why can’t unused columns and tables be ignored in a recursive CTE

Usually, SQL Server can optimize away any unused columns in the execution plan, and unused joined tables are not queried. But as soon as a recursive CTE comes into play, apparently all columns and joined tables are queried, even if not necessary. Why is that? Can I help SQL Server to ignore unused columns/tables here?


;WITH cte AS
    SELECT cn.CatalogNodeId,
           (SELECT Name FROM dbo.LocalizedCatalogNodes WHERE CatalogNodeId = cn.CatalogNodeId) AS Name
    FROM dbo.CatalogNodes AS cn
    WHERE cn.ParentId IS NULL
    SELECT cn.CatalogNodeId,
           (SELECT Name FROM dbo.LocalizedCatalogNodes WHERE CatalogNodeId = cn.CatalogNodeId) AS Name
    FROM dbo.CatalogNodes AS cn
    JOIN cte ON cte.CatalogNodeId = cn.ParentId
SELECT CatalogNodeId FROM cte

This is a simple recursive CTE resolving a hierarchical parent-child structure, adding a localized name to each node.

Even though not necessary, the execution plan shows that column CatalogNodeType is retrieved and dbo.LocalizedCatalogNodes is joined for each iteration.

Now comment out the UNION ALL part making it a non-recursive CTE, and the execution plan suddenly consists of just two steps, not containing column CatalogNodeType or table dbo.LocalizedCatalogNodes.

computability theory – Ordinal numbers reachable by primitive recursive ordinal functions in omega

For each $ninomega$ let $(varphi_i^n)_{iinomega}$ be some “reasonable” enumeration of the $n$-ary $PR_omega$ functions. It’s easy to check that the functions $$F_n: (a,b_1,…,b_n)mapsto varphi^n_a(b_1,…,b_n)$$ are uniformly-in-$n$ $Delta_1$-definable over $L_{omega_1^{CK}}$ (although we’re only interested in $F_0$, the right way to prove this is to define all of them simultaneously).

Now looking at $n=1$ specifically, consider the function $$G:omegarightarrowomega_1^{CK}: amapsto F_1(a,0).$$ This is $Delta_1$ over $L_{omega_1^{CK}}$, so by $Sigma_1$ Replacement we have $sup(ran(G))<omega_1^{CK}$.

(We can recast the above in terms of ordinal notations and $Sigma^1_1$ bounding, but personally I find that thinking in terms of definability over admissible sets is ultimately simpler.) Morally speaking, any “short” hierarchy of ordinals which only involves simply-defined total operations will fall short of $omega_1^{CK}$.

Of course I’ve omitted basically all the details here, since they get rather tedious. The development of hyperarithmetic theory and $omega_1^{CK}$-recursion theory is treated quite nicely in Sacks’ book. The key point is the “closedness” of $omega_1^{CK}$, either in the sense of $Sigma^1_1$ bounding or in the sense of admissibility; the appropriate definability of the $PR_omega$ operations in either case is annoying but not hard (it follows the proof that classical primitive recursive functions are $Delta_1$ definable).

OK, so what is the supremum in question? The following is a bit speculative:

The relevant thing to look at is the Veblen hierarchy. At a glance, each application of primitive recursion is only going to go “one level up,” and so $phi_omega(0)$ is a reasonable guess. (Note that $epsilon_0=phi_1(0)$, so $phi_omega(0)$ is going to be quite large by many standards). But I haven’t had time to check the details on this.

I am more confident that the Feferman-Schutte ordinal $Gamma_0$ is an upper bound. This is because the basic theory of $PR_omega$-functions – specifically, their totality, appropriately phrased – should be developable in the theory $mathsf{ATR}_0$. This gives the proof-theoretic ordinal of $mathsf{ATR}_0$, which is $Gamma_0$, as an upper bound. Again, this is a very coarse argument which should apply to any “simple” hierarchy of ordinals – but “simple” is more limited here than in the $omega_1^{CK}$ analysis of course.

asymptotics – Can I use Master Theorem to solve recurrence relations with a constant in the recursive term?

If you are just interested in an upper bound you can notice that $T(n) le S(n)$ where $S(n) = 3 S(n/3) + frac{n}{2}$ and has solution $S(n) = O(n log n)$.

Alternatively there is always induction. You can show that, for $n ge 2$, $T(n) le c n log n$.

For $2 le n < 7$, $T(n)$ is a constant and $n log n ge 1$. Therefore the claim is true for a sufficiently large (constant) value $c^*$ of $c$.

For $n ge 7$ you have:
T(n) = 3Tleft(frac{n}{3} – 2right) + frac{n}{2} le 3 c frac{n}{3} log frac{n}{3} + frac{n}{2}
= cn log n – cn log 3 + frac{n}{2},

which is at most $cn log n$ when $c n log 3 ge frac{n}{2}$ or, equivalently, $c ge frac{1}{2 log 3}$.

Simply pick $c = max{c^*, frac{1}{2 log 3} }$.

Time complexity a recursive function

Suppose given recurrence relation

$$T(n)=T(sqrt{n})+T(n-sqrt{n})+n$$ $$T(1)=O(1)$$

How we can find an order of above recurrence relation?

My attempt:

I read following post, but get stuck in understanding that solution.

If language L is recursive, then how to prove L’ is recursively enumerable?

How to prove if a language L’ is recursively enumerable if L is recursive?

Which type of the ISP DNS server query (recursive or iterative) to Root, TLD, Authoritative server? Could I change to the other one?

  • Which type of the ISP DNS server query (recursive or iterative) to Root, TLD, Authoritative server?

  • Could I change to the other one?

c++ – How can I raise the performance of this recursive function for a chess engine?


I am making a chess engine in C++. I have a search function which is recursive, and I time the search within main. Currently, I achieve 72,762,064 nodes per second with the code below. I am new to C++, so before I develop further, I would like to know if I am doing anything wrong, or if there are any areas I could improve on.

Description of code

I have used typedef to “create” my own Bitboard type for easier reading.

Within the King class is a static function called Move that will return an array of length 8 containing the relevant positions.

The search function is given a position (Bitboard) and an depth (int) and then first checks if we are at a maximum depth. If not, we carry out the search by looping over all moves generated by King::Move.

Relevant code

#include <iostream>

typedef uint_fast64_t Bitboard;
unsigned long long count = 0;
int depth = 10;

Bitboard * Move(Bitboard square)
    Bitboard toReturn(8);
    toReturn(0) = (square << 7) & 0x7F7F7F7F7F7F7F7F;
    toReturn(1) = square << 8;
    toReturn(2) = (square << 9) & 0xFEFEFEFEFEFEFEFE;
    toReturn(3) = (square >> 1) & 0x7F7F7F7F7F7F7F7F;
    toReturn(4) = (square << 1) & 0xFEFEFEFEFEFEFEFE;
    toReturn(5) = (square >> 9) & 0x7F7F7F7F7F7F7F7F;
    toReturn(6) = square >> 8;
    toReturn(7) = (square >> 7) & 0xFEFEFEFEFEFEFEFE;
    return toReturn;

void search(Bitboard &pos, int depth) {
    if (!depth) return; // Quicker to check if number is 0 rather than compare to another
    Bitboard *moves = Move(pos);
    for (int i = 8; i--; ) // Quicker to check if number is 0 rather than compare to another
        search(moves(i), depth);

int main(){
    Bitboard king = 1ULL;
    search(king, depth);
    std::cout << "Boards checked: " << count << std::endl;


Please let me know if there is anything else you would like to know.