seo – Googlebot can not index the website of my spa based on hbs

As part of a school project, I created a single-page application web site using a server-side handlebar. However, when inspecting the site with the help of Google Search Console, all that Google can find is the bare index.html page before the actual content is injected via javascript.

The website replaces an old website (written in 1997). For some reason, Google is still trying to regularly index old and now non-existent links from the old site, while it has only tried to index new active links once 2 months ago. It goes without saying that all links in the domain outside the landing page are considered duplicates because they only contain the index.html bare page according to Google. Dead links always appear when I try to google the domain.

It should be noted that I have both a valid robot.txt file and a site map displaying the valid links on my website. Search Console does not report any errors and appears to have read my script files correctly. Using the Google-bot simulator, I can only see the bare index.html page.

According to all sources that I have found, Google in 2019 should do business with both SPA. Since I have no mistakes to make, I do not even know how to start diagnosing the problem.

Even if you can not solve my problem, even going in the right direction would be highly appreciated at this point. I do not feel like I have a lot of things to do.