performance tuning – How to quickly calculate the large inverse matrix

I want get the Jacobian inverse matrix (100 * 100) as fast as possible for Newton's method.

but my implementation is slow, is there a good way?

in this question, I mean,

$ function = first + second.vector + third.vector.vector. $
first is ($ 100 times 1) $ vector
second is( 100 $ times 100) $ matrix.
third is ($ 100 times100 times100) $ tensor.

prepare the material (not main for the question, but need to prepare vector / matrix / tensor).


expr = -0.5 * x ^ 2 + 0.5 * derivative[0, 1][v][t, x]^ 2 + derivative[1, 0][v][t, x];
discriminate = {derivative[1, 0][v][t, x]    -> (v[i + 1, j] - v[i - 1, j]) / (2 * dt), derivative[0, 1][v][t, x]    -> (v[i, j + 1] - 2 * v[i, j] + v[i, j - 1]) / dx ^ 2, x -> x[j]};
param = {dt -> 0,1, dx -> 0,1};
xmax = 1;
xmin = 0;
dx = 0.1;
nx = IntegerPart[(xmax - xmin)/dx];
Make[x[i] = xmin + i * dx, {i, 0, nx}];
discritinize2 = {derivative[1, 0][v][t, x]    -> (v[i + 1, j] - v[i, j]) / dt, derivative[0, 1][v][t, x]    -> (v[i, j + 1] - 2 * v[i, j] + v[i, j - 1]) / dx ^ 2, x -> x[j]};
before = expr /. to discriminate2 /. param;
middle = expr /. to discriminate. param;
Make[v[i, -1] = 0, {i, 0, nx}];
Make[v[nx, i] = x[i] - X[i]^ 2, {i, 0, nx}];
{first, second, third} = CoefficientArrays[Flatten[Table[If[i == 0, forward, middle], {i, 0, nx - 1}, {j, 0, nx - 1}]], Flatten[Table[v[Table[v[Table[v[Table[v[i, j], {i, 0, nx - 1}, {j, 0, nx - 1}]]]vector = Flatten @ Table[v[v[v[v[i, j], {i, 0, nx - 1}, {j, 0, nx - 1}];

First, I tried the symbolic calculation.

AbsoluteTiming @
Inverse @ Grad[
   SparseArray[
    first + Dot[second, vector] + Point[third, vector, vector]]vector]

=>

$ Aborted (* more minutes past *)

Second,
Specific value of substitution,
and solve for $ r $ in $ J_F (x_n) (r) = – F (x_n) $ to avoid calculating the inverted Jacobian.
(I've found this method here)

$ J_F (x_n) $ is the matrix,
$ r $ is vector,
$ -F (x_n) $ is the vector of the value of the function.

testVector = Flatten @ Table[0.2, {i, 0, nx - 1}, {j, 0, nx - 1}];
Jacobian =
SparseArray[
   Normal@Grad[
      SparseArray[
       first + Dot[second, vector] + Point[third, vector, vector]],
vector]/. wire[Rule[vector, testVector]]];
funcion =
SparseArray[
   Normal[first + Dot[second, vector] +
Point[third, vector, vector]]/.
wire[Rule[vector, testVector]]];
AbsoluteTiming @ SparseArray @ LinearSolve[jacobian, f]

=>

9.237s

At this speed, it is a bit difficult for iteration.

Any advice is appreciated!