The fem-bem couple code error using dolfinx

Hello everyone,
I am following the tutorial for the latest fem-bem couple solver. When I was running such code, I got such error below, which is seems like the memory leak. I has used the valgrind and the result shows that it has definitely lost

INFO:bempp:Created grid with id a8d634f8-2530-4fed-b243-3f537f0e5f94. Elements: 48. Edges: 72. Vertices: 26
TIMING:bempp:Grid.__init__ : 3.831e+00s
TIMING:bempp:_compute_p1_dof_map : 1.504e+00s
TIMING:bempp:p1_continuous_function_space : 1.505e+00s
TIMING:bempp:p0_discontinuous_function_space : 2.401e-04s
FEM dofs: 27
BEM dofs: 48
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and https://petsc.org/release/faq/
[0]PETSC ERROR: ---------------------  Stack Frames ------------------------------------
[0]PETSC ERROR: No error traceback is available, the problem could be in the main program. 
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 59.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

And here is my code from the tutorial, and I using the breakpoint to find out that the error happed in A.weak_form() because I have change the A.weak_form() to other row, the error occurs with A.weak_form()

import dolfinx
from dolfinx.fem import FunctionSpace, Function
from dolfinx.mesh import create_unit_cube
import dolfinx.geometry
import ufl
from mpi4py import MPI
import bempp.api
import numpy as np

k = 6.
d = np.array([1., 1., 1])
d /= np.linalg.norm(d)

mesh = create_unit_cube(MPI.COMM_WORLD, 2, 2, 2)
# mesh = create_unit_cube(nx=10,ny=10,nz=10)

from bempp.api.external import fenicsx

fenics_space = FunctionSpace(mesh, ("CG", 1))
trace_space, trace_matrix = \
    fenicsx.fenics_to_bempp_trace_data(fenics_space)
bempp_space = bempp.api.function_space(trace_space.grid, "DP", 0)

fem_size = fenics_space.dofmap.index_map.size_global
bem_size = bempp_space.global_dof_count

print("FEM dofs: {0}".format(fem_size))
print("BEM dofs: {0}".format(bem_size))

id_op = bempp.api.operators.boundary.sparse.identity(
    trace_space, bempp_space, bempp_space)
mass = bempp.api.operators.boundary.sparse.identity(
    bempp_space, bempp_space, trace_space)
dlp = bempp.api.operators.boundary.helmholtz.double_layer(
    trace_space, bempp_space, bempp_space, k)
slp = bempp.api.operators.boundary.helmholtz.single_layer(
    bempp_space, bempp_space, bempp_space, k)

u = ufl.TrialFunction(fenics_space)
v = ufl.TestFunction(fenics_space)
n = 0.5

@bempp.api.complex_callable
def u_inc(x, n, domain_index, result):
    result[0] = np.exp(1j * k * np.dot(x, d))
u_inc = bempp.api.GridFunction(bempp_space, fun=u_inc)

# The rhs from the FEM
rhs_fem = np.zeros(fem_size)
# The rhs from the BEM
rhs_bem = u_inc.projections(bempp_space)
# The combined rhs
rhs = np.concatenate([rhs_fem, rhs_bem])


# -------------------------- the block martrix ------------------------------------
from bempp.api.assembly.blocked_operator import BlockedDiscreteOperator
# from scipy.sparse.linalg.interface import LinearOperator # old version
from scipy.sparse.linalg import LinearOperator
blocks = [[None,None],[None,None]]

trace_op = LinearOperator(trace_matrix.shape, lambda x:trace_matrix @ x)

A = fenicsx.FenicsOperator((ufl.inner(ufl.grad(u), ufl.grad(v)) - k**2 * n**2 * ufl.inner(u, v)) * ufl.dx)

blocks[0][0] = A.weak_form()
blocks[0][1] = -trace_matrix.T * mass.weak_form().to_sparse()
blocks[1][0] = (.5 * id_op - dlp).weak_form() * trace_op
blocks[1][1] = slp.weak_form()

blocked = BlockedDiscreteOperator(np.array(blocks))

I am using the PETSc 3.20.3 and dolfinx version is 0.7.2