Issue Description
Simply replacing torch.jit.script or torch.jit.trace with backend.jit still fail for tc functions
Example scripts:
@torch.jit.script
def f(param):
c = tc.Circuit(6)
for i in range(5):
for j in range(5):
c.rzz(i, i+1, theta=param[i, j])
return c.expectation_ps(z=[1])
f(torch.ones([5, 5]))
or
@partial(torch.jit.trace, example_inputs=torch.ones([5, 5]))
def f(param):
c = tc.Circuit(6)
for i in range(5):
for j in range(5):
c.rzz(i, i+1, theta=param[i, j])
return c.expectation_ps(z=[1])
f(torch.ones([5, 5]))
actually the latter somehow works, but very fragile, for example, if the jit transformation is nested with grad or vmap operation, torch mostly fails
Proposed Solution
- Wait for further development of torch or 2. use tf/jax backend with torch interface instead or 3. actually maybe slightly fix in the exsisting tc codebase may work but currently have no time to try 4. or try
torch.compile later.
Additional References
Issue Description
Simply replacing
torch.jit.scriptortorch.jit.tracewithbackend.jitstill fail for tc functionsExample scripts:
or
actually the latter somehow works, but very fragile, for example, if the jit transformation is nested with grad or vmap operation, torch mostly fails
Proposed Solution
torch.compilelater.Additional References