Skip to content

Runtime error during backward(): Trying to backward through the graph a second time (...). #604

Answered by luisenp
nosmokingsurfer asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @nosmokingsurfer, the case of the error that required you to add retain_graph=True is that you were not setting initial values for the optimization variables, which means that after the first loop the values from the previous optimization were using as initial values (thus retaining graph info). You can replace your initialization for loop with the following

    theseus_inputs = {}
    for i in range(N):
        if i < N - 1:
            tmp = torch.zeros(B, 4)
            tmp[:, 2] = 1.0
            tmp[:, 0] = 0.5 * predicted_acc[:, i] ** 2 + predicted_acc[:, i]
            theseus_inputs[f"predicted_odometry_{i}"] = tmp
        # Using SE2(...).tensor converts the (x, y, theta) inpu…

Replies: 2 comments 6 replies

Comment options

You must be logged in to vote
4 replies
@nosmokingsurfer
Comment options

@nosmokingsurfer
Comment options

@nosmokingsurfer
Comment options

@luisenp
Comment options

luisenp Oct 18, 2023
Collaborator

Comment options

You must be logged in to vote
2 replies
@nosmokingsurfer
Comment options

@luisenp
Comment options

luisenp Oct 20, 2023
Collaborator

Answer selected by nosmokingsurfer
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants