Skip to content

Memory leak with different libraries (numpy, pyTorch, OpenAI Gym) #542

@Paethon

Description

@Paethon

I am currently working with OpenAI Gym using PyCall and the following code (randomly plays games of Atari 2600 Pong) uses up memory it never frees (on the order of 100MiB a second).

using PyCall

@pyimport gym
env = gym.make("Pong-v0")
env[:reset]()

for i in 1:100000
    obs, reward, done, info = env[:step](rand([3,4]))
    env[:render]()
    done && env[:reset]()
end

env[:close]()

To try you should pip install gym[all]

The same example in Python does not have this problem. Any idea how to fix this? Or how do I find out what exactly is happening here?

This seems to occur with other libraries (numpy, PyTorch) as well. See later comments on this issue.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions