Skip to content

Error caused by GPU memory  #22

@Wingerllyyy

Description

@Wingerllyyy

I run the TUM whith commad "./build/mono_tum Vocabulary/ORBvoc.txt configs/Monocular/TUM/freiburg3_office.yaml ./rgbd_dataset_freiburg3_long_office_household/ "on docker with GPU GTX1660 and I get the error
I
image

Copliot give the following advice:
"Reduce the n_neurons parameter in FullyFusedMLP model configuration. This will decrease the memory requirement of the model.
Alternatively, use CutlassMLP, which might offer better compatibility but could be slower."

I woner if these two can work. And I noticed the gpu memory usage is about 6GB. But the GPU memory usage in the paper and other people is about 9GB. Can I work with my GPU or I have to change one.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions