Skip to content

Memory error while loading wiki-news model for incremental learning #4

@vdpappu

Description

@vdpappu

Hi Eric,
Thanks for the excellent enhancement. I am trying to use your repo for incremental learning. I am getting a memory error while running the script. My machine has 32gb ram and I am able to load the pre-trained model otherwise for inference tasks.

image

Pre-trained model size: 6.8gb
Command executed:
./fasttext skipgram -input /home/aaa/Downloads/datasets/nlu/sed_sof_corpus.txt -inputModel /home/aaa/Downloads/datasets/wiki-news-300d-1M-subword.bin -output sed_sof_trlearn -incr

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions