When I use spacy's beam search, there will be a memory leak problem, how can I solve it? The code that causes the memory leak is as follows:
beams = nlp.entity.beam_parse(docs, beam_width=beam_width, beam_density=beam_density)
When I use spacy's beam search, there will be a memory leak problem, how can I solve it? The code that causes the memory leak is as follows:
beams = nlp.entity.beam_parse(docs, beam_width=beam_width, beam_density=beam_density)
This should be fixed as of spacy v2.2.4.