You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
importing the languages database costs about 60MB, 40 or 50 of which is likely the English lexicon
This is more complex than replacing networkx algorithms: if we could find a way to memory map the English lexicon from disk, instead of loading in into RAM, that would let us realize a significant memory saving on the API server, for both the g2p backend and the RAS web-api back-end.
The text was updated successfully, but these errors were encountered:
g2p is used on the ReadAlong Studio back-end server, and causes a fairly high amount of RAM to get used.
networkx
(any subpart of it, even just the top-level import) costs about 30MB (see RAM usage: replace networkx elements used by the web API server #394)This is more complex than replacing networkx algorithms: if we could find a way to memory map the English lexicon from disk, instead of loading in into RAM, that would let us realize a significant memory saving on the API server, for both the g2p backend and the RAS web-api back-end.
The text was updated successfully, but these errors were encountered: