You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1000 documents | Loadable (a few minutes) | 4.0 GB
1500 documents | Loadable (a few minutes) | 3.7 GB
2000 documents | Loadable (a few minutes) | 4.2 GB
2500 documents | Loadable (a few minutes) | 3.9 GB (crashed using the contexts tool (which required 5.4 GB)
With an instance with a 7 GB limit
2500 documents | Loadable (a few minutes) | 4.4 GB
3000 documents | Loadable (a few minutes) | 4.6GB (using the context tool 6.4)
3500 documents | Loadable (a few minutes) | 4.6 Go (crash when using the context tool (requested Go unknown))
The web interface no longer becomes usable for 2500-3000 documents, the use of different tools (eg Terms, Trends, Contexts, etc.) can crash the application or are particularly slow. Also, it becomes barely impossible to consult the results on the web interface. The limitations seem to be as much on the frontend side as on the backend side. In either case, there is a memory limit on the client and server-side.
On the frontend side, the problem seems to be the display and storage in memory of potentially several hundred million words. There may be an option to show only part of the results or documents, even if the results were all been calculated in the backend.
On the backend side, the limitation seems to be a memory limitation. We should check if there is a way to use storage rather than memory while performing analyses.
gacou54
changed the title
Déterminer les limites de Voyant-tools
Determine the limits of Voyant-tools
Sep 1, 2021
The text was updated successfully, but these errors were encountered: