Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SLURM support #3467

Open
1 of 8 tasks
germa89 opened this issue Oct 7, 2024 · 1 comment
Open
1 of 8 tasks

SLURM support #3467

germa89 opened this issue Oct 7, 2024 · 1 comment
Assignees

Comments

@germa89
Copy link
Collaborator

germa89 commented Oct 7, 2024

Context

Check #2865 for a bit of historical context, which lead to #3091.
In #2865 we proposed implementing PyHPS to interact with HPC clusters. While PyHPS is very powerful, it is not an scheduler, so it needs to be installed additionally to an scheduler (like SLURM) and depends on it.
In this PR we are going to support SLURM HPC clusters only and directly, without PyHPS.

Research

Check #3397 for the research done on launching MAPDL and PyMAPDL on SLURM clusters

Introduction

For the moment we are going to focus more on launching single MAPDL instances, leaving aside the MapdlPool since it does create issues when regarding resource splitting. I think comming up with a good default resource sharing scheme might be a bit tricky.

Also, we are going to focus on the most useful stuff:

  • [Case 1] Batch script submission (Scenario A in PyMAPDL and PyHPS #2865)
  • [Case 2] Interactive MAPDL instance on HPC, and PyMAPDL on entrypoint (Scenario B in PyMAPDL and PyHPS #2865)
  • [Case 3] Interactive MAPDL instance on HPC, and PyMAPDL on outside-cluster computer (Similar to scenario B in PyMAPDL and PyHPS #2865)
    We might need to ssh to the entrypoint pc.
  • [Case 4] Batch submission from ouside-cluster machine. This is tricky because attaching files is complicated. This issue is solved if we are running interactively, because PyMAPDL can take care of uploading the files to the instance. So we will leave this one to the very end.

Roadmap

Start to implement this on the following PRs:

@germa89
Copy link
Collaborator Author

germa89 commented Oct 7, 2024

Pinging @koubaa for feedback/awareness.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant