-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Add pixi project configuration #227
base: main
Are you sure you want to change the base?
Conversation
@alexander-held My guess is that the list that @eguiraud determined in #144 (comment) has changed since then, but this PR currently just implements the requirements described in #199 (comment) but I assume there will be more that we will need to test with. |
396beda
to
09b06ed
Compare
@alexander-held Can the |
Okay, I'll want to rebase this to get it into a single commit before merge, but to run the
and then you're good to go as that will also properly install the environment you need (making sure that you select the |
@matthewfeickert yes let's remove the |
94544e8
to
57df86b
Compare
@alexander-held @oshadura I've managed to get the environment to solve but I need help debugging some issues testing it:
### GLOBAL CONFIGURATION
# input files per process, set to e.g. 10 (smaller number = faster)
N_FILES_MAX_PER_SAMPLE = 5
# enable Dask
USE_DASK = True
# enable ServiceX
USE_SERVICEX = False
### ML-INFERENCE SETTINGS
# enable ML inference
USE_INFERENCE = True
# enable inference using NVIDIA Triton server
USE_TRITON = False during the "`Execute the data delivery pipeline" cell of the notebook things fail with the following Traceback:---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
Cell In[7], line 29
27 t0 = time.monotonic()
28 # processing
---> 29 all_histograms, metrics = run(
30 fileset,
31 treename,
32 processor_instance=TtbarAnalysis(USE_INFERENCE, USE_TRITON)
33 )
34 exec_time = time.monotonic() - t0
36 print(f"\nexecution took {exec_time:.2f} seconds")
File ~/analysis-grand-challenge-debug/.pixi/envs/cms-open-data-ttbar/lib/python3.9/site-packages/coffea/processor/executor.py:1700, in Runner.__call__(self, fileset, treename, processor_instance)
1679 def __call__(
1680 self,
1681 fileset: Dict,
1682 treename: str,
1683 processor_instance: ProcessorABC,
1684 ) -> Accumulatable:
1685 """Run the processor_instance on a given fileset
1686
1687 Parameters
(...)
1697 An instance of a class deriving from ProcessorABC
1698 """
-> 1700 wrapped_out = self.run(fileset, processor_instance, treename)
1701 if self.use_dataframes:
1702 return wrapped_out # not wrapped anymore
File ~/analysis-grand-challenge-debug/.pixi/envs/cms-open-data-ttbar/lib/python3.9/site-packages/coffea/processor/executor.py:1848, in Runner.run(self, fileset, processor_instance, treename)
1843 closure = partial(
1844 self.automatic_retries, self.retries, self.skipbadfiles, closure
1845 )
1847 executor = self.executor.copy(**exe_args)
-> 1848 wrapped_out, e = executor(chunks, closure, None)
1849 if wrapped_out is None:
1850 raise ValueError(
1851 "No chunks returned results, verify ``processor`` instance structure.\n\
1852 if you used skipbadfiles=True, it is possible all your files are bad."
1853 )
File ~/analysis-grand-challenge-debug/.pixi/envs/cms-open-data-ttbar/lib/python3.9/site-packages/coffea/processor/executor.py:974, in DaskExecutor.__call__(self, items, function, accumulator)
967 # FIXME: fancy widget doesn't appear, have to live with boring pbar
968 progress(work, multi=True, notebook=False)
969 return (
970 accumulate(
971 [
972 work.result()
973 if self.compression is None
--> 974 else _decompress(work.result())
975 ],
976 accumulator,
977 ),
978 0,
979 )
980 except KilledWorker as ex:
981 baditem = key_to_item[ex.task]
File ~/analysis-grand-challenge-debug/.pixi/envs/cms-open-data-ttbar/lib/python3.9/site-packages/distributed/client.py:322, in Future.result(self, timeout)
320 self._verify_initialized()
321 with shorten_traceback():
--> 322 return self.client.sync(self._result, callback_timeout=timeout)
File /opt/conda/lib/python3.9/site-packages/coffea/processor/executor.py:221, in __call__()
220 def __call__(self, *args, **kwargs):
--> 221 out = self.function(*args, **kwargs)
222 return _compress(out, self.level)
File /opt/conda/lib/python3.9/site-packages/coffea/processor/executor.py:1367, in automatic_retries()
1361 break
1362 if (
1363 not skipbadfiles
1364 or any("Auth failed" in str(c) for c in chain)
1365 or retries == retry_count
1366 ):
-> 1367 raise e
1368 warnings.warn("Attempt %d of %d." % (retry_count + 1, retries + 1))
1369 retry_count += 1
File /opt/conda/lib/python3.9/site-packages/coffea/processor/executor.py:1336, in automatic_retries()
1334 while retry_count <= retries:
1335 try:
-> 1336 return func(*args, **kwargs)
1337 # catch xrootd errors and optionally skip
1338 # or retry to read the file
1339 except Exception as e:
File /opt/conda/lib/python3.9/site-packages/coffea/processor/executor.py:1572, in _work_function()
1570 item, processor_instance = item
1571 if not isinstance(processor_instance, ProcessorABC):
-> 1572 processor_instance = cloudpickle.loads(lz4f.decompress(processor_instance))
1574 if format == "root":
1575 filecontext = uproot.open(
1576 {item.filename: None},
1577 timeout=xrootdtimeout,
(...)
1580 else uproot.MultithreadedFileSource,
1581 )
ModuleNotFoundError: No module named 'servicex' which seems to indicate that the existence of the |
A follow up question: Is there an analysis facility where the CMS ttbar open data workflow has been run with $ git grep --name-only "USE_SERVICEX"
analyses/cms-open-data-ttbar/ttbar_analysis_pipeline.ipynb
analyses/cms-open-data-ttbar/ttbar_analysis_pipeline.py
analyses/cms-open-data-ttbar/utils/metrics.py
docs/facilityinstructions.rst isn't particularly deep. |
Now that #225 is merged, we can target the v3 API of the ServiceX frontend.
Should be https://opendataaf-servicex.servicex.coffea-opendata.casa/. As for the other question about importing, that's with your own environment? Not sure what causes this but perhaps we can update to v3 and then debug that one. |
@matthewfeickert The ServiceX instance was upgraded during the last couple of days, and now it back works. |
22k lines of changes are coming from |
I am not sure why we need to remove |
I used requirements.txt to create from scratch a conda environment to run my I/O tests. I'm not familiar with pixi, but if it can be used for the exact same use case, it should be fine. Otherwise, keeping a requirements.txt might be handy. |
@sciaba I agree with you :) and I was just telling Alex about your use-case |
@matthewfeickert can we keep in sync both environments? prefix-dev/pixi#1410 |
Okay, let me refactor this to use v3. That will be easier.
@alexander-held Yes. I don't think that having a different version of the library will matter, but we'll see.
Merci @oshadura! 🙏
@oshadura Yes, lock files are long to begin with and this is a multi-platform and multi-environment lock file. I would suggest not trying to keep around the old
@sciaba Yes,
The suggested idea in that issue is going the wrong direction ( |
When I rebase my PR I'll not remove the |
a08008f
to
eb4aa30
Compare
I am suggesting to remove jupyterlab environment or make it optional. This is very confusing for users, especially for power users who want to test notebook / python script in the facility or particular environment where is not needed jupyterlab. |
5f00329
to
021c741
Compare
Okay, I can refactor this into another feature + environment. Why is this confusing for users though? I would think they should be unaware of its existence. |
I tried to test, and |
Oh yeah. You wouldn't use |
* Add pixi manifest (pixi.toml) and pixi lockfile (pixi.lock) to fully specify the project dependencies. This provides a multi-environment multi-platform (Linux, macOS) lockfile. * In addition to the default feature, add 'latest', 'cms-open-data-ttbar', and 'local' features and corresponding environments composed from the features. The 'cms-open-data-ttbar' feature is designed to be compatible with the Coffea Base image which uses SemVer coffea (Coffea-casa build with coffea 0.7.21/dask 2022.05.0/HTCondor and cheese). - The cms-open-data-ttbar feature has a 'install-ipykernel' task that installs a kernel such that the pixi environment can be used on a coffea-casa instance from a notebook. - The local features have the canonical 'start' task that will launch a jupyter lab session inside of the environment.
a551a54
to
a246d73
Compare
@alexander-held @oshadura I've moved this out of draft and this is now ready for review. I've added notes for reviewers in the PR body, but all information should be clear from the additions to the README. If not, then I need to revise it. (sorry, last |
cab5854
to
3786611
Compare
3786611
to
3989c45
Compare
@@ -20,11 +20,41 @@ This directory is focused on running the CMS Open Data $t\bar{t}$ analysis throu | |||
| utils/config.py | This is a general config file to handle different options for running the analysis. | | |||
| utils/hepdata.py | Function to create tables for submission to the [HEP_DATA website](https://www.hepdata.net) (use `HEP_DATA = True`) | | |||
|
|||
#### Setting up the environment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Later on in this README there are instructions to install hepdata_lib
and hepdata-cli
At the moment these are not in the pixi
manifest and lock file, but they can be added if it would be useful.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some high level guiding notes if you're new to how pixi
manifest files work. Feel free to ignore.
|
||
[tasks] | ||
|
||
[dependencies] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These are the dependencies of the the "default" pixi "feature" (basically a composable environment chunk) that all other defined features will extend.
pip = ">=24.3.1" | ||
uv = ">=0.4.27" | ||
|
||
[feature.latest.dependencies] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The "latest" feature assumes no restrictions from run facility and has no upper bounds. I think @alexander-held had suggested that having something like this would be useful.
[feature.latest.target.osx-arm64.pypi-dependencies] | ||
servicex = ">=3.0.0" | ||
|
||
[feature.cms-open-data-ttbar.tasks] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The "cms-open-data-ttbar" feature is designed to work with the SemVer coffea Coffea-casa deployment, and so has dependencies tuned to be compatible with the current deployment.
func-adl-servicex = ">=2.2, <3" | ||
tcut-to-qastle = ">=0.7, <0.8" | ||
|
||
[feature.local.dependencies] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The "local" feature has the interactive Jupyter components factored out into it so that these are not installed as part of the environments that will use the features designed for coffea-casa.
[feature.local.tasks] | ||
start = "jupyter lab" | ||
|
||
[environments] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Environments are the actual environments that are installed. They are composed from the "default" feature (unless excluded) and then all features given in the environment's list.
requirements.txt
does not work with Python 3.11 #144requirements.txt
#140pixi
manifest (pixi.toml
) andpixi
lockfile (pixi.lock
) to fully specify the project dependencies. This provides a multi-environment multi-platform (Linux, macOS) lockfile.latest
,cms-open-data-ttbar
, andlocal
pixi features and corresponding environments composed from the features. Thecms-open-data-ttbar
feature is designed to be compatible with the Coffea Base image which uses SemVercoffea
(Coffea-casa build with coffea 0.7.21/dask 2022.05.0/HTCondor and cheese).cms-open-data-ttbar
feature has aninstall-ipykernel
task that installs a kernel such that the pixi environment can be used on a coffea-casa instance from a notebook.start
task that will launch a jupyter lab session inside of the environment.This will also be able to support the results of PR #225 after that PR is merged with just a few updates from
pixi
. 👍Tip
Instructions for reviewers testing the PR:
pixi
if you haven't alreadyipykernel
for thecms-open-data-ttbar
environmentanalyses/cms-open-data-ttbar/ttbar_analysis_pipeline.ipynb
cms-open-data-ttbar