Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 Bug Report: Worker Exits with Signal 4 (SIGILL) When Running Locally on macOS M4 #1515

Open
2 tasks done
alireza-es opened this issue Dec 29, 2024 · 0 comments
Open
2 tasks done

Comments

@alireza-es
Copy link

📜 Description

I’m encountering an error when attempting to run the app locally on my MacBook Pro with an Apple M4 chip. The worker instance exits with the error signal 4 (SIGILL) during startup. The configuration uses GPTAI with a GPT API key.

👟 Reproduction steps

  1. Clone the repository.
  2. Set up the .env file with the necessary environment variables:
        API_KEY=<MY API Key>
        LLM_NAME=openai
        VITE_API_STREAMING=true
  1. Start the application using the provided script:

./run-with-docker-compose.sh

  1. Open web app in the brwoser on http://localhost:5173/ and try to upload a file
  2. Observe the error in the worker instance.
  3. The browser shows an infinite loop and is not uploading the file.

👍 Expected behavior

The application should upload the file without errors, and the worker should function correctly.

👎 Actual Behavior with Screenshots

The worker instance crashes with the following error message:
Process 'ForkPoolWorker-8' pid:44 exited with 'signal 4 (SIGILL)

image

The full log has been included.

💻 Operating system

MacOS

What browsers are you seeing the problem on?

Chrome

🤖 What development environment are you experiencing this bug on?

Docker

🔒 Did you set the correct environment variables in the right path? List the environment variable names (not values please!)

.env content:

        API_KEY=<MY API Key>
        LLM_NAME=openai
        VITE_API_STREAMING=true

📃 Provide any additional context for the Bug.

No response

📖 Relevant log output

2024-12-28 20:42:13 USER_AGENT environment variable not set, consider setting it to identify your requests.
2024-12-28 20:44:54 
Batches:   0%|          | 0/1 [00:00<?, ?it/s]Process 'ForkPoolWorker-9' pid:43 exited with 'signal 4 (SIGILL)'
2024-12-28 20:45:25 
2024-12-28 20:45:25 worker: Warm shutdown (MainProcess)
2024-12-28 20:45:50 USER_AGENT environment variable not set, consider setting it to identify your requests.
2024-12-28 20:42:13  
2024-12-28 20:42:13  -------------- celery@86bf6371723b v5.3.6 (emerald-rush)
2024-12-28 20:42:13 --- ***** ----- 
2024-12-28 20:42:13 -- ******* ---- Linux-6.10.14-linuxkit-aarch64-with-glibc2.39 2024-12-29 03:42:13
2024-12-28 20:42:13 - *** --- * --- 
2024-12-28 20:42:13 - ** ---------- [config]
2024-12-28 20:42:13 - ** ---------- .> app:         application.celery_init:0xfffef8cf1370
2024-12-28 20:42:13 - ** ---------- .> transport:   redis://redis:6379/0
2024-12-28 20:42:13 - ** ---------- .> results:     redis://redis:6379/1
2024-12-28 20:42:13 - *** --- * --- .> concurrency: 14 (prefork)
2024-12-28 20:42:13 -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
2024-12-28 20:42:13 --- ***** ----- 
2024-12-28 20:42:13  -------------- [queues]
2024-12-28 20:42:13                 .> celery           exchange=celery(direct) key=celery
2024-12-28 20:42:13                 
2024-12-28 20:42:13 
2024-12-28 20:42:13 [tasks]
2024-12-28 20:42:13   . application.api.user.tasks.ingest
2024-12-28 20:42:13   . application.api.user.tasks.ingest_remote
2024-12-28 20:42:13   . application.api.user.tasks.schedule_syncs
2024-12-28 20:42:13 
2024-12-28 20:42:15 [2024-12-29 03:42:15,222] WARNING in warnings: /venv/lib/python3.12/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
2024-12-28 20:42:15 whether broker connection retries are made during startup in Celery 6.0 and above.
2024-12-28 20:42:15 If you wish to retain the existing behavior for retrying connections on startup,
2024-12-28 20:42:15 you should set broker_connection_retry_on_startup to True.
2024-12-28 20:42:15   warnings.warn(
2024-12-28 20:42:15 
2024-12-28 20:42:15 [2024-12-29 03:42:15,226] INFO in connection: Connected to redis://redis:6379/0
2024-12-28 20:42:15 [2024-12-29 03:42:15,226] WARNING in warnings: /venv/lib/python3.12/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
2024-12-28 20:42:15 whether broker connection retries are made during startup in Celery 6.0 and above.
2024-12-28 20:42:15 If you wish to retain the existing behavior for retrying connections on startup,
2024-12-28 20:42:15 you should set broker_connection_retry_on_startup to True.
2024-12-28 20:42:15   warnings.warn(
2024-12-28 20:42:15 
2024-12-28 20:42:15 [2024-12-29 03:42:15,227] INFO in mingle: mingle: searching for neighbors
2024-12-28 20:42:15 [2024-12-29 03:42:15,822] INFO in beat: beat: Starting...
2024-12-28 20:42:16 [2024-12-29 03:42:16,237] INFO in mingle: mingle: all alone
2024-12-28 20:44:52 [2024-12-29 03:44:52,661] INFO in strategy: Task application.api.user.tasks.ingest[841cdff5-8c78-4187-bc3a-1245e7c5251a] received
2024-12-28 20:44:52 [2024-12-29 03:44:52,662] INFO in worker: Ingest file: inputs/local/Temporal_Solar_Photovoltaic_Generation_Capacity_Reduction_From_Wildfire_Smoke.pdf
2024-12-28 20:45:25 [2024-12-29 03:45:25,404] INFO in beat: beat: Shutting down...
2024-12-28 20:45:50  
2024-12-28 20:45:50  -------------- celery@86bf6371723b v5.3.6 (emerald-rush)
2024-12-28 20:45:50 --- ***** ----- 
2024-12-28 20:45:50 -- ******* ---- Linux-6.10.14-linuxkit-aarch64-with-glibc2.39 2024-12-29 03:45:50
2024-12-28 20:45:50 - *** --- * --- 
2024-12-28 20:45:50 - ** ---------- [config]
2024-12-28 20:45:50 - ** ---------- .> app:         application.celery_init:0xfffef1176780
2024-12-28 20:45:50 - ** ---------- .> transport:   redis://redis:6379/0
2024-12-28 20:45:50 - ** ---------- .> results:     redis://redis:6379/1
2024-12-28 20:45:50 - *** --- * --- .> concurrency: 14 (prefork)
2024-12-28 20:45:50 -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
2024-12-28 20:45:50 --- ***** ----- 
2024-12-28 20:45:50  -------------- [queues]
2024-12-28 20:45:50                 .> celery           exchange=celery(direct) key=celery
2024-12-28 20:45:50                 
2024-12-28 20:45:50 
2024-12-28 20:45:50 [tasks]
2024-12-28 20:45:50   . application.api.user.tasks.ingest
2024-12-28 20:45:50   . application.api.user.tasks.ingest_remote
2024-12-28 20:45:50   . application.api.user.tasks.schedule_syncs
2024-12-28 20:45:50 
2024-12-28 20:45:51 [2024-12-29 03:45:51,100] INFO in beat: beat: Starting...
2024-12-28 20:45:52 [2024-12-29 03:45:52,068] WARNING in warnings: /venv/lib/python3.12/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
2024-12-28 20:45:52 whether broker connection retries are made during startup in Celery 6.0 and above.
2024-12-28 20:45:52 If you wish to retain the existing behavior for retrying connections on startup,
2024-12-28 20:45:52 you should set broker_connection_retry_on_startup to True.
2024-12-28 20:45:52   warnings.warn(
2024-12-28 20:45:52 
2024-12-28 20:45:52 [2024-12-29 03:45:52,073] INFO in connection: Connected to redis://redis:6379/0
2024-12-28 20:45:52 [2024-12-29 03:45:52,073] WARNING in warnings: /venv/lib/python3.12/site-packages/celery/worker/consumer/consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine
2024-12-28 20:45:52 whether broker connection retries are made during startup in Celery 6.0 and above.
2024-12-28 20:45:52 If you wish to retain the existing behavior for retrying connections on startup,
2024-12-28 20:45:52 you should set broker_connection_retry_on_startup to True.
2024-12-28 20:45:52   warnings.warn(
2024-12-28 20:45:52 
2024-12-28 20:45:52 [2024-12-29 03:45:52,075] INFO in mingle: mingle: searching for neighbors
2024-12-28 20:45:53 [2024-12-29 03:45:53,085] INFO in mingle: mingle: all alone
2024-12-28 20:46:13 [2024-12-29 03:46:13,219] INFO in strategy: Task application.api.user.tasks.ingest[9777a3aa-7d4b-4093-8346-dcc5be0e6a72] received
2024-12-28 20:46:13 [2024-12-29 03:46:13,220] INFO in worker: Ingest file: inputs/local/Temporal_Solar_Photovoltaic_Generation_Capacity_Reduction_From_Wildfire_Smoke.pdf
2024-12-28 20:46:13 
Batches:   0%|          | 0/1 [00:00<?, ?it/s]Process 'ForkPoolWorker-8' pid:44 exited with 'signal 4 (SIGILL)'

👀 Have you spent some time to check if this bug has been raised before?

  • I checked and didn't find similar issue

🔗 Are you willing to submit PR?

No

🧑‍⚖️ Code of Conduct

  • I agree to follow this project's Code of Conduct
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant