Skip to content

Commit

Permalink
edit link
Browse files Browse the repository at this point in the history
  • Loading branch information
ErinWeisbart authored May 16, 2023
1 parent ad4dcc3 commit 119a77b
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion documentation/DCP-documentation/troubleshooting_runs.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
| Jobs completing (total messages decreasing) much more quickly than expected. | “== OUT” without proceeding through CP pipeline | Batch_data.h5 files being created instead of expected output. | | Your pipeline has the CreateBatchFiles module included. | Uncheck the CreateBatchFiles module in your pipeline. |
| | "ValueError: dictionary update sequence element #1 has length 1; 2 is required" | | | The syntax in the groups section of your job file is incorrect. | If you are grouping based on multiple variables, make sure there are no spaces between them in your listing in your job file. e.g. "Metadata_Plate=Plate1,Metadata_Well=A01" is correct, "Metadata_Plate=Plate1, Metadata_Well=A01" is incorrect. |
| | Nothing happens for a long time after "cellprofiler -c -r "| | | 1) Your input directory is set to a folder with a large number of files and CP is trying to read the whole directory before running. 2) You are loading very large images. | 1) In your job file, change the input to a smaller folder. 2) Consider downscaling your images before running them in CP. Or just be more patient.|
| | Within a single log there are multiple “cellprofiler -c -r” | Expected output seen. | | A single job is being processed multiple times. | SQS_MESSAGE_VISIBILITY is set too short. See https://github.com/CellProfiler/Distributed-CellProfiler/wiki/SQS-QUEUE-INFORMATION for more information. |
| | Within a single log there are multiple “cellprofiler -c -r” | Expected output seen. | | A single job is being processed multiple times. | SQS_MESSAGE_VISIBILITY is set too short. See [SQS_Queue_information](SQS_QUEUE_information.md) for more information. |
| | “ValueError: no name (Invalid arguments to routine: Bad value)” or “Encountered unrecoverable error in LoadData during startup: No name (no name)” | | | There is a problem with your LoadData.csv. This is usually seen when CSVs are created with a script; accidentally having an extra comma somewhere (looks like ",,") will be invisible in Excel but generate the CP error. If you made your CSVs with pandas to_csv option, you must pass index=False or you will get this error. | Find the “,,” in your CSV and remove it. If you made your CSVs with pandas dataframe’s to_csv function, check to make sure you used the index=False parameter. |
| | IndexError: index 0 is out of bounds for axis 0 with size 0| | | 1) Metadata values of 0 OR that have leading zeros (ie Metadata_Site=04, rather than Metadata_Site=4) are not handled well by CP. 2) The submitted jobs don’t make sense to CP. 3) DCP is looking for your images in the wrong location. | 1) Change your LoadData.csv so that there are no Metadata values of 0 or with 0 padding. 2) Change your job file so that your jobs match your pipeline’s expected input. 3) If using LoadData, make sure the file paths are correct in your LoadData.csv and the "Base image location" is set correctly in the LoadData module. If using BatchFiles, make sure your BatchFile paths are correct. |
| | | Pipeline output is not where expected | | 1) There is a mistake in your ExportToSpreadsheet in your pipeline. 2) There is a mistake in your job file. | 1) Check that your Output File Location is as expected. Default Output Folder is typical. Default Output Folder sub-folder can cause outputs to be nested in an unusual manner. 2) Check the output path in your job file. |
Expand Down

0 comments on commit 119a77b

Please sign in to comment.