INFO: Using directory: "2" INFO: Using existing c-file: ./templateSANS_Mantid.c INFO: Using existing binary: ./templateSANS_Mantid.out INFO: === Simulation 'templateSANS_Mantid' (/Users/peterwillendrup/tmp/mcstas-test/20240617_1000_29/mcstas-3.4.49_Darwin/templateSANS_Mantid/templateSANS_Mantid.instr): running on 10 nodes (master is 'CIN-969631', MPI version 3.1). [templateSANS_Mantid] Initialize [templateSANS_Mantid] Initialize [templateSANS_Mantid] Initialize [templateSANS_Mantid] Initialize [templateSANS_Mantid] Initialize [templateSANS_Mantid] Initialize [templateSANS_Mantid] Initialize [templateSANS_Mantid] Initialize [templateSANS_Mantid] Initialize [templateSANS_Mantid] Initialize *** TRACE end *** Save [templateSANS_Mantid] Detector: LdetectorPRE_I=6.86077e+09 LdetectorPRE_ERR=4.37708e+06 LdetectorPRE_N=2.45685e+06 "Edet0.dat" Detector: detector_I=7.72151e+06 detector_ERR=169809 detector_N=73939 "PSD.dat" Detector: Ldetector_I=7.72131e+06 Ldetector_ERR=169809 Ldetector_N=73454 "Edet.dat" Detector: PSDrad_I=7.71464e+06 PSDrad_ERR=169810 PSDrad_N=57459 "psd2.dat" Detector: PSDrad_I=1.79297e+10 PSDrad_ERR=8.609e+08 PSDrad_N=574590 "psd2_av.dat" Events: "bank01_events_dat_list.p.x.y.n.id.t" ** MPI master gathering slave node list data ** . MPI master writing data for slave node 1 Finally [templateSANS_Mantid: 2]. Time: 1 [s] Append: "bank01_events_dat_list.p.x.y.n.id.t" . MPI master writing data for slave node 2 Finally [templateSANS_Mantid: 2]. Time: 1 [s] Append: "bank01_events_dat_list.p.x.y.n.id.t" . MPI master writing data for slave node 3 Finally [templateSANS_Mantid: 2]. Time: 1 [s] Append: "bank01_events_dat_list.p.x.y.n.id.t" . MPI master writing data for slave node 4 Finally [templateSANS_Mantid: 2]. Time: 1 [s] Append: "bank01_events_dat_list.p.x.y.n.id.t" . MPI master writing data for slave node 5 Finally [templateSANS_Mantid: 2]. Time: 1 [s] Append: "bank01_events_dat_list.p.x.y.n.id.t" . MPI master writing data for slave node 6 Finally [templateSANS_Mantid: 2]. Time: 1 [s] Append: "bank01_events_dat_list.p.x.y.n.id.t" . MPI master writing data for slave node 7 Finally [templateSANS_Mantid: 2]. Time: 1 [s] Append: "bank01_events_dat_list.p.x.y.n.id.t" . MPI master writing data for slave node 8 Finally [templateSANS_Mantid: 2]. Time: 1 [s] Append: "bank01_events_dat_list.p.x.y.n.id.t" . MPI master writing data for slave node 9 Finally [templateSANS_Mantid: 2]. Time: 1 [s] Append: "bank01_events_dat_list.p.x.y.n.id.t" ** Done ** Finally [templateSANS_Mantid: 2]. Time: 1 [s] -------------------------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code. Per user-direction, the job has been aborted. -------------------------------------------------------------------------- -------------------------------------------------------------------------- mpirun noticed that process rank 1 with PID 0 on node CIN-969631 exited on signal 5 (Trace/BPT trap: 5). -------------------------------------------------------------------------- INFO: call to mpirun failed with Command 'mpirun -np 10 ./templateSANS_Mantid.out --trace=0 --seed=1000 --ncount=50000000.0 --dir=2 --format=NeXus lambda=6' returned non-zero exit status 133. Traceback (most recent call last): File "/Users/peterwillendrup/miniforge3/share/mcstas/tools/Python/mcrun/mcrun.py", line 569, in main() File "/Users/peterwillendrup/miniforge3/share/mcstas/tools/Python/mcrun/mcrun.py", line 547, in main mcstas.run() # in mccode.py File "/Users/peterwillendrup/miniforge3/share/mcstas/tools/Python/mcrun/mccode.py", line 329, in run return self.runMPI(args, pipe, override_mpi) File "/Users/peterwillendrup/miniforge3/share/mcstas/tools/Python/mcrun/mccode.py", line 370, in runMPI return Process(binpath).run(args, pipe=pipe) File "/Users/peterwillendrup/miniforge3/share/mcstas/tools/Python/mcrun/mccode.py", line 77, in run raise err File "/Users/peterwillendrup/miniforge3/share/mcstas/tools/Python/mcrun/mccode.py", line 73, in run proc = run(command, shell=True, check=True, text=True, capture_output=pipe) File "/Users/peterwillendrup/miniforge3/lib/python3.10/subprocess.py", line 526, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command 'mpirun -np 10 ./templateSANS_Mantid.out --trace=0 --seed=1000 --ncount=50000000.0 --dir=2 --format=NeXus lambda=6' returned non-zero exit status 133.