Failed
gprs:oc2g+mod-bts0-egprs.iperf3m4.py (from gprs_oc2g+mod-bts0-egprs)
Stacktrace
osmo-stp_10.42.42.5(pid=31981): Process ended prematurely: osmo-stp_10.42.42.5(pid=31981) [trial-2370↪gprs:oc2g+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=31981)]
Standard Output
---------------------------------------------- trial-2370 gprs:oc2g+mod-bts0-egprs iperf3m4.py ---------------------------------------------- 05:42:53.814984 tst gprs:oc2g+mod-bts0-egprs: Using 1 x ip_address (candidates: 9) 05:42:53.826530 tst gprs:oc2g+mod-bts0-egprs: Using 1 x bts (candidates: 1) 05:42:53.951817 tst iperf3m4.py: using LAC 26243 05:42:54.070967 tst iperf3m4.py: using RAC 233 05:42:54.192472 tst iperf3m4.py: using CellId 26243 05:42:54.311275 tst iperf3m4.py: using BVCI 26244 05:42:54.334595 tst gprs:oc2g+mod-bts0-egprs: Using 1 x ip_address (candidates: 9) 05:42:54.345418 tst gprs:oc2g+mod-bts0-egprs: Using 1 x ip_address (candidates: 9) 05:42:54.355955 tst gprs:oc2g+mod-bts0-egprs: Using 1 x ip_address (candidates: 9) 05:42:54.367314 tst gprs:oc2g+mod-bts0-egprs: Using 1 x ip_address (candidates: 9) 05:42:54.378701 tst gprs:oc2g+mod-bts0-egprs: Using 1 x ip_address (candidates: 9) 05:42:54.390060 tst gprs:oc2g+mod-bts0-egprs: Using 1 x ip_address (candidates: 9) 05:42:54.401525 tst gprs:oc2g+mod-bts0-egprs: Using 1 x ip_address (candidates: 9) 05:42:54.412891 tst gprs:oc2g+mod-bts0-egprs: Using 1 x ip_address (candidates: 9) 05:42:54.424492 tst gprs:oc2g+mod-bts0-egprs: Using 1 x modem (candidates: 4) 05:42:54.544716 tst gprs:oc2g+mod-bts0-egprs: Using 1 x modem (candidates: 4) 05:42:54.664049 tst gprs:oc2g+mod-bts0-egprs: Using 1 x modem (candidates: 4) 05:42:54.782702 tst gprs:oc2g+mod-bts0-egprs: Using 1 x modem (candidates: 4) 05:42:55.058084 tst iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003... 05:42:55.183028 run iperf3-srv_10.42.42.10: Starting iperf3-srv 05:42:55.497802 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31967): Launched 05:42:55.758404 run iperf3-srv_10.42.42.10(pid=31968): Launched 05:42:55.940938 tst iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004... 05:42:56.065941 run iperf3-srv_10.42.42.10: Starting iperf3-srv 05:42:56.381130 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31969): Launched 05:42:56.642275 run iperf3-srv_10.42.42.10(pid=31970): Launched 05:42:56.825458 tst iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005... 05:42:56.950278 run iperf3-srv_10.42.42.10: Starting iperf3-srv 05:42:57.264433 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31971): Launched 05:42:57.526090 run iperf3-srv_10.42.42.10(pid=31972): Launched 05:42:57.709583 tst iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006... 05:42:57.834603 run iperf3-srv_10.42.42.10: Starting iperf3-srv 05:42:58.149943 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31973): Launched 05:42:58.411033 run iperf3-srv_10.42.42.10(pid=31974): Launched 05:42:58.593331 tst iperf3m4.py:8: start network... 05:42:58.718159 run osmo-hlr_10.42.42.2: Starting osmo-hlr 05:42:59.195055 run create_hlr_db(pid=31975): Launched 05:42:59.368265 bus /gobi_6: Setting Powered False 05:43:00.390368 run create_hlr_db(pid=31975): Terminated: ok {rc=0} 05:43:00.820441 run pcap-recorder_any(filters='host 10.42.42.2')(pid=31977): Launched 05:43:01.209337 run osmo-hlr_10.42.42.2(pid=31978): Launched 05:43:01.402316 run osmo-stp_10.42.42.5: Starting osmo-stp 05:43:02.069565 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=31980): Launched 05:43:02.399981 run osmo-stp_10.42.42.5(pid=31981): Launched 05:43:02.517341 run osmo-ggsn_10.42.42.6: Starting osmo-ggsn 05:43:03.083876 run pcap-recorder_any(filters='host 10.42.42.6')(pid=31983): Launched 05:43:03.199218 run osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn 05:43:03.485038 run patchelf(pid=31984): Launched 05:43:03.674411 run osmo-stp_10.42.42.5(pid=31981): ERR: Terminated: ERROR {rc=1} [trial-2370↪gprs:oc2g+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=31981)] 05:43:03.812862 run osmo-stp_10.42.42.5(pid=31981): stdout: | (launched: 2022-12-10_05:43:02.236735) | Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org> | Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy | License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html> | This is free software: you are free to change and redistribute it. | There is NO WARRANTY, to the extent permitted by law. | | Free Software lives by contribution. If you use this, please contribute! | 05:43:03.951028 run osmo-stp_10.42.42.5(pid=31981): stderr: | �[0;m�[38;5;43m20221210054302451 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189) | �[0;m�[38;5;43m20221210054302451 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207) | �[0;m�[38;5;43m20221210054302452 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235) | �[0;m20221210054302452 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822) | % Unable to bind xUA server to IP(s) | 20221210054302453 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935) | 20221210054302453 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946) | 20221210054302453 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99) | Error binding VTY port | : Address already in use 05:43:04.084484 run osmo-stp_10.42.42.5(pid=31981): stdout: | (launched: 2022-12-10_05:43:02.236735) | Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org> | Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy | License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html> | This is free software: you are free to change and redistribute it. | There is NO WARRANTY, to the extent permitted by law. | | Free Software lives by contribution. If you use this, please contribute! | 05:43:04.217472 run osmo-stp_10.42.42.5(pid=31981): stderr: | �[0;m�[38;5;43m20221210054302451 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189) | �[0;m�[38;5;43m20221210054302451 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207) | �[0;m�[38;5;43m20221210054302452 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235) | �[0;m20221210054302452 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822) | % Unable to bind xUA server to IP(s) | 20221210054302453 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935) | 20221210054302453 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946) | 20221210054302453 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99) | Error binding VTY port | : Address already in use 05:43:04.371498 run patchelf(pid=31984): Terminating (SIGINT) 05:43:04.551574 run patchelf(pid=31984): Terminated: ok {rc=0} 05:43:04.584909 tst iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=31981): Process ended prematurely: osmo-stp_10.42.42.5(pid=31981) [trial-2370↪gprs:oc2g+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=31981)] [trial-2370↪gprs:oc2g+mod-bts0-egprs↪iperf3m4.py:8] 05:43:04.594898 tst iperf3m4.py:8: Test FAILED (10.8 sec) 05:43:04.657954 run osmo-hlr_10.42.42.2(pid=31978): ERR: Terminated: ERROR {rc=237} [trial-2370↪gprs:oc2g+mod-bts0-egprs↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=31978)] 05:43:04.709329 run osmo-hlr_10.42.42.2(pid=31978): stdout: | (launched: 2022-12-10_05:43:01.014076) 05:43:04.761520 run osmo-hlr_10.42.42.2(pid=31978): stderr: | �[0;m�[1;31m20221210054301371 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579) | �[0;m�[1;31m20221210054301371 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579) | �[0;m�[1;31m20221210054301371 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579) | �[0;m�[1;31m20221210054301371 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579) | �[0;m�[1;31m20221210054301371 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579) | �[0;m�[1;31m20221210054301371 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596) | �[0;m�[1;31m20221210054301395 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2370/run.2022-12-10_05-27-43/gprs:oc2g+mod-bts0-egprs/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636) | �[0;m20221210054301411 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935) | 20221210054301412 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946) | 20221210054301412 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 05:43:04.805985 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31967): Terminating (SIGTERM) 05:43:04.847746 run iperf3-srv_10.42.42.10(pid=31968): Terminating (SIGTERM) 05:43:04.889590 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31969): Terminating (SIGTERM) 05:43:04.932155 run iperf3-srv_10.42.42.10(pid=31970): Terminating (SIGTERM) 05:43:04.974732 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31971): Terminating (SIGTERM) 05:43:05.016693 run iperf3-srv_10.42.42.10(pid=31972): Terminating (SIGTERM) 05:43:05.058899 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31973): Terminating (SIGTERM) 05:43:05.101041 run iperf3-srv_10.42.42.10(pid=31974): Terminating (SIGTERM) 05:43:05.143079 run pcap-recorder_any(filters='host 10.42.42.2')(pid=31977): Terminating (SIGTERM) 05:43:05.185197 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=31980): Terminating (SIGTERM) 05:43:05.227098 run pcap-recorder_any(filters='host 10.42.42.6')(pid=31983): Terminating (SIGTERM) 05:43:05.236756 --- ParallelTerminationStrategy: PID 31967 died... 05:43:05.293170 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31967): Terminated: ok {rc=0} 05:43:05.303353 --- ParallelTerminationStrategy: PID 31968 died... 05:43:05.359467 run iperf3-srv_10.42.42.10(pid=31968): Terminated {rc=256} 05:43:05.369901 --- ParallelTerminationStrategy: PID 31969 died... 05:43:05.425878 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31969): Terminated: ok {rc=0} 05:43:05.436050 --- ParallelTerminationStrategy: PID 31970 died... 05:43:05.491835 run iperf3-srv_10.42.42.10(pid=31970): Terminated {rc=256} 05:43:05.501977 --- ParallelTerminationStrategy: PID 31971 died... 05:43:05.557725 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31971): Terminated: ok {rc=0} 05:43:05.567832 --- ParallelTerminationStrategy: PID 31972 died... 05:43:05.623587 run iperf3-srv_10.42.42.10(pid=31972): Terminated {rc=256} 05:43:05.633646 --- ParallelTerminationStrategy: PID 31973 died... 05:43:05.689468 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=31973): Terminated: ok {rc=0} 05:43:05.699589 --- ParallelTerminationStrategy: PID 31974 died... 05:43:05.755785 run iperf3-srv_10.42.42.10(pid=31974): Terminated {rc=256} 05:43:05.765848 --- ParallelTerminationStrategy: PID 31977 died... 05:43:05.822022 run pcap-recorder_any(filters='host 10.42.42.2')(pid=31977): Terminated: ok {rc=0} 05:43:05.832170 --- ParallelTerminationStrategy: PID 31980 died... 05:43:05.887942 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=31980): Terminated: ok {rc=0} 05:43:05.898091 --- ParallelTerminationStrategy: PID 31983 died... 05:43:05.954199 run pcap-recorder_any(filters='host 10.42.42.6')(pid=31983): Terminated: ok {rc=0} 05:43:06.042494 bus /gobi_4: Setting Powered False 05:43:07.146380 bus /gobi_4: Setting Powered False 05:43:08.258232 bus /gobi_1: Setting Powered False 05:43:09.359194 bus /gobi_1: Setting Powered False 05:43:10.469823 bus /gobi_0: Setting Powered False 05:43:11.576811 bus /gobi_0: Setting Powered False 05:43:12.686218 bus /gobi_6: Setting Powered False
Standard Error
Traceback (most recent call last): File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run self.path) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file spec.loader.exec_module( importlib.util.module_from_spec(spec) ) File "<frozen importlib._bootstrap_external>", line 728, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module> setup_run_iperf3_test_parallel(4) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel ggsn.start() File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf')) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath proc.launch_sync() File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync raise e File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync self.wait() File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait MainLoop.wait(self.terminated, timeout=timeout) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep): File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise self.poll(may_block=True) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll self.deferred_handling.handle_queue() File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue handler(*args, **kwargs) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll raise log_module.Error('Process ended prematurely: %s' % proc.name()) osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=31981): Process ended prematurely: osmo-stp_10.42.42.5(pid=31981) [trial-2370↪gprs:oc2g+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=31981)]