Failed
gprs:trx-b200.iperf3.py (from gprs_trx-b200)
Stacktrace
osmo-stp_10.42.42.5(pid=28179): Process ended prematurely: osmo-stp_10.42.42.5(pid=28179) [trial-2394↪gprs:trx-b200↪iperf3.py:8↪iperf3.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=28179)]
Standard Output
---------------------------------------------- trial-2394 gprs:trx-b200 iperf3.py ---------------------------------------------- 20:28:49.987330 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9) 20:28:49.998511 tst gprs:trx-b200: Using 1 x bts (candidates: 1) 20:28:50.146232 tst iperf3.py: using LAC 36078 20:28:50.274739 tst iperf3.py: using RAC 123 20:28:50.402968 tst iperf3.py: using CellId 36078 20:28:50.530755 tst iperf3.py: using BVCI 36079 20:28:50.554125 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9) 20:28:50.564831 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9) 20:28:50.575286 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9) 20:28:50.585781 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9) 20:28:50.596787 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9) 20:28:50.607528 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9) 20:28:50.618418 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9) 20:28:50.629098 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9) 20:28:50.639845 tst gprs:trx-b200: Using 1 x modem (candidates: 4) 20:28:50.938661 tst iperf3.py:8: start iperfv3 server 10.42.42.10:5003... 20:28:51.075835 run iperf3-srv_10.42.42.10: Starting iperf3-srv 20:28:51.415956 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=28171): Launched 20:28:51.700195 run iperf3-srv_10.42.42.10(pid=28172): Launched 20:28:51.901519 tst iperf3.py:8: start network... 20:28:52.038833 run osmo-hlr_10.42.42.2: Starting osmo-hlr 20:28:52.565340 run create_hlr_db(pid=28173): Launched 20:28:52.754058 bus /gobi_6: Setting Powered False 20:28:53.774839 run create_hlr_db(pid=28173): Terminated: ok {rc=0} 20:28:54.185327 run pcap-recorder_any(filters='host 10.42.42.2')(pid=28175): Launched 20:28:54.492502 run osmo-hlr_10.42.42.2(pid=28176): Launched 20:28:54.613320 run osmo-stp_10.42.42.5: Starting osmo-stp 20:28:55.154017 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=28178): Launched 20:28:55.462253 run osmo-stp_10.42.42.5(pid=28179): Launched 20:28:55.575350 run osmo-ggsn_10.42.42.6: Starting osmo-ggsn 20:28:56.109918 run pcap-recorder_any(filters='host 10.42.42.6')(pid=28181): Launched 20:28:56.219099 run osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn 20:28:56.486971 run patchelf(pid=28182): Launched 20:28:56.667775 run osmo-stp_10.42.42.5(pid=28179): ERR: Terminated: ERROR {rc=1} [trial-2394↪gprs:trx-b200↪iperf3.py:8↪iperf3.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=28179)] 20:28:56.799474 run osmo-stp_10.42.42.5(pid=28179): stdout: | (launched: 2022-12-13_20:28:55.310240) | Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org> | Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy | License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html> | This is free software: you are free to change and redistribute it. | There is NO WARRANTY, to the extent permitted by law. | | Free Software lives by contribution. If you use this, please contribute! | 20:28:56.931625 run osmo-stp_10.42.42.5(pid=28179): stderr: | �[0;m�[38;5;43m20221213202855478 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189) | �[0;m�[38;5;43m20221213202855478 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207) | �[0;m�[38;5;43m20221213202855478 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235) | �[0;m20221213202855479 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822) | % Unable to bind xUA server to IP(s) | 20221213202855479 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935) | 20221213202855479 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946) | 20221213202855479 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99) | Error binding VTY port | : Address already in use 20:28:57.059710 run osmo-stp_10.42.42.5(pid=28179): stdout: | (launched: 2022-12-13_20:28:55.310240) | Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org> | Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy | License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html> | This is free software: you are free to change and redistribute it. | There is NO WARRANTY, to the extent permitted by law. | | Free Software lives by contribution. If you use this, please contribute! | 20:28:57.186467 run osmo-stp_10.42.42.5(pid=28179): stderr: | �[0;m�[38;5;43m20221213202855478 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189) | �[0;m�[38;5;43m20221213202855478 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207) | �[0;m�[38;5;43m20221213202855478 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235) | �[0;m20221213202855479 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822) | % Unable to bind xUA server to IP(s) | 20221213202855479 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935) | 20221213202855479 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946) | 20221213202855479 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99) | Error binding VTY port | : Address already in use 20:28:57.334047 run patchelf(pid=28182): Terminating (SIGINT) 20:28:57.508269 run patchelf(pid=28182): Terminated: ok {rc=0} 20:28:57.540159 tst iperf3.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=28179): Process ended prematurely: osmo-stp_10.42.42.5(pid=28179) [trial-2394↪gprs:trx-b200↪iperf3.py:8↪iperf3.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=28179)] [trial-2394↪gprs:trx-b200↪iperf3.py:8] 20:28:57.549601 tst iperf3.py:8: Test FAILED (7.6 sec) 20:28:57.610557 run osmo-hlr_10.42.42.2(pid=28176): ERR: Terminated: ERROR {rc=237} [trial-2394↪gprs:trx-b200↪iperf3.py↪iperf3.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=28176)] 20:28:57.660173 run osmo-hlr_10.42.42.2(pid=28176): stdout: | (launched: 2022-12-13_20:28:54.341115) 20:28:57.710501 run osmo-hlr_10.42.42.2(pid=28176): stderr: | �[0;m�[1;31m20221213202854515 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579) | �[0;m�[1;31m20221213202854515 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579) | �[0;m�[1;31m20221213202854515 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579) | �[0;m�[1;31m20221213202854515 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579) | �[0;m�[1;31m20221213202854515 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579) | �[0;m�[1;31m20221213202854515 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596) | �[0;m�[1;31m20221213202854527 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2394/run.2022-12-13_20-06-05/gprs:trx-b200/iperf3.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636) | �[0;m20221213202854536 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935) | 20221213202854536 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946) | 20221213202854536 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 20:28:57.753544 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=28171): Terminating (SIGTERM) 20:28:57.794120 run iperf3-srv_10.42.42.10(pid=28172): Terminating (SIGTERM) 20:28:57.834685 run pcap-recorder_any(filters='host 10.42.42.2')(pid=28175): Terminating (SIGTERM) 20:28:57.875070 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=28178): Terminating (SIGTERM) 20:28:57.916210 run pcap-recorder_any(filters='host 10.42.42.6')(pid=28181): Terminating (SIGTERM) 20:28:57.925592 --- ParallelTerminationStrategy: PID 28171 died... 20:28:57.980161 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=28171): Terminated: ok {rc=0} 20:28:57.989837 --- ParallelTerminationStrategy: PID 28172 died... 20:28:58.044037 run iperf3-srv_10.42.42.10(pid=28172): Terminated {rc=256} 20:28:58.053674 --- ParallelTerminationStrategy: PID 28175 died... 20:28:58.108301 run pcap-recorder_any(filters='host 10.42.42.2')(pid=28175): Terminated: ok {rc=0} 20:28:58.117963 --- ParallelTerminationStrategy: PID 28178 died... 20:28:58.172002 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=28178): Terminated: ok {rc=0} 20:28:58.181650 --- ParallelTerminationStrategy: PID 28181 died... 20:28:58.235454 run pcap-recorder_any(filters='host 10.42.42.6')(pid=28181): Terminated: ok {rc=0} 20:28:58.320068 bus /gobi_6: Setting Powered False
Standard Error
Traceback (most recent call last): File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run self.path) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file spec.loader.exec_module( importlib.util.module_from_spec(spec) ) File "<frozen importlib._bootstrap_external>", line 728, in exec_module File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3.py", line 8, in <module> setup_run_iperf3_test_parallel(1) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel ggsn.start() File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf')) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath proc.launch_sync() File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync raise e File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync self.wait() File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait MainLoop.wait(self.terminated, timeout=timeout) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep): File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise self.poll(may_block=True) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll self.deferred_handling.handle_queue() File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue handler(*args, **kwargs) File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll raise log_module.Error('Process ended prematurely: %s' % proc.name()) osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=28179): Process ended prematurely: osmo-stp_10.42.42.5(pid=28179) [trial-2394↪gprs:trx-b200↪iperf3.py:8↪iperf3.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=28179)]