Failed
gprs:trx-b200.iperf3.py (from gprs_trx-b200)
Stacktrace
osmo-stp_10.42.42.5(pid=11763): Process ended prematurely: osmo-stp_10.42.42.5(pid=11763) [trial-2364↪gprs:trx-b200↪iperf3.py:8↪iperf3.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=11763)]
Standard Output
----------------------------------------------
trial-2364 gprs:trx-b200 iperf3.py
----------------------------------------------
20:27:23.019641 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9)
20:27:23.030921 tst gprs:trx-b200: Using 1 x bts (candidates: 1)
20:27:23.172178 tst iperf3.py: using LAC 23517
20:27:23.299413 tst iperf3.py: using RAC 57
20:27:23.426637 tst iperf3.py: using CellId 23517
20:27:23.554559 tst iperf3.py: using BVCI 23518
20:27:23.577940 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9)
20:27:23.588485 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9)
20:27:23.598969 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9)
20:27:23.609801 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9)
20:27:23.620410 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9)
20:27:23.631244 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9)
20:27:23.641923 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9)
20:27:23.652391 tst gprs:trx-b200: Using 1 x ip_address (candidates: 9)
20:27:23.663519 tst gprs:trx-b200: Using 1 x modem (candidates: 4)
20:27:23.959893 tst iperf3.py:8: start iperfv3 server 10.42.42.10:5003...
20:27:24.096475 run iperf3-srv_10.42.42.10: Starting iperf3-srv
20:27:24.431938 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=11755): Launched
20:27:24.711137 run iperf3-srv_10.42.42.10(pid=11756): Launched
20:27:24.911029 tst iperf3.py:8: start network...
20:27:25.047398 run osmo-hlr_10.42.42.2: Starting osmo-hlr
20:27:25.567513 run create_hlr_db(pid=11757): Launched
20:27:25.754878 bus /gobi_6: Setting Powered False
20:27:26.775245 run create_hlr_db(pid=11757): Terminated: ok {rc=0}
20:27:27.180508 run pcap-recorder_any(filters='host 10.42.42.2')(pid=11759): Launched
20:27:27.483981 run osmo-hlr_10.42.42.2(pid=11760): Launched
20:27:27.605213 run osmo-stp_10.42.42.5: Starting osmo-stp
20:27:28.136527 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=11762): Launched
20:27:28.438788 run osmo-stp_10.42.42.5(pid=11763): Launched
20:27:28.549449 run osmo-ggsn_10.42.42.6: Starting osmo-ggsn
20:27:29.075314 run pcap-recorder_any(filters='host 10.42.42.6')(pid=11765): Launched
20:27:29.183831 run osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
20:27:29.447745 run patchelf(pid=11767): Launched
20:27:29.625173 run osmo-stp_10.42.42.5(pid=11763): ERR: Terminated: ERROR {rc=1} [trial-2364↪gprs:trx-b200↪iperf3.py:8↪iperf3.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=11763)]
20:27:29.754145 run osmo-stp_10.42.42.5(pid=11763): stdout:
| (launched: 2022-12-09_20:27:28.290474)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
|
| Free Software lives by contribution. If you use this, please contribute!
|
20:27:29.883893 run osmo-stp_10.42.42.5(pid=11763): stderr:
| �[0;m�[38;5;43m20221209202728456 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209202728456 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209202728456 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209202728457 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209202728457 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209202728458 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209202728458 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use
20:27:30.009504 run osmo-stp_10.42.42.5(pid=11763): stdout:
| (launched: 2022-12-09_20:27:28.290474)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
|
| Free Software lives by contribution. If you use this, please contribute!
|
20:27:30.134270 run osmo-stp_10.42.42.5(pid=11763): stderr:
| �[0;m�[38;5;43m20221209202728456 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209202728456 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209202728456 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209202728457 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209202728457 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209202728458 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209202728458 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use
20:27:30.278744 run patchelf(pid=11767): Terminating (SIGINT)
20:27:30.448506 run patchelf(pid=11767): Terminated: ok {rc=0}
20:27:30.479721 tst iperf3.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=11763): Process ended prematurely: osmo-stp_10.42.42.5(pid=11763) [trial-2364↪gprs:trx-b200↪iperf3.py:8↪iperf3.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=11763)] [trial-2364↪gprs:trx-b200↪iperf3.py:8]
20:27:30.488985 tst iperf3.py:8: Test FAILED (7.5 sec)
20:27:30.549100 run osmo-hlr_10.42.42.2(pid=11760): ERR: Terminated: ERROR {rc=237} [trial-2364↪gprs:trx-b200↪iperf3.py↪iperf3.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=11760)]
20:27:30.597835 run osmo-hlr_10.42.42.2(pid=11760): stdout:
| (launched: 2022-12-09_20:27:27.334499)
20:27:30.647132 run osmo-hlr_10.42.42.2(pid=11760): stderr:
| �[0;m�[1;31m20221209202727508 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221209202727508 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221209202727508 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221209202727508 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221209202727508 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221209202727508 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221209202727520 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2364/run.2022-12-09_20-04-44/gprs:trx-b200/iperf3.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221209202727528 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221209202727529 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221209202727529 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99)
20:27:30.688751 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=11755): Terminating (SIGTERM)
20:27:30.727982 run iperf3-srv_10.42.42.10(pid=11756): Terminating (SIGTERM)
20:27:30.767454 run pcap-recorder_any(filters='host 10.42.42.2')(pid=11759): Terminating (SIGTERM)
20:27:30.806775 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=11762): Terminating (SIGTERM)
20:27:30.846679 run pcap-recorder_any(filters='host 10.42.42.6')(pid=11765): Terminating (SIGTERM)
20:27:30.855852 --- ParallelTerminationStrategy: PID 11755 died...
20:27:30.915612 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=11755): Terminated: ok {rc=0}
20:27:30.925573 --- ParallelTerminationStrategy: PID 11756 died...
20:27:30.978452 run iperf3-srv_10.42.42.10(pid=11756): Terminated {rc=256}
20:27:30.988627 --- ParallelTerminationStrategy: PID 11759 died...
20:27:31.042108 run pcap-recorder_any(filters='host 10.42.42.2')(pid=11759): Terminated: ok {rc=0}
20:27:31.052135 --- ParallelTerminationStrategy: PID 11762 died...
20:27:31.106057 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=11762): Terminated: ok {rc=0}
20:27:31.116117 --- ParallelTerminationStrategy: PID 11765 died...
20:27:31.170068 run pcap-recorder_any(filters='host 10.42.42.6')(pid=11765): Terminated: ok {rc=0}
20:27:31.259040 bus /gobi_6: Setting Powered False
Standard Error
Traceback (most recent call last):
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
self.path)
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
spec.loader.exec_module( importlib.util.module_from_spec(spec) )
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3.py", line 8, in <module>
setup_run_iperf3_test_parallel(1)
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
ggsn.start()
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
proc.launch_sync()
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
raise e
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
self.wait()
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
MainLoop.wait(self.terminated, timeout=timeout)
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
self.poll(may_block=True)
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
self.deferred_handling.handle_queue()
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
handler(*args, **kwargs)
File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=11763): Process ended prematurely: osmo-stp_10.42.42.5(pid=11763) [trial-2364↪gprs:trx-b200↪iperf3.py:8↪iperf3.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=11763)]