Skip to content

Failed

gprs:trx-umtrx.iperf3m4.py (from gprs_trx-umtrx)

Failing for the past 217 builds (Since #2175 )
Took 12 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=4968): Process ended prematurely: osmo-stp_10.42.42.5(pid=4968) [trial-2391↪gprs:trx-umtrx↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=4968)]

Standard Output

----------------------------------------------
trial-2391 gprs:trx-umtrx iperf3m4.py
----------------------------------------------
16:18:29.093719 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
16:18:29.105902 tst                   gprs:trx-umtrx: Using 1 x bts (candidates: 1)
16:18:29.254031 tst                      iperf3m4.py: using LAC 34787
16:18:29.384760 tst                      iperf3m4.py: using RAC 107
16:18:29.514017 tst                      iperf3m4.py: using CellId 34787
16:18:29.643593 tst                      iperf3m4.py: using BVCI 34788
16:18:29.669446 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
16:18:29.680687 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
16:18:29.691765 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
16:18:29.702829 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
16:18:29.714226 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
16:18:29.725533 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
16:18:29.737219 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
16:18:29.748643 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
16:18:29.760213 tst                   gprs:trx-umtrx: Using 1 x modem (candidates: 4)
16:18:29.881040 tst                   gprs:trx-umtrx: Using 1 x modem (candidates: 4)
16:18:30.011812 tst                   gprs:trx-umtrx: Using 1 x modem (candidates: 4)
16:18:30.130853 tst                   gprs:trx-umtrx: Using 1 x modem (candidates: 4)
16:18:30.434394 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
16:18:30.574424 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
16:18:30.945729 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4954): Launched
16:18:31.240270 run iperf3-srv_10.42.42.10(pid=4955): Launched
16:18:31.438644 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
16:18:31.575352 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
16:18:31.959543 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4956): Launched
16:18:32.285453 run iperf3-srv_10.42.42.10(pid=4957): Launched
16:18:32.492290 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
16:18:32.633140 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
16:18:32.984325 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4958): Launched
16:18:33.275970 run iperf3-srv_10.42.42.10(pid=4959): Launched
16:18:33.482019 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
16:18:33.622609 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
16:18:33.972521 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4960): Launched
16:18:34.262882 run iperf3-srv_10.42.42.10(pid=4961): Launched
16:18:34.468048 tst                    iperf3m4.py:8: start network...
16:18:34.608024 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
16:18:35.138481 run          create_hlr_db(pid=4962): Launched
16:18:35.333309 bus                          /gobi_6: Setting Powered False
16:18:36.353222 run          create_hlr_db(pid=4962): Terminated: ok {rc=0}
16:18:36.777730 run pcap-recorder_any(filters='host 10.42.42.2')(pid=4964): Launched
16:18:37.096228 run    osmo-hlr_10.42.42.2(pid=4965): Launched
16:18:37.221199 run              osmo-stp_10.42.42.5: Starting osmo-stp
16:18:37.775781 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=4967): Launched
16:18:38.095081 run    osmo-stp_10.42.42.5(pid=4968): Launched
16:18:38.209427 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
16:18:38.759007 run pcap-recorder_any(filters='host 10.42.42.6')(pid=4970): Launched
16:18:38.870765 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
16:18:39.146558 run               patchelf(pid=4971): Launched
16:18:39.329949 run    osmo-stp_10.42.42.5(pid=4968): ERR: Terminated: ERROR {rc=1}  [trial-2391↪gprs:trx-umtrx↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=4968)]
16:18:39.464088 run    osmo-stp_10.42.42.5(pid=4968): stdout: 
| (launched: 2022-12-13_16:18:37.936193)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
16:18:39.598579 run    osmo-stp_10.42.42.5(pid=4968): stderr: 
| �[0;m�[38;5;43m20221213161838110 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221213161838110 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221213161838110 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221213161838110 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221213161838111 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221213161838111 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221213161838111 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
16:18:39.728051 run    osmo-stp_10.42.42.5(pid=4968): stdout: 
| (launched: 2022-12-13_16:18:37.936193)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
16:18:39.856139 run    osmo-stp_10.42.42.5(pid=4968): stderr: 
| �[0;m�[38;5;43m20221213161838110 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221213161838110 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221213161838110 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221213161838110 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221213161838111 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221213161838111 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221213161838111 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
16:18:40.007104 run               patchelf(pid=4971): Terminating (SIGINT)
16:18:40.183398 run               patchelf(pid=4971): Terminated: ok {rc=0}
16:18:40.215725 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=4968): Process ended prematurely: osmo-stp_10.42.42.5(pid=4968) [trial-2391↪gprs:trx-umtrx↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=4968)]  [trial-2391↪gprs:trx-umtrx↪iperf3m4.py:8]
16:18:40.225172 tst                    iperf3m4.py:8: Test FAILED (11.2 sec)
16:18:40.285390 run    osmo-hlr_10.42.42.2(pid=4965): ERR: Terminated: ERROR {rc=237}  [trial-2391↪gprs:trx-umtrx↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=4965)]
16:18:40.334469 run    osmo-hlr_10.42.42.2(pid=4965): stdout: 
| (launched: 2022-12-13_16:18:36.937799) 
16:18:40.383996 run    osmo-hlr_10.42.42.2(pid=4965): stderr: 
| �[0;m�[1;31m20221213161837118 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221213161837118 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221213161837118 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221213161837118 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221213161837118 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221213161837118 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221213161837130 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2391/run.2022-12-13_15-37-59/gprs:trx-umtrx/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221213161837138 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221213161837138 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221213161837138 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
16:18:40.426031 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4954): Terminating (SIGTERM)
16:18:40.465682 run iperf3-srv_10.42.42.10(pid=4955): Terminating (SIGTERM)
16:18:40.505740 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4956): Terminating (SIGTERM)
16:18:40.545568 run iperf3-srv_10.42.42.10(pid=4957): Terminating (SIGTERM)
16:18:40.585615 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4958): Terminating (SIGTERM)
16:18:40.625435 run iperf3-srv_10.42.42.10(pid=4959): Terminating (SIGTERM)
16:18:40.665282 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4960): Terminating (SIGTERM)
16:18:40.705400 run iperf3-srv_10.42.42.10(pid=4961): Terminating (SIGTERM)
16:18:40.745599 run pcap-recorder_any(filters='host 10.42.42.2')(pid=4964): Terminating (SIGTERM)
16:18:40.785344 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=4967): Terminating (SIGTERM)
16:18:40.825694 run pcap-recorder_any(filters='host 10.42.42.6')(pid=4970): Terminating (SIGTERM)
16:18:40.834958 ---      ParallelTerminationStrategy: PID 4954 died...
16:18:40.888334 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4954): Terminated: ok {rc=0}
16:18:40.897884 ---      ParallelTerminationStrategy: PID 4955 died...
16:18:40.950753 run iperf3-srv_10.42.42.10(pid=4955): Terminated {rc=256}
16:18:40.960400 ---      ParallelTerminationStrategy: PID 4956 died...
16:18:41.013578 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4956): Terminated: ok {rc=0}
16:18:41.023222 ---      ParallelTerminationStrategy: PID 4957 died...
16:18:41.075873 run iperf3-srv_10.42.42.10(pid=4957): Terminated {rc=256}
16:18:41.085544 ---      ParallelTerminationStrategy: PID 4958 died...
16:18:41.138655 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4958): Terminated: ok {rc=0}
16:18:41.148259 ---      ParallelTerminationStrategy: PID 4959 died...
16:18:41.201251 run iperf3-srv_10.42.42.10(pid=4959): Terminated {rc=256}
16:18:41.211022 ---      ParallelTerminationStrategy: PID 4960 died...
16:18:41.264320 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=4960): Terminated: ok {rc=0}
16:18:41.274041 ---      ParallelTerminationStrategy: PID 4961 died...
16:18:41.326810 run iperf3-srv_10.42.42.10(pid=4961): Terminated {rc=256}
16:18:41.336331 ---      ParallelTerminationStrategy: PID 4964 died...
16:18:41.389438 run pcap-recorder_any(filters='host 10.42.42.2')(pid=4964): Terminated: ok {rc=0}
16:18:41.399150 ---      ParallelTerminationStrategy: PID 4967 died...
16:18:41.451954 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=4967): Terminated: ok {rc=0}
16:18:41.461729 ---      ParallelTerminationStrategy: PID 4970 died...
16:18:41.514888 run pcap-recorder_any(filters='host 10.42.42.6')(pid=4970): Terminated: ok {rc=0}
16:18:41.602692 bus                          /gobi_4: Setting Powered False
16:18:42.706278 bus                          /gobi_4: Setting Powered False
16:18:43.815103 bus                          /gobi_1: Setting Powered False
16:18:44.920915 bus                          /gobi_1: Setting Powered False
16:18:46.058162 bus                          /gobi_0: Setting Powered False
16:18:47.165096 bus                          /gobi_0: Setting Powered False
16:18:48.273301 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=4968): Process ended prematurely: osmo-stp_10.42.42.5(pid=4968) [trial-2391↪gprs:trx-umtrx↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=4968)]