Skip to content

Failed

gprs:trx-umtrx.iperf3m4.py (from gprs_trx-umtrx)

Failing for the past 195 builds (Since #2175 )
Took 11 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=25103): Process ended prematurely: osmo-stp_10.42.42.5(pid=25103) [trial-2369↪gprs:trx-umtrx↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=25103)]

Standard Output

----------------------------------------------
trial-2369 gprs:trx-umtrx iperf3m4.py
----------------------------------------------
04:13:59.613682 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
04:13:59.625084 tst                   gprs:trx-umtrx: Using 1 x bts (candidates: 1)
04:13:59.766327 tst                      iperf3m4.py: using LAC 25914
04:13:59.890029 tst                      iperf3m4.py: using RAC 159
04:14:00.014987 tst                      iperf3m4.py: using CellId 25914
04:14:00.138410 tst                      iperf3m4.py: using BVCI 25915
04:14:00.162497 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
04:14:00.174040 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
04:14:00.184875 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
04:14:00.195749 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
04:14:00.206968 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
04:14:00.217782 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
04:14:00.228701 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
04:14:00.239574 tst                   gprs:trx-umtrx: Using 1 x ip_address (candidates: 9)
04:14:00.250620 tst                   gprs:trx-umtrx: Using 1 x modem (candidates: 4)
04:14:00.365489 tst                   gprs:trx-umtrx: Using 1 x modem (candidates: 4)
04:14:00.488644 tst                   gprs:trx-umtrx: Using 1 x modem (candidates: 4)
04:14:00.606192 tst                   gprs:trx-umtrx: Using 1 x modem (candidates: 4)
04:14:00.920175 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
04:14:01.065745 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
04:14:01.407844 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25089): Launched
04:14:01.698329 run iperf3-srv_10.42.42.10(pid=25090): Launched
04:14:01.897146 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
04:14:02.030778 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
04:14:02.358938 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25091): Launched
04:14:02.629116 run iperf3-srv_10.42.42.10(pid=25092): Launched
04:14:02.823269 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
04:14:02.955181 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
04:14:03.280035 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25093): Launched
04:14:03.549253 run iperf3-srv_10.42.42.10(pid=25094): Launched
04:14:03.742746 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
04:14:03.875919 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
04:14:04.201789 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25095): Launched
04:14:04.471559 run iperf3-srv_10.42.42.10(pid=25096): Launched
04:14:04.664659 tst                    iperf3m4.py:8: start network...
04:14:04.796622 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
04:14:05.299349 run         create_hlr_db(pid=25097): Launched
04:14:05.482363 bus                          /gobi_6: Setting Powered False
04:14:06.502811 run         create_hlr_db(pid=25097): Terminated: ok {rc=0}
04:14:06.922064 run pcap-recorder_any(filters='host 10.42.42.2')(pid=25099): Launched
04:14:07.235626 run   osmo-hlr_10.42.42.2(pid=25100): Launched
04:14:07.351128 run              osmo-stp_10.42.42.5: Starting osmo-stp
04:14:07.899161 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=25102): Launched
04:14:08.213463 run   osmo-stp_10.42.42.5(pid=25103): Launched
04:14:08.327614 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
04:14:08.876813 run pcap-recorder_any(filters='host 10.42.42.6')(pid=25105): Launched
04:14:08.989378 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
04:14:09.263117 run              patchelf(pid=25106): Launched
04:14:09.448245 run   osmo-stp_10.42.42.5(pid=25103): ERR: Terminated: ERROR {rc=1}  [trial-2369↪gprs:trx-umtrx↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=25103)]
04:14:09.583775 run   osmo-stp_10.42.42.5(pid=25103): stdout: 
| (launched: 2022-12-10_04:14:08.059571)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
04:14:09.719570 run   osmo-stp_10.42.42.5(pid=25103): stderr: 
| �[0;m�[38;5;43m20221210041408249 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221210041408249 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221210041408250 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221210041408250 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221210041408250 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221210041408250 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221210041408251 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
04:14:09.851268 run   osmo-stp_10.42.42.5(pid=25103): stdout: 
| (launched: 2022-12-10_04:14:08.059571)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
04:14:09.980762 run   osmo-stp_10.42.42.5(pid=25103): stderr: 
| �[0;m�[38;5;43m20221210041408249 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221210041408249 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221210041408250 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221210041408250 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221210041408250 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221210041408250 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221210041408251 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
04:14:10.131471 run              patchelf(pid=25106): Terminating (SIGINT)
04:14:10.308741 run              patchelf(pid=25106): Terminated: ok {rc=0}
04:14:10.341463 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=25103): Process ended prematurely: osmo-stp_10.42.42.5(pid=25103) [trial-2369↪gprs:trx-umtrx↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=25103)]  [trial-2369↪gprs:trx-umtrx↪iperf3m4.py:8]
04:14:10.351509 tst                    iperf3m4.py:8: Test FAILED (10.8 sec)
04:14:10.412860 run   osmo-hlr_10.42.42.2(pid=25100): ERR: Terminated: ERROR {rc=237}  [trial-2369↪gprs:trx-umtrx↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=25100)]
04:14:10.462856 run   osmo-hlr_10.42.42.2(pid=25100): stdout: 
| (launched: 2022-12-10_04:14:07.082385) 
04:14:10.513401 run   osmo-hlr_10.42.42.2(pid=25100): stderr: 
| �[0;m�[1;31m20221210041407279 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221210041407279 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221210041407279 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221210041407279 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221210041407279 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221210041407279 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221210041407291 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2369/run.2022-12-10_03-34-39/gprs:trx-umtrx/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221210041407300 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221210041407300 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221210041407300 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
04:14:10.556552 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25089): Terminating (SIGTERM)
04:14:10.597694 run iperf3-srv_10.42.42.10(pid=25090): Terminating (SIGTERM)
04:14:10.639020 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25091): Terminating (SIGTERM)
04:14:10.680127 run iperf3-srv_10.42.42.10(pid=25092): Terminating (SIGTERM)
04:14:10.721152 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25093): Terminating (SIGTERM)
04:14:10.761969 run iperf3-srv_10.42.42.10(pid=25094): Terminating (SIGTERM)
04:14:10.803233 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25095): Terminating (SIGTERM)
04:14:10.844198 run iperf3-srv_10.42.42.10(pid=25096): Terminating (SIGTERM)
04:14:10.885107 run pcap-recorder_any(filters='host 10.42.42.2')(pid=25099): Terminating (SIGTERM)
04:14:10.926014 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=25102): Terminating (SIGTERM)
04:14:10.967199 run pcap-recorder_any(filters='host 10.42.42.6')(pid=25105): Terminating (SIGTERM)
04:14:10.976803 ---      ParallelTerminationStrategy: PID 25089 died...
04:14:11.032048 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25089): Terminated: ok {rc=0}
04:14:11.042211 ---      ParallelTerminationStrategy: PID 25090 died...
04:14:11.097348 run iperf3-srv_10.42.42.10(pid=25090): Terminated {rc=256}
04:14:11.107361 ---      ParallelTerminationStrategy: PID 25091 died...
04:14:11.162630 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25091): Terminated: ok {rc=0}
04:14:11.172756 ---      ParallelTerminationStrategy: PID 25092 died...
04:14:11.227667 run iperf3-srv_10.42.42.10(pid=25092): Terminated {rc=256}
04:14:11.237790 ---      ParallelTerminationStrategy: PID 25093 died...
04:14:11.292847 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25093): Terminated: ok {rc=0}
04:14:11.302774 ---      ParallelTerminationStrategy: PID 25094 died...
04:14:11.357999 run iperf3-srv_10.42.42.10(pid=25094): Terminated {rc=256}
04:14:11.368067 ---      ParallelTerminationStrategy: PID 25095 died...
04:14:11.423367 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25095): Terminated: ok {rc=0}
04:14:11.433520 ---      ParallelTerminationStrategy: PID 25096 died...
04:14:11.488441 run iperf3-srv_10.42.42.10(pid=25096): Terminated {rc=256}
04:14:11.498597 ---      ParallelTerminationStrategy: PID 25099 died...
04:14:11.553760 run pcap-recorder_any(filters='host 10.42.42.2')(pid=25099): Terminated: ok {rc=0}
04:14:11.563789 ---      ParallelTerminationStrategy: PID 25102 died...
04:14:11.618987 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=25102): Terminated: ok {rc=0}
04:14:11.629071 ---      ParallelTerminationStrategy: PID 25105 died...
04:14:11.684191 run pcap-recorder_any(filters='host 10.42.42.6')(pid=25105): Terminated: ok {rc=0}
04:14:11.769476 bus                          /gobi_4: Setting Powered False
04:14:12.871812 bus                          /gobi_4: Setting Powered False
04:14:13.981872 bus                          /gobi_1: Setting Powered False
04:14:15.083530 bus                          /gobi_1: Setting Powered False
04:14:16.195381 bus                          /gobi_0: Setting Powered False
04:14:17.300003 bus                          /gobi_0: Setting Powered False
04:14:18.406859 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=25103): Process ended prematurely: osmo-stp_10.42.42.5(pid=25103) [trial-2369↪gprs:trx-umtrx↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=25103)]