Skip to content

Failed

gprs:oc2g.iperf3m4.py (from gprs_oc2g)

Failing for the past 252 builds (Since #2139 )
Took 11 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=25191): Process ended prematurely: osmo-stp_10.42.42.5(pid=25191) [trial-2390↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=25191)]

Standard Output

----------------------------------------------
trial-2390 gprs:oc2g iperf3m4.py
----------------------------------------------
13:15:19.980347 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
13:15:19.991525 tst                        gprs:oc2g: Using 1 x bts (candidates: 1)
13:15:20.115399 tst                      iperf3m4.py: using LAC 34189
13:15:20.237683 tst                      iperf3m4.py: using RAC 19
13:15:20.359389 tst                      iperf3m4.py: using CellId 34189
13:15:20.481373 tst                      iperf3m4.py: using BVCI 34190
13:15:20.504567 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
13:15:20.515123 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
13:15:20.525655 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
13:15:20.536004 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
13:15:20.547255 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
13:15:20.560533 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
13:15:20.572088 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
13:15:20.583652 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
13:15:20.595198 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
13:15:20.712606 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
13:15:20.835446 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
13:15:20.952678 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
13:15:21.231807 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
13:15:21.357877 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
13:15:21.676124 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25177): Launched
13:15:21.940268 run iperf3-srv_10.42.42.10(pid=25178): Launched
13:15:22.125920 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
13:15:22.252224 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
13:15:22.566693 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25179): Launched
13:15:22.830028 run iperf3-srv_10.42.42.10(pid=25180): Launched
13:15:23.015491 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
13:15:23.141915 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
13:15:23.460057 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25181): Launched
13:15:23.723903 run iperf3-srv_10.42.42.10(pid=25182): Launched
13:15:23.909318 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
13:15:24.035385 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
13:15:24.350056 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25183): Launched
13:15:24.610808 run iperf3-srv_10.42.42.10(pid=25184): Launched
13:15:24.795483 tst                    iperf3m4.py:8: start network...
13:15:24.920640 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
13:15:25.407878 run         create_hlr_db(pid=25185): Launched
13:15:25.582141 bus                          /gobi_6: Setting Powered False
13:15:26.603350 run         create_hlr_db(pid=25185): Terminated: ok {rc=0}
13:15:27.037084 run pcap-recorder_any(filters='host 10.42.42.2')(pid=25187): Launched
13:15:27.365128 run   osmo-hlr_10.42.42.2(pid=25188): Launched
13:15:27.486793 run              osmo-stp_10.42.42.5: Starting osmo-stp
13:15:28.053636 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=25190): Launched
13:15:28.381598 run   osmo-stp_10.42.42.5(pid=25191): Launched
13:15:28.501270 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
13:15:29.077909 run pcap-recorder_any(filters='host 10.42.42.6')(pid=25193): Launched
13:15:29.196256 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
13:15:29.486511 run              patchelf(pid=25194): Launched
13:15:29.679573 run   osmo-stp_10.42.42.5(pid=25191): ERR: Terminated: ERROR {rc=1}  [trial-2390↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=25191)]
13:15:29.820621 run   osmo-stp_10.42.42.5(pid=25191): stdout: 
| (launched: 2022-12-13_13:15:28.219517)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
13:15:29.963057 run   osmo-stp_10.42.42.5(pid=25191): stderr: 
| �[0;m�[38;5;43m20221213131528406 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221213131528406 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221213131528407 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221213131528407 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221213131528408 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221213131528408 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221213131528408 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
13:15:30.100052 run   osmo-stp_10.42.42.5(pid=25191): stdout: 
| (launched: 2022-12-13_13:15:28.219517)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
13:15:30.236050 run   osmo-stp_10.42.42.5(pid=25191): stderr: 
| �[0;m�[38;5;43m20221213131528406 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221213131528406 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221213131528407 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221213131528407 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221213131528408 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221213131528408 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221213131528408 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
13:15:30.394434 run              patchelf(pid=25194): Terminating (SIGINT)
13:15:30.581247 run              patchelf(pid=25194): Terminated: ok {rc=0}
13:15:30.615301 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=25191): Process ended prematurely: osmo-stp_10.42.42.5(pid=25191) [trial-2390↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=25191)]  [trial-2390↪gprs:oc2g↪iperf3m4.py:8]
13:15:30.625571 tst                    iperf3m4.py:8: Test FAILED (10.7 sec)
13:15:30.689144 run   osmo-hlr_10.42.42.2(pid=25188): ERR: Terminated: ERROR {rc=237}  [trial-2390↪gprs:oc2g↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=25188)]
13:15:30.740537 run   osmo-hlr_10.42.42.2(pid=25188): stdout: 
| (launched: 2022-12-13_13:15:27.203306) 
13:15:30.792652 run   osmo-hlr_10.42.42.2(pid=25188): stderr: 
| �[0;m�[1;31m20221213131527397 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221213131527397 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221213131527397 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221213131527397 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221213131527397 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221213131527397 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221213131527412 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2390/run.2022-12-13_13-01-08/gprs:oc2g/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221213131527422 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221213131527423 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221213131527423 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
13:15:30.841759 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25177): Terminating (SIGTERM)
13:15:30.916807 run iperf3-srv_10.42.42.10(pid=25178): Terminating (SIGTERM)
13:15:30.961153 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25179): Terminating (SIGTERM)
13:15:31.006089 run iperf3-srv_10.42.42.10(pid=25180): Terminating (SIGTERM)
13:15:31.071421 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25181): Terminating (SIGTERM)
13:15:31.121179 run iperf3-srv_10.42.42.10(pid=25182): Terminating (SIGTERM)
13:15:31.166095 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25183): Terminating (SIGTERM)
13:15:31.211032 run iperf3-srv_10.42.42.10(pid=25184): Terminating (SIGTERM)
13:15:31.255944 run pcap-recorder_any(filters='host 10.42.42.2')(pid=25187): Terminating (SIGTERM)
13:15:31.299055 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=25190): Terminating (SIGTERM)
13:15:31.344213 run pcap-recorder_any(filters='host 10.42.42.6')(pid=25193): Terminating (SIGTERM)
13:15:31.354416 ---      ParallelTerminationStrategy: PID 25177 died...
13:15:31.412076 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25177): Terminated: ok {rc=0}
13:15:31.422736 ---      ParallelTerminationStrategy: PID 25178 died...
13:15:31.479991 run iperf3-srv_10.42.42.10(pid=25178): Terminated {rc=256}
13:15:31.490933 ---      ParallelTerminationStrategy: PID 25179 died...
13:15:31.548387 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25179): Terminated: ok {rc=0}
13:15:31.559103 ---      ParallelTerminationStrategy: PID 25180 died...
13:15:31.616463 run iperf3-srv_10.42.42.10(pid=25180): Terminated {rc=256}
13:15:31.627141 ---      ParallelTerminationStrategy: PID 25181 died...
13:15:31.684321 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25181): Terminated: ok {rc=0}
13:15:31.695042 ---      ParallelTerminationStrategy: PID 25182 died...
13:15:31.752220 run iperf3-srv_10.42.42.10(pid=25182): Terminated {rc=256}
13:15:31.763067 ---      ParallelTerminationStrategy: PID 25183 died...
13:15:31.820193 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25183): Terminated: ok {rc=0}
13:15:31.831231 ---      ParallelTerminationStrategy: PID 25184 died...
13:15:31.888254 run iperf3-srv_10.42.42.10(pid=25184): Terminated {rc=256}
13:15:31.899284 ---      ParallelTerminationStrategy: PID 25187 died...
13:15:31.956348 run pcap-recorder_any(filters='host 10.42.42.2')(pid=25187): Terminated: ok {rc=0}
13:15:31.967125 ---      ParallelTerminationStrategy: PID 25190 died...
13:15:32.024359 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=25190): Terminated: ok {rc=0}
13:15:32.035044 ---      ParallelTerminationStrategy: PID 25193 died...
13:15:32.092515 run pcap-recorder_any(filters='host 10.42.42.6')(pid=25193): Terminated: ok {rc=0}
13:15:32.196764 bus                          /gobi_4: Setting Powered False
13:15:33.304970 bus                          /gobi_4: Setting Powered False
13:15:34.411714 bus                          /gobi_1: Setting Powered False
13:15:35.517212 bus                          /gobi_1: Setting Powered False
13:15:36.624767 bus                          /gobi_0: Setting Powered False
13:15:37.731698 bus                          /gobi_0: Setting Powered False
13:15:38.840722 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=25191): Process ended prematurely: osmo-stp_10.42.42.5(pid=25191) [trial-2390↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=25191)]