Skip to content

Failed

gprs:oc2g.iperf3m4.py (from gprs_oc2g)

Failing for the past 224 builds (Since #2139 )
Took 11 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=26904): Process ended prematurely: osmo-stp_10.42.42.5(pid=26904) [trial-2362↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=26904)]

Standard Output

----------------------------------------------
trial-2362 gprs:oc2g iperf3m4.py
----------------------------------------------
17:24:19.695807 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
17:24:19.707291 tst                        gprs:oc2g: Using 1 x bts (candidates: 1)
17:24:19.835786 tst                      iperf3m4.py: using LAC 22550
17:24:19.961742 tst                      iperf3m4.py: using RAC 110
17:24:20.083658 tst                      iperf3m4.py: using CellId 22550
17:24:20.201161 tst                      iperf3m4.py: using BVCI 22551
17:24:20.224552 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
17:24:20.235089 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
17:24:20.245544 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
17:24:20.256448 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
17:24:20.267305 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
17:24:20.278152 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
17:24:20.289031 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
17:24:20.299735 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
17:24:20.310706 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
17:24:20.429465 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
17:24:20.549720 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
17:24:20.667409 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
17:24:20.943598 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
17:24:21.069379 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
17:24:21.383975 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26890): Launched
17:24:21.645165 run iperf3-srv_10.42.42.10(pid=26891): Launched
17:24:21.829312 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
17:24:21.954942 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
17:24:22.267471 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26892): Launched
17:24:22.527890 run iperf3-srv_10.42.42.10(pid=26893): Launched
17:24:22.711679 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
17:24:22.836624 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
17:24:23.149840 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26894): Launched
17:24:23.408772 run iperf3-srv_10.42.42.10(pid=26895): Launched
17:24:23.591304 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
17:24:23.715895 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
17:24:24.028496 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26896): Launched
17:24:24.287548 run iperf3-srv_10.42.42.10(pid=26897): Launched
17:24:24.470793 tst                    iperf3m4.py:8: start network...
17:24:24.595507 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
17:24:25.070460 run         create_hlr_db(pid=26898): Launched
17:24:25.243919 bus                          /gobi_6: Setting Powered False
17:24:26.265166 run         create_hlr_db(pid=26898): Terminated: ok {rc=0}
17:24:26.698810 run pcap-recorder_any(filters='host 10.42.42.2')(pid=26900): Launched
17:24:27.022572 run   osmo-hlr_10.42.42.2(pid=26901): Launched
17:24:27.140159 run              osmo-stp_10.42.42.5: Starting osmo-stp
17:24:27.701891 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=26903): Launched
17:24:28.025426 run   osmo-stp_10.42.42.5(pid=26904): Launched
17:24:28.141965 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
17:24:28.707675 run pcap-recorder_any(filters='host 10.42.42.6')(pid=26906): Launched
17:24:28.823263 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
17:24:29.105060 run              patchelf(pid=26907): Launched
17:24:29.294397 run   osmo-stp_10.42.42.5(pid=26904): ERR: Terminated: ERROR {rc=1}  [trial-2362↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=26904)]
17:24:29.433037 run   osmo-stp_10.42.42.5(pid=26904): stdout: 
| (launched: 2022-12-09_17:24:27.866700)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
17:24:29.572270 run   osmo-stp_10.42.42.5(pid=26904): stderr: 
| �[0;m�[38;5;43m20221209172428054 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209172428054 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209172428054 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209172428055 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209172428055 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209172428055 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209172428055 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
17:24:29.705906 run   osmo-stp_10.42.42.5(pid=26904): stdout: 
| (launched: 2022-12-09_17:24:27.866700)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
17:24:29.838586 run   osmo-stp_10.42.42.5(pid=26904): stderr: 
| �[0;m�[38;5;43m20221209172428054 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209172428054 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209172428054 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209172428055 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209172428055 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209172428055 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209172428055 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
17:24:29.992763 run              patchelf(pid=26907): Terminating (SIGINT)
17:24:30.174423 run              patchelf(pid=26907): Terminated: ok {rc=0}
17:24:30.207525 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=26904): Process ended prematurely: osmo-stp_10.42.42.5(pid=26904) [trial-2362↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=26904)]  [trial-2362↪gprs:oc2g↪iperf3m4.py:8]
17:24:30.217343 tst                    iperf3m4.py:8: Test FAILED (10.6 sec)
17:24:30.280306 run   osmo-hlr_10.42.42.2(pid=26901): ERR: Terminated: ERROR {rc=237}  [trial-2362↪gprs:oc2g↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=26901)]
17:24:30.331668 run   osmo-hlr_10.42.42.2(pid=26901): stdout: 
| (launched: 2022-12-09_17:24:26.863665) 
17:24:30.383591 run   osmo-hlr_10.42.42.2(pid=26901): stderr: 
| �[0;m�[1;31m20221209172427061 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221209172427061 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221209172427061 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221209172427061 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221209172427061 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221209172427061 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221209172427074 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2362/run.2022-12-09_17-10-07/gprs:oc2g/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221209172427082 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221209172427082 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221209172427082 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
17:24:30.427793 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26890): Terminating (SIGTERM)
17:24:30.469983 run iperf3-srv_10.42.42.10(pid=26891): Terminating (SIGTERM)
17:24:30.511830 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26892): Terminating (SIGTERM)
17:24:30.553507 run iperf3-srv_10.42.42.10(pid=26893): Terminating (SIGTERM)
17:24:30.595243 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26894): Terminating (SIGTERM)
17:24:30.636796 run iperf3-srv_10.42.42.10(pid=26895): Terminating (SIGTERM)
17:24:30.678474 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26896): Terminating (SIGTERM)
17:24:30.720156 run iperf3-srv_10.42.42.10(pid=26897): Terminating (SIGTERM)
17:24:30.761989 run pcap-recorder_any(filters='host 10.42.42.2')(pid=26900): Terminating (SIGTERM)
17:24:30.803719 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=26903): Terminating (SIGTERM)
17:24:30.845699 run pcap-recorder_any(filters='host 10.42.42.6')(pid=26906): Terminating (SIGTERM)
17:24:30.855401 ---      ParallelTerminationStrategy: PID 26890 died...
17:24:30.912786 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26890): Terminated: ok {rc=0}
17:24:30.923671 ---      ParallelTerminationStrategy: PID 26891 died...
17:24:30.984507 run iperf3-srv_10.42.42.10(pid=26891): Terminated {rc=256}
17:24:30.995046 ---      ParallelTerminationStrategy: PID 26892 died...
17:24:31.052478 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26892): Terminated: ok {rc=0}
17:24:31.062937 ---      ParallelTerminationStrategy: PID 26893 died...
17:24:31.120188 run iperf3-srv_10.42.42.10(pid=26893): Terminated {rc=256}
17:24:31.130705 ---      ParallelTerminationStrategy: PID 26894 died...
17:24:31.187621 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26894): Terminated: ok {rc=0}
17:24:31.198314 ---      ParallelTerminationStrategy: PID 26895 died...
17:24:31.255331 run iperf3-srv_10.42.42.10(pid=26895): Terminated {rc=256}
17:24:31.266104 ---      ParallelTerminationStrategy: PID 26896 died...
17:24:31.322957 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=26896): Terminated: ok {rc=0}
17:24:31.333666 ---      ParallelTerminationStrategy: PID 26897 died...
17:24:31.390610 run iperf3-srv_10.42.42.10(pid=26897): Terminated {rc=256}
17:24:31.401326 ---      ParallelTerminationStrategy: PID 26900 died...
17:24:31.458315 run pcap-recorder_any(filters='host 10.42.42.2')(pid=26900): Terminated: ok {rc=0}
17:24:31.469055 ---      ParallelTerminationStrategy: PID 26903 died...
17:24:31.525906 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=26903): Terminated: ok {rc=0}
17:24:31.536562 ---      ParallelTerminationStrategy: PID 26906 died...
17:24:31.593157 run pcap-recorder_any(filters='host 10.42.42.6')(pid=26906): Terminated: ok {rc=0}
17:24:31.683029 bus                          /gobi_4: Setting Powered False
17:24:32.784871 bus                          /gobi_4: Setting Powered False
17:24:33.896308 bus                          /gobi_1: Setting Powered False
17:24:34.998954 bus                          /gobi_1: Setting Powered False
17:24:36.109531 bus                          /gobi_0: Setting Powered False
17:24:37.212469 bus                          /gobi_0: Setting Powered False
17:24:38.324197 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=26904): Process ended prematurely: osmo-stp_10.42.42.5(pid=26904) [trial-2362↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=26904)]