Skip to content

Failed

gprs:oc2g.iperf3m4.py (from gprs_oc2g)

Failing for the past 217 builds (Since #2139 )
Took 11 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=30101): Process ended prematurely: osmo-stp_10.42.42.5(pid=30101) [trial-2355↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=30101)]

Standard Output

----------------------------------------------
trial-2355 gprs:oc2g iperf3m4.py
----------------------------------------------
05:42:18.886050 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
05:42:18.897234 tst                        gprs:oc2g: Using 1 x bts (candidates: 1)
05:42:19.019824 tst                      iperf3m4.py: using LAC 19323
05:42:19.139621 tst                      iperf3m4.py: using RAC 198
05:42:19.259190 tst                      iperf3m4.py: using CellId 19323
05:42:19.379772 tst                      iperf3m4.py: using BVCI 19324
05:42:19.402777 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
05:42:19.413381 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
05:42:19.423720 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
05:42:19.434425 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
05:42:19.444516 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
05:42:19.455352 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
05:42:19.466051 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
05:42:19.476491 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
05:42:19.487073 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
05:42:19.608334 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
05:42:19.723844 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
05:42:19.837579 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
05:42:20.114010 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
05:42:20.238078 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
05:42:20.548188 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30087): Launched
05:42:20.805478 run iperf3-srv_10.42.42.10(pid=30088): Launched
05:42:20.987445 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
05:42:21.111474 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
05:42:21.420260 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30089): Launched
05:42:21.677990 run iperf3-srv_10.42.42.10(pid=30090): Launched
05:42:21.859207 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
05:42:21.983372 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
05:42:22.292715 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30091): Launched
05:42:22.549906 run iperf3-srv_10.42.42.10(pid=30092): Launched
05:42:22.731104 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
05:42:22.854706 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
05:42:23.162611 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30093): Launched
05:42:23.419621 run iperf3-srv_10.42.42.10(pid=30094): Launched
05:42:23.601818 tst                    iperf3m4.py:8: start network...
05:42:23.725880 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
05:42:24.196564 run         create_hlr_db(pid=30095): Launched
05:42:24.369224 bus                          /gobi_6: Setting Powered False
05:42:25.391342 run         create_hlr_db(pid=30095): Terminated: ok {rc=0}
05:42:25.817670 run pcap-recorder_any(filters='host 10.42.42.2')(pid=30097): Launched
05:42:26.139455 run   osmo-hlr_10.42.42.2(pid=30098): Launched
05:42:26.259855 run              osmo-stp_10.42.42.5: Starting osmo-stp
05:42:26.819154 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=30100): Launched
05:42:27.140240 run   osmo-stp_10.42.42.5(pid=30101): Launched
05:42:27.256567 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
05:42:27.819487 run pcap-recorder_any(filters='host 10.42.42.6')(pid=30103): Launched
05:42:27.935017 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
05:42:28.216383 run              patchelf(pid=30104): Launched
05:42:28.405781 run   osmo-stp_10.42.42.5(pid=30101): ERR: Terminated: ERROR {rc=1}  [trial-2355↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=30101)]
05:42:28.543816 run   osmo-stp_10.42.42.5(pid=30101): stdout: 
| (launched: 2022-12-09_05:42:26.982723)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
05:42:28.682349 run   osmo-stp_10.42.42.5(pid=30101): stderr: 
| �[0;m�[38;5;43m20221209054227173 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209054227173 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209054227174 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209054227174 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209054227174 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209054227174 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209054227175 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
05:42:28.816621 run   osmo-stp_10.42.42.5(pid=30101): stdout: 
| (launched: 2022-12-09_05:42:26.982723)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
05:42:28.949587 run   osmo-stp_10.42.42.5(pid=30101): stderr: 
| �[0;m�[38;5;43m20221209054227173 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209054227173 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209054227174 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209054227174 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209054227174 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209054227174 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209054227175 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
05:42:29.103617 run              patchelf(pid=30104): Terminating (SIGINT)
05:42:29.284125 run              patchelf(pid=30104): Terminated: ok {rc=0}
05:42:29.317012 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=30101): Process ended prematurely: osmo-stp_10.42.42.5(pid=30101) [trial-2355↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=30101)]  [trial-2355↪gprs:oc2g↪iperf3m4.py:8]
05:42:29.326720 tst                    iperf3m4.py:8: Test FAILED (10.5 sec)
05:42:29.389878 run   osmo-hlr_10.42.42.2(pid=30098): ERR: Terminated: ERROR {rc=237}  [trial-2355↪gprs:oc2g↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=30098)]
05:42:29.441724 run   osmo-hlr_10.42.42.2(pid=30098): stdout: 
| (launched: 2022-12-09_05:42:25.981247) 
05:42:29.494050 run   osmo-hlr_10.42.42.2(pid=30098): stderr: 
| �[0;m�[1;31m20221209054226181 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221209054226181 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221209054226181 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221209054226181 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221209054226181 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221209054226181 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221209054226194 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2355/run.2022-12-09_05-28-02/gprs:oc2g/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221209054226202 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221209054226202 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221209054226202 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
05:42:29.538719 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30087): Terminating (SIGTERM)
05:42:29.580578 run iperf3-srv_10.42.42.10(pid=30088): Terminating (SIGTERM)
05:42:29.622565 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30089): Terminating (SIGTERM)
05:42:29.664417 run iperf3-srv_10.42.42.10(pid=30090): Terminating (SIGTERM)
05:42:29.706499 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30091): Terminating (SIGTERM)
05:42:29.748227 run iperf3-srv_10.42.42.10(pid=30092): Terminating (SIGTERM)
05:42:29.790714 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30093): Terminating (SIGTERM)
05:42:29.832481 run iperf3-srv_10.42.42.10(pid=30094): Terminating (SIGTERM)
05:42:29.874378 run pcap-recorder_any(filters='host 10.42.42.2')(pid=30097): Terminating (SIGTERM)
05:42:29.916191 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=30100): Terminating (SIGTERM)
05:42:29.958534 run pcap-recorder_any(filters='host 10.42.42.6')(pid=30103): Terminating (SIGTERM)
05:42:29.968277 ---      ParallelTerminationStrategy: PID 30087 died...
05:42:30.024382 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30087): Terminated: ok {rc=0}
05:42:30.034582 ---      ParallelTerminationStrategy: PID 30088 died...
05:42:30.090596 run iperf3-srv_10.42.42.10(pid=30088): Terminated {rc=256}
05:42:30.100742 ---      ParallelTerminationStrategy: PID 30089 died...
05:42:30.157255 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30089): Terminated: ok {rc=0}
05:42:30.167483 ---      ParallelTerminationStrategy: PID 30090 died...
05:42:30.223443 run iperf3-srv_10.42.42.10(pid=30090): Terminated {rc=256}
05:42:30.233722 ---      ParallelTerminationStrategy: PID 30091 died...
05:42:30.289566 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30091): Terminated: ok {rc=0}
05:42:30.299726 ---      ParallelTerminationStrategy: PID 30092 died...
05:42:30.355770 run iperf3-srv_10.42.42.10(pid=30092): Terminated {rc=256}
05:42:30.365988 ---      ParallelTerminationStrategy: PID 30093 died...
05:42:30.422259 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=30093): Terminated: ok {rc=0}
05:42:30.432430 ---      ParallelTerminationStrategy: PID 30094 died...
05:42:30.488209 run iperf3-srv_10.42.42.10(pid=30094): Terminated {rc=256}
05:42:30.498416 ---      ParallelTerminationStrategy: PID 30097 died...
05:42:30.554214 run pcap-recorder_any(filters='host 10.42.42.2')(pid=30097): Terminated: ok {rc=0}
05:42:30.564360 ---      ParallelTerminationStrategy: PID 30100 died...
05:42:30.619972 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=30100): Terminated: ok {rc=0}
05:42:30.630162 ---      ParallelTerminationStrategy: PID 30103 died...
05:42:30.685805 run pcap-recorder_any(filters='host 10.42.42.6')(pid=30103): Terminated: ok {rc=0}
05:42:30.772629 bus                          /gobi_4: Setting Powered False
05:42:31.896241 bus                          /gobi_4: Setting Powered False
05:42:33.005271 bus                          /gobi_1: Setting Powered False
05:42:34.110888 bus                          /gobi_1: Setting Powered False
05:42:35.222748 bus                          /gobi_0: Setting Powered False
05:42:36.325714 bus                          /gobi_0: Setting Powered False
05:42:37.437783 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=30101): Process ended prematurely: osmo-stp_10.42.42.5(pid=30101) [trial-2355↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=30101)]