Skip to content

Failed

gprs:nanobts+band-900+mod-bts0-egprs.iperf3m4.py (from gprs_nanobts+band-900+mod-bts0-egprs)

Failing for the past 213 builds (Since #2139 )
Took 11 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=706): Process ended prematurely: osmo-stp_10.42.42.5(pid=706) [trial-2351↪gprs:nanobts+band-900+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=706)]

Standard Output

----------------------------------------------
trial-2351 gprs:nanobts+band-900+mod-bts0-egprs iperf3m4.py
----------------------------------------------
00:22:35.954442 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
00:22:35.966436 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x bts (candidates: 1)
00:22:36.116025 tst                      iperf3m4.py: using LAC 17781
00:22:36.237544 tst                      iperf3m4.py: using RAC 186
00:22:36.355434 tst                      iperf3m4.py: using CellId 17781
00:22:36.479068 tst                      iperf3m4.py: using BVCI 17782
00:22:36.502300 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
00:22:36.513137 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
00:22:36.523670 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
00:22:36.534271 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
00:22:36.544669 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
00:22:36.555238 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
00:22:36.566006 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
00:22:36.576403 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
00:22:36.587154 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x modem (candidates: 4)
00:22:36.705552 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x modem (candidates: 4)
00:22:36.822982 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x modem (candidates: 4)
00:22:36.943008 tst gprs:nanobts+band-900+mod-bts0-egprs: Using 1 x modem (candidates: 4)
00:22:37.224898 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
00:22:37.351409 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
00:22:37.666996 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=692): Launched
00:22:37.930207 run  iperf3-srv_10.42.42.10(pid=693): Launched
00:22:38.116250 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
00:22:38.243079 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
00:22:38.558599 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=694): Launched
00:22:38.821943 run  iperf3-srv_10.42.42.10(pid=695): Launched
00:22:39.008012 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
00:22:39.135001 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
00:22:39.451358 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=696): Launched
00:22:39.714472 run  iperf3-srv_10.42.42.10(pid=697): Launched
00:22:39.901376 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
00:22:40.028438 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
00:22:40.345262 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=698): Launched
00:22:40.608700 run  iperf3-srv_10.42.42.10(pid=699): Launched
00:22:40.795526 tst                    iperf3m4.py:8: start network...
00:22:40.922473 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
00:22:41.413112 run           create_hlr_db(pid=700): Launched
00:22:41.588892 bus                          /gobi_6: Setting Powered False
00:22:42.609976 run           create_hlr_db(pid=700): Terminated: ok {rc=0}
00:22:43.042099 run pcap-recorder_any(filters='host 10.42.42.2')(pid=702): Launched
00:22:43.367133 run     osmo-hlr_10.42.42.2(pid=703): Launched
00:22:43.486192 run              osmo-stp_10.42.42.5: Starting osmo-stp
00:22:44.055824 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=705): Launched
00:22:44.382948 run     osmo-stp_10.42.42.5(pid=706): Launched
00:22:44.502169 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
00:22:45.070366 run pcap-recorder_any(filters='host 10.42.42.6')(pid=708): Launched
00:22:45.186451 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
00:22:45.469903 run                patchelf(pid=709): Launched
00:22:45.660591 run     osmo-stp_10.42.42.5(pid=706): ERR: Terminated: ERROR {rc=1}  [trial-2351↪gprs:nanobts+band-900+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=706)]
00:22:45.799333 run     osmo-stp_10.42.42.5(pid=706): stdout: 
| (launched: 2022-12-09_00:22:44.221994)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
00:22:45.941784 run     osmo-stp_10.42.42.5(pid=706): stderr: 
| �[0;m�[38;5;43m20221209002244416 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209002244416 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209002244417 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209002244417 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209002244418 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209002244418 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209002244418 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
00:22:46.078396 run     osmo-stp_10.42.42.5(pid=706): stdout: 
| (launched: 2022-12-09_00:22:44.221994)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
00:22:46.213818 run     osmo-stp_10.42.42.5(pid=706): stderr: 
| �[0;m�[38;5;43m20221209002244416 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209002244416 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209002244417 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209002244417 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209002244418 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209002244418 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209002244418 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
00:22:46.371842 run                patchelf(pid=709): Terminating (SIGINT)
00:22:46.557726 run                patchelf(pid=709): Terminated: ok {rc=0}
00:22:46.592498 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=706): Process ended prematurely: osmo-stp_10.42.42.5(pid=706) [trial-2351↪gprs:nanobts+band-900+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=706)]  [trial-2351↪gprs:nanobts+band-900+mod-bts0-egprs↪iperf3m4.py:8]
00:22:46.603484 tst                    iperf3m4.py:8: Test FAILED (10.7 sec)
00:22:46.671288 run     osmo-hlr_10.42.42.2(pid=703): ERR: Terminated: ERROR {rc=237}  [trial-2351↪gprs:nanobts+band-900+mod-bts0-egprs↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=703)]
00:22:46.724845 run     osmo-hlr_10.42.42.2(pid=703): stdout: 
| (launched: 2022-12-09_00:22:43.207558) 
00:22:46.779302 run     osmo-hlr_10.42.42.2(pid=703): stderr: 
| �[0;m�[1;31m20221209002243406 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221209002243406 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221209002243407 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221209002243407 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221209002243407 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221209002243407 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221209002243420 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2351/run.2022-12-08_23-11-41/gprs:nanobts+band-900+mod-bts0-egprs/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221209002243428 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221209002243428 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221209002243429 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
00:22:46.825949 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=692): Terminating (SIGTERM)
00:22:46.871853 run  iperf3-srv_10.42.42.10(pid=693): Terminating (SIGTERM)
00:22:46.917376 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=694): Terminating (SIGTERM)
00:22:46.963575 run  iperf3-srv_10.42.42.10(pid=695): Terminating (SIGTERM)
00:22:47.009164 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=696): Terminating (SIGTERM)
00:22:47.054116 run  iperf3-srv_10.42.42.10(pid=697): Terminating (SIGTERM)
00:22:47.101241 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=698): Terminating (SIGTERM)
00:22:47.145845 run  iperf3-srv_10.42.42.10(pid=699): Terminating (SIGTERM)
00:22:47.192609 run pcap-recorder_any(filters='host 10.42.42.2')(pid=702): Terminating (SIGTERM)
00:22:47.235473 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=705): Terminating (SIGTERM)
00:22:47.279383 run pcap-recorder_any(filters='host 10.42.42.6')(pid=708): Terminating (SIGTERM)
00:22:47.290073 ---      ParallelTerminationStrategy: PID 692 died...
00:22:47.345774 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=692): Terminated: ok {rc=0}
00:22:47.355730 ---      ParallelTerminationStrategy: PID 693 died...
00:22:47.408080 run  iperf3-srv_10.42.42.10(pid=693): Terminated {rc=256}
00:22:47.418767 ---      ParallelTerminationStrategy: PID 694 died...
00:22:47.470862 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=694): Terminated: ok {rc=0}
00:22:47.480349 ---      ParallelTerminationStrategy: PID 695 died...
00:22:47.532306 run  iperf3-srv_10.42.42.10(pid=695): Terminated {rc=256}
00:22:47.541808 ---      ParallelTerminationStrategy: PID 696 died...
00:22:47.594540 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=696): Terminated: ok {rc=0}
00:22:47.604221 ---      ParallelTerminationStrategy: PID 697 died...
00:22:47.656262 run  iperf3-srv_10.42.42.10(pid=697): Terminated {rc=256}
00:22:47.665898 ---      ParallelTerminationStrategy: PID 698 died...
00:22:47.718397 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=698): Terminated: ok {rc=0}
00:22:47.728143 ---      ParallelTerminationStrategy: PID 699 died...
00:22:47.780347 run  iperf3-srv_10.42.42.10(pid=699): Terminated {rc=256}
00:22:47.790035 ---      ParallelTerminationStrategy: PID 702 died...
00:22:47.842239 run pcap-recorder_any(filters='host 10.42.42.2')(pid=702): Terminated: ok {rc=0}
00:22:47.851831 ---      ParallelTerminationStrategy: PID 705 died...
00:22:47.903795 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=705): Terminated: ok {rc=0}
00:22:47.913675 ---      ParallelTerminationStrategy: PID 708 died...
00:22:47.966116 run pcap-recorder_any(filters='host 10.42.42.6')(pid=708): Terminated: ok {rc=0}
00:22:48.053719 bus                          /gobi_4: Setting Powered False
00:22:49.156908 bus                          /gobi_4: Setting Powered False
00:22:50.268148 bus                          /gobi_1: Setting Powered False
00:22:51.369631 bus                          /gobi_1: Setting Powered False
00:22:52.480563 bus                          /gobi_0: Setting Powered False
00:22:53.581705 bus                          /gobi_0: Setting Powered False
00:22:54.692465 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=706): Process ended prematurely: osmo-stp_10.42.42.5(pid=706) [trial-2351↪gprs:nanobts+band-900+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=706)]