Skip to content

Failed

gprs:nanobts+band-900.iperf3m4.py (from gprs_nanobts+band-900)

Failing for the past 199 builds (Since #2161 )
Took 12 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=5443): Process ended prematurely: osmo-stp_10.42.42.5(pid=5443) [trial-2359↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=5443)]

Standard Output

----------------------------------------------
trial-2359 gprs:nanobts+band-900 iperf3m4.py
----------------------------------------------
12:28:30.500571 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
12:28:30.512086 tst            gprs:nanobts+band-900: Using 1 x bts (candidates: 1)
12:28:30.649882 tst                      iperf3m4.py: using LAC 21464
12:28:30.774079 tst                      iperf3m4.py: using RAC 44
12:28:30.910551 tst                      iperf3m4.py: using CellId 21464
12:28:31.047073 tst                      iperf3m4.py: using BVCI 21465
12:28:31.074901 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
12:28:31.087376 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
12:28:31.100608 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
12:28:31.113510 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
12:28:31.126621 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
12:28:31.140578 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
12:28:31.152779 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
12:28:31.165645 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
12:28:31.179607 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
12:28:31.307669 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
12:28:31.446162 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
12:28:31.565502 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
12:28:31.908282 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
12:28:32.039464 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
12:28:32.383093 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5429): Launched
12:28:32.671006 run iperf3-srv_10.42.42.10(pid=5430): Launched
12:28:32.873752 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
12:28:33.011803 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
12:28:33.357887 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5431): Launched
12:28:33.646940 run iperf3-srv_10.42.42.10(pid=5432): Launched
12:28:33.850177 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
12:28:33.988280 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
12:28:34.334194 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5433): Launched
12:28:34.621729 run iperf3-srv_10.42.42.10(pid=5434): Launched
12:28:34.824277 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
12:28:34.962096 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
12:28:35.307137 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5435): Launched
12:28:35.595533 run iperf3-srv_10.42.42.10(pid=5436): Launched
12:28:35.798386 tst                    iperf3m4.py:8: start network...
12:28:35.937026 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
12:28:36.459930 run          create_hlr_db(pid=5437): Launched
12:28:36.652406 bus                          /gobi_6: Setting Powered False
12:28:37.673064 run          create_hlr_db(pid=5437): Terminated: ok {rc=0}
12:28:38.117489 run pcap-recorder_any(filters='host 10.42.42.2')(pid=5439): Launched
12:28:38.453262 run    osmo-hlr_10.42.42.2(pid=5440): Launched
12:28:38.574626 run              osmo-stp_10.42.42.5: Starting osmo-stp
12:28:39.159161 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=5442): Launched
12:28:39.493328 run    osmo-stp_10.42.42.5(pid=5443): Launched
12:28:39.614158 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
12:28:40.193936 run pcap-recorder_any(filters='host 10.42.42.6')(pid=5445): Launched
12:28:40.312050 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
12:28:40.602826 run               patchelf(pid=5446): Launched
12:28:40.796800 run    osmo-stp_10.42.42.5(pid=5443): ERR: Terminated: ERROR {rc=1}  [trial-2359↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=5443)]
12:28:40.938344 run    osmo-stp_10.42.42.5(pid=5443): stdout: 
| (launched: 2022-12-09_12:28:39.328019)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
12:28:41.079738 run    osmo-stp_10.42.42.5(pid=5443): stderr: 
| �[0;m�[38;5;43m20221209122839517 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209122839517 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209122839518 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209122839518 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209122839519 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209122839519 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209122839519 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
12:28:41.216895 run    osmo-stp_10.42.42.5(pid=5443): stdout: 
| (launched: 2022-12-09_12:28:39.328019)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
12:28:41.352640 run    osmo-stp_10.42.42.5(pid=5443): stderr: 
| �[0;m�[38;5;43m20221209122839517 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209122839517 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209122839518 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209122839518 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209122839519 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209122839519 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209122839519 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
12:28:41.512123 run               patchelf(pid=5446): Terminating (SIGINT)
12:28:41.700858 run               patchelf(pid=5446): Terminated: ok {rc=0}
12:28:41.735215 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=5443): Process ended prematurely: osmo-stp_10.42.42.5(pid=5443) [trial-2359↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=5443)]  [trial-2359↪gprs:nanobts+band-900↪iperf3m4.py:8]
12:28:41.745213 tst                    iperf3m4.py:8: Test FAILED (11.3 sec)
12:28:41.809920 run    osmo-hlr_10.42.42.2(pid=5440): ERR: Terminated: ERROR {rc=237}  [trial-2359↪gprs:nanobts+band-900↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=5440)]
12:28:41.861819 run    osmo-hlr_10.42.42.2(pid=5440): stdout: 
| (launched: 2022-12-09_12:28:38.287595) 
12:28:41.914984 run    osmo-hlr_10.42.42.2(pid=5440): stderr: 
| �[0;m�[1;31m20221209122838483 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221209122838483 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221209122838483 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221209122838483 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221209122838483 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221209122838483 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221209122838495 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2359/run.2022-12-09_11-17-44/gprs:nanobts+band-900/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221209122838504 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221209122838504 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221209122838504 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
12:28:41.959478 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5429): Terminating (SIGTERM)
12:28:42.002033 run iperf3-srv_10.42.42.10(pid=5430): Terminating (SIGTERM)
12:28:42.044437 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5431): Terminating (SIGTERM)
12:28:42.086566 run iperf3-srv_10.42.42.10(pid=5432): Terminating (SIGTERM)
12:28:42.128632 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5433): Terminating (SIGTERM)
12:28:42.170432 run iperf3-srv_10.42.42.10(pid=5434): Terminating (SIGTERM)
12:28:42.212383 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5435): Terminating (SIGTERM)
12:28:42.254372 run iperf3-srv_10.42.42.10(pid=5436): Terminating (SIGTERM)
12:28:42.296734 run pcap-recorder_any(filters='host 10.42.42.2')(pid=5439): Terminating (SIGTERM)
12:28:42.338723 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=5442): Terminating (SIGTERM)
12:28:42.381120 run pcap-recorder_any(filters='host 10.42.42.6')(pid=5445): Terminating (SIGTERM)
12:28:42.390978 ---      ParallelTerminationStrategy: PID 5429 died...
12:28:42.447773 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5429): Terminated: ok {rc=0}
12:28:42.458047 ---      ParallelTerminationStrategy: PID 5430 died...
12:28:42.514578 run iperf3-srv_10.42.42.10(pid=5430): Terminated {rc=256}
12:28:42.524674 ---      ParallelTerminationStrategy: PID 5431 died...
12:28:42.581044 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5431): Terminated: ok {rc=0}
12:28:42.591210 ---      ParallelTerminationStrategy: PID 5432 died...
12:28:42.647068 run iperf3-srv_10.42.42.10(pid=5432): Terminated {rc=256}
12:28:42.657493 ---      ParallelTerminationStrategy: PID 5433 died...
12:28:42.713689 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5433): Terminated: ok {rc=0}
12:28:42.723827 ---      ParallelTerminationStrategy: PID 5434 died...
12:28:42.779619 run iperf3-srv_10.42.42.10(pid=5434): Terminated {rc=256}
12:28:42.790036 ---      ParallelTerminationStrategy: PID 5435 died...
12:28:42.846678 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5435): Terminated: ok {rc=0}
12:28:42.856832 ---      ParallelTerminationStrategy: PID 5436 died...
12:28:42.912918 run iperf3-srv_10.42.42.10(pid=5436): Terminated {rc=256}
12:28:42.923162 ---      ParallelTerminationStrategy: PID 5439 died...
12:28:42.978981 run pcap-recorder_any(filters='host 10.42.42.2')(pid=5439): Terminated: ok {rc=0}
12:28:42.989434 ---      ParallelTerminationStrategy: PID 5442 died...
12:28:43.045644 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=5442): Terminated: ok {rc=0}
12:28:43.055810 ---      ParallelTerminationStrategy: PID 5445 died...
12:28:43.112267 run pcap-recorder_any(filters='host 10.42.42.6')(pid=5445): Terminated: ok {rc=0}
12:28:43.199822 bus                          /gobi_4: Setting Powered False
12:28:44.303699 bus                          /gobi_4: Setting Powered False
12:28:45.415876 bus                          /gobi_1: Setting Powered False
12:28:46.530869 bus                          /gobi_1: Setting Powered False
12:28:47.639837 bus                          /gobi_0: Setting Powered False
12:28:48.747389 bus                          /gobi_0: Setting Powered False
12:28:49.856825 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=5443): Process ended prematurely: osmo-stp_10.42.42.5(pid=5443) [trial-2359↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=5443)]