Skip to content

Failed

gprs:nanobts+band-900.iperf3m4.py (from gprs_nanobts+band-900)

Failing for the past 210 builds (Since #2161 )
Took 11 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=5853): Process ended prematurely: osmo-stp_10.42.42.5(pid=5853) [trial-2370↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=5853)]

Standard Output

----------------------------------------------
trial-2370 gprs:nanobts+band-900 iperf3m4.py
----------------------------------------------
06:37:11.020417 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
06:37:11.032027 tst            gprs:nanobts+band-900: Using 1 x bts (candidates: 1)
06:37:11.175068 tst                      iperf3m4.py: using LAC 26535
06:37:11.308216 tst                      iperf3m4.py: using RAC 15
06:37:11.444004 tst                      iperf3m4.py: using CellId 26535
06:37:11.568433 tst                      iperf3m4.py: using BVCI 26536
06:37:11.592691 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
06:37:11.603899 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
06:37:11.614868 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
06:37:11.626409 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
06:37:11.637861 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
06:37:11.649211 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
06:37:11.660727 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
06:37:11.672130 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
06:37:11.683755 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
06:37:11.802624 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
06:37:11.921766 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
06:37:12.039731 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
06:37:12.326626 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
06:37:12.459190 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
06:37:12.787100 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5839): Launched
06:37:13.059118 run iperf3-srv_10.42.42.10(pid=5840): Launched
06:37:13.252322 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
06:37:13.384320 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
06:37:13.712606 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5841): Launched
06:37:13.985772 run iperf3-srv_10.42.42.10(pid=5842): Launched
06:37:14.178901 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
06:37:14.310542 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
06:37:14.639599 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5843): Launched
06:37:14.912384 run iperf3-srv_10.42.42.10(pid=5844): Launched
06:37:15.106601 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
06:37:15.238587 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
06:37:15.567761 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5845): Launched
06:37:15.844517 run iperf3-srv_10.42.42.10(pid=5846): Launched
06:37:16.078756 tst                    iperf3m4.py:8: start network...
06:37:16.207655 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
06:37:16.767557 run          create_hlr_db(pid=5847): Launched
06:37:16.942875 bus                          /gobi_6: Setting Powered False
06:37:17.959960 run          create_hlr_db(pid=5847): Terminated: ok {rc=0}
06:37:18.356042 run pcap-recorder_any(filters='host 10.42.42.2')(pid=5849): Launched
06:37:18.653119 run    osmo-hlr_10.42.42.2(pid=5850): Launched
06:37:18.766432 run              osmo-stp_10.42.42.5: Starting osmo-stp
06:37:19.289085 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=5852): Launched
06:37:19.585346 run    osmo-stp_10.42.42.5(pid=5853): Launched
06:37:19.693253 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
06:37:20.208288 run pcap-recorder_any(filters='host 10.42.42.6')(pid=5855): Launched
06:37:20.313562 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
06:37:20.572294 run               patchelf(pid=5856): Launched
06:37:20.745799 run    osmo-stp_10.42.42.5(pid=5853): ERR: Terminated: ERROR {rc=1}  [trial-2370↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=5853)]
06:37:20.872313 run    osmo-stp_10.42.42.5(pid=5853): stdout: 
| (launched: 2022-12-10_06:37:19.438898)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
06:37:20.999280 run    osmo-stp_10.42.42.5(pid=5853): stderr: 
| �[0;m�[38;5;43m20221210063719608 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221210063719609 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221210063719609 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221210063719609 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221210063719610 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221210063719610 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221210063719610 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
06:37:21.121689 run    osmo-stp_10.42.42.5(pid=5853): stdout: 
| (launched: 2022-12-10_06:37:19.438898)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
06:37:21.243378 run    osmo-stp_10.42.42.5(pid=5853): stderr: 
| �[0;m�[38;5;43m20221210063719608 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221210063719609 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221210063719609 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221210063719609 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221210063719610 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221210063719610 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221210063719610 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
06:37:21.384662 run               patchelf(pid=5856): Terminating (SIGINT)
06:37:21.551013 run               patchelf(pid=5856): Terminated: ok {rc=0}
06:37:21.581880 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=5853): Process ended prematurely: osmo-stp_10.42.42.5(pid=5853) [trial-2370↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=5853)]  [trial-2370↪gprs:nanobts+band-900↪iperf3m4.py:8]
06:37:21.591368 tst                    iperf3m4.py:8: Test FAILED (10.6 sec)
06:37:21.651767 run    osmo-hlr_10.42.42.2(pid=5850): ERR: Terminated: ERROR {rc=237}  [trial-2370↪gprs:nanobts+band-900↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=5850)]
06:37:21.700489 run    osmo-hlr_10.42.42.2(pid=5850): stdout: 
| (launched: 2022-12-10_06:37:18.506093) 
06:37:21.749798 run    osmo-hlr_10.42.42.2(pid=5850): stderr: 
| �[0;m�[1;31m20221210063718679 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221210063718679 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221210063718679 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221210063718679 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221210063718679 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221210063718680 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221210063718692 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2370/run.2022-12-10_05-27-43/gprs:nanobts+band-900/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221210063718703 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221210063718703 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221210063718703 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
06:37:21.791308 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5839): Terminating (SIGTERM)
06:37:21.830831 run iperf3-srv_10.42.42.10(pid=5840): Terminating (SIGTERM)
06:37:21.870535 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5841): Terminating (SIGTERM)
06:37:21.909708 run iperf3-srv_10.42.42.10(pid=5842): Terminating (SIGTERM)
06:37:21.949341 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5843): Terminating (SIGTERM)
06:37:21.988668 run iperf3-srv_10.42.42.10(pid=5844): Terminating (SIGTERM)
06:37:22.028400 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5845): Terminating (SIGTERM)
06:37:22.067815 run iperf3-srv_10.42.42.10(pid=5846): Terminating (SIGTERM)
06:37:22.107684 run pcap-recorder_any(filters='host 10.42.42.2')(pid=5849): Terminating (SIGTERM)
06:37:22.147442 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=5852): Terminating (SIGTERM)
06:37:22.186858 run pcap-recorder_any(filters='host 10.42.42.6')(pid=5855): Terminating (SIGTERM)
06:37:22.196424 ---      ParallelTerminationStrategy: PID 5839 died...
06:37:22.249759 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5839): Terminated: ok {rc=0}
06:37:22.259297 ---      ParallelTerminationStrategy: PID 5840 died...
06:37:22.311471 run iperf3-srv_10.42.42.10(pid=5840): Terminated {rc=256}
06:37:22.321282 ---      ParallelTerminationStrategy: PID 5841 died...
06:37:22.373865 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5841): Terminated: ok {rc=0}
06:37:22.383399 ---      ParallelTerminationStrategy: PID 5842 died...
06:37:22.435664 run iperf3-srv_10.42.42.10(pid=5842): Terminated {rc=256}
06:37:22.445457 ---      ParallelTerminationStrategy: PID 5843 died...
06:37:22.497743 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5843): Terminated: ok {rc=0}
06:37:22.507475 ---      ParallelTerminationStrategy: PID 5844 died...
06:37:22.559749 run iperf3-srv_10.42.42.10(pid=5844): Terminated {rc=256}
06:37:22.569591 ---      ParallelTerminationStrategy: PID 5845 died...
06:37:22.621964 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=5845): Terminated: ok {rc=0}
06:37:22.631750 ---      ParallelTerminationStrategy: PID 5846 died...
06:37:22.684177 run iperf3-srv_10.42.42.10(pid=5846): Terminated {rc=256}
06:37:22.693815 ---      ParallelTerminationStrategy: PID 5849 died...
06:37:22.746379 run pcap-recorder_any(filters='host 10.42.42.2')(pid=5849): Terminated: ok {rc=0}
06:37:22.756141 ---      ParallelTerminationStrategy: PID 5852 died...
06:37:22.808655 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=5852): Terminated: ok {rc=0}
06:37:22.818506 ---      ParallelTerminationStrategy: PID 5855 died...
06:37:22.871096 run pcap-recorder_any(filters='host 10.42.42.6')(pid=5855): Terminated: ok {rc=0}
06:37:22.953843 bus                          /gobi_4: Setting Powered False
06:37:24.061701 bus                          /gobi_4: Setting Powered False
06:37:25.168325 bus                          /gobi_1: Setting Powered False
06:37:26.273585 bus                          /gobi_1: Setting Powered False
06:37:27.379354 bus                          /gobi_0: Setting Powered False
06:37:28.484702 bus                          /gobi_0: Setting Powered False
06:37:29.592251 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=5853): Process ended prematurely: osmo-stp_10.42.42.5(pid=5853) [trial-2370↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=5853)]