Skip to content

Failed

gprs:nanobts+band-900.iperf3m4.py (from gprs_nanobts+band-900)

Failing for the past 219 builds (Since #2161 )
Took 11 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=553): Process ended prematurely: osmo-stp_10.42.42.5(pid=553) [trial-2379↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=553)]

Standard Output

----------------------------------------------
trial-2379 gprs:nanobts+band-900 iperf3m4.py
----------------------------------------------
17:16:45.464304 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
17:16:45.475557 tst            gprs:nanobts+band-900: Using 1 x bts (candidates: 1)
17:16:45.611638 tst                      iperf3m4.py: using LAC 30337
17:16:45.730630 tst                      iperf3m4.py: using RAC 247
17:16:45.854529 tst                      iperf3m4.py: using CellId 30337
17:16:45.984746 tst                      iperf3m4.py: using BVCI 30338
17:16:46.008781 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
17:16:46.020604 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
17:16:46.032294 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
17:16:46.044507 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
17:16:46.057365 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
17:16:46.068878 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
17:16:46.081474 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
17:16:46.093985 tst            gprs:nanobts+band-900: Using 1 x ip_address (candidates: 9)
17:16:46.106058 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
17:16:46.223242 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
17:16:46.365659 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
17:16:46.482277 tst            gprs:nanobts+band-900: Using 1 x modem (candidates: 4)
17:16:46.779849 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
17:16:46.917963 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
17:16:47.269839 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=539): Launched
17:16:47.547056 run  iperf3-srv_10.42.42.10(pid=540): Launched
17:16:47.743253 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
17:16:47.877487 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
17:16:48.210821 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=541): Launched
17:16:48.489917 run  iperf3-srv_10.42.42.10(pid=542): Launched
17:16:48.685163 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
17:16:48.818826 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
17:16:49.151369 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=543): Launched
17:16:49.427121 run  iperf3-srv_10.42.42.10(pid=544): Launched
17:16:49.622857 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
17:16:49.756116 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
17:16:50.088544 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=545): Launched
17:16:50.364416 run  iperf3-srv_10.42.42.10(pid=546): Launched
17:16:50.560507 tst                    iperf3m4.py:8: start network...
17:16:50.694564 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
17:16:51.207577 run           create_hlr_db(pid=547): Launched
17:16:51.379582 bus                          /gobi_6: Setting Powered False
17:16:52.401235 run           create_hlr_db(pid=547): Terminated: ok {rc=0}
17:16:52.827802 run pcap-recorder_any(filters='host 10.42.42.2')(pid=549): Launched
17:16:53.150782 run     osmo-hlr_10.42.42.2(pid=550): Launched
17:16:53.268434 run              osmo-stp_10.42.42.5: Starting osmo-stp
17:16:53.827204 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=552): Launched
17:16:54.152277 run     osmo-stp_10.42.42.5(pid=553): Launched
17:16:54.268788 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
17:16:54.828784 run pcap-recorder_any(filters='host 10.42.42.6')(pid=555): Launched
17:16:54.943865 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
17:16:55.223506 run                patchelf(pid=556): Launched
17:16:55.410892 run     osmo-stp_10.42.42.5(pid=553): ERR: Terminated: ERROR {rc=1}  [trial-2379↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=553)]
17:16:55.547676 run     osmo-stp_10.42.42.5(pid=553): stdout: 
| (launched: 2022-12-12_17:16:53.991813)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
17:16:55.685686 run     osmo-stp_10.42.42.5(pid=553): stderr: 
| �[0;m�[38;5;43m20221212171654180 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221212171654180 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221212171654181 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221212171654181 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221212171654181 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221212171654182 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221212171654182 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
17:16:55.818472 run     osmo-stp_10.42.42.5(pid=553): stdout: 
| (launched: 2022-12-12_17:16:53.991813)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
17:16:55.951008 run     osmo-stp_10.42.42.5(pid=553): stderr: 
| �[0;m�[38;5;43m20221212171654180 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221212171654180 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221212171654181 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221212171654181 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221212171654181 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221212171654182 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221212171654182 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
17:16:56.104205 run                patchelf(pid=556): Terminating (SIGINT)
17:16:56.283493 run                patchelf(pid=556): Terminated: ok {rc=0}
17:16:56.316365 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=553): Process ended prematurely: osmo-stp_10.42.42.5(pid=553) [trial-2379↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=553)]  [trial-2379↪gprs:nanobts+band-900↪iperf3m4.py:8]
17:16:56.326308 tst                    iperf3m4.py:8: Test FAILED (10.9 sec)
17:16:56.388546 run     osmo-hlr_10.42.42.2(pid=550): ERR: Terminated: ERROR {rc=237}  [trial-2379↪gprs:nanobts+band-900↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=550)]
17:16:56.439093 run     osmo-hlr_10.42.42.2(pid=550): stdout: 
| (launched: 2022-12-12_17:16:52.991967) 
17:16:56.490471 run     osmo-hlr_10.42.42.2(pid=550): stderr: 
| �[0;m�[1;31m20221212171653186 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221212171653186 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221212171653186 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221212171653187 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221212171653187 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221212171653187 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221212171653199 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2379/run.2022-12-12_16-06-15/gprs:nanobts+band-900/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221212171653209 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221212171653209 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221212171653209 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
17:16:56.534119 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=539): Terminating (SIGTERM)
17:16:56.574977 run  iperf3-srv_10.42.42.10(pid=540): Terminating (SIGTERM)
17:16:56.615891 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=541): Terminating (SIGTERM)
17:16:56.656644 run  iperf3-srv_10.42.42.10(pid=542): Terminating (SIGTERM)
17:16:56.698024 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=543): Terminating (SIGTERM)
17:16:56.738907 run  iperf3-srv_10.42.42.10(pid=544): Terminating (SIGTERM)
17:16:56.779840 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=545): Terminating (SIGTERM)
17:16:56.820423 run  iperf3-srv_10.42.42.10(pid=546): Terminating (SIGTERM)
17:16:56.861739 run pcap-recorder_any(filters='host 10.42.42.2')(pid=549): Terminating (SIGTERM)
17:16:56.902806 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=552): Terminating (SIGTERM)
17:16:56.943762 run pcap-recorder_any(filters='host 10.42.42.6')(pid=555): Terminating (SIGTERM)
17:16:56.953479 ---      ParallelTerminationStrategy: PID 539 died...
17:16:57.008863 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=539): Terminated: ok {rc=0}
17:16:57.019003 ---      ParallelTerminationStrategy: PID 540 died...
17:16:57.073555 run  iperf3-srv_10.42.42.10(pid=540): Terminated {rc=256}
17:16:57.083359 ---      ParallelTerminationStrategy: PID 541 died...
17:16:57.137838 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=541): Terminated: ok {rc=0}
17:16:57.147686 ---      ParallelTerminationStrategy: PID 542 died...
17:16:57.202094 run  iperf3-srv_10.42.42.10(pid=542): Terminated {rc=256}
17:16:57.211925 ---      ParallelTerminationStrategy: PID 543 died...
17:16:57.266100 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=543): Terminated: ok {rc=0}
17:16:57.275928 ---      ParallelTerminationStrategy: PID 544 died...
17:16:57.330583 run  iperf3-srv_10.42.42.10(pid=544): Terminated {rc=256}
17:16:57.340435 ---      ParallelTerminationStrategy: PID 545 died...
17:16:57.394676 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=545): Terminated: ok {rc=0}
17:16:57.404511 ---      ParallelTerminationStrategy: PID 546 died...
17:16:57.458816 run  iperf3-srv_10.42.42.10(pid=546): Terminated {rc=256}
17:16:57.468743 ---      ParallelTerminationStrategy: PID 549 died...
17:16:57.523198 run pcap-recorder_any(filters='host 10.42.42.2')(pid=549): Terminated: ok {rc=0}
17:16:57.533166 ---      ParallelTerminationStrategy: PID 552 died...
17:16:57.587431 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=552): Terminated: ok {rc=0}
17:16:57.597431 ---      ParallelTerminationStrategy: PID 555 died...
17:16:57.651572 run pcap-recorder_any(filters='host 10.42.42.6')(pid=555): Terminated: ok {rc=0}
17:16:57.736828 bus                          /gobi_4: Setting Powered False
17:16:58.843592 bus                          /gobi_4: Setting Powered False
17:16:59.952384 bus                          /gobi_1: Setting Powered False
17:17:01.058126 bus                          /gobi_1: Setting Powered False
17:17:02.167905 bus                          /gobi_0: Setting Powered False
17:17:03.269251 bus                          /gobi_0: Setting Powered False
17:17:04.374768 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=553): Process ended prematurely: osmo-stp_10.42.42.5(pid=553) [trial-2379↪gprs:nanobts+band-900↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=553)]