Skip to content

Failed

gprs:trx-b200+mod-bts0-egprs.iperf3m4.py (from gprs_trx-b200+mod-bts0-egprs)

Failing for the past 235 builds (Since #2139 )
Took 12 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=27595): Process ended prematurely: osmo-stp_10.42.42.5(pid=27595) [trial-2373↪gprs:trx-b200+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=27595)]

Standard Output

----------------------------------------------
trial-2373 gprs:trx-b200+mod-bts0-egprs iperf3m4.py
----------------------------------------------
10:11:51.773898 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
10:11:51.785185 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x bts (candidates: 1)
10:11:51.924178 tst                      iperf3m4.py: using LAC 27672
10:11:52.047283 tst                      iperf3m4.py: using RAC 132
10:11:52.164922 tst                      iperf3m4.py: using CellId 27672
10:11:52.283439 tst                      iperf3m4.py: using BVCI 27673
10:11:52.306718 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
10:11:52.317605 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
10:11:52.328164 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
10:11:52.338841 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
10:11:52.349469 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
10:11:52.360084 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
10:11:52.370904 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
10:11:52.381509 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x ip_address (candidates: 9)
10:11:52.392262 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x modem (candidates: 4)
10:11:52.509195 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x modem (candidates: 4)
10:11:52.628688 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x modem (candidates: 4)
10:11:52.747931 tst     gprs:trx-b200+mod-bts0-egprs: Using 1 x modem (candidates: 4)
10:11:53.044499 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
10:11:53.180460 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
10:11:53.517152 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27581): Launched
10:11:53.797713 run iperf3-srv_10.42.42.10(pid=27582): Launched
10:11:53.996500 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
10:11:54.132310 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
10:11:54.467717 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27583): Launched
10:11:54.746454 run iperf3-srv_10.42.42.10(pid=27584): Launched
10:11:54.946037 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
10:11:55.081599 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
10:11:55.416916 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27585): Launched
10:11:55.694815 run iperf3-srv_10.42.42.10(pid=27586): Launched
10:11:55.893483 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
10:11:56.028307 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
10:11:56.364255 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27587): Launched
10:11:56.642232 run iperf3-srv_10.42.42.10(pid=27588): Launched
10:11:56.841006 tst                    iperf3m4.py:8: start network...
10:11:56.975936 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
10:11:57.484879 run         create_hlr_db(pid=27589): Launched
10:11:57.673900 bus                          /gobi_6: Setting Powered False
10:11:58.693341 run         create_hlr_db(pid=27589): Terminated: ok {rc=0}
10:11:59.125772 run pcap-recorder_any(filters='host 10.42.42.2')(pid=27591): Launched
10:11:59.450629 run   osmo-hlr_10.42.42.2(pid=27592): Launched
10:11:59.570952 run              osmo-stp_10.42.42.5: Starting osmo-stp
10:12:00.139268 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=27594): Launched
10:12:00.464203 run   osmo-stp_10.42.42.5(pid=27595): Launched
10:12:00.582365 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
10:12:01.190899 run pcap-recorder_any(filters='host 10.42.42.6')(pid=27597): Launched
10:12:01.310908 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
10:12:01.674796 run              patchelf(pid=27598): Launched
10:12:01.888938 run   osmo-stp_10.42.42.5(pid=27595): ERR: Terminated: ERROR {rc=1}  [trial-2373↪gprs:trx-b200+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=27595)]
10:12:02.031583 run   osmo-stp_10.42.42.5(pid=27595): stdout: 
| (launched: 2022-12-10_10:12:00.305046)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
10:12:02.174042 run   osmo-stp_10.42.42.5(pid=27595): stderr: 
| �[0;m�[38;5;43m20221210101200499 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221210101200499 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221210101200499 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221210101200500 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221210101200500 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221210101200500 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221210101200500 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
10:12:02.311592 run   osmo-stp_10.42.42.5(pid=27595): stdout: 
| (launched: 2022-12-10_10:12:00.305046)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
10:12:02.444885 run   osmo-stp_10.42.42.5(pid=27595): stderr: 
| �[0;m�[38;5;43m20221210101200499 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221210101200499 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221210101200499 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221210101200500 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221210101200500 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221210101200500 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221210101200500 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
10:12:02.600657 run              patchelf(pid=27598): Terminating (SIGINT)
10:12:02.784789 run              patchelf(pid=27598): Terminated: ok {rc=0}
10:12:02.818479 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=27595): Process ended prematurely: osmo-stp_10.42.42.5(pid=27595) [trial-2373↪gprs:trx-b200+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=27595)]  [trial-2373↪gprs:trx-b200+mod-bts0-egprs↪iperf3m4.py:8]
10:12:02.828586 tst                    iperf3m4.py:8: Test FAILED (11.1 sec)
10:12:02.893401 run   osmo-hlr_10.42.42.2(pid=27592): ERR: Terminated: ERROR {rc=237}  [trial-2373↪gprs:trx-b200+mod-bts0-egprs↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=27592)]
10:12:02.945613 run   osmo-hlr_10.42.42.2(pid=27592): stdout: 
| (launched: 2022-12-10_10:11:59.291372) 
10:12:02.998568 run   osmo-hlr_10.42.42.2(pid=27592): stderr: 
| �[0;m�[1;31m20221210101159488 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221210101159488 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221210101159488 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221210101159488 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221210101159488 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221210101159488 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221210101159502 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2373/run.2022-12-10_09-48-07/gprs:trx-b200+mod-bts0-egprs/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221210101159510 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221210101159510 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221210101159510 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
10:12:03.043401 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27581): Terminating (SIGTERM)
10:12:03.086275 run iperf3-srv_10.42.42.10(pid=27582): Terminating (SIGTERM)
10:12:03.129361 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27583): Terminating (SIGTERM)
10:12:03.171907 run iperf3-srv_10.42.42.10(pid=27584): Terminating (SIGTERM)
10:12:03.214420 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27585): Terminating (SIGTERM)
10:12:03.257190 run iperf3-srv_10.42.42.10(pid=27586): Terminating (SIGTERM)
10:12:03.299736 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27587): Terminating (SIGTERM)
10:12:03.342503 run iperf3-srv_10.42.42.10(pid=27588): Terminating (SIGTERM)
10:12:03.384932 run pcap-recorder_any(filters='host 10.42.42.2')(pid=27591): Terminating (SIGTERM)
10:12:03.427606 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=27594): Terminating (SIGTERM)
10:12:03.470156 run pcap-recorder_any(filters='host 10.42.42.6')(pid=27597): Terminating (SIGTERM)
10:12:03.479557 ---      ParallelTerminationStrategy: PID 27581 died...
10:12:03.536153 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27581): Terminated: ok {rc=0}
10:12:03.546350 ---      ParallelTerminationStrategy: PID 27582 died...
10:12:03.602895 run iperf3-srv_10.42.42.10(pid=27582): Terminated {rc=256}
10:12:03.613196 ---      ParallelTerminationStrategy: PID 27583 died...
10:12:03.669675 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27583): Terminated: ok {rc=0}
10:12:03.679840 ---      ParallelTerminationStrategy: PID 27584 died...
10:12:03.735892 run iperf3-srv_10.42.42.10(pid=27584): Terminated {rc=256}
10:12:03.746531 ---      ParallelTerminationStrategy: PID 27585 died...
10:12:03.803026 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27585): Terminated: ok {rc=0}
10:12:03.813275 ---      ParallelTerminationStrategy: PID 27586 died...
10:12:03.869687 run iperf3-srv_10.42.42.10(pid=27586): Terminated {rc=256}
10:12:03.879744 ---      ParallelTerminationStrategy: PID 27587 died...
10:12:03.936103 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=27587): Terminated: ok {rc=0}
10:12:03.946212 ---      ParallelTerminationStrategy: PID 27588 died...
10:12:04.002751 run iperf3-srv_10.42.42.10(pid=27588): Terminated {rc=256}
10:12:04.012891 ---      ParallelTerminationStrategy: PID 27591 died...
10:12:04.069333 run pcap-recorder_any(filters='host 10.42.42.2')(pid=27591): Terminated: ok {rc=0}
10:12:04.079750 ---      ParallelTerminationStrategy: PID 27594 died...
10:12:04.136598 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=27594): Terminated: ok {rc=0}
10:12:04.146836 ---      ParallelTerminationStrategy: PID 27597 died...
10:12:04.203527 run pcap-recorder_any(filters='host 10.42.42.6')(pid=27597): Terminated: ok {rc=0}
10:12:04.292326 bus                          /gobi_4: Setting Powered False
10:12:05.397400 bus                          /gobi_4: Setting Powered False
10:12:06.509640 bus                          /gobi_1: Setting Powered False
10:12:07.612700 bus                          /gobi_1: Setting Powered False
10:12:08.723774 bus                          /gobi_0: Setting Powered False
10:12:09.827041 bus                          /gobi_0: Setting Powered False
10:12:10.940033 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=27595): Process ended prematurely: osmo-stp_10.42.42.5(pid=27595) [trial-2373↪gprs:trx-b200+mod-bts0-egprs↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=27595)]