Skip to content

Failed

gprs:oc2g.iperf3m4.py (from gprs_oc2g)

Failing for the past 227 builds (Since #2139 )
Took 11 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=20990): Process ended prematurely: osmo-stp_10.42.42.5(pid=20990) [trial-2365↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=20990)]

Standard Output

----------------------------------------------
trial-2365 gprs:oc2g iperf3m4.py
----------------------------------------------
21:58:33.556235 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
21:58:33.567852 tst                        gprs:oc2g: Using 1 x bts (candidates: 1)
21:58:33.701026 tst                      iperf3m4.py: using LAC 23933
21:58:33.831240 tst                      iperf3m4.py: using RAC 218
21:58:33.959160 tst                      iperf3m4.py: using CellId 23933
21:58:34.087269 tst                      iperf3m4.py: using BVCI 23934
21:58:34.112488 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
21:58:34.123559 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
21:58:34.134539 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
21:58:34.145561 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
21:58:34.156222 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
21:58:34.167153 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
21:58:34.178126 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
21:58:34.188835 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
21:58:34.199931 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
21:58:34.316736 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
21:58:34.439041 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
21:58:34.557282 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
21:58:34.833870 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
21:58:34.961052 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
21:58:35.279002 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20976): Launched
21:58:35.543208 run iperf3-srv_10.42.42.10(pid=20977): Launched
21:58:35.729823 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
21:58:35.856853 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
21:58:36.173336 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20978): Launched
21:58:36.435832 run iperf3-srv_10.42.42.10(pid=20979): Launched
21:58:36.621724 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
21:58:36.748708 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
21:58:37.066155 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20980): Launched
21:58:37.330214 run iperf3-srv_10.42.42.10(pid=20981): Launched
21:58:37.515452 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
21:58:37.642012 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
21:58:37.957192 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20982): Launched
21:58:38.219654 run iperf3-srv_10.42.42.10(pid=20983): Launched
21:58:38.405352 tst                    iperf3m4.py:8: start network...
21:58:38.531841 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
21:58:39.014229 run         create_hlr_db(pid=20984): Launched
21:58:39.190060 bus                          /gobi_6: Setting Powered False
21:58:40.211968 run         create_hlr_db(pid=20984): Terminated: ok {rc=0}
21:58:40.619265 run pcap-recorder_any(filters='host 10.42.42.2')(pid=20986): Launched
21:58:40.925429 run   osmo-hlr_10.42.42.2(pid=20987): Launched
21:58:41.043562 run              osmo-stp_10.42.42.5: Starting osmo-stp
21:58:41.582090 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=20989): Launched
21:58:41.886848 run   osmo-stp_10.42.42.5(pid=20990): Launched
21:58:41.998126 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
21:58:42.531040 run pcap-recorder_any(filters='host 10.42.42.6')(pid=20992): Launched
21:58:42.639771 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
21:58:42.907070 run              patchelf(pid=20993): Launched
21:58:43.086664 run   osmo-stp_10.42.42.5(pid=20990): ERR: Terminated: ERROR {rc=1}  [trial-2365↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=20990)]
21:58:43.217309 run   osmo-stp_10.42.42.5(pid=20990): stdout: 
| (launched: 2022-12-09_21:58:41.736391)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
21:58:43.348292 run   osmo-stp_10.42.42.5(pid=20990): stderr: 
| �[0;m�[38;5;43m20221209215841909 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209215841909 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209215841909 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209215841910 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209215841910 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209215841910 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209215841910 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
21:58:43.473948 run   osmo-stp_10.42.42.5(pid=20990): stdout: 
| (launched: 2022-12-09_21:58:41.736391)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
21:58:43.598730 run   osmo-stp_10.42.42.5(pid=20990): stderr: 
| �[0;m�[38;5;43m20221209215841909 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221209215841909 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221209215841909 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221209215841910 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221209215841910 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221209215841910 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221209215841910 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
21:58:43.745511 run              patchelf(pid=20993): Terminating (SIGINT)
21:58:43.918480 run              patchelf(pid=20993): Terminated: ok {rc=0}
21:58:43.949879 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=20990): Process ended prematurely: osmo-stp_10.42.42.5(pid=20990) [trial-2365↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=20990)]  [trial-2365↪gprs:oc2g↪iperf3m4.py:8]
21:58:43.959008 tst                    iperf3m4.py:8: Test FAILED (10.4 sec)
21:58:44.019664 run   osmo-hlr_10.42.42.2(pid=20987): ERR: Terminated: ERROR {rc=237}  [trial-2365↪gprs:oc2g↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=20987)]
21:58:44.068747 run   osmo-hlr_10.42.42.2(pid=20987): stdout: 
| (launched: 2022-12-09_21:58:40.774596) 
21:58:44.118765 run   osmo-hlr_10.42.42.2(pid=20987): stderr: 
| �[0;m�[1;31m20221209215840953 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221209215840953 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221209215840953 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221209215840953 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221209215840953 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221209215840953 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221209215840965 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2365/run.2022-12-09_21-44-27/gprs:oc2g/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221209215840974 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221209215840974 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221209215840974 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
21:58:44.161023 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20976): Terminating (SIGTERM)
21:58:44.200914 run iperf3-srv_10.42.42.10(pid=20977): Terminating (SIGTERM)
21:58:44.240754 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20978): Terminating (SIGTERM)
21:58:44.280637 run iperf3-srv_10.42.42.10(pid=20979): Terminating (SIGTERM)
21:58:44.320440 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20980): Terminating (SIGTERM)
21:58:44.360106 run iperf3-srv_10.42.42.10(pid=20981): Terminating (SIGTERM)
21:58:44.399974 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20982): Terminating (SIGTERM)
21:58:44.440028 run iperf3-srv_10.42.42.10(pid=20983): Terminating (SIGTERM)
21:58:44.479558 run pcap-recorder_any(filters='host 10.42.42.2')(pid=20986): Terminating (SIGTERM)
21:58:44.519453 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=20989): Terminating (SIGTERM)
21:58:44.559230 run pcap-recorder_any(filters='host 10.42.42.6')(pid=20992): Terminating (SIGTERM)
21:58:44.568731 ---      ParallelTerminationStrategy: PID 20976 died...
21:58:44.622317 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20976): Terminated: ok {rc=0}
21:58:44.632308 ---      ParallelTerminationStrategy: PID 20977 died...
21:58:44.685681 run iperf3-srv_10.42.42.10(pid=20977): Terminated {rc=256}
21:58:44.695556 ---      ParallelTerminationStrategy: PID 20978 died...
21:58:44.748374 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20978): Terminated: ok {rc=0}
21:58:44.758172 ---      ParallelTerminationStrategy: PID 20979 died...
21:58:44.811149 run iperf3-srv_10.42.42.10(pid=20979): Terminated {rc=256}
21:58:44.820736 ---      ParallelTerminationStrategy: PID 20980 died...
21:58:44.873663 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20980): Terminated: ok {rc=0}
21:58:44.883321 ---      ParallelTerminationStrategy: PID 20981 died...
21:58:44.935981 run iperf3-srv_10.42.42.10(pid=20981): Terminated {rc=256}
21:58:44.945897 ---      ParallelTerminationStrategy: PID 20982 died...
21:58:44.998508 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=20982): Terminated: ok {rc=0}
21:58:45.008340 ---      ParallelTerminationStrategy: PID 20983 died...
21:58:45.061618 run iperf3-srv_10.42.42.10(pid=20983): Terminated {rc=256}
21:58:45.071489 ---      ParallelTerminationStrategy: PID 20986 died...
21:58:45.124420 run pcap-recorder_any(filters='host 10.42.42.2')(pid=20986): Terminated: ok {rc=0}
21:58:45.134022 ---      ParallelTerminationStrategy: PID 20989 died...
21:58:45.186840 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=20989): Terminated: ok {rc=0}
21:58:45.196642 ---      ParallelTerminationStrategy: PID 20992 died...
21:58:45.249368 run pcap-recorder_any(filters='host 10.42.42.6')(pid=20992): Terminated: ok {rc=0}
21:58:45.332288 bus                          /gobi_4: Setting Powered False
21:58:46.434408 bus                          /gobi_4: Setting Powered False
21:58:47.540558 bus                          /gobi_1: Setting Powered False
21:58:48.647953 bus                          /gobi_1: Setting Powered False
21:58:49.755019 bus                          /gobi_0: Setting Powered False
21:58:50.860629 bus                          /gobi_0: Setting Powered False
21:58:51.972415 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=20990): Process ended prematurely: osmo-stp_10.42.42.5(pid=20990) [trial-2365↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=20990)]