Skip to content

Failed

gprs:oc2g.iperf3m4.py (from gprs_oc2g)

Failing for the past 244 builds (Since #2139 )
Took 12 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=26003): Process ended prematurely: osmo-stp_10.42.42.5(pid=26003) [trial-2382↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=26003)]

Standard Output

----------------------------------------------
trial-2382 gprs:oc2g iperf3m4.py
----------------------------------------------
20:50:57.056080 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
20:50:57.067949 tst                        gprs:oc2g: Using 1 x bts (candidates: 1)
20:50:57.203175 tst                      iperf3m4.py: using LAC 31423
20:50:57.336640 tst                      iperf3m4.py: using RAC 58
20:50:57.471193 tst                      iperf3m4.py: using CellId 31423
20:50:57.603347 tst                      iperf3m4.py: using BVCI 31424
20:50:57.629307 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
20:50:57.640440 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
20:50:57.651549 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
20:50:57.663704 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
20:50:57.675280 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
20:50:57.686800 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
20:50:57.698604 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
20:50:57.710171 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
20:50:57.721900 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
20:50:57.844443 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
20:50:57.966929 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
20:50:58.085802 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
20:50:58.369336 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5003...
20:50:58.499791 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
20:50:58.823207 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25989): Launched
20:50:59.091416 run iperf3-srv_10.42.42.10(pid=25990): Launched
20:50:59.282702 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5004...
20:50:59.413263 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
20:50:59.734781 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25991): Launched
20:51:00.002883 run iperf3-srv_10.42.42.10(pid=25992): Launched
20:51:00.193220 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5005...
20:51:00.323402 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
20:51:00.645905 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25993): Launched
20:51:00.951750 run iperf3-srv_10.42.42.10(pid=25994): Launched
20:51:01.163495 tst                    iperf3m4.py:8: start iperfv3 server 10.42.42.10:5006...
20:51:01.308155 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
20:51:01.680354 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25995): Launched
20:51:01.989147 run iperf3-srv_10.42.42.10(pid=25996): Launched
20:51:02.199618 tst                    iperf3m4.py:8: start network...
20:51:02.340528 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
20:51:02.876213 run         create_hlr_db(pid=25997): Launched
20:51:03.070081 bus                          /gobi_6: Setting Powered False
20:51:04.092828 run         create_hlr_db(pid=25997): Terminated: ok {rc=0}
20:51:04.537804 run pcap-recorder_any(filters='host 10.42.42.2')(pid=25999): Launched
20:51:04.875437 run   osmo-hlr_10.42.42.2(pid=26000): Launched
20:51:04.998824 run              osmo-stp_10.42.42.5: Starting osmo-stp
20:51:05.580245 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=26002): Launched
20:51:05.915459 run   osmo-stp_10.42.42.5(pid=26003): Launched
20:51:06.037493 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
20:51:06.621440 run pcap-recorder_any(filters='host 10.42.42.6')(pid=26005): Launched
20:51:06.741976 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
20:51:07.032501 run              patchelf(pid=26006): Launched
20:51:07.227932 run   osmo-stp_10.42.42.5(pid=26003): ERR: Terminated: ERROR {rc=1}  [trial-2382↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=26003)]
20:51:07.370175 run   osmo-stp_10.42.42.5(pid=26003): stdout: 
| (launched: 2022-12-12_20:51:05.751413)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
20:51:07.513092 run   osmo-stp_10.42.42.5(pid=26003): stderr: 
| �[0;m�[38;5;43m20221212205105942 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221212205105943 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221212205105943 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221212205105943 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221212205105944 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221212205105944 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221212205105944 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
20:51:07.651374 run   osmo-stp_10.42.42.5(pid=26003): stdout: 
| (launched: 2022-12-12_20:51:05.751413)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
20:51:07.788517 run   osmo-stp_10.42.42.5(pid=26003): stderr: 
| �[0;m�[38;5;43m20221212205105942 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221212205105943 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221212205105943 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221212205105943 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221212205105944 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221212205105944 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221212205105944 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
20:51:07.948294 run              patchelf(pid=26006): Terminating (SIGINT)
20:51:08.137607 run              patchelf(pid=26006): Terminated: ok {rc=0}
20:51:08.172038 tst                    iperf3m4.py:8: ERR: Error: osmo-stp_10.42.42.5(pid=26003): Process ended prematurely: osmo-stp_10.42.42.5(pid=26003) [trial-2382↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=26003)]  [trial-2382↪gprs:oc2g↪iperf3m4.py:8]
20:51:08.181890 tst                    iperf3m4.py:8: Test FAILED (11.2 sec)
20:51:08.246314 run   osmo-hlr_10.42.42.2(pid=26000): ERR: Terminated: ERROR {rc=237}  [trial-2382↪gprs:oc2g↪iperf3m4.py↪iperf3m4.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=26000)]
20:51:08.298863 run   osmo-hlr_10.42.42.2(pid=26000): stdout: 
| (launched: 2022-12-12_20:51:04.709334) 
20:51:08.352059 run   osmo-hlr_10.42.42.2(pid=26000): stderr: 
| �[0;m�[1;31m20221212205104905 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221212205104905 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221212205104905 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221212205104905 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221212205104905 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221212205104905 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221212205104916 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2382/run.2022-12-12_20-36-34/gprs:oc2g/iperf3m4.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221212205104925 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221212205104925 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221212205104925 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
20:51:08.397466 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25989): Terminating (SIGTERM)
20:51:08.440381 run iperf3-srv_10.42.42.10(pid=25990): Terminating (SIGTERM)
20:51:08.483271 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25991): Terminating (SIGTERM)
20:51:08.525994 run iperf3-srv_10.42.42.10(pid=25992): Terminating (SIGTERM)
20:51:08.568774 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25993): Terminating (SIGTERM)
20:51:08.611470 run iperf3-srv_10.42.42.10(pid=25994): Terminating (SIGTERM)
20:51:08.654224 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25995): Terminating (SIGTERM)
20:51:08.697524 run iperf3-srv_10.42.42.10(pid=25996): Terminating (SIGTERM)
20:51:08.740345 run pcap-recorder_any(filters='host 10.42.42.2')(pid=25999): Terminating (SIGTERM)
20:51:08.783298 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=26002): Terminating (SIGTERM)
20:51:08.826102 run pcap-recorder_any(filters='host 10.42.42.6')(pid=26005): Terminating (SIGTERM)
20:51:08.836086 ---      ParallelTerminationStrategy: PID 25989 died...
20:51:08.893810 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25989): Terminated: ok {rc=0}
20:51:08.904249 ---      ParallelTerminationStrategy: PID 25990 died...
20:51:08.960848 run iperf3-srv_10.42.42.10(pid=25990): Terminated {rc=256}
20:51:08.971137 ---      ParallelTerminationStrategy: PID 25991 died...
20:51:09.027928 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25991): Terminated: ok {rc=0}
20:51:09.038215 ---      ParallelTerminationStrategy: PID 25992 died...
20:51:09.094722 run iperf3-srv_10.42.42.10(pid=25992): Terminated {rc=256}
20:51:09.105156 ---      ParallelTerminationStrategy: PID 25993 died...
20:51:09.162159 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25993): Terminated: ok {rc=0}
20:51:09.172519 ---      ParallelTerminationStrategy: PID 25994 died...
20:51:09.229473 run iperf3-srv_10.42.42.10(pid=25994): Terminated {rc=256}
20:51:09.239717 ---      ParallelTerminationStrategy: PID 25995 died...
20:51:09.296447 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=25995): Terminated: ok {rc=0}
20:51:09.306749 ---      ParallelTerminationStrategy: PID 25996 died...
20:51:09.363319 run iperf3-srv_10.42.42.10(pid=25996): Terminated {rc=256}
20:51:09.373767 ---      ParallelTerminationStrategy: PID 25999 died...
20:51:09.430443 run pcap-recorder_any(filters='host 10.42.42.2')(pid=25999): Terminated: ok {rc=0}
20:51:09.440739 ---      ParallelTerminationStrategy: PID 26002 died...
20:51:09.497316 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=26002): Terminated: ok {rc=0}
20:51:09.507635 ---      ParallelTerminationStrategy: PID 26005 died...
20:51:09.564397 run pcap-recorder_any(filters='host 10.42.42.6')(pid=26005): Terminated: ok {rc=0}
20:51:09.651567 bus                          /gobi_4: Setting Powered False
20:51:10.758766 bus                          /gobi_4: Setting Powered False
20:51:11.867635 bus                          /gobi_1: Setting Powered False
20:51:12.975037 bus                          /gobi_1: Setting Powered False
20:51:14.083997 bus                          /gobi_0: Setting Powered False
20:51:15.191253 bus                          /gobi_0: Setting Powered False
20:51:16.303949 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/iperf3m4.py", line 8, in <module>
    setup_run_iperf3_test_parallel(4)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=26003): Process ended prematurely: osmo-stp_10.42.42.5(pid=26003) [trial-2382↪gprs:oc2g↪iperf3m4.py:8↪iperf3m4.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=26003)]