Skip to content

Failed

gprs:oc2g.cs_paging_gprs_active.py (from gprs_oc2g)

Failing for the past 249 builds (Since #2143 )
Took 10 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=2056): Process ended prematurely: osmo-stp_10.42.42.5(pid=2056) [trial-2391↪gprs:oc2g↪cs_paging_gprs_active.py:38↪cs_paging_gprs_active.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=2056)]

Standard Output

----------------------------------------------
trial-2391 gprs:oc2g cs_paging_gprs_active.py
----------------------------------------------
15:51:55.160248 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
15:51:55.171280 tst                        gprs:oc2g: Using 1 x bts (candidates: 1)
15:51:55.305175 tst         cs_paging_gprs_active.py: using LAC 34648
15:51:55.438017 tst         cs_paging_gprs_active.py: using RAC 223
15:51:55.573720 tst         cs_paging_gprs_active.py: using CellId 34648
15:51:55.703433 tst         cs_paging_gprs_active.py: using BVCI 34649
15:51:55.729121 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
15:51:55.740130 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
15:51:55.751069 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
15:51:55.761905 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
15:51:55.773100 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
15:51:55.784160 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
15:51:55.795558 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
15:51:55.806973 tst                        gprs:oc2g: Using 1 x ip_address (candidates: 9)
15:51:55.818455 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
15:51:55.937583 tst                        gprs:oc2g: Using 1 x modem (candidates: 4)
15:51:56.241042 tst      cs_paging_gprs_active.py:38: start iperfv3 server 10.42.42.10:5003...
15:51:56.380763 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
15:51:56.728817 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=2046): Launched
15:51:57.018662 run iperf3-srv_10.42.42.10(pid=2047): Launched
15:51:57.225705 tst      cs_paging_gprs_active.py:38: start iperfv3 server 10.42.42.10:5004...
15:51:57.366646 run           iperf3-srv_10.42.42.10: Starting iperf3-srv
15:51:57.715908 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=2048): Launched
15:51:58.005562 run iperf3-srv_10.42.42.10(pid=2049): Launched
15:51:58.212519 tst      cs_paging_gprs_active.py:38: start network...
15:51:58.353155 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
15:51:58.881384 run          create_hlr_db(pid=2050): Launched
15:51:59.075666 bus                        /sierra_2: Setting Powered False
15:52:00.096472 run          create_hlr_db(pid=2050): Terminated: ok {rc=0}
15:52:00.517841 run pcap-recorder_any(filters='host 10.42.42.2')(pid=2052): Launched
15:52:00.846424 run    osmo-hlr_10.42.42.2(pid=2053): Launched
15:52:01.059009 run              osmo-stp_10.42.42.5: Starting osmo-stp
15:52:01.673113 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=2055): Launched
15:52:02.171254 run    osmo-stp_10.42.42.5(pid=2056): Launched
15:52:02.293922 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
15:52:02.879094 run pcap-recorder_any(filters='host 10.42.42.6')(pid=2058): Launched
15:52:02.999086 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
15:52:03.295873 run               patchelf(pid=2059): Launched
15:52:03.492575 run    osmo-stp_10.42.42.5(pid=2056): ERR: Terminated: ERROR {rc=1}  [trial-2391↪gprs:oc2g↪cs_paging_gprs_active.py:38↪cs_paging_gprs_active.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=2056)]
15:52:03.635847 run    osmo-stp_10.42.42.5(pid=2056): stdout: 
| (launched: 2022-12-13_15:52:01.848754)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
15:52:03.779708 run    osmo-stp_10.42.42.5(pid=2056): stderr: 
| �[0;m�[38;5;43m20221213155202038 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221213155202038 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221213155202038 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221213155202039 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221213155202039 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221213155202039 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221213155202039 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
15:52:03.918222 run    osmo-stp_10.42.42.5(pid=2056): stdout: 
| (launched: 2022-12-13_15:52:01.848754)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
15:52:04.055391 run    osmo-stp_10.42.42.5(pid=2056): stderr: 
| �[0;m�[38;5;43m20221213155202038 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221213155202038 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221213155202038 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221213155202039 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221213155202039 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221213155202039 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221213155202039 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
15:52:04.216204 run               patchelf(pid=2059): Terminating (SIGINT)
15:52:04.405113 run               patchelf(pid=2059): Terminated: ok {rc=0}
15:52:04.439438 tst      cs_paging_gprs_active.py:38: ERR: Error: osmo-stp_10.42.42.5(pid=2056): Process ended prematurely: osmo-stp_10.42.42.5(pid=2056) [trial-2391↪gprs:oc2g↪cs_paging_gprs_active.py:38↪cs_paging_gprs_active.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=2056)]  [trial-2391↪gprs:oc2g↪cs_paging_gprs_active.py:38]
15:52:04.449528 tst      cs_paging_gprs_active.py:38: Test FAILED (9.3 sec)
15:52:04.513954 run    osmo-hlr_10.42.42.2(pid=2053): ERR: Terminated: ERROR {rc=237}  [trial-2391↪gprs:oc2g↪cs_paging_gprs_active.py↪cs_paging_gprs_active.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=2053)]
15:52:04.566479 run    osmo-hlr_10.42.42.2(pid=2053): stdout: 
| (launched: 2022-12-13_15:52:00.676889) 
15:52:04.619364 run    osmo-hlr_10.42.42.2(pid=2053): stderr: 
| �[0;m�[1;31m20221213155200862 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221213155200862 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221213155200862 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221213155200862 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221213155200862 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221213155200862 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221213155200883 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2391/run.2022-12-13_15-37-59/gprs:oc2g/cs_paging_gprs_active.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221213155200892 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221213155200892 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221213155200892 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
15:52:04.664833 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=2046): Terminating (SIGTERM)
15:52:04.707514 run iperf3-srv_10.42.42.10(pid=2047): Terminating (SIGTERM)
15:52:04.750739 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=2048): Terminating (SIGTERM)
15:52:04.793689 run iperf3-srv_10.42.42.10(pid=2049): Terminating (SIGTERM)
15:52:04.836461 run pcap-recorder_any(filters='host 10.42.42.2')(pid=2052): Terminating (SIGTERM)
15:52:04.879332 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=2055): Terminating (SIGTERM)
15:52:04.922646 run pcap-recorder_any(filters='host 10.42.42.6')(pid=2058): Terminating (SIGTERM)
15:52:04.932230 ---      ParallelTerminationStrategy: PID 2046 died...
15:52:04.989671 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=2046): Terminated: ok {rc=0}
15:52:05.000146 ---      ParallelTerminationStrategy: PID 2047 died...
15:52:05.057383 run iperf3-srv_10.42.42.10(pid=2047): Terminated {rc=256}
15:52:05.067716 ---      ParallelTerminationStrategy: PID 2048 died...
15:52:05.124822 run pcap-recorder_any(filters='host 10.42.42.10 and port not 22')(pid=2048): Terminated: ok {rc=0}
15:52:05.135122 ---      ParallelTerminationStrategy: PID 2049 died...
15:52:05.191899 run iperf3-srv_10.42.42.10(pid=2049): Terminated {rc=256}
15:52:05.201986 ---      ParallelTerminationStrategy: PID 2052 died...
15:52:05.258884 run pcap-recorder_any(filters='host 10.42.42.2')(pid=2052): Terminated: ok {rc=0}
15:52:05.269145 ---      ParallelTerminationStrategy: PID 2055 died...
15:52:05.326032 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=2055): Terminated: ok {rc=0}
15:52:05.336272 ---      ParallelTerminationStrategy: PID 2058 died...
15:52:05.393168 run pcap-recorder_any(filters='host 10.42.42.6')(pid=2058): Terminated: ok {rc=0}
15:52:05.481601 bus                          /gobi_0: Setting Powered False
15:52:06.588924 bus                          /gobi_0: Setting Powered False
15:52:07.698481 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/cs_paging_gprs_active.py", line 38, in <module>
    setup_run_iperf3_test_parallel(2, ready_cb=ready_cb_place_voicecall)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/lib/testlib.py", line 67, in setup_run_iperf3_test_parallel
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=2056): Process ended prematurely: osmo-stp_10.42.42.5(pid=2056) [trial-2391↪gprs:oc2g↪cs_paging_gprs_active.py:38↪cs_paging_gprs_active.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=2056)]