Skip to content

Failed

gprs:oc2g+mod-bts0-dynts-osmo.ping.py (from gprs_oc2g+mod-bts0-dynts-osmo)

Failing for the past 230 builds (Since #2139 )
Took 7 sec.

Stacktrace

osmo-stp_10.42.42.5(pid=13584): Process ended prematurely: osmo-stp_10.42.42.5(pid=13584) [trial-2368↪gprs:oc2g+mod-bts0-dynts-osmo↪ping.py:22↪ping.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=13584)]

Standard Output

----------------------------------------------
trial-2368 gprs:oc2g+mod-bts0-dynts-osmo ping.py
----------------------------------------------
02:24:46.697727 tst    gprs:oc2g+mod-bts0-dynts-osmo: Using 1 x ip_address (candidates: 9)
02:24:46.710070 tst    gprs:oc2g+mod-bts0-dynts-osmo: Using 1 x bts (candidates: 1)
02:24:46.859067 tst                          ping.py: using LAC 25332
02:24:46.979898 tst                          ping.py: using RAC 87
02:24:47.128125 tst                          ping.py: using CellId 25332
02:24:47.252995 tst                          ping.py: using BVCI 25333
02:24:47.277661 tst    gprs:oc2g+mod-bts0-dynts-osmo: Using 1 x ip_address (candidates: 9)
02:24:47.289079 tst    gprs:oc2g+mod-bts0-dynts-osmo: Using 1 x ip_address (candidates: 9)
02:24:47.300284 tst    gprs:oc2g+mod-bts0-dynts-osmo: Using 1 x ip_address (candidates: 9)
02:24:47.311644 tst    gprs:oc2g+mod-bts0-dynts-osmo: Using 1 x ip_address (candidates: 9)
02:24:47.323088 tst    gprs:oc2g+mod-bts0-dynts-osmo: Using 1 x ip_address (candidates: 9)
02:24:47.334134 tst    gprs:oc2g+mod-bts0-dynts-osmo: Using 1 x ip_address (candidates: 9)
02:24:47.345413 tst    gprs:oc2g+mod-bts0-dynts-osmo: Using 1 x ip_address (candidates: 9)
02:24:47.356383 tst    gprs:oc2g+mod-bts0-dynts-osmo: Using 1 x modem (candidates: 4)
02:24:47.643865 tst                       ping.py:19: start network...
02:24:47.776561 run              osmo-hlr_10.42.42.2: Starting osmo-hlr
02:24:48.282608 run         create_hlr_db(pid=13578): Launched
02:24:48.466732 bus                          /gobi_6: Setting Powered False
02:24:49.486429 run         create_hlr_db(pid=13578): Terminated: ok {rc=0}
02:24:49.911121 run pcap-recorder_any(filters='host 10.42.42.2')(pid=13580): Launched
02:24:50.227280 run   osmo-hlr_10.42.42.2(pid=13581): Launched
02:24:50.346174 run              osmo-stp_10.42.42.5: Starting osmo-stp
02:24:50.904223 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=13583): Launched
02:24:51.221393 run   osmo-stp_10.42.42.5(pid=13584): Launched
02:24:51.338322 run             osmo-ggsn_10.42.42.6: Starting osmo-ggsn
02:24:51.897173 run pcap-recorder_any(filters='host 10.42.42.6')(pid=13586): Launched
02:24:52.010383 run             osmo-ggsn_10.42.42.6: Setting RPATH for osmo-ggsn
02:24:52.288227 run              patchelf(pid=13587): Launched
02:24:52.474705 run   osmo-stp_10.42.42.5(pid=13584): ERR: Terminated: ERROR {rc=1}  [trial-2368↪gprs:oc2g+mod-bts0-dynts-osmo↪ping.py:22↪ping.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=13584)]
02:24:52.611339 run   osmo-stp_10.42.42.5(pid=13584): stdout: 
| (launched: 2022-12-10_02:24:51.066011)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
02:24:52.747955 run   osmo-stp_10.42.42.5(pid=13584): stderr: 
| �[0;m�[38;5;43m20221210022451257 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221210022451257 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221210022451257 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221210022451258 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221210022451258 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221210022451258 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221210022451258 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
02:24:52.880005 run   osmo-stp_10.42.42.5(pid=13584): stdout: 
| (launched: 2022-12-10_02:24:51.066011)
| Copyright (C) 2015-2020 by Harald Welte <laforge@gnumonks.org>
| Contributions by Holger Freyther, Neels Hofmeyr, Pau Espin, Vadim Yanitskiy
| License GPLv2+: GNU GPL Version 2 or later <http://gnu.org/licenses/gpl-2.0.html>
| This is free software: you are free to change and redistribute it.
| There is NO WARRANTY, to the extent permitted by law.
| 
| Free Software lives by contribution.  If you use this, please contribute!
|  
02:24:53.010477 run   osmo-stp_10.42.42.5(pid=13584): stderr: 
| �[0;m�[38;5;43m20221210022451257 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Creating m3ua Server (null):2905 (osmo_ss7.c:2189)
| �[0;m�[38;5;43m20221210022451257 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m Created m3ua server on (null):2905 (osmo_ss7.c:2207)
| �[0;m�[38;5;43m20221210022451257 �[1;32mDLSS7�[0;m�[38;5;43m �[1;32mINFO�[0;m�[38;5;43m (Re)binding m3ua Server to 10.42.42.5:2905 (osmo_ss7.c:2235)
| �[0;m20221210022451258 �[1;33mDLGLOBAL�[0;m �[1;33mNOTICE�[0;m unable to bind socket: 10.42.42.5:2905: Address already in use (socket.c:822)
| % Unable to bind xUA server to IP(s)
| 20221210022451258 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.5:4239: Address already in use (socket.c:935)
| 20221210022451258 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.5:4239 (socket.c:946)
| 20221210022451258 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.5 4239 (telnet_interface.c:99)
| Error binding VTY port
| : Address already in use 
02:24:53.162299 run              patchelf(pid=13587): Terminating (SIGINT)
02:24:53.340276 run              patchelf(pid=13587): Terminated: ok {rc=0}
02:24:53.372614 tst                       ping.py:22: ERR: Error: osmo-stp_10.42.42.5(pid=13584): Process ended prematurely: osmo-stp_10.42.42.5(pid=13584) [trial-2368↪gprs:oc2g+mod-bts0-dynts-osmo↪ping.py:22↪ping.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=13584)]  [trial-2368↪gprs:oc2g+mod-bts0-dynts-osmo↪ping.py:22]
02:24:53.381743 tst                       ping.py:22: Test FAILED (6.7 sec)
02:24:53.444980 run   osmo-hlr_10.42.42.2(pid=13581): ERR: Terminated: ERROR {rc=237}  [trial-2368↪gprs:oc2g+mod-bts0-dynts-osmo↪ping.py↪ping.py↪osmo-hlr_10.42.42.2↪osmo-hlr_10.42.42.2(pid=13581)]
02:24:53.496186 run   osmo-hlr_10.42.42.2(pid=13581): stdout: 
| (launched: 2022-12-10_02:24:50.072422) 
02:24:53.548738 run   osmo-hlr_10.42.42.2(pid=13581): stderr: 
| �[0;m�[1;31m20221210022450269 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SECURE_DELETE' (db.c:579)
| �[0;m�[1;31m20221210022450269 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'SOUNDEX' (db.c:579)
| �[0;m�[1;31m20221210022450269 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'TEMP_STORE=1' (db.c:579)
| �[0;m�[1;31m20221210022450269 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'THREADSAFE=1' (db.c:579)
| �[0;m�[1;31m20221210022450269 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m SQLite3 compiled with 'USE_URI' (db.c:579)
| �[0;m�[1;31m20221210022450270 �[1;34mDDB�[0;m�[1;31m �[1;34mDEBUG�[0;m�[1;31m Not setting SQL log callback: SQLite3 compiled without support for it (db.c:596)
| �[0;m�[1;31m20221210022450282 �[1;33mDDB�[0;m�[1;31m �[1;33mNOTICE�[0;m�[1;31m Database '/home/jenkins/workspace/osmo-gsm-tester_run-prod/trial-2368/run.2022-12-10_02-07-09/gprs:oc2g+mod-bts0-dynts-osmo/ping.py/osmo-hlr_10.42.42.2/hlr.db' has HLR DB schema version 6 (db.c:636)
| �[0;m20221210022450290 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m unable to bind socket:10.42.42.2:4258: Address already in use (socket.c:935)
| 20221210022450291 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m no suitable addr found for: 10.42.42.2:4258 (socket.c:946)
| 20221210022450291 �[1;31mDLGLOBAL�[0;m �[1;31mERROR�[0;m Cannot bind telnet at 10.42.42.2 4258 (telnet_interface.c:99) 
02:24:53.593225 run pcap-recorder_any(filters='host 10.42.42.2')(pid=13580): Terminating (SIGTERM)
02:24:53.635206 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=13583): Terminating (SIGTERM)
02:24:53.677526 run pcap-recorder_any(filters='host 10.42.42.6')(pid=13586): Terminating (SIGTERM)
02:24:53.687239 ---      ParallelTerminationStrategy: PID 13580 died...
02:24:53.743638 run pcap-recorder_any(filters='host 10.42.42.2')(pid=13580): Terminated: ok {rc=0}
02:24:53.753969 ---      ParallelTerminationStrategy: PID 13583 died...
02:24:53.810302 run pcap-recorder_any(filters='host 10.42.42.5 and port not 22')(pid=13583): Terminated: ok {rc=0}
02:24:53.820374 ---      ParallelTerminationStrategy: PID 13586 died...
02:24:53.876185 run pcap-recorder_any(filters='host 10.42.42.6')(pid=13586): Terminated: ok {rc=0}
02:24:53.964046 bus                          /gobi_6: Setting Powered False

Standard Error

Traceback (most recent call last):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/test.py", line 76, in run
    self.path)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 414, in run_python_file
    spec.loader.exec_module( importlib.util.module_from_spec(spec) )
  File "<frozen importlib._bootstrap_external>", line 728, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/sysmocom/suites/gprs/ping.py", line 22, in <module>
    ggsn.start()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/obj/ggsn_osmo.py", line 57, in start
    util.change_elf_rpath(binary, util.prepend_library_path(lib), self.run_dir.new_dir('patchelf'))
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/util.py", line 57, in change_elf_rpath
    proc.launch_sync()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 214, in launch_sync
    raise e
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 211, in launch_sync
    self.wait()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/process.py", line 421, in wait
    MainLoop.wait(self.terminated, timeout=timeout)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 108, in wait
    if not self.wait_no_raise(condition, condition_args, condition_kwargs, timeout, timestep):
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 98, in wait_no_raise
    self.poll(may_block=True)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 86, in poll
    self.deferred_handling.handle_queue()
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/core/event_loop.py", line 33, in handle_queue
    handler(*args, **kwargs)
  File "/home/jenkins/workspace/osmo-gsm-tester_run-prod/osmo-gsm-tester/src/osmo_gsm_tester/testenv.py", line 142, in poll
    raise log_module.Error('Process ended prematurely: %s' % proc.name())
osmo_gsm_tester.core.log.Error: osmo-stp_10.42.42.5(pid=13584): Process ended prematurely: osmo-stp_10.42.42.5(pid=13584) [trial-2368↪gprs:oc2g+mod-bts0-dynts-osmo↪ping.py:22↪ping.py↪osmo-stp_10.42.42.5↪osmo-stp_10.42.42.5(pid=13584)]