<div dir="ltr"><div class="gmail_extra"><div><div class="gmail_signature"><div dir="ltr"><p class="MsoNormal"><b><span style="font-family:Consolas">Thank you very much
Brian. Updating python-kombu to 3.0.33-5 resolves the issue. After the update I
ran sync task several times without any issues. Please find the log as
requested.</span></b></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"># rpm
-qa | grep kombu</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">python-kombu-3.0.33-4.pulp.el7.noarch</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"># sudo
qpid-stat  -q |grep  celeryev</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
celeryev.136f85c7-b6fd-4e90-8426-fc73ff86864c                            
Y                
0  2.27k  2.27k      0  
1.92m    1.92m       
1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">#
journalctl -f -l</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">-- Logs
begin at Wed 2016-05-04 11:41:21 CEST. --</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">May 04
15:21:03 mysrv audispd[975]: node=mysrv type=USER_ACCT
msg=audit(1462368063.769:541): pid=3501 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:accounting
grantors=pam_succeed_if acct="root" exe="/usr/bin/su"
hostname=? addr=? terminal=pts/0 res=success'</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">May 04
15:21:03 mysrv audispd[975]: node=mysrv type=CRED_ACQ
msg=audit(1462368063.770:542): pid=3501 uid=0 auid=692982 ses=7 subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:setcred grantors=pam_rootok acct="root"
exe="/usr/bin/su" hostname=? addr=? terminal=pts/0 res=success'</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">May 04
15:21:03 mysrv su[3501]: pam_unix(su-l:session): session opened for user root
by lsagy92iy(uid=0)</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">May 04
15:21:03 mysrv audispd[975]: node=mysrv type=USER_START
msg=audit(1462368063.775:543): pid=3501 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:session_open grantors=pam_keyinit,pam_keyinit,pam_limits,pam_systemd,pam_unix,pam_xauth
acct="root" exe="/usr/bin/su" hostname=? addr=?
terminal=pts/0 res=success'</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">May 04
15:22:21 mysrv sudo[3536]:     root : TTY=pts/0 ; PWD=/root
; USER=root ; COMMAND=/bin/qpid-stat -q</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">May 04
15:22:21 mysrv audispd[975]: node=mysrv type=USER_CMD
msg=audit(1462368141.089:544): pid=3536 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='cwd="/root" cmd=717069642D73746174202D71 terminal=pts/0
res=success'</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">May 04
15:22:21 mysrv audispd[975]: node=mysrv type=CRED_ACQ
msg=audit(1462368141.090:545): pid=3536 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:setcred
grantors=pam_env,pam_localuser,pam_unix acct="root" exe="/usr/bin/sudo"
hostname=? addr=? terminal=/dev/pts/0 res=success'</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">May 04
15:22:21 mysrv audispd[975]: node=mysrv type=USER_START
msg=audit(1462368141.090:546): pid=3536 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:session_open grantors=pam_keyinit,pam_limits acct="root"
exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0
res=success'</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">May 04
15:22:21 mysrv audispd[975]: node=mysrv type=USER_END
msg=audit(1462368141.240:547): pid=3536 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:session_close grantors=pam_keyinit,pam_limits acct="root"
exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0
res=success'</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">May 04
15:22:21 mysrv audispd[975]: node=mysrv type=CRED_DISP msg=audit(1462368141.240:548):
pid=3536 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='op=PAM:setcred
grantors=pam_env,pam_localuser,pam_unix acct="root"
exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0
res=success'</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">^C</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"># rpm
-Uvh python-kombu-3.0.33-5.pulp.el7.noarch.rpm</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Preparing...
                         #################################
[100%]</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Updating
/ installing...</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  
1:python-kombu-1:3.0.33-5.pulp.el7 ################################# [ 50%]</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Cleaning
up / removing...</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  
2:python-kombu-1:3.0.33-4.pulp.el7 ################################# [100%]</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"># rpm
-qa | grep kombu</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">python-kombu-3.0.33-5.pulp.el7.noarch</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"># sudo
qpid-stat  -q |grep  celeryev</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
celeryev.a2ebe0b7-c01e-41f2-8fae-58498cee82e5                            
Y                
0   109    109      
0   91.6k    91.6k    
   1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">#
pulp-admin status</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">+----------------------------------------------------------------------+</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">                         
Status of the server</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">+----------------------------------------------------------------------+</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Api
Version:           2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Database
Connection:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
Connected: True</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Known
Workers:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_id:           
scheduler@mysrv</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_ns:            workers</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
Last Heartbeat: 2016-05-04T13:47:15Z</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_id:           
reserved_resource_worker-3@mysrv</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_ns:            workers</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
Last Heartbeat: 2016-05-04T13:48:15Z</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_id:           
reserved_resource_worker-0@mysrv</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_ns:            workers</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
Last Heartbeat: 2016-05-04T13:48:15Z</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_id:           
reserved_resource_worker-2@mysrv</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_ns:            workers</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
Last Heartbeat: 2016-05-04T13:48:15Z</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_id:    
       reserved_resource_worker-1@mysrv</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_ns:            workers</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
Last Heartbeat: 2016-05-04T13:48:18Z</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_id:           
resource_manager@mysrv</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
_ns:            workers</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
Last Heartbeat: 2016-05-04T13:48:18Z</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Messaging
Connection:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
Connected: True</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Versions:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
Platform Version: 2.8.2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">#
qpid-stat -q</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Queues</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
queue                                                               
dur  autoDel  excl 
msg                  
msgIn  msgOut  bytes  bytesIn  bytesOut  cons 
bind</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
=============================================================================================               
=======================================================</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
05a8bc3d-cf7e-4fae-875b-13b214cf8de6:1.0                                 
Y        Y  
     0                    
0      0       0     
0       
0         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
21c58b6c-8bd4-4ede-86d9-fd65dfe8bc4c:1.0                                 
Y       
Y       
0                    
8      8      
0   4.90k   
4.90k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
21c58b6c-8bd4-4ede-86d9-fd65dfe8bc4c:2.0                                 
Y       
Y       
0                    
4      4      
0   2.50k   
2.50k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
372baf00-05a0-40e0-bd16-430433ce0980:1.0                                 
Y        Y       
0          
          2     
2       0    486     
486         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
39ae3a61-e29e-4032-af1e-19bd822e34e3:1.0                                 
Y       
Y       
0                    
8      8      
0   4.88k   
4.88k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
39ae3a61-e29e-4032-af1e-19bd822e34e3:2.0                                 
Y       
Y       
0                    
4      4      
0   2.52k   
2.52k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
3a7d8a98-86df-471d-9e20-384dd8dacbb2:1.0                                 
Y       
Y       
0                    
2      2      
0    486     
486         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
4fa605d0-5346-4087-852d-266f98d708e4:0.0                                 
Y       
Y       
0                    
0      0      
0      0       
0         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
96497b3d-6fdc-46c7-b784-c3508f97500b:1.0      
                           Y       
Y       
0                    
8      8      
0   4.88k   
4.88k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
96497b3d-6fdc-46c7-b784-c3508f97500b:2.0                                 
Y       
Y       
0                    
4      4      
0   2.52k   
2.52k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
a2a2aaee-797c-4b47-853d-d60b80bfac69:1.0                                 
Y       
Y       
0                    
5      5      
0   2.73k   
2.73k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
b1918f13-ff85-45ec-91a7-67dceea00fb2:1.0                       
          Y       
Y       
0                    
2      2      
0    486     
486         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
c209635c-dba5-43b5-bc7e-284c342e0ed0:1.0                                 
Y       
Y        0                    
8      8      
0   4.88k    4.88k       
1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
c209635c-dba5-43b5-bc7e-284c342e0ed0:2.0                                 
Y       
Y       
0                    
4      4      
0   2.46k   
2.46k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
celery                                                              
Y          
           0                    
0      0      
0      0       
0         4     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
celeryev.a2ebe0b7-c01e-41f2-8fae-58498cee82e5                            
Y                
0                  
144    144      
0    121k    
121k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
d78d70a1-7ecd-4375-8a3a-ad7de2f60fa3:1.0                                 
Y       
Y       
0                    
8      8      
0   4.88k   
4.88k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
d78d70a1-7ecd-4375-8a3a-ad7de2f60fa3:2.0                                 
Y        Y       
0    
                4     
4       0   2.52k   
2.52k        1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
f2fea2ac-4468-4e4a-9843-86cfab4157f2:1.0                                 
Y       
Y       
0                    
2      2      
0    486     
486         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
f9e48a42-8b2c-4378-b90a-1632ff8dbba8:1.0                                 
Y       
Y       
0                    
2      2      
0    486     
486         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
pulp.task                                                           
Y                     
0                    
0      0      
0      0       
0         3     1</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  <a href="mailto:reserved_resource_worker-0@mysrv.celery.pidbox">reserved_resource_worker-0@mysrv.celery.pidbox</a>      
Y                
0                    
1      1      
0    449     
449         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  <a href="mailto:reserved_resource_worker-0@mysrv.dq">reserved_resource_worker-0@mysrv.dq</a>            
Y    Y       
         0                    
0      0      
0      0       
0         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  <a href="mailto:reserved_resource_worker-1@mysrv.celery.pidbox">reserved_resource_worker-1@mysrv.celery.pidbox</a>      
Y                
0                    
1      1      
0    449     
449         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  <a href="mailto:reserved_resource_worker-1@mysrv.dq">reserved_resource_worker-1@mysrv.dq</a>            
Y   
Y                
0                    
0      0      
0      0       
0         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  <a href="mailto:reserved_resource_worker-2@mysrv.celery.pidbox">reserved_resource_worker-2@mysrv.celery.pidbox</a>      
Y                
0                    
1      1       0
   449     
449         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  <a href="mailto:reserved_resource_worker-2@mysrv.dq">reserved_resource_worker-2@mysrv.dq</a>            
Y   
Y                
0                    
0      0      
0      0       
0         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  <a href="mailto:reserved_resource_worker-3@mysrv.celery.pidbox">reserved_resource_worker-3@mysrv.celery.pidbox</a>      
Y                
0     
               1     
1       0   
449     
449         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  <a href="mailto:reserved_resource_worker-3@mysrv.dq">reserved_resource_worker-3@mysrv.dq</a>            
Y   
Y                
0                    
0      0      
0      0       
0         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
resource_manager                             
                       Y                     
0                    
0      0      
0      0       
0         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  <a href="mailto:resource_manager@mysrv.celery.pidbox">resource_manager@mysrv.celery.pidbox</a>                
Y                
0                    
0      0      
0      0       
0         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">  <a href="mailto:resource_manager@mysrv.dq">resource_manager@mysrv.dq</a>                      
Y   
Y                
0                    
0      0      
0      0       
0         1     2</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">#
pulp-admin -vv tasks details --task-id bf077e15-09f0-45d2-98d7-17ce953248d1</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">+----------------------------------------------------------------------+</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">                             
Task Details</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">+----------------------------------------------------------------------+</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">2016-05-04
15:59:19,827 - DEBUG - sending GET request to /pulp/api/v2/tasks/bf077e15-09f0-45d2-98d7-17ce953248d1/</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">2016-05-04
15:59:19,950 - INFO - GET request to
/pulp/api/v2/tasks/bf077e15-09f0-45d2-98d7-17ce953248d1/ with parameters None</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">2016-05-04
15:59:19,950 - INFO - Response status : 200</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">2016-05-04
15:59:19,950 - INFO - Response body :</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">{</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"exception": null,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"task_type": "pulp.server.managers.repo.sync.sync",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"_href":
"/pulp/api/v2/tasks/bf077e15-09f0-45d2-98d7-17ce953248d1/",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"task_id": "bf077e15-09f0-45d2-98d7-17ce953248d1",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"tags": [</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
"pulp:repository:rhel-6-server-rpms",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
"pulp:action:sync"</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
],</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"finish_time": null,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"_ns": "task_status",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"start_time": "2016-05-04T13:51:26Z",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"traceback": null,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"spawned_tasks": [],</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"progress_report": {</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
"yum_importer": {</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
"content": {</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"size_total": 0,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"items_left": 0,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"items_total": 0,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"state": "IN_PROGRESS",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"size_left": 0,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"details": {</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">         
"rpm_total": 0,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">         
"rpm_done": 0,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">         
"drpm_total": 0,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">         
"drpm_done": 0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
},</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"error_details": []</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
},</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
"comps": {</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"state": "NOT_STARTED"</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
},</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
"purge_duplicates": {</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"state": "NOT_STARTED"</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
},</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
"distribution": {</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"items_total": 0,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"state": "NOT_STARTED",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"error_details": [],</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"items_left": 0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
},</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
"errata": {</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"state": "NOT_STARTED"</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
},</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
"metadata": {</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
"state": "FINISHED"</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
}</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
}</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
},</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"queue": "<a href="mailto:reserved_resource_worker-2@mysrv.dq">reserved_resource_worker-2@mysrv.dq</a>",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"state": "running",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"worker_name": "reserved_resource_worker-2@mysrv",</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"result": null,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"error": null,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"_id": {</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
"$oid": "5729fe5e89348df3077e3e52"</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
},</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
"id": "5729fe5e89348df3077e3e52"</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">}</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> </span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Operations:      
sync</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Resources:       
rhel-6-server-rpms (repository)</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">State:           
Running</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Start
Time:       2016-05-04T13:51:26Z</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Finish
Time:      Incomplete</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Result:          
Incomplete</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Task
Id:         
bf077e15-09f0-45d2-98d7-17ce953248d1</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Progress
Report:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"> 
Yum Importer:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
Comps:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
State: NOT_STARTED</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
Content:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
Details:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
Drpm Done:  0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
Drpm Total: 0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
Rpm Done:   0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">       
Rpm Total:  0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
Error Details:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
Items Left:    0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
Items Total:   0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
Size Left:     0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
Size Total:    0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
State:         IN_PROGRESS</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
Distribution:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
Error Details:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
Items Left:    0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
Items Total:   0</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
State:         NOT_STARTED</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
Errata:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
State: NOT_STARTED</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
Metadata:</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
State: FINISHED</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">   
Purge Duplicates:</span></p><p>





























































































































































































































































































































































































































<b><i><span style="font-size:9pt;font-family:Arial;color:black;background-image:none;background-repeat:repeat"></span></i></b><span style="font-size:9pt"></span>

</p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">     
State: NOT_STARTED</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas"><br></span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Thanks,</span></p><p class="MsoNormal"><span style="font-size:10pt;font-family:Consolas">Samiron</span></p></div></div></div>
<br><div class="gmail_quote">On Wed, May 4, 2016 at 6:05 PM, Brian Bouterse <span dir="ltr"><<a href="mailto:bbouters@redhat.com" target="_blank">bbouters@redhat.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">After trying the things from my other e-mail. When your system is in its<br>
"bad state" show the `qpid-stat -q` output too please.<br>
<br>
The `qpid-stat -q` output from this e-mail shows all of the queues are<br>
empty with msgIn and msgOut being the same.<br>
<br>
-Brian<br>
<br>
On 05/04/2016 05:07 AM, Mallick, Samiron wrote:<br>
> *Seems the queue has something. I deleted them up one-by-one, and tried<br>
<span class="">> starting the stuck repo sync. and it works for the repo finally. But bad<br>
> news is, the queue again regenerates lot of tasks automatically where no<br>
> sync tasks running. So while trying to run another repo, it went to<br>
</span>> Waiting again. Is there any workaround yet?*<br>
<div><div class="h5">><br>
><br>
><br>
> # qpid-stat -q<br>
><br>
> Queues<br>
><br>
>   queue<br>
> dur  autoDel  excl  msg   msgIn  msgOut  bytes  bytesIn  bytesOut  cons<br>
> bind<br>
><br>
><br>
> ====================================================================================================================================================<br>
><br>
><br>
> 1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0                                  Y<br>
> Y        0     2      2       0    486      486         1     2<br>
><br>
><br>
> 36e7ca4b-5a0d-4f5c-9f94-a22016390562:1.0                                  Y<br>
> Y        0     8      8       0   4.91k    4.91k        1     2<br>
><br>
><br>
> 36e7ca4b-5a0d-4f5c-9f94-a22016390562:2.0                                  Y<br>
> Y        0     4      4       0   2.50k    2.50k        1     2<br>
><br>
><br>
> 3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:1.0                                  Y<br>
> Y        0     8      8       0   4.88k    4.88k        1     2<br>
><br>
><br>
> 3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:2.0                                  Y<br>
> Y        0     4      4       0   2.52k    2.52k        1     2<br>
><br>
><br>
> 43099b2b-cc78-4b96-a1a9-50d94517c1e2:1.0                                  Y<br>
> Y        0     2      2       0    486      486         1     2<br>
><br>
><br>
> 4409c371-0d54-44c4-94b7-ec0bb7ecfd45:1.0                                  Y<br>
> Y        0     0      0       0      0        0         1     2<br>
><br>
>   680eb17a-8285-450c-b8b9-51d107b4ff2d:0.0<br>
>            Y        Y        0     0      0       0      0<br>
> 0         1     2<br>
><br>
><br>
> bcbc1fa3-8157-403d-8f33-252fe057587a:1.0                                  Y<br>
> Y        0     5      5       0   2.67k    2.67k        1     2<br>
><br>
>   celery<br>
>                                                 Y<br>
> 0     0      0       0      0        0         1     2<br>
><br>
><br>
> celeryev.4021d653-24bf-4f06-9aee-aa457c579c4b                             Y<br>
> 0    12     12       0   10.0k    10.0k        1     2<br>
><br>
>   pulp.task<br>
> Y                      0     0      0       0      0        0<br>
> 3     1<br>
><br>
>   reserved_resource_worker-0@mysrv.celery.pidbox<br>
</div></div>> <mailto:<a href="mailto:reserved_resource_worker-0@mysrv.celery.pidbox">reserved_resource_worker-0@mysrv.celery.pidbox</a>><br>
<span class="">> Y                 0     0      0       0      0        0         1     2<br>
><br>
>   reserved_resource_worker-0@mysrv.dq<br>
</span>> <mailto:<a href="mailto:reserved_resource_worker-0@mysrv.dq">reserved_resource_worker-0@mysrv.dq</a>>             Y<br>
<span class="">> Y                 0     0      0       0      0        0         1     2<br>
><br>
>   resource_manager<br>
> Y                      0     0      0       0      0        0<br>
> 1     2<br>
><br>
>   resource_manager@mysrv.celery.pidbox<br>
</span>> <mailto:<a href="mailto:resource_manager@mysrv.celery.pidbox">resource_manager@mysrv.celery.pidbox</a>><br>
<span class="">> Y                 0     0      0       0      0        0         1     2<br>
><br>
>   resource_manager@mysrv.dq<br>
</span>> <mailto:<a href="mailto:resource_manager@mysrv.dq">resource_manager@mysrv.dq</a>>                       Y<br>
<div><div class="h5">> Y                 0     0      0       0      0        0         1     2<br>
><br>
> # pulp-admin tasks list<br>
><br>
> +----------------------------------------------------------------------+<br>
><br>
>                                  Tasks<br>
><br>
> +----------------------------------------------------------------------+<br>
><br>
><br>
><br>
> No tasks found<br>
><br>
><br>
><br>
><br>
><br>
><br>
><br>
> # qpid-tool<br>
><br>
> Management Tool for QPID<br>
><br>
> qpid: list<br>
><br>
> Summary of Objects by Type:<br>
><br>
> qpid: help<br>
><br>
> Management Tool for QPID<br>
><br>
><br>
><br>
> Commands:<br>
><br>
>     agents                          - Print a list of the known Agents<br>
><br>
>     list                            - Print summary of existing objects<br>
> by class<br>
><br>
>     list <className>                - Print list of objects of the<br>
> specified class<br>
><br>
>     list <className> active         - Print list of non-deleted objects<br>
> of the specified class<br>
><br>
>     show <ID>                       - Print contents of an object (infer<br>
> className)<br>
><br>
>     call <ID> <methodName> [<args>] - Invoke a method on an object<br>
><br>
>     schema                          - Print summary of object classes<br>
> seen on the target<br>
><br>
>     schema <className>              - Print details of an object class<br>
><br>
>     set time-format short           - Select short timestamp format<br>
> (default)<br>
><br>
>     set time-format long            - Select long timestamp format<br>
><br>
>     quit or ^D                      - Exit the program<br>
><br>
><br>
><br>
> qpid: list<br>
><br>
> Summary of Objects by Type:<br>
><br>
>     Package                      Class         Active  Deleted<br>
><br>
>     ============================================================<br>
><br>
>     org.apache.qpid.broker       binding       43      12<br>
><br>
>     org.apache.qpid.broker       broker        1       0<br>
><br>
>     org.apache.qpid.broker       memory        1       0<br>
><br>
>     org.apache.qpid.broker       system        1       0<br>
><br>
>     org.apache.qpid.linearstore  store         1       0<br>
><br>
>     org.apache.qpid.broker       subscription  23      5<br>
><br>
>     org.apache.qpid.broker       connection    14      1<br>
><br>
>     org.apache.qpid.broker       session       19      1<br>
><br>
>     org.apache.qpid.linearstore  journal       5       0<br>
><br>
>     org.apache.qpid.acl          acl           1       0<br>
><br>
>     org.apache.qpid.broker       queue         21      5<br>
><br>
>     org.apache.qpid.broker       exchange      13      0<br>
><br>
>     org.apache.qpid.broker       vhost         1       0<br>
><br>
> qpid: list queue<br>
><br>
> Object Summary:<br>
><br>
>     ID   Created   Destroyed  Index<br>
><br>
><br>
> ============================================================================================================================<br>
><br>
>     114  06:24:09  06:24:41<br>
> org.apache.qpid.broker:queue:topic-mysrv.3108.1<br>
><br>
>     115  06:24:09  06:24:41<br>
> org.apache.qpid.broker:queue:reply-mysrv.3108.1<br>
><br>
>     116  06:24:09  06:24:41<br>
> org.apache.qpid.broker:queue:qmfc-v2-ui-mysrv.3108.1<br>
><br>
>     117  06:24:09  06:24:41<br>
> org.apache.qpid.broker:queue:qmfc-v2-mysrv.3108.1<br>
><br>
>     118  06:24:09  06:24:41<br>
> org.apache.qpid.broker:queue:qmfc-v2-hb-mysrv.3108.1<br>
><br>
>     198  06:16:36  -<br>
> org.apache.qpid.broker:queue:1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0<br>
><br>
>     199  06:16:36  -<br>
> org.apache.qpid.broker:queue:36e7ca4b-5a0d-4f5c-9f94-a22016390562:1.0<br>
><br>
>     200  06:16:38  -<br>
> org.apache.qpid.broker:queue:36e7ca4b-5a0d-4f5c-9f94-a22016390562:2.0<br>
><br>
>     201  06:16:36  -<br>
> org.apache.qpid.broker:queue:3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:1.0<br>
><br>
>     202  06:16:37  -<br>
> org.apache.qpid.broker:queue:3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:2.0<br>
><br>
>     203  06:16:36  -<br>
> org.apache.qpid.broker:queue:43099b2b-cc78-4b96-a1a9-50d94517c1e2:1.0<br>
><br>
>     204  06:16:33  -<br>
> org.apache.qpid.broker:queue:4409c371-0d54-44c4-94b7-ec0bb7ecfd45:1.0<br>
><br>
>     205  06:16:33  -<br>
> org.apache.qpid.broker:queue:bcbc1fa3-8157-403d-8f33-252fe057587a:1.0<br>
><br>
>     206  06:16:33  -          org.apache.qpid.broker:queue:celery<br>
><br>
>     207  06:16:33  -<br>
> org.apache.qpid.broker:queue:celeryev.4021d653-24bf-4f06-9aee-aa457c579c4b<br>
><br>
>     208  06:16:33  -          org.apache.qpid.broker:queue:pulp.task<br>
><br>
>     209  06:24:43  -<br>
> org.apache.qpid.broker:queue:qmfc-v2-hb-mysrv.3122.1<br>
><br>
>     210  06:24:43  -<br>
> org.apache.qpid.broker:queue:qmfc-v2-mysrv.3122.1<br>
><br>
>     211  06:24:43  -<br>
> org.apache.qpid.broker:queue:qmfc-v2-ui-mysrv.3122.1<br>
><br>
>     212  06:24:43  -<br>
> org.apache.qpid.broker:queue:reply-mysrv.3122.1<br>
><br>
>     213  06:16:37  -<br>
> org.apache.qpid.broker:queue:reserved_resource_worker-0@mysrv.celery.pidbox<br>
><br>
>     214  06:16:36  -<br>
> org.apache.qpid.broker:queue:reserved_resource_worker-0@mysrv.dq<br>
><br>
>     215  06:16:33  -          org.apache.qpid.broker:queue:resource_manager<br>
><br>
>     216  06:16:38  -<br>
> org.apache.qpid.broker:queue:resource_manager@mysrv.celery.pidbox<br>
><br>
>     217  06:16:37  -<br>
> org.apache.qpid.broker:queue:resource_manager@mysrv.dq<br>
><br>
>     218  06:24:43  -<br>
> org.apache.qpid.broker:queue:topic-mysrv.3122.1<br>
><br>
><br>
><br>
> # qpid-config del queue 1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0<br>
><br>
> Failed: Exception: Exception from Agent: {u'error_code': 7,<br>
> u'error_text': 'precondition-failed: Cannot delete queue<br>
> 1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0; queue in use<br>
> (/builddir/build/BUILD/qpid-cpp-0.34/src/qpid/broker/Broker.cpp:1068)'}<br>
><br>
><br>
><br>
> # qpid-config del queue resource_manager@mysrv.dq<br>
</div></div>> <mailto:<a href="mailto:resource_manager@mysrv.dq">resource_manager@mysrv.dq</a>> --force<br>
<div><div class="h5">><br>
><br>
><br>
> qpid: list<br>
><br>
> Summary of Objects by Type:<br>
><br>
>     Package                      Class         Active  Deleted<br>
><br>
>     ============================================================<br>
><br>
>     org.apache.qpid.broker       binding       16      0<br>
><br>
>     org.apache.qpid.broker       broker        1       0<br>
><br>
>     org.apache.qpid.broker       memory        1       0<br>
><br>
>     org.apache.qpid.broker       system        1       0<br>
><br>
>     org.apache.qpid.linearstore  store         1       0<br>
><br>
>     org.apache.qpid.broker       subscription  7       0<br>
><br>
>     org.apache.qpid.broker       connection    13      0<br>
><br>
>     org.apache.qpid.broker       session       269     0<br>
><br>
>     org.apache.qpid.acl          acl           1       0<br>
><br>
>     org.apache.qpid.broker       queue         7       0<br>
><br>
>     org.apache.qpid.broker       exchange      13      0<br>
><br>
>     org.apache.qpid.broker       vhost         1       0<br>
><br>
> qpid: list queue<br>
><br>
> Object Summary:<br>
><br>
>     ID   Created   Destroyed  Index<br>
><br>
><br>
> ======================================================================================================<br>
><br>
>     146  08:47:30  -<br>
> org.apache.qpid.broker:queue:2d1a7c8f-bc3b-4d54-bbe6-b7b264530506:1.0<br>
><br>
>     147  08:47:30  -<br>
> org.apache.qpid.broker:queue:celeryev.d45c6bc2-2449-4700-b3bb-bbbbf0b2990b<br>
><br>
>     148  08:52:24  -<br>
> org.apache.qpid.broker:queue:qmfc-v2-hb-mysrv.4080.1<br>
><br>
>     149  08:52:24  -<br>
> org.apache.qpid.broker:queue:qmfc-v2-mysrv.4080.1<br>
><br>
>     150  08:52:24  -<br>
> org.apache.qpid.broker:queue:qmfc-v2-ui-mysrv.4080.1<br>
><br>
>     151  08:52:24  -<br>
> org.apache.qpid.broker:queue:reply-mysrv.4080.1<br>
><br>
>     152  08:52:24  -<br>
> org.apache.qpid.broker:queue:topic-mysrv.4080.1<br>
><br>
><br>
><br>
> # pulp-admin tasks list<br>
><br>
> +----------------------------------------------------------------------+<br>
><br>
>                                  Tasks<br>
><br>
> +----------------------------------------------------------------------+<br>
><br>
><br>
><br>
> No tasks found<br>
><br>
><br>
><br>
> # pulp-admin rpm repo sync run --repo-id=rhel-6-server-rpms<br>
><br>
> +----------------------------------------------------------------------+<br>
><br>
>              Synchronizing Repository [rhel-6-server-rpms]<br>
><br>
> +----------------------------------------------------------------------+<br>
><br>
><br>
><br>
> This command may be exited via ctrl+c without affecting the request.<br>
><br>
><br>
><br>
><br>
><br>
> [\]<br>
><br>
> *** STUCK ***<br>
><br>
><br>
><br>
> On Wed, May 4, 2016 at 9:37 AM, Mallick, Samiron<br>
</div></div><span class="">> <<a href="mailto:samiron.mallick@gmail.com">samiron.mallick@gmail.com</a> <mailto:<a href="mailto:samiron.mallick@gmail.com">samiron.mallick@gmail.com</a>>> wrote:<br>
><br>
>     Hey Brian, thanks for the reply.<br>
><br>
</span>>     *From the below output I could see "resource_worker-1" is<br>
<span class="">>     responsible for this task and I have 4 worker displayed on the<br>
</span><span class="">>     server.____*<br>
><br>
>     __ __<br>
><br>
>     # pulp-admin tasks list____<br>
><br>
>     +----------------------------------------------------------------------+____<br>
><br>
>                                      Tasks____<br>
><br>
>     +----------------------------------------------------------------------+____<br>
><br>
>     __ __<br>
><br>
>     Operations:  sync____<br>
><br>
>     Resources:   rhel-6-server-supplementary-rpms (repository)____<br>
><br>
>     State:       Waiting____<br>
><br>
>     Start Time:  Unstarted____<br>
><br>
>     Finish Time: Incomplete____<br>
><br>
>     Task Id:     49b83f70-e6d6-4cdb-9c5a-93c20c31d697____<br>
><br>
>     __ __<br>
><br>
>     __ __<br>
><br>
</span><span class="">>     # pulp-admin -vv tasks details --task-id<br>
</span><span class="">>     49b83f70-e6d6-4cdb-9c5a-93c20c31d697____<br>
><br>
>     +----------------------------------------------------------------------+____<br>
><br>
>                                   Task Details____<br>
><br>
>     +----------------------------------------------------------------------+____<br>
><br>
>     __ __<br>
><br>
</span><span class="">>     2016-05-04 04:55:33,231 - DEBUG - sending GET request to<br>
</span>>     /pulp/api/v2/tasks/49b83f70-e6d6-4cdb-9c5a-93c20c31d697/____<br>
<span class="">><br>
>     2016-05-04 04:55:33,362 - INFO - GET request to<br>
>     /pulp/api/v2/tasks/49b83f70-e6d6-4cdb-9c5a-93c20c31d697/ with<br>
</span>>     parameters None____<br>
<div><div class="h5">><br>
>     2016-05-04 04:55:33,362 - INFO - Response status : 200____<br>
><br>
>     __ __<br>
><br>
>     2016-05-04 04:55:33,363 - INFO - Response body :____<br>
><br>
>     {____<br>
><br>
>       "exception": null,____<br>
><br>
>       "task_type": "pulp.server.managers.repo.sync.sync",____<br>
><br>
>       "_href":<br>
>     "/pulp/api/v2/tasks/49b83f70-e6d6-4cdb-9c5a-93c20c31d697/",____<br>
><br>
>       "task_id": "49b83f70-e6d6-4cdb-9c5a-93c20c31d697",____<br>
><br>
>       "tags": [____<br>
><br>
>         "pulp:repository:rhel-6-server-supplementary-rpms",____<br>
><br>
>         "pulp:action:sync"____<br>
><br>
>       ],____<br>
><br>
>       "finish_time": null,____<br>
><br>
>       "_ns": "task_status",____<br>
><br>
>       "start_time": null,____<br>
><br>
>       "traceback": null,____<br>
><br>
>       "spawned_tasks": [],____<br>
><br>
>       "progress_report": {},____<br>
><br>
>       "queue": "reserved_resource_worker-1@mysrv.dq",____<br>
><br>
>       "state": "waiting",____<br>
><br>
>       "worker_name": "reserved_resource_worker-1@mysrv",____<br>
><br>
>       "result": null,____<br>
><br>
>       "error": null,____<br>
><br>
>       "_id": {____<br>
><br>
>         "$oid": "572964399b70a2ea1d2694aa"____<br>
><br>
>       },____<br>
><br>
>       "id": "572964399b70a2ea1d2694aa"____<br>
><br>
>     }____<br>
><br>
>     __ __<br>
><br>
>     Operations:       sync____<br>
><br>
>     Resources:        rhel-6-server-supplementary-rpms (repository)____<br>
><br>
>     State:            Waiting____<br>
><br>
>     Start Time:       Unstarted____<br>
><br>
>     Finish Time:      Incomplete____<br>
><br>
>     Result:           Incomplete____<br>
><br>
>     Task Id:          49b83f70-e6d6-4cdb-9c5a-93c20c31d697____<br>
><br>
>     Progress Report:____<br>
><br>
>     __ __<br>
><br>
>     __ __<br>
><br>
>     # pulp-admin status____<br>
><br>
>     +----------------------------------------------------------------------+____<br>
><br>
>                               Status of the server____<br>
><br>
>     +----------------------------------------------------------------------+____<br>
><br>
>     __ __<br>
><br>
>     Api Version:           2____<br>
><br>
>     Database Connection:____<br>
><br>
>       Connected: True____<br>
><br>
>     Known Workers:____<br>
><br>
>       _id:            scheduler@mysrv____<br>
><br>
>       _ns:            workers____<br>
><br>
>       Last Heartbeat: 2016-05-04T02:53:34Z____<br>
><br>
>       _id:            reserved_resource_worker-3@mysrv____<br>
><br>
>       _ns:            workers____<br>
><br>
>       Last Heartbeat: 2016-05-04T02:54:00Z____<br>
><br>
>       _id:            reserved_resource_worker-2@mysrv____<br>
><br>
>       _ns:            workers____<br>
><br>
>       Last Heartbeat: 2016-05-04T02:54:00Z____<br>
><br>
>       _id:            resource_manager@mysrv____<br>
><br>
>       _ns:            workers____<br>
><br>
>       Last Heartbeat: 2016-05-04T02:54:00Z____<br>
><br>
>       _id:            reserved_resource_worker-1@mysrv____<br>
><br>
>       _ns:            workers____<br>
><br>
>       Last Heartbeat: 2016-05-04T02:54:01Z____<br>
><br>
>       _id:            reserved_resource_worker-0@mysrv____<br>
><br>
>       _ns:            workers____<br>
><br>
>       Last Heartbeat: 2016-05-04T02:54:03Z____<br>
><br>
>     Messaging Connection:____<br>
><br>
>       Connected: True____<br>
><br>
>     Versions:____<br>
><br>
>       Platform Version: 2.8.2____<br>
><br>
>     __ __<br>
><br>
>     # ps -awfux | grep celery____<br>
><br>
</div></div><span class="">>     root      4637  0.0  0.0 112644   960 pts/0    S+   04:56<br>
</span>>     0:00                          \_ grep --color=auto celery____<br>
<span class="">><br>
>     apache    1592  0.0  1.4 667716 56368 ?        Ssl  May03   0:26<br>
>     /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n<br>
>     resource_manager@%h -Q resource_manager -c 1 --events --umask 18<br>
</span>>     --pidfile=/var/run/pulp/resource_manager.pid --heartbeat-interval=30____<br>
<span class="">><br>
>     apache    2921  0.0  1.4 667664 54296 ?        Sl   May03   0:13  \_<br>
>     /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n<br>
>     resource_manager@%h -Q resource_manager -c 1 --events --umask 18<br>
</span>>     --pidfile=/var/run/pulp/resource_manager.pid --heartbeat-interval=30____<br>
<span class="">><br>
>     apache    1616  0.0  1.4 667996 56400 ?        Ssl  May03   0:27<br>
>     /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events<br>
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-0.pid<br>
</span>>     --heartbeat-interval=30____<br>
<span class="">><br>
>     apache    2919  0.0  1.4 741536 54564 ?        Sl   May03   0:11  \_<br>
>     /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events<br>
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-0.pid<br>
</span>>     --heartbeat-interval=30____<br>
<span class="">><br>
>     apache    1626  0.0  1.5 668560 59524 ?        Ssl  May03   0:29<br>
>     /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events<br>
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-1.pid<br>
</span>>     --heartbeat-interval=30____<br>
<span class="">><br>
>     apache    4561  0.0  1.4 668560 56260 ?        S    04:47   0:00  \_<br>
>     /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events<br>
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-1.pid<br>
</span>>     --heartbeat-interval=30____<br>
<span class="">><br>
>     apache    1631  0.0  1.5 667748 58508 ?        Ssl  May03   0:27<br>
>     /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events<br>
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-2.pid<br>
</span>>     --heartbeat-interval=30____<br>
<span class="">><br>
>     apache    2922  4.2  8.0 1042956 311476 ?      Sl   May03  48:25  \_<br>
>     /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events<br>
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-2.pid<br>
</span>>     --heartbeat-interval=30____<br>
<span class="">><br>
>     apache    1637  0.0  1.4 667744 56368 ?        Ssl  May03   0:27<br>
>     /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events<br>
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-3.pid<br>
</span>>     --heartbeat-interval=30____<br>
<span class="">><br>
>     apache    2920  0.0  1.4 815420 54760 ?        Sl   May03   0:13  \_<br>
>     /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events<br>
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-3.pid<br>
</span>>     --heartbeat-interval=30____<br>
<span class="">><br>
>     apache    4620  6.5  0.8 663652 31432 ?        Ssl  04:56   0:00<br>
>     /usr/bin/python /usr/bin/celery beat<br>
>     --app=pulp.server.async.celery_instance.celery<br>
</span><span class="">>     --scheduler=pulp.server.async.scheduler.Scheduler____<br>
><br>
>     __ __<br>
><br>
>     __ __<br>
><br>
>     *As I saw errors in output of pulp_worker-1, I restarted each worker<br>
>     individually, and all error seems gone.____*<br>
><br>
>     __ __<br>
><br>
>     # systemctl status pulp_workers.service____<br>
><br>
>     ● pulp_workers.service - Pulp Celery Workers____<br>
><br>
</span>>        Loaded: loaded (/usr/lib/systemd/system/pulp_workers.service;<br>
>     enabled; vendor preset: disabled)____<br>
<span class="">><br>
>        Active: active (exited) since Wed 2016-05-04 05:36:38 CEST; 3s<br>
</span>>     ago____<br>
<span class="">><br>
>       Process: 5717 ExecStop=/usr/bin/python -m<br>
>     pulp.server.async.manage_workers stop (code=exited,<br>
</span>>     status=0/SUCCESS)____<br>
<span class="">><br>
>       Process: 5731 ExecStart=/usr/bin/python -m<br>
>     pulp.server.async.manage_workers start (code=exited,<br>
</span><span class="">>     status=0/SUCCESS)____<br>
><br>
>     Main PID: 5731 (code=exited, status=0/SUCCESS)____<br>
><br>
>     __ __<br>
><br>
>     May 04 05:36:38 mysrv systemd[1]: Starting Pulp Celery Workers...____<br>
><br>
>     May 04 05:36:38 mysrv systemd[1]: Started Pulp Celery Workers.____<br>
><br>
>     __ __<br>
><br>
>     # systemctl status pulp_worker-0____<br>
><br>
</span>>     ? pulp_worker-0.service - Pulp Worker #0____<br>
><br>
>        Loaded: loaded (/run/systemd/system/pulp_worker-0.service;<br>
>     static; vendor preset: disabled)____<br>
<span class="">><br>
>        Active: active (running) since Wed 2016-05-04 05:10:44 CEST; 1min<br>
</span>>     26s ago____<br>
<span class="">><br>
>     Main PID: 4753 (celery)____<br>
><br>
>        CGroup: /system.slice/pulp_worker-0.service____<br>
><br>
</span><span class="">>                +-4753 /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events<br>
</span>>     --umask 18 --pidfile=/var...____<br>
<span class="">><br>
>                +-4766 /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events<br>
</span>>     --umask 18 --pidfile=/var...____<br>
><br>
>     __ __<br>
<span class="">><br>
>     May 04 05:10:46 mysrv celery[4753]: - ** ---------- .> transport:<br>
</span>>     qpid://mysrv:5672// <<a href="http://srpadinf0244.mgt.insim.biz:5672//" rel="noreferrer" target="_blank">http://srpadinf0244.mgt.insim.biz:5672//</a>>____<br>
<span class="">><br>
>     May 04 05:10:46 mysrv celery[4753]: - ** ---------- .> results:<br>
</span>>     disabled____<br>
<span class="">><br>
>     May 04 05:10:46 mysrv celery[4753]: - *** --- * --- .> concurrency:<br>
</span>>     1 (prefork)____<br>
><br>
>     May 04 05:10:46 mysrv celery[4753]: -- ******* ----____<br>
><br>
>     May 04 05:10:46 mysrv celery[4753]: --- ***** ----- [queues]____<br>
<span class="">><br>
>     May 04 05:10:46 mysrv celery[4753]: -------------- .><br>
</span>>     celery           exchange=celery(direct) key=celery____<br>
<span class="">><br>
>     May 04 05:10:46 mysrv celery[4753]: .><br>
>     reserved_resource_worker-0@mysrv.dq exchange=C.dq(direct)<br>
</span>>     key=rese...s <<a href="http://sim.biz/" rel="noreferrer" target="_blank">http://sim.biz/</a>>rv____<br>
<span class="">><br>
>     May 04 05:10:46 mysrv pulp[4753]: kombu.transport.qpid:INFO:<br>
</span>>     Connected to qpid with SASL mechanism ANONYMOUS____<br>
<span class="">><br>
>     May 04 05:10:46 mysrv pulp[4753]: celery.worker.consumer:INFO:<br>
>     Connected to qpid://mysrv:5672//<br>
</span>>     <<a href="http://srpadinf0244.mgt.insim.biz:5672//" rel="noreferrer" target="_blank">http://srpadinf0244.mgt.insim.biz:5672//</a>>____<br>
<span class="">><br>
>     May 04 05:10:46 mysrv pulp[4753]: kombu.transport.qpid:INFO:<br>
</span>>     Connected to qpid with SASL mechanism ANONYMOUS____<br>
<span class="">><br>
>     Hint: Some lines were ellipsized, use -l to show in full.____<br>
><br>
>     __ __<br>
><br>
>     __ __<br>
><br>
>     # systemctl status pulp_worker-1____<br>
><br>
</span>>     ? pulp_worker-1.service - Pulp Worker #1____<br>
><br>
>        Loaded: loaded (/run/systemd/system/pulp_worker-1.service;<br>
>     static; vendor preset: disabled)____<br>
<span class="">><br>
>        Active: active (running) since Wed 2016-05-04 05:08:16 CEST; 3min<br>
</span>>     57s ago____<br>
<span class="">><br>
>     Main PID: 4718 (celery)____<br>
><br>
>        CGroup: /system.slice/pulp_worker-1.service____<br>
><br>
</span><span class="">>                +-4718 /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events<br>
</span>>     --umask 18 --pidfile=/var...____<br>
<span class="">><br>
>                +-4733 /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events<br>
</span>>     --umask 18 --pidfile=/var...____<br>
><br>
>     __ __<br>
<span class="">><br>
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:<br>
>     Received task:<br>
</span>>     pulp.server.controllers.repository.download_def...3cc3c36]____<br>
<span class="">><br>
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:<br>
>     Received task:<br>
</span>>     pulp.server.controllers.repository.download_def...ce7430b]____<br>
<span class="">><br>
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.job:INFO: Task<br>
>     pulp.server.controllers.repository.download_deferred[aad88f32-...9s:<br>
</span>>     None____<br>
<span class="">><br>
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:<br>
>     Received task:<br>
</span>>     pulp.server.db.reaper.reap_expired_documents[02...8322faa]____<br>
<span class="">><br>
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:<br>
>     Received task:<br>
</span>>     pulp.server.controllers.repository.download_def...ddadf87]____<br>
<span class="">><br>
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:<br>
>     Received task:<br>
</span>>     pulp.server.controllers.repository.download_def...d0cf8c6]____<br>
<span class="">><br>
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:<br>
>     Received task:<br>
</span>>     pulp.server.controllers.repository.download_def...72edf98]____<br>
<span class="">><br>
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:<br>
>     Received task:<br>
</span>>     pulp.server.controllers.repository.download_def...1e9e4bc]____<br>
<span class="">><br>
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:<br>
>     Received task:<br>
</span>>     pulp.server.controllers.repository.download_def...30f8627]____<br>
<span class="">><br>
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:<br>
>     Received task:<br>
</span><span class="">>     pulp.server.controllers.repository.queue_downlo...fd23e13]____<br>
><br>
>     Hint: Some lines were ellipsized, use -l to show in full.____<br>
><br>
>     __ __<br>
><br>
>     __ __<br>
><br>
>     # systemctl status pulp_worker-2____<br>
><br>
</span>>     ? pulp_worker-2.service - Pulp Worker #2____<br>
><br>
>        Loaded: loaded (/run/systemd/system/pulp_worker-2.service;<br>
>     static; vendor preset: disabled)____<br>
<span class="">><br>
>        Active: active (running) since Wed 2016-05-04 05:11:06 CEST; 1min<br>
</span>>     10s ago____<br>
<span class="">><br>
>     Main PID: 4776 (celery)____<br>
><br>
>        CGroup: /system.slice/pulp_worker-2.service____<br>
><br>
</span><span class="">>                +-4776 /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events<br>
</span>>     --umask 18 --pidfile=/var...____<br>
<span class="">><br>
>                +-4789 /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events<br>
</span>>     --umask 18 --pidfile=/var...____<br>
><br>
>     __ __<br>
<span class="">><br>
>     May 04 05:11:07 mysrv celery[4776]: - ** ---------- .> transport:<br>
</span>>     qpid://mysrv:5672// <<a href="http://srpadinf0244.mgt.insim.biz:5672//" rel="noreferrer" target="_blank">http://srpadinf0244.mgt.insim.biz:5672//</a>>____<br>
<span class="">><br>
>     May 04 05:11:07 mysrv celery[4776]: - ** ---------- .> results:<br>
</span>>     disabled____<br>
<span class="">><br>
>     May 04 05:11:07 mysrv celery[4776]: - *** --- * --- .> concurrency:<br>
</span>>     1 (prefork)____<br>
><br>
>     May 04 05:11:07 mysrv celery[4776]: -- ******* ----____<br>
><br>
>     May 04 05:11:07 mysrv celery[4776]: --- ***** ----- [queues]____<br>
<span class="">><br>
>     May 04 05:11:07 mysrv celery[4776]: -------------- .><br>
</span>>     celery           exchange=celery(direct) key=celery____<br>
<span class="">><br>
>     May 04 05:11:07 mysrv celery[4776]: .><br>
>     reserved_resource_worker-2@srpadinf0244.mgt.insim.biz.dq<br>
</span>>     exchange=C.dq(direct) key=rese...s <<a href="http://sim.biz/" rel="noreferrer" target="_blank">http://sim.biz/</a>>rv____<br>
<span class="">><br>
>     May 04 05:11:07 mysrv pulp[4776]: kombu.transport.qpid:INFO:<br>
</span>>     Connected to qpid with SASL mechanism ANONYMOUS____<br>
<span class="">><br>
>     May 04 05:11:07 mysrv pulp[4776]: celery.worker.consumer:INFO:<br>
>     Connected to qpid://mysrv:5672//<br>
</span>>     <<a href="http://srpadinf0244.mgt.insim.biz:5672//" rel="noreferrer" target="_blank">http://srpadinf0244.mgt.insim.biz:5672//</a>>____<br>
<span class="">><br>
>     May 04 05:11:07 mysrv pulp[4776]: kombu.transport.qpid:INFO:<br>
</span>>     Connected to qpid with SASL mechanism ANONYMOUS____<br>
<span class="">><br>
>     Hint: Some lines were ellipsized, use -l to show in full.____<br>
><br>
>     __ __<br>
><br>
>     __ __<br>
><br>
>     # systemctl status pulp_worker-3____<br>
><br>
</span>>     ? pulp_worker-3.service - Pulp Worker #3____<br>
><br>
>        Loaded: loaded (/run/systemd/system/pulp_worker-3.service;<br>
>     static; vendor preset: disabled)____<br>
<span class="">><br>
>        Active: active (running) since Wed 2016-05-04 05:11:21 CEST; 59s<br>
</span><span class="">>     ago____<br>
><br>
>     Main PID: 4798 (celery)____<br>
><br>
>        CGroup: /system.slice/pulp_worker-3.service____<br>
><br>
</span><span class="">>                +-4798 /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events<br>
</span>>     --umask 18 --pidfile=/var...____<br>
<span class="">><br>
>                +-4811 /usr/bin/python /usr/bin/celery worker -n<br>
>     reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events<br>
</span>>     --umask 18 --pidfile=/var...____<br>
><br>
>     __ __<br>
<span class="">><br>
>     May 04 05:11:22 mysrv celery[4798]: - ** ---------- .> transport:<br>
</span>>     qpid://mysrv:5672// <<a href="http://srpadinf0244.mgt.insim.biz:5672//" rel="noreferrer" target="_blank">http://srpadinf0244.mgt.insim.biz:5672//</a>>____<br>
<span class="">><br>
>     May 04 05:11:22 mysrv celery[4798]: - ** ---------- .> results:<br>
</span>>     disabled____<br>
<span class="">><br>
>     May 04 05:11:22 mysrv celery[4798]: - *** --- * --- .> concurrency:<br>
</span>>     1 (prefork)____<br>
><br>
>     May 04 05:11:22 mysrv celery[4798]: -- ******* ----____<br>
><br>
>     May 04 05:11:22 mysrv celery[4798]: --- ***** ----- [queues]____<br>
<span class="">><br>
>     May 04 05:11:22 mysrv celery[4798]: -------------- .><br>
</span>>     celery           exchange=celery(direct) key=celery____<br>
<span class="">><br>
>     May 04 05:11:22 mysrv celery[4798]: .><br>
>     reserved_resource_worker-3@srpadinf0244.mgt.insim.biz.dq<br>
</span>>     exchange=C.dq(direct) key=rese...s <<a href="http://sim.biz/" rel="noreferrer" target="_blank">http://sim.biz/</a>>rv____<br>
<span class="">><br>
>     May 04 05:11:22 mysrv pulp[4798]: kombu.transport.qpid:INFO:<br>
</span>>     Connected to qpid with SASL mechanism ANONYMOUS____<br>
<span class="">><br>
>     May 04 05:11:22 mysrv pulp[4798]: celery.worker.consumer:INFO:<br>
>     Connected to qpid://mysrv:5672//<br>
</span>>     <<a href="http://srpadinf0244.mgt.insim.biz:5672//" rel="noreferrer" target="_blank">http://srpadinf0244.mgt.insim.biz:5672//</a>>____<br>
<span class="">><br>
>     May 04 05:11:22 mysrv pulp[4798]: kombu.transport.qpid:INFO:<br>
</span>>     Connected to qpid with SASL mechanism ANONYMOUS____<br>
><br>
>     Hint: Some lines were ellipsized, use -l to show in full.____<br>
><br>
>     __ __<br>
><br>
>     *Now I have no tasks queued at all. I ran the repo sync again. and<br>
>     again it went to waiting.....____*<br>
<span class="">><br>
>     __ __<br>
><br>
>     # pulp-admin tasks list____<br>
><br>
>     +----------------------------------------------------------------------+____<br>
><br>
>                                      Tasks____<br>
><br>
>     +----------------------------------------------------------------------+____<br>
><br>
>     __ __<br>
><br>
>     No tasks found____<br>
><br>
>     __ __<br>
><br>
</span><span class="">>     # pulp-admin rpm repo sync run<br>
</span><span class="">>     --repo-id=rhel-6-server-supplementary-rpms____<br>
><br>
>     +----------------------------------------------------------------------+____<br>
><br>
>           Synchronizing Repository [rhel-6-server-supplementary-rpms]____<br>
><br>
>     +----------------------------------------------------------------------+____<br>
><br>
>     __ __<br>
><br>
>     This command may be exited via ctrl+c without affecting the request.____<br>
><br>
>     __ __<br>
><br>
>     __ __<br>
><br>
>     [/]____<br>
><br>
>     Waiting to begin...<br>
><br>
><br>
><br>
>     *//*<br>
><br>
><br>
</span><span class="">>     On Wed, May 4, 2016 at 1:52 AM, Brian Bouterse <<a href="mailto:bbouters@redhat.com">bbouters@redhat.com</a><br>
</span><div><div class="h5">>     <mailto:<a href="mailto:bbouters@redhat.com">bbouters@redhat.com</a>>> wrote:<br>
><br>
>         Kodiak is right that the second task stuck at "Waiting to Begin" is<br>
>         likely waiting behind another operation on that same repo.<br>
>         Canceling the<br>
>         one prior will likely allow the later one to start.<br>
><br>
>         How many workers are running and how many do you expect? You can see<br>
>         what Pulp thinks with:  `pulp-admin status`<br>
><br>
>         You can compare that to your pulp processes on all of your Pulp<br>
>         servers<br>
>         with `sudo ps -awfux | grep celery`.<br>
><br>
>         Also you can look at the task details with -vv to see the worker the<br>
>         halted task is assigned to. Something like `pulp-admin -vv tasks<br>
>         details<br>
>         --task-id 03842c9d-e053-4a6f-a4c4-2d7302be9c8c.`<br>
><br>
>         Unfortunately you'll have to see the worker in the raw response<br>
>         with -vv<br>
>         because of [0].<br>
><br>
>         [0]: <a href="https://pulp.plan.io/issues/1832" rel="noreferrer" target="_blank">https://pulp.plan.io/issues/1832</a><br>
><br>
>         -Brian<br>
><br>
><br>
>         On 05/03/2016 11:53 AM, Kodiak Firesmith wrote:<br>
>         > I believe you may need to cancel the pending repo sync task before you<br>
>         > can delete the repo.  Maybe try:<br>
>         >  pulp-admin tasks cancel --task-id=2d776d63-fd8a-4e0a-8f32-d2276c85187c<br>
>         >  pulp-admin tasks cancel --task-id=03842c9d-e053-4a6f-a4c4-2d7302be9c8c<br>
>         ><br>
>         > Then:<br>
>         > pulp-admin rpm repo delete --repo-id=rhel-6-server-supplementary-rpms<br>
>         ><br>
>         ><br>
>         > On Tue, May 3, 2016 at 11:47 AM, Mallick, Samiron<br>
>         > <<a href="mailto:samiron.mallick@gmail.com">samiron.mallick@gmail.com</a> <mailto:<a href="mailto:samiron.mallick@gmail.com">samiron.mallick@gmail.com</a>><br>
</div></div>>         <mailto:<a href="mailto:samiron.mallick@gmail.com">samiron.mallick@gmail.com</a><br>
<div><div class="h5">>         <mailto:<a href="mailto:samiron.mallick@gmail.com">samiron.mallick@gmail.com</a>>>> wrote:<br>
>         ><br>
>         >     Could anyone please tell me what went wrong with the<br>
>         repository. One<br>
>         >     of my EL7 server registered and was able to fetch contents<br>
>         from CDN.<br>
>         >     Recently I found one of the repo stuck after downloading<br>
>         RPMs. It’s<br>
>         >     never ending. I rebooted my server, cancelled tasks,<br>
>         deleted the<br>
>         >     repo and recreated, but no luck. No if I run sync, it<br>
>         directly going<br>
>         >     to waiting stage. Earlier I observed it was starting the<br>
>         task but<br>
>         >     the start time was same as I ran the job first time. Even<br>
>         now I am<br>
>         >     not able to delete the repo as well as it is showing<br>
>         “Waiting to<br>
>         >     begin”. I am running Pulp v2.8. Any idea would be greatly<br>
>         appreciated.<br>
>         ><br>
>         ><br>
>         ><br>
>         >     # rpm -qa pulp-server<br>
>         ><br>
>         >     pulp-server-2.8.2-1.el7.noarch<br>
>         ><br>
>         ><br>
>         ><br>
>         >     # pulp-admin rpm repo sync run<br>
>         >     --repo-id=rhel-6-server-supplementary-rpms<br>
>         ><br>
>         ><br>
>          +----------------------------------------------------------------------+<br>
>         ><br>
>         >           Synchronizing Repository<br>
>         [rhel-6-server-supplementary-rpms]<br>
>         ><br>
>         ><br>
>          +----------------------------------------------------------------------+<br>
>         ><br>
>         ><br>
>         ><br>
>         >     This command may be exited via ctrl+c without affecting<br>
>         the request.<br>
>         ><br>
>         ><br>
>         ><br>
>         ><br>
>         ><br>
>         >     Downloading metadata...<br>
>         ><br>
>         >     [\]<br>
>         ><br>
>         >     ... completed<br>
>         ><br>
>         ><br>
>         ><br>
>         >     Downloading repository content...<br>
>         ><br>
>         >     [-]<br>
>         ><br>
>         >     [==================================================] 100%<br>
>         ><br>
>         >     RPMs:       0/0 items<br>
>         ><br>
>         >     Delta RPMs: 0/0 items<br>
>         ><br>
>         ><br>
>         ><br>
>         >     ... completed<br>
>         ><br>
>         ><br>
>         ><br>
>         >     Downloading distribution files...<br>
>         ><br>
>         >     [==================================================] 100%<br>
>         ><br>
>         >     Distributions: 0/0 items<br>
>         ><br>
>         >     ... completed<br>
>         ><br>
>         ><br>
>         ><br>
>         >     Importing errata...<br>
>         ><br>
>         >     [/]<br>
>         ><br>
>         >     ... completed<br>
>         ><br>
>         ><br>
>         ><br>
>         >     Importing package groups/categories...<br>
>         ><br>
>         >     [-]<br>
>         ><br>
>         >     ... completed<br>
>         ><br>
>         ><br>
>         ><br>
>         >     Cleaning duplicate packages...<br>
>         ><br>
>         >     [|]<br>
>         ><br>
>         >     ... completed<br>
>         ><br>
>         ><br>
>         ><br>
>         >     *** AND STUCK HERE ***<br>
>         ><br>
>         ><br>
>         ><br>
>         >     # pulp-admin tasks list<br>
>         ><br>
>         ><br>
>          +----------------------------------------------------------------------+<br>
>         ><br>
>         >                                      Tasks<br>
>         ><br>
>         ><br>
>          +----------------------------------------------------------------------+<br>
>         ><br>
>         ><br>
>         ><br>
>         >     Operations:  sync<br>
>         ><br>
>         >     Resources:   rhel-6-server-supplementary-rpms (repository)<br>
>         ><br>
>         >     State:       Running<br>
>         ><br>
>         >     Start Time:  2016-05-03T07:06:36Z<br>
>         ><br>
>         >     Finish Time: Incomplete<br>
>         ><br>
>         >     Task Id:     2d776d63-fd8a-4e0a-8f32-d2276c85187c<br>
>         ><br>
>         ><br>
>         ><br>
>         >     Operations:  publish<br>
>         ><br>
>         >     Resources:   rhel-6-server-supplementary-rpms (repository)<br>
>         ><br>
>         >     State:       Waiting<br>
>         ><br>
>         >     Start Time:  Unstarted<br>
>         ><br>
>         >     Finish Time: Incomplete<br>
>         ><br>
>         >     Task Id:     03842c9d-e053-4a6f-a4c4-2d7302be9c8c<br>
>         ><br>
>         ><br>
>         ><br>
>         >     # date<br>
>         ><br>
>         >     Tue May  3 09:22:30 CEST 2016<br>
>         ><br>
>         >     # pulp-admin rpm repo sync schedules list<br>
>         >     --repo-id=rhel-6-server-supplementary-rpms<br>
>         ><br>
>         ><br>
>          +----------------------------------------------------------------------+<br>
>         ><br>
>         >                                    Schedules<br>
>         ><br>
>         ><br>
>          +----------------------------------------------------------------------+<br>
>         ><br>
>         ><br>
>         ><br>
>         >     There are no schedules defined for this operation.<br>
>         ><br>
>         ><br>
>         ><br>
>         >     # pulp-admin rpm repo delete<br>
>         --repo-id=rhel-6-server-supplementary-rpms<br>
>         ><br>
>         >     This command may be exited via ctrl+c without affecting<br>
>         the request.<br>
>         ><br>
>         ><br>
>         ><br>
>         ><br>
>         ><br>
>         >     [-]<br>
>         ><br>
>         >     Running...<br>
>         ><br>
>         >     [-]<br>
>         ><br>
>         >     Waiting to begin...<br>
>         ><br>
>         ><br>
>         ><br>
>         >     *** AND STUCK HERE ***<br>
>         ><br>
>         ><br>
>         >     _______________________________________________<br>
>         >     Pulp-list mailing list<br>
>         >     <a href="mailto:Pulp-list@redhat.com">Pulp-list@redhat.com</a> <mailto:<a href="mailto:Pulp-list@redhat.com">Pulp-list@redhat.com</a>><br>
</div></div>>         <mailto:<a href="mailto:Pulp-list@redhat.com">Pulp-list@redhat.com</a> <mailto:<a href="mailto:Pulp-list@redhat.com">Pulp-list@redhat.com</a>>><br>
<span class="">>         >     <a href="https://www.redhat.com/mailman/listinfo/pulp-list" rel="noreferrer" target="_blank">https://www.redhat.com/mailman/listinfo/pulp-list</a><br>
>         ><br>
>         ><br>
>         ><br>
>         ><br>
>         > _______________________________________________<br>
>         > Pulp-list mailing list<br>
>         > <a href="mailto:Pulp-list@redhat.com">Pulp-list@redhat.com</a> <mailto:<a href="mailto:Pulp-list@redhat.com">Pulp-list@redhat.com</a>><br>
>         > <a href="https://www.redhat.com/mailman/listinfo/pulp-list" rel="noreferrer" target="_blank">https://www.redhat.com/mailman/listinfo/pulp-list</a><br>
>         ><br>
><br>
>         _______________________________________________<br>
>         Pulp-list mailing list<br>
</span>>         <a href="mailto:Pulp-list@redhat.com">Pulp-list@redhat.com</a> <mailto:<a href="mailto:Pulp-list@redhat.com">Pulp-list@redhat.com</a>><br>
>         <a href="https://www.redhat.com/mailman/listinfo/pulp-list" rel="noreferrer" target="_blank">https://www.redhat.com/mailman/listinfo/pulp-list</a><br>
><br>
><br>
><br>
</blockquote></div><br></div></div>