[Pulp-list] pulp 2.8 repos went into waiting state and never ends
Mallick, Samiron
samiron.mallick at gmail.com
Fri May 6 09:19:11 UTC 2016
*Thank you very much Brian. Updating python-kombu to 3.0.33-5 resolves the
issue. After the update I ran sync task several times without any issues.
Please find the log as requested.*
# rpm -qa | grep kombu
python-kombu-3.0.33-4.pulp.el7.noarch
# sudo qpid-stat -q |grep celeryev
celeryev.136f85c7-b6fd-4e90-8426-fc73ff86864c
Y 0 2.27k 2.27k 0 1.92m 1.92m 1 2
# journalctl -f -l
-- Logs begin at Wed 2016-05-04 11:41:21 CEST. --
May 04 15:21:03 mysrv audispd[975]: node=mysrv type=USER_ACCT
msg=audit(1462368063.769:541): pid=3501 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:accounting grantors=pam_succeed_if acct="root"
exe="/usr/bin/su" hostname=? addr=? terminal=pts/0 res=success'
May 04 15:21:03 mysrv audispd[975]: node=mysrv type=CRED_ACQ
msg=audit(1462368063.770:542): pid=3501 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:setcred grantors=pam_rootok acct="root" exe="/usr/bin/su"
hostname=? addr=? terminal=pts/0 res=success'
May 04 15:21:03 mysrv su[3501]: pam_unix(su-l:session): session opened for
user root by lsagy92iy(uid=0)
May 04 15:21:03 mysrv audispd[975]: node=mysrv type=USER_START
msg=audit(1462368063.775:543): pid=3501 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:session_open
grantors=pam_keyinit,pam_keyinit,pam_limits,pam_systemd,pam_unix,pam_xauth
acct="root" exe="/usr/bin/su" hostname=? addr=? terminal=pts/0 res=success'
May 04 15:22:21 mysrv sudo[3536]: root : TTY=pts/0 ; PWD=/root ;
USER=root ; COMMAND=/bin/qpid-stat -q
May 04 15:22:21 mysrv audispd[975]: node=mysrv type=USER_CMD
msg=audit(1462368141.089:544): pid=3536 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 msg='cwd="/root"
cmd=717069642D73746174202D71 terminal=pts/0 res=success'
May 04 15:22:21 mysrv audispd[975]: node=mysrv type=CRED_ACQ
msg=audit(1462368141.090:545): pid=3536 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="root"
exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0 res=success'
May 04 15:22:21 mysrv audispd[975]: node=mysrv type=USER_START
msg=audit(1462368141.090:546): pid=3536 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:session_open grantors=pam_keyinit,pam_limits acct="root"
exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0 res=success'
May 04 15:22:21 mysrv audispd[975]: node=mysrv type=USER_END
msg=audit(1462368141.240:547): pid=3536 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:session_close grantors=pam_keyinit,pam_limits acct="root"
exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0 res=success'
May 04 15:22:21 mysrv audispd[975]: node=mysrv type=CRED_DISP
msg=audit(1462368141.240:548): pid=3536 uid=0 auid=692982 ses=7
subj=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023
msg='op=PAM:setcred grantors=pam_env,pam_localuser,pam_unix acct="root"
exe="/usr/bin/sudo" hostname=? addr=? terminal=/dev/pts/0 res=success'
^C
# rpm -Uvh python-kombu-3.0.33-5.pulp.el7.noarch.rpm
Preparing... #################################
[100%]
Updating / installing...
1:python-kombu-1:3.0.33-5.pulp.el7 ################################# [
50%]
Cleaning up / removing...
2:python-kombu-1:3.0.33-4.pulp.el7 #################################
[100%]
# rpm -qa | grep kombu
python-kombu-3.0.33-5.pulp.el7.noarch
# sudo qpid-stat -q |grep celeryev
celeryev.a2ebe0b7-c01e-41f2-8fae-58498cee82e5
Y 0 109 109 0 91.6k 91.6k 1 2
# pulp-admin status
+----------------------------------------------------------------------+
Status of the server
+----------------------------------------------------------------------+
Api Version: 2
Database Connection:
Connected: True
Known Workers:
_id: scheduler at mysrv
_ns: workers
Last Heartbeat: 2016-05-04T13:47:15Z
_id: reserved_resource_worker-3 at mysrv
_ns: workers
Last Heartbeat: 2016-05-04T13:48:15Z
_id: reserved_resource_worker-0 at mysrv
_ns: workers
Last Heartbeat: 2016-05-04T13:48:15Z
_id: reserved_resource_worker-2 at mysrv
_ns: workers
Last Heartbeat: 2016-05-04T13:48:15Z
_id: reserved_resource_worker-1 at mysrv
_ns: workers
Last Heartbeat: 2016-05-04T13:48:18Z
_id: resource_manager at mysrv
_ns: workers
Last Heartbeat: 2016-05-04T13:48:18Z
Messaging Connection:
Connected: True
Versions:
Platform Version: 2.8.2
# qpid-stat -q
Queues
queue dur
autoDel excl msg msgIn msgOut bytes bytesIn
bytesOut cons bind
=============================================================================================
=======================================================
05a8bc3d-cf7e-4fae-875b-13b214cf8de6:1.0
Y Y 0 0 0 0 0
0 1 2
21c58b6c-8bd4-4ede-86d9-fd65dfe8bc4c:1.0
Y Y 0 8 8 0 4.90k
4.90k 1 2
21c58b6c-8bd4-4ede-86d9-fd65dfe8bc4c:2.0
Y Y 0 4 4 0 2.50k
2.50k 1 2
372baf00-05a0-40e0-bd16-430433ce0980:1.0
Y Y 0 2 2 0 486
486 1 2
39ae3a61-e29e-4032-af1e-19bd822e34e3:1.0
Y Y 0 8 8 0 4.88k
4.88k 1 2
39ae3a61-e29e-4032-af1e-19bd822e34e3:2.0
Y Y 0 4 4 0 2.52k
2.52k 1 2
3a7d8a98-86df-471d-9e20-384dd8dacbb2:1.0
Y Y 0 2 2 0 486
486 1 2
4fa605d0-5346-4087-852d-266f98d708e4:0.0
Y Y 0 0 0 0 0
0 1 2
96497b3d-6fdc-46c7-b784-c3508f97500b:1.0
Y Y 0 8
8 0 4.88k 4.88k 1 2
96497b3d-6fdc-46c7-b784-c3508f97500b:2.0
Y Y 0 4 4 0 2.52k
2.52k 1 2
a2a2aaee-797c-4b47-853d-d60b80bfac69:1.0
Y Y 0 5 5 0 2.73k
2.73k 1 2
b1918f13-ff85-45ec-91a7-67dceea00fb2:1.0
Y Y 0 2 2 0
486 486 1 2
c209635c-dba5-43b5-bc7e-284c342e0ed0:1.0
Y Y 0 8 8 0 4.88k
4.88k 1 2
c209635c-dba5-43b5-bc7e-284c342e0ed0:2.0
Y Y 0 4 4 0 2.46k
2.46k 1 2
celery
Y 0 0 0 0 0
0 4 2
celeryev.a2ebe0b7-c01e-41f2-8fae-58498cee82e5
Y 0 144 144 0 121k
121k 1 2
d78d70a1-7ecd-4375-8a3a-ad7de2f60fa3:1.0
Y Y 0 8 8 0 4.88k
4.88k 1 2
d78d70a1-7ecd-4375-8a3a-ad7de2f60fa3:2.0
Y Y 0 4 4 0 2.52k
2.52k 1 2
f2fea2ac-4468-4e4a-9843-86cfab4157f2:1.0
Y Y 0 2 2 0 486
486 1 2
f9e48a42-8b2c-4378-b90a-1632ff8dbba8:1.0
Y Y 0 2 2 0 486
486 1 2
pulp.task
Y 0 0 0 0 0
0 3 1
reserved_resource_worker-0 at mysrv.celery.pidbox Y
0 1 1 0 449 449 1 2
reserved_resource_worker-0 at mysrv.dq Y Y
0 0 0 0 0 0
1 2
reserved_resource_worker-1 at mysrv.celery.pidbox Y
0 1 1 0 449 449 1 2
reserved_resource_worker-1 at mysrv.dq Y Y
0 0 0 0 0 0 1 2
reserved_resource_worker-2 at mysrv.celery.pidbox Y
0 1 1 0 449 449 1 2
reserved_resource_worker-2 at mysrv.dq Y Y
0 0 0 0 0 0 1 2
reserved_resource_worker-3 at mysrv.celery.pidbox Y
0 1 1 0 449 449 1 2
reserved_resource_worker-3 at mysrv.dq Y Y
0 0 0 0 0 0 1 2
resource_manager
Y 0 0
0 0 0 0 1 2
resource_manager at mysrv.celery.pidbox Y
0 0 0 0 0 0 1 2
resource_manager at mysrv.dq Y Y
0 0 0 0 0 0 1 2
# pulp-admin -vv tasks details --task-id
bf077e15-09f0-45d2-98d7-17ce953248d1
+----------------------------------------------------------------------+
Task Details
+----------------------------------------------------------------------+
2016-05-04 15:59:19,827 - DEBUG - sending GET request to
/pulp/api/v2/tasks/bf077e15-09f0-45d2-98d7-17ce953248d1/
2016-05-04 15:59:19,950 - INFO - GET request to
/pulp/api/v2/tasks/bf077e15-09f0-45d2-98d7-17ce953248d1/ with parameters
None
2016-05-04 15:59:19,950 - INFO - Response status : 200
2016-05-04 15:59:19,950 - INFO - Response body :
{
"exception": null,
"task_type": "pulp.server.managers.repo.sync.sync",
"_href": "/pulp/api/v2/tasks/bf077e15-09f0-45d2-98d7-17ce953248d1/",
"task_id": "bf077e15-09f0-45d2-98d7-17ce953248d1",
"tags": [
"pulp:repository:rhel-6-server-rpms",
"pulp:action:sync"
],
"finish_time": null,
"_ns": "task_status",
"start_time": "2016-05-04T13:51:26Z",
"traceback": null,
"spawned_tasks": [],
"progress_report": {
"yum_importer": {
"content": {
"size_total": 0,
"items_left": 0,
"items_total": 0,
"state": "IN_PROGRESS",
"size_left": 0,
"details": {
"rpm_total": 0,
"rpm_done": 0,
"drpm_total": 0,
"drpm_done": 0
},
"error_details": []
},
"comps": {
"state": "NOT_STARTED"
},
"purge_duplicates": {
"state": "NOT_STARTED"
},
"distribution": {
"items_total": 0,
"state": "NOT_STARTED",
"error_details": [],
"items_left": 0
},
"errata": {
"state": "NOT_STARTED"
},
"metadata": {
"state": "FINISHED"
}
}
},
"queue": "reserved_resource_worker-2 at mysrv.dq",
"state": "running",
"worker_name": "reserved_resource_worker-2 at mysrv",
"result": null,
"error": null,
"_id": {
"$oid": "5729fe5e89348df3077e3e52"
},
"id": "5729fe5e89348df3077e3e52"
}
Operations: sync
Resources: rhel-6-server-rpms (repository)
State: Running
Start Time: 2016-05-04T13:51:26Z
Finish Time: Incomplete
Result: Incomplete
Task Id: bf077e15-09f0-45d2-98d7-17ce953248d1
Progress Report:
Yum Importer:
Comps:
State: NOT_STARTED
Content:
Details:
Drpm Done: 0
Drpm Total: 0
Rpm Done: 0
Rpm Total: 0
Error Details:
Items Left: 0
Items Total: 0
Size Left: 0
Size Total: 0
State: IN_PROGRESS
Distribution:
Error Details:
Items Left: 0
Items Total: 0
State: NOT_STARTED
Errata:
State: NOT_STARTED
Metadata:
State: FINISHED
Purge Duplicates:
State: NOT_STARTED
Thanks,
Samiron
On Wed, May 4, 2016 at 6:05 PM, Brian Bouterse <bbouters at redhat.com> wrote:
> After trying the things from my other e-mail. When your system is in its
> "bad state" show the `qpid-stat -q` output too please.
>
> The `qpid-stat -q` output from this e-mail shows all of the queues are
> empty with msgIn and msgOut being the same.
>
> -Brian
>
> On 05/04/2016 05:07 AM, Mallick, Samiron wrote:
> > *Seems the queue has something. I deleted them up one-by-one, and tried
> > starting the stuck repo sync. and it works for the repo finally. But bad
> > news is, the queue again regenerates lot of tasks automatically where no
> > sync tasks running. So while trying to run another repo, it went to
> > Waiting again. Is there any workaround yet?*
> >
> >
> >
> > # qpid-stat -q
> >
> > Queues
> >
> > queue
> > dur autoDel excl msg msgIn msgOut bytes bytesIn bytesOut cons
> > bind
> >
> >
> >
> ====================================================================================================================================================
> >
> >
> > 1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0
> Y
> > Y 0 2 2 0 486 486 1 2
> >
> >
> > 36e7ca4b-5a0d-4f5c-9f94-a22016390562:1.0
> Y
> > Y 0 8 8 0 4.91k 4.91k 1 2
> >
> >
> > 36e7ca4b-5a0d-4f5c-9f94-a22016390562:2.0
> Y
> > Y 0 4 4 0 2.50k 2.50k 1 2
> >
> >
> > 3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:1.0
> Y
> > Y 0 8 8 0 4.88k 4.88k 1 2
> >
> >
> > 3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:2.0
> Y
> > Y 0 4 4 0 2.52k 2.52k 1 2
> >
> >
> > 43099b2b-cc78-4b96-a1a9-50d94517c1e2:1.0
> Y
> > Y 0 2 2 0 486 486 1 2
> >
> >
> > 4409c371-0d54-44c4-94b7-ec0bb7ecfd45:1.0
> Y
> > Y 0 0 0 0 0 0 1 2
> >
> > 680eb17a-8285-450c-b8b9-51d107b4ff2d:0.0
> > Y Y 0 0 0 0 0
> > 0 1 2
> >
> >
> > bcbc1fa3-8157-403d-8f33-252fe057587a:1.0
> Y
> > Y 0 5 5 0 2.67k 2.67k 1 2
> >
> > celery
> > Y
> > 0 0 0 0 0 0 1 2
> >
> >
> > celeryev.4021d653-24bf-4f06-9aee-aa457c579c4b
> Y
> > 0 12 12 0 10.0k 10.0k 1 2
> >
> > pulp.task
> > Y 0 0 0 0 0 0
> > 3 1
> >
> > reserved_resource_worker-0 at mysrv.celery.pidbox
> > <mailto:reserved_resource_worker-0 at mysrv.celery.pidbox>
> > Y 0 0 0 0 0 0 1 2
> >
> > reserved_resource_worker-0 at mysrv.dq
> > <mailto:reserved_resource_worker-0 at mysrv.dq> Y
> > Y 0 0 0 0 0 0 1 2
> >
> > resource_manager
> > Y 0 0 0 0 0 0
> > 1 2
> >
> > resource_manager at mysrv.celery.pidbox
> > <mailto:resource_manager at mysrv.celery.pidbox>
> > Y 0 0 0 0 0 0 1 2
> >
> > resource_manager at mysrv.dq
> > <mailto:resource_manager at mysrv.dq> Y
> > Y 0 0 0 0 0 0 1 2
> >
> > # pulp-admin tasks list
> >
> > +----------------------------------------------------------------------+
> >
> > Tasks
> >
> > +----------------------------------------------------------------------+
> >
> >
> >
> > No tasks found
> >
> >
> >
> >
> >
> >
> >
> > # qpid-tool
> >
> > Management Tool for QPID
> >
> > qpid: list
> >
> > Summary of Objects by Type:
> >
> > qpid: help
> >
> > Management Tool for QPID
> >
> >
> >
> > Commands:
> >
> > agents - Print a list of the known Agents
> >
> > list - Print summary of existing objects
> > by class
> >
> > list <className> - Print list of objects of the
> > specified class
> >
> > list <className> active - Print list of non-deleted objects
> > of the specified class
> >
> > show <ID> - Print contents of an object (infer
> > className)
> >
> > call <ID> <methodName> [<args>] - Invoke a method on an object
> >
> > schema - Print summary of object classes
> > seen on the target
> >
> > schema <className> - Print details of an object class
> >
> > set time-format short - Select short timestamp format
> > (default)
> >
> > set time-format long - Select long timestamp format
> >
> > quit or ^D - Exit the program
> >
> >
> >
> > qpid: list
> >
> > Summary of Objects by Type:
> >
> > Package Class Active Deleted
> >
> > ============================================================
> >
> > org.apache.qpid.broker binding 43 12
> >
> > org.apache.qpid.broker broker 1 0
> >
> > org.apache.qpid.broker memory 1 0
> >
> > org.apache.qpid.broker system 1 0
> >
> > org.apache.qpid.linearstore store 1 0
> >
> > org.apache.qpid.broker subscription 23 5
> >
> > org.apache.qpid.broker connection 14 1
> >
> > org.apache.qpid.broker session 19 1
> >
> > org.apache.qpid.linearstore journal 5 0
> >
> > org.apache.qpid.acl acl 1 0
> >
> > org.apache.qpid.broker queue 21 5
> >
> > org.apache.qpid.broker exchange 13 0
> >
> > org.apache.qpid.broker vhost 1 0
> >
> > qpid: list queue
> >
> > Object Summary:
> >
> > ID Created Destroyed Index
> >
> >
> >
> ============================================================================================================================
> >
> > 114 06:24:09 06:24:41
> > org.apache.qpid.broker:queue:topic-mysrv.3108.1
> >
> > 115 06:24:09 06:24:41
> > org.apache.qpid.broker:queue:reply-mysrv.3108.1
> >
> > 116 06:24:09 06:24:41
> > org.apache.qpid.broker:queue:qmfc-v2-ui-mysrv.3108.1
> >
> > 117 06:24:09 06:24:41
> > org.apache.qpid.broker:queue:qmfc-v2-mysrv.3108.1
> >
> > 118 06:24:09 06:24:41
> > org.apache.qpid.broker:queue:qmfc-v2-hb-mysrv.3108.1
> >
> > 198 06:16:36 -
> > org.apache.qpid.broker:queue:1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0
> >
> > 199 06:16:36 -
> > org.apache.qpid.broker:queue:36e7ca4b-5a0d-4f5c-9f94-a22016390562:1.0
> >
> > 200 06:16:38 -
> > org.apache.qpid.broker:queue:36e7ca4b-5a0d-4f5c-9f94-a22016390562:2.0
> >
> > 201 06:16:36 -
> > org.apache.qpid.broker:queue:3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:1.0
> >
> > 202 06:16:37 -
> > org.apache.qpid.broker:queue:3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:2.0
> >
> > 203 06:16:36 -
> > org.apache.qpid.broker:queue:43099b2b-cc78-4b96-a1a9-50d94517c1e2:1.0
> >
> > 204 06:16:33 -
> > org.apache.qpid.broker:queue:4409c371-0d54-44c4-94b7-ec0bb7ecfd45:1.0
> >
> > 205 06:16:33 -
> > org.apache.qpid.broker:queue:bcbc1fa3-8157-403d-8f33-252fe057587a:1.0
> >
> > 206 06:16:33 - org.apache.qpid.broker:queue:celery
> >
> > 207 06:16:33 -
> >
> org.apache.qpid.broker:queue:celeryev.4021d653-24bf-4f06-9aee-aa457c579c4b
> >
> > 208 06:16:33 - org.apache.qpid.broker:queue:pulp.task
> >
> > 209 06:24:43 -
> > org.apache.qpid.broker:queue:qmfc-v2-hb-mysrv.3122.1
> >
> > 210 06:24:43 -
> > org.apache.qpid.broker:queue:qmfc-v2-mysrv.3122.1
> >
> > 211 06:24:43 -
> > org.apache.qpid.broker:queue:qmfc-v2-ui-mysrv.3122.1
> >
> > 212 06:24:43 -
> > org.apache.qpid.broker:queue:reply-mysrv.3122.1
> >
> > 213 06:16:37 -
> >
> org.apache.qpid.broker:queue:reserved_resource_worker-0 at mysrv.celery.pidbox
> >
> > 214 06:16:36 -
> > org.apache.qpid.broker:queue:reserved_resource_worker-0 at mysrv.dq
> >
> > 215 06:16:33 -
> org.apache.qpid.broker:queue:resource_manager
> >
> > 216 06:16:38 -
> > org.apache.qpid.broker:queue:resource_manager at mysrv.celery.pidbox
> >
> > 217 06:16:37 -
> > org.apache.qpid.broker:queue:resource_manager at mysrv.dq
> >
> > 218 06:24:43 -
> > org.apache.qpid.broker:queue:topic-mysrv.3122.1
> >
> >
> >
> > # qpid-config del queue 1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0
> >
> > Failed: Exception: Exception from Agent: {u'error_code': 7,
> > u'error_text': 'precondition-failed: Cannot delete queue
> > 1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0; queue in use
> > (/builddir/build/BUILD/qpid-cpp-0.34/src/qpid/broker/Broker.cpp:1068)'}
> >
> >
> >
> > # qpid-config del queue resource_manager at mysrv.dq
> > <mailto:resource_manager at mysrv.dq> --force
> >
> >
> >
> > qpid: list
> >
> > Summary of Objects by Type:
> >
> > Package Class Active Deleted
> >
> > ============================================================
> >
> > org.apache.qpid.broker binding 16 0
> >
> > org.apache.qpid.broker broker 1 0
> >
> > org.apache.qpid.broker memory 1 0
> >
> > org.apache.qpid.broker system 1 0
> >
> > org.apache.qpid.linearstore store 1 0
> >
> > org.apache.qpid.broker subscription 7 0
> >
> > org.apache.qpid.broker connection 13 0
> >
> > org.apache.qpid.broker session 269 0
> >
> > org.apache.qpid.acl acl 1 0
> >
> > org.apache.qpid.broker queue 7 0
> >
> > org.apache.qpid.broker exchange 13 0
> >
> > org.apache.qpid.broker vhost 1 0
> >
> > qpid: list queue
> >
> > Object Summary:
> >
> > ID Created Destroyed Index
> >
> >
> >
> ======================================================================================================
> >
> > 146 08:47:30 -
> > org.apache.qpid.broker:queue:2d1a7c8f-bc3b-4d54-bbe6-b7b264530506:1.0
> >
> > 147 08:47:30 -
> >
> org.apache.qpid.broker:queue:celeryev.d45c6bc2-2449-4700-b3bb-bbbbf0b2990b
> >
> > 148 08:52:24 -
> > org.apache.qpid.broker:queue:qmfc-v2-hb-mysrv.4080.1
> >
> > 149 08:52:24 -
> > org.apache.qpid.broker:queue:qmfc-v2-mysrv.4080.1
> >
> > 150 08:52:24 -
> > org.apache.qpid.broker:queue:qmfc-v2-ui-mysrv.4080.1
> >
> > 151 08:52:24 -
> > org.apache.qpid.broker:queue:reply-mysrv.4080.1
> >
> > 152 08:52:24 -
> > org.apache.qpid.broker:queue:topic-mysrv.4080.1
> >
> >
> >
> > # pulp-admin tasks list
> >
> > +----------------------------------------------------------------------+
> >
> > Tasks
> >
> > +----------------------------------------------------------------------+
> >
> >
> >
> > No tasks found
> >
> >
> >
> > # pulp-admin rpm repo sync run --repo-id=rhel-6-server-rpms
> >
> > +----------------------------------------------------------------------+
> >
> > Synchronizing Repository [rhel-6-server-rpms]
> >
> > +----------------------------------------------------------------------+
> >
> >
> >
> > This command may be exited via ctrl+c without affecting the request.
> >
> >
> >
> >
> >
> > [\]
> >
> > *** STUCK ***
> >
> >
> >
> > On Wed, May 4, 2016 at 9:37 AM, Mallick, Samiron
> > <samiron.mallick at gmail.com <mailto:samiron.mallick at gmail.com>> wrote:
> >
> > Hey Brian, thanks for the reply.
> >
> > *From the below output I could see "resource_worker-1" is
> > responsible for this task and I have 4 worker displayed on the
> > server.____*
> >
> > __ __
> >
> > # pulp-admin tasks list____
> >
> >
> +----------------------------------------------------------------------+____
> >
> > Tasks____
> >
> >
> +----------------------------------------------------------------------+____
> >
> > __ __
> >
> > Operations: sync____
> >
> > Resources: rhel-6-server-supplementary-rpms (repository)____
> >
> > State: Waiting____
> >
> > Start Time: Unstarted____
> >
> > Finish Time: Incomplete____
> >
> > Task Id: 49b83f70-e6d6-4cdb-9c5a-93c20c31d697____
> >
> > __ __
> >
> > __ __
> >
> > # pulp-admin -vv tasks details --task-id
> > 49b83f70-e6d6-4cdb-9c5a-93c20c31d697____
> >
> >
> +----------------------------------------------------------------------+____
> >
> > Task Details____
> >
> >
> +----------------------------------------------------------------------+____
> >
> > __ __
> >
> > 2016-05-04 04:55:33,231 - DEBUG - sending GET request to
> > /pulp/api/v2/tasks/49b83f70-e6d6-4cdb-9c5a-93c20c31d697/____
> >
> > 2016-05-04 04:55:33,362 - INFO - GET request to
> > /pulp/api/v2/tasks/49b83f70-e6d6-4cdb-9c5a-93c20c31d697/ with
> > parameters None____
> >
> > 2016-05-04 04:55:33,362 - INFO - Response status : 200____
> >
> > __ __
> >
> > 2016-05-04 04:55:33,363 - INFO - Response body :____
> >
> > {____
> >
> > "exception": null,____
> >
> > "task_type": "pulp.server.managers.repo.sync.sync",____
> >
> > "_href":
> > "/pulp/api/v2/tasks/49b83f70-e6d6-4cdb-9c5a-93c20c31d697/",____
> >
> > "task_id": "49b83f70-e6d6-4cdb-9c5a-93c20c31d697",____
> >
> > "tags": [____
> >
> > "pulp:repository:rhel-6-server-supplementary-rpms",____
> >
> > "pulp:action:sync"____
> >
> > ],____
> >
> > "finish_time": null,____
> >
> > "_ns": "task_status",____
> >
> > "start_time": null,____
> >
> > "traceback": null,____
> >
> > "spawned_tasks": [],____
> >
> > "progress_report": {},____
> >
> > "queue": "reserved_resource_worker-1 at mysrv.dq",____
> >
> > "state": "waiting",____
> >
> > "worker_name": "reserved_resource_worker-1 at mysrv",____
> >
> > "result": null,____
> >
> > "error": null,____
> >
> > "_id": {____
> >
> > "$oid": "572964399b70a2ea1d2694aa"____
> >
> > },____
> >
> > "id": "572964399b70a2ea1d2694aa"____
> >
> > }____
> >
> > __ __
> >
> > Operations: sync____
> >
> > Resources: rhel-6-server-supplementary-rpms (repository)____
> >
> > State: Waiting____
> >
> > Start Time: Unstarted____
> >
> > Finish Time: Incomplete____
> >
> > Result: Incomplete____
> >
> > Task Id: 49b83f70-e6d6-4cdb-9c5a-93c20c31d697____
> >
> > Progress Report:____
> >
> > __ __
> >
> > __ __
> >
> > # pulp-admin status____
> >
> >
> +----------------------------------------------------------------------+____
> >
> > Status of the server____
> >
> >
> +----------------------------------------------------------------------+____
> >
> > __ __
> >
> > Api Version: 2____
> >
> > Database Connection:____
> >
> > Connected: True____
> >
> > Known Workers:____
> >
> > _id: scheduler at mysrv____
> >
> > _ns: workers____
> >
> > Last Heartbeat: 2016-05-04T02:53:34Z____
> >
> > _id: reserved_resource_worker-3 at mysrv____
> >
> > _ns: workers____
> >
> > Last Heartbeat: 2016-05-04T02:54:00Z____
> >
> > _id: reserved_resource_worker-2 at mysrv____
> >
> > _ns: workers____
> >
> > Last Heartbeat: 2016-05-04T02:54:00Z____
> >
> > _id: resource_manager at mysrv____
> >
> > _ns: workers____
> >
> > Last Heartbeat: 2016-05-04T02:54:00Z____
> >
> > _id: reserved_resource_worker-1 at mysrv____
> >
> > _ns: workers____
> >
> > Last Heartbeat: 2016-05-04T02:54:01Z____
> >
> > _id: reserved_resource_worker-0 at mysrv____
> >
> > _ns: workers____
> >
> > Last Heartbeat: 2016-05-04T02:54:03Z____
> >
> > Messaging Connection:____
> >
> > Connected: True____
> >
> > Versions:____
> >
> > Platform Version: 2.8.2____
> >
> > __ __
> >
> > # ps -awfux | grep celery____
> >
> > root 4637 0.0 0.0 112644 960 pts/0 S+ 04:56
> > 0:00 \_ grep --color=auto celery____
> >
> > apache 1592 0.0 1.4 667716 56368 ? Ssl May03 0:26
> > /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n
> > resource_manager@%h -Q resource_manager -c 1 --events --umask 18
> > --pidfile=/var/run/pulp/resource_manager.pid
> --heartbeat-interval=30____
> >
> > apache 2921 0.0 1.4 667664 54296 ? Sl May03 0:13 \_
> > /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n
> > resource_manager@%h -Q resource_manager -c 1 --events --umask 18
> > --pidfile=/var/run/pulp/resource_manager.pid
> --heartbeat-interval=30____
> >
> > apache 1616 0.0 1.4 667996 56400 ? Ssl May03 0:27
> > /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-0.pid
> > --heartbeat-interval=30____
> >
> > apache 2919 0.0 1.4 741536 54564 ? Sl May03 0:11 \_
> > /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-0.pid
> > --heartbeat-interval=30____
> >
> > apache 1626 0.0 1.5 668560 59524 ? Ssl May03 0:29
> > /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-1.pid
> > --heartbeat-interval=30____
> >
> > apache 4561 0.0 1.4 668560 56260 ? S 04:47 0:00 \_
> > /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-1.pid
> > --heartbeat-interval=30____
> >
> > apache 1631 0.0 1.5 667748 58508 ? Ssl May03 0:27
> > /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-2.pid
> > --heartbeat-interval=30____
> >
> > apache 2922 4.2 8.0 1042956 311476 ? Sl May03 48:25 \_
> > /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-2.pid
> > --heartbeat-interval=30____
> >
> > apache 1637 0.0 1.4 667744 56368 ? Ssl May03 0:27
> > /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-3.pid
> > --heartbeat-interval=30____
> >
> > apache 2920 0.0 1.4 815420 54760 ? Sl May03 0:13 \_
> > /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-3.pid
> > --heartbeat-interval=30____
> >
> > apache 4620 6.5 0.8 663652 31432 ? Ssl 04:56 0:00
> > /usr/bin/python /usr/bin/celery beat
> > --app=pulp.server.async.celery_instance.celery
> > --scheduler=pulp.server.async.scheduler.Scheduler____
> >
> > __ __
> >
> > __ __
> >
> > *As I saw errors in output of pulp_worker-1, I restarted each worker
> > individually, and all error seems gone.____*
> >
> > __ __
> >
> > # systemctl status pulp_workers.service____
> >
> > ● pulp_workers.service - Pulp Celery Workers____
> >
> > Loaded: loaded (/usr/lib/systemd/system/pulp_workers.service;
> > enabled; vendor preset: disabled)____
> >
> > Active: active (exited) since Wed 2016-05-04 05:36:38 CEST; 3s
> > ago____
> >
> > Process: 5717 ExecStop=/usr/bin/python -m
> > pulp.server.async.manage_workers stop (code=exited,
> > status=0/SUCCESS)____
> >
> > Process: 5731 ExecStart=/usr/bin/python -m
> > pulp.server.async.manage_workers start (code=exited,
> > status=0/SUCCESS)____
> >
> > Main PID: 5731 (code=exited, status=0/SUCCESS)____
> >
> > __ __
> >
> > May 04 05:36:38 mysrv systemd[1]: Starting Pulp Celery Workers...____
> >
> > May 04 05:36:38 mysrv systemd[1]: Started Pulp Celery Workers.____
> >
> > __ __
> >
> > # systemctl status pulp_worker-0____
> >
> > ? pulp_worker-0.service - Pulp Worker #0____
> >
> > Loaded: loaded (/run/systemd/system/pulp_worker-0.service;
> > static; vendor preset: disabled)____
> >
> > Active: active (running) since Wed 2016-05-04 05:10:44 CEST; 1min
> > 26s ago____
> >
> > Main PID: 4753 (celery)____
> >
> > CGroup: /system.slice/pulp_worker-0.service____
> >
> > +-4753 /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var...____
> >
> > +-4766 /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var...____
> >
> > __ __
> >
> > May 04 05:10:46 mysrv celery[4753]: - ** ---------- .> transport:
> > qpid://mysrv:5672// ____
> >
> > May 04 05:10:46 mysrv celery[4753]: - ** ---------- .> results:
> > disabled____
> >
> > May 04 05:10:46 mysrv celery[4753]: - *** --- * --- .> concurrency:
> > 1 (prefork)____
> >
> > May 04 05:10:46 mysrv celery[4753]: -- ******* ----____
> >
> > May 04 05:10:46 mysrv celery[4753]: --- ***** ----- [queues]____
> >
> > May 04 05:10:46 mysrv celery[4753]: -------------- .>
> > celery exchange=celery(direct) key=celery____
> >
> > May 04 05:10:46 mysrv celery[4753]: .>
> > reserved_resource_worker-0 at mysrv.dq exchange=C.dq(direct)
> > key=rese...s <http://sim.biz/>rv____
> >
> > May 04 05:10:46 mysrv pulp[4753]: kombu.transport.qpid:INFO:
> > Connected to qpid with SASL mechanism ANONYMOUS____
> >
> > May 04 05:10:46 mysrv pulp[4753]: celery.worker.consumer:INFO:
> > Connected to qpid://mysrv:5672//
> > ____
> >
> > May 04 05:10:46 mysrv pulp[4753]: kombu.transport.qpid:INFO:
> > Connected to qpid with SASL mechanism ANONYMOUS____
> >
> > Hint: Some lines were ellipsized, use -l to show in full.____
> >
> > __ __
> >
> > __ __
> >
> > # systemctl status pulp_worker-1____
> >
> > ? pulp_worker-1.service - Pulp Worker #1____
> >
> > Loaded: loaded (/run/systemd/system/pulp_worker-1.service;
> > static; vendor preset: disabled)____
> >
> > Active: active (running) since Wed 2016-05-04 05:08:16 CEST; 3min
> > 57s ago____
> >
> > Main PID: 4718 (celery)____
> >
> > CGroup: /system.slice/pulp_worker-1.service____
> >
> > +-4718 /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var...____
> >
> > +-4733 /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var...____
> >
> > __ __
> >
> > May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
> > Received task:
> > pulp.server.controllers.repository.download_def...3cc3c36]____
> >
> > May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
> > Received task:
> > pulp.server.controllers.repository.download_def...ce7430b]____
> >
> > May 04 05:08:23 mysrv pulp[4718]: celery.worker.job:INFO: Task
> > pulp.server.controllers.repository.download_deferred[aad88f32-...9s:
> > None____
> >
> > May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
> > Received task:
> > pulp.server.db.reaper.reap_expired_documents[02...8322faa]____
> >
> > May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
> > Received task:
> > pulp.server.controllers.repository.download_def...ddadf87]____
> >
> > May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
> > Received task:
> > pulp.server.controllers.repository.download_def...d0cf8c6]____
> >
> > May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
> > Received task:
> > pulp.server.controllers.repository.download_def...72edf98]____
> >
> > May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
> > Received task:
> > pulp.server.controllers.repository.download_def...1e9e4bc]____
> >
> > May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
> > Received task:
> > pulp.server.controllers.repository.download_def...30f8627]____
> >
> > May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
> > Received task:
> > pulp.server.controllers.repository.queue_downlo...fd23e13]____
> >
> > Hint: Some lines were ellipsized, use -l to show in full.____
> >
> > __ __
> >
> > __ __
> >
> > # systemctl status pulp_worker-2____
> >
> > ? pulp_worker-2.service - Pulp Worker #2____
> >
> > Loaded: loaded (/run/systemd/system/pulp_worker-2.service;
> > static; vendor preset: disabled)____
> >
> > Active: active (running) since Wed 2016-05-04 05:11:06 CEST; 1min
> > 10s ago____
> >
> > Main PID: 4776 (celery)____
> >
> > CGroup: /system.slice/pulp_worker-2.service____
> >
> > +-4776 /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var...____
> >
> > +-4789 /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var...____
> >
> > __ __
> >
> > May 04 05:11:07 mysrv celery[4776]: - ** ---------- .> transport:
> > qpid://mysrv:5672// ____
> >
> > May 04 05:11:07 mysrv celery[4776]: - ** ---------- .> results:
> > disabled____
> >
> > May 04 05:11:07 mysrv celery[4776]: - *** --- * --- .> concurrency:
> > 1 (prefork)____
> >
> > May 04 05:11:07 mysrv celery[4776]: -- ******* ----____
> >
> > May 04 05:11:07 mysrv celery[4776]: --- ***** ----- [queues]____
> >
> > May 04 05:11:07 mysrv celery[4776]: -------------- .>
> > celery exchange=celery(direct) key=celery____
> >
> > May 04 05:11:07 mysrv celery[4776]: .>
> > reserved_resource_worker-2@<redacted by list administrator>
> > exchange=C.dq(direct) key=rese...s <http://sim.biz/>rv____
> >
> > May 04 05:11:07 mysrv pulp[4776]: kombu.transport.qpid:INFO:
> > Connected to qpid with SASL mechanism ANONYMOUS____
> >
> > May 04 05:11:07 mysrv pulp[4776]: celery.worker.consumer:INFO:
> > Connected to qpid://mysrv:5672//
> > ____
> >
> > May 04 05:11:07 mysrv pulp[4776]: kombu.transport.qpid:INFO:
> > Connected to qpid with SASL mechanism ANONYMOUS____
> >
> > Hint: Some lines were ellipsized, use -l to show in full.____
> >
> > __ __
> >
> > __ __
> >
> > # systemctl status pulp_worker-3____
> >
> > ? pulp_worker-3.service - Pulp Worker #3____
> >
> > Loaded: loaded (/run/systemd/system/pulp_worker-3.service;
> > static; vendor preset: disabled)____
> >
> > Active: active (running) since Wed 2016-05-04 05:11:21 CEST; 59s
> > ago____
> >
> > Main PID: 4798 (celery)____
> >
> > CGroup: /system.slice/pulp_worker-3.service____
> >
> > +-4798 /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var...____
> >
> > +-4811 /usr/bin/python /usr/bin/celery worker -n
> > reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events
> > --umask 18 --pidfile=/var...____
> >
> > __ __
> >
> > May 04 05:11:22 mysrv celery[4798]: - ** ---------- .> transport:
> > qpid://mysrv:5672// ____
> >
> > May 04 05:11:22 mysrv celery[4798]: - ** ---------- .> results:
> > disabled____
> >
> > May 04 05:11:22 mysrv celery[4798]: - *** --- * --- .> concurrency:
> > 1 (prefork)____
> >
> > May 04 05:11:22 mysrv celery[4798]: -- ******* ----____
> >
> > May 04 05:11:22 mysrv celery[4798]: --- ***** ----- [queues]____
> >
> > May 04 05:11:22 mysrv celery[4798]: -------------- .>
> > celery exchange=celery(direct) key=celery____
> >
> > May 04 05:11:22 mysrv celery[4798]: .>
> > reserved_resource_worker-3@<redacted by list administrator>
> > exchange=C.dq(direct) key=rese...s <http://sim.biz/>rv____
> >
> > May 04 05:11:22 mysrv pulp[4798]: kombu.transport.qpid:INFO:
> > Connected to qpid with SASL mechanism ANONYMOUS____
> >
> > May 04 05:11:22 mysrv pulp[4798]: celery.worker.consumer:INFO:
> > Connected to qpid://mysrv:5672//
> > ____
> >
> > May 04 05:11:22 mysrv pulp[4798]: kombu.transport.qpid:INFO:
> > Connected to qpid with SASL mechanism ANONYMOUS____
> >
> > Hint: Some lines were ellipsized, use -l to show in full.____
> >
> > __ __
> >
> > *Now I have no tasks queued at all. I ran the repo sync again. and
> > again it went to waiting.....____*
> >
> > __ __
> >
> > # pulp-admin tasks list____
> >
> >
> +----------------------------------------------------------------------+____
> >
> > Tasks____
> >
> >
> +----------------------------------------------------------------------+____
> >
> > __ __
> >
> > No tasks found____
> >
> > __ __
> >
> > # pulp-admin rpm repo sync run
> > --repo-id=rhel-6-server-supplementary-rpms____
> >
> >
> +----------------------------------------------------------------------+____
> >
> > Synchronizing Repository [rhel-6-server-supplementary-rpms]____
> >
> >
> +----------------------------------------------------------------------+____
> >
> > __ __
> >
> > This command may be exited via ctrl+c without affecting the
> request.____
> >
> > __ __
> >
> > __ __
> >
> > [/]____
> >
> > Waiting to begin...
> >
> >
> >
> > *//*
> >
> >
> > On Wed, May 4, 2016 at 1:52 AM, Brian Bouterse <bbouters at redhat.com
> > <mailto:bbouters at redhat.com>> wrote:
> >
> > Kodiak is right that the second task stuck at "Waiting to Begin"
> is
> > likely waiting behind another operation on that same repo.
> > Canceling the
> > one prior will likely allow the later one to start.
> >
> > How many workers are running and how many do you expect? You can
> see
> > what Pulp thinks with: `pulp-admin status`
> >
> > You can compare that to your pulp processes on all of your Pulp
> > servers
> > with `sudo ps -awfux | grep celery`.
> >
> > Also you can look at the task details with -vv to see the worker
> the
> > halted task is assigned to. Something like `pulp-admin -vv tasks
> > details
> > --task-id 03842c9d-e053-4a6f-a4c4-2d7302be9c8c.`
> >
> > Unfortunately you'll have to see the worker in the raw response
> > with -vv
> > because of [0].
> >
> > [0]: https://pulp.plan.io/issues/1832
> >
> > -Brian
> >
> >
> > On 05/03/2016 11:53 AM, Kodiak Firesmith wrote:
> > > I believe you may need to cancel the pending repo sync task
> before you
> > > can delete the repo. Maybe try:
> > > pulp-admin tasks cancel
> --task-id=2d776d63-fd8a-4e0a-8f32-d2276c85187c
> > > pulp-admin tasks cancel
> --task-id=03842c9d-e053-4a6f-a4c4-2d7302be9c8c
> > >
> > > Then:
> > > pulp-admin rpm repo delete
> --repo-id=rhel-6-server-supplementary-rpms
> > >
> > >
> > > On Tue, May 3, 2016 at 11:47 AM, Mallick, Samiron
> > > <samiron.mallick at gmail.com <mailto:samiron.mallick at gmail.com>
> > <mailto:samiron.mallick at gmail.com
> > <mailto:samiron.mallick at gmail.com>>> wrote:
> > >
> > > Could anyone please tell me what went wrong with the
> > repository. One
> > > of my EL7 server registered and was able to fetch contents
> > from CDN.
> > > Recently I found one of the repo stuck after downloading
> > RPMs. It’s
> > > never ending. I rebooted my server, cancelled tasks,
> > deleted the
> > > repo and recreated, but no luck. No if I run sync, it
> > directly going
> > > to waiting stage. Earlier I observed it was starting the
> > task but
> > > the start time was same as I ran the job first time. Even
> > now I am
> > > not able to delete the repo as well as it is showing
> > “Waiting to
> > > begin”. I am running Pulp v2.8. Any idea would be greatly
> > appreciated.
> > >
> > >
> > >
> > > # rpm -qa pulp-server
> > >
> > > pulp-server-2.8.2-1.el7.noarch
> > >
> > >
> > >
> > > # pulp-admin rpm repo sync run
> > > --repo-id=rhel-6-server-supplementary-rpms
> > >
> > >
> >
> +----------------------------------------------------------------------+
> > >
> > > Synchronizing Repository
> > [rhel-6-server-supplementary-rpms]
> > >
> > >
> >
> +----------------------------------------------------------------------+
> > >
> > >
> > >
> > > This command may be exited via ctrl+c without affecting
> > the request.
> > >
> > >
> > >
> > >
> > >
> > > Downloading metadata...
> > >
> > > [\]
> > >
> > > ... completed
> > >
> > >
> > >
> > > Downloading repository content...
> > >
> > > [-]
> > >
> > > [==================================================] 100%
> > >
> > > RPMs: 0/0 items
> > >
> > > Delta RPMs: 0/0 items
> > >
> > >
> > >
> > > ... completed
> > >
> > >
> > >
> > > Downloading distribution files...
> > >
> > > [==================================================] 100%
> > >
> > > Distributions: 0/0 items
> > >
> > > ... completed
> > >
> > >
> > >
> > > Importing errata...
> > >
> > > [/]
> > >
> > > ... completed
> > >
> > >
> > >
> > > Importing package groups/categories...
> > >
> > > [-]
> > >
> > > ... completed
> > >
> > >
> > >
> > > Cleaning duplicate packages...
> > >
> > > [|]
> > >
> > > ... completed
> > >
> > >
> > >
> > > *** AND STUCK HERE ***
> > >
> > >
> > >
> > > # pulp-admin tasks list
> > >
> > >
> >
> +----------------------------------------------------------------------+
> > >
> > > Tasks
> > >
> > >
> >
> +----------------------------------------------------------------------+
> > >
> > >
> > >
> > > Operations: sync
> > >
> > > Resources: rhel-6-server-supplementary-rpms (repository)
> > >
> > > State: Running
> > >
> > > Start Time: 2016-05-03T07:06:36Z
> > >
> > > Finish Time: Incomplete
> > >
> > > Task Id: 2d776d63-fd8a-4e0a-8f32-d2276c85187c
> > >
> > >
> > >
> > > Operations: publish
> > >
> > > Resources: rhel-6-server-supplementary-rpms (repository)
> > >
> > > State: Waiting
> > >
> > > Start Time: Unstarted
> > >
> > > Finish Time: Incomplete
> > >
> > > Task Id: 03842c9d-e053-4a6f-a4c4-2d7302be9c8c
> > >
> > >
> > >
> > > # date
> > >
> > > Tue May 3 09:22:30 CEST 2016
> > >
> > > # pulp-admin rpm repo sync schedules list
> > > --repo-id=rhel-6-server-supplementary-rpms
> > >
> > >
> >
> +----------------------------------------------------------------------+
> > >
> > > Schedules
> > >
> > >
> >
> +----------------------------------------------------------------------+
> > >
> > >
> > >
> > > There are no schedules defined for this operation.
> > >
> > >
> > >
> > > # pulp-admin rpm repo delete
> > --repo-id=rhel-6-server-supplementary-rpms
> > >
> > > This command may be exited via ctrl+c without affecting
> > the request.
> > >
> > >
> > >
> > >
> > >
> > > [-]
> > >
> > > Running...
> > >
> > > [-]
> > >
> > > Waiting to begin...
> > >
> > >
> > >
> > > *** AND STUCK HERE ***
> > >
> > >
> > > _______________________________________________
> > > Pulp-list mailing list
> > > Pulp-list at redhat.com <mailto:Pulp-list at redhat.com>
> > <mailto:Pulp-list at redhat.com <mailto:Pulp-list at redhat.com>>
> > > https://www.redhat.com/mailman/listinfo/pulp-list
> > >
> > >
> > >
> > >
> > > _______________________________________________
> > > Pulp-list mailing list
> > > Pulp-list at redhat.com <mailto:Pulp-list at redhat.com>
> > > https://www.redhat.com/mailman/listinfo/pulp-list
> > >
> >
> > _______________________________________________
> > Pulp-list mailing list
> > Pulp-list at redhat.com <mailto:Pulp-list at redhat.com>
> > https://www.redhat.com/mailman/listinfo/pulp-list
> >
> >
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listman.redhat.com/archives/pulp-list/attachments/20160506/098d4873/attachment.htm>
More information about the Pulp-list
mailing list