[Pulp-list] pulp 2.8 repos went into waiting state and never ends

Brian Bouterse bbouters at redhat.com
Wed May 4 12:35:57 UTC 2016


After trying the things from my other e-mail. When your system is in its
"bad state" show the `qpid-stat -q` output too please.

The `qpid-stat -q` output from this e-mail shows all of the queues are
empty with msgIn and msgOut being the same.

-Brian

On 05/04/2016 05:07 AM, Mallick, Samiron wrote:
> *Seems the queue has something. I deleted them up one-by-one, and tried
> starting the stuck repo sync. and it works for the repo finally. But bad
> news is, the queue again regenerates lot of tasks automatically where no
> sync tasks running. So while trying to run another repo, it went to
> Waiting again. Is there any workaround yet?*
> 
>  
> 
> # qpid-stat -q
> 
> Queues
> 
>   queue                                                               
> dur  autoDel  excl  msg   msgIn  msgOut  bytes  bytesIn  bytesOut  cons 
> bind
> 
>  
> ====================================================================================================================================================
> 
>  
> 1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0                                  Y       
> Y        0     2      2       0    486      486         1     2
> 
>  
> 36e7ca4b-5a0d-4f5c-9f94-a22016390562:1.0                                  Y       
> Y        0     8      8       0   4.91k    4.91k        1     2
> 
>  
> 36e7ca4b-5a0d-4f5c-9f94-a22016390562:2.0                                  Y       
> Y        0     4      4       0   2.50k    2.50k        1     2
> 
>  
> 3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:1.0                                  Y       
> Y        0     8      8       0   4.88k    4.88k        1     2
> 
>  
> 3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:2.0                                  Y       
> Y        0     4      4       0   2.52k    2.52k        1     2
> 
>  
> 43099b2b-cc78-4b96-a1a9-50d94517c1e2:1.0                                  Y       
> Y        0     2      2       0    486      486         1     2
> 
>  
> 4409c371-0d54-44c4-94b7-ec0bb7ecfd45:1.0                                  Y       
> Y        0     0      0       0      0        0         1     2
> 
>   680eb17a-8285-450c-b8b9-51d107b4ff2d:0.0                      
>            Y        Y        0     0      0       0      0       
> 0         1     2
> 
>  
> bcbc1fa3-8157-403d-8f33-252fe057587a:1.0                                  Y       
> Y        0     5      5       0   2.67k    2.67k        1     2
> 
>   celery              
>                                                 Y                     
> 0     0      0       0      0        0         1     2
> 
>  
> celeryev.4021d653-24bf-4f06-9aee-aa457c579c4b                             Y                
> 0    12     12       0   10.0k    10.0k        1     2
> 
>   pulp.task                                                           
> Y                      0     0      0       0      0        0        
> 3     1
> 
>   reserved_resource_worker-0 at mysrv.celery.pidbox
> <mailto:reserved_resource_worker-0 at mysrv.celery.pidbox>      
> Y                 0     0      0       0      0        0         1     2
> 
>   reserved_resource_worker-0 at mysrv.dq
> <mailto:reserved_resource_worker-0 at mysrv.dq>             Y   
> Y                 0     0      0       0      0        0         1     2
> 
>   resource_manager                                                    
> Y                      0     0      0       0      0        0        
> 1     2
> 
>   resource_manager at mysrv.celery.pidbox
> <mailto:resource_manager at mysrv.celery.pidbox>                
> Y                 0     0      0       0      0        0         1     2
> 
>   resource_manager at mysrv.dq
> <mailto:resource_manager at mysrv.dq>                       Y   
> Y                 0     0      0       0      0        0         1     2
> 
> # pulp-admin tasks list
> 
> +----------------------------------------------------------------------+
> 
>                                  Tasks
> 
> +----------------------------------------------------------------------+
> 
>  
> 
> No tasks found
> 
>  
> 
>  
> 
>  
> 
> # qpid-tool
> 
> Management Tool for QPID
> 
> qpid: list
> 
> Summary of Objects by Type:
> 
> qpid: help
> 
> Management Tool for QPID
> 
>  
> 
> Commands:
> 
>     agents                          - Print a list of the known Agents
> 
>     list                            - Print summary of existing objects
> by class
> 
>     list <className>                - Print list of objects of the
> specified class
> 
>     list <className> active         - Print list of non-deleted objects
> of the specified class
> 
>     show <ID>                       - Print contents of an object (infer
> className)
> 
>     call <ID> <methodName> [<args>] - Invoke a method on an object
> 
>     schema                          - Print summary of object classes
> seen on the target
> 
>     schema <className>              - Print details of an object class
> 
>     set time-format short           - Select short timestamp format
> (default)
> 
>     set time-format long            - Select long timestamp format
> 
>     quit or ^D                      - Exit the program
> 
>  
> 
> qpid: list
> 
> Summary of Objects by Type:
> 
>     Package                      Class         Active  Deleted
> 
>     ============================================================
> 
>     org.apache.qpid.broker       binding       43      12
> 
>     org.apache.qpid.broker       broker        1       0
> 
>     org.apache.qpid.broker       memory        1       0
> 
>     org.apache.qpid.broker       system        1       0
> 
>     org.apache.qpid.linearstore  store         1       0
> 
>     org.apache.qpid.broker       subscription  23      5
> 
>     org.apache.qpid.broker       connection    14      1
> 
>     org.apache.qpid.broker       session       19      1
> 
>     org.apache.qpid.linearstore  journal       5       0
> 
>     org.apache.qpid.acl          acl           1       0
> 
>     org.apache.qpid.broker       queue         21      5
> 
>     org.apache.qpid.broker       exchange      13      0
> 
>     org.apache.qpid.broker       vhost         1       0
> 
> qpid: list queue
> 
> Object Summary:
> 
>     ID   Created   Destroyed  Index
> 
>    
> ============================================================================================================================
> 
>     114  06:24:09  06:24:41  
> org.apache.qpid.broker:queue:topic-mysrv.3108.1
> 
>     115  06:24:09  06:24:41  
> org.apache.qpid.broker:queue:reply-mysrv.3108.1
> 
>     116  06:24:09  06:24:41  
> org.apache.qpid.broker:queue:qmfc-v2-ui-mysrv.3108.1
> 
>     117  06:24:09  06:24:41  
> org.apache.qpid.broker:queue:qmfc-v2-mysrv.3108.1
> 
>     118  06:24:09  06:24:41  
> org.apache.qpid.broker:queue:qmfc-v2-hb-mysrv.3108.1
> 
>     198  06:16:36  -         
> org.apache.qpid.broker:queue:1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0
> 
>     199  06:16:36  -         
> org.apache.qpid.broker:queue:36e7ca4b-5a0d-4f5c-9f94-a22016390562:1.0
> 
>     200  06:16:38  -         
> org.apache.qpid.broker:queue:36e7ca4b-5a0d-4f5c-9f94-a22016390562:2.0
> 
>     201  06:16:36  -         
> org.apache.qpid.broker:queue:3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:1.0
> 
>     202  06:16:37  -         
> org.apache.qpid.broker:queue:3de2643d-bb8d-4e98-94d1-d8ed4e1bdf11:2.0
> 
>     203  06:16:36  -         
> org.apache.qpid.broker:queue:43099b2b-cc78-4b96-a1a9-50d94517c1e2:1.0
> 
>     204  06:16:33  -         
> org.apache.qpid.broker:queue:4409c371-0d54-44c4-94b7-ec0bb7ecfd45:1.0
> 
>     205  06:16:33  -         
> org.apache.qpid.broker:queue:bcbc1fa3-8157-403d-8f33-252fe057587a:1.0
> 
>     206  06:16:33  -          org.apache.qpid.broker:queue:celery
> 
>     207  06:16:33  -         
> org.apache.qpid.broker:queue:celeryev.4021d653-24bf-4f06-9aee-aa457c579c4b
> 
>     208  06:16:33  -          org.apache.qpid.broker:queue:pulp.task
> 
>     209  06:24:43  -         
> org.apache.qpid.broker:queue:qmfc-v2-hb-mysrv.3122.1
> 
>     210  06:24:43  -         
> org.apache.qpid.broker:queue:qmfc-v2-mysrv.3122.1
> 
>     211  06:24:43  -         
> org.apache.qpid.broker:queue:qmfc-v2-ui-mysrv.3122.1
> 
>     212  06:24:43  -         
> org.apache.qpid.broker:queue:reply-mysrv.3122.1
> 
>     213  06:16:37  -         
> org.apache.qpid.broker:queue:reserved_resource_worker-0 at mysrv.celery.pidbox
> 
>     214  06:16:36  -         
> org.apache.qpid.broker:queue:reserved_resource_worker-0 at mysrv.dq
> 
>     215  06:16:33  -          org.apache.qpid.broker:queue:resource_manager
> 
>     216  06:16:38  -         
> org.apache.qpid.broker:queue:resource_manager at mysrv.celery.pidbox
> 
>     217  06:16:37  -         
> org.apache.qpid.broker:queue:resource_manager at mysrv.dq
> 
>     218  06:24:43  -         
> org.apache.qpid.broker:queue:topic-mysrv.3122.1
> 
>  
> 
> # qpid-config del queue 1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0
> 
> Failed: Exception: Exception from Agent: {u'error_code': 7,
> u'error_text': 'precondition-failed: Cannot delete queue
> 1a4b6e57-3ecc-406d-84cd-29b24a0a6610:1.0; queue in use
> (/builddir/build/BUILD/qpid-cpp-0.34/src/qpid/broker/Broker.cpp:1068)'}
> 
>  
> 
> # qpid-config del queue resource_manager at mysrv.dq
> <mailto:resource_manager at mysrv.dq> --force
> 
>  
> 
> qpid: list
> 
> Summary of Objects by Type:
> 
>     Package                      Class         Active  Deleted
> 
>     ============================================================
> 
>     org.apache.qpid.broker       binding       16      0
> 
>     org.apache.qpid.broker       broker        1       0
> 
>     org.apache.qpid.broker       memory        1       0
> 
>     org.apache.qpid.broker       system        1       0
> 
>     org.apache.qpid.linearstore  store         1       0
> 
>     org.apache.qpid.broker       subscription  7       0
> 
>     org.apache.qpid.broker       connection    13      0
> 
>     org.apache.qpid.broker       session       269     0
> 
>     org.apache.qpid.acl          acl           1       0
> 
>     org.apache.qpid.broker       queue         7       0
> 
>     org.apache.qpid.broker       exchange      13      0
> 
>     org.apache.qpid.broker       vhost         1       0
> 
> qpid: list queue
> 
> Object Summary:
> 
>     ID   Created   Destroyed  Index
> 
>    
> ======================================================================================================
> 
>     146  08:47:30  -         
> org.apache.qpid.broker:queue:2d1a7c8f-bc3b-4d54-bbe6-b7b264530506:1.0
> 
>     147  08:47:30  -         
> org.apache.qpid.broker:queue:celeryev.d45c6bc2-2449-4700-b3bb-bbbbf0b2990b
> 
>     148  08:52:24  -         
> org.apache.qpid.broker:queue:qmfc-v2-hb-mysrv.4080.1
> 
>     149  08:52:24  -         
> org.apache.qpid.broker:queue:qmfc-v2-mysrv.4080.1
> 
>     150  08:52:24  -         
> org.apache.qpid.broker:queue:qmfc-v2-ui-mysrv.4080.1
> 
>     151  08:52:24  -         
> org.apache.qpid.broker:queue:reply-mysrv.4080.1
> 
>     152  08:52:24  -         
> org.apache.qpid.broker:queue:topic-mysrv.4080.1
> 
>  
> 
> # pulp-admin tasks list
> 
> +----------------------------------------------------------------------+
> 
>                                  Tasks
> 
> +----------------------------------------------------------------------+
> 
>  
> 
> No tasks found
> 
>  
> 
> # pulp-admin rpm repo sync run --repo-id=rhel-6-server-rpms
> 
> +----------------------------------------------------------------------+
> 
>              Synchronizing Repository [rhel-6-server-rpms]
> 
> +----------------------------------------------------------------------+
> 
>  
> 
> This command may be exited via ctrl+c without affecting the request.
> 
>  
> 
>  
> 
> [\]
> 
> *** STUCK ***
> 
> 
> 
> On Wed, May 4, 2016 at 9:37 AM, Mallick, Samiron
> <samiron.mallick at gmail.com <mailto:samiron.mallick at gmail.com>> wrote:
> 
>     Hey Brian, thanks for the reply.
> 
>     *From the below output I could see "resource_worker-1" is
>     responsible for this task and I have 4 worker displayed on the
>     server.____*
> 
>     __ __
> 
>     # pulp-admin tasks list____
> 
>     +----------------------------------------------------------------------+____
> 
>                                      Tasks____
> 
>     +----------------------------------------------------------------------+____
> 
>     __ __
> 
>     Operations:  sync____
> 
>     Resources:   rhel-6-server-supplementary-rpms (repository)____
> 
>     State:       Waiting____
> 
>     Start Time:  Unstarted____
> 
>     Finish Time: Incomplete____
> 
>     Task Id:     49b83f70-e6d6-4cdb-9c5a-93c20c31d697____
> 
>     __ __
> 
>     __ __
> 
>     # pulp-admin -vv tasks details --task-id
>     49b83f70-e6d6-4cdb-9c5a-93c20c31d697____
> 
>     +----------------------------------------------------------------------+____
> 
>                                   Task Details____
> 
>     +----------------------------------------------------------------------+____
> 
>     __ __
> 
>     2016-05-04 04:55:33,231 - DEBUG - sending GET request to
>     /pulp/api/v2/tasks/49b83f70-e6d6-4cdb-9c5a-93c20c31d697/____
> 
>     2016-05-04 04:55:33,362 - INFO - GET request to
>     /pulp/api/v2/tasks/49b83f70-e6d6-4cdb-9c5a-93c20c31d697/ with
>     parameters None____
> 
>     2016-05-04 04:55:33,362 - INFO - Response status : 200____
> 
>     __ __
> 
>     2016-05-04 04:55:33,363 - INFO - Response body :____
> 
>     {____
> 
>       "exception": null,____
> 
>       "task_type": "pulp.server.managers.repo.sync.sync",____
> 
>       "_href":
>     "/pulp/api/v2/tasks/49b83f70-e6d6-4cdb-9c5a-93c20c31d697/",____
> 
>       "task_id": "49b83f70-e6d6-4cdb-9c5a-93c20c31d697",____
> 
>       "tags": [____
> 
>         "pulp:repository:rhel-6-server-supplementary-rpms",____
> 
>         "pulp:action:sync"____
> 
>       ],____
> 
>       "finish_time": null,____
> 
>       "_ns": "task_status",____
> 
>       "start_time": null,____
> 
>       "traceback": null,____
> 
>       "spawned_tasks": [],____
> 
>       "progress_report": {},____
> 
>       "queue": "reserved_resource_worker-1 at mysrv.dq",____
> 
>       "state": "waiting",____
> 
>       "worker_name": "reserved_resource_worker-1 at mysrv",____
> 
>       "result": null,____
> 
>       "error": null,____
> 
>       "_id": {____
> 
>         "$oid": "572964399b70a2ea1d2694aa"____
> 
>       },____
> 
>       "id": "572964399b70a2ea1d2694aa"____
> 
>     }____
> 
>     __ __
> 
>     Operations:       sync____
> 
>     Resources:        rhel-6-server-supplementary-rpms (repository)____
> 
>     State:            Waiting____
> 
>     Start Time:       Unstarted____
> 
>     Finish Time:      Incomplete____
> 
>     Result:           Incomplete____
> 
>     Task Id:          49b83f70-e6d6-4cdb-9c5a-93c20c31d697____
> 
>     Progress Report:____
> 
>     __ __
> 
>     __ __
> 
>     # pulp-admin status____
> 
>     +----------------------------------------------------------------------+____
> 
>                               Status of the server____
> 
>     +----------------------------------------------------------------------+____
> 
>     __ __
> 
>     Api Version:           2____
> 
>     Database Connection:____
> 
>       Connected: True____
> 
>     Known Workers:____
> 
>       _id:            scheduler at mysrv____
> 
>       _ns:            workers____
> 
>       Last Heartbeat: 2016-05-04T02:53:34Z____
> 
>       _id:            reserved_resource_worker-3 at mysrv____
> 
>       _ns:            workers____
> 
>       Last Heartbeat: 2016-05-04T02:54:00Z____
> 
>       _id:            reserved_resource_worker-2 at mysrv____
> 
>       _ns:            workers____
> 
>       Last Heartbeat: 2016-05-04T02:54:00Z____
> 
>       _id:            resource_manager at mysrv____
> 
>       _ns:            workers____
> 
>       Last Heartbeat: 2016-05-04T02:54:00Z____
> 
>       _id:            reserved_resource_worker-1 at mysrv____
> 
>       _ns:            workers____
> 
>       Last Heartbeat: 2016-05-04T02:54:01Z____
> 
>       _id:            reserved_resource_worker-0 at mysrv____
> 
>       _ns:            workers____
> 
>       Last Heartbeat: 2016-05-04T02:54:03Z____
> 
>     Messaging Connection:____
> 
>       Connected: True____
> 
>     Versions:____
> 
>       Platform Version: 2.8.2____
> 
>     __ __
> 
>     # ps -awfux | grep celery____
> 
>     root      4637  0.0  0.0 112644   960 pts/0    S+   04:56  
>     0:00                          \_ grep --color=auto celery____
> 
>     apache    1592  0.0  1.4 667716 56368 ?        Ssl  May03   0:26
>     /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n
>     resource_manager@%h -Q resource_manager -c 1 --events --umask 18
>     --pidfile=/var/run/pulp/resource_manager.pid --heartbeat-interval=30____
> 
>     apache    2921  0.0  1.4 667664 54296 ?        Sl   May03   0:13  \_
>     /usr/bin/python /usr/bin/celery worker -A pulp.server.async.app -n
>     resource_manager@%h -Q resource_manager -c 1 --events --umask 18
>     --pidfile=/var/run/pulp/resource_manager.pid --heartbeat-interval=30____
> 
>     apache    1616  0.0  1.4 667996 56400 ?        Ssl  May03   0:27
>     /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-0.pid
>     --heartbeat-interval=30____
> 
>     apache    2919  0.0  1.4 741536 54564 ?        Sl   May03   0:11  \_
>     /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-0.pid
>     --heartbeat-interval=30____
> 
>     apache    1626  0.0  1.5 668560 59524 ?        Ssl  May03   0:29
>     /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-1.pid
>     --heartbeat-interval=30____
> 
>     apache    4561  0.0  1.4 668560 56260 ?        S    04:47   0:00  \_
>     /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-1.pid
>     --heartbeat-interval=30____
> 
>     apache    1631  0.0  1.5 667748 58508 ?        Ssl  May03   0:27
>     /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-2.pid
>     --heartbeat-interval=30____
> 
>     apache    2922  4.2  8.0 1042956 311476 ?      Sl   May03  48:25  \_
>     /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-2.pid
>     --heartbeat-interval=30____
> 
>     apache    1637  0.0  1.4 667744 56368 ?        Ssl  May03   0:27
>     /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-3.pid
>     --heartbeat-interval=30____
> 
>     apache    2920  0.0  1.4 815420 54760 ?        Sl   May03   0:13  \_
>     /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var/run/pulp/reserved_resource_worker-3.pid
>     --heartbeat-interval=30____
> 
>     apache    4620  6.5  0.8 663652 31432 ?        Ssl  04:56   0:00
>     /usr/bin/python /usr/bin/celery beat
>     --app=pulp.server.async.celery_instance.celery
>     --scheduler=pulp.server.async.scheduler.Scheduler____
> 
>     __ __
> 
>     __ __
> 
>     *As I saw errors in output of pulp_worker-1, I restarted each worker
>     individually, and all error seems gone.____*
> 
>     __ __
> 
>     # systemctl status pulp_workers.service____
> 
>     ● pulp_workers.service - Pulp Celery Workers____
> 
>        Loaded: loaded (/usr/lib/systemd/system/pulp_workers.service;
>     enabled; vendor preset: disabled)____
> 
>        Active: active (exited) since Wed 2016-05-04 05:36:38 CEST; 3s
>     ago____
> 
>       Process: 5717 ExecStop=/usr/bin/python -m
>     pulp.server.async.manage_workers stop (code=exited,
>     status=0/SUCCESS)____
> 
>       Process: 5731 ExecStart=/usr/bin/python -m
>     pulp.server.async.manage_workers start (code=exited,
>     status=0/SUCCESS)____
> 
>     Main PID: 5731 (code=exited, status=0/SUCCESS)____
> 
>     __ __
> 
>     May 04 05:36:38 mysrv systemd[1]: Starting Pulp Celery Workers...____
> 
>     May 04 05:36:38 mysrv systemd[1]: Started Pulp Celery Workers.____
> 
>     __ __
> 
>     # systemctl status pulp_worker-0____
> 
>     ? pulp_worker-0.service - Pulp Worker #0____
> 
>        Loaded: loaded (/run/systemd/system/pulp_worker-0.service;
>     static; vendor preset: disabled)____
> 
>        Active: active (running) since Wed 2016-05-04 05:10:44 CEST; 1min
>     26s ago____
> 
>     Main PID: 4753 (celery)____
> 
>        CGroup: /system.slice/pulp_worker-0.service____
> 
>                +-4753 /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var...____
> 
>                +-4766 /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-0@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var...____
> 
>     __ __
> 
>     May 04 05:10:46 mysrv celery[4753]: - ** ---------- .> transport:  
>     qpid://mysrv:5672// ____
> 
>     May 04 05:10:46 mysrv celery[4753]: - ** ---------- .> results:    
>     disabled____
> 
>     May 04 05:10:46 mysrv celery[4753]: - *** --- * --- .> concurrency:
>     1 (prefork)____
> 
>     May 04 05:10:46 mysrv celery[4753]: -- ******* ----____
> 
>     May 04 05:10:46 mysrv celery[4753]: --- ***** ----- [queues]____
> 
>     May 04 05:10:46 mysrv celery[4753]: -------------- .>
>     celery           exchange=celery(direct) key=celery____
> 
>     May 04 05:10:46 mysrv celery[4753]: .>
>     reserved_resource_worker-0 at mysrv.dq exchange=C.dq(direct)
>     key=rese...s <http://sim.biz/>rv____
> 
>     May 04 05:10:46 mysrv pulp[4753]: kombu.transport.qpid:INFO:
>     Connected to qpid with SASL mechanism ANONYMOUS____
> 
>     May 04 05:10:46 mysrv pulp[4753]: celery.worker.consumer:INFO:
>     Connected to qpid://mysrv:5672//
>     ____
> 
>     May 04 05:10:46 mysrv pulp[4753]: kombu.transport.qpid:INFO:
>     Connected to qpid with SASL mechanism ANONYMOUS____
> 
>     Hint: Some lines were ellipsized, use -l to show in full.____
> 
>     __ __
> 
>     __ __
> 
>     # systemctl status pulp_worker-1____
> 
>     ? pulp_worker-1.service - Pulp Worker #1____
> 
>        Loaded: loaded (/run/systemd/system/pulp_worker-1.service;
>     static; vendor preset: disabled)____
> 
>        Active: active (running) since Wed 2016-05-04 05:08:16 CEST; 3min
>     57s ago____
> 
>     Main PID: 4718 (celery)____
> 
>        CGroup: /system.slice/pulp_worker-1.service____
> 
>                +-4718 /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var...____
> 
>                +-4733 /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-1@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var...____
> 
>     __ __
> 
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
>     Received task:
>     pulp.server.controllers.repository.download_def...3cc3c36]____
> 
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
>     Received task:
>     pulp.server.controllers.repository.download_def...ce7430b]____
> 
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.job:INFO: Task
>     pulp.server.controllers.repository.download_deferred[aad88f32-...9s:
>     None____
> 
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
>     Received task:
>     pulp.server.db.reaper.reap_expired_documents[02...8322faa]____
> 
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
>     Received task:
>     pulp.server.controllers.repository.download_def...ddadf87]____
> 
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
>     Received task:
>     pulp.server.controllers.repository.download_def...d0cf8c6]____
> 
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
>     Received task:
>     pulp.server.controllers.repository.download_def...72edf98]____
> 
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
>     Received task:
>     pulp.server.controllers.repository.download_def...1e9e4bc]____
> 
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
>     Received task:
>     pulp.server.controllers.repository.download_def...30f8627]____
> 
>     May 04 05:08:23 mysrv pulp[4718]: celery.worker.strategy:INFO:
>     Received task:
>     pulp.server.controllers.repository.queue_downlo...fd23e13]____
> 
>     Hint: Some lines were ellipsized, use -l to show in full.____
> 
>     __ __
> 
>     __ __
> 
>     # systemctl status pulp_worker-2____
> 
>     ? pulp_worker-2.service - Pulp Worker #2____
> 
>        Loaded: loaded (/run/systemd/system/pulp_worker-2.service;
>     static; vendor preset: disabled)____
> 
>        Active: active (running) since Wed 2016-05-04 05:11:06 CEST; 1min
>     10s ago____
> 
>     Main PID: 4776 (celery)____
> 
>        CGroup: /system.slice/pulp_worker-2.service____
> 
>                +-4776 /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var...____
> 
>                +-4789 /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-2@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var...____
> 
>     __ __
> 
>     May 04 05:11:07 mysrv celery[4776]: - ** ---------- .> transport:  
>     qpid://mysrv:5672// ____
> 
>     May 04 05:11:07 mysrv celery[4776]: - ** ---------- .> results:    
>     disabled____
> 
>     May 04 05:11:07 mysrv celery[4776]: - *** --- * --- .> concurrency:
>     1 (prefork)____
> 
>     May 04 05:11:07 mysrv celery[4776]: -- ******* ----____
> 
>     May 04 05:11:07 mysrv celery[4776]: --- ***** ----- [queues]____
> 
>     May 04 05:11:07 mysrv celery[4776]: -------------- .>
>     celery           exchange=celery(direct) key=celery____
> 
>     May 04 05:11:07 mysrv celery[4776]: .>
>     reserved_resource_worker-2 at redacted by list administrator
>     exchange=C.dq(direct) key=rese...s <http://sim.biz/>rv____
> 
>     May 04 05:11:07 mysrv pulp[4776]: kombu.transport.qpid:INFO:
>     Connected to qpid with SASL mechanism ANONYMOUS____
> 
>     May 04 05:11:07 mysrv pulp[4776]: celery.worker.consumer:INFO:
>     Connected to qpid://mysrv:5672//
>     ____
> 
>     May 04 05:11:07 mysrv pulp[4776]: kombu.transport.qpid:INFO:
>     Connected to qpid with SASL mechanism ANONYMOUS____
> 
>     Hint: Some lines were ellipsized, use -l to show in full.____
> 
>     __ __
> 
>     __ __
> 
>     # systemctl status pulp_worker-3____
> 
>     ? pulp_worker-3.service - Pulp Worker #3____
> 
>        Loaded: loaded (/run/systemd/system/pulp_worker-3.service;
>     static; vendor preset: disabled)____
> 
>        Active: active (running) since Wed 2016-05-04 05:11:21 CEST; 59s
>     ago____
> 
>     Main PID: 4798 (celery)____
> 
>        CGroup: /system.slice/pulp_worker-3.service____
> 
>                +-4798 /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var...____
> 
>                +-4811 /usr/bin/python /usr/bin/celery worker -n
>     reserved_resource_worker-3@%h -A pulp.server.async.app -c 1 --events
>     --umask 18 --pidfile=/var...____
> 
>     __ __
> 
>     May 04 05:11:22 mysrv celery[4798]: - ** ---------- .> transport:  
>     qpid://mysrv:5672// ____
> 
>     May 04 05:11:22 mysrv celery[4798]: - ** ---------- .> results:    
>     disabled____
> 
>     May 04 05:11:22 mysrv celery[4798]: - *** --- * --- .> concurrency:
>     1 (prefork)____
> 
>     May 04 05:11:22 mysrv celery[4798]: -- ******* ----____
> 
>     May 04 05:11:22 mysrv celery[4798]: --- ***** ----- [queues]____
> 
>     May 04 05:11:22 mysrv celery[4798]: -------------- .>
>     celery           exchange=celery(direct) key=celery____
> 
>     May 04 05:11:22 mysrv celery[4798]: .>
>     reserved_resource_worker-3@<redacted by list administrator
>     exchange=C.dq(direct) key=rese...s <http://sim.biz/>rv____
> 
>     May 04 05:11:22 mysrv pulp[4798]: kombu.transport.qpid:INFO:
>     Connected to qpid with SASL mechanism ANONYMOUS____
> 
>     May 04 05:11:22 mysrv pulp[4798]: celery.worker.consumer:INFO:
>     Connected to qpid://mysrv:5672//
>     ____
> 
>     May 04 05:11:22 mysrv pulp[4798]: kombu.transport.qpid:INFO:
>     Connected to qpid with SASL mechanism ANONYMOUS____
> 
>     Hint: Some lines were ellipsized, use -l to show in full.____
> 
>     __ __
> 
>     *Now I have no tasks queued at all. I ran the repo sync again. and
>     again it went to waiting.....____*
> 
>     __ __
> 
>     # pulp-admin tasks list____
> 
>     +----------------------------------------------------------------------+____
> 
>                                      Tasks____
> 
>     +----------------------------------------------------------------------+____
> 
>     __ __
> 
>     No tasks found____
> 
>     __ __
> 
>     # pulp-admin rpm repo sync run
>     --repo-id=rhel-6-server-supplementary-rpms____
> 
>     +----------------------------------------------------------------------+____
> 
>           Synchronizing Repository [rhel-6-server-supplementary-rpms]____
> 
>     +----------------------------------------------------------------------+____
> 
>     __ __
> 
>     This command may be exited via ctrl+c without affecting the request.____
> 
>     __ __
> 
>     __ __
> 
>     [/]____
> 
>     Waiting to begin...
> 
> 
> 
>     *//*
> 
> 
>     On Wed, May 4, 2016 at 1:52 AM, Brian Bouterse <bbouters at redhat.com
>     <mailto:bbouters at redhat.com>> wrote:
> 
>         Kodiak is right that the second task stuck at "Waiting to Begin" is
>         likely waiting behind another operation on that same repo.
>         Canceling the
>         one prior will likely allow the later one to start.
> 
>         How many workers are running and how many do you expect? You can see
>         what Pulp thinks with:  `pulp-admin status`
> 
>         You can compare that to your pulp processes on all of your Pulp
>         servers
>         with `sudo ps -awfux | grep celery`.
> 
>         Also you can look at the task details with -vv to see the worker the
>         halted task is assigned to. Something like `pulp-admin -vv tasks
>         details
>         --task-id 03842c9d-e053-4a6f-a4c4-2d7302be9c8c.`
> 
>         Unfortunately you'll have to see the worker in the raw response
>         with -vv
>         because of [0].
> 
>         [0]: https://pulp.plan.io/issues/1832
> 
>         -Brian
> 
> 
>         On 05/03/2016 11:53 AM, Kodiak Firesmith wrote:
>         > I believe you may need to cancel the pending repo sync task before you
>         > can delete the repo.  Maybe try:
>         >  pulp-admin tasks cancel --task-id=2d776d63-fd8a-4e0a-8f32-d2276c85187c
>         >  pulp-admin tasks cancel --task-id=03842c9d-e053-4a6f-a4c4-2d7302be9c8c
>         >
>         > Then:
>         > pulp-admin rpm repo delete --repo-id=rhel-6-server-supplementary-rpms
>         >
>         >
>         > On Tue, May 3, 2016 at 11:47 AM, Mallick, Samiron
>         > <samiron.mallick at gmail.com <mailto:samiron.mallick at gmail.com>
>         <mailto:samiron.mallick at gmail.com
>         <mailto:samiron.mallick at gmail.com>>> wrote:
>         >
>         >     Could anyone please tell me what went wrong with the
>         repository. One
>         >     of my EL7 server registered and was able to fetch contents
>         from CDN.
>         >     Recently I found one of the repo stuck after downloading
>         RPMs. It’s
>         >     never ending. I rebooted my server, cancelled tasks,
>         deleted the
>         >     repo and recreated, but no luck. No if I run sync, it
>         directly going
>         >     to waiting stage. Earlier I observed it was starting the
>         task but
>         >     the start time was same as I ran the job first time. Even
>         now I am
>         >     not able to delete the repo as well as it is showing
>         “Waiting to
>         >     begin”. I am running Pulp v2.8. Any idea would be greatly
>         appreciated.
>         >
>         >
>         >
>         >     # rpm -qa pulp-server
>         >
>         >     pulp-server-2.8.2-1.el7.noarch
>         >
>         >
>         >
>         >     # pulp-admin rpm repo sync run
>         >     --repo-id=rhel-6-server-supplementary-rpms
>         >
>         >   
>          +----------------------------------------------------------------------+
>         >
>         >           Synchronizing Repository
>         [rhel-6-server-supplementary-rpms]
>         >
>         >   
>          +----------------------------------------------------------------------+
>         >
>         >
>         >
>         >     This command may be exited via ctrl+c without affecting
>         the request.
>         >
>         >
>         >
>         >
>         >
>         >     Downloading metadata...
>         >
>         >     [\]
>         >
>         >     ... completed
>         >
>         >
>         >
>         >     Downloading repository content...
>         >
>         >     [-]
>         >
>         >     [==================================================] 100%
>         >
>         >     RPMs:       0/0 items
>         >
>         >     Delta RPMs: 0/0 items
>         >
>         >
>         >
>         >     ... completed
>         >
>         >
>         >
>         >     Downloading distribution files...
>         >
>         >     [==================================================] 100%
>         >
>         >     Distributions: 0/0 items
>         >
>         >     ... completed
>         >
>         >
>         >
>         >     Importing errata...
>         >
>         >     [/]
>         >
>         >     ... completed
>         >
>         >
>         >
>         >     Importing package groups/categories...
>         >
>         >     [-]
>         >
>         >     ... completed
>         >
>         >
>         >
>         >     Cleaning duplicate packages...
>         >
>         >     [|]
>         >
>         >     ... completed
>         >
>         >
>         >
>         >     *** AND STUCK HERE ***
>         >
>         >
>         >
>         >     # pulp-admin tasks list
>         >
>         >   
>          +----------------------------------------------------------------------+
>         >
>         >                                      Tasks
>         >
>         >   
>          +----------------------------------------------------------------------+
>         >
>         >
>         >
>         >     Operations:  sync
>         >
>         >     Resources:   rhel-6-server-supplementary-rpms (repository)
>         >
>         >     State:       Running
>         >
>         >     Start Time:  2016-05-03T07:06:36Z
>         >
>         >     Finish Time: Incomplete
>         >
>         >     Task Id:     2d776d63-fd8a-4e0a-8f32-d2276c85187c
>         >
>         >
>         >
>         >     Operations:  publish
>         >
>         >     Resources:   rhel-6-server-supplementary-rpms (repository)
>         >
>         >     State:       Waiting
>         >
>         >     Start Time:  Unstarted
>         >
>         >     Finish Time: Incomplete
>         >
>         >     Task Id:     03842c9d-e053-4a6f-a4c4-2d7302be9c8c
>         >
>         >
>         >
>         >     # date
>         >
>         >     Tue May  3 09:22:30 CEST 2016
>         >
>         >     # pulp-admin rpm repo sync schedules list
>         >     --repo-id=rhel-6-server-supplementary-rpms
>         >
>         >   
>          +----------------------------------------------------------------------+
>         >
>         >                                    Schedules
>         >
>         >   
>          +----------------------------------------------------------------------+
>         >
>         >
>         >
>         >     There are no schedules defined for this operation.
>         >
>         >
>         >
>         >     # pulp-admin rpm repo delete
>         --repo-id=rhel-6-server-supplementary-rpms
>         >
>         >     This command may be exited via ctrl+c without affecting
>         the request.
>         >
>         >
>         >
>         >
>         >
>         >     [-]
>         >
>         >     Running...
>         >
>         >     [-]
>         >
>         >     Waiting to begin...
>         >
>         >
>         >
>         >     *** AND STUCK HERE ***
>         >
>         >
>         >     _______________________________________________
>         >     Pulp-list mailing list
>         >     Pulp-list at redhat.com <mailto:Pulp-list at redhat.com>
>         <mailto:Pulp-list at redhat.com <mailto:Pulp-list at redhat.com>>
>         >     https://www.redhat.com/mailman/listinfo/pulp-list
>         >
>         >
>         >
>         >
>         > _______________________________________________
>         > Pulp-list mailing list
>         > Pulp-list at redhat.com <mailto:Pulp-list at redhat.com>
>         > https://www.redhat.com/mailman/listinfo/pulp-list
>         >
> 
>         _______________________________________________
>         Pulp-list mailing list
>         Pulp-list at redhat.com <mailto:Pulp-list at redhat.com>
>         https://www.redhat.com/mailman/listinfo/pulp-list
> 
> 
> 




More information about the Pulp-list mailing list