[dm-devel] Duplicate multipathd process

Juliano da Costa juliano.dacosta at gmail.com
Thu Aug 27 15:53:00 UTC 2015


Hi ALL.

I'm having some problems and could not find solution with technical support.

My server is two multipathd running processes. I noticed that when there is
a failure to communicate with the SAN device looks like this:

volclu2 (36000144000000010700aa26e0e57b5b9) dm-36 EMC Invista
size = 1.0G features = '0' hwhandler = '0' wp = rw
`- + - Policy = 'queue-length 0' prio = 0 status = enabled
| - 0: 0: 0: 20 65:64 sdu failed running faulty
| - 0: 0: 1: 20 sdaq 66: 160 failed running faulty
| - 1: 0: 0: 20 sddf 70: 208 failed running faulty
`- 1: 0: 1: 20 sdeb 128: 48 failed running faulty

And then the communication is re-established looks like this:

volclu2 (36000144000000010700aa26e0e57b5b9) dm-36 EMC Invista
size = 1.0G features = '0' hwhandler = '0' wp = rw
`- + - Policy = 'queue-length 0' prio = 1 Status = enabled
| - 0: 0: 0: 20 65:64 sdu failed running ready
| - 0: 0: 1: 20 sdaq 66: 160 failed running ready
| - 1: 0: 0: 20 sddf 70: 208 failed running ready
`- 1: 0: 1: 20 sdeb 128: 48  failed running ready

The failback is not being automatic. The status of the device is "enable"
rather than "active".

The status only changes to "active" when I run "multipath -v3".

- What can cause the creation of two daemon multipathd?

Regards,
    Juliano.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://listman.redhat.com/archives/dm-devel/attachments/20150827/765f09e4/attachment.htm>


More information about the dm-devel mailing list