All of lore.kernel.org
 help / color / mirror / Atom feed
* [bug report] blktests srp/002 hang
@ 2023-08-21  6:46 Shinichiro Kawasaki
  2023-08-22  1:46 ` Bob Pearson
  2023-09-22 11:06 ` Linux regression tracking #adding (Thorsten Leemhuis)
  0 siblings, 2 replies; 87+ messages in thread
From: Shinichiro Kawasaki @ 2023-08-21  6:46 UTC (permalink / raw)
  To: linux-rdma, linux-scsi; +Cc: Bob Pearson

I observed a process hang at the blktests test case srp/002 occasionally, using
kernel v6.5-rcX. Kernel reported stall of many kworkers [1]. PID 2757 hanged at
inode_sleep_on_writeback(). Other kworkers hanged at __inode_wait_for_writeback.

The hang is recreated in stable manner by repeating the test case srp/002 (from
15 times to 30 times).

I bisected and found the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support
for rxe tasks") looks like the trigger commit. When I revert it from the kernel
v6.5-rc7, the hang symptom disappears. I'm not sure how the commit relates to
the hang. Comments will be welcomed.

[1]

...
[ 1670.489181] scsi 4:0:0:1: alua: Detached
[ 1670.985461] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-38: queued zerolength write
[ 1670.985702] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-36: queued zerolength write
[ 1670.985716] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-38 wc->status 5
[ 1670.985821] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-38
[ 1670.985824] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-36 wc->status 5
[ 1670.985909] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-34: queued zerolength write
[ 1670.985924] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-36
[ 1670.986104] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-34 wc->status 5
[ 1670.986244] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-34
[ 1671.049223] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-40: queued zerolength write
[ 1671.049588] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-40 wc->status 5
[ 1671.049626] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-40
[ 1844.873748] INFO: task kworker/0:1:9 blocked for more than 122 seconds.
[ 1844.877893]       Not tainted 6.5.0-rc7 #106
[ 1844.878903] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 1844.880255] task:kworker/0:1     state:D stack:0     pid:9     ppid:2      flags:0x00004000
[ 1844.881830] Workqueue: dio/dm-1 iomap_dio_complete_work
[ 1844.882999] Call Trace:
[ 1844.883900]  <TASK>
[ 1844.884703]  __schedule+0x10ac/0x5e80
[ 1844.885609]  ? do_raw_spin_unlock+0x54/0x1f0
[ 1844.886569]  ? __pfx___schedule+0x10/0x10
[ 1844.887596]  ? lock_release+0x378/0x650
[ 1844.888431]  ? schedule+0x92/0x220
[ 1844.889232]  ? mark_held_locks+0x96/0xe0
[ 1844.890117]  schedule+0x133/0x220
[ 1844.890874]  bit_wait+0x17/0xe0
[ 1844.891619]  __wait_on_bit+0x66/0x180
[ 1844.892409]  ? __pfx_bit_wait+0x10/0x10
[ 1844.893192]  __inode_wait_for_writeback+0x12b/0x1b0
[ 1844.894245]  ? __pfx___inode_wait_for_writeback+0x10/0x10
[ 1844.895225]  ? __pfx_wake_bit_function+0x10/0x10
[ 1844.896138]  ? find_held_lock+0x2d/0x110
[ 1844.897085]  writeback_single_inode+0xf9/0x3f0
[ 1844.898186]  sync_inode_metadata+0x91/0xd0
[ 1844.899036]  ? __pfx_sync_inode_metadata+0x10/0x10
[ 1844.900106]  ? lock_release+0x378/0x650
[ 1844.900988]  ? file_check_and_advance_wb_err+0xb5/0x230
[ 1844.901978]  generic_buffers_fsync_noflush+0x1bf/0x270
[ 1844.902964]  ext4_sync_file+0x469/0xb60
[ 1844.903859]  iomap_dio_complete+0x5d1/0x860
[ 1844.904828]  ? __pfx_aio_complete_rw+0x10/0x10
[ 1844.905841]  iomap_dio_complete_work+0x52/0x80
[ 1844.906774]  process_one_work+0x898/0x14a0
[ 1844.907673]  ? __pfx_lock_acquire+0x10/0x10
[ 1844.908644]  ? __pfx_process_one_work+0x10/0x10
[ 1844.909693]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1844.910676]  worker_thread+0x100/0x12c0
[ 1844.911612]  ? __kthread_parkme+0xc1/0x1f0
[ 1844.912542]  ? __pfx_worker_thread+0x10/0x10
[ 1844.913584]  kthread+0x2ea/0x3c0
[ 1844.914465]  ? __pfx_kthread+0x10/0x10
[ 1844.915335]  ret_from_fork+0x30/0x70
[ 1844.916269]  ? __pfx_kthread+0x10/0x10
[ 1844.917308]  ret_from_fork_asm+0x1b/0x30
[ 1844.918243]  </TASK>
[ 1844.918998] INFO: task kworker/1:0:25 blocked for more than 122 seconds.
[ 1844.920107]       Not tainted 6.5.0-rc7 #106
[ 1844.921041] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 1844.922262] task:kworker/1:0     state:D stack:0     pid:25    ppid:2      flags:0x00004000
[ 1844.923550] Workqueue: dio/dm-1 iomap_dio_complete_work
[ 1844.924598] Call Trace:
[ 1844.925407]  <TASK>
[ 1844.926194]  __schedule+0x10ac/0x5e80
[ 1844.927097]  ? do_raw_spin_unlock+0x54/0x1f0
[ 1844.928032]  ? __pfx___schedule+0x10/0x10
[ 1844.928937]  ? lock_release+0x378/0x650
[ 1844.929823]  ? schedule+0x92/0x220
[ 1844.930682]  ? mark_held_locks+0x96/0xe0
[ 1844.931579]  schedule+0x133/0x220
[ 1844.932411]  bit_wait+0x17/0xe0
[ 1844.933238]  __wait_on_bit+0x66/0x180
[ 1844.934107]  ? __pfx_bit_wait+0x10/0x10
[ 1844.934996]  __inode_wait_for_writeback+0x12b/0x1b0
[ 1844.935956]  ? __pfx___inode_wait_for_writeback+0x10/0x10
[ 1844.936969]  ? __pfx_wake_bit_function+0x10/0x10
[ 1844.937942]  ? find_held_lock+0x2d/0x110
[ 1844.938891]  writeback_single_inode+0xf9/0x3f0
[ 1844.939836]  sync_inode_metadata+0x91/0xd0
[ 1844.940758]  ? __pfx_sync_inode_metadata+0x10/0x10
[ 1844.941730]  ? lock_release+0x378/0x650
[ 1844.942640]  ? file_check_and_advance_wb_err+0xb5/0x230
[ 1844.943647]  generic_buffers_fsync_noflush+0x1bf/0x270
[ 1844.944652]  ext4_sync_file+0x469/0xb60
[ 1844.945561]  iomap_dio_complete+0x5d1/0x860
[ 1844.946469]  ? __pfx_aio_complete_rw+0x10/0x10
[ 1844.947417]  iomap_dio_complete_work+0x52/0x80
[ 1844.948358]  process_one_work+0x898/0x14a0
[ 1844.949284]  ? __pfx_lock_acquire+0x10/0x10
[ 1844.950204]  ? __pfx_process_one_work+0x10/0x10
[ 1844.951152]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1844.952094]  worker_thread+0x100/0x12c0
[ 1844.952998]  ? __pfx_worker_thread+0x10/0x10
[ 1844.953919]  kthread+0x2ea/0x3c0
[ 1844.954760]  ? __pfx_kthread+0x10/0x10
[ 1844.955669]  ret_from_fork+0x30/0x70
[ 1844.956550]  ? __pfx_kthread+0x10/0x10
[ 1844.957418]  ret_from_fork_asm+0x1b/0x30
[ 1844.958321]  </TASK>
[ 1844.959085] INFO: task kworker/1:1:49 blocked for more than 122 seconds.
[ 1844.960193]       Not tainted 6.5.0-rc7 #106
[ 1844.961122] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 1844.962340] task:kworker/1:1     state:D stack:0     pid:49    ppid:2      flags:0x00004000
[ 1844.963619] Workqueue: dio/dm-1 iomap_dio_complete_work
[ 1844.964667] Call Trace:
[ 1844.965503]  <TASK>
[ 1844.966289]  __schedule+0x10ac/0x5e80
[ 1844.967207]  ? lock_acquire+0x1a9/0x4e0
[ 1844.968122]  ? __pfx___schedule+0x10/0x10
[ 1844.969034]  ? lock_release+0x378/0x650
[ 1844.969922]  ? schedule+0x92/0x220
[ 1844.970778]  ? mark_held_locks+0x96/0xe0
[ 1844.971674]  schedule+0x133/0x220
[ 1844.972526]  bit_wait+0x17/0xe0
[ 1844.973336]  __wait_on_bit+0x66/0x180
[ 1844.974206]  ? __pfx_bit_wait+0x10/0x10
[ 1844.975086]  __inode_wait_for_writeback+0x12b/0x1b0
[ 1844.976046]  ? __pfx___inode_wait_for_writeback+0x10/0x10
[ 1844.977056]  ? __pfx_wake_bit_function+0x10/0x10
[ 1844.978007]  ? find_held_lock+0x2d/0x110
[ 1844.978917]  writeback_single_inode+0xf9/0x3f0
[ 1844.979865]  sync_inode_metadata+0x91/0xd0
[ 1844.980786]  ? __pfx_sync_inode_metadata+0x10/0x10
[ 1844.981765]  ? lock_release+0x378/0x650
[ 1844.982677]  ? file_check_and_advance_wb_err+0xb5/0x230
[ 1844.983687]  generic_buffers_fsync_noflush+0x1bf/0x270
[ 1844.984696]  ext4_sync_file+0x469/0xb60
[ 1844.985608]  iomap_dio_complete+0x5d1/0x860
[ 1844.986548]  ? __pfx_aio_complete_rw+0x10/0x10
[ 1844.987484]  iomap_dio_complete_work+0x52/0x80
[ 1844.988435]  process_one_work+0x898/0x14a0
[ 1844.989352]  ? __pfx_lock_acquire+0x10/0x10
[ 1844.990275]  ? __pfx_process_one_work+0x10/0x10
[ 1844.991220]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1844.992164]  worker_thread+0x100/0x12c0
[ 1844.993065]  ? __kthread_parkme+0xc1/0x1f0
[ 1844.993977]  ? __pfx_worker_thread+0x10/0x10
[ 1844.994934]  kthread+0x2ea/0x3c0
[ 1844.995783]  ? __pfx_kthread+0x10/0x10
[ 1844.996670]  ret_from_fork+0x30/0x70
[ 1844.997544]  ? __pfx_kthread+0x10/0x10
[ 1844.998409]  ret_from_fork_asm+0x1b/0x30
[ 1844.999308]  </TASK>
[ 1845.000094] INFO: task kworker/0:2:74 blocked for more than 123 seconds.
[ 1845.001315]       Not tainted 6.5.0-rc7 #106
[ 1845.002326] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 1845.003630] task:kworker/0:2     state:D stack:0     pid:74    ppid:2      flags:0x00004000
[ 1845.004991] Workqueue: dio/dm-1 iomap_dio_complete_work
[ 1845.006108] Call Trace:
[ 1845.006975]  <TASK>
[ 1845.007805]  __schedule+0x10ac/0x5e80
[ 1845.008781]  ? do_raw_spin_unlock+0x54/0x1f0
[ 1845.009780]  ? __pfx___schedule+0x10/0x10
[ 1845.010736]  ? lock_release+0x378/0x650
[ 1845.011666]  ? schedule+0x92/0x220
[ 1845.012579]  ? mark_held_locks+0x96/0xe0
[ 1845.013531]  schedule+0x133/0x220
[ 1845.014414]  bit_wait+0x17/0xe0
[ 1845.015287]  __wait_on_bit+0x66/0x180
[ 1845.016219]  ? __pfx_bit_wait+0x10/0x10
[ 1845.017164]  __inode_wait_for_writeback+0x12b/0x1b0
[ 1845.018185]  ? __pfx___inode_wait_for_writeback+0x10/0x10
[ 1845.019269]  ? __pfx_wake_bit_function+0x10/0x10
[ 1845.020282]  ? find_held_lock+0x2d/0x110
[ 1845.021246]  writeback_single_inode+0xf9/0x3f0
[ 1845.022248]  sync_inode_metadata+0x91/0xd0
[ 1845.023222]  ? __pfx_sync_inode_metadata+0x10/0x10
[ 1845.024255]  ? lock_release+0x378/0x650
[ 1845.025207]  ? file_check_and_advance_wb_err+0xb5/0x230
[ 1845.026281]  generic_buffers_fsync_noflush+0x1bf/0x270
[ 1845.027347]  ext4_sync_file+0x469/0xb60
[ 1845.028302]  iomap_dio_complete+0x5d1/0x860
[ 1845.029275]  ? __pfx_aio_complete_rw+0x10/0x10
[ 1845.030276]  iomap_dio_complete_work+0x52/0x80
[ 1845.031281]  process_one_work+0x898/0x14a0
[ 1845.032248]  ? __pfx_lock_acquire+0x10/0x10
[ 1845.033199]  ? __pfx_process_one_work+0x10/0x10
[ 1845.034182]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1845.035188]  worker_thread+0x100/0x12c0
[ 1845.036138]  ? __pfx_worker_thread+0x10/0x10
[ 1845.037104]  kthread+0x2ea/0x3c0
[ 1845.037996]  ? __pfx_kthread+0x10/0x10
[ 1845.038923]  ret_from_fork+0x30/0x70
[ 1845.039840]  ? __pfx_kthread+0x10/0x10
[ 1845.040763]  ret_from_fork_asm+0x1b/0x30
[ 1845.041729]  </TASK>
[ 1845.042531] INFO: task kworker/3:2:169 blocked for more than 123 seconds.
[ 1845.043703]       Not tainted 6.5.0-rc7 #106
[ 1845.044780] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 1845.046068] task:kworker/3:2     state:D stack:0     pid:169   ppid:2      flags:0x00004000
[ 1845.047400] Workqueue: dio/dm-1 iomap_dio_complete_work
[ 1845.048518] Call Trace:
[ 1845.049392]  <TASK>
[ 1845.050214]  __schedule+0x10ac/0x5e80
[ 1845.051172]  ? lock_acquire+0x1a9/0x4e0
[ 1845.052141]  ? __pfx___schedule+0x10/0x10
[ 1845.053086]  ? lock_release+0x378/0x650
[ 1845.054017]  ? schedule+0x92/0x220
[ 1845.054920]  ? mark_held_locks+0x96/0xe0
[ 1845.055866]  schedule+0x133/0x220
[ 1845.056761]  bit_wait+0x17/0xe0
[ 1845.057645]  __wait_on_bit+0x66/0x180
[ 1845.058573]  ? __pfx_bit_wait+0x10/0x10
[ 1845.059502]  __inode_wait_for_writeback+0x12b/0x1b0
[ 1845.060528]  ? __pfx___inode_wait_for_writeback+0x10/0x10
[ 1845.061603]  ? __pfx_wake_bit_function+0x10/0x10
[ 1845.062604]  ? find_held_lock+0x2d/0x110
[ 1845.063548]  writeback_single_inode+0xf9/0x3f0
[ 1845.064564]  sync_inode_metadata+0x91/0xd0
[ 1845.065534]  ? __pfx_sync_inode_metadata+0x10/0x10
[ 1845.066552]  ? lock_release+0x378/0x650
[ 1845.067504]  ? file_check_and_advance_wb_err+0xb5/0x230
[ 1845.068557]  generic_buffers_fsync_noflush+0x1bf/0x270
[ 1845.069609]  ext4_sync_file+0x469/0xb60
[ 1845.070563]  iomap_dio_complete+0x5d1/0x860
[ 1845.071550]  ? __pfx_aio_complete_rw+0x10/0x10
[ 1845.072543]  iomap_dio_complete_work+0x52/0x80
[ 1845.073547]  process_one_work+0x898/0x14a0
[ 1845.074518]  ? __pfx_lock_acquire+0x10/0x10
[ 1845.075468]  ? __pfx_process_one_work+0x10/0x10
[ 1845.076456]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1845.077436]  worker_thread+0x100/0x12c0
[ 1845.078382]  ? __pfx_worker_thread+0x10/0x10
[ 1845.079354]  kthread+0x2ea/0x3c0
[ 1845.080230]  ? __pfx_kthread+0x10/0x10
[ 1845.081163]  ret_from_fork+0x30/0x70
[ 1845.082075]  ? __pfx_kthread+0x10/0x10
[ 1845.083014]  ret_from_fork_asm+0x1b/0x30
[ 1845.083957]  </TASK>
[ 1845.084756] INFO: task kworker/0:3:221 blocked for more than 123 seconds.
[ 1845.085927]       Not tainted 6.5.0-rc7 #106
[ 1845.086911] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 1845.088205] task:kworker/0:3     state:D stack:0     pid:221   ppid:2      flags:0x00004000
[ 1845.089566] Workqueue: dio/dm-1 iomap_dio_complete_work
[ 1845.090635] Call Trace:
[ 1845.091503]  <TASK>
[ 1845.092318]  __schedule+0x10ac/0x5e80
[ 1845.093282]  ? do_raw_spin_unlock+0x54/0x1f0
[ 1845.094265]  ? __pfx___schedule+0x10/0x10
[ 1845.095200]  ? lock_release+0x378/0x650
[ 1845.096132]  ? schedule+0x92/0x220
[ 1845.097018]  ? mark_held_locks+0x96/0xe0
[ 1845.097959]  schedule+0x133/0x220
[ 1845.098863]  bit_wait+0x17/0xe0
[ 1845.099736]  __wait_on_bit+0x66/0x180
[ 1845.100649]  ? __pfx_bit_wait+0x10/0x10
[ 1845.101600]  __inode_wait_for_writeback+0x12b/0x1b0
[ 1845.102606]  ? __pfx___inode_wait_for_writeback+0x10/0x10
[ 1845.103673]  ? __pfx_wake_bit_function+0x10/0x10
[ 1845.104685]  ? find_held_lock+0x2d/0x110
[ 1845.105633]  writeback_single_inode+0xf9/0x3f0
[ 1845.106625]  sync_inode_metadata+0x91/0xd0
[ 1845.107612]  ? __pfx_sync_inode_metadata+0x10/0x10
[ 1845.108635]  ? lock_release+0x378/0x650
[ 1845.109591]  ? file_check_and_advance_wb_err+0xb5/0x230
[ 1845.110645]  generic_buffers_fsync_noflush+0x1bf/0x270
[ 1845.111698]  ext4_sync_file+0x469/0xb60
[ 1845.112657]  iomap_dio_complete+0x5d1/0x860
[ 1845.113639]  ? __pfx_aio_complete_rw+0x10/0x10
[ 1845.114625]  iomap_dio_complete_work+0x52/0x80
[ 1845.115616]  process_one_work+0x898/0x14a0
[ 1845.116582]  ? __pfx_lock_acquire+0x10/0x10
[ 1845.117575]  ? __pfx_process_one_work+0x10/0x10
[ 1845.118573]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1845.119557]  worker_thread+0x100/0x12c0
[ 1845.120480]  ? __pfx_worker_thread+0x10/0x10
[ 1845.121453]  kthread+0x2ea/0x3c0
[ 1845.122339]  ? __pfx_kthread+0x10/0x10
[ 1845.123277]  ret_from_fork+0x30/0x70
[ 1845.124192]  ? __pfx_kthread+0x10/0x10
[ 1845.125131]  ret_from_fork_asm+0x1b/0x30
[ 1845.126085]  </TASK>
[ 1845.127043] INFO: task kworker/1:2:230 blocked for more than 123 seconds.
[ 1845.128574]       Not tainted 6.5.0-rc7 #106
[ 1845.129789] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 1845.131441] task:kworker/1:2     state:D stack:0     pid:230   ppid:2      flags:0x00004000
[ 1845.133125] Workqueue: dio/dm-1 iomap_dio_complete_work
[ 1845.134546] Call Trace:
[ 1845.135547]  <TASK>
[ 1845.136475]  __schedule+0x10ac/0x5e80
[ 1845.137599]  ? lock_acquire+0x1a9/0x4e0
[ 1845.138703]  ? __pfx___schedule+0x10/0x10
[ 1845.139859]  ? lock_release+0x378/0x650
[ 1845.140980]  ? schedule+0x92/0x220
[ 1845.142026]  ? mark_held_locks+0x96/0xe0
[ 1845.143161]  schedule+0x133/0x220
[ 1845.144196]  bit_wait+0x17/0xe0
[ 1845.145233]  __wait_on_bit+0x66/0x180
[ 1845.146262]  ? __pfx_bit_wait+0x10/0x10
[ 1845.147380]  __inode_wait_for_writeback+0x12b/0x1b0
[ 1845.148650]  ? __pfx___inode_wait_for_writeback+0x10/0x10
[ 1845.149950]  ? __pfx_wake_bit_function+0x10/0x10
[ 1845.151181]  ? find_held_lock+0x2d/0x110
[ 1845.152288]  writeback_single_inode+0xf9/0x3f0
[ 1845.153474]  sync_inode_metadata+0x91/0xd0
[ 1845.154608]  ? __pfx_sync_inode_metadata+0x10/0x10
[ 1845.155857]  ? lock_release+0x378/0x650
[ 1845.156997]  ? file_check_and_advance_wb_err+0xb5/0x230
[ 1845.158309]  generic_buffers_fsync_noflush+0x1bf/0x270
[ 1845.159569]  ext4_sync_file+0x469/0xb60
[ 1845.160709]  iomap_dio_complete+0x5d1/0x860
[ 1845.161881]  ? __pfx_aio_complete_rw+0x10/0x10
[ 1845.163086]  iomap_dio_complete_work+0x52/0x80
[ 1845.164269]  process_one_work+0x898/0x14a0
[ 1845.165367]  ? __pfx_lock_acquire+0x10/0x10
[ 1845.166541]  ? __pfx_process_one_work+0x10/0x10
[ 1845.167706]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1845.168880]  worker_thread+0x100/0x12c0
[ 1845.170006]  ? __kthread_parkme+0xc1/0x1f0
[ 1845.171083]  ? __pfx_worker_thread+0x10/0x10
[ 1845.172302]  kthread+0x2ea/0x3c0
[ 1845.173350]  ? __pfx_kthread+0x10/0x10
[ 1845.174465]  ret_from_fork+0x30/0x70
[ 1845.175522]  ? __pfx_kthread+0x10/0x10
[ 1845.176616]  ret_from_fork_asm+0x1b/0x30
[ 1845.177754]  </TASK>
[ 1845.178624] INFO: task kworker/2:3:291 blocked for more than 123 seconds.
[ 1845.180123]       Not tainted 6.5.0-rc7 #106
[ 1845.181306] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 1845.182914] task:kworker/2:3     state:D stack:0     pid:291   ppid:2      flags:0x00004000
[ 1845.184626] Workqueue: dio/dm-1 iomap_dio_complete_work
[ 1845.186012] Call Trace:
[ 1845.187004]  <TASK>
[ 1845.187939]  __schedule+0x10ac/0x5e80
[ 1845.189072]  ? do_raw_spin_unlock+0x54/0x1f0
[ 1845.190177]  ? __pfx___schedule+0x10/0x10
[ 1845.191356]  ? lock_release+0x378/0x650
[ 1845.192421]  ? schedule+0x92/0x220
[ 1845.193501]  ? mark_held_locks+0x96/0xe0
[ 1845.194535]  schedule+0x133/0x220
[ 1845.195595]  bit_wait+0x17/0xe0
[ 1845.196603]  __wait_on_bit+0x66/0x180
[ 1845.197697]  ? __pfx_bit_wait+0x10/0x10
[ 1845.198820]  __inode_wait_for_writeback+0x12b/0x1b0
[ 1845.200061]  ? __pfx___inode_wait_for_writeback+0x10/0x10
[ 1845.201315]  ? __pfx_wake_bit_function+0x10/0x10
[ 1845.202522]  ? find_held_lock+0x2d/0x110
[ 1845.203679]  writeback_single_inode+0xf9/0x3f0
[ 1845.204885]  sync_inode_metadata+0x91/0xd0
[ 1845.205943]  ? __pfx_sync_inode_metadata+0x10/0x10
[ 1845.207190]  ? lock_release+0x378/0x650
[ 1845.208325]  ? file_check_and_advance_wb_err+0xb5/0x230
[ 1845.209581]  generic_buffers_fsync_noflush+0x1bf/0x270
[ 1845.210883]  ext4_sync_file+0x469/0xb60
[ 1845.212022]  iomap_dio_complete+0x5d1/0x860
[ 1845.213177]  ? __pfx_aio_complete_rw+0x10/0x10
[ 1845.214315]  iomap_dio_complete_work+0x52/0x80
[ 1845.215547]  process_one_work+0x898/0x14a0
[ 1845.216714]  ? __pfx_lock_acquire+0x10/0x10
[ 1845.217887]  ? __pfx_process_one_work+0x10/0x10
[ 1845.219026]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1845.220280]  worker_thread+0x100/0x12c0
[ 1845.221386]  ? __kthread_parkme+0xc1/0x1f0
[ 1845.222569]  ? __pfx_worker_thread+0x10/0x10
[ 1845.223743]  kthread+0x2ea/0x3c0
[ 1845.224788]  ? __pfx_kthread+0x10/0x10
[ 1845.225908]  ret_from_fork+0x30/0x70
[ 1845.226996]  ? __pfx_kthread+0x10/0x10
[ 1845.228110]  ret_from_fork_asm+0x1b/0x30
[ 1845.229254]  </TASK>
[ 1845.230191] INFO: task kworker/1:3:322 blocked for more than 123 seconds.
[ 1845.231562]       Not tainted 6.5.0-rc7 #106
[ 1845.232622] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 1845.233992] task:kworker/1:3     state:D stack:0     pid:322   ppid:2      flags:0x00004000
[ 1845.235439] Workqueue: dio/dm-1 iomap_dio_complete_work
[ 1845.236681] Call Trace:
[ 1845.237629]  <TASK>
[ 1845.238526]  __schedule+0x10ac/0x5e80
[ 1845.239559]  ? do_raw_spin_unlock+0x54/0x1f0
[ 1845.240622]  ? __pfx___schedule+0x10/0x10
[ 1845.241639]  ? lock_release+0x378/0x650
[ 1845.242650]  ? schedule+0x92/0x220
[ 1845.243654]  ? mark_held_locks+0x96/0xe0
[ 1845.244707]  schedule+0x133/0x220
[ 1845.245657]  bit_wait+0x17/0xe0
[ 1845.246631]  __wait_on_bit+0x66/0x180
[ 1845.247601]  ? __pfx_bit_wait+0x10/0x10
[ 1845.248630]  __inode_wait_for_writeback+0x12b/0x1b0
[ 1845.249743]  ? __pfx___inode_wait_for_writeback+0x10/0x10
[ 1845.250948]  ? __pfx_wake_bit_function+0x10/0x10
[ 1845.252021]  ? find_held_lock+0x2d/0x110
[ 1845.253043]  writeback_single_inode+0xf9/0x3f0
[ 1845.254123]  sync_inode_metadata+0x91/0xd0
[ 1845.255205]  ? __pfx_sync_inode_metadata+0x10/0x10
[ 1845.256294]  ? lock_release+0x378/0x650
[ 1845.257332]  ? file_check_and_advance_wb_err+0xb5/0x230
[ 1845.258542]  generic_buffers_fsync_noflush+0x1bf/0x270
[ 1845.259701]  ext4_sync_file+0x469/0xb60
[ 1845.260765]  iomap_dio_complete+0x5d1/0x860
[ 1845.261790]  ? __pfx_aio_complete_rw+0x10/0x10
[ 1845.262907]  iomap_dio_complete_work+0x52/0x80
[ 1845.263961]  process_one_work+0x898/0x14a0
[ 1845.265025]  ? __pfx_lock_acquire+0x10/0x10
[ 1845.266074]  ? __pfx_process_one_work+0x10/0x10
[ 1845.267197]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1845.268305]  worker_thread+0x100/0x12c0
[ 1845.269328]  ? __kthread_parkme+0xc1/0x1f0
[ 1845.270368]  ? __pfx_worker_thread+0x10/0x10
[ 1845.271457]  kthread+0x2ea/0x3c0
[ 1845.272422]  ? __pfx_kthread+0x10/0x10
[ 1845.273443]  ret_from_fork+0x30/0x70
[ 1845.274438]  ? __pfx_kthread+0x10/0x10
[ 1845.275475]  ret_from_fork_asm+0x1b/0x30
[ 1845.276555]  </TASK>
[ 1845.277433] INFO: task kworker/u8:7:2757 blocked for more than 123 seconds.
[ 1845.278808]       Not tainted 6.5.0-rc7 #106
[ 1845.279897] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 1845.281313] task:kworker/u8:7    state:D stack:0     pid:2757  ppid:2      flags:0x00004000
[ 1845.282753] Workqueue: writeback wb_workfn (flush-253:1)
[ 1845.283993] Call Trace:
[ 1845.284945]  <TASK>
[ 1845.285853]  __schedule+0x10ac/0x5e80
[ 1845.286872]  ? lock_acquire+0x1b9/0x4e0
[ 1845.287917]  ? __pfx___schedule+0x10/0x10
[ 1845.288934]  ? __blk_flush_plug+0x27a/0x450
[ 1845.289979]  ? inode_sleep_on_writeback+0xf4/0x160
[ 1845.291131]  schedule+0x133/0x220
[ 1845.292052]  inode_sleep_on_writeback+0x14e/0x160
[ 1845.293130]  ? __pfx_inode_sleep_on_writeback+0x10/0x10
[ 1845.294289]  ? __pfx_lock_release+0x10/0x10
[ 1845.295362]  ? __pfx_autoremove_wake_function+0x10/0x10
[ 1845.296574]  ? __pfx___writeback_inodes_wb+0x10/0x10
[ 1845.297750]  wb_writeback+0x330/0x7a0
[ 1845.298800]  ? __pfx_wb_writeback+0x10/0x10
[ 1845.299876]  ? get_nr_dirty_inodes+0xc7/0x170
[ 1845.300988]  wb_workfn+0x7a1/0xcc0
[ 1845.302019]  ? __pfx_wb_workfn+0x10/0x10
[ 1845.303071]  ? lock_acquire+0x1b9/0x4e0
[ 1845.304127]  ? __pfx_lock_acquire+0x10/0x10
[ 1845.305232]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1845.306341]  process_one_work+0x898/0x14a0
[ 1845.307377]  ? __pfx_lock_acquire+0x10/0x10
[ 1845.308410]  ? __pfx_process_one_work+0x10/0x10
[ 1845.309551]  ? __pfx_do_raw_spin_lock+0x10/0x10
[ 1845.310678]  worker_thread+0x100/0x12c0
[ 1845.311702]  ? __kthread_parkme+0xc1/0x1f0
[ 1845.312778]  ? __pfx_worker_thread+0x10/0x10
[ 1845.313864]  kthread+0x2ea/0x3c0
[ 1845.314848]  ? __pfx_kthread+0x10/0x10
[ 1845.315885]  ret_from_fork+0x30/0x70
[ 1845.316879]  ? __pfx_kthread+0x10/0x10
[ 1845.317885]  ret_from_fork_asm+0x1b/0x30
[ 1845.318896]  </TASK>
[ 1845.319767] Future hung task reports are suppressed, see sysctl kernel.hung_task_warnings
[ 1845.321587] 
               Showing all locks held in the system:
[ 1845.323498] 2 locks held by kworker/0:1/9:
[ 1845.324569]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.326209]  #1: ffff888100877d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.327999] 1 lock held by rcu_tasks_kthre/13:
[ 1845.329153]  #0: ffffffffa8c7b010 (rcu_tasks.tasks_gp_mutex){+.+.}-{3:3}, at: rcu_tasks_one_gp+0x31/0xde0
[ 1845.330838] 1 lock held by rcu_tasks_rude_/14:
[ 1845.332043]  #0: ffffffffa8c7ad70 (rcu_tasks_rude.tasks_gp_mutex){+.+.}-{3:3}, at: rcu_tasks_one_gp+0x31/0xde0
[ 1845.333713] 1 lock held by rcu_tasks_trace/15:
[ 1845.334939]  #0: ffffffffa8c7aa70 (rcu_tasks_trace.tasks_gp_mutex){+.+.}-{3:3}, at: rcu_tasks_one_gp+0x31/0xde0
[ 1845.336716] 2 locks held by kworker/1:0/25:
[ 1845.337890]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.339639]  #1: ffff888100977d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.341440] 1 lock held by khungtaskd/43:
[ 1845.342669]  #0: ffffffffa8c7bbe0 (rcu_read_lock){....}-{1:2}, at: debug_show_all_locks+0x51/0x340
[ 1845.344347] 2 locks held by kworker/1:1/49:
[ 1845.345577]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.347382]  #1: ffff88810164fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.349278] 2 locks held by kworker/0:2/74:
[ 1845.350547]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.352400]  #1: ffff88811c8ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.354301] 2 locks held by kworker/3:2/169:
[ 1845.355618]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.357472]  #1: ffff88811f0e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.359445] 2 locks held by kworker/0:3/221:
[ 1845.360862]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.362800]  #1: ffff888126567d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.364804] 2 locks held by kworker/1:2/230:
[ 1845.366259]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.368270]  #1: ffff8881285f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.370338] 2 locks held by kworker/2:3/291:
[ 1845.371807]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.373789]  #1: ffff88812a1f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.375949] 2 locks held by kworker/1:3/322:
[ 1845.377464]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.379533]  #1: ffff888105a6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.381731] 1 lock held by in:imjournal/663:
[ 1845.383335] 2 locks held by kworker/u8:7/2757:
[ 1845.384953]  #0: ffff888101191938 ((wq_completion)writeback){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.387067]  #1: ffff88813542fd98 ((work_completion)(&(&wb->dwork)->work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.389320] 2 locks held by kworker/3:4/2759:
[ 1845.390985]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.393164]  #1: ffff888122ddfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.395410] 2 locks held by kworker/0:4/2760:
[ 1845.397073]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.399329]  #1: ffff888107dbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.401670] 2 locks held by kworker/1:5/2762:
[ 1845.403414]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.405626]  #1: ffff888105fbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.407962] 2 locks held by kworker/1:6/2764:
[ 1845.409693]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.411996]  #1: ffff888134647d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.414335] 2 locks held by kworker/3:5/2765:
[ 1845.416107]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.418376]  #1: ffff888128effd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.420758] 2 locks held by kworker/1:7/2767:
[ 1845.422532]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.424711]  #1: ffff88810fcefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.427082] 2 locks held by kworker/1:8/2768:
[ 1845.428790]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.431080]  #1: ffff88812a42fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.433495] 2 locks held by kworker/1:9/2770:
[ 1845.435192]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.437507]  #1: ffff888135477d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.439982] 2 locks held by kworker/3:6/2771:
[ 1845.441737]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.444015]  #1: ffff888127c6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.446448] 2 locks held by kworker/3:10/2776:
[ 1845.448255]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.450561]  #1: ffff888129fafd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.452971] 2 locks held by kworker/3:11/2777:
[ 1845.454703]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.457029]  #1: ffff8881056b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.459377] 2 locks held by kworker/2:8/2779:
[ 1845.461157]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.463483]  #1: ffff88812e997d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.465906] 2 locks held by kworker/3:13/2780:
[ 1845.467678]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.469988]  #1: ffff888128d57d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.472395] 2 locks held by kworker/3:14/2781:
[ 1845.474175]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.476468]  #1: ffff88812c9bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.478896] 2 locks held by kworker/3:15/2782:
[ 1845.480638]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.482919]  #1: ffff888104f27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.485299] 2 locks held by kworker/3:17/2784:
[ 1845.487097]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.489383]  #1: ffff88812224fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.491737] 2 locks held by kworker/3:18/2785:
[ 1845.493480]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.495790]  #1: ffff8881361afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.498159] 2 locks held by kworker/3:19/2786:
[ 1845.499941]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.502266]  #1: ffff888127e67d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.504618] 2 locks held by kworker/3:22/2790:
[ 1845.506418]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.508708]  #1: ffff888130d4fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.511121] 2 locks held by kworker/2:10/2791:
[ 1845.512938]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.515179]  #1: ffff888113127d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.517588] 2 locks held by kworker/3:23/2793:
[ 1845.519372]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.521683]  #1: ffff88812a89fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.524075] 2 locks held by kworker/3:24/2794:
[ 1845.525876]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.528115]  #1: ffff888129a1fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.530515] 2 locks held by kworker/3:25/2795:
[ 1845.532283]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.534610]  #1: ffff88812ebb7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.537020] 2 locks held by kworker/3:26/2796:
[ 1845.538809]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.541117]  #1: ffff888119577d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.543506] 2 locks held by kworker/1:11/2797:
[ 1845.545286]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.547624]  #1: ffff88813716fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.550018] 2 locks held by kworker/3:27/2798:
[ 1845.551827]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.554139]  #1: ffff888136747d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.556535] 2 locks held by kworker/1:13/2800:
[ 1845.558325]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.560657]  #1: ffff888131687d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.563055] 2 locks held by kworker/1:15/2802:
[ 1845.564867]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.567176]  #1: ffff8881342d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.569574] 2 locks held by kworker/1:17/2804:
[ 1845.571352]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.573643]  #1: ffff888132137d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.576005] 2 locks held by kworker/1:18/2805:
[ 1845.577768]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.580107]  #1: ffff888134a5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.582512] 2 locks held by kworker/1:19/2806:
[ 1845.584307]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.586598]  #1: ffff888135b87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.588971] 2 locks held by kworker/1:20/2807:
[ 1845.590771]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.593039]  #1: ffff88810513fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.595437] 2 locks held by kworker/1:22/2809:
[ 1845.597257]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.599584]  #1: ffff8881397bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.601975] 2 locks held by kworker/1:23/2810:
[ 1845.603756]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.606073]  #1: ffff888139807d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.608442] 2 locks held by kworker/3:30/2814:
[ 1845.610262]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.612547]  #1: ffff888101a27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.614937] 2 locks held by kworker/2:13/2815:
[ 1845.616711]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.618912]  #1: ffff888120087d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.621317] 2 locks held by kworker/2:15/2817:
[ 1845.623090]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.625381]  #1: ffff88812258fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.627743] 2 locks held by kworker/2:16/2818:
[ 1845.629551]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.631844]  #1: ffff888133d47d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.634251] 2 locks held by kworker/2:19/2821:
[ 1845.636011]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.638324]  #1: ffff88812ea37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.640711] 2 locks held by kworker/2:20/2822:
[ 1845.642514]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.644824]  #1: ffff88813abd7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.647217] 2 locks held by kworker/2:21/2823:
[ 1845.649025]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.651351]  #1: ffff88813454fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.653690] 2 locks held by kworker/2:22/2824:
[ 1845.655501]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.657763]  #1: ffff888132e5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.660177] 2 locks held by kworker/3:31/2825:
[ 1845.661943]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.664289]  #1: ffff888138177d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.666651] 2 locks held by kworker/3:32/2826:
[ 1845.668418]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.670748]  #1: ffff88812a26fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.673018] 2 locks held by kworker/3:38/2832:
[ 1845.674821]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.677132]  #1: ffff8881319b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.679533] 2 locks held by kworker/2:24/2834:
[ 1845.681338]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.683668]  #1: ffff8881185efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.686081] 2 locks held by kworker/2:25/2835:
[ 1845.687877]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.690160]  #1: ffff8881299a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.692548] 2 locks held by kworker/2:27/2837:
[ 1845.694316]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.696589]  #1: ffff888105ae7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.698995] 2 locks held by kworker/2:28/2838:
[ 1845.700799]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.703139]  #1: ffff888133fd7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.705549] 2 locks held by kworker/2:30/2840:
[ 1845.707341]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.709638]  #1: ffff888127627d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.712057] 2 locks held by kworker/2:31/2841:
[ 1845.713853]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.716160]  #1: ffff88810a8d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.718564] 2 locks held by kworker/2:34/2845:
[ 1845.720341]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.722653]  #1: ffff888134107d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.725061] 2 locks held by kworker/3:40/2847:
[ 1845.726873]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.729184]  #1: ffff88812f5cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.731588] 2 locks held by kworker/2:36/2848:
[ 1845.733384]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.735681]  #1: ffff8881184efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.738077] 2 locks held by kworker/2:37/2851:
[ 1845.739855]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.742191]  #1: ffff88813b89fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.744532] 2 locks held by kworker/1:24/2852:
[ 1845.746338]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.748635]  #1: ffff8881275c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.751036] 2 locks held by kworker/1:26/2854:
[ 1845.752810]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.755139]  #1: ffff88812238fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.757498] 2 locks held by kworker/1:28/2856:
[ 1845.759286]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.761628]  #1: ffff888122f2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.763996] 2 locks held by kworker/1:29/2857:
[ 1845.765766]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.768067]  #1: ffff88812215fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.770425] 2 locks held by kworker/1:30/2858:
[ 1845.772237]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.774564]  #1: ffff888137177d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.776959] 2 locks held by kworker/1:32/2860:
[ 1845.778767]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.781058]  #1: ffff88812a6bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.783435] 2 locks held by kworker/1:34/2862:
[ 1845.785261]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.787605]  #1: ffff888119487d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.790019] 2 locks held by kworker/1:35/2863:
[ 1845.791759]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.794093]  #1: ffff888135497d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.796540] 2 locks held by kworker/1:37/2865:
[ 1845.798278]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.800636]  #1: ffff8881053b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.803035] 2 locks held by kworker/2:38/2866:
[ 1845.804808]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.807150]  #1: ffff88810533fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.809571] 2 locks held by kworker/2:39/2867:
[ 1845.811371]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.813698]  #1: ffff888119d57d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.816104] 2 locks held by kworker/2:41/2869:
[ 1845.817858]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.820217]  #1: ffff888119d7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.822579] 2 locks held by kworker/2:46/2874:
[ 1845.824384]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.826691]  #1: ffff888106be7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.829051] 2 locks held by kworker/2:49/2878:
[ 1845.830865]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.833194]  #1: ffff88813af5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.835616] 2 locks held by kworker/2:51/2881:
[ 1845.837390]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.839737]  #1: ffff888122957d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.842116] 2 locks held by kworker/2:52/2882:
[ 1845.843933]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.846254]  #1: ffff888123fe7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.848710] 2 locks held by kworker/2:53/2883:
[ 1845.850464]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.852749]  #1: ffff88812282fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.855191] 2 locks held by kworker/2:54/2884:
[ 1845.856982]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.859288]  #1: ffff88813baffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.861684] 2 locks held by kworker/2:55/2885:
[ 1845.863494]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.865779]  #1: ffff888111c97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.868184] 2 locks held by kworker/2:56/2886:
[ 1845.869955]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.872223]  #1: ffff888111c8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.874666] 2 locks held by kworker/1:40/2888:
[ 1845.876443]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.878794]  #1: ffff88811b197d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.881130] 2 locks held by kworker/0:5/2889:
[ 1845.882854]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.885148]  #1: ffff888118247d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.887535] 2 locks held by kworker/2:58/2890:
[ 1845.889341]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.891495]  #1: ffff88810cf57d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.893905] 2 locks held by kworker/1:41/2897:
[ 1845.895655]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.897934]  #1: ffff888137987d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.900296] 2 locks held by kworker/2:61/2898:
[ 1845.902071]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.904422]  #1: ffff88811008fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.906816] 2 locks held by kworker/0:7/2899:
[ 1845.908574]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.910857]  #1: ffff88810530fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.913250] 2 locks held by kworker/2:62/2900:
[ 1845.915027]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.917326]  #1: ffff88812eccfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.919696] 2 locks held by kworker/0:8/2901:
[ 1845.921496]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.923773]  #1: ffff888139277d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.926133] 2 locks held by kworker/0:9/2903:
[ 1845.927908]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.930231]  #1: ffff888105f27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.932617] 2 locks held by kworker/1:43/2905:
[ 1845.934393]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.936659]  #1: ffff88810629fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.939044] 2 locks held by kworker/1:44/2907:
[ 1845.940855]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.943143]  #1: ffff88811d127d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.945543] 2 locks held by kworker/0:10/2908:
[ 1845.947309]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.949590]  #1: ffff8881361b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.952001] 2 locks held by kworker/1:45/2909:
[ 1845.953773]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.956004]  #1: ffff888121147d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.958426] 2 locks held by kworker/2:65/2910:
[ 1845.960240]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.962547]  #1: ffff88810c597d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.964935] 2 locks held by kworker/1:46/2911:
[ 1845.966701]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.968990]  #1: ffff88812b2ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.971313] 2 locks held by kworker/1:47/2913:
[ 1845.973100]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.975451]  #1: ffff88813f79fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.977880] 2 locks held by kworker/0:11/2916:
[ 1845.979682]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.981949]  #1: ffff88811d7e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.984317] 2 locks held by kworker/2:68/2917:
[ 1845.986087]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.988369]  #1: ffff88812c017d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.990715] 2 locks held by kworker/1:50/2920:
[ 1845.992496]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1845.994769]  #1: ffff888123fc7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1845.997095] 2 locks held by kworker/0:12/2921:
[ 1845.998885]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.001218]  #1: ffff8881202f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.003603] 2 locks held by kworker/1:51/2923:
[ 1846.005405]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.007715]  #1: ffff8881114ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.010124] 2 locks held by kworker/2:71/2924:
[ 1846.011907]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.014223]  #1: ffff88812ef5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.016615] 2 locks held by kworker/2:73/2928:
[ 1846.018367]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.020712]  #1: ffff888117667d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.023000] 2 locks held by kworker/2:74/2931:
[ 1846.024774]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.027108]  #1: ffff88811322fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.029466] 2 locks held by kworker/0:14/2932:
[ 1846.031284]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.033576]  #1: ffff88810fd5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.035945] 2 locks held by kworker/2:75/2933:
[ 1846.037730]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.040007]  #1: ffff8881367a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.042335] 2 locks held by kworker/0:16/2935:
[ 1846.044121]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.046392]  #1: ffff88810c55fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.048757] 2 locks held by kworker/0:17/2937:
[ 1846.050524]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.052871]  #1: ffff8881368a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.055241] 2 locks held by kworker/2:77/2938:
[ 1846.056990]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.059306]  #1: ffff888122217d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.061588] 2 locks held by kworker/2:78/2940:
[ 1846.063332]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.065636]  #1: ffff8881212a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.068005] 2 locks held by kworker/1:56/2941:
[ 1846.069793]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.072091]  #1: ffff8881192efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.074460] 2 locks held by kworker/2:79/2942:
[ 1846.076276]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.078593]  #1: ffff88811b187d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.080997] 2 locks held by kworker/1:57/2943:
[ 1846.082766]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.085099]  #1: ffff888139457d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.087514] 2 locks held by kworker/2:80/2944:
[ 1846.089313]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.091623]  #1: ffff888134697d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.094002] 2 locks held by kworker/1:59/2948:
[ 1846.095792]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.098122]  #1: ffff888107d27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.100558] 2 locks held by kworker/2:82/2949:
[ 1846.102361]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.104650]  #1: ffff88812810fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.107035] 2 locks held by kworker/0:19/2950:
[ 1846.108804]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.111121]  #1: ffff8881313f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.113499] 2 locks held by kworker/1:60/2951:
[ 1846.115278]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.117586]  #1: ffff88810d01fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.120000] 2 locks held by kworker/2:84/2954:
[ 1846.121772]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.124105]  #1: ffff88812618fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.126532] 2 locks held by kworker/0:21/2955:
[ 1846.128332]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.130576]  #1: ffff888107c6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.132910] 2 locks held by kworker/0:24/2960:
[ 1846.134696]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.136967]  #1: ffff888100cafd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.139353] 2 locks held by kworker/0:25/2962:
[ 1846.141106]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.143454]  #1: ffff888111267d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.145841] 2 locks held by kworker/2:88/2963:
[ 1846.147625]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.149903]  #1: ffff888134d0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.152280] 2 locks held by kworker/3:46/2964:
[ 1846.154068]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.156371]  #1: ffff88810f7afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.158751] 2 locks held by kworker/3:47/2967:
[ 1846.160398]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.162653]  #1: ffff88813c7b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.165045] 2 locks held by kworker/0:28/2968:
[ 1846.166830]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.169156]  #1: ffff88812dc77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.171558] 2 locks held by kworker/0:29/2970:
[ 1846.173363]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.175655]  #1: ffff88812892fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.178081] 2 locks held by kworker/0:30/2971:
[ 1846.179861]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.182198]  #1: ffff88812dfd7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.184562] 2 locks held by kworker/0:31/2973:
[ 1846.186364]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.188663]  #1: ffff8881304ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.191053] 2 locks held by kworker/3:50/2974:
[ 1846.192850]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.195153]  #1: ffff88811fa6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.197534] 2 locks held by kworker/3:51/2975:
[ 1846.199290]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.201640]  #1: ffff888130c0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.204059] 2 locks held by kworker/2:90/2978:
[ 1846.205833]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.208134]  #1: ffff888138457d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.210548] 2 locks held by kworker/2:94/2983:
[ 1846.212355]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.214684]  #1: ffff88813c5b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.217078] 2 locks held by kworker/0:33/2984:
[ 1846.218870]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.221180]  #1: ffff888118337d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.223616] 2 locks held by kworker/0:34/2987:
[ 1846.225402]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.227712]  #1: ffff88812b827d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.230049] 2 locks held by kworker/0:35/2988:
[ 1846.231865]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.234180]  #1: ffff88811761fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.236603] 2 locks held by kworker/0:36/2990:
[ 1846.238405]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.240743]  #1: ffff88813a327d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.243154] 2 locks held by kworker/3:54/2991:
[ 1846.244944]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.247254]  #1: ffff88813a32fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.249661] 2 locks held by kworker/2:96/2992:
[ 1846.251415]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.253744]  #1: ffff88813a5cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.256072] 2 locks held by kworker/1:62/2993:
[ 1846.257867]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.260131]  #1: ffff88810f7c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.262502] 2 locks held by kworker/2:98/2996:
[ 1846.264306]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.266598]  #1: ffff88813544fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.268989] 2 locks held by kworker/1:64/2997:
[ 1846.270789]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.273098]  #1: ffff88810f497d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.275485] 2 locks held by kworker/2:102/3001:
[ 1846.277249]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.279558]  #1: ffff888107d37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.281985] 2 locks held by kworker/0:38/3004:
[ 1846.283756]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.286069]  #1: ffff88812db1fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.288455] 2 locks held by kworker/0:39/3006:
[ 1846.290218]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.292529]  #1: ffff88812b847d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.294939] 2 locks held by kworker/2:105/3007:
[ 1846.296685]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.299010]  #1: ffff888135e37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.301429] 2 locks held by kworker/0:40/3008:
[ 1846.303243]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.305566]  #1: ffff888112cffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.307892] 2 locks held by kworker/2:107/3011:
[ 1846.309698]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.312023]  #1: ffff88812e577d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.314422] 2 locks held by kworker/2:108/3013:
[ 1846.316249]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.318523]  #1: ffff88812183fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.320928] 2 locks held by kworker/0:43/3014:
[ 1846.322729]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.325007]  #1: ffff88813b8f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.327362] 2 locks held by kworker/1:65/3015:
[ 1846.329133]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.331460]  #1: ffff8881230efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.333826] 2 locks held by kworker/0:44/3016:
[ 1846.335617]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.337933]  #1: ffff888134f77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.340294] 2 locks held by kworker/2:110/3019:
[ 1846.342063]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.344356]  #1: ffff888123877d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.346708] 2 locks held by kworker/2:111/3021:
[ 1846.348463]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.350711]  #1: ffff88811b93fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.353066] 2 locks held by kworker/0:48/3024:
[ 1846.354871]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.357159]  #1: ffff88812500fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.359565] 2 locks held by kworker/0:49/3026:
[ 1846.361326]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.363655]  #1: ffff8881184a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.366021] 2 locks held by kworker/0:50/3027:
[ 1846.367802]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.370043]  #1: ffff8881184afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.372410] 2 locks held by kworker/1:66/3028:
[ 1846.374214]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.376522]  #1: ffff88813478fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.378915] 2 locks held by kworker/1:67/3029:
[ 1846.380682]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.383009]  #1: ffff8881216e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.385428] 2 locks held by kworker/1:72/3034:
[ 1846.387211]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.389542]  #1: ffff88812bdd7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.391894] 2 locks held by kworker/1:73/3035:
[ 1846.393613]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.395947]  #1: ffff88812bddfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.398383] 2 locks held by kworker/1:74/3036:
[ 1846.400150]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.402449]  #1: ffff88811c49fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.404851] 2 locks held by kworker/1:75/3037:
[ 1846.406632]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.408913]  #1: ffff888111587d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.411257] 2 locks held by kworker/1:77/3039:
[ 1846.413046]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.415323]  #1: ffff88811157fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.417715] 2 locks held by kworker/1:79/3042:
[ 1846.419479]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.421757]  #1: ffff888126f77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.424093] 2 locks held by kworker/1:80/3043:
[ 1846.425872]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.428177]  #1: ffff888126f7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.430604] 2 locks held by kworker/1:82/3046:
[ 1846.432382]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.434722]  #1: ffff88811b027d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.437127] 2 locks held by kworker/2:116/3052:
[ 1846.438947]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.441264]  #1: ffff888138e4fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.443697] 2 locks held by kworker/2:118/3054:
[ 1846.445508]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.447719]  #1: ffff88813ecc7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.450101] 2 locks held by kworker/2:120/3056:
[ 1846.451878]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.454210]  #1: ffff88813ecdfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.456602] 2 locks held by kworker/2:122/3058:
[ 1846.458392]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.460678]  #1: ffff88811c597d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.463034] 2 locks held by kworker/2:123/3059:
[ 1846.464820]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.467113]  #1: ffff88811c59fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.469512] 2 locks held by kworker/2:125/3061:
[ 1846.471288]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.473547]  #1: ffff88811c47fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.475876] 2 locks held by kworker/2:127/3063:
[ 1846.477645]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.479943]  #1: ffff88812fbf7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.482357] 2 locks held by kworker/2:128/3064:
[ 1846.484135]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.486426]  #1: ffff88810f5a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.488860] 2 locks held by kworker/2:131/3067:
[ 1846.490666]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.492903]  #1: ffff88811f307d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.495335] 2 locks held by kworker/2:133/3069:
[ 1846.497155]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.499482]  #1: ffff888130447d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.501832] 2 locks held by kworker/2:134/3070:
[ 1846.503601]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.505908]  #1: ffff888130457d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.508286] 2 locks held by kworker/2:141/3077:
[ 1846.510081]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.512419]  #1: ffff88813d78fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.514763] 2 locks held by kworker/0:55/3078:
[ 1846.516571]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.518869]  #1: ffff88813d79fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.521270] 2 locks held by kworker/0:56/3080:
[ 1846.523060]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.525405]  #1: ffff8881252f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.527817] 2 locks held by kworker/0:58/3082:
[ 1846.529590]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.531794]  #1: ffff888110d6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.534196] 2 locks held by kworker/0:59/3083:
[ 1846.535999]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.538293]  #1: ffff888110d77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.540713] 2 locks held by kworker/0:60/3084:
[ 1846.542437]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.544711]  #1: ffff888119c07d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.547111] 2 locks held by kworker/0:62/3086:
[ 1846.548917]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.551264]  #1: ffff88811464fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.553629] 2 locks held by kworker/0:64/3088:
[ 1846.555433]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.557729]  #1: ffff88813ee47d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.560105] 2 locks held by kworker/0:65/3089:
[ 1846.561924]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.564227]  #1: ffff88813ee4fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.566623] 2 locks held by kworker/0:66/3090:
[ 1846.568414]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.570750]  #1: ffff88813ee5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.573183] 2 locks held by kworker/0:68/3092:
[ 1846.574932]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.577277]  #1: ffff8881169b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.579664] 2 locks held by kworker/0:69/3093:
[ 1846.581445]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.583780]  #1: ffff8881169bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.586161] 2 locks held by kworker/0:73/3097:
[ 1846.587954]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.590274]  #1: ffff88811632fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.592663] 2 locks held by kworker/0:74/3098:
[ 1846.594470]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.596751]  #1: ffff88811633fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.599107] 2 locks held by kworker/0:76/3100:
[ 1846.600881]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.603217]  #1: ffff8881169dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.605601] 2 locks held by kworker/0:77/3101:
[ 1846.607402]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.609573]  #1: ffff8881169e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.611971] 2 locks held by kworker/0:78/3102:
[ 1846.613730]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.616042]  #1: ffff8881169f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.618444] 2 locks held by kworker/0:79/3103:
[ 1846.620254]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.622558]  #1: ffff8881169ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.624952] 2 locks held by kworker/0:80/3104:
[ 1846.626680]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.628957]  #1: ffff888113257d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.631318] 2 locks held by kworker/0:82/3106:
[ 1846.633132]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.635458]  #1: ffff88811326fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.637825] 2 locks held by kworker/2:143/3107:
[ 1846.639535]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.641623]  #1: ffff888113277d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.643714] 2 locks held by kworker/0:83/3108:
[ 1846.645345]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.647378]  #1: ffff888116747d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.649476] 2 locks held by kworker/0:85/3110:
[ 1846.651108]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.653157]  #1: ffff88811675fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.655265] 2 locks held by kworker/2:145/3115:
[ 1846.656907]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.658943]  #1: ffff88811681fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.661043] 2 locks held by kworker/0:88/3116:
[ 1846.662672]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.664700]  #1: ffff88811682fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.666800] 2 locks held by kworker/0:89/3117:
[ 1846.668428]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.670467]  #1: ffff888116837d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.672564] 2 locks held by kworker/0:90/3118:
[ 1846.674191]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.676231]  #1: ffff888116847d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.678334] 2 locks held by kworker/0:91/3119:
[ 1846.679962]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.682000]  #1: ffff88811684fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.684357] 2 locks held by kworker/0:94/3122:
[ 1846.686158]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.688463]  #1: ffff88811687fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.690827] 2 locks held by kworker/0:96/3124:
[ 1846.692624]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.694930]  #1: ffff888116797d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.697350] 2 locks held by kworker/0:97/3125:
[ 1846.699168]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.701468]  #1: ffff8881167a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.703838] 2 locks held by kworker/3:55/3126:
[ 1846.705562]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.707872]  #1: ffff8881167efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.710221] 2 locks held by kworker/3:57/3129:
[ 1846.712016]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.714368]  #1: ffff88810f7efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.716772] 2 locks held by kworker/2:147/3130:
[ 1846.718550]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.720798]  #1: ffff888130f3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.723151] 2 locks held by kworker/3:58/3131:
[ 1846.724961]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.727251]  #1: ffff88813387fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.729629] 2 locks held by kworker/3:60/3136:
[ 1846.731377]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.733722]  #1: ffff88811cac7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.736070] 2 locks held by kworker/2:151/3137:
[ 1846.737871]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.740181]  #1: ffff888119a2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.742605] 2 locks held by kworker/3:61/3138:
[ 1846.744409]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.746708]  #1: ffff888132bbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.749059] 2 locks held by kworker/3:62/3141:
[ 1846.750851]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.753065]  #1: ffff8881378dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.755504] 2 locks held by kworker/2:155/3144:
[ 1846.757284]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.759575]  #1: ffff888118bffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.761933] 2 locks held by kworker/2:157/3147:
[ 1846.763742]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.766062]  #1: ffff88812f4c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.768433] 2 locks held by kworker/3:66/3150:
[ 1846.770245]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.772589]  #1: ffff88812c59fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.774922] 2 locks held by kworker/2:159/3151:
[ 1846.776705]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.778997]  #1: ffff888128447d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.781418] 2 locks held by kworker/3:67/3152:
[ 1846.783229]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.785552]  #1: ffff8881010c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.787935] 2 locks held by kworker/2:160/3153:
[ 1846.789731]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.791997]  #1: ffff88811b8dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.794398] 2 locks held by kworker/3:68/3154:
[ 1846.796217]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.798461]  #1: ffff8881230c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.800827] 2 locks held by kworker/3:69/3156:
[ 1846.802626]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.804880]  #1: ffff88811a5afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.807294] 2 locks held by kworker/2:162/3157:
[ 1846.809032]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.811342]  #1: ffff888123e27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.813731] 2 locks held by kworker/3:70/3158:
[ 1846.815471]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.817704]  #1: ffff888119967d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.820099] 2 locks held by kworker/2:163/3159:
[ 1846.821912]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.824259]  #1: ffff88812eb17d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.826658] 2 locks held by kworker/3:72/3162:
[ 1846.828445]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.830737]  #1: ffff88812b71fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.833091] 2 locks held by kworker/3:73/3164:
[ 1846.834905]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.837180]  #1: ffff8881236cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.839547] 2 locks held by kworker/2:166/3165:
[ 1846.841359]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.843683]  #1: ffff888127ce7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.846077] 2 locks held by kworker/2:167/3166:
[ 1846.847874]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.850188]  #1: ffff888130f5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.852581] 2 locks held by kworker/3:74/3167:
[ 1846.854381]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.856638]  #1: ffff88812a03fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.858994] 2 locks held by kworker/2:168/3168:
[ 1846.860803]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.863125]  #1: ffff888118547d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.865502] 2 locks held by kworker/3:76/3170:
[ 1846.867274]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.869590]  #1: ffff8881290efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.871966] 2 locks held by kworker/2:169/3171:
[ 1846.873759]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.876067]  #1: ffff888113537d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.878500] 2 locks held by kworker/2:170/3172:
[ 1846.880241]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.882550]  #1: ffff88812800fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.884923] 2 locks held by kworker/2:171/3174:
[ 1846.886717]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.888964]  #1: ffff88810b7afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.891342] 2 locks held by kworker/3:78/3175:
[ 1846.893152]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.895444]  #1: ffff88810b7cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.897839] 2 locks held by kworker/2:173/3178:
[ 1846.899645]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.901930]  #1: ffff88813824fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.904368] 2 locks held by kworker/2:174/3180:
[ 1846.906166]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.908487]  #1: ffff88811fbffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.910889] 2 locks held by kworker/2:175/3181:
[ 1846.912677]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.914993]  #1: ffff88810d657d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.917381] 2 locks held by kworker/2:176/3183:
[ 1846.919126]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.921418]  #1: ffff88812cd0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.923802] 2 locks held by kworker/0:99/3184:
[ 1846.925561]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.927875]  #1: ffff888129a8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.930261] 2 locks held by kworker/0:101/3188:
[ 1846.932086]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.934427]  #1: ffff888122d0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.936855] 2 locks held by kworker/0:102/3189:
[ 1846.938637]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.940957]  #1: ffff888135087d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.943360] 2 locks held by kworker/2:179/3190:
[ 1846.945154]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.947422]  #1: ffff88812db5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.949816] 2 locks held by kworker/2:180/3192:
[ 1846.951610]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.953888]  #1: ffff888135c2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.956310] 2 locks held by kworker/2:181/3194:
[ 1846.958131]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.960429]  #1: ffff88811e607d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.962842] 2 locks held by kworker/0:105/3195:
[ 1846.964653]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.966944]  #1: ffff88810786fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.969357] 2 locks held by kworker/2:182/3196:
[ 1846.971031]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.973347]  #1: ffff88810b6dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.975722] 2 locks held by kworker/0:106/3197:
[ 1846.977477]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.979799]  #1: ffff888133eb7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.982180] 2 locks held by kworker/2:183/3198:
[ 1846.983956]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.986282]  #1: ffff88810fd4fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.988625] 2 locks held by kworker/2:184/3200:
[ 1846.990385]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.992687]  #1: ffff88811d3bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1846.995091] 2 locks held by kworker/0:108/3201:
[ 1846.996880]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1846.999212]  #1: ffff8881194f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.001610] 2 locks held by kworker/2:185/3202:
[ 1847.003419]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.005697]  #1: ffff88812201fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.008064] 2 locks held by kworker/0:109/3203:
[ 1847.009833]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.012147]  #1: ffff88812360fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.014580] 2 locks held by kworker/0:110/3205:
[ 1847.016323]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.018596]  #1: ffff88812dbffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.020968] 2 locks held by kworker/0:111/3206:
[ 1847.022748]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.025005]  #1: ffff888121917d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.027404] 2 locks held by kworker/0:113/3208:
[ 1847.029168]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.031487]  #1: ffff888125257d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.033927] 2 locks held by kworker/0:114/3209:
[ 1847.035695]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.037924]  #1: ffff888117cffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.040327] 2 locks held by kworker/0:115/3210:
[ 1847.042093]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.044392]  #1: ffff88813ee97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.046790] 2 locks held by kworker/3:84/3214:
[ 1847.048447]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.050752]  #1: ffff88811624fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.053143] 2 locks held by kworker/3:85/3215:
[ 1847.054922]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.057177]  #1: ffff88811625fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.059600] 2 locks held by kworker/3:86/3216:
[ 1847.061342]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.063655]  #1: ffff888116267d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.066031] 2 locks held by kworker/3:87/3217:
[ 1847.067846]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.070173]  #1: ffff888116277d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.072605] 2 locks held by kworker/3:88/3218:
[ 1847.074322]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.076675]  #1: ffff88811627fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.079050] 2 locks held by kworker/3:90/3220:
[ 1847.080866]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.083139]  #1: ffff88811629fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.085557] 2 locks held by kworker/0:116/3224:
[ 1847.087325]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.089540]  #1: ffff8881162cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.091896] 2 locks held by kworker/0:117/3225:
[ 1847.093708]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.095968]  #1: ffff8881162dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.098385] 2 locks held by kworker/0:120/3228:
[ 1847.100165]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.102497]  #1: ffff8881162ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.104912] 2 locks held by kworker/0:122/3230:
[ 1847.106730]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.109025]  #1: ffff888116617d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.111433] 2 locks held by kworker/0:124/3232:
[ 1847.113261]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.115545]  #1: ffff888116637d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.117891] 2 locks held by kworker/0:125/3233:
[ 1847.119688]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.121958]  #1: ffff88811664fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.124367] 2 locks held by kworker/0:126/3234:
[ 1847.126165]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.128485]  #1: ffff888116657d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.130883] 2 locks held by kworker/0:127/3235:
[ 1847.132680]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.134962]  #1: ffff888116667d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.137353] 2 locks held by kworker/0:128/3236:
[ 1847.139112]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.141422]  #1: ffff88811666fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.143839] 2 locks held by kworker/0:129/3237:
[ 1847.145625]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.147894]  #1: ffff88811667fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.150250] 2 locks held by kworker/0:130/3238:
[ 1847.152017]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.154368]  #1: ffff888116687d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.156773] 2 locks held by kworker/0:135/3243:
[ 1847.158555]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.160730]  #1: ffff8881166c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.163142] 2 locks held by kworker/0:136/3244:
[ 1847.164910]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.167258]  #1: ffff8881166cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.169669] 2 locks held by kworker/3:95/3246:
[ 1847.171438]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.173713]  #1: ffff8881166efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.176070] 2 locks held by kworker/3:97/3248:
[ 1847.177888]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.180173]  #1: ffff88813ef07d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.182623] 2 locks held by kworker/0:137/3249:
[ 1847.184437]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.186710]  #1: ffff88813ef1fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.189078] 2 locks held by kworker/3:99/3251:
[ 1847.190888]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.193149]  #1: ffff88813ef37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.195552] 2 locks held by kworker/3:102/3254:
[ 1847.197351]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.199679]  #1: ffff88813ef57d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.202095] 2 locks held by kworker/3:104/3256:
[ 1847.203830]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.206136]  #1: ffff88813ef6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.208511] 2 locks held by kworker/3:107/3259:
[ 1847.210327]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.212667]  #1: ffff88813ef9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.215030] 2 locks held by kworker/3:109/3261:
[ 1847.216850]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.219130]  #1: ffff88813efb7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.221522] 2 locks held by kworker/3:110/3262:
[ 1847.223298]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.225646]  #1: ffff88813efbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.227959] 2 locks held by kworker/3:112/3264:
[ 1847.229749]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.232022]  #1: ffff88811600fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.234410] 2 locks held by kworker/1:85/3265:
[ 1847.236204]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.238532]  #1: ffff88811601fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.240856] 2 locks held by kworker/1:86/3266:
[ 1847.242656]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.244925]  #1: ffff888116027d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.247269] 2 locks held by kworker/1:87/3267:
[ 1847.249067]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.251384]  #1: ffff88811607fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.253760] 2 locks held by kworker/1:88/3268:
[ 1847.255546]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.257786]  #1: ffff888116087d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.260211] 2 locks held by kworker/1:89/3269:
[ 1847.262017]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.264285]  #1: ffff888116097d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.266675] 2 locks held by kworker/0:138/3270:
[ 1847.268432]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.270712]  #1: ffff8881393cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.273097] 2 locks held by kworker/0:139/3272:
[ 1847.274906]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.277199]  #1: ffff8881160e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.279629] 2 locks held by kworker/1:91/3273:
[ 1847.281402]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.283688]  #1: ffff8881160f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.286055] 2 locks held by kworker/1:92/3275:
[ 1847.287859]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.290142]  #1: ffff88811610fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.292499] 2 locks held by kworker/1:93/3277:
[ 1847.294276]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.296572]  #1: ffff888116167d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.298959] 2 locks held by kworker/0:143/3280:
[ 1847.300741]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.303053]  #1: ffff88811618fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.305458] 2 locks held by kworker/0:144/3282:
[ 1847.307260]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.309562]  #1: ffff8881161a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.311971] 2 locks held by kworker/1:99/3289:
[ 1847.313761]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.316077]  #1: ffff888116407d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.318447] 2 locks held by kworker/1:100/3291:
[ 1847.320274]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.322556]  #1: ffff88811641fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.324976] 2 locks held by kworker/0:149/3292:
[ 1847.326775]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.329064]  #1: ffff88811642fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.331461] 2 locks held by kworker/0:150/3294:
[ 1847.333272]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.335568]  #1: ffff888116447d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.337986] 2 locks held by kworker/1:102/3295:
[ 1847.339772]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.342014]  #1: ffff88811644fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.344402] 2 locks held by kworker/0:151/3296:
[ 1847.346193]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.348481]  #1: ffff88811645fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.350902] 2 locks held by kworker/1:103/3297:
[ 1847.352678]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.354987]  #1: ffff888116467d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.357395] 2 locks held by kworker/0:152/3298:
[ 1847.359178]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.361441]  #1: ffff8881164afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.363841] 2 locks held by kworker/1:104/3299:
[ 1847.365642]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.367965]  #1: ffff8881164bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.370355] 2 locks held by kworker/0:154/3301:
[ 1847.372178]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.374487]  #1: ffff8881164d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.376872] 2 locks held by kworker/0:155/3302:
[ 1847.378658]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.380974]  #1: ffff8881164e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.383376] 2 locks held by kworker/0:156/3303:
[ 1847.385153]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.387476]  #1: ffff8881164efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.389924] 2 locks held by kworker/0:157/3304:
[ 1847.391724]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.394012]  #1: ffff888116507d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.396410] 2 locks held by kworker/0:158/3306:
[ 1847.398175]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.400498]  #1: ffff888124897d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.402864] 2 locks held by kworker/2:188/3307:
[ 1847.404675]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.406978]  #1: ffff88811f0afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.409328] 2 locks held by kworker/0:159/3310:
[ 1847.411083]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.413366]  #1: ffff888129117d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.415716] 2 locks held by kworker/0:160/3312:
[ 1847.417507]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.419764]  #1: ffff888105837d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.422101] 2 locks held by kworker/0:161/3314:
[ 1847.423922]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.426202]  #1: ffff88813d44fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.428598] 2 locks held by kworker/0:162/3316:
[ 1847.430401]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.432698]  #1: ffff888121b37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.435047] 2 locks held by kworker/2:194/3317:
[ 1847.436834]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.439139]  #1: ffff88812ba5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.441506] 2 locks held by kworker/0:163/3318:
[ 1847.443320]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.445432]  #1: ffff88812923fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.447767] 2 locks held by kworker/2:197/3321:
[ 1847.449571]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.451878]  #1: ffff88811ea3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.454252] 2 locks held by kworker/2:199/3323:
[ 1847.456041]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.458354]  #1: ffff888113057d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.460746] 2 locks held by kworker/2:202/3326:
[ 1847.462556]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.464785]  #1: ffff8881330b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.467097] 2 locks held by kworker/3:113/3328:
[ 1847.468870]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.471219]  #1: ffff888122eb7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.473651] 2 locks held by kworker/1:105/3329:
[ 1847.475411]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.477759]  #1: ffff888127057d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.480120] 2 locks held by kworker/2:204/3331:
[ 1847.481886]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.484226]  #1: ffff888117757d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.486641] 2 locks held by kworker/2:206/3333:
[ 1847.488454]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.490716]  #1: ffff88812be7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.493118] 2 locks held by kworker/2:209/3336:
[ 1847.494924]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.497212]  #1: ffff88811778fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.499623] 2 locks held by kworker/2:210/3337:
[ 1847.501424]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.503699]  #1: ffff8881304efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.506080] 2 locks held by kworker/2:213/3340:
[ 1847.507880]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.510224]  #1: ffff88811ffbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.512684] 2 locks held by kworker/2:220/3347:
[ 1847.514454]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.516795]  #1: ffff8881165c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.519190] 2 locks held by kworker/1:106/3348:
[ 1847.520945]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.523275]  #1: ffff8881165cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.525698] 2 locks held by kworker/1:108/3350:
[ 1847.527508]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.529801]  #1: ffff8881165e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.532214] 2 locks held by kworker/1:109/3351:
[ 1847.534035]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.536369]  #1: ffff8881165f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.538787] 2 locks held by kworker/1:110/3352:
[ 1847.540550]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.542874]  #1: ffff888116a37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.545280] 2 locks held by kworker/1:111/3353:
[ 1847.547082]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.549357]  #1: ffff888116a47d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.551733] 2 locks held by kworker/1:112/3354:
[ 1847.553531]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.555790]  #1: ffff888116a4fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.558151] 2 locks held by kworker/1:114/3356:
[ 1847.559954]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.562196]  #1: ffff888116a6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.564639] 2 locks held by kworker/1:115/3357:
[ 1847.566434]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.568712]  #1: ffff888116a7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.571103] 2 locks held by kworker/1:116/3358:
[ 1847.572922]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.575204]  #1: ffff888116a87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.577562] 2 locks held by kworker/1:117/3359:
[ 1847.579381]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.581713]  #1: ffff888116a9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.584083] 2 locks held by kworker/1:119/3361:
[ 1847.585908]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.588165]  #1: ffff888116ab7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.590577] 2 locks held by kworker/1:120/3362:
[ 1847.592382]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.594688]  #1: ffff888116abfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.597066] 2 locks held by kworker/1:121/3363:
[ 1847.598848]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.601147]  #1: ffff888116acfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.603549] 2 locks held by kworker/1:123/3365:
[ 1847.605340]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.607647]  #1: ffff888116ae7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.610047] 2 locks held by kworker/1:124/3366:
[ 1847.611864]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.614188]  #1: ffff888116aefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.616565] 2 locks held by kworker/1:125/3367:
[ 1847.618340]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.620638]  #1: ffff888116affd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.623046] 2 locks held by kworker/1:126/3368:
[ 1847.624844]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.627133]  #1: ffff888116b07d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.629539] 2 locks held by kworker/1:127/3369:
[ 1847.631330]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.633633]  #1: ffff888116b17d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.636059] 2 locks held by kworker/1:129/3371:
[ 1847.637882]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.640201]  #1: ffff888116b2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.642586] 2 locks held by kworker/1:130/3372:
[ 1847.644401]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.646695]  #1: ffff888116b3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.649024] 2 locks held by kworker/1:132/3374:
[ 1847.650768]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.653070]  #1: ffff888116b5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.655470] 2 locks held by kworker/1:134/3376:
[ 1847.657301]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.659606]  #1: ffff888116b77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.661962] 2 locks held by kworker/1:135/3377:
[ 1847.663777]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.666019]  #1: ffff888116b87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.668437] 2 locks held by kworker/1:136/3378:
[ 1847.670250]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.672595]  #1: ffff888116b8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.674955] 2 locks held by kworker/1:137/3379:
[ 1847.676736]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.678987]  #1: ffff888116b9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.681417] 2 locks held by kworker/1:138/3380:
[ 1847.683182]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.685500]  #1: ffff888116ba7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.687873] 2 locks held by kworker/1:141/3383:
[ 1847.689653]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.691934]  #1: ffff888116bcfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.694359] 2 locks held by kworker/1:143/3385:
[ 1847.696172]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.698478]  #1: ffff888116befd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.700903] 2 locks held by kworker/1:144/3386:
[ 1847.702685]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.704954]  #1: ffff888116bf7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.707320] 2 locks held by kworker/1:146/3388:
[ 1847.709107]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.711342]  #1: ffff88813e40fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.713773] 2 locks held by kworker/1:147/3389:
[ 1847.715521]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.717823]  #1: ffff88813e41fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.720135] 2 locks held by kworker/2:226/3395:
[ 1847.721952]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.724259]  #1: ffff88813e46fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.726683] 2 locks held by kworker/2:230/3399:
[ 1847.728488]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.730738]  #1: ffff88813e4a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.733151] 2 locks held by kworker/2:235/3404:
[ 1847.734971]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.737282]  #1: ffff88813e4dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.739700] 2 locks held by kworker/2:237/3406:
[ 1847.741471]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.743751]  #1: ffff88813e4f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.746141] 2 locks held by kworker/2:238/3407:
[ 1847.747934]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.750210]  #1: ffff88813e507d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.752594] 2 locks held by kworker/2:240/3409:
[ 1847.754389]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.756657]  #1: ffff88813e51fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.759038] 2 locks held by kworker/0:165/3410:
[ 1847.760840]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.763088]  #1: ffff88813e52fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.765501] 2 locks held by kworker/0:166/3411:
[ 1847.767292]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.769539]  #1: ffff88813e587d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.771912] 2 locks held by kworker/0:167/3412:
[ 1847.773703]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.775930]  #1: ffff88813e58fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.778359] 2 locks held by kworker/0:170/3415:
[ 1847.780177]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.782469]  #1: ffff88813e5b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.784868] 2 locks held by kworker/0:171/3416:
[ 1847.786668]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.788962]  #1: ffff88813e5bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.791373] 2 locks held by kworker/0:172/3417:
[ 1847.793191]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.795535]  #1: ffff88813e5cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.797917] 2 locks held by kworker/0:173/3418:
[ 1847.799712]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.802009]  #1: ffff88813e5d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.804327] 2 locks held by kworker/0:174/3419:
[ 1847.806122]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.808399]  #1: ffff88813e5e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.810832] 2 locks held by kworker/0:175/3420:
[ 1847.812621]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.814863]  #1: ffff88813e5efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.817249] 2 locks held by kworker/0:177/3422:
[ 1847.819043]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.821388]  #1: ffff88813e607d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.823768] 2 locks held by kworker/0:181/3426:
[ 1847.825577]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.827834]  #1: ffff88811e057d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.830158] 2 locks held by kworker/0:184/3429:
[ 1847.831957]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.834282]  #1: ffff88811d1bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.836701] 2 locks held by kworker/2:241/3430:
[ 1847.838428]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.840701]  #1: ffff88813b6efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.843032] 2 locks held by kworker/2:242/3431:
[ 1847.844838]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.847128]  #1: ffff888138427d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.849525] 2 locks held by kworker/2:245/3434:
[ 1847.851328]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.853618]  #1: ffff88813e617d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.856041] 2 locks held by kworker/2:250/3439:
[ 1847.857842]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.860122]  #1: ffff88813e657d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.862556] 2 locks held by kworker/2:251/3440:
[ 1847.864373]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.866672]  #1: ffff88813e667d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.869094] 2 locks held by kworker/2:253/3442:
[ 1847.870890]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.873154]  #1: ffff88813e67fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.875581] 2 locks held by kworker/3:114/3447:
[ 1847.877394]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.879740]  #1: ffff88813e6b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.882098] 2 locks held by kworker/3:115/3448:
[ 1847.883918]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.886171]  #1: ffff88813e6c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.888583] 2 locks held by kworker/3:116/3449:
[ 1847.890398]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.892705]  #1: ffff88813e747d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.895051] 2 locks held by kworker/3:118/3451:
[ 1847.896847]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.899132]  #1: ffff88813e767d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.901507] 2 locks held by kworker/3:120/3453:
[ 1847.903236]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.905527]  #1: ffff88813e77fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.907901] 2 locks held by kworker/3:122/3455:
[ 1847.909708]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.912026]  #1: ffff88813e797d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.914454] 2 locks held by kworker/3:124/3457:
[ 1847.916231]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.918510]  #1: ffff88813e7afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.920927] 2 locks held by kworker/3:125/3458:
[ 1847.922695]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.925008]  #1: ffff88813e7bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.927436] 2 locks held by kworker/3:128/3461:
[ 1847.929255]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.931546]  #1: ffff88813e7dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.933891] 2 locks held by kworker/3:131/3464:
[ 1847.935689]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.937985]  #1: ffff88813e047d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.940359] 2 locks held by kworker/0:186/3467:
[ 1847.942142]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.944467]  #1: ffff88813e06fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.946777] 2 locks held by kworker/0:188/3469:
[ 1847.948559]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.950874]  #1: ffff88813e087d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.953271] 2 locks held by kworker/0:189/3470:
[ 1847.955060]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.957359]  #1: ffff88813e097d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.959762] 2 locks held by kworker/0:191/3472:
[ 1847.961559]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.963833]  #1: ffff88813e0afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.966201] 2 locks held by kworker/0:192/3473:
[ 1847.967990]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.970260]  #1: ffff88813e0b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.972661] 2 locks held by kworker/0:193/3474:
[ 1847.974439]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.976762]  #1: ffff88813e0c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.979010] 2 locks held by kworker/0:195/3476:
[ 1847.980748]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.983039]  #1: ffff88813e0e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.985423] 2 locks held by kworker/0:197/3478:
[ 1847.987205]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.989489]  #1: ffff88813e0ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.991878] 2 locks held by kworker/0:198/3479:
[ 1847.993608]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1847.995881]  #1: ffff88813e13fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1847.998236] 2 locks held by kworker/0:199/3480:
[ 1848.000010]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.002319]  #1: ffff88813e14fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.004719] 2 locks held by kworker/0:200/3481:
[ 1848.006525]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.008835]  #1: ffff88813e157d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.011188] 2 locks held by kworker/0:203/3484:
[ 1848.012969]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.015327]  #1: ffff88813e17fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.017729] 2 locks held by kworker/0:205/3486:
[ 1848.019539]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.021850]  #1: ffff88813e19fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.024275] 2 locks held by kworker/0:206/3487:
[ 1848.026073]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.028400]  #1: ffff88813e1a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.030781] 2 locks held by kworker/0:207/3488:
[ 1848.032591]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.034794]  #1: ffff88813e1b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.037098] 2 locks held by kworker/0:208/3489:
[ 1848.038870]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.041174]  #1: ffff88813e1c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.043556] 2 locks held by kworker/0:209/3490:
[ 1848.045370]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.047648]  #1: ffff88813e1d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.050062] 2 locks held by kworker/0:211/3492:
[ 1848.051827]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.054123]  #1: ffff88813e227d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.056524] 2 locks held by kworker/0:215/3496:
[ 1848.058345]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.060635]  #1: ffff88813e257d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.063063] 2 locks held by kworker/0:219/3500:
[ 1848.064887]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.067150]  #1: ffff88813e287d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.069530] 2 locks held by kworker/0:220/3501:
[ 1848.071321]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.073575]  #1: ffff88813e28fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.075995] 2 locks held by kworker/0:221/3502:
[ 1848.077815]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.080025]  #1: ffff8881348afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.082419] 2 locks held by kworker/0:222/3503:
[ 1848.084240]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.086510]  #1: ffff88812e54fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.088878] 2 locks held by kworker/0:224/3505:
[ 1848.090652]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.092903]  #1: ffff888126f0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.095228] 2 locks held by kworker/3:133/3506:
[ 1848.097027]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.099344]  #1: ffff88813c507d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.101705] 2 locks held by kworker/0:225/3507:
[ 1848.103476]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.105768]  #1: ffff88811f2d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.108190] 2 locks held by kworker/0:228/3510:
[ 1848.109978]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.112277]  #1: ffff888130bc7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.114622] 2 locks held by kworker/0:229/3511:
[ 1848.116439]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.118707]  #1: ffff88811cd5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.121118] 2 locks held by kworker/0:231/3513:
[ 1848.122938]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.125207]  #1: ffff888122837d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.127584] 2 locks held by kworker/0:234/3516:
[ 1848.129347]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.131683]  #1: ffff8881277bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.134053] 2 locks held by kworker/0:235/3517:
[ 1848.135872]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.138204]  #1: ffff88811a1bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.140577] 2 locks held by kworker/0:237/3519:
[ 1848.142368]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.144709]  #1: ffff8881182f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.147107] 2 locks held by kworker/0:238/3520:
[ 1848.148899]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.151163]  #1: ffff8881394ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.153550] 2 locks held by kworker/0:239/3521:
[ 1848.155363]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.157675]  #1: ffff888120a3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.160047] 2 locks held by kworker/0:240/3522:
[ 1848.161866]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.164186]  #1: ffff88812cf97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.166588] 2 locks held by kworker/0:241/3523:
[ 1848.168362]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.170636]  #1: ffff888132a37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.172998] 2 locks held by kworker/1:149/3528:
[ 1848.174801]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.177106]  #1: ffff88813b2b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.179521] 2 locks held by kworker/1:153/3532:
[ 1848.181318]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.183570]  #1: ffff888115c87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.185967] 2 locks held by kworker/1:154/3533:
[ 1848.187772]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.190094]  #1: ffff888115c8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.192475] 2 locks held by kworker/1:156/3535:
[ 1848.194287]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.196585]  #1: ffff888115ca7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.198982] 2 locks held by kworker/1:157/3536:
[ 1848.200771]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.203095]  #1: ffff88813e3bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.205467] 2 locks held by kworker/1:159/3538:
[ 1848.207299]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.209643]  #1: ffff88813e3d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.212044] 2 locks held by kworker/1:160/3539:
[ 1848.213848]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.216173]  #1: ffff88813e3e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.218566] 2 locks held by kworker/1:162/3541:
[ 1848.220380]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.222663]  #1: ffff88813e3ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.225041] 2 locks held by kworker/1:163/3542:
[ 1848.226859]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.229127]  #1: ffff88813dc0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.231558] 2 locks held by kworker/1:164/3543:
[ 1848.233346]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.235588]  #1: ffff88813dc17d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.237998] 2 locks held by kworker/1:165/3544:
[ 1848.239818]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.242102]  #1: ffff88813dc27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.244507] 2 locks held by kworker/0:245/3546:
[ 1848.246246]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.248542]  #1: ffff88813dc3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.250948] 2 locks held by kworker/0:248/3549:
[ 1848.252757]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.254964]  #1: ffff88813dc67d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.257370] 2 locks held by kworker/0:249/3550:
[ 1848.259162]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.261488]  #1: ffff88813dc77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.263876] 2 locks held by kworker/0:250/3551:
[ 1848.265634]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.267917]  #1: ffff88813dc7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.270327] 2 locks held by kworker/0:252/3553:
[ 1848.272112]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.274439]  #1: ffff88813dc97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.276827] 2 locks held by kworker/0:253/3554:
[ 1848.278595]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.280893]  #1: ffff88813dca7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.283292] 2 locks held by kworker/0:255/3556:
[ 1848.285086]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.287396]  #1: ffff88813dcbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.289755] 2 locks held by kworker/3:134/3558:
[ 1848.291566]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.293838]  #1: ffff88813dcdfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.296224] 2 locks held by kworker/3:135/3559:
[ 1848.297985]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.300275]  #1: ffff88813dce7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.302673] 2 locks held by kworker/3:136/3560:
[ 1848.304445]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.306727]  #1: ffff88813dcf7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.309148] 2 locks held by kworker/3:137/3561:
[ 1848.310950]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.313250]  #1: ffff88813dd37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.315666] 2 locks held by kworker/3:141/3565:
[ 1848.317476]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.319782]  #1: ffff88813dd6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.322179] 2 locks held by kworker/3:143/3567:
[ 1848.323971]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.326288]  #1: ffff88813dd87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.328681] 2 locks held by kworker/3:144/3568:
[ 1848.330412]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.332695]  #1: ffff88813dd97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.335071] 2 locks held by kworker/3:146/3570:
[ 1848.336870]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.339177]  #1: ffff88813ddafd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.341567] 2 locks held by kworker/3:149/3573:
[ 1848.343358]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.345637]  #1: ffff88813ddcfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.348027] 2 locks held by kworker/3:150/3574:
[ 1848.349819]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.352086]  #1: ffff88813dddfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.354507] 2 locks held by kworker/3:151/3575:
[ 1848.356320]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.358664]  #1: ffff88813ddf7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.361065] 2 locks held by kworker/3:152/3576:
[ 1848.362867]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.365196]  #1: ffff88813de07d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.367604] 2 locks held by kworker/3:153/3577:
[ 1848.369354]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.371643]  #1: ffff88813de0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.374031] 2 locks held by kworker/0:257/3578:
[ 1848.375841]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.378138]  #1: ffff88813de1fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.380570] 2 locks held by kworker/3:154/3579:
[ 1848.382385]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.384693]  #1: ffff88813de3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.387099] 2 locks held by kworker/3:156/3581:
[ 1848.388912]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.391253]  #1: ffff88813dedfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.393640] 2 locks held by kworker/1:167/3585:
[ 1848.395440]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.397769]  #1: ffff888134f9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.400195] 2 locks held by kworker/1:168/3586:
[ 1848.402010]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.404334]  #1: ffff8881304a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.406710] 2 locks held by kworker/1:169/3587:
[ 1848.408518]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.410797]  #1: ffff888128997d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.413123] 2 locks held by kworker/1:170/3588:
[ 1848.414922]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.417223]  #1: ffff888128c0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.419652] 2 locks held by kworker/1:173/3591:
[ 1848.421465]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.423770]  #1: ffff88812479fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.426139] 2 locks held by kworker/3:159/3592:
[ 1848.427922]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.430183]  #1: ffff88813b37fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.432620] 2 locks held by kworker/3:161/3594:
[ 1848.434390]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.436736]  #1: ffff88812f527d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.439124] 2 locks held by kworker/1:174/3595:
[ 1848.440806]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.443037]  #1: ffff88812ddefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.445407] 2 locks held by kworker/1:175/3596:
[ 1848.447227]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.449537]  #1: ffff88813d93fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.451925] 2 locks held by kworker/1:176/3597:
[ 1848.453695]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.456000]  #1: ffff88813d94fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.458405] 2 locks held by kworker/1:178/3599:
[ 1848.460161]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.462413]  #1: ffff88813d967d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.464819] 2 locks held by kworker/1:179/3600:
[ 1848.466623]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.468907]  #1: ffff88813dadfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.471296] 2 locks held by kworker/1:180/3601:
[ 1848.473040]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.475356]  #1: ffff88813dae7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.477785] 2 locks held by kworker/1:181/3602:
[ 1848.479595]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.481902]  #1: ffff88813daf7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.484286] 2 locks held by kworker/1:182/3603:
[ 1848.486088]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.488360]  #1: ffff88813daffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.490762] 2 locks held by kworker/1:184/3605:
[ 1848.492571]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.494862]  #1: ffff88813db1fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.497218] 2 locks held by kworker/1:185/3606:
[ 1848.498997]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.501268]  #1: ffff88813db2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.503383] 2 locks held by kworker/1:186/3607:
[ 1848.505022]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.507066]  #1: ffff88813db37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.509171] 2 locks held by kworker/1:189/3610:
[ 1848.510820]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.512859]  #1: ffff88813db5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.514967] 2 locks held by kworker/1:191/3612:
[ 1848.516610]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.518642]  #1: ffff88813db77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.520738] 2 locks held by kworker/1:192/3613:
[ 1848.522384]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.524420]  #1: ffff88813db7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.526519] 2 locks held by kworker/1:193/3614:
[ 1848.528154]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.530192]  #1: ffff88813db8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.532301] 2 locks held by kworker/1:194/3615:
[ 1848.533949]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.535994]  #1: ffff88813db9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.538100] 2 locks held by kworker/1:195/3616:
[ 1848.539738]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.541777]  #1: ffff88813dbafd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.543878] 2 locks held by kworker/1:196/3617:
[ 1848.545523]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.547550]  #1: ffff88813dbb7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.549645] 2 locks held by kworker/1:198/3619:
[ 1848.551279]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.553424]  #1: ffff88813dbd7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.555635] 2 locks held by kworker/1:199/3620:
[ 1848.557345]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.559466]  #1: ffff88813dbe7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.561573] 2 locks held by kworker/1:200/3621:
[ 1848.563208]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.565245]  #1: ffff88813dbefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.567358] 2 locks held by kworker/1:203/3624:
[ 1848.568987]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.571029]  #1: ffff888161817d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.573136] 2 locks held by kworker/1:206/3627:
[ 1848.574789]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.576827]  #1: ffff888161837d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.578934] 2 locks held by kworker/1:209/3630:
[ 1848.580574]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.582608]  #1: ffff88816185fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.584704] 2 locks held by kworker/1:210/3631:
[ 1848.586343]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.588374]  #1: ffff88816186fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.590480] 2 locks held by kworker/1:211/3632:
[ 1848.592125]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.594163]  #1: ffff88816187fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.596268] 2 locks held by kworker/3:162/3633:
[ 1848.597914]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.599956]  #1: ffff88816189fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.602064] 2 locks held by kworker/3:163/3634:
[ 1848.603707]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.605740]  #1: ffff888127a6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.607845] 2 locks held by kworker/3:164/3635:
[ 1848.609485]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.611528]  #1: ffff888128f3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.613627] 2 locks held by kworker/3:166/3637:
[ 1848.615263]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.617310]  #1: ffff88812b83fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.619419] 2 locks held by kworker/3:167/3638:
[ 1848.621064]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.623106]  #1: ffff88812aa57d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.625217] 2 locks held by kworker/3:168/3639:
[ 1848.626872]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.628921]  #1: ffff888127d3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.631031] 2 locks held by kworker/3:170/3641:
[ 1848.632673]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.634706]  #1: ffff88811ec6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.636811] 2 locks held by kworker/3:171/3642:
[ 1848.638453]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.640493]  #1: ffff88812f687d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.642594] 2 locks held by kworker/3:172/3643:
[ 1848.644233]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.646272]  #1: ffff8881380a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.648388] 2 locks held by kworker/1:212/3644:
[ 1848.650034]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.652083]  #1: ffff888126e6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.654204] 2 locks held by kworker/1:213/3645:
[ 1848.655862]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.657906]  #1: ffff8881276afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.660014] 2 locks held by kworker/1:214/3646:
[ 1848.661658]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.663690]  #1: ffff8881323dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.665799] 2 locks held by kworker/1:215/3647:
[ 1848.667443]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.669472]  #1: ffff888129ecfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.671566] 2 locks held by kworker/1:216/3648:
[ 1848.673206]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.675248]  #1: ffff88810f47fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.677353] 2 locks held by kworker/1:218/3650:
[ 1848.679001]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.681046]  #1: ffff888126487d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.683158] 2 locks held by kworker/1:220/3652:
[ 1848.684810]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.686858]  #1: ffff88813d47fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.688970] 2 locks held by kworker/1:222/3654:
[ 1848.690618]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.692656]  #1: ffff8881289d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.694758] 2 locks held by kworker/1:223/3655:
[ 1848.696401]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.698445]  #1: ffff888126a6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.700559] 2 locks held by kworker/1:224/3656:
[ 1848.702204]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.704245]  #1: ffff88812338fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.706363] 2 locks held by kworker/1:226/3658:
[ 1848.708009]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.710057]  #1: ffff888105697d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.712165] 2 locks held by kworker/1:227/3659:
[ 1848.713818]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.715864]  #1: ffff888130d6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.717969] 2 locks held by kworker/1:229/3661:
[ 1848.719616]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.721652]  #1: ffff88813c977d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.723760] 2 locks held by kworker/3:173/3663:
[ 1848.725406]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.727446]  #1: ffff88812b3a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.729563] 2 locks held by kworker/3:174/3664:
[ 1848.731188]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.733232]  #1: ffff88812b28fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.735350] 2 locks held by kworker/3:176/3666:
[ 1848.736998]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.739041]  #1: ffff888130617d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.741151] 2 locks held by kworker/3:177/3667:
[ 1848.742800]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.744841]  #1: ffff88812fcbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.746948] 2 locks held by kworker/3:180/3670:
[ 1848.748596]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.750633]  #1: ffff88812f107d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.752736] 2 locks held by kworker/3:181/3671:
[ 1848.754382]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.756410]  #1: ffff88812feffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.758503] 2 locks held by kworker/3:182/3672:
[ 1848.760134]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.762174]  #1: ffff88812bc8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.764287] 2 locks held by kworker/3:185/3675:
[ 1848.765941]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.767986]  #1: ffff8881348dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.770095] 2 locks held by kworker/3:187/3677:
[ 1848.771739]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.773785]  #1: ffff888132c87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.775892] 2 locks held by kworker/3:188/3678:
[ 1848.777537]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.779565]  #1: ffff888121e2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.781669] 2 locks held by kworker/3:195/3685:
[ 1848.783314]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.785354]  #1: ffff88812c187d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.787460] 2 locks held by kworker/3:197/3687:
[ 1848.789094]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.791137]  #1: ffff888131f5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.793245] 2 locks held by kworker/3:198/3688:
[ 1848.794897]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.796942]  #1: ffff88813516fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.799050] 2 locks held by kworker/3:202/3692:
[ 1848.800695]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.802729]  #1: ffff8881350efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.804833] 2 locks held by kworker/3:204/3694:
[ 1848.806476]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.808520]  #1: ffff88811e2a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.810620] 2 locks held by kworker/3:205/3695:
[ 1848.812256]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.814287]  #1: ffff88812f4cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.816397] 2 locks held by kworker/3:207/3697:
[ 1848.818041]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.820082]  #1: ffff8881247dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.822188] 2 locks held by kworker/3:208/3698:
[ 1848.823848]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.825889]  #1: ffff88811934fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.827995] 2 locks held by kworker/3:209/3699:
[ 1848.829643]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.831677]  #1: ffff8881231d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.833786] 2 locks held by kworker/3:211/3701:
[ 1848.835431]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.837472]  #1: ffff888133b6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.839583] 2 locks held by kworker/3:212/3702:
[ 1848.841220]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.843267]  #1: ffff88813242fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.845377] 2 locks held by kworker/3:214/3704:
[ 1848.847024]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.849080]  #1: ffff8881316b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.851197] 2 locks held by kworker/3:217/3707:
[ 1848.852849]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.854894]  #1: ffff88811476fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.856998] 2 locks held by kworker/3:218/3708:
[ 1848.858649]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.860681]  #1: ffff888132bdfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.862786] 2 locks held by kworker/3:220/3710:
[ 1848.864428]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.866459]  #1: ffff888125137d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.868556] 2 locks held by kworker/3:221/3711:
[ 1848.870182]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.872225]  #1: ffff888132597d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.874338] 2 locks held by kworker/3:223/3713:
[ 1848.875983]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.878025]  #1: ffff8881209cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.880131] 2 locks held by kworker/3:224/3714:
[ 1848.881784]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.883828]  #1: ffff88811f877d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.885935] 2 locks held by kworker/3:225/3715:
[ 1848.887576]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.889601]  #1: ffff88811cf47d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.891689] 2 locks held by kworker/3:226/3716:
[ 1848.893331]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.895372]  #1: ffff88811cd7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.897468] 2 locks held by kworker/3:227/3717:
[ 1848.899110]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.901150]  #1: ffff888111b9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.903256] 2 locks held by kworker/3:232/3722:
[ 1848.904908]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.906947]  #1: ffff88812aed7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.909055] 2 locks held by kworker/3:233/3723:
[ 1848.910698]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.912732]  #1: ffff888130637d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.914839] 2 locks held by kworker/3:238/3728:
[ 1848.916481]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.918521]  #1: ffff8881399efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.920623] 2 locks held by kworker/1:231/3737:
[ 1848.922259]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.924298]  #1: ffff8881290a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.926411] 2 locks held by kworker/1:232/3738:
[ 1848.928047]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.930086]  #1: ffff888120b77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.932193] 2 locks held by kworker/1:237/3743:
[ 1848.933842]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.935906]  #1: ffff8881100f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.938012] 2 locks held by kworker/1:238/3744:
[ 1848.939659]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.941690]  #1: ffff88812e0e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.943802] 2 locks held by kworker/1:239/3745:
[ 1848.945451]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.947488]  #1: ffff88810ad5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.949607] 2 locks held by kworker/1:241/3747:
[ 1848.951246]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.953289]  #1: ffff88811fb5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.955408] 2 locks held by kworker/1:242/3748:
[ 1848.957058]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.959102]  #1: ffff888119eefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.961208] 2 locks held by kworker/1:243/3749:
[ 1848.962862]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.964903]  #1: ffff888130d87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.967016] 2 locks held by kworker/1:244/3750:
[ 1848.968662]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.970700]  #1: ffff8881289afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.972810] 2 locks held by kworker/1:245/3751:
[ 1848.974456]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.976486]  #1: ffff8881063cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.978601] 2 locks held by kworker/1:246/3752:
[ 1848.980230]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.982276]  #1: ffff88811fca7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.984392] 2 locks held by kworker/1:248/3754:
[ 1848.986033]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.988077]  #1: ffff888106997d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.990184] 2 locks held by kworker/1:249/3755:
[ 1848.991840]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.993885]  #1: ffff8881372afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1848.995994] 2 locks held by kworker/1:250/3756:
[ 1848.997641]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1848.999679]  #1: ffff8881209b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.001787] 2 locks held by kworker/1:251/3757:
[ 1849.003436]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.005482]  #1: ffff8881314ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.007582] 2 locks held by kworker/1:252/3758:
[ 1849.009219]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.011260]  #1: ffff888130d8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.013377] 2 locks held by kworker/1:253/3759:
[ 1849.015026]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.017066]  #1: ffff8881371b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.019201] 2 locks held by kworker/1:255/3761:
[ 1849.020862]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.022902]  #1: ffff88811d897d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.025007] 2 locks held by kworker/1:256/3762:
[ 1849.026649]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.028680]  #1: ffff88813b99fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.030792] 2 locks held by kworker/1:257/3763:
[ 1849.032440]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.034483]  #1: ffff88813b0efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.036590] 2 locks held by kworker/3:247/3765:
[ 1849.038220]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.040260]  #1: ffff888134867d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.042382] 2 locks held by kworker/3:248/3766:
[ 1849.044023]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.046066]  #1: ffff888124b7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.048170] 2 locks held by kworker/3:249/3767:
[ 1849.049821]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.051860]  #1: ffff888131aafd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.053964] 2 locks held by kworker/3:251/3769:
[ 1849.055610]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.057647]  #1: ffff8881068ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.059746] 2 locks held by kworker/3:252/3770:
[ 1849.061399]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.063434]  #1: ffff88810b757d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.065537] 2 locks held by kworker/3:254/3772:
[ 1849.067174]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.069213]  #1: ffff888136d97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.071346] 2 locks held by kworker/3:255/3773:
[ 1849.072992]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.075027]  #1: ffff88811830fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.077139] 2 locks held by kworker/3:256/3774:
[ 1849.078795]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.080838]  #1: ffff888127547d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.082946] 2 locks held by kworker/1:258/4004:
[ 1849.084592]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.086626]  #1: ffff88812cbefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.088727] 2 locks held by kworker/0:18/13817:
[ 1849.090368]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.092406]  #1: ffff8881213cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.094515] 2 locks held by kworker/1:97/23521:
[ 1849.096151]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.098194]  #1: ffff88810c33fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.100300] 2 locks held by kworker/1:259/28552:
[ 1849.101959]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.103997]  #1: ffff888140777d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.106102] 2 locks held by kworker/3:258/38106:
[ 1849.107766]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.109802]  #1: ffff888111b0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
[ 1849.111912] 2 locks held by kworker/1:172/39248:
[ 1849.113563]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
[ 1849.115594]  #1: ffff888110eefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0

[ 1849.119116] =============================================


[2]

$ ps axuw | grep " D "
root           9  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/0:1+dio/dm-1]
root          25  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/1:0+dio/dm-1]
root          49  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/1:1+dio/dm-1]
root          74  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/0:2+dio/dm-1]
root         169  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/3:2+dio/dm-1]
root         221  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/0:3+dio/dm-1]
root         230  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/1:2+dio/dm-1]
root         291  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/2:3+dio/dm-1]
root         322  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/1:3+dio/dm-1]
root        2757  2.1  0.0      0     0 ?        D    10:57   1:14 [kworker/u8:7+flush-253:1]
root        2759  0.0  0.0      0     0 ?        D    10:57   0:00 [kworker/3:4+dio/dm-1]
root        2760  0.0  0.0      0     0 ?        D    10:57   0:00 [kworker/0:4+dio/dm-1]
root        2762  0.0  0.0      0     0 ?        D    10:57   0:00 [kworker/1:5+dio/dm-1]
root        2764  0.0  0.0      0     0 ?        D    10:57   0:00 [kworker/1:6+dio/dm-1]
root        2765  0.0  0.0      0     0 ?        D    10:57   0:00 [kworker/3:5+dio/dm-1]
...

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-21  6:46 [bug report] blktests srp/002 hang Shinichiro Kawasaki
@ 2023-08-22  1:46 ` Bob Pearson
  2023-08-22 10:18   ` Shinichiro Kawasaki
  2023-09-22 11:06 ` Linux regression tracking #adding (Thorsten Leemhuis)
  1 sibling, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-08-22  1:46 UTC (permalink / raw)
  To: Shinichiro Kawasaki, linux-rdma, linux-scsi

On 8/21/23 01:46, Shinichiro Kawasaki wrote:
> I observed a process hang at the blktests test case srp/002 occasionally, using
> kernel v6.5-rcX. Kernel reported stall of many kworkers [1]. PID 2757 hanged at
> inode_sleep_on_writeback(). Other kworkers hanged at __inode_wait_for_writeback.
> 
> The hang is recreated in stable manner by repeating the test case srp/002 (from
> 15 times to 30 times).
> 
> I bisected and found the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support
> for rxe tasks") looks like the trigger commit. When I revert it from the kernel
> v6.5-rc7, the hang symptom disappears. I'm not sure how the commit relates to
> the hang. Comments will be welcomed.
> 
> [1]
> 
> ...
> [ 1670.489181] scsi 4:0:0:1: alua: Detached
> [ 1670.985461] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-38: queued zerolength write
> [ 1670.985702] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-36: queued zerolength write
> [ 1670.985716] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-38 wc->status 5
> [ 1670.985821] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-38
> [ 1670.985824] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-36 wc->status 5
> [ 1670.985909] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-34: queued zerolength write
> [ 1670.985924] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-36
> [ 1670.986104] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-34 wc->status 5
> [ 1670.986244] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-34
> [ 1671.049223] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-40: queued zerolength write
> [ 1671.049588] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-40 wc->status 5
> [ 1671.049626] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-40
> [ 1844.873748] INFO: task kworker/0:1:9 blocked for more than 122 seconds.
> [ 1844.877893]       Not tainted 6.5.0-rc7 #106
> [ 1844.878903] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
> [ 1844.880255] task:kworker/0:1     state:D stack:0     pid:9     ppid:2      flags:0x00004000
> [ 1844.881830] Workqueue: dio/dm-1 iomap_dio_complete_work
> [ 1844.882999] Call Trace:
> [ 1844.883900]  <TASK>
> [ 1844.884703]  __schedule+0x10ac/0x5e80
> [ 1844.885609]  ? do_raw_spin_unlock+0x54/0x1f0
> [ 1844.886569]  ? __pfx___schedule+0x10/0x10
> [ 1844.887596]  ? lock_release+0x378/0x650
> [ 1844.888431]  ? schedule+0x92/0x220
> [ 1844.889232]  ? mark_held_locks+0x96/0xe0
> [ 1844.890117]  schedule+0x133/0x220
> [ 1844.890874]  bit_wait+0x17/0xe0
> [ 1844.891619]  __wait_on_bit+0x66/0x180
> [ 1844.892409]  ? __pfx_bit_wait+0x10/0x10
> [ 1844.893192]  __inode_wait_for_writeback+0x12b/0x1b0
> [ 1844.894245]  ? __pfx___inode_wait_for_writeback+0x10/0x10
> [ 1844.895225]  ? __pfx_wake_bit_function+0x10/0x10
> [ 1844.896138]  ? find_held_lock+0x2d/0x110
> [ 1844.897085]  writeback_single_inode+0xf9/0x3f0
> [ 1844.898186]  sync_inode_metadata+0x91/0xd0
> [ 1844.899036]  ? __pfx_sync_inode_metadata+0x10/0x10
> [ 1844.900106]  ? lock_release+0x378/0x650
> [ 1844.900988]  ? file_check_and_advance_wb_err+0xb5/0x230
> [ 1844.901978]  generic_buffers_fsync_noflush+0x1bf/0x270
> [ 1844.902964]  ext4_sync_file+0x469/0xb60
> [ 1844.903859]  iomap_dio_complete+0x5d1/0x860
> [ 1844.904828]  ? __pfx_aio_complete_rw+0x10/0x10
> [ 1844.905841]  iomap_dio_complete_work+0x52/0x80
> [ 1844.906774]  process_one_work+0x898/0x14a0
> [ 1844.907673]  ? __pfx_lock_acquire+0x10/0x10
> [ 1844.908644]  ? __pfx_process_one_work+0x10/0x10
> [ 1844.909693]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1844.910676]  worker_thread+0x100/0x12c0
> [ 1844.911612]  ? __kthread_parkme+0xc1/0x1f0
> [ 1844.912542]  ? __pfx_worker_thread+0x10/0x10
> [ 1844.913584]  kthread+0x2ea/0x3c0
> [ 1844.914465]  ? __pfx_kthread+0x10/0x10
> [ 1844.915335]  ret_from_fork+0x30/0x70
> [ 1844.916269]  ? __pfx_kthread+0x10/0x10
> [ 1844.917308]  ret_from_fork_asm+0x1b/0x30
> [ 1844.918243]  </TASK>
> [ 1844.918998] INFO: task kworker/1:0:25 blocked for more than 122 seconds.
> [ 1844.920107]       Not tainted 6.5.0-rc7 #106
> [ 1844.921041] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
> [ 1844.922262] task:kworker/1:0     state:D stack:0     pid:25    ppid:2      flags:0x00004000
> [ 1844.923550] Workqueue: dio/dm-1 iomap_dio_complete_work
> [ 1844.924598] Call Trace:
> [ 1844.925407]  <TASK>
> [ 1844.926194]  __schedule+0x10ac/0x5e80
> [ 1844.927097]  ? do_raw_spin_unlock+0x54/0x1f0
> [ 1844.928032]  ? __pfx___schedule+0x10/0x10
> [ 1844.928937]  ? lock_release+0x378/0x650
> [ 1844.929823]  ? schedule+0x92/0x220
> [ 1844.930682]  ? mark_held_locks+0x96/0xe0
> [ 1844.931579]  schedule+0x133/0x220
> [ 1844.932411]  bit_wait+0x17/0xe0
> [ 1844.933238]  __wait_on_bit+0x66/0x180
> [ 1844.934107]  ? __pfx_bit_wait+0x10/0x10
> [ 1844.934996]  __inode_wait_for_writeback+0x12b/0x1b0
> [ 1844.935956]  ? __pfx___inode_wait_for_writeback+0x10/0x10
> [ 1844.936969]  ? __pfx_wake_bit_function+0x10/0x10
> [ 1844.937942]  ? find_held_lock+0x2d/0x110
> [ 1844.938891]  writeback_single_inode+0xf9/0x3f0
> [ 1844.939836]  sync_inode_metadata+0x91/0xd0
> [ 1844.940758]  ? __pfx_sync_inode_metadata+0x10/0x10
> [ 1844.941730]  ? lock_release+0x378/0x650
> [ 1844.942640]  ? file_check_and_advance_wb_err+0xb5/0x230
> [ 1844.943647]  generic_buffers_fsync_noflush+0x1bf/0x270
> [ 1844.944652]  ext4_sync_file+0x469/0xb60
> [ 1844.945561]  iomap_dio_complete+0x5d1/0x860
> [ 1844.946469]  ? __pfx_aio_complete_rw+0x10/0x10
> [ 1844.947417]  iomap_dio_complete_work+0x52/0x80
> [ 1844.948358]  process_one_work+0x898/0x14a0
> [ 1844.949284]  ? __pfx_lock_acquire+0x10/0x10
> [ 1844.950204]  ? __pfx_process_one_work+0x10/0x10
> [ 1844.951152]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1844.952094]  worker_thread+0x100/0x12c0
> [ 1844.952998]  ? __pfx_worker_thread+0x10/0x10
> [ 1844.953919]  kthread+0x2ea/0x3c0
> [ 1844.954760]  ? __pfx_kthread+0x10/0x10
> [ 1844.955669]  ret_from_fork+0x30/0x70
> [ 1844.956550]  ? __pfx_kthread+0x10/0x10
> [ 1844.957418]  ret_from_fork_asm+0x1b/0x30
> [ 1844.958321]  </TASK>
> [ 1844.959085] INFO: task kworker/1:1:49 blocked for more than 122 seconds.
> [ 1844.960193]       Not tainted 6.5.0-rc7 #106
> [ 1844.961122] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
> [ 1844.962340] task:kworker/1:1     state:D stack:0     pid:49    ppid:2      flags:0x00004000
> [ 1844.963619] Workqueue: dio/dm-1 iomap_dio_complete_work
> [ 1844.964667] Call Trace:
> [ 1844.965503]  <TASK>
> [ 1844.966289]  __schedule+0x10ac/0x5e80
> [ 1844.967207]  ? lock_acquire+0x1a9/0x4e0
> [ 1844.968122]  ? __pfx___schedule+0x10/0x10
> [ 1844.969034]  ? lock_release+0x378/0x650
> [ 1844.969922]  ? schedule+0x92/0x220
> [ 1844.970778]  ? mark_held_locks+0x96/0xe0
> [ 1844.971674]  schedule+0x133/0x220
> [ 1844.972526]  bit_wait+0x17/0xe0
> [ 1844.973336]  __wait_on_bit+0x66/0x180
> [ 1844.974206]  ? __pfx_bit_wait+0x10/0x10
> [ 1844.975086]  __inode_wait_for_writeback+0x12b/0x1b0
> [ 1844.976046]  ? __pfx___inode_wait_for_writeback+0x10/0x10
> [ 1844.977056]  ? __pfx_wake_bit_function+0x10/0x10
> [ 1844.978007]  ? find_held_lock+0x2d/0x110
> [ 1844.978917]  writeback_single_inode+0xf9/0x3f0
> [ 1844.979865]  sync_inode_metadata+0x91/0xd0
> [ 1844.980786]  ? __pfx_sync_inode_metadata+0x10/0x10
> [ 1844.981765]  ? lock_release+0x378/0x650
> [ 1844.982677]  ? file_check_and_advance_wb_err+0xb5/0x230
> [ 1844.983687]  generic_buffers_fsync_noflush+0x1bf/0x270
> [ 1844.984696]  ext4_sync_file+0x469/0xb60
> [ 1844.985608]  iomap_dio_complete+0x5d1/0x860
> [ 1844.986548]  ? __pfx_aio_complete_rw+0x10/0x10
> [ 1844.987484]  iomap_dio_complete_work+0x52/0x80
> [ 1844.988435]  process_one_work+0x898/0x14a0
> [ 1844.989352]  ? __pfx_lock_acquire+0x10/0x10
> [ 1844.990275]  ? __pfx_process_one_work+0x10/0x10
> [ 1844.991220]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1844.992164]  worker_thread+0x100/0x12c0
> [ 1844.993065]  ? __kthread_parkme+0xc1/0x1f0
> [ 1844.993977]  ? __pfx_worker_thread+0x10/0x10
> [ 1844.994934]  kthread+0x2ea/0x3c0
> [ 1844.995783]  ? __pfx_kthread+0x10/0x10
> [ 1844.996670]  ret_from_fork+0x30/0x70
> [ 1844.997544]  ? __pfx_kthread+0x10/0x10
> [ 1844.998409]  ret_from_fork_asm+0x1b/0x30
> [ 1844.999308]  </TASK>
> [ 1845.000094] INFO: task kworker/0:2:74 blocked for more than 123 seconds.
> [ 1845.001315]       Not tainted 6.5.0-rc7 #106
> [ 1845.002326] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
> [ 1845.003630] task:kworker/0:2     state:D stack:0     pid:74    ppid:2      flags:0x00004000
> [ 1845.004991] Workqueue: dio/dm-1 iomap_dio_complete_work
> [ 1845.006108] Call Trace:
> [ 1845.006975]  <TASK>
> [ 1845.007805]  __schedule+0x10ac/0x5e80
> [ 1845.008781]  ? do_raw_spin_unlock+0x54/0x1f0
> [ 1845.009780]  ? __pfx___schedule+0x10/0x10
> [ 1845.010736]  ? lock_release+0x378/0x650
> [ 1845.011666]  ? schedule+0x92/0x220
> [ 1845.012579]  ? mark_held_locks+0x96/0xe0
> [ 1845.013531]  schedule+0x133/0x220
> [ 1845.014414]  bit_wait+0x17/0xe0
> [ 1845.015287]  __wait_on_bit+0x66/0x180
> [ 1845.016219]  ? __pfx_bit_wait+0x10/0x10
> [ 1845.017164]  __inode_wait_for_writeback+0x12b/0x1b0
> [ 1845.018185]  ? __pfx___inode_wait_for_writeback+0x10/0x10
> [ 1845.019269]  ? __pfx_wake_bit_function+0x10/0x10
> [ 1845.020282]  ? find_held_lock+0x2d/0x110
> [ 1845.021246]  writeback_single_inode+0xf9/0x3f0
> [ 1845.022248]  sync_inode_metadata+0x91/0xd0
> [ 1845.023222]  ? __pfx_sync_inode_metadata+0x10/0x10
> [ 1845.024255]  ? lock_release+0x378/0x650
> [ 1845.025207]  ? file_check_and_advance_wb_err+0xb5/0x230
> [ 1845.026281]  generic_buffers_fsync_noflush+0x1bf/0x270
> [ 1845.027347]  ext4_sync_file+0x469/0xb60
> [ 1845.028302]  iomap_dio_complete+0x5d1/0x860
> [ 1845.029275]  ? __pfx_aio_complete_rw+0x10/0x10
> [ 1845.030276]  iomap_dio_complete_work+0x52/0x80
> [ 1845.031281]  process_one_work+0x898/0x14a0
> [ 1845.032248]  ? __pfx_lock_acquire+0x10/0x10
> [ 1845.033199]  ? __pfx_process_one_work+0x10/0x10
> [ 1845.034182]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1845.035188]  worker_thread+0x100/0x12c0
> [ 1845.036138]  ? __pfx_worker_thread+0x10/0x10
> [ 1845.037104]  kthread+0x2ea/0x3c0
> [ 1845.037996]  ? __pfx_kthread+0x10/0x10
> [ 1845.038923]  ret_from_fork+0x30/0x70
> [ 1845.039840]  ? __pfx_kthread+0x10/0x10
> [ 1845.040763]  ret_from_fork_asm+0x1b/0x30
> [ 1845.041729]  </TASK>
> [ 1845.042531] INFO: task kworker/3:2:169 blocked for more than 123 seconds.
> [ 1845.043703]       Not tainted 6.5.0-rc7 #106
> [ 1845.044780] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
> [ 1845.046068] task:kworker/3:2     state:D stack:0     pid:169   ppid:2      flags:0x00004000
> [ 1845.047400] Workqueue: dio/dm-1 iomap_dio_complete_work
> [ 1845.048518] Call Trace:
> [ 1845.049392]  <TASK>
> [ 1845.050214]  __schedule+0x10ac/0x5e80
> [ 1845.051172]  ? lock_acquire+0x1a9/0x4e0
> [ 1845.052141]  ? __pfx___schedule+0x10/0x10
> [ 1845.053086]  ? lock_release+0x378/0x650
> [ 1845.054017]  ? schedule+0x92/0x220
> [ 1845.054920]  ? mark_held_locks+0x96/0xe0
> [ 1845.055866]  schedule+0x133/0x220
> [ 1845.056761]  bit_wait+0x17/0xe0
> [ 1845.057645]  __wait_on_bit+0x66/0x180
> [ 1845.058573]  ? __pfx_bit_wait+0x10/0x10
> [ 1845.059502]  __inode_wait_for_writeback+0x12b/0x1b0
> [ 1845.060528]  ? __pfx___inode_wait_for_writeback+0x10/0x10
> [ 1845.061603]  ? __pfx_wake_bit_function+0x10/0x10
> [ 1845.062604]  ? find_held_lock+0x2d/0x110
> [ 1845.063548]  writeback_single_inode+0xf9/0x3f0
> [ 1845.064564]  sync_inode_metadata+0x91/0xd0
> [ 1845.065534]  ? __pfx_sync_inode_metadata+0x10/0x10
> [ 1845.066552]  ? lock_release+0x378/0x650
> [ 1845.067504]  ? file_check_and_advance_wb_err+0xb5/0x230
> [ 1845.068557]  generic_buffers_fsync_noflush+0x1bf/0x270
> [ 1845.069609]  ext4_sync_file+0x469/0xb60
> [ 1845.070563]  iomap_dio_complete+0x5d1/0x860
> [ 1845.071550]  ? __pfx_aio_complete_rw+0x10/0x10
> [ 1845.072543]  iomap_dio_complete_work+0x52/0x80
> [ 1845.073547]  process_one_work+0x898/0x14a0
> [ 1845.074518]  ? __pfx_lock_acquire+0x10/0x10
> [ 1845.075468]  ? __pfx_process_one_work+0x10/0x10
> [ 1845.076456]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1845.077436]  worker_thread+0x100/0x12c0
> [ 1845.078382]  ? __pfx_worker_thread+0x10/0x10
> [ 1845.079354]  kthread+0x2ea/0x3c0
> [ 1845.080230]  ? __pfx_kthread+0x10/0x10
> [ 1845.081163]  ret_from_fork+0x30/0x70
> [ 1845.082075]  ? __pfx_kthread+0x10/0x10
> [ 1845.083014]  ret_from_fork_asm+0x1b/0x30
> [ 1845.083957]  </TASK>
> [ 1845.084756] INFO: task kworker/0:3:221 blocked for more than 123 seconds.
> [ 1845.085927]       Not tainted 6.5.0-rc7 #106
> [ 1845.086911] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
> [ 1845.088205] task:kworker/0:3     state:D stack:0     pid:221   ppid:2      flags:0x00004000
> [ 1845.089566] Workqueue: dio/dm-1 iomap_dio_complete_work
> [ 1845.090635] Call Trace:
> [ 1845.091503]  <TASK>
> [ 1845.092318]  __schedule+0x10ac/0x5e80
> [ 1845.093282]  ? do_raw_spin_unlock+0x54/0x1f0
> [ 1845.094265]  ? __pfx___schedule+0x10/0x10
> [ 1845.095200]  ? lock_release+0x378/0x650
> [ 1845.096132]  ? schedule+0x92/0x220
> [ 1845.097018]  ? mark_held_locks+0x96/0xe0
> [ 1845.097959]  schedule+0x133/0x220
> [ 1845.098863]  bit_wait+0x17/0xe0
> [ 1845.099736]  __wait_on_bit+0x66/0x180
> [ 1845.100649]  ? __pfx_bit_wait+0x10/0x10
> [ 1845.101600]  __inode_wait_for_writeback+0x12b/0x1b0
> [ 1845.102606]  ? __pfx___inode_wait_for_writeback+0x10/0x10
> [ 1845.103673]  ? __pfx_wake_bit_function+0x10/0x10
> [ 1845.104685]  ? find_held_lock+0x2d/0x110
> [ 1845.105633]  writeback_single_inode+0xf9/0x3f0
> [ 1845.106625]  sync_inode_metadata+0x91/0xd0
> [ 1845.107612]  ? __pfx_sync_inode_metadata+0x10/0x10
> [ 1845.108635]  ? lock_release+0x378/0x650
> [ 1845.109591]  ? file_check_and_advance_wb_err+0xb5/0x230
> [ 1845.110645]  generic_buffers_fsync_noflush+0x1bf/0x270
> [ 1845.111698]  ext4_sync_file+0x469/0xb60
> [ 1845.112657]  iomap_dio_complete+0x5d1/0x860
> [ 1845.113639]  ? __pfx_aio_complete_rw+0x10/0x10
> [ 1845.114625]  iomap_dio_complete_work+0x52/0x80
> [ 1845.115616]  process_one_work+0x898/0x14a0
> [ 1845.116582]  ? __pfx_lock_acquire+0x10/0x10
> [ 1845.117575]  ? __pfx_process_one_work+0x10/0x10
> [ 1845.118573]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1845.119557]  worker_thread+0x100/0x12c0
> [ 1845.120480]  ? __pfx_worker_thread+0x10/0x10
> [ 1845.121453]  kthread+0x2ea/0x3c0
> [ 1845.122339]  ? __pfx_kthread+0x10/0x10
> [ 1845.123277]  ret_from_fork+0x30/0x70
> [ 1845.124192]  ? __pfx_kthread+0x10/0x10
> [ 1845.125131]  ret_from_fork_asm+0x1b/0x30
> [ 1845.126085]  </TASK>
> [ 1845.127043] INFO: task kworker/1:2:230 blocked for more than 123 seconds.
> [ 1845.128574]       Not tainted 6.5.0-rc7 #106
> [ 1845.129789] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
> [ 1845.131441] task:kworker/1:2     state:D stack:0     pid:230   ppid:2      flags:0x00004000
> [ 1845.133125] Workqueue: dio/dm-1 iomap_dio_complete_work
> [ 1845.134546] Call Trace:
> [ 1845.135547]  <TASK>
> [ 1845.136475]  __schedule+0x10ac/0x5e80
> [ 1845.137599]  ? lock_acquire+0x1a9/0x4e0
> [ 1845.138703]  ? __pfx___schedule+0x10/0x10
> [ 1845.139859]  ? lock_release+0x378/0x650
> [ 1845.140980]  ? schedule+0x92/0x220
> [ 1845.142026]  ? mark_held_locks+0x96/0xe0
> [ 1845.143161]  schedule+0x133/0x220
> [ 1845.144196]  bit_wait+0x17/0xe0
> [ 1845.145233]  __wait_on_bit+0x66/0x180
> [ 1845.146262]  ? __pfx_bit_wait+0x10/0x10
> [ 1845.147380]  __inode_wait_for_writeback+0x12b/0x1b0
> [ 1845.148650]  ? __pfx___inode_wait_for_writeback+0x10/0x10
> [ 1845.149950]  ? __pfx_wake_bit_function+0x10/0x10
> [ 1845.151181]  ? find_held_lock+0x2d/0x110
> [ 1845.152288]  writeback_single_inode+0xf9/0x3f0
> [ 1845.153474]  sync_inode_metadata+0x91/0xd0
> [ 1845.154608]  ? __pfx_sync_inode_metadata+0x10/0x10
> [ 1845.155857]  ? lock_release+0x378/0x650
> [ 1845.156997]  ? file_check_and_advance_wb_err+0xb5/0x230
> [ 1845.158309]  generic_buffers_fsync_noflush+0x1bf/0x270
> [ 1845.159569]  ext4_sync_file+0x469/0xb60
> [ 1845.160709]  iomap_dio_complete+0x5d1/0x860
> [ 1845.161881]  ? __pfx_aio_complete_rw+0x10/0x10
> [ 1845.163086]  iomap_dio_complete_work+0x52/0x80
> [ 1845.164269]  process_one_work+0x898/0x14a0
> [ 1845.165367]  ? __pfx_lock_acquire+0x10/0x10
> [ 1845.166541]  ? __pfx_process_one_work+0x10/0x10
> [ 1845.167706]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1845.168880]  worker_thread+0x100/0x12c0
> [ 1845.170006]  ? __kthread_parkme+0xc1/0x1f0
> [ 1845.171083]  ? __pfx_worker_thread+0x10/0x10
> [ 1845.172302]  kthread+0x2ea/0x3c0
> [ 1845.173350]  ? __pfx_kthread+0x10/0x10
> [ 1845.174465]  ret_from_fork+0x30/0x70
> [ 1845.175522]  ? __pfx_kthread+0x10/0x10
> [ 1845.176616]  ret_from_fork_asm+0x1b/0x30
> [ 1845.177754]  </TASK>
> [ 1845.178624] INFO: task kworker/2:3:291 blocked for more than 123 seconds.
> [ 1845.180123]       Not tainted 6.5.0-rc7 #106
> [ 1845.181306] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
> [ 1845.182914] task:kworker/2:3     state:D stack:0     pid:291   ppid:2      flags:0x00004000
> [ 1845.184626] Workqueue: dio/dm-1 iomap_dio_complete_work
> [ 1845.186012] Call Trace:
> [ 1845.187004]  <TASK>
> [ 1845.187939]  __schedule+0x10ac/0x5e80
> [ 1845.189072]  ? do_raw_spin_unlock+0x54/0x1f0
> [ 1845.190177]  ? __pfx___schedule+0x10/0x10
> [ 1845.191356]  ? lock_release+0x378/0x650
> [ 1845.192421]  ? schedule+0x92/0x220
> [ 1845.193501]  ? mark_held_locks+0x96/0xe0
> [ 1845.194535]  schedule+0x133/0x220
> [ 1845.195595]  bit_wait+0x17/0xe0
> [ 1845.196603]  __wait_on_bit+0x66/0x180
> [ 1845.197697]  ? __pfx_bit_wait+0x10/0x10
> [ 1845.198820]  __inode_wait_for_writeback+0x12b/0x1b0
> [ 1845.200061]  ? __pfx___inode_wait_for_writeback+0x10/0x10
> [ 1845.201315]  ? __pfx_wake_bit_function+0x10/0x10
> [ 1845.202522]  ? find_held_lock+0x2d/0x110
> [ 1845.203679]  writeback_single_inode+0xf9/0x3f0
> [ 1845.204885]  sync_inode_metadata+0x91/0xd0
> [ 1845.205943]  ? __pfx_sync_inode_metadata+0x10/0x10
> [ 1845.207190]  ? lock_release+0x378/0x650
> [ 1845.208325]  ? file_check_and_advance_wb_err+0xb5/0x230
> [ 1845.209581]  generic_buffers_fsync_noflush+0x1bf/0x270
> [ 1845.210883]  ext4_sync_file+0x469/0xb60
> [ 1845.212022]  iomap_dio_complete+0x5d1/0x860
> [ 1845.213177]  ? __pfx_aio_complete_rw+0x10/0x10
> [ 1845.214315]  iomap_dio_complete_work+0x52/0x80
> [ 1845.215547]  process_one_work+0x898/0x14a0
> [ 1845.216714]  ? __pfx_lock_acquire+0x10/0x10
> [ 1845.217887]  ? __pfx_process_one_work+0x10/0x10
> [ 1845.219026]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1845.220280]  worker_thread+0x100/0x12c0
> [ 1845.221386]  ? __kthread_parkme+0xc1/0x1f0
> [ 1845.222569]  ? __pfx_worker_thread+0x10/0x10
> [ 1845.223743]  kthread+0x2ea/0x3c0
> [ 1845.224788]  ? __pfx_kthread+0x10/0x10
> [ 1845.225908]  ret_from_fork+0x30/0x70
> [ 1845.226996]  ? __pfx_kthread+0x10/0x10
> [ 1845.228110]  ret_from_fork_asm+0x1b/0x30
> [ 1845.229254]  </TASK>
> [ 1845.230191] INFO: task kworker/1:3:322 blocked for more than 123 seconds.
> [ 1845.231562]       Not tainted 6.5.0-rc7 #106
> [ 1845.232622] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
> [ 1845.233992] task:kworker/1:3     state:D stack:0     pid:322   ppid:2      flags:0x00004000
> [ 1845.235439] Workqueue: dio/dm-1 iomap_dio_complete_work
> [ 1845.236681] Call Trace:
> [ 1845.237629]  <TASK>
> [ 1845.238526]  __schedule+0x10ac/0x5e80
> [ 1845.239559]  ? do_raw_spin_unlock+0x54/0x1f0
> [ 1845.240622]  ? __pfx___schedule+0x10/0x10
> [ 1845.241639]  ? lock_release+0x378/0x650
> [ 1845.242650]  ? schedule+0x92/0x220
> [ 1845.243654]  ? mark_held_locks+0x96/0xe0
> [ 1845.244707]  schedule+0x133/0x220
> [ 1845.245657]  bit_wait+0x17/0xe0
> [ 1845.246631]  __wait_on_bit+0x66/0x180
> [ 1845.247601]  ? __pfx_bit_wait+0x10/0x10
> [ 1845.248630]  __inode_wait_for_writeback+0x12b/0x1b0
> [ 1845.249743]  ? __pfx___inode_wait_for_writeback+0x10/0x10
> [ 1845.250948]  ? __pfx_wake_bit_function+0x10/0x10
> [ 1845.252021]  ? find_held_lock+0x2d/0x110
> [ 1845.253043]  writeback_single_inode+0xf9/0x3f0
> [ 1845.254123]  sync_inode_metadata+0x91/0xd0
> [ 1845.255205]  ? __pfx_sync_inode_metadata+0x10/0x10
> [ 1845.256294]  ? lock_release+0x378/0x650
> [ 1845.257332]  ? file_check_and_advance_wb_err+0xb5/0x230
> [ 1845.258542]  generic_buffers_fsync_noflush+0x1bf/0x270
> [ 1845.259701]  ext4_sync_file+0x469/0xb60
> [ 1845.260765]  iomap_dio_complete+0x5d1/0x860
> [ 1845.261790]  ? __pfx_aio_complete_rw+0x10/0x10
> [ 1845.262907]  iomap_dio_complete_work+0x52/0x80
> [ 1845.263961]  process_one_work+0x898/0x14a0
> [ 1845.265025]  ? __pfx_lock_acquire+0x10/0x10
> [ 1845.266074]  ? __pfx_process_one_work+0x10/0x10
> [ 1845.267197]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1845.268305]  worker_thread+0x100/0x12c0
> [ 1845.269328]  ? __kthread_parkme+0xc1/0x1f0
> [ 1845.270368]  ? __pfx_worker_thread+0x10/0x10
> [ 1845.271457]  kthread+0x2ea/0x3c0
> [ 1845.272422]  ? __pfx_kthread+0x10/0x10
> [ 1845.273443]  ret_from_fork+0x30/0x70
> [ 1845.274438]  ? __pfx_kthread+0x10/0x10
> [ 1845.275475]  ret_from_fork_asm+0x1b/0x30
> [ 1845.276555]  </TASK>
> [ 1845.277433] INFO: task kworker/u8:7:2757 blocked for more than 123 seconds.
> [ 1845.278808]       Not tainted 6.5.0-rc7 #106
> [ 1845.279897] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
> [ 1845.281313] task:kworker/u8:7    state:D stack:0     pid:2757  ppid:2      flags:0x00004000
> [ 1845.282753] Workqueue: writeback wb_workfn (flush-253:1)
> [ 1845.283993] Call Trace:
> [ 1845.284945]  <TASK>
> [ 1845.285853]  __schedule+0x10ac/0x5e80
> [ 1845.286872]  ? lock_acquire+0x1b9/0x4e0
> [ 1845.287917]  ? __pfx___schedule+0x10/0x10
> [ 1845.288934]  ? __blk_flush_plug+0x27a/0x450
> [ 1845.289979]  ? inode_sleep_on_writeback+0xf4/0x160
> [ 1845.291131]  schedule+0x133/0x220
> [ 1845.292052]  inode_sleep_on_writeback+0x14e/0x160
> [ 1845.293130]  ? __pfx_inode_sleep_on_writeback+0x10/0x10
> [ 1845.294289]  ? __pfx_lock_release+0x10/0x10
> [ 1845.295362]  ? __pfx_autoremove_wake_function+0x10/0x10
> [ 1845.296574]  ? __pfx___writeback_inodes_wb+0x10/0x10
> [ 1845.297750]  wb_writeback+0x330/0x7a0
> [ 1845.298800]  ? __pfx_wb_writeback+0x10/0x10
> [ 1845.299876]  ? get_nr_dirty_inodes+0xc7/0x170
> [ 1845.300988]  wb_workfn+0x7a1/0xcc0
> [ 1845.302019]  ? __pfx_wb_workfn+0x10/0x10
> [ 1845.303071]  ? lock_acquire+0x1b9/0x4e0
> [ 1845.304127]  ? __pfx_lock_acquire+0x10/0x10
> [ 1845.305232]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1845.306341]  process_one_work+0x898/0x14a0
> [ 1845.307377]  ? __pfx_lock_acquire+0x10/0x10
> [ 1845.308410]  ? __pfx_process_one_work+0x10/0x10
> [ 1845.309551]  ? __pfx_do_raw_spin_lock+0x10/0x10
> [ 1845.310678]  worker_thread+0x100/0x12c0
> [ 1845.311702]  ? __kthread_parkme+0xc1/0x1f0
> [ 1845.312778]  ? __pfx_worker_thread+0x10/0x10
> [ 1845.313864]  kthread+0x2ea/0x3c0
> [ 1845.314848]  ? __pfx_kthread+0x10/0x10
> [ 1845.315885]  ret_from_fork+0x30/0x70
> [ 1845.316879]  ? __pfx_kthread+0x10/0x10
> [ 1845.317885]  ret_from_fork_asm+0x1b/0x30
> [ 1845.318896]  </TASK>
> [ 1845.319767] Future hung task reports are suppressed, see sysctl kernel.hung_task_warnings
> [ 1845.321587] 
>                Showing all locks held in the system:
> [ 1845.323498] 2 locks held by kworker/0:1/9:
> [ 1845.324569]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.326209]  #1: ffff888100877d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.327999] 1 lock held by rcu_tasks_kthre/13:
> [ 1845.329153]  #0: ffffffffa8c7b010 (rcu_tasks.tasks_gp_mutex){+.+.}-{3:3}, at: rcu_tasks_one_gp+0x31/0xde0
> [ 1845.330838] 1 lock held by rcu_tasks_rude_/14:
> [ 1845.332043]  #0: ffffffffa8c7ad70 (rcu_tasks_rude.tasks_gp_mutex){+.+.}-{3:3}, at: rcu_tasks_one_gp+0x31/0xde0
> [ 1845.333713] 1 lock held by rcu_tasks_trace/15:
> [ 1845.334939]  #0: ffffffffa8c7aa70 (rcu_tasks_trace.tasks_gp_mutex){+.+.}-{3:3}, at: rcu_tasks_one_gp+0x31/0xde0
> [ 1845.336716] 2 locks held by kworker/1:0/25:
> [ 1845.337890]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.339639]  #1: ffff888100977d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.341440] 1 lock held by khungtaskd/43:
> [ 1845.342669]  #0: ffffffffa8c7bbe0 (rcu_read_lock){....}-{1:2}, at: debug_show_all_locks+0x51/0x340
> [ 1845.344347] 2 locks held by kworker/1:1/49:
> [ 1845.345577]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.347382]  #1: ffff88810164fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.349278] 2 locks held by kworker/0:2/74:
> [ 1845.350547]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.352400]  #1: ffff88811c8ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.354301] 2 locks held by kworker/3:2/169:
> [ 1845.355618]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.357472]  #1: ffff88811f0e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.359445] 2 locks held by kworker/0:3/221:
> [ 1845.360862]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.362800]  #1: ffff888126567d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.364804] 2 locks held by kworker/1:2/230:
> [ 1845.366259]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.368270]  #1: ffff8881285f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.370338] 2 locks held by kworker/2:3/291:
> [ 1845.371807]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.373789]  #1: ffff88812a1f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.375949] 2 locks held by kworker/1:3/322:
> [ 1845.377464]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.379533]  #1: ffff888105a6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.381731] 1 lock held by in:imjournal/663:
> [ 1845.383335] 2 locks held by kworker/u8:7/2757:
> [ 1845.384953]  #0: ffff888101191938 ((wq_completion)writeback){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.387067]  #1: ffff88813542fd98 ((work_completion)(&(&wb->dwork)->work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.389320] 2 locks held by kworker/3:4/2759:
> [ 1845.390985]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.393164]  #1: ffff888122ddfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.395410] 2 locks held by kworker/0:4/2760:
> [ 1845.397073]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.399329]  #1: ffff888107dbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.401670] 2 locks held by kworker/1:5/2762:
> [ 1845.403414]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.405626]  #1: ffff888105fbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.407962] 2 locks held by kworker/1:6/2764:
> [ 1845.409693]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.411996]  #1: ffff888134647d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.414335] 2 locks held by kworker/3:5/2765:
> [ 1845.416107]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.418376]  #1: ffff888128effd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.420758] 2 locks held by kworker/1:7/2767:
> [ 1845.422532]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.424711]  #1: ffff88810fcefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.427082] 2 locks held by kworker/1:8/2768:
> [ 1845.428790]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.431080]  #1: ffff88812a42fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.433495] 2 locks held by kworker/1:9/2770:
> [ 1845.435192]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.437507]  #1: ffff888135477d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.439982] 2 locks held by kworker/3:6/2771:
> [ 1845.441737]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.444015]  #1: ffff888127c6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.446448] 2 locks held by kworker/3:10/2776:
> [ 1845.448255]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.450561]  #1: ffff888129fafd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.452971] 2 locks held by kworker/3:11/2777:
> [ 1845.454703]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.457029]  #1: ffff8881056b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.459377] 2 locks held by kworker/2:8/2779:
> [ 1845.461157]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.463483]  #1: ffff88812e997d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.465906] 2 locks held by kworker/3:13/2780:
> [ 1845.467678]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.469988]  #1: ffff888128d57d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.472395] 2 locks held by kworker/3:14/2781:
> [ 1845.474175]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.476468]  #1: ffff88812c9bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.478896] 2 locks held by kworker/3:15/2782:
> [ 1845.480638]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.482919]  #1: ffff888104f27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.485299] 2 locks held by kworker/3:17/2784:
> [ 1845.487097]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.489383]  #1: ffff88812224fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.491737] 2 locks held by kworker/3:18/2785:
> [ 1845.493480]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.495790]  #1: ffff8881361afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.498159] 2 locks held by kworker/3:19/2786:
> [ 1845.499941]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.502266]  #1: ffff888127e67d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.504618] 2 locks held by kworker/3:22/2790:
> [ 1845.506418]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.508708]  #1: ffff888130d4fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.511121] 2 locks held by kworker/2:10/2791:
> [ 1845.512938]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.515179]  #1: ffff888113127d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.517588] 2 locks held by kworker/3:23/2793:
> [ 1845.519372]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.521683]  #1: ffff88812a89fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.524075] 2 locks held by kworker/3:24/2794:
> [ 1845.525876]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.528115]  #1: ffff888129a1fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.530515] 2 locks held by kworker/3:25/2795:
> [ 1845.532283]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.534610]  #1: ffff88812ebb7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.537020] 2 locks held by kworker/3:26/2796:
> [ 1845.538809]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.541117]  #1: ffff888119577d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.543506] 2 locks held by kworker/1:11/2797:
> [ 1845.545286]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.547624]  #1: ffff88813716fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.550018] 2 locks held by kworker/3:27/2798:
> [ 1845.551827]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.554139]  #1: ffff888136747d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.556535] 2 locks held by kworker/1:13/2800:
> [ 1845.558325]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.560657]  #1: ffff888131687d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.563055] 2 locks held by kworker/1:15/2802:
> [ 1845.564867]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.567176]  #1: ffff8881342d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.569574] 2 locks held by kworker/1:17/2804:
> [ 1845.571352]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.573643]  #1: ffff888132137d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.576005] 2 locks held by kworker/1:18/2805:
> [ 1845.577768]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.580107]  #1: ffff888134a5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.582512] 2 locks held by kworker/1:19/2806:
> [ 1845.584307]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.586598]  #1: ffff888135b87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.588971] 2 locks held by kworker/1:20/2807:
> [ 1845.590771]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.593039]  #1: ffff88810513fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.595437] 2 locks held by kworker/1:22/2809:
> [ 1845.597257]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.599584]  #1: ffff8881397bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.601975] 2 locks held by kworker/1:23/2810:
> [ 1845.603756]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.606073]  #1: ffff888139807d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.608442] 2 locks held by kworker/3:30/2814:
> [ 1845.610262]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.612547]  #1: ffff888101a27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.614937] 2 locks held by kworker/2:13/2815:
> [ 1845.616711]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.618912]  #1: ffff888120087d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.621317] 2 locks held by kworker/2:15/2817:
> [ 1845.623090]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.625381]  #1: ffff88812258fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.627743] 2 locks held by kworker/2:16/2818:
> [ 1845.629551]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.631844]  #1: ffff888133d47d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.634251] 2 locks held by kworker/2:19/2821:
> [ 1845.636011]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.638324]  #1: ffff88812ea37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.640711] 2 locks held by kworker/2:20/2822:
> [ 1845.642514]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.644824]  #1: ffff88813abd7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.647217] 2 locks held by kworker/2:21/2823:
> [ 1845.649025]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.651351]  #1: ffff88813454fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.653690] 2 locks held by kworker/2:22/2824:
> [ 1845.655501]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.657763]  #1: ffff888132e5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.660177] 2 locks held by kworker/3:31/2825:
> [ 1845.661943]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.664289]  #1: ffff888138177d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.666651] 2 locks held by kworker/3:32/2826:
> [ 1845.668418]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.670748]  #1: ffff88812a26fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.673018] 2 locks held by kworker/3:38/2832:
> [ 1845.674821]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.677132]  #1: ffff8881319b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.679533] 2 locks held by kworker/2:24/2834:
> [ 1845.681338]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.683668]  #1: ffff8881185efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.686081] 2 locks held by kworker/2:25/2835:
> [ 1845.687877]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.690160]  #1: ffff8881299a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.692548] 2 locks held by kworker/2:27/2837:
> [ 1845.694316]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.696589]  #1: ffff888105ae7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.698995] 2 locks held by kworker/2:28/2838:
> [ 1845.700799]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.703139]  #1: ffff888133fd7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.705549] 2 locks held by kworker/2:30/2840:
> [ 1845.707341]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.709638]  #1: ffff888127627d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.712057] 2 locks held by kworker/2:31/2841:
> [ 1845.713853]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.716160]  #1: ffff88810a8d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.718564] 2 locks held by kworker/2:34/2845:
> [ 1845.720341]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.722653]  #1: ffff888134107d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.725061] 2 locks held by kworker/3:40/2847:
> [ 1845.726873]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.729184]  #1: ffff88812f5cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.731588] 2 locks held by kworker/2:36/2848:
> [ 1845.733384]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.735681]  #1: ffff8881184efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.738077] 2 locks held by kworker/2:37/2851:
> [ 1845.739855]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.742191]  #1: ffff88813b89fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.744532] 2 locks held by kworker/1:24/2852:
> [ 1845.746338]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.748635]  #1: ffff8881275c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.751036] 2 locks held by kworker/1:26/2854:
> [ 1845.752810]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.755139]  #1: ffff88812238fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.757498] 2 locks held by kworker/1:28/2856:
> [ 1845.759286]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.761628]  #1: ffff888122f2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.763996] 2 locks held by kworker/1:29/2857:
> [ 1845.765766]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.768067]  #1: ffff88812215fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.770425] 2 locks held by kworker/1:30/2858:
> [ 1845.772237]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.774564]  #1: ffff888137177d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.776959] 2 locks held by kworker/1:32/2860:
> [ 1845.778767]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.781058]  #1: ffff88812a6bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.783435] 2 locks held by kworker/1:34/2862:
> [ 1845.785261]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.787605]  #1: ffff888119487d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.790019] 2 locks held by kworker/1:35/2863:
> [ 1845.791759]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.794093]  #1: ffff888135497d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.796540] 2 locks held by kworker/1:37/2865:
> [ 1845.798278]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.800636]  #1: ffff8881053b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.803035] 2 locks held by kworker/2:38/2866:
> [ 1845.804808]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.807150]  #1: ffff88810533fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.809571] 2 locks held by kworker/2:39/2867:
> [ 1845.811371]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.813698]  #1: ffff888119d57d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.816104] 2 locks held by kworker/2:41/2869:
> [ 1845.817858]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.820217]  #1: ffff888119d7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.822579] 2 locks held by kworker/2:46/2874:
> [ 1845.824384]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.826691]  #1: ffff888106be7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.829051] 2 locks held by kworker/2:49/2878:
> [ 1845.830865]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.833194]  #1: ffff88813af5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.835616] 2 locks held by kworker/2:51/2881:
> [ 1845.837390]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.839737]  #1: ffff888122957d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.842116] 2 locks held by kworker/2:52/2882:
> [ 1845.843933]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.846254]  #1: ffff888123fe7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.848710] 2 locks held by kworker/2:53/2883:
> [ 1845.850464]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.852749]  #1: ffff88812282fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.855191] 2 locks held by kworker/2:54/2884:
> [ 1845.856982]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.859288]  #1: ffff88813baffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.861684] 2 locks held by kworker/2:55/2885:
> [ 1845.863494]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.865779]  #1: ffff888111c97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.868184] 2 locks held by kworker/2:56/2886:
> [ 1845.869955]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.872223]  #1: ffff888111c8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.874666] 2 locks held by kworker/1:40/2888:
> [ 1845.876443]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.878794]  #1: ffff88811b197d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.881130] 2 locks held by kworker/0:5/2889:
> [ 1845.882854]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.885148]  #1: ffff888118247d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.887535] 2 locks held by kworker/2:58/2890:
> [ 1845.889341]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.891495]  #1: ffff88810cf57d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.893905] 2 locks held by kworker/1:41/2897:
> [ 1845.895655]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.897934]  #1: ffff888137987d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.900296] 2 locks held by kworker/2:61/2898:
> [ 1845.902071]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.904422]  #1: ffff88811008fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.906816] 2 locks held by kworker/0:7/2899:
> [ 1845.908574]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.910857]  #1: ffff88810530fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.913250] 2 locks held by kworker/2:62/2900:
> [ 1845.915027]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.917326]  #1: ffff88812eccfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.919696] 2 locks held by kworker/0:8/2901:
> [ 1845.921496]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.923773]  #1: ffff888139277d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.926133] 2 locks held by kworker/0:9/2903:
> [ 1845.927908]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.930231]  #1: ffff888105f27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.932617] 2 locks held by kworker/1:43/2905:
> [ 1845.934393]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.936659]  #1: ffff88810629fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.939044] 2 locks held by kworker/1:44/2907:
> [ 1845.940855]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.943143]  #1: ffff88811d127d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.945543] 2 locks held by kworker/0:10/2908:
> [ 1845.947309]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.949590]  #1: ffff8881361b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.952001] 2 locks held by kworker/1:45/2909:
> [ 1845.953773]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.956004]  #1: ffff888121147d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.958426] 2 locks held by kworker/2:65/2910:
> [ 1845.960240]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.962547]  #1: ffff88810c597d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.964935] 2 locks held by kworker/1:46/2911:
> [ 1845.966701]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.968990]  #1: ffff88812b2ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.971313] 2 locks held by kworker/1:47/2913:
> [ 1845.973100]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.975451]  #1: ffff88813f79fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.977880] 2 locks held by kworker/0:11/2916:
> [ 1845.979682]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.981949]  #1: ffff88811d7e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.984317] 2 locks held by kworker/2:68/2917:
> [ 1845.986087]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.988369]  #1: ffff88812c017d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.990715] 2 locks held by kworker/1:50/2920:
> [ 1845.992496]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1845.994769]  #1: ffff888123fc7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1845.997095] 2 locks held by kworker/0:12/2921:
> [ 1845.998885]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.001218]  #1: ffff8881202f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.003603] 2 locks held by kworker/1:51/2923:
> [ 1846.005405]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.007715]  #1: ffff8881114ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.010124] 2 locks held by kworker/2:71/2924:
> [ 1846.011907]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.014223]  #1: ffff88812ef5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.016615] 2 locks held by kworker/2:73/2928:
> [ 1846.018367]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.020712]  #1: ffff888117667d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.023000] 2 locks held by kworker/2:74/2931:
> [ 1846.024774]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.027108]  #1: ffff88811322fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.029466] 2 locks held by kworker/0:14/2932:
> [ 1846.031284]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.033576]  #1: ffff88810fd5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.035945] 2 locks held by kworker/2:75/2933:
> [ 1846.037730]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.040007]  #1: ffff8881367a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.042335] 2 locks held by kworker/0:16/2935:
> [ 1846.044121]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.046392]  #1: ffff88810c55fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.048757] 2 locks held by kworker/0:17/2937:
> [ 1846.050524]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.052871]  #1: ffff8881368a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.055241] 2 locks held by kworker/2:77/2938:
> [ 1846.056990]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.059306]  #1: ffff888122217d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.061588] 2 locks held by kworker/2:78/2940:
> [ 1846.063332]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.065636]  #1: ffff8881212a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.068005] 2 locks held by kworker/1:56/2941:
> [ 1846.069793]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.072091]  #1: ffff8881192efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.074460] 2 locks held by kworker/2:79/2942:
> [ 1846.076276]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.078593]  #1: ffff88811b187d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.080997] 2 locks held by kworker/1:57/2943:
> [ 1846.082766]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.085099]  #1: ffff888139457d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.087514] 2 locks held by kworker/2:80/2944:
> [ 1846.089313]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.091623]  #1: ffff888134697d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.094002] 2 locks held by kworker/1:59/2948:
> [ 1846.095792]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.098122]  #1: ffff888107d27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.100558] 2 locks held by kworker/2:82/2949:
> [ 1846.102361]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.104650]  #1: ffff88812810fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.107035] 2 locks held by kworker/0:19/2950:
> [ 1846.108804]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.111121]  #1: ffff8881313f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.113499] 2 locks held by kworker/1:60/2951:
> [ 1846.115278]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.117586]  #1: ffff88810d01fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.120000] 2 locks held by kworker/2:84/2954:
> [ 1846.121772]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.124105]  #1: ffff88812618fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.126532] 2 locks held by kworker/0:21/2955:
> [ 1846.128332]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.130576]  #1: ffff888107c6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.132910] 2 locks held by kworker/0:24/2960:
> [ 1846.134696]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.136967]  #1: ffff888100cafd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.139353] 2 locks held by kworker/0:25/2962:
> [ 1846.141106]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.143454]  #1: ffff888111267d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.145841] 2 locks held by kworker/2:88/2963:
> [ 1846.147625]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.149903]  #1: ffff888134d0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.152280] 2 locks held by kworker/3:46/2964:
> [ 1846.154068]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.156371]  #1: ffff88810f7afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.158751] 2 locks held by kworker/3:47/2967:
> [ 1846.160398]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.162653]  #1: ffff88813c7b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.165045] 2 locks held by kworker/0:28/2968:
> [ 1846.166830]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.169156]  #1: ffff88812dc77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.171558] 2 locks held by kworker/0:29/2970:
> [ 1846.173363]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.175655]  #1: ffff88812892fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.178081] 2 locks held by kworker/0:30/2971:
> [ 1846.179861]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.182198]  #1: ffff88812dfd7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.184562] 2 locks held by kworker/0:31/2973:
> [ 1846.186364]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.188663]  #1: ffff8881304ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.191053] 2 locks held by kworker/3:50/2974:
> [ 1846.192850]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.195153]  #1: ffff88811fa6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.197534] 2 locks held by kworker/3:51/2975:
> [ 1846.199290]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.201640]  #1: ffff888130c0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.204059] 2 locks held by kworker/2:90/2978:
> [ 1846.205833]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.208134]  #1: ffff888138457d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.210548] 2 locks held by kworker/2:94/2983:
> [ 1846.212355]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.214684]  #1: ffff88813c5b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.217078] 2 locks held by kworker/0:33/2984:
> [ 1846.218870]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.221180]  #1: ffff888118337d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.223616] 2 locks held by kworker/0:34/2987:
> [ 1846.225402]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.227712]  #1: ffff88812b827d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.230049] 2 locks held by kworker/0:35/2988:
> [ 1846.231865]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.234180]  #1: ffff88811761fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.236603] 2 locks held by kworker/0:36/2990:
> [ 1846.238405]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.240743]  #1: ffff88813a327d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.243154] 2 locks held by kworker/3:54/2991:
> [ 1846.244944]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.247254]  #1: ffff88813a32fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.249661] 2 locks held by kworker/2:96/2992:
> [ 1846.251415]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.253744]  #1: ffff88813a5cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.256072] 2 locks held by kworker/1:62/2993:
> [ 1846.257867]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.260131]  #1: ffff88810f7c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.262502] 2 locks held by kworker/2:98/2996:
> [ 1846.264306]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.266598]  #1: ffff88813544fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.268989] 2 locks held by kworker/1:64/2997:
> [ 1846.270789]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.273098]  #1: ffff88810f497d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.275485] 2 locks held by kworker/2:102/3001:
> [ 1846.277249]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.279558]  #1: ffff888107d37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.281985] 2 locks held by kworker/0:38/3004:
> [ 1846.283756]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.286069]  #1: ffff88812db1fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.288455] 2 locks held by kworker/0:39/3006:
> [ 1846.290218]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.292529]  #1: ffff88812b847d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.294939] 2 locks held by kworker/2:105/3007:
> [ 1846.296685]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.299010]  #1: ffff888135e37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.301429] 2 locks held by kworker/0:40/3008:
> [ 1846.303243]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.305566]  #1: ffff888112cffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.307892] 2 locks held by kworker/2:107/3011:
> [ 1846.309698]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.312023]  #1: ffff88812e577d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.314422] 2 locks held by kworker/2:108/3013:
> [ 1846.316249]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.318523]  #1: ffff88812183fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.320928] 2 locks held by kworker/0:43/3014:
> [ 1846.322729]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.325007]  #1: ffff88813b8f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.327362] 2 locks held by kworker/1:65/3015:
> [ 1846.329133]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.331460]  #1: ffff8881230efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.333826] 2 locks held by kworker/0:44/3016:
> [ 1846.335617]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.337933]  #1: ffff888134f77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.340294] 2 locks held by kworker/2:110/3019:
> [ 1846.342063]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.344356]  #1: ffff888123877d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.346708] 2 locks held by kworker/2:111/3021:
> [ 1846.348463]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.350711]  #1: ffff88811b93fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.353066] 2 locks held by kworker/0:48/3024:
> [ 1846.354871]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.357159]  #1: ffff88812500fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.359565] 2 locks held by kworker/0:49/3026:
> [ 1846.361326]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.363655]  #1: ffff8881184a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.366021] 2 locks held by kworker/0:50/3027:
> [ 1846.367802]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.370043]  #1: ffff8881184afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.372410] 2 locks held by kworker/1:66/3028:
> [ 1846.374214]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.376522]  #1: ffff88813478fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.378915] 2 locks held by kworker/1:67/3029:
> [ 1846.380682]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.383009]  #1: ffff8881216e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.385428] 2 locks held by kworker/1:72/3034:
> [ 1846.387211]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.389542]  #1: ffff88812bdd7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.391894] 2 locks held by kworker/1:73/3035:
> [ 1846.393613]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.395947]  #1: ffff88812bddfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.398383] 2 locks held by kworker/1:74/3036:
> [ 1846.400150]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.402449]  #1: ffff88811c49fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.404851] 2 locks held by kworker/1:75/3037:
> [ 1846.406632]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.408913]  #1: ffff888111587d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.411257] 2 locks held by kworker/1:77/3039:
> [ 1846.413046]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.415323]  #1: ffff88811157fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.417715] 2 locks held by kworker/1:79/3042:
> [ 1846.419479]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.421757]  #1: ffff888126f77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.424093] 2 locks held by kworker/1:80/3043:
> [ 1846.425872]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.428177]  #1: ffff888126f7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.430604] 2 locks held by kworker/1:82/3046:
> [ 1846.432382]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.434722]  #1: ffff88811b027d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.437127] 2 locks held by kworker/2:116/3052:
> [ 1846.438947]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.441264]  #1: ffff888138e4fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.443697] 2 locks held by kworker/2:118/3054:
> [ 1846.445508]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.447719]  #1: ffff88813ecc7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.450101] 2 locks held by kworker/2:120/3056:
> [ 1846.451878]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.454210]  #1: ffff88813ecdfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.456602] 2 locks held by kworker/2:122/3058:
> [ 1846.458392]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.460678]  #1: ffff88811c597d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.463034] 2 locks held by kworker/2:123/3059:
> [ 1846.464820]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.467113]  #1: ffff88811c59fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.469512] 2 locks held by kworker/2:125/3061:
> [ 1846.471288]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.473547]  #1: ffff88811c47fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.475876] 2 locks held by kworker/2:127/3063:
> [ 1846.477645]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.479943]  #1: ffff88812fbf7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.482357] 2 locks held by kworker/2:128/3064:
> [ 1846.484135]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.486426]  #1: ffff88810f5a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.488860] 2 locks held by kworker/2:131/3067:
> [ 1846.490666]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.492903]  #1: ffff88811f307d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.495335] 2 locks held by kworker/2:133/3069:
> [ 1846.497155]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.499482]  #1: ffff888130447d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.501832] 2 locks held by kworker/2:134/3070:
> [ 1846.503601]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.505908]  #1: ffff888130457d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.508286] 2 locks held by kworker/2:141/3077:
> [ 1846.510081]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.512419]  #1: ffff88813d78fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.514763] 2 locks held by kworker/0:55/3078:
> [ 1846.516571]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.518869]  #1: ffff88813d79fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.521270] 2 locks held by kworker/0:56/3080:
> [ 1846.523060]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.525405]  #1: ffff8881252f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.527817] 2 locks held by kworker/0:58/3082:
> [ 1846.529590]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.531794]  #1: ffff888110d6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.534196] 2 locks held by kworker/0:59/3083:
> [ 1846.535999]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.538293]  #1: ffff888110d77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.540713] 2 locks held by kworker/0:60/3084:
> [ 1846.542437]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.544711]  #1: ffff888119c07d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.547111] 2 locks held by kworker/0:62/3086:
> [ 1846.548917]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.551264]  #1: ffff88811464fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.553629] 2 locks held by kworker/0:64/3088:
> [ 1846.555433]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.557729]  #1: ffff88813ee47d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.560105] 2 locks held by kworker/0:65/3089:
> [ 1846.561924]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.564227]  #1: ffff88813ee4fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.566623] 2 locks held by kworker/0:66/3090:
> [ 1846.568414]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.570750]  #1: ffff88813ee5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.573183] 2 locks held by kworker/0:68/3092:
> [ 1846.574932]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.577277]  #1: ffff8881169b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.579664] 2 locks held by kworker/0:69/3093:
> [ 1846.581445]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.583780]  #1: ffff8881169bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.586161] 2 locks held by kworker/0:73/3097:
> [ 1846.587954]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.590274]  #1: ffff88811632fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.592663] 2 locks held by kworker/0:74/3098:
> [ 1846.594470]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.596751]  #1: ffff88811633fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.599107] 2 locks held by kworker/0:76/3100:
> [ 1846.600881]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.603217]  #1: ffff8881169dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.605601] 2 locks held by kworker/0:77/3101:
> [ 1846.607402]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.609573]  #1: ffff8881169e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.611971] 2 locks held by kworker/0:78/3102:
> [ 1846.613730]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.616042]  #1: ffff8881169f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.618444] 2 locks held by kworker/0:79/3103:
> [ 1846.620254]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.622558]  #1: ffff8881169ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.624952] 2 locks held by kworker/0:80/3104:
> [ 1846.626680]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.628957]  #1: ffff888113257d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.631318] 2 locks held by kworker/0:82/3106:
> [ 1846.633132]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.635458]  #1: ffff88811326fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.637825] 2 locks held by kworker/2:143/3107:
> [ 1846.639535]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.641623]  #1: ffff888113277d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.643714] 2 locks held by kworker/0:83/3108:
> [ 1846.645345]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.647378]  #1: ffff888116747d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.649476] 2 locks held by kworker/0:85/3110:
> [ 1846.651108]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.653157]  #1: ffff88811675fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.655265] 2 locks held by kworker/2:145/3115:
> [ 1846.656907]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.658943]  #1: ffff88811681fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.661043] 2 locks held by kworker/0:88/3116:
> [ 1846.662672]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.664700]  #1: ffff88811682fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.666800] 2 locks held by kworker/0:89/3117:
> [ 1846.668428]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.670467]  #1: ffff888116837d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.672564] 2 locks held by kworker/0:90/3118:
> [ 1846.674191]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.676231]  #1: ffff888116847d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.678334] 2 locks held by kworker/0:91/3119:
> [ 1846.679962]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.682000]  #1: ffff88811684fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.684357] 2 locks held by kworker/0:94/3122:
> [ 1846.686158]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.688463]  #1: ffff88811687fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.690827] 2 locks held by kworker/0:96/3124:
> [ 1846.692624]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.694930]  #1: ffff888116797d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.697350] 2 locks held by kworker/0:97/3125:
> [ 1846.699168]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.701468]  #1: ffff8881167a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.703838] 2 locks held by kworker/3:55/3126:
> [ 1846.705562]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.707872]  #1: ffff8881167efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.710221] 2 locks held by kworker/3:57/3129:
> [ 1846.712016]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.714368]  #1: ffff88810f7efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.716772] 2 locks held by kworker/2:147/3130:
> [ 1846.718550]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.720798]  #1: ffff888130f3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.723151] 2 locks held by kworker/3:58/3131:
> [ 1846.724961]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.727251]  #1: ffff88813387fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.729629] 2 locks held by kworker/3:60/3136:
> [ 1846.731377]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.733722]  #1: ffff88811cac7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.736070] 2 locks held by kworker/2:151/3137:
> [ 1846.737871]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.740181]  #1: ffff888119a2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.742605] 2 locks held by kworker/3:61/3138:
> [ 1846.744409]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.746708]  #1: ffff888132bbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.749059] 2 locks held by kworker/3:62/3141:
> [ 1846.750851]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.753065]  #1: ffff8881378dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.755504] 2 locks held by kworker/2:155/3144:
> [ 1846.757284]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.759575]  #1: ffff888118bffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.761933] 2 locks held by kworker/2:157/3147:
> [ 1846.763742]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.766062]  #1: ffff88812f4c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.768433] 2 locks held by kworker/3:66/3150:
> [ 1846.770245]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.772589]  #1: ffff88812c59fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.774922] 2 locks held by kworker/2:159/3151:
> [ 1846.776705]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.778997]  #1: ffff888128447d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.781418] 2 locks held by kworker/3:67/3152:
> [ 1846.783229]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.785552]  #1: ffff8881010c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.787935] 2 locks held by kworker/2:160/3153:
> [ 1846.789731]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.791997]  #1: ffff88811b8dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.794398] 2 locks held by kworker/3:68/3154:
> [ 1846.796217]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.798461]  #1: ffff8881230c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.800827] 2 locks held by kworker/3:69/3156:
> [ 1846.802626]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.804880]  #1: ffff88811a5afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.807294] 2 locks held by kworker/2:162/3157:
> [ 1846.809032]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.811342]  #1: ffff888123e27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.813731] 2 locks held by kworker/3:70/3158:
> [ 1846.815471]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.817704]  #1: ffff888119967d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.820099] 2 locks held by kworker/2:163/3159:
> [ 1846.821912]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.824259]  #1: ffff88812eb17d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.826658] 2 locks held by kworker/3:72/3162:
> [ 1846.828445]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.830737]  #1: ffff88812b71fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.833091] 2 locks held by kworker/3:73/3164:
> [ 1846.834905]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.837180]  #1: ffff8881236cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.839547] 2 locks held by kworker/2:166/3165:
> [ 1846.841359]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.843683]  #1: ffff888127ce7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.846077] 2 locks held by kworker/2:167/3166:
> [ 1846.847874]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.850188]  #1: ffff888130f5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.852581] 2 locks held by kworker/3:74/3167:
> [ 1846.854381]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.856638]  #1: ffff88812a03fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.858994] 2 locks held by kworker/2:168/3168:
> [ 1846.860803]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.863125]  #1: ffff888118547d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.865502] 2 locks held by kworker/3:76/3170:
> [ 1846.867274]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.869590]  #1: ffff8881290efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.871966] 2 locks held by kworker/2:169/3171:
> [ 1846.873759]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.876067]  #1: ffff888113537d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.878500] 2 locks held by kworker/2:170/3172:
> [ 1846.880241]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.882550]  #1: ffff88812800fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.884923] 2 locks held by kworker/2:171/3174:
> [ 1846.886717]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.888964]  #1: ffff88810b7afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.891342] 2 locks held by kworker/3:78/3175:
> [ 1846.893152]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.895444]  #1: ffff88810b7cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.897839] 2 locks held by kworker/2:173/3178:
> [ 1846.899645]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.901930]  #1: ffff88813824fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.904368] 2 locks held by kworker/2:174/3180:
> [ 1846.906166]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.908487]  #1: ffff88811fbffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.910889] 2 locks held by kworker/2:175/3181:
> [ 1846.912677]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.914993]  #1: ffff88810d657d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.917381] 2 locks held by kworker/2:176/3183:
> [ 1846.919126]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.921418]  #1: ffff88812cd0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.923802] 2 locks held by kworker/0:99/3184:
> [ 1846.925561]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.927875]  #1: ffff888129a8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.930261] 2 locks held by kworker/0:101/3188:
> [ 1846.932086]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.934427]  #1: ffff888122d0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.936855] 2 locks held by kworker/0:102/3189:
> [ 1846.938637]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.940957]  #1: ffff888135087d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.943360] 2 locks held by kworker/2:179/3190:
> [ 1846.945154]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.947422]  #1: ffff88812db5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.949816] 2 locks held by kworker/2:180/3192:
> [ 1846.951610]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.953888]  #1: ffff888135c2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.956310] 2 locks held by kworker/2:181/3194:
> [ 1846.958131]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.960429]  #1: ffff88811e607d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.962842] 2 locks held by kworker/0:105/3195:
> [ 1846.964653]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.966944]  #1: ffff88810786fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.969357] 2 locks held by kworker/2:182/3196:
> [ 1846.971031]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.973347]  #1: ffff88810b6dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.975722] 2 locks held by kworker/0:106/3197:
> [ 1846.977477]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.979799]  #1: ffff888133eb7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.982180] 2 locks held by kworker/2:183/3198:
> [ 1846.983956]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.986282]  #1: ffff88810fd4fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.988625] 2 locks held by kworker/2:184/3200:
> [ 1846.990385]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.992687]  #1: ffff88811d3bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1846.995091] 2 locks held by kworker/0:108/3201:
> [ 1846.996880]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1846.999212]  #1: ffff8881194f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.001610] 2 locks held by kworker/2:185/3202:
> [ 1847.003419]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.005697]  #1: ffff88812201fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.008064] 2 locks held by kworker/0:109/3203:
> [ 1847.009833]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.012147]  #1: ffff88812360fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.014580] 2 locks held by kworker/0:110/3205:
> [ 1847.016323]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.018596]  #1: ffff88812dbffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.020968] 2 locks held by kworker/0:111/3206:
> [ 1847.022748]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.025005]  #1: ffff888121917d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.027404] 2 locks held by kworker/0:113/3208:
> [ 1847.029168]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.031487]  #1: ffff888125257d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.033927] 2 locks held by kworker/0:114/3209:
> [ 1847.035695]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.037924]  #1: ffff888117cffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.040327] 2 locks held by kworker/0:115/3210:
> [ 1847.042093]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.044392]  #1: ffff88813ee97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.046790] 2 locks held by kworker/3:84/3214:
> [ 1847.048447]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.050752]  #1: ffff88811624fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.053143] 2 locks held by kworker/3:85/3215:
> [ 1847.054922]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.057177]  #1: ffff88811625fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.059600] 2 locks held by kworker/3:86/3216:
> [ 1847.061342]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.063655]  #1: ffff888116267d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.066031] 2 locks held by kworker/3:87/3217:
> [ 1847.067846]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.070173]  #1: ffff888116277d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.072605] 2 locks held by kworker/3:88/3218:
> [ 1847.074322]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.076675]  #1: ffff88811627fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.079050] 2 locks held by kworker/3:90/3220:
> [ 1847.080866]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.083139]  #1: ffff88811629fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.085557] 2 locks held by kworker/0:116/3224:
> [ 1847.087325]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.089540]  #1: ffff8881162cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.091896] 2 locks held by kworker/0:117/3225:
> [ 1847.093708]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.095968]  #1: ffff8881162dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.098385] 2 locks held by kworker/0:120/3228:
> [ 1847.100165]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.102497]  #1: ffff8881162ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.104912] 2 locks held by kworker/0:122/3230:
> [ 1847.106730]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.109025]  #1: ffff888116617d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.111433] 2 locks held by kworker/0:124/3232:
> [ 1847.113261]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.115545]  #1: ffff888116637d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.117891] 2 locks held by kworker/0:125/3233:
> [ 1847.119688]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.121958]  #1: ffff88811664fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.124367] 2 locks held by kworker/0:126/3234:
> [ 1847.126165]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.128485]  #1: ffff888116657d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.130883] 2 locks held by kworker/0:127/3235:
> [ 1847.132680]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.134962]  #1: ffff888116667d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.137353] 2 locks held by kworker/0:128/3236:
> [ 1847.139112]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.141422]  #1: ffff88811666fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.143839] 2 locks held by kworker/0:129/3237:
> [ 1847.145625]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.147894]  #1: ffff88811667fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.150250] 2 locks held by kworker/0:130/3238:
> [ 1847.152017]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.154368]  #1: ffff888116687d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.156773] 2 locks held by kworker/0:135/3243:
> [ 1847.158555]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.160730]  #1: ffff8881166c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.163142] 2 locks held by kworker/0:136/3244:
> [ 1847.164910]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.167258]  #1: ffff8881166cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.169669] 2 locks held by kworker/3:95/3246:
> [ 1847.171438]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.173713]  #1: ffff8881166efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.176070] 2 locks held by kworker/3:97/3248:
> [ 1847.177888]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.180173]  #1: ffff88813ef07d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.182623] 2 locks held by kworker/0:137/3249:
> [ 1847.184437]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.186710]  #1: ffff88813ef1fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.189078] 2 locks held by kworker/3:99/3251:
> [ 1847.190888]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.193149]  #1: ffff88813ef37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.195552] 2 locks held by kworker/3:102/3254:
> [ 1847.197351]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.199679]  #1: ffff88813ef57d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.202095] 2 locks held by kworker/3:104/3256:
> [ 1847.203830]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.206136]  #1: ffff88813ef6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.208511] 2 locks held by kworker/3:107/3259:
> [ 1847.210327]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.212667]  #1: ffff88813ef9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.215030] 2 locks held by kworker/3:109/3261:
> [ 1847.216850]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.219130]  #1: ffff88813efb7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.221522] 2 locks held by kworker/3:110/3262:
> [ 1847.223298]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.225646]  #1: ffff88813efbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.227959] 2 locks held by kworker/3:112/3264:
> [ 1847.229749]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.232022]  #1: ffff88811600fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.234410] 2 locks held by kworker/1:85/3265:
> [ 1847.236204]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.238532]  #1: ffff88811601fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.240856] 2 locks held by kworker/1:86/3266:
> [ 1847.242656]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.244925]  #1: ffff888116027d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.247269] 2 locks held by kworker/1:87/3267:
> [ 1847.249067]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.251384]  #1: ffff88811607fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.253760] 2 locks held by kworker/1:88/3268:
> [ 1847.255546]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.257786]  #1: ffff888116087d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.260211] 2 locks held by kworker/1:89/3269:
> [ 1847.262017]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.264285]  #1: ffff888116097d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.266675] 2 locks held by kworker/0:138/3270:
> [ 1847.268432]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.270712]  #1: ffff8881393cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.273097] 2 locks held by kworker/0:139/3272:
> [ 1847.274906]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.277199]  #1: ffff8881160e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.279629] 2 locks held by kworker/1:91/3273:
> [ 1847.281402]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.283688]  #1: ffff8881160f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.286055] 2 locks held by kworker/1:92/3275:
> [ 1847.287859]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.290142]  #1: ffff88811610fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.292499] 2 locks held by kworker/1:93/3277:
> [ 1847.294276]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.296572]  #1: ffff888116167d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.298959] 2 locks held by kworker/0:143/3280:
> [ 1847.300741]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.303053]  #1: ffff88811618fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.305458] 2 locks held by kworker/0:144/3282:
> [ 1847.307260]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.309562]  #1: ffff8881161a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.311971] 2 locks held by kworker/1:99/3289:
> [ 1847.313761]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.316077]  #1: ffff888116407d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.318447] 2 locks held by kworker/1:100/3291:
> [ 1847.320274]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.322556]  #1: ffff88811641fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.324976] 2 locks held by kworker/0:149/3292:
> [ 1847.326775]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.329064]  #1: ffff88811642fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.331461] 2 locks held by kworker/0:150/3294:
> [ 1847.333272]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.335568]  #1: ffff888116447d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.337986] 2 locks held by kworker/1:102/3295:
> [ 1847.339772]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.342014]  #1: ffff88811644fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.344402] 2 locks held by kworker/0:151/3296:
> [ 1847.346193]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.348481]  #1: ffff88811645fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.350902] 2 locks held by kworker/1:103/3297:
> [ 1847.352678]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.354987]  #1: ffff888116467d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.357395] 2 locks held by kworker/0:152/3298:
> [ 1847.359178]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.361441]  #1: ffff8881164afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.363841] 2 locks held by kworker/1:104/3299:
> [ 1847.365642]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.367965]  #1: ffff8881164bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.370355] 2 locks held by kworker/0:154/3301:
> [ 1847.372178]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.374487]  #1: ffff8881164d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.376872] 2 locks held by kworker/0:155/3302:
> [ 1847.378658]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.380974]  #1: ffff8881164e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.383376] 2 locks held by kworker/0:156/3303:
> [ 1847.385153]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.387476]  #1: ffff8881164efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.389924] 2 locks held by kworker/0:157/3304:
> [ 1847.391724]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.394012]  #1: ffff888116507d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.396410] 2 locks held by kworker/0:158/3306:
> [ 1847.398175]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.400498]  #1: ffff888124897d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.402864] 2 locks held by kworker/2:188/3307:
> [ 1847.404675]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.406978]  #1: ffff88811f0afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.409328] 2 locks held by kworker/0:159/3310:
> [ 1847.411083]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.413366]  #1: ffff888129117d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.415716] 2 locks held by kworker/0:160/3312:
> [ 1847.417507]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.419764]  #1: ffff888105837d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.422101] 2 locks held by kworker/0:161/3314:
> [ 1847.423922]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.426202]  #1: ffff88813d44fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.428598] 2 locks held by kworker/0:162/3316:
> [ 1847.430401]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.432698]  #1: ffff888121b37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.435047] 2 locks held by kworker/2:194/3317:
> [ 1847.436834]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.439139]  #1: ffff88812ba5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.441506] 2 locks held by kworker/0:163/3318:
> [ 1847.443320]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.445432]  #1: ffff88812923fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.447767] 2 locks held by kworker/2:197/3321:
> [ 1847.449571]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.451878]  #1: ffff88811ea3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.454252] 2 locks held by kworker/2:199/3323:
> [ 1847.456041]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.458354]  #1: ffff888113057d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.460746] 2 locks held by kworker/2:202/3326:
> [ 1847.462556]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.464785]  #1: ffff8881330b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.467097] 2 locks held by kworker/3:113/3328:
> [ 1847.468870]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.471219]  #1: ffff888122eb7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.473651] 2 locks held by kworker/1:105/3329:
> [ 1847.475411]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.477759]  #1: ffff888127057d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.480120] 2 locks held by kworker/2:204/3331:
> [ 1847.481886]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.484226]  #1: ffff888117757d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.486641] 2 locks held by kworker/2:206/3333:
> [ 1847.488454]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.490716]  #1: ffff88812be7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.493118] 2 locks held by kworker/2:209/3336:
> [ 1847.494924]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.497212]  #1: ffff88811778fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.499623] 2 locks held by kworker/2:210/3337:
> [ 1847.501424]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.503699]  #1: ffff8881304efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.506080] 2 locks held by kworker/2:213/3340:
> [ 1847.507880]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.510224]  #1: ffff88811ffbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.512684] 2 locks held by kworker/2:220/3347:
> [ 1847.514454]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.516795]  #1: ffff8881165c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.519190] 2 locks held by kworker/1:106/3348:
> [ 1847.520945]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.523275]  #1: ffff8881165cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.525698] 2 locks held by kworker/1:108/3350:
> [ 1847.527508]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.529801]  #1: ffff8881165e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.532214] 2 locks held by kworker/1:109/3351:
> [ 1847.534035]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.536369]  #1: ffff8881165f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.538787] 2 locks held by kworker/1:110/3352:
> [ 1847.540550]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.542874]  #1: ffff888116a37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.545280] 2 locks held by kworker/1:111/3353:
> [ 1847.547082]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.549357]  #1: ffff888116a47d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.551733] 2 locks held by kworker/1:112/3354:
> [ 1847.553531]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.555790]  #1: ffff888116a4fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.558151] 2 locks held by kworker/1:114/3356:
> [ 1847.559954]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.562196]  #1: ffff888116a6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.564639] 2 locks held by kworker/1:115/3357:
> [ 1847.566434]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.568712]  #1: ffff888116a7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.571103] 2 locks held by kworker/1:116/3358:
> [ 1847.572922]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.575204]  #1: ffff888116a87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.577562] 2 locks held by kworker/1:117/3359:
> [ 1847.579381]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.581713]  #1: ffff888116a9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.584083] 2 locks held by kworker/1:119/3361:
> [ 1847.585908]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.588165]  #1: ffff888116ab7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.590577] 2 locks held by kworker/1:120/3362:
> [ 1847.592382]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.594688]  #1: ffff888116abfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.597066] 2 locks held by kworker/1:121/3363:
> [ 1847.598848]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.601147]  #1: ffff888116acfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.603549] 2 locks held by kworker/1:123/3365:
> [ 1847.605340]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.607647]  #1: ffff888116ae7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.610047] 2 locks held by kworker/1:124/3366:
> [ 1847.611864]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.614188]  #1: ffff888116aefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.616565] 2 locks held by kworker/1:125/3367:
> [ 1847.618340]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.620638]  #1: ffff888116affd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.623046] 2 locks held by kworker/1:126/3368:
> [ 1847.624844]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.627133]  #1: ffff888116b07d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.629539] 2 locks held by kworker/1:127/3369:
> [ 1847.631330]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.633633]  #1: ffff888116b17d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.636059] 2 locks held by kworker/1:129/3371:
> [ 1847.637882]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.640201]  #1: ffff888116b2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.642586] 2 locks held by kworker/1:130/3372:
> [ 1847.644401]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.646695]  #1: ffff888116b3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.649024] 2 locks held by kworker/1:132/3374:
> [ 1847.650768]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.653070]  #1: ffff888116b5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.655470] 2 locks held by kworker/1:134/3376:
> [ 1847.657301]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.659606]  #1: ffff888116b77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.661962] 2 locks held by kworker/1:135/3377:
> [ 1847.663777]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.666019]  #1: ffff888116b87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.668437] 2 locks held by kworker/1:136/3378:
> [ 1847.670250]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.672595]  #1: ffff888116b8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.674955] 2 locks held by kworker/1:137/3379:
> [ 1847.676736]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.678987]  #1: ffff888116b9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.681417] 2 locks held by kworker/1:138/3380:
> [ 1847.683182]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.685500]  #1: ffff888116ba7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.687873] 2 locks held by kworker/1:141/3383:
> [ 1847.689653]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.691934]  #1: ffff888116bcfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.694359] 2 locks held by kworker/1:143/3385:
> [ 1847.696172]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.698478]  #1: ffff888116befd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.700903] 2 locks held by kworker/1:144/3386:
> [ 1847.702685]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.704954]  #1: ffff888116bf7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.707320] 2 locks held by kworker/1:146/3388:
> [ 1847.709107]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.711342]  #1: ffff88813e40fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.713773] 2 locks held by kworker/1:147/3389:
> [ 1847.715521]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.717823]  #1: ffff88813e41fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.720135] 2 locks held by kworker/2:226/3395:
> [ 1847.721952]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.724259]  #1: ffff88813e46fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.726683] 2 locks held by kworker/2:230/3399:
> [ 1847.728488]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.730738]  #1: ffff88813e4a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.733151] 2 locks held by kworker/2:235/3404:
> [ 1847.734971]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.737282]  #1: ffff88813e4dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.739700] 2 locks held by kworker/2:237/3406:
> [ 1847.741471]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.743751]  #1: ffff88813e4f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.746141] 2 locks held by kworker/2:238/3407:
> [ 1847.747934]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.750210]  #1: ffff88813e507d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.752594] 2 locks held by kworker/2:240/3409:
> [ 1847.754389]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.756657]  #1: ffff88813e51fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.759038] 2 locks held by kworker/0:165/3410:
> [ 1847.760840]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.763088]  #1: ffff88813e52fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.765501] 2 locks held by kworker/0:166/3411:
> [ 1847.767292]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.769539]  #1: ffff88813e587d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.771912] 2 locks held by kworker/0:167/3412:
> [ 1847.773703]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.775930]  #1: ffff88813e58fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.778359] 2 locks held by kworker/0:170/3415:
> [ 1847.780177]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.782469]  #1: ffff88813e5b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.784868] 2 locks held by kworker/0:171/3416:
> [ 1847.786668]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.788962]  #1: ffff88813e5bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.791373] 2 locks held by kworker/0:172/3417:
> [ 1847.793191]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.795535]  #1: ffff88813e5cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.797917] 2 locks held by kworker/0:173/3418:
> [ 1847.799712]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.802009]  #1: ffff88813e5d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.804327] 2 locks held by kworker/0:174/3419:
> [ 1847.806122]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.808399]  #1: ffff88813e5e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.810832] 2 locks held by kworker/0:175/3420:
> [ 1847.812621]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.814863]  #1: ffff88813e5efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.817249] 2 locks held by kworker/0:177/3422:
> [ 1847.819043]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.821388]  #1: ffff88813e607d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.823768] 2 locks held by kworker/0:181/3426:
> [ 1847.825577]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.827834]  #1: ffff88811e057d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.830158] 2 locks held by kworker/0:184/3429:
> [ 1847.831957]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.834282]  #1: ffff88811d1bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.836701] 2 locks held by kworker/2:241/3430:
> [ 1847.838428]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.840701]  #1: ffff88813b6efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.843032] 2 locks held by kworker/2:242/3431:
> [ 1847.844838]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.847128]  #1: ffff888138427d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.849525] 2 locks held by kworker/2:245/3434:
> [ 1847.851328]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.853618]  #1: ffff88813e617d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.856041] 2 locks held by kworker/2:250/3439:
> [ 1847.857842]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.860122]  #1: ffff88813e657d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.862556] 2 locks held by kworker/2:251/3440:
> [ 1847.864373]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.866672]  #1: ffff88813e667d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.869094] 2 locks held by kworker/2:253/3442:
> [ 1847.870890]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.873154]  #1: ffff88813e67fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.875581] 2 locks held by kworker/3:114/3447:
> [ 1847.877394]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.879740]  #1: ffff88813e6b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.882098] 2 locks held by kworker/3:115/3448:
> [ 1847.883918]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.886171]  #1: ffff88813e6c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.888583] 2 locks held by kworker/3:116/3449:
> [ 1847.890398]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.892705]  #1: ffff88813e747d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.895051] 2 locks held by kworker/3:118/3451:
> [ 1847.896847]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.899132]  #1: ffff88813e767d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.901507] 2 locks held by kworker/3:120/3453:
> [ 1847.903236]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.905527]  #1: ffff88813e77fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.907901] 2 locks held by kworker/3:122/3455:
> [ 1847.909708]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.912026]  #1: ffff88813e797d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.914454] 2 locks held by kworker/3:124/3457:
> [ 1847.916231]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.918510]  #1: ffff88813e7afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.920927] 2 locks held by kworker/3:125/3458:
> [ 1847.922695]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.925008]  #1: ffff88813e7bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.927436] 2 locks held by kworker/3:128/3461:
> [ 1847.929255]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.931546]  #1: ffff88813e7dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.933891] 2 locks held by kworker/3:131/3464:
> [ 1847.935689]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.937985]  #1: ffff88813e047d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.940359] 2 locks held by kworker/0:186/3467:
> [ 1847.942142]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.944467]  #1: ffff88813e06fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.946777] 2 locks held by kworker/0:188/3469:
> [ 1847.948559]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.950874]  #1: ffff88813e087d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.953271] 2 locks held by kworker/0:189/3470:
> [ 1847.955060]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.957359]  #1: ffff88813e097d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.959762] 2 locks held by kworker/0:191/3472:
> [ 1847.961559]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.963833]  #1: ffff88813e0afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.966201] 2 locks held by kworker/0:192/3473:
> [ 1847.967990]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.970260]  #1: ffff88813e0b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.972661] 2 locks held by kworker/0:193/3474:
> [ 1847.974439]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.976762]  #1: ffff88813e0c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.979010] 2 locks held by kworker/0:195/3476:
> [ 1847.980748]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.983039]  #1: ffff88813e0e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.985423] 2 locks held by kworker/0:197/3478:
> [ 1847.987205]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.989489]  #1: ffff88813e0ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.991878] 2 locks held by kworker/0:198/3479:
> [ 1847.993608]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1847.995881]  #1: ffff88813e13fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1847.998236] 2 locks held by kworker/0:199/3480:
> [ 1848.000010]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.002319]  #1: ffff88813e14fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.004719] 2 locks held by kworker/0:200/3481:
> [ 1848.006525]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.008835]  #1: ffff88813e157d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.011188] 2 locks held by kworker/0:203/3484:
> [ 1848.012969]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.015327]  #1: ffff88813e17fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.017729] 2 locks held by kworker/0:205/3486:
> [ 1848.019539]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.021850]  #1: ffff88813e19fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.024275] 2 locks held by kworker/0:206/3487:
> [ 1848.026073]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.028400]  #1: ffff88813e1a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.030781] 2 locks held by kworker/0:207/3488:
> [ 1848.032591]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.034794]  #1: ffff88813e1b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.037098] 2 locks held by kworker/0:208/3489:
> [ 1848.038870]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.041174]  #1: ffff88813e1c7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.043556] 2 locks held by kworker/0:209/3490:
> [ 1848.045370]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.047648]  #1: ffff88813e1d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.050062] 2 locks held by kworker/0:211/3492:
> [ 1848.051827]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.054123]  #1: ffff88813e227d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.056524] 2 locks held by kworker/0:215/3496:
> [ 1848.058345]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.060635]  #1: ffff88813e257d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.063063] 2 locks held by kworker/0:219/3500:
> [ 1848.064887]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.067150]  #1: ffff88813e287d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.069530] 2 locks held by kworker/0:220/3501:
> [ 1848.071321]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.073575]  #1: ffff88813e28fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.075995] 2 locks held by kworker/0:221/3502:
> [ 1848.077815]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.080025]  #1: ffff8881348afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.082419] 2 locks held by kworker/0:222/3503:
> [ 1848.084240]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.086510]  #1: ffff88812e54fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.088878] 2 locks held by kworker/0:224/3505:
> [ 1848.090652]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.092903]  #1: ffff888126f0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.095228] 2 locks held by kworker/3:133/3506:
> [ 1848.097027]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.099344]  #1: ffff88813c507d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.101705] 2 locks held by kworker/0:225/3507:
> [ 1848.103476]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.105768]  #1: ffff88811f2d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.108190] 2 locks held by kworker/0:228/3510:
> [ 1848.109978]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.112277]  #1: ffff888130bc7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.114622] 2 locks held by kworker/0:229/3511:
> [ 1848.116439]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.118707]  #1: ffff88811cd5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.121118] 2 locks held by kworker/0:231/3513:
> [ 1848.122938]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.125207]  #1: ffff888122837d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.127584] 2 locks held by kworker/0:234/3516:
> [ 1848.129347]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.131683]  #1: ffff8881277bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.134053] 2 locks held by kworker/0:235/3517:
> [ 1848.135872]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.138204]  #1: ffff88811a1bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.140577] 2 locks held by kworker/0:237/3519:
> [ 1848.142368]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.144709]  #1: ffff8881182f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.147107] 2 locks held by kworker/0:238/3520:
> [ 1848.148899]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.151163]  #1: ffff8881394ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.153550] 2 locks held by kworker/0:239/3521:
> [ 1848.155363]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.157675]  #1: ffff888120a3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.160047] 2 locks held by kworker/0:240/3522:
> [ 1848.161866]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.164186]  #1: ffff88812cf97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.166588] 2 locks held by kworker/0:241/3523:
> [ 1848.168362]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.170636]  #1: ffff888132a37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.172998] 2 locks held by kworker/1:149/3528:
> [ 1848.174801]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.177106]  #1: ffff88813b2b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.179521] 2 locks held by kworker/1:153/3532:
> [ 1848.181318]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.183570]  #1: ffff888115c87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.185967] 2 locks held by kworker/1:154/3533:
> [ 1848.187772]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.190094]  #1: ffff888115c8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.192475] 2 locks held by kworker/1:156/3535:
> [ 1848.194287]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.196585]  #1: ffff888115ca7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.198982] 2 locks held by kworker/1:157/3536:
> [ 1848.200771]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.203095]  #1: ffff88813e3bfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.205467] 2 locks held by kworker/1:159/3538:
> [ 1848.207299]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.209643]  #1: ffff88813e3d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.212044] 2 locks held by kworker/1:160/3539:
> [ 1848.213848]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.216173]  #1: ffff88813e3e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.218566] 2 locks held by kworker/1:162/3541:
> [ 1848.220380]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.222663]  #1: ffff88813e3ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.225041] 2 locks held by kworker/1:163/3542:
> [ 1848.226859]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.229127]  #1: ffff88813dc0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.231558] 2 locks held by kworker/1:164/3543:
> [ 1848.233346]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.235588]  #1: ffff88813dc17d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.237998] 2 locks held by kworker/1:165/3544:
> [ 1848.239818]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.242102]  #1: ffff88813dc27d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.244507] 2 locks held by kworker/0:245/3546:
> [ 1848.246246]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.248542]  #1: ffff88813dc3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.250948] 2 locks held by kworker/0:248/3549:
> [ 1848.252757]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.254964]  #1: ffff88813dc67d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.257370] 2 locks held by kworker/0:249/3550:
> [ 1848.259162]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.261488]  #1: ffff88813dc77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.263876] 2 locks held by kworker/0:250/3551:
> [ 1848.265634]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.267917]  #1: ffff88813dc7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.270327] 2 locks held by kworker/0:252/3553:
> [ 1848.272112]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.274439]  #1: ffff88813dc97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.276827] 2 locks held by kworker/0:253/3554:
> [ 1848.278595]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.280893]  #1: ffff88813dca7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.283292] 2 locks held by kworker/0:255/3556:
> [ 1848.285086]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.287396]  #1: ffff88813dcbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.289755] 2 locks held by kworker/3:134/3558:
> [ 1848.291566]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.293838]  #1: ffff88813dcdfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.296224] 2 locks held by kworker/3:135/3559:
> [ 1848.297985]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.300275]  #1: ffff88813dce7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.302673] 2 locks held by kworker/3:136/3560:
> [ 1848.304445]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.306727]  #1: ffff88813dcf7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.309148] 2 locks held by kworker/3:137/3561:
> [ 1848.310950]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.313250]  #1: ffff88813dd37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.315666] 2 locks held by kworker/3:141/3565:
> [ 1848.317476]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.319782]  #1: ffff88813dd6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.322179] 2 locks held by kworker/3:143/3567:
> [ 1848.323971]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.326288]  #1: ffff88813dd87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.328681] 2 locks held by kworker/3:144/3568:
> [ 1848.330412]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.332695]  #1: ffff88813dd97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.335071] 2 locks held by kworker/3:146/3570:
> [ 1848.336870]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.339177]  #1: ffff88813ddafd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.341567] 2 locks held by kworker/3:149/3573:
> [ 1848.343358]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.345637]  #1: ffff88813ddcfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.348027] 2 locks held by kworker/3:150/3574:
> [ 1848.349819]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.352086]  #1: ffff88813dddfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.354507] 2 locks held by kworker/3:151/3575:
> [ 1848.356320]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.358664]  #1: ffff88813ddf7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.361065] 2 locks held by kworker/3:152/3576:
> [ 1848.362867]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.365196]  #1: ffff88813de07d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.367604] 2 locks held by kworker/3:153/3577:
> [ 1848.369354]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.371643]  #1: ffff88813de0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.374031] 2 locks held by kworker/0:257/3578:
> [ 1848.375841]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.378138]  #1: ffff88813de1fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.380570] 2 locks held by kworker/3:154/3579:
> [ 1848.382385]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.384693]  #1: ffff88813de3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.387099] 2 locks held by kworker/3:156/3581:
> [ 1848.388912]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.391253]  #1: ffff88813dedfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.393640] 2 locks held by kworker/1:167/3585:
> [ 1848.395440]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.397769]  #1: ffff888134f9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.400195] 2 locks held by kworker/1:168/3586:
> [ 1848.402010]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.404334]  #1: ffff8881304a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.406710] 2 locks held by kworker/1:169/3587:
> [ 1848.408518]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.410797]  #1: ffff888128997d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.413123] 2 locks held by kworker/1:170/3588:
> [ 1848.414922]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.417223]  #1: ffff888128c0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.419652] 2 locks held by kworker/1:173/3591:
> [ 1848.421465]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.423770]  #1: ffff88812479fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.426139] 2 locks held by kworker/3:159/3592:
> [ 1848.427922]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.430183]  #1: ffff88813b37fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.432620] 2 locks held by kworker/3:161/3594:
> [ 1848.434390]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.436736]  #1: ffff88812f527d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.439124] 2 locks held by kworker/1:174/3595:
> [ 1848.440806]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.443037]  #1: ffff88812ddefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.445407] 2 locks held by kworker/1:175/3596:
> [ 1848.447227]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.449537]  #1: ffff88813d93fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.451925] 2 locks held by kworker/1:176/3597:
> [ 1848.453695]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.456000]  #1: ffff88813d94fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.458405] 2 locks held by kworker/1:178/3599:
> [ 1848.460161]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.462413]  #1: ffff88813d967d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.464819] 2 locks held by kworker/1:179/3600:
> [ 1848.466623]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.468907]  #1: ffff88813dadfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.471296] 2 locks held by kworker/1:180/3601:
> [ 1848.473040]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.475356]  #1: ffff88813dae7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.477785] 2 locks held by kworker/1:181/3602:
> [ 1848.479595]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.481902]  #1: ffff88813daf7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.484286] 2 locks held by kworker/1:182/3603:
> [ 1848.486088]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.488360]  #1: ffff88813daffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.490762] 2 locks held by kworker/1:184/3605:
> [ 1848.492571]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.494862]  #1: ffff88813db1fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.497218] 2 locks held by kworker/1:185/3606:
> [ 1848.498997]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.501268]  #1: ffff88813db2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.503383] 2 locks held by kworker/1:186/3607:
> [ 1848.505022]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.507066]  #1: ffff88813db37d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.509171] 2 locks held by kworker/1:189/3610:
> [ 1848.510820]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.512859]  #1: ffff88813db5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.514967] 2 locks held by kworker/1:191/3612:
> [ 1848.516610]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.518642]  #1: ffff88813db77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.520738] 2 locks held by kworker/1:192/3613:
> [ 1848.522384]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.524420]  #1: ffff88813db7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.526519] 2 locks held by kworker/1:193/3614:
> [ 1848.528154]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.530192]  #1: ffff88813db8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.532301] 2 locks held by kworker/1:194/3615:
> [ 1848.533949]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.535994]  #1: ffff88813db9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.538100] 2 locks held by kworker/1:195/3616:
> [ 1848.539738]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.541777]  #1: ffff88813dbafd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.543878] 2 locks held by kworker/1:196/3617:
> [ 1848.545523]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.547550]  #1: ffff88813dbb7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.549645] 2 locks held by kworker/1:198/3619:
> [ 1848.551279]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.553424]  #1: ffff88813dbd7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.555635] 2 locks held by kworker/1:199/3620:
> [ 1848.557345]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.559466]  #1: ffff88813dbe7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.561573] 2 locks held by kworker/1:200/3621:
> [ 1848.563208]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.565245]  #1: ffff88813dbefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.567358] 2 locks held by kworker/1:203/3624:
> [ 1848.568987]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.571029]  #1: ffff888161817d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.573136] 2 locks held by kworker/1:206/3627:
> [ 1848.574789]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.576827]  #1: ffff888161837d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.578934] 2 locks held by kworker/1:209/3630:
> [ 1848.580574]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.582608]  #1: ffff88816185fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.584704] 2 locks held by kworker/1:210/3631:
> [ 1848.586343]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.588374]  #1: ffff88816186fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.590480] 2 locks held by kworker/1:211/3632:
> [ 1848.592125]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.594163]  #1: ffff88816187fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.596268] 2 locks held by kworker/3:162/3633:
> [ 1848.597914]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.599956]  #1: ffff88816189fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.602064] 2 locks held by kworker/3:163/3634:
> [ 1848.603707]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.605740]  #1: ffff888127a6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.607845] 2 locks held by kworker/3:164/3635:
> [ 1848.609485]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.611528]  #1: ffff888128f3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.613627] 2 locks held by kworker/3:166/3637:
> [ 1848.615263]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.617310]  #1: ffff88812b83fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.619419] 2 locks held by kworker/3:167/3638:
> [ 1848.621064]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.623106]  #1: ffff88812aa57d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.625217] 2 locks held by kworker/3:168/3639:
> [ 1848.626872]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.628921]  #1: ffff888127d3fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.631031] 2 locks held by kworker/3:170/3641:
> [ 1848.632673]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.634706]  #1: ffff88811ec6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.636811] 2 locks held by kworker/3:171/3642:
> [ 1848.638453]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.640493]  #1: ffff88812f687d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.642594] 2 locks held by kworker/3:172/3643:
> [ 1848.644233]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.646272]  #1: ffff8881380a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.648388] 2 locks held by kworker/1:212/3644:
> [ 1848.650034]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.652083]  #1: ffff888126e6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.654204] 2 locks held by kworker/1:213/3645:
> [ 1848.655862]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.657906]  #1: ffff8881276afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.660014] 2 locks held by kworker/1:214/3646:
> [ 1848.661658]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.663690]  #1: ffff8881323dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.665799] 2 locks held by kworker/1:215/3647:
> [ 1848.667443]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.669472]  #1: ffff888129ecfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.671566] 2 locks held by kworker/1:216/3648:
> [ 1848.673206]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.675248]  #1: ffff88810f47fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.677353] 2 locks held by kworker/1:218/3650:
> [ 1848.679001]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.681046]  #1: ffff888126487d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.683158] 2 locks held by kworker/1:220/3652:
> [ 1848.684810]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.686858]  #1: ffff88813d47fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.688970] 2 locks held by kworker/1:222/3654:
> [ 1848.690618]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.692656]  #1: ffff8881289d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.694758] 2 locks held by kworker/1:223/3655:
> [ 1848.696401]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.698445]  #1: ffff888126a6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.700559] 2 locks held by kworker/1:224/3656:
> [ 1848.702204]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.704245]  #1: ffff88812338fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.706363] 2 locks held by kworker/1:226/3658:
> [ 1848.708009]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.710057]  #1: ffff888105697d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.712165] 2 locks held by kworker/1:227/3659:
> [ 1848.713818]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.715864]  #1: ffff888130d6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.717969] 2 locks held by kworker/1:229/3661:
> [ 1848.719616]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.721652]  #1: ffff88813c977d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.723760] 2 locks held by kworker/3:173/3663:
> [ 1848.725406]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.727446]  #1: ffff88812b3a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.729563] 2 locks held by kworker/3:174/3664:
> [ 1848.731188]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.733232]  #1: ffff88812b28fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.735350] 2 locks held by kworker/3:176/3666:
> [ 1848.736998]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.739041]  #1: ffff888130617d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.741151] 2 locks held by kworker/3:177/3667:
> [ 1848.742800]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.744841]  #1: ffff88812fcbfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.746948] 2 locks held by kworker/3:180/3670:
> [ 1848.748596]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.750633]  #1: ffff88812f107d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.752736] 2 locks held by kworker/3:181/3671:
> [ 1848.754382]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.756410]  #1: ffff88812feffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.758503] 2 locks held by kworker/3:182/3672:
> [ 1848.760134]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.762174]  #1: ffff88812bc8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.764287] 2 locks held by kworker/3:185/3675:
> [ 1848.765941]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.767986]  #1: ffff8881348dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.770095] 2 locks held by kworker/3:187/3677:
> [ 1848.771739]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.773785]  #1: ffff888132c87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.775892] 2 locks held by kworker/3:188/3678:
> [ 1848.777537]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.779565]  #1: ffff888121e2fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.781669] 2 locks held by kworker/3:195/3685:
> [ 1848.783314]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.785354]  #1: ffff88812c187d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.787460] 2 locks held by kworker/3:197/3687:
> [ 1848.789094]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.791137]  #1: ffff888131f5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.793245] 2 locks held by kworker/3:198/3688:
> [ 1848.794897]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.796942]  #1: ffff88813516fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.799050] 2 locks held by kworker/3:202/3692:
> [ 1848.800695]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.802729]  #1: ffff8881350efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.804833] 2 locks held by kworker/3:204/3694:
> [ 1848.806476]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.808520]  #1: ffff88811e2a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.810620] 2 locks held by kworker/3:205/3695:
> [ 1848.812256]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.814287]  #1: ffff88812f4cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.816397] 2 locks held by kworker/3:207/3697:
> [ 1848.818041]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.820082]  #1: ffff8881247dfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.822188] 2 locks held by kworker/3:208/3698:
> [ 1848.823848]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.825889]  #1: ffff88811934fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.827995] 2 locks held by kworker/3:209/3699:
> [ 1848.829643]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.831677]  #1: ffff8881231d7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.833786] 2 locks held by kworker/3:211/3701:
> [ 1848.835431]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.837472]  #1: ffff888133b6fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.839583] 2 locks held by kworker/3:212/3702:
> [ 1848.841220]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.843267]  #1: ffff88813242fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.845377] 2 locks held by kworker/3:214/3704:
> [ 1848.847024]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.849080]  #1: ffff8881316b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.851197] 2 locks held by kworker/3:217/3707:
> [ 1848.852849]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.854894]  #1: ffff88811476fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.856998] 2 locks held by kworker/3:218/3708:
> [ 1848.858649]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.860681]  #1: ffff888132bdfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.862786] 2 locks held by kworker/3:220/3710:
> [ 1848.864428]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.866459]  #1: ffff888125137d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.868556] 2 locks held by kworker/3:221/3711:
> [ 1848.870182]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.872225]  #1: ffff888132597d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.874338] 2 locks held by kworker/3:223/3713:
> [ 1848.875983]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.878025]  #1: ffff8881209cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.880131] 2 locks held by kworker/3:224/3714:
> [ 1848.881784]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.883828]  #1: ffff88811f877d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.885935] 2 locks held by kworker/3:225/3715:
> [ 1848.887576]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.889601]  #1: ffff88811cf47d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.891689] 2 locks held by kworker/3:226/3716:
> [ 1848.893331]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.895372]  #1: ffff88811cd7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.897468] 2 locks held by kworker/3:227/3717:
> [ 1848.899110]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.901150]  #1: ffff888111b9fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.903256] 2 locks held by kworker/3:232/3722:
> [ 1848.904908]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.906947]  #1: ffff88812aed7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.909055] 2 locks held by kworker/3:233/3723:
> [ 1848.910698]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.912732]  #1: ffff888130637d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.914839] 2 locks held by kworker/3:238/3728:
> [ 1848.916481]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.918521]  #1: ffff8881399efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.920623] 2 locks held by kworker/1:231/3737:
> [ 1848.922259]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.924298]  #1: ffff8881290a7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.926411] 2 locks held by kworker/1:232/3738:
> [ 1848.928047]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.930086]  #1: ffff888120b77d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.932193] 2 locks held by kworker/1:237/3743:
> [ 1848.933842]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.935906]  #1: ffff8881100f7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.938012] 2 locks held by kworker/1:238/3744:
> [ 1848.939659]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.941690]  #1: ffff88812e0e7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.943802] 2 locks held by kworker/1:239/3745:
> [ 1848.945451]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.947488]  #1: ffff88810ad5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.949607] 2 locks held by kworker/1:241/3747:
> [ 1848.951246]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.953289]  #1: ffff88811fb5fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.955408] 2 locks held by kworker/1:242/3748:
> [ 1848.957058]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.959102]  #1: ffff888119eefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.961208] 2 locks held by kworker/1:243/3749:
> [ 1848.962862]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.964903]  #1: ffff888130d87d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.967016] 2 locks held by kworker/1:244/3750:
> [ 1848.968662]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.970700]  #1: ffff8881289afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.972810] 2 locks held by kworker/1:245/3751:
> [ 1848.974456]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.976486]  #1: ffff8881063cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.978601] 2 locks held by kworker/1:246/3752:
> [ 1848.980230]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.982276]  #1: ffff88811fca7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.984392] 2 locks held by kworker/1:248/3754:
> [ 1848.986033]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.988077]  #1: ffff888106997d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.990184] 2 locks held by kworker/1:249/3755:
> [ 1848.991840]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.993885]  #1: ffff8881372afd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1848.995994] 2 locks held by kworker/1:250/3756:
> [ 1848.997641]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1848.999679]  #1: ffff8881209b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.001787] 2 locks held by kworker/1:251/3757:
> [ 1849.003436]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.005482]  #1: ffff8881314ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.007582] 2 locks held by kworker/1:252/3758:
> [ 1849.009219]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.011260]  #1: ffff888130d8fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.013377] 2 locks held by kworker/1:253/3759:
> [ 1849.015026]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.017066]  #1: ffff8881371b7d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.019201] 2 locks held by kworker/1:255/3761:
> [ 1849.020862]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.022902]  #1: ffff88811d897d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.025007] 2 locks held by kworker/1:256/3762:
> [ 1849.026649]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.028680]  #1: ffff88813b99fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.030792] 2 locks held by kworker/1:257/3763:
> [ 1849.032440]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.034483]  #1: ffff88813b0efd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.036590] 2 locks held by kworker/3:247/3765:
> [ 1849.038220]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.040260]  #1: ffff888134867d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.042382] 2 locks held by kworker/3:248/3766:
> [ 1849.044023]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.046066]  #1: ffff888124b7fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.048170] 2 locks held by kworker/3:249/3767:
> [ 1849.049821]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.051860]  #1: ffff888131aafd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.053964] 2 locks held by kworker/3:251/3769:
> [ 1849.055610]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.057647]  #1: ffff8881068ffd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.059746] 2 locks held by kworker/3:252/3770:
> [ 1849.061399]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.063434]  #1: ffff88810b757d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.065537] 2 locks held by kworker/3:254/3772:
> [ 1849.067174]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.069213]  #1: ffff888136d97d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.071346] 2 locks held by kworker/3:255/3773:
> [ 1849.072992]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.075027]  #1: ffff88811830fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.077139] 2 locks held by kworker/3:256/3774:
> [ 1849.078795]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.080838]  #1: ffff888127547d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.082946] 2 locks held by kworker/1:258/4004:
> [ 1849.084592]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.086626]  #1: ffff88812cbefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.088727] 2 locks held by kworker/0:18/13817:
> [ 1849.090368]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.092406]  #1: ffff8881213cfd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.094515] 2 locks held by kworker/1:97/23521:
> [ 1849.096151]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.098194]  #1: ffff88810c33fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.100300] 2 locks held by kworker/1:259/28552:
> [ 1849.101959]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.103997]  #1: ffff888140777d98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.106102] 2 locks held by kworker/3:258/38106:
> [ 1849.107766]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.109802]  #1: ffff888111b0fd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> [ 1849.111912] 2 locks held by kworker/1:172/39248:
> [ 1849.113563]  #0: ffff88813c00c938 ((wq_completion)dio/dm-1){+.+.}-{0:0}, at: process_one_work+0x790/0x14a0
> [ 1849.115594]  #1: ffff888110eefd98 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x7be/0x14a0
> 
> [ 1849.119116] =============================================
> 
> 
> [2]
> 
> $ ps axuw | grep " D "
> root           9  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/0:1+dio/dm-1]
> root          25  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/1:0+dio/dm-1]
> root          49  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/1:1+dio/dm-1]
> root          74  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/0:2+dio/dm-1]
> root         169  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/3:2+dio/dm-1]
> root         221  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/0:3+dio/dm-1]
> root         230  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/1:2+dio/dm-1]
> root         291  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/2:3+dio/dm-1]
> root         322  0.0  0.0      0     0 ?        D    10:55   0:00 [kworker/1:3+dio/dm-1]
> root        2757  2.1  0.0      0     0 ?        D    10:57   1:14 [kworker/u8:7+flush-253:1]
> root        2759  0.0  0.0      0     0 ?        D    10:57   0:00 [kworker/3:4+dio/dm-1]
> root        2760  0.0  0.0      0     0 ?        D    10:57   0:00 [kworker/0:4+dio/dm-1]
> root        2762  0.0  0.0      0     0 ?        D    10:57   0:00 [kworker/1:5+dio/dm-1]
> root        2764  0.0  0.0      0     0 ?        D    10:57   0:00 [kworker/1:6+dio/dm-1]
> root        2765  0.0  0.0      0     0 ?        D    10:57   0:00 [kworker/3:5+dio/dm-1]
> ...

Shinichiro,

I have been aware for a long time that there is a problem with blktests/srp. I see hangs in
002 and 011 fairly often. I have not been able to figure out the root cause but suspect that
there is a timing issue in the srp drivers which cannot handle the slowness of the software
RoCE implemtation. If you can give me any clues about what you are seeing I am happy to help
try to figure this out.

Bob Pearson
rpearson@hpe.com (rpearsonhpe@gmail.com)

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-22  1:46 ` Bob Pearson
@ 2023-08-22 10:18   ` Shinichiro Kawasaki
  2023-08-22 15:20     ` Bart Van Assche
  0 siblings, 1 reply; 87+ messages in thread
From: Shinichiro Kawasaki @ 2023-08-22 10:18 UTC (permalink / raw)
  To: Bob Pearson; +Cc: linux-rdma, linux-scsi, Bart Van Assche

CC+: Bart,

On Aug 21, 2023 / 20:46, Bob Pearson wrote:
[...]
> Shinichiro,

Hello Bob, thanks for the response.

> 
> I have been aware for a long time that there is a problem with blktests/srp. I see hangs in
> 002 and 011 fairly often.

I repeated the test case srp/011, and observed it hangs. This hang at srp/011
also can be recreated in stable manner. I reverted the commit 9b4b7c1f9f54
then observed the srp/011 hang disappeared. So, I guess these two hangs have
same root cause.

> I have not been able to figure out the root cause but suspect that
> there is a timing issue in the srp drivers which cannot handle the slowness of the software
> RoCE implemtation. If you can give me any clues about what you are seeing I am happy to help
> try to figure this out.

Thanks for sharing your thoughts. I myself do not have srp driver knowledge, and
not sure what clue I should provide. If you have any idea of the action I can
take, please let me know.

IMHO, srp/011 hang looks easier to dig in than srp/002, because srp/011 does not
involve filesystems. And at srp/011 hang, kernel reported many "SRP abort"s [X],
which is similar as the srp/002 hang.

[X]

[  196.330820] run blktests srp/011 at 2023-08-22 17:22:42
[  196.819383] null_blk: module loaded
[  196.870572] null_blk: disk nullb0 created
[  196.886712] null_blk: disk nullb1 created
[  197.081369] rdma_rxe: loaded
[  197.103766] (null): rxe_set_mtu: Set mtu to 1024
[  197.139726] infiniband ens3_rxe: set active
[  197.142649] infiniband ens3_rxe: added ens3
[  197.196229] scsi_debug:sdebug_add_store: dif_storep 524288 bytes @ 000000005234c247
[  197.200354] scsi_debug:sdebug_driver_probe: scsi_debug: trim poll_queues to 0. poll_q/nr_hw = (0/1)
[  197.202780] scsi_debug:sdebug_driver_probe: host protection DIF3 DIX3
[  197.204566] scsi host3: scsi_debug: version 0191 [20210520]
                 dev_size_mb=32, opts=0x0, submit_queues=1, statistics=0
[  197.209853] scsi 3:0:0:0: Direct-Access     Linux    scsi_debug       0191 PQ: 0 ANSI: 7
[  197.213521] scsi 3:0:0:0: Power-on or device reset occurred
[  197.217732] sd 3:0:0:0: [sdc] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[  197.218797] sd 3:0:0:0: Attached scsi generic sg2 type 0
[  197.219951] sd 3:0:0:0: [sdc] Write Protect is off
[  197.223066] sd 3:0:0:0: [sdc] Mode Sense: 73 00 10 08
[  197.225611] sd 3:0:0:0: [sdc] Write cache: enabled, read cache: enabled, supports DPO and FUA
[  197.229701] sd 3:0:0:0: [sdc] Enabling DIX T10-DIF-TYPE3-CRC, application tag size 6 bytes
[  197.232015] sd 3:0:0:0: [sdc] Enabling DIF Type 3 protection
[  197.233863] sd 3:0:0:0: [sdc] Preferred minimum I/O size 512 bytes
[  197.235412] sd 3:0:0:0: [sdc] Optimal transfer size 524288 bytes
[  197.241520] sd 3:0:0:0: [sdc] Attached SCSI disk
[  197.654951] Rounding down aligned max_sectors from 4294967295 to 4294967288
[  197.710283] ib_srpt:srpt_add_one: ib_srpt device = 00000000685934b8
[  197.710340] ib_srpt:srpt_use_srq: ib_srpt srpt_use_srq(ens3_rxe): use_srq = 0; ret = 0
[  197.710345] ib_srpt:srpt_add_one: ib_srpt Target login info: id_ext=505400fffe123456,ioc_guid=505400fffe123456,pkey=ffff,service_id=505400fffe123456
[  197.710657] ib_srpt:srpt_add_one: ib_srpt added ens3_rxe.
[  198.184239] Rounding down aligned max_sectors from 255 to 248
[  198.247444] Rounding down aligned max_sectors from 255 to 248
[  198.311742] Rounding down aligned max_sectors from 4294967295 to 4294967288
[  198.798620] ib_srp:srp_add_one: ib_srp: srp_add_one: 18446744073709551615 / 4096 = 4503599627370495 <> 512
[  198.798630] ib_srp:srp_add_one: ib_srp: ens3_rxe: mr_page_shift = 12, device->max_mr_size = 0xffffffffffffffff, device->max_fast_reg_page_list_len = 512, max_pages_per_mr = 512, mr_max_size = 0x200000
[  198.898881] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  198.898908] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  198.898942] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 512; mr_page_size = 4096; max_sectors_per_mr = 4096; mr_per_cmd = 2
[  198.898947] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[  198.910816] ib_srpt Received SRP_LOGIN_REQ with i_port_id fe80:0000:0000:0000:5054:00ff:fe12:3456, t_port_id 5054:00ff:fe12:3456:5054:00ff:fe12:3456 and it_iu_len 8260 on port 1 (guid=fe80:0000:0000:0000:5054:00ff:fe12:3456); pkey 0xffff
[  198.916313] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[  198.919848] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8191 max_sge= 32 sq_size = 4096 ch= 00000000d71a59ab
[  198.920007] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 10.0.2.15 or i_port_id 0xfe80000000000000505400fffe123456
[  198.920308] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000a5feaed8 name=10.0.2.15 ch=00000000d71a59ab
[  198.921661] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[  198.921688] scsi host4: ib_srp: using immediate data
[  198.921951] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-18: queued zerolength write
[  198.922831] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-18 wc->status 0
[  198.931958] ib_srpt Received SRP_LOGIN_REQ with i_port_id fe80:0000:0000:0000:5054:00ff:fe12:3456, t_port_id 5054:00ff:fe12:3456:5054:00ff:fe12:3456 and it_iu_len 8260 on port 1 (guid=fe80:0000:0000:0000:5054:00ff:fe12:3456); pkey 0xffff
[  198.937206] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[  198.939984] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8191 max_sge= 32 sq_size = 4096 ch= 000000009f3b3382
[  198.940133] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 10.0.2.15 or i_port_id 0xfe80000000000000505400fffe123456
[  198.940173] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000c70b88d5 name=10.0.2.15 ch=000000009f3b3382
[  198.940454] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[  198.940460] scsi host4: ib_srp: using immediate data
[  198.940840] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-20: queued zerolength write
[  198.941071] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-20 wc->status 0
[  198.950276] ib_srpt Received SRP_LOGIN_REQ with i_port_id fe80:0000:0000:0000:5054:00ff:fe12:3456, t_port_id 5054:00ff:fe12:3456:5054:00ff:fe12:3456 and it_iu_len 8260 on port 1 (guid=fe80:0000:0000:0000:5054:00ff:fe12:3456); pkey 0xffff
[  198.955351] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[  198.958102] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8191 max_sge= 32 sq_size = 4096 ch= 000000002f3d11a8
[  198.958270] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 10.0.2.15 or i_port_id 0xfe80000000000000505400fffe123456
[  198.958312] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000008dd11076 name=10.0.2.15 ch=000000002f3d11a8
[  198.958626] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[  198.958632] scsi host4: ib_srp: using immediate data
[  198.959552] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-22: queued zerolength write
[  198.959815] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-22 wc->status 0
[  198.968720] ib_srpt Received SRP_LOGIN_REQ with i_port_id fe80:0000:0000:0000:5054:00ff:fe12:3456, t_port_id 5054:00ff:fe12:3456:5054:00ff:fe12:3456 and it_iu_len 8260 on port 1 (guid=fe80:0000:0000:0000:5054:00ff:fe12:3456); pkey 0xffff
[  198.973609] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[  198.976219] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8191 max_sge= 32 sq_size = 4096 ch= 00000000b6291aea
[  198.976369] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 10.0.2.15 or i_port_id 0xfe80000000000000505400fffe123456
[  198.976413] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000d8231f1e name=10.0.2.15 ch=00000000b6291aea
[  198.976694] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[  198.976700] scsi host4: ib_srp: using immediate data
[  198.976810] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-24: queued zerolength write
[  198.976929] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-24 wc->status 0
[  198.977610] scsi host4: SRP.T10:505400FFFE123456
[  198.987781] scsi 4:0:0:0: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[  198.996088] scsi 4:0:0:0: LUN assignments on this target have changed. The Linux SCSI layer does not automatically remap LUN assignments.
[  199.000231] scsi 4:0:0:0: alua: supports implicit and explicit TPGS
[  199.002500] scsi 4:0:0:0: alua: device naa.60014056e756c6c62300000000000000 port group 0 rel port 1
[  199.007201] sd 4:0:0:0: [sdd] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[  199.007936] sd 4:0:0:0: Attached scsi generic sg3 type 0
[  199.010141] sd 4:0:0:0: [sdd] Write Protect is off
[  199.012388] sd 4:0:0:0: [sdd] Mode Sense: 43 00 00 08
[  199.014718] scsi 4:0:0:2: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[  199.015810] sd 4:0:0:0: [sdd] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[  199.019705] sd 4:0:0:0: [sdd] Preferred minimum I/O size 512 bytes
[  199.021312] sd 4:0:0:0: [sdd] Optimal transfer size 126976 bytes
[  199.023796] scsi 4:0:0:2: alua: supports implicit and explicit TPGS
[  199.025378] scsi 4:0:0:2: alua: device naa.60014057363736964626700000000000 port group 0 rel port 1
[  199.029763] sd 4:0:0:2: [sde] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[  199.029822] sd 4:0:0:2: Attached scsi generic sg4 type 0
[  199.030670] sd 4:0:0:2: [sde] Write Protect is off
[  199.034314] sd 4:0:0:2: [sde] Mode Sense: 43 00 10 08
[  199.036643] sd 4:0:0:0: [sdd] Attached SCSI disk
[  199.038861] scsi 4:0:0:1: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[  199.039148] sd 4:0:0:2: [sde] Write cache: enabled, read cache: enabled, supports DPO and FUA
[  199.046070] sd 4:0:0:2: [sde] Preferred minimum I/O size 512 bytes
[  199.047580] scsi 4:0:0:1: LUN assignments on this target have changed. The Linux SCSI layer does not automatically remap LUN assignments.
[  199.047685] sd 4:0:0:2: [sde] Optimal transfer size 524288 bytes
[  199.049654] scsi 4:0:0:1: alua: supports implicit and explicit TPGS
[  199.053197] scsi 4:0:0:1: alua: device naa.60014056e756c6c62310000000000000 port group 0 rel port 1
[  199.056642] sd 4:0:0:1: Attached scsi generic sg5 type 0
[  199.056679] sd 4:0:0:1: [sdf] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[  199.057539] ib_srp:srp_add_target: ib_srp: host4: SCSI scan succeeded - detected 3 LUNs
[  199.057979] sd 4:0:0:1: [sdf] Write Protect is off
[  199.058888] scsi host4: ib_srp: new target: id_ext 505400fffe123456 ioc_guid 505400fffe123456 sgid fe80:0000:0000:0000:5054:00ff:fe12:3456 dest 10.0.2.15
[  199.059238] sd 4:0:0:1: [sdf] Mode Sense: 43 00 00 08
[  199.064721] sd 4:0:0:2: [sde] Attached SCSI disk
[  199.066646] sd 4:0:0:1: [sdf] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[  199.069653] sd 4:0:0:1: [sdf] Preferred minimum I/O size 512 bytes
[  199.071330] sd 4:0:0:1: [sdf] Optimal transfer size 126976 bytes
[  199.072389] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  199.072952] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  199.072985] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  199.073001] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  199.073012] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  199.083799] sd 4:0:0:1: [sdf] Attached SCSI disk
[  199.095910] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  199.095929] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  199.095959] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  199.095975] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  199.096005] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  199.096020] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  199.096030] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  199.431496] sd 4:0:0:1: alua: transition timeout set to 60 seconds
[  199.433258] sd 4:0:0:1: alua: port group 00 state A non-preferred supports TOlUSNA
[  199.456782] sd 4:0:0:2: alua: transition timeout set to 60 seconds
[  199.458737] sd 4:0:0:2: alua: port group 00 state A non-preferred supports TOlUSNA
[  199.488105] sd 4:0:0:0: alua: transition timeout set to 60 seconds
[  199.489964] sd 4:0:0:0: alua: port group 00 state A non-preferred supports TOlUSNA
[  204.807887] device-mapper: multipath: 253:3: Failing path 8:48.
[  204.856553] scsi 4:0:0:0: alua: Detached
[  204.868122] sd 4:0:0:2: [sde] Synchronizing SCSI cache
[  204.886615] scsi 4:0:0:2: alua: Detached
[  204.919557] scsi 4:0:0:1: alua: Detached
[  204.925989] ib_srpt receiving failed for ioctx 00000000ddab6801 with status 5
[  204.926715] ib_srpt receiving failed for ioctx 000000000bc9beb4 with status 5
[  204.927759] ib_srpt receiving failed for ioctx 000000002ec13abb with status 5
[  204.927762] ib_srpt receiving failed for ioctx 00000000a73075da with status 5
[  204.927764] ib_srpt receiving failed for ioctx 00000000db73d7b8 with status 5
[  204.927766] ib_srpt receiving failed for ioctx 00000000b7c85b9d with status 5
[  204.927767] ib_srpt receiving failed for ioctx 00000000d70acd70 with status 5
[  204.927769] ib_srpt receiving failed for ioctx 0000000059193fad with status 5
[  204.927771] ib_srpt receiving failed for ioctx 0000000019e9ec9e with status 5
[  204.927773] ib_srpt receiving failed for ioctx 0000000033e124b9 with status 5
[  205.443422] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-20: queued zerolength write
[  205.444973] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-18: queued zerolength write
[  205.445056] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-20 wc->status 5
[  205.446047] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-18 wc->status 5
[  205.446190] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-20
[  205.448320] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-18
[  205.506138] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-24: queued zerolength write
[  205.506195] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-22: queued zerolength write
[  205.506263] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-24 wc->status 5
[  205.506329] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-22 wc->status 5
[  205.506354] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-24
[  205.506381] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-22
[  209.945988] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  209.946026] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  209.946046] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 512; mr_page_size = 4096; max_sectors_per_mr = 4096; mr_per_cmd = 2
[  209.946055] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[  209.958117] ib_srpt Received SRP_LOGIN_REQ with i_port_id fe80:0000:0000:0000:5054:00ff:fe12:3456, t_port_id 5054:00ff:fe12:3456:5054:00ff:fe12:3456 and it_iu_len 8260 on port 1 (guid=fe80:0000:0000:0000:5054:00ff:fe12:3456); pkey 0xffff
[  209.962963] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[  209.965421] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8191 max_sge= 32 sq_size = 4096 ch= 00000000fec65d93
[  209.965591] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 10.0.2.15 or i_port_id 0xfe80000000000000505400fffe123456
[  209.965635] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000009f6a881a name=10.0.2.15 ch=00000000fec65d93
[  209.966180] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[  209.966187] scsi host4: ib_srp: using immediate data
[  209.967393] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-26: queued zerolength write
[  209.967518] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-26 wc->status 0
[  209.976127] ib_srpt Received SRP_LOGIN_REQ with i_port_id fe80:0000:0000:0000:5054:00ff:fe12:3456, t_port_id 5054:00ff:fe12:3456:5054:00ff:fe12:3456 and it_iu_len 8260 on port 1 (guid=fe80:0000:0000:0000:5054:00ff:fe12:3456); pkey 0xffff
[  209.984015] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[  209.988890] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8191 max_sge= 32 sq_size = 4096 ch= 0000000029b704ac
[  209.989221] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 10.0.2.15 or i_port_id 0xfe80000000000000505400fffe123456
[  209.989297] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000078cf4fe1 name=10.0.2.15 ch=0000000029b704ac
[  209.989641] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[  209.989647] scsi host4: ib_srp: using immediate data
[  209.989814] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-28: queued zerolength write
[  209.989997] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-28 wc->status 0
[  209.999235] ib_srpt Received SRP_LOGIN_REQ with i_port_id fe80:0000:0000:0000:5054:00ff:fe12:3456, t_port_id 5054:00ff:fe12:3456:5054:00ff:fe12:3456 and it_iu_len 8260 on port 1 (guid=fe80:0000:0000:0000:5054:00ff:fe12:3456); pkey 0xffff
[  210.004184] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[  210.006893] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8191 max_sge= 32 sq_size = 4096 ch= 00000000492e551c
[  210.007050] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 10.0.2.15 or i_port_id 0xfe80000000000000505400fffe123456
[  210.007096] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000008b9aa995 name=10.0.2.15 ch=00000000492e551c
[  210.007402] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[  210.007410] scsi host4: ib_srp: using immediate data
[  210.007582] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-30: queued zerolength write
[  210.008212] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-30 wc->status 0
[  210.017177] ib_srpt Received SRP_LOGIN_REQ with i_port_id fe80:0000:0000:0000:5054:00ff:fe12:3456, t_port_id 5054:00ff:fe12:3456:5054:00ff:fe12:3456 and it_iu_len 8260 on port 1 (guid=fe80:0000:0000:0000:5054:00ff:fe12:3456); pkey 0xffff
[  210.022684] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[  210.025487] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8191 max_sge= 32 sq_size = 4096 ch= 0000000033dc05a8
[  210.025663] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 10.0.2.15 or i_port_id 0xfe80000000000000505400fffe123456
[  210.025707] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000066092653 name=10.0.2.15 ch=0000000033dc05a8
[  210.026031] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[  210.026038] scsi host4: ib_srp: using immediate data
[  210.026169] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-32: queued zerolength write
[  210.026743] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-32 wc->status 0
[  210.026940] scsi host4: SRP.T10:505400FFFE123456
[  210.032959] scsi 4:0:0:0: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[  210.041064] scsi 4:0:0:0: alua: supports implicit and explicit TPGS
[  210.043448] scsi 4:0:0:0: alua: device naa.60014056e756c6c62300000000000000 port group 0 rel port 1
[  210.047197] sd 4:0:0:0: Attached scsi generic sg3 type 0
[  210.047772] sd 4:0:0:0: [sdd] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[  210.051913] sd 4:0:0:0: [sdd] Write Protect is off
[  210.053426] sd 4:0:0:0: [sdd] Mode Sense: 43 00 00 08
[  210.054089] scsi 4:0:0:2: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[  210.054483] sd 4:0:0:0: [sdd] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[  210.060433] sd 4:0:0:0: [sdd] Preferred minimum I/O size 512 bytes
[  210.062061] sd 4:0:0:0: [sdd] Optimal transfer size 126976 bytes
[  210.063874] scsi 4:0:0:2: alua: supports implicit and explicit TPGS
[  210.065561] scsi 4:0:0:2: alua: device naa.60014057363736964626700000000000 port group 0 rel port 1
[  210.069906] sd 4:0:0:2: Attached scsi generic sg4 type 0
[  210.071244] sd 4:0:0:0: [sdd] Attached SCSI disk
[  210.071438] sd 4:0:0:2: [sde] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[  210.075126] sd 4:0:0:2: [sde] Write Protect is off
[  210.076561] sd 4:0:0:2: [sde] Mode Sense: 43 00 10 08
[  210.077417] sd 4:0:0:2: [sde] Write cache: enabled, read cache: enabled, supports DPO and FUA
[  210.080889] sd 4:0:0:2: [sde] Preferred minimum I/O size 512 bytes
[  210.081559] scsi 4:0:0:1: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[  210.082390] sd 4:0:0:2: [sde] Optimal transfer size 524288 bytes
[  210.090475] scsi 4:0:0:1: alua: supports implicit and explicit TPGS
[  210.092667] scsi 4:0:0:1: alua: device naa.60014056e756c6c62310000000000000 port group 0 rel port 1
[  210.096228] sd 4:0:0:1: [sdf] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[  210.098474] sd 4:0:0:1: [sdf] Write Protect is off
[  210.099943] sd 4:0:0:1: [sdf] Mode Sense: 43 00 00 08
[  210.100772] sd 4:0:0:1: [sdf] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[  210.104885] sd 4:0:0:1: [sdf] Preferred minimum I/O size 512 bytes
[  210.106065] sd 4:0:0:2: [sde] Attached SCSI disk
[  210.106425] sd 4:0:0:1: [sdf] Optimal transfer size 126976 bytes
[  210.109617] sd 4:0:0:1: Attached scsi generic sg5 type 0
[  210.112866] ib_srp:srp_add_target: ib_srp: host4: SCSI scan succeeded - detected 3 LUNs
[  210.112873] scsi host4: ib_srp: new target: id_ext 505400fffe123456 ioc_guid 505400fffe123456 sgid fe80:0000:0000:0000:5054:00ff:fe12:3456 dest 10.0.2.15
[  210.114809] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  210.114827] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  210.114857] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  210.114873] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  210.114883] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  210.121314] sd 4:0:0:1: [sdf] Attached SCSI disk
[  210.133745] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  210.133764] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  210.133796] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  210.133813] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  210.133846] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  210.133861] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  210.133871] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  210.325220] sd 4:0:0:0: alua: transition timeout set to 60 seconds
[  210.327176] sd 4:0:0:0: alua: port group 00 state A non-preferred supports TOlUSNA
[  210.512382] sd 4:0:0:1: alua: transition timeout set to 60 seconds
[  210.514366] sd 4:0:0:1: alua: port group 00 state A non-preferred supports TOlUSNA
[  210.537067] sd 4:0:0:2: alua: transition timeout set to 60 seconds
[  210.538788] sd 4:0:0:2: alua: port group 00 state A non-preferred supports TOlUSNA
[  217.322048] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  217.322067] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  217.322078] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  217.336141] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  217.336160] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  217.336190] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  217.336206] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  217.336216] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  217.351059] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  217.351079] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  217.351109] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  217.351125] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  217.351155] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  217.351171] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  217.351180] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  217.583935] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  217.583961] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  217.583974] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  217.599109] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  217.599128] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  217.599158] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  217.599174] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  217.599184] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  217.617214] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  217.617234] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  217.617270] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  217.617285] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  217.617316] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  217.617331] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  217.617341] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  217.839147] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  217.839168] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  217.839187] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  217.853795] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  217.853815] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  217.853846] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  217.853861] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  217.853872] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  217.875042] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  217.875061] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  217.875092] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  217.875107] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  217.875138] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  217.875152] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  217.875162] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  218.110548] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.110585] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.110603] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  218.127935] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.127959] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.128003] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  218.128023] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  218.128036] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  218.145223] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.145243] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.145274] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  218.145295] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  218.145326] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  218.145340] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  218.145351] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  218.379877] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.379897] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.379908] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  218.399268] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.399298] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.399330] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  218.399346] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  218.399356] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  218.414922] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.414942] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.414973] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  218.414989] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  218.415019] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  218.415034] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  218.415044] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  218.657313] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.657347] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.657365] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  218.672418] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.672437] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.672468] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  218.672483] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  218.672494] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  218.687795] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.687815] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.687846] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  218.687861] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  218.687892] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  218.687907] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  218.687917] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  218.932504] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.932549] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.932561] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  218.948134] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.948155] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.948185] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  218.948200] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  218.948210] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  218.961885] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  218.961904] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  218.961935] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  218.961951] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  218.961981] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  218.961996] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  218.962006] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  219.196670] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  219.196691] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  219.196701] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  219.213009] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  219.213029] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  219.213059] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  219.213075] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  219.213085] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  219.231373] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  219.231392] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  219.231424] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  219.231439] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  219.231470] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  219.231485] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  219.231495] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  219.483889] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  219.483910] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  219.483920] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  219.498333] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  219.498365] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  219.498405] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  219.498425] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  219.498439] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  219.515059] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  219.515080] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  219.515110] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  219.515128] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  219.515158] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  219.515173] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  219.515183] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  219.760020] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  219.760040] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  219.760051] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  219.777862] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  219.777896] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  219.777958] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  219.777987] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  219.778005] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  219.799030] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  219.799051] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  219.799087] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  219.799102] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  219.799133] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  219.799147] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  219.799158] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  227.086363] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.086383] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.086393] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  227.107902] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.107922] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.107952] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  227.107968] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  227.107978] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  227.125254] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.125274] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.125304] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  227.125320] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  227.125350] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  227.125365] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  227.125375] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  227.355831] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.355852] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.355862] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  227.369483] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.369502] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.369533] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  227.369572] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  227.369584] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  227.386900] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.386920] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.386951] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  227.386966] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  227.386997] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  227.387012] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  227.387022] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  227.610540] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.610603] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.610620] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  227.628902] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.628922] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.628955] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  227.628971] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  227.628981] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  227.645234] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.645255] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.645285] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  227.645301] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  227.645334] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  227.645349] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  227.645359] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  227.873626] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.873646] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.873656] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  227.890838] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.890857] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.890887] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  227.890903] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  227.890913] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  227.905153] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  227.905173] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  227.905203] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  227.905218] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  227.905249] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  227.905264] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  227.905274] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  228.130167] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.130187] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.130197] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  228.151246] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.151271] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.151312] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  228.151332] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  228.151345] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  228.177952] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.177972] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.178003] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  228.178020] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  228.178051] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  228.178066] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  228.178076] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  228.408180] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.408204] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.408217] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  228.429107] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.429133] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.429173] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  228.429193] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  228.429206] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  228.446183] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.446202] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.446233] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  228.446248] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  228.446279] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  228.446295] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  228.446305] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  228.681133] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.681166] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.681180] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  228.699467] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.699490] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.699521] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  228.699536] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  228.699547] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  228.715076] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.715096] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.715127] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  228.715142] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  228.715173] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  228.715188] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  228.715197] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  228.942150] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.942176] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.942190] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  228.957037] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.957068] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.957125] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  228.957153] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  228.957172] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  228.973879] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  228.973900] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  228.973931] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  228.973947] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  228.973978] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  228.973993] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  228.974003] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  229.204277] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  229.204298] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  229.204308] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  229.224969] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  229.224989] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  229.225021] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  229.225037] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  229.225047] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  229.241092] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  229.241111] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  229.241142] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  229.241157] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  229.241187] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  229.241202] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  229.241212] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  229.467761] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  229.467786] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  229.467797] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  229.483759] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  229.483790] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  229.483839] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  229.483866] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  229.483883] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  229.501487] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  229.501512] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  229.501581] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  229.501615] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  229.501660] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  229.501680] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  229.501693] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  229.752662] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  229.752682] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  229.752692] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  229.764884] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  229.764903] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  229.764934] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  229.764951] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  229.764961] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  229.781797] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  229.781817] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  229.781848] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  229.781863] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  229.781893] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  229.781908] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  229.781918] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  230.010623] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.010643] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.010654] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  230.026757] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.026777] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.026816] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  230.026832] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  230.026842] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  230.047533] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.047671] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.047714] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  230.047735] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  230.047777] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  230.047796] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  230.047815] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  230.276899] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.276919] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.276929] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  230.294911] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.294932] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.294963] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  230.294979] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  230.294989] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  230.316396] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.316417] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.316447] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  230.316463] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  230.316493] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  230.316508] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  230.316518] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  230.543058] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.543081] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.543091] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  230.560038] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.560058] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.560089] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  230.560105] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  230.560115] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  230.580843] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.580864] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.580895] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  230.580911] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  230.580941] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  230.580956] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  230.580966] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  230.821544] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.821589] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.821600] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  230.833829] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.833861] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.833892] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  230.833908] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  230.833918] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  230.854790] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  230.854809] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  230.854845] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  230.854861] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  230.854891] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  230.854906] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  230.854916] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  231.101599] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.101620] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.101630] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  231.122996] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.123015] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.123046] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  231.123061] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  231.123072] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  231.136982] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.137001] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.137032] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  231.137047] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  231.137078] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  231.137093] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  231.137103] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  231.380246] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.380267] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.380278] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  231.393413] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.393433] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.393464] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  231.393480] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  231.393490] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  231.414392] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.414425] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.414487] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  231.414516] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  231.414601] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  231.414631] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  231.414649] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  231.645614] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.645634] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.645645] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  231.664063] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.664082] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.664113] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  231.664129] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  231.664139] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  231.681247] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.681267] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.681298] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  231.681313] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  231.681344] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  231.681358] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  231.681368] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  231.924774] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.924793] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.924803] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  231.944478] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.944499] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.944531] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  231.944547] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  231.944557] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  231.962526] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  231.962545] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  231.962606] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  231.962623] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  231.962653] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  231.962668] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  231.962678] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  232.185860] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  232.185880] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  232.185890] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=10.0.2.15
[  232.202741] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  232.202763] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  232.202794] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  232.202809] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  232.202820] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fec0:0000:0000:0000:5054:00ff:fe12:3456
[  232.223744] ib_srp:srp_parse_in: ib_srp: 10.0.2.15 -> 10.0.2.15:0
[  232.223763] ib_srp:srp_parse_in: ib_srp: 10.0.2.15:5555 -> 10.0.2.15:5555
[  232.223793] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456] -> [fec0::5054:ff:fe12:3456]:0/167772687%0
[  232.223809] ib_srp:srp_parse_in: ib_srp: [fec0::5054:ff:fe12:3456]:5555 -> [fec0::5054:ff:fe12:3456]:5555/167772687%0
[  232.223839] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2] -> [fe80::5054:ff:fe12:3456]:0/167772687%2
[  232.223854] ib_srp:srp_parse_in: ib_srp: [fe80::5054:ff:fe12:3456%2]:5555 -> [fe80::5054:ff:fe12:3456]:5555/167772687%2
[  232.223864] scsi host5: ib_srp: Already connected to target port with id_ext=505400fffe123456;ioc_guid=505400fffe123456;dest=fe80:0000:0000:0000:5054:00ff:fe12:3456
[  242.836918] scsi host4: SRP abort called
[  242.840917] scsi host4: Sending SRP abort for tag 0x3d
[  242.846319] ib_srpt:srpt_handle_tsk_mgmt: ib_srpt recv tsk_mgmt fn 1 for task_tag 61 and cmd tag 2147483649 ch 00000000fec65d93 sess 000000009f6a881a
[  242.846638] ABORT_TASK: Sending TMR_TASK_DOES_NOT_EXIST for ref_tag: 61
[  242.847642] scsi host4: Null scmnd for RSP w/tag 0x0000000000003d received on ch 0 / QP 0x19
[  242.854933] scsi host4: SRP abort called
[  242.857399] scsi host4: SRP abort called
[  242.859682] scsi host4: SRP abort called
[  242.861854] scsi host4: SRP abort called
[  242.863946] scsi host4: SRP abort called
[  242.866014] scsi host4: SRP abort called
[  242.868081] scsi host4: SRP abort called
[  242.870154] scsi host4: SRP abort called
[  242.871974] scsi host4: SRP abort called
[  242.874470] scsi host4: SRP abort called
[  242.876505] scsi host4: SRP abort called
[  242.878640] scsi host4: SRP abort called
[  242.880548] scsi host4: SRP abort called
[  242.882525] scsi host4: SRP abort called
[  242.884463] scsi host4: SRP abort called
[  242.886145] scsi host4: SRP abort called
[  242.887812] scsi host4: SRP abort called
[  242.889573] scsi host4: SRP abort called
[  242.891210] scsi host4: SRP abort called
[  242.892901] scsi host4: SRP abort called
[  242.903730] device-mapper: multipath: 253:3: Failing path 8:48.
[  242.928724] scsi 4:0:0:0: alua: Detached
[  242.948278] sd 4:0:0:2: [sde] Synchronizing SCSI cache
[  242.969091] scsi 4:0:0:2: alua: Detached
[  242.996739] srpt_recv_done: 502 callbacks suppressed
[  242.996743] ib_srpt receiving failed for ioctx 000000002b03f6bc with status 5
[  242.996898] ib_srpt receiving failed for ioctx 0000000067119178 with status 5
[  242.997255] ib_srpt receiving failed for ioctx 00000000451fc813 with status 5
[  242.997850] ib_srpt receiving failed for ioctx 0000000006e2d4c1 with status 5
[  242.997853] ib_srpt receiving failed for ioctx 000000007db43a18 with status 5
[  242.997855] ib_srpt receiving failed for ioctx 00000000976247d6 with status 5
[  242.997856] ib_srpt receiving failed for ioctx 000000008e5c98aa with status 5
[  242.997858] ib_srpt receiving failed for ioctx 00000000f17ceb65 with status 5
[  242.997860] ib_srpt receiving failed for ioctx 00000000e0ba06d1 with status 5
[  242.997861] ib_srpt receiving failed for ioctx 000000007ac01832 with status 5
[  243.020721] scsi 4:0:0:1: alua: Detached
[  243.522706] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-32: queued zerolength write
[  243.522742] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-30: queued zerolength write
[  243.522769] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-28: queued zerolength write
[  243.522785] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-32 wc->status 5
[  243.522795] ib_srpt:srpt_zerolength_write: ib_srpt 10.0.2.15-26: queued zerolength write
[  243.522806] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-32
[  243.522848] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-30 wc->status 5
[  243.522879] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-26 wc->status 5
[  243.522905] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-30
[  243.522910] ib_srpt:srpt_zerolength_write_done: ib_srpt 10.0.2.15-28 wc->status 5
[  243.522921] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-26
[  243.522934] ib_srpt:srpt_release_channel_work: ib_srpt 10.0.2.15-28

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-22 10:18   ` Shinichiro Kawasaki
@ 2023-08-22 15:20     ` Bart Van Assche
  2023-08-23 16:19       ` Bob Pearson
  2023-08-25  1:11       ` Shinichiro Kawasaki
  0 siblings, 2 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-08-22 15:20 UTC (permalink / raw)
  To: Shinichiro Kawasaki, Bob Pearson; +Cc: linux-rdma, linux-scsi

On 8/22/23 03:18, Shinichiro Kawasaki wrote:
> CC+: Bart,
> 
> On Aug 21, 2023 / 20:46, Bob Pearson wrote:
> [...]
>> Shinichiro,
> 
> Hello Bob, thanks for the response.
> 
>>
>> I have been aware for a long time that there is a problem with blktests/srp. I see hangs in
>> 002 and 011 fairly often.
> 
> I repeated the test case srp/011, and observed it hangs. This hang at srp/011
> also can be recreated in stable manner. I reverted the commit 9b4b7c1f9f54
> then observed the srp/011 hang disappeared. So, I guess these two hangs have
> same root cause.
> 
>> I have not been able to figure out the root cause but suspect that
>> there is a timing issue in the srp drivers which cannot handle the slowness of the software
>> RoCE implemtation. If you can give me any clues about what you are seeing I am happy to help
>> try to figure this out.
> 
> Thanks for sharing your thoughts. I myself do not have srp driver knowledge, and
> not sure what clue I should provide. If you have any idea of the action I can
> take, please let me know.

Hi Shinichiro and Bob,

When I initially developed the SRP tests these were working reliably in
combination with the rdma_rxe driver. Since 2017 I frequently see issues when
running the SRP tests on top of the rdma_rxe driver, issues that I do not see
if I run the SRP tests on top of the soft-iWARP driver (siw). How about
changing the default for the SRP tests from rdma_rxe to siw and to let the
RDMA community resolve the rdma_rxe issues?

Thanks,

Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-22 15:20     ` Bart Van Assche
@ 2023-08-23 16:19       ` Bob Pearson
  2023-08-23 19:46         ` Bart Van Assche
                           ` (2 more replies)
  2023-08-25  1:11       ` Shinichiro Kawasaki
  1 sibling, 3 replies; 87+ messages in thread
From: Bob Pearson @ 2023-08-23 16:19 UTC (permalink / raw)
  To: Bart Van Assche, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

On 8/22/23 10:20, Bart Van Assche wrote:
> On 8/22/23 03:18, Shinichiro Kawasaki wrote:
>> CC+: Bart,
>>
>> On Aug 21, 2023 / 20:46, Bob Pearson wrote:
>> [...]
>>> Shinichiro,
>>
>> Hello Bob, thanks for the response.
>>
>>>
>>> I have been aware for a long time that there is a problem with blktests/srp. I see hangs in
>>> 002 and 011 fairly often.
>>
>> I repeated the test case srp/011, and observed it hangs. This hang at srp/011
>> also can be recreated in stable manner. I reverted the commit 9b4b7c1f9f54
>> then observed the srp/011 hang disappeared. So, I guess these two hangs have
>> same root cause.
>>
>>> I have not been able to figure out the root cause but suspect that
>>> there is a timing issue in the srp drivers which cannot handle the slowness of the software
>>> RoCE implemtation. If you can give me any clues about what you are seeing I am happy to help
>>> try to figure this out.
>>
>> Thanks for sharing your thoughts. I myself do not have srp driver knowledge, and
>> not sure what clue I should provide. If you have any idea of the action I can
>> take, please let me know.
> 
> Hi Shinichiro and Bob,
> 
> When I initially developed the SRP tests these were working reliably in
> combination with the rdma_rxe driver. Since 2017 I frequently see issues when
> running the SRP tests on top of the rdma_rxe driver, issues that I do not see
> if I run the SRP tests on top of the soft-iWARP driver (siw). How about
> changing the default for the SRP tests from rdma_rxe to siw and to let the
> RDMA community resolve the rdma_rxe issues?
> 
> Thanks,
> 
> Bart.
> 

Bart,

I have also seen the same hangs in siw. Not as frequently but the same symptoms.
About every month or so I take another run at trying to find and fix this bug but
I have not succeeded yet. I haven't seen anything that looks like bad behavior from 
the rxe side but that doesn't prove anything. I also saw these hangs on my system
before the WQ patch went in if my memory serves. Out main application for this
driver at HPE is Lustre which is a little different than SRP but uses the same
general approach with fast MRs. Currently we are finding the driver to be quite stable
even under very heavy stress.

I would be happy to collaborate with someone (you?) who knows the SRP side well to resolve
this hang. I think that is the quickest way to fix this. I have no idea what SRP is waiting for.

Best regards,

Bob 

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-23 16:19       ` Bob Pearson
@ 2023-08-23 19:46         ` Bart Van Assche
  2023-08-24 16:24           ` Bob Pearson
  2023-08-24  8:55         ` Bernard Metzler
  2023-08-24 15:35         ` Bernard Metzler
  2 siblings, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-08-23 19:46 UTC (permalink / raw)
  To: Bob Pearson, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

On 8/23/23 09:19, Bob Pearson wrote:
> I have also seen the same hangs in siw. Not as frequently but the same symptoms.
> About every month or so I take another run at trying to find and fix this bug but
> I have not succeeded yet. I haven't seen anything that looks like bad behavior from
> the rxe side but that doesn't prove anything. I also saw these hangs on my system
> before the WQ patch went in if my memory serves. Out main application for this
> driver at HPE is Lustre which is a little different than SRP but uses the same
> general approach with fast MRs. Currently we are finding the driver to be quite stable
> even under very heavy stress.
> 
> I would be happy to collaborate with someone (you?) who knows the SRP side well to resolve
> this hang. I think that is the quickest way to fix this. I have no idea what SRP is waiting for.

Hi Bob,

I cannot reproduce these issues. All SRP tests work reliably on my test setup on
top of the v6.5-rc7 kernel, whether I use the siw driver or whether I use the
rdma_rxe driver. Additionally, I do not see any SRP abort messages.

# uname -a
Linux opensuse-vm 6.5.0-rc7 #28 SMP PREEMPT_DYNAMIC Wed Aug 23 10:42:35 PDT 2023 x86_64 x86_64 x86_64 GNU/Linux
# journalctl --since=today | grep 'SRP abort' | wc
       0       0       0

Since I installed openSUSE Tumbleweed in the VM in which I run kernel tests: if
you are using a Linux distro that is based on Debian it may include a buggy
version of multipathd. Last time I ran the SRP tests in a Debian VM I had to
build multipathd from source - the SRP tests did not work with the Debian version
of multipathd. The shell script that I use to build and install multipathd is as
follows (must be run in the multipath-tools source directory):

#!/bin/bash

scriptdir="$(dirname "$0")"

if type -p zypper >/dev/null 2>&1; then
     rpms=(device-mapper-devel libaio-devel libjson-c-devel librados-devel
	  liburcu-devel readline-devel systemd-devel)
     for p in "${rpms[@]}"; do
	sudo zypper install -y "$p"
     done
elif type -p apt-get >/dev/null 2>&1; then
     export LIB=/lib
     sudo apt-get install -y libaio-dev libdevmapper-dev libjson-c-dev librados-dev \
	    libreadline-dev libsystemd-dev liburcu-dev
fi

git clean -f
make -s "$@"
sudo make -s "$@" install

Bart.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* RE: Re: [bug report] blktests srp/002 hang
  2023-08-23 16:19       ` Bob Pearson
  2023-08-23 19:46         ` Bart Van Assche
@ 2023-08-24  8:55         ` Bernard Metzler
  2023-08-24 15:35         ` Bernard Metzler
  2 siblings, 0 replies; 87+ messages in thread
From: Bernard Metzler @ 2023-08-24  8:55 UTC (permalink / raw)
  To: Bob Pearson, Bart Van Assche, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi


> -----Original Message-----
> From: Bob Pearson <rpearsonhpe@gmail.com>
> Sent: Wednesday, 23 August 2023 18:19
> To: Bart Van Assche <bvanassche@acm.org>; Shinichiro Kawasaki
> <shinichiro.kawasaki@wdc.com>
> Cc: linux-rdma@vger.kernel.org; linux-scsi@vger.kernel.org
> Subject: [EXTERNAL] Re: [bug report] blktests srp/002 hang
> 
> On 8/22/23 10:20, Bart Van Assche wrote:
> > On 8/22/23 03:18, Shinichiro Kawasaki wrote:
> >> CC+: Bart,
> >>
> >> On Aug 21, 2023 / 20:46, Bob Pearson wrote:
> >> [...]
> >>> Shinichiro,
> >>
> >> Hello Bob, thanks for the response.
> >>
> >>>
> >>> I have been aware for a long time that there is a problem with
> blktests/srp. I see hangs in
> >>> 002 and 011 fairly often.
> >>
> >> I repeated the test case srp/011, and observed it hangs. This hang at
> srp/011
> >> also can be recreated in stable manner. I reverted the commit
> 9b4b7c1f9f54
> >> then observed the srp/011 hang disappeared. So, I guess these two hangs
> have
> >> same root cause.
> >>
> >>> I have not been able to figure out the root cause but suspect that
> >>> there is a timing issue in the srp drivers which cannot handle the
> slowness of the software
> >>> RoCE implemtation. If you can give me any clues about what you are
> seeing I am happy to help
> >>> try to figure this out.
> >>
> >> Thanks for sharing your thoughts. I myself do not have srp driver
> knowledge, and
> >> not sure what clue I should provide. If you have any idea of the action
> I can
> >> take, please let me know.
> >
> > Hi Shinichiro and Bob,
> >
> > When I initially developed the SRP tests these were working reliably in
> > combination with the rdma_rxe driver. Since 2017 I frequently see issues
> when
> > running the SRP tests on top of the rdma_rxe driver, issues that I do not
> see
> > if I run the SRP tests on top of the soft-iWARP driver (siw). How about
> > changing the default for the SRP tests from rdma_rxe to siw and to let
> the
> > RDMA community resolve the rdma_rxe issues?
> >
> > Thanks,
> >
> > Bart.
> >
> 
> Bart,
> 
> I have also seen the same hangs in siw. Not as frequently but the same
> symptoms.

I did not hear about that one form siw side, but will try to make up some
time to reproduce it and fix siw in case. I'll let you know if I find
something, Bob.

Bernard.

> About every month or so I take another run at trying to find and fix this
> bug but
> I have not succeeded yet. I haven't seen anything that looks like bad
> behavior from
> the rxe side but that doesn't prove anything. I also saw these hangs on my
> system
> before the WQ patch went in if my memory serves. Out main application for
> this
> driver at HPE is Lustre which is a little different than SRP but uses the
> same
> general approach with fast MRs. Currently we are finding the driver to be
> quite stable
> even under very heavy stress.
> 
> I would be happy to collaborate with someone (you?) who knows the SRP side
> well to resolve
> this hang. I think that is the quickest way to fix this. I have no idea
> what SRP is waiting for.
> 
> Best regards,
> 
> Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* RE: Re: [bug report] blktests srp/002 hang
  2023-08-23 16:19       ` Bob Pearson
  2023-08-23 19:46         ` Bart Van Assche
  2023-08-24  8:55         ` Bernard Metzler
@ 2023-08-24 15:35         ` Bernard Metzler
  2023-08-24 16:05           ` Bart Van Assche
  2 siblings, 1 reply; 87+ messages in thread
From: Bernard Metzler @ 2023-08-24 15:35 UTC (permalink / raw)
  To: Bob Pearson, Bart Van Assche, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi



> -----Original Message-----
> From: Bob Pearson <rpearsonhpe@gmail.com>
> Sent: Wednesday, 23 August 2023 18:19
> To: Bart Van Assche <bvanassche@acm.org>; Shinichiro Kawasaki
> <shinichiro.kawasaki@wdc.com>
> Cc: linux-rdma@vger.kernel.org; linux-scsi@vger.kernel.org
> Subject: [EXTERNAL] Re: [bug report] blktests srp/002 hang
> 
> On 8/22/23 10:20, Bart Van Assche wrote:
> > On 8/22/23 03:18, Shinichiro Kawasaki wrote:
> >> CC+: Bart,
> >>
> >> On Aug 21, 2023 / 20:46, Bob Pearson wrote:
> >> [...]
> >>> Shinichiro,
> >>
> >> Hello Bob, thanks for the response.
> >>
> >>>
> >>> I have been aware for a long time that there is a problem with
> blktests/srp. I see hangs in
> >>> 002 and 011 fairly often.
> >>
> >> I repeated the test case srp/011, and observed it hangs. This hang at
> srp/011
> >> also can be recreated in stable manner. I reverted the commit
> 9b4b7c1f9f54
> >> then observed the srp/011 hang disappeared. So, I guess these two hangs
> have
> >> same root cause.
> >>
> >>> I have not been able to figure out the root cause but suspect that
> >>> there is a timing issue in the srp drivers which cannot handle the
> slowness of the software
> >>> RoCE implemtation. If you can give me any clues about what you are
> seeing I am happy to help
> >>> try to figure this out.
> >>
> >> Thanks for sharing your thoughts. I myself do not have srp driver
> knowledge, and
> >> not sure what clue I should provide. If you have any idea of the action
> I can
> >> take, please let me know.
> >
> > Hi Shinichiro and Bob,
> >
> > When I initially developed the SRP tests these were working reliably in
> > combination with the rdma_rxe driver. Since 2017 I frequently see issues
> when
> > running the SRP tests on top of the rdma_rxe driver, issues that I do not
> see
> > if I run the SRP tests on top of the soft-iWARP driver (siw). How about
> > changing the default for the SRP tests from rdma_rxe to siw and to let
> the
> > RDMA community resolve the rdma_rxe issues?
> >
> > Thanks,
> >
> > Bart.
> >
> 
> Bart,
> 
> I have also seen the same hangs in siw. Not as frequently but the same
> symptoms.
> About every month or so I take another run at trying to find and fix this
> bug but
> I have not succeeded yet. I haven't seen anything that looks like bad
> behavior from
> the rxe side but that doesn't prove anything. I also saw these hangs on my
> system
> before the WQ patch went in if my memory serves. Out main application for
> this
> driver at HPE is Lustre which is a little different than SRP but uses the
> same
> general approach with fast MRs. Currently we are finding the driver to be
> quite stable
> even under very heavy stress.
> 
> I would be happy to collaborate with someone (you?) who knows the SRP side
> well to resolve
> this hang. I think that is the quickest way to fix this. I have no idea
> what SRP is waiting for.
> 
> Best regards,
> 
> Bob

Hi Bart,
I spent some time testing the srp/002 blktest with siw, still
trying to get it hanging.
Looking closer into the logs: While most of the time RDMA CM
connection setup works, I also see some connection rejects being
created by the passive ULP side during setup:

[16848.757937] scsi host11: ib_srp: REJ received
[16848.757939] scsi host11:   REJ reason 0xffffff98 

This does not affect the overall success of the current test
run, other connect attempts succeed etc. Is that connection
rejection intended behavior of the test?

Thanks!
Bernard.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-24 15:35         ` Bernard Metzler
@ 2023-08-24 16:05           ` Bart Van Assche
  2023-08-24 16:27             ` Bob Pearson
  0 siblings, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-08-24 16:05 UTC (permalink / raw)
  To: Bernard Metzler, Bob Pearson, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

On 8/24/23 08:35, Bernard Metzler wrote:
> I spent some time testing the srp/002 blktest with siw, still
> trying to get it hanging.
> Looking closer into the logs: While most of the time RDMA CM
> connection setup works, I also see some connection rejects being
> created by the passive ULP side during setup:
> 
> [16848.757937] scsi host11: ib_srp: REJ received
> [16848.757939] scsi host11:   REJ reason 0xffffff98
> 
> This does not affect the overall success of the current test
> run, other connect attempts succeed etc. Is that connection
> rejection intended behavior of the test?

Hi Bernard,

In the logs I see that the SRP initiator (ib_srp) may try to log in before
the SRP target driver (ib_srpt) has finished associating with the configured
RDMA ports. I think this is why REJ messages appear in the logs. The retry
loop in the test script should be sufficient to deal with this.

Bart.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-23 19:46         ` Bart Van Assche
@ 2023-08-24 16:24           ` Bob Pearson
  0 siblings, 0 replies; 87+ messages in thread
From: Bob Pearson @ 2023-08-24 16:24 UTC (permalink / raw)
  To: Bart Van Assche, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

On 8/23/23 14:46, Bart Van Assche wrote:
> On 8/23/23 09:19, Bob Pearson wrote:
>> I have also seen the same hangs in siw. Not as frequently but the same symptoms.
>> About every month or so I take another run at trying to find and fix this bug but
>> I have not succeeded yet. I haven't seen anything that looks like bad behavior from
>> the rxe side but that doesn't prove anything. I also saw these hangs on my system
>> before the WQ patch went in if my memory serves. Out main application for this
>> driver at HPE is Lustre which is a little different than SRP but uses the same
>> general approach with fast MRs. Currently we are finding the driver to be quite stable
>> even under very heavy stress.
>>
>> I would be happy to collaborate with someone (you?) who knows the SRP side well to resolve
>> this hang. I think that is the quickest way to fix this. I have no idea what SRP is waiting for.
> 
> Hi Bob,
> 
> I cannot reproduce these issues. All SRP tests work reliably on my test setup on
> top of the v6.5-rc7 kernel, whether I use the siw driver or whether I use the
> rdma_rxe driver. Additionally, I do not see any SRP abort messages.

Thank you for this. This is good news.
> 
> # uname -a
> Linux opensuse-vm 6.5.0-rc7 #28 SMP PREEMPT_DYNAMIC Wed Aug 23 10:42:35 PDT 2023 x86_64 x86_64 x86_64 GNU/Linux
> # journalctl --since=today | grep 'SRP abort' | wc
>       0       0       0
> 
> Since I installed openSUSE Tumbleweed in the VM in which I run kernel tests: if
> you are using a Linux distro that is based on Debian it may include a buggy
> version of multipathd. Last time I ran the SRP tests in a Debian VM I had to
> build multipathd from source - the SRP tests did not work with the Debian version
> of multipathd. The shell script that I use to build and install multipathd is as
> follows (must be run in the multipath-tools source directory):

I run on Ubuntu which is Debian based. So perhaps that is the root of the problems
I have been seeing.

I'll try to follow your lead here.

Bob
> 
> #!/bin/bash
> 
> scriptdir="$(dirname "$0")"
> 
> if type -p zypper >/dev/null 2>&1; then
>     rpms=(device-mapper-devel libaio-devel libjson-c-devel librados-devel
>       liburcu-devel readline-devel systemd-devel)
>     for p in "${rpms[@]}"; do
>     sudo zypper install -y "$p"
>     done
> elif type -p apt-get >/dev/null 2>&1; then
>     export LIB=/lib
>     sudo apt-get install -y libaio-dev libdevmapper-dev libjson-c-dev librados-dev \
>         libreadline-dev libsystemd-dev liburcu-dev
> fi
> 
> git clean -f
> make -s "$@"
> sudo make -s "$@" install
> 
> Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-24 16:05           ` Bart Van Assche
@ 2023-08-24 16:27             ` Bob Pearson
  0 siblings, 0 replies; 87+ messages in thread
From: Bob Pearson @ 2023-08-24 16:27 UTC (permalink / raw)
  To: Bart Van Assche, Bernard Metzler, Shinichiro Kawasaki
  Cc: linux-rdma, linux-scsi

On 8/24/23 11:05, Bart Van Assche wrote:
> On 8/24/23 08:35, Bernard Metzler wrote:
>> I spent some time testing the srp/002 blktest with siw, still
>> trying to get it hanging.
>> Looking closer into the logs: While most of the time RDMA CM
>> connection setup works, I also see some connection rejects being
>> created by the passive ULP side during setup:
>>
>> [16848.757937] scsi host11: ib_srp: REJ received
>> [16848.757939] scsi host11:   REJ reason 0xffffff98
>>
>> This does not affect the overall success of the current test
>> run, other connect attempts succeed etc. Is that connection
>> rejection intended behavior of the test?
> 
> Hi Bernard,
> 
> In the logs I see that the SRP initiator (ib_srp) may try to log in before
> the SRP target driver (ib_srpt) has finished associating with the configured
> RDMA ports. I think this is why REJ messages appear in the logs. The retry
> loop in the test script should be sufficient to deal with this.
> 
> Bart.

Thanks to both of you for taking the time to look at this.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-22 15:20     ` Bart Van Assche
  2023-08-23 16:19       ` Bob Pearson
@ 2023-08-25  1:11       ` Shinichiro Kawasaki
  2023-08-25  1:36         ` Bob Pearson
  2023-08-25 13:52         ` Bart Van Assche
  1 sibling, 2 replies; 87+ messages in thread
From: Shinichiro Kawasaki @ 2023-08-25  1:11 UTC (permalink / raw)
  To: Bart Van Assche; +Cc: Bob Pearson, linux-rdma, linux-scsi

On Aug 22, 2023 / 08:20, Bart Van Assche wrote:
> On 8/22/23 03:18, Shinichiro Kawasaki wrote:
> > CC+: Bart,
> > 
> > On Aug 21, 2023 / 20:46, Bob Pearson wrote:
> > [...]
> > > Shinichiro,
> > 
> > Hello Bob, thanks for the response.
> > 
> > > 
> > > I have been aware for a long time that there is a problem with blktests/srp. I see hangs in
> > > 002 and 011 fairly often.
> > 
> > I repeated the test case srp/011, and observed it hangs. This hang at srp/011
> > also can be recreated in stable manner. I reverted the commit 9b4b7c1f9f54
> > then observed the srp/011 hang disappeared. So, I guess these two hangs have
> > same root cause.
> > 
> > > I have not been able to figure out the root cause but suspect that
> > > there is a timing issue in the srp drivers which cannot handle the slowness of the software
> > > RoCE implemtation. If you can give me any clues about what you are seeing I am happy to help
> > > try to figure this out.
> > 
> > Thanks for sharing your thoughts. I myself do not have srp driver knowledge, and
> > not sure what clue I should provide. If you have any idea of the action I can
> > take, please let me know.
> 
> Hi Shinichiro and Bob,
> 
> When I initially developed the SRP tests these were working reliably in
> combination with the rdma_rxe driver. Since 2017 I frequently see issues when
> running the SRP tests on top of the rdma_rxe driver, issues that I do not see
> if I run the SRP tests on top of the soft-iWARP driver (siw). How about
> changing the default for the SRP tests from rdma_rxe to siw and to let the
> RDMA community resolve the rdma_rxe issues?

If it takes time to resolve the issues, it sounds a good idea to make siw driver
default, since it will make the hangs less painful for blktests users. Another
idea to reduce the pain is to improve srp/002 and srp/011 to detect the hangs
and report them as failures.

Having said that, some discussion started on this thread for resolution
(thanks!) I would wait for a while and see how long it will take for solution,
and if the actions on blktests side are valuable or not.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-25  1:11       ` Shinichiro Kawasaki
@ 2023-08-25  1:36         ` Bob Pearson
  2023-08-25 10:16           ` Shinichiro Kawasaki
  2023-08-25 13:49           ` Bart Van Assche
  2023-08-25 13:52         ` Bart Van Assche
  1 sibling, 2 replies; 87+ messages in thread
From: Bob Pearson @ 2023-08-25  1:36 UTC (permalink / raw)
  To: Shinichiro Kawasaki, Bart Van Assche; +Cc: linux-rdma, linux-scsi

On 8/24/23 20:11, Shinichiro Kawasaki wrote:
> On Aug 22, 2023 / 08:20, Bart Van Assche wrote:
>> On 8/22/23 03:18, Shinichiro Kawasaki wrote:
>>> CC+: Bart,
>>>
>>> On Aug 21, 2023 / 20:46, Bob Pearson wrote:
>>> [...]
>>>> Shinichiro,
>>>
>>> Hello Bob, thanks for the response.
>>>
>>>>
>>>> I have been aware for a long time that there is a problem with blktests/srp. I see hangs in
>>>> 002 and 011 fairly often.
>>>
>>> I repeated the test case srp/011, and observed it hangs. This hang at srp/011
>>> also can be recreated in stable manner. I reverted the commit 9b4b7c1f9f54
>>> then observed the srp/011 hang disappeared. So, I guess these two hangs have
>>> same root cause.
>>>
>>>> I have not been able to figure out the root cause but suspect that
>>>> there is a timing issue in the srp drivers which cannot handle the slowness of the software
>>>> RoCE implemtation. If you can give me any clues about what you are seeing I am happy to help
>>>> try to figure this out.
>>>
>>> Thanks for sharing your thoughts. I myself do not have srp driver knowledge, and
>>> not sure what clue I should provide. If you have any idea of the action I can
>>> take, please let me know.
>>
>> Hi Shinichiro and Bob,
>>
>> When I initially developed the SRP tests these were working reliably in
>> combination with the rdma_rxe driver. Since 2017 I frequently see issues when
>> running the SRP tests on top of the rdma_rxe driver, issues that I do not see
>> if I run the SRP tests on top of the soft-iWARP driver (siw). How about
>> changing the default for the SRP tests from rdma_rxe to siw and to let the
>> RDMA community resolve the rdma_rxe issues?
> 
> If it takes time to resolve the issues, it sounds a good idea to make siw driver
> default, since it will make the hangs less painful for blktests users. Another
> idea to reduce the pain is to improve srp/002 and srp/011 to detect the hangs
> and report them as failures.
> 
> Having said that, some discussion started on this thread for resolution
> (thanks!) I would wait for a while and see how long it will take for solution,
> and if the actions on blktests side are valuable or not.

Did you see Bart's comment about srp not working with older versions of multipathd?
He is currently not seeing any hangs at all.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-25  1:36         ` Bob Pearson
@ 2023-08-25 10:16           ` Shinichiro Kawasaki
  2023-08-25 13:49           ` Bart Van Assche
  1 sibling, 0 replies; 87+ messages in thread
From: Shinichiro Kawasaki @ 2023-08-25 10:16 UTC (permalink / raw)
  To: Bob Pearson; +Cc: Bart Van Assche, linux-rdma, linux-scsi

On Aug 24, 2023 / 20:36, Bob Pearson wrote:
> On 8/24/23 20:11, Shinichiro Kawasaki wrote:
> > On Aug 22, 2023 / 08:20, Bart Van Assche wrote:
> >> On 8/22/23 03:18, Shinichiro Kawasaki wrote:
> >>> CC+: Bart,
> >>>
> >>> On Aug 21, 2023 / 20:46, Bob Pearson wrote:
> >>> [...]
> >>>> Shinichiro,
> >>>
> >>> Hello Bob, thanks for the response.
> >>>
> >>>>
> >>>> I have been aware for a long time that there is a problem with blktests/srp. I see hangs in
> >>>> 002 and 011 fairly often.
> >>>
> >>> I repeated the test case srp/011, and observed it hangs. This hang at srp/011
> >>> also can be recreated in stable manner. I reverted the commit 9b4b7c1f9f54
> >>> then observed the srp/011 hang disappeared. So, I guess these two hangs have
> >>> same root cause.
> >>>
> >>>> I have not been able to figure out the root cause but suspect that
> >>>> there is a timing issue in the srp drivers which cannot handle the slowness of the software
> >>>> RoCE implemtation. If you can give me any clues about what you are seeing I am happy to help
> >>>> try to figure this out.
> >>>
> >>> Thanks for sharing your thoughts. I myself do not have srp driver knowledge, and
> >>> not sure what clue I should provide. If you have any idea of the action I can
> >>> take, please let me know.
> >>
> >> Hi Shinichiro and Bob,
> >>
> >> When I initially developed the SRP tests these were working reliably in
> >> combination with the rdma_rxe driver. Since 2017 I frequently see issues when
> >> running the SRP tests on top of the rdma_rxe driver, issues that I do not see
> >> if I run the SRP tests on top of the soft-iWARP driver (siw). How about
> >> changing the default for the SRP tests from rdma_rxe to siw and to let the
> >> RDMA community resolve the rdma_rxe issues?
> > 
> > If it takes time to resolve the issues, it sounds a good idea to make siw driver
> > default, since it will make the hangs less painful for blktests users. Another
> > idea to reduce the pain is to improve srp/002 and srp/011 to detect the hangs
> > and report them as failures.
> > 
> > Having said that, some discussion started on this thread for resolution
> > (thanks!) I would wait for a while and see how long it will take for solution,
> > and if the actions on blktests side are valuable or not.
> 
> Did you see Bart's comment about srp not working with older versions of multipathd?
> He is currently not seeing any hangs at all.

Yes, I saw it. My test system is Fedora 38 with device-mapper-multipathd package
version 0.9.4. I compiled and installed the latest multipath-tools but still see
the hangs. Not sure why it is observed on my test system and not observed on
Bart's system.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-25  1:36         ` Bob Pearson
  2023-08-25 10:16           ` Shinichiro Kawasaki
@ 2023-08-25 13:49           ` Bart Van Assche
  1 sibling, 0 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-08-25 13:49 UTC (permalink / raw)
  To: Bob Pearson, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

On 8/24/23 18:36, Bob Pearson wrote:
> Did you see Bart's comment about srp not working with older versions of multipathd?
> He is currently not seeing any hangs at all.

Hi Bob,

It seems like my comment was not clear enough. The SRP tests are compatible
with all upstream versions of multipathd, including those from ten years ago.
While testing on Debian, one year ago I noticed that the only way to make
the SRP tests pass was to replace the Debian version of multipathd with an
upstream version. I'm not sure of this but my guess is that I encountered a
Debian version of multipathd with a bug introduced by the Debian maintainers.

Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-25  1:11       ` Shinichiro Kawasaki
  2023-08-25  1:36         ` Bob Pearson
@ 2023-08-25 13:52         ` Bart Van Assche
  2023-09-13 17:36           ` Bob Pearson
  1 sibling, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-08-25 13:52 UTC (permalink / raw)
  To: Shinichiro Kawasaki; +Cc: Bob Pearson, linux-rdma, linux-scsi

On 8/24/23 18:11, Shinichiro Kawasaki wrote:
> If it takes time to resolve the issues, it sounds a good idea to make siw driver
> default, since it will make the hangs less painful for blktests users. Another
> idea to reduce the pain is to improve srp/002 and srp/011 to detect the hangs
> and report them as failures.

At this moment we don't know whether the hangs can be converted into failures.
Answering this question is only possible after we have found the root cause of
the hang. If the hang would be caused by commands getting stuck in multipathd
then it can be solved by changing the path configuration (see also the dmsetup
message commands in blktests). If the hang is caused by a kernel bug then it's
very well possible that there is no way to recover other than by rebooting the
system on which the tests are run.

Thanks,

Bart.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-25 13:52         ` Bart Van Assche
@ 2023-09-13 17:36           ` Bob Pearson
  2023-09-13 23:38             ` Zhu Yanjun
  0 siblings, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-09-13 17:36 UTC (permalink / raw)
  To: Bart Van Assche, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

On 8/25/23 08:52, Bart Van Assche wrote:
> On 8/24/23 18:11, Shinichiro Kawasaki wrote:
>> If it takes time to resolve the issues, it sounds a good idea to make siw driver
>> default, since it will make the hangs less painful for blktests users. Another
>> idea to reduce the pain is to improve srp/002 and srp/011 to detect the hangs
>> and report them as failures.
> 
> At this moment we don't know whether the hangs can be converted into failures.
> Answering this question is only possible after we have found the root cause of
> the hang. If the hang would be caused by commands getting stuck in multipathd
> then it can be solved by changing the path configuration (see also the dmsetup
> message commands in blktests). If the hang is caused by a kernel bug then it's
> very well possible that there is no way to recover other than by rebooting the
> system on which the tests are run.
> 
> Thanks,
> 
> Bart.

Since 6.6.0-rc1 came out I decided to give blktests srp another try with the current
rdma for-next branch on my Ubuntu (debian) system. For the first time in a very long
time all the srp test cases run correctly multiple times. I ran each one 3X.

I had tried to build multipath-tools from source but ran into problems so I reinstalled
the current Ubuntu packages. I have no idea what was the root cause that finally went
away but I don't think it was in rxe as there aren't any recent patches related to the
blktests failures. I did notice that the dmesg traces picked up a couple of lines after
the place where it used to hang. Something about setting an ALUA timeout to 60 seconds.

Thanks to all who worked on this.

Bob Pearson

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-13 17:36           ` Bob Pearson
@ 2023-09-13 23:38             ` Zhu Yanjun
  2023-09-16  5:59               ` Zhu Yanjun
  0 siblings, 1 reply; 87+ messages in thread
From: Zhu Yanjun @ 2023-09-13 23:38 UTC (permalink / raw)
  To: Bob Pearson, Bart Van Assche, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

在 2023/9/14 1:36, Bob Pearson 写道:
> On 8/25/23 08:52, Bart Van Assche wrote:
>> On 8/24/23 18:11, Shinichiro Kawasaki wrote:
>>> If it takes time to resolve the issues, it sounds a good idea to make siw driver
>>> default, since it will make the hangs less painful for blktests users. Another
>>> idea to reduce the pain is to improve srp/002 and srp/011 to detect the hangs
>>> and report them as failures.
>>
>> At this moment we don't know whether the hangs can be converted into failures.
>> Answering this question is only possible after we have found the root cause of
>> the hang. If the hang would be caused by commands getting stuck in multipathd
>> then it can be solved by changing the path configuration (see also the dmsetup
>> message commands in blktests). If the hang is caused by a kernel bug then it's
>> very well possible that there is no way to recover other than by rebooting the
>> system on which the tests are run.
>>
>> Thanks,
>>
>> Bart.
> 
> Since 6.6.0-rc1 came out I decided to give blktests srp another try with the current
> rdma for-next branch on my Ubuntu (debian) system. For the first time in a very long
> time all the srp test cases run correctly multiple times. I ran each one 3X.
> 
> I had tried to build multipath-tools from source but ran into problems so I reinstalled
> the current Ubuntu packages. I have no idea what was the root cause that finally went
> away but I don't think it was in rxe as there aren't any recent patches related to the
> blktests failures. I did notice that the dmesg traces picked up a couple of lines after
> the place where it used to hang. Something about setting an ALUA timeout to 60 seconds.
> 
> Thanks to all who worked on this.

Hi, Bob

About this problem, IIRC, this problem easily occurred on Debian and 
Fedora 38 and with the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue 
support for rxe tasks").

And on Debian, with the latest multipathd, this problem seems to disappear.

On Fedora 38, even with the latest multipathd, this problem still can be 
observed.

On Ubuntu, it is difficult to reproduce this problem.

Perhaps this is why you can not reproduce this problem on Ubuntu.

It seems that this problem is related with linux distribution and the 
version of multipathd.

If I am missing something, please feel free to let me know.

Zhu Yanjun

> 
> Bob Pearson


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-13 23:38             ` Zhu Yanjun
@ 2023-09-16  5:59               ` Zhu Yanjun
  2023-09-19  4:14                 ` Shinichiro Kawasaki
  0 siblings, 1 reply; 87+ messages in thread
From: Zhu Yanjun @ 2023-09-16  5:59 UTC (permalink / raw)
  To: Bob Pearson, Bart Van Assche, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

[-- Attachment #1: Type: text/plain, Size: 3005 bytes --]



在 2023/9/14 7:38, Zhu Yanjun 写道:
> 在 2023/9/14 1:36, Bob Pearson 写道:
>> On 8/25/23 08:52, Bart Van Assche wrote:
>>> On 8/24/23 18:11, Shinichiro Kawasaki wrote:
>>>> If it takes time to resolve the issues, it sounds a good idea to 
>>>> make siw driver
>>>> default, since it will make the hangs less painful for blktests 
>>>> users. Another
>>>> idea to reduce the pain is to improve srp/002 and srp/011 to detect 
>>>> the hangs
>>>> and report them as failures.
>>>
>>> At this moment we don't know whether the hangs can be converted into 
>>> failures.
>>> Answering this question is only possible after we have found the root 
>>> cause of
>>> the hang. If the hang would be caused by commands getting stuck in 
>>> multipathd
>>> then it can be solved by changing the path configuration (see also 
>>> the dmsetup
>>> message commands in blktests). If the hang is caused by a kernel bug 
>>> then it's
>>> very well possible that there is no way to recover other than by 
>>> rebooting the
>>> system on which the tests are run.
>>>
>>> Thanks,
>>>
>>> Bart.
>>
>> Since 6.6.0-rc1 came out I decided to give blktests srp another try 
>> with the current
>> rdma for-next branch on my Ubuntu (debian) system. For the first time 
>> in a very long
>> time all the srp test cases run correctly multiple times. I ran each 
>> one 3X.
>>
>> I had tried to build multipath-tools from source but ran into problems 
>> so I reinstalled
>> the current Ubuntu packages. I have no idea what was the root cause 
>> that finally went
>> away but I don't think it was in rxe as there aren't any recent 
>> patches related to the
>> blktests failures. I did notice that the dmesg traces picked up a 
>> couple of lines after
>> the place where it used to hang. Something about setting an ALUA 
>> timeout to 60 seconds.
>>
>> Thanks to all who worked on this.
> 
> Hi, Bob
> 
> About this problem, IIRC, this problem easily occurred on Debian and 
> Fedora 38 and with the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue 
> support for rxe tasks").
> 
> And on Debian, with the latest multipathd, this problem seems to disappear.
> 
> On Fedora 38, even with the latest multipathd, this problem still can be 
> observed.

On Debian, with the latest multipathd or revert the commit 9b4b7c1f9f54 
("RDMA/rxe: Add workqueue support for rxe tasks"), this problem will 
disappear.

On Fedora 38, if the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue 
support for rxe tasks") is reverted, will this problem still appear?
I do not have such test environment. The commit is in the attachment,
can anyone have a test? Please let us know the test result. Thanks.

Zhu Yanjun

> 
> On Ubuntu, it is difficult to reproduce this problem.
> 
> Perhaps this is why you can not reproduce this problem on Ubuntu.
> 
> It seems that this problem is related with linux distribution and the 
> version of multipathd.
> 
> If I am missing something, please feel free to let me know.
> 
> Zhu Yanjun
> 
>>
>> Bob Pearson
> 

[-- Attachment #2: 0001-Revert-RDMA-rxe-Add-workqueue-support-for-rxe-tasks.patch --]
[-- Type: text/plain, Size: 9149 bytes --]

From fd2360edbc9171298d2e91fd9b74b4c3022db9d4 Mon Sep 17 00:00:00 2001
From: Zhu Yanjun <yanjun.zhu@linux.dev>
Date: Fri, 15 Sep 2023 23:07:17 -0400
Subject: [PATCH 1/1] Revert "RDMA/rxe: Add workqueue support for rxe tasks"

This reverts commit 9b4b7c1f9f54120940e243251e2b1407767b3381.

Signed-off-by: Zhu Yanjun <yanjun.zhu@linux.dev>
---
 drivers/infiniband/sw/rxe/rxe.c      |   9 +--
 drivers/infiniband/sw/rxe/rxe_task.c | 110 ++++++++++++---------------
 drivers/infiniband/sw/rxe/rxe_task.h |   6 +-
 3 files changed, 49 insertions(+), 76 deletions(-)

diff --git a/drivers/infiniband/sw/rxe/rxe.c b/drivers/infiniband/sw/rxe/rxe.c
index 54c723a6edda..7a7e713de52d 100644
--- a/drivers/infiniband/sw/rxe/rxe.c
+++ b/drivers/infiniband/sw/rxe/rxe.c
@@ -212,15 +212,9 @@ static int __init rxe_module_init(void)
 {
 	int err;
 
-	err = rxe_alloc_wq();
-	if (err)
-		return err;
-
 	err = rxe_net_init();
-	if (err) {
-		rxe_destroy_wq();
+	if (err)
 		return err;
-	}
 
 	rdma_link_register(&rxe_link_ops);
 	pr_info("loaded\n");
@@ -232,7 +226,6 @@ static void __exit rxe_module_exit(void)
 	rdma_link_unregister(&rxe_link_ops);
 	ib_unregister_driver(RDMA_DRIVER_RXE);
 	rxe_net_exit();
-	rxe_destroy_wq();
 
 	pr_info("unloaded\n");
 }
diff --git a/drivers/infiniband/sw/rxe/rxe_task.c b/drivers/infiniband/sw/rxe/rxe_task.c
index 1501120d4f52..fb9a6bc8e620 100644
--- a/drivers/infiniband/sw/rxe/rxe_task.c
+++ b/drivers/infiniband/sw/rxe/rxe_task.c
@@ -6,24 +6,8 @@
 
 #include "rxe.h"
 
-static struct workqueue_struct *rxe_wq;
-
-int rxe_alloc_wq(void)
-{
-	rxe_wq = alloc_workqueue("rxe_wq", WQ_UNBOUND, WQ_MAX_ACTIVE);
-	if (!rxe_wq)
-		return -ENOMEM;
-
-	return 0;
-}
-
-void rxe_destroy_wq(void)
-{
-	destroy_workqueue(rxe_wq);
-}
-
 /* Check if task is idle i.e. not running, not scheduled in
- * work queue and not draining. If so move to busy to
+ * tasklet queue and not draining. If so move to busy to
  * reserve a slot in do_task() by setting to busy and taking
  * a qp reference to cover the gap from now until the task finishes.
  * state will move out of busy if task returns a non zero value
@@ -37,6 +21,9 @@ static bool __reserve_if_idle(struct rxe_task *task)
 {
 	WARN_ON(rxe_read(task->qp) <= 0);
 
+	if (task->tasklet.state & BIT(TASKLET_STATE_SCHED))
+		return false;
+
 	if (task->state == TASK_STATE_IDLE) {
 		rxe_get(task->qp);
 		task->state = TASK_STATE_BUSY;
@@ -51,7 +38,7 @@ static bool __reserve_if_idle(struct rxe_task *task)
 }
 
 /* check if task is idle or drained and not currently
- * scheduled in the work queue. This routine is
+ * scheduled in the tasklet queue. This routine is
  * called by rxe_cleanup_task or rxe_disable_task to
  * see if the queue is empty.
  * Context: caller should hold task->lock.
@@ -59,7 +46,7 @@ static bool __reserve_if_idle(struct rxe_task *task)
  */
 static bool __is_done(struct rxe_task *task)
 {
-	if (work_pending(&task->work))
+	if (task->tasklet.state & BIT(TASKLET_STATE_SCHED))
 		return false;
 
 	if (task->state == TASK_STATE_IDLE ||
@@ -90,23 +77,23 @@ static bool is_done(struct rxe_task *task)
  * schedules the task. They must call __reserve_if_idle to
  * move the task to busy before calling or scheduling.
  * The task can also be moved to drained or invalid
- * by calls to rxe_cleanup_task or rxe_disable_task.
+ * by calls to rxe-cleanup_task or rxe_disable_task.
  * In that case tasks which get here are not executed but
  * just flushed. The tasks are designed to look to see if
- * there is work to do and then do part of it before returning
+ * there is work to do and do part of it before returning
  * here with a return value of zero until all the work
- * has been consumed then it returns a non-zero value.
+ * has been consumed then it retuens a non-zero value.
  * The number of times the task can be run is limited by
  * max iterations so one task cannot hold the cpu forever.
- * If the limit is hit and work remains the task is rescheduled.
  */
-static void do_task(struct rxe_task *task)
+static void do_task(struct tasklet_struct *t)
 {
+	int cont;
+	int ret;
+	struct rxe_task *task = from_tasklet(task, t, tasklet);
 	unsigned int iterations;
 	unsigned long flags;
 	int resched = 0;
-	int cont;
-	int ret;
 
 	WARN_ON(rxe_read(task->qp) <= 0);
 
@@ -128,22 +115,25 @@ static void do_task(struct rxe_task *task)
 		} while (ret == 0 && iterations-- > 0);
 
 		spin_lock_irqsave(&task->lock, flags);
-		/* we're not done yet but we ran out of iterations.
-		 * yield the cpu and reschedule the task
-		 */
-		if (!ret) {
-			task->state = TASK_STATE_IDLE;
-			resched = 1;
-			goto exit;
-		}
-
 		switch (task->state) {
 		case TASK_STATE_BUSY:
-			task->state = TASK_STATE_IDLE;
+			if (ret) {
+				task->state = TASK_STATE_IDLE;
+			} else {
+				/* This can happen if the client
+				 * can add work faster than the
+				 * tasklet can finish it.
+				 * Reschedule the tasklet and exit
+				 * the loop to give up the cpu
+				 */
+				task->state = TASK_STATE_IDLE;
+				resched = 1;
+			}
 			break;
 
-		/* someone tried to schedule the task while we
-		 * were running, keep going
+		/* someone tried to run the task since the last time we called
+		 * func, so we will call one more time regardless of the
+		 * return value
 		 */
 		case TASK_STATE_ARMED:
 			task->state = TASK_STATE_BUSY;
@@ -151,24 +141,22 @@ static void do_task(struct rxe_task *task)
 			break;
 
 		case TASK_STATE_DRAINING:
-			task->state = TASK_STATE_DRAINED;
+			if (ret)
+				task->state = TASK_STATE_DRAINED;
+			else
+				cont = 1;
 			break;
 
 		default:
 			WARN_ON(1);
-			rxe_dbg_qp(task->qp, "unexpected task state = %d",
-				   task->state);
-			task->state = TASK_STATE_IDLE;
+			rxe_info_qp(task->qp, "unexpected task state = %d", task->state);
 		}
 
-exit:
 		if (!cont) {
 			task->num_done++;
 			if (WARN_ON(task->num_done != task->num_sched))
-				rxe_dbg_qp(
-					task->qp,
-					"%ld tasks scheduled, %ld tasks done",
-					task->num_sched, task->num_done);
+				rxe_err_qp(task->qp, "%ld tasks scheduled, %ld tasks done",
+					   task->num_sched, task->num_done);
 		}
 		spin_unlock_irqrestore(&task->lock, flags);
 	} while (cont);
@@ -181,12 +169,6 @@ static void do_task(struct rxe_task *task)
 	rxe_put(task->qp);
 }
 
-/* wrapper around do_task to fix argument for work queue */
-static void do_work(struct work_struct *work)
-{
-	do_task(container_of(work, struct rxe_task, work));
-}
-
 int rxe_init_task(struct rxe_task *task, struct rxe_qp *qp,
 		  int (*func)(struct rxe_qp *))
 {
@@ -194,9 +176,11 @@ int rxe_init_task(struct rxe_task *task, struct rxe_qp *qp,
 
 	task->qp = qp;
 	task->func = func;
+
+	tasklet_setup(&task->tasklet, do_task);
+
 	task->state = TASK_STATE_IDLE;
 	spin_lock_init(&task->lock);
-	INIT_WORK(&task->work, do_work);
 
 	return 0;
 }
@@ -229,6 +213,8 @@ void rxe_cleanup_task(struct rxe_task *task)
 	while (!is_done(task))
 		cond_resched();
 
+	tasklet_kill(&task->tasklet);
+
 	spin_lock_irqsave(&task->lock, flags);
 	task->state = TASK_STATE_INVALID;
 	spin_unlock_irqrestore(&task->lock, flags);
@@ -240,7 +226,7 @@ void rxe_cleanup_task(struct rxe_task *task)
 void rxe_run_task(struct rxe_task *task)
 {
 	unsigned long flags;
-	bool run;
+	int run;
 
 	WARN_ON(rxe_read(task->qp) <= 0);
 
@@ -249,11 +235,11 @@ void rxe_run_task(struct rxe_task *task)
 	spin_unlock_irqrestore(&task->lock, flags);
 
 	if (run)
-		do_task(task);
+		do_task(&task->tasklet);
 }
 
-/* schedule the task to run later as a work queue entry.
- * the queue_work call can be called holding
+/* schedule the task to run later as a tasklet.
+ * the tasklet)schedule call can be called holding
  * the lock.
  */
 void rxe_sched_task(struct rxe_task *task)
@@ -264,7 +250,7 @@ void rxe_sched_task(struct rxe_task *task)
 
 	spin_lock_irqsave(&task->lock, flags);
 	if (__reserve_if_idle(task))
-		queue_work(rxe_wq, &task->work);
+		tasklet_schedule(&task->tasklet);
 	spin_unlock_irqrestore(&task->lock, flags);
 }
 
@@ -291,9 +277,7 @@ void rxe_disable_task(struct rxe_task *task)
 	while (!is_done(task))
 		cond_resched();
 
-	spin_lock_irqsave(&task->lock, flags);
-	task->state = TASK_STATE_DRAINED;
-	spin_unlock_irqrestore(&task->lock, flags);
+	tasklet_disable(&task->tasklet);
 }
 
 void rxe_enable_task(struct rxe_task *task)
@@ -307,7 +291,7 @@ void rxe_enable_task(struct rxe_task *task)
 		spin_unlock_irqrestore(&task->lock, flags);
 		return;
 	}
-
 	task->state = TASK_STATE_IDLE;
+	tasklet_enable(&task->tasklet);
 	spin_unlock_irqrestore(&task->lock, flags);
 }
diff --git a/drivers/infiniband/sw/rxe/rxe_task.h b/drivers/infiniband/sw/rxe/rxe_task.h
index a63e258b3d66..facb7c8e3729 100644
--- a/drivers/infiniband/sw/rxe/rxe_task.h
+++ b/drivers/infiniband/sw/rxe/rxe_task.h
@@ -22,7 +22,7 @@ enum {
  * called again.
  */
 struct rxe_task {
-	struct work_struct	work;
+	struct tasklet_struct	tasklet;
 	int			state;
 	spinlock_t		lock;
 	struct rxe_qp		*qp;
@@ -32,10 +32,6 @@ struct rxe_task {
 	long			num_done;
 };
 
-int rxe_alloc_wq(void);
-
-void rxe_destroy_wq(void);
-
 /*
  * init rxe_task structure
  *	qp  => parameter to pass to func
-- 
2.40.1


^ permalink raw reply related	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-16  5:59               ` Zhu Yanjun
@ 2023-09-19  4:14                 ` Shinichiro Kawasaki
  2023-09-19  8:07                   ` Zhu Yanjun
  0 siblings, 1 reply; 87+ messages in thread
From: Shinichiro Kawasaki @ 2023-09-19  4:14 UTC (permalink / raw)
  To: Zhu Yanjun; +Cc: Bob Pearson, Bart Van Assche, linux-rdma, linux-scsi

On Sep 16, 2023 / 13:59, Zhu Yanjun wrote:
[...]
> On Debian, with the latest multipathd or revert the commit 9b4b7c1f9f54
> ("RDMA/rxe: Add workqueue support for rxe tasks"), this problem will
> disappear.

Zhu, thank you for the actions.

> On Fedora 38, if the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support
> for rxe tasks") is reverted, will this problem still appear?
> I do not have such test environment. The commit is in the attachment,
> can anyone have a test? Please let us know the test result. Thanks.

I tried the latest kernel tag v6.6-rc2 with my Fedora 38 test systems. With the
v6.6-rc2 kernel, I still see the hang. I repeated the blktests test case srp/002
30 time or so, then the hang was recreated. Then I reverted the commit
9b4b7c1f9f54 from v6.6-rc2, and the hang disappeared. I repeated the blktests
test case 100 times, and did not see the hang.

I confirmed these results under two multipathd conditions: 1) with Fedora latest
device-mapper-multipath package v0.9.4, and 2) the latest multipath-tools v0.9.6
that I built from source code.

So, when the commit gets reverted, the hang disappears as I reported for
v6.5-rcX kernels.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-19  4:14                 ` Shinichiro Kawasaki
@ 2023-09-19  8:07                   ` Zhu Yanjun
  2023-09-19 16:30                     ` Pearson, Robert B
  2023-09-19 18:11                     ` Bob Pearson
  0 siblings, 2 replies; 87+ messages in thread
From: Zhu Yanjun @ 2023-09-19  8:07 UTC (permalink / raw)
  To: Shinichiro Kawasaki; +Cc: Bob Pearson, Bart Van Assche, linux-rdma, linux-scsi

在 2023/9/19 12:14, Shinichiro Kawasaki 写道:
> On Sep 16, 2023 / 13:59, Zhu Yanjun wrote:
> [...]
>> On Debian, with the latest multipathd or revert the commit 9b4b7c1f9f54
>> ("RDMA/rxe: Add workqueue support for rxe tasks"), this problem will
>> disappear.
> 
> Zhu, thank you for the actions.
> 
>> On Fedora 38, if the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support
>> for rxe tasks") is reverted, will this problem still appear?
>> I do not have such test environment. The commit is in the attachment,
>> can anyone have a test? Please let us know the test result. Thanks.
> 
> I tried the latest kernel tag v6.6-rc2 with my Fedora 38 test systems. With the
> v6.6-rc2 kernel, I still see the hang. I repeated the blktests test case srp/002
> 30 time or so, then the hang was recreated. Then I reverted the commit
> 9b4b7c1f9f54 from v6.6-rc2, and the hang disappeared. I repeated the blktests
> test case 100 times, and did not see the hang.
> 
> I confirmed these results under two multipathd conditions: 1) with Fedora latest
> device-mapper-multipath package v0.9.4, and 2) the latest multipath-tools v0.9.6
> that I built from source code.
> 
> So, when the commit gets reverted, the hang disappears as I reported for
> v6.5-rcX kernels.
Thanks, Shinichiro Kawasaki. Your helps are appreciated.

This problem is related with the followings:

1). Linux distributions: Ubuntu, Debian and Fedora;

2). multipathd;

3). the commits 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe 
tasks")

On Ubuntu, with or without the commit, this problem does not occur.

On Debian, without this commit, this problem does not occur. With this 
commit, this problem will occur.

On Fedora, without this commit, this problem does not occur. With this 
commit, this problem will occur.

The commits 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe 
tasks") is from Bob Pearson.

Hi, Bob, do you have any comments about this problem? It seems that this 
commit is not compatible with blktests.

Hi, Jason and Leon, please comment on this problem.

Thanks a lot.

Zhu Yanjun

^ permalink raw reply	[flat|nested] 87+ messages in thread

* RE: [bug report] blktests srp/002 hang
  2023-09-19  8:07                   ` Zhu Yanjun
@ 2023-09-19 16:30                     ` Pearson, Robert B
  2023-09-19 18:11                     ` Bob Pearson
  1 sibling, 0 replies; 87+ messages in thread
From: Pearson, Robert B @ 2023-09-19 16:30 UTC (permalink / raw)
  To: rpearsonhpe; +Cc: Bob Pearson, Bart Van Assche, linux-rdma, linux-scsi

My belief is that the issue is related to timing not the logical operation of the code.
Work queues are just kernel processes and can be scheduled (if not holding spinlocks)
while soft IRQs lock up the CPU until they exit. This can cause longer delays in responding
to ULPs. The work queue tasks for each QP are strictly single threaded which is managed by
the work queue framework the same as tasklets. The other evidence ofthis 

-----Original Message-----
From: Zhu Yanjun <yanjun.zhu@linux.dev> 
Sent: Tuesday, September 19, 2023 3:07 AM
To: Shinichiro Kawasaki <shinichiro.kawasaki@wdc.com>
Cc: Bob Pearson <rpearsonhpe@gmail.com>; Bart Van Assche <bvanassche@acm.org>; linux-rdma@vger.kernel.org; linux-scsi@vger.kernel.org
Subject: Re: [bug report] blktests srp/002 hang

在 2023/9/19 12:14, Shinichiro Kawasaki 写道:
> On Sep 16, 2023 / 13:59, Zhu Yanjun wrote:
> [...]
>> On Debian, with the latest multipathd or revert the commit 
>> 9b4b7c1f9f54
>> ("RDMA/rxe: Add workqueue support for rxe tasks"), this problem will 
>> disappear.
> 
> Zhu, thank you for the actions.
> 
>> On Fedora 38, if the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue 
>> support for rxe tasks") is reverted, will this problem still appear?
>> I do not have such test environment. The commit is in the attachment, 
>> can anyone have a test? Please let us know the test result. Thanks.
> 
> I tried the latest kernel tag v6.6-rc2 with my Fedora 38 test systems. 
> With the
> v6.6-rc2 kernel, I still see the hang. I repeated the blktests test 
> case srp/002
> 30 time or so, then the hang was recreated. Then I reverted the commit
> 9b4b7c1f9f54 from v6.6-rc2, and the hang disappeared. I repeated the 
> blktests test case 100 times, and did not see the hang.
> 
> I confirmed these results under two multipathd conditions: 1) with 
> Fedora latest device-mapper-multipath package v0.9.4, and 2) the 
> latest multipath-tools v0.9.6 that I built from source code.
> 
> So, when the commit gets reverted, the hang disappears as I reported 
> for v6.5-rcX kernels.
Thanks, Shinichiro Kawasaki. Your helps are appreciated.

This problem is related with the followings:

1). Linux distributions: Ubuntu, Debian and Fedora;

2). multipathd;

3). the commits 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe
tasks")

On Ubuntu, with or without the commit, this problem does not occur.

On Debian, without this commit, this problem does not occur. With this commit, this problem will occur.

On Fedora, without this commit, this problem does not occur. With this commit, this problem will occur.

The commits 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe
tasks") is from Bob Pearson.

Hi, Bob, do you have any comments about this problem? It seems that this commit is not compatible with blktests.

Hi, Jason and Leon, please comment on this problem.

Thanks a lot.

Zhu Yanjun

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-19  8:07                   ` Zhu Yanjun
  2023-09-19 16:30                     ` Pearson, Robert B
@ 2023-09-19 18:11                     ` Bob Pearson
  2023-09-20  4:22                       ` Zhu Yanjun
  1 sibling, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-09-19 18:11 UTC (permalink / raw)
  To: Zhu Yanjun, Shinichiro Kawasaki; +Cc: Bart Van Assche, linux-rdma, linux-scsi

On 9/19/23 03:07, Zhu Yanjun wrote:
> 在 2023/9/19 12:14, Shinichiro Kawasaki 写道:
>> On Sep 16, 2023 / 13:59, Zhu Yanjun wrote:
>> [...]
>>> On Debian, with the latest multipathd or revert the commit 9b4b7c1f9f54
>>> ("RDMA/rxe: Add workqueue support for rxe tasks"), this problem will
>>> disappear.
>>
>> Zhu, thank you for the actions.
>>
>>> On Fedora 38, if the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support
>>> for rxe tasks") is reverted, will this problem still appear?
>>> I do not have such test environment. The commit is in the attachment,
>>> can anyone have a test? Please let us know the test result. Thanks.
>>
>> I tried the latest kernel tag v6.6-rc2 with my Fedora 38 test systems. With the
>> v6.6-rc2 kernel, I still see the hang. I repeated the blktests test case srp/002
>> 30 time or so, then the hang was recreated. Then I reverted the commit
>> 9b4b7c1f9f54 from v6.6-rc2, and the hang disappeared. I repeated the blktests
>> test case 100 times, and did not see the hang.
>>
>> I confirmed these results under two multipathd conditions: 1) with Fedora latest
>> device-mapper-multipath package v0.9.4, and 2) the latest multipath-tools v0.9.6
>> that I built from source code.
>>
>> So, when the commit gets reverted, the hang disappears as I reported for
>> v6.5-rcX kernels.
> Thanks, Shinichiro Kawasaki. Your helps are appreciated.
> 
> This problem is related with the followings:
> 
> 1). Linux distributions: Ubuntu, Debian and Fedora;
> 
> 2). multipathd;
> 
> 3). the commits 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks")
> 
> On Ubuntu, with or without the commit, this problem does not occur.
> 
> On Debian, without this commit, this problem does not occur. With this commit, this problem will occur.
> 
> On Fedora, without this commit, this problem does not occur. With this commit, this problem will occur.
> 
> The commits 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks") is from Bob Pearson.
> 
> Hi, Bob, do you have any comments about this problem? It seems that this commit is not compatible with blktests.
> 
> Hi, Jason and Leon, please comment on this problem.
> 
> Thanks a lot.
> 
> Zhu Yanjun

My belief is that the issue is related to timing not the logical operation of the code.
Work queues are just kernel processes and can be scheduled (if not holding spinlocks)
while soft IRQs lock up the CPU until they exit. This can cause longer delays in responding
to ULPs. The work queue tasks for each QP are strictly single threaded which is managed by
the work queue framework the same as tasklets.

Earlier in time I have also seen the exact same hang behavior with the siw driver but not
recently. Also I have seen sensitivity to logging changes in the hang behavior. These are
indications that timing may be the cause of the issue.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-19 18:11                     ` Bob Pearson
@ 2023-09-20  4:22                       ` Zhu Yanjun
  2023-09-20 16:24                         ` Bob Pearson
  0 siblings, 1 reply; 87+ messages in thread
From: Zhu Yanjun @ 2023-09-20  4:22 UTC (permalink / raw)
  To: Bob Pearson, Shinichiro Kawasaki; +Cc: Bart Van Assche, linux-rdma, linux-scsi

[-- Attachment #1: Type: text/plain, Size: 4134 bytes --]


在 2023/9/20 2:11, Bob Pearson 写道:
> On 9/19/23 03:07, Zhu Yanjun wrote:
>> 在 2023/9/19 12:14, Shinichiro Kawasaki 写道:
>>> On Sep 16, 2023 / 13:59, Zhu Yanjun wrote:
>>> [...]
>>>> On Debian, with the latest multipathd or revert the commit 9b4b7c1f9f54
>>>> ("RDMA/rxe: Add workqueue support for rxe tasks"), this problem will
>>>> disappear.
>>> Zhu, thank you for the actions.
>>>
>>>> On Fedora 38, if the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support
>>>> for rxe tasks") is reverted, will this problem still appear?
>>>> I do not have such test environment. The commit is in the attachment,
>>>> can anyone have a test? Please let us know the test result. Thanks.
>>> I tried the latest kernel tag v6.6-rc2 with my Fedora 38 test systems. With the
>>> v6.6-rc2 kernel, I still see the hang. I repeated the blktests test case srp/002
>>> 30 time or so, then the hang was recreated. Then I reverted the commit
>>> 9b4b7c1f9f54 from v6.6-rc2, and the hang disappeared. I repeated the blktests
>>> test case 100 times, and did not see the hang.
>>>
>>> I confirmed these results under two multipathd conditions: 1) with Fedora latest
>>> device-mapper-multipath package v0.9.4, and 2) the latest multipath-tools v0.9.6
>>> that I built from source code.
>>>
>>> So, when the commit gets reverted, the hang disappears as I reported for
>>> v6.5-rcX kernels.
>> Thanks, Shinichiro Kawasaki. Your helps are appreciated.
>>
>> This problem is related with the followings:
>>
>> 1). Linux distributions: Ubuntu, Debian and Fedora;
>>
>> 2). multipathd;
>>
>> 3). the commits 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks")
>>
>> On Ubuntu, with or without the commit, this problem does not occur.
>>
>> On Debian, without this commit, this problem does not occur. With this commit, this problem will occur.
>>
>> On Fedora, without this commit, this problem does not occur. With this commit, this problem will occur.
>>
>> The commits 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks") is from Bob Pearson.
>>
>> Hi, Bob, do you have any comments about this problem? It seems that this commit is not compatible with blktests.
>>
>> Hi, Jason and Leon, please comment on this problem.
>>
>> Thanks a lot.
>>
>> Zhu Yanjun
> My belief is that the issue is related to timing not the logical operation of the code.
> Work queues are just kernel processes and can be scheduled (if not holding spinlocks)
> while soft IRQs lock up the CPU until they exit. This can cause longer delays in responding
> to ULPs. The work queue tasks for each QP are strictly single threaded which is managed by
> the work queue framework the same as tasklets.

Thanks, Bob. From you, the workqueue can be scheduled, this can cause 
longer delays in reponding to ULPs.

This will cause ULPs to hang. But the tasklet will lock up the CPU until 
it exits. So the tasklet will repond to

ULPs in time.

To this, there are 3 solutins:

1). Try to make workqueue respond ULPs in time, this hang problem should 
be avoided. so this will not cause

this problem. But from the kernel, workqueue should be scheduled,So it 
is difficult to avoid this longer delay.


2). Make tasklet and workqueue both work in RXE.  We can make one of 
tasklet or workqueue as the default. The user

can choose to use tasklet or workqueue via kernel module parameter or 
sysctl variables. This will cost a lot of time

and efforts to implement it.


3). Revert the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for 
rxe tasks"). Shinichiro Kawasaki

confirmed that this can fix this regression. And the patch is in the 
attachment.


Hi, Bob, Please comment.

Hi, Jason && Leon, please also comment on this.

Thanks a lot.

>
> Earlier in time I have also seen the exact same hang behavior with the siw driver but not
> recently. Also I have seen sensitivity to logging changes in the hang behavior. These are

This is a regression to RXE which is caused by the the commit 
9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").

We should fix it.

Zhu Yanjun

> indications that timing may be the cause of the issue.
>
> Bob

[-- Attachment #2: 0001-Revert-RDMA-rxe-Add-workqueue-support-for-rxe-tasks.patch --]
[-- Type: text/plain, Size: 9149 bytes --]

From fd2360edbc9171298d2e91fd9b74b4c3022db9d4 Mon Sep 17 00:00:00 2001
From: Zhu Yanjun <yanjun.zhu@linux.dev>
Date: Fri, 15 Sep 2023 23:07:17 -0400
Subject: [PATCH 1/1] Revert "RDMA/rxe: Add workqueue support for rxe tasks"

This reverts commit 9b4b7c1f9f54120940e243251e2b1407767b3381.

Signed-off-by: Zhu Yanjun <yanjun.zhu@linux.dev>
---
 drivers/infiniband/sw/rxe/rxe.c      |   9 +--
 drivers/infiniband/sw/rxe/rxe_task.c | 110 ++++++++++++---------------
 drivers/infiniband/sw/rxe/rxe_task.h |   6 +-
 3 files changed, 49 insertions(+), 76 deletions(-)

diff --git a/drivers/infiniband/sw/rxe/rxe.c b/drivers/infiniband/sw/rxe/rxe.c
index 54c723a6edda..7a7e713de52d 100644
--- a/drivers/infiniband/sw/rxe/rxe.c
+++ b/drivers/infiniband/sw/rxe/rxe.c
@@ -212,15 +212,9 @@ static int __init rxe_module_init(void)
 {
 	int err;
 
-	err = rxe_alloc_wq();
-	if (err)
-		return err;
-
 	err = rxe_net_init();
-	if (err) {
-		rxe_destroy_wq();
+	if (err)
 		return err;
-	}
 
 	rdma_link_register(&rxe_link_ops);
 	pr_info("loaded\n");
@@ -232,7 +226,6 @@ static void __exit rxe_module_exit(void)
 	rdma_link_unregister(&rxe_link_ops);
 	ib_unregister_driver(RDMA_DRIVER_RXE);
 	rxe_net_exit();
-	rxe_destroy_wq();
 
 	pr_info("unloaded\n");
 }
diff --git a/drivers/infiniband/sw/rxe/rxe_task.c b/drivers/infiniband/sw/rxe/rxe_task.c
index 1501120d4f52..fb9a6bc8e620 100644
--- a/drivers/infiniband/sw/rxe/rxe_task.c
+++ b/drivers/infiniband/sw/rxe/rxe_task.c
@@ -6,24 +6,8 @@
 
 #include "rxe.h"
 
-static struct workqueue_struct *rxe_wq;
-
-int rxe_alloc_wq(void)
-{
-	rxe_wq = alloc_workqueue("rxe_wq", WQ_UNBOUND, WQ_MAX_ACTIVE);
-	if (!rxe_wq)
-		return -ENOMEM;
-
-	return 0;
-}
-
-void rxe_destroy_wq(void)
-{
-	destroy_workqueue(rxe_wq);
-}
-
 /* Check if task is idle i.e. not running, not scheduled in
- * work queue and not draining. If so move to busy to
+ * tasklet queue and not draining. If so move to busy to
  * reserve a slot in do_task() by setting to busy and taking
  * a qp reference to cover the gap from now until the task finishes.
  * state will move out of busy if task returns a non zero value
@@ -37,6 +21,9 @@ static bool __reserve_if_idle(struct rxe_task *task)
 {
 	WARN_ON(rxe_read(task->qp) <= 0);
 
+	if (task->tasklet.state & BIT(TASKLET_STATE_SCHED))
+		return false;
+
 	if (task->state == TASK_STATE_IDLE) {
 		rxe_get(task->qp);
 		task->state = TASK_STATE_BUSY;
@@ -51,7 +38,7 @@ static bool __reserve_if_idle(struct rxe_task *task)
 }
 
 /* check if task is idle or drained and not currently
- * scheduled in the work queue. This routine is
+ * scheduled in the tasklet queue. This routine is
  * called by rxe_cleanup_task or rxe_disable_task to
  * see if the queue is empty.
  * Context: caller should hold task->lock.
@@ -59,7 +46,7 @@ static bool __reserve_if_idle(struct rxe_task *task)
  */
 static bool __is_done(struct rxe_task *task)
 {
-	if (work_pending(&task->work))
+	if (task->tasklet.state & BIT(TASKLET_STATE_SCHED))
 		return false;
 
 	if (task->state == TASK_STATE_IDLE ||
@@ -90,23 +77,23 @@ static bool is_done(struct rxe_task *task)
  * schedules the task. They must call __reserve_if_idle to
  * move the task to busy before calling or scheduling.
  * The task can also be moved to drained or invalid
- * by calls to rxe_cleanup_task or rxe_disable_task.
+ * by calls to rxe-cleanup_task or rxe_disable_task.
  * In that case tasks which get here are not executed but
  * just flushed. The tasks are designed to look to see if
- * there is work to do and then do part of it before returning
+ * there is work to do and do part of it before returning
  * here with a return value of zero until all the work
- * has been consumed then it returns a non-zero value.
+ * has been consumed then it retuens a non-zero value.
  * The number of times the task can be run is limited by
  * max iterations so one task cannot hold the cpu forever.
- * If the limit is hit and work remains the task is rescheduled.
  */
-static void do_task(struct rxe_task *task)
+static void do_task(struct tasklet_struct *t)
 {
+	int cont;
+	int ret;
+	struct rxe_task *task = from_tasklet(task, t, tasklet);
 	unsigned int iterations;
 	unsigned long flags;
 	int resched = 0;
-	int cont;
-	int ret;
 
 	WARN_ON(rxe_read(task->qp) <= 0);
 
@@ -128,22 +115,25 @@ static void do_task(struct rxe_task *task)
 		} while (ret == 0 && iterations-- > 0);
 
 		spin_lock_irqsave(&task->lock, flags);
-		/* we're not done yet but we ran out of iterations.
-		 * yield the cpu and reschedule the task
-		 */
-		if (!ret) {
-			task->state = TASK_STATE_IDLE;
-			resched = 1;
-			goto exit;
-		}
-
 		switch (task->state) {
 		case TASK_STATE_BUSY:
-			task->state = TASK_STATE_IDLE;
+			if (ret) {
+				task->state = TASK_STATE_IDLE;
+			} else {
+				/* This can happen if the client
+				 * can add work faster than the
+				 * tasklet can finish it.
+				 * Reschedule the tasklet and exit
+				 * the loop to give up the cpu
+				 */
+				task->state = TASK_STATE_IDLE;
+				resched = 1;
+			}
 			break;
 
-		/* someone tried to schedule the task while we
-		 * were running, keep going
+		/* someone tried to run the task since the last time we called
+		 * func, so we will call one more time regardless of the
+		 * return value
 		 */
 		case TASK_STATE_ARMED:
 			task->state = TASK_STATE_BUSY;
@@ -151,24 +141,22 @@ static void do_task(struct rxe_task *task)
 			break;
 
 		case TASK_STATE_DRAINING:
-			task->state = TASK_STATE_DRAINED;
+			if (ret)
+				task->state = TASK_STATE_DRAINED;
+			else
+				cont = 1;
 			break;
 
 		default:
 			WARN_ON(1);
-			rxe_dbg_qp(task->qp, "unexpected task state = %d",
-				   task->state);
-			task->state = TASK_STATE_IDLE;
+			rxe_info_qp(task->qp, "unexpected task state = %d", task->state);
 		}
 
-exit:
 		if (!cont) {
 			task->num_done++;
 			if (WARN_ON(task->num_done != task->num_sched))
-				rxe_dbg_qp(
-					task->qp,
-					"%ld tasks scheduled, %ld tasks done",
-					task->num_sched, task->num_done);
+				rxe_err_qp(task->qp, "%ld tasks scheduled, %ld tasks done",
+					   task->num_sched, task->num_done);
 		}
 		spin_unlock_irqrestore(&task->lock, flags);
 	} while (cont);
@@ -181,12 +169,6 @@ static void do_task(struct rxe_task *task)
 	rxe_put(task->qp);
 }
 
-/* wrapper around do_task to fix argument for work queue */
-static void do_work(struct work_struct *work)
-{
-	do_task(container_of(work, struct rxe_task, work));
-}
-
 int rxe_init_task(struct rxe_task *task, struct rxe_qp *qp,
 		  int (*func)(struct rxe_qp *))
 {
@@ -194,9 +176,11 @@ int rxe_init_task(struct rxe_task *task, struct rxe_qp *qp,
 
 	task->qp = qp;
 	task->func = func;
+
+	tasklet_setup(&task->tasklet, do_task);
+
 	task->state = TASK_STATE_IDLE;
 	spin_lock_init(&task->lock);
-	INIT_WORK(&task->work, do_work);
 
 	return 0;
 }
@@ -229,6 +213,8 @@ void rxe_cleanup_task(struct rxe_task *task)
 	while (!is_done(task))
 		cond_resched();
 
+	tasklet_kill(&task->tasklet);
+
 	spin_lock_irqsave(&task->lock, flags);
 	task->state = TASK_STATE_INVALID;
 	spin_unlock_irqrestore(&task->lock, flags);
@@ -240,7 +226,7 @@ void rxe_cleanup_task(struct rxe_task *task)
 void rxe_run_task(struct rxe_task *task)
 {
 	unsigned long flags;
-	bool run;
+	int run;
 
 	WARN_ON(rxe_read(task->qp) <= 0);
 
@@ -249,11 +235,11 @@ void rxe_run_task(struct rxe_task *task)
 	spin_unlock_irqrestore(&task->lock, flags);
 
 	if (run)
-		do_task(task);
+		do_task(&task->tasklet);
 }
 
-/* schedule the task to run later as a work queue entry.
- * the queue_work call can be called holding
+/* schedule the task to run later as a tasklet.
+ * the tasklet)schedule call can be called holding
  * the lock.
  */
 void rxe_sched_task(struct rxe_task *task)
@@ -264,7 +250,7 @@ void rxe_sched_task(struct rxe_task *task)
 
 	spin_lock_irqsave(&task->lock, flags);
 	if (__reserve_if_idle(task))
-		queue_work(rxe_wq, &task->work);
+		tasklet_schedule(&task->tasklet);
 	spin_unlock_irqrestore(&task->lock, flags);
 }
 
@@ -291,9 +277,7 @@ void rxe_disable_task(struct rxe_task *task)
 	while (!is_done(task))
 		cond_resched();
 
-	spin_lock_irqsave(&task->lock, flags);
-	task->state = TASK_STATE_DRAINED;
-	spin_unlock_irqrestore(&task->lock, flags);
+	tasklet_disable(&task->tasklet);
 }
 
 void rxe_enable_task(struct rxe_task *task)
@@ -307,7 +291,7 @@ void rxe_enable_task(struct rxe_task *task)
 		spin_unlock_irqrestore(&task->lock, flags);
 		return;
 	}
-
 	task->state = TASK_STATE_IDLE;
+	tasklet_enable(&task->tasklet);
 	spin_unlock_irqrestore(&task->lock, flags);
 }
diff --git a/drivers/infiniband/sw/rxe/rxe_task.h b/drivers/infiniband/sw/rxe/rxe_task.h
index a63e258b3d66..facb7c8e3729 100644
--- a/drivers/infiniband/sw/rxe/rxe_task.h
+++ b/drivers/infiniband/sw/rxe/rxe_task.h
@@ -22,7 +22,7 @@ enum {
  * called again.
  */
 struct rxe_task {
-	struct work_struct	work;
+	struct tasklet_struct	tasklet;
 	int			state;
 	spinlock_t		lock;
 	struct rxe_qp		*qp;
@@ -32,10 +32,6 @@ struct rxe_task {
 	long			num_done;
 };
 
-int rxe_alloc_wq(void);
-
-void rxe_destroy_wq(void);
-
 /*
  * init rxe_task structure
  *	qp  => parameter to pass to func
-- 
2.40.1


^ permalink raw reply related	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-20  4:22                       ` Zhu Yanjun
@ 2023-09-20 16:24                         ` Bob Pearson
  2023-09-20 16:36                           ` Bart Van Assche
  0 siblings, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-09-20 16:24 UTC (permalink / raw)
  To: Zhu Yanjun, Shinichiro Kawasaki; +Cc: Bart Van Assche, linux-rdma, linux-scsi

On 9/19/23 23:22, Zhu Yanjun wrote:
> 
> 在 2023/9/20 2:11, Bob Pearson 写道:
>> On 9/19/23 03:07, Zhu Yanjun wrote:
>>> 在 2023/9/19 12:14, Shinichiro Kawasaki 写道:
>>>> On Sep 16, 2023 / 13:59, Zhu Yanjun wrote:
>>>> [...]
>>>>> On Debian, with the latest multipathd or revert the commit 9b4b7c1f9f54
>>>>> ("RDMA/rxe: Add workqueue support for rxe tasks"), this problem will
>>>>> disappear.
>>>> Zhu, thank you for the actions.
>>>>
>>>>> On Fedora 38, if the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support
>>>>> for rxe tasks") is reverted, will this problem still appear?
>>>>> I do not have such test environment. The commit is in the attachment,
>>>>> can anyone have a test? Please let us know the test result. Thanks.
>>>> I tried the latest kernel tag v6.6-rc2 with my Fedora 38 test systems. With the
>>>> v6.6-rc2 kernel, I still see the hang. I repeated the blktests test case srp/002
>>>> 30 time or so, then the hang was recreated. Then I reverted the commit
>>>> 9b4b7c1f9f54 from v6.6-rc2, and the hang disappeared. I repeated the blktests
>>>> test case 100 times, and did not see the hang.
>>>>
>>>> I confirmed these results under two multipathd conditions: 1) with Fedora latest
>>>> device-mapper-multipath package v0.9.4, and 2) the latest multipath-tools v0.9.6
>>>> that I built from source code.
>>>>
>>>> So, when the commit gets reverted, the hang disappears as I reported for
>>>> v6.5-rcX kernels.
>>> Thanks, Shinichiro Kawasaki. Your helps are appreciated.
>>>
>>> This problem is related with the followings:
>>>
>>> 1). Linux distributions: Ubuntu, Debian and Fedora;
>>>
>>> 2). multipathd;
>>>
>>> 3). the commits 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks")
>>>
>>> On Ubuntu, with or without the commit, this problem does not occur.
>>>
>>> On Debian, without this commit, this problem does not occur. With this commit, this problem will occur.
>>>
>>> On Fedora, without this commit, this problem does not occur. With this commit, this problem will occur.
>>>
>>> The commits 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks") is from Bob Pearson.
>>>
>>> Hi, Bob, do you have any comments about this problem? It seems that this commit is not compatible with blktests.
>>>
>>> Hi, Jason and Leon, please comment on this problem.
>>>
>>> Thanks a lot.
>>>
>>> Zhu Yanjun
>> My belief is that the issue is related to timing not the logical operation of the code.
>> Work queues are just kernel processes and can be scheduled (if not holding spinlocks)
>> while soft IRQs lock up the CPU until they exit. This can cause longer delays in responding
>> to ULPs. The work queue tasks for each QP are strictly single threaded which is managed by
>> the work queue framework the same as tasklets.
> 
> Thanks, Bob. From you, the workqueue can be scheduled, this can cause longer delays in reponding to ULPs.
> 
> This will cause ULPs to hang. But the tasklet will lock up the CPU until it exits. So the tasklet will repond to
> 
> ULPs in time.
> 
> To this, there are 3 solutins:
> 
> 1). Try to make workqueue respond ULPs in time, this hang problem should be avoided. so this will not cause
> 
> this problem. But from the kernel, workqueue should be scheduled,So it is difficult to avoid this longer delay.
> 
> 
> 2). Make tasklet and workqueue both work in RXE.  We can make one of tasklet or workqueue as the default. The user
> 
> can choose to use tasklet or workqueue via kernel module parameter or sysctl variables. This will cost a lot of time
> 
> and efforts to implement it.
> 
> 
> 3). Revert the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks"). Shinichiro Kawasaki
> 
> confirmed that this can fix this regression. And the patch is in the attachment.
> 
> 
> Hi, Bob, Please comment.
> 
> Hi, Jason && Leon, please also comment on this.
> 
> Thanks a lot.
> 
>>
>> Earlier in time I have also seen the exact same hang behavior with the siw driver but not
>> recently. Also I have seen sensitivity to logging changes in the hang behavior. These are
> 
> This is a regression to RXE which is caused by the the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
> 
> We should fix it.
> 
> Zhu Yanjun
> 
>> indications that timing may be the cause of the issue.
>>
>> Bob

The verbs APIs do not make real time commitments. If a ULP fails because of response times it is the
problem in the ULP not in the verbs provider.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-20 16:24                         ` Bob Pearson
@ 2023-09-20 16:36                           ` Bart Van Assche
  2023-09-20 17:18                             ` Bob Pearson
  0 siblings, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-09-20 16:36 UTC (permalink / raw)
  To: Bob Pearson, Zhu Yanjun, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

On 9/20/23 09:24, Bob Pearson wrote:
> The verbs APIs do not make real time commitments. If a ULP fails 
> because of response times it is the problem in the ULP not in the 
> verbs provider.

I think there is evidence that the root cause is in the RXE driver. I
haven't seen any evidence that there would be any issues in any of the
involved ULP drivers. Am I perhaps missing something?

Bart.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-20 16:36                           ` Bart Van Assche
@ 2023-09-20 17:18                             ` Bob Pearson
  2023-09-20 17:22                               ` Bart Van Assche
  0 siblings, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-09-20 17:18 UTC (permalink / raw)
  To: Bart Van Assche, Zhu Yanjun, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

On 9/20/23 11:36, Bart Van Assche wrote:
> On 9/20/23 09:24, Bob Pearson wrote:
>> The verbs APIs do not make real time commitments. If a ULP fails because of response times it is the problem in the ULP not in the verbs provider.
> 
> I think there is evidence that the root cause is in the RXE driver. I
> haven't seen any evidence that there would be any issues in any of the
> involved ULP drivers. Am I perhaps missing something?
> 
> Bart.

I agree it is definitely possible. But I have also seen the same behavior in the siw driver which is completely
independent. I have tried but have not been able to figure out what the ULPs are waiting for when the hangs
occur. If someone who has a good understanding of the ULPs could catch a hang and figure what is missing it
would give a clue as to what is going on.

As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of
the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.

Bob


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-20 17:18                             ` Bob Pearson
@ 2023-09-20 17:22                               ` Bart Van Assche
  2023-09-20 17:29                                 ` Bob Pearson
  0 siblings, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-09-20 17:22 UTC (permalink / raw)
  To: Bob Pearson, Zhu Yanjun, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

On 9/20/23 10:18, Bob Pearson wrote:
> But I have also seen the same behavior in the siw driver which is
> completely independent.

Hmm ... I haven't seen any hangs yet with the siw driver.

> As mentioned above at the moment Ubuntu is failing rarely. But it 
> used to fail reliably (srp/002 about 75% of the time and srp/011 
> about 99% of the time.) There haven't been any changes to rxe to 
> explain this.

I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
support for rxe tasks")?

Thanks,

Bart.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-20 17:22                               ` Bart Van Assche
@ 2023-09-20 17:29                                 ` Bob Pearson
  2023-09-21  5:46                                   ` Zhu Yanjun
                                                     ` (2 more replies)
  0 siblings, 3 replies; 87+ messages in thread
From: Bob Pearson @ 2023-09-20 17:29 UTC (permalink / raw)
  To: Bart Van Assche, Zhu Yanjun, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi

On 9/20/23 12:22, Bart Van Assche wrote:
> On 9/20/23 10:18, Bob Pearson wrote:
>> But I have also seen the same behavior in the siw driver which is
>> completely independent.
> 
> Hmm ... I haven't seen any hangs yet with the siw driver.

I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
> 
>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
> 
> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
> support for rxe tasks")?

That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
and wqs. But after updating Ubuntu and the kernel at some point they all went away.

> 
> Thanks,
> 
> Bart.



^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-20 17:29                                 ` Bob Pearson
@ 2023-09-21  5:46                                   ` Zhu Yanjun
  2023-09-21 10:06                                   ` Zhu Yanjun
  2023-09-21 14:23                                   ` Rain River
  2 siblings, 0 replies; 87+ messages in thread
From: Zhu Yanjun @ 2023-09-21  5:46 UTC (permalink / raw)
  To: Bob Pearson, Bart Van Assche, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi


在 2023/9/21 1:29, Bob Pearson 写道:
> On 9/20/23 12:22, Bart Van Assche wrote:
>> On 9/20/23 10:18, Bob Pearson wrote:
>>> But I have also seen the same behavior in the siw driver which is
>>> completely independent.
>> Hmm ... I haven't seen any hangs yet with the siw driver.
> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
>> support for rxe tasks")?
> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
> and wqs. But after updating Ubuntu and the kernel at some point they all went away.

Thanks, Bob. From what you said, in Ubuntu, this problem does not occur 
now.

To now,

On Debian, without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue 
support for rxe tasks"), this hang does not occur.

On Fedora, similar to Debian.

On Ubuntu, this problem does not occur now. But not sure if this commit 
exists or not.

Hi, Bob, can you make tests without the above commit to verify if the 
same problem occurs or not on Ubuntu?

Can any one who has test environments to verify if this problem still 
occurs on Ubuntu without this commit?

Jason && Leon, please comment on this.

Thanks a lot.

Zhu Yanjun

>
>> Thanks,
>>
>> Bart.
>

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-20 17:29                                 ` Bob Pearson
  2023-09-21  5:46                                   ` Zhu Yanjun
@ 2023-09-21 10:06                                   ` Zhu Yanjun
  2023-09-21 14:23                                   ` Rain River
  2 siblings, 0 replies; 87+ messages in thread
From: Zhu Yanjun @ 2023-09-21 10:06 UTC (permalink / raw)
  To: Bob Pearson, Bart Van Assche, Shinichiro Kawasaki; +Cc: linux-rdma, linux-scsi


在 2023/9/21 1:29, Bob Pearson 写道:
> On 9/20/23 12:22, Bart Van Assche wrote:
>> On 9/20/23 10:18, Bob Pearson wrote:
>>> But I have also seen the same behavior in the siw driver which is
>>> completely independent.
>> Hmm ... I haven't seen any hangs yet with the siw driver.
> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
>> support for rxe tasks")?
> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
> and wqs. But after updating Ubuntu and the kernel at some point they all went away.
Thanks, Bob. From what you said, in Ubuntu, this problem does not occur 
now.

To now,

On Debian, without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue 
support for rxe tasks"), this hang does not occur.

On Fedora, similar to Debian.

On Ubuntu, this problem does not occur now. But not sure if this commit 
exists or not.

Hi, Bob, can you make tests without the above commit to verify if the 
same problem occurs or not on Ubuntu?

Can any one who has test environments to verify if this problem still 
occurs on Ubuntu without this commit?

Jason && Leon, please comment on this.

Thanks a lot.

Zhu Yanjun
>
>> Thanks,
>>
>> Bart.
>

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-20 17:29                                 ` Bob Pearson
  2023-09-21  5:46                                   ` Zhu Yanjun
  2023-09-21 10:06                                   ` Zhu Yanjun
@ 2023-09-21 14:23                                   ` Rain River
  2023-09-21 14:39                                     ` Bob Pearson
  2 siblings, 1 reply; 87+ messages in thread
From: Rain River @ 2023-09-21 14:23 UTC (permalink / raw)
  To: Bob Pearson
  Cc: Bart Van Assche, Zhu Yanjun, Shinichiro Kawasaki, linux-rdma, linux-scsi

On Thu, Sep 21, 2023 at 2:53 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>
> On 9/20/23 12:22, Bart Van Assche wrote:
> > On 9/20/23 10:18, Bob Pearson wrote:
> >> But I have also seen the same behavior in the siw driver which is
> >> completely independent.
> >
> > Hmm ... I haven't seen any hangs yet with the siw driver.
>
> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
> >
> >> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
> >
> > I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
> > support for rxe tasks")?
>
> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
> and wqs. But after updating Ubuntu and the kernel at some point they all went away.

I made tests on the latest Ubuntu with the latest kernel without the
commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
The latest kernel is v6.6-rc2, the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
workqueue support for rxe tasks") is reverted.
I made blktest tests for about 30 times, this problem does not occur.

So I confirm that without this commit, this hang problem does not
occur on Ubuntu without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
workqueue support for rxe tasks").

Nanthan

>
> >
> > Thanks,
> >
> > Bart.
>
>

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-21 14:23                                   ` Rain River
@ 2023-09-21 14:39                                     ` Bob Pearson
  2023-09-21 15:08                                       ` Zhu Yanjun
  2023-09-21 15:10                                       ` Zhu Yanjun
  0 siblings, 2 replies; 87+ messages in thread
From: Bob Pearson @ 2023-09-21 14:39 UTC (permalink / raw)
  To: Rain River, Daisuke Matsuda
  Cc: Bart Van Assche, Zhu Yanjun, Shinichiro Kawasaki, linux-rdma, linux-scsi

On 9/21/23 09:23, Rain River wrote:
> On Thu, Sep 21, 2023 at 2:53 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>>
>> On 9/20/23 12:22, Bart Van Assche wrote:
>>> On 9/20/23 10:18, Bob Pearson wrote:
>>>> But I have also seen the same behavior in the siw driver which is
>>>> completely independent.
>>>
>>> Hmm ... I haven't seen any hangs yet with the siw driver.
>>
>> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
>>>
>>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
>>>
>>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
>>> support for rxe tasks")?
>>
>> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
>> and wqs. But after updating Ubuntu and the kernel at some point they all went away.
> 
> I made tests on the latest Ubuntu with the latest kernel without the
> commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
> The latest kernel is v6.6-rc2, the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
> workqueue support for rxe tasks") is reverted.
> I made blktest tests for about 30 times, this problem does not occur.
> 
> So I confirm that without this commit, this hang problem does not
> occur on Ubuntu without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
> workqueue support for rxe tasks").
> 
> Nanthan
> 
>>
>>>
>>> Thanks,
>>>
>>> Bart.
>>
>>

This commit is very important for several reasons. It is needed for the ODP implementation
that is in the works from Daisuke Matsuda and also for QP scaling of performance. The work
queue implementation scales well with increasing qp number while the tasklet implementation
does not. This is critical for the drivers use in large scale storage applications. So, if
there is a bug in the work queue implementation it needs to be fixed not reverted.

I am still hoping that someone will diagnose what is causing the ULPs to hang in terms of
something missing causing it to wait.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-21 14:39                                     ` Bob Pearson
@ 2023-09-21 15:08                                       ` Zhu Yanjun
  2023-09-21 15:10                                       ` Zhu Yanjun
  1 sibling, 0 replies; 87+ messages in thread
From: Zhu Yanjun @ 2023-09-21 15:08 UTC (permalink / raw)
  To: Bob Pearson, Rain River, Daisuke Matsuda
  Cc: Bart Van Assche, Shinichiro Kawasaki, linux-rdma, linux-scsi


在 2023/9/21 22:39, Bob Pearson 写道:
> On 9/21/23 09:23, Rain River wrote:
>> On Thu, Sep 21, 2023 at 2:53 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>>> On 9/20/23 12:22, Bart Van Assche wrote:
>>>> On 9/20/23 10:18, Bob Pearson wrote:
>>>>> But I have also seen the same behavior in the siw driver which is
>>>>> completely independent.
>>>> Hmm ... I haven't seen any hangs yet with the siw driver.
>>> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
>>>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
>>>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
>>>> support for rxe tasks")?
>>> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
>>> and wqs. But after updating Ubuntu and the kernel at some point they all went away.
>> I made tests on the latest Ubuntu with the latest kernel without the
>> commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
>> The latest kernel is v6.6-rc2, the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
>> workqueue support for rxe tasks") is reverted.
>> I made blktest tests for about 30 times, this problem does not occur.
>>
>> So I confirm that without this commit, this hang problem does not
>> occur on Ubuntu without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
>> workqueue support for rxe tasks").
>>
>> Nanthan
>>
>>>> Thanks,
>>>>
>>>> Bart.
>>>
> This commit is very important for several reasons. It is needed for the ODP implementation
> that is in the works from Daisuke Matsuda and also for QP scaling of performance. The work
> queue implementation scales well with increasing qp number while the tasklet implementation
> does not. This is critical for the drivers use in large scale storage applications. So, if
> there is a bug in the work queue implementation it needs to be fixed not reverted.
>
> I am still hoping that someone will diagnose what is causing the ULPs to hang in terms of
> something missing causing it to wait.

Hi, Bob


You submitted this commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support 
for rxe tasks").

You should be very familiar with this commit.

And this commit causes regression.

So you should delved into the source code to find the root cause, then 
fix it.


Jason && Leon, please comment on this.


Best Regards,

Zhu Yanjun

>
> Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-21 14:39                                     ` Bob Pearson
  2023-09-21 15:08                                       ` Zhu Yanjun
@ 2023-09-21 15:10                                       ` Zhu Yanjun
  2023-09-22 18:14                                         ` Bob Pearson
  1 sibling, 1 reply; 87+ messages in thread
From: Zhu Yanjun @ 2023-09-21 15:10 UTC (permalink / raw)
  To: Bob Pearson, Rain River, Daisuke Matsuda, Jason Gunthorpe, leon
  Cc: Bart Van Assche, Shinichiro Kawasaki, RDMA mailing list, linux-scsi


在 2023/9/21 22:39, Bob Pearson 写道:
> On 9/21/23 09:23, Rain River wrote:
>> On Thu, Sep 21, 2023 at 2:53 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>>> On 9/20/23 12:22, Bart Van Assche wrote:
>>>> On 9/20/23 10:18, Bob Pearson wrote:
>>>>> But I have also seen the same behavior in the siw driver which is
>>>>> completely independent.
>>>> Hmm ... I haven't seen any hangs yet with the siw driver.
>>> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
>>>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
>>>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
>>>> support for rxe tasks")?
>>> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
>>> and wqs. But after updating Ubuntu and the kernel at some point they all went away.
>> I made tests on the latest Ubuntu with the latest kernel without the
>> commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
>> The latest kernel is v6.6-rc2, the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
>> workqueue support for rxe tasks") is reverted.
>> I made blktest tests for about 30 times, this problem does not occur.
>>
>> So I confirm that without this commit, this hang problem does not
>> occur on Ubuntu without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
>> workqueue support for rxe tasks").
>>
>> Nanthan
>>
>>>> Thanks,
>>>>
>>>> Bart.
>>>
> This commit is very important for several reasons. It is needed for the ODP implementation
> that is in the works from Daisuke Matsuda and also for QP scaling of performance. The work
> queue implementation scales well with increasing qp number while the tasklet implementation
> does not. This is critical for the drivers use in large scale storage applications. So, if
> there is a bug in the work queue implementation it needs to be fixed not reverted.
>
> I am still hoping that someone will diagnose what is causing the ULPs to hang in terms of
> something missing causing it to wait.

Hi, Bob


You submitted this commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support 
for rxe tasks").

You should be very familiar with this commit.

And this commit causes regression.

So you should delved into the source code to find the root cause, then 
fix it.


Jason && Leon, please comment on this.


Best Regards,

Zhu Yanjun

>
> Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-08-21  6:46 [bug report] blktests srp/002 hang Shinichiro Kawasaki
  2023-08-22  1:46 ` Bob Pearson
@ 2023-09-22 11:06 ` Linux regression tracking #adding (Thorsten Leemhuis)
  2023-10-13 12:51   ` Linux regression tracking #update (Thorsten Leemhuis)
  1 sibling, 1 reply; 87+ messages in thread
From: Linux regression tracking #adding (Thorsten Leemhuis) @ 2023-09-22 11:06 UTC (permalink / raw)
  To: linux-rdma, linux-scsi; +Cc: Linux kernel regressions list

[TLDR: I'm adding this report to the list of tracked Linux kernel
regressions; the text you find below is based on a few templates
paragraphs you might have encountered already in similar form.
See link in footer if these mails annoy you.]

On 21.08.23 08:46, Shinichiro Kawasaki wrote:
> I observed a process hang at the blktests test case srp/002 occasionally, using
> kernel v6.5-rcX. Kernel reported stall of many kworkers [1]. PID 2757 hanged at
> inode_sleep_on_writeback(). Other kworkers hanged at __inode_wait_for_writeback.
> 
> The hang is recreated in stable manner by repeating the test case srp/002 (from
> 15 times to 30 times).
> 
> I bisected and found the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support
> for rxe tasks") looks like the trigger commit. When I revert it from the kernel
> v6.5-rc7, the hang symptom disappears. I'm not sure how the commit relates to
> the hang. Comments will be welcomed.
> […]

Thanks for the report. To be sure the issue doesn't fall through the
cracks unnoticed, I'm adding it to regzbot, the Linux kernel regression
tracking bot:

#regzbot ^introduced 9b4b7c1f9f54
#regzbot title RDMA/rxe: occasionally pocess hang at the blktests test
case srp/002
#regzbot ignore-activity

This isn't a regression? This issue or a fix for it are already
discussed somewhere else? It was fixed already? You want to clarify when
the regression started to happen? Or point out I got the title or
something else totally wrong? Then just reply and tell me -- ideally
while also telling regzbot about it, as explained by the page listed in
the footer of this mail.

Developers: When fixing the issue, remember to add 'Link:' tags pointing
to the report (the parent of this mail). See page linked in footer for
details.

Ciao, Thorsten (wearing his 'the Linux kernel's regression tracker' hat)
--
Everything you wanna know about Linux kernel regression tracking:
https://linux-regtracking.leemhuis.info/about/#tldr
That page also explains what to do if mails like this annoy you.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-21 15:10                                       ` Zhu Yanjun
@ 2023-09-22 18:14                                         ` Bob Pearson
  2023-09-22 22:06                                           ` Bart Van Assche
  2023-09-24  1:17                                           ` Rain River
  0 siblings, 2 replies; 87+ messages in thread
From: Bob Pearson @ 2023-09-22 18:14 UTC (permalink / raw)
  To: Zhu Yanjun, Rain River, Daisuke Matsuda, Jason Gunthorpe, leon
  Cc: Bart Van Assche, Shinichiro Kawasaki, RDMA mailing list, linux-scsi

On 9/21/23 10:10, Zhu Yanjun wrote:
> 
> 在 2023/9/21 22:39, Bob Pearson 写道:
>> On 9/21/23 09:23, Rain River wrote:
>>> On Thu, Sep 21, 2023 at 2:53 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>>>> On 9/20/23 12:22, Bart Van Assche wrote:
>>>>> On 9/20/23 10:18, Bob Pearson wrote:
>>>>>> But I have also seen the same behavior in the siw driver which is
>>>>>> completely independent.
>>>>> Hmm ... I haven't seen any hangs yet with the siw driver.
>>>> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
>>>>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
>>>>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
>>>>> support for rxe tasks")?
>>>> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
>>>> and wqs. But after updating Ubuntu and the kernel at some point they all went away.
>>> I made tests on the latest Ubuntu with the latest kernel without the
>>> commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
>>> The latest kernel is v6.6-rc2, the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
>>> workqueue support for rxe tasks") is reverted.
>>> I made blktest tests for about 30 times, this problem does not occur.
>>>
>>> So I confirm that without this commit, this hang problem does not
>>> occur on Ubuntu without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
>>> workqueue support for rxe tasks").
>>>
>>> Nanthan
>>>
>>>>> Thanks,
>>>>>
>>>>> Bart.
>>>>
>> This commit is very important for several reasons. It is needed for the ODP implementation
>> that is in the works from Daisuke Matsuda and also for QP scaling of performance. The work
>> queue implementation scales well with increasing qp number while the tasklet implementation
>> does not. This is critical for the drivers use in large scale storage applications. So, if
>> there is a bug in the work queue implementation it needs to be fixed not reverted.
>>
>> I am still hoping that someone will diagnose what is causing the ULPs to hang in terms of
>> something missing causing it to wait.
> 
> Hi, Bob
> 
> 
> You submitted this commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
> 
> You should be very familiar with this commit.
> 
> And this commit causes regression.
> 
> So you should delved into the source code to find the root cause, then fix it.

Zhu,

I have spent tons of time over the months trying to figure out what is happening with blktests.
As I have mentioned several times I have seen the same exact failure in siw in the past although
currently that doesn't seem to happen so I had been suspecting that the problem may be in the ULP.
The challenge is that the blktests represents a huge stack of software much of which I am not
familiar with. The bug is a hang in layers above the rxe driver and so far no one has been able to
say with any specificity the rxe driver failed to do something needed to make progress or violated
expected behavior. Without any clue as to where to look it has been hard to make progress.

My main motivation is making Lustre run on rxe and it does and it's fast enough to meet our needs.
Lustre is similar to srp as a ULP and in all of our testing we have never seen a similar hang. Other
hangs to be sure but not this one. I believe that this bug will never get resolved until someone with
a good understanding of the ulp drivers makes an effort to find out where and why the hang is occurring.
From there it should be straight forward to fix the problem. I am continuing to investigate and am learning
the device-manager/multipath/srp/scsi stack but I have a long ways to go.

Bob


> 
> 
> Jason && Leon, please comment on this.
> 
> 
> Best Regards,
> 
> Zhu Yanjun
> 
>>
>> Bob


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-22 18:14                                         ` Bob Pearson
@ 2023-09-22 22:06                                           ` Bart Van Assche
  2023-09-24  1:17                                           ` Rain River
  1 sibling, 0 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-09-22 22:06 UTC (permalink / raw)
  To: Bob Pearson, Zhu Yanjun, Rain River, Daisuke Matsuda,
	Jason Gunthorpe, leon
  Cc: Shinichiro Kawasaki, RDMA mailing list, linux-scsi

On 9/22/23 11:14, Bob Pearson wrote:
> I have spent tons of time over the months trying to figure out what 
> is happening with blktests. As I have mentioned several times I have 
> seen the same exact failure in siw in the past although currently 
> that doesn't seem to happen so I had been suspecting that the
> problem may be in the ULP. The challenge is that the blktests
> represents a huge stack of software much of which I am not familiar
> with. The bug is a hang in layers above the rxe driver and so far no
> one has been able to say with any specificity the rxe driver failed
> to do something needed to make progress or violated expected
> behavior. Without any clue as to where to look it has been hard to
> make progress.
> 
> My main motivation is making Lustre run on rxe and it does and it's 
> fast enough to meet our needs. Lustre is similar to srp as a ULP and 
> in all of our testing we have never seen a similar hang. Other hangs 
> to be sure but not this one. I believe that this bug will never get 
> resolved until someone with a good understanding of the ulp drivers 
> makes an effort to find out where and why the hang is occurring.
> From there it should be straight forward to fix the problem. I am 
> continuing to investigate and am learning the 
> device-manager/multipath/srp/scsi stack but I have a long ways to 
> go.

Why would knowledge of device-manager/multipath/srp/scsi be required to
make progress?

Please start with fixing the KASAN complaint shown below. I think the
root cause of this complaint is in the RDMA/rxe driver. This issue can
be reproduced as follows:
* Build and install Linus' master branch with KASAN enabled (commit
   8018e02a8703 ("Merge tag 'thermal-6.6-rc3' of
   git://git.kernel.org/pub/scm/linux/kernel/git/rafael/linux-pm")).
* Install the latest version of blktests and run the following shell
   command:

     export use_rxe=1; while (cd ~bart/software/blktests && ./check -q srp/002); do :; done

   The KASAN complaint should appear during the first run of test
   srp/002.

Thanks,

Bart.

BUG: KASAN: slab-use-after-free in rxe_comp_queue_pkt+0x3d/0x80 [rdma_rxe]
Read of size 8 at addr ffff888111865928 by task kworker/u18:5/3502

CPU: 1 PID: 3502 Comm: kworker/u18:5 Tainted: G        W          6.6.0-rc2-dbg #3
Hardware name: QEMU Standard PC (Q35 + ICH9, 2009), BIOS rel-1.16.2-3-gd478f380-rebuilt.opensuse.org 04/01/2014
Workqueue: rxe_wq do_work [rdma_rxe]
Call Trace:
  <TASK>
  dump_stack_lvl+0x5c/0xc0
  print_address_description.constprop.0+0x33/0x400
  ? preempt_count_sub+0x18/0xc0
  print_report+0xb6/0x260
  ? kasan_complete_mode_report_info+0x5c/0x190
  kasan_report+0xc6/0x100
  ? rxe_comp_queue_pkt+0x3d/0x80 [rdma_rxe]
  ? rxe_comp_queue_pkt+0x3d/0x80 [rdma_rxe]
  __asan_load8+0x69/0x90
  rxe_comp_queue_pkt+0x3d/0x80 [rdma_rxe]
  rxe_rcv+0x3db/0x400 [rdma_rxe]
  ? rxe_rcv_mcast_pkt+0x500/0x500 [rdma_rxe]
  rxe_xmit_packet+0x224/0x3f0 [rdma_rxe]
  ? rxe_prepare+0x110/0x110 [rdma_rxe]
  ? prepare_ack_packet+0x1cd/0x340 [rdma_rxe]
  send_common_ack.isra.0+0xac/0x140 [rdma_rxe]
  ? prepare_ack_packet+0x340/0x340 [rdma_rxe]
  ? __this_cpu_preempt_check+0x13/0x20
  ? rxe_resp_check_length+0x148/0x2d0 [rdma_rxe]
  rxe_responder+0xe0b/0x1610 [rdma_rxe]
  ? __this_cpu_preempt_check+0x13/0x20
  ? rxe_resp_queue_pkt+0x70/0x70 [rdma_rxe]
  do_task+0xd2/0x350 [rdma_rxe]
  ? lockdep_hardirqs_on+0x7e/0x100
  rxe_run_task+0x8a/0xa0 [rdma_rxe]
  rxe_resp_queue_pkt+0x62/0x70 [rdma_rxe]
  rxe_rcv+0x327/0x400 [rdma_rxe]
  ? rxe_rcv_mcast_pkt+0x500/0x500 [rdma_rxe]
  rxe_xmit_packet+0x224/0x3f0 [rdma_rxe]
  ? rxe_prepare+0x110/0x110 [rdma_rxe]
  rxe_requester+0x6bb/0x13a0 [rdma_rxe]
  ? check_prev_add+0x12c0/0x12c0
  ? rnr_nak_timer+0xd0/0xd0 [rdma_rxe]
  ? __lock_acquire+0x88c/0xf30
  ? __kasan_check_read+0x11/0x20
  ? mark_lock+0xeb/0xa80
  ? mark_lock_irq+0xcd0/0xcd0
  ? __lock_release.isra.0+0x14c/0x280
  ? do_task+0x9f/0x350 [rdma_rxe]
  ? reacquire_held_locks+0x270/0x270
  ? _raw_spin_unlock_irqrestore+0x56/0x80
  ? __this_cpu_preempt_check+0x13/0x20
  ? lockdep_hardirqs_on+0x7e/0x100
  ? rnr_nak_timer+0xd0/0xd0 [rdma_rxe]
  do_task+0xd2/0x350 [rdma_rxe]
  ? __this_cpu_preempt_check+0x13/0x20
  do_work+0xe/0x10 [rdma_rxe]
  process_one_work+0x4af/0x9a0
  ? init_worker_pool+0x350/0x350
  ? assign_work+0xe2/0x120
  worker_thread+0x385/0x680
  ? preempt_count_sub+0x18/0xc0
  ? process_one_work+0x9a0/0x9a0
  kthread+0x1b9/0x200
  ? kthread+0xfd/0x200
  ? kthread_complete_and_exit+0x30/0x30
  ret_from_fork+0x36/0x60
  ? kthread_complete_and_exit+0x30/0x30
  ret_from_fork_asm+0x11/0x20
  </TASK>

Allocated by task 3502:
  kasan_save_stack+0x26/0x50
  kasan_set_track+0x25/0x30
  kasan_save_alloc_info+0x1e/0x30
  __kasan_slab_alloc+0x6a/0x70
  kmem_cache_alloc_node+0x16a/0x3d0
  __alloc_skb+0x1d8/0x250
  rxe_init_packet+0x11a/0x3b0 [rdma_rxe]
  prepare_ack_packet+0x9c/0x340 [rdma_rxe]
  send_common_ack.isra.0+0x95/0x140 [rdma_rxe]
  rxe_responder+0xe0b/0x1610 [rdma_rxe]
  do_task+0xd2/0x350 [rdma_rxe]
  rxe_run_task+0x8a/0xa0 [rdma_rxe]
  rxe_resp_queue_pkt+0x62/0x70 [rdma_rxe]
  rxe_rcv+0x327/0x400 [rdma_rxe]
  rxe_xmit_packet+0x224/0x3f0 [rdma_rxe]
  rxe_requester+0x6bb/0x13a0 [rdma_rxe]
  do_task+0xd2/0x350 [rdma_rxe]
  do_work+0xe/0x10 [rdma_rxe]
  process_one_work+0x4af/0x9a0
  worker_thread+0x385/0x680
  kthread+0x1b9/0x200
  ret_from_fork+0x36/0x60
  ret_from_fork_asm+0x11/0x20

Freed by task 56:
  kasan_save_stack+0x26/0x50
  kasan_set_track+0x25/0x30
  kasan_save_free_info+0x2b/0x40
  ____kasan_slab_free+0x14c/0x1b0
  __kasan_slab_free+0x12/0x20
  kmem_cache_free+0x20a/0x4b0
  kfree_skbmem+0xaa/0xc0
  kfree_skb_reason+0x8e/0xe0
  rxe_completer+0x205/0xfe0 [rdma_rxe]
  do_task+0xd2/0x350 [rdma_rxe]
  do_work+0xe/0x10 [rdma_rxe]
  process_one_work+0x4af/0x9a0
  worker_thread+0x385/0x680
  kthread+0x1b9/0x200
  ret_from_fork+0x36/0x60
  ret_from_fork_asm+0x11/0x20

The buggy address belongs to the object at ffff888111865900
  which belongs to the cache skbuff_head_cache of size 224
The buggy address is located 40 bytes inside of
  freed 224-byte region [ffff888111865900, ffff8881118659e0)

The buggy address belongs to the physical page:
page:00000000c6a967c7 refcount:1 mapcount:0 mapping:0000000000000000 index:0x0 pfn:0x111864
head:00000000c6a967c7 order:1 entire_mapcount:0 nr_pages_mapped:0 pincount:0
flags: 0x2000000000000840(slab|head|node=0|zone=2)
page_type: 0xffffffff()
raw: 2000000000000840 ffff888100274c80 dead000000000122 0000000000000000
raw: 0000000000000000 0000000080190019 00000001ffffffff 0000000000000000
page dumped because: kasan: bad access detected

Memory state around the buggy address:
  ffff888111865800: 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
  ffff888111865880: 00 00 00 00 fc fc fc fc fc fc fc fc fc fc fc fc
 >ffff888111865900: fa fb fb fb fb fb fb fb fb fb fb fb fb fb fb fb
                                   ^
  ffff888111865980: fb fb fb fb fb fb fb fb fb fb fb fb fc fc fc fc
  ffff888111865a00: fc fc fc fc fc fc fc fc fc fc fc fc fc fc fc fc
==================================================================

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-22 18:14                                         ` Bob Pearson
  2023-09-22 22:06                                           ` Bart Van Assche
@ 2023-09-24  1:17                                           ` Rain River
  2023-09-25  4:47                                             ` Daisuke Matsuda (Fujitsu)
  1 sibling, 1 reply; 87+ messages in thread
From: Rain River @ 2023-09-24  1:17 UTC (permalink / raw)
  To: Bob Pearson
  Cc: Zhu Yanjun, Daisuke Matsuda, Jason Gunthorpe, leon,
	Bart Van Assche, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On Sat, Sep 23, 2023 at 2:14 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>
> On 9/21/23 10:10, Zhu Yanjun wrote:
> >
> > 在 2023/9/21 22:39, Bob Pearson 写道:
> >> On 9/21/23 09:23, Rain River wrote:
> >>> On Thu, Sep 21, 2023 at 2:53 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
> >>>> On 9/20/23 12:22, Bart Van Assche wrote:
> >>>>> On 9/20/23 10:18, Bob Pearson wrote:
> >>>>>> But I have also seen the same behavior in the siw driver which is
> >>>>>> completely independent.
> >>>>> Hmm ... I haven't seen any hangs yet with the siw driver.
> >>>> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
> >>>>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
> >>>>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
> >>>>> support for rxe tasks")?
> >>>> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
> >>>> and wqs. But after updating Ubuntu and the kernel at some point they all went away.
> >>> I made tests on the latest Ubuntu with the latest kernel without the
> >>> commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
> >>> The latest kernel is v6.6-rc2, the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
> >>> workqueue support for rxe tasks") is reverted.
> >>> I made blktest tests for about 30 times, this problem does not occur.
> >>>
> >>> So I confirm that without this commit, this hang problem does not
> >>> occur on Ubuntu without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
> >>> workqueue support for rxe tasks").
> >>>
> >>> Nanthan
> >>>
> >>>>> Thanks,
> >>>>>
> >>>>> Bart.
> >>>>
> >> This commit is very important for several reasons. It is needed for the ODP implementation
> >> that is in the works from Daisuke Matsuda and also for QP scaling of performance. The work
> >> queue implementation scales well with increasing qp number while the tasklet implementation
> >> does not. This is critical for the drivers use in large scale storage applications. So, if
> >> there is a bug in the work queue implementation it needs to be fixed not reverted.
> >>
> >> I am still hoping that someone will diagnose what is causing the ULPs to hang in terms of
> >> something missing causing it to wait.
> >
> > Hi, Bob
> >
> >
> > You submitted this commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
> >
> > You should be very familiar with this commit.
> >
> > And this commit causes regression.
> >
> > So you should delved into the source code to find the root cause, then fix it.
>
> Zhu,
>
> I have spent tons of time over the months trying to figure out what is happening with blktests.
> As I have mentioned several times I have seen the same exact failure in siw in the past although
> currently that doesn't seem to happen so I had been suspecting that the problem may be in the ULP.
> The challenge is that the blktests represents a huge stack of software much of which I am not
> familiar with. The bug is a hang in layers above the rxe driver and so far no one has been able to
> say with any specificity the rxe driver failed to do something needed to make progress or violated
> expected behavior. Without any clue as to where to look it has been hard to make progress.

Bob

Work queue will sleep. If work queue sleep for long time, the packets
will not be sent to ULP. This is why this hang occurs.
Difficult to handle this sleep in work queue. It had better revert
this commit in RXE.
Because work queue sleeps,  ULP can not wait for long time for the
packets. If packets can not reach ULPs for long time, many problems
will occur to ULPs.

>
> My main motivation is making Lustre run on rxe and it does and it's fast enough to meet our needs.
> Lustre is similar to srp as a ULP and in all of our testing we have never seen a similar hang. Other
> hangs to be sure but not this one. I believe that this bug will never get resolved until someone with
> a good understanding of the ulp drivers makes an effort to find out where and why the hang is occurring.
> From there it should be straight forward to fix the problem. I am continuing to investigate and am learning
> the device-manager/multipath/srp/scsi stack but I have a long ways to go.
>
> Bob
>
>
> >
> >
> > Jason && Leon, please comment on this.
> >
> >
> > Best Regards,
> >
> > Zhu Yanjun
> >
> >>
> >> Bob
>

^ permalink raw reply	[flat|nested] 87+ messages in thread

* RE: [bug report] blktests srp/002 hang
  2023-09-24  1:17                                           ` Rain River
@ 2023-09-25  4:47                                             ` Daisuke Matsuda (Fujitsu)
  2023-09-25 14:31                                               ` Zhu Yanjun
  2023-09-25 15:00                                               ` Bart Van Assche
  0 siblings, 2 replies; 87+ messages in thread
From: Daisuke Matsuda (Fujitsu) @ 2023-09-25  4:47 UTC (permalink / raw)
  To: 'Rain River', Bob Pearson
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Bart Van Assche,
	Shinichiro Kawasaki, RDMA mailing list, linux-scsi

On Sun, Sep 24, 2023 10:18 AM Rain River wrote:
> On Sat, Sep 23, 2023 at 2:14 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
> >
> > On 9/21/23 10:10, Zhu Yanjun wrote:
> > >
> > > 在 2023/9/21 22:39, Bob Pearson 写道:
> > >> On 9/21/23 09:23, Rain River wrote:
> > >>> On Thu, Sep 21, 2023 at 2:53 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
> > >>>> On 9/20/23 12:22, Bart Van Assche wrote:
> > >>>>> On 9/20/23 10:18, Bob Pearson wrote:
> > >>>>>> But I have also seen the same behavior in the siw driver which is
> > >>>>>> completely independent.
> > >>>>> Hmm ... I haven't seen any hangs yet with the siw driver.
> > >>>> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
> > >>>>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of
> the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
> > >>>>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
> > >>>>> support for rxe tasks")?
> > >>>> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
> > >>>> and wqs. But after updating Ubuntu and the kernel at some point they all went away.
> > >>> I made tests on the latest Ubuntu with the latest kernel without the
> > >>> commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
> > >>> The latest kernel is v6.6-rc2, the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
> > >>> workqueue support for rxe tasks") is reverted.
> > >>> I made blktest tests for about 30 times, this problem does not occur.
> > >>>
> > >>> So I confirm that without this commit, this hang problem does not
> > >>> occur on Ubuntu without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
> > >>> workqueue support for rxe tasks").
> > >>>
> > >>> Nanthan
> > >>>
> > >>>>> Thanks,
> > >>>>>
> > >>>>> Bart.
> > >>>>
> > >> This commit is very important for several reasons. It is needed for the ODP implementation
> > >> that is in the works from Daisuke Matsuda and also for QP scaling of performance. The work
> > >> queue implementation scales well with increasing qp number while the tasklet implementation
> > >> does not. This is critical for the drivers use in large scale storage applications. So, if
> > >> there is a bug in the work queue implementation it needs to be fixed not reverted.
> > >>
> > >> I am still hoping that someone will diagnose what is causing the ULPs to hang in terms of
> > >> something missing causing it to wait.
> > >
> > > Hi, Bob
> > >
> > >
> > > You submitted this commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
> > >
> > > You should be very familiar with this commit.
> > >
> > > And this commit causes regression.
> > >
> > > So you should delved into the source code to find the root cause, then fix it.
> >
> > Zhu,
> >
> > I have spent tons of time over the months trying to figure out what is happening with blktests.
> > As I have mentioned several times I have seen the same exact failure in siw in the past although
> > currently that doesn't seem to happen so I had been suspecting that the problem may be in the ULP.
> > The challenge is that the blktests represents a huge stack of software much of which I am not
> > familiar with. The bug is a hang in layers above the rxe driver and so far no one has been able to
> > say with any specificity the rxe driver failed to do something needed to make progress or violated
> > expected behavior. Without any clue as to where to look it has been hard to make progress.
> 
> Bob
> 
> Work queue will sleep. If work queue sleep for long time, the packets
> will not be sent to ULP. This is why this hang occurs.

In general work queue can sleep, but the workload running in rxe driver
should not sleep because it was originally running on tasklet and converted
to use work queue. A task can sometime take longer because of IRQs, but
the same thing can also happen with tasklet. If there is a difference between
the two, I think it would be the overhead of scheduring the work queue.

> Difficult to handle this sleep in work queue. It had better revert
> this commit in RXE.

I am objected to reverting the commit at this stage. As Bob wrote above,
nobody has found any logical failure in rxe driver. It is quite possible
that the patch is just revealing a latent bug in the higher layers.

> Because work queue sleeps,  ULP can not wait for long time for the
> packets. If packets can not reach ULPs for long time, many problems
> will occur to ULPs.

I wonder where in the rxe driver does it sleep. BTW, most packets are
processed in NET_RX_IRQ context, and work queue is scheduled only
when there is already a running context. If your speculation is to the point,
the hang will occur more frequently if we change it to use work queue exclusively.
My ODP patches include a change to do this.
Cf. https://lore.kernel.org/lkml/7699a90bc4af10c33c0a46ef6330ed4bb7e7ace6.1694153251.git.matsuda-daisuke@fujitsu.com/

Thanks,
Daisuke

> 
> >
> > My main motivation is making Lustre run on rxe and it does and it's fast enough to meet our needs.
> > Lustre is similar to srp as a ULP and in all of our testing we have never seen a similar hang. Other
> > hangs to be sure but not this one. I believe that this bug will never get resolved until someone with
> > a good understanding of the ulp drivers makes an effort to find out where and why the hang is occurring.
> > From there it should be straight forward to fix the problem. I am continuing to investigate and am learning
> > the device-manager/multipath/srp/scsi stack but I have a long ways to go.
> >
> > Bob
> >
> >
> > >
> > >
> > > Jason && Leon, please comment on this.
> > >
> > >
> > > Best Regards,
> > >
> > > Zhu Yanjun
> > >
> > >>
> > >> Bob
> >

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25  4:47                                             ` Daisuke Matsuda (Fujitsu)
@ 2023-09-25 14:31                                               ` Zhu Yanjun
  2023-09-26  1:09                                                 ` Daisuke Matsuda (Fujitsu)
  2023-09-25 15:00                                               ` Bart Van Assche
  1 sibling, 1 reply; 87+ messages in thread
From: Zhu Yanjun @ 2023-09-25 14:31 UTC (permalink / raw)
  To: Daisuke Matsuda (Fujitsu), 'Rain River', Bob Pearson
  Cc: Jason Gunthorpe, leon, Bart Van Assche, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi


在 2023/9/25 12:47, Daisuke Matsuda (Fujitsu) 写道:
> On Sun, Sep 24, 2023 10:18 AM Rain River wrote:
>> On Sat, Sep 23, 2023 at 2:14 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>>> On 9/21/23 10:10, Zhu Yanjun wrote:
>>>> 在 2023/9/21 22:39, Bob Pearson 写道:
>>>>> On 9/21/23 09:23, Rain River wrote:
>>>>>> On Thu, Sep 21, 2023 at 2:53 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>>>>>>> On 9/20/23 12:22, Bart Van Assche wrote:
>>>>>>>> On 9/20/23 10:18, Bob Pearson wrote:
>>>>>>>>> But I have also seen the same behavior in the siw driver which is
>>>>>>>>> completely independent.
>>>>>>>> Hmm ... I haven't seen any hangs yet with the siw driver.
>>>>>>> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
>>>>>>>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75% of
>> the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
>>>>>>>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
>>>>>>>> support for rxe tasks")?
>>>>>>> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
>>>>>>> and wqs. But after updating Ubuntu and the kernel at some point they all went away.
>>>>>> I made tests on the latest Ubuntu with the latest kernel without the
>>>>>> commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
>>>>>> The latest kernel is v6.6-rc2, the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
>>>>>> workqueue support for rxe tasks") is reverted.
>>>>>> I made blktest tests for about 30 times, this problem does not occur.
>>>>>>
>>>>>> So I confirm that without this commit, this hang problem does not
>>>>>> occur on Ubuntu without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
>>>>>> workqueue support for rxe tasks").
>>>>>>
>>>>>> Nanthan
>>>>>>
>>>>>>>> Thanks,
>>>>>>>>
>>>>>>>> Bart.
>>>>> This commit is very important for several reasons. It is needed for the ODP implementation
>>>>> that is in the works from Daisuke Matsuda and also for QP scaling of performance. The work
>>>>> queue implementation scales well with increasing qp number while the tasklet implementation
>>>>> does not. This is critical for the drivers use in large scale storage applications. So, if
>>>>> there is a bug in the work queue implementation it needs to be fixed not reverted.
>>>>>
>>>>> I am still hoping that someone will diagnose what is causing the ULPs to hang in terms of
>>>>> something missing causing it to wait.
>>>> Hi, Bob
>>>>
>>>>
>>>> You submitted this commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
>>>>
>>>> You should be very familiar with this commit.
>>>>
>>>> And this commit causes regression.
>>>>
>>>> So you should delved into the source code to find the root cause, then fix it.
>>> Zhu,
>>>
>>> I have spent tons of time over the months trying to figure out what is happening with blktests.
>>> As I have mentioned several times I have seen the same exact failure in siw in the past although
>>> currently that doesn't seem to happen so I had been suspecting that the problem may be in the ULP.
>>> The challenge is that the blktests represents a huge stack of software much of which I am not
>>> familiar with. The bug is a hang in layers above the rxe driver and so far no one has been able to
>>> say with any specificity the rxe driver failed to do something needed to make progress or violated
>>> expected behavior. Without any clue as to where to look it has been hard to make progress.
>> Bob
>>
>> Work queue will sleep. If work queue sleep for long time, the packets
>> will not be sent to ULP. This is why this hang occurs.
> In general work queue can sleep, but the workload running in rxe driver
> should not sleep because it was originally running on tasklet and converted
> to use work queue. A task can sometime take longer because of IRQs, but
> the same thing can also happen with tasklet. If there is a difference between
> the two, I think it would be the overhead of scheduring the work queue.
>
>> Difficult to handle this sleep in work queue. It had better revert
>> this commit in RXE.
> I am objected to reverting the commit at this stage. As Bob wrote above,
> nobody has found any logical failure in rxe driver. It is quite possible
> that the patch is just revealing a latent bug in the higher layers.

To now, on Debian and Fedora, all the tests with work queue will hang. 
And after reverting this commit,

no hang will occur.

Before new test results, it is a reasonable suspect that this commit 
will result in the hang.

>
>> Because work queue sleeps,  ULP can not wait for long time for the
>> packets. If packets can not reach ULPs for long time, many problems
>> will occur to ULPs.
> I wonder where in the rxe driver does it sleep. BTW, most packets are
> processed in NET_RX_IRQ context, and work queue is scheduled only

Do you mean NET_RX_SOFTIRQ?

Zhu Yanjun

> when there is already a running context. If your speculation is to the point,
> the hang will occur more frequently if we change it to use work queue exclusively.
> My ODP patches include a change to do this.
> Cf. https://lore.kernel.org/lkml/7699a90bc4af10c33c0a46ef6330ed4bb7e7ace6.1694153251.git.matsuda-daisuke@fujitsu.com/
>
> Thanks,
> Daisuke
>
>>> My main motivation is making Lustre run on rxe and it does and it's fast enough to meet our needs.
>>> Lustre is similar to srp as a ULP and in all of our testing we have never seen a similar hang. Other
>>> hangs to be sure but not this one. I believe that this bug will never get resolved until someone with
>>> a good understanding of the ulp drivers makes an effort to find out where and why the hang is occurring.
>>>  From there it should be straight forward to fix the problem. I am continuing to investigate and am learning
>>> the device-manager/multipath/srp/scsi stack but I have a long ways to go.
>>>
>>> Bob
>>>
>>>
>>>>
>>>> Jason && Leon, please comment on this.
>>>>
>>>>
>>>> Best Regards,
>>>>
>>>> Zhu Yanjun
>>>>
>>>>> Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25  4:47                                             ` Daisuke Matsuda (Fujitsu)
  2023-09-25 14:31                                               ` Zhu Yanjun
@ 2023-09-25 15:00                                               ` Bart Van Assche
  2023-09-25 15:25                                                 ` Bob Pearson
                                                                   ` (3 more replies)
  1 sibling, 4 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-09-25 15:00 UTC (permalink / raw)
  To: Daisuke Matsuda (Fujitsu), 'Rain River', Bob Pearson
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On 9/24/23 21:47, Daisuke Matsuda (Fujitsu) wrote:
> As Bob wrote above, nobody has found any logical failure in rxe
> driver.

That's wrong. In case you would not yet have noticed my latest email in
this thread, please take a look at
https://lore.kernel.org/linux-rdma/e8b76fae-780a-470e-8ec4-c6b650793d10@leemhuis.info/T/#m0fd8ea8a4cbc27b37b042ae4f8e9b024f1871a73. 
I think the report in that email is a 100% proof that there is a 
use-after-free issue in the rdma_rxe driver. Use-after-free issues have 
security implications and also can cause data corruption. I propose to 
revert the commit that introduced the rdma_rxe use-after-free unless 
someone comes up with a fix for the rdma_rxe driver.

Bart.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25 15:00                                               ` Bart Van Assche
@ 2023-09-25 15:25                                                 ` Bob Pearson
  2023-09-25 15:52                                                 ` Jason Gunthorpe
                                                                   ` (2 subsequent siblings)
  3 siblings, 0 replies; 87+ messages in thread
From: Bob Pearson @ 2023-09-25 15:25 UTC (permalink / raw)
  To: Bart Van Assche, Daisuke Matsuda (Fujitsu), 'Rain River'
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On 9/25/23 10:00, Bart Van Assche wrote:
> On 9/24/23 21:47, Daisuke Matsuda (Fujitsu) wrote:
>> As Bob wrote above, nobody has found any logical failure in rxe
>> driver.
> 
> That's wrong. In case you would not yet have noticed my latest email in
> this thread, please take a look at
> https://lore.kernel.org/linux-rdma/e8b76fae-780a-470e-8ec4-c6b650793d10@leemhuis.info/T/#m0fd8ea8a4cbc27b37b042ae4f8e9b024f1871a73. I think the report in that email is a 100% proof that there is a use-after-free issue in the rdma_rxe driver. Use-after-free issues have security implications and also can cause data corruption. I propose to revert the commit that introduced the rdma_rxe use-after-free unless someone comes up with a fix for the rdma_rxe driver.
> 
> Bart.

Thanks Bart, I missed that. This will give me a better target to try to track this down.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25 15:00                                               ` Bart Van Assche
  2023-09-25 15:25                                                 ` Bob Pearson
@ 2023-09-25 15:52                                                 ` Jason Gunthorpe
  2023-09-25 15:54                                                   ` Bob Pearson
  2023-09-25 19:57                                                 ` Bob Pearson
  2023-09-26  1:17                                                 ` Daisuke Matsuda (Fujitsu)
  3 siblings, 1 reply; 87+ messages in thread
From: Jason Gunthorpe @ 2023-09-25 15:52 UTC (permalink / raw)
  To: Bart Van Assche
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Bob Pearson, Zhu Yanjun, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On Mon, Sep 25, 2023 at 08:00:39AM -0700, Bart Van Assche wrote:
> On 9/24/23 21:47, Daisuke Matsuda (Fujitsu) wrote:
> > As Bob wrote above, nobody has found any logical failure in rxe
> > driver.
> 
> That's wrong. In case you would not yet have noticed my latest email in
> this thread, please take a look at
> https://lore.kernel.org/linux-rdma/e8b76fae-780a-470e-8ec4-c6b650793d10@leemhuis.info/T/#m0fd8ea8a4cbc27b37b042ae4f8e9b024f1871a73.
> I think the report in that email is a 100% proof that there is a
> use-after-free issue in the rdma_rxe driver. Use-after-free issues have
> security implications and also can cause data corruption. I propose to
> revert the commit that introduced the rdma_rxe use-after-free unless someone
> comes up with a fix for the rdma_rxe driver.

I should say I'm not keen on reverting improvements to rxe. This stuff
needs to happen eventually. Let's please try hard to fix it.

Jason

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25 15:52                                                 ` Jason Gunthorpe
@ 2023-09-25 15:54                                                   ` Bob Pearson
  0 siblings, 0 replies; 87+ messages in thread
From: Bob Pearson @ 2023-09-25 15:54 UTC (permalink / raw)
  To: Jason Gunthorpe, Bart Van Assche
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 9/25/23 10:52, Jason Gunthorpe wrote:
> On Mon, Sep 25, 2023 at 08:00:39AM -0700, Bart Van Assche wrote:
>> On 9/24/23 21:47, Daisuke Matsuda (Fujitsu) wrote:
>>> As Bob wrote above, nobody has found any logical failure in rxe
>>> driver.
>>
>> That's wrong. In case you would not yet have noticed my latest email in
>> this thread, please take a look at
>> https://lore.kernel.org/linux-rdma/e8b76fae-780a-470e-8ec4-c6b650793d10@leemhuis.info/T/#m0fd8ea8a4cbc27b37b042ae4f8e9b024f1871a73.
>> I think the report in that email is a 100% proof that there is a
>> use-after-free issue in the rdma_rxe driver. Use-after-free issues have
>> security implications and also can cause data corruption. I propose to
>> revert the commit that introduced the rdma_rxe use-after-free unless someone
>> comes up with a fix for the rdma_rxe driver.
> 
> I should say I'm not keen on reverting improvements to rxe. This stuff
> needs to happen eventually. Let's please try hard to fix it.
> 
> Jason
I'm digging into Bart's kasan bug. Hope to find something.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25 15:00                                               ` Bart Van Assche
  2023-09-25 15:25                                                 ` Bob Pearson
  2023-09-25 15:52                                                 ` Jason Gunthorpe
@ 2023-09-25 19:57                                                 ` Bob Pearson
  2023-09-25 20:33                                                   ` Bart Van Assche
  2023-09-26 15:36                                                   ` Rain River
  2023-09-26  1:17                                                 ` Daisuke Matsuda (Fujitsu)
  3 siblings, 2 replies; 87+ messages in thread
From: Bob Pearson @ 2023-09-25 19:57 UTC (permalink / raw)
  To: Bart Van Assche, Daisuke Matsuda (Fujitsu), 'Rain River'
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On 9/25/23 10:00, Bart Van Assche wrote:
> On 9/24/23 21:47, Daisuke Matsuda (Fujitsu) wrote:
>> As Bob wrote above, nobody has found any logical failure in rxe
>> driver.
> 
> That's wrong. In case you would not yet have noticed my latest email in
> this thread, please take a look at
> https://lore.kernel.org/linux-rdma/e8b76fae-780a-470e-8ec4-c6b650793d10@leemhuis.info/T/#m0fd8ea8a4cbc27b37b042ae4f8e9b024f1871a73. I think the report in that email is a 100% proof that there is a use-after-free issue in the rdma_rxe driver. Use-after-free issues have security implications and also can cause data corruption. I propose to revert the commit that introduced the rdma_rxe use-after-rpearson:src$ git clone git://git.kernel.org/pub/scm/linux/git/rafael/linux-pm
Cloning into 'linux-pm'...
fatal: remote error: access denied or repository not exported: /pub/scm/linux/git/rafael/linux-pm
free unless someone comes up with a fix for the rdma_rxe driver.
> 
> Bart.

Bart,

Having trouble following your recipe. The git repo you mention does not seem to be available. E.g.

rpearson:src$ git clone git://git.kernel.org/pub/scm/linux/git/rafael/linux-pm
Cloning into 'linux-pm'...
fatal: remote error: access denied or repository not exported: /pub/scm/linux/git/rafael/linux-pm

I am not sure how to obtain the tag if I cannot see the repo.

If I just try to enable KASAN by setting CONFIG_KASAN=y in .config for the current linux-rdma repo
and compile the kernel the kernel won't boot and is caught in some kind of SRSO hell. If I checkout
Linus' v6.4 tag and add CONFIG_KASAN=y to a fresh .config file the kernel builds OK but when I
try to boot it, it is unable to chroot to the root file system in boot.

Any hints would be appreciated.

Bob


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25 19:57                                                 ` Bob Pearson
@ 2023-09-25 20:33                                                   ` Bart Van Assche
  2023-09-25 20:40                                                     ` Bob Pearson
  2023-09-26 15:36                                                   ` Rain River
  1 sibling, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-09-25 20:33 UTC (permalink / raw)
  To: Bob Pearson, Daisuke Matsuda (Fujitsu), 'Rain River'
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

[-- Attachment #1: Type: text/plain, Size: 1390 bytes --]

On 9/25/23 12:57, Bob Pearson wrote:
> Having trouble following your recipe. The git repo you mention does not seem to be available. E.g.
> 
> rpearson:src$ git clone git://git.kernel.org/pub/scm/linux/git/rafael/linux-pm
> Cloning into 'linux-pm'...
> fatal: remote error: access denied or repository not exported: /pub/scm/linux/git/rafael/linux-pm
> 
> I am not sure how to obtain the tag if I cannot see the repo.

As one can see on
https://git.kernel.org/pub/scm/linux/kernel/git/rafael/linux-pm.git/,
".git" is missing from the end of the URL in your git clone command.

I think that you misread my email. In my email I clearly referred to
Linus' master branch. Please try this:
$ git clone git://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git linux-kernel
$ cd linux-kernel
$ git checkout 8018e02a8703 -b linus-master

> If I just try to enable KASAN by setting CONFIG_KASAN=y in .config for the current linux-rdma repo
> and compile the kernel the kernel won't boot and is caught in some kind of SRSO hell. If I checkout
> Linus' v6.4 tag and add CONFIG_KASAN=y to a fresh .config file the kernel builds OK but when I
> try to boot it, it is unable to chroot to the root file system in boot.

Please try to run the blktests suite in a VM. I have attached the kernel
configuration to this email with which I observed the KASAN complaint on
my test setup.

Thanks,

Bart.

[-- Attachment #2: vm-kernel-config.txt.gz --]
[-- Type: application/gzip, Size: 29541 bytes --]

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25 20:33                                                   ` Bart Van Assche
@ 2023-09-25 20:40                                                     ` Bob Pearson
  0 siblings, 0 replies; 87+ messages in thread
From: Bob Pearson @ 2023-09-25 20:40 UTC (permalink / raw)
  To: Bart Van Assche, Daisuke Matsuda (Fujitsu), 'Rain River'
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On 9/25/23 15:33, Bart Van Assche wrote:
> On 9/25/23 12:57, Bob Pearson wrote:
>> Having trouble following your recipe. The git repo you mention does not seem to be available. E.g.
>>
>> rpearson:src$ git clone git://git.kernel.org/pub/scm/linux/git/rafael/linux-pm
>> Cloning into 'linux-pm'...
>> fatal: remote error: access denied or repository not exported: /pub/scm/linux/git/rafael/linux-pm
>>
>> I am not sure how to obtain the tag if I cannot see the repo.
> 
> As one can see on
> https://git.kernel.org/pub/scm/linux/kernel/git/rafael/linux-pm.git/,
> ".git" is missing from the end of the URL in your git clone command.
> 
> I think that you misread my email. In my email I clearly referred to
> Linus' master branch. Please try this:
> $ git clone git://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git linux-kernel
> $ cd linux-kernel
> $ git checkout 8018e02a8703 -b linus-master

what the email said was:

Please start with fixing the KASAN complaint shown below. I think the
root cause of this complaint is in the RDMA/rxe driver. This issue can
be reproduced as follows:
* Build and install Linus' master branch with KASAN enabled (commit
   8018e02a8703 ("Merge tag 'thermal-6.6-rc3' of
   git://git.kernel.org/pub/scm/linux/kernel/git/rafael/linux-pm")).

I found the reference to rafael/linux-pm confusing. I also tried with .git still didn't work.
Thanks for the clarification.

Bob
> 
>> If I just try to enable KASAN by setting CONFIG_KASAN=y in .config for the current linux-rdma repo
>> and compile the kernel the kernel won't boot and is caught in some kind of SRSO hell. If I checkout
>> Linus' v6.4 tag and add CONFIG_KASAN=y to a fresh .config file the kernel builds OK but when I
>> try to boot it, it is unable to chroot to the root file system in boot.
> 
> Please try to run the blktests suite in a VM. I have attached the kernel
> configuration to this email with which I observed the KASAN complaint on
> my test setup.
> 
> Thanks,
> 
> Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25 14:31                                               ` Zhu Yanjun
@ 2023-09-26  1:09                                                 ` Daisuke Matsuda (Fujitsu)
  2023-09-26  6:09                                                   ` Zhu Yanjun
  0 siblings, 1 reply; 87+ messages in thread
From: Daisuke Matsuda (Fujitsu) @ 2023-09-26  1:09 UTC (permalink / raw)
  To: 'Zhu Yanjun', 'Rain River', Bob Pearson
  Cc: Jason Gunthorpe, leon, Bart Van Assche, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On Mon, Sep 25, 2023 11:31 PM Zhu Yanjun <yanjun.zhu@linux.dev> wrote:
> 在 2023/9/25 12:47, Daisuke Matsuda (Fujitsu) 写道:
> > On Sun, Sep 24, 2023 10:18 AM Rain River wrote:
> >> On Sat, Sep 23, 2023 at 2:14 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
> >>> On 9/21/23 10:10, Zhu Yanjun wrote:
> >>>> 在 2023/9/21 22:39, Bob Pearson 写道:
> >>>>> On 9/21/23 09:23, Rain River wrote:
> >>>>>> On Thu, Sep 21, 2023 at 2:53 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
> >>>>>>> On 9/20/23 12:22, Bart Van Assche wrote:
> >>>>>>>> On 9/20/23 10:18, Bob Pearson wrote:
> >>>>>>>>> But I have also seen the same behavior in the siw driver which is
> >>>>>>>>> completely independent.
> >>>>>>>> Hmm ... I haven't seen any hangs yet with the siw driver.
> >>>>>>> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
> >>>>>>>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75%
> of
> >> the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
> >>>>>>>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
> >>>>>>>> support for rxe tasks")?
> >>>>>>> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
> >>>>>>> and wqs. But after updating Ubuntu and the kernel at some point they all went away.
> >>>>>> I made tests on the latest Ubuntu with the latest kernel without the
> >>>>>> commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
> >>>>>> The latest kernel is v6.6-rc2, the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
> >>>>>> workqueue support for rxe tasks") is reverted.
> >>>>>> I made blktest tests for about 30 times, this problem does not occur.
> >>>>>>
> >>>>>> So I confirm that without this commit, this hang problem does not
> >>>>>> occur on Ubuntu without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
> >>>>>> workqueue support for rxe tasks").
> >>>>>>
> >>>>>> Nanthan
> >>>>>>
> >>>>>>>> Thanks,
> >>>>>>>>
> >>>>>>>> Bart.
> >>>>> This commit is very important for several reasons. It is needed for the ODP implementation
> >>>>> that is in the works from Daisuke Matsuda and also for QP scaling of performance. The work
> >>>>> queue implementation scales well with increasing qp number while the tasklet implementation
> >>>>> does not. This is critical for the drivers use in large scale storage applications. So, if
> >>>>> there is a bug in the work queue implementation it needs to be fixed not reverted.
> >>>>>
> >>>>> I am still hoping that someone will diagnose what is causing the ULPs to hang in terms of
> >>>>> something missing causing it to wait.
> >>>> Hi, Bob
> >>>>
> >>>>
> >>>> You submitted this commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
> >>>>
> >>>> You should be very familiar with this commit.
> >>>>
> >>>> And this commit causes regression.
> >>>>
> >>>> So you should delved into the source code to find the root cause, then fix it.
> >>> Zhu,
> >>>
> >>> I have spent tons of time over the months trying to figure out what is happening with blktests.
> >>> As I have mentioned several times I have seen the same exact failure in siw in the past although
> >>> currently that doesn't seem to happen so I had been suspecting that the problem may be in the ULP.
> >>> The challenge is that the blktests represents a huge stack of software much of which I am not
> >>> familiar with. The bug is a hang in layers above the rxe driver and so far no one has been able to
> >>> say with any specificity the rxe driver failed to do something needed to make progress or violated
> >>> expected behavior. Without any clue as to where to look it has been hard to make progress.
> >> Bob
> >>
> >> Work queue will sleep. If work queue sleep for long time, the packets
> >> will not be sent to ULP. This is why this hang occurs.
> > In general work queue can sleep, but the workload running in rxe driver
> > should not sleep because it was originally running on tasklet and converted
> > to use work queue. A task can sometime take longer because of IRQs, but
> > the same thing can also happen with tasklet. If there is a difference between
> > the two, I think it would be the overhead of scheduring the work queue.
> >
> >> Difficult to handle this sleep in work queue. It had better revert
> >> this commit in RXE.
> > I am objected to reverting the commit at this stage. As Bob wrote above,
> > nobody has found any logical failure in rxe driver. It is quite possible
> > that the patch is just revealing a latent bug in the higher layers.
> 
> To now, on Debian and Fedora, all the tests with work queue will hang.
> And after reverting this commit,
> 
> no hang will occur.
> 
> Before new test results, it is a reasonable suspect that this commit
> will result in the hang.

If the hang *always* occurs, then I agree your opinion is correct,
but this one happens occasionally. It is also natural to think that
the commit makes it easier to meet the condition of an existing bug.

> 
> >
> >> Because work queue sleeps,  ULP can not wait for long time for the
> >> packets. If packets can not reach ULPs for long time, many problems
> >> will occur to ULPs.
> > I wonder where in the rxe driver does it sleep. BTW, most packets are
> > processed in NET_RX_IRQ context, and work queue is scheduled only
> 
> Do you mean NET_RX_SOFTIRQ?

Yes. I am sorry for confusing you.

Thanks,
Daisuke

> 
> Zhu Yanjun
> 
> > when there is already a running context. If your speculation is to the point,
> > the hang will occur more frequently if we change it to use work queue exclusively.
> > My ODP patches include a change to do this.
> > Cf.
> https://lore.kernel.org/lkml/7699a90bc4af10c33c0a46ef6330ed4bb7e7ace6.1694153251.git.matsuda-daisuke@fujitsu.c
> om/
> >
> > Thanks,
> > Daisuke
> >
> >>> My main motivation is making Lustre run on rxe and it does and it's fast enough to meet our needs.
> >>> Lustre is similar to srp as a ULP and in all of our testing we have never seen a similar hang. Other
> >>> hangs to be sure but not this one. I believe that this bug will never get resolved until someone with
> >>> a good understanding of the ulp drivers makes an effort to find out where and why the hang is occurring.
> >>>  From there it should be straight forward to fix the problem. I am continuing to investigate and am learning
> >>> the device-manager/multipath/srp/scsi stack but I have a long ways to go.
> >>>
> >>> Bob
> >>>
> >>>
> >>>>
> >>>> Jason && Leon, please comment on this.
> >>>>
> >>>>
> >>>> Best Regards,
> >>>>
> >>>> Zhu Yanjun
> >>>>
> >>>>> Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25 15:00                                               ` Bart Van Assche
                                                                   ` (2 preceding siblings ...)
  2023-09-25 19:57                                                 ` Bob Pearson
@ 2023-09-26  1:17                                                 ` Daisuke Matsuda (Fujitsu)
  2023-10-17 17:09                                                   ` Bob Pearson
  3 siblings, 1 reply; 87+ messages in thread
From: Daisuke Matsuda (Fujitsu) @ 2023-09-26  1:17 UTC (permalink / raw)
  To: 'Bart Van Assche', 'Rain River', Bob Pearson
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On Tue, Sep 26, 2023 12:01 AM Bart Van Assche:
> On 9/24/23 21:47, Daisuke Matsuda (Fujitsu) wrote:
> > As Bob wrote above, nobody has found any logical failure in rxe
> > driver.
> 
> That's wrong. In case you would not yet have noticed my latest email in
> this thread, please take a look at
> https://lore.kernel.org/linux-rdma/e8b76fae-780a-470e-8ec4-c6b650793d10@leemhuis.info/T/#m0fd8ea8a4cbc27b37
> b042ae4f8e9b024f1871a73.
> I think the report in that email is a 100% proof that there is a
> use-after-free issue in the rdma_rxe driver. Use-after-free issues have
> security implications and also can cause data corruption. I propose to
> revert the commit that introduced the rdma_rxe use-after-free unless
> someone comes up with a fix for the rdma_rxe driver.
> 
> Bart.

Thank you for the clarification. I see your intention.
I hope the hang issue will be resolved by addressing this.

Thanks,
Daisuke


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-26  1:09                                                 ` Daisuke Matsuda (Fujitsu)
@ 2023-09-26  6:09                                                   ` Zhu Yanjun
  0 siblings, 0 replies; 87+ messages in thread
From: Zhu Yanjun @ 2023-09-26  6:09 UTC (permalink / raw)
  To: Daisuke Matsuda (Fujitsu), 'Rain River', Bob Pearson
  Cc: Jason Gunthorpe, leon, Bart Van Assche, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

在 2023/9/26 9:09, Daisuke Matsuda (Fujitsu) 写道:
> On Mon, Sep 25, 2023 11:31 PM Zhu Yanjun <yanjun.zhu@linux.dev> wrote:
>> 在 2023/9/25 12:47, Daisuke Matsuda (Fujitsu) 写道:
>>> On Sun, Sep 24, 2023 10:18 AM Rain River wrote:
>>>> On Sat, Sep 23, 2023 at 2:14 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>>>>> On 9/21/23 10:10, Zhu Yanjun wrote:
>>>>>> 在 2023/9/21 22:39, Bob Pearson 写道:
>>>>>>> On 9/21/23 09:23, Rain River wrote:
>>>>>>>> On Thu, Sep 21, 2023 at 2:53 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>>>>>>>>> On 9/20/23 12:22, Bart Van Assche wrote:
>>>>>>>>>> On 9/20/23 10:18, Bob Pearson wrote:
>>>>>>>>>>> But I have also seen the same behavior in the siw driver which is
>>>>>>>>>>> completely independent.
>>>>>>>>>> Hmm ... I haven't seen any hangs yet with the siw driver.
>>>>>>>>> I was on Ubuntu 6-9 months ago. Currently I don't see hangs on either.
>>>>>>>>>>> As mentioned above at the moment Ubuntu is failing rarely. But it used to fail reliably (srp/002 about 75%
>> of
>>>> the time and srp/011 about 99% of the time.) There haven't been any changes to rxe to explain this.
>>>>>>>>>> I think that Zhu mentioned commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue
>>>>>>>>>> support for rxe tasks")?
>>>>>>>>> That change happened well before the failures went away. I was seeing failures at the same rate with tasklets
>>>>>>>>> and wqs. But after updating Ubuntu and the kernel at some point they all went away.
>>>>>>>> I made tests on the latest Ubuntu with the latest kernel without the
>>>>>>>> commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
>>>>>>>> The latest kernel is v6.6-rc2, the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
>>>>>>>> workqueue support for rxe tasks") is reverted.
>>>>>>>> I made blktest tests for about 30 times, this problem does not occur.
>>>>>>>>
>>>>>>>> So I confirm that without this commit, this hang problem does not
>>>>>>>> occur on Ubuntu without the commit 9b4b7c1f9f54 ("RDMA/rxe: Add
>>>>>>>> workqueue support for rxe tasks").
>>>>>>>>
>>>>>>>> Nanthan
>>>>>>>>
>>>>>>>>>> Thanks,
>>>>>>>>>>
>>>>>>>>>> Bart.
>>>>>>> This commit is very important for several reasons. It is needed for the ODP implementation
>>>>>>> that is in the works from Daisuke Matsuda and also for QP scaling of performance. The work
>>>>>>> queue implementation scales well with increasing qp number while the tasklet implementation
>>>>>>> does not. This is critical for the drivers use in large scale storage applications. So, if
>>>>>>> there is a bug in the work queue implementation it needs to be fixed not reverted.
>>>>>>>
>>>>>>> I am still hoping that someone will diagnose what is causing the ULPs to hang in terms of
>>>>>>> something missing causing it to wait.
>>>>>> Hi, Bob
>>>>>>
>>>>>>
>>>>>> You submitted this commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support for rxe tasks").
>>>>>>
>>>>>> You should be very familiar with this commit.
>>>>>>
>>>>>> And this commit causes regression.
>>>>>>
>>>>>> So you should delved into the source code to find the root cause, then fix it.
>>>>> Zhu,
>>>>>
>>>>> I have spent tons of time over the months trying to figure out what is happening with blktests.
>>>>> As I have mentioned several times I have seen the same exact failure in siw in the past although
>>>>> currently that doesn't seem to happen so I had been suspecting that the problem may be in the ULP.
>>>>> The challenge is that the blktests represents a huge stack of software much of which I am not
>>>>> familiar with. The bug is a hang in layers above the rxe driver and so far no one has been able to
>>>>> say with any specificity the rxe driver failed to do something needed to make progress or violated
>>>>> expected behavior. Without any clue as to where to look it has been hard to make progress.
>>>> Bob
>>>>
>>>> Work queue will sleep. If work queue sleep for long time, the packets
>>>> will not be sent to ULP. This is why this hang occurs.
>>> In general work queue can sleep, but the workload running in rxe driver
>>> should not sleep because it was originally running on tasklet and converted
>>> to use work queue. A task can sometime take longer because of IRQs, but
>>> the same thing can also happen with tasklet. If there is a difference between
>>> the two, I think it would be the overhead of scheduring the work queue.
>>>
>>>> Difficult to handle this sleep in work queue. It had better revert
>>>> this commit in RXE.
>>> I am objected to reverting the commit at this stage. As Bob wrote above,
>>> nobody has found any logical failure in rxe driver. It is quite possible
>>> that the patch is just revealing a latent bug in the higher layers.
>>
>> To now, on Debian and Fedora, all the tests with work queue will hang.
>> And after reverting this commit,
>>
>> no hang will occur.
>>
>> Before new test results, it is a reasonable suspect that this commit
>> will result in the hang.
> 
> If the hang *always* occurs, then I agree your opinion is correct,

About hang tests, please read through the whole discussion. Several 
engineers made tests on Debian, Fedora and Ubuntu to confirm these test 
results.

Zhu Yanjun

> but this one happens occasionally. It is also natural to think that
> the commit makes it easier to meet the condition of an existing bug.
> 
>>
>>>
>>>> Because work queue sleeps,  ULP can not wait for long time for the
>>>> packets. If packets can not reach ULPs for long time, many problems
>>>> will occur to ULPs.
>>> I wonder where in the rxe driver does it sleep. BTW, most packets are
>>> processed in NET_RX_IRQ context, and work queue is scheduled only
>>
>> Do you mean NET_RX_SOFTIRQ?
> 
> Yes. I am sorry for confusing you.
> 
> Thanks,
> Daisuke
> 
>>
>> Zhu Yanjun
>>
>>> when there is already a running context. If your speculation is to the point,
>>> the hang will occur more frequently if we change it to use work queue exclusively.
>>> My ODP patches include a change to do this.
>>> Cf.
>> https://lore.kernel.org/lkml/7699a90bc4af10c33c0a46ef6330ed4bb7e7ace6.1694153251.git.matsuda-daisuke@fujitsu.c
>> om/
>>>
>>> Thanks,
>>> Daisuke
>>>
>>>>> My main motivation is making Lustre run on rxe and it does and it's fast enough to meet our needs.
>>>>> Lustre is similar to srp as a ULP and in all of our testing we have never seen a similar hang. Other
>>>>> hangs to be sure but not this one. I believe that this bug will never get resolved until someone with
>>>>> a good understanding of the ulp drivers makes an effort to find out where and why the hang is occurring.
>>>>>   From there it should be straight forward to fix the problem. I am continuing to investigate and am learning
>>>>> the device-manager/multipath/srp/scsi stack but I have a long ways to go.
>>>>>
>>>>> Bob
>>>>>
>>>>>
>>>>>>
>>>>>> Jason && Leon, please comment on this.
>>>>>>
>>>>>>
>>>>>> Best Regards,
>>>>>>
>>>>>> Zhu Yanjun
>>>>>>
>>>>>>> Bob


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-25 19:57                                                 ` Bob Pearson
  2023-09-25 20:33                                                   ` Bart Van Assche
@ 2023-09-26 15:36                                                   ` Rain River
  1 sibling, 0 replies; 87+ messages in thread
From: Rain River @ 2023-09-26 15:36 UTC (permalink / raw)
  To: Bob Pearson
  Cc: Bart Van Assche, Daisuke Matsuda (Fujitsu),
	Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On Tue, Sep 26, 2023 at 3:57 AM Bob Pearson <rpearsonhpe@gmail.com> wrote:
>
> On 9/25/23 10:00, Bart Van Assche wrote:
> > On 9/24/23 21:47, Daisuke Matsuda (Fujitsu) wrote:
> >> As Bob wrote above, nobody has found any logical failure in rxe
> >> driver.
> >
> > That's wrong. In case you would not yet have noticed my latest email in
> > this thread, please take a look at
> > https://lore.kernel.org/linux-rdma/e8b76fae-780a-470e-8ec4-c6b650793d10@leemhuis.info/T/#m0fd8ea8a4cbc27b37b042ae4f8e9b024f1871a73. I think the report in that email is a 100% proof that there is a use-after-free issue in the rdma_rxe driver. Use-after-free issues have security implications and also can cause data corruption. I propose to revert the commit that introduced the rdma_rxe use-after-rpearson:src$ git clone git://git.kernel.org/pub/scm/linux/git/rafael/linux-pm
> Cloning into 'linux-pm'...
> fatal: remote error: access denied or repository not exported: /pub/scm/linux/git/rafael/linux-pm
> free unless someone comes up with a fix for the rdma_rxe driver.
> >
> > Bart.
>
> Bart,
>
> Having trouble following your recipe. The git repo you mention does not seem to be available. E.g.
>
> rpearson:src$ git clone git://git.kernel.org/pub/scm/linux/git/rafael/linux-pm
> Cloning into 'linux-pm'...
> fatal: remote error: access denied or repository not exported: /pub/scm/linux/git/rafael/linux-pm
>
> I am not sure how to obtain the tag if I cannot see the repo.
>
> If I just try to enable KASAN by setting CONFIG_KASAN=y in .config for the current linux-rdma repo
> and compile the kernel the kernel won't boot and is caught in some kind of SRSO hell. If I checkout
> Linus' v6.4 tag and add CONFIG_KASAN=y to a fresh .config file the kernel builds OK but when I
> try to boot it, it is unable to chroot to the root file system in boot.

Bob,

Suggested by a friend who is an expert in process schedule and
workqueue, I made a test as below.
On each CPU, a cpu-intensive process runs with high priority. Then run
rxe with the commit, the rping almost can not work well.
Without this commit, rping can work with rxe in the same scenario.
When you fix this problem, consider the above.

>
> Any hints would be appreciated.
>
> Bob
>

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-22 11:06 ` Linux regression tracking #adding (Thorsten Leemhuis)
@ 2023-10-13 12:51   ` Linux regression tracking #update (Thorsten Leemhuis)
  0 siblings, 0 replies; 87+ messages in thread
From: Linux regression tracking #update (Thorsten Leemhuis) @ 2023-10-13 12:51 UTC (permalink / raw)
  To: linux-rdma, linux-scsi; +Cc: Linux kernel regressions list

[TLDR: This mail in primarily relevant for Linux kernel regression
tracking. See link in footer if these mails annoy you.]

On 22.09.23 13:06, Linux regression tracking #adding (Thorsten Leemhuis)
wrote:
> On 21.08.23 08:46, Shinichiro Kawasaki wrote:
>> I observed a process hang at the blktests test case srp/002 occasionally, using
>> kernel v6.5-rcX. Kernel reported stall of many kworkers [1]. PID 2757 hanged at
>> inode_sleep_on_writeback(). Other kworkers hanged at __inode_wait_for_writeback.
>>
>> The hang is recreated in stable manner by repeating the test case srp/002 (from
>> 15 times to 30 times).
>>
>> I bisected and found the commit 9b4b7c1f9f54 ("RDMA/rxe: Add workqueue support
>> for rxe tasks") looks like the trigger commit. When I revert it from the kernel
>> v6.5-rc7, the hang symptom disappears. I'm not sure how the commit relates to
>> the hang. Comments will be welcomed.
>> […]
> 
> Thanks for the report. To be sure the issue doesn't fall through the
> cracks unnoticed, I'm adding it to regzbot, the Linux kernel regression
> tracking bot:
> 
> #regzbot ^introduced 9b4b7c1f9f54
> #regzbot title RDMA/rxe: occasionally pocess hang at the blktests test
> case srp/002
> #regzbot ignore-activity

#regzbot monitor:
https://lore.kernel.org/all/20230922163231.2237811-1-yanjun.zhu@intel.com/
#regzbot ignore-activity

Ciao, Thorsten (wearing his 'the Linux kernel's regression tracker' hat)
--
Everything you wanna know about Linux kernel regression tracking:
https://linux-regtracking.leemhuis.info/about/#tldr
That page also explains what to do if mails like this annoy you.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-09-26  1:17                                                 ` Daisuke Matsuda (Fujitsu)
@ 2023-10-17 17:09                                                   ` Bob Pearson
  2023-10-17 17:13                                                     ` Bart Van Assche
                                                                       ` (2 more replies)
  0 siblings, 3 replies; 87+ messages in thread
From: Bob Pearson @ 2023-10-17 17:09 UTC (permalink / raw)
  To: Daisuke Matsuda (Fujitsu), 'Bart Van Assche',
	'Rain River'
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

[-- Attachment #1: Type: text/plain, Size: 2793 bytes --]

On 9/25/23 20:17, Daisuke Matsuda (Fujitsu) wrote:
> On Tue, Sep 26, 2023 12:01 AM Bart Van Assche:
>> On 9/24/23 21:47, Daisuke Matsuda (Fujitsu) wrote:
>>> As Bob wrote above, nobody has found any logical failure in rxe
>>> driver.
>>
>> That's wrong. In case you would not yet have noticed my latest email in
>> this thread, please take a look at
>> https://lore.kernel.org/linux-rdma/e8b76fae-780a-470e-8ec4-c6b650793d10@leemhuis.info/T/#m0fd8ea8a4cbc27b37
>> b042ae4f8e9b024f1871a73.
>> I think the report in that email is a 100% proof that there is a
>> use-after-free issue in the rdma_rxe driver. Use-after-free issues have
>> security implications and also can cause data corruption. I propose to
>> revert the commit that introduced the rdma_rxe use-after-free unless
>> someone comes up with a fix for the rdma_rxe driver.
>>
>> Bart.
> 
> Thank you for the clarification. I see your intention.
> I hope the hang issue will be resolved by addressing this.
> 
> Thanks,
> Daisuke
> 

I have made some progress in understanding the cause of the srp/002 etc. hang.

The two attached files are traces of activity for two qp's qp#151 and qp#167. In my runs of srp/002
All the qp's pass before 167 and all fail after 167 which is the first to fail.

It turns out that all the passing qp's call srp_post_send() some number of times and also call
srp_send_done() the same number of times. Starting at qp#167 the last call to srp_send_done() does
not take place leaving the srp driver waiting for the final completion and causing the hang I believe.

There are four cq's involved in each pair of qp's in the srp test. Two in ib_srp and two in ib_srpt
for the two qp's. Three of them execute completion processing in a soft irq context so the code in
core/cq.c gathers the completions and calls back to the srp drivers. The send side cq in srp uses
cq_direct which requires srp to call ib_process_direct() in order to collect the completions. This
happens in __srp_get_tx_iu() which is called in several places in the srp driver. But only as a side effect
since the purpose of this routine is to get an iu to start a new command.

In the attached files for qp#151 the final call to srp_post_send is followed by the rxe requester and
completer work queues processing the send packet and the ack before a final call to __srp_get_rx_iu()
which gathers the final send side completion and success.

For qp#167 the call to srp_post_send() is followed by the rxe driver processing the send operation and
generating a work completion which is posted to the send cq but there is never a following call to
__srp_get_rx_iu() so the cqe is not received by srp and failure.

I don't yet understand the logic of the srp driver to fix this but the problem is not in the rxe driver
as far as I can tell.

Bob

[-- Attachment #2: out151 --]
[-- Type: text/plain, Size: 16249 bytes --]

[  184.877132] qp#151: create_qp
[  184.892362] qp#151: modify_qp: INIT
[  184.892385] qp#151: modify_qp: RTR
[  184.892390] qp#151: modify_qp: RTS
[  184.892722] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  184.893208] ib_srp: qp#151: post-recv:
[  184.893212] ib_srp: qp#151: post-recv:
[  184.893215] ib_srp: qp#151: post-recv:
[  184.893218] ib_srp: qp#151: post-recv:
[  184.893220] ib_srp: qp#151: post-recv:
[  184.893223] ib_srp: qp#151: post-recv:
[  184.893226] ib_srp: qp#151: post-recv:
[  184.893228] ib_srp: qp#151: post-recv:
[  184.893231] ib_srp: qp#151: post-recv:
[  184.893234] ib_srp: qp#151: post-recv:
[  184.893236] ib_srp: qp#151: post-recv:
[  184.893239] ib_srp: qp#151: post-recv:
[  184.893242] ib_srp: qp#151: post-recv:
[  184.893244] ib_srp: qp#151: post-recv:
[  184.893247] ib_srp: qp#151: post-recv:
[  184.893249] ib_srp: qp#151: post-recv:
[  184.893252] ib_srp: qp#151: post-recv:
[  184.893255] ib_srp: qp#151: post-recv:
[  184.893257] ib_srp: qp#151: post-recv:
[  184.893260] ib_srp: qp#151: post-recv:
[  184.893263] ib_srp: qp#151: post-recv:
[  184.893265] ib_srp: qp#151: post-recv:
[  184.893268] ib_srp: qp#151: post-recv:
[  184.893270] ib_srp: qp#151: post-recv:
[  184.893273] ib_srp: qp#151: post-recv:
[  184.893276] ib_srp: qp#151: post-recv:
[  184.893278] ib_srp: qp#151: post-recv:
[  184.893281] ib_srp: qp#151: post-recv:
[  184.893284] ib_srp: qp#151: post-recv:
[  184.893286] ib_srp: qp#151: post-recv:
[  184.893289] ib_srp: qp#151: post-recv:
[  184.893291] ib_srp: qp#151: post-recv:
[  184.893294] ib_srp: qp#151: post-recv:
[  184.893297] ib_srp: qp#151: post-recv:
[  184.893299] ib_srp: qp#151: post-recv:
[  184.893302] ib_srp: qp#151: post-recv:
[  184.893304] ib_srp: qp#151: post-recv:
[  184.893307] ib_srp: qp#151: post-recv:
[  184.893310] ib_srp: qp#151: post-recv:
[  184.893312] ib_srp: qp#151: post-recv:
[  184.893315] ib_srp: qp#151: post-recv:
[  184.893318] ib_srp: qp#151: post-recv:
[  184.893320] ib_srp: qp#151: post-recv:
[  184.893323] ib_srp: qp#151: post-recv:
[  184.893325] ib_srp: qp#151: post-recv:
[  184.893328] ib_srp: qp#151: post-recv:
[  184.893331] ib_srp: qp#151: post-recv:
[  184.893333] ib_srp: qp#151: post-recv:
[  184.893336] ib_srp: qp#151: post-recv:
[  184.893339] ib_srp: qp#151: post-recv:
[  184.893341] ib_srp: qp#151: post-recv:
[  184.893344] ib_srp: qp#151: post-recv:
[  184.893346] ib_srp: qp#151: post-recv:
[  184.893349] ib_srp: qp#151: post-recv:
[  184.893352] ib_srp: qp#151: post-recv:
[  184.893354] ib_srp: qp#151: post-recv:
[  184.893357] ib_srp: qp#151: post-recv:
[  184.893360] ib_srp: qp#151: post-recv:
[  184.893362] ib_srp: qp#151: post-recv:
[  184.893365] ib_srp: qp#151: post-recv:
[  184.893367] ib_srp: qp#151: post-recv:
[  184.893370] ib_srp: qp#151: post-recv:
[  184.893373] ib_srp: qp#151: post-recv:
[  184.893375] ib_srp: qp#151: post-recv:
[  185.127720] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.127760] ib_srp: qp#151: post-reg_mr: 0x207820
[  185.127767] ib_srp: qp#151: post-send:
[  185.127792] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.127805] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.127984] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.127996] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.128182] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.128232] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.128241] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.128254] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified
[  185.128302] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.128317] ib_srp: qp#151: post-inv_rkey: 0x207820
[  185.128323] ib_srp: qp#151: post-recv:
[  185.128336] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.128388] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.128409] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  185.128439] ib_srp: qp#151: post-reg_mr: 0x207821
[  185.128446] ib_srp: qp#151: post-send:
[  185.128446] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.128459] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.128548] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.128556] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.128692] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.128736] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.128745] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.128756] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified
[  185.128769] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.128788] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  185.128810] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.128821] ib_srp: qp#151: post-reg_mr: 0x20798f
[  185.128821] ib_srp: qp#151: post-inv_rkey: 0x207821
[  185.128828] ib_srp: qp#151: post-send:
[  185.128844] ib_srp: qp#151: post-recv:
[  185.128868] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.128883] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.128891] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.128908] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.128917] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.128921] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.128932] ib_srp: qp#151: post-reg_mr: 0x207822
[  185.128939] ib_srp: qp#151: post-send:
[  185.128946] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.128960] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.129019] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.129026] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.129063] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.129070] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.129285] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.129332] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.129341] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.129352] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified
[  185.129380] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.129399] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.129404] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.129438] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.129448] ib_srp: qp#151: post-inv_rkey: 0x20798f
[  185.129455] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.129461] ib_srp: qp#151: post-recv:
[  185.129465] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.129471] ib_srp: qp#151: post-inv_rkey: 0x207822
[  185.129475] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.129491] ib_srp: qp#151: post-recv:
[  185.129505] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.129530] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  185.129535] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  185.129560] ib_srp: qp#151: post-reg_mr: 0x207823
[  185.129568] ib_srp: qp#151: post-send:
[  185.129570] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.129587] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.129664] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.129672] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.129841] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.129908] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.129917] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.129929] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified
[  185.129995] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.130010] ib_srp: qp#151: post-inv_rkey: 0x207823
[  185.130034] ib_srp: qp#151: post-recv:
[  185.130035] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.134544] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.134566] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  185.134593] ib_srp: qp#151: post-reg_mr: 0x207824
[  185.134598] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.134599] ib_srp: qp#151: post-send:
[  185.134618] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.134701] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.134712] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.134845] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.134882] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.134890] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.134898] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified
[  185.134936] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.134947] ib_srp: qp#151: post-inv_rkey: 0x207824
[  185.134956] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.134961] ib_srp: qp#151: post-recv:
[  185.135496] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.135518] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  185.135545] ib_srp: qp#151: post-reg_mr: 0x207825
[  185.135552] ib_srp: qp#151: post-send:
[  185.135554] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.135573] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.135678] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.135688] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.135865] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.135913] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.135921] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.135933] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified
[  185.135994] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.136010] ib_srp: qp#151: post-inv_rkey: 0x207825
[  185.136014] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.136028] ib_srp: qp#151: post-recv:
[  185.141233] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.141260] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  185.141292] ib_srp: qp#151: post-reg_mr: 0x207826
[  185.141301] ib_srp: qp#151: post-send:
[  185.141302] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.141319] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.141424] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.141428] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.141600] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.141648] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.141663] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.141675] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified
[  185.141738] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.141754] ib_srp: qp#151: post-inv_rkey: 0x207826
[  185.141762] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.141772] ib_srp: qp#151: post-recv:
[  185.141820] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.141842] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  185.141882] ib_srp: qp#151: post-reg_mr: 0x207827
[  185.141889] ib_srp: qp#151: post-send:
[  185.141893] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.141909] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.142020] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.142027] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.142199] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.142246] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.142256] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.142271] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified
[  185.142337] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.142354] ib_srp: qp#151: post-inv_rkey: 0x207827
[  185.142362] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.142370] ib_srp: qp#151: post-recv:
[  185.142463] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.142486] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  185.142514] ib_srp: qp#151: post-reg_mr: 0x207828
[  185.142523] ib_srp: qp#151: post-send:
[  185.142523] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.142541] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.142657] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.142664] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.142832] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.142880] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.142891] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.142902] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified
[  185.142962] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.142978] ib_srp: qp#151: post-inv_rkey: 0x207828
[  185.142985] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.142992] ib_srp: qp#151: post-recv:
[  185.143041] ib_srp: qp#151: __srp_get_tx_iu to ib_process_cq_direct
[  185.143062] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  185.143087] ib_srp: qp#151: post-reg_mr: 0x207829
[  185.143093] ib_srp: qp#151: post-send:
[  185.143095] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  185.143111] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  185.143215] enp6s0_rxe: qp#151 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  185.143220] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 0, len: 64
[  185.143393] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  185.143441] enp6s0_rxe: qp#151 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  185.143450] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 128, status: 0, len: 36
[  185.143462] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified
[  185.143530] ib_srp: qp#151: recv-done: opcode: 128 status: 0: len: 36
[  185.143547] ib_srp: qp#151: post-inv_rkey: 0x207829
[  185.143556] enp6s0_rxe: qp#151 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  185.143566] ib_srp: qp#151: post-recv:
[  192.873115] ib_srp: qp#151: srp_destroy_qp to ib_process_cq_direct
[  192.873142] ib_srp: qp#151: send-done: opcode: 0 status: 0: len: 64
[  192.873226] enp6s0_rxe: qp#151 rxe_cq_post: cq#161 opcode: 0, status: 5, len: 0
[  192.973687] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 opcode: 0, status: 5, len: 0
[  192.973701] enp6s0_rxe: qp#151 rxe_cq_post: cq#160 notified

[-- Attachment #3: out167 --]
[-- Type: text/plain, Size: 13466 bytes --]

[  195.843870] qp#167: create_qp
[  195.858393] qp#167: modify_qp: INIT
[  195.858402] qp#167: modify_qp: RTR
[  195.858406] qp#167: modify_qp: RTS
[  195.858656] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  195.859199] ib_srp: qp#167: post-recv:
[  195.859203] ib_srp: qp#167: post-recv:
[  195.859205] ib_srp: qp#167: post-recv:
[  195.859208] ib_srp: qp#167: post-recv:
[  195.859210] ib_srp: qp#167: post-recv:
[  195.859213] ib_srp: qp#167: post-recv:
[  195.859216] ib_srp: qp#167: post-recv:
[  195.859218] ib_srp: qp#167: post-recv:
[  195.859221] ib_srp: qp#167: post-recv:
[  195.859224] ib_srp: qp#167: post-recv:
[  195.859226] ib_srp: qp#167: post-recv:
[  195.859229] ib_srp: qp#167: post-recv:
[  195.859231] ib_srp: qp#167: post-recv:
[  195.859234] ib_srp: qp#167: post-recv:
[  195.859237] ib_srp: qp#167: post-recv:
[  195.859239] ib_srp: qp#167: post-recv:
[  195.859242] ib_srp: qp#167: post-recv:
[  195.859244] ib_srp: qp#167: post-recv:
[  195.859247] ib_srp: qp#167: post-recv:
[  195.859250] ib_srp: qp#167: post-recv:
[  195.859253] ib_srp: qp#167: post-recv:
[  195.859255] ib_srp: qp#167: post-recv:
[  195.859258] ib_srp: qp#167: post-recv:
[  195.859260] ib_srp: qp#167: post-recv:
[  195.859263] ib_srp: qp#167: post-recv:
[  195.859266] ib_srp: qp#167: post-recv:
[  195.859268] ib_srp: qp#167: post-recv:
[  195.859271] ib_srp: qp#167: post-recv:
[  195.859274] ib_srp: qp#167: post-recv:
[  195.859276] ib_srp: qp#167: post-recv:
[  195.859279] ib_srp: qp#167: post-recv:
[  195.859281] ib_srp: qp#167: post-recv:
[  195.859284] ib_srp: qp#167: post-recv:
[  195.859287] ib_srp: qp#167: post-recv:
[  195.859289] ib_srp: qp#167: post-recv:
[  195.859292] ib_srp: qp#167: post-recv:
[  195.859294] ib_srp: qp#167: post-recv:
[  195.859297] ib_srp: qp#167: post-recv:
[  195.859300] ib_srp: qp#167: post-recv:
[  195.859303] ib_srp: qp#167: post-recv:
[  195.859306] ib_srp: qp#167: post-recv:
[  195.859308] ib_srp: qp#167: post-recv:
[  195.859311] ib_srp: qp#167: post-recv:
[  195.859313] ib_srp: qp#167: post-recv:
[  195.859316] ib_srp: qp#167: post-recv:
[  195.859319] ib_srp: qp#167: post-recv:
[  195.859321] ib_srp: qp#167: post-recv:
[  195.859324] ib_srp: qp#167: post-recv:
[  195.859326] ib_srp: qp#167: post-recv:
[  195.859329] ib_srp: qp#167: post-recv:
[  195.859332] ib_srp: qp#167: post-recv:
[  195.859334] ib_srp: qp#167: post-recv:
[  195.859337] ib_srp: qp#167: post-recv:
[  195.859339] ib_srp: qp#167: post-recv:
[  195.859342] ib_srp: qp#167: post-recv:
[  195.859345] ib_srp: qp#167: post-recv:
[  195.859348] ib_srp: qp#167: post-recv:
[  195.859350] ib_srp: qp#167: post-recv:
[  195.859353] ib_srp: qp#167: post-recv:
[  195.859356] ib_srp: qp#167: post-recv:
[  195.859358] ib_srp: qp#167: post-recv:
[  195.859361] ib_srp: qp#167: post-recv:
[  195.859364] ib_srp: qp#167: post-recv:
[  195.859366] ib_srp: qp#167: post-recv:
[  196.396284] ib_srp: qp#167: __srp_get_tx_iu to ib_process_cq_direct
[  196.396316] ib_srp: qp#167: post-reg_mr: 0x2458e7
[  196.396325] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  196.396327] ib_srp: qp#167: post-send:
[  196.396338] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  196.396360] ib_srp: qp#167: __srp_get_tx_iu to ib_process_cq_direct
[  196.396383] ib_srp: qp#167: post-reg_mr: 0x245923
[  196.396391] ib_srp: qp#167: post-send:
[  196.396455] ib_srp: qp#167: __srp_get_tx_iu to ib_process_cq_direct
[  196.396478] ib_srp: qp#167: post-reg_mr: 0x245a7d
[  196.396484] ib_srp: qp#167: post-send:
[  196.396590] ib_srp: qp#167: __srp_get_tx_iu to ib_process_cq_direct
[  196.396615] ib_srp: qp#167: post-reg_mr: 0x245b46
[  196.396621] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  196.396624] ib_srp: qp#167: post-send:
[  196.396629] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  196.396645] enp6s0_rxe: qp#167 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  196.396661] enp6s0_rxe: qp#167 rxe_cq_post: cq#177 opcode: 0, status: 0, len: 64
[  196.396662] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  196.396670] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  196.396694] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  196.396702] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_ONLY
[  196.396709] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  196.396733] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  196.396738] enp6s0_rxe: qp#167 rxe_cq_post: cq#176 opcode: 128, status: 0, len: 36
[  196.396746] enp6s0_rxe: qp#167 rxe_cq_post: cq#176 notified
[  196.396760] enp6s0_rxe: qp#167 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  196.396770] enp6s0_rxe: qp#167 rxe_cq_post: cq#177 opcode: 0, status: 0, len: 64
[  196.396783] ib_srp: qp#167: recv-done: opcode: 128 status: 0: len: 36
[  196.396796] ib_srp: qp#167: post-inv_rkey: 0x2458e7
[  196.396798] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  196.396798] enp6s0_rxe: qp#167 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  196.396804] ib_srp: qp#167: post-recv:
[  196.396813] ib_srp: qp#167: __srp_get_tx_iu to ib_process_cq_direct
[  196.396815] enp6s0_rxe: qp#167 rxe_cq_post: cq#177 opcode: 0, status: 0, len: 64
[  196.396845] ib_srp: qp#167: send-done: opcode: 0 status: 0: len: 64
[  196.396845] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  196.396850] ib_srp: qp#167: send-done: opcode: 0 status: 0: len: 64
[  196.396855] ib_srp: qp#167: send-done: opcode: 0 status: 0: len: 64
[  196.396860] enp6s0_rxe: qp#167 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  196.396867] enp6s0_rxe: qp#167 rxe_cq_post: cq#177 opcode: 0, status: 0, len: 64
[  196.396886] ib_srp: qp#167: post-reg_mr: 0x2458e8
[  196.396892] ib_srp: qp#167: post-send:
[  196.396898] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  196.396910] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_REG_MR, length: 0, resid: 0
[  196.396917] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_SEND, length: 64, resid: 64
[  196.397354] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_FIRST
[  196.397373] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_LAST
[  196.397418] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  196.397429] enp6s0_rxe: qp#167 rxe_cq_post: cq#176 opcode: 128, status: 0, len: 36
[  196.397440] enp6s0_rxe: qp#167 rxe_cq_post: cq#176 notified
[  196.397490] ib_srp: qp#167: recv-done: opcode: 128 status: 0: len: 36
[  196.397513] ib_srp: qp#167: post-inv_rkey: 0x245923
[  196.397534] ib_srp: qp#167: post-recv:
[  196.397576] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_FIRST
[  196.397597] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397615] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397633] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397653] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397668] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_LAST
[  196.397677] enp6s0_rxe: qp#167 rxe_completer: pkt: opcode = IB_OPCODE_RC_ACKNOWLEDGE
[  196.397685] enp6s0_rxe: qp#167 rxe_cq_post: cq#177 opcode: 0, status: 0, len: 64
[  196.397705] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  196.397714] enp6s0_rxe: qp#167 rxe_cq_post: cq#176 opcode: 128, status: 0, len: 36
[  196.397720] enp6s0_rxe: qp#167 rxe_cq_post: cq#176 notified
[  196.397754] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_FIRST
[  196.397771] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397789] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397806] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397827] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397845] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397864] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397883] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397903] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397921] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397925] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  196.397938] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397957] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397974] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.397991] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398029] ib_srp: qp#167: recv-done: opcode: 128 status: 0: len: 36
[  196.398031] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_LAST
[  196.398052] ib_srp: qp#167: post-inv_rkey: 0x245a7d
[  196.398077] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  196.398079] ib_srp: qp#167: post-recv:
[  196.398089] enp6s0_rxe: qp#167 rxe_cq_post: cq#176 opcode: 128, status: 0, len: 36
[  196.398106] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  196.398113] ib_srp: qp#167: recv-done: opcode: 128 status: 0: len: 36
[  196.398121] ib_srp: qp#167: post-inv_rkey: 0x245b46
[  196.398122] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  196.398127] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_FIRST
[  196.398145] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398155] ib_srp: qp#167: post-recv:
[  196.398164] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398185] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398205] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398228] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398245] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398261] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398296] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398334] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398355] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398390] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398411] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398452] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398470] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398505] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398520] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398577] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398594] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398612] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398630] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398676] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398696] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398737] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398756] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398798] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398816] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398853] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398889] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398909] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_MIDDLE
[  196.398947] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_RDMA_WRITE_LAST
[  196.398994] enp6s0_rxe: qp#167 rxe_responder: pkt: opcode = IB_OPCODE_RC_SEND_ONLY
[  196.399001] enp6s0_rxe: qp#167 rxe_cq_post: cq#176 opcode: 128, status: 0, len: 36
[  196.399014] enp6s0_rxe: qp#167 rxe_cq_post: cq#176 notified
[  196.399087] ib_srp: qp#167: recv-done: opcode: 128 status: 0: len: 36
[  196.399107] ib_srp: qp#167: post-inv_rkey: 0x2458e8
[  196.399114] enp6s0_rxe: qp#167 rxe_requester: wqe: IB_WR_LOCAL_INV, length: 0, resid: 0
[  196.399139] ib_srp: qp#167: post-recv:

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 17:09                                                   ` Bob Pearson
@ 2023-10-17 17:13                                                     ` Bart Van Assche
  2023-10-17 17:15                                                       ` Bob Pearson
  2023-10-17 17:19                                                       ` Bob Pearson
  2023-10-17 17:58                                                     ` Jason Gunthorpe
  2023-10-18  8:16                                                     ` Zhu Yanjun
  2 siblings, 2 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-10-17 17:13 UTC (permalink / raw)
  To: Bob Pearson, Daisuke Matsuda (Fujitsu), 'Rain River'
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi


On 10/17/23 10:09, Bob Pearson wrote:
> I don't yet understand the logic of the srp driver to fix this but
> the problem is not in the rxe driver as far as I can tell.
Is there any information available that supports this conclusion? I
think the KASAN output that I shared shows that there is an issue in
the RXE driver.

Thanks,

Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 17:13                                                     ` Bart Van Assche
@ 2023-10-17 17:15                                                       ` Bob Pearson
  2023-10-17 17:19                                                       ` Bob Pearson
  1 sibling, 0 replies; 87+ messages in thread
From: Bob Pearson @ 2023-10-17 17:15 UTC (permalink / raw)
  To: Bart Van Assche, Daisuke Matsuda (Fujitsu), 'Rain River'
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On 10/17/23 12:13, Bart Van Assche wrote:
> 
> On 10/17/23 10:09, Bob Pearson wrote:
>> I don't yet understand the logic of the srp driver to fix this but
>> the problem is not in the rxe driver as far as I can tell.
> Is there any information available that supports this conclusion? I
> think the KASAN output that I shared shows that there is an issue in
> the RXE driver.
> 
> Thanks,
> 
> Bart.
> 

Bart,

I have seen 100's of hangs. I have never seen a KASAN warning and it is configured in my kernel.

Bopb

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 17:13                                                     ` Bart Van Assche
  2023-10-17 17:15                                                       ` Bob Pearson
@ 2023-10-17 17:19                                                       ` Bob Pearson
  2023-10-17 17:34                                                         ` Bart Van Assche
  1 sibling, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-10-17 17:19 UTC (permalink / raw)
  To: Bart Van Assche, Daisuke Matsuda (Fujitsu), 'Rain River'
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On 10/17/23 12:13, Bart Van Assche wrote:
> 
> On 10/17/23 10:09, Bob Pearson wrote:
>> I don't yet understand the logic of the srp driver to fix this but
>> the problem is not in the rxe driver as far as I can tell.
> Is there any information available that supports this conclusion? I
> think the KASAN output that I shared shows that there is an issue in
> the RXE driver.
> 
> Thanks,
> 
> Bart.
> 

Should have mentioned that the last set of tests in srp/002 have much longer
writes than the earlier ones which require a lot more processing and thus
time. My belief is that the completion logic in srp is faulty but works if
the underlying transport is fast but not if it is slow.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 17:19                                                       ` Bob Pearson
@ 2023-10-17 17:34                                                         ` Bart Van Assche
  0 siblings, 0 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-10-17 17:34 UTC (permalink / raw)
  To: Bob Pearson, Daisuke Matsuda (Fujitsu), 'Rain River'
  Cc: Zhu Yanjun, Jason Gunthorpe, leon, Shinichiro Kawasaki,
	RDMA mailing list, linux-scsi

On 10/17/23 10:19, Bob Pearson wrote:
> Should have mentioned that the last set of tests in srp/002 have much longer
> writes than the earlier ones which require a lot more processing and thus
> time. My belief is that the completion logic in srp is faulty but works if
> the underlying transport is fast but not if it is slow.

There are no known issues in the SRP driver. If there would be any
issues in that driver, I think these would also show up in tests with
the siw (Soft-iWARP) driver.

Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 17:09                                                   ` Bob Pearson
  2023-10-17 17:13                                                     ` Bart Van Assche
@ 2023-10-17 17:58                                                     ` Jason Gunthorpe
  2023-10-17 18:44                                                       ` Bob Pearson
  2023-10-17 19:18                                                       ` Bart Van Assche
  2023-10-18  8:16                                                     ` Zhu Yanjun
  2 siblings, 2 replies; 87+ messages in thread
From: Jason Gunthorpe @ 2023-10-17 17:58 UTC (permalink / raw)
  To: Bob Pearson
  Cc: Daisuke Matsuda (Fujitsu), 'Bart Van Assche',
	'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On Tue, Oct 17, 2023 at 12:09:31PM -0500, Bob Pearson wrote:

 
> For qp#167 the call to srp_post_send() is followed by the rxe driver
> processing the send operation and generating a work completion which
> is posted to the send cq but there is never a following call to
> __srp_get_rx_iu() so the cqe is not received by srp and failure.

? I don't see this funcion in the kernel?  __srp_get_tx_iu ?
 
> I don't yet understand the logic of the srp driver to fix this but
> the problem is not in the rxe driver as far as I can tell.

It looks to me like __srp_get_tx_iu() is following the design pattern
where the send queue is only polled when it needs to allocate a new
send buffer - ie the send buffers are pre-allocated and cycle through
the queue.

So, it is not surprising this isn't being called if it is hung - the
hang is probably something that is preventing it from even wanting to
send, which is probably a receive side issue.

Followup back up from that point to isolate what is the missing
resouce to trigger send may bring some more clarity.

Alternatively if __srp_get_tx_iu() is failing then perhaps you've run
into an issue where it hit something rare and recovery does not work.

eg this kind of design pattern carries a subtle assumption that the rx
and send CQ are ordered together. Getting a rx CQ before a matching tx
CQ can trigger the unusual scenario where the send side runs out of
resources.

Jason

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 17:58                                                     ` Jason Gunthorpe
@ 2023-10-17 18:44                                                       ` Bob Pearson
  2023-10-17 18:51                                                         ` Jason Gunthorpe
  2023-10-17 19:18                                                       ` Bart Van Assche
  1 sibling, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-10-17 18:44 UTC (permalink / raw)
  To: Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Bart Van Assche',
	'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/17/23 12:58, Jason Gunthorpe wrote:
> On Tue, Oct 17, 2023 at 12:09:31PM -0500, Bob Pearson wrote:
> 
>  
>> For qp#167 the call to srp_post_send() is followed by the rxe driver
>> processing the send operation and generating a work completion which
>> is posted to the send cq but there is never a following call to
>> __srp_get_rx_iu() so the cqe is not received by srp and failure.
> 
> ? I don't see this funcion in the kernel?  __srp_get_tx_iu ?
>  
>> I don't yet understand the logic of the srp driver to fix this but
>> the problem is not in the rxe driver as far as I can tell.
> 
> It looks to me like __srp_get_tx_iu() is following the design pattern
> where the send queue is only polled when it needs to allocate a new
> send buffer - ie the send buffers are pre-allocated and cycle through
> the queue.
> 
> So, it is not surprising this isn't being called if it is hung - the
> hang is probably something that is preventing it from even wanting to
> send, which is probably a receive side issue.
> 
> Followup back up from that point to isolate what is the missing
> resouce to trigger send may bring some more clarity.
> 
> Alternatively if __srp_get_tx_iu() is failing then perhaps you've run
> into an issue where it hit something rare and recovery does not work.
> 
> eg this kind of design pattern carries a subtle assumption that the rx
> and send CQ are ordered together. Getting a rx CQ before a matching tx
> CQ can trigger the unusual scenario where the send side runs out of
> resources.
> 
> Jason

In all the traces I have looked at the hang only occurs once the final
send side completions are not received. This happens when the srp
driver doesn't poll (i.e. call ib_process_cq_direct). The rest is
my conjecture. Since there are several (e.g. qp#167 through qp#211 (odd))
qp's with missing completions there are 23 iu's tied up when srp hangs.
Your suggestion makes sense as why the hang occurs. When the test
finishes the qp's are destroyed and the driver calls ib_process_cq_direct
again which cleans up the resources.

The problem is that there isn't any obvious way to find a thread related
to the missing cqe to poll for them. I think the best way to fix this is
to convert the send side cq handling to interrupt driven (as is the case
with the srpt driver.) The provider drivers have to run in any case to
convert cqe's to wc's so there isn't much penalty to call the cq
completion handler since there is already software running and then you
will get reliable delivery of completions.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 18:44                                                       ` Bob Pearson
@ 2023-10-17 18:51                                                         ` Jason Gunthorpe
  2023-10-17 19:55                                                           ` Bob Pearson
  0 siblings, 1 reply; 87+ messages in thread
From: Jason Gunthorpe @ 2023-10-17 18:51 UTC (permalink / raw)
  To: Bob Pearson
  Cc: Daisuke Matsuda (Fujitsu), 'Bart Van Assche',
	'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On Tue, Oct 17, 2023 at 01:44:58PM -0500, Bob Pearson wrote:
> On 10/17/23 12:58, Jason Gunthorpe wrote:
> > On Tue, Oct 17, 2023 at 12:09:31PM -0500, Bob Pearson wrote:
> > 
> >  
> >> For qp#167 the call to srp_post_send() is followed by the rxe driver
> >> processing the send operation and generating a work completion which
> >> is posted to the send cq but there is never a following call to
> >> __srp_get_rx_iu() so the cqe is not received by srp and failure.
> > 
> > ? I don't see this funcion in the kernel?  __srp_get_tx_iu ?
> >  
> >> I don't yet understand the logic of the srp driver to fix this but
> >> the problem is not in the rxe driver as far as I can tell.
> > 
> > It looks to me like __srp_get_tx_iu() is following the design pattern
> > where the send queue is only polled when it needs to allocate a new
> > send buffer - ie the send buffers are pre-allocated and cycle through
> > the queue.
> > 
> > So, it is not surprising this isn't being called if it is hung - the
> > hang is probably something that is preventing it from even wanting to
> > send, which is probably a receive side issue.
> > 
> > Followup back up from that point to isolate what is the missing
> > resouce to trigger send may bring some more clarity.
> > 
> > Alternatively if __srp_get_tx_iu() is failing then perhaps you've run
> > into an issue where it hit something rare and recovery does not work.
> > 
> > eg this kind of design pattern carries a subtle assumption that the rx
> > and send CQ are ordered together. Getting a rx CQ before a matching tx
> > CQ can trigger the unusual scenario where the send side runs out of
> > resources.
> > 
> > Jason
> 
> In all the traces I have looked at the hang only occurs once the final
> send side completions are not received. This happens when the srp
> driver doesn't poll (i.e. call ib_process_cq_direct). The rest is
> my conjecture. Since there are several (e.g. qp#167 through qp#211 (odd))
> qp's with missing completions there are 23 iu's tied up when srp hangs.
> Your suggestion makes sense as why the hang occurs. When the test
> finishes the qp's are destroyed and the driver calls ib_process_cq_direct
> again which cleans up the resources.
> 
> The problem is that there isn't any obvious way to find a thread related
> to the missing cqe to poll for them. I think the best way to fix this is
> to convert the send side cq handling to interrupt driven (as is the case
> with the srpt driver.) The provider drivers have to run in any case to
> convert cqe's to wc's so there isn't much penalty to call the cq
> completion handler since there is already software running and then you
> will get reliable delivery of completions.

Can you add tracing to show that SRP is running out of SQ resources,
ie __srp_get_tx_iu() fails and that is a precondition for the hang?

I am fully willing to belive that is not ever tested.

Otherwise if srp thinks it has SQ resources then the SQ is probably
not the cause of the hang.

Jason

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 17:58                                                     ` Jason Gunthorpe
  2023-10-17 18:44                                                       ` Bob Pearson
@ 2023-10-17 19:18                                                       ` Bart Van Assche
  1 sibling, 0 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-10-17 19:18 UTC (permalink / raw)
  To: Jason Gunthorpe, Bob Pearson
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/17/23 10:58, Jason Gunthorpe wrote:
> eg this kind of design pattern carries a subtle assumption that the rx
> and send CQ are ordered together. Getting a rx CQ before a matching tx
> CQ can trigger the unusual scenario where the send side runs out of
> resources.

If an rx CQ is received before the matching tx CQ by srp_queuecommand(),
then srp_queuecommand() will return SCSI_MLQUEUE_HOST_BUSY and the SCSI
core will retry the srp_queuecommand() after a small delay. This is a
common approach in Linux kernel SCSI drivers.

Thanks,

Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 18:51                                                         ` Jason Gunthorpe
@ 2023-10-17 19:55                                                           ` Bob Pearson
  2023-10-17 20:06                                                             ` Bart Van Assche
  0 siblings, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-10-17 19:55 UTC (permalink / raw)
  To: Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Bart Van Assche',
	'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/17/23 13:51, Jason Gunthorpe wrote:
> On Tue, Oct 17, 2023 at 01:44:58PM -0500, Bob Pearson wrote:
>> On 10/17/23 12:58, Jason Gunthorpe wrote:
>>> On Tue, Oct 17, 2023 at 12:09:31PM -0500, Bob Pearson wrote:
>>>
>>>  
>>>> For qp#167 the call to srp_post_send() is followed by the rxe driver
>>>> processing the send operation and generating a work completion which
>>>> is posted to the send cq but there is never a following call to
>>>> __srp_get_rx_iu() so the cqe is not received by srp and failure.
>>>
>>> ? I don't see this funcion in the kernel?  __srp_get_tx_iu ?
>>>  
>>>> I don't yet understand the logic of the srp driver to fix this but
>>>> the problem is not in the rxe driver as far as I can tell.
>>>
>>> It looks to me like __srp_get_tx_iu() is following the design pattern
>>> where the send queue is only polled when it needs to allocate a new
>>> send buffer - ie the send buffers are pre-allocated and cycle through
>>> the queue.
>>>
>>> So, it is not surprising this isn't being called if it is hung - the
>>> hang is probably something that is preventing it from even wanting to
>>> send, which is probably a receive side issue.
>>>
>>> Followup back up from that point to isolate what is the missing
>>> resouce to trigger send may bring some more clarity.
>>>
>>> Alternatively if __srp_get_tx_iu() is failing then perhaps you've run
>>> into an issue where it hit something rare and recovery does not work.
>>>
>>> eg this kind of design pattern carries a subtle assumption that the rx
>>> and send CQ are ordered together. Getting a rx CQ before a matching tx
>>> CQ can trigger the unusual scenario where the send side runs out of
>>> resources.
>>>
>>> Jason
>>
>> In all the traces I have looked at the hang only occurs once the final
>> send side completions are not received. This happens when the srp
>> driver doesn't poll (i.e. call ib_process_cq_direct). The rest is
>> my conjecture. Since there are several (e.g. qp#167 through qp#211 (odd))
>> qp's with missing completions there are 23 iu's tied up when srp hangs.
>> Your suggestion makes sense as why the hang occurs. When the test
>> finishes the qp's are destroyed and the driver calls ib_process_cq_direct
>> again which cleans up the resources.
>>
>> The problem is that there isn't any obvious way to find a thread related
>> to the missing cqe to poll for them. I think the best way to fix this is
>> to convert the send side cq handling to interrupt driven (as is the case
>> with the srpt driver.) The provider drivers have to run in any case to
>> convert cqe's to wc's so there isn't much penalty to call the cq
>> completion handler since there is already software running and then you
>> will get reliable delivery of completions.
> 
> Can you add tracing to show that SRP is running out of SQ resources,
> ie __srp_get_tx_iu() fails and that is a precondition for the hang?
> 
> I am fully willing to belive that is not ever tested.
> 
> Otherwise if srp thinks it has SQ resources then the SQ is probably
> not the cause of the hang.
> 
> Jason

Well.... the extra tracing did *not* show srp running out of iu's.
So I converted cq handling to IB_POLL_SOFTIRQ from IB_POLL_DIRECT.
This required adding a spinlock around list_add(&iu->list, ...) in 
srp_send_done(). The test now runs with all the completions handled
correctly. But, it still hangs. So a red herring.

The hunt continues.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 19:55                                                           ` Bob Pearson
@ 2023-10-17 20:06                                                             ` Bart Van Assche
  2023-10-17 20:13                                                               ` Bob Pearson
  2023-10-17 21:14                                                               ` Bob Pearson
  0 siblings, 2 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-10-17 20:06 UTC (permalink / raw)
  To: Bob Pearson, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/17/23 12:55, Bob Pearson wrote:
> Well.... the extra tracing did *not* show srp running out of iu's.
> So I converted cq handling to IB_POLL_SOFTIRQ from IB_POLL_DIRECT.
> This required adding a spinlock around list_add(&iu->list, ...) in
> srp_send_done(). The test now runs with all the completions handled
> correctly. But, it still hangs. So a red herring.

iu->list manipulations are protected by ch->lock. See also the
lockdep_assert_held(&ch->lock) statements in the code that does
manipulate this list and that does not grab ch->lock directly.

Thanks,

Bart.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 20:06                                                             ` Bart Van Assche
@ 2023-10-17 20:13                                                               ` Bob Pearson
  2023-10-17 21:14                                                               ` Bob Pearson
  1 sibling, 0 replies; 87+ messages in thread
From: Bob Pearson @ 2023-10-17 20:13 UTC (permalink / raw)
  To: Bart Van Assche, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/17/23 15:06, Bart Van Assche wrote:
> On 10/17/23 12:55, Bob Pearson wrote:
>> Well.... the extra tracing did *not* show srp running out of iu's.
>> So I converted cq handling to IB_POLL_SOFTIRQ from IB_POLL_DIRECT.
>> This required adding a spinlock around list_add(&iu->list, ...) in
>> srp_send_done(). The test now runs with all the completions handled
>> correctly. But, it still hangs. So a red herring.
> 
> iu->list manipulations are protected by ch->lock. See also the
> lockdep_assert_held(&ch->lock) statements in the code that does
> manipulate this list and that does not grab ch->lock directly.
> 
> Thanks,
> 
> Bart.

Thanks. Saw that. I just added ch->lock'ing around the list_add. It
works if you don't call ib_process_cq_direct which was inside
the lock and use poll_softirq instead which runs on it's own thread.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 20:06                                                             ` Bart Van Assche
  2023-10-17 20:13                                                               ` Bob Pearson
@ 2023-10-17 21:14                                                               ` Bob Pearson
  2023-10-17 21:18                                                                 ` Bart Van Assche
  1 sibling, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-10-17 21:14 UTC (permalink / raw)
  To: Bart Van Assche, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/17/23 15:06, Bart Van Assche wrote:
> On 10/17/23 12:55, Bob Pearson wrote:
>> Well.... the extra tracing did *not* show srp running out of iu's.
>> So I converted cq handling to IB_POLL_SOFTIRQ from IB_POLL_DIRECT.
>> This required adding a spinlock around list_add(&iu->list, ...) in
>> srp_send_done(). The test now runs with all the completions handled
>> correctly. But, it still hangs. So a red herring.
> 
> iu->list manipulations are protected by ch->lock. See also the
> lockdep_assert_held(&ch->lock) statements in the code that does
> manipulate this list and that does not grab ch->lock directly.
> 
> Thanks,
> 
> Bart.

One more clue. When the test hangs, after 120 seconds there is a set
of hung task messages in the logs like:

[  408.844422] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[  408.844439] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[  408.844474] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[  408.844491] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[  408.844502] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[  605.106839] INFO: task kworker/1:0:25 blocked for more than 120 seconds.
[  605.106857]       Tainted: G    B      OE      6.6.0-rc3+ #10
[  605.106866] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[  605.106872] task:kworker/1:0     state:D stack:0     pid:25    ppid:2      flags:0x00004000
[  605.106887] Workqueue: dio/dm-5 iomap_dio_complete_work
[  605.106904] Call Trace:
[  605.106909]  <TASK>
[  605.106917]  ? __schedule+0x996/0x2c80
[  605.106929]  __schedule+0x9f6/0x2c80
[  605.106945]  ? lock_release+0xc1/0x6f0
[  605.106955]  ? rcu_is_watching+0x23/0x50
[  605.106970]  ? io_schedule_timeout+0xc0/0xc0
[  605.106981]  ? lock_contended+0x740/0x740
[  605.106989]  ? do_raw_spin_lock+0x1c0/0x1c0
[  605.106999]  ? lock_contended+0x740/0x740
[  605.107011]  ? _raw_spin_unlock_irq+0x27/0x60
[  605.107023]  ? trace_hardirqs_on+0x22/0x100
[  605.107037]  ? _raw_spin_unlock_irq+0x27/0x60
[  605.107052]  schedule+0x96/0x150
[  605.107063]  bit_wait+0x1c/0xa0
[  605.107074]  __wait_on_bit+0x42/0x110
[  605.107084]  ? bit_wait_io+0xa0/0xa0
[  605.107099]  __inode_wait_for_writeback+0x11b/0x190
[  605.107112]  ? inode_prepare_wbs_switch+0x160/0x160
[  605.107127]  ? swake_up_one+0xb0/0xb0
[  605.107147]  writeback_single_inode+0xb8/0x250
[  605.107159]  sync_inode_metadata+0xa2/0xe0
[  605.107168]  ? write_inode_now+0x160/0x160
[  605.107186]  ? file_write_and_wait_range+0x54/0xe0
[  605.107199]  generic_buffers_fsync_noflush+0x135/0x160
[  605.107213]  ext4_sync_file+0x3b3/0x620
[  605.107227]  vfs_fsync_range+0x69/0x110
[  605.107237]  ? ext4_getfsmap+0x520/0x520
[  605.107249]  iomap_dio_complete+0x35c/0x3a0
[  605.107259]  ? __schedule+0x9fe/0x2c80
[  605.107272]  ? aio_fsync_work+0x190/0x190
[  605.107284]  iomap_dio_complete_work+0x36/0x50
[  605.107297]  process_one_work+0x46c/0x950


All the active threads are just the same and are all waiting for
an io to complete from scsi. No threads are active in rxe, srp(t)
or scsi. All activity appears to be dead.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 21:14                                                               ` Bob Pearson
@ 2023-10-17 21:18                                                                 ` Bart Van Assche
  2023-10-17 21:23                                                                   ` Bob Pearson
  0 siblings, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-10-17 21:18 UTC (permalink / raw)
  To: Bob Pearson, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi


On 10/17/23 14:14, Bob Pearson wrote:
> All the active threads are just the same and are all waiting for
> an io to complete from scsi. No threads are active in rxe, srp(t)
> or scsi. All activity appears to be dead.

Is this really a clue? I have seen such backtraces many times. All
such a backtrace tells us is that something got stuck in a layer
under the filesystem. It does not tell us which layer caused
command processing to get stuck.

Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 21:18                                                                 ` Bart Van Assche
@ 2023-10-17 21:23                                                                   ` Bob Pearson
  2023-10-17 21:30                                                                     ` Bart Van Assche
  0 siblings, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-10-17 21:23 UTC (permalink / raw)
  To: Bart Van Assche, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/17/23 16:18, Bart Van Assche wrote:
> 
> On 10/17/23 14:14, Bob Pearson wrote:
>> All the active threads are just the same and are all waiting for
>> an io to complete from scsi. No threads are active in rxe, srp(t)
>> or scsi. All activity appears to be dead.
> 
> Is this really a clue? I have seen such backtraces many times. All
> such a backtrace tells us is that something got stuck in a layer
> under the filesystem. It does not tell us which layer caused
> command processing to get stuck.
> 
> Bart.
> 

Not really, but stuck could mean it died (no threads active) or it is
in a loop or waiting to be scheduled. It looks dead. The lower layers are
waiting to get kicked into action by some event but it hasn't happened.
This is conjecture on my part though.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 21:23                                                                   ` Bob Pearson
@ 2023-10-17 21:30                                                                     ` Bart Van Assche
  2023-10-17 21:39                                                                       ` Bob Pearson
  0 siblings, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-10-17 21:30 UTC (permalink / raw)
  To: Bob Pearson, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi


On 10/17/23 14:23, Bob Pearson wrote:
> Not really, but stuck could mean it died (no threads active) or it is
> in a loop or waiting to be scheduled. It looks dead. The lower layers are
> waiting to get kicked into action by some event but it hasn't happened.
> This is conjecture on my part though.

This call stack means that I/O has been submitted by the block layer and
that it did not get completed. Which I/O request got stuck can be
verified by e.g. running the list-pending-block-requests script that I
posted some time ago. See also
https://lore.kernel.org/all/55c0fe61-a091-b351-11b4-fa7f668e49d7@acm.org/.

Thanks,

Bart.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 21:30                                                                     ` Bart Van Assche
@ 2023-10-17 21:39                                                                       ` Bob Pearson
  2023-10-17 22:42                                                                         ` Bart Van Assche
  0 siblings, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-10-17 21:39 UTC (permalink / raw)
  To: Bart Van Assche, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/17/23 16:30, Bart Van Assche wrote:
> 
> On 10/17/23 14:23, Bob Pearson wrote:
>> Not really, but stuck could mean it died (no threads active) or it is
>> in a loop or waiting to be scheduled. It looks dead. The lower layers are
>> waiting to get kicked into action by some event but it hasn't happened.
>> This is conjecture on my part though.
> 
> This call stack means that I/O has been submitted by the block layer and
> that it did not get completed. Which I/O request got stuck can be
> verified by e.g. running the list-pending-block-requests script that I
> posted some time ago. See also
> https://lore.kernel.org/all/55c0fe61-a091-b351-11b4-fa7f668e49d7@acm.org/.
> 
> Thanks,
> 
> Bart.

Thanks. Would this run on the side of a hung blktests or would I need to
setup an srp-srpt file system?

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 21:39                                                                       ` Bob Pearson
@ 2023-10-17 22:42                                                                         ` Bart Van Assche
  2023-10-18 18:29                                                                           ` Bob Pearson
  0 siblings, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-10-17 22:42 UTC (permalink / raw)
  To: Bob Pearson, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/17/23 14:39, Bob Pearson wrote:
> On 10/17/23 16:30, Bart Van Assche wrote:
>>
>> On 10/17/23 14:23, Bob Pearson wrote:
>>> Not really, but stuck could mean it died (no threads active) or it is
>>> in a loop or waiting to be scheduled. It looks dead. The lower layers are
>>> waiting to get kicked into action by some event but it hasn't happened.
>>> This is conjecture on my part though.
>>
>> This call stack means that I/O has been submitted by the block layer and
>> that it did not get completed. Which I/O request got stuck can be
>> verified by e.g. running the list-pending-block-requests script that I
>> posted some time ago. See also
>> https://lore.kernel.org/all/55c0fe61-a091-b351-11b4-fa7f668e49d7@acm.org/.
> 
> Thanks. Would this run on the side of a hung blktests or would I need to
> setup an srp-srpt file system?

I propose to analyze the source code of the component(s) that you
suspect of causing the hang. The output of the list-pending-block-
requests script is not sufficient to reveal which of the following
drivers is causing the hang: ib_srp, rdma_rxe, ib_srpt, ...

Thanks,

Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 17:09                                                   ` Bob Pearson
  2023-10-17 17:13                                                     ` Bart Van Assche
  2023-10-17 17:58                                                     ` Jason Gunthorpe
@ 2023-10-18  8:16                                                     ` Zhu Yanjun
  2 siblings, 0 replies; 87+ messages in thread
From: Zhu Yanjun @ 2023-10-18  8:16 UTC (permalink / raw)
  To: Bob Pearson, Daisuke Matsuda (Fujitsu), 'Bart Van Assche',
	'Rain River'
  Cc: Jason Gunthorpe, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi


在 2023/10/18 1:09, Bob Pearson 写道:
> On 9/25/23 20:17, Daisuke Matsuda (Fujitsu) wrote:
>> On Tue, Sep 26, 2023 12:01 AM Bart Van Assche:
>>> On 9/24/23 21:47, Daisuke Matsuda (Fujitsu) wrote:
>>>> As Bob wrote above, nobody has found any logical failure in rxe
>>>> driver.
>>> That's wrong. In case you would not yet have noticed my latest email in
>>> this thread, please take a look at
>>> https://lore.kernel.org/linux-rdma/e8b76fae-780a-470e-8ec4-c6b650793d10@leemhuis.info/T/#m0fd8ea8a4cbc27b37
>>> b042ae4f8e9b024f1871a73.
>>> I think the report in that email is a 100% proof that there is a
>>> use-after-free issue in the rdma_rxe driver. Use-after-free issues have
>>> security implications and also can cause data corruption. I propose to
>>> revert the commit that introduced the rdma_rxe use-after-free unless
>>> someone comes up with a fix for the rdma_rxe driver.
>>>
>>> Bart.
>> Thank you for the clarification. I see your intention.
>> I hope the hang issue will be resolved by addressing this.
>>
>> Thanks,
>> Daisuke
>>
> I have made some progress in understanding the cause of the srp/002 etc. hang.
>
> The two attached files are traces of activity for two qp's qp#151 and qp#167. In my runs of srp/002
> All the qp's pass before 167 and all fail after 167 which is the first to fail.
>
> It turns out that all the passing qp's call srp_post_send() some number of times and also call
> srp_send_done() the same number of times. Starting at qp#167 the last call to srp_send_done() does
> not take place leaving the srp driver waiting for the final completion and causing the hang I believe.

Thanks, Bob

I will delve into your findings and the source code to find the root cause.

BTW, what linux distribution are you using to find this? Ubuntu, Fedora 
or Debian?

 From the above, sometings this problem is difficult to reproduce on 
Ubuntu. But it can be reproduced in Ubuntu and Debian.

So can you let me know what linux distribution you are using?

Thanks

Zhu Yanjun

>
> There are four cq's involved in each pair of qp's in the srp test. Two in ib_srp and two in ib_srpt
> for the two qp's. Three of them execute completion processing in a soft irq context so the code in
> core/cq.c gathers the completions and calls back to the srp drivers. The send side cq in srp uses
> cq_direct which requires srp to call ib_process_direct() in order to collect the completions. This
> happens in __srp_get_tx_iu() which is called in several places in the srp driver. But only as a side effect
> since the purpose of this routine is to get an iu to start a new command.
>
> In the attached files for qp#151 the final call to srp_post_send is followed by the rxe requester and
> completer work queues processing the send packet and the ack before a final call to __srp_get_rx_iu()
> which gathers the final send side completion and success.
>
> For qp#167 the call to srp_post_send() is followed by the rxe driver processing the send operation and
> generating a work completion which is posted to the send cq but there is never a following call to
> __srp_get_rx_iu() so the cqe is not received by srp and failure.
>
> I don't yet understand the logic of the srp driver to fix this but the problem is not in the rxe driver
> as far as I can tell.
>
> Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-17 22:42                                                                         ` Bart Van Assche
@ 2023-10-18 18:29                                                                           ` Bob Pearson
  2023-10-18 19:17                                                                             ` Jason Gunthorpe
  2023-10-18 19:38                                                                             ` Bart Van Assche
  0 siblings, 2 replies; 87+ messages in thread
From: Bob Pearson @ 2023-10-18 18:29 UTC (permalink / raw)
  To: Bart Van Assche, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/17/23 17:42, Bart Van Assche wrote:
> On 10/17/23 14:39, Bob Pearson wrote:
>> On 10/17/23 16:30, Bart Van Assche wrote:
>>>
>>> On 10/17/23 14:23, Bob Pearson wrote:
>>>> Not really, but stuck could mean it died (no threads active) or it is
>>>> in a loop or waiting to be scheduled. It looks dead. The lower layers are
>>>> waiting to get kicked into action by some event but it hasn't happened.
>>>> This is conjecture on my part though.
>>>
>>> This call stack means that I/O has been submitted by the block layer and
>>> that it did not get completed. Which I/O request got stuck can be
>>> verified by e.g. running the list-pending-block-requests script that I
>>> posted some time ago. See also
>>> https://lore.kernel.org/all/55c0fe61-a091-b351-11b4-fa7f668e49d7@acm.org/.
>>
>> Thanks. Would this run on the side of a hung blktests or would I need to
>> setup an srp-srpt file system?
> 
> I propose to analyze the source code of the component(s) that you
> suspect of causing the hang. The output of the list-pending-block-
> requests script is not sufficient to reveal which of the following
> drivers is causing the hang: ib_srp, rdma_rxe, ib_srpt, ...
> 
> Thanks,
> 
> Bart.
> 

Bart,

Another data point. I had seen (months ago) that both the rxe and siw drivers could cause blktests srp
hangs. More recently when I configure my kernel to run lots of tests (lockdep, memory leaks, kasan, ubsan,
etc.), which definitely slows performance and adds delays, the % of srp/002 runs which hang on the rxe driver
has gone from 10%+- to a solid 100%. This suggested retrying the siw driver on the debug kernel since it
has the reputation of always running successfully. I now find that siw also hangs solidly on srp/002.
This is another hint that we are seeing a timing issue.

Bob 

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 18:29                                                                           ` Bob Pearson
@ 2023-10-18 19:17                                                                             ` Jason Gunthorpe
  2023-10-18 19:48                                                                               ` Bart Van Assche
  2023-10-18 19:38                                                                             ` Bart Van Assche
  1 sibling, 1 reply; 87+ messages in thread
From: Jason Gunthorpe @ 2023-10-18 19:17 UTC (permalink / raw)
  To: Bob Pearson
  Cc: Bart Van Assche, Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On Wed, Oct 18, 2023 at 01:29:16PM -0500, Bob Pearson wrote:
> On 10/17/23 17:42, Bart Van Assche wrote:
> > On 10/17/23 14:39, Bob Pearson wrote:
> >> On 10/17/23 16:30, Bart Van Assche wrote:
> >>>
> >>> On 10/17/23 14:23, Bob Pearson wrote:
> >>>> Not really, but stuck could mean it died (no threads active) or it is
> >>>> in a loop or waiting to be scheduled. It looks dead. The lower layers are
> >>>> waiting to get kicked into action by some event but it hasn't happened.
> >>>> This is conjecture on my part though.
> >>>
> >>> This call stack means that I/O has been submitted by the block layer and
> >>> that it did not get completed. Which I/O request got stuck can be
> >>> verified by e.g. running the list-pending-block-requests script that I
> >>> posted some time ago. See also
> >>> https://lore.kernel.org/all/55c0fe61-a091-b351-11b4-fa7f668e49d7@acm.org/.
> >>
> >> Thanks. Would this run on the side of a hung blktests or would I need to
> >> setup an srp-srpt file system?
> > 
> > I propose to analyze the source code of the component(s) that you
> > suspect of causing the hang. The output of the list-pending-block-
> > requests script is not sufficient to reveal which of the following
> > drivers is causing the hang: ib_srp, rdma_rxe, ib_srpt, ...
> > 
> > Thanks,
> > 
> > Bart.
> > 
> 
> Bart,
> 
> Another data point. I had seen (months ago) that both the rxe and
> siw drivers could cause blktests srp hangs. More recently when I
> configure my kernel to run lots of tests (lockdep, memory leaks,
> kasan, ubsan, etc.), which definitely slows performance and adds
> delays, the % of srp/002 runs which hang on the rxe driver has gone
> from 10%+- to a solid 100%. This suggested retrying the siw driver
> on the debug kernel since it has the reputation of always running
> successfully. I now find that siw also hangs solidly on srp/002.
> This is another hint that we are seeing a timing issue.

If siw hangs as well, I definitely comfortable continuing to debug and
leaving the work queues in-tree for now.

Jason

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 18:29                                                                           ` Bob Pearson
  2023-10-18 19:17                                                                             ` Jason Gunthorpe
@ 2023-10-18 19:38                                                                             ` Bart Van Assche
  1 sibling, 0 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-10-18 19:38 UTC (permalink / raw)
  To: Bob Pearson, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/18/23 11:29, Bob Pearson wrote:
> I now find that siw also hangs solidly on srp/002. This is another
> hint that we are seeing a timing issue.
I can't reproduce the srp/002 hang with the siw driver - neither with a 
production kernel nor with a debug kernel. Is anyone else able to 
reproduce the srp/002 hang with the siw driver?

Thanks,

Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 19:17                                                                             ` Jason Gunthorpe
@ 2023-10-18 19:48                                                                               ` Bart Van Assche
  2023-10-18 20:03                                                                                 ` Bob Pearson
                                                                                                   ` (3 more replies)
  0 siblings, 4 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-10-18 19:48 UTC (permalink / raw)
  To: Jason Gunthorpe, Bob Pearson
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi


On 10/18/23 12:17, Jason Gunthorpe wrote:
> If siw hangs as well, I definitely comfortable continuing to debug and
> leaving the work queues in-tree for now.

Regarding the KASAN complaint that I shared about one month ago, can 
that complaint have any other root cause than the patch "RDMA/rxe: Add
workqueue support for rxe tasks"? That report shows a use-after-free by
rxe code with a pointer to memory that was owned by the rxe driver and
that was freed by the rxe driver. That memory is an skbuff. The rxe
driver manages skbuffs. The SRP driver doesn't even know about these
skbuff objects. See also 
https://lore.kernel.org/linux-rdma/8ee2869b-3f51-4195-9883-015cd30b4241@acm.org/

Thanks,

Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 19:48                                                                               ` Bart Van Assche
@ 2023-10-18 20:03                                                                                 ` Bob Pearson
  2023-10-18 20:04                                                                                 ` Bob Pearson
                                                                                                   ` (2 subsequent siblings)
  3 siblings, 0 replies; 87+ messages in thread
From: Bob Pearson @ 2023-10-18 20:03 UTC (permalink / raw)
  To: Bart Van Assche, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/18/23 14:48, Bart Van Assche wrote:
> 
> On 10/18/23 12:17, Jason Gunthorpe wrote:
>> If siw hangs as well, I definitely comfortable continuing to debug and
>> leaving the work queues in-tree for now.
> 
> Regarding the KASAN complaint that I shared about one month ago, can that complaint have any other root cause than the patch "RDMA/rxe: Add
> workqueue support for rxe tasks"? That report shows a use-after-free by
> rxe code with a pointer to memory that was owned by the rxe driver and
> that was freed by the rxe driver. That memory is an skbuff. The rxe
> driver manages skbuffs. The SRP driver doesn't even know about these
> skbuff objects. See also https://lore.kernel.org/linux-rdma/8ee2869b-3f51-4195-9883-015cd30b4241@acm.org/
> 
> Thanks,
> 
> Bart.
> 
Bart,

I agree with you that that is a rxe issue. But, I haven't been able to reproduce it. However, I am able
to generate hangs without the KASAN bug so it seems to me that they are unrelated. In addition to the
kernel debugging I have added tracing to ib_srp and ib_srpt which may help delay things.

Bob


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 19:48                                                                               ` Bart Van Assche
  2023-10-18 20:03                                                                                 ` Bob Pearson
@ 2023-10-18 20:04                                                                                 ` Bob Pearson
  2023-10-18 20:14                                                                                 ` Bob Pearson
  2023-10-18 20:29                                                                                 ` Bob Pearson
  3 siblings, 0 replies; 87+ messages in thread
From: Bob Pearson @ 2023-10-18 20:04 UTC (permalink / raw)
  To: Bart Van Assche, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

[-- Attachment #1: Type: text/plain, Size: 859 bytes --]

On 10/18/23 14:48, Bart Van Assche wrote:
> 
> On 10/18/23 12:17, Jason Gunthorpe wrote:
>> If siw hangs as well, I definitely comfortable continuing to debug and
>> leaving the work queues in-tree for now.
> 
> Regarding the KASAN complaint that I shared about one month ago, can that complaint have any other root cause than the patch "RDMA/rxe: Add
> workqueue support for rxe tasks"? That report shows a use-after-free by
> rxe code with a pointer to memory that was owned by the rxe driver and
> that was freed by the rxe driver. That memory is an skbuff. The rxe
> driver manages skbuffs. The SRP driver doesn't even know about these
> skbuff objects. See also https://lore.kernel.org/linux-rdma/8ee2869b-3f51-4195-9883-015cd30b4241@acm.org/
> 
> Thanks,
> 
> Bart.
> 

here is the .config I am using. Based on stock Ubuntu 23.04 plus make olddefconfig.

[-- Attachment #2: .config --]
[-- Type: text/plain, Size: 281233 bytes --]

#
# Automatically generated file; DO NOT EDIT.
# Linux/x86 6.6.0-rc3 Kernel Configuration
#
CONFIG_CC_VERSION_TEXT="gcc (Ubuntu 12.3.0-1ubuntu1~23.04) 12.3.0"
CONFIG_CC_IS_GCC=y
CONFIG_GCC_VERSION=120300
CONFIG_CLANG_VERSION=0
CONFIG_AS_IS_GNU=y
CONFIG_AS_VERSION=24000
CONFIG_LD_IS_BFD=y
CONFIG_LD_VERSION=24000
CONFIG_LLD_VERSION=0
CONFIG_CC_CAN_LINK=y
CONFIG_CC_CAN_LINK_STATIC=y
CONFIG_CC_HAS_ASM_GOTO_OUTPUT=y
CONFIG_CC_HAS_ASM_GOTO_TIED_OUTPUT=y
CONFIG_TOOLS_SUPPORT_RELR=y
CONFIG_CC_HAS_ASM_INLINE=y
CONFIG_CC_HAS_NO_PROFILE_FN_ATTR=y
CONFIG_PAHOLE_VERSION=125
CONFIG_CONSTRUCTORS=y
CONFIG_IRQ_WORK=y
CONFIG_BUILDTIME_TABLE_SORT=y
CONFIG_THREAD_INFO_IN_TASK=y

#
# General setup
#
CONFIG_INIT_ENV_ARG_LIMIT=32
# CONFIG_COMPILE_TEST is not set
# CONFIG_WERROR is not set
CONFIG_LOCALVERSION=""
# CONFIG_LOCALVERSION_AUTO is not set
CONFIG_BUILD_SALT=""
CONFIG_HAVE_KERNEL_GZIP=y
CONFIG_HAVE_KERNEL_BZIP2=y
CONFIG_HAVE_KERNEL_LZMA=y
CONFIG_HAVE_KERNEL_XZ=y
CONFIG_HAVE_KERNEL_LZO=y
CONFIG_HAVE_KERNEL_LZ4=y
CONFIG_HAVE_KERNEL_ZSTD=y
# CONFIG_KERNEL_GZIP is not set
# CONFIG_KERNEL_BZIP2 is not set
# CONFIG_KERNEL_LZMA is not set
# CONFIG_KERNEL_XZ is not set
# CONFIG_KERNEL_LZO is not set
# CONFIG_KERNEL_LZ4 is not set
CONFIG_KERNEL_ZSTD=y
CONFIG_DEFAULT_INIT=""
CONFIG_DEFAULT_HOSTNAME="(none)"
CONFIG_SYSVIPC=y
CONFIG_SYSVIPC_SYSCTL=y
CONFIG_SYSVIPC_COMPAT=y
CONFIG_POSIX_MQUEUE=y
CONFIG_POSIX_MQUEUE_SYSCTL=y
CONFIG_WATCH_QUEUE=y
CONFIG_CROSS_MEMORY_ATTACH=y
CONFIG_USELIB=y
CONFIG_AUDIT=y
CONFIG_HAVE_ARCH_AUDITSYSCALL=y
CONFIG_AUDITSYSCALL=y

#
# IRQ subsystem
#
CONFIG_GENERIC_IRQ_PROBE=y
CONFIG_GENERIC_IRQ_SHOW=y
CONFIG_GENERIC_IRQ_EFFECTIVE_AFF_MASK=y
CONFIG_GENERIC_PENDING_IRQ=y
CONFIG_GENERIC_IRQ_MIGRATION=y
CONFIG_HARDIRQS_SW_RESEND=y
CONFIG_GENERIC_IRQ_CHIP=y
CONFIG_IRQ_DOMAIN=y
CONFIG_IRQ_SIM=y
CONFIG_IRQ_DOMAIN_HIERARCHY=y
CONFIG_GENERIC_MSI_IRQ=y
CONFIG_IRQ_MSI_IOMMU=y
CONFIG_GENERIC_IRQ_MATRIX_ALLOCATOR=y
CONFIG_GENERIC_IRQ_RESERVATION_MODE=y
CONFIG_IRQ_FORCED_THREADING=y
CONFIG_SPARSE_IRQ=y
# CONFIG_GENERIC_IRQ_DEBUGFS is not set
# end of IRQ subsystem

CONFIG_CLOCKSOURCE_WATCHDOG=y
CONFIG_ARCH_CLOCKSOURCE_INIT=y
CONFIG_CLOCKSOURCE_VALIDATE_LAST_CYCLE=y
CONFIG_GENERIC_TIME_VSYSCALL=y
CONFIG_GENERIC_CLOCKEVENTS=y
CONFIG_GENERIC_CLOCKEVENTS_BROADCAST=y
CONFIG_GENERIC_CLOCKEVENTS_MIN_ADJUST=y
CONFIG_GENERIC_CMOS_UPDATE=y
CONFIG_HAVE_POSIX_CPU_TIMERS_TASK_WORK=y
CONFIG_POSIX_CPU_TIMERS_TASK_WORK=y
CONFIG_CONTEXT_TRACKING=y
CONFIG_CONTEXT_TRACKING_IDLE=y

#
# Timers subsystem
#
CONFIG_TICK_ONESHOT=y
CONFIG_NO_HZ_COMMON=y
# CONFIG_HZ_PERIODIC is not set
CONFIG_NO_HZ_IDLE=y
# CONFIG_NO_HZ_FULL is not set
CONFIG_NO_HZ=y
CONFIG_HIGH_RES_TIMERS=y
CONFIG_CLOCKSOURCE_WATCHDOG_MAX_SKEW_US=100
# end of Timers subsystem

CONFIG_BPF=y
CONFIG_HAVE_EBPF_JIT=y
CONFIG_ARCH_WANT_DEFAULT_BPF_JIT=y

#
# BPF subsystem
#
CONFIG_BPF_SYSCALL=y
CONFIG_BPF_JIT=y
CONFIG_BPF_JIT_ALWAYS_ON=y
CONFIG_BPF_JIT_DEFAULT_ON=y
CONFIG_BPF_UNPRIV_DEFAULT_OFF=y
CONFIG_USERMODE_DRIVER=y
# CONFIG_BPF_PRELOAD is not set
CONFIG_BPF_LSM=y
# end of BPF subsystem

CONFIG_PREEMPT_BUILD=y
# CONFIG_PREEMPT_NONE is not set
CONFIG_PREEMPT_VOLUNTARY=y
# CONFIG_PREEMPT is not set
CONFIG_PREEMPT_COUNT=y
CONFIG_PREEMPTION=y
CONFIG_PREEMPT_DYNAMIC=y
CONFIG_SCHED_CORE=y

#
# CPU/Task time and stats accounting
#
CONFIG_TICK_CPU_ACCOUNTING=y
# CONFIG_VIRT_CPU_ACCOUNTING_GEN is not set
# CONFIG_IRQ_TIME_ACCOUNTING is not set
CONFIG_BSD_PROCESS_ACCT=y
CONFIG_BSD_PROCESS_ACCT_V3=y
CONFIG_TASKSTATS=y
CONFIG_TASK_DELAY_ACCT=y
CONFIG_TASK_XACCT=y
CONFIG_TASK_IO_ACCOUNTING=y
CONFIG_PSI=y
# CONFIG_PSI_DEFAULT_DISABLED is not set
# end of CPU/Task time and stats accounting

CONFIG_CPU_ISOLATION=y

#
# RCU Subsystem
#
CONFIG_TREE_RCU=y
CONFIG_PREEMPT_RCU=y
# CONFIG_RCU_EXPERT is not set
CONFIG_TREE_SRCU=y
CONFIG_TASKS_RCU_GENERIC=y
CONFIG_TASKS_RCU=y
CONFIG_TASKS_RUDE_RCU=y
CONFIG_TASKS_TRACE_RCU=y
CONFIG_RCU_STALL_COMMON=y
CONFIG_RCU_NEED_SEGCBLIST=y
# end of RCU Subsystem

# CONFIG_IKCONFIG is not set
CONFIG_IKHEADERS=m
CONFIG_LOG_BUF_SHIFT=20
CONFIG_LOG_CPU_MAX_BUF_SHIFT=12
# CONFIG_PRINTK_INDEX is not set
CONFIG_HAVE_UNSTABLE_SCHED_CLOCK=y

#
# Scheduler features
#
CONFIG_UCLAMP_TASK=y
CONFIG_UCLAMP_BUCKETS_COUNT=5
# end of Scheduler features

CONFIG_ARCH_SUPPORTS_NUMA_BALANCING=y
CONFIG_ARCH_WANT_BATCHED_UNMAP_TLB_FLUSH=y
CONFIG_CC_HAS_INT128=y
CONFIG_CC_IMPLICIT_FALLTHROUGH="-Wimplicit-fallthrough=5"
CONFIG_GCC11_NO_ARRAY_BOUNDS=y
CONFIG_CC_NO_ARRAY_BOUNDS=y
CONFIG_ARCH_SUPPORTS_INT128=y
CONFIG_NUMA_BALANCING=y
CONFIG_NUMA_BALANCING_DEFAULT_ENABLED=y
CONFIG_CGROUPS=y
CONFIG_PAGE_COUNTER=y
# CONFIG_CGROUP_FAVOR_DYNMODS is not set
CONFIG_MEMCG=y
CONFIG_MEMCG_KMEM=y
CONFIG_BLK_CGROUP=y
CONFIG_CGROUP_WRITEBACK=y
CONFIG_CGROUP_SCHED=y
CONFIG_FAIR_GROUP_SCHED=y
CONFIG_CFS_BANDWIDTH=y
# CONFIG_RT_GROUP_SCHED is not set
CONFIG_SCHED_MM_CID=y
CONFIG_UCLAMP_TASK_GROUP=y
CONFIG_CGROUP_PIDS=y
CONFIG_CGROUP_RDMA=y
CONFIG_CGROUP_FREEZER=y
CONFIG_CGROUP_HUGETLB=y
CONFIG_CPUSETS=y
CONFIG_PROC_PID_CPUSET=y
CONFIG_CGROUP_DEVICE=y
CONFIG_CGROUP_CPUACCT=y
CONFIG_CGROUP_PERF=y
CONFIG_CGROUP_BPF=y
CONFIG_CGROUP_MISC=y
# CONFIG_CGROUP_DEBUG is not set
CONFIG_SOCK_CGROUP_DATA=y
CONFIG_NAMESPACES=y
CONFIG_UTS_NS=y
CONFIG_TIME_NS=y
CONFIG_IPC_NS=y
CONFIG_USER_NS=y
CONFIG_PID_NS=y
CONFIG_NET_NS=y
CONFIG_CHECKPOINT_RESTORE=y
CONFIG_SCHED_AUTOGROUP=y
CONFIG_RELAY=y
CONFIG_BLK_DEV_INITRD=y
CONFIG_INITRAMFS_SOURCE=""
CONFIG_RD_GZIP=y
CONFIG_RD_BZIP2=y
CONFIG_RD_LZMA=y
CONFIG_RD_XZ=y
CONFIG_RD_LZO=y
CONFIG_RD_LZ4=y
CONFIG_RD_ZSTD=y
CONFIG_BOOT_CONFIG=y
# CONFIG_BOOT_CONFIG_FORCE is not set
# CONFIG_BOOT_CONFIG_EMBED is not set
CONFIG_INITRAMFS_PRESERVE_MTIME=y
CONFIG_CC_OPTIMIZE_FOR_PERFORMANCE=y
# CONFIG_CC_OPTIMIZE_FOR_SIZE is not set
CONFIG_LD_ORPHAN_WARN=y
CONFIG_LD_ORPHAN_WARN_LEVEL="warn"
CONFIG_SYSCTL=y
CONFIG_HAVE_UID16=y
CONFIG_SYSCTL_EXCEPTION_TRACE=y
CONFIG_HAVE_PCSPKR_PLATFORM=y
CONFIG_EXPERT=y
CONFIG_UID16=y
CONFIG_MULTIUSER=y
CONFIG_SGETMASK_SYSCALL=y
CONFIG_SYSFS_SYSCALL=y
CONFIG_FHANDLE=y
CONFIG_POSIX_TIMERS=y
CONFIG_PRINTK=y
CONFIG_BUG=y
CONFIG_ELF_CORE=y
CONFIG_PCSPKR_PLATFORM=y
CONFIG_BASE_FULL=y
CONFIG_FUTEX=y
CONFIG_FUTEX_PI=y
CONFIG_EPOLL=y
CONFIG_SIGNALFD=y
CONFIG_TIMERFD=y
CONFIG_EVENTFD=y
CONFIG_SHMEM=y
CONFIG_AIO=y
CONFIG_IO_URING=y
CONFIG_ADVISE_SYSCALLS=y
CONFIG_MEMBARRIER=y
CONFIG_KALLSYMS=y
# CONFIG_KALLSYMS_SELFTEST is not set
CONFIG_KALLSYMS_ALL=y
CONFIG_KALLSYMS_ABSOLUTE_PERCPU=y
CONFIG_KALLSYMS_BASE_RELATIVE=y
CONFIG_ARCH_HAS_MEMBARRIER_SYNC_CORE=y
CONFIG_KCMP=y
CONFIG_RSEQ=y
CONFIG_CACHESTAT_SYSCALL=y
# CONFIG_DEBUG_RSEQ is not set
CONFIG_HAVE_PERF_EVENTS=y
CONFIG_GUEST_PERF_EVENTS=y
CONFIG_PC104=y

#
# Kernel Performance Events And Counters
#
CONFIG_PERF_EVENTS=y
# CONFIG_DEBUG_PERF_USE_VMALLOC is not set
# end of Kernel Performance Events And Counters

CONFIG_SYSTEM_DATA_VERIFICATION=y
CONFIG_PROFILING=y
CONFIG_TRACEPOINTS=y

#
# Kexec and crash features
#
CONFIG_CRASH_CORE=y
CONFIG_KEXEC_CORE=y
CONFIG_HAVE_IMA_KEXEC=y
CONFIG_KEXEC=y
CONFIG_KEXEC_FILE=y
CONFIG_KEXEC_SIG=y
# CONFIG_KEXEC_SIG_FORCE is not set
CONFIG_KEXEC_BZIMAGE_VERIFY_SIG=y
CONFIG_KEXEC_JUMP=y
CONFIG_CRASH_DUMP=y
CONFIG_CRASH_HOTPLUG=y
CONFIG_CRASH_MAX_MEMORY_RANGES=8192
# end of Kexec and crash features
# end of General setup

CONFIG_64BIT=y
CONFIG_X86_64=y
CONFIG_X86=y
CONFIG_INSTRUCTION_DECODER=y
CONFIG_OUTPUT_FORMAT="elf64-x86-64"
CONFIG_LOCKDEP_SUPPORT=y
CONFIG_STACKTRACE_SUPPORT=y
CONFIG_MMU=y
CONFIG_ARCH_MMAP_RND_BITS_MIN=28
CONFIG_ARCH_MMAP_RND_BITS_MAX=32
CONFIG_ARCH_MMAP_RND_COMPAT_BITS_MIN=8
CONFIG_ARCH_MMAP_RND_COMPAT_BITS_MAX=16
CONFIG_GENERIC_ISA_DMA=y
CONFIG_GENERIC_CSUM=y
CONFIG_GENERIC_BUG=y
CONFIG_GENERIC_BUG_RELATIVE_POINTERS=y
CONFIG_ARCH_MAY_HAVE_PC_FDC=y
CONFIG_GENERIC_CALIBRATE_DELAY=y
CONFIG_ARCH_HAS_CPU_RELAX=y
CONFIG_ARCH_HIBERNATION_POSSIBLE=y
CONFIG_ARCH_SUSPEND_POSSIBLE=y
CONFIG_AUDIT_ARCH=y
CONFIG_KASAN_SHADOW_OFFSET=0xdffffc0000000000
CONFIG_HAVE_INTEL_TXT=y
CONFIG_X86_64_SMP=y
CONFIG_ARCH_SUPPORTS_UPROBES=y
CONFIG_FIX_EARLYCON_MEM=y
CONFIG_DYNAMIC_PHYSICAL_MASK=y
CONFIG_PGTABLE_LEVELS=5
CONFIG_CC_HAS_SANE_STACKPROTECTOR=y

#
# Processor type and features
#
CONFIG_SMP=y
CONFIG_X86_X2APIC=y
CONFIG_X86_MPPARSE=y
# CONFIG_GOLDFISH is not set
CONFIG_X86_CPU_RESCTRL=y
CONFIG_X86_EXTENDED_PLATFORM=y
CONFIG_X86_NUMACHIP=y
# CONFIG_X86_VSMP is not set
CONFIG_X86_UV=y
# CONFIG_X86_GOLDFISH is not set
# CONFIG_X86_INTEL_MID is not set
CONFIG_X86_INTEL_LPSS=y
CONFIG_X86_AMD_PLATFORM_DEVICE=y
CONFIG_IOSF_MBI=y
CONFIG_IOSF_MBI_DEBUG=y
CONFIG_X86_SUPPORTS_MEMORY_FAILURE=y
CONFIG_SCHED_OMIT_FRAME_POINTER=y
CONFIG_HYPERVISOR_GUEST=y
CONFIG_PARAVIRT=y
CONFIG_PARAVIRT_XXL=y
# CONFIG_PARAVIRT_DEBUG is not set
CONFIG_PARAVIRT_SPINLOCKS=y
CONFIG_X86_HV_CALLBACK_VECTOR=y
CONFIG_XEN=y
CONFIG_XEN_PV=y
CONFIG_XEN_512GB=y
CONFIG_XEN_PV_SMP=y
CONFIG_XEN_PV_DOM0=y
CONFIG_XEN_PVHVM=y
CONFIG_XEN_PVHVM_SMP=y
CONFIG_XEN_PVHVM_GUEST=y
CONFIG_XEN_SAVE_RESTORE=y
# CONFIG_XEN_DEBUG_FS is not set
CONFIG_XEN_PVH=y
CONFIG_XEN_DOM0=y
CONFIG_XEN_PV_MSR_SAFE=y
CONFIG_KVM_GUEST=y
CONFIG_ARCH_CPUIDLE_HALTPOLL=y
CONFIG_PVH=y
# CONFIG_PARAVIRT_TIME_ACCOUNTING is not set
CONFIG_PARAVIRT_CLOCK=y
CONFIG_JAILHOUSE_GUEST=y
CONFIG_ACRN_GUEST=y
CONFIG_INTEL_TDX_GUEST=y
# CONFIG_MK8 is not set
# CONFIG_MPSC is not set
# CONFIG_MCORE2 is not set
# CONFIG_MATOM is not set
CONFIG_GENERIC_CPU=y
CONFIG_X86_INTERNODE_CACHE_SHIFT=6
CONFIG_X86_L1_CACHE_SHIFT=6
CONFIG_X86_TSC=y
CONFIG_X86_CMPXCHG64=y
CONFIG_X86_CMOV=y
CONFIG_X86_MINIMUM_CPU_FAMILY=64
CONFIG_X86_DEBUGCTLMSR=y
CONFIG_IA32_FEAT_CTL=y
CONFIG_X86_VMX_FEATURE_NAMES=y
CONFIG_PROCESSOR_SELECT=y
CONFIG_CPU_SUP_INTEL=y
CONFIG_CPU_SUP_AMD=y
CONFIG_CPU_SUP_HYGON=y
CONFIG_CPU_SUP_CENTAUR=y
CONFIG_CPU_SUP_ZHAOXIN=y
CONFIG_HPET_TIMER=y
CONFIG_HPET_EMULATE_RTC=y
CONFIG_DMI=y
CONFIG_GART_IOMMU=y
CONFIG_BOOT_VESA_SUPPORT=y
CONFIG_MAXSMP=y
CONFIG_NR_CPUS_RANGE_BEGIN=8192
CONFIG_NR_CPUS_RANGE_END=8192
CONFIG_NR_CPUS_DEFAULT=8192
CONFIG_NR_CPUS=8192
CONFIG_SCHED_CLUSTER=y
CONFIG_SCHED_SMT=y
CONFIG_SCHED_MC=y
CONFIG_SCHED_MC_PRIO=y
CONFIG_X86_LOCAL_APIC=y
CONFIG_X86_IO_APIC=y
CONFIG_X86_REROUTE_FOR_BROKEN_BOOT_IRQS=y
CONFIG_X86_MCE=y
CONFIG_X86_MCELOG_LEGACY=y
CONFIG_X86_MCE_INTEL=y
CONFIG_X86_MCE_AMD=y
CONFIG_X86_MCE_THRESHOLD=y
CONFIG_X86_MCE_INJECT=m

#
# Performance monitoring
#
CONFIG_PERF_EVENTS_INTEL_UNCORE=y
CONFIG_PERF_EVENTS_INTEL_RAPL=m
CONFIG_PERF_EVENTS_INTEL_CSTATE=m
# CONFIG_PERF_EVENTS_AMD_POWER is not set
CONFIG_PERF_EVENTS_AMD_UNCORE=m
CONFIG_PERF_EVENTS_AMD_BRS=y
# end of Performance monitoring

CONFIG_X86_16BIT=y
CONFIG_X86_ESPFIX64=y
CONFIG_X86_VSYSCALL_EMULATION=y
CONFIG_X86_IOPL_IOPERM=y
CONFIG_MICROCODE=y
# CONFIG_MICROCODE_LATE_LOADING is not set
CONFIG_X86_MSR=m
CONFIG_X86_CPUID=m
CONFIG_X86_5LEVEL=y
CONFIG_X86_DIRECT_GBPAGES=y
# CONFIG_X86_CPA_STATISTICS is not set
CONFIG_X86_MEM_ENCRYPT=y
CONFIG_AMD_MEM_ENCRYPT=y
# CONFIG_AMD_MEM_ENCRYPT_ACTIVE_BY_DEFAULT is not set
CONFIG_NUMA=y
CONFIG_AMD_NUMA=y
CONFIG_X86_64_ACPI_NUMA=y
CONFIG_NUMA_EMU=y
CONFIG_NODES_SHIFT=10
CONFIG_ARCH_SPARSEMEM_ENABLE=y
CONFIG_ARCH_SPARSEMEM_DEFAULT=y
CONFIG_ARCH_MEMORY_PROBE=y
CONFIG_ARCH_PROC_KCORE_TEXT=y
CONFIG_ILLEGAL_POINTER_VALUE=0xdead000000000000
CONFIG_X86_PMEM_LEGACY_DEVICE=y
CONFIG_X86_PMEM_LEGACY=y
CONFIG_X86_CHECK_BIOS_CORRUPTION=y
CONFIG_X86_BOOTPARAM_MEMORY_CORRUPTION_CHECK=y
CONFIG_MTRR=y
CONFIG_MTRR_SANITIZER=y
CONFIG_MTRR_SANITIZER_ENABLE_DEFAULT=1
CONFIG_MTRR_SANITIZER_SPARE_REG_NR_DEFAULT=1
CONFIG_X86_PAT=y
CONFIG_ARCH_USES_PG_UNCACHED=y
CONFIG_X86_UMIP=y
CONFIG_CC_HAS_IBT=y
# CONFIG_X86_KERNEL_IBT is not set
CONFIG_X86_INTEL_MEMORY_PROTECTION_KEYS=y
CONFIG_X86_INTEL_TSX_MODE_OFF=y
# CONFIG_X86_INTEL_TSX_MODE_ON is not set
# CONFIG_X86_INTEL_TSX_MODE_AUTO is not set
CONFIG_X86_SGX=y
# CONFIG_X86_USER_SHADOW_STACK is not set
CONFIG_EFI=y
CONFIG_EFI_STUB=y
CONFIG_EFI_HANDOVER_PROTOCOL=y
CONFIG_EFI_MIXED=y
# CONFIG_EFI_FAKE_MEMMAP is not set
CONFIG_EFI_RUNTIME_MAP=y
# CONFIG_HZ_100 is not set
CONFIG_HZ_250=y
# CONFIG_HZ_300 is not set
# CONFIG_HZ_1000 is not set
CONFIG_HZ=250
CONFIG_SCHED_HRTICK=y
CONFIG_ARCH_SUPPORTS_KEXEC=y
CONFIG_ARCH_SUPPORTS_KEXEC_FILE=y
CONFIG_ARCH_SELECTS_KEXEC_FILE=y
CONFIG_ARCH_SUPPORTS_KEXEC_PURGATORY=y
CONFIG_ARCH_SUPPORTS_KEXEC_SIG=y
CONFIG_ARCH_SUPPORTS_KEXEC_SIG_FORCE=y
CONFIG_ARCH_SUPPORTS_KEXEC_BZIMAGE_VERIFY_SIG=y
CONFIG_ARCH_SUPPORTS_KEXEC_JUMP=y
CONFIG_ARCH_SUPPORTS_CRASH_DUMP=y
CONFIG_ARCH_SUPPORTS_CRASH_HOTPLUG=y
CONFIG_PHYSICAL_START=0x1000000
CONFIG_RELOCATABLE=y
CONFIG_RANDOMIZE_BASE=y
CONFIG_X86_NEED_RELOCS=y
CONFIG_PHYSICAL_ALIGN=0x200000
CONFIG_DYNAMIC_MEMORY_LAYOUT=y
CONFIG_RANDOMIZE_MEMORY=y
CONFIG_RANDOMIZE_MEMORY_PHYSICAL_PADDING=0xa
# CONFIG_ADDRESS_MASKING is not set
CONFIG_HOTPLUG_CPU=y
# CONFIG_COMPAT_VDSO is not set
CONFIG_LEGACY_VSYSCALL_XONLY=y
# CONFIG_LEGACY_VSYSCALL_NONE is not set
# CONFIG_CMDLINE_BOOL is not set
CONFIG_MODIFY_LDT_SYSCALL=y
# CONFIG_STRICT_SIGALTSTACK_SIZE is not set
CONFIG_HAVE_LIVEPATCH=y
CONFIG_LIVEPATCH=y
# end of Processor type and features

CONFIG_CC_HAS_SLS=y
CONFIG_CC_HAS_RETURN_THUNK=y
CONFIG_CC_HAS_ENTRY_PADDING=y
CONFIG_FUNCTION_PADDING_CFI=11
CONFIG_FUNCTION_PADDING_BYTES=16
# CONFIG_SPECULATION_MITIGATIONS is not set
CONFIG_ARCH_HAS_ADD_PAGES=y

#
# Power management and ACPI options
#
CONFIG_ARCH_HIBERNATION_HEADER=y
CONFIG_SUSPEND=y
CONFIG_SUSPEND_FREEZER=y
# CONFIG_SUSPEND_SKIP_SYNC is not set
CONFIG_HIBERNATE_CALLBACKS=y
CONFIG_HIBERNATION=y
CONFIG_HIBERNATION_SNAPSHOT_DEV=y
CONFIG_PM_STD_PARTITION=""
CONFIG_PM_SLEEP=y
CONFIG_PM_SLEEP_SMP=y
# CONFIG_PM_AUTOSLEEP is not set
# CONFIG_PM_USERSPACE_AUTOSLEEP is not set
CONFIG_PM_WAKELOCKS=y
CONFIG_PM_WAKELOCKS_LIMIT=100
CONFIG_PM_WAKELOCKS_GC=y
CONFIG_PM=y
CONFIG_PM_DEBUG=y
CONFIG_PM_ADVANCED_DEBUG=y
# CONFIG_PM_TEST_SUSPEND is not set
CONFIG_PM_SLEEP_DEBUG=y
# CONFIG_DPM_WATCHDOG is not set
CONFIG_PM_TRACE=y
CONFIG_PM_TRACE_RTC=y
CONFIG_PM_CLK=y
CONFIG_PM_GENERIC_DOMAINS=y
CONFIG_WQ_POWER_EFFICIENT_DEFAULT=y
CONFIG_PM_GENERIC_DOMAINS_SLEEP=y
CONFIG_ENERGY_MODEL=y
CONFIG_ARCH_SUPPORTS_ACPI=y
CONFIG_ACPI=y
CONFIG_ACPI_LEGACY_TABLES_LOOKUP=y
CONFIG_ARCH_MIGHT_HAVE_ACPI_PDC=y
CONFIG_ACPI_SYSTEM_POWER_STATES_SUPPORT=y
CONFIG_ACPI_TABLE_LIB=y
CONFIG_ACPI_DEBUGGER=y
CONFIG_ACPI_DEBUGGER_USER=y
CONFIG_ACPI_SPCR_TABLE=y
CONFIG_ACPI_FPDT=y
CONFIG_ACPI_LPIT=y
CONFIG_ACPI_SLEEP=y
CONFIG_ACPI_REV_OVERRIDE_POSSIBLE=y
CONFIG_ACPI_EC_DEBUGFS=m
CONFIG_ACPI_AC=y
CONFIG_ACPI_BATTERY=y
CONFIG_ACPI_BUTTON=y
CONFIG_ACPI_VIDEO=m
CONFIG_ACPI_FAN=y
CONFIG_ACPI_TAD=m
CONFIG_ACPI_DOCK=y
CONFIG_ACPI_CPU_FREQ_PSS=y
CONFIG_ACPI_PROCESSOR_CSTATE=y
CONFIG_ACPI_PROCESSOR_IDLE=y
CONFIG_ACPI_CPPC_LIB=y
CONFIG_ACPI_PROCESSOR=y
CONFIG_ACPI_IPMI=m
CONFIG_ACPI_HOTPLUG_CPU=y
CONFIG_ACPI_PROCESSOR_AGGREGATOR=m
CONFIG_ACPI_THERMAL=y
CONFIG_ACPI_PLATFORM_PROFILE=m
CONFIG_ACPI_CUSTOM_DSDT_FILE=""
CONFIG_ARCH_HAS_ACPI_TABLE_UPGRADE=y
CONFIG_ACPI_TABLE_UPGRADE=y
CONFIG_ACPI_DEBUG=y
CONFIG_ACPI_PCI_SLOT=y
CONFIG_ACPI_CONTAINER=y
CONFIG_ACPI_HOTPLUG_MEMORY=y
CONFIG_ACPI_HOTPLUG_IOAPIC=y
CONFIG_ACPI_SBS=m
CONFIG_ACPI_HED=y
# CONFIG_ACPI_CUSTOM_METHOD is not set
CONFIG_ACPI_BGRT=y
# CONFIG_ACPI_REDUCED_HARDWARE_ONLY is not set
CONFIG_ACPI_NFIT=m
# CONFIG_NFIT_SECURITY_DEBUG is not set
CONFIG_ACPI_NUMA=y
CONFIG_ACPI_HMAT=y
CONFIG_HAVE_ACPI_APEI=y
CONFIG_HAVE_ACPI_APEI_NMI=y
CONFIG_ACPI_APEI=y
CONFIG_ACPI_APEI_GHES=y
CONFIG_ACPI_APEI_PCIEAER=y
CONFIG_ACPI_APEI_MEMORY_FAILURE=y
CONFIG_ACPI_APEI_EINJ=m
# CONFIG_ACPI_APEI_ERST_DEBUG is not set
CONFIG_ACPI_DPTF=y
CONFIG_DPTF_POWER=m
CONFIG_DPTF_PCH_FIVR=m
CONFIG_ACPI_WATCHDOG=y
CONFIG_ACPI_EXTLOG=m
CONFIG_ACPI_ADXL=y
CONFIG_ACPI_CONFIGFS=m
CONFIG_ACPI_PFRUT=m
CONFIG_ACPI_PCC=y
CONFIG_ACPI_FFH=y
CONFIG_PMIC_OPREGION=y
CONFIG_BYTCRC_PMIC_OPREGION=y
CONFIG_CHTCRC_PMIC_OPREGION=y
CONFIG_XPOWER_PMIC_OPREGION=y
CONFIG_BXT_WC_PMIC_OPREGION=y
CONFIG_CHT_WC_PMIC_OPREGION=y
CONFIG_CHT_DC_TI_PMIC_OPREGION=y
CONFIG_TPS68470_PMIC_OPREGION=y
CONFIG_ACPI_VIOT=y
CONFIG_ACPI_PRMT=y
CONFIG_X86_PM_TIMER=y

#
# CPU Frequency scaling
#
CONFIG_CPU_FREQ=y
CONFIG_CPU_FREQ_GOV_ATTR_SET=y
CONFIG_CPU_FREQ_GOV_COMMON=y
CONFIG_CPU_FREQ_STAT=y
# CONFIG_CPU_FREQ_DEFAULT_GOV_PERFORMANCE is not set
# CONFIG_CPU_FREQ_DEFAULT_GOV_POWERSAVE is not set
# CONFIG_CPU_FREQ_DEFAULT_GOV_USERSPACE is not set
CONFIG_CPU_FREQ_DEFAULT_GOV_SCHEDUTIL=y
CONFIG_CPU_FREQ_GOV_PERFORMANCE=y
CONFIG_CPU_FREQ_GOV_POWERSAVE=y
CONFIG_CPU_FREQ_GOV_USERSPACE=y
CONFIG_CPU_FREQ_GOV_ONDEMAND=y
CONFIG_CPU_FREQ_GOV_CONSERVATIVE=y
CONFIG_CPU_FREQ_GOV_SCHEDUTIL=y

#
# CPU frequency scaling drivers
#
CONFIG_X86_INTEL_PSTATE=y
CONFIG_X86_PCC_CPUFREQ=y
CONFIG_X86_AMD_PSTATE=y
CONFIG_X86_AMD_PSTATE_DEFAULT_MODE=3
# CONFIG_X86_AMD_PSTATE_UT is not set
CONFIG_X86_ACPI_CPUFREQ=y
CONFIG_X86_ACPI_CPUFREQ_CPB=y
CONFIG_X86_POWERNOW_K8=y
CONFIG_X86_AMD_FREQ_SENSITIVITY=m
CONFIG_X86_SPEEDSTEP_CENTRINO=y
CONFIG_X86_P4_CLOCKMOD=m

#
# shared options
#
CONFIG_X86_SPEEDSTEP_LIB=m
# end of CPU Frequency scaling

#
# CPU Idle
#
CONFIG_CPU_IDLE=y
CONFIG_CPU_IDLE_GOV_LADDER=y
CONFIG_CPU_IDLE_GOV_MENU=y
CONFIG_CPU_IDLE_GOV_TEO=y
CONFIG_CPU_IDLE_GOV_HALTPOLL=y
CONFIG_HALTPOLL_CPUIDLE=m
# end of CPU Idle

CONFIG_INTEL_IDLE=y
# end of Power management and ACPI options

#
# Bus options (PCI etc.)
#
CONFIG_PCI_DIRECT=y
CONFIG_PCI_MMCONFIG=y
CONFIG_PCI_XEN=y
CONFIG_MMCONF_FAM10H=y
# CONFIG_PCI_CNB20LE_QUIRK is not set
CONFIG_ISA_BUS=y
CONFIG_ISA_DMA_API=y
CONFIG_AMD_NB=y
# end of Bus options (PCI etc.)

#
# Binary Emulations
#
CONFIG_IA32_EMULATION=y
# CONFIG_X86_X32_ABI is not set
CONFIG_COMPAT_32=y
CONFIG_COMPAT=y
CONFIG_COMPAT_FOR_U64_ALIGNMENT=y
# end of Binary Emulations

CONFIG_HAVE_KVM=y
CONFIG_HAVE_KVM_PFNCACHE=y
CONFIG_HAVE_KVM_IRQCHIP=y
CONFIG_HAVE_KVM_IRQFD=y
CONFIG_HAVE_KVM_IRQ_ROUTING=y
CONFIG_HAVE_KVM_DIRTY_RING=y
CONFIG_HAVE_KVM_DIRTY_RING_TSO=y
CONFIG_HAVE_KVM_DIRTY_RING_ACQ_REL=y
CONFIG_HAVE_KVM_EVENTFD=y
CONFIG_KVM_MMIO=y
CONFIG_KVM_ASYNC_PF=y
CONFIG_HAVE_KVM_MSI=y
CONFIG_HAVE_KVM_CPU_RELAX_INTERCEPT=y
CONFIG_KVM_VFIO=y
CONFIG_KVM_GENERIC_DIRTYLOG_READ_PROTECT=y
CONFIG_KVM_COMPAT=y
CONFIG_HAVE_KVM_IRQ_BYPASS=y
CONFIG_HAVE_KVM_NO_POLL=y
CONFIG_KVM_XFER_TO_GUEST_WORK=y
CONFIG_HAVE_KVM_PM_NOTIFIER=y
CONFIG_KVM_GENERIC_HARDWARE_ENABLING=y
CONFIG_VIRTUALIZATION=y
CONFIG_KVM=m
CONFIG_KVM_WERROR=y
CONFIG_KVM_INTEL=m
CONFIG_X86_SGX_KVM=y
CONFIG_KVM_AMD=m
CONFIG_KVM_AMD_SEV=y
CONFIG_KVM_SMM=y
CONFIG_KVM_XEN=y
# CONFIG_KVM_PROVE_MMU is not set
CONFIG_KVM_EXTERNAL_WRITE_TRACKING=y
CONFIG_AS_AVX512=y
CONFIG_AS_SHA1_NI=y
CONFIG_AS_SHA256_NI=y
CONFIG_AS_TPAUSE=y
CONFIG_AS_GFNI=y
CONFIG_AS_WRUSS=y

#
# General architecture-dependent options
#
CONFIG_HOTPLUG_SMT=y
CONFIG_HOTPLUG_CORE_SYNC=y
CONFIG_HOTPLUG_CORE_SYNC_DEAD=y
CONFIG_HOTPLUG_CORE_SYNC_FULL=y
CONFIG_HOTPLUG_SPLIT_STARTUP=y
CONFIG_HOTPLUG_PARALLEL=y
CONFIG_GENERIC_ENTRY=y
CONFIG_KPROBES=y
CONFIG_JUMP_LABEL=y
# CONFIG_STATIC_KEYS_SELFTEST is not set
# CONFIG_STATIC_CALL_SELFTEST is not set
CONFIG_OPTPROBES=y
CONFIG_KPROBES_ON_FTRACE=y
CONFIG_UPROBES=y
CONFIG_HAVE_EFFICIENT_UNALIGNED_ACCESS=y
CONFIG_ARCH_USE_BUILTIN_BSWAP=y
CONFIG_KRETPROBES=y
CONFIG_KRETPROBE_ON_RETHOOK=y
CONFIG_USER_RETURN_NOTIFIER=y
CONFIG_HAVE_IOREMAP_PROT=y
CONFIG_HAVE_KPROBES=y
CONFIG_HAVE_KRETPROBES=y
CONFIG_HAVE_OPTPROBES=y
CONFIG_HAVE_KPROBES_ON_FTRACE=y
CONFIG_ARCH_CORRECT_STACKTRACE_ON_KRETPROBE=y
CONFIG_HAVE_FUNCTION_ERROR_INJECTION=y
CONFIG_HAVE_NMI=y
CONFIG_TRACE_IRQFLAGS_SUPPORT=y
CONFIG_TRACE_IRQFLAGS_NMI_SUPPORT=y
CONFIG_HAVE_ARCH_TRACEHOOK=y
CONFIG_HAVE_DMA_CONTIGUOUS=y
CONFIG_GENERIC_SMP_IDLE_THREAD=y
CONFIG_ARCH_HAS_FORTIFY_SOURCE=y
CONFIG_ARCH_HAS_SET_MEMORY=y
CONFIG_ARCH_HAS_SET_DIRECT_MAP=y
CONFIG_ARCH_HAS_CPU_FINALIZE_INIT=y
CONFIG_HAVE_ARCH_THREAD_STRUCT_WHITELIST=y
CONFIG_ARCH_WANTS_DYNAMIC_TASK_STRUCT=y
CONFIG_ARCH_WANTS_NO_INSTR=y
CONFIG_HAVE_ASM_MODVERSIONS=y
CONFIG_HAVE_REGS_AND_STACK_ACCESS_API=y
CONFIG_HAVE_RSEQ=y
CONFIG_HAVE_RUST=y
CONFIG_HAVE_FUNCTION_ARG_ACCESS_API=y
CONFIG_HAVE_HW_BREAKPOINT=y
CONFIG_HAVE_MIXED_BREAKPOINTS_REGS=y
CONFIG_HAVE_USER_RETURN_NOTIFIER=y
CONFIG_HAVE_PERF_EVENTS_NMI=y
CONFIG_HAVE_HARDLOCKUP_DETECTOR_PERF=y
CONFIG_HAVE_PERF_REGS=y
CONFIG_HAVE_PERF_USER_STACK_DUMP=y
CONFIG_HAVE_ARCH_JUMP_LABEL=y
CONFIG_HAVE_ARCH_JUMP_LABEL_RELATIVE=y
CONFIG_MMU_GATHER_TABLE_FREE=y
CONFIG_MMU_GATHER_RCU_TABLE_FREE=y
CONFIG_MMU_GATHER_MERGE_VMAS=y
CONFIG_MMU_LAZY_TLB_REFCOUNT=y
CONFIG_ARCH_HAVE_NMI_SAFE_CMPXCHG=y
CONFIG_ARCH_HAS_NMI_SAFE_THIS_CPU_OPS=y
CONFIG_HAVE_ALIGNED_STRUCT_PAGE=y
CONFIG_HAVE_CMPXCHG_LOCAL=y
CONFIG_HAVE_CMPXCHG_DOUBLE=y
CONFIG_ARCH_WANT_COMPAT_IPC_PARSE_VERSION=y
CONFIG_ARCH_WANT_OLD_COMPAT_IPC=y
CONFIG_HAVE_ARCH_SECCOMP=y
CONFIG_HAVE_ARCH_SECCOMP_FILTER=y
CONFIG_SECCOMP=y
CONFIG_SECCOMP_FILTER=y
# CONFIG_SECCOMP_CACHE_DEBUG is not set
CONFIG_HAVE_ARCH_STACKLEAK=y
CONFIG_HAVE_STACKPROTECTOR=y
CONFIG_STACKPROTECTOR=y
CONFIG_STACKPROTECTOR_STRONG=y
CONFIG_ARCH_SUPPORTS_LTO_CLANG=y
CONFIG_ARCH_SUPPORTS_LTO_CLANG_THIN=y
CONFIG_LTO_NONE=y
CONFIG_ARCH_SUPPORTS_CFI_CLANG=y
CONFIG_HAVE_ARCH_WITHIN_STACK_FRAMES=y
CONFIG_HAVE_CONTEXT_TRACKING_USER=y
CONFIG_HAVE_CONTEXT_TRACKING_USER_OFFSTACK=y
CONFIG_HAVE_VIRT_CPU_ACCOUNTING_GEN=y
CONFIG_HAVE_IRQ_TIME_ACCOUNTING=y
CONFIG_HAVE_MOVE_PUD=y
CONFIG_HAVE_MOVE_PMD=y
CONFIG_HAVE_ARCH_TRANSPARENT_HUGEPAGE=y
CONFIG_HAVE_ARCH_TRANSPARENT_HUGEPAGE_PUD=y
CONFIG_HAVE_ARCH_HUGE_VMAP=y
CONFIG_HAVE_ARCH_HUGE_VMALLOC=y
CONFIG_ARCH_WANT_HUGE_PMD_SHARE=y
CONFIG_ARCH_WANT_PMD_MKWRITE=y
CONFIG_HAVE_ARCH_SOFT_DIRTY=y
CONFIG_HAVE_MOD_ARCH_SPECIFIC=y
CONFIG_MODULES_USE_ELF_RELA=y
CONFIG_HAVE_IRQ_EXIT_ON_IRQ_STACK=y
CONFIG_HAVE_SOFTIRQ_ON_OWN_STACK=y
CONFIG_SOFTIRQ_ON_OWN_STACK=y
CONFIG_ARCH_HAS_ELF_RANDOMIZE=y
CONFIG_HAVE_ARCH_MMAP_RND_BITS=y
CONFIG_HAVE_EXIT_THREAD=y
CONFIG_ARCH_MMAP_RND_BITS=28
CONFIG_HAVE_ARCH_MMAP_RND_COMPAT_BITS=y
CONFIG_ARCH_MMAP_RND_COMPAT_BITS=8
CONFIG_HAVE_ARCH_COMPAT_MMAP_BASES=y
CONFIG_PAGE_SIZE_LESS_THAN_64KB=y
CONFIG_PAGE_SIZE_LESS_THAN_256KB=y
CONFIG_HAVE_OBJTOOL=y
CONFIG_HAVE_JUMP_LABEL_HACK=y
CONFIG_HAVE_NOINSTR_HACK=y
CONFIG_HAVE_NOINSTR_VALIDATION=y
CONFIG_HAVE_UACCESS_VALIDATION=y
CONFIG_HAVE_STACK_VALIDATION=y
CONFIG_HAVE_RELIABLE_STACKTRACE=y
CONFIG_ISA_BUS_API=y
CONFIG_OLD_SIGSUSPEND3=y
CONFIG_COMPAT_OLD_SIGACTION=y
CONFIG_COMPAT_32BIT_TIME=y
CONFIG_HAVE_ARCH_VMAP_STACK=y
CONFIG_HAVE_ARCH_RANDOMIZE_KSTACK_OFFSET=y
CONFIG_RANDOMIZE_KSTACK_OFFSET=y
CONFIG_RANDOMIZE_KSTACK_OFFSET_DEFAULT=y
CONFIG_ARCH_HAS_STRICT_KERNEL_RWX=y
CONFIG_STRICT_KERNEL_RWX=y
CONFIG_ARCH_HAS_STRICT_MODULE_RWX=y
CONFIG_STRICT_MODULE_RWX=y
CONFIG_HAVE_ARCH_PREL32_RELOCATIONS=y
CONFIG_ARCH_USE_MEMREMAP_PROT=y
# CONFIG_LOCK_EVENT_COUNTS is not set
CONFIG_ARCH_HAS_MEM_ENCRYPT=y
CONFIG_ARCH_HAS_CC_PLATFORM=y
CONFIG_HAVE_STATIC_CALL=y
CONFIG_HAVE_STATIC_CALL_INLINE=y
CONFIG_HAVE_PREEMPT_DYNAMIC=y
CONFIG_HAVE_PREEMPT_DYNAMIC_CALL=y
CONFIG_ARCH_WANT_LD_ORPHAN_WARN=y
CONFIG_ARCH_SUPPORTS_DEBUG_PAGEALLOC=y
CONFIG_ARCH_SUPPORTS_PAGE_TABLE_CHECK=y
CONFIG_ARCH_HAS_ELFCORE_COMPAT=y
CONFIG_ARCH_HAS_PARANOID_L1D_FLUSH=y
CONFIG_DYNAMIC_SIGFRAME=y
CONFIG_HAVE_ARCH_NODE_DEV_GROUP=y
CONFIG_ARCH_HAS_NONLEAF_PMD_YOUNG=y

#
# GCOV-based kernel profiling
#
# CONFIG_GCOV_KERNEL is not set
CONFIG_ARCH_HAS_GCOV_PROFILE_ALL=y
# end of GCOV-based kernel profiling

CONFIG_HAVE_GCC_PLUGINS=y
CONFIG_FUNCTION_ALIGNMENT_4B=y
CONFIG_FUNCTION_ALIGNMENT_16B=y
CONFIG_FUNCTION_ALIGNMENT=16
# end of General architecture-dependent options

CONFIG_RT_MUTEXES=y
CONFIG_BASE_SMALL=0
CONFIG_MODULE_SIG_FORMAT=y
CONFIG_MODULES=y
# CONFIG_MODULE_DEBUG is not set
# CONFIG_MODULE_FORCE_LOAD is not set
CONFIG_MODULE_UNLOAD=y
# CONFIG_MODULE_FORCE_UNLOAD is not set
# CONFIG_MODULE_UNLOAD_TAINT_TRACKING is not set
CONFIG_MODVERSIONS=y
CONFIG_ASM_MODVERSIONS=y
CONFIG_MODULE_SRCVERSION_ALL=y
CONFIG_MODULE_SIG=y
# CONFIG_MODULE_SIG_FORCE is not set
CONFIG_MODULE_SIG_ALL=y
# CONFIG_MODULE_SIG_SHA1 is not set
# CONFIG_MODULE_SIG_SHA224 is not set
# CONFIG_MODULE_SIG_SHA256 is not set
# CONFIG_MODULE_SIG_SHA384 is not set
CONFIG_MODULE_SIG_SHA512=y
CONFIG_MODULE_SIG_HASH="sha512"
CONFIG_MODULE_COMPRESS_NONE=y
# CONFIG_MODULE_COMPRESS_GZIP is not set
# CONFIG_MODULE_COMPRESS_XZ is not set
# CONFIG_MODULE_COMPRESS_ZSTD is not set
# CONFIG_MODULE_ALLOW_MISSING_NAMESPACE_IMPORTS is not set
CONFIG_MODPROBE_PATH="/sbin/modprobe"
# CONFIG_TRIM_UNUSED_KSYMS is not set
CONFIG_MODULES_TREE_LOOKUP=y
CONFIG_BLOCK=y
CONFIG_BLOCK_LEGACY_AUTOLOAD=y
CONFIG_BLK_RQ_ALLOC_TIME=y
CONFIG_BLK_CGROUP_RWSTAT=y
CONFIG_BLK_CGROUP_PUNT_BIO=y
CONFIG_BLK_DEV_BSG_COMMON=y
CONFIG_BLK_ICQ=y
CONFIG_BLK_DEV_BSGLIB=y
CONFIG_BLK_DEV_INTEGRITY=y
CONFIG_BLK_DEV_INTEGRITY_T10=y
CONFIG_BLK_DEV_ZONED=y
CONFIG_BLK_DEV_THROTTLING=y
# CONFIG_BLK_DEV_THROTTLING_LOW is not set
CONFIG_BLK_WBT=y
CONFIG_BLK_WBT_MQ=y
# CONFIG_BLK_CGROUP_IOLATENCY is not set
CONFIG_BLK_CGROUP_FC_APPID=y
CONFIG_BLK_CGROUP_IOCOST=y
CONFIG_BLK_CGROUP_IOPRIO=y
CONFIG_BLK_DEBUG_FS=y
CONFIG_BLK_DEBUG_FS_ZONED=y
CONFIG_BLK_SED_OPAL=y
CONFIG_BLK_INLINE_ENCRYPTION=y
CONFIG_BLK_INLINE_ENCRYPTION_FALLBACK=y

#
# Partition Types
#
CONFIG_PARTITION_ADVANCED=y
# CONFIG_ACORN_PARTITION is not set
CONFIG_AIX_PARTITION=y
CONFIG_OSF_PARTITION=y
CONFIG_AMIGA_PARTITION=y
CONFIG_ATARI_PARTITION=y
CONFIG_MAC_PARTITION=y
CONFIG_MSDOS_PARTITION=y
CONFIG_BSD_DISKLABEL=y
CONFIG_MINIX_SUBPARTITION=y
CONFIG_SOLARIS_X86_PARTITION=y
CONFIG_UNIXWARE_DISKLABEL=y
CONFIG_LDM_PARTITION=y
# CONFIG_LDM_DEBUG is not set
CONFIG_SGI_PARTITION=y
CONFIG_ULTRIX_PARTITION=y
CONFIG_SUN_PARTITION=y
CONFIG_KARMA_PARTITION=y
CONFIG_EFI_PARTITION=y
CONFIG_SYSV68_PARTITION=y
CONFIG_CMDLINE_PARTITION=y
# end of Partition Types

CONFIG_BLK_MQ_PCI=y
CONFIG_BLK_MQ_VIRTIO=y
CONFIG_BLK_PM=y
CONFIG_BLOCK_HOLDER_DEPRECATED=y
CONFIG_BLK_MQ_STACKING=y

#
# IO Schedulers
#
CONFIG_MQ_IOSCHED_DEADLINE=y
CONFIG_MQ_IOSCHED_KYBER=m
CONFIG_IOSCHED_BFQ=m
CONFIG_BFQ_GROUP_IOSCHED=y
# CONFIG_BFQ_CGROUP_DEBUG is not set
# end of IO Schedulers

CONFIG_PREEMPT_NOTIFIERS=y
CONFIG_PADATA=y
CONFIG_ASN1=y
CONFIG_UNINLINE_SPIN_UNLOCK=y
CONFIG_ARCH_SUPPORTS_ATOMIC_RMW=y
CONFIG_MUTEX_SPIN_ON_OWNER=y
CONFIG_RWSEM_SPIN_ON_OWNER=y
CONFIG_LOCK_SPIN_ON_OWNER=y
CONFIG_ARCH_USE_QUEUED_SPINLOCKS=y
CONFIG_QUEUED_SPINLOCKS=y
CONFIG_ARCH_USE_QUEUED_RWLOCKS=y
CONFIG_QUEUED_RWLOCKS=y
CONFIG_ARCH_HAS_NON_OVERLAPPING_ADDRESS_SPACE=y
CONFIG_ARCH_HAS_SYNC_CORE_BEFORE_USERMODE=y
CONFIG_ARCH_HAS_SYSCALL_WRAPPER=y
CONFIG_FREEZER=y

#
# Executable file formats
#
CONFIG_BINFMT_ELF=y
CONFIG_COMPAT_BINFMT_ELF=y
CONFIG_ELFCORE=y
CONFIG_CORE_DUMP_DEFAULT_ELF_HEADERS=y
CONFIG_BINFMT_SCRIPT=y
CONFIG_BINFMT_MISC=m
CONFIG_COREDUMP=y
# end of Executable file formats

#
# Memory Management options
#
CONFIG_ZPOOL=y
CONFIG_SWAP=y
CONFIG_ZSWAP=y
# CONFIG_ZSWAP_DEFAULT_ON is not set
# CONFIG_ZSWAP_EXCLUSIVE_LOADS_DEFAULT_ON is not set
# CONFIG_ZSWAP_COMPRESSOR_DEFAULT_DEFLATE is not set
CONFIG_ZSWAP_COMPRESSOR_DEFAULT_LZO=y
# CONFIG_ZSWAP_COMPRESSOR_DEFAULT_842 is not set
# CONFIG_ZSWAP_COMPRESSOR_DEFAULT_LZ4 is not set
# CONFIG_ZSWAP_COMPRESSOR_DEFAULT_LZ4HC is not set
# CONFIG_ZSWAP_COMPRESSOR_DEFAULT_ZSTD is not set
CONFIG_ZSWAP_COMPRESSOR_DEFAULT="lzo"
CONFIG_ZSWAP_ZPOOL_DEFAULT_ZBUD=y
# CONFIG_ZSWAP_ZPOOL_DEFAULT_Z3FOLD is not set
# CONFIG_ZSWAP_ZPOOL_DEFAULT_ZSMALLOC is not set
CONFIG_ZSWAP_ZPOOL_DEFAULT="zbud"
CONFIG_ZBUD=y
CONFIG_Z3FOLD=m
CONFIG_ZSMALLOC=y
# CONFIG_ZSMALLOC_STAT is not set
CONFIG_ZSMALLOC_CHAIN_SIZE=8

#
# SLAB allocator options
#
# CONFIG_SLAB_DEPRECATED is not set
CONFIG_SLUB=y
# CONFIG_SLUB_TINY is not set
CONFIG_SLAB_MERGE_DEFAULT=y
CONFIG_SLAB_FREELIST_RANDOM=y
CONFIG_SLAB_FREELIST_HARDENED=y
# CONFIG_SLUB_STATS is not set
CONFIG_SLUB_CPU_PARTIAL=y
# CONFIG_RANDOM_KMALLOC_CACHES is not set
# end of SLAB allocator options

CONFIG_SHUFFLE_PAGE_ALLOCATOR=y
# CONFIG_COMPAT_BRK is not set
CONFIG_SPARSEMEM=y
CONFIG_SPARSEMEM_EXTREME=y
CONFIG_SPARSEMEM_VMEMMAP_ENABLE=y
CONFIG_SPARSEMEM_VMEMMAP=y
CONFIG_ARCH_WANT_OPTIMIZE_DAX_VMEMMAP=y
CONFIG_ARCH_WANT_OPTIMIZE_HUGETLB_VMEMMAP=y
CONFIG_HAVE_FAST_GUP=y
CONFIG_NUMA_KEEP_MEMINFO=y
CONFIG_MEMORY_ISOLATION=y
CONFIG_EXCLUSIVE_SYSTEM_RAM=y
CONFIG_HAVE_BOOTMEM_INFO_NODE=y
CONFIG_ARCH_ENABLE_MEMORY_HOTPLUG=y
CONFIG_ARCH_ENABLE_MEMORY_HOTREMOVE=y
CONFIG_MEMORY_HOTPLUG=y
CONFIG_MEMORY_HOTPLUG_DEFAULT_ONLINE=y
CONFIG_MEMORY_HOTREMOVE=y
CONFIG_MHP_MEMMAP_ON_MEMORY=y
CONFIG_ARCH_MHP_MEMMAP_ON_MEMORY_ENABLE=y
CONFIG_SPLIT_PTLOCK_CPUS=4
CONFIG_ARCH_ENABLE_SPLIT_PMD_PTLOCK=y
CONFIG_MEMORY_BALLOON=y
CONFIG_BALLOON_COMPACTION=y
CONFIG_COMPACTION=y
CONFIG_COMPACT_UNEVICTABLE_DEFAULT=1
CONFIG_PAGE_REPORTING=y
CONFIG_MIGRATION=y
CONFIG_DEVICE_MIGRATION=y
CONFIG_ARCH_ENABLE_HUGEPAGE_MIGRATION=y
CONFIG_ARCH_ENABLE_THP_MIGRATION=y
CONFIG_CONTIG_ALLOC=y
CONFIG_PHYS_ADDR_T_64BIT=y
CONFIG_MMU_NOTIFIER=y
CONFIG_KSM=y
CONFIG_DEFAULT_MMAP_MIN_ADDR=65536
CONFIG_ARCH_SUPPORTS_MEMORY_FAILURE=y
CONFIG_MEMORY_FAILURE=y
CONFIG_HWPOISON_INJECT=m
CONFIG_ARCH_WANT_GENERAL_HUGETLB=y
CONFIG_ARCH_WANTS_THP_SWAP=y
CONFIG_TRANSPARENT_HUGEPAGE=y
# CONFIG_TRANSPARENT_HUGEPAGE_ALWAYS is not set
CONFIG_TRANSPARENT_HUGEPAGE_MADVISE=y
CONFIG_THP_SWAP=y
# CONFIG_READ_ONLY_THP_FOR_FS is not set
CONFIG_NEED_PER_CPU_EMBED_FIRST_CHUNK=y
CONFIG_NEED_PER_CPU_PAGE_FIRST_CHUNK=y
CONFIG_USE_PERCPU_NUMA_NODE_ID=y
CONFIG_HAVE_SETUP_PER_CPU_AREA=y
# CONFIG_CMA is not set
CONFIG_MEM_SOFT_DIRTY=y
CONFIG_GENERIC_EARLY_IOREMAP=y
# CONFIG_DEFERRED_STRUCT_PAGE_INIT is not set
CONFIG_PAGE_IDLE_FLAG=y
CONFIG_IDLE_PAGE_TRACKING=y
CONFIG_ARCH_HAS_CACHE_LINE_SIZE=y
CONFIG_ARCH_HAS_CURRENT_STACK_POINTER=y
CONFIG_ARCH_HAS_PTE_DEVMAP=y
CONFIG_ARCH_HAS_ZONE_DMA_SET=y
CONFIG_ZONE_DMA=y
CONFIG_ZONE_DMA32=y
CONFIG_ZONE_DEVICE=y
CONFIG_HMM_MIRROR=y
CONFIG_GET_FREE_REGION=y
CONFIG_DEVICE_PRIVATE=y
CONFIG_VMAP_PFN=y
CONFIG_ARCH_USES_HIGH_VMA_FLAGS=y
CONFIG_ARCH_HAS_PKEYS=y
CONFIG_VM_EVENT_COUNTERS=y
# CONFIG_PERCPU_STATS is not set
# CONFIG_GUP_TEST is not set
# CONFIG_DMAPOOL_TEST is not set
CONFIG_ARCH_HAS_PTE_SPECIAL=y
CONFIG_MAPPING_DIRTY_HELPERS=y
CONFIG_MEMFD_CREATE=y
CONFIG_SECRETMEM=y
CONFIG_ANON_VMA_NAME=y
CONFIG_USERFAULTFD=y
CONFIG_HAVE_ARCH_USERFAULTFD_WP=y
CONFIG_HAVE_ARCH_USERFAULTFD_MINOR=y
CONFIG_PTE_MARKER_UFFD_WP=y
CONFIG_LRU_GEN=y
# CONFIG_LRU_GEN_ENABLED is not set
# CONFIG_LRU_GEN_STATS is not set
CONFIG_ARCH_SUPPORTS_PER_VMA_LOCK=y
CONFIG_PER_VMA_LOCK=y
CONFIG_LOCK_MM_AND_FIND_VMA=y

#
# Data Access Monitoring
#
# CONFIG_DAMON is not set
# end of Data Access Monitoring
# end of Memory Management options

CONFIG_NET=y
CONFIG_WANT_COMPAT_NETLINK_MESSAGES=y
CONFIG_COMPAT_NETLINK_MESSAGES=y
CONFIG_NET_INGRESS=y
CONFIG_NET_EGRESS=y
CONFIG_NET_XGRESS=y
CONFIG_NET_REDIRECT=y
CONFIG_SKB_EXTENSIONS=y

#
# Networking options
#
CONFIG_PACKET=y
CONFIG_PACKET_DIAG=m
CONFIG_UNIX=y
CONFIG_UNIX_SCM=y
CONFIG_AF_UNIX_OOB=y
CONFIG_UNIX_DIAG=m
CONFIG_TLS=m
CONFIG_TLS_DEVICE=y
# CONFIG_TLS_TOE is not set
CONFIG_XFRM=y
CONFIG_XFRM_OFFLOAD=y
CONFIG_XFRM_ALGO=m
CONFIG_XFRM_USER=m
CONFIG_XFRM_USER_COMPAT=m
CONFIG_XFRM_INTERFACE=m
# CONFIG_XFRM_SUB_POLICY is not set
# CONFIG_XFRM_MIGRATE is not set
CONFIG_XFRM_STATISTICS=y
CONFIG_XFRM_AH=m
CONFIG_XFRM_ESP=m
CONFIG_XFRM_IPCOMP=m
CONFIG_NET_KEY=m
# CONFIG_NET_KEY_MIGRATE is not set
CONFIG_XFRM_ESPINTCP=y
CONFIG_SMC=m
CONFIG_SMC_DIAG=m
CONFIG_XDP_SOCKETS=y
CONFIG_XDP_SOCKETS_DIAG=m
CONFIG_NET_HANDSHAKE=y
CONFIG_INET=y
CONFIG_IP_MULTICAST=y
CONFIG_IP_ADVANCED_ROUTER=y
CONFIG_IP_FIB_TRIE_STATS=y
CONFIG_IP_MULTIPLE_TABLES=y
CONFIG_IP_ROUTE_MULTIPATH=y
CONFIG_IP_ROUTE_VERBOSE=y
CONFIG_IP_ROUTE_CLASSID=y
# CONFIG_IP_PNP is not set
CONFIG_NET_IPIP=m
CONFIG_NET_IPGRE_DEMUX=m
CONFIG_NET_IP_TUNNEL=m
CONFIG_NET_IPGRE=m
CONFIG_NET_IPGRE_BROADCAST=y
CONFIG_IP_MROUTE_COMMON=y
CONFIG_IP_MROUTE=y
CONFIG_IP_MROUTE_MULTIPLE_TABLES=y
CONFIG_IP_PIMSM_V1=y
CONFIG_IP_PIMSM_V2=y
CONFIG_SYN_COOKIES=y
CONFIG_NET_IPVTI=m
CONFIG_NET_UDP_TUNNEL=m
CONFIG_NET_FOU=m
CONFIG_NET_FOU_IP_TUNNELS=y
CONFIG_INET_AH=m
CONFIG_INET_ESP=m
CONFIG_INET_ESP_OFFLOAD=m
CONFIG_INET_ESPINTCP=y
CONFIG_INET_IPCOMP=m
CONFIG_INET_TABLE_PERTURB_ORDER=16
CONFIG_INET_XFRM_TUNNEL=m
CONFIG_INET_TUNNEL=m
CONFIG_INET_DIAG=m
CONFIG_INET_TCP_DIAG=m
CONFIG_INET_UDP_DIAG=m
CONFIG_INET_RAW_DIAG=m
CONFIG_INET_DIAG_DESTROY=y
CONFIG_TCP_CONG_ADVANCED=y
CONFIG_TCP_CONG_BIC=m
CONFIG_TCP_CONG_CUBIC=y
CONFIG_TCP_CONG_WESTWOOD=m
CONFIG_TCP_CONG_HTCP=m
CONFIG_TCP_CONG_HSTCP=m
CONFIG_TCP_CONG_HYBLA=m
CONFIG_TCP_CONG_VEGAS=m
CONFIG_TCP_CONG_NV=m
CONFIG_TCP_CONG_SCALABLE=m
CONFIG_TCP_CONG_LP=m
CONFIG_TCP_CONG_VENO=m
CONFIG_TCP_CONG_YEAH=m
CONFIG_TCP_CONG_ILLINOIS=m
CONFIG_TCP_CONG_DCTCP=m
CONFIG_TCP_CONG_CDG=m
CONFIG_TCP_CONG_BBR=m
CONFIG_DEFAULT_CUBIC=y
# CONFIG_DEFAULT_RENO is not set
CONFIG_DEFAULT_TCP_CONG="cubic"
CONFIG_TCP_MD5SIG=y
CONFIG_IPV6=y
CONFIG_IPV6_ROUTER_PREF=y
CONFIG_IPV6_ROUTE_INFO=y
# CONFIG_IPV6_OPTIMISTIC_DAD is not set
CONFIG_INET6_AH=m
CONFIG_INET6_ESP=m
CONFIG_INET6_ESP_OFFLOAD=m
CONFIG_INET6_ESPINTCP=y
CONFIG_INET6_IPCOMP=m
CONFIG_IPV6_MIP6=m
CONFIG_IPV6_ILA=m
CONFIG_INET6_XFRM_TUNNEL=m
CONFIG_INET6_TUNNEL=m
CONFIG_IPV6_VTI=m
CONFIG_IPV6_SIT=m
CONFIG_IPV6_SIT_6RD=y
CONFIG_IPV6_NDISC_NODETYPE=y
CONFIG_IPV6_TUNNEL=m
CONFIG_IPV6_GRE=m
CONFIG_IPV6_FOU=m
CONFIG_IPV6_FOU_TUNNEL=m
CONFIG_IPV6_MULTIPLE_TABLES=y
CONFIG_IPV6_SUBTREES=y
CONFIG_IPV6_MROUTE=y
CONFIG_IPV6_MROUTE_MULTIPLE_TABLES=y
CONFIG_IPV6_PIMSM_V2=y
CONFIG_IPV6_SEG6_LWTUNNEL=y
CONFIG_IPV6_SEG6_HMAC=y
CONFIG_IPV6_SEG6_BPF=y
# CONFIG_IPV6_RPL_LWTUNNEL is not set
CONFIG_IPV6_IOAM6_LWTUNNEL=y
CONFIG_NETLABEL=y
CONFIG_MPTCP=y
CONFIG_INET_MPTCP_DIAG=m
CONFIG_MPTCP_IPV6=y
CONFIG_NETWORK_SECMARK=y
CONFIG_NET_PTP_CLASSIFY=y
CONFIG_NETWORK_PHY_TIMESTAMPING=y
CONFIG_NETFILTER=y
CONFIG_NETFILTER_ADVANCED=y
CONFIG_BRIDGE_NETFILTER=m

#
# Core Netfilter Configuration
#
CONFIG_NETFILTER_INGRESS=y
CONFIG_NETFILTER_EGRESS=y
CONFIG_NETFILTER_SKIP_EGRESS=y
CONFIG_NETFILTER_NETLINK=m
CONFIG_NETFILTER_FAMILY_BRIDGE=y
CONFIG_NETFILTER_FAMILY_ARP=y
CONFIG_NETFILTER_BPF_LINK=y
CONFIG_NETFILTER_NETLINK_HOOK=m
CONFIG_NETFILTER_NETLINK_ACCT=m
CONFIG_NETFILTER_NETLINK_QUEUE=m
CONFIG_NETFILTER_NETLINK_LOG=m
CONFIG_NETFILTER_NETLINK_OSF=m
CONFIG_NF_CONNTRACK=m
CONFIG_NF_LOG_SYSLOG=m
CONFIG_NETFILTER_CONNCOUNT=m
CONFIG_NF_CONNTRACK_MARK=y
CONFIG_NF_CONNTRACK_SECMARK=y
CONFIG_NF_CONNTRACK_ZONES=y
# CONFIG_NF_CONNTRACK_PROCFS is not set
CONFIG_NF_CONNTRACK_EVENTS=y
CONFIG_NF_CONNTRACK_TIMEOUT=y
CONFIG_NF_CONNTRACK_TIMESTAMP=y
CONFIG_NF_CONNTRACK_LABELS=y
CONFIG_NF_CONNTRACK_OVS=y
CONFIG_NF_CT_PROTO_DCCP=y
CONFIG_NF_CT_PROTO_GRE=y
CONFIG_NF_CT_PROTO_SCTP=y
CONFIG_NF_CT_PROTO_UDPLITE=y
CONFIG_NF_CONNTRACK_AMANDA=m
CONFIG_NF_CONNTRACK_FTP=m
CONFIG_NF_CONNTRACK_H323=m
CONFIG_NF_CONNTRACK_IRC=m
CONFIG_NF_CONNTRACK_BROADCAST=m
CONFIG_NF_CONNTRACK_NETBIOS_NS=m
CONFIG_NF_CONNTRACK_SNMP=m
CONFIG_NF_CONNTRACK_PPTP=m
CONFIG_NF_CONNTRACK_SANE=m
CONFIG_NF_CONNTRACK_SIP=m
CONFIG_NF_CONNTRACK_TFTP=m
CONFIG_NF_CT_NETLINK=m
CONFIG_NF_CT_NETLINK_TIMEOUT=m
CONFIG_NF_CT_NETLINK_HELPER=m
CONFIG_NETFILTER_NETLINK_GLUE_CT=y
CONFIG_NF_NAT=m
CONFIG_NF_NAT_AMANDA=m
CONFIG_NF_NAT_FTP=m
CONFIG_NF_NAT_IRC=m
CONFIG_NF_NAT_SIP=m
CONFIG_NF_NAT_TFTP=m
CONFIG_NF_NAT_REDIRECT=y
CONFIG_NF_NAT_MASQUERADE=y
CONFIG_NF_NAT_OVS=y
CONFIG_NETFILTER_SYNPROXY=m
CONFIG_NF_TABLES=m
CONFIG_NF_TABLES_INET=y
CONFIG_NF_TABLES_NETDEV=y
CONFIG_NFT_NUMGEN=m
CONFIG_NFT_CT=m
CONFIG_NFT_FLOW_OFFLOAD=m
CONFIG_NFT_CONNLIMIT=m
CONFIG_NFT_LOG=m
CONFIG_NFT_LIMIT=m
CONFIG_NFT_MASQ=m
CONFIG_NFT_REDIR=m
CONFIG_NFT_NAT=m
CONFIG_NFT_TUNNEL=m
CONFIG_NFT_QUEUE=m
CONFIG_NFT_QUOTA=m
CONFIG_NFT_REJECT=m
CONFIG_NFT_REJECT_INET=m
CONFIG_NFT_COMPAT=m
CONFIG_NFT_HASH=m
CONFIG_NFT_FIB=m
CONFIG_NFT_FIB_INET=m
CONFIG_NFT_XFRM=m
CONFIG_NFT_SOCKET=m
CONFIG_NFT_OSF=m
CONFIG_NFT_TPROXY=m
CONFIG_NFT_SYNPROXY=m
CONFIG_NF_DUP_NETDEV=m
CONFIG_NFT_DUP_NETDEV=m
CONFIG_NFT_FWD_NETDEV=m
CONFIG_NFT_FIB_NETDEV=m
CONFIG_NFT_REJECT_NETDEV=m
CONFIG_NF_FLOW_TABLE_INET=m
CONFIG_NF_FLOW_TABLE=m
# CONFIG_NF_FLOW_TABLE_PROCFS is not set
CONFIG_NETFILTER_XTABLES=m
CONFIG_NETFILTER_XTABLES_COMPAT=y

#
# Xtables combined modules
#
CONFIG_NETFILTER_XT_MARK=m
CONFIG_NETFILTER_XT_CONNMARK=m
CONFIG_NETFILTER_XT_SET=m

#
# Xtables targets
#
CONFIG_NETFILTER_XT_TARGET_AUDIT=m
CONFIG_NETFILTER_XT_TARGET_CHECKSUM=m
CONFIG_NETFILTER_XT_TARGET_CLASSIFY=m
CONFIG_NETFILTER_XT_TARGET_CONNMARK=m
CONFIG_NETFILTER_XT_TARGET_CONNSECMARK=m
CONFIG_NETFILTER_XT_TARGET_CT=m
CONFIG_NETFILTER_XT_TARGET_DSCP=m
CONFIG_NETFILTER_XT_TARGET_HL=m
CONFIG_NETFILTER_XT_TARGET_HMARK=m
CONFIG_NETFILTER_XT_TARGET_IDLETIMER=m
CONFIG_NETFILTER_XT_TARGET_LED=m
CONFIG_NETFILTER_XT_TARGET_LOG=m
CONFIG_NETFILTER_XT_TARGET_MARK=m
CONFIG_NETFILTER_XT_NAT=m
CONFIG_NETFILTER_XT_TARGET_NETMAP=m
CONFIG_NETFILTER_XT_TARGET_NFLOG=m
CONFIG_NETFILTER_XT_TARGET_NFQUEUE=m
# CONFIG_NETFILTER_XT_TARGET_NOTRACK is not set
CONFIG_NETFILTER_XT_TARGET_RATEEST=m
CONFIG_NETFILTER_XT_TARGET_REDIRECT=m
CONFIG_NETFILTER_XT_TARGET_MASQUERADE=m
CONFIG_NETFILTER_XT_TARGET_TEE=m
CONFIG_NETFILTER_XT_TARGET_TPROXY=m
CONFIG_NETFILTER_XT_TARGET_TRACE=m
CONFIG_NETFILTER_XT_TARGET_SECMARK=m
CONFIG_NETFILTER_XT_TARGET_TCPMSS=m
CONFIG_NETFILTER_XT_TARGET_TCPOPTSTRIP=m

#
# Xtables matches
#
CONFIG_NETFILTER_XT_MATCH_ADDRTYPE=m
CONFIG_NETFILTER_XT_MATCH_BPF=m
CONFIG_NETFILTER_XT_MATCH_CGROUP=m
CONFIG_NETFILTER_XT_MATCH_CLUSTER=m
CONFIG_NETFILTER_XT_MATCH_COMMENT=m
CONFIG_NETFILTER_XT_MATCH_CONNBYTES=m
CONFIG_NETFILTER_XT_MATCH_CONNLABEL=m
CONFIG_NETFILTER_XT_MATCH_CONNLIMIT=m
CONFIG_NETFILTER_XT_MATCH_CONNMARK=m
CONFIG_NETFILTER_XT_MATCH_CONNTRACK=m
CONFIG_NETFILTER_XT_MATCH_CPU=m
CONFIG_NETFILTER_XT_MATCH_DCCP=m
CONFIG_NETFILTER_XT_MATCH_DEVGROUP=m
CONFIG_NETFILTER_XT_MATCH_DSCP=m
CONFIG_NETFILTER_XT_MATCH_ECN=m
CONFIG_NETFILTER_XT_MATCH_ESP=m
CONFIG_NETFILTER_XT_MATCH_HASHLIMIT=m
CONFIG_NETFILTER_XT_MATCH_HELPER=m
CONFIG_NETFILTER_XT_MATCH_HL=m
CONFIG_NETFILTER_XT_MATCH_IPCOMP=m
CONFIG_NETFILTER_XT_MATCH_IPRANGE=m
CONFIG_NETFILTER_XT_MATCH_IPVS=m
CONFIG_NETFILTER_XT_MATCH_L2TP=m
CONFIG_NETFILTER_XT_MATCH_LENGTH=m
CONFIG_NETFILTER_XT_MATCH_LIMIT=m
CONFIG_NETFILTER_XT_MATCH_MAC=m
CONFIG_NETFILTER_XT_MATCH_MARK=m
CONFIG_NETFILTER_XT_MATCH_MULTIPORT=m
CONFIG_NETFILTER_XT_MATCH_NFACCT=m
CONFIG_NETFILTER_XT_MATCH_OSF=m
CONFIG_NETFILTER_XT_MATCH_OWNER=m
CONFIG_NETFILTER_XT_MATCH_POLICY=m
CONFIG_NETFILTER_XT_MATCH_PHYSDEV=m
CONFIG_NETFILTER_XT_MATCH_PKTTYPE=m
CONFIG_NETFILTER_XT_MATCH_QUOTA=m
CONFIG_NETFILTER_XT_MATCH_RATEEST=m
CONFIG_NETFILTER_XT_MATCH_REALM=m
CONFIG_NETFILTER_XT_MATCH_RECENT=m
CONFIG_NETFILTER_XT_MATCH_SCTP=m
CONFIG_NETFILTER_XT_MATCH_SOCKET=m
CONFIG_NETFILTER_XT_MATCH_STATE=m
CONFIG_NETFILTER_XT_MATCH_STATISTIC=m
CONFIG_NETFILTER_XT_MATCH_STRING=m
CONFIG_NETFILTER_XT_MATCH_TCPMSS=m
CONFIG_NETFILTER_XT_MATCH_TIME=m
CONFIG_NETFILTER_XT_MATCH_U32=m
# end of Core Netfilter Configuration

CONFIG_IP_SET=m
CONFIG_IP_SET_MAX=256
CONFIG_IP_SET_BITMAP_IP=m
CONFIG_IP_SET_BITMAP_IPMAC=m
CONFIG_IP_SET_BITMAP_PORT=m
CONFIG_IP_SET_HASH_IP=m
CONFIG_IP_SET_HASH_IPMARK=m
CONFIG_IP_SET_HASH_IPPORT=m
CONFIG_IP_SET_HASH_IPPORTIP=m
CONFIG_IP_SET_HASH_IPPORTNET=m
CONFIG_IP_SET_HASH_IPMAC=m
CONFIG_IP_SET_HASH_MAC=m
CONFIG_IP_SET_HASH_NETPORTNET=m
CONFIG_IP_SET_HASH_NET=m
CONFIG_IP_SET_HASH_NETNET=m
CONFIG_IP_SET_HASH_NETPORT=m
CONFIG_IP_SET_HASH_NETIFACE=m
CONFIG_IP_SET_LIST_SET=m
CONFIG_IP_VS=m
CONFIG_IP_VS_IPV6=y
# CONFIG_IP_VS_DEBUG is not set
CONFIG_IP_VS_TAB_BITS=12

#
# IPVS transport protocol load balancing support
#
CONFIG_IP_VS_PROTO_TCP=y
CONFIG_IP_VS_PROTO_UDP=y
CONFIG_IP_VS_PROTO_AH_ESP=y
CONFIG_IP_VS_PROTO_ESP=y
CONFIG_IP_VS_PROTO_AH=y
CONFIG_IP_VS_PROTO_SCTP=y

#
# IPVS scheduler
#
CONFIG_IP_VS_RR=m
CONFIG_IP_VS_WRR=m
CONFIG_IP_VS_LC=m
CONFIG_IP_VS_WLC=m
CONFIG_IP_VS_FO=m
CONFIG_IP_VS_OVF=m
CONFIG_IP_VS_LBLC=m
CONFIG_IP_VS_LBLCR=m
CONFIG_IP_VS_DH=m
CONFIG_IP_VS_SH=m
CONFIG_IP_VS_MH=m
CONFIG_IP_VS_SED=m
CONFIG_IP_VS_NQ=m
CONFIG_IP_VS_TWOS=m

#
# IPVS SH scheduler
#
CONFIG_IP_VS_SH_TAB_BITS=8

#
# IPVS MH scheduler
#
CONFIG_IP_VS_MH_TAB_INDEX=12

#
# IPVS application helper
#
CONFIG_IP_VS_FTP=m
CONFIG_IP_VS_NFCT=y
CONFIG_IP_VS_PE_SIP=m

#
# IP: Netfilter Configuration
#
CONFIG_NF_DEFRAG_IPV4=m
CONFIG_NF_SOCKET_IPV4=m
CONFIG_NF_TPROXY_IPV4=m
CONFIG_NF_TABLES_IPV4=y
CONFIG_NFT_REJECT_IPV4=m
CONFIG_NFT_DUP_IPV4=m
CONFIG_NFT_FIB_IPV4=m
CONFIG_NF_TABLES_ARP=y
CONFIG_NF_DUP_IPV4=m
CONFIG_NF_LOG_ARP=m
CONFIG_NF_LOG_IPV4=m
CONFIG_NF_REJECT_IPV4=m
CONFIG_NF_NAT_SNMP_BASIC=m
CONFIG_NF_NAT_PPTP=m
CONFIG_NF_NAT_H323=m
CONFIG_IP_NF_IPTABLES=m
CONFIG_IP_NF_MATCH_AH=m
CONFIG_IP_NF_MATCH_ECN=m
CONFIG_IP_NF_MATCH_RPFILTER=m
CONFIG_IP_NF_MATCH_TTL=m
CONFIG_IP_NF_FILTER=m
CONFIG_IP_NF_TARGET_REJECT=m
CONFIG_IP_NF_TARGET_SYNPROXY=m
CONFIG_IP_NF_NAT=m
CONFIG_IP_NF_TARGET_MASQUERADE=m
CONFIG_IP_NF_TARGET_NETMAP=m
CONFIG_IP_NF_TARGET_REDIRECT=m
CONFIG_IP_NF_MANGLE=m
CONFIG_IP_NF_TARGET_ECN=m
CONFIG_IP_NF_TARGET_TTL=m
CONFIG_IP_NF_RAW=m
CONFIG_IP_NF_SECURITY=m
CONFIG_IP_NF_ARPTABLES=m
CONFIG_IP_NF_ARPFILTER=m
CONFIG_IP_NF_ARP_MANGLE=m
# end of IP: Netfilter Configuration

#
# IPv6: Netfilter Configuration
#
CONFIG_NF_SOCKET_IPV6=m
CONFIG_NF_TPROXY_IPV6=m
CONFIG_NF_TABLES_IPV6=y
CONFIG_NFT_REJECT_IPV6=m
CONFIG_NFT_DUP_IPV6=m
CONFIG_NFT_FIB_IPV6=m
CONFIG_NF_DUP_IPV6=m
CONFIG_NF_REJECT_IPV6=m
CONFIG_NF_LOG_IPV6=m
CONFIG_IP6_NF_IPTABLES=m
CONFIG_IP6_NF_MATCH_AH=m
CONFIG_IP6_NF_MATCH_EUI64=m
CONFIG_IP6_NF_MATCH_FRAG=m
CONFIG_IP6_NF_MATCH_OPTS=m
CONFIG_IP6_NF_MATCH_HL=m
CONFIG_IP6_NF_MATCH_IPV6HEADER=m
CONFIG_IP6_NF_MATCH_MH=m
CONFIG_IP6_NF_MATCH_RPFILTER=m
CONFIG_IP6_NF_MATCH_RT=m
CONFIG_IP6_NF_MATCH_SRH=m
CONFIG_IP6_NF_TARGET_HL=m
CONFIG_IP6_NF_FILTER=m
CONFIG_IP6_NF_TARGET_REJECT=m
CONFIG_IP6_NF_TARGET_SYNPROXY=m
CONFIG_IP6_NF_MANGLE=m
CONFIG_IP6_NF_RAW=m
CONFIG_IP6_NF_SECURITY=m
CONFIG_IP6_NF_NAT=m
CONFIG_IP6_NF_TARGET_MASQUERADE=m
CONFIG_IP6_NF_TARGET_NPT=m
# end of IPv6: Netfilter Configuration

CONFIG_NF_DEFRAG_IPV6=m
CONFIG_NF_TABLES_BRIDGE=m
CONFIG_NFT_BRIDGE_META=m
CONFIG_NFT_BRIDGE_REJECT=m
CONFIG_NF_CONNTRACK_BRIDGE=m
CONFIG_BRIDGE_NF_EBTABLES=m
CONFIG_BRIDGE_EBT_BROUTE=m
CONFIG_BRIDGE_EBT_T_FILTER=m
CONFIG_BRIDGE_EBT_T_NAT=m
CONFIG_BRIDGE_EBT_802_3=m
CONFIG_BRIDGE_EBT_AMONG=m
CONFIG_BRIDGE_EBT_ARP=m
CONFIG_BRIDGE_EBT_IP=m
CONFIG_BRIDGE_EBT_IP6=m
CONFIG_BRIDGE_EBT_LIMIT=m
CONFIG_BRIDGE_EBT_MARK=m
CONFIG_BRIDGE_EBT_PKTTYPE=m
CONFIG_BRIDGE_EBT_STP=m
CONFIG_BRIDGE_EBT_VLAN=m
CONFIG_BRIDGE_EBT_ARPREPLY=m
CONFIG_BRIDGE_EBT_DNAT=m
CONFIG_BRIDGE_EBT_MARK_T=m
CONFIG_BRIDGE_EBT_REDIRECT=m
CONFIG_BRIDGE_EBT_SNAT=m
CONFIG_BRIDGE_EBT_LOG=m
CONFIG_BRIDGE_EBT_NFLOG=m
CONFIG_BPFILTER=y
CONFIG_BPFILTER_UMH=m
CONFIG_IP_DCCP=m
CONFIG_INET_DCCP_DIAG=m

#
# DCCP CCIDs Configuration
#
# CONFIG_IP_DCCP_CCID2_DEBUG is not set
# CONFIG_IP_DCCP_CCID3 is not set
# end of DCCP CCIDs Configuration

#
# DCCP Kernel Hacking
#
# CONFIG_IP_DCCP_DEBUG is not set
# end of DCCP Kernel Hacking

CONFIG_IP_SCTP=m
# CONFIG_SCTP_DBG_OBJCNT is not set
# CONFIG_SCTP_DEFAULT_COOKIE_HMAC_MD5 is not set
CONFIG_SCTP_DEFAULT_COOKIE_HMAC_SHA1=y
# CONFIG_SCTP_DEFAULT_COOKIE_HMAC_NONE is not set
CONFIG_SCTP_COOKIE_HMAC_MD5=y
CONFIG_SCTP_COOKIE_HMAC_SHA1=y
CONFIG_INET_SCTP_DIAG=m
CONFIG_RDS=m
CONFIG_RDS_RDMA=m
CONFIG_RDS_TCP=m
# CONFIG_RDS_DEBUG is not set
CONFIG_TIPC=m
CONFIG_TIPC_MEDIA_IB=y
CONFIG_TIPC_MEDIA_UDP=y
CONFIG_TIPC_CRYPTO=y
CONFIG_TIPC_DIAG=m
CONFIG_ATM=m
CONFIG_ATM_CLIP=m
# CONFIG_ATM_CLIP_NO_ICMP is not set
CONFIG_ATM_LANE=m
CONFIG_ATM_MPOA=m
CONFIG_ATM_BR2684=m
# CONFIG_ATM_BR2684_IPFILTER is not set
CONFIG_L2TP=m
CONFIG_L2TP_DEBUGFS=m
CONFIG_L2TP_V3=y
CONFIG_L2TP_IP=m
CONFIG_L2TP_ETH=m
CONFIG_STP=m
CONFIG_GARP=m
CONFIG_MRP=m
CONFIG_BRIDGE=m
CONFIG_BRIDGE_IGMP_SNOOPING=y
CONFIG_BRIDGE_VLAN_FILTERING=y
CONFIG_BRIDGE_MRP=y
CONFIG_BRIDGE_CFM=y
CONFIG_NET_DSA=m
CONFIG_NET_DSA_TAG_NONE=m
CONFIG_NET_DSA_TAG_AR9331=m
CONFIG_NET_DSA_TAG_BRCM_COMMON=m
CONFIG_NET_DSA_TAG_BRCM=m
CONFIG_NET_DSA_TAG_BRCM_LEGACY=m
CONFIG_NET_DSA_TAG_BRCM_PREPEND=m
CONFIG_NET_DSA_TAG_HELLCREEK=m
CONFIG_NET_DSA_TAG_GSWIP=m
CONFIG_NET_DSA_TAG_DSA_COMMON=m
CONFIG_NET_DSA_TAG_DSA=m
CONFIG_NET_DSA_TAG_EDSA=m
CONFIG_NET_DSA_TAG_MTK=m
CONFIG_NET_DSA_TAG_KSZ=m
CONFIG_NET_DSA_TAG_OCELOT=m
CONFIG_NET_DSA_TAG_OCELOT_8021Q=m
CONFIG_NET_DSA_TAG_QCA=m
CONFIG_NET_DSA_TAG_RTL4_A=m
CONFIG_NET_DSA_TAG_RTL8_4=m
CONFIG_NET_DSA_TAG_RZN1_A5PSW=m
CONFIG_NET_DSA_TAG_LAN9303=m
CONFIG_NET_DSA_TAG_SJA1105=m
CONFIG_NET_DSA_TAG_TRAILER=m
CONFIG_NET_DSA_TAG_XRS700X=m
CONFIG_VLAN_8021Q=m
CONFIG_VLAN_8021Q_GVRP=y
CONFIG_VLAN_8021Q_MVRP=y
CONFIG_LLC=m
CONFIG_LLC2=m
CONFIG_ATALK=m
CONFIG_DEV_APPLETALK=m
# CONFIG_IPDDP is not set
CONFIG_X25=m
CONFIG_LAPB=m
CONFIG_PHONET=m
CONFIG_6LOWPAN=m
# CONFIG_6LOWPAN_DEBUGFS is not set
CONFIG_6LOWPAN_NHC=m
CONFIG_6LOWPAN_NHC_DEST=m
CONFIG_6LOWPAN_NHC_FRAGMENT=m
CONFIG_6LOWPAN_NHC_HOP=m
CONFIG_6LOWPAN_NHC_IPV6=m
CONFIG_6LOWPAN_NHC_MOBILITY=m
CONFIG_6LOWPAN_NHC_ROUTING=m
CONFIG_6LOWPAN_NHC_UDP=m
# CONFIG_6LOWPAN_GHC_EXT_HDR_HOP is not set
# CONFIG_6LOWPAN_GHC_UDP is not set
# CONFIG_6LOWPAN_GHC_ICMPV6 is not set
# CONFIG_6LOWPAN_GHC_EXT_HDR_DEST is not set
# CONFIG_6LOWPAN_GHC_EXT_HDR_FRAG is not set
# CONFIG_6LOWPAN_GHC_EXT_HDR_ROUTE is not set
CONFIG_IEEE802154=m
# CONFIG_IEEE802154_NL802154_EXPERIMENTAL is not set
CONFIG_IEEE802154_SOCKET=m
CONFIG_IEEE802154_6LOWPAN=m
CONFIG_MAC802154=m
CONFIG_NET_SCHED=y

#
# Queueing/Scheduling
#
CONFIG_NET_SCH_HTB=m
CONFIG_NET_SCH_HFSC=m
CONFIG_NET_SCH_PRIO=m
CONFIG_NET_SCH_MULTIQ=m
CONFIG_NET_SCH_RED=m
CONFIG_NET_SCH_SFB=m
CONFIG_NET_SCH_SFQ=m
CONFIG_NET_SCH_TEQL=m
CONFIG_NET_SCH_TBF=m
CONFIG_NET_SCH_CBS=m
CONFIG_NET_SCH_ETF=m
CONFIG_NET_SCH_MQPRIO_LIB=m
CONFIG_NET_SCH_TAPRIO=m
CONFIG_NET_SCH_GRED=m
CONFIG_NET_SCH_NETEM=m
CONFIG_NET_SCH_DRR=m
CONFIG_NET_SCH_MQPRIO=m
CONFIG_NET_SCH_SKBPRIO=m
CONFIG_NET_SCH_CHOKE=m
CONFIG_NET_SCH_QFQ=m
CONFIG_NET_SCH_CODEL=m
CONFIG_NET_SCH_FQ_CODEL=m
CONFIG_NET_SCH_CAKE=m
CONFIG_NET_SCH_FQ=m
CONFIG_NET_SCH_HHF=m
CONFIG_NET_SCH_PIE=m
CONFIG_NET_SCH_FQ_PIE=m
CONFIG_NET_SCH_INGRESS=m
CONFIG_NET_SCH_PLUG=m
CONFIG_NET_SCH_ETS=m
# CONFIG_NET_SCH_DEFAULT is not set

#
# Classification
#
CONFIG_NET_CLS=y
CONFIG_NET_CLS_BASIC=m
CONFIG_NET_CLS_ROUTE4=m
CONFIG_NET_CLS_FW=m
CONFIG_NET_CLS_U32=m
# CONFIG_CLS_U32_PERF is not set
CONFIG_CLS_U32_MARK=y
CONFIG_NET_CLS_FLOW=m
CONFIG_NET_CLS_CGROUP=m
CONFIG_NET_CLS_BPF=m
CONFIG_NET_CLS_FLOWER=m
CONFIG_NET_CLS_MATCHALL=m
CONFIG_NET_EMATCH=y
CONFIG_NET_EMATCH_STACK=32
CONFIG_NET_EMATCH_CMP=m
CONFIG_NET_EMATCH_NBYTE=m
CONFIG_NET_EMATCH_U32=m
CONFIG_NET_EMATCH_META=m
CONFIG_NET_EMATCH_TEXT=m
CONFIG_NET_EMATCH_CANID=m
CONFIG_NET_EMATCH_IPSET=m
CONFIG_NET_EMATCH_IPT=m
CONFIG_NET_CLS_ACT=y
CONFIG_NET_ACT_POLICE=m
CONFIG_NET_ACT_GACT=m
CONFIG_GACT_PROB=y
CONFIG_NET_ACT_MIRRED=m
CONFIG_NET_ACT_SAMPLE=m
CONFIG_NET_ACT_IPT=m
CONFIG_NET_ACT_NAT=m
CONFIG_NET_ACT_PEDIT=m
CONFIG_NET_ACT_SIMP=m
CONFIG_NET_ACT_SKBEDIT=m
CONFIG_NET_ACT_CSUM=m
CONFIG_NET_ACT_MPLS=m
CONFIG_NET_ACT_VLAN=m
CONFIG_NET_ACT_BPF=m
CONFIG_NET_ACT_CONNMARK=m
CONFIG_NET_ACT_CTINFO=m
CONFIG_NET_ACT_SKBMOD=m
# CONFIG_NET_ACT_IFE is not set
CONFIG_NET_ACT_TUNNEL_KEY=m
CONFIG_NET_ACT_CT=m
CONFIG_NET_ACT_GATE=m
CONFIG_NET_TC_SKB_EXT=y
CONFIG_NET_SCH_FIFO=y
CONFIG_DCB=y
CONFIG_DNS_RESOLVER=y
CONFIG_BATMAN_ADV=m
# CONFIG_BATMAN_ADV_BATMAN_V is not set
CONFIG_BATMAN_ADV_BLA=y
CONFIG_BATMAN_ADV_DAT=y
CONFIG_BATMAN_ADV_NC=y
CONFIG_BATMAN_ADV_MCAST=y
# CONFIG_BATMAN_ADV_DEBUG is not set
# CONFIG_BATMAN_ADV_TRACING is not set
CONFIG_OPENVSWITCH=m
CONFIG_OPENVSWITCH_GRE=m
CONFIG_OPENVSWITCH_VXLAN=m
CONFIG_OPENVSWITCH_GENEVE=m
CONFIG_VSOCKETS=m
CONFIG_VSOCKETS_DIAG=m
CONFIG_VSOCKETS_LOOPBACK=m
CONFIG_VMWARE_VMCI_VSOCKETS=m
CONFIG_VIRTIO_VSOCKETS=m
CONFIG_VIRTIO_VSOCKETS_COMMON=m
CONFIG_HYPERV_VSOCKETS=m
CONFIG_NETLINK_DIAG=m
CONFIG_MPLS=y
CONFIG_NET_MPLS_GSO=m
CONFIG_MPLS_ROUTING=m
CONFIG_MPLS_IPTUNNEL=m
CONFIG_NET_NSH=m
CONFIG_HSR=m
CONFIG_NET_SWITCHDEV=y
CONFIG_NET_L3_MASTER_DEV=y
CONFIG_QRTR=m
CONFIG_QRTR_SMD=m
CONFIG_QRTR_TUN=m
CONFIG_QRTR_MHI=m
CONFIG_NET_NCSI=y
CONFIG_NCSI_OEM_CMD_GET_MAC=y
# CONFIG_NCSI_OEM_CMD_KEEP_PHY is not set
CONFIG_PCPU_DEV_REFCNT=y
CONFIG_MAX_SKB_FRAGS=17
CONFIG_RPS=y
CONFIG_RFS_ACCEL=y
CONFIG_SOCK_RX_QUEUE_MAPPING=y
CONFIG_XPS=y
CONFIG_CGROUP_NET_PRIO=y
CONFIG_CGROUP_NET_CLASSID=y
CONFIG_NET_RX_BUSY_POLL=y
CONFIG_BQL=y
CONFIG_BPF_STREAM_PARSER=y
CONFIG_NET_FLOW_LIMIT=y

#
# Network testing
#
CONFIG_NET_PKTGEN=m
CONFIG_NET_DROP_MONITOR=y
# end of Network testing
# end of Networking options

CONFIG_HAMRADIO=y

#
# Packet Radio protocols
#
CONFIG_AX25=m
CONFIG_AX25_DAMA_SLAVE=y
CONFIG_NETROM=m
CONFIG_ROSE=m

#
# AX.25 network device drivers
#
CONFIG_MKISS=m
CONFIG_6PACK=m
CONFIG_BPQETHER=m
CONFIG_BAYCOM_SER_FDX=m
CONFIG_BAYCOM_SER_HDX=m
CONFIG_BAYCOM_PAR=m
CONFIG_YAM=m
# end of AX.25 network device drivers

CONFIG_CAN=m
CONFIG_CAN_RAW=m
CONFIG_CAN_BCM=m
CONFIG_CAN_GW=m
CONFIG_CAN_J1939=m
CONFIG_CAN_ISOTP=m
CONFIG_BT=m
CONFIG_BT_BREDR=y
CONFIG_BT_RFCOMM=m
CONFIG_BT_RFCOMM_TTY=y
CONFIG_BT_BNEP=m
CONFIG_BT_BNEP_MC_FILTER=y
CONFIG_BT_BNEP_PROTO_FILTER=y
CONFIG_BT_CMTP=m
CONFIG_BT_HIDP=m
CONFIG_BT_HS=y
CONFIG_BT_LE=y
CONFIG_BT_LE_L2CAP_ECRED=y
CONFIG_BT_6LOWPAN=m
CONFIG_BT_LEDS=y
CONFIG_BT_MSFTEXT=y
CONFIG_BT_AOSPEXT=y
CONFIG_BT_DEBUGFS=y
# CONFIG_BT_SELFTEST is not set

#
# Bluetooth device drivers
#
CONFIG_BT_INTEL=m
CONFIG_BT_BCM=m
CONFIG_BT_RTL=m
CONFIG_BT_QCA=m
CONFIG_BT_MTK=m
CONFIG_BT_HCIBTUSB=m
CONFIG_BT_HCIBTUSB_AUTOSUSPEND=y
CONFIG_BT_HCIBTUSB_POLL_SYNC=y
CONFIG_BT_HCIBTUSB_BCM=y
CONFIG_BT_HCIBTUSB_MTK=y
CONFIG_BT_HCIBTUSB_RTL=y
CONFIG_BT_HCIBTSDIO=m
CONFIG_BT_HCIUART=m
CONFIG_BT_HCIUART_SERDEV=y
CONFIG_BT_HCIUART_H4=y
CONFIG_BT_HCIUART_NOKIA=m
CONFIG_BT_HCIUART_BCSP=y
CONFIG_BT_HCIUART_ATH3K=y
CONFIG_BT_HCIUART_LL=y
CONFIG_BT_HCIUART_3WIRE=y
CONFIG_BT_HCIUART_INTEL=y
CONFIG_BT_HCIUART_BCM=y
CONFIG_BT_HCIUART_RTL=y
CONFIG_BT_HCIUART_QCA=y
CONFIG_BT_HCIUART_AG6XX=y
CONFIG_BT_HCIUART_MRVL=y
CONFIG_BT_HCIBCM203X=m
CONFIG_BT_HCIBCM4377=m
CONFIG_BT_HCIBPA10X=m
CONFIG_BT_HCIBFUSB=m
CONFIG_BT_HCIDTL1=m
CONFIG_BT_HCIBT3C=m
CONFIG_BT_HCIBLUECARD=m
CONFIG_BT_HCIVHCI=m
CONFIG_BT_MRVL=m
CONFIG_BT_MRVL_SDIO=m
CONFIG_BT_ATH3K=m
CONFIG_BT_MTKSDIO=m
CONFIG_BT_MTKUART=m
CONFIG_BT_HCIRSI=m
CONFIG_BT_VIRTIO=m
# CONFIG_BT_NXPUART is not set
# end of Bluetooth device drivers

CONFIG_AF_RXRPC=m
CONFIG_AF_RXRPC_IPV6=y
# CONFIG_AF_RXRPC_INJECT_LOSS is not set
# CONFIG_AF_RXRPC_INJECT_RX_DELAY is not set
# CONFIG_AF_RXRPC_DEBUG is not set
CONFIG_RXKAD=y
CONFIG_RXPERF=m
CONFIG_AF_KCM=m
CONFIG_STREAM_PARSER=y
CONFIG_MCTP=y
CONFIG_FIB_RULES=y
CONFIG_WIRELESS=y
CONFIG_WIRELESS_EXT=y
CONFIG_WEXT_CORE=y
CONFIG_WEXT_PROC=y
CONFIG_WEXT_SPY=y
CONFIG_WEXT_PRIV=y
CONFIG_CFG80211=m
# CONFIG_NL80211_TESTMODE is not set
# CONFIG_CFG80211_DEVELOPER_WARNINGS is not set
# CONFIG_CFG80211_CERTIFICATION_ONUS is not set
CONFIG_CFG80211_REQUIRE_SIGNED_REGDB=y
CONFIG_CFG80211_USE_KERNEL_REGDB_KEYS=y
CONFIG_CFG80211_DEFAULT_PS=y
CONFIG_CFG80211_DEBUGFS=y
CONFIG_CFG80211_CRDA_SUPPORT=y
CONFIG_CFG80211_WEXT=y
CONFIG_CFG80211_WEXT_EXPORT=y
CONFIG_LIB80211=m
CONFIG_LIB80211_CRYPT_WEP=m
CONFIG_LIB80211_CRYPT_CCMP=m
CONFIG_LIB80211_CRYPT_TKIP=m
# CONFIG_LIB80211_DEBUG is not set
CONFIG_MAC80211=m
CONFIG_MAC80211_HAS_RC=y
CONFIG_MAC80211_RC_MINSTREL=y
CONFIG_MAC80211_RC_DEFAULT_MINSTREL=y
CONFIG_MAC80211_RC_DEFAULT="minstrel_ht"
CONFIG_MAC80211_MESH=y
CONFIG_MAC80211_LEDS=y
CONFIG_MAC80211_DEBUGFS=y
CONFIG_MAC80211_MESSAGE_TRACING=y
# CONFIG_MAC80211_DEBUG_MENU is not set
CONFIG_MAC80211_STA_HASH_MAX_SIZE=0
CONFIG_RFKILL=y
CONFIG_RFKILL_LEDS=y
CONFIG_RFKILL_INPUT=y
CONFIG_RFKILL_GPIO=m
CONFIG_NET_9P=m
CONFIG_NET_9P_FD=m
CONFIG_NET_9P_VIRTIO=m
CONFIG_NET_9P_XEN=m
CONFIG_NET_9P_RDMA=m
# CONFIG_NET_9P_DEBUG is not set
CONFIG_CAIF=m
# CONFIG_CAIF_DEBUG is not set
CONFIG_CAIF_NETDEV=m
CONFIG_CAIF_USB=m
CONFIG_CEPH_LIB=m
# CONFIG_CEPH_LIB_PRETTYDEBUG is not set
CONFIG_CEPH_LIB_USE_DNS_RESOLVER=y
CONFIG_NFC=m
CONFIG_NFC_DIGITAL=m
CONFIG_NFC_NCI=m
CONFIG_NFC_NCI_SPI=m
CONFIG_NFC_NCI_UART=m
CONFIG_NFC_HCI=m
CONFIG_NFC_SHDLC=y

#
# Near Field Communication (NFC) devices
#
CONFIG_NFC_TRF7970A=m
CONFIG_NFC_MEI_PHY=m
CONFIG_NFC_SIM=m
CONFIG_NFC_PORT100=m
CONFIG_NFC_VIRTUAL_NCI=m
CONFIG_NFC_FDP=m
CONFIG_NFC_FDP_I2C=m
CONFIG_NFC_PN544=m
CONFIG_NFC_PN544_I2C=m
CONFIG_NFC_PN544_MEI=m
CONFIG_NFC_PN533=m
CONFIG_NFC_PN533_USB=m
CONFIG_NFC_PN533_I2C=m
CONFIG_NFC_PN532_UART=m
CONFIG_NFC_MICROREAD=m
CONFIG_NFC_MICROREAD_I2C=m
CONFIG_NFC_MICROREAD_MEI=m
CONFIG_NFC_MRVL=m
CONFIG_NFC_MRVL_USB=m
CONFIG_NFC_MRVL_UART=m
CONFIG_NFC_MRVL_I2C=m
CONFIG_NFC_MRVL_SPI=m
CONFIG_NFC_ST21NFCA=m
CONFIG_NFC_ST21NFCA_I2C=m
CONFIG_NFC_ST_NCI=m
CONFIG_NFC_ST_NCI_I2C=m
CONFIG_NFC_ST_NCI_SPI=m
CONFIG_NFC_NXP_NCI=m
CONFIG_NFC_NXP_NCI_I2C=m
CONFIG_NFC_S3FWRN5=m
CONFIG_NFC_S3FWRN5_I2C=m
CONFIG_NFC_S3FWRN82_UART=m
CONFIG_NFC_ST95HF=m
# end of Near Field Communication (NFC) devices

CONFIG_PSAMPLE=m
CONFIG_NET_IFE=m
CONFIG_LWTUNNEL=y
CONFIG_LWTUNNEL_BPF=y
CONFIG_DST_CACHE=y
CONFIG_GRO_CELLS=y
CONFIG_SOCK_VALIDATE_XMIT=y
CONFIG_NET_SELFTESTS=y
CONFIG_NET_SOCK_MSG=y
CONFIG_NET_DEVLINK=y
CONFIG_PAGE_POOL=y
# CONFIG_PAGE_POOL_STATS is not set
CONFIG_FAILOVER=m
CONFIG_ETHTOOL_NETLINK=y

#
# Device Drivers
#
CONFIG_HAVE_EISA=y
CONFIG_EISA=y
CONFIG_EISA_VLB_PRIMING=y
CONFIG_EISA_PCI_EISA=y
CONFIG_EISA_VIRTUAL_ROOT=y
CONFIG_EISA_NAMES=y
CONFIG_HAVE_PCI=y
CONFIG_PCI=y
CONFIG_PCI_DOMAINS=y
CONFIG_PCIEPORTBUS=y
CONFIG_HOTPLUG_PCI_PCIE=y
CONFIG_PCIEAER=y
# CONFIG_PCIEAER_INJECT is not set
# CONFIG_PCIE_ECRC is not set
CONFIG_PCIEASPM=y
CONFIG_PCIEASPM_DEFAULT=y
# CONFIG_PCIEASPM_POWERSAVE is not set
# CONFIG_PCIEASPM_POWER_SUPERSAVE is not set
# CONFIG_PCIEASPM_PERFORMANCE is not set
CONFIG_PCIE_PME=y
CONFIG_PCIE_DPC=y
CONFIG_PCIE_PTM=y
CONFIG_PCIE_EDR=y
CONFIG_PCI_MSI=y
CONFIG_PCI_QUIRKS=y
# CONFIG_PCI_DEBUG is not set
CONFIG_PCI_REALLOC_ENABLE_AUTO=y
CONFIG_PCI_STUB=m
CONFIG_PCI_PF_STUB=m
CONFIG_XEN_PCIDEV_FRONTEND=m
CONFIG_PCI_ATS=y
CONFIG_PCI_DOE=y
CONFIG_PCI_LOCKLESS_CONFIG=y
CONFIG_PCI_IOV=y
CONFIG_PCI_PRI=y
CONFIG_PCI_PASID=y
CONFIG_PCI_P2PDMA=y
CONFIG_PCI_LABEL=y
CONFIG_PCI_HYPERV=m
# CONFIG_PCIE_BUS_TUNE_OFF is not set
CONFIG_PCIE_BUS_DEFAULT=y
# CONFIG_PCIE_BUS_SAFE is not set
# CONFIG_PCIE_BUS_PERFORMANCE is not set
# CONFIG_PCIE_BUS_PEER2PEER is not set
CONFIG_VGA_ARB=y
CONFIG_VGA_ARB_MAX_GPUS=16
CONFIG_HOTPLUG_PCI=y
CONFIG_HOTPLUG_PCI_ACPI=y
CONFIG_HOTPLUG_PCI_ACPI_IBM=m
CONFIG_HOTPLUG_PCI_CPCI=y
CONFIG_HOTPLUG_PCI_CPCI_ZT5550=m
CONFIG_HOTPLUG_PCI_CPCI_GENERIC=m
CONFIG_HOTPLUG_PCI_SHPC=y

#
# PCI controller drivers
#
CONFIG_VMD=m
CONFIG_PCI_HYPERV_INTERFACE=m

#
# Cadence-based PCIe controllers
#
# end of Cadence-based PCIe controllers

#
# DesignWare-based PCIe controllers
#
CONFIG_PCIE_DW=y
CONFIG_PCIE_DW_HOST=y
CONFIG_PCIE_DW_EP=y
# CONFIG_PCI_MESON is not set
CONFIG_PCIE_DW_PLAT=y
CONFIG_PCIE_DW_PLAT_HOST=y
CONFIG_PCIE_DW_PLAT_EP=y
# end of DesignWare-based PCIe controllers

#
# Mobiveil-based PCIe controllers
#
# end of Mobiveil-based PCIe controllers
# end of PCI controller drivers

#
# PCI Endpoint
#
CONFIG_PCI_ENDPOINT=y
CONFIG_PCI_ENDPOINT_CONFIGFS=y
# CONFIG_PCI_EPF_TEST is not set
CONFIG_PCI_EPF_NTB=m
CONFIG_PCI_EPF_VNTB=m
# CONFIG_PCI_EPF_MHI is not set
# end of PCI Endpoint

#
# PCI switch controller drivers
#
CONFIG_PCI_SW_SWITCHTEC=m
# end of PCI switch controller drivers

CONFIG_CXL_BUS=m
CONFIG_CXL_PCI=m
# CONFIG_CXL_MEM_RAW_COMMANDS is not set
CONFIG_CXL_ACPI=m
CONFIG_CXL_PMEM=m
CONFIG_CXL_MEM=m
CONFIG_CXL_PORT=m
CONFIG_CXL_SUSPEND=y
CONFIG_CXL_REGION=y
# CONFIG_CXL_REGION_INVALIDATION_TEST is not set
CONFIG_CXL_PMU=m
CONFIG_PCCARD=m
CONFIG_PCMCIA=m
CONFIG_PCMCIA_LOAD_CIS=y
CONFIG_CARDBUS=y

#
# PC-card bridges
#
CONFIG_YENTA=m
CONFIG_YENTA_O2=y
CONFIG_YENTA_RICOH=y
CONFIG_YENTA_TI=y
CONFIG_YENTA_ENE_TUNE=y
CONFIG_YENTA_TOSHIBA=y
CONFIG_PD6729=m
CONFIG_I82092=m
CONFIG_PCCARD_NONSTATIC=y
CONFIG_RAPIDIO=y
CONFIG_RAPIDIO_TSI721=m
CONFIG_RAPIDIO_DISC_TIMEOUT=30
# CONFIG_RAPIDIO_ENABLE_RX_TX_PORTS is not set
CONFIG_RAPIDIO_DMA_ENGINE=y
# CONFIG_RAPIDIO_DEBUG is not set
CONFIG_RAPIDIO_ENUM_BASIC=m
CONFIG_RAPIDIO_CHMAN=m
CONFIG_RAPIDIO_MPORT_CDEV=m

#
# RapidIO Switch drivers
#
CONFIG_RAPIDIO_CPS_XX=m
CONFIG_RAPIDIO_CPS_GEN2=m
CONFIG_RAPIDIO_RXS_GEN3=m
# end of RapidIO Switch drivers

#
# Generic Driver Options
#
CONFIG_AUXILIARY_BUS=y
CONFIG_UEVENT_HELPER=y
CONFIG_UEVENT_HELPER_PATH=""
CONFIG_DEVTMPFS=y
CONFIG_DEVTMPFS_MOUNT=y
CONFIG_DEVTMPFS_SAFE=y
# CONFIG_STANDALONE is not set
CONFIG_PREVENT_FIRMWARE_BUILD=y

#
# Firmware loader
#
CONFIG_FW_LOADER=y
CONFIG_FW_LOADER_DEBUG=y
CONFIG_FW_LOADER_PAGED_BUF=y
CONFIG_FW_LOADER_SYSFS=y
CONFIG_EXTRA_FIRMWARE=""
CONFIG_FW_LOADER_USER_HELPER=y
# CONFIG_FW_LOADER_USER_HELPER_FALLBACK is not set
CONFIG_FW_LOADER_COMPRESS=y
CONFIG_FW_LOADER_COMPRESS_XZ=y
CONFIG_FW_LOADER_COMPRESS_ZSTD=y
CONFIG_FW_CACHE=y
CONFIG_FW_UPLOAD=y
# end of Firmware loader

CONFIG_WANT_DEV_COREDUMP=y
CONFIG_ALLOW_DEV_COREDUMP=y
CONFIG_DEV_COREDUMP=y
# CONFIG_DEBUG_DRIVER is not set
# CONFIG_DEBUG_DEVRES is not set
# CONFIG_DEBUG_TEST_DRIVER_REMOVE is not set
CONFIG_HMEM_REPORTING=y
# CONFIG_TEST_ASYNC_DRIVER_PROBE is not set
CONFIG_SYS_HYPERVISOR=y
CONFIG_GENERIC_CPU_AUTOPROBE=y
CONFIG_GENERIC_CPU_VULNERABILITIES=y
CONFIG_REGMAP=y
CONFIG_REGMAP_I2C=y
CONFIG_REGMAP_SLIMBUS=m
CONFIG_REGMAP_SPI=y
CONFIG_REGMAP_SPMI=m
CONFIG_REGMAP_W1=m
CONFIG_REGMAP_MMIO=y
CONFIG_REGMAP_IRQ=y
CONFIG_REGMAP_SOUNDWIRE=m
CONFIG_REGMAP_SOUNDWIRE_MBQ=m
CONFIG_REGMAP_SCCB=m
CONFIG_REGMAP_I3C=m
CONFIG_DMA_SHARED_BUFFER=y
# CONFIG_DMA_FENCE_TRACE is not set
# CONFIG_FW_DEVLINK_SYNC_STATE_TIMEOUT is not set
# end of Generic Driver Options

#
# Bus devices
#
CONFIG_MHI_BUS=m
# CONFIG_MHI_BUS_DEBUG is not set
CONFIG_MHI_BUS_PCI_GENERIC=m
CONFIG_MHI_BUS_EP=m
# end of Bus devices

#
# Cache Drivers
#
# end of Cache Drivers

CONFIG_CONNECTOR=y
CONFIG_PROC_EVENTS=y

#
# Firmware Drivers
#

#
# ARM System Control and Management Interface Protocol
#
# end of ARM System Control and Management Interface Protocol

CONFIG_EDD=y
CONFIG_EDD_OFF=y
CONFIG_FIRMWARE_MEMMAP=y
CONFIG_DMIID=y
CONFIG_DMI_SYSFS=m
CONFIG_DMI_SCAN_MACHINE_NON_EFI_FALLBACK=y
CONFIG_ISCSI_IBFT_FIND=y
CONFIG_ISCSI_IBFT=m
CONFIG_FW_CFG_SYSFS=m
# CONFIG_FW_CFG_SYSFS_CMDLINE is not set
CONFIG_SYSFB=y
# CONFIG_SYSFB_SIMPLEFB is not set
CONFIG_FW_CS_DSP=m
# CONFIG_GOOGLE_FIRMWARE is not set

#
# EFI (Extensible Firmware Interface) Support
#
CONFIG_EFI_ESRT=y
CONFIG_EFI_VARS_PSTORE=m
# CONFIG_EFI_VARS_PSTORE_DEFAULT_DISABLE is not set
CONFIG_EFI_SOFT_RESERVE=y
CONFIG_EFI_DXE_MEM_ATTRIBUTES=y
CONFIG_EFI_RUNTIME_WRAPPERS=y
CONFIG_EFI_BOOTLOADER_CONTROL=m
CONFIG_EFI_CAPSULE_LOADER=m
CONFIG_EFI_TEST=m
CONFIG_EFI_DEV_PATH_PARSER=y
CONFIG_APPLE_PROPERTIES=y
CONFIG_RESET_ATTACK_MITIGATION=y
CONFIG_EFI_RCI2_TABLE=y
# CONFIG_EFI_DISABLE_PCI_DMA is not set
CONFIG_EFI_EARLYCON=y
CONFIG_EFI_CUSTOM_SSDT_OVERLAYS=y
# CONFIG_EFI_DISABLE_RUNTIME is not set
CONFIG_EFI_COCO_SECRET=y
CONFIG_UNACCEPTED_MEMORY=y
CONFIG_EFI_EMBEDDED_FIRMWARE=y
# end of EFI (Extensible Firmware Interface) Support

CONFIG_UEFI_CPER=y
CONFIG_UEFI_CPER_X86=y

#
# Tegra firmware driver
#
# end of Tegra firmware driver
# end of Firmware Drivers

CONFIG_GNSS=m
CONFIG_GNSS_SERIAL=m
CONFIG_GNSS_MTK_SERIAL=m
CONFIG_GNSS_SIRF_SERIAL=m
CONFIG_GNSS_UBX_SERIAL=m
CONFIG_GNSS_USB=m
CONFIG_MTD=m
# CONFIG_MTD_TESTS is not set

#
# Partition parsers
#
CONFIG_MTD_AR7_PARTS=m
CONFIG_MTD_CMDLINE_PARTS=m
CONFIG_MTD_REDBOOT_PARTS=m
CONFIG_MTD_REDBOOT_DIRECTORY_BLOCK=-1
# CONFIG_MTD_REDBOOT_PARTS_UNALLOCATED is not set
# CONFIG_MTD_REDBOOT_PARTS_READONLY is not set
# end of Partition parsers

#
# User Modules And Translation Layers
#
CONFIG_MTD_BLKDEVS=m
CONFIG_MTD_BLOCK=m
CONFIG_MTD_BLOCK_RO=m

#
# Note that in some cases UBI block is preferred. See MTD_UBI_BLOCK.
#
CONFIG_FTL=m
CONFIG_NFTL=m
CONFIG_NFTL_RW=y
CONFIG_INFTL=m
CONFIG_RFD_FTL=m
CONFIG_SSFDC=m
CONFIG_SM_FTL=m
CONFIG_MTD_OOPS=m
CONFIG_MTD_PSTORE=m
CONFIG_MTD_SWAP=m
# CONFIG_MTD_PARTITIONED_MASTER is not set

#
# RAM/ROM/Flash chip drivers
#
CONFIG_MTD_CFI=m
CONFIG_MTD_JEDECPROBE=m
CONFIG_MTD_GEN_PROBE=m
# CONFIG_MTD_CFI_ADV_OPTIONS is not set
CONFIG_MTD_MAP_BANK_WIDTH_1=y
CONFIG_MTD_MAP_BANK_WIDTH_2=y
CONFIG_MTD_MAP_BANK_WIDTH_4=y
CONFIG_MTD_CFI_I1=y
CONFIG_MTD_CFI_I2=y
CONFIG_MTD_CFI_INTELEXT=m
CONFIG_MTD_CFI_AMDSTD=m
CONFIG_MTD_CFI_STAA=m
CONFIG_MTD_CFI_UTIL=m
CONFIG_MTD_RAM=m
CONFIG_MTD_ROM=m
CONFIG_MTD_ABSENT=m
# end of RAM/ROM/Flash chip drivers

#
# Mapping drivers for chip access
#
CONFIG_MTD_COMPLEX_MAPPINGS=y
CONFIG_MTD_PHYSMAP=m
# CONFIG_MTD_PHYSMAP_COMPAT is not set
CONFIG_MTD_PHYSMAP_GPIO_ADDR=y
CONFIG_MTD_SBC_GXX=m
CONFIG_MTD_AMD76XROM=m
CONFIG_MTD_ICHXROM=m
CONFIG_MTD_ESB2ROM=m
CONFIG_MTD_CK804XROM=m
CONFIG_MTD_SCB2_FLASH=m
CONFIG_MTD_NETtel=m
CONFIG_MTD_L440GX=m
CONFIG_MTD_PCI=m
CONFIG_MTD_PCMCIA=m
# CONFIG_MTD_PCMCIA_ANONYMOUS is not set
CONFIG_MTD_INTEL_VR_NOR=m
CONFIG_MTD_PLATRAM=m
# end of Mapping drivers for chip access

#
# Self-contained MTD device drivers
#
CONFIG_MTD_PMC551=m
# CONFIG_MTD_PMC551_BUGFIX is not set
# CONFIG_MTD_PMC551_DEBUG is not set
CONFIG_MTD_DATAFLASH=m
# CONFIG_MTD_DATAFLASH_WRITE_VERIFY is not set
CONFIG_MTD_DATAFLASH_OTP=y
CONFIG_MTD_MCHP23K256=m
CONFIG_MTD_MCHP48L640=m
CONFIG_MTD_SST25L=m
CONFIG_MTD_SLRAM=m
CONFIG_MTD_PHRAM=m
CONFIG_MTD_MTDRAM=m
CONFIG_MTDRAM_TOTAL_SIZE=4096
CONFIG_MTDRAM_ERASE_SIZE=128
CONFIG_MTD_BLOCK2MTD=m

#
# Disk-On-Chip Device Drivers
#
# CONFIG_MTD_DOCG3 is not set
# end of Self-contained MTD device drivers

#
# NAND
#
CONFIG_MTD_NAND_CORE=m
CONFIG_MTD_ONENAND=m
CONFIG_MTD_ONENAND_VERIFY_WRITE=y
CONFIG_MTD_ONENAND_GENERIC=m
# CONFIG_MTD_ONENAND_OTP is not set
CONFIG_MTD_ONENAND_2X_PROGRAM=y
CONFIG_MTD_RAW_NAND=m

#
# Raw/parallel NAND flash controllers
#
CONFIG_MTD_NAND_DENALI=m
CONFIG_MTD_NAND_DENALI_PCI=m
CONFIG_MTD_NAND_CAFE=m
CONFIG_MTD_NAND_MXIC=m
CONFIG_MTD_NAND_GPIO=m
CONFIG_MTD_NAND_PLATFORM=m
CONFIG_MTD_NAND_ARASAN=m

#
# Misc
#
CONFIG_MTD_SM_COMMON=m
CONFIG_MTD_NAND_NANDSIM=m
CONFIG_MTD_NAND_RICOH=m
CONFIG_MTD_NAND_DISKONCHIP=m
# CONFIG_MTD_NAND_DISKONCHIP_PROBE_ADVANCED is not set
CONFIG_MTD_NAND_DISKONCHIP_PROBE_ADDRESS=0
# CONFIG_MTD_NAND_DISKONCHIP_BBTWRITE is not set
CONFIG_MTD_SPI_NAND=m

#
# ECC engine support
#
CONFIG_MTD_NAND_ECC=y
CONFIG_MTD_NAND_ECC_SW_HAMMING=y
# CONFIG_MTD_NAND_ECC_SW_HAMMING_SMC is not set
CONFIG_MTD_NAND_ECC_SW_BCH=y
CONFIG_MTD_NAND_ECC_MXIC=y
# end of ECC engine support
# end of NAND

#
# LPDDR & LPDDR2 PCM memory drivers
#
CONFIG_MTD_LPDDR=m
CONFIG_MTD_QINFO_PROBE=m
# end of LPDDR & LPDDR2 PCM memory drivers

CONFIG_MTD_SPI_NOR=m
CONFIG_MTD_SPI_NOR_USE_4K_SECTORS=y
# CONFIG_MTD_SPI_NOR_SWP_DISABLE is not set
CONFIG_MTD_SPI_NOR_SWP_DISABLE_ON_VOLATILE=y
# CONFIG_MTD_SPI_NOR_SWP_KEEP is not set
CONFIG_MTD_UBI=m
CONFIG_MTD_UBI_WL_THRESHOLD=4096
CONFIG_MTD_UBI_BEB_LIMIT=20
CONFIG_MTD_UBI_FASTMAP=y
CONFIG_MTD_UBI_GLUEBI=m
CONFIG_MTD_UBI_BLOCK=y
CONFIG_MTD_HYPERBUS=m
# CONFIG_OF is not set
CONFIG_ARCH_MIGHT_HAVE_PC_PARPORT=y
CONFIG_PARPORT=m
CONFIG_PARPORT_PC=m
CONFIG_PARPORT_SERIAL=m
CONFIG_PARPORT_PC_FIFO=y
# CONFIG_PARPORT_PC_SUPERIO is not set
CONFIG_PARPORT_PC_PCMCIA=m
CONFIG_PARPORT_1284=y
CONFIG_PARPORT_NOT_PC=y
CONFIG_PNP=y
# CONFIG_PNP_DEBUG_MESSAGES is not set

#
# Protocols
#
CONFIG_PNPACPI=y
CONFIG_BLK_DEV=y
CONFIG_BLK_DEV_NULL_BLK=m
CONFIG_BLK_DEV_FD=m
# CONFIG_BLK_DEV_FD_RAWCMD is not set
CONFIG_CDROM=y
CONFIG_BLK_DEV_PCIESSD_MTIP32XX=m
CONFIG_ZRAM=m
CONFIG_ZRAM_DEF_COMP_LZORLE=y
# CONFIG_ZRAM_DEF_COMP_ZSTD is not set
# CONFIG_ZRAM_DEF_COMP_LZ4 is not set
# CONFIG_ZRAM_DEF_COMP_LZO is not set
# CONFIG_ZRAM_DEF_COMP_LZ4HC is not set
# CONFIG_ZRAM_DEF_COMP_842 is not set
CONFIG_ZRAM_DEF_COMP="lzo-rle"
CONFIG_ZRAM_WRITEBACK=y
CONFIG_ZRAM_MEMORY_TRACKING=y
# CONFIG_ZRAM_MULTI_COMP is not set
CONFIG_BLK_DEV_LOOP=y
CONFIG_BLK_DEV_LOOP_MIN_COUNT=8
CONFIG_BLK_DEV_DRBD=m
# CONFIG_DRBD_FAULT_INJECTION is not set
CONFIG_BLK_DEV_NBD=m
CONFIG_BLK_DEV_RAM=m
CONFIG_BLK_DEV_RAM_COUNT=16
CONFIG_BLK_DEV_RAM_SIZE=65536
# CONFIG_CDROM_PKTCDVD is not set
CONFIG_ATA_OVER_ETH=m
CONFIG_XEN_BLKDEV_FRONTEND=y
CONFIG_XEN_BLKDEV_BACKEND=m
CONFIG_VIRTIO_BLK=m
CONFIG_BLK_DEV_RBD=m
# CONFIG_BLK_DEV_UBLK is not set
CONFIG_BLK_DEV_RNBD=y
CONFIG_BLK_DEV_RNBD_CLIENT=m
CONFIG_BLK_DEV_RNBD_SERVER=m

#
# NVME Support
#
CONFIG_NVME_COMMON=m
CONFIG_NVME_CORE=m
CONFIG_BLK_DEV_NVME=m
CONFIG_NVME_MULTIPATH=y
# CONFIG_NVME_VERBOSE_ERRORS is not set
CONFIG_NVME_HWMON=y
CONFIG_NVME_FABRICS=m
CONFIG_NVME_RDMA=m
CONFIG_NVME_FC=m
CONFIG_NVME_TCP=m
CONFIG_NVME_AUTH=y
CONFIG_NVME_TARGET=m
CONFIG_NVME_TARGET_PASSTHRU=y
CONFIG_NVME_TARGET_LOOP=m
CONFIG_NVME_TARGET_RDMA=m
CONFIG_NVME_TARGET_FC=m
# CONFIG_NVME_TARGET_FCLOOP is not set
CONFIG_NVME_TARGET_TCP=m
CONFIG_NVME_TARGET_AUTH=y
# end of NVME Support

#
# Misc devices
#
CONFIG_SENSORS_LIS3LV02D=m
CONFIG_AD525X_DPOT=m
CONFIG_AD525X_DPOT_I2C=m
CONFIG_AD525X_DPOT_SPI=m
CONFIG_DUMMY_IRQ=m
CONFIG_IBM_ASM=m
CONFIG_PHANTOM=m
CONFIG_TIFM_CORE=m
CONFIG_TIFM_7XX1=m
CONFIG_ICS932S401=m
CONFIG_ENCLOSURE_SERVICES=m
CONFIG_SGI_XP=m
CONFIG_SMPRO_ERRMON=m
CONFIG_SMPRO_MISC=m
CONFIG_HP_ILO=m
CONFIG_SGI_GRU=m
# CONFIG_SGI_GRU_DEBUG is not set
CONFIG_APDS9802ALS=m
CONFIG_ISL29003=m
CONFIG_ISL29020=m
CONFIG_SENSORS_TSL2550=m
CONFIG_SENSORS_BH1770=m
CONFIG_SENSORS_APDS990X=m
CONFIG_HMC6352=m
CONFIG_DS1682=m
CONFIG_VMWARE_BALLOON=m
CONFIG_LATTICE_ECP3_CONFIG=m
CONFIG_SRAM=y
CONFIG_DW_XDATA_PCIE=m
# CONFIG_PCI_ENDPOINT_TEST is not set
CONFIG_XILINX_SDFEC=m
CONFIG_MISC_RTSX=m
CONFIG_C2PORT=m
CONFIG_C2PORT_DURAMAR_2150=m

#
# EEPROM support
#
CONFIG_EEPROM_AT24=m
CONFIG_EEPROM_AT25=m
CONFIG_EEPROM_LEGACY=m
CONFIG_EEPROM_MAX6875=m
CONFIG_EEPROM_93CX6=m
CONFIG_EEPROM_93XX46=m
CONFIG_EEPROM_IDT_89HPESX=m
CONFIG_EEPROM_EE1004=m
# end of EEPROM support

CONFIG_CB710_CORE=m
# CONFIG_CB710_DEBUG is not set
CONFIG_CB710_DEBUG_ASSUMPTIONS=y

#
# Texas Instruments shared transport line discipline
#
CONFIG_TI_ST=m
# end of Texas Instruments shared transport line discipline

CONFIG_SENSORS_LIS3_I2C=m
CONFIG_ALTERA_STAPL=m
CONFIG_INTEL_MEI=m
CONFIG_INTEL_MEI_ME=m
CONFIG_INTEL_MEI_TXE=m
CONFIG_INTEL_MEI_GSC=m
CONFIG_INTEL_MEI_HDCP=m
CONFIG_INTEL_MEI_PXP=m
# CONFIG_INTEL_MEI_GSC_PROXY is not set
CONFIG_VMWARE_VMCI=m
CONFIG_GENWQE=m
CONFIG_GENWQE_PLATFORM_ERROR_RECOVERY=0
CONFIG_ECHO=m
CONFIG_BCM_VK=m
CONFIG_BCM_VK_TTY=y
CONFIG_MISC_ALCOR_PCI=m
CONFIG_MISC_RTSX_PCI=m
CONFIG_MISC_RTSX_USB=m
CONFIG_UACCE=m
CONFIG_PVPANIC=y
CONFIG_PVPANIC_MMIO=m
CONFIG_PVPANIC_PCI=m
CONFIG_GP_PCI1XXXX=m
# end of Misc devices

#
# SCSI device support
#
CONFIG_SCSI_MOD=y
CONFIG_RAID_ATTRS=m
CONFIG_SCSI_COMMON=y
CONFIG_SCSI=y
CONFIG_SCSI_DMA=y
CONFIG_SCSI_NETLINK=y
CONFIG_SCSI_PROC_FS=y

#
# SCSI support type (disk, tape, CD-ROM)
#
CONFIG_BLK_DEV_SD=y
CONFIG_CHR_DEV_ST=m
CONFIG_BLK_DEV_SR=y
CONFIG_CHR_DEV_SG=y
CONFIG_BLK_DEV_BSG=y
CONFIG_CHR_DEV_SCH=m
CONFIG_SCSI_ENCLOSURE=m
CONFIG_SCSI_CONSTANTS=y
CONFIG_SCSI_LOGGING=y
CONFIG_SCSI_SCAN_ASYNC=y

#
# SCSI Transports
#
CONFIG_SCSI_SPI_ATTRS=m
CONFIG_SCSI_FC_ATTRS=m
CONFIG_SCSI_ISCSI_ATTRS=m
CONFIG_SCSI_SAS_ATTRS=m
CONFIG_SCSI_SAS_LIBSAS=m
CONFIG_SCSI_SAS_ATA=y
CONFIG_SCSI_SAS_HOST_SMP=y
CONFIG_SCSI_SRP_ATTRS=m
# end of SCSI Transports

CONFIG_SCSI_LOWLEVEL=y
CONFIG_ISCSI_TCP=m
CONFIG_ISCSI_BOOT_SYSFS=m
CONFIG_SCSI_CXGB3_ISCSI=m
CONFIG_SCSI_CXGB4_ISCSI=m
CONFIG_SCSI_BNX2_ISCSI=m
CONFIG_SCSI_BNX2X_FCOE=m
CONFIG_BE2ISCSI=m
CONFIG_BLK_DEV_3W_XXXX_RAID=m
CONFIG_SCSI_HPSA=m
CONFIG_SCSI_3W_9XXX=m
CONFIG_SCSI_3W_SAS=m
CONFIG_SCSI_ACARD=m
CONFIG_SCSI_AHA1740=m
CONFIG_SCSI_AACRAID=m
CONFIG_SCSI_AIC7XXX=m
CONFIG_AIC7XXX_CMDS_PER_DEVICE=8
CONFIG_AIC7XXX_RESET_DELAY_MS=5000
# CONFIG_AIC7XXX_DEBUG_ENABLE is not set
CONFIG_AIC7XXX_DEBUG_MASK=0
CONFIG_AIC7XXX_REG_PRETTY_PRINT=y
CONFIG_SCSI_AIC79XX=m
CONFIG_AIC79XX_CMDS_PER_DEVICE=32
CONFIG_AIC79XX_RESET_DELAY_MS=5000
# CONFIG_AIC79XX_DEBUG_ENABLE is not set
CONFIG_AIC79XX_DEBUG_MASK=0
CONFIG_AIC79XX_REG_PRETTY_PRINT=y
CONFIG_SCSI_AIC94XX=m
# CONFIG_AIC94XX_DEBUG is not set
CONFIG_SCSI_MVSAS=m
# CONFIG_SCSI_MVSAS_DEBUG is not set
# CONFIG_SCSI_MVSAS_TASKLET is not set
CONFIG_SCSI_MVUMI=m
CONFIG_SCSI_ADVANSYS=m
CONFIG_SCSI_ARCMSR=m
CONFIG_SCSI_ESAS2R=m
CONFIG_MEGARAID_NEWGEN=y
CONFIG_MEGARAID_MM=m
CONFIG_MEGARAID_MAILBOX=m
CONFIG_MEGARAID_LEGACY=m
CONFIG_MEGARAID_SAS=m
CONFIG_SCSI_MPT3SAS=m
CONFIG_SCSI_MPT2SAS_MAX_SGE=128
CONFIG_SCSI_MPT3SAS_MAX_SGE=128
CONFIG_SCSI_MPT2SAS=m
CONFIG_SCSI_MPI3MR=m
CONFIG_SCSI_SMARTPQI=m
CONFIG_SCSI_HPTIOP=m
CONFIG_SCSI_BUSLOGIC=m
CONFIG_SCSI_FLASHPOINT=y
CONFIG_SCSI_MYRB=m
CONFIG_SCSI_MYRS=m
CONFIG_VMWARE_PVSCSI=m
CONFIG_XEN_SCSI_FRONTEND=m
CONFIG_HYPERV_STORAGE=m
CONFIG_LIBFC=m
CONFIG_LIBFCOE=m
CONFIG_FCOE=m
CONFIG_FCOE_FNIC=m
CONFIG_SCSI_SNIC=m
# CONFIG_SCSI_SNIC_DEBUG_FS is not set
CONFIG_SCSI_DMX3191D=m
CONFIG_SCSI_FDOMAIN=m
CONFIG_SCSI_FDOMAIN_PCI=m
CONFIG_SCSI_ISCI=m
CONFIG_SCSI_IPS=m
CONFIG_SCSI_INITIO=m
CONFIG_SCSI_INIA100=m
CONFIG_SCSI_PPA=m
CONFIG_SCSI_IMM=m
# CONFIG_SCSI_IZIP_EPP16 is not set
# CONFIG_SCSI_IZIP_SLOW_CTR is not set
CONFIG_SCSI_STEX=m
CONFIG_SCSI_SYM53C8XX_2=m
CONFIG_SCSI_SYM53C8XX_DMA_ADDRESSING_MODE=1
CONFIG_SCSI_SYM53C8XX_DEFAULT_TAGS=16
CONFIG_SCSI_SYM53C8XX_MAX_TAGS=64
CONFIG_SCSI_SYM53C8XX_MMIO=y
CONFIG_SCSI_IPR=m
CONFIG_SCSI_IPR_TRACE=y
CONFIG_SCSI_IPR_DUMP=y
CONFIG_SCSI_QLOGIC_1280=m
CONFIG_SCSI_QLA_FC=m
CONFIG_TCM_QLA2XXX=m
# CONFIG_TCM_QLA2XXX_DEBUG is not set
CONFIG_SCSI_QLA_ISCSI=m
CONFIG_QEDI=m
CONFIG_QEDF=m
CONFIG_SCSI_LPFC=m
# CONFIG_SCSI_LPFC_DEBUG_FS is not set
CONFIG_SCSI_EFCT=m
CONFIG_SCSI_SIM710=m
CONFIG_SCSI_DC395x=m
CONFIG_SCSI_AM53C974=m
CONFIG_SCSI_WD719X=m
CONFIG_SCSI_DEBUG=m
CONFIG_SCSI_PMCRAID=m
CONFIG_SCSI_PM8001=m
CONFIG_SCSI_BFA_FC=m
CONFIG_SCSI_VIRTIO=y
CONFIG_SCSI_CHELSIO_FCOE=m
CONFIG_SCSI_LOWLEVEL_PCMCIA=y
CONFIG_PCMCIA_AHA152X=m
CONFIG_PCMCIA_FDOMAIN=m
CONFIG_PCMCIA_QLOGIC=m
CONFIG_PCMCIA_SYM53C500=m
CONFIG_SCSI_DH=y
CONFIG_SCSI_DH_RDAC=m
CONFIG_SCSI_DH_HP_SW=m
CONFIG_SCSI_DH_EMC=m
CONFIG_SCSI_DH_ALUA=m
# end of SCSI device support

CONFIG_ATA=y
CONFIG_SATA_HOST=y
CONFIG_PATA_TIMINGS=y
CONFIG_ATA_VERBOSE_ERROR=y
CONFIG_ATA_FORCE=y
CONFIG_ATA_ACPI=y
CONFIG_SATA_ZPODD=y
CONFIG_SATA_PMP=y

#
# Controllers with non-SFF native interface
#
CONFIG_SATA_AHCI=m
CONFIG_SATA_MOBILE_LPM_POLICY=3
CONFIG_SATA_AHCI_PLATFORM=m
CONFIG_AHCI_DWC=m
CONFIG_SATA_INIC162X=m
CONFIG_SATA_ACARD_AHCI=m
CONFIG_SATA_SIL24=m
CONFIG_ATA_SFF=y

#
# SFF controllers with custom DMA interface
#
CONFIG_PDC_ADMA=m
CONFIG_SATA_QSTOR=m
CONFIG_SATA_SX4=m
CONFIG_ATA_BMDMA=y

#
# SATA SFF controllers with BMDMA
#
CONFIG_ATA_PIIX=y
CONFIG_SATA_DWC=m
CONFIG_SATA_DWC_OLD_DMA=y
CONFIG_SATA_MV=m
CONFIG_SATA_NV=m
CONFIG_SATA_PROMISE=m
CONFIG_SATA_SIL=m
CONFIG_SATA_SIS=m
CONFIG_SATA_SVW=m
CONFIG_SATA_ULI=m
CONFIG_SATA_VIA=m
CONFIG_SATA_VITESSE=m

#
# PATA SFF controllers with BMDMA
#
CONFIG_PATA_ALI=m
CONFIG_PATA_AMD=m
CONFIG_PATA_ARTOP=m
CONFIG_PATA_ATIIXP=m
CONFIG_PATA_ATP867X=m
CONFIG_PATA_CMD64X=m
CONFIG_PATA_CYPRESS=m
CONFIG_PATA_EFAR=m
CONFIG_PATA_HPT366=m
CONFIG_PATA_HPT37X=m
CONFIG_PATA_HPT3X2N=m
CONFIG_PATA_HPT3X3=m
# CONFIG_PATA_HPT3X3_DMA is not set
CONFIG_PATA_IT8213=m
CONFIG_PATA_IT821X=m
CONFIG_PATA_JMICRON=m
CONFIG_PATA_MARVELL=m
CONFIG_PATA_NETCELL=m
CONFIG_PATA_NINJA32=m
CONFIG_PATA_NS87415=m
CONFIG_PATA_OLDPIIX=m
CONFIG_PATA_OPTIDMA=m
CONFIG_PATA_PDC2027X=m
CONFIG_PATA_PDC_OLD=m
CONFIG_PATA_RADISYS=m
CONFIG_PATA_RDC=m
CONFIG_PATA_SCH=m
CONFIG_PATA_SERVERWORKS=m
CONFIG_PATA_SIL680=m
CONFIG_PATA_SIS=y
CONFIG_PATA_TOSHIBA=m
CONFIG_PATA_TRIFLEX=m
CONFIG_PATA_VIA=m
CONFIG_PATA_WINBOND=m

#
# PIO-only SFF controllers
#
CONFIG_PATA_CMD640_PCI=m
CONFIG_PATA_MPIIX=m
CONFIG_PATA_NS87410=m
CONFIG_PATA_OPTI=m
CONFIG_PATA_PCMCIA=m
CONFIG_PATA_RZ1000=m
# CONFIG_PATA_PARPORT is not set

#
# Generic fallback / legacy drivers
#
CONFIG_PATA_ACPI=m
CONFIG_ATA_GENERIC=y
CONFIG_PATA_LEGACY=m
CONFIG_MD=y
CONFIG_BLK_DEV_MD=y
CONFIG_MD_AUTODETECT=y
CONFIG_MD_BITMAP_FILE=y
CONFIG_MD_LINEAR=m
CONFIG_MD_RAID0=m
CONFIG_MD_RAID1=m
CONFIG_MD_RAID10=m
CONFIG_MD_RAID456=m
CONFIG_MD_MULTIPATH=m
CONFIG_MD_FAULTY=m
CONFIG_MD_CLUSTER=m
CONFIG_BCACHE=m
# CONFIG_BCACHE_DEBUG is not set
# CONFIG_BCACHE_CLOSURES_DEBUG is not set
CONFIG_BCACHE_ASYNC_REGISTRATION=y
CONFIG_BLK_DEV_DM_BUILTIN=y
CONFIG_BLK_DEV_DM=y
# CONFIG_DM_DEBUG is not set
CONFIG_DM_BUFIO=m
# CONFIG_DM_DEBUG_BLOCK_MANAGER_LOCKING is not set
CONFIG_DM_BIO_PRISON=m
CONFIG_DM_PERSISTENT_DATA=m
CONFIG_DM_UNSTRIPED=m
CONFIG_DM_CRYPT=m
CONFIG_DM_SNAPSHOT=m
CONFIG_DM_THIN_PROVISIONING=m
CONFIG_DM_CACHE=m
CONFIG_DM_CACHE_SMQ=m
CONFIG_DM_WRITECACHE=m
CONFIG_DM_EBS=m
CONFIG_DM_ERA=m
CONFIG_DM_CLONE=m
CONFIG_DM_MIRROR=m
CONFIG_DM_LOG_USERSPACE=m
CONFIG_DM_RAID=m
CONFIG_DM_ZERO=m
CONFIG_DM_MULTIPATH=m
CONFIG_DM_MULTIPATH_QL=m
CONFIG_DM_MULTIPATH_ST=m
CONFIG_DM_MULTIPATH_HST=m
CONFIG_DM_MULTIPATH_IOA=m
CONFIG_DM_DELAY=m
# CONFIG_DM_DUST is not set
CONFIG_DM_INIT=y
CONFIG_DM_UEVENT=y
CONFIG_DM_FLAKEY=m
CONFIG_DM_VERITY=m
CONFIG_DM_VERITY_VERIFY_ROOTHASH_SIG=y
CONFIG_DM_VERITY_VERIFY_ROOTHASH_SIG_SECONDARY_KEYRING=y
# CONFIG_DM_VERITY_FEC is not set
CONFIG_DM_SWITCH=m
CONFIG_DM_LOG_WRITES=m
CONFIG_DM_INTEGRITY=m
CONFIG_DM_ZONED=m
CONFIG_DM_AUDIT=y
CONFIG_TARGET_CORE=m
CONFIG_TCM_IBLOCK=m
CONFIG_TCM_FILEIO=m
CONFIG_TCM_PSCSI=m
CONFIG_TCM_USER2=m
CONFIG_LOOPBACK_TARGET=m
CONFIG_TCM_FC=m
CONFIG_ISCSI_TARGET=m
CONFIG_ISCSI_TARGET_CXGB4=m
CONFIG_SBP_TARGET=m
# CONFIG_REMOTE_TARGET is not set
CONFIG_FUSION=y
CONFIG_FUSION_SPI=m
CONFIG_FUSION_FC=m
CONFIG_FUSION_SAS=m
CONFIG_FUSION_MAX_SGE=128
CONFIG_FUSION_CTL=m
CONFIG_FUSION_LAN=m
CONFIG_FUSION_LOGGING=y

#
# IEEE 1394 (FireWire) support
#
CONFIG_FIREWIRE=m
CONFIG_FIREWIRE_OHCI=m
CONFIG_FIREWIRE_SBP2=m
CONFIG_FIREWIRE_NET=m
CONFIG_FIREWIRE_NOSY=m
# end of IEEE 1394 (FireWire) support

CONFIG_MACINTOSH_DRIVERS=y
CONFIG_MAC_EMUMOUSEBTN=m
CONFIG_NETDEVICES=y
CONFIG_MII=m
CONFIG_NET_CORE=y
CONFIG_BONDING=m
CONFIG_DUMMY=m
CONFIG_WIREGUARD=m
# CONFIG_WIREGUARD_DEBUG is not set
CONFIG_EQUALIZER=m
CONFIG_NET_FC=y
CONFIG_IFB=m
CONFIG_NET_TEAM=m
CONFIG_NET_TEAM_MODE_BROADCAST=m
CONFIG_NET_TEAM_MODE_ROUNDROBIN=m
CONFIG_NET_TEAM_MODE_RANDOM=m
CONFIG_NET_TEAM_MODE_ACTIVEBACKUP=m
CONFIG_NET_TEAM_MODE_LOADBALANCE=m
CONFIG_MACVLAN=m
CONFIG_MACVTAP=m
CONFIG_IPVLAN_L3S=y
CONFIG_IPVLAN=m
CONFIG_IPVTAP=m
CONFIG_VXLAN=m
CONFIG_GENEVE=m
CONFIG_BAREUDP=m
CONFIG_GTP=m
CONFIG_AMT=m
CONFIG_MACSEC=m
CONFIG_NETCONSOLE=m
CONFIG_NETCONSOLE_DYNAMIC=y
# CONFIG_NETCONSOLE_EXTENDED_LOG is not set
CONFIG_NETPOLL=y
CONFIG_NET_POLL_CONTROLLER=y
CONFIG_NTB_NETDEV=m
CONFIG_RIONET=m
CONFIG_RIONET_TX_SIZE=128
CONFIG_RIONET_RX_SIZE=128
CONFIG_TUN=y
CONFIG_TAP=m
# CONFIG_TUN_VNET_CROSS_LE is not set
CONFIG_VETH=m
CONFIG_VIRTIO_NET=m
CONFIG_NLMON=m
CONFIG_NET_VRF=m
CONFIG_VSOCKMON=m
CONFIG_MHI_NET=m
CONFIG_SUNGEM_PHY=m
CONFIG_ARCNET=m
CONFIG_ARCNET_1201=m
CONFIG_ARCNET_1051=m
CONFIG_ARCNET_RAW=m
CONFIG_ARCNET_CAP=m
CONFIG_ARCNET_COM90xx=m
CONFIG_ARCNET_COM90xxIO=m
CONFIG_ARCNET_RIM_I=m
CONFIG_ARCNET_COM20020=m
CONFIG_ARCNET_COM20020_PCI=m
CONFIG_ARCNET_COM20020_CS=m
CONFIG_ATM_DRIVERS=y
CONFIG_ATM_DUMMY=m
CONFIG_ATM_TCP=m
CONFIG_ATM_LANAI=m
CONFIG_ATM_ENI=m
# CONFIG_ATM_ENI_DEBUG is not set
# CONFIG_ATM_ENI_TUNE_BURST is not set
CONFIG_ATM_NICSTAR=m
# CONFIG_ATM_NICSTAR_USE_SUNI is not set
# CONFIG_ATM_NICSTAR_USE_IDT77105 is not set
CONFIG_ATM_IDT77252=m
# CONFIG_ATM_IDT77252_DEBUG is not set
# CONFIG_ATM_IDT77252_RCV_ALL is not set
CONFIG_ATM_IDT77252_USE_SUNI=y
CONFIG_ATM_IA=m
# CONFIG_ATM_IA_DEBUG is not set
CONFIG_ATM_FORE200E=m
# CONFIG_ATM_FORE200E_USE_TASKLET is not set
CONFIG_ATM_FORE200E_TX_RETRY=16
CONFIG_ATM_FORE200E_DEBUG=0
CONFIG_ATM_HE=m
CONFIG_ATM_HE_USE_SUNI=y
CONFIG_ATM_SOLOS=m
CONFIG_CAIF_DRIVERS=y
CONFIG_CAIF_TTY=m
CONFIG_CAIF_VIRTIO=m

#
# Distributed Switch Architecture drivers
#
CONFIG_B53=m
CONFIG_B53_SPI_DRIVER=m
CONFIG_B53_MDIO_DRIVER=m
CONFIG_B53_MMAP_DRIVER=m
CONFIG_B53_SRAB_DRIVER=m
CONFIG_B53_SERDES=m
CONFIG_NET_DSA_BCM_SF2=m
# CONFIG_NET_DSA_LOOP is not set
CONFIG_NET_DSA_HIRSCHMANN_HELLCREEK=m
CONFIG_NET_DSA_LANTIQ_GSWIP=m
CONFIG_NET_DSA_MT7530=m
CONFIG_NET_DSA_MT7530_MDIO=m
CONFIG_NET_DSA_MT7530_MMIO=m
CONFIG_NET_DSA_MV88E6060=m
CONFIG_NET_DSA_MICROCHIP_KSZ_COMMON=m
CONFIG_NET_DSA_MICROCHIP_KSZ9477_I2C=m
CONFIG_NET_DSA_MICROCHIP_KSZ_SPI=m
# CONFIG_NET_DSA_MICROCHIP_KSZ_PTP is not set
CONFIG_NET_DSA_MICROCHIP_KSZ8863_SMI=m
CONFIG_NET_DSA_MV88E6XXX=m
CONFIG_NET_DSA_MV88E6XXX_PTP=y
CONFIG_NET_DSA_MSCC_FELIX_DSA_LIB=m
# CONFIG_NET_DSA_MSCC_OCELOT_EXT is not set
CONFIG_NET_DSA_MSCC_SEVILLE=m
CONFIG_NET_DSA_AR9331=m
CONFIG_NET_DSA_QCA8K=m
# CONFIG_NET_DSA_QCA8K_LEDS_SUPPORT is not set
CONFIG_NET_DSA_SJA1105=m
CONFIG_NET_DSA_SJA1105_PTP=y
CONFIG_NET_DSA_SJA1105_TAS=y
CONFIG_NET_DSA_SJA1105_VL=y
CONFIG_NET_DSA_XRS700X=m
CONFIG_NET_DSA_XRS700X_I2C=m
CONFIG_NET_DSA_XRS700X_MDIO=m
CONFIG_NET_DSA_REALTEK=m
# CONFIG_NET_DSA_REALTEK_MDIO is not set
# CONFIG_NET_DSA_REALTEK_SMI is not set
CONFIG_NET_DSA_REALTEK_RTL8365MB=m
CONFIG_NET_DSA_REALTEK_RTL8366RB=m
CONFIG_NET_DSA_SMSC_LAN9303=m
CONFIG_NET_DSA_SMSC_LAN9303_I2C=m
CONFIG_NET_DSA_SMSC_LAN9303_MDIO=m
CONFIG_NET_DSA_VITESSE_VSC73XX=m
CONFIG_NET_DSA_VITESSE_VSC73XX_SPI=m
CONFIG_NET_DSA_VITESSE_VSC73XX_PLATFORM=m
# end of Distributed Switch Architecture drivers

CONFIG_ETHERNET=y
CONFIG_MDIO=m
CONFIG_NET_VENDOR_3COM=y
CONFIG_EL3=m
CONFIG_PCMCIA_3C574=m
CONFIG_PCMCIA_3C589=m
CONFIG_VORTEX=m
CONFIG_TYPHOON=m
CONFIG_NET_VENDOR_ADAPTEC=y
CONFIG_ADAPTEC_STARFIRE=m
CONFIG_NET_VENDOR_AGERE=y
CONFIG_ET131X=m
CONFIG_NET_VENDOR_ALACRITECH=y
CONFIG_SLICOSS=m
CONFIG_NET_VENDOR_ALTEON=y
CONFIG_ACENIC=m
# CONFIG_ACENIC_OMIT_TIGON_I is not set
CONFIG_ALTERA_TSE=m
CONFIG_NET_VENDOR_AMAZON=y
CONFIG_ENA_ETHERNET=m
CONFIG_NET_VENDOR_AMD=y
CONFIG_AMD8111_ETH=m
CONFIG_PCNET32=m
CONFIG_PCMCIA_NMCLAN=m
CONFIG_AMD_XGBE=m
CONFIG_AMD_XGBE_DCB=y
CONFIG_AMD_XGBE_HAVE_ECC=y
# CONFIG_PDS_CORE is not set
CONFIG_NET_VENDOR_AQUANTIA=y
CONFIG_AQTION=m
CONFIG_NET_VENDOR_ARC=y
CONFIG_NET_VENDOR_ASIX=y
CONFIG_SPI_AX88796C=m
# CONFIG_SPI_AX88796C_COMPRESSION is not set
CONFIG_NET_VENDOR_ATHEROS=y
CONFIG_ATL2=m
CONFIG_ATL1=m
CONFIG_ATL1E=m
CONFIG_ATL1C=m
CONFIG_ALX=m
CONFIG_CX_ECAT=m
CONFIG_NET_VENDOR_BROADCOM=y
CONFIG_B44=m
CONFIG_B44_PCI_AUTOSELECT=y
CONFIG_B44_PCICORE_AUTOSELECT=y
CONFIG_B44_PCI=y
CONFIG_BCMGENET=m
CONFIG_BNX2=m
CONFIG_CNIC=m
CONFIG_TIGON3=m
CONFIG_TIGON3_HWMON=y
CONFIG_BNX2X=m
CONFIG_BNX2X_SRIOV=y
CONFIG_SYSTEMPORT=m
CONFIG_BNXT=m
CONFIG_BNXT_SRIOV=y
CONFIG_BNXT_FLOWER_OFFLOAD=y
CONFIG_BNXT_DCB=y
CONFIG_BNXT_HWMON=y
CONFIG_NET_VENDOR_CADENCE=y
CONFIG_MACB=m
CONFIG_MACB_USE_HWSTAMP=y
CONFIG_MACB_PCI=m
CONFIG_NET_VENDOR_CAVIUM=y
CONFIG_THUNDER_NIC_PF=m
CONFIG_THUNDER_NIC_VF=m
CONFIG_THUNDER_NIC_BGX=m
CONFIG_THUNDER_NIC_RGX=m
CONFIG_CAVIUM_PTP=m
CONFIG_LIQUIDIO_CORE=m
CONFIG_LIQUIDIO=m
CONFIG_LIQUIDIO_VF=m
CONFIG_NET_VENDOR_CHELSIO=y
CONFIG_CHELSIO_T1=m
CONFIG_CHELSIO_T1_1G=y
CONFIG_CHELSIO_T3=m
CONFIG_CHELSIO_T4=m
CONFIG_CHELSIO_T4_DCB=y
CONFIG_CHELSIO_T4_FCOE=y
CONFIG_CHELSIO_T4VF=m
CONFIG_CHELSIO_LIB=m
CONFIG_CHELSIO_INLINE_CRYPTO=y
CONFIG_CHELSIO_IPSEC_INLINE=m
CONFIG_CHELSIO_TLS_DEVICE=m
CONFIG_NET_VENDOR_CIRRUS=y
CONFIG_NET_VENDOR_CISCO=y
CONFIG_ENIC=m
CONFIG_NET_VENDOR_CORTINA=y
CONFIG_NET_VENDOR_DAVICOM=y
CONFIG_DM9051=m
CONFIG_DNET=m
CONFIG_NET_VENDOR_DEC=y
CONFIG_NET_TULIP=y
CONFIG_DE2104X=m
CONFIG_DE2104X_DSL=0
CONFIG_TULIP=m
# CONFIG_TULIP_MWI is not set
# CONFIG_TULIP_MMIO is not set
# CONFIG_TULIP_NAPI is not set
CONFIG_WINBOND_840=m
CONFIG_DM9102=m
CONFIG_ULI526X=m
CONFIG_PCMCIA_XIRCOM=m
CONFIG_NET_VENDOR_DLINK=y
CONFIG_DL2K=m
CONFIG_SUNDANCE=m
# CONFIG_SUNDANCE_MMIO is not set
CONFIG_NET_VENDOR_EMULEX=y
CONFIG_BE2NET=m
CONFIG_BE2NET_HWMON=y
CONFIG_BE2NET_BE2=y
CONFIG_BE2NET_BE3=y
CONFIG_BE2NET_LANCER=y
CONFIG_BE2NET_SKYHAWK=y
CONFIG_NET_VENDOR_ENGLEDER=y
CONFIG_TSNEP=m
# CONFIG_TSNEP_SELFTESTS is not set
CONFIG_NET_VENDOR_EZCHIP=y
CONFIG_NET_VENDOR_FUJITSU=y
CONFIG_PCMCIA_FMVJ18X=m
CONFIG_NET_VENDOR_FUNGIBLE=y
CONFIG_FUN_CORE=m
CONFIG_FUN_ETH=m
CONFIG_NET_VENDOR_GOOGLE=y
CONFIG_GVE=m
CONFIG_NET_VENDOR_HUAWEI=y
CONFIG_HINIC=m
CONFIG_NET_VENDOR_I825XX=y
CONFIG_NET_VENDOR_INTEL=y
CONFIG_E100=m
CONFIG_E1000=m
CONFIG_E1000E=m
CONFIG_E1000E_HWTS=y
CONFIG_IGB=m
CONFIG_IGB_HWMON=y
CONFIG_IGB_DCA=y
CONFIG_IGBVF=m
CONFIG_IXGBE=m
CONFIG_IXGBE_HWMON=y
CONFIG_IXGBE_DCA=y
CONFIG_IXGBE_DCB=y
CONFIG_IXGBE_IPSEC=y
CONFIG_IXGBEVF=m
CONFIG_IXGBEVF_IPSEC=y
CONFIG_I40E=m
CONFIG_I40E_DCB=y
CONFIG_IAVF=m
CONFIG_I40EVF=m
CONFIG_ICE=m
CONFIG_ICE_SWITCHDEV=y
CONFIG_ICE_HWTS=y
CONFIG_FM10K=m
CONFIG_IGC=m
CONFIG_JME=m
CONFIG_NET_VENDOR_ADI=y
CONFIG_ADIN1110=m
CONFIG_NET_VENDOR_LITEX=y
CONFIG_NET_VENDOR_MARVELL=y
CONFIG_MVMDIO=m
CONFIG_SKGE=m
# CONFIG_SKGE_DEBUG is not set
CONFIG_SKGE_GENESIS=y
CONFIG_SKY2=m
# CONFIG_SKY2_DEBUG is not set
CONFIG_OCTEON_EP=m
CONFIG_PRESTERA=m
CONFIG_PRESTERA_PCI=m
CONFIG_NET_VENDOR_MELLANOX=y
CONFIG_MLX4_EN=m
CONFIG_MLX4_EN_DCB=y
CONFIG_MLX4_CORE=m
CONFIG_MLX4_DEBUG=y
CONFIG_MLX4_CORE_GEN2=y
CONFIG_MLX5_CORE=m
CONFIG_MLX5_FPGA=y
CONFIG_MLX5_CORE_EN=y
CONFIG_MLX5_EN_ARFS=y
CONFIG_MLX5_EN_RXNFC=y
CONFIG_MLX5_MPFS=y
CONFIG_MLX5_ESWITCH=y
CONFIG_MLX5_BRIDGE=y
CONFIG_MLX5_CLS_ACT=y
CONFIG_MLX5_TC_CT=y
CONFIG_MLX5_TC_SAMPLE=y
CONFIG_MLX5_CORE_EN_DCB=y
CONFIG_MLX5_CORE_IPOIB=y
# CONFIG_MLX5_MACSEC is not set
CONFIG_MLX5_EN_IPSEC=y
CONFIG_MLX5_EN_TLS=y
CONFIG_MLX5_SW_STEERING=y
CONFIG_MLX5_SF=y
CONFIG_MLX5_SF_MANAGER=y
CONFIG_MLXSW_CORE=m
CONFIG_MLXSW_CORE_HWMON=y
CONFIG_MLXSW_CORE_THERMAL=y
CONFIG_MLXSW_PCI=m
CONFIG_MLXSW_I2C=m
CONFIG_MLXSW_SPECTRUM=m
CONFIG_MLXSW_SPECTRUM_DCB=y
CONFIG_MLXSW_MINIMAL=m
CONFIG_MLXFW=m
CONFIG_NET_VENDOR_MICREL=y
CONFIG_KS8842=m
CONFIG_KS8851=m
CONFIG_KS8851_MLL=m
CONFIG_KSZ884X_PCI=m
CONFIG_NET_VENDOR_MICROCHIP=y
CONFIG_ENC28J60=m
# CONFIG_ENC28J60_WRITEVERIFY is not set
CONFIG_ENCX24J600=m
CONFIG_LAN743X=m
CONFIG_VCAP=y
CONFIG_NET_VENDOR_MICROSEMI=y
CONFIG_MSCC_OCELOT_SWITCH_LIB=m
CONFIG_NET_VENDOR_MICROSOFT=y
CONFIG_MICROSOFT_MANA=m
CONFIG_NET_VENDOR_MYRI=y
CONFIG_MYRI10GE=m
CONFIG_MYRI10GE_DCA=y
CONFIG_FEALNX=m
CONFIG_NET_VENDOR_NI=y
CONFIG_NI_XGE_MANAGEMENT_ENET=m
CONFIG_NET_VENDOR_NATSEMI=y
CONFIG_NATSEMI=m
CONFIG_NS83820=m
CONFIG_NET_VENDOR_NETERION=y
CONFIG_S2IO=m
CONFIG_NET_VENDOR_NETRONOME=y
CONFIG_NFP=m
CONFIG_NFP_APP_FLOWER=y
CONFIG_NFP_APP_ABM_NIC=y
CONFIG_NFP_NET_IPSEC=y
# CONFIG_NFP_DEBUG is not set
CONFIG_NET_VENDOR_8390=y
CONFIG_PCMCIA_AXNET=m
CONFIG_NE2K_PCI=m
CONFIG_PCMCIA_PCNET=m
CONFIG_NET_VENDOR_NVIDIA=y
CONFIG_FORCEDETH=m
CONFIG_NET_VENDOR_OKI=y
CONFIG_ETHOC=m
CONFIG_NET_VENDOR_PACKET_ENGINES=y
CONFIG_HAMACHI=m
CONFIG_YELLOWFIN=m
CONFIG_NET_VENDOR_PENSANDO=y
CONFIG_IONIC=m
CONFIG_NET_VENDOR_QLOGIC=y
CONFIG_QLA3XXX=m
CONFIG_QLCNIC=m
CONFIG_QLCNIC_SRIOV=y
CONFIG_QLCNIC_DCB=y
CONFIG_QLCNIC_HWMON=y
CONFIG_NETXEN_NIC=m
CONFIG_QED=m
CONFIG_QED_LL2=y
CONFIG_QED_SRIOV=y
CONFIG_QEDE=m
CONFIG_QED_RDMA=y
CONFIG_QED_ISCSI=y
CONFIG_QED_FCOE=y
CONFIG_QED_OOO=y
CONFIG_NET_VENDOR_BROCADE=y
CONFIG_BNA=m
CONFIG_NET_VENDOR_QUALCOMM=y
CONFIG_QCOM_EMAC=m
CONFIG_RMNET=m
CONFIG_NET_VENDOR_RDC=y
CONFIG_R6040=m
CONFIG_NET_VENDOR_REALTEK=y
CONFIG_ATP=m
CONFIG_8139CP=m
CONFIG_8139TOO=m
CONFIG_8139TOO_PIO=y
# CONFIG_8139TOO_TUNE_TWISTER is not set
CONFIG_8139TOO_8129=y
# CONFIG_8139_OLD_RX_RESET is not set
CONFIG_R8169=m
CONFIG_NET_VENDOR_RENESAS=y
CONFIG_NET_VENDOR_ROCKER=y
CONFIG_ROCKER=m
CONFIG_NET_VENDOR_SAMSUNG=y
CONFIG_SXGBE_ETH=m
CONFIG_NET_VENDOR_SEEQ=y
CONFIG_NET_VENDOR_SILAN=y
CONFIG_SC92031=m
CONFIG_NET_VENDOR_SIS=y
CONFIG_SIS900=m
CONFIG_SIS190=m
CONFIG_NET_VENDOR_SOLARFLARE=y
CONFIG_SFC=m
CONFIG_SFC_MTD=y
CONFIG_SFC_MCDI_MON=y
CONFIG_SFC_SRIOV=y
CONFIG_SFC_MCDI_LOGGING=y
CONFIG_SFC_FALCON=m
CONFIG_SFC_FALCON_MTD=y
CONFIG_SFC_SIENA=m
CONFIG_SFC_SIENA_MTD=y
CONFIG_SFC_SIENA_MCDI_MON=y
CONFIG_SFC_SIENA_SRIOV=y
CONFIG_SFC_SIENA_MCDI_LOGGING=y
CONFIG_NET_VENDOR_SMSC=y
CONFIG_PCMCIA_SMC91C92=m
CONFIG_EPIC100=m
CONFIG_SMSC911X=m
CONFIG_SMSC9420=m
CONFIG_NET_VENDOR_SOCIONEXT=y
CONFIG_NET_VENDOR_STMICRO=y
CONFIG_STMMAC_ETH=m
# CONFIG_STMMAC_SELFTESTS is not set
CONFIG_STMMAC_PLATFORM=m
CONFIG_DWMAC_GENERIC=m
CONFIG_DWMAC_INTEL=m
CONFIG_DWMAC_LOONGSON=m
CONFIG_STMMAC_PCI=m
CONFIG_NET_VENDOR_SUN=y
CONFIG_HAPPYMEAL=m
CONFIG_SUNGEM=m
CONFIG_CASSINI=m
CONFIG_NIU=m
CONFIG_NET_VENDOR_SYNOPSYS=y
CONFIG_DWC_XLGMAC=m
CONFIG_DWC_XLGMAC_PCI=m
CONFIG_NET_VENDOR_TEHUTI=y
CONFIG_TEHUTI=m
CONFIG_NET_VENDOR_TI=y
# CONFIG_TI_CPSW_PHY_SEL is not set
CONFIG_TLAN=m
CONFIG_NET_VENDOR_VERTEXCOM=y
CONFIG_MSE102X=m
CONFIG_NET_VENDOR_VIA=y
CONFIG_VIA_RHINE=m
CONFIG_VIA_RHINE_MMIO=y
CONFIG_VIA_VELOCITY=m
CONFIG_NET_VENDOR_WANGXUN=y
CONFIG_LIBWX=m
CONFIG_NGBE=m
CONFIG_TXGBE=m
CONFIG_NET_VENDOR_WIZNET=y
CONFIG_WIZNET_W5100=m
CONFIG_WIZNET_W5300=m
# CONFIG_WIZNET_BUS_DIRECT is not set
# CONFIG_WIZNET_BUS_INDIRECT is not set
CONFIG_WIZNET_BUS_ANY=y
CONFIG_WIZNET_W5100_SPI=m
CONFIG_NET_VENDOR_XILINX=y
CONFIG_XILINX_EMACLITE=m
CONFIG_XILINX_AXI_EMAC=m
CONFIG_XILINX_LL_TEMAC=m
CONFIG_NET_VENDOR_XIRCOM=y
CONFIG_PCMCIA_XIRC2PS=m
CONFIG_FDDI=y
CONFIG_DEFXX=m
CONFIG_SKFP=m
# CONFIG_HIPPI is not set
CONFIG_NET_SB1000=m
CONFIG_PHYLINK=m
CONFIG_PHYLIB=y
CONFIG_SWPHY=y
CONFIG_LED_TRIGGER_PHY=y
CONFIG_FIXED_PHY=y
CONFIG_SFP=m

#
# MII PHY device drivers
#
CONFIG_AMD_PHY=m
CONFIG_ADIN_PHY=m
CONFIG_ADIN1100_PHY=m
CONFIG_AQUANTIA_PHY=m
CONFIG_AX88796B_PHY=m
CONFIG_BROADCOM_PHY=m
CONFIG_BCM54140_PHY=m
CONFIG_BCM7XXX_PHY=m
CONFIG_BCM84881_PHY=y
CONFIG_BCM87XX_PHY=m
CONFIG_BCM_NET_PHYLIB=m
CONFIG_BCM_NET_PHYPTP=m
CONFIG_CICADA_PHY=m
CONFIG_CORTINA_PHY=m
CONFIG_DAVICOM_PHY=m
CONFIG_ICPLUS_PHY=m
CONFIG_LXT_PHY=m
CONFIG_INTEL_XWAY_PHY=m
CONFIG_LSI_ET1011C_PHY=m
CONFIG_MARVELL_PHY=m
CONFIG_MARVELL_10G_PHY=m
# CONFIG_MARVELL_88Q2XXX_PHY is not set
CONFIG_MARVELL_88X2222_PHY=m
CONFIG_MAXLINEAR_GPHY=m
CONFIG_MEDIATEK_GE_PHY=m
# CONFIG_MEDIATEK_GE_SOC_PHY is not set
CONFIG_MICREL_PHY=m
# CONFIG_MICROCHIP_T1S_PHY is not set
CONFIG_MICROCHIP_PHY=m
CONFIG_MICROCHIP_T1_PHY=m
CONFIG_MICROSEMI_PHY=m
CONFIG_MOTORCOMM_PHY=m
CONFIG_NATIONAL_PHY=m
# CONFIG_NXP_CBTX_PHY is not set
CONFIG_NXP_C45_TJA11XX_PHY=m
CONFIG_NXP_TJA11XX_PHY=m
# CONFIG_NCN26000_PHY is not set
CONFIG_AT803X_PHY=m
CONFIG_QSEMI_PHY=m
CONFIG_REALTEK_PHY=m
CONFIG_RENESAS_PHY=m
CONFIG_ROCKCHIP_PHY=m
CONFIG_SMSC_PHY=m
CONFIG_STE10XP=m
CONFIG_TERANETICS_PHY=m
CONFIG_DP83822_PHY=m
CONFIG_DP83TC811_PHY=m
CONFIG_DP83848_PHY=m
CONFIG_DP83867_PHY=m
CONFIG_DP83869_PHY=m
CONFIG_DP83TD510_PHY=m
CONFIG_VITESSE_PHY=m
CONFIG_XILINX_GMII2RGMII=m
CONFIG_MICREL_KS8995MA=m
CONFIG_PSE_CONTROLLER=y
CONFIG_PSE_REGULATOR=m
CONFIG_CAN_DEV=m
CONFIG_CAN_VCAN=m
CONFIG_CAN_VXCAN=m
CONFIG_CAN_NETLINK=y
CONFIG_CAN_CALC_BITTIMING=y
CONFIG_CAN_RX_OFFLOAD=y
CONFIG_CAN_CAN327=m
CONFIG_CAN_JANZ_ICAN3=m
CONFIG_CAN_KVASER_PCIEFD=m
CONFIG_CAN_SLCAN=m
CONFIG_CAN_C_CAN=m
CONFIG_CAN_C_CAN_PLATFORM=m
CONFIG_CAN_C_CAN_PCI=m
CONFIG_CAN_CC770=m
CONFIG_CAN_CC770_ISA=m
CONFIG_CAN_CC770_PLATFORM=m
CONFIG_CAN_CTUCANFD=m
CONFIG_CAN_CTUCANFD_PCI=m
CONFIG_CAN_IFI_CANFD=m
CONFIG_CAN_M_CAN=m
CONFIG_CAN_M_CAN_PCI=m
CONFIG_CAN_M_CAN_PLATFORM=m
CONFIG_CAN_M_CAN_TCAN4X5X=m
CONFIG_CAN_PEAK_PCIEFD=m
CONFIG_CAN_SJA1000=m
CONFIG_CAN_EMS_PCI=m
CONFIG_CAN_EMS_PCMCIA=m
CONFIG_CAN_F81601=m
CONFIG_CAN_KVASER_PCI=m
CONFIG_CAN_PEAK_PCI=m
CONFIG_CAN_PEAK_PCIEC=y
CONFIG_CAN_PEAK_PCMCIA=m
CONFIG_CAN_PLX_PCI=m
CONFIG_CAN_SJA1000_ISA=m
CONFIG_CAN_SJA1000_PLATFORM=m
CONFIG_CAN_SOFTING=m
CONFIG_CAN_SOFTING_CS=m

#
# CAN SPI interfaces
#
CONFIG_CAN_HI311X=m
CONFIG_CAN_MCP251X=m
CONFIG_CAN_MCP251XFD=m
# CONFIG_CAN_MCP251XFD_SANITY is not set
# end of CAN SPI interfaces

#
# CAN USB interfaces
#
CONFIG_CAN_8DEV_USB=m
CONFIG_CAN_EMS_USB=m
CONFIG_CAN_ESD_USB=m
CONFIG_CAN_ETAS_ES58X=m
# CONFIG_CAN_F81604 is not set
CONFIG_CAN_GS_USB=m
CONFIG_CAN_KVASER_USB=m
CONFIG_CAN_MCBA_USB=m
CONFIG_CAN_PEAK_USB=m
CONFIG_CAN_UCAN=m
# end of CAN USB interfaces

# CONFIG_CAN_DEBUG_DEVICES is not set

#
# MCTP Device Drivers
#
CONFIG_MCTP_SERIAL=m
# end of MCTP Device Drivers

CONFIG_MDIO_DEVICE=y
CONFIG_MDIO_BUS=y
CONFIG_FWNODE_MDIO=y
CONFIG_ACPI_MDIO=y
CONFIG_MDIO_DEVRES=y
CONFIG_MDIO_BITBANG=m
CONFIG_MDIO_BCM_UNIMAC=m
CONFIG_MDIO_CAVIUM=m
CONFIG_MDIO_GPIO=m
CONFIG_MDIO_I2C=m
CONFIG_MDIO_MVUSB=m
CONFIG_MDIO_MSCC_MIIM=m
CONFIG_MDIO_REGMAP=m
CONFIG_MDIO_THUNDER=m

#
# MDIO Multiplexers
#

#
# PCS device drivers
#
CONFIG_PCS_XPCS=m
CONFIG_PCS_LYNX=m
CONFIG_PCS_MTK_LYNXI=m
# end of PCS device drivers

CONFIG_PLIP=m
CONFIG_PPP=y
CONFIG_PPP_BSDCOMP=m
CONFIG_PPP_DEFLATE=m
CONFIG_PPP_FILTER=y
CONFIG_PPP_MPPE=m
CONFIG_PPP_MULTILINK=y
CONFIG_PPPOATM=m
CONFIG_PPPOE=m
# CONFIG_PPPOE_HASH_BITS_1 is not set
# CONFIG_PPPOE_HASH_BITS_2 is not set
CONFIG_PPPOE_HASH_BITS_4=y
# CONFIG_PPPOE_HASH_BITS_8 is not set
CONFIG_PPPOE_HASH_BITS=4
CONFIG_PPTP=m
CONFIG_PPPOL2TP=m
CONFIG_PPP_ASYNC=m
CONFIG_PPP_SYNC_TTY=m
CONFIG_SLIP=m
CONFIG_SLHC=y
CONFIG_SLIP_COMPRESSED=y
CONFIG_SLIP_SMART=y
CONFIG_SLIP_MODE_SLIP6=y
CONFIG_USB_NET_DRIVERS=m
CONFIG_USB_CATC=m
CONFIG_USB_KAWETH=m
CONFIG_USB_PEGASUS=m
CONFIG_USB_RTL8150=m
CONFIG_USB_RTL8152=m
CONFIG_USB_LAN78XX=m
CONFIG_USB_USBNET=m
CONFIG_USB_NET_AX8817X=m
CONFIG_USB_NET_AX88179_178A=m
CONFIG_USB_NET_CDCETHER=m
CONFIG_USB_NET_CDC_EEM=m
CONFIG_USB_NET_CDC_NCM=m
CONFIG_USB_NET_HUAWEI_CDC_NCM=m
CONFIG_USB_NET_CDC_MBIM=m
CONFIG_USB_NET_DM9601=m
CONFIG_USB_NET_SR9700=m
CONFIG_USB_NET_SR9800=m
CONFIG_USB_NET_SMSC75XX=m
CONFIG_USB_NET_SMSC95XX=m
CONFIG_USB_NET_GL620A=m
CONFIG_USB_NET_NET1080=m
CONFIG_USB_NET_PLUSB=m
CONFIG_USB_NET_MCS7830=m
CONFIG_USB_NET_RNDIS_HOST=m
CONFIG_USB_NET_CDC_SUBSET_ENABLE=m
CONFIG_USB_NET_CDC_SUBSET=m
CONFIG_USB_ALI_M5632=y
CONFIG_USB_AN2720=y
CONFIG_USB_BELKIN=y
CONFIG_USB_ARMLINUX=y
CONFIG_USB_EPSON2888=y
CONFIG_USB_KC2190=y
CONFIG_USB_NET_ZAURUS=m
CONFIG_USB_NET_CX82310_ETH=m
CONFIG_USB_NET_KALMIA=m
CONFIG_USB_NET_QMI_WWAN=m
CONFIG_USB_HSO=m
CONFIG_USB_NET_INT51X1=m
CONFIG_USB_CDC_PHONET=m
CONFIG_USB_IPHETH=m
CONFIG_USB_SIERRA_NET=m
CONFIG_USB_VL600=m
CONFIG_USB_NET_CH9200=m
CONFIG_USB_NET_AQC111=m
CONFIG_USB_RTL8153_ECM=m
CONFIG_WLAN=y
CONFIG_WLAN_VENDOR_ADMTEK=y
CONFIG_ADM8211=m
CONFIG_ATH_COMMON=m
CONFIG_WLAN_VENDOR_ATH=y
# CONFIG_ATH_DEBUG is not set
CONFIG_ATH5K=m
# CONFIG_ATH5K_DEBUG is not set
# CONFIG_ATH5K_TRACER is not set
CONFIG_ATH5K_PCI=y
CONFIG_ATH9K_HW=m
CONFIG_ATH9K_COMMON=m
CONFIG_ATH9K_COMMON_DEBUG=y
CONFIG_ATH9K_BTCOEX_SUPPORT=y
CONFIG_ATH9K=m
CONFIG_ATH9K_PCI=y
CONFIG_ATH9K_AHB=y
CONFIG_ATH9K_DEBUGFS=y
CONFIG_ATH9K_STATION_STATISTICS=y
# CONFIG_ATH9K_DYNACK is not set
CONFIG_ATH9K_WOW=y
CONFIG_ATH9K_RFKILL=y
CONFIG_ATH9K_CHANNEL_CONTEXT=y
CONFIG_ATH9K_PCOEM=y
CONFIG_ATH9K_PCI_NO_EEPROM=m
CONFIG_ATH9K_HTC=m
CONFIG_ATH9K_HTC_DEBUGFS=y
CONFIG_ATH9K_HWRNG=y
CONFIG_ATH9K_COMMON_SPECTRAL=y
CONFIG_CARL9170=m
CONFIG_CARL9170_LEDS=y
# CONFIG_CARL9170_DEBUGFS is not set
CONFIG_CARL9170_WPC=y
CONFIG_CARL9170_HWRNG=y
CONFIG_ATH6KL=m
CONFIG_ATH6KL_SDIO=m
CONFIG_ATH6KL_USB=m
# CONFIG_ATH6KL_DEBUG is not set
# CONFIG_ATH6KL_TRACING is not set
CONFIG_AR5523=m
CONFIG_WIL6210=m
CONFIG_WIL6210_ISR_COR=y
CONFIG_WIL6210_TRACING=y
CONFIG_WIL6210_DEBUGFS=y
CONFIG_ATH10K=m
CONFIG_ATH10K_CE=y
CONFIG_ATH10K_PCI=m
CONFIG_ATH10K_SDIO=m
CONFIG_ATH10K_USB=m
# CONFIG_ATH10K_DEBUG is not set
CONFIG_ATH10K_DEBUGFS=y
CONFIG_ATH10K_SPECTRAL=y
CONFIG_ATH10K_TRACING=y
CONFIG_WCN36XX=m
# CONFIG_WCN36XX_DEBUGFS is not set
CONFIG_ATH11K=m
CONFIG_ATH11K_AHB=m
CONFIG_ATH11K_PCI=m
# CONFIG_ATH11K_DEBUG is not set
CONFIG_ATH11K_DEBUGFS=y
CONFIG_ATH11K_TRACING=y
CONFIG_ATH11K_SPECTRAL=y
# CONFIG_ATH12K is not set
CONFIG_WLAN_VENDOR_ATMEL=y
CONFIG_ATMEL=m
CONFIG_PCI_ATMEL=m
CONFIG_PCMCIA_ATMEL=m
CONFIG_AT76C50X_USB=m
CONFIG_WLAN_VENDOR_BROADCOM=y
CONFIG_B43=m
CONFIG_B43_BCMA=y
CONFIG_B43_SSB=y
CONFIG_B43_BUSES_BCMA_AND_SSB=y
# CONFIG_B43_BUSES_BCMA is not set
# CONFIG_B43_BUSES_SSB is not set
CONFIG_B43_PCI_AUTOSELECT=y
CONFIG_B43_PCICORE_AUTOSELECT=y
# CONFIG_B43_SDIO is not set
CONFIG_B43_BCMA_PIO=y
CONFIG_B43_PIO=y
CONFIG_B43_PHY_G=y
CONFIG_B43_PHY_N=y
CONFIG_B43_PHY_LP=y
CONFIG_B43_PHY_HT=y
CONFIG_B43_LEDS=y
CONFIG_B43_HWRNG=y
# CONFIG_B43_DEBUG is not set
CONFIG_B43LEGACY=m
CONFIG_B43LEGACY_PCI_AUTOSELECT=y
CONFIG_B43LEGACY_PCICORE_AUTOSELECT=y
CONFIG_B43LEGACY_LEDS=y
CONFIG_B43LEGACY_HWRNG=y
# CONFIG_B43LEGACY_DEBUG is not set
CONFIG_B43LEGACY_DMA=y
CONFIG_B43LEGACY_PIO=y
CONFIG_B43LEGACY_DMA_AND_PIO_MODE=y
# CONFIG_B43LEGACY_DMA_MODE is not set
# CONFIG_B43LEGACY_PIO_MODE is not set
CONFIG_BRCMUTIL=m
CONFIG_BRCMSMAC=m
CONFIG_BRCMSMAC_LEDS=y
CONFIG_BRCMFMAC=m
CONFIG_BRCMFMAC_PROTO_BCDC=y
CONFIG_BRCMFMAC_PROTO_MSGBUF=y
CONFIG_BRCMFMAC_SDIO=y
CONFIG_BRCMFMAC_USB=y
CONFIG_BRCMFMAC_PCIE=y
CONFIG_BRCM_TRACING=y
# CONFIG_BRCMDBG is not set
CONFIG_WLAN_VENDOR_CISCO=y
CONFIG_AIRO=m
CONFIG_AIRO_CS=m
CONFIG_WLAN_VENDOR_INTEL=y
CONFIG_IPW2100=m
CONFIG_IPW2100_MONITOR=y
# CONFIG_IPW2100_DEBUG is not set
CONFIG_IPW2200=m
CONFIG_IPW2200_MONITOR=y
CONFIG_IPW2200_RADIOTAP=y
CONFIG_IPW2200_PROMISCUOUS=y
CONFIG_IPW2200_QOS=y
# CONFIG_IPW2200_DEBUG is not set
CONFIG_LIBIPW=m
# CONFIG_LIBIPW_DEBUG is not set
CONFIG_IWLEGACY=m
CONFIG_IWL4965=m
CONFIG_IWL3945=m

#
# iwl3945 / iwl4965 Debugging Options
#
# CONFIG_IWLEGACY_DEBUG is not set
CONFIG_IWLEGACY_DEBUGFS=y
# end of iwl3945 / iwl4965 Debugging Options

CONFIG_IWLWIFI=m
CONFIG_IWLWIFI_LEDS=y
CONFIG_IWLDVM=m
CONFIG_IWLMVM=m
CONFIG_IWLWIFI_OPMODE_MODULAR=y

#
# Debugging Options
#
# CONFIG_IWLWIFI_DEBUG is not set
CONFIG_IWLWIFI_DEBUGFS=y
CONFIG_IWLWIFI_DEVICE_TRACING=y
# end of Debugging Options

CONFIG_WLAN_VENDOR_INTERSIL=y
CONFIG_HOSTAP=m
CONFIG_HOSTAP_FIRMWARE=y
CONFIG_HOSTAP_FIRMWARE_NVRAM=y
CONFIG_HOSTAP_PLX=m
CONFIG_HOSTAP_PCI=m
CONFIG_HOSTAP_CS=m
CONFIG_HERMES=m
# CONFIG_HERMES_PRISM is not set
CONFIG_HERMES_CACHE_FW_ON_INIT=y
CONFIG_PLX_HERMES=m
CONFIG_TMD_HERMES=m
CONFIG_NORTEL_HERMES=m
CONFIG_PCMCIA_HERMES=m
CONFIG_PCMCIA_SPECTRUM=m
CONFIG_ORINOCO_USB=m
CONFIG_P54_COMMON=m
CONFIG_P54_USB=m
CONFIG_P54_PCI=m
CONFIG_P54_SPI=m
# CONFIG_P54_SPI_DEFAULT_EEPROM is not set
CONFIG_P54_LEDS=y
CONFIG_WLAN_VENDOR_MARVELL=y
CONFIG_LIBERTAS=m
CONFIG_LIBERTAS_USB=m
CONFIG_LIBERTAS_CS=m
CONFIG_LIBERTAS_SDIO=m
CONFIG_LIBERTAS_SPI=m
# CONFIG_LIBERTAS_DEBUG is not set
CONFIG_LIBERTAS_MESH=y
CONFIG_LIBERTAS_THINFIRM=m
# CONFIG_LIBERTAS_THINFIRM_DEBUG is not set
CONFIG_LIBERTAS_THINFIRM_USB=m
CONFIG_MWIFIEX=m
CONFIG_MWIFIEX_SDIO=m
CONFIG_MWIFIEX_PCIE=m
CONFIG_MWIFIEX_USB=m
CONFIG_MWL8K=m
CONFIG_WLAN_VENDOR_MEDIATEK=y
CONFIG_MT7601U=m
CONFIG_MT76_CORE=m
CONFIG_MT76_LEDS=y
CONFIG_MT76_USB=m
CONFIG_MT76_SDIO=m
CONFIG_MT76x02_LIB=m
CONFIG_MT76x02_USB=m
CONFIG_MT76_CONNAC_LIB=m
CONFIG_MT792x_LIB=m
CONFIG_MT792x_USB=m
CONFIG_MT76x0_COMMON=m
CONFIG_MT76x0U=m
CONFIG_MT76x0E=m
CONFIG_MT76x2_COMMON=m
CONFIG_MT76x2E=m
CONFIG_MT76x2U=m
CONFIG_MT7603E=m
CONFIG_MT7615_COMMON=m
CONFIG_MT7615E=m
CONFIG_MT7663_USB_SDIO_COMMON=m
CONFIG_MT7663U=m
CONFIG_MT7663S=m
CONFIG_MT7915E=m
CONFIG_MT7921_COMMON=m
CONFIG_MT7921E=m
CONFIG_MT7921S=m
CONFIG_MT7921U=m
CONFIG_MT7996E=m
CONFIG_WLAN_VENDOR_MICROCHIP=y
CONFIG_WILC1000=m
CONFIG_WILC1000_SDIO=m
CONFIG_WILC1000_SPI=m
CONFIG_WILC1000_HW_OOB_INTR=y
CONFIG_WLAN_VENDOR_PURELIFI=y
CONFIG_PLFXLC=m
CONFIG_WLAN_VENDOR_RALINK=y
CONFIG_RT2X00=m
CONFIG_RT2400PCI=m
CONFIG_RT2500PCI=m
CONFIG_RT61PCI=m
CONFIG_RT2800PCI=m
CONFIG_RT2800PCI_RT33XX=y
CONFIG_RT2800PCI_RT35XX=y
CONFIG_RT2800PCI_RT53XX=y
CONFIG_RT2800PCI_RT3290=y
CONFIG_RT2500USB=m
CONFIG_RT73USB=m
CONFIG_RT2800USB=m
CONFIG_RT2800USB_RT33XX=y
CONFIG_RT2800USB_RT35XX=y
CONFIG_RT2800USB_RT3573=y
CONFIG_RT2800USB_RT53XX=y
CONFIG_RT2800USB_RT55XX=y
CONFIG_RT2800USB_UNKNOWN=y
CONFIG_RT2800_LIB=m
CONFIG_RT2800_LIB_MMIO=m
CONFIG_RT2X00_LIB_MMIO=m
CONFIG_RT2X00_LIB_PCI=m
CONFIG_RT2X00_LIB_USB=m
CONFIG_RT2X00_LIB=m
CONFIG_RT2X00_LIB_FIRMWARE=y
CONFIG_RT2X00_LIB_CRYPTO=y
CONFIG_RT2X00_LIB_LEDS=y
# CONFIG_RT2X00_LIB_DEBUGFS is not set
# CONFIG_RT2X00_DEBUG is not set
CONFIG_WLAN_VENDOR_REALTEK=y
CONFIG_RTL8180=m
CONFIG_RTL8187=m
CONFIG_RTL8187_LEDS=y
CONFIG_RTL_CARDS=m
CONFIG_RTL8192CE=m
CONFIG_RTL8192SE=m
CONFIG_RTL8192DE=m
CONFIG_RTL8723AE=m
CONFIG_RTL8723BE=m
CONFIG_RTL8188EE=m
CONFIG_RTL8192EE=m
CONFIG_RTL8821AE=m
CONFIG_RTL8192CU=m
CONFIG_RTLWIFI=m
CONFIG_RTLWIFI_PCI=m
CONFIG_RTLWIFI_USB=m
# CONFIG_RTLWIFI_DEBUG is not set
CONFIG_RTL8192C_COMMON=m
CONFIG_RTL8723_COMMON=m
CONFIG_RTLBTCOEXIST=m
CONFIG_RTL8XXXU=m
CONFIG_RTL8XXXU_UNTESTED=y
CONFIG_RTW88=m
CONFIG_RTW88_CORE=m
CONFIG_RTW88_PCI=m
CONFIG_RTW88_USB=m
CONFIG_RTW88_8822B=m
CONFIG_RTW88_8822C=m
CONFIG_RTW88_8723D=m
CONFIG_RTW88_8821C=m
CONFIG_RTW88_8822BE=m
# CONFIG_RTW88_8822BS is not set
CONFIG_RTW88_8822BU=m
CONFIG_RTW88_8822CE=m
# CONFIG_RTW88_8822CS is not set
CONFIG_RTW88_8822CU=m
CONFIG_RTW88_8723DE=m
# CONFIG_RTW88_8723DS is not set
CONFIG_RTW88_8723DU=m
CONFIG_RTW88_8821CE=m
# CONFIG_RTW88_8821CS is not set
CONFIG_RTW88_8821CU=m
CONFIG_RTW88_DEBUG=y
CONFIG_RTW88_DEBUGFS=y
CONFIG_RTW89=m
CONFIG_RTW89_CORE=m
CONFIG_RTW89_PCI=m
CONFIG_RTW89_8852A=m
CONFIG_RTW89_8852B=m
CONFIG_RTW89_8852C=m
# CONFIG_RTW89_8851BE is not set
CONFIG_RTW89_8852AE=m
CONFIG_RTW89_8852BE=m
CONFIG_RTW89_8852CE=m
CONFIG_RTW89_DEBUG=y
CONFIG_RTW89_DEBUGMSG=y
CONFIG_RTW89_DEBUGFS=y
CONFIG_WLAN_VENDOR_RSI=y
CONFIG_RSI_91X=m
# CONFIG_RSI_DEBUGFS is not set
CONFIG_RSI_SDIO=m
CONFIG_RSI_USB=m
CONFIG_RSI_COEX=y
CONFIG_WLAN_VENDOR_SILABS=y
CONFIG_WFX=m
CONFIG_WLAN_VENDOR_ST=y
CONFIG_CW1200=m
CONFIG_CW1200_WLAN_SDIO=m
CONFIG_CW1200_WLAN_SPI=m
CONFIG_WLAN_VENDOR_TI=y
CONFIG_WL1251=m
CONFIG_WL1251_SPI=m
CONFIG_WL1251_SDIO=m
CONFIG_WL12XX=m
CONFIG_WL18XX=m
CONFIG_WLCORE=m
CONFIG_WLCORE_SDIO=m
CONFIG_WLAN_VENDOR_ZYDAS=y
CONFIG_USB_ZD1201=m
CONFIG_ZD1211RW=m
# CONFIG_ZD1211RW_DEBUG is not set
CONFIG_WLAN_VENDOR_QUANTENNA=y
CONFIG_QTNFMAC=m
CONFIG_QTNFMAC_PCIE=m
CONFIG_PCMCIA_RAYCS=m
CONFIG_PCMCIA_WL3501=m
CONFIG_USB_NET_RNDIS_WLAN=m
CONFIG_MAC80211_HWSIM=m
CONFIG_VIRT_WIFI=m
CONFIG_WAN=y
CONFIG_HDLC=m
CONFIG_HDLC_RAW=m
CONFIG_HDLC_RAW_ETH=m
CONFIG_HDLC_CISCO=m
CONFIG_HDLC_FR=m
CONFIG_HDLC_PPP=m
CONFIG_HDLC_X25=m
CONFIG_PCI200SYN=m
CONFIG_WANXL=m
CONFIG_PC300TOO=m
CONFIG_FARSYNC=m
CONFIG_LAPBETHER=m
CONFIG_IEEE802154_DRIVERS=m
CONFIG_IEEE802154_FAKELB=m
CONFIG_IEEE802154_AT86RF230=m
CONFIG_IEEE802154_MRF24J40=m
CONFIG_IEEE802154_CC2520=m
CONFIG_IEEE802154_ATUSB=m
CONFIG_IEEE802154_ADF7242=m
CONFIG_IEEE802154_CA8210=m
CONFIG_IEEE802154_CA8210_DEBUGFS=y
CONFIG_IEEE802154_MCR20A=m
CONFIG_IEEE802154_HWSIM=m

#
# Wireless WAN
#
CONFIG_WWAN=y
CONFIG_WWAN_DEBUGFS=y
CONFIG_WWAN_HWSIM=m
CONFIG_MHI_WWAN_CTRL=m
CONFIG_MHI_WWAN_MBIM=m
CONFIG_RPMSG_WWAN_CTRL=m
CONFIG_IOSM=m
CONFIG_MTK_T7XX=m
# end of Wireless WAN

CONFIG_XEN_NETDEV_FRONTEND=y
CONFIG_XEN_NETDEV_BACKEND=m
CONFIG_VMXNET3=m
CONFIG_FUJITSU_ES=m
CONFIG_USB4_NET=m
CONFIG_HYPERV_NET=m
CONFIG_NETDEVSIM=m
CONFIG_NET_FAILOVER=m
CONFIG_ISDN=y
CONFIG_ISDN_CAPI=y
CONFIG_CAPI_TRACE=y
CONFIG_ISDN_CAPI_MIDDLEWARE=y
CONFIG_MISDN=m
CONFIG_MISDN_DSP=m
CONFIG_MISDN_L1OIP=m

#
# mISDN hardware drivers
#
CONFIG_MISDN_HFCPCI=m
CONFIG_MISDN_HFCMULTI=m
CONFIG_MISDN_HFCUSB=m
CONFIG_MISDN_AVMFRITZ=m
CONFIG_MISDN_SPEEDFAX=m
CONFIG_MISDN_INFINEON=m
CONFIG_MISDN_W6692=m
CONFIG_MISDN_NETJET=m
CONFIG_MISDN_HDLC=m
CONFIG_MISDN_IPAC=m
CONFIG_MISDN_ISAR=m

#
# Input device support
#
CONFIG_INPUT=y
CONFIG_INPUT_LEDS=m
CONFIG_INPUT_FF_MEMLESS=m
CONFIG_INPUT_SPARSEKMAP=m
CONFIG_INPUT_MATRIXKMAP=m
CONFIG_INPUT_VIVALDIFMAP=y

#
# Userland interfaces
#
CONFIG_INPUT_MOUSEDEV=y
CONFIG_INPUT_MOUSEDEV_PSAUX=y
CONFIG_INPUT_MOUSEDEV_SCREEN_X=1024
CONFIG_INPUT_MOUSEDEV_SCREEN_Y=768
CONFIG_INPUT_JOYDEV=m
CONFIG_INPUT_EVDEV=y
CONFIG_INPUT_EVBUG=m

#
# Input Device Drivers
#
CONFIG_INPUT_KEYBOARD=y
CONFIG_KEYBOARD_ADC=m
CONFIG_KEYBOARD_ADP5520=m
CONFIG_KEYBOARD_ADP5588=m
CONFIG_KEYBOARD_ADP5589=m
CONFIG_KEYBOARD_APPLESPI=m
CONFIG_KEYBOARD_ATKBD=y
CONFIG_KEYBOARD_QT1050=m
CONFIG_KEYBOARD_QT1070=m
CONFIG_KEYBOARD_QT2160=m
CONFIG_KEYBOARD_DLINK_DIR685=m
CONFIG_KEYBOARD_LKKBD=m
CONFIG_KEYBOARD_GPIO=m
CONFIG_KEYBOARD_GPIO_POLLED=m
CONFIG_KEYBOARD_TCA6416=m
CONFIG_KEYBOARD_TCA8418=m
CONFIG_KEYBOARD_MATRIX=m
CONFIG_KEYBOARD_LM8323=m
CONFIG_KEYBOARD_LM8333=m
CONFIG_KEYBOARD_MAX7359=m
CONFIG_KEYBOARD_MCS=m
CONFIG_KEYBOARD_MPR121=m
CONFIG_KEYBOARD_NEWTON=m
CONFIG_KEYBOARD_OPENCORES=m
CONFIG_KEYBOARD_PINEPHONE=m
CONFIG_KEYBOARD_SAMSUNG=m
CONFIG_KEYBOARD_STOWAWAY=m
CONFIG_KEYBOARD_SUNKBD=m
CONFIG_KEYBOARD_IQS62X=m
CONFIG_KEYBOARD_TM2_TOUCHKEY=m
CONFIG_KEYBOARD_TWL4030=m
CONFIG_KEYBOARD_XTKBD=m
CONFIG_KEYBOARD_CROS_EC=m
CONFIG_KEYBOARD_MTK_PMIC=m
CONFIG_KEYBOARD_CYPRESS_SF=m
CONFIG_INPUT_MOUSE=y
CONFIG_MOUSE_PS2=m
CONFIG_MOUSE_PS2_ALPS=y
CONFIG_MOUSE_PS2_BYD=y
CONFIG_MOUSE_PS2_LOGIPS2PP=y
CONFIG_MOUSE_PS2_SYNAPTICS=y
CONFIG_MOUSE_PS2_SYNAPTICS_SMBUS=y
CONFIG_MOUSE_PS2_CYPRESS=y
CONFIG_MOUSE_PS2_LIFEBOOK=y
CONFIG_MOUSE_PS2_TRACKPOINT=y
CONFIG_MOUSE_PS2_ELANTECH=y
CONFIG_MOUSE_PS2_ELANTECH_SMBUS=y
CONFIG_MOUSE_PS2_SENTELIC=y
CONFIG_MOUSE_PS2_TOUCHKIT=y
CONFIG_MOUSE_PS2_FOCALTECH=y
CONFIG_MOUSE_PS2_VMMOUSE=y
CONFIG_MOUSE_PS2_SMBUS=y
CONFIG_MOUSE_SERIAL=m
CONFIG_MOUSE_APPLETOUCH=m
CONFIG_MOUSE_BCM5974=m
CONFIG_MOUSE_CYAPA=m
CONFIG_MOUSE_ELAN_I2C=m
CONFIG_MOUSE_ELAN_I2C_I2C=y
CONFIG_MOUSE_ELAN_I2C_SMBUS=y
CONFIG_MOUSE_VSXXXAA=m
CONFIG_MOUSE_GPIO=m
CONFIG_MOUSE_SYNAPTICS_I2C=m
CONFIG_MOUSE_SYNAPTICS_USB=m
CONFIG_INPUT_JOYSTICK=y
CONFIG_JOYSTICK_ANALOG=m
CONFIG_JOYSTICK_A3D=m
CONFIG_JOYSTICK_ADC=m
CONFIG_JOYSTICK_ADI=m
CONFIG_JOYSTICK_COBRA=m
CONFIG_JOYSTICK_GF2K=m
CONFIG_JOYSTICK_GRIP=m
CONFIG_JOYSTICK_GRIP_MP=m
CONFIG_JOYSTICK_GUILLEMOT=m
CONFIG_JOYSTICK_INTERACT=m
CONFIG_JOYSTICK_SIDEWINDER=m
CONFIG_JOYSTICK_TMDC=m
CONFIG_JOYSTICK_IFORCE=m
CONFIG_JOYSTICK_IFORCE_USB=m
CONFIG_JOYSTICK_IFORCE_232=m
CONFIG_JOYSTICK_WARRIOR=m
CONFIG_JOYSTICK_MAGELLAN=m
CONFIG_JOYSTICK_SPACEORB=m
CONFIG_JOYSTICK_SPACEBALL=m
CONFIG_JOYSTICK_STINGER=m
CONFIG_JOYSTICK_TWIDJOY=m
CONFIG_JOYSTICK_ZHENHUA=m
CONFIG_JOYSTICK_DB9=m
CONFIG_JOYSTICK_GAMECON=m
CONFIG_JOYSTICK_TURBOGRAFX=m
CONFIG_JOYSTICK_AS5011=m
CONFIG_JOYSTICK_JOYDUMP=m
CONFIG_JOYSTICK_XPAD=m
CONFIG_JOYSTICK_XPAD_FF=y
CONFIG_JOYSTICK_XPAD_LEDS=y
CONFIG_JOYSTICK_WALKERA0701=m
CONFIG_JOYSTICK_PSXPAD_SPI=m
CONFIG_JOYSTICK_PSXPAD_SPI_FF=y
CONFIG_JOYSTICK_PXRC=m
CONFIG_JOYSTICK_QWIIC=m
CONFIG_JOYSTICK_FSIA6B=m
CONFIG_JOYSTICK_SENSEHAT=m
CONFIG_INPUT_TABLET=y
CONFIG_TABLET_USB_ACECAD=m
CONFIG_TABLET_USB_AIPTEK=m
CONFIG_TABLET_USB_HANWANG=m
CONFIG_TABLET_USB_KBTAB=m
CONFIG_TABLET_USB_PEGASUS=m
CONFIG_TABLET_SERIAL_WACOM4=m
CONFIG_INPUT_TOUCHSCREEN=y
CONFIG_TOUCHSCREEN_88PM860X=m
CONFIG_TOUCHSCREEN_ADS7846=m
CONFIG_TOUCHSCREEN_AD7877=m
CONFIG_TOUCHSCREEN_AD7879=m
CONFIG_TOUCHSCREEN_AD7879_I2C=m
CONFIG_TOUCHSCREEN_AD7879_SPI=m
CONFIG_TOUCHSCREEN_ADC=m
CONFIG_TOUCHSCREEN_ATMEL_MXT=m
CONFIG_TOUCHSCREEN_ATMEL_MXT_T37=y
CONFIG_TOUCHSCREEN_AUO_PIXCIR=m
CONFIG_TOUCHSCREEN_BU21013=m
CONFIG_TOUCHSCREEN_BU21029=m
CONFIG_TOUCHSCREEN_CHIPONE_ICN8505=m
CONFIG_TOUCHSCREEN_CY8CTMA140=m
CONFIG_TOUCHSCREEN_CY8CTMG110=m
CONFIG_TOUCHSCREEN_CYTTSP_CORE=m
CONFIG_TOUCHSCREEN_CYTTSP_I2C=m
CONFIG_TOUCHSCREEN_CYTTSP_SPI=m
CONFIG_TOUCHSCREEN_CYTTSP4_CORE=m
CONFIG_TOUCHSCREEN_CYTTSP4_I2C=m
CONFIG_TOUCHSCREEN_CYTTSP4_SPI=m
CONFIG_TOUCHSCREEN_CYTTSP5=m
CONFIG_TOUCHSCREEN_DA9034=m
CONFIG_TOUCHSCREEN_DA9052=m
CONFIG_TOUCHSCREEN_DYNAPRO=m
CONFIG_TOUCHSCREEN_HAMPSHIRE=m
CONFIG_TOUCHSCREEN_EETI=m
CONFIG_TOUCHSCREEN_EGALAX_SERIAL=m
CONFIG_TOUCHSCREEN_EXC3000=m
CONFIG_TOUCHSCREEN_FUJITSU=m
CONFIG_TOUCHSCREEN_GOODIX=m
CONFIG_TOUCHSCREEN_HIDEEP=m
CONFIG_TOUCHSCREEN_HYCON_HY46XX=m
CONFIG_TOUCHSCREEN_HYNITRON_CSTXXX=m
CONFIG_TOUCHSCREEN_ILI210X=m
CONFIG_TOUCHSCREEN_ILITEK=m
CONFIG_TOUCHSCREEN_S6SY761=m
CONFIG_TOUCHSCREEN_GUNZE=m
CONFIG_TOUCHSCREEN_EKTF2127=m
CONFIG_TOUCHSCREEN_ELAN=y
CONFIG_TOUCHSCREEN_ELO=m
CONFIG_TOUCHSCREEN_WACOM_W8001=m
CONFIG_TOUCHSCREEN_WACOM_I2C=m
CONFIG_TOUCHSCREEN_MAX11801=m
CONFIG_TOUCHSCREEN_MCS5000=m
CONFIG_TOUCHSCREEN_MMS114=m
CONFIG_TOUCHSCREEN_MELFAS_MIP4=m
CONFIG_TOUCHSCREEN_MSG2638=m
CONFIG_TOUCHSCREEN_MTOUCH=m
# CONFIG_TOUCHSCREEN_NOVATEK_NVT_TS is not set
CONFIG_TOUCHSCREEN_IMAGIS=m
CONFIG_TOUCHSCREEN_INEXIO=m
CONFIG_TOUCHSCREEN_PENMOUNT=m
CONFIG_TOUCHSCREEN_EDT_FT5X06=m
CONFIG_TOUCHSCREEN_TOUCHRIGHT=m
CONFIG_TOUCHSCREEN_TOUCHWIN=m
CONFIG_TOUCHSCREEN_TI_AM335X_TSC=m
CONFIG_TOUCHSCREEN_PIXCIR=m
CONFIG_TOUCHSCREEN_WDT87XX_I2C=m
CONFIG_TOUCHSCREEN_WM831X=m
CONFIG_TOUCHSCREEN_WM97XX=m
CONFIG_TOUCHSCREEN_WM9705=y
CONFIG_TOUCHSCREEN_WM9712=y
CONFIG_TOUCHSCREEN_WM9713=y
CONFIG_TOUCHSCREEN_USB_COMPOSITE=m
CONFIG_TOUCHSCREEN_MC13783=m
CONFIG_TOUCHSCREEN_USB_EGALAX=y
CONFIG_TOUCHSCREEN_USB_PANJIT=y
CONFIG_TOUCHSCREEN_USB_3M=y
CONFIG_TOUCHSCREEN_USB_ITM=y
CONFIG_TOUCHSCREEN_USB_ETURBO=y
CONFIG_TOUCHSCREEN_USB_GUNZE=y
CONFIG_TOUCHSCREEN_USB_DMC_TSC10=y
CONFIG_TOUCHSCREEN_USB_IRTOUCH=y
CONFIG_TOUCHSCREEN_USB_IDEALTEK=y
CONFIG_TOUCHSCREEN_USB_GENERAL_TOUCH=y
CONFIG_TOUCHSCREEN_USB_GOTOP=y
CONFIG_TOUCHSCREEN_USB_JASTEC=y
CONFIG_TOUCHSCREEN_USB_ELO=y
CONFIG_TOUCHSCREEN_USB_E2I=y
CONFIG_TOUCHSCREEN_USB_ZYTRONIC=y
CONFIG_TOUCHSCREEN_USB_ETT_TC45USB=y
CONFIG_TOUCHSCREEN_USB_NEXIO=y
CONFIG_TOUCHSCREEN_USB_EASYTOUCH=y
CONFIG_TOUCHSCREEN_TOUCHIT213=m
CONFIG_TOUCHSCREEN_TSC_SERIO=m
CONFIG_TOUCHSCREEN_TSC200X_CORE=m
CONFIG_TOUCHSCREEN_TSC2004=m
CONFIG_TOUCHSCREEN_TSC2005=m
CONFIG_TOUCHSCREEN_TSC2007=m
CONFIG_TOUCHSCREEN_TSC2007_IIO=y
CONFIG_TOUCHSCREEN_PCAP=m
CONFIG_TOUCHSCREEN_RM_TS=m
CONFIG_TOUCHSCREEN_SILEAD=m
CONFIG_TOUCHSCREEN_SIS_I2C=m
CONFIG_TOUCHSCREEN_ST1232=m
CONFIG_TOUCHSCREEN_STMFTS=m
CONFIG_TOUCHSCREEN_SUR40=m
CONFIG_TOUCHSCREEN_SURFACE3_SPI=m
CONFIG_TOUCHSCREEN_SX8654=m
CONFIG_TOUCHSCREEN_TPS6507X=m
CONFIG_TOUCHSCREEN_ZET6223=m
CONFIG_TOUCHSCREEN_ZFORCE=m
CONFIG_TOUCHSCREEN_COLIBRI_VF50=m
CONFIG_TOUCHSCREEN_ROHM_BU21023=m
CONFIG_TOUCHSCREEN_IQS5XX=m
# CONFIG_TOUCHSCREEN_IQS7211 is not set
CONFIG_TOUCHSCREEN_ZINITIX=m
CONFIG_TOUCHSCREEN_HIMAX_HX83112B=m
CONFIG_INPUT_MISC=y
CONFIG_INPUT_88PM860X_ONKEY=m
CONFIG_INPUT_88PM80X_ONKEY=m
CONFIG_INPUT_AD714X=m
CONFIG_INPUT_AD714X_I2C=m
CONFIG_INPUT_AD714X_SPI=m
CONFIG_INPUT_ARIZONA_HAPTICS=m
CONFIG_INPUT_ATC260X_ONKEY=m
CONFIG_INPUT_BMA150=m
CONFIG_INPUT_E3X0_BUTTON=m
CONFIG_INPUT_PCSPKR=m
CONFIG_INPUT_MAX77693_HAPTIC=m
CONFIG_INPUT_MAX8925_ONKEY=m
CONFIG_INPUT_MAX8997_HAPTIC=m
CONFIG_INPUT_MC13783_PWRBUTTON=m
CONFIG_INPUT_MMA8450=m
CONFIG_INPUT_APANEL=m
CONFIG_INPUT_GPIO_BEEPER=m
CONFIG_INPUT_GPIO_DECODER=m
CONFIG_INPUT_GPIO_VIBRA=m
CONFIG_INPUT_ATLAS_BTNS=m
CONFIG_INPUT_ATI_REMOTE2=m
CONFIG_INPUT_KEYSPAN_REMOTE=m
CONFIG_INPUT_KXTJ9=m
CONFIG_INPUT_POWERMATE=m
CONFIG_INPUT_YEALINK=m
CONFIG_INPUT_CM109=m
CONFIG_INPUT_REGULATOR_HAPTIC=m
CONFIG_INPUT_RETU_PWRBUTTON=m
CONFIG_INPUT_AXP20X_PEK=m
CONFIG_INPUT_TWL4030_PWRBUTTON=m
CONFIG_INPUT_TWL4030_VIBRA=m
CONFIG_INPUT_TWL6040_VIBRA=m
CONFIG_INPUT_UINPUT=y
CONFIG_INPUT_PALMAS_PWRBUTTON=m
CONFIG_INPUT_PCF50633_PMU=m
CONFIG_INPUT_PCF8574=m
CONFIG_INPUT_PWM_BEEPER=m
CONFIG_INPUT_PWM_VIBRA=m
CONFIG_INPUT_GPIO_ROTARY_ENCODER=m
CONFIG_INPUT_DA7280_HAPTICS=m
CONFIG_INPUT_DA9052_ONKEY=m
CONFIG_INPUT_DA9055_ONKEY=m
CONFIG_INPUT_DA9063_ONKEY=m
CONFIG_INPUT_WM831X_ON=m
CONFIG_INPUT_PCAP=m
CONFIG_INPUT_ADXL34X=m
CONFIG_INPUT_ADXL34X_I2C=m
CONFIG_INPUT_ADXL34X_SPI=m
CONFIG_INPUT_IMS_PCU=m
CONFIG_INPUT_IQS269A=m
CONFIG_INPUT_IQS626A=m
CONFIG_INPUT_IQS7222=m
CONFIG_INPUT_CMA3000=m
CONFIG_INPUT_CMA3000_I2C=m
CONFIG_INPUT_XEN_KBDDEV_FRONTEND=m
CONFIG_INPUT_IDEAPAD_SLIDEBAR=m
CONFIG_INPUT_SOC_BUTTON_ARRAY=m
CONFIG_INPUT_DRV260X_HAPTICS=m
CONFIG_INPUT_DRV2665_HAPTICS=m
CONFIG_INPUT_DRV2667_HAPTICS=m
CONFIG_INPUT_RAVE_SP_PWRBUTTON=m
CONFIG_INPUT_RT5120_PWRKEY=m
CONFIG_RMI4_CORE=m
CONFIG_RMI4_I2C=m
CONFIG_RMI4_SPI=m
CONFIG_RMI4_SMB=m
CONFIG_RMI4_F03=y
CONFIG_RMI4_F03_SERIO=m
CONFIG_RMI4_2D_SENSOR=y
CONFIG_RMI4_F11=y
CONFIG_RMI4_F12=y
CONFIG_RMI4_F30=y
CONFIG_RMI4_F34=y
CONFIG_RMI4_F3A=y
CONFIG_RMI4_F54=y
CONFIG_RMI4_F55=y

#
# Hardware I/O ports
#
CONFIG_SERIO=y
CONFIG_ARCH_MIGHT_HAVE_PC_SERIO=y
CONFIG_SERIO_I8042=y
CONFIG_SERIO_SERPORT=m
CONFIG_SERIO_CT82C710=m
CONFIG_SERIO_PARKBD=m
CONFIG_SERIO_PCIPS2=m
CONFIG_SERIO_LIBPS2=y
CONFIG_SERIO_RAW=m
CONFIG_SERIO_ALTERA_PS2=m
CONFIG_SERIO_PS2MULT=m
CONFIG_SERIO_ARC_PS2=m
CONFIG_HYPERV_KEYBOARD=m
CONFIG_SERIO_GPIO_PS2=m
CONFIG_USERIO=m
CONFIG_GAMEPORT=m
CONFIG_GAMEPORT_EMU10K1=m
CONFIG_GAMEPORT_FM801=m
# end of Hardware I/O ports
# end of Input device support

#
# Character devices
#
CONFIG_TTY=y
CONFIG_VT=y
CONFIG_CONSOLE_TRANSLATIONS=y
CONFIG_VT_CONSOLE=y
CONFIG_VT_CONSOLE_SLEEP=y
CONFIG_HW_CONSOLE=y
CONFIG_VT_HW_CONSOLE_BINDING=y
CONFIG_UNIX98_PTYS=y
CONFIG_LEGACY_PTYS=y
CONFIG_LEGACY_PTY_COUNT=0
CONFIG_LEGACY_TIOCSTI=y
CONFIG_LDISC_AUTOLOAD=y

#
# Serial drivers
#
CONFIG_SERIAL_EARLYCON=y
CONFIG_SERIAL_8250=y
# CONFIG_SERIAL_8250_DEPRECATED_OPTIONS is not set
CONFIG_SERIAL_8250_PNP=y
CONFIG_SERIAL_8250_16550A_VARIANTS=y
CONFIG_SERIAL_8250_FINTEK=y
CONFIG_SERIAL_8250_CONSOLE=y
CONFIG_SERIAL_8250_DMA=y
CONFIG_SERIAL_8250_PCILIB=y
CONFIG_SERIAL_8250_PCI=y
CONFIG_SERIAL_8250_EXAR=m
CONFIG_SERIAL_8250_CS=m
CONFIG_SERIAL_8250_MEN_MCB=m
CONFIG_SERIAL_8250_NR_UARTS=48
CONFIG_SERIAL_8250_RUNTIME_UARTS=32
CONFIG_SERIAL_8250_EXTENDED=y
CONFIG_SERIAL_8250_MANY_PORTS=y
# CONFIG_SERIAL_8250_PCI1XXXX is not set
CONFIG_SERIAL_8250_SHARE_IRQ=y
# CONFIG_SERIAL_8250_DETECT_IRQ is not set
CONFIG_SERIAL_8250_RSA=y
CONFIG_SERIAL_8250_DWLIB=y
# CONFIG_SERIAL_8250_DFL is not set
CONFIG_SERIAL_8250_DW=m
CONFIG_SERIAL_8250_RT288X=y
CONFIG_SERIAL_8250_LPSS=m
CONFIG_SERIAL_8250_MID=y
CONFIG_SERIAL_8250_PERICOM=m

#
# Non-8250 serial port support
#
CONFIG_SERIAL_KGDB_NMI=y
CONFIG_SERIAL_MAX3100=m
CONFIG_SERIAL_MAX310X=y
CONFIG_SERIAL_UARTLITE=m
CONFIG_SERIAL_UARTLITE_NR_UARTS=1
CONFIG_SERIAL_CORE=y
CONFIG_SERIAL_CORE_CONSOLE=y
CONFIG_CONSOLE_POLL=y
CONFIG_SERIAL_JSM=m
CONFIG_SERIAL_LANTIQ=m
CONFIG_SERIAL_SCCNXP=y
CONFIG_SERIAL_SCCNXP_CONSOLE=y
CONFIG_SERIAL_SC16IS7XX_CORE=m
CONFIG_SERIAL_SC16IS7XX=m
CONFIG_SERIAL_SC16IS7XX_I2C=y
CONFIG_SERIAL_SC16IS7XX_SPI=y
CONFIG_SERIAL_ALTERA_JTAGUART=m
CONFIG_SERIAL_ALTERA_UART=m
CONFIG_SERIAL_ALTERA_UART_MAXPORTS=4
CONFIG_SERIAL_ALTERA_UART_BAUDRATE=115200
CONFIG_SERIAL_ARC=m
CONFIG_SERIAL_ARC_NR_PORTS=1
CONFIG_SERIAL_RP2=m
CONFIG_SERIAL_RP2_NR_UARTS=32
CONFIG_SERIAL_FSL_LPUART=m
CONFIG_SERIAL_FSL_LINFLEXUART=m
CONFIG_SERIAL_MEN_Z135=m
CONFIG_SERIAL_SPRD=m
# end of Serial drivers

CONFIG_SERIAL_MCTRL_GPIO=y
CONFIG_SERIAL_NONSTANDARD=y
CONFIG_MOXA_INTELLIO=m
CONFIG_MOXA_SMARTIO=m
CONFIG_N_HDLC=m
CONFIG_IPWIRELESS=m
CONFIG_N_GSM=m
CONFIG_NOZOMI=m
CONFIG_NULL_TTY=m
CONFIG_HVC_DRIVER=y
CONFIG_HVC_IRQ=y
CONFIG_HVC_XEN=y
CONFIG_HVC_XEN_FRONTEND=y
CONFIG_RPMSG_TTY=m
CONFIG_SERIAL_DEV_BUS=y
CONFIG_SERIAL_DEV_CTRL_TTYPORT=y
CONFIG_TTY_PRINTK=y
CONFIG_TTY_PRINTK_LEVEL=6
CONFIG_PRINTER=m
# CONFIG_LP_CONSOLE is not set
CONFIG_PPDEV=m
CONFIG_VIRTIO_CONSOLE=y
CONFIG_IPMI_HANDLER=m
CONFIG_IPMI_DMI_DECODE=y
CONFIG_IPMI_PLAT_DATA=y
# CONFIG_IPMI_PANIC_EVENT is not set
CONFIG_IPMI_DEVICE_INTERFACE=m
CONFIG_IPMI_SI=m
CONFIG_IPMI_SSIF=m
CONFIG_IPMI_WATCHDOG=m
CONFIG_IPMI_POWEROFF=m
CONFIG_HW_RANDOM=y
CONFIG_HW_RANDOM_TIMERIOMEM=m
CONFIG_HW_RANDOM_INTEL=m
CONFIG_HW_RANDOM_AMD=m
CONFIG_HW_RANDOM_BA431=m
CONFIG_HW_RANDOM_VIA=m
CONFIG_HW_RANDOM_VIRTIO=m
CONFIG_HW_RANDOM_XIPHERA=m
CONFIG_APPLICOM=m
CONFIG_MWAVE=m
CONFIG_DEVMEM=y
CONFIG_NVRAM=m
CONFIG_DEVPORT=y
CONFIG_HPET=y
CONFIG_HPET_MMAP=y
CONFIG_HPET_MMAP_DEFAULT=y
CONFIG_HANGCHECK_TIMER=m
CONFIG_UV_MMTIMER=m
CONFIG_TCG_TPM=y
CONFIG_HW_RANDOM_TPM=y
CONFIG_TCG_TIS_CORE=y
CONFIG_TCG_TIS=y
CONFIG_TCG_TIS_SPI=m
CONFIG_TCG_TIS_SPI_CR50=y
CONFIG_TCG_TIS_I2C=m
CONFIG_TCG_TIS_I2C_CR50=m
CONFIG_TCG_TIS_I2C_ATMEL=m
CONFIG_TCG_TIS_I2C_INFINEON=m
CONFIG_TCG_TIS_I2C_NUVOTON=m
CONFIG_TCG_NSC=m
CONFIG_TCG_ATMEL=m
CONFIG_TCG_INFINEON=m
CONFIG_TCG_XEN=m
CONFIG_TCG_CRB=y
CONFIG_TCG_VTPM_PROXY=m
CONFIG_TCG_TIS_ST33ZP24=m
CONFIG_TCG_TIS_ST33ZP24_I2C=m
CONFIG_TCG_TIS_ST33ZP24_SPI=m
CONFIG_TELCLOCK=m
CONFIG_XILLYBUS_CLASS=m
CONFIG_XILLYBUS=m
CONFIG_XILLYBUS_PCIE=m
CONFIG_XILLYUSB=m
# end of Character devices

#
# I2C support
#
CONFIG_I2C=y
CONFIG_ACPI_I2C_OPREGION=y
CONFIG_I2C_BOARDINFO=y
CONFIG_I2C_COMPAT=y
CONFIG_I2C_CHARDEV=y
CONFIG_I2C_MUX=m

#
# Multiplexer I2C Chip support
#
CONFIG_I2C_MUX_GPIO=m
CONFIG_I2C_MUX_LTC4306=m
CONFIG_I2C_MUX_PCA9541=m
CONFIG_I2C_MUX_PCA954x=m
CONFIG_I2C_MUX_REG=m
CONFIG_I2C_MUX_MLXCPLD=m
# end of Multiplexer I2C Chip support

CONFIG_I2C_HELPER_AUTO=y
CONFIG_I2C_SMBUS=m
CONFIG_I2C_ALGOBIT=m
CONFIG_I2C_ALGOPCA=m

#
# I2C Hardware Bus support
#

#
# PC SMBus host controller drivers
#
CONFIG_I2C_CCGX_UCSI=m
CONFIG_I2C_ALI1535=m
CONFIG_I2C_ALI1563=m
CONFIG_I2C_ALI15X3=m
CONFIG_I2C_AMD756=m
CONFIG_I2C_AMD756_S4882=m
CONFIG_I2C_AMD8111=m
CONFIG_I2C_AMD_MP2=m
CONFIG_I2C_I801=m
CONFIG_I2C_ISCH=m
CONFIG_I2C_ISMT=m
CONFIG_I2C_PIIX4=m
CONFIG_I2C_CHT_WC=m
CONFIG_I2C_NFORCE2=m
CONFIG_I2C_NFORCE2_S4985=m
CONFIG_I2C_NVIDIA_GPU=m
CONFIG_I2C_SIS5595=m
CONFIG_I2C_SIS630=m
CONFIG_I2C_SIS96X=m
CONFIG_I2C_VIA=m
CONFIG_I2C_VIAPRO=m

#
# ACPI drivers
#
CONFIG_I2C_SCMI=m

#
# I2C system bus drivers (mostly embedded / system-on-chip)
#
CONFIG_I2C_CBUS_GPIO=m
CONFIG_I2C_DESIGNWARE_CORE=y
# CONFIG_I2C_DESIGNWARE_SLAVE is not set
CONFIG_I2C_DESIGNWARE_PLATFORM=y
CONFIG_I2C_DESIGNWARE_BAYTRAIL=y
CONFIG_I2C_DESIGNWARE_PCI=m
# CONFIG_I2C_EMEV2 is not set
CONFIG_I2C_GPIO=m
# CONFIG_I2C_GPIO_FAULT_INJECTOR is not set
CONFIG_I2C_KEMPLD=m
CONFIG_I2C_OCORES=m
CONFIG_I2C_PCA_PLATFORM=m
CONFIG_I2C_SIMTEC=m
CONFIG_I2C_XILINX=m

#
# External I2C/SMBus adapter drivers
#
CONFIG_I2C_DIOLAN_U2C=m
CONFIG_I2C_DLN2=m
CONFIG_I2C_CP2615=m
CONFIG_I2C_PARPORT=m
CONFIG_I2C_PCI1XXXX=m
CONFIG_I2C_ROBOTFUZZ_OSIF=m
CONFIG_I2C_TAOS_EVM=m
CONFIG_I2C_TINY_USB=m
CONFIG_I2C_VIPERBOARD=m

#
# Other I2C/SMBus bus drivers
#
CONFIG_I2C_MLXCPLD=m
CONFIG_I2C_CROS_EC_TUNNEL=m
CONFIG_I2C_VIRTIO=m
# end of I2C Hardware Bus support

CONFIG_I2C_STUB=m
# CONFIG_I2C_SLAVE is not set
# CONFIG_I2C_DEBUG_CORE is not set
# CONFIG_I2C_DEBUG_ALGO is not set
# CONFIG_I2C_DEBUG_BUS is not set
# end of I2C support

CONFIG_I3C=m
CONFIG_CDNS_I3C_MASTER=m
CONFIG_DW_I3C_MASTER=m
CONFIG_SVC_I3C_MASTER=m
CONFIG_MIPI_I3C_HCI=m
CONFIG_SPI=y
# CONFIG_SPI_DEBUG is not set
CONFIG_SPI_MASTER=y
CONFIG_SPI_MEM=y

#
# SPI Master Controller Drivers
#
CONFIG_SPI_ALTERA=m
CONFIG_SPI_ALTERA_CORE=m
CONFIG_SPI_ALTERA_DFL=m
CONFIG_SPI_AXI_SPI_ENGINE=m
CONFIG_SPI_BITBANG=m
CONFIG_SPI_BUTTERFLY=m
CONFIG_SPI_CADENCE=m
CONFIG_SPI_DESIGNWARE=m
CONFIG_SPI_DW_DMA=y
CONFIG_SPI_DW_PCI=m
CONFIG_SPI_DW_MMIO=m
CONFIG_SPI_DLN2=m
CONFIG_SPI_GPIO=m
CONFIG_SPI_INTEL=m
CONFIG_SPI_INTEL_PCI=m
CONFIG_SPI_INTEL_PLATFORM=m
CONFIG_SPI_LM70_LLP=m
CONFIG_SPI_MICROCHIP_CORE=m
CONFIG_SPI_MICROCHIP_CORE_QSPI=m
CONFIG_SPI_LANTIQ_SSC=m
CONFIG_SPI_OC_TINY=m
CONFIG_SPI_PCI1XXXX=m
CONFIG_SPI_PXA2XX=m
CONFIG_SPI_PXA2XX_PCI=m
CONFIG_SPI_SC18IS602=m
CONFIG_SPI_SIFIVE=m
CONFIG_SPI_MXIC=m
CONFIG_SPI_XCOMM=m
# CONFIG_SPI_XILINX is not set
CONFIG_SPI_ZYNQMP_GQSPI=m
CONFIG_SPI_AMD=m

#
# SPI Multiplexer support
#
CONFIG_SPI_MUX=m

#
# SPI Protocol Masters
#
CONFIG_SPI_SPIDEV=m
CONFIG_SPI_LOOPBACK_TEST=m
CONFIG_SPI_TLE62X0=m
CONFIG_SPI_SLAVE=y
CONFIG_SPI_SLAVE_TIME=m
CONFIG_SPI_SLAVE_SYSTEM_CONTROL=m
CONFIG_SPI_DYNAMIC=y
CONFIG_SPMI=m
CONFIG_SPMI_HISI3670=m
CONFIG_HSI=m
CONFIG_HSI_BOARDINFO=y

#
# HSI controllers
#

#
# HSI clients
#
CONFIG_HSI_CHAR=m
CONFIG_PPS=y
# CONFIG_PPS_DEBUG is not set

#
# PPS clients support
#
# CONFIG_PPS_CLIENT_KTIMER is not set
CONFIG_PPS_CLIENT_LDISC=m
CONFIG_PPS_CLIENT_PARPORT=m
CONFIG_PPS_CLIENT_GPIO=m

#
# PPS generators support
#

#
# PTP clock support
#
CONFIG_PTP_1588_CLOCK=y
CONFIG_PTP_1588_CLOCK_OPTIONAL=y
CONFIG_DP83640_PHY=m
CONFIG_PTP_1588_CLOCK_INES=m
CONFIG_PTP_1588_CLOCK_KVM=m
CONFIG_PTP_1588_CLOCK_IDT82P33=m
CONFIG_PTP_1588_CLOCK_IDTCM=m
# CONFIG_PTP_1588_CLOCK_MOCK is not set
CONFIG_PTP_1588_CLOCK_VMW=m
CONFIG_PTP_1588_CLOCK_OCP=m
# CONFIG_PTP_DFL_TOD is not set
# end of PTP clock support

CONFIG_PINCTRL=y
CONFIG_PINMUX=y
CONFIG_PINCONF=y
CONFIG_GENERIC_PINCONF=y
# CONFIG_DEBUG_PINCTRL is not set
CONFIG_PINCTRL_AMD=y
CONFIG_PINCTRL_CY8C95X0=m
CONFIG_PINCTRL_DA9062=m
CONFIG_PINCTRL_MCP23S08_I2C=m
CONFIG_PINCTRL_MCP23S08_SPI=m
CONFIG_PINCTRL_MCP23S08=m
CONFIG_PINCTRL_SX150X=y
CONFIG_PINCTRL_MADERA=m
CONFIG_PINCTRL_CS47L15=y
CONFIG_PINCTRL_CS47L35=y
CONFIG_PINCTRL_CS47L85=y
CONFIG_PINCTRL_CS47L90=y
CONFIG_PINCTRL_CS47L92=y

#
# Intel pinctrl drivers
#
CONFIG_PINCTRL_BAYTRAIL=y
CONFIG_PINCTRL_CHERRYVIEW=y
CONFIG_PINCTRL_LYNXPOINT=m
CONFIG_PINCTRL_INTEL=y
CONFIG_PINCTRL_ALDERLAKE=m
CONFIG_PINCTRL_BROXTON=m
CONFIG_PINCTRL_CANNONLAKE=m
CONFIG_PINCTRL_CEDARFORK=m
CONFIG_PINCTRL_DENVERTON=m
CONFIG_PINCTRL_ELKHARTLAKE=m
CONFIG_PINCTRL_EMMITSBURG=m
CONFIG_PINCTRL_GEMINILAKE=m
CONFIG_PINCTRL_ICELAKE=m
CONFIG_PINCTRL_JASPERLAKE=m
CONFIG_PINCTRL_LAKEFIELD=m
CONFIG_PINCTRL_LEWISBURG=m
CONFIG_PINCTRL_METEORLAKE=m
CONFIG_PINCTRL_SUNRISEPOINT=m
CONFIG_PINCTRL_TIGERLAKE=m
# end of Intel pinctrl drivers

#
# Renesas pinctrl drivers
#
# end of Renesas pinctrl drivers

CONFIG_GPIOLIB=y
CONFIG_GPIOLIB_FASTPATH_LIMIT=512
CONFIG_GPIO_ACPI=y
CONFIG_GPIOLIB_IRQCHIP=y
# CONFIG_DEBUG_GPIO is not set
CONFIG_GPIO_SYSFS=y
CONFIG_GPIO_CDEV=y
CONFIG_GPIO_CDEV_V1=y
CONFIG_GPIO_GENERIC=y
CONFIG_GPIO_REGMAP=m
CONFIG_GPIO_MAX730X=m
CONFIG_GPIO_IDIO_16=m

#
# Memory mapped GPIO drivers
#
CONFIG_GPIO_AMDPT=m
CONFIG_GPIO_DWAPB=m
CONFIG_GPIO_EXAR=m
CONFIG_GPIO_GENERIC_PLATFORM=y
CONFIG_GPIO_ICH=m
CONFIG_GPIO_MB86S7X=m
CONFIG_GPIO_MENZ127=m
CONFIG_GPIO_SIOX=m
CONFIG_GPIO_AMD_FCH=m
# end of Memory mapped GPIO drivers

#
# Port-mapped I/O GPIO drivers
#
CONFIG_GPIO_VX855=m
CONFIG_GPIO_I8255=m
CONFIG_GPIO_104_DIO_48E=m
CONFIG_GPIO_104_IDIO_16=m
CONFIG_GPIO_104_IDI_48=m
CONFIG_GPIO_F7188X=m
CONFIG_GPIO_GPIO_MM=m
CONFIG_GPIO_IT87=m
CONFIG_GPIO_SCH=m
CONFIG_GPIO_SCH311X=m
CONFIG_GPIO_WINBOND=m
CONFIG_GPIO_WS16C48=m
# end of Port-mapped I/O GPIO drivers

#
# I2C GPIO expanders
#
# CONFIG_GPIO_FXL6408 is not set
# CONFIG_GPIO_DS4520 is not set
CONFIG_GPIO_MAX7300=m
CONFIG_GPIO_MAX732X=m
CONFIG_GPIO_PCA953X=m
CONFIG_GPIO_PCA953X_IRQ=y
CONFIG_GPIO_PCA9570=m
CONFIG_GPIO_PCF857X=m
CONFIG_GPIO_TPIC2810=m
# end of I2C GPIO expanders

#
# MFD GPIO expanders
#
CONFIG_GPIO_ADP5520=m
CONFIG_GPIO_ARIZONA=m
CONFIG_GPIO_BD9571MWV=m
CONFIG_GPIO_CRYSTAL_COVE=y
CONFIG_GPIO_DA9052=m
CONFIG_GPIO_DA9055=m
CONFIG_GPIO_DLN2=m
# CONFIG_GPIO_ELKHARTLAKE is not set
CONFIG_GPIO_JANZ_TTL=m
CONFIG_GPIO_KEMPLD=m
CONFIG_GPIO_LP3943=m
CONFIG_GPIO_LP873X=m
CONFIG_GPIO_MADERA=m
CONFIG_GPIO_PALMAS=y
CONFIG_GPIO_RC5T583=y
CONFIG_GPIO_TPS65086=m
CONFIG_GPIO_TPS6586X=y
CONFIG_GPIO_TPS65910=y
CONFIG_GPIO_TPS65912=m
CONFIG_GPIO_TPS68470=m
CONFIG_GPIO_TQMX86=m
CONFIG_GPIO_TWL4030=m
CONFIG_GPIO_TWL6040=m
CONFIG_GPIO_WHISKEY_COVE=m
CONFIG_GPIO_WM831X=m
CONFIG_GPIO_WM8350=m
CONFIG_GPIO_WM8994=m
# end of MFD GPIO expanders

#
# PCI GPIO expanders
#
CONFIG_GPIO_AMD8111=m
CONFIG_GPIO_ML_IOH=m
CONFIG_GPIO_PCI_IDIO_16=m
CONFIG_GPIO_PCIE_IDIO_24=m
CONFIG_GPIO_RDC321X=m
# end of PCI GPIO expanders

#
# SPI GPIO expanders
#
CONFIG_GPIO_MAX3191X=m
CONFIG_GPIO_MAX7301=m
CONFIG_GPIO_MC33880=m
CONFIG_GPIO_PISOSR=m
CONFIG_GPIO_XRA1403=m
# end of SPI GPIO expanders

#
# USB GPIO expanders
#
CONFIG_GPIO_VIPERBOARD=m
# end of USB GPIO expanders

#
# Virtual GPIO drivers
#
CONFIG_GPIO_AGGREGATOR=m
CONFIG_GPIO_LATCH=m
# CONFIG_GPIO_MOCKUP is not set
CONFIG_GPIO_VIRTIO=m
CONFIG_GPIO_SIM=m
# end of Virtual GPIO drivers

CONFIG_W1=m
CONFIG_W1_CON=y

#
# 1-wire Bus Masters
#
CONFIG_W1_MASTER_MATROX=m
CONFIG_W1_MASTER_DS2490=m
CONFIG_W1_MASTER_DS2482=m
CONFIG_W1_MASTER_GPIO=m
CONFIG_W1_MASTER_SGI=m
# end of 1-wire Bus Masters

#
# 1-wire Slaves
#
CONFIG_W1_SLAVE_THERM=m
CONFIG_W1_SLAVE_SMEM=m
CONFIG_W1_SLAVE_DS2405=m
CONFIG_W1_SLAVE_DS2408=m
CONFIG_W1_SLAVE_DS2408_READBACK=y
CONFIG_W1_SLAVE_DS2413=m
CONFIG_W1_SLAVE_DS2406=m
CONFIG_W1_SLAVE_DS2423=m
CONFIG_W1_SLAVE_DS2805=m
CONFIG_W1_SLAVE_DS2430=m
CONFIG_W1_SLAVE_DS2431=m
CONFIG_W1_SLAVE_DS2433=m
# CONFIG_W1_SLAVE_DS2433_CRC is not set
CONFIG_W1_SLAVE_DS2438=m
CONFIG_W1_SLAVE_DS250X=m
CONFIG_W1_SLAVE_DS2780=m
CONFIG_W1_SLAVE_DS2781=m
CONFIG_W1_SLAVE_DS28E04=m
CONFIG_W1_SLAVE_DS28E17=m
# end of 1-wire Slaves

CONFIG_POWER_RESET=y
CONFIG_POWER_RESET_ATC260X=m
CONFIG_POWER_RESET_MT6323=y
CONFIG_POWER_RESET_RESTART=y
CONFIG_POWER_RESET_TPS65086=y
CONFIG_POWER_SUPPLY=y
# CONFIG_POWER_SUPPLY_DEBUG is not set
CONFIG_POWER_SUPPLY_HWMON=y
CONFIG_GENERIC_ADC_BATTERY=m
CONFIG_IP5XXX_POWER=m
CONFIG_MAX8925_POWER=m
CONFIG_WM831X_BACKUP=m
CONFIG_WM831X_POWER=m
CONFIG_WM8350_POWER=m
CONFIG_TEST_POWER=m
CONFIG_BATTERY_88PM860X=m
CONFIG_CHARGER_ADP5061=m
CONFIG_BATTERY_CW2015=m
CONFIG_BATTERY_DS2760=m
CONFIG_BATTERY_DS2780=m
CONFIG_BATTERY_DS2781=m
CONFIG_BATTERY_DS2782=m
CONFIG_BATTERY_SAMSUNG_SDI=y
CONFIG_BATTERY_SBS=m
CONFIG_CHARGER_SBS=m
CONFIG_MANAGER_SBS=m
CONFIG_BATTERY_BQ27XXX=m
CONFIG_BATTERY_BQ27XXX_I2C=m
CONFIG_BATTERY_BQ27XXX_HDQ=m
# CONFIG_BATTERY_BQ27XXX_DT_UPDATES_NVM is not set
CONFIG_BATTERY_DA9030=m
CONFIG_BATTERY_DA9052=m
CONFIG_CHARGER_DA9150=m
CONFIG_BATTERY_DA9150=m
CONFIG_CHARGER_AXP20X=m
CONFIG_BATTERY_AXP20X=m
CONFIG_AXP20X_POWER=m
CONFIG_AXP288_CHARGER=m
CONFIG_AXP288_FUEL_GAUGE=m
CONFIG_BATTERY_MAX17040=m
CONFIG_BATTERY_MAX17042=m
CONFIG_BATTERY_MAX1721X=m
CONFIG_BATTERY_TWL4030_MADC=m
CONFIG_CHARGER_88PM860X=m
CONFIG_CHARGER_PCF50633=m
CONFIG_BATTERY_RX51=m
CONFIG_CHARGER_ISP1704=m
CONFIG_CHARGER_MAX8903=m
CONFIG_CHARGER_TWL4030=m
CONFIG_CHARGER_LP8727=m
CONFIG_CHARGER_LP8788=m
CONFIG_CHARGER_GPIO=m
CONFIG_CHARGER_MANAGER=y
CONFIG_CHARGER_LT3651=m
CONFIG_CHARGER_LTC4162L=m
CONFIG_CHARGER_MAX14577=m
CONFIG_CHARGER_MAX77693=m
CONFIG_CHARGER_MAX77976=m
CONFIG_CHARGER_MAX8997=m
CONFIG_CHARGER_MAX8998=m
CONFIG_CHARGER_MP2629=m
CONFIG_CHARGER_MT6360=m
CONFIG_CHARGER_MT6370=m
CONFIG_CHARGER_BQ2415X=m
CONFIG_CHARGER_BQ24190=m
CONFIG_CHARGER_BQ24257=m
CONFIG_CHARGER_BQ24735=m
CONFIG_CHARGER_BQ2515X=m
CONFIG_CHARGER_BQ25890=m
CONFIG_CHARGER_BQ25980=m
CONFIG_CHARGER_BQ256XX=m
CONFIG_CHARGER_SMB347=m
CONFIG_CHARGER_TPS65090=m
CONFIG_BATTERY_GAUGE_LTC2941=m
CONFIG_BATTERY_GOLDFISH=m
CONFIG_BATTERY_RT5033=m
# CONFIG_CHARGER_RT5033 is not set
CONFIG_CHARGER_RT9455=m
# CONFIG_CHARGER_RT9467 is not set
# CONFIG_CHARGER_RT9471 is not set
CONFIG_CHARGER_CROS_USBPD=m
CONFIG_CHARGER_CROS_PCHG=m
CONFIG_CHARGER_BD99954=m
CONFIG_CHARGER_WILCO=m
CONFIG_BATTERY_SURFACE=m
CONFIG_CHARGER_SURFACE=m
CONFIG_BATTERY_UG3105=m
CONFIG_HWMON=y
CONFIG_HWMON_VID=m
# CONFIG_HWMON_DEBUG_CHIP is not set

#
# Native drivers
#
CONFIG_SENSORS_ABITUGURU=m
CONFIG_SENSORS_ABITUGURU3=m
CONFIG_SENSORS_SMPRO=m
CONFIG_SENSORS_AD7314=m
CONFIG_SENSORS_AD7414=m
CONFIG_SENSORS_AD7418=m
CONFIG_SENSORS_ADM1025=m
CONFIG_SENSORS_ADM1026=m
CONFIG_SENSORS_ADM1029=m
CONFIG_SENSORS_ADM1031=m
CONFIG_SENSORS_ADM1177=m
CONFIG_SENSORS_ADM9240=m
CONFIG_SENSORS_ADT7X10=m
CONFIG_SENSORS_ADT7310=m
CONFIG_SENSORS_ADT7410=m
CONFIG_SENSORS_ADT7411=m
CONFIG_SENSORS_ADT7462=m
CONFIG_SENSORS_ADT7470=m
CONFIG_SENSORS_ADT7475=m
CONFIG_SENSORS_AHT10=m
CONFIG_SENSORS_AQUACOMPUTER_D5NEXT=m
CONFIG_SENSORS_AS370=m
CONFIG_SENSORS_ASC7621=m
CONFIG_SENSORS_AXI_FAN_CONTROL=m
CONFIG_SENSORS_K8TEMP=m
CONFIG_SENSORS_K10TEMP=m
CONFIG_SENSORS_FAM15H_POWER=m
CONFIG_SENSORS_APPLESMC=m
CONFIG_SENSORS_ASB100=m
CONFIG_SENSORS_ATXP1=m
CONFIG_SENSORS_CORSAIR_CPRO=m
CONFIG_SENSORS_CORSAIR_PSU=m
CONFIG_SENSORS_DRIVETEMP=m
CONFIG_SENSORS_DS620=m
CONFIG_SENSORS_DS1621=m
CONFIG_SENSORS_DELL_SMM=m
CONFIG_I8K=y
CONFIG_SENSORS_DA9052_ADC=m
CONFIG_SENSORS_DA9055=m
CONFIG_SENSORS_I5K_AMB=m
CONFIG_SENSORS_F71805F=m
CONFIG_SENSORS_F71882FG=m
CONFIG_SENSORS_F75375S=m
CONFIG_SENSORS_MC13783_ADC=m
CONFIG_SENSORS_FSCHMD=m
CONFIG_SENSORS_FTSTEUTATES=m
CONFIG_SENSORS_GL518SM=m
CONFIG_SENSORS_GL520SM=m
CONFIG_SENSORS_G760A=m
CONFIG_SENSORS_G762=m
CONFIG_SENSORS_HIH6130=m
# CONFIG_SENSORS_HS3001 is not set
CONFIG_SENSORS_IBMAEM=m
CONFIG_SENSORS_IBMPEX=m
CONFIG_SENSORS_IIO_HWMON=m
CONFIG_SENSORS_I5500=m
CONFIG_SENSORS_CORETEMP=m
CONFIG_SENSORS_IT87=m
CONFIG_SENSORS_JC42=m
CONFIG_SENSORS_POWR1220=m
CONFIG_SENSORS_LINEAGE=m
CONFIG_SENSORS_LTC2945=m
CONFIG_SENSORS_LTC2947=m
CONFIG_SENSORS_LTC2947_I2C=m
CONFIG_SENSORS_LTC2947_SPI=m
CONFIG_SENSORS_LTC2990=m
CONFIG_SENSORS_LTC2992=m
CONFIG_SENSORS_LTC4151=m
CONFIG_SENSORS_LTC4215=m
CONFIG_SENSORS_LTC4222=m
CONFIG_SENSORS_LTC4245=m
CONFIG_SENSORS_LTC4260=m
CONFIG_SENSORS_LTC4261=m
CONFIG_SENSORS_MAX1111=m
CONFIG_SENSORS_MAX127=m
CONFIG_SENSORS_MAX16065=m
CONFIG_SENSORS_MAX1619=m
CONFIG_SENSORS_MAX1668=m
CONFIG_SENSORS_MAX197=m
CONFIG_SENSORS_MAX31722=m
CONFIG_SENSORS_MAX31730=m
CONFIG_SENSORS_MAX31760=m
# CONFIG_MAX31827 is not set
CONFIG_SENSORS_MAX6620=m
CONFIG_SENSORS_MAX6621=m
CONFIG_SENSORS_MAX6639=m
CONFIG_SENSORS_MAX6650=m
CONFIG_SENSORS_MAX6697=m
CONFIG_SENSORS_MAX31790=m
# CONFIG_SENSORS_MC34VR500 is not set
CONFIG_SENSORS_MCP3021=m
CONFIG_SENSORS_MLXREG_FAN=m
CONFIG_SENSORS_TC654=m
CONFIG_SENSORS_TPS23861=m
CONFIG_SENSORS_MENF21BMC_HWMON=m
CONFIG_SENSORS_MR75203=m
CONFIG_SENSORS_ADCXX=m
CONFIG_SENSORS_LM63=m
CONFIG_SENSORS_LM70=m
CONFIG_SENSORS_LM73=m
CONFIG_SENSORS_LM75=m
CONFIG_SENSORS_LM77=m
CONFIG_SENSORS_LM78=m
CONFIG_SENSORS_LM80=m
CONFIG_SENSORS_LM83=m
CONFIG_SENSORS_LM85=m
CONFIG_SENSORS_LM87=m
CONFIG_SENSORS_LM90=m
CONFIG_SENSORS_LM92=m
CONFIG_SENSORS_LM93=m
CONFIG_SENSORS_LM95234=m
CONFIG_SENSORS_LM95241=m
CONFIG_SENSORS_LM95245=m
CONFIG_SENSORS_PC87360=m
CONFIG_SENSORS_PC87427=m
CONFIG_SENSORS_NTC_THERMISTOR=m
CONFIG_SENSORS_NCT6683=m
CONFIG_SENSORS_NCT6775_CORE=m
CONFIG_SENSORS_NCT6775=m
CONFIG_SENSORS_NCT6775_I2C=m
CONFIG_SENSORS_NCT7802=m
CONFIG_SENSORS_NCT7904=m
CONFIG_SENSORS_NPCM7XX=m
CONFIG_SENSORS_NZXT_KRAKEN2=m
CONFIG_SENSORS_NZXT_SMART2=m
CONFIG_SENSORS_OCC_P8_I2C=m
CONFIG_SENSORS_OCC=m
CONFIG_SENSORS_OXP=m
CONFIG_SENSORS_PCF8591=m
CONFIG_SENSORS_PECI_CPUTEMP=m
CONFIG_SENSORS_PECI_DIMMTEMP=m
CONFIG_SENSORS_PECI=m
CONFIG_PMBUS=m
CONFIG_SENSORS_PMBUS=m
# CONFIG_SENSORS_ACBEL_FSG032 is not set
CONFIG_SENSORS_ADM1266=m
CONFIG_SENSORS_ADM1275=m
CONFIG_SENSORS_BEL_PFE=m
CONFIG_SENSORS_BPA_RS600=m
CONFIG_SENSORS_DELTA_AHE50DC_FAN=m
CONFIG_SENSORS_FSP_3Y=m
CONFIG_SENSORS_IBM_CFFPS=m
CONFIG_SENSORS_DPS920AB=m
CONFIG_SENSORS_INSPUR_IPSPS=m
CONFIG_SENSORS_IR35221=m
CONFIG_SENSORS_IR36021=m
CONFIG_SENSORS_IR38064=m
CONFIG_SENSORS_IR38064_REGULATOR=y
CONFIG_SENSORS_IRPS5401=m
CONFIG_SENSORS_ISL68137=m
CONFIG_SENSORS_LM25066=m
CONFIG_SENSORS_LM25066_REGULATOR=y
CONFIG_SENSORS_LT7182S=m
CONFIG_SENSORS_LTC2978=m
CONFIG_SENSORS_LTC2978_REGULATOR=y
CONFIG_SENSORS_LTC3815=m
CONFIG_SENSORS_MAX15301=m
CONFIG_SENSORS_MAX16064=m
CONFIG_SENSORS_MAX16601=m
CONFIG_SENSORS_MAX20730=m
CONFIG_SENSORS_MAX20751=m
CONFIG_SENSORS_MAX31785=m
CONFIG_SENSORS_MAX34440=m
CONFIG_SENSORS_MAX8688=m
CONFIG_SENSORS_MP2888=m
CONFIG_SENSORS_MP2975=m
# CONFIG_SENSORS_MP2975_REGULATOR is not set
CONFIG_SENSORS_MP5023=m
# CONFIG_SENSORS_MPQ7932 is not set
CONFIG_SENSORS_PIM4328=m
CONFIG_SENSORS_PLI1209BC=m
CONFIG_SENSORS_PLI1209BC_REGULATOR=y
CONFIG_SENSORS_PM6764TR=m
CONFIG_SENSORS_PXE1610=m
CONFIG_SENSORS_Q54SJ108A2=m
CONFIG_SENSORS_STPDDC60=m
# CONFIG_SENSORS_TDA38640 is not set
CONFIG_SENSORS_TPS40422=m
CONFIG_SENSORS_TPS53679=m
CONFIG_SENSORS_TPS546D24=m
CONFIG_SENSORS_UCD9000=m
CONFIG_SENSORS_UCD9200=m
CONFIG_SENSORS_XDPE152=m
CONFIG_SENSORS_XDPE122=m
CONFIG_SENSORS_XDPE122_REGULATOR=y
CONFIG_SENSORS_ZL6100=m
CONFIG_SENSORS_SBTSI=m
CONFIG_SENSORS_SBRMI=m
CONFIG_SENSORS_SHT15=m
CONFIG_SENSORS_SHT21=m
CONFIG_SENSORS_SHT3x=m
CONFIG_SENSORS_SHT4x=m
CONFIG_SENSORS_SHTC1=m
CONFIG_SENSORS_SIS5595=m
CONFIG_SENSORS_SY7636A=m
CONFIG_SENSORS_DME1737=m
CONFIG_SENSORS_EMC1403=m
CONFIG_SENSORS_EMC2103=m
CONFIG_SENSORS_EMC2305=m
CONFIG_SENSORS_EMC6W201=m
CONFIG_SENSORS_SMSC47M1=m
CONFIG_SENSORS_SMSC47M192=m
CONFIG_SENSORS_SMSC47B397=m
CONFIG_SENSORS_SCH56XX_COMMON=m
CONFIG_SENSORS_SCH5627=m
CONFIG_SENSORS_SCH5636=m
CONFIG_SENSORS_STTS751=m
CONFIG_SENSORS_ADC128D818=m
CONFIG_SENSORS_ADS7828=m
CONFIG_SENSORS_ADS7871=m
CONFIG_SENSORS_AMC6821=m
CONFIG_SENSORS_INA209=m
CONFIG_SENSORS_INA2XX=m
CONFIG_SENSORS_INA238=m
CONFIG_SENSORS_INA3221=m
CONFIG_SENSORS_TC74=m
CONFIG_SENSORS_THMC50=m
CONFIG_SENSORS_TMP102=m
CONFIG_SENSORS_TMP103=m
CONFIG_SENSORS_TMP108=m
CONFIG_SENSORS_TMP401=m
CONFIG_SENSORS_TMP421=m
CONFIG_SENSORS_TMP464=m
CONFIG_SENSORS_TMP513=m
CONFIG_SENSORS_VIA_CPUTEMP=m
CONFIG_SENSORS_VIA686A=m
CONFIG_SENSORS_VT1211=m
CONFIG_SENSORS_VT8231=m
CONFIG_SENSORS_W83773G=m
CONFIG_SENSORS_W83781D=m
CONFIG_SENSORS_W83791D=m
CONFIG_SENSORS_W83792D=m
CONFIG_SENSORS_W83793=m
CONFIG_SENSORS_W83795=m
# CONFIG_SENSORS_W83795_FANCTRL is not set
CONFIG_SENSORS_W83L785TS=m
CONFIG_SENSORS_W83L786NG=m
CONFIG_SENSORS_W83627HF=m
CONFIG_SENSORS_W83627EHF=m
CONFIG_SENSORS_WM831X=m
CONFIG_SENSORS_WM8350=m
CONFIG_SENSORS_XGENE=m

#
# ACPI drivers
#
CONFIG_SENSORS_ACPI_POWER=m
CONFIG_SENSORS_ATK0110=m
CONFIG_SENSORS_ASUS_WMI=m
CONFIG_SENSORS_ASUS_EC=m
# CONFIG_SENSORS_HP_WMI is not set
CONFIG_THERMAL=y
CONFIG_THERMAL_NETLINK=y
CONFIG_THERMAL_STATISTICS=y
CONFIG_THERMAL_EMERGENCY_POWEROFF_DELAY_MS=0
CONFIG_THERMAL_HWMON=y
CONFIG_THERMAL_ACPI=y
CONFIG_THERMAL_WRITABLE_TRIPS=y
CONFIG_THERMAL_DEFAULT_GOV_STEP_WISE=y
# CONFIG_THERMAL_DEFAULT_GOV_FAIR_SHARE is not set
# CONFIG_THERMAL_DEFAULT_GOV_USER_SPACE is not set
# CONFIG_THERMAL_DEFAULT_GOV_POWER_ALLOCATOR is not set
# CONFIG_THERMAL_DEFAULT_GOV_BANG_BANG is not set
CONFIG_THERMAL_GOV_FAIR_SHARE=y
CONFIG_THERMAL_GOV_STEP_WISE=y
CONFIG_THERMAL_GOV_BANG_BANG=y
CONFIG_THERMAL_GOV_USER_SPACE=y
CONFIG_THERMAL_GOV_POWER_ALLOCATOR=y
CONFIG_DEVFREQ_THERMAL=y
CONFIG_THERMAL_EMULATION=y

#
# Intel thermal drivers
#
CONFIG_INTEL_POWERCLAMP=m
CONFIG_X86_THERMAL_VECTOR=y
CONFIG_INTEL_TCC=y
CONFIG_X86_PKG_TEMP_THERMAL=m
CONFIG_INTEL_SOC_DTS_IOSF_CORE=m
CONFIG_INTEL_SOC_DTS_THERMAL=m

#
# ACPI INT340X thermal drivers
#
CONFIG_INT340X_THERMAL=m
CONFIG_ACPI_THERMAL_REL=m
CONFIG_INT3406_THERMAL=m
CONFIG_PROC_THERMAL_MMIO_RAPL=m
# end of ACPI INT340X thermal drivers

CONFIG_INTEL_BXT_PMIC_THERMAL=m
CONFIG_INTEL_PCH_THERMAL=m
CONFIG_INTEL_TCC_COOLING=m
CONFIG_INTEL_HFI_THERMAL=y
# end of Intel thermal drivers

CONFIG_GENERIC_ADC_THERMAL=m
CONFIG_WATCHDOG=y
CONFIG_WATCHDOG_CORE=y
# CONFIG_WATCHDOG_NOWAYOUT is not set
CONFIG_WATCHDOG_HANDLE_BOOT_ENABLED=y
CONFIG_WATCHDOG_OPEN_TIMEOUT=0
CONFIG_WATCHDOG_SYSFS=y
# CONFIG_WATCHDOG_HRTIMER_PRETIMEOUT is not set

#
# Watchdog Pretimeout Governors
#
CONFIG_WATCHDOG_PRETIMEOUT_GOV=y
CONFIG_WATCHDOG_PRETIMEOUT_GOV_SEL=m
CONFIG_WATCHDOG_PRETIMEOUT_GOV_NOOP=y
CONFIG_WATCHDOG_PRETIMEOUT_GOV_PANIC=m
CONFIG_WATCHDOG_PRETIMEOUT_DEFAULT_GOV_NOOP=y
# CONFIG_WATCHDOG_PRETIMEOUT_DEFAULT_GOV_PANIC is not set

#
# Watchdog Device Drivers
#
CONFIG_SOFT_WATCHDOG=m
CONFIG_SOFT_WATCHDOG_PRETIMEOUT=y
CONFIG_DA9052_WATCHDOG=m
CONFIG_DA9055_WATCHDOG=m
CONFIG_DA9063_WATCHDOG=m
CONFIG_DA9062_WATCHDOG=m
CONFIG_MENF21BMC_WATCHDOG=m
CONFIG_MENZ069_WATCHDOG=m
CONFIG_WDAT_WDT=m
CONFIG_WM831X_WATCHDOG=m
CONFIG_WM8350_WATCHDOG=m
CONFIG_XILINX_WATCHDOG=m
CONFIG_ZIIRAVE_WATCHDOG=m
CONFIG_RAVE_SP_WATCHDOG=m
CONFIG_MLX_WDT=m
CONFIG_CADENCE_WATCHDOG=m
CONFIG_DW_WATCHDOG=m
CONFIG_TWL4030_WATCHDOG=m
CONFIG_MAX63XX_WATCHDOG=m
CONFIG_RETU_WATCHDOG=m
CONFIG_ACQUIRE_WDT=m
CONFIG_ADVANTECH_WDT=m
CONFIG_ADVANTECH_EC_WDT=m
CONFIG_ALIM1535_WDT=m
CONFIG_ALIM7101_WDT=m
CONFIG_EBC_C384_WDT=m
CONFIG_EXAR_WDT=m
CONFIG_F71808E_WDT=m
CONFIG_SP5100_TCO=m
CONFIG_SBC_FITPC2_WATCHDOG=m
CONFIG_EUROTECH_WDT=m
CONFIG_IB700_WDT=m
CONFIG_IBMASR=m
CONFIG_WAFER_WDT=m
CONFIG_I6300ESB_WDT=m
CONFIG_IE6XX_WDT=m
CONFIG_ITCO_WDT=m
CONFIG_ITCO_VENDOR_SUPPORT=y
CONFIG_IT8712F_WDT=m
CONFIG_IT87_WDT=m
CONFIG_HP_WATCHDOG=m
CONFIG_HPWDT_NMI_DECODING=y
CONFIG_KEMPLD_WDT=m
CONFIG_SC1200_WDT=m
CONFIG_PC87413_WDT=m
CONFIG_NV_TCO=m
CONFIG_60XX_WDT=m
CONFIG_CPU5_WDT=m
CONFIG_SMSC_SCH311X_WDT=m
CONFIG_SMSC37B787_WDT=m
CONFIG_TQMX86_WDT=m
CONFIG_VIA_WDT=m
CONFIG_W83627HF_WDT=m
CONFIG_W83877F_WDT=m
CONFIG_W83977F_WDT=m
CONFIG_MACHZ_WDT=m
CONFIG_SBC_EPX_C3_WATCHDOG=m
CONFIG_INTEL_MEI_WDT=m
CONFIG_NI903X_WDT=m
CONFIG_NIC7018_WDT=m
CONFIG_SIEMENS_SIMATIC_IPC_WDT=m
CONFIG_MEN_A21_WDT=m
CONFIG_XEN_WDT=m

#
# PCI-based Watchdog Cards
#
CONFIG_PCIPCWATCHDOG=m
CONFIG_WDTPCI=m

#
# USB-based Watchdog Cards
#
CONFIG_USBPCWATCHDOG=m
CONFIG_SSB_POSSIBLE=y
CONFIG_SSB=m
CONFIG_SSB_SPROM=y
CONFIG_SSB_BLOCKIO=y
CONFIG_SSB_PCIHOST_POSSIBLE=y
CONFIG_SSB_PCIHOST=y
CONFIG_SSB_B43_PCI_BRIDGE=y
CONFIG_SSB_PCMCIAHOST_POSSIBLE=y
# CONFIG_SSB_PCMCIAHOST is not set
CONFIG_SSB_SDIOHOST_POSSIBLE=y
CONFIG_SSB_SDIOHOST=y
CONFIG_SSB_DRIVER_PCICORE_POSSIBLE=y
CONFIG_SSB_DRIVER_PCICORE=y
CONFIG_SSB_DRIVER_GPIO=y
CONFIG_BCMA_POSSIBLE=y
CONFIG_BCMA=m
CONFIG_BCMA_BLOCKIO=y
CONFIG_BCMA_HOST_PCI_POSSIBLE=y
CONFIG_BCMA_HOST_PCI=y
CONFIG_BCMA_HOST_SOC=y
CONFIG_BCMA_DRIVER_PCI=y
CONFIG_BCMA_SFLASH=y
CONFIG_BCMA_DRIVER_GMAC_CMN=y
CONFIG_BCMA_DRIVER_GPIO=y
# CONFIG_BCMA_DEBUG is not set

#
# Multifunction device drivers
#
CONFIG_MFD_CORE=y
CONFIG_MFD_AS3711=y
CONFIG_MFD_SMPRO=m
CONFIG_PMIC_ADP5520=y
CONFIG_MFD_AAT2870_CORE=y
CONFIG_MFD_BCM590XX=m
CONFIG_MFD_BD9571MWV=m
CONFIG_MFD_AXP20X=m
CONFIG_MFD_AXP20X_I2C=m
CONFIG_MFD_CROS_EC_DEV=m
# CONFIG_MFD_CS42L43_I2C is not set
# CONFIG_MFD_CS42L43_SDW is not set
CONFIG_MFD_MADERA=m
CONFIG_MFD_MADERA_I2C=m
CONFIG_MFD_MADERA_SPI=m
CONFIG_MFD_CS47L15=y
CONFIG_MFD_CS47L35=y
CONFIG_MFD_CS47L85=y
CONFIG_MFD_CS47L90=y
CONFIG_MFD_CS47L92=y
CONFIG_PMIC_DA903X=y
CONFIG_PMIC_DA9052=y
CONFIG_MFD_DA9052_SPI=y
CONFIG_MFD_DA9052_I2C=y
CONFIG_MFD_DA9055=y
CONFIG_MFD_DA9062=m
CONFIG_MFD_DA9063=y
CONFIG_MFD_DA9150=m
CONFIG_MFD_DLN2=m
CONFIG_MFD_MC13XXX=m
CONFIG_MFD_MC13XXX_SPI=m
CONFIG_MFD_MC13XXX_I2C=m
CONFIG_MFD_MP2629=m
CONFIG_MFD_INTEL_QUARK_I2C_GPIO=m
CONFIG_LPC_ICH=m
CONFIG_LPC_SCH=m
CONFIG_INTEL_SOC_PMIC=y
CONFIG_INTEL_SOC_PMIC_BXTWC=m
CONFIG_INTEL_SOC_PMIC_CHTWC=y
CONFIG_INTEL_SOC_PMIC_CHTDC_TI=m
CONFIG_INTEL_SOC_PMIC_MRFLD=m
CONFIG_MFD_INTEL_LPSS=m
CONFIG_MFD_INTEL_LPSS_ACPI=m
CONFIG_MFD_INTEL_LPSS_PCI=m
CONFIG_MFD_INTEL_PMC_BXT=m
CONFIG_MFD_IQS62X=m
CONFIG_MFD_JANZ_CMODIO=m
CONFIG_MFD_KEMPLD=m
CONFIG_MFD_88PM800=m
CONFIG_MFD_88PM805=m
CONFIG_MFD_88PM860X=y
CONFIG_MFD_MAX14577=y
# CONFIG_MFD_MAX77541 is not set
CONFIG_MFD_MAX77693=y
CONFIG_MFD_MAX77843=y
CONFIG_MFD_MAX8907=m
CONFIG_MFD_MAX8925=y
CONFIG_MFD_MAX8997=y
CONFIG_MFD_MAX8998=y
CONFIG_MFD_MT6360=m
CONFIG_MFD_MT6370=m
CONFIG_MFD_MT6397=m
CONFIG_MFD_MENF21BMC=m
CONFIG_MFD_OCELOT=m
CONFIG_EZX_PCAP=y
CONFIG_MFD_VIPERBOARD=m
CONFIG_MFD_RETU=m
CONFIG_MFD_PCF50633=m
CONFIG_PCF50633_ADC=m
CONFIG_PCF50633_GPIO=m
CONFIG_MFD_SY7636A=m
CONFIG_MFD_RDC321X=m
CONFIG_MFD_RT4831=m
CONFIG_MFD_RT5033=m
CONFIG_MFD_RT5120=m
CONFIG_MFD_RC5T583=y
CONFIG_MFD_SI476X_CORE=m
CONFIG_MFD_SIMPLE_MFD_I2C=m
CONFIG_MFD_SM501=m
CONFIG_MFD_SM501_GPIO=y
CONFIG_MFD_SKY81452=m
CONFIG_MFD_SYSCON=y
CONFIG_MFD_TI_AM335X_TSCADC=m
CONFIG_MFD_LP3943=m
CONFIG_MFD_LP8788=y
CONFIG_MFD_TI_LMU=m
CONFIG_MFD_PALMAS=y
CONFIG_TPS6105X=m
CONFIG_TPS65010=m
CONFIG_TPS6507X=m
CONFIG_MFD_TPS65086=m
CONFIG_MFD_TPS65090=y
CONFIG_MFD_TI_LP873X=m
CONFIG_MFD_TPS6586X=y
CONFIG_MFD_TPS65910=y
CONFIG_MFD_TPS65912=y
CONFIG_MFD_TPS65912_I2C=y
CONFIG_MFD_TPS65912_SPI=y
# CONFIG_MFD_TPS6594_I2C is not set
# CONFIG_MFD_TPS6594_SPI is not set
CONFIG_TWL4030_CORE=y
CONFIG_MFD_TWL4030_AUDIO=y
CONFIG_TWL6040_CORE=y
CONFIG_MFD_WL1273_CORE=m
CONFIG_MFD_LM3533=m
CONFIG_MFD_TQMX86=m
CONFIG_MFD_VX855=m
CONFIG_MFD_ARIZONA=m
CONFIG_MFD_ARIZONA_I2C=m
CONFIG_MFD_ARIZONA_SPI=m
CONFIG_MFD_CS47L24=y
CONFIG_MFD_WM5102=y
CONFIG_MFD_WM5110=y
CONFIG_MFD_WM8997=y
CONFIG_MFD_WM8998=y
CONFIG_MFD_WM8400=y
CONFIG_MFD_WM831X=y
CONFIG_MFD_WM831X_I2C=y
CONFIG_MFD_WM831X_SPI=y
CONFIG_MFD_WM8350=y
CONFIG_MFD_WM8350_I2C=y
CONFIG_MFD_WM8994=m
CONFIG_MFD_WCD934X=m
CONFIG_MFD_ATC260X=m
CONFIG_MFD_ATC260X_I2C=m
CONFIG_RAVE_SP_CORE=m
# CONFIG_MFD_INTEL_M10_BMC_SPI is not set
# CONFIG_MFD_INTEL_M10_BMC_PMCI is not set
# end of Multifunction device drivers

CONFIG_REGULATOR=y
# CONFIG_REGULATOR_DEBUG is not set
CONFIG_REGULATOR_FIXED_VOLTAGE=m
CONFIG_REGULATOR_VIRTUAL_CONSUMER=m
CONFIG_REGULATOR_USERSPACE_CONSUMER=m
CONFIG_REGULATOR_88PG86X=m
CONFIG_REGULATOR_88PM800=m
CONFIG_REGULATOR_88PM8607=m
CONFIG_REGULATOR_ACT8865=m
CONFIG_REGULATOR_AD5398=m
CONFIG_REGULATOR_AAT2870=m
CONFIG_REGULATOR_ARIZONA_LDO1=m
CONFIG_REGULATOR_ARIZONA_MICSUPP=m
CONFIG_REGULATOR_AS3711=m
CONFIG_REGULATOR_ATC260X=m
# CONFIG_REGULATOR_AW37503 is not set
CONFIG_REGULATOR_AXP20X=m
CONFIG_REGULATOR_BCM590XX=m
CONFIG_REGULATOR_BD9571MWV=m
CONFIG_REGULATOR_DA903X=m
CONFIG_REGULATOR_DA9052=m
CONFIG_REGULATOR_DA9055=m
CONFIG_REGULATOR_DA9062=m
CONFIG_REGULATOR_DA9210=m
CONFIG_REGULATOR_DA9211=m
CONFIG_REGULATOR_FAN53555=m
CONFIG_REGULATOR_GPIO=m
CONFIG_REGULATOR_ISL9305=m
CONFIG_REGULATOR_ISL6271A=m
CONFIG_REGULATOR_LM363X=m
CONFIG_REGULATOR_LP3971=m
CONFIG_REGULATOR_LP3972=m
CONFIG_REGULATOR_LP872X=m
CONFIG_REGULATOR_LP8755=m
CONFIG_REGULATOR_LP8788=m
CONFIG_REGULATOR_LTC3589=m
CONFIG_REGULATOR_LTC3676=m
CONFIG_REGULATOR_MAX14577=m
CONFIG_REGULATOR_MAX1586=m
# CONFIG_REGULATOR_MAX77857 is not set
CONFIG_REGULATOR_MAX8649=m
CONFIG_REGULATOR_MAX8660=m
CONFIG_REGULATOR_MAX8893=m
CONFIG_REGULATOR_MAX8907=m
CONFIG_REGULATOR_MAX8925=m
CONFIG_REGULATOR_MAX8952=m
CONFIG_REGULATOR_MAX8997=m
CONFIG_REGULATOR_MAX8998=m
CONFIG_REGULATOR_MAX20086=m
# CONFIG_REGULATOR_MAX20411 is not set
CONFIG_REGULATOR_MAX77693=m
CONFIG_REGULATOR_MAX77826=m
CONFIG_REGULATOR_MC13XXX_CORE=m
CONFIG_REGULATOR_MC13783=m
CONFIG_REGULATOR_MC13892=m
CONFIG_REGULATOR_MP8859=m
CONFIG_REGULATOR_MT6311=m
CONFIG_REGULATOR_MT6315=m
CONFIG_REGULATOR_MT6323=m
CONFIG_REGULATOR_MT6331=m
CONFIG_REGULATOR_MT6332=m
CONFIG_REGULATOR_MT6357=m
CONFIG_REGULATOR_MT6358=m
CONFIG_REGULATOR_MT6359=m
CONFIG_REGULATOR_MT6360=m
CONFIG_REGULATOR_MT6370=m
CONFIG_REGULATOR_MT6397=m
CONFIG_REGULATOR_PALMAS=m
CONFIG_REGULATOR_PCA9450=m
CONFIG_REGULATOR_PCAP=m
CONFIG_REGULATOR_PCF50633=m
CONFIG_REGULATOR_PV88060=m
CONFIG_REGULATOR_PV88080=m
CONFIG_REGULATOR_PV88090=m
CONFIG_REGULATOR_PWM=m
CONFIG_REGULATOR_QCOM_SPMI=m
CONFIG_REGULATOR_QCOM_USB_VBUS=m
# CONFIG_REGULATOR_RAA215300 is not set
CONFIG_REGULATOR_RC5T583=m
CONFIG_REGULATOR_RT4801=m
# CONFIG_REGULATOR_RT4803 is not set
CONFIG_REGULATOR_RT4831=m
CONFIG_REGULATOR_RT5033=m
CONFIG_REGULATOR_RT5120=m
CONFIG_REGULATOR_RT5190A=m
# CONFIG_REGULATOR_RT5739 is not set
CONFIG_REGULATOR_RT5759=m
CONFIG_REGULATOR_RT6160=m
CONFIG_REGULATOR_RT6190=m
CONFIG_REGULATOR_RT6245=m
CONFIG_REGULATOR_RTQ2134=m
CONFIG_REGULATOR_RTMV20=m
CONFIG_REGULATOR_RTQ6752=m
# CONFIG_REGULATOR_RTQ2208 is not set
CONFIG_REGULATOR_SKY81452=m
CONFIG_REGULATOR_SLG51000=m
CONFIG_REGULATOR_SY7636A=m
CONFIG_REGULATOR_TPS51632=m
CONFIG_REGULATOR_TPS6105X=m
CONFIG_REGULATOR_TPS62360=m
CONFIG_REGULATOR_TPS65023=m
CONFIG_REGULATOR_TPS6507X=m
CONFIG_REGULATOR_TPS65086=m
CONFIG_REGULATOR_TPS65090=m
CONFIG_REGULATOR_TPS65132=m
CONFIG_REGULATOR_TPS6524X=m
CONFIG_REGULATOR_TPS6586X=m
CONFIG_REGULATOR_TPS65910=m
CONFIG_REGULATOR_TPS65912=m
CONFIG_REGULATOR_TPS68470=m
CONFIG_REGULATOR_TWL4030=m
CONFIG_REGULATOR_WM831X=m
CONFIG_REGULATOR_WM8350=m
CONFIG_REGULATOR_WM8400=m
CONFIG_REGULATOR_WM8994=m
CONFIG_REGULATOR_QCOM_LABIBB=m
CONFIG_RC_CORE=m
CONFIG_LIRC=y
CONFIG_RC_MAP=m
CONFIG_RC_DECODERS=y
CONFIG_IR_IMON_DECODER=m
CONFIG_IR_JVC_DECODER=m
CONFIG_IR_MCE_KBD_DECODER=m
CONFIG_IR_NEC_DECODER=m
CONFIG_IR_RC5_DECODER=m
CONFIG_IR_RC6_DECODER=m
CONFIG_IR_RCMM_DECODER=m
CONFIG_IR_SANYO_DECODER=m
CONFIG_IR_SHARP_DECODER=m
CONFIG_IR_SONY_DECODER=m
CONFIG_IR_XMP_DECODER=m
CONFIG_RC_DEVICES=y
CONFIG_IR_ENE=m
CONFIG_IR_FINTEK=m
CONFIG_IR_IGORPLUGUSB=m
CONFIG_IR_IGUANA=m
CONFIG_IR_IMON=m
CONFIG_IR_IMON_RAW=m
CONFIG_IR_ITE_CIR=m
CONFIG_IR_MCEUSB=m
CONFIG_IR_NUVOTON=m
CONFIG_IR_REDRAT3=m
CONFIG_IR_SERIAL=m
CONFIG_IR_SERIAL_TRANSMITTER=y
CONFIG_IR_STREAMZAP=m
CONFIG_IR_TOY=m
CONFIG_IR_TTUSBIR=m
CONFIG_IR_WINBOND_CIR=m
CONFIG_RC_ATI_REMOTE=m
CONFIG_RC_LOOPBACK=m
CONFIG_RC_XBOX_DVD=m
CONFIG_CEC_CORE=m
CONFIG_CEC_NOTIFIER=y
CONFIG_CEC_PIN=y

#
# CEC support
#
CONFIG_MEDIA_CEC_RC=y
# CONFIG_CEC_PIN_ERROR_INJ is not set
CONFIG_MEDIA_CEC_SUPPORT=y
CONFIG_CEC_CH7322=m
CONFIG_CEC_CROS_EC=m
CONFIG_CEC_GPIO=m
CONFIG_CEC_SECO=m
CONFIG_CEC_SECO_RC=y
CONFIG_USB_PULSE8_CEC=m
CONFIG_USB_RAINSHADOW_CEC=m
# end of CEC support

CONFIG_MEDIA_SUPPORT=m
CONFIG_MEDIA_SUPPORT_FILTER=y
CONFIG_MEDIA_SUBDRV_AUTOSELECT=y

#
# Media device types
#
CONFIG_MEDIA_CAMERA_SUPPORT=y
CONFIG_MEDIA_ANALOG_TV_SUPPORT=y
CONFIG_MEDIA_DIGITAL_TV_SUPPORT=y
CONFIG_MEDIA_RADIO_SUPPORT=y
CONFIG_MEDIA_SDR_SUPPORT=y
CONFIG_MEDIA_PLATFORM_SUPPORT=y
CONFIG_MEDIA_TEST_SUPPORT=y
# end of Media device types

CONFIG_VIDEO_DEV=m
CONFIG_MEDIA_CONTROLLER=y
CONFIG_DVB_CORE=m

#
# Video4Linux options
#
CONFIG_VIDEO_V4L2_I2C=y
CONFIG_VIDEO_V4L2_SUBDEV_API=y
# CONFIG_VIDEO_ADV_DEBUG is not set
# CONFIG_VIDEO_FIXED_MINOR_RANGES is not set
CONFIG_VIDEO_TUNER=m
CONFIG_V4L2_MEM2MEM_DEV=m
CONFIG_V4L2_FLASH_LED_CLASS=m
CONFIG_V4L2_FWNODE=m
CONFIG_V4L2_ASYNC=m
CONFIG_V4L2_CCI=m
CONFIG_V4L2_CCI_I2C=m
# end of Video4Linux options

#
# Media controller options
#
CONFIG_MEDIA_CONTROLLER_DVB=y
CONFIG_MEDIA_CONTROLLER_REQUEST_API=y
# end of Media controller options

#
# Digital TV options
#
# CONFIG_DVB_MMAP is not set
CONFIG_DVB_NET=y
CONFIG_DVB_MAX_ADAPTERS=8
CONFIG_DVB_DYNAMIC_MINORS=y
# CONFIG_DVB_DEMUX_SECTION_LOSS_LOG is not set
# CONFIG_DVB_ULE_DEBUG is not set
# end of Digital TV options

#
# Media drivers
#

#
# Drivers filtered as selected at 'Filter media drivers'
#

#
# Media drivers
#
CONFIG_MEDIA_USB_SUPPORT=y

#
# Webcam devices
#
CONFIG_USB_GSPCA=m
CONFIG_USB_GSPCA_BENQ=m
CONFIG_USB_GSPCA_CONEX=m
CONFIG_USB_GSPCA_CPIA1=m
CONFIG_USB_GSPCA_DTCS033=m
CONFIG_USB_GSPCA_ETOMS=m
CONFIG_USB_GSPCA_FINEPIX=m
CONFIG_USB_GSPCA_JEILINJ=m
CONFIG_USB_GSPCA_JL2005BCD=m
CONFIG_USB_GSPCA_KINECT=m
CONFIG_USB_GSPCA_KONICA=m
CONFIG_USB_GSPCA_MARS=m
CONFIG_USB_GSPCA_MR97310A=m
CONFIG_USB_GSPCA_NW80X=m
CONFIG_USB_GSPCA_OV519=m
CONFIG_USB_GSPCA_OV534=m
CONFIG_USB_GSPCA_OV534_9=m
CONFIG_USB_GSPCA_PAC207=m
CONFIG_USB_GSPCA_PAC7302=m
CONFIG_USB_GSPCA_PAC7311=m
CONFIG_USB_GSPCA_SE401=m
CONFIG_USB_GSPCA_SN9C2028=m
CONFIG_USB_GSPCA_SN9C20X=m
CONFIG_USB_GSPCA_SONIXB=m
CONFIG_USB_GSPCA_SONIXJ=m
CONFIG_USB_GSPCA_SPCA1528=m
CONFIG_USB_GSPCA_SPCA500=m
CONFIG_USB_GSPCA_SPCA501=m
CONFIG_USB_GSPCA_SPCA505=m
CONFIG_USB_GSPCA_SPCA506=m
CONFIG_USB_GSPCA_SPCA508=m
CONFIG_USB_GSPCA_SPCA561=m
CONFIG_USB_GSPCA_SQ905=m
CONFIG_USB_GSPCA_SQ905C=m
CONFIG_USB_GSPCA_SQ930X=m
CONFIG_USB_GSPCA_STK014=m
CONFIG_USB_GSPCA_STK1135=m
CONFIG_USB_GSPCA_STV0680=m
CONFIG_USB_GSPCA_SUNPLUS=m
CONFIG_USB_GSPCA_T613=m
CONFIG_USB_GSPCA_TOPRO=m
CONFIG_USB_GSPCA_TOUPTEK=m
CONFIG_USB_GSPCA_TV8532=m
CONFIG_USB_GSPCA_VC032X=m
CONFIG_USB_GSPCA_VICAM=m
CONFIG_USB_GSPCA_XIRLINK_CIT=m
CONFIG_USB_GSPCA_ZC3XX=m
CONFIG_USB_GL860=m
CONFIG_USB_M5602=m
CONFIG_USB_STV06XX=m
CONFIG_USB_PWC=m
# CONFIG_USB_PWC_DEBUG is not set
CONFIG_USB_PWC_INPUT_EVDEV=y
CONFIG_USB_S2255=m
CONFIG_VIDEO_USBTV=m
CONFIG_USB_VIDEO_CLASS=m
CONFIG_USB_VIDEO_CLASS_INPUT_EVDEV=y

#
# Analog TV USB devices
#
CONFIG_VIDEO_GO7007=m
CONFIG_VIDEO_GO7007_USB=m
CONFIG_VIDEO_GO7007_LOADER=m
CONFIG_VIDEO_GO7007_USB_S2250_BOARD=m
CONFIG_VIDEO_HDPVR=m
CONFIG_VIDEO_PVRUSB2=m
CONFIG_VIDEO_PVRUSB2_SYSFS=y
CONFIG_VIDEO_PVRUSB2_DVB=y
# CONFIG_VIDEO_PVRUSB2_DEBUGIFC is not set
CONFIG_VIDEO_STK1160=m

#
# Analog/digital TV USB devices
#
CONFIG_VIDEO_AU0828=m
CONFIG_VIDEO_AU0828_V4L2=y
CONFIG_VIDEO_AU0828_RC=y
CONFIG_VIDEO_CX231XX=m
CONFIG_VIDEO_CX231XX_RC=y
CONFIG_VIDEO_CX231XX_ALSA=m
CONFIG_VIDEO_CX231XX_DVB=m

#
# Digital TV USB devices
#
CONFIG_DVB_AS102=m
CONFIG_DVB_B2C2_FLEXCOP_USB=m
# CONFIG_DVB_B2C2_FLEXCOP_USB_DEBUG is not set
CONFIG_DVB_USB_V2=m
CONFIG_DVB_USB_AF9015=m
CONFIG_DVB_USB_AF9035=m
CONFIG_DVB_USB_ANYSEE=m
CONFIG_DVB_USB_AU6610=m
CONFIG_DVB_USB_AZ6007=m
CONFIG_DVB_USB_CE6230=m
CONFIG_DVB_USB_DVBSKY=m
CONFIG_DVB_USB_EC168=m
CONFIG_DVB_USB_GL861=m
CONFIG_DVB_USB_LME2510=m
CONFIG_DVB_USB_MXL111SF=m
CONFIG_DVB_USB_RTL28XXU=m
CONFIG_DVB_USB_ZD1301=m
CONFIG_DVB_USB=m
# CONFIG_DVB_USB_DEBUG is not set
CONFIG_DVB_USB_A800=m
CONFIG_DVB_USB_AF9005=m
CONFIG_DVB_USB_AF9005_REMOTE=m
CONFIG_DVB_USB_AZ6027=m
CONFIG_DVB_USB_CINERGY_T2=m
CONFIG_DVB_USB_CXUSB=m
CONFIG_DVB_USB_CXUSB_ANALOG=y
CONFIG_DVB_USB_DIB0700=m
CONFIG_DVB_USB_DIB3000MC=m
CONFIG_DVB_USB_DIBUSB_MB=m
# CONFIG_DVB_USB_DIBUSB_MB_FAULTY is not set
CONFIG_DVB_USB_DIBUSB_MC=m
CONFIG_DVB_USB_DIGITV=m
CONFIG_DVB_USB_DTT200U=m
CONFIG_DVB_USB_DTV5100=m
CONFIG_DVB_USB_DW2102=m
CONFIG_DVB_USB_GP8PSK=m
CONFIG_DVB_USB_M920X=m
CONFIG_DVB_USB_NOVA_T_USB2=m
CONFIG_DVB_USB_OPERA1=m
CONFIG_DVB_USB_PCTV452E=m
CONFIG_DVB_USB_TECHNISAT_USB2=m
CONFIG_DVB_USB_TTUSB2=m
CONFIG_DVB_USB_UMT_010=m
CONFIG_DVB_USB_VP702X=m
CONFIG_DVB_USB_VP7045=m
CONFIG_SMS_USB_DRV=m
CONFIG_DVB_TTUSB_BUDGET=m
CONFIG_DVB_TTUSB_DEC=m

#
# Webcam, TV (analog/digital) USB devices
#
CONFIG_VIDEO_EM28XX=m
CONFIG_VIDEO_EM28XX_V4L2=m
CONFIG_VIDEO_EM28XX_ALSA=m
CONFIG_VIDEO_EM28XX_DVB=m
CONFIG_VIDEO_EM28XX_RC=m

#
# Software defined radio USB devices
#
CONFIG_USB_AIRSPY=m
CONFIG_USB_HACKRF=m
CONFIG_USB_MSI2500=m
CONFIG_MEDIA_PCI_SUPPORT=y

#
# Media capture support
#
CONFIG_VIDEO_SOLO6X10=m
CONFIG_VIDEO_TW5864=m
CONFIG_VIDEO_TW68=m
CONFIG_VIDEO_TW686X=m
# CONFIG_VIDEO_ZORAN is not set

#
# Media capture/analog TV support
#
CONFIG_VIDEO_DT3155=m
CONFIG_VIDEO_IVTV=m
CONFIG_VIDEO_IVTV_ALSA=m
CONFIG_VIDEO_FB_IVTV=m
CONFIG_VIDEO_FB_IVTV_FORCE_PAT=y
# CONFIG_VIDEO_HEXIUM_GEMINI is not set
# CONFIG_VIDEO_HEXIUM_ORION is not set
# CONFIG_VIDEO_MXB is not set

#
# Media capture/analog/hybrid TV support
#
CONFIG_VIDEO_BT848=m
CONFIG_DVB_BT8XX=m
CONFIG_VIDEO_COBALT=m
CONFIG_VIDEO_CX18=m
CONFIG_VIDEO_CX18_ALSA=m
CONFIG_VIDEO_CX23885=m
CONFIG_MEDIA_ALTERA_CI=m
CONFIG_VIDEO_CX25821=m
CONFIG_VIDEO_CX25821_ALSA=m
CONFIG_VIDEO_CX88=m
CONFIG_VIDEO_CX88_ALSA=m
CONFIG_VIDEO_CX88_BLACKBIRD=m
CONFIG_VIDEO_CX88_DVB=m
CONFIG_VIDEO_CX88_ENABLE_VP3054=y
CONFIG_VIDEO_CX88_VP3054=m
CONFIG_VIDEO_CX88_MPEG=m
CONFIG_VIDEO_SAA7134=m
CONFIG_VIDEO_SAA7134_ALSA=m
CONFIG_VIDEO_SAA7134_RC=y
CONFIG_VIDEO_SAA7134_DVB=m
CONFIG_VIDEO_SAA7134_GO7007=m
CONFIG_VIDEO_SAA7164=m

#
# Media digital TV PCI Adapters
#
CONFIG_DVB_B2C2_FLEXCOP_PCI=m
# CONFIG_DVB_B2C2_FLEXCOP_PCI_DEBUG is not set
CONFIG_DVB_DDBRIDGE=m
# CONFIG_DVB_DDBRIDGE_MSIENABLE is not set
CONFIG_DVB_DM1105=m
CONFIG_MANTIS_CORE=m
CONFIG_DVB_MANTIS=m
CONFIG_DVB_HOPPER=m
CONFIG_DVB_NETUP_UNIDVB=m
CONFIG_DVB_NGENE=m
CONFIG_DVB_PLUTO2=m
CONFIG_DVB_PT1=m
CONFIG_DVB_PT3=m
CONFIG_DVB_SMIPCIE=m
# CONFIG_DVB_BUDGET_CORE is not set
# CONFIG_VIDEO_PCI_SKELETON is not set
CONFIG_IPU_BRIDGE=m
CONFIG_VIDEO_IPU3_CIO2=m
CONFIG_CIO2_BRIDGE=y
# CONFIG_INTEL_VSC is not set
CONFIG_RADIO_ADAPTERS=m
CONFIG_RADIO_MAXIRADIO=m
CONFIG_RADIO_SAA7706H=m
CONFIG_RADIO_SHARK=m
CONFIG_RADIO_SHARK2=m
CONFIG_RADIO_SI4713=m
CONFIG_RADIO_SI476X=m
CONFIG_RADIO_TEA575X=m
CONFIG_RADIO_TEA5764=m
CONFIG_RADIO_TEF6862=m
CONFIG_RADIO_WL1273=m
CONFIG_USB_DSBR=m
CONFIG_USB_KEENE=m
CONFIG_USB_MA901=m
CONFIG_USB_MR800=m
CONFIG_USB_RAREMONO=m
CONFIG_RADIO_SI470X=m
CONFIG_USB_SI470X=m
CONFIG_I2C_SI470X=m
CONFIG_USB_SI4713=m
CONFIG_PLATFORM_SI4713=m
CONFIG_I2C_SI4713=m
CONFIG_RADIO_WL128X=m
CONFIG_MEDIA_PLATFORM_DRIVERS=y
CONFIG_V4L_PLATFORM_DRIVERS=y
CONFIG_SDR_PLATFORM_DRIVERS=y
CONFIG_DVB_PLATFORM_DRIVERS=y
CONFIG_V4L_MEM2MEM_DRIVERS=y
CONFIG_VIDEO_MEM2MEM_DEINTERLACE=m

#
# Allegro DVT media platform drivers
#

#
# Amlogic media platform drivers
#

#
# Amphion drivers
#

#
# Aspeed media platform drivers
#

#
# Atmel media platform drivers
#

#
# Cadence media platform drivers
#
CONFIG_VIDEO_CADENCE_CSI2RX=m
CONFIG_VIDEO_CADENCE_CSI2TX=m

#
# Chips&Media media platform drivers
#

#
# Intel media platform drivers
#

#
# Marvell media platform drivers
#
CONFIG_VIDEO_CAFE_CCIC=m

#
# Mediatek media platform drivers
#

#
# Microchip Technology, Inc. media platform drivers
#

#
# NVidia media platform drivers
#

#
# NXP media platform drivers
#

#
# Qualcomm media platform drivers
#

#
# Renesas media platform drivers
#

#
# Rockchip media platform drivers
#

#
# Samsung media platform drivers
#

#
# STMicroelectronics media platform drivers
#

#
# Sunxi media platform drivers
#

#
# Texas Instruments drivers
#

#
# Verisilicon media platform drivers
#

#
# VIA media platform drivers
#
CONFIG_VIDEO_VIA_CAMERA=m

#
# Xilinx media platform drivers
#

#
# MMC/SDIO DVB adapters
#
CONFIG_SMS_SDIO_DRV=m
CONFIG_V4L_TEST_DRIVERS=y
CONFIG_VIDEO_VIM2M=m
CONFIG_VIDEO_VICODEC=m
CONFIG_VIDEO_VIMC=m
CONFIG_VIDEO_VIVID=m
CONFIG_VIDEO_VIVID_CEC=y
CONFIG_VIDEO_VIVID_MAX_DEVS=64
CONFIG_VIDEO_VISL=m
# CONFIG_VISL_DEBUGFS is not set
# CONFIG_DVB_TEST_DRIVERS is not set

#
# FireWire (IEEE 1394) Adapters
#
CONFIG_DVB_FIREDTV=m
CONFIG_DVB_FIREDTV_INPUT=y
CONFIG_MEDIA_COMMON_OPTIONS=y

#
# common driver options
#
CONFIG_CYPRESS_FIRMWARE=m
CONFIG_TTPCI_EEPROM=m
CONFIG_UVC_COMMON=m
CONFIG_VIDEO_CX2341X=m
CONFIG_VIDEO_TVEEPROM=m
CONFIG_DVB_B2C2_FLEXCOP=m
CONFIG_SMS_SIANO_MDTV=m
CONFIG_SMS_SIANO_RC=y
CONFIG_SMS_SIANO_DEBUGFS=y
CONFIG_VIDEO_V4L2_TPG=m
CONFIG_VIDEOBUF2_CORE=m
CONFIG_VIDEOBUF2_V4L2=m
CONFIG_VIDEOBUF2_MEMOPS=m
CONFIG_VIDEOBUF2_DMA_CONTIG=m
CONFIG_VIDEOBUF2_VMALLOC=m
CONFIG_VIDEOBUF2_DMA_SG=m
CONFIG_VIDEOBUF2_DVB=m
# end of Media drivers

#
# Media ancillary drivers
#
CONFIG_MEDIA_ATTACH=y

#
# IR I2C driver auto-selected by 'Autoselect ancillary drivers'
#
CONFIG_VIDEO_IR_I2C=m
CONFIG_VIDEO_CAMERA_SENSOR=y
CONFIG_VIDEO_APTINA_PLL=m
CONFIG_VIDEO_CCS_PLL=m
CONFIG_VIDEO_AR0521=m
CONFIG_VIDEO_HI556=m
CONFIG_VIDEO_HI846=m
CONFIG_VIDEO_HI847=m
CONFIG_VIDEO_IMX208=m
CONFIG_VIDEO_IMX214=m
CONFIG_VIDEO_IMX219=m
CONFIG_VIDEO_IMX258=m
CONFIG_VIDEO_IMX274=m
CONFIG_VIDEO_IMX290=m
# CONFIG_VIDEO_IMX296 is not set
CONFIG_VIDEO_IMX319=m
CONFIG_VIDEO_IMX355=m
CONFIG_VIDEO_MAX9271_LIB=m
CONFIG_VIDEO_MT9M001=m
CONFIG_VIDEO_MT9M111=m
CONFIG_VIDEO_MT9P031=m
CONFIG_VIDEO_MT9T112=m
CONFIG_VIDEO_MT9V011=m
CONFIG_VIDEO_MT9V032=m
CONFIG_VIDEO_MT9V111=m
CONFIG_VIDEO_OG01A1B=m
# CONFIG_VIDEO_OV01A10 is not set
CONFIG_VIDEO_OV02A10=m
CONFIG_VIDEO_OV08D10=m
CONFIG_VIDEO_OV08X40=m
CONFIG_VIDEO_OV13858=m
CONFIG_VIDEO_OV13B10=m
CONFIG_VIDEO_OV2640=m
CONFIG_VIDEO_OV2659=m
CONFIG_VIDEO_OV2680=m
CONFIG_VIDEO_OV2685=m
CONFIG_VIDEO_OV2740=m
CONFIG_VIDEO_OV4689=m
CONFIG_VIDEO_OV5647=m
CONFIG_VIDEO_OV5648=m
CONFIG_VIDEO_OV5670=m
CONFIG_VIDEO_OV5675=m
CONFIG_VIDEO_OV5693=m
CONFIG_VIDEO_OV5695=m
CONFIG_VIDEO_OV6650=m
CONFIG_VIDEO_OV7251=m
CONFIG_VIDEO_OV7640=m
CONFIG_VIDEO_OV7670=m
CONFIG_VIDEO_OV772X=m
CONFIG_VIDEO_OV7740=m
CONFIG_VIDEO_OV8856=m
# CONFIG_VIDEO_OV8858 is not set
CONFIG_VIDEO_OV8865=m
CONFIG_VIDEO_OV9640=m
CONFIG_VIDEO_OV9650=m
CONFIG_VIDEO_OV9734=m
CONFIG_VIDEO_RDACM20=m
CONFIG_VIDEO_RDACM21=m
CONFIG_VIDEO_RJ54N1=m
CONFIG_VIDEO_S5C73M3=m
CONFIG_VIDEO_S5K5BAF=m
CONFIG_VIDEO_S5K6A3=m
CONFIG_VIDEO_CCS=m
CONFIG_VIDEO_ET8EK8=m

#
# Lens drivers
#
CONFIG_VIDEO_AD5820=m
CONFIG_VIDEO_AK7375=m
CONFIG_VIDEO_DW9714=m
# CONFIG_VIDEO_DW9719 is not set
CONFIG_VIDEO_DW9768=m
CONFIG_VIDEO_DW9807_VCM=m
# end of Lens drivers

#
# Flash devices
#
CONFIG_VIDEO_ADP1653=m
CONFIG_VIDEO_LM3560=m
CONFIG_VIDEO_LM3646=m
# end of Flash devices

#
# Audio decoders, processors and mixers
#
CONFIG_VIDEO_CS3308=m
CONFIG_VIDEO_CS5345=m
CONFIG_VIDEO_CS53L32A=m
CONFIG_VIDEO_MSP3400=m
CONFIG_VIDEO_SONY_BTF_MPX=m
CONFIG_VIDEO_TDA1997X=m
CONFIG_VIDEO_TDA7432=m
CONFIG_VIDEO_TDA9840=m
CONFIG_VIDEO_TEA6415C=m
CONFIG_VIDEO_TEA6420=m
CONFIG_VIDEO_TLV320AIC23B=m
CONFIG_VIDEO_TVAUDIO=m
CONFIG_VIDEO_UDA1342=m
CONFIG_VIDEO_VP27SMPX=m
CONFIG_VIDEO_WM8739=m
CONFIG_VIDEO_WM8775=m
# end of Audio decoders, processors and mixers

#
# RDS decoders
#
CONFIG_VIDEO_SAA6588=m
# end of RDS decoders

#
# Video decoders
#
CONFIG_VIDEO_ADV7180=m
CONFIG_VIDEO_ADV7183=m
CONFIG_VIDEO_ADV7604=m
CONFIG_VIDEO_ADV7604_CEC=y
CONFIG_VIDEO_ADV7842=m
CONFIG_VIDEO_ADV7842_CEC=y
CONFIG_VIDEO_BT819=m
CONFIG_VIDEO_BT856=m
CONFIG_VIDEO_BT866=m
CONFIG_VIDEO_KS0127=m
CONFIG_VIDEO_ML86V7667=m
CONFIG_VIDEO_SAA7110=m
CONFIG_VIDEO_SAA711X=m
CONFIG_VIDEO_TC358743=m
CONFIG_VIDEO_TC358743_CEC=y
CONFIG_VIDEO_TC358746=m
CONFIG_VIDEO_TVP514X=m
CONFIG_VIDEO_TVP5150=m
CONFIG_VIDEO_TVP7002=m
CONFIG_VIDEO_TW2804=m
CONFIG_VIDEO_TW9903=m
CONFIG_VIDEO_TW9906=m
CONFIG_VIDEO_TW9910=m
CONFIG_VIDEO_VPX3220=m

#
# Video and audio decoders
#
CONFIG_VIDEO_SAA717X=m
CONFIG_VIDEO_CX25840=m
# end of Video decoders

#
# Video encoders
#
CONFIG_VIDEO_ADV7170=m
CONFIG_VIDEO_ADV7175=m
CONFIG_VIDEO_ADV7343=m
CONFIG_VIDEO_ADV7393=m
CONFIG_VIDEO_ADV7511=m
# CONFIG_VIDEO_ADV7511_CEC is not set
CONFIG_VIDEO_AK881X=m
CONFIG_VIDEO_SAA7127=m
CONFIG_VIDEO_SAA7185=m
CONFIG_VIDEO_THS8200=m
# end of Video encoders

#
# Video improvement chips
#
CONFIG_VIDEO_UPD64031A=m
CONFIG_VIDEO_UPD64083=m
# end of Video improvement chips

#
# Audio/Video compression chips
#
CONFIG_VIDEO_SAA6752HS=m
# end of Audio/Video compression chips

#
# SDR tuner chips
#
CONFIG_SDR_MAX2175=m
# end of SDR tuner chips

#
# Miscellaneous helper chips
#
CONFIG_VIDEO_I2C=m
CONFIG_VIDEO_M52790=m
CONFIG_VIDEO_ST_MIPID02=m
CONFIG_VIDEO_THS7303=m
# end of Miscellaneous helper chips

#
# Video serializers and deserializers
#
# end of Video serializers and deserializers

#
# Media SPI Adapters
#
CONFIG_CXD2880_SPI_DRV=m
CONFIG_VIDEO_GS1662=m
# end of Media SPI Adapters

CONFIG_MEDIA_TUNER=m

#
# Customize TV tuners
#
CONFIG_MEDIA_TUNER_E4000=m
CONFIG_MEDIA_TUNER_FC0011=m
CONFIG_MEDIA_TUNER_FC0012=m
CONFIG_MEDIA_TUNER_FC0013=m
CONFIG_MEDIA_TUNER_FC2580=m
CONFIG_MEDIA_TUNER_IT913X=m
CONFIG_MEDIA_TUNER_M88RS6000T=m
CONFIG_MEDIA_TUNER_MAX2165=m
CONFIG_MEDIA_TUNER_MC44S803=m
CONFIG_MEDIA_TUNER_MSI001=m
CONFIG_MEDIA_TUNER_MT2060=m
CONFIG_MEDIA_TUNER_MT2063=m
CONFIG_MEDIA_TUNER_MT20XX=m
CONFIG_MEDIA_TUNER_MT2131=m
CONFIG_MEDIA_TUNER_MT2266=m
CONFIG_MEDIA_TUNER_MXL301RF=m
CONFIG_MEDIA_TUNER_MXL5005S=m
CONFIG_MEDIA_TUNER_MXL5007T=m
CONFIG_MEDIA_TUNER_QM1D1B0004=m
CONFIG_MEDIA_TUNER_QM1D1C0042=m
CONFIG_MEDIA_TUNER_QT1010=m
CONFIG_MEDIA_TUNER_R820T=m
CONFIG_MEDIA_TUNER_SI2157=m
CONFIG_MEDIA_TUNER_SIMPLE=m
CONFIG_MEDIA_TUNER_TDA18212=m
CONFIG_MEDIA_TUNER_TDA18218=m
CONFIG_MEDIA_TUNER_TDA18250=m
CONFIG_MEDIA_TUNER_TDA18271=m
CONFIG_MEDIA_TUNER_TDA827X=m
CONFIG_MEDIA_TUNER_TDA8290=m
CONFIG_MEDIA_TUNER_TDA9887=m
CONFIG_MEDIA_TUNER_TEA5761=m
CONFIG_MEDIA_TUNER_TEA5767=m
CONFIG_MEDIA_TUNER_TUA9001=m
CONFIG_MEDIA_TUNER_XC2028=m
CONFIG_MEDIA_TUNER_XC4000=m
CONFIG_MEDIA_TUNER_XC5000=m
# end of Customize TV tuners

#
# Customise DVB Frontends
#

#
# Multistandard (satellite) frontends
#
CONFIG_DVB_M88DS3103=m
CONFIG_DVB_MXL5XX=m
CONFIG_DVB_STB0899=m
CONFIG_DVB_STB6100=m
CONFIG_DVB_STV090x=m
CONFIG_DVB_STV0910=m
CONFIG_DVB_STV6110x=m
CONFIG_DVB_STV6111=m

#
# Multistandard (cable + terrestrial) frontends
#
CONFIG_DVB_DRXK=m
CONFIG_DVB_MN88472=m
CONFIG_DVB_MN88473=m
CONFIG_DVB_SI2165=m
CONFIG_DVB_TDA18271C2DD=m

#
# DVB-S (satellite) frontends
#
CONFIG_DVB_CX24110=m
CONFIG_DVB_CX24116=m
CONFIG_DVB_CX24117=m
CONFIG_DVB_CX24120=m
CONFIG_DVB_CX24123=m
CONFIG_DVB_DS3000=m
CONFIG_DVB_MB86A16=m
CONFIG_DVB_MT312=m
CONFIG_DVB_S5H1420=m
CONFIG_DVB_SI21XX=m
CONFIG_DVB_STB6000=m
CONFIG_DVB_STV0288=m
CONFIG_DVB_STV0299=m
CONFIG_DVB_STV0900=m
CONFIG_DVB_STV6110=m
CONFIG_DVB_TDA10071=m
CONFIG_DVB_TDA10086=m
CONFIG_DVB_TDA8083=m
CONFIG_DVB_TDA8261=m
CONFIG_DVB_TDA826X=m
CONFIG_DVB_TS2020=m
CONFIG_DVB_TUA6100=m
CONFIG_DVB_TUNER_CX24113=m
CONFIG_DVB_TUNER_ITD1000=m
CONFIG_DVB_VES1X93=m
CONFIG_DVB_ZL10036=m
CONFIG_DVB_ZL10039=m

#
# DVB-T (terrestrial) frontends
#
CONFIG_DVB_AF9013=m
CONFIG_DVB_AS102_FE=m
CONFIG_DVB_CX22700=m
CONFIG_DVB_CX22702=m
CONFIG_DVB_CXD2820R=m
CONFIG_DVB_CXD2841ER=m
CONFIG_DVB_DIB3000MB=m
CONFIG_DVB_DIB3000MC=m
CONFIG_DVB_DIB7000M=m
CONFIG_DVB_DIB7000P=m
CONFIG_DVB_DIB9000=m
CONFIG_DVB_DRXD=m
CONFIG_DVB_EC100=m
CONFIG_DVB_GP8PSK_FE=m
CONFIG_DVB_L64781=m
CONFIG_DVB_MT352=m
CONFIG_DVB_NXT6000=m
CONFIG_DVB_RTL2830=m
CONFIG_DVB_RTL2832=m
CONFIG_DVB_RTL2832_SDR=m
CONFIG_DVB_S5H1432=m
CONFIG_DVB_SI2168=m
CONFIG_DVB_SP887X=m
CONFIG_DVB_STV0367=m
CONFIG_DVB_TDA10048=m
CONFIG_DVB_TDA1004X=m
CONFIG_DVB_ZD1301_DEMOD=m
CONFIG_DVB_ZL10353=m
CONFIG_DVB_CXD2880=m

#
# DVB-C (cable) frontends
#
CONFIG_DVB_STV0297=m
CONFIG_DVB_TDA10021=m
CONFIG_DVB_TDA10023=m
CONFIG_DVB_VES1820=m

#
# ATSC (North American/Korean Terrestrial/Cable DTV) frontends
#
CONFIG_DVB_AU8522=m
CONFIG_DVB_AU8522_DTV=m
CONFIG_DVB_AU8522_V4L=m
CONFIG_DVB_BCM3510=m
CONFIG_DVB_LG2160=m
CONFIG_DVB_LGDT3305=m
CONFIG_DVB_LGDT3306A=m
CONFIG_DVB_LGDT330X=m
CONFIG_DVB_MXL692=m
CONFIG_DVB_NXT200X=m
CONFIG_DVB_OR51132=m
CONFIG_DVB_OR51211=m
CONFIG_DVB_S5H1409=m
CONFIG_DVB_S5H1411=m

#
# ISDB-T (terrestrial) frontends
#
CONFIG_DVB_DIB8000=m
CONFIG_DVB_MB86A20S=m
CONFIG_DVB_S921=m

#
# ISDB-S (satellite) & ISDB-T (terrestrial) frontends
#
CONFIG_DVB_MN88443X=m
CONFIG_DVB_TC90522=m

#
# Digital terrestrial only tuners/PLL
#
CONFIG_DVB_PLL=m
CONFIG_DVB_TUNER_DIB0070=m
CONFIG_DVB_TUNER_DIB0090=m

#
# SEC control devices for DVB-S
#
CONFIG_DVB_A8293=m
CONFIG_DVB_AF9033=m
CONFIG_DVB_ASCOT2E=m
CONFIG_DVB_ATBM8830=m
CONFIG_DVB_HELENE=m
CONFIG_DVB_HORUS3A=m
CONFIG_DVB_ISL6405=m
CONFIG_DVB_ISL6421=m
CONFIG_DVB_ISL6423=m
CONFIG_DVB_IX2505V=m
CONFIG_DVB_LGS8GL5=m
CONFIG_DVB_LGS8GXX=m
CONFIG_DVB_LNBH25=m
CONFIG_DVB_LNBH29=m
CONFIG_DVB_LNBP21=m
CONFIG_DVB_LNBP22=m
CONFIG_DVB_M88RS2000=m
CONFIG_DVB_TDA665x=m
CONFIG_DVB_DRX39XYJ=m

#
# Common Interface (EN50221) controller drivers
#
CONFIG_DVB_CXD2099=m
CONFIG_DVB_SP2=m
# end of Customise DVB Frontends

#
# Tools to develop new frontends
#
CONFIG_DVB_DUMMY_FE=m
# end of Media ancillary drivers

#
# Graphics support
#
CONFIG_APERTURE_HELPERS=y
CONFIG_VIDEO_CMDLINE=y
CONFIG_VIDEO_NOMODESET=y
CONFIG_AUXDISPLAY=y
CONFIG_CHARLCD=m
CONFIG_LINEDISP=m
CONFIG_HD44780_COMMON=m
CONFIG_HD44780=m
CONFIG_KS0108=m
CONFIG_KS0108_PORT=0x378
CONFIG_KS0108_DELAY=2
CONFIG_CFAG12864B=m
CONFIG_CFAG12864B_RATE=20
CONFIG_IMG_ASCII_LCD=m
CONFIG_HT16K33=m
CONFIG_LCD2S=m
CONFIG_PARPORT_PANEL=m
CONFIG_PANEL_PARPORT=0
CONFIG_PANEL_PROFILE=5
# CONFIG_PANEL_CHANGE_MESSAGE is not set
# CONFIG_CHARLCD_BL_OFF is not set
# CONFIG_CHARLCD_BL_ON is not set
CONFIG_CHARLCD_BL_FLASH=y
CONFIG_PANEL=m
CONFIG_AGP=y
CONFIG_AGP_AMD64=y
CONFIG_AGP_INTEL=y
CONFIG_AGP_SIS=m
CONFIG_AGP_VIA=y
CONFIG_INTEL_GTT=y
CONFIG_VGA_SWITCHEROO=y
CONFIG_DRM=m
CONFIG_DRM_MIPI_DBI=m
CONFIG_DRM_MIPI_DSI=y
CONFIG_DRM_KMS_HELPER=m
# CONFIG_DRM_DEBUG_DP_MST_TOPOLOGY_REFS is not set
# CONFIG_DRM_DEBUG_MODESET_LOCK is not set
CONFIG_DRM_FBDEV_EMULATION=y
CONFIG_DRM_FBDEV_OVERALLOC=100
# CONFIG_DRM_FBDEV_LEAK_PHYS_SMEM is not set
CONFIG_DRM_LOAD_EDID_FIRMWARE=y
CONFIG_DRM_DISPLAY_HELPER=m
CONFIG_DRM_DISPLAY_DP_HELPER=y
CONFIG_DRM_DISPLAY_HDCP_HELPER=y
CONFIG_DRM_DISPLAY_HDMI_HELPER=y
CONFIG_DRM_DP_AUX_CHARDEV=y
CONFIG_DRM_DP_CEC=y
CONFIG_DRM_TTM=m
CONFIG_DRM_EXEC=m
CONFIG_DRM_BUDDY=m
CONFIG_DRM_VRAM_HELPER=m
CONFIG_DRM_TTM_HELPER=m
CONFIG_DRM_GEM_DMA_HELPER=m
CONFIG_DRM_GEM_SHMEM_HELPER=m
CONFIG_DRM_SUBALLOC_HELPER=m
CONFIG_DRM_SCHED=m

#
# I2C encoder or helper chips
#
CONFIG_DRM_I2C_CH7006=m
CONFIG_DRM_I2C_SIL164=m
CONFIG_DRM_I2C_NXP_TDA998X=m
CONFIG_DRM_I2C_NXP_TDA9950=m
# end of I2C encoder or helper chips

#
# ARM devices
#
# end of ARM devices

CONFIG_DRM_RADEON=m
# CONFIG_DRM_RADEON_USERPTR is not set
CONFIG_DRM_AMDGPU=m
CONFIG_DRM_AMDGPU_SI=y
CONFIG_DRM_AMDGPU_CIK=y
CONFIG_DRM_AMDGPU_USERPTR=y
# CONFIG_DRM_AMDGPU_WERROR is not set

#
# ACP (Audio CoProcessor) Configuration
#
CONFIG_DRM_AMD_ACP=y
# end of ACP (Audio CoProcessor) Configuration

#
# Display Engine Configuration
#
CONFIG_DRM_AMD_DC=y
CONFIG_DRM_AMD_DC_FP=y
CONFIG_DRM_AMD_DC_SI=y
# CONFIG_DEBUG_KERNEL_DC is not set
CONFIG_DRM_AMD_SECURE_DISPLAY=y
# end of Display Engine Configuration

CONFIG_HSA_AMD=y
CONFIG_HSA_AMD_SVM=y
CONFIG_HSA_AMD_P2P=y
CONFIG_DRM_NOUVEAU=m
CONFIG_NOUVEAU_DEBUG=5
CONFIG_NOUVEAU_DEBUG_DEFAULT=3
# CONFIG_NOUVEAU_DEBUG_MMU is not set
# CONFIG_NOUVEAU_DEBUG_PUSH is not set
CONFIG_DRM_NOUVEAU_BACKLIGHT=y
# CONFIG_DRM_NOUVEAU_SVM is not set
CONFIG_DRM_I915=m
CONFIG_DRM_I915_FORCE_PROBE=""
CONFIG_DRM_I915_CAPTURE_ERROR=y
CONFIG_DRM_I915_COMPRESS_ERROR=y
CONFIG_DRM_I915_USERPTR=y
CONFIG_DRM_I915_GVT_KVMGT=m
CONFIG_DRM_I915_PXP=y

#
# drm/i915 Debugging
#
# CONFIG_DRM_I915_WERROR is not set
# CONFIG_DRM_I915_DEBUG is not set
# CONFIG_DRM_I915_DEBUG_MMIO is not set
# CONFIG_DRM_I915_SW_FENCE_DEBUG_OBJECTS is not set
# CONFIG_DRM_I915_SW_FENCE_CHECK_DAG is not set
# CONFIG_DRM_I915_DEBUG_GUC is not set
# CONFIG_DRM_I915_SELFTEST is not set
# CONFIG_DRM_I915_LOW_LEVEL_TRACEPOINTS is not set
# CONFIG_DRM_I915_DEBUG_VBLANK_EVADE is not set
# CONFIG_DRM_I915_DEBUG_RUNTIME_PM is not set
# end of drm/i915 Debugging

#
# drm/i915 Profile Guided Optimisation
#
CONFIG_DRM_I915_REQUEST_TIMEOUT=20000
CONFIG_DRM_I915_FENCE_TIMEOUT=10000
CONFIG_DRM_I915_USERFAULT_AUTOSUSPEND=250
CONFIG_DRM_I915_HEARTBEAT_INTERVAL=2500
CONFIG_DRM_I915_PREEMPT_TIMEOUT=640
CONFIG_DRM_I915_PREEMPT_TIMEOUT_COMPUTE=7500
CONFIG_DRM_I915_MAX_REQUEST_BUSYWAIT=8000
CONFIG_DRM_I915_STOP_TIMEOUT=100
CONFIG_DRM_I915_TIMESLICE_DURATION=1
# end of drm/i915 Profile Guided Optimisation

CONFIG_DRM_I915_GVT=y
CONFIG_DRM_VGEM=m
CONFIG_DRM_VKMS=m
CONFIG_DRM_VMWGFX=m
# CONFIG_DRM_VMWGFX_MKSSTATS is not set
CONFIG_DRM_GMA500=m
CONFIG_DRM_UDL=m
CONFIG_DRM_AST=m
CONFIG_DRM_MGAG200=m
CONFIG_DRM_QXL=m
CONFIG_DRM_VIRTIO_GPU=m
CONFIG_DRM_VIRTIO_GPU_KMS=y
CONFIG_DRM_PANEL=y

#
# Display Panels
#
# CONFIG_DRM_PANEL_AUO_A030JTN01 is not set
# CONFIG_DRM_PANEL_ORISETECH_OTA5601A is not set
CONFIG_DRM_PANEL_RASPBERRYPI_TOUCHSCREEN=m
CONFIG_DRM_PANEL_WIDECHIPS_WS2401=m
# end of Display Panels

CONFIG_DRM_BRIDGE=y
CONFIG_DRM_PANEL_BRIDGE=y

#
# Display Interface Bridges
#
CONFIG_DRM_ANALOGIX_ANX78XX=m
CONFIG_DRM_ANALOGIX_DP=m
# end of Display Interface Bridges

# CONFIG_DRM_LOONGSON is not set
# CONFIG_DRM_ETNAVIV is not set
CONFIG_DRM_BOCHS=m
CONFIG_DRM_CIRRUS_QEMU=m
CONFIG_DRM_GM12U320=m
CONFIG_DRM_PANEL_MIPI_DBI=m
CONFIG_DRM_SIMPLEDRM=m
CONFIG_TINYDRM_HX8357D=m
CONFIG_TINYDRM_ILI9163=m
CONFIG_TINYDRM_ILI9225=m
CONFIG_TINYDRM_ILI9341=m
CONFIG_TINYDRM_ILI9486=m
CONFIG_TINYDRM_MI0283QT=m
CONFIG_TINYDRM_REPAPER=m
CONFIG_TINYDRM_ST7586=m
CONFIG_TINYDRM_ST7735R=m
CONFIG_DRM_XEN=y
CONFIG_DRM_XEN_FRONTEND=m
CONFIG_DRM_VBOXVIDEO=m
CONFIG_DRM_GUD=m
CONFIG_DRM_SSD130X=m
CONFIG_DRM_SSD130X_I2C=m
CONFIG_DRM_SSD130X_SPI=m
CONFIG_DRM_HYPERV=m
# CONFIG_DRM_LEGACY is not set
CONFIG_DRM_PANEL_ORIENTATION_QUIRKS=y
CONFIG_DRM_PRIVACY_SCREEN=y

#
# Frame buffer Devices
#
CONFIG_FB=y
CONFIG_FB_HECUBA=m
CONFIG_FB_SVGALIB=m
CONFIG_FB_CIRRUS=m
CONFIG_FB_PM2=m
CONFIG_FB_PM2_FIFO_DISCONNECT=y
CONFIG_FB_CYBER2000=m
CONFIG_FB_CYBER2000_DDC=y
CONFIG_FB_ARC=m
CONFIG_FB_ASILIANT=y
CONFIG_FB_IMSTT=y
CONFIG_FB_VGA16=m
CONFIG_FB_UVESA=m
CONFIG_FB_VESA=y
CONFIG_FB_EFI=y
CONFIG_FB_N411=m
CONFIG_FB_HGA=m
CONFIG_FB_OPENCORES=m
CONFIG_FB_S1D13XXX=m
CONFIG_FB_NVIDIA=m
CONFIG_FB_NVIDIA_I2C=y
# CONFIG_FB_NVIDIA_DEBUG is not set
CONFIG_FB_NVIDIA_BACKLIGHT=y
CONFIG_FB_RIVA=m
CONFIG_FB_RIVA_I2C=y
# CONFIG_FB_RIVA_DEBUG is not set
CONFIG_FB_RIVA_BACKLIGHT=y
CONFIG_FB_I740=m
CONFIG_FB_LE80578=m
CONFIG_FB_CARILLO_RANCH=m
CONFIG_FB_INTEL=m
# CONFIG_FB_INTEL_DEBUG is not set
CONFIG_FB_INTEL_I2C=y
CONFIG_FB_MATROX=m
CONFIG_FB_MATROX_MILLENIUM=y
CONFIG_FB_MATROX_MYSTIQUE=y
CONFIG_FB_MATROX_G=y
CONFIG_FB_MATROX_I2C=m
CONFIG_FB_MATROX_MAVEN=m
CONFIG_FB_RADEON=m
CONFIG_FB_RADEON_I2C=y
CONFIG_FB_RADEON_BACKLIGHT=y
# CONFIG_FB_RADEON_DEBUG is not set
CONFIG_FB_ATY128=m
CONFIG_FB_ATY128_BACKLIGHT=y
CONFIG_FB_ATY=m
CONFIG_FB_ATY_CT=y
# CONFIG_FB_ATY_GENERIC_LCD is not set
CONFIG_FB_ATY_GX=y
CONFIG_FB_ATY_BACKLIGHT=y
CONFIG_FB_S3=m
CONFIG_FB_S3_DDC=y
CONFIG_FB_SAVAGE=m
CONFIG_FB_SAVAGE_I2C=y
# CONFIG_FB_SAVAGE_ACCEL is not set
CONFIG_FB_SIS=m
CONFIG_FB_SIS_300=y
CONFIG_FB_SIS_315=y
CONFIG_FB_VIA=m
# CONFIG_FB_VIA_DIRECT_PROCFS is not set
CONFIG_FB_VIA_X_COMPATIBILITY=y
CONFIG_FB_NEOMAGIC=m
CONFIG_FB_KYRO=m
CONFIG_FB_3DFX=m
# CONFIG_FB_3DFX_ACCEL is not set
# CONFIG_FB_3DFX_I2C is not set
CONFIG_FB_VOODOO1=m
CONFIG_FB_VT8623=m
CONFIG_FB_TRIDENT=m
CONFIG_FB_ARK=m
CONFIG_FB_PM3=m
CONFIG_FB_CARMINE=m
CONFIG_FB_CARMINE_DRAM_EVAL=y
# CONFIG_CARMINE_DRAM_CUSTOM is not set
CONFIG_FB_SM501=m
CONFIG_FB_SMSCUFX=m
CONFIG_FB_UDL=m
# CONFIG_FB_IBM_GXT4500 is not set
# CONFIG_FB_VIRTUAL is not set
CONFIG_XEN_FBDEV_FRONTEND=m
CONFIG_FB_METRONOME=m
CONFIG_FB_MB862XX=m
CONFIG_FB_MB862XX_PCI_GDC=y
CONFIG_FB_MB862XX_I2C=y
CONFIG_FB_HYPERV=m
CONFIG_FB_SIMPLE=m
CONFIG_FB_SSD1307=m
CONFIG_FB_SM712=m
CONFIG_FB_CORE=y
CONFIG_FB_NOTIFY=y
CONFIG_FIRMWARE_EDID=y
CONFIG_FB_DEVICE=y
CONFIG_FB_DDC=m
CONFIG_FB_CFB_FILLRECT=y
CONFIG_FB_CFB_COPYAREA=y
CONFIG_FB_CFB_IMAGEBLIT=y
CONFIG_FB_SYS_FILLRECT=y
CONFIG_FB_SYS_COPYAREA=y
CONFIG_FB_SYS_IMAGEBLIT=y
# CONFIG_FB_FOREIGN_ENDIAN is not set
CONFIG_FB_SYS_FOPS=y
CONFIG_FB_DEFERRED_IO=y
CONFIG_FB_DMAMEM_HELPERS=y
CONFIG_FB_IOMEM_HELPERS=y
CONFIG_FB_SYSMEM_HELPERS=y
CONFIG_FB_SYSMEM_HELPERS_DEFERRED=y
CONFIG_FB_BACKLIGHT=m
CONFIG_FB_MODE_HELPERS=y
CONFIG_FB_TILEBLITTING=y
# end of Frame buffer Devices

#
# Backlight & LCD device support
#
CONFIG_LCD_CLASS_DEVICE=m
CONFIG_LCD_L4F00242T03=m
CONFIG_LCD_LMS283GF05=m
CONFIG_LCD_LTV350QV=m
CONFIG_LCD_ILI922X=m
CONFIG_LCD_ILI9320=m
CONFIG_LCD_TDO24M=m
CONFIG_LCD_VGG2432A4=m
CONFIG_LCD_PLATFORM=m
CONFIG_LCD_AMS369FG06=m
CONFIG_LCD_LMS501KF03=m
CONFIG_LCD_HX8357=m
CONFIG_LCD_OTM3225A=m
CONFIG_BACKLIGHT_CLASS_DEVICE=y
CONFIG_BACKLIGHT_KTD253=m
# CONFIG_BACKLIGHT_KTZ8866 is not set
CONFIG_BACKLIGHT_LM3533=m
CONFIG_BACKLIGHT_CARILLO_RANCH=m
CONFIG_BACKLIGHT_PWM=m
CONFIG_BACKLIGHT_DA903X=m
CONFIG_BACKLIGHT_DA9052=m
CONFIG_BACKLIGHT_MAX8925=m
CONFIG_BACKLIGHT_MT6370=m
CONFIG_BACKLIGHT_APPLE=m
CONFIG_BACKLIGHT_QCOM_WLED=m
CONFIG_BACKLIGHT_RT4831=m
CONFIG_BACKLIGHT_SAHARA=m
CONFIG_BACKLIGHT_WM831X=m
CONFIG_BACKLIGHT_ADP5520=m
CONFIG_BACKLIGHT_ADP8860=m
CONFIG_BACKLIGHT_ADP8870=m
CONFIG_BACKLIGHT_88PM860X=m
CONFIG_BACKLIGHT_PCF50633=m
CONFIG_BACKLIGHT_AAT2870=m
CONFIG_BACKLIGHT_LM3630A=m
CONFIG_BACKLIGHT_LM3639=m
CONFIG_BACKLIGHT_LP855X=m
CONFIG_BACKLIGHT_LP8788=m
CONFIG_BACKLIGHT_PANDORA=m
CONFIG_BACKLIGHT_SKY81452=m
CONFIG_BACKLIGHT_AS3711=m
CONFIG_BACKLIGHT_GPIO=m
CONFIG_BACKLIGHT_LV5207LP=m
CONFIG_BACKLIGHT_BD6107=m
CONFIG_BACKLIGHT_ARCXCNN=m
CONFIG_BACKLIGHT_RAVE_SP=m
# end of Backlight & LCD device support

CONFIG_VGASTATE=m
CONFIG_VIDEOMODE_HELPERS=y
CONFIG_HDMI=y

#
# Console display driver support
#
CONFIG_VGA_CONSOLE=y
CONFIG_DUMMY_CONSOLE=y
CONFIG_DUMMY_CONSOLE_COLUMNS=80
CONFIG_DUMMY_CONSOLE_ROWS=25
CONFIG_FRAMEBUFFER_CONSOLE=y
# CONFIG_FRAMEBUFFER_CONSOLE_LEGACY_ACCELERATION is not set
CONFIG_FRAMEBUFFER_CONSOLE_DETECT_PRIMARY=y
CONFIG_FRAMEBUFFER_CONSOLE_ROTATION=y
CONFIG_FRAMEBUFFER_CONSOLE_DEFERRED_TAKEOVER=y
# end of Console display driver support

# CONFIG_LOGO is not set
# end of Graphics support

CONFIG_DRM_ACCEL=y
# CONFIG_DRM_ACCEL_HABANALABS is not set
# CONFIG_DRM_ACCEL_IVPU is not set
# CONFIG_DRM_ACCEL_QAIC is not set
CONFIG_SOUND=m
CONFIG_SOUND_OSS_CORE=y
# CONFIG_SOUND_OSS_CORE_PRECLAIM is not set
CONFIG_SND=m
CONFIG_SND_TIMER=m
CONFIG_SND_PCM=m
CONFIG_SND_PCM_ELD=y
CONFIG_SND_PCM_IEC958=y
CONFIG_SND_DMAENGINE_PCM=m
CONFIG_SND_HWDEP=m
CONFIG_SND_SEQ_DEVICE=m
CONFIG_SND_RAWMIDI=m
CONFIG_SND_COMPRESS_OFFLOAD=m
CONFIG_SND_JACK=y
CONFIG_SND_JACK_INPUT_DEV=y
CONFIG_SND_OSSEMUL=y
CONFIG_SND_MIXER_OSS=m
# CONFIG_SND_PCM_OSS is not set
CONFIG_SND_PCM_TIMER=y
CONFIG_SND_HRTIMER=m
CONFIG_SND_DYNAMIC_MINORS=y
CONFIG_SND_MAX_CARDS=32
CONFIG_SND_SUPPORT_OLD_API=y
CONFIG_SND_PROC_FS=y
CONFIG_SND_VERBOSE_PROCFS=y
# CONFIG_SND_VERBOSE_PRINTK is not set
# CONFIG_SND_CTL_FAST_LOOKUP is not set
# CONFIG_SND_DEBUG is not set
# CONFIG_SND_CTL_INPUT_VALIDATION is not set
CONFIG_SND_VMASTER=y
CONFIG_SND_DMA_SGBUF=y
CONFIG_SND_CTL_LED=m
CONFIG_SND_SEQUENCER=m
CONFIG_SND_SEQ_DUMMY=m
# CONFIG_SND_SEQUENCER_OSS is not set
CONFIG_SND_SEQ_HRTIMER_DEFAULT=y
CONFIG_SND_SEQ_MIDI_EVENT=m
CONFIG_SND_SEQ_MIDI=m
CONFIG_SND_SEQ_MIDI_EMUL=m
CONFIG_SND_SEQ_VIRMIDI=m
# CONFIG_SND_SEQ_UMP is not set
CONFIG_SND_MPU401_UART=m
CONFIG_SND_OPL3_LIB=m
CONFIG_SND_OPL3_LIB_SEQ=m
CONFIG_SND_VX_LIB=m
CONFIG_SND_AC97_CODEC=m
CONFIG_SND_DRIVERS=y
CONFIG_SND_PCSP=m
CONFIG_SND_DUMMY=m
CONFIG_SND_ALOOP=m
# CONFIG_SND_PCMTEST is not set
CONFIG_SND_VIRMIDI=m
CONFIG_SND_MTPAV=m
CONFIG_SND_MTS64=m
CONFIG_SND_SERIAL_U16550=m
CONFIG_SND_MPU401=m
CONFIG_SND_PORTMAN2X4=m
CONFIG_SND_AC97_POWER_SAVE=y
CONFIG_SND_AC97_POWER_SAVE_DEFAULT=0
CONFIG_SND_SB_COMMON=m
CONFIG_SND_PCI=y
CONFIG_SND_AD1889=m
CONFIG_SND_ALS300=m
CONFIG_SND_ALS4000=m
CONFIG_SND_ALI5451=m
CONFIG_SND_ASIHPI=m
CONFIG_SND_ATIIXP=m
CONFIG_SND_ATIIXP_MODEM=m
CONFIG_SND_AU8810=m
CONFIG_SND_AU8820=m
CONFIG_SND_AU8830=m
CONFIG_SND_AW2=m
CONFIG_SND_AZT3328=m
CONFIG_SND_BT87X=m
# CONFIG_SND_BT87X_OVERCLOCK is not set
CONFIG_SND_CA0106=m
CONFIG_SND_CMIPCI=m
CONFIG_SND_OXYGEN_LIB=m
CONFIG_SND_OXYGEN=m
CONFIG_SND_CS4281=m
CONFIG_SND_CS46XX=m
CONFIG_SND_CS46XX_NEW_DSP=y
CONFIG_SND_CTXFI=m
CONFIG_SND_DARLA20=m
CONFIG_SND_GINA20=m
CONFIG_SND_LAYLA20=m
CONFIG_SND_DARLA24=m
CONFIG_SND_GINA24=m
CONFIG_SND_LAYLA24=m
CONFIG_SND_MONA=m
CONFIG_SND_MIA=m
CONFIG_SND_ECHO3G=m
CONFIG_SND_INDIGO=m
CONFIG_SND_INDIGOIO=m
CONFIG_SND_INDIGODJ=m
CONFIG_SND_INDIGOIOX=m
CONFIG_SND_INDIGODJX=m
CONFIG_SND_EMU10K1=m
CONFIG_SND_EMU10K1_SEQ=m
CONFIG_SND_EMU10K1X=m
CONFIG_SND_ENS1370=m
CONFIG_SND_ENS1371=m
CONFIG_SND_ES1938=m
CONFIG_SND_ES1968=m
CONFIG_SND_ES1968_INPUT=y
CONFIG_SND_ES1968_RADIO=y
CONFIG_SND_FM801=m
CONFIG_SND_FM801_TEA575X_BOOL=y
CONFIG_SND_HDSP=m
CONFIG_SND_HDSPM=m
CONFIG_SND_ICE1712=m
CONFIG_SND_ICE1724=m
CONFIG_SND_INTEL8X0=m
CONFIG_SND_INTEL8X0M=m
CONFIG_SND_KORG1212=m
CONFIG_SND_LOLA=m
CONFIG_SND_LX6464ES=m
CONFIG_SND_MAESTRO3=m
CONFIG_SND_MAESTRO3_INPUT=y
CONFIG_SND_MIXART=m
CONFIG_SND_NM256=m
CONFIG_SND_PCXHR=m
CONFIG_SND_RIPTIDE=m
CONFIG_SND_RME32=m
CONFIG_SND_RME96=m
CONFIG_SND_RME9652=m
CONFIG_SND_SONICVIBES=m
CONFIG_SND_TRIDENT=m
CONFIG_SND_VIA82XX=m
CONFIG_SND_VIA82XX_MODEM=m
CONFIG_SND_VIRTUOSO=m
CONFIG_SND_VX222=m
CONFIG_SND_YMFPCI=m

#
# HD-Audio
#
CONFIG_SND_HDA=m
CONFIG_SND_HDA_GENERIC_LEDS=y
CONFIG_SND_HDA_INTEL=m
CONFIG_SND_HDA_HWDEP=y
CONFIG_SND_HDA_RECONFIG=y
CONFIG_SND_HDA_INPUT_BEEP=y
CONFIG_SND_HDA_INPUT_BEEP_MODE=0
CONFIG_SND_HDA_PATCH_LOADER=y
CONFIG_SND_HDA_SCODEC_CS35L41=m
CONFIG_SND_HDA_CS_DSP_CONTROLS=m
CONFIG_SND_HDA_SCODEC_CS35L41_I2C=m
CONFIG_SND_HDA_SCODEC_CS35L41_SPI=m
# CONFIG_SND_HDA_SCODEC_CS35L56_I2C is not set
# CONFIG_SND_HDA_SCODEC_CS35L56_SPI is not set
# CONFIG_SND_HDA_SCODEC_TAS2781_I2C is not set
CONFIG_SND_HDA_CODEC_REALTEK=m
CONFIG_SND_HDA_CODEC_ANALOG=m
CONFIG_SND_HDA_CODEC_SIGMATEL=m
CONFIG_SND_HDA_CODEC_VIA=m
CONFIG_SND_HDA_CODEC_HDMI=m
CONFIG_SND_HDA_CODEC_CIRRUS=m
CONFIG_SND_HDA_CODEC_CS8409=m
CONFIG_SND_HDA_CODEC_CONEXANT=m
CONFIG_SND_HDA_CODEC_CA0110=m
CONFIG_SND_HDA_CODEC_CA0132=m
CONFIG_SND_HDA_CODEC_CA0132_DSP=y
CONFIG_SND_HDA_CODEC_CMEDIA=m
CONFIG_SND_HDA_CODEC_SI3054=m
CONFIG_SND_HDA_GENERIC=m
CONFIG_SND_HDA_POWER_SAVE_DEFAULT=1
CONFIG_SND_HDA_INTEL_HDMI_SILENT_STREAM=y
# CONFIG_SND_HDA_CTL_DEV_ID is not set
# end of HD-Audio

CONFIG_SND_HDA_CORE=m
CONFIG_SND_HDA_DSP_LOADER=y
CONFIG_SND_HDA_COMPONENT=y
CONFIG_SND_HDA_I915=y
CONFIG_SND_HDA_EXT_CORE=m
CONFIG_SND_HDA_PREALLOC_SIZE=0
CONFIG_SND_INTEL_NHLT=y
CONFIG_SND_INTEL_DSP_CONFIG=m
CONFIG_SND_INTEL_SOUNDWIRE_ACPI=m
CONFIG_SND_INTEL_BYT_PREFER_SOF=y
CONFIG_SND_SPI=y
CONFIG_SND_USB=y
CONFIG_SND_USB_AUDIO=m
# CONFIG_SND_USB_AUDIO_MIDI_V2 is not set
CONFIG_SND_USB_AUDIO_USE_MEDIA_CONTROLLER=y
CONFIG_SND_USB_UA101=m
CONFIG_SND_USB_USX2Y=m
CONFIG_SND_USB_CAIAQ=m
CONFIG_SND_USB_CAIAQ_INPUT=y
CONFIG_SND_USB_US122L=m
CONFIG_SND_USB_6FIRE=m
CONFIG_SND_USB_HIFACE=m
CONFIG_SND_BCD2000=m
CONFIG_SND_USB_LINE6=m
CONFIG_SND_USB_POD=m
CONFIG_SND_USB_PODHD=m
CONFIG_SND_USB_TONEPORT=m
CONFIG_SND_USB_VARIAX=m
CONFIG_SND_FIREWIRE=y
CONFIG_SND_FIREWIRE_LIB=m
CONFIG_SND_DICE=m
CONFIG_SND_OXFW=m
CONFIG_SND_ISIGHT=m
CONFIG_SND_FIREWORKS=m
CONFIG_SND_BEBOB=m
CONFIG_SND_FIREWIRE_DIGI00X=m
CONFIG_SND_FIREWIRE_TASCAM=m
CONFIG_SND_FIREWIRE_MOTU=m
CONFIG_SND_FIREFACE=m
CONFIG_SND_PCMCIA=y
CONFIG_SND_VXPOCKET=m
CONFIG_SND_PDAUDIOCF=m
CONFIG_SND_SOC=m
CONFIG_SND_SOC_AC97_BUS=y
CONFIG_SND_SOC_GENERIC_DMAENGINE_PCM=y
CONFIG_SND_SOC_COMPRESS=y
CONFIG_SND_SOC_TOPOLOGY=y
CONFIG_SND_SOC_ACPI=m
CONFIG_SND_SOC_ADI=m
CONFIG_SND_SOC_ADI_AXI_I2S=m
CONFIG_SND_SOC_ADI_AXI_SPDIF=m
CONFIG_SND_SOC_AMD_ACP=m
CONFIG_SND_SOC_AMD_CZ_DA7219MX98357_MACH=m
CONFIG_SND_SOC_AMD_CZ_RT5645_MACH=m
CONFIG_SND_SOC_AMD_ST_ES8336_MACH=m
CONFIG_SND_SOC_AMD_ACP3x=m
CONFIG_SND_SOC_AMD_RV_RT5682_MACH=m
CONFIG_SND_SOC_AMD_RENOIR=m
CONFIG_SND_SOC_AMD_RENOIR_MACH=m
CONFIG_SND_SOC_AMD_ACP5x=m
CONFIG_SND_SOC_AMD_VANGOGH_MACH=m
CONFIG_SND_SOC_AMD_ACP6x=m
CONFIG_SND_SOC_AMD_YC_MACH=m
CONFIG_SND_AMD_ACP_CONFIG=m
CONFIG_SND_SOC_AMD_ACP_COMMON=m
CONFIG_SND_SOC_AMD_ACP_PDM=m
CONFIG_SND_SOC_AMD_ACP_LEGACY_COMMON=m
CONFIG_SND_SOC_AMD_ACP_I2S=m
CONFIG_SND_SOC_AMD_ACP_PCM=m
CONFIG_SND_SOC_AMD_ACP_PCI=m
CONFIG_SND_AMD_ASOC_RENOIR=m
CONFIG_SND_AMD_ASOC_REMBRANDT=m
CONFIG_SND_SOC_AMD_MACH_COMMON=m
CONFIG_SND_SOC_AMD_LEGACY_MACH=m
CONFIG_SND_SOC_AMD_SOF_MACH=m
CONFIG_SND_SOC_AMD_RPL_ACP6x=m
CONFIG_SND_SOC_AMD_PS=m
CONFIG_SND_SOC_AMD_PS_MACH=m
CONFIG_SND_ATMEL_SOC=m
CONFIG_SND_BCM63XX_I2S_WHISTLER=m
CONFIG_SND_DESIGNWARE_I2S=m
CONFIG_SND_DESIGNWARE_PCM=y

#
# SoC Audio for Freescale CPUs
#

#
# Common SoC Audio options for Freescale CPUs:
#
CONFIG_SND_SOC_FSL_ASRC=m
CONFIG_SND_SOC_FSL_SAI=m
CONFIG_SND_SOC_FSL_MQS=m
CONFIG_SND_SOC_FSL_AUDMIX=m
CONFIG_SND_SOC_FSL_SSI=m
CONFIG_SND_SOC_FSL_SPDIF=m
CONFIG_SND_SOC_FSL_ESAI=m
CONFIG_SND_SOC_FSL_MICFIL=m
CONFIG_SND_SOC_FSL_EASRC=m
CONFIG_SND_SOC_FSL_XCVR=m
CONFIG_SND_SOC_FSL_UTILS=m
CONFIG_SND_SOC_FSL_RPMSG=m
CONFIG_SND_SOC_IMX_AUDMUX=m
# end of SoC Audio for Freescale CPUs

# CONFIG_SND_SOC_CHV3_I2S is not set
CONFIG_SND_I2S_HI6210_I2S=m
CONFIG_SND_SOC_IMG=y
CONFIG_SND_SOC_IMG_I2S_IN=m
CONFIG_SND_SOC_IMG_I2S_OUT=m
CONFIG_SND_SOC_IMG_PARALLEL_OUT=m
CONFIG_SND_SOC_IMG_SPDIF_IN=m
CONFIG_SND_SOC_IMG_SPDIF_OUT=m
CONFIG_SND_SOC_IMG_PISTACHIO_INTERNAL_DAC=m
CONFIG_SND_SOC_INTEL_SST_TOPLEVEL=y
CONFIG_SND_SOC_INTEL_SST=m
CONFIG_SND_SOC_INTEL_CATPT=m
CONFIG_SND_SST_ATOM_HIFI2_PLATFORM=m
CONFIG_SND_SST_ATOM_HIFI2_PLATFORM_PCI=m
CONFIG_SND_SST_ATOM_HIFI2_PLATFORM_ACPI=m
# CONFIG_SND_SOC_INTEL_SKYLAKE is not set
CONFIG_SND_SOC_INTEL_SKL=m
CONFIG_SND_SOC_INTEL_APL=m
CONFIG_SND_SOC_INTEL_KBL=m
CONFIG_SND_SOC_INTEL_GLK=m
# CONFIG_SND_SOC_INTEL_CNL is not set
# CONFIG_SND_SOC_INTEL_CFL is not set
# CONFIG_SND_SOC_INTEL_CML_H is not set
# CONFIG_SND_SOC_INTEL_CML_LP is not set
CONFIG_SND_SOC_INTEL_SKYLAKE_FAMILY=m
CONFIG_SND_SOC_INTEL_SKYLAKE_SSP_CLK=m
CONFIG_SND_SOC_INTEL_SKYLAKE_HDAUDIO_CODEC=y
CONFIG_SND_SOC_INTEL_SKYLAKE_COMMON=m
CONFIG_SND_SOC_ACPI_INTEL_MATCH=m
CONFIG_SND_SOC_INTEL_AVS=m

#
# Intel AVS Machine drivers
#

#
# Available DSP configurations
#
CONFIG_SND_SOC_INTEL_AVS_MACH_DA7219=m
CONFIG_SND_SOC_INTEL_AVS_MACH_DMIC=m
# CONFIG_SND_SOC_INTEL_AVS_MACH_ES8336 is not set
CONFIG_SND_SOC_INTEL_AVS_MACH_HDAUDIO=m
CONFIG_SND_SOC_INTEL_AVS_MACH_I2S_TEST=m
CONFIG_SND_SOC_INTEL_AVS_MACH_MAX98927=m
CONFIG_SND_SOC_INTEL_AVS_MACH_MAX98357A=m
CONFIG_SND_SOC_INTEL_AVS_MACH_MAX98373=m
CONFIG_SND_SOC_INTEL_AVS_MACH_NAU8825=m
CONFIG_SND_SOC_INTEL_AVS_MACH_PROBE=m
CONFIG_SND_SOC_INTEL_AVS_MACH_RT274=m
CONFIG_SND_SOC_INTEL_AVS_MACH_RT286=m
CONFIG_SND_SOC_INTEL_AVS_MACH_RT298=m
# CONFIG_SND_SOC_INTEL_AVS_MACH_RT5663 is not set
CONFIG_SND_SOC_INTEL_AVS_MACH_RT5682=m
CONFIG_SND_SOC_INTEL_AVS_MACH_SSM4567=m
# end of Intel AVS Machine drivers

CONFIG_SND_SOC_INTEL_MACH=y
CONFIG_SND_SOC_INTEL_USER_FRIENDLY_LONG_NAMES=y
CONFIG_SND_SOC_INTEL_HDA_DSP_COMMON=m
CONFIG_SND_SOC_INTEL_SOF_MAXIM_COMMON=m
CONFIG_SND_SOC_INTEL_SOF_REALTEK_COMMON=m
CONFIG_SND_SOC_INTEL_SOF_CIRRUS_COMMON=m
CONFIG_SND_SOC_INTEL_HASWELL_MACH=m
CONFIG_SND_SOC_INTEL_BDW_RT5650_MACH=m
CONFIG_SND_SOC_INTEL_BDW_RT5677_MACH=m
CONFIG_SND_SOC_INTEL_BROADWELL_MACH=m
CONFIG_SND_SOC_INTEL_BYTCR_RT5640_MACH=m
CONFIG_SND_SOC_INTEL_BYTCR_RT5651_MACH=m
CONFIG_SND_SOC_INTEL_BYTCR_WM5102_MACH=m
CONFIG_SND_SOC_INTEL_CHT_BSW_RT5672_MACH=m
CONFIG_SND_SOC_INTEL_CHT_BSW_RT5645_MACH=m
CONFIG_SND_SOC_INTEL_CHT_BSW_MAX98090_TI_MACH=m
CONFIG_SND_SOC_INTEL_CHT_BSW_NAU8824_MACH=m
CONFIG_SND_SOC_INTEL_BYT_CHT_CX2072X_MACH=m
CONFIG_SND_SOC_INTEL_BYT_CHT_DA7213_MACH=m
CONFIG_SND_SOC_INTEL_BYT_CHT_ES8316_MACH=m
# CONFIG_SND_SOC_INTEL_BYT_CHT_NOCODEC_MACH is not set
CONFIG_SND_SOC_INTEL_SKL_RT286_MACH=m
CONFIG_SND_SOC_INTEL_SKL_NAU88L25_SSM4567_MACH=m
CONFIG_SND_SOC_INTEL_SKL_NAU88L25_MAX98357A_MACH=m
CONFIG_SND_SOC_INTEL_DA7219_MAX98357A_GENERIC=m
CONFIG_SND_SOC_INTEL_BXT_DA7219_MAX98357A_COMMON=m
CONFIG_SND_SOC_INTEL_BXT_DA7219_MAX98357A_MACH=m
CONFIG_SND_SOC_INTEL_BXT_RT298_MACH=m
CONFIG_SND_SOC_INTEL_SOF_WM8804_MACH=m
CONFIG_SND_SOC_INTEL_KBL_RT5663_MAX98927_MACH=m
CONFIG_SND_SOC_INTEL_KBL_RT5663_RT5514_MAX98927_MACH=m
CONFIG_SND_SOC_INTEL_KBL_DA7219_MAX98357A_MACH=m
CONFIG_SND_SOC_INTEL_KBL_DA7219_MAX98927_MACH=m
CONFIG_SND_SOC_INTEL_KBL_RT5660_MACH=m
CONFIG_SND_SOC_INTEL_GLK_DA7219_MAX98357A_MACH=m
CONFIG_SND_SOC_INTEL_GLK_RT5682_MAX98357A_MACH=m
CONFIG_SND_SOC_INTEL_SKL_HDA_DSP_GENERIC_MACH=m
CONFIG_SND_SOC_INTEL_SOF_RT5682_MACH=m
CONFIG_SND_SOC_INTEL_SOF_CS42L42_MACH=m
CONFIG_SND_SOC_INTEL_SOF_PCM512x_MACH=m
CONFIG_SND_SOC_INTEL_SOF_ES8336_MACH=m
CONFIG_SND_SOC_INTEL_SOF_NAU8825_MACH=m
CONFIG_SND_SOC_INTEL_CML_LP_DA7219_MAX98357A_MACH=m
CONFIG_SND_SOC_INTEL_SOF_CML_RT1011_RT5682_MACH=m
CONFIG_SND_SOC_INTEL_SOF_DA7219_MAX98373_MACH=m
CONFIG_SND_SOC_INTEL_SOF_SSP_AMP_MACH=m
CONFIG_SND_SOC_INTEL_EHL_RT5660_MACH=m
CONFIG_SND_SOC_INTEL_SOUNDWIRE_SOF_MACH=m
CONFIG_SND_SOC_MTK_BTCVSD=m
CONFIG_SND_SOC_SOF_TOPLEVEL=y
CONFIG_SND_SOC_SOF_PCI_DEV=m
CONFIG_SND_SOC_SOF_PCI=m
CONFIG_SND_SOC_SOF_ACPI=m
CONFIG_SND_SOC_SOF_ACPI_DEV=m
CONFIG_SND_SOC_SOF_DEBUG_PROBES=m
CONFIG_SND_SOC_SOF_CLIENT=m
# CONFIG_SND_SOC_SOF_DEVELOPER_SUPPORT is not set
CONFIG_SND_SOC_SOF=m
CONFIG_SND_SOC_SOF_PROBE_WORK_QUEUE=y
CONFIG_SND_SOC_SOF_IPC3=y
CONFIG_SND_SOC_SOF_INTEL_IPC4=y
CONFIG_SND_SOC_SOF_AMD_TOPLEVEL=m
CONFIG_SND_SOC_SOF_AMD_COMMON=m
CONFIG_SND_SOC_SOF_AMD_RENOIR=m
# CONFIG_SND_SOC_SOF_AMD_VANGOGH is not set
CONFIG_SND_SOC_SOF_AMD_REMBRANDT=m
CONFIG_SND_SOC_SOF_ACP_PROBES=m
CONFIG_SND_SOC_SOF_INTEL_TOPLEVEL=y
CONFIG_SND_SOC_SOF_INTEL_HIFI_EP_IPC=m
CONFIG_SND_SOC_SOF_INTEL_ATOM_HIFI_EP=m
CONFIG_SND_SOC_SOF_INTEL_COMMON=m
CONFIG_SND_SOC_SOF_BAYTRAIL=m
CONFIG_SND_SOC_SOF_BROADWELL=m
CONFIG_SND_SOC_SOF_MERRIFIELD=m
CONFIG_SND_SOC_SOF_INTEL_SKL=m
CONFIG_SND_SOC_SOF_SKYLAKE=m
CONFIG_SND_SOC_SOF_KABYLAKE=m
CONFIG_SND_SOC_SOF_INTEL_APL=m
CONFIG_SND_SOC_SOF_APOLLOLAKE=m
CONFIG_SND_SOC_SOF_GEMINILAKE=m
CONFIG_SND_SOC_SOF_INTEL_CNL=m
CONFIG_SND_SOC_SOF_CANNONLAKE=m
CONFIG_SND_SOC_SOF_COFFEELAKE=m
CONFIG_SND_SOC_SOF_COMETLAKE=m
CONFIG_SND_SOC_SOF_INTEL_ICL=m
CONFIG_SND_SOC_SOF_ICELAKE=m
CONFIG_SND_SOC_SOF_JASPERLAKE=m
CONFIG_SND_SOC_SOF_INTEL_TGL=m
CONFIG_SND_SOC_SOF_TIGERLAKE=m
CONFIG_SND_SOC_SOF_ELKHARTLAKE=m
CONFIG_SND_SOC_SOF_ALDERLAKE=m
CONFIG_SND_SOC_SOF_INTEL_MTL=m
CONFIG_SND_SOC_SOF_METEORLAKE=m
CONFIG_SND_SOC_SOF_INTEL_LNL=m
CONFIG_SND_SOC_SOF_LUNARLAKE=m
CONFIG_SND_SOC_SOF_HDA_COMMON=m
CONFIG_SND_SOC_SOF_HDA_MLINK=m
CONFIG_SND_SOC_SOF_HDA_LINK=y
CONFIG_SND_SOC_SOF_HDA_AUDIO_CODEC=y
CONFIG_SND_SOC_SOF_HDA_LINK_BASELINE=m
CONFIG_SND_SOC_SOF_HDA=m
CONFIG_SND_SOC_SOF_HDA_PROBES=m
CONFIG_SND_SOC_SOF_INTEL_SOUNDWIRE_LINK_BASELINE=m
CONFIG_SND_SOC_SOF_INTEL_SOUNDWIRE=m
CONFIG_SND_SOC_SOF_XTENSA=m

#
# STMicroelectronics STM32 SOC audio support
#
# end of STMicroelectronics STM32 SOC audio support

CONFIG_SND_SOC_XILINX_I2S=m
CONFIG_SND_SOC_XILINX_AUDIO_FORMATTER=m
CONFIG_SND_SOC_XILINX_SPDIF=m
CONFIG_SND_SOC_XTFPGA_I2S=m
CONFIG_SND_SOC_I2C_AND_SPI=m

#
# CODEC drivers
#
CONFIG_SND_SOC_ARIZONA=m
CONFIG_SND_SOC_WM_ADSP=m
CONFIG_SND_SOC_AC97_CODEC=m
CONFIG_SND_SOC_ADAU_UTILS=m
CONFIG_SND_SOC_ADAU1372=m
CONFIG_SND_SOC_ADAU1372_I2C=m
CONFIG_SND_SOC_ADAU1372_SPI=m
CONFIG_SND_SOC_ADAU1701=m
CONFIG_SND_SOC_ADAU17X1=m
CONFIG_SND_SOC_ADAU1761=m
CONFIG_SND_SOC_ADAU1761_I2C=m
CONFIG_SND_SOC_ADAU1761_SPI=m
CONFIG_SND_SOC_ADAU7002=m
CONFIG_SND_SOC_ADAU7118=m
CONFIG_SND_SOC_ADAU7118_HW=m
CONFIG_SND_SOC_ADAU7118_I2C=m
CONFIG_SND_SOC_AK4104=m
CONFIG_SND_SOC_AK4118=m
CONFIG_SND_SOC_AK4375=m
CONFIG_SND_SOC_AK4458=m
CONFIG_SND_SOC_AK4554=m
CONFIG_SND_SOC_AK4613=m
CONFIG_SND_SOC_AK4642=m
CONFIG_SND_SOC_AK5386=m
CONFIG_SND_SOC_AK5558=m
CONFIG_SND_SOC_ALC5623=m
# CONFIG_SND_SOC_AUDIO_IIO_AUX is not set
CONFIG_SND_SOC_AW8738=m
# CONFIG_SND_SOC_AW88395 is not set
# CONFIG_SND_SOC_AW88261 is not set
CONFIG_SND_SOC_BD28623=m
CONFIG_SND_SOC_BT_SCO=m
# CONFIG_SND_SOC_CHV3_CODEC is not set
CONFIG_SND_SOC_CROS_EC_CODEC=m
CONFIG_SND_SOC_CS35L32=m
CONFIG_SND_SOC_CS35L33=m
CONFIG_SND_SOC_CS35L34=m
CONFIG_SND_SOC_CS35L35=m
CONFIG_SND_SOC_CS35L36=m
CONFIG_SND_SOC_CS35L41_LIB=m
CONFIG_SND_SOC_CS35L41=m
CONFIG_SND_SOC_CS35L41_SPI=m
CONFIG_SND_SOC_CS35L41_I2C=m
CONFIG_SND_SOC_CS35L45=m
CONFIG_SND_SOC_CS35L45_SPI=m
CONFIG_SND_SOC_CS35L45_I2C=m
CONFIG_SND_SOC_CS35L56=m
CONFIG_SND_SOC_CS35L56_SHARED=m
# CONFIG_SND_SOC_CS35L56_I2C is not set
# CONFIG_SND_SOC_CS35L56_SPI is not set
CONFIG_SND_SOC_CS35L56_SDW=m
CONFIG_SND_SOC_CS42L42_CORE=m
CONFIG_SND_SOC_CS42L42=m
CONFIG_SND_SOC_CS42L42_SDW=m
CONFIG_SND_SOC_CS42L51=m
CONFIG_SND_SOC_CS42L51_I2C=m
CONFIG_SND_SOC_CS42L52=m
CONFIG_SND_SOC_CS42L56=m
CONFIG_SND_SOC_CS42L73=m
CONFIG_SND_SOC_CS42L83=m
CONFIG_SND_SOC_CS4234=m
CONFIG_SND_SOC_CS4265=m
CONFIG_SND_SOC_CS4270=m
CONFIG_SND_SOC_CS4271=m
CONFIG_SND_SOC_CS4271_I2C=m
CONFIG_SND_SOC_CS4271_SPI=m
CONFIG_SND_SOC_CS42XX8=m
CONFIG_SND_SOC_CS42XX8_I2C=m
CONFIG_SND_SOC_CS43130=m
CONFIG_SND_SOC_CS4341=m
CONFIG_SND_SOC_CS4349=m
CONFIG_SND_SOC_CS53L30=m
CONFIG_SND_SOC_CX2072X=m
CONFIG_SND_SOC_DA7213=m
CONFIG_SND_SOC_DA7219=m
CONFIG_SND_SOC_DMIC=m
CONFIG_SND_SOC_HDMI_CODEC=m
CONFIG_SND_SOC_ES7134=m
CONFIG_SND_SOC_ES7241=m
CONFIG_SND_SOC_ES8316=m
CONFIG_SND_SOC_ES8326=m
CONFIG_SND_SOC_ES8328=m
CONFIG_SND_SOC_ES8328_I2C=m
CONFIG_SND_SOC_ES8328_SPI=m
CONFIG_SND_SOC_GTM601=m
CONFIG_SND_SOC_HDAC_HDMI=m
CONFIG_SND_SOC_HDAC_HDA=m
CONFIG_SND_SOC_HDA=m
CONFIG_SND_SOC_ICS43432=m
# CONFIG_SND_SOC_IDT821034 is not set
CONFIG_SND_SOC_INNO_RK3036=m
CONFIG_SND_SOC_MAX98088=m
CONFIG_SND_SOC_MAX98090=m
CONFIG_SND_SOC_MAX98357A=m
CONFIG_SND_SOC_MAX98504=m
CONFIG_SND_SOC_MAX9867=m
CONFIG_SND_SOC_MAX98927=m
CONFIG_SND_SOC_MAX98520=m
CONFIG_SND_SOC_MAX98363=m
CONFIG_SND_SOC_MAX98373=m
CONFIG_SND_SOC_MAX98373_I2C=m
CONFIG_SND_SOC_MAX98373_SDW=m
CONFIG_SND_SOC_MAX98388=m
CONFIG_SND_SOC_MAX98390=m
CONFIG_SND_SOC_MAX98396=m
CONFIG_SND_SOC_MAX9860=m
CONFIG_SND_SOC_MSM8916_WCD_ANALOG=m
CONFIG_SND_SOC_MSM8916_WCD_DIGITAL=m
CONFIG_SND_SOC_PCM1681=m
CONFIG_SND_SOC_PCM1789=m
CONFIG_SND_SOC_PCM1789_I2C=m
CONFIG_SND_SOC_PCM179X=m
CONFIG_SND_SOC_PCM179X_I2C=m
CONFIG_SND_SOC_PCM179X_SPI=m
CONFIG_SND_SOC_PCM186X=m
CONFIG_SND_SOC_PCM186X_I2C=m
CONFIG_SND_SOC_PCM186X_SPI=m
CONFIG_SND_SOC_PCM3060=m
CONFIG_SND_SOC_PCM3060_I2C=m
CONFIG_SND_SOC_PCM3060_SPI=m
CONFIG_SND_SOC_PCM3168A=m
CONFIG_SND_SOC_PCM3168A_I2C=m
CONFIG_SND_SOC_PCM3168A_SPI=m
CONFIG_SND_SOC_PCM5102A=m
CONFIG_SND_SOC_PCM512x=m
CONFIG_SND_SOC_PCM512x_I2C=m
CONFIG_SND_SOC_PCM512x_SPI=m
# CONFIG_SND_SOC_PEB2466 is not set
CONFIG_SND_SOC_RK3328=m
CONFIG_SND_SOC_RL6231=m
CONFIG_SND_SOC_RL6347A=m
CONFIG_SND_SOC_RT274=m
CONFIG_SND_SOC_RT286=m
CONFIG_SND_SOC_RT298=m
CONFIG_SND_SOC_RT1011=m
CONFIG_SND_SOC_RT1015=m
CONFIG_SND_SOC_RT1015P=m
# CONFIG_SND_SOC_RT1017_SDCA_SDW is not set
CONFIG_SND_SOC_RT1019=m
CONFIG_SND_SOC_RT1308=m
CONFIG_SND_SOC_RT1308_SDW=m
CONFIG_SND_SOC_RT1316_SDW=m
CONFIG_SND_SOC_RT1318_SDW=m
CONFIG_SND_SOC_RT5514=m
CONFIG_SND_SOC_RT5514_SPI=m
CONFIG_SND_SOC_RT5616=m
CONFIG_SND_SOC_RT5631=m
CONFIG_SND_SOC_RT5640=m
CONFIG_SND_SOC_RT5645=m
CONFIG_SND_SOC_RT5651=m
CONFIG_SND_SOC_RT5659=m
CONFIG_SND_SOC_RT5660=m
CONFIG_SND_SOC_RT5663=m
CONFIG_SND_SOC_RT5670=m
CONFIG_SND_SOC_RT5677=m
CONFIG_SND_SOC_RT5677_SPI=m
CONFIG_SND_SOC_RT5682=m
CONFIG_SND_SOC_RT5682_I2C=m
CONFIG_SND_SOC_RT5682_SDW=m
CONFIG_SND_SOC_RT5682S=m
CONFIG_SND_SOC_RT700=m
CONFIG_SND_SOC_RT700_SDW=m
CONFIG_SND_SOC_RT711=m
CONFIG_SND_SOC_RT711_SDW=m
CONFIG_SND_SOC_RT711_SDCA_SDW=m
CONFIG_SND_SOC_RT712_SDCA_SDW=m
CONFIG_SND_SOC_RT712_SDCA_DMIC_SDW=m
# CONFIG_SND_SOC_RT722_SDCA_SDW is not set
CONFIG_SND_SOC_RT715=m
CONFIG_SND_SOC_RT715_SDW=m
CONFIG_SND_SOC_RT715_SDCA_SDW=m
CONFIG_SND_SOC_RT9120=m
CONFIG_SND_SOC_SDW_MOCKUP=m
CONFIG_SND_SOC_SGTL5000=m
CONFIG_SND_SOC_SI476X=m
CONFIG_SND_SOC_SIGMADSP=m
CONFIG_SND_SOC_SIGMADSP_I2C=m
CONFIG_SND_SOC_SIGMADSP_REGMAP=m
CONFIG_SND_SOC_SIMPLE_AMPLIFIER=m
CONFIG_SND_SOC_SIMPLE_MUX=m
# CONFIG_SND_SOC_SMA1303 is not set
CONFIG_SND_SOC_SPDIF=m
CONFIG_SND_SOC_SRC4XXX_I2C=m
CONFIG_SND_SOC_SRC4XXX=m
CONFIG_SND_SOC_SSM2305=m
CONFIG_SND_SOC_SSM2518=m
CONFIG_SND_SOC_SSM2602=m
CONFIG_SND_SOC_SSM2602_SPI=m
CONFIG_SND_SOC_SSM2602_I2C=m
CONFIG_SND_SOC_SSM4567=m
CONFIG_SND_SOC_STA32X=m
CONFIG_SND_SOC_STA350=m
CONFIG_SND_SOC_STI_SAS=m
CONFIG_SND_SOC_TAS2552=m
CONFIG_SND_SOC_TAS2562=m
CONFIG_SND_SOC_TAS2764=m
CONFIG_SND_SOC_TAS2770=m
CONFIG_SND_SOC_TAS2780=m
# CONFIG_SND_SOC_TAS2781_I2C is not set
CONFIG_SND_SOC_TAS5086=m
CONFIG_SND_SOC_TAS571X=m
CONFIG_SND_SOC_TAS5720=m
CONFIG_SND_SOC_TAS5805M=m
CONFIG_SND_SOC_TAS6424=m
CONFIG_SND_SOC_TDA7419=m
CONFIG_SND_SOC_TFA9879=m
CONFIG_SND_SOC_TFA989X=m
CONFIG_SND_SOC_TLV320ADC3XXX=m
CONFIG_SND_SOC_TLV320AIC23=m
CONFIG_SND_SOC_TLV320AIC23_I2C=m
CONFIG_SND_SOC_TLV320AIC23_SPI=m
CONFIG_SND_SOC_TLV320AIC31XX=m
CONFIG_SND_SOC_TLV320AIC32X4=m
CONFIG_SND_SOC_TLV320AIC32X4_I2C=m
CONFIG_SND_SOC_TLV320AIC32X4_SPI=m
CONFIG_SND_SOC_TLV320AIC3X=m
CONFIG_SND_SOC_TLV320AIC3X_I2C=m
CONFIG_SND_SOC_TLV320AIC3X_SPI=m
CONFIG_SND_SOC_TLV320ADCX140=m
CONFIG_SND_SOC_TS3A227E=m
CONFIG_SND_SOC_TSCS42XX=m
CONFIG_SND_SOC_TSCS454=m
CONFIG_SND_SOC_UDA1334=m
CONFIG_SND_SOC_WCD_CLASSH=m
CONFIG_SND_SOC_WCD9335=m
CONFIG_SND_SOC_WCD_MBHC=m
CONFIG_SND_SOC_WCD934X=m
CONFIG_SND_SOC_WCD938X=m
CONFIG_SND_SOC_WCD938X_SDW=m
CONFIG_SND_SOC_WM5102=m
CONFIG_SND_SOC_WM8510=m
CONFIG_SND_SOC_WM8523=m
CONFIG_SND_SOC_WM8524=m
CONFIG_SND_SOC_WM8580=m
CONFIG_SND_SOC_WM8711=m
CONFIG_SND_SOC_WM8728=m
CONFIG_SND_SOC_WM8731=m
CONFIG_SND_SOC_WM8731_I2C=m
CONFIG_SND_SOC_WM8731_SPI=m
CONFIG_SND_SOC_WM8737=m
CONFIG_SND_SOC_WM8741=m
CONFIG_SND_SOC_WM8750=m
CONFIG_SND_SOC_WM8753=m
CONFIG_SND_SOC_WM8770=m
CONFIG_SND_SOC_WM8776=m
CONFIG_SND_SOC_WM8782=m
CONFIG_SND_SOC_WM8804=m
CONFIG_SND_SOC_WM8804_I2C=m
CONFIG_SND_SOC_WM8804_SPI=m
CONFIG_SND_SOC_WM8903=m
CONFIG_SND_SOC_WM8904=m
CONFIG_SND_SOC_WM8940=m
CONFIG_SND_SOC_WM8960=m
CONFIG_SND_SOC_WM8961=m
CONFIG_SND_SOC_WM8962=m
CONFIG_SND_SOC_WM8974=m
CONFIG_SND_SOC_WM8978=m
CONFIG_SND_SOC_WM8985=m
CONFIG_SND_SOC_WSA881X=m
CONFIG_SND_SOC_WSA883X=m
# CONFIG_SND_SOC_WSA884X is not set
CONFIG_SND_SOC_ZL38060=m
CONFIG_SND_SOC_MAX9759=m
CONFIG_SND_SOC_MT6351=m
CONFIG_SND_SOC_MT6358=m
CONFIG_SND_SOC_MT6660=m
CONFIG_SND_SOC_NAU8315=m
CONFIG_SND_SOC_NAU8540=m
CONFIG_SND_SOC_NAU8810=m
CONFIG_SND_SOC_NAU8821=m
CONFIG_SND_SOC_NAU8822=m
CONFIG_SND_SOC_NAU8824=m
CONFIG_SND_SOC_NAU8825=m
CONFIG_SND_SOC_TPA6130A2=m
CONFIG_SND_SOC_LPASS_MACRO_COMMON=m
CONFIG_SND_SOC_LPASS_WSA_MACRO=m
CONFIG_SND_SOC_LPASS_VA_MACRO=m
CONFIG_SND_SOC_LPASS_RX_MACRO=m
CONFIG_SND_SOC_LPASS_TX_MACRO=m
# end of CODEC drivers

CONFIG_SND_SIMPLE_CARD_UTILS=m
CONFIG_SND_SIMPLE_CARD=m
CONFIG_SND_X86=y
CONFIG_HDMI_LPE_AUDIO=m
CONFIG_SND_SYNTH_EMUX=m
CONFIG_SND_XEN_FRONTEND=m
CONFIG_SND_VIRTIO=m
CONFIG_AC97_BUS=m
CONFIG_HID_SUPPORT=y
CONFIG_HID=m
CONFIG_HID_BATTERY_STRENGTH=y
CONFIG_HIDRAW=y
CONFIG_UHID=m
CONFIG_HID_GENERIC=m

#
# Special HID drivers
#
CONFIG_HID_A4TECH=m
CONFIG_HID_ACCUTOUCH=m
CONFIG_HID_ACRUX=m
CONFIG_HID_ACRUX_FF=y
CONFIG_HID_APPLE=m
CONFIG_HID_APPLEIR=m
CONFIG_HID_ASUS=m
CONFIG_HID_AUREAL=m
CONFIG_HID_BELKIN=m
CONFIG_HID_BETOP_FF=m
CONFIG_HID_BIGBEN_FF=m
CONFIG_HID_CHERRY=m
CONFIG_HID_CHICONY=m
CONFIG_HID_CORSAIR=m
CONFIG_HID_COUGAR=m
CONFIG_HID_MACALLY=m
CONFIG_HID_PRODIKEYS=m
CONFIG_HID_CMEDIA=m
CONFIG_HID_CP2112=m
CONFIG_HID_CREATIVE_SB0540=m
CONFIG_HID_CYPRESS=m
CONFIG_HID_DRAGONRISE=m
CONFIG_DRAGONRISE_FF=y
CONFIG_HID_EMS_FF=m
CONFIG_HID_ELAN=m
CONFIG_HID_ELECOM=m
CONFIG_HID_ELO=m
# CONFIG_HID_EVISION is not set
CONFIG_HID_EZKEY=m
CONFIG_HID_FT260=m
CONFIG_HID_GEMBIRD=m
CONFIG_HID_GFRM=m
CONFIG_HID_GLORIOUS=m
CONFIG_HID_HOLTEK=m
CONFIG_HOLTEK_FF=y
CONFIG_HID_VIVALDI_COMMON=m
CONFIG_HID_GOOGLE_HAMMER=m
# CONFIG_HID_GOOGLE_STADIA_FF is not set
CONFIG_HID_VIVALDI=m
CONFIG_HID_GT683R=m
CONFIG_HID_KEYTOUCH=m
CONFIG_HID_KYE=m
CONFIG_HID_UCLOGIC=m
CONFIG_HID_WALTOP=m
CONFIG_HID_VIEWSONIC=m
CONFIG_HID_VRC2=m
CONFIG_HID_XIAOMI=m
CONFIG_HID_GYRATION=m
CONFIG_HID_ICADE=m
CONFIG_HID_ITE=m
CONFIG_HID_JABRA=m
CONFIG_HID_TWINHAN=m
CONFIG_HID_KENSINGTON=m
CONFIG_HID_LCPOWER=m
CONFIG_HID_LED=m
CONFIG_HID_LENOVO=m
CONFIG_HID_LETSKETCH=m
CONFIG_HID_LOGITECH=m
CONFIG_HID_LOGITECH_DJ=m
CONFIG_HID_LOGITECH_HIDPP=m
CONFIG_LOGITECH_FF=y
CONFIG_LOGIRUMBLEPAD2_FF=y
CONFIG_LOGIG940_FF=y
CONFIG_LOGIWHEELS_FF=y
CONFIG_HID_MAGICMOUSE=m
CONFIG_HID_MALTRON=m
CONFIG_HID_MAYFLASH=m
CONFIG_HID_MEGAWORLD_FF=m
CONFIG_HID_REDRAGON=m
CONFIG_HID_MICROSOFT=m
CONFIG_HID_MONTEREY=m
CONFIG_HID_MULTITOUCH=m
CONFIG_HID_NINTENDO=m
CONFIG_NINTENDO_FF=y
CONFIG_HID_NTI=m
CONFIG_HID_NTRIG=m
# CONFIG_HID_NVIDIA_SHIELD is not set
CONFIG_HID_ORTEK=m
CONFIG_HID_PANTHERLORD=m
CONFIG_PANTHERLORD_FF=y
CONFIG_HID_PENMOUNT=m
CONFIG_HID_PETALYNX=m
CONFIG_HID_PICOLCD=m
CONFIG_HID_PICOLCD_FB=y
CONFIG_HID_PICOLCD_BACKLIGHT=y
CONFIG_HID_PICOLCD_LCD=y
CONFIG_HID_PICOLCD_LEDS=y
CONFIG_HID_PICOLCD_CIR=y
CONFIG_HID_PLANTRONICS=m
CONFIG_HID_PLAYSTATION=m
CONFIG_PLAYSTATION_FF=y
CONFIG_HID_PXRC=m
CONFIG_HID_RAZER=m
CONFIG_HID_PRIMAX=m
CONFIG_HID_RETRODE=m
CONFIG_HID_ROCCAT=m
CONFIG_HID_SAITEK=m
CONFIG_HID_SAMSUNG=m
CONFIG_HID_SEMITEK=m
CONFIG_HID_SIGMAMICRO=m
CONFIG_HID_SONY=m
CONFIG_SONY_FF=y
CONFIG_HID_SPEEDLINK=m
CONFIG_HID_STEAM=m
# CONFIG_STEAM_FF is not set
CONFIG_HID_STEELSERIES=m
CONFIG_HID_SUNPLUS=m
CONFIG_HID_RMI=m
CONFIG_HID_GREENASIA=m
CONFIG_GREENASIA_FF=y
CONFIG_HID_HYPERV_MOUSE=m
CONFIG_HID_SMARTJOYPLUS=m
CONFIG_SMARTJOYPLUS_FF=y
CONFIG_HID_TIVO=m
CONFIG_HID_TOPSEED=m
CONFIG_HID_TOPRE=m
CONFIG_HID_THINGM=m
CONFIG_HID_THRUSTMASTER=m
CONFIG_THRUSTMASTER_FF=y
CONFIG_HID_UDRAW_PS3=m
CONFIG_HID_U2FZERO=m
CONFIG_HID_WACOM=m
CONFIG_HID_WIIMOTE=m
CONFIG_HID_XINMO=m
CONFIG_HID_ZEROPLUS=m
CONFIG_ZEROPLUS_FF=y
CONFIG_HID_ZYDACRON=m
CONFIG_HID_SENSOR_HUB=m
CONFIG_HID_SENSOR_CUSTOM_SENSOR=m
CONFIG_HID_ALPS=m
CONFIG_HID_MCP2221=m
# end of Special HID drivers

#
# HID-BPF support
#
# CONFIG_HID_BPF is not set
# end of HID-BPF support

#
# USB HID support
#
CONFIG_USB_HID=m
CONFIG_HID_PID=y
CONFIG_USB_HIDDEV=y

#
# USB HID Boot Protocol drivers
#
CONFIG_USB_KBD=m
CONFIG_USB_MOUSE=m
# end of USB HID Boot Protocol drivers
# end of USB HID support

CONFIG_I2C_HID=m
CONFIG_I2C_HID_ACPI=m
# CONFIG_I2C_HID_OF is not set
CONFIG_I2C_HID_CORE=m

#
# Intel ISH HID support
#
CONFIG_INTEL_ISH_HID=m
CONFIG_INTEL_ISH_FIRMWARE_DOWNLOADER=m
# end of Intel ISH HID support

#
# AMD SFH HID Support
#
CONFIG_AMD_SFH_HID=m
# end of AMD SFH HID Support

#
# Surface System Aggregator Module HID support
#
CONFIG_SURFACE_HID=m
CONFIG_SURFACE_KBD=m
# end of Surface System Aggregator Module HID support

CONFIG_SURFACE_HID_CORE=m
CONFIG_USB_OHCI_LITTLE_ENDIAN=y
CONFIG_USB_SUPPORT=y
CONFIG_USB_COMMON=y
CONFIG_USB_LED_TRIG=y
CONFIG_USB_ULPI_BUS=m
CONFIG_USB_CONN_GPIO=m
CONFIG_USB_ARCH_HAS_HCD=y
CONFIG_USB=y
CONFIG_USB_PCI=y
CONFIG_USB_ANNOUNCE_NEW_DEVICES=y

#
# Miscellaneous USB options
#
CONFIG_USB_DEFAULT_PERSIST=y
# CONFIG_USB_FEW_INIT_RETRIES is not set
CONFIG_USB_DYNAMIC_MINORS=y
# CONFIG_USB_OTG is not set
# CONFIG_USB_OTG_PRODUCTLIST is not set
# CONFIG_USB_OTG_DISABLE_EXTERNAL_HUB is not set
CONFIG_USB_LEDS_TRIGGER_USBPORT=m
CONFIG_USB_AUTOSUSPEND_DELAY=2
CONFIG_USB_MON=m

#
# USB Host Controller Drivers
#
CONFIG_USB_C67X00_HCD=m
CONFIG_USB_XHCI_HCD=y
CONFIG_USB_XHCI_DBGCAP=y
CONFIG_USB_XHCI_PCI=m
CONFIG_USB_XHCI_PCI_RENESAS=m
CONFIG_USB_XHCI_PLATFORM=m
CONFIG_USB_EHCI_HCD=y
CONFIG_USB_EHCI_ROOT_HUB_TT=y
CONFIG_USB_EHCI_TT_NEWSCHED=y
CONFIG_USB_EHCI_PCI=y
CONFIG_USB_EHCI_FSL=m
CONFIG_USB_EHCI_HCD_PLATFORM=y
CONFIG_USB_OXU210HP_HCD=m
CONFIG_USB_ISP116X_HCD=m
CONFIG_USB_MAX3421_HCD=m
CONFIG_USB_OHCI_HCD=y
CONFIG_USB_OHCI_HCD_PCI=y
CONFIG_USB_OHCI_HCD_PLATFORM=y
CONFIG_USB_UHCI_HCD=y
CONFIG_USB_SL811_HCD=m
CONFIG_USB_SL811_HCD_ISO=y
CONFIG_USB_SL811_CS=m
CONFIG_USB_R8A66597_HCD=m
CONFIG_USB_HCD_BCMA=m
CONFIG_USB_HCD_SSB=m
# CONFIG_USB_HCD_TEST_MODE is not set
CONFIG_USB_XEN_HCD=m

#
# USB Device Class drivers
#
CONFIG_USB_ACM=m
CONFIG_USB_PRINTER=m
CONFIG_USB_WDM=m
CONFIG_USB_TMC=m

#
# NOTE: USB_STORAGE depends on SCSI but BLK_DEV_SD may
#

#
# also be needed; see USB_STORAGE Help for more info
#
CONFIG_USB_STORAGE=m
# CONFIG_USB_STORAGE_DEBUG is not set
CONFIG_USB_STORAGE_REALTEK=m
CONFIG_REALTEK_AUTOPM=y
CONFIG_USB_STORAGE_DATAFAB=m
CONFIG_USB_STORAGE_FREECOM=m
CONFIG_USB_STORAGE_ISD200=m
CONFIG_USB_STORAGE_USBAT=m
CONFIG_USB_STORAGE_SDDR09=m
CONFIG_USB_STORAGE_SDDR55=m
CONFIG_USB_STORAGE_JUMPSHOT=m
CONFIG_USB_STORAGE_ALAUDA=m
CONFIG_USB_STORAGE_ONETOUCH=m
CONFIG_USB_STORAGE_KARMA=m
CONFIG_USB_STORAGE_CYPRESS_ATACB=m
CONFIG_USB_STORAGE_ENE_UB6250=m
CONFIG_USB_UAS=m

#
# USB Imaging devices
#
CONFIG_USB_MDC800=m
CONFIG_USB_MICROTEK=m
CONFIG_USBIP_CORE=m
CONFIG_USBIP_VHCI_HCD=m
CONFIG_USBIP_VHCI_HC_PORTS=8
CONFIG_USBIP_VHCI_NR_HCS=1
CONFIG_USBIP_HOST=m
CONFIG_USBIP_VUDC=m
# CONFIG_USBIP_DEBUG is not set

#
# USB dual-mode controller drivers
#
CONFIG_USB_CDNS_SUPPORT=m
CONFIG_USB_CDNS_HOST=y
CONFIG_USB_CDNS3=m
CONFIG_USB_CDNS3_GADGET=y
CONFIG_USB_CDNS3_HOST=y
CONFIG_USB_CDNS3_PCI_WRAP=m
CONFIG_USB_CDNSP_PCI=m
CONFIG_USB_CDNSP_GADGET=y
CONFIG_USB_CDNSP_HOST=y
CONFIG_USB_MUSB_HDRC=m
# CONFIG_USB_MUSB_HOST is not set
# CONFIG_USB_MUSB_GADGET is not set
CONFIG_USB_MUSB_DUAL_ROLE=y

#
# Platform Glue Layer
#

#
# MUSB DMA mode
#
CONFIG_MUSB_PIO_ONLY=y
CONFIG_USB_DWC3=m
CONFIG_USB_DWC3_ULPI=y
# CONFIG_USB_DWC3_HOST is not set
# CONFIG_USB_DWC3_GADGET is not set
CONFIG_USB_DWC3_DUAL_ROLE=y

#
# Platform Glue Driver Support
#
CONFIG_USB_DWC3_PCI=m
CONFIG_USB_DWC3_HAPS=m
CONFIG_USB_DWC2=y
CONFIG_USB_DWC2_HOST=y

#
# Gadget/Dual-role mode requires USB Gadget support to be enabled
#
CONFIG_USB_DWC2_PCI=m
# CONFIG_USB_DWC2_DEBUG is not set
# CONFIG_USB_DWC2_TRACK_MISSED_SOFS is not set
CONFIG_USB_CHIPIDEA=m
CONFIG_USB_CHIPIDEA_UDC=y
CONFIG_USB_CHIPIDEA_HOST=y
CONFIG_USB_CHIPIDEA_PCI=m
CONFIG_USB_CHIPIDEA_MSM=m
CONFIG_USB_CHIPIDEA_GENERIC=m
CONFIG_USB_ISP1760=m
CONFIG_USB_ISP1760_HCD=y
CONFIG_USB_ISP1761_UDC=y
# CONFIG_USB_ISP1760_HOST_ROLE is not set
# CONFIG_USB_ISP1760_GADGET_ROLE is not set
CONFIG_USB_ISP1760_DUAL_ROLE=y

#
# USB port drivers
#
CONFIG_USB_SERIAL=m
CONFIG_USB_SERIAL_GENERIC=y
CONFIG_USB_SERIAL_SIMPLE=m
CONFIG_USB_SERIAL_AIRCABLE=m
CONFIG_USB_SERIAL_ARK3116=m
CONFIG_USB_SERIAL_BELKIN=m
CONFIG_USB_SERIAL_CH341=m
CONFIG_USB_SERIAL_WHITEHEAT=m
CONFIG_USB_SERIAL_DIGI_ACCELEPORT=m
CONFIG_USB_SERIAL_CP210X=m
CONFIG_USB_SERIAL_CYPRESS_M8=m
CONFIG_USB_SERIAL_EMPEG=m
CONFIG_USB_SERIAL_FTDI_SIO=m
CONFIG_USB_SERIAL_VISOR=m
CONFIG_USB_SERIAL_IPAQ=m
CONFIG_USB_SERIAL_IR=m
CONFIG_USB_SERIAL_EDGEPORT=m
CONFIG_USB_SERIAL_EDGEPORT_TI=m
CONFIG_USB_SERIAL_F81232=m
CONFIG_USB_SERIAL_F8153X=m
CONFIG_USB_SERIAL_GARMIN=m
CONFIG_USB_SERIAL_IPW=m
CONFIG_USB_SERIAL_IUU=m
CONFIG_USB_SERIAL_KEYSPAN_PDA=m
CONFIG_USB_SERIAL_KEYSPAN=m
CONFIG_USB_SERIAL_KLSI=m
CONFIG_USB_SERIAL_KOBIL_SCT=m
CONFIG_USB_SERIAL_MCT_U232=m
CONFIG_USB_SERIAL_METRO=m
CONFIG_USB_SERIAL_MOS7720=m
CONFIG_USB_SERIAL_MOS7715_PARPORT=y
CONFIG_USB_SERIAL_MOS7840=m
CONFIG_USB_SERIAL_MXUPORT=m
CONFIG_USB_SERIAL_NAVMAN=m
CONFIG_USB_SERIAL_PL2303=m
CONFIG_USB_SERIAL_OTI6858=m
CONFIG_USB_SERIAL_QCAUX=m
CONFIG_USB_SERIAL_QUALCOMM=m
CONFIG_USB_SERIAL_SPCP8X5=m
CONFIG_USB_SERIAL_SAFE=m
# CONFIG_USB_SERIAL_SAFE_PADDED is not set
CONFIG_USB_SERIAL_SIERRAWIRELESS=m
CONFIG_USB_SERIAL_SYMBOL=m
CONFIG_USB_SERIAL_TI=m
CONFIG_USB_SERIAL_CYBERJACK=m
CONFIG_USB_SERIAL_WWAN=m
CONFIG_USB_SERIAL_OPTION=m
CONFIG_USB_SERIAL_OMNINET=m
CONFIG_USB_SERIAL_OPTICON=m
CONFIG_USB_SERIAL_XSENS_MT=m
CONFIG_USB_SERIAL_WISHBONE=m
CONFIG_USB_SERIAL_SSU100=m
CONFIG_USB_SERIAL_QT2=m
CONFIG_USB_SERIAL_UPD78F0730=m
CONFIG_USB_SERIAL_XR=m
CONFIG_USB_SERIAL_DEBUG=m

#
# USB Miscellaneous drivers
#
CONFIG_USB_USS720=m
CONFIG_USB_EMI62=m
CONFIG_USB_EMI26=m
CONFIG_USB_ADUTUX=m
CONFIG_USB_SEVSEG=m
CONFIG_USB_LEGOTOWER=m
CONFIG_USB_LCD=m
CONFIG_USB_CYPRESS_CY7C63=m
CONFIG_USB_CYTHERM=m
CONFIG_USB_IDMOUSE=m
CONFIG_USB_APPLEDISPLAY=m
CONFIG_APPLE_MFI_FASTCHARGE=m
CONFIG_USB_SISUSBVGA=m
CONFIG_USB_LD=m
CONFIG_USB_TRANCEVIBRATOR=m
CONFIG_USB_IOWARRIOR=m
CONFIG_USB_TEST=m
CONFIG_USB_EHSET_TEST_FIXTURE=m
CONFIG_USB_ISIGHTFW=m
CONFIG_USB_YUREX=m
CONFIG_USB_EZUSB_FX2=m
CONFIG_USB_HUB_USB251XB=m
CONFIG_USB_HSIC_USB3503=m
CONFIG_USB_HSIC_USB4604=m
CONFIG_USB_LINK_LAYER_TEST=m
CONFIG_USB_CHAOSKEY=m
CONFIG_USB_ATM=m
CONFIG_USB_SPEEDTOUCH=m
CONFIG_USB_CXACRU=m
CONFIG_USB_UEAGLEATM=m
CONFIG_USB_XUSBATM=m

#
# USB Physical Layer drivers
#
CONFIG_USB_PHY=y
CONFIG_NOP_USB_XCEIV=m
CONFIG_USB_GPIO_VBUS=m
CONFIG_TAHVO_USB=m
CONFIG_TAHVO_USB_HOST_BY_DEFAULT=y
CONFIG_USB_ISP1301=m
# end of USB Physical Layer drivers

CONFIG_USB_GADGET=m
# CONFIG_USB_GADGET_DEBUG is not set
# CONFIG_USB_GADGET_DEBUG_FILES is not set
# CONFIG_USB_GADGET_DEBUG_FS is not set
CONFIG_USB_GADGET_VBUS_DRAW=2
CONFIG_USB_GADGET_STORAGE_NUM_BUFFERS=2
CONFIG_U_SERIAL_CONSOLE=y

#
# USB Peripheral Controller
#
CONFIG_USB_GR_UDC=m
CONFIG_USB_R8A66597=m
CONFIG_USB_PXA27X=m
CONFIG_USB_MV_UDC=m
CONFIG_USB_MV_U3D=m
CONFIG_USB_SNP_CORE=m
# CONFIG_USB_M66592 is not set
CONFIG_USB_BDC_UDC=m
CONFIG_USB_AMD5536UDC=m
CONFIG_USB_NET2272=m
CONFIG_USB_NET2272_DMA=y
CONFIG_USB_NET2280=m
CONFIG_USB_GOKU=m
CONFIG_USB_EG20T=m
CONFIG_USB_MAX3420_UDC=m
# CONFIG_USB_CDNS2_UDC is not set
# CONFIG_USB_DUMMY_HCD is not set
# end of USB Peripheral Controller

CONFIG_USB_LIBCOMPOSITE=m
CONFIG_USB_F_ACM=m
CONFIG_USB_F_SS_LB=m
CONFIG_USB_U_SERIAL=m
CONFIG_USB_U_ETHER=m
CONFIG_USB_U_AUDIO=m
CONFIG_USB_F_SERIAL=m
CONFIG_USB_F_OBEX=m
CONFIG_USB_F_NCM=m
CONFIG_USB_F_ECM=m
CONFIG_USB_F_PHONET=m
CONFIG_USB_F_EEM=m
CONFIG_USB_F_SUBSET=m
CONFIG_USB_F_RNDIS=m
CONFIG_USB_F_MASS_STORAGE=m
CONFIG_USB_F_FS=m
CONFIG_USB_F_UAC1=m
CONFIG_USB_F_UAC1_LEGACY=m
CONFIG_USB_F_UAC2=m
CONFIG_USB_F_UVC=m
CONFIG_USB_F_MIDI=m
CONFIG_USB_F_HID=m
CONFIG_USB_F_PRINTER=m
CONFIG_USB_F_TCM=m
CONFIG_USB_CONFIGFS=m
CONFIG_USB_CONFIGFS_SERIAL=y
CONFIG_USB_CONFIGFS_ACM=y
CONFIG_USB_CONFIGFS_OBEX=y
CONFIG_USB_CONFIGFS_NCM=y
CONFIG_USB_CONFIGFS_ECM=y
CONFIG_USB_CONFIGFS_ECM_SUBSET=y
CONFIG_USB_CONFIGFS_RNDIS=y
CONFIG_USB_CONFIGFS_EEM=y
CONFIG_USB_CONFIGFS_PHONET=y
CONFIG_USB_CONFIGFS_MASS_STORAGE=y
CONFIG_USB_CONFIGFS_F_LB_SS=y
CONFIG_USB_CONFIGFS_F_FS=y
CONFIG_USB_CONFIGFS_F_UAC1=y
CONFIG_USB_CONFIGFS_F_UAC1_LEGACY=y
CONFIG_USB_CONFIGFS_F_UAC2=y
CONFIG_USB_CONFIGFS_F_MIDI=y
# CONFIG_USB_CONFIGFS_F_MIDI2 is not set
CONFIG_USB_CONFIGFS_F_HID=y
CONFIG_USB_CONFIGFS_F_UVC=y
CONFIG_USB_CONFIGFS_F_PRINTER=y
CONFIG_USB_CONFIGFS_F_TCM=y

#
# USB Gadget precomposed configurations
#
CONFIG_USB_ZERO=m
CONFIG_USB_AUDIO=m
CONFIG_GADGET_UAC1=y
# CONFIG_GADGET_UAC1_LEGACY is not set
CONFIG_USB_ETH=m
CONFIG_USB_ETH_RNDIS=y
CONFIG_USB_ETH_EEM=y
CONFIG_USB_G_NCM=m
CONFIG_USB_GADGETFS=m
CONFIG_USB_FUNCTIONFS=m
CONFIG_USB_FUNCTIONFS_ETH=y
CONFIG_USB_FUNCTIONFS_RNDIS=y
CONFIG_USB_FUNCTIONFS_GENERIC=y
CONFIG_USB_MASS_STORAGE=m
CONFIG_USB_GADGET_TARGET=m
CONFIG_USB_G_SERIAL=m
CONFIG_USB_MIDI_GADGET=m
CONFIG_USB_G_PRINTER=m
CONFIG_USB_CDC_COMPOSITE=m
CONFIG_USB_G_NOKIA=m
CONFIG_USB_G_ACM_MS=m
# CONFIG_USB_G_MULTI is not set
CONFIG_USB_G_HID=m
CONFIG_USB_G_DBGP=m
# CONFIG_USB_G_DBGP_PRINTK is not set
CONFIG_USB_G_DBGP_SERIAL=y
CONFIG_USB_G_WEBCAM=m
CONFIG_USB_RAW_GADGET=m
# end of USB Gadget precomposed configurations

CONFIG_TYPEC=m
CONFIG_TYPEC_TCPM=m
CONFIG_TYPEC_TCPCI=m
CONFIG_TYPEC_RT1711H=m
CONFIG_TYPEC_MT6360=m
CONFIG_TYPEC_TCPCI_MT6370=m
CONFIG_TYPEC_TCPCI_MAXIM=m
CONFIG_TYPEC_FUSB302=m
CONFIG_TYPEC_WCOVE=m
CONFIG_TYPEC_UCSI=m
CONFIG_UCSI_CCG=m
CONFIG_UCSI_ACPI=m
CONFIG_UCSI_STM32G0=m
CONFIG_TYPEC_TPS6598X=m
CONFIG_TYPEC_ANX7411=m
CONFIG_TYPEC_RT1719=m
CONFIG_TYPEC_HD3SS3220=m
CONFIG_TYPEC_STUSB160X=m
CONFIG_TYPEC_WUSB3801=m

#
# USB Type-C Multiplexer/DeMultiplexer Switch support
#
CONFIG_TYPEC_MUX_FSA4480=m
# CONFIG_TYPEC_MUX_GPIO_SBU is not set
CONFIG_TYPEC_MUX_PI3USB30532=m
CONFIG_TYPEC_MUX_INTEL_PMC=m
# CONFIG_TYPEC_MUX_NB7VPQ904M is not set
# end of USB Type-C Multiplexer/DeMultiplexer Switch support

#
# USB Type-C Alternate Mode drivers
#
CONFIG_TYPEC_DP_ALTMODE=m
CONFIG_TYPEC_NVIDIA_ALTMODE=m
# end of USB Type-C Alternate Mode drivers

CONFIG_USB_ROLE_SWITCH=y
CONFIG_USB_ROLES_INTEL_XHCI=m
CONFIG_MMC=y
CONFIG_MMC_BLOCK=m
CONFIG_MMC_BLOCK_MINORS=8
CONFIG_SDIO_UART=m
# CONFIG_MMC_TEST is not set
CONFIG_MMC_CRYPTO=y

#
# MMC/SD/SDIO Host Controller Drivers
#
# CONFIG_MMC_DEBUG is not set
CONFIG_MMC_SDHCI=m
CONFIG_MMC_SDHCI_IO_ACCESSORS=y
CONFIG_MMC_SDHCI_PCI=m
CONFIG_MMC_RICOH_MMC=y
CONFIG_MMC_SDHCI_ACPI=m
CONFIG_MMC_SDHCI_PLTFM=m
CONFIG_MMC_SDHCI_F_SDH30=m
CONFIG_MMC_WBSD=m
CONFIG_MMC_ALCOR=m
CONFIG_MMC_TIFM_SD=m
CONFIG_MMC_SPI=m
CONFIG_MMC_SDRICOH_CS=m
CONFIG_MMC_CB710=m
CONFIG_MMC_VIA_SDMMC=m
CONFIG_MMC_VUB300=m
CONFIG_MMC_USHC=m
CONFIG_MMC_USDHI6ROL0=m
CONFIG_MMC_REALTEK_PCI=m
CONFIG_MMC_REALTEK_USB=m
CONFIG_MMC_CQHCI=m
# CONFIG_MMC_HSQ is not set
CONFIG_MMC_TOSHIBA_PCI=m
CONFIG_MMC_MTK=m
CONFIG_MMC_SDHCI_XENON=m
CONFIG_SCSI_UFSHCD=m
CONFIG_SCSI_UFS_BSG=y
CONFIG_SCSI_UFS_CRYPTO=y
# CONFIG_SCSI_UFS_HWMON is not set
CONFIG_SCSI_UFSHCD_PCI=m
CONFIG_SCSI_UFS_DWC_TC_PCI=m
CONFIG_SCSI_UFSHCD_PLATFORM=m
CONFIG_SCSI_UFS_CDNS_PLATFORM=m
CONFIG_MEMSTICK=m
# CONFIG_MEMSTICK_DEBUG is not set

#
# MemoryStick drivers
#
# CONFIG_MEMSTICK_UNSAFE_RESUME is not set
CONFIG_MSPRO_BLOCK=m
CONFIG_MS_BLOCK=m

#
# MemoryStick Host Controller Drivers
#
CONFIG_MEMSTICK_TIFM_MS=m
CONFIG_MEMSTICK_JMICRON_38X=m
CONFIG_MEMSTICK_R592=m
CONFIG_MEMSTICK_REALTEK_PCI=m
CONFIG_MEMSTICK_REALTEK_USB=m
CONFIG_NEW_LEDS=y
CONFIG_LEDS_CLASS=y
CONFIG_LEDS_CLASS_FLASH=m
CONFIG_LEDS_CLASS_MULTICOLOR=m
CONFIG_LEDS_BRIGHTNESS_HW_CHANGED=y

#
# LED drivers
#
CONFIG_LEDS_88PM860X=m
CONFIG_LEDS_APU=m
# CONFIG_LEDS_AW200XX is not set
# CONFIG_LEDS_CHT_WCOVE is not set
CONFIG_LEDS_LM3530=m
CONFIG_LEDS_LM3532=m
CONFIG_LEDS_LM3533=m
CONFIG_LEDS_LM3642=m
CONFIG_LEDS_MT6323=m
CONFIG_LEDS_PCA9532=m
CONFIG_LEDS_PCA9532_GPIO=y
CONFIG_LEDS_GPIO=m
CONFIG_LEDS_LP3944=m
CONFIG_LEDS_LP3952=m
CONFIG_LEDS_LP50XX=m
CONFIG_LEDS_LP8788=m
CONFIG_LEDS_PCA955X=m
CONFIG_LEDS_PCA955X_GPIO=y
CONFIG_LEDS_PCA963X=m
# CONFIG_LEDS_PCA995X is not set
CONFIG_LEDS_WM831X_STATUS=m
CONFIG_LEDS_WM8350=m
CONFIG_LEDS_DA903X=m
CONFIG_LEDS_DA9052=m
CONFIG_LEDS_DAC124S085=m
CONFIG_LEDS_PWM=m
CONFIG_LEDS_REGULATOR=m
# CONFIG_LEDS_BD2606MVV is not set
CONFIG_LEDS_BD2802=m
CONFIG_LEDS_INTEL_SS4200=m
CONFIG_LEDS_LT3593=m
CONFIG_LEDS_ADP5520=m
CONFIG_LEDS_MC13783=m
CONFIG_LEDS_TCA6507=m
CONFIG_LEDS_TLC591XX=m
CONFIG_LEDS_MAX8997=m
CONFIG_LEDS_LM355x=m
CONFIG_LEDS_MENF21BMC=m
CONFIG_LEDS_IS31FL319X=m

#
# LED driver for blink(1) USB RGB LED is under Special HID drivers (HID_THINGM)
#
CONFIG_LEDS_BLINKM=m
CONFIG_LEDS_MLXCPLD=m
CONFIG_LEDS_MLXREG=m
CONFIG_LEDS_USER=m
CONFIG_LEDS_NIC78BX=m
CONFIG_LEDS_TI_LMU_COMMON=m
CONFIG_LEDS_LM36274=m
CONFIG_LEDS_TPS6105X=m

#
# Flash and Torch LED drivers
#
CONFIG_LEDS_AS3645A=m
CONFIG_LEDS_LM3601X=m
# CONFIG_LEDS_MT6370_FLASH is not set
CONFIG_LEDS_RT8515=m
CONFIG_LEDS_SGM3140=m

#
# RGB LED drivers
#
CONFIG_LEDS_PWM_MULTICOLOR=m
# CONFIG_LEDS_MT6370_RGB is not set

#
# LED Triggers
#
CONFIG_LEDS_TRIGGERS=y
CONFIG_LEDS_TRIGGER_TIMER=m
CONFIG_LEDS_TRIGGER_ONESHOT=m
CONFIG_LEDS_TRIGGER_DISK=y
CONFIG_LEDS_TRIGGER_MTD=y
CONFIG_LEDS_TRIGGER_HEARTBEAT=m
CONFIG_LEDS_TRIGGER_BACKLIGHT=m
CONFIG_LEDS_TRIGGER_CPU=y
CONFIG_LEDS_TRIGGER_ACTIVITY=m
CONFIG_LEDS_TRIGGER_DEFAULT_ON=m

#
# iptables trigger is under Netfilter config (LED target)
#
CONFIG_LEDS_TRIGGER_TRANSIENT=m
CONFIG_LEDS_TRIGGER_CAMERA=m
CONFIG_LEDS_TRIGGER_PANIC=y
CONFIG_LEDS_TRIGGER_NETDEV=m
CONFIG_LEDS_TRIGGER_PATTERN=m
CONFIG_LEDS_TRIGGER_AUDIO=m
CONFIG_LEDS_TRIGGER_TTY=m

#
# Simple LED drivers
#
CONFIG_LEDS_SIEMENS_SIMATIC_IPC=m
CONFIG_LEDS_SIEMENS_SIMATIC_IPC_APOLLOLAKE=m
CONFIG_LEDS_SIEMENS_SIMATIC_IPC_F7188X=m
CONFIG_LEDS_SIEMENS_SIMATIC_IPC_ELKHARTLAKE=m
CONFIG_ACCESSIBILITY=y
# CONFIG_A11Y_BRAILLE_CONSOLE is not set

#
# Speakup console speech
#
CONFIG_SPEAKUP=m
CONFIG_SPEAKUP_SYNTH_ACNTSA=m
CONFIG_SPEAKUP_SYNTH_APOLLO=m
CONFIG_SPEAKUP_SYNTH_AUDPTR=m
CONFIG_SPEAKUP_SYNTH_BNS=m
CONFIG_SPEAKUP_SYNTH_DECTLK=m
CONFIG_SPEAKUP_SYNTH_DECEXT=m
CONFIG_SPEAKUP_SYNTH_LTLK=m
CONFIG_SPEAKUP_SYNTH_SOFT=m
CONFIG_SPEAKUP_SYNTH_SPKOUT=m
CONFIG_SPEAKUP_SYNTH_TXPRT=m
CONFIG_SPEAKUP_SYNTH_DUMMY=m
# end of Speakup console speech

CONFIG_INFINIBAND=m
CONFIG_INFINIBAND_USER_MAD=m
CONFIG_INFINIBAND_USER_ACCESS=m
CONFIG_INFINIBAND_USER_MEM=y
CONFIG_INFINIBAND_ON_DEMAND_PAGING=y
CONFIG_INFINIBAND_ADDR_TRANS=y
CONFIG_INFINIBAND_ADDR_TRANS_CONFIGFS=y
CONFIG_INFINIBAND_VIRT_DMA=y
CONFIG_INFINIBAND_BNXT_RE=m
CONFIG_INFINIBAND_CXGB4=m
CONFIG_INFINIBAND_EFA=m
CONFIG_INFINIBAND_ERDMA=m
CONFIG_INFINIBAND_HFI1=m
# CONFIG_HFI1_DEBUG_SDMA_ORDER is not set
# CONFIG_SDMA_VERBOSITY is not set
CONFIG_INFINIBAND_IRDMA=m
CONFIG_MANA_INFINIBAND=m
CONFIG_MLX4_INFINIBAND=m
CONFIG_MLX5_INFINIBAND=m
CONFIG_INFINIBAND_MTHCA=m
# CONFIG_INFINIBAND_MTHCA_DEBUG is not set
CONFIG_INFINIBAND_OCRDMA=m
CONFIG_INFINIBAND_QEDR=m
CONFIG_INFINIBAND_QIB=m
CONFIG_INFINIBAND_QIB_DCA=y
CONFIG_INFINIBAND_USNIC=m
CONFIG_INFINIBAND_VMWARE_PVRDMA=m
CONFIG_INFINIBAND_RDMAVT=m
CONFIG_RDMA_RXE=m
CONFIG_RDMA_SIW=m
CONFIG_INFINIBAND_IPOIB=m
CONFIG_INFINIBAND_IPOIB_CM=y
# CONFIG_INFINIBAND_IPOIB_DEBUG is not set
CONFIG_INFINIBAND_SRP=m
CONFIG_INFINIBAND_SRPT=m
CONFIG_INFINIBAND_ISER=m
CONFIG_INFINIBAND_ISERT=m
CONFIG_INFINIBAND_RTRS=m
CONFIG_INFINIBAND_RTRS_CLIENT=m
CONFIG_INFINIBAND_RTRS_SERVER=m
CONFIG_INFINIBAND_OPA_VNIC=m
CONFIG_EDAC_ATOMIC_SCRUB=y
CONFIG_EDAC_SUPPORT=y
CONFIG_EDAC=y
# CONFIG_EDAC_LEGACY_SYSFS is not set
# CONFIG_EDAC_DEBUG is not set
CONFIG_EDAC_DECODE_MCE=m
CONFIG_EDAC_GHES=y
CONFIG_EDAC_AMD64=m
CONFIG_EDAC_E752X=m
CONFIG_EDAC_I82975X=m
CONFIG_EDAC_I3000=m
CONFIG_EDAC_I3200=m
CONFIG_EDAC_IE31200=m
CONFIG_EDAC_X38=m
CONFIG_EDAC_I5400=m
CONFIG_EDAC_I7CORE=m
CONFIG_EDAC_I5100=m
CONFIG_EDAC_I7300=m
CONFIG_EDAC_SBRIDGE=m
CONFIG_EDAC_SKX=m
CONFIG_EDAC_I10NM=m
CONFIG_EDAC_PND2=m
CONFIG_EDAC_IGEN6=m
CONFIG_RTC_LIB=y
CONFIG_RTC_MC146818_LIB=y
CONFIG_RTC_CLASS=y
CONFIG_RTC_HCTOSYS=y
CONFIG_RTC_HCTOSYS_DEVICE="rtc0"
CONFIG_RTC_SYSTOHC=y
CONFIG_RTC_SYSTOHC_DEVICE="rtc0"
# CONFIG_RTC_DEBUG is not set
CONFIG_RTC_NVMEM=y

#
# RTC interfaces
#
CONFIG_RTC_INTF_SYSFS=y
CONFIG_RTC_INTF_PROC=y
CONFIG_RTC_INTF_DEV=y
# CONFIG_RTC_INTF_DEV_UIE_EMUL is not set
# CONFIG_RTC_DRV_TEST is not set

#
# I2C RTC drivers
#
CONFIG_RTC_DRV_88PM860X=m
CONFIG_RTC_DRV_88PM80X=m
CONFIG_RTC_DRV_ABB5ZES3=m
CONFIG_RTC_DRV_ABEOZ9=m
CONFIG_RTC_DRV_ABX80X=m
CONFIG_RTC_DRV_DS1307=m
CONFIG_RTC_DRV_DS1307_CENTURY=y
CONFIG_RTC_DRV_DS1374=m
CONFIG_RTC_DRV_DS1374_WDT=y
CONFIG_RTC_DRV_DS1672=m
CONFIG_RTC_DRV_LP8788=m
CONFIG_RTC_DRV_MAX6900=m
CONFIG_RTC_DRV_MAX8907=m
CONFIG_RTC_DRV_MAX8925=m
CONFIG_RTC_DRV_MAX8998=m
CONFIG_RTC_DRV_MAX8997=m
CONFIG_RTC_DRV_RS5C372=m
CONFIG_RTC_DRV_ISL1208=m
CONFIG_RTC_DRV_ISL12022=m
CONFIG_RTC_DRV_X1205=m
CONFIG_RTC_DRV_PCF8523=m
CONFIG_RTC_DRV_PCF85063=m
CONFIG_RTC_DRV_PCF85363=m
CONFIG_RTC_DRV_PCF8563=m
CONFIG_RTC_DRV_PCF8583=m
CONFIG_RTC_DRV_M41T80=m
CONFIG_RTC_DRV_M41T80_WDT=y
CONFIG_RTC_DRV_BQ32K=m
CONFIG_RTC_DRV_PALMAS=m
CONFIG_RTC_DRV_TPS6586X=m
CONFIG_RTC_DRV_TPS65910=m
CONFIG_RTC_DRV_RC5T583=m
CONFIG_RTC_DRV_S35390A=m
CONFIG_RTC_DRV_FM3130=m
CONFIG_RTC_DRV_RX8010=m
CONFIG_RTC_DRV_RX8581=m
CONFIG_RTC_DRV_RX8025=m
CONFIG_RTC_DRV_EM3027=m
CONFIG_RTC_DRV_RV3028=m
CONFIG_RTC_DRV_RV3032=m
CONFIG_RTC_DRV_RV8803=m
CONFIG_RTC_DRV_SD3078=m

#
# SPI RTC drivers
#
CONFIG_RTC_DRV_M41T93=m
CONFIG_RTC_DRV_M41T94=m
CONFIG_RTC_DRV_DS1302=m
CONFIG_RTC_DRV_DS1305=m
CONFIG_RTC_DRV_DS1343=m
CONFIG_RTC_DRV_DS1347=m
CONFIG_RTC_DRV_DS1390=m
CONFIG_RTC_DRV_MAX6916=m
CONFIG_RTC_DRV_R9701=m
CONFIG_RTC_DRV_RX4581=m
CONFIG_RTC_DRV_RS5C348=m
CONFIG_RTC_DRV_MAX6902=m
CONFIG_RTC_DRV_PCF2123=m
CONFIG_RTC_DRV_MCP795=m
CONFIG_RTC_I2C_AND_SPI=y

#
# SPI and I2C RTC drivers
#
CONFIG_RTC_DRV_DS3232=m
CONFIG_RTC_DRV_DS3232_HWMON=y
CONFIG_RTC_DRV_PCF2127=m
CONFIG_RTC_DRV_RV3029C2=m
CONFIG_RTC_DRV_RV3029_HWMON=y
CONFIG_RTC_DRV_RX6110=m

#
# Platform RTC drivers
#
CONFIG_RTC_DRV_CMOS=y
CONFIG_RTC_DRV_DS1286=m
CONFIG_RTC_DRV_DS1511=m
CONFIG_RTC_DRV_DS1553=m
CONFIG_RTC_DRV_DS1685_FAMILY=m
CONFIG_RTC_DRV_DS1685=y
# CONFIG_RTC_DRV_DS1689 is not set
# CONFIG_RTC_DRV_DS17285 is not set
# CONFIG_RTC_DRV_DS17485 is not set
# CONFIG_RTC_DRV_DS17885 is not set
CONFIG_RTC_DRV_DS1742=m
CONFIG_RTC_DRV_DS2404=m
CONFIG_RTC_DRV_DA9052=m
CONFIG_RTC_DRV_DA9055=m
CONFIG_RTC_DRV_DA9063=m
CONFIG_RTC_DRV_STK17TA8=m
CONFIG_RTC_DRV_M48T86=m
CONFIG_RTC_DRV_M48T35=m
CONFIG_RTC_DRV_M48T59=m
CONFIG_RTC_DRV_MSM6242=m
CONFIG_RTC_DRV_RP5C01=m
CONFIG_RTC_DRV_WM831X=m
CONFIG_RTC_DRV_WM8350=m
CONFIG_RTC_DRV_PCF50633=m
CONFIG_RTC_DRV_CROS_EC=m

#
# on-CPU RTC drivers
#
CONFIG_RTC_DRV_FTRTC010=m
CONFIG_RTC_DRV_PCAP=m
CONFIG_RTC_DRV_MC13XXX=m
CONFIG_RTC_DRV_MT6397=m

#
# HID Sensor RTC drivers
#
CONFIG_RTC_DRV_HID_SENSOR_TIME=m
CONFIG_RTC_DRV_GOLDFISH=m
CONFIG_RTC_DRV_WILCO_EC=m
CONFIG_DMADEVICES=y
# CONFIG_DMADEVICES_DEBUG is not set

#
# DMA Devices
#
CONFIG_DMA_ENGINE=y
CONFIG_DMA_VIRTUAL_CHANNELS=y
CONFIG_DMA_ACPI=y
CONFIG_ALTERA_MSGDMA=m
CONFIG_INTEL_IDMA64=m
CONFIG_INTEL_IDXD_BUS=m
CONFIG_INTEL_IDXD=m
# CONFIG_INTEL_IDXD_COMPAT is not set
CONFIG_INTEL_IDXD_SVM=y
CONFIG_INTEL_IDXD_PERFMON=y
CONFIG_INTEL_IOATDMA=m
CONFIG_PLX_DMA=m
# CONFIG_XILINX_DMA is not set
# CONFIG_XILINX_XDMA is not set
CONFIG_AMD_PTDMA=m
CONFIG_QCOM_HIDMA_MGMT=m
CONFIG_QCOM_HIDMA=m
CONFIG_DW_DMAC_CORE=m
CONFIG_DW_DMAC=m
CONFIG_DW_DMAC_PCI=m
CONFIG_DW_EDMA=m
CONFIG_DW_EDMA_PCIE=m
CONFIG_HSU_DMA=y
CONFIG_SF_PDMA=m
CONFIG_INTEL_LDMA=y

#
# DMA Clients
#
CONFIG_ASYNC_TX_DMA=y
# CONFIG_DMATEST is not set
CONFIG_DMA_ENGINE_RAID=y

#
# DMABUF options
#
CONFIG_SYNC_FILE=y
CONFIG_SW_SYNC=y
CONFIG_UDMABUF=y
CONFIG_DMABUF_MOVE_NOTIFY=y
# CONFIG_DMABUF_DEBUG is not set
# CONFIG_DMABUF_SELFTESTS is not set
CONFIG_DMABUF_HEAPS=y
# CONFIG_DMABUF_SYSFS_STATS is not set
CONFIG_DMABUF_HEAPS_SYSTEM=y
# end of DMABUF options

CONFIG_DCA=m
CONFIG_UIO=m
CONFIG_UIO_CIF=m
CONFIG_UIO_PDRV_GENIRQ=m
CONFIG_UIO_DMEM_GENIRQ=m
CONFIG_UIO_AEC=m
CONFIG_UIO_SERCOS3=m
CONFIG_UIO_PCI_GENERIC=m
CONFIG_UIO_NETX=m
CONFIG_UIO_PRUSS=m
CONFIG_UIO_MF624=m
CONFIG_UIO_HV_GENERIC=m
CONFIG_UIO_DFL=m
CONFIG_VFIO=m
# CONFIG_VFIO_DEVICE_CDEV is not set
CONFIG_VFIO_GROUP=y
CONFIG_VFIO_CONTAINER=y
CONFIG_VFIO_IOMMU_TYPE1=m
CONFIG_VFIO_NOIOMMU=y
CONFIG_VFIO_VIRQFD=y

#
# VFIO support for PCI devices
#
CONFIG_VFIO_PCI_CORE=m
CONFIG_VFIO_PCI_MMAP=y
CONFIG_VFIO_PCI_INTX=y
CONFIG_VFIO_PCI=m
CONFIG_VFIO_PCI_VGA=y
CONFIG_VFIO_PCI_IGD=y
CONFIG_MLX5_VFIO_PCI=m
# end of VFIO support for PCI devices

CONFIG_VFIO_MDEV=m
CONFIG_IRQ_BYPASS_MANAGER=m
CONFIG_VIRT_DRIVERS=y
CONFIG_VMGENID=m
CONFIG_VBOXGUEST=m
CONFIG_NITRO_ENCLAVES=m
CONFIG_ACRN_HSM=m
CONFIG_EFI_SECRET=m
CONFIG_SEV_GUEST=m
CONFIG_TDX_GUEST_DRIVER=m
CONFIG_VIRTIO_ANCHOR=y
CONFIG_VIRTIO=y
CONFIG_VIRTIO_PCI_LIB=y
CONFIG_VIRTIO_PCI_LIB_LEGACY=y
CONFIG_VIRTIO_MENU=y
CONFIG_VIRTIO_PCI=y
CONFIG_VIRTIO_PCI_LEGACY=y
CONFIG_VIRTIO_VDPA=m
CONFIG_VIRTIO_PMEM=m
CONFIG_VIRTIO_BALLOON=y
CONFIG_VIRTIO_MEM=m
CONFIG_VIRTIO_INPUT=m
CONFIG_VIRTIO_MMIO=y
CONFIG_VIRTIO_MMIO_CMDLINE_DEVICES=y
CONFIG_VIRTIO_DMA_SHARED_BUFFER=m
CONFIG_VDPA=m
CONFIG_VDPA_SIM=m
CONFIG_VDPA_SIM_NET=m
CONFIG_VDPA_SIM_BLOCK=m
CONFIG_VDPA_USER=m
CONFIG_IFCVF=m
CONFIG_MLX5_VDPA=y
CONFIG_MLX5_VDPA_NET=m
# CONFIG_MLX5_VDPA_STEERING_DEBUG is not set
CONFIG_VP_VDPA=m
CONFIG_ALIBABA_ENI_VDPA=m
# CONFIG_SNET_VDPA is not set
CONFIG_VHOST_IOTLB=m
CONFIG_VHOST_RING=m
CONFIG_VHOST_TASK=y
CONFIG_VHOST=m
CONFIG_VHOST_MENU=y
CONFIG_VHOST_NET=m
CONFIG_VHOST_SCSI=m
CONFIG_VHOST_VSOCK=m
CONFIG_VHOST_VDPA=m
# CONFIG_VHOST_CROSS_ENDIAN_LEGACY is not set

#
# Microsoft Hyper-V guest support
#
CONFIG_HYPERV=m
# CONFIG_HYPERV_VTL_MODE is not set
CONFIG_HYPERV_TIMER=y
CONFIG_HYPERV_UTILS=m
CONFIG_HYPERV_BALLOON=m
# end of Microsoft Hyper-V guest support

#
# Xen driver support
#
CONFIG_XEN_BALLOON=y
CONFIG_XEN_BALLOON_MEMORY_HOTPLUG=y
CONFIG_XEN_MEMORY_HOTPLUG_LIMIT=512
CONFIG_XEN_SCRUB_PAGES_DEFAULT=y
CONFIG_XEN_DEV_EVTCHN=m
CONFIG_XEN_BACKEND=y
CONFIG_XENFS=m
CONFIG_XEN_COMPAT_XENFS=y
CONFIG_XEN_SYS_HYPERVISOR=y
CONFIG_XEN_XENBUS_FRONTEND=y
CONFIG_XEN_GNTDEV=m
CONFIG_XEN_GNTDEV_DMABUF=y
CONFIG_XEN_GRANT_DEV_ALLOC=m
CONFIG_XEN_GRANT_DMA_ALLOC=y
CONFIG_SWIOTLB_XEN=y
CONFIG_XEN_PCI_STUB=y
CONFIG_XEN_PCIDEV_BACKEND=m
CONFIG_XEN_PVCALLS_FRONTEND=m
# CONFIG_XEN_PVCALLS_BACKEND is not set
CONFIG_XEN_SCSI_BACKEND=m
CONFIG_XEN_PRIVCMD=m
# CONFIG_XEN_PRIVCMD_IRQFD is not set
CONFIG_XEN_ACPI_PROCESSOR=y
CONFIG_XEN_MCE_LOG=y
CONFIG_XEN_HAVE_PVMMU=y
CONFIG_XEN_EFI=y
CONFIG_XEN_AUTO_XLATE=y
CONFIG_XEN_ACPI=y
CONFIG_XEN_SYMS=y
CONFIG_XEN_HAVE_VPMU=y
CONFIG_XEN_FRONT_PGDIR_SHBUF=m
CONFIG_XEN_UNPOPULATED_ALLOC=y
CONFIG_XEN_GRANT_DMA_OPS=y
CONFIG_XEN_VIRTIO=y
# CONFIG_XEN_VIRTIO_FORCE_GRANT is not set
# end of Xen driver support

CONFIG_GREYBUS=m
CONFIG_GREYBUS_ES2=m
CONFIG_COMEDI=m
# CONFIG_COMEDI_DEBUG is not set
CONFIG_COMEDI_DEFAULT_BUF_SIZE_KB=2048
CONFIG_COMEDI_DEFAULT_BUF_MAXSIZE_KB=20480
CONFIG_COMEDI_MISC_DRIVERS=y
CONFIG_COMEDI_BOND=m
CONFIG_COMEDI_TEST=m
CONFIG_COMEDI_PARPORT=m
# CONFIG_COMEDI_ISA_DRIVERS is not set
CONFIG_COMEDI_PCI_DRIVERS=m
CONFIG_COMEDI_8255_PCI=m
CONFIG_COMEDI_ADDI_WATCHDOG=m
CONFIG_COMEDI_ADDI_APCI_1032=m
CONFIG_COMEDI_ADDI_APCI_1500=m
CONFIG_COMEDI_ADDI_APCI_1516=m
CONFIG_COMEDI_ADDI_APCI_1564=m
CONFIG_COMEDI_ADDI_APCI_16XX=m
CONFIG_COMEDI_ADDI_APCI_2032=m
CONFIG_COMEDI_ADDI_APCI_2200=m
CONFIG_COMEDI_ADDI_APCI_3120=m
CONFIG_COMEDI_ADDI_APCI_3501=m
CONFIG_COMEDI_ADDI_APCI_3XXX=m
CONFIG_COMEDI_ADL_PCI6208=m
CONFIG_COMEDI_ADL_PCI7X3X=m
CONFIG_COMEDI_ADL_PCI8164=m
# CONFIG_COMEDI_ADL_PCI9111 is not set
# CONFIG_COMEDI_ADL_PCI9118 is not set
# CONFIG_COMEDI_ADV_PCI1710 is not set
CONFIG_COMEDI_ADV_PCI1720=m
CONFIG_COMEDI_ADV_PCI1723=m
CONFIG_COMEDI_ADV_PCI1724=m
CONFIG_COMEDI_ADV_PCI1760=m
# CONFIG_COMEDI_ADV_PCI_DIO is not set
# CONFIG_COMEDI_AMPLC_DIO200_PCI is not set
CONFIG_COMEDI_AMPLC_PC236_PCI=m
CONFIG_COMEDI_AMPLC_PC263_PCI=m
# CONFIG_COMEDI_AMPLC_PCI224 is not set
# CONFIG_COMEDI_AMPLC_PCI230 is not set
CONFIG_COMEDI_CONTEC_PCI_DIO=m
# CONFIG_COMEDI_DAS08_PCI is not set
CONFIG_COMEDI_DT3000=m
CONFIG_COMEDI_DYNA_PCI10XX=m
CONFIG_COMEDI_GSC_HPDI=m
CONFIG_COMEDI_MF6X4=m
CONFIG_COMEDI_ICP_MULTI=m
CONFIG_COMEDI_DAQBOARD2000=m
CONFIG_COMEDI_JR3_PCI=m
CONFIG_COMEDI_KE_COUNTER=m
CONFIG_COMEDI_CB_PCIDAS64=m
# CONFIG_COMEDI_CB_PCIDAS is not set
CONFIG_COMEDI_CB_PCIDDA=m
# CONFIG_COMEDI_CB_PCIMDAS is not set
CONFIG_COMEDI_CB_PCIMDDA=m
# CONFIG_COMEDI_ME4000 is not set
CONFIG_COMEDI_ME_DAQ=m
CONFIG_COMEDI_NI_6527=m
CONFIG_COMEDI_NI_65XX=m
CONFIG_COMEDI_NI_660X=m
CONFIG_COMEDI_NI_670X=m
# CONFIG_COMEDI_NI_LABPC_PCI is not set
CONFIG_COMEDI_NI_PCIDIO=m
CONFIG_COMEDI_NI_PCIMIO=m
# CONFIG_COMEDI_RTD520 is not set
CONFIG_COMEDI_S626=m
CONFIG_COMEDI_MITE=m
CONFIG_COMEDI_NI_TIOCMD=m
CONFIG_COMEDI_PCMCIA_DRIVERS=m
# CONFIG_COMEDI_CB_DAS16_CS is not set
# CONFIG_COMEDI_DAS08_CS is not set
CONFIG_COMEDI_NI_DAQ_700_CS=m
CONFIG_COMEDI_NI_DAQ_DIO24_CS=m
# CONFIG_COMEDI_NI_LABPC_CS is not set
CONFIG_COMEDI_NI_MIO_CS=m
CONFIG_COMEDI_QUATECH_DAQP_CS=m
CONFIG_COMEDI_USB_DRIVERS=m
CONFIG_COMEDI_DT9812=m
CONFIG_COMEDI_NI_USB6501=m
CONFIG_COMEDI_USBDUX=m
CONFIG_COMEDI_USBDUXFAST=m
CONFIG_COMEDI_USBDUXSIGMA=m
CONFIG_COMEDI_VMK80XX=m
CONFIG_COMEDI_8255=m
CONFIG_COMEDI_8255_SA=m
CONFIG_COMEDI_KCOMEDILIB=m
CONFIG_COMEDI_AMPLC_PC236=m
CONFIG_COMEDI_NI_TIO=m
CONFIG_COMEDI_NI_ROUTING=m
CONFIG_COMEDI_TESTS=m
CONFIG_COMEDI_TESTS_EXAMPLE=m
CONFIG_COMEDI_TESTS_NI_ROUTES=m
CONFIG_STAGING=y
CONFIG_PRISM2_USB=m
CONFIG_RTL8192U=m
CONFIG_RTLLIB=m
CONFIG_RTLLIB_CRYPTO_CCMP=m
CONFIG_RTLLIB_CRYPTO_TKIP=m
CONFIG_RTLLIB_CRYPTO_WEP=m
CONFIG_RTL8192E=m
CONFIG_RTL8723BS=m
CONFIG_R8712U=m
CONFIG_RTS5208=m
CONFIG_VT6655=m
CONFIG_VT6656=m

#
# IIO staging drivers
#

#
# Accelerometers
#
CONFIG_ADIS16203=m
CONFIG_ADIS16240=m
# end of Accelerometers

#
# Analog to digital converters
#
CONFIG_AD7816=m
# end of Analog to digital converters

#
# Analog digital bi-direction converters
#
CONFIG_ADT7316=m
CONFIG_ADT7316_SPI=m
CONFIG_ADT7316_I2C=m
# end of Analog digital bi-direction converters

#
# Direct Digital Synthesis
#
CONFIG_AD9832=m
CONFIG_AD9834=m
# end of Direct Digital Synthesis

#
# Network Analyzer, Impedance Converters
#
CONFIG_AD5933=m
# end of Network Analyzer, Impedance Converters

#
# Resolver to digital converters
#
CONFIG_AD2S1210=m
# end of Resolver to digital converters
# end of IIO staging drivers

CONFIG_FB_SM750=m
CONFIG_STAGING_MEDIA=y
# CONFIG_INTEL_ATOMISP is not set
# CONFIG_DVB_AV7110 is not set
CONFIG_VIDEO_IPU3_IMGU=m
# CONFIG_STAGING_MEDIA_DEPRECATED is not set
CONFIG_LTE_GDM724X=m
CONFIG_FB_TFT=m
CONFIG_FB_TFT_AGM1264K_FL=m
CONFIG_FB_TFT_BD663474=m
CONFIG_FB_TFT_HX8340BN=m
CONFIG_FB_TFT_HX8347D=m
CONFIG_FB_TFT_HX8353D=m
CONFIG_FB_TFT_HX8357D=m
CONFIG_FB_TFT_ILI9163=m
CONFIG_FB_TFT_ILI9320=m
CONFIG_FB_TFT_ILI9325=m
CONFIG_FB_TFT_ILI9340=m
CONFIG_FB_TFT_ILI9341=m
CONFIG_FB_TFT_ILI9481=m
CONFIG_FB_TFT_ILI9486=m
CONFIG_FB_TFT_PCD8544=m
CONFIG_FB_TFT_RA8875=m
CONFIG_FB_TFT_S6D02A1=m
CONFIG_FB_TFT_S6D1121=m
CONFIG_FB_TFT_SEPS525=m
CONFIG_FB_TFT_SH1106=m
CONFIG_FB_TFT_SSD1289=m
CONFIG_FB_TFT_SSD1305=m
CONFIG_FB_TFT_SSD1306=m
CONFIG_FB_TFT_SSD1331=m
CONFIG_FB_TFT_SSD1351=m
CONFIG_FB_TFT_ST7735R=m
CONFIG_FB_TFT_ST7789V=m
CONFIG_FB_TFT_TINYLCD=m
CONFIG_FB_TFT_TLS8204=m
CONFIG_FB_TFT_UC1611=m
CONFIG_FB_TFT_UC1701=m
CONFIG_FB_TFT_UPD161704=m
CONFIG_MOST_COMPONENTS=m
CONFIG_MOST_NET=m
CONFIG_MOST_VIDEO=m
CONFIG_MOST_I2C=m
CONFIG_KS7010=m
CONFIG_GREYBUS_AUDIO=m
CONFIG_GREYBUS_AUDIO_APB_CODEC=m
CONFIG_GREYBUS_BOOTROM=m
CONFIG_GREYBUS_FIRMWARE=m
CONFIG_GREYBUS_HID=m
CONFIG_GREYBUS_LIGHT=m
CONFIG_GREYBUS_LOG=m
CONFIG_GREYBUS_LOOPBACK=m
CONFIG_GREYBUS_POWER=m
CONFIG_GREYBUS_RAW=m
CONFIG_GREYBUS_VIBRATOR=m
CONFIG_GREYBUS_BRIDGED_PHY=m
CONFIG_GREYBUS_GPIO=m
CONFIG_GREYBUS_I2C=m
CONFIG_GREYBUS_PWM=m
CONFIG_GREYBUS_SDIO=m
CONFIG_GREYBUS_SPI=m
CONFIG_GREYBUS_UART=m
CONFIG_GREYBUS_USB=m
CONFIG_PI433=m
CONFIG_FIELDBUS_DEV=m
CONFIG_QLGE=m
CONFIG_VME_BUS=y

#
# VME Bridge Drivers
#
CONFIG_VME_TSI148=m
CONFIG_VME_FAKE=m

#
# VME Device Drivers
#
CONFIG_VME_USER=m
CONFIG_CHROME_PLATFORMS=y
CONFIG_CHROMEOS_ACPI=m
CONFIG_CHROMEOS_LAPTOP=m
CONFIG_CHROMEOS_PSTORE=m
CONFIG_CHROMEOS_TBMC=m
CONFIG_CROS_EC=m
CONFIG_CROS_EC_I2C=m
CONFIG_CROS_EC_ISHTP=m
CONFIG_CROS_EC_SPI=m
# CONFIG_CROS_EC_UART is not set
CONFIG_CROS_EC_LPC=m
CONFIG_CROS_EC_PROTO=y
CONFIG_CROS_KBD_LED_BACKLIGHT=m
CONFIG_CROS_EC_CHARDEV=m
CONFIG_CROS_EC_LIGHTBAR=m
CONFIG_CROS_EC_DEBUGFS=m
CONFIG_CROS_EC_SENSORHUB=m
CONFIG_CROS_EC_SYSFS=m
CONFIG_CROS_EC_TYPEC=m
CONFIG_CROS_HPS_I2C=m
CONFIG_CROS_USBPD_LOGGER=m
CONFIG_CROS_USBPD_NOTIFY=m
CONFIG_CHROMEOS_PRIVACY_SCREEN=m
CONFIG_CROS_TYPEC_SWITCH=m
CONFIG_WILCO_EC=m
CONFIG_WILCO_EC_DEBUGFS=m
CONFIG_WILCO_EC_EVENTS=m
CONFIG_WILCO_EC_TELEMETRY=m
CONFIG_MELLANOX_PLATFORM=y
CONFIG_MLXREG_HOTPLUG=m
CONFIG_MLXREG_IO=m
CONFIG_MLXREG_LC=m
CONFIG_NVSW_SN2201=m
CONFIG_SURFACE_PLATFORMS=y
CONFIG_SURFACE3_WMI=m
CONFIG_SURFACE_3_POWER_OPREGION=m
CONFIG_SURFACE_ACPI_NOTIFY=m
CONFIG_SURFACE_AGGREGATOR_CDEV=m
CONFIG_SURFACE_AGGREGATOR_HUB=m
CONFIG_SURFACE_AGGREGATOR_REGISTRY=m
CONFIG_SURFACE_AGGREGATOR_TABLET_SWITCH=m
CONFIG_SURFACE_DTX=m
CONFIG_SURFACE_GPE=m
CONFIG_SURFACE_HOTPLUG=m
CONFIG_SURFACE_PLATFORM_PROFILE=m
CONFIG_SURFACE_PRO3_BUTTON=m
CONFIG_SURFACE_AGGREGATOR=m
CONFIG_SURFACE_AGGREGATOR_BUS=y
# CONFIG_SURFACE_AGGREGATOR_ERROR_INJECTION is not set
CONFIG_X86_PLATFORM_DEVICES=y
CONFIG_ACPI_WMI=m
CONFIG_WMI_BMOF=m
CONFIG_HUAWEI_WMI=m
CONFIG_UV_SYSFS=m
CONFIG_MXM_WMI=m
CONFIG_NVIDIA_WMI_EC_BACKLIGHT=m
CONFIG_XIAOMI_WMI=m
CONFIG_GIGABYTE_WMI=m
# CONFIG_YOGABOOK is not set
CONFIG_ACERHDF=m
CONFIG_ACER_WIRELESS=m
CONFIG_ACER_WMI=m
CONFIG_AMD_PMF=m
# CONFIG_AMD_PMF_DEBUG is not set
CONFIG_AMD_PMC=m
CONFIG_AMD_HSMP=m
CONFIG_ADV_SWBUTTON=m
CONFIG_APPLE_GMUX=m
CONFIG_ASUS_LAPTOP=m
CONFIG_ASUS_WIRELESS=m
CONFIG_ASUS_WMI=m
CONFIG_ASUS_NB_WMI=m
CONFIG_ASUS_TF103C_DOCK=m
CONFIG_MERAKI_MX100=m
CONFIG_EEEPC_LAPTOP=m
CONFIG_EEEPC_WMI=m
CONFIG_X86_PLATFORM_DRIVERS_DELL=y
CONFIG_ALIENWARE_WMI=m
CONFIG_DCDBAS=m
CONFIG_DELL_LAPTOP=m
CONFIG_DELL_RBU=m
CONFIG_DELL_RBTN=m
CONFIG_DELL_SMBIOS=m
CONFIG_DELL_SMBIOS_WMI=y
CONFIG_DELL_SMBIOS_SMM=y
CONFIG_DELL_SMO8800=m
CONFIG_DELL_WMI=m
CONFIG_DELL_WMI_PRIVACY=y
CONFIG_DELL_WMI_AIO=m
CONFIG_DELL_WMI_DESCRIPTOR=m
CONFIG_DELL_WMI_DDV=m
CONFIG_DELL_WMI_LED=m
CONFIG_DELL_WMI_SYSMAN=m
CONFIG_AMILO_RFKILL=m
CONFIG_FUJITSU_LAPTOP=m
CONFIG_FUJITSU_TABLET=m
CONFIG_GPD_POCKET_FAN=m
CONFIG_X86_PLATFORM_DRIVERS_HP=y
CONFIG_HP_ACCEL=m
CONFIG_HP_WMI=m
CONFIG_HP_BIOSCFG=m
CONFIG_WIRELESS_HOTKEY=m
CONFIG_IBM_RTL=m
CONFIG_IDEAPAD_LAPTOP=m
# CONFIG_LENOVO_YMC is not set
CONFIG_SENSORS_HDAPS=m
CONFIG_THINKPAD_ACPI=m
CONFIG_THINKPAD_ACPI_ALSA_SUPPORT=y
CONFIG_THINKPAD_ACPI_DEBUGFACILITIES=y
# CONFIG_THINKPAD_ACPI_DEBUG is not set
# CONFIG_THINKPAD_ACPI_UNSAFE_LEDS is not set
CONFIG_THINKPAD_ACPI_VIDEO=y
CONFIG_THINKPAD_ACPI_HOTKEY_POLL=y
CONFIG_THINKPAD_LMI=m
CONFIG_INTEL_ATOMISP2_PDX86=y
CONFIG_INTEL_ATOMISP2_LED=m
CONFIG_INTEL_ATOMISP2_PM=m
CONFIG_INTEL_IFS=m
CONFIG_INTEL_SAR_INT1092=m
CONFIG_INTEL_SKL_INT3472=m
CONFIG_INTEL_PMC_CORE=y
CONFIG_INTEL_PMT_CLASS=m
CONFIG_INTEL_PMT_TELEMETRY=m
CONFIG_INTEL_PMT_CRASHLOG=m

#
# Intel Speed Select Technology interface support
#
CONFIG_INTEL_SPEED_SELECT_INTERFACE=m
# end of Intel Speed Select Technology interface support

CONFIG_INTEL_TELEMETRY=m
CONFIG_INTEL_WMI=y
CONFIG_INTEL_WMI_SBL_FW_UPDATE=m
CONFIG_INTEL_WMI_THUNDERBOLT=m

#
# Intel Uncore Frequency Control
#
CONFIG_INTEL_UNCORE_FREQ_CONTROL=m
# end of Intel Uncore Frequency Control

CONFIG_INTEL_HID_EVENT=m
CONFIG_INTEL_VBTN=m
CONFIG_INTEL_INT0002_VGPIO=m
CONFIG_INTEL_OAKTRAIL=m
CONFIG_INTEL_BXTWC_PMIC_TMU=m
# CONFIG_INTEL_BYTCRC_PWRSRC is not set
CONFIG_INTEL_CHTDC_TI_PWRBTN=m
CONFIG_INTEL_CHTWC_INT33FE=m
CONFIG_INTEL_ISHTP_ECLITE=m
CONFIG_INTEL_MRFLD_PWRBTN=m
CONFIG_INTEL_PUNIT_IPC=m
CONFIG_INTEL_RST=m
CONFIG_INTEL_SDSI=m
CONFIG_INTEL_SMARTCONNECT=m
# CONFIG_INTEL_TPMI is not set
CONFIG_INTEL_TURBO_MAX_3=y
CONFIG_INTEL_VSEC=m
# CONFIG_MSI_EC is not set
CONFIG_MSI_LAPTOP=m
CONFIG_MSI_WMI=m
CONFIG_PCENGINES_APU2=m
CONFIG_BARCO_P50_GPIO=m
CONFIG_SAMSUNG_LAPTOP=m
CONFIG_SAMSUNG_Q10=m
CONFIG_ACPI_TOSHIBA=m
CONFIG_TOSHIBA_BT_RFKILL=m
CONFIG_TOSHIBA_HAPS=m
# CONFIG_TOSHIBA_WMI is not set
CONFIG_ACPI_CMPC=m
CONFIG_COMPAL_LAPTOP=m
CONFIG_LG_LAPTOP=m
CONFIG_PANASONIC_LAPTOP=m
CONFIG_SONY_LAPTOP=m
CONFIG_SONYPI_COMPAT=y
CONFIG_SYSTEM76_ACPI=m
CONFIG_TOPSTAR_LAPTOP=m
CONFIG_SERIAL_MULTI_INSTANTIATE=m
CONFIG_MLX_PLATFORM=m
CONFIG_TOUCHSCREEN_DMI=y
CONFIG_X86_ANDROID_TABLETS=m
CONFIG_FW_ATTR_CLASS=m
CONFIG_INTEL_IPS=m
CONFIG_INTEL_SCU_IPC=y
CONFIG_INTEL_SCU=y
CONFIG_INTEL_SCU_PCI=y
CONFIG_INTEL_SCU_PLATFORM=m
CONFIG_INTEL_SCU_IPC_UTIL=m
CONFIG_SIEMENS_SIMATIC_IPC=m
CONFIG_SIEMENS_SIMATIC_IPC_BATT=m
CONFIG_SIEMENS_SIMATIC_IPC_BATT_APOLLOLAKE=m
CONFIG_SIEMENS_SIMATIC_IPC_BATT_ELKHARTLAKE=m
CONFIG_SIEMENS_SIMATIC_IPC_BATT_F7188X=m
CONFIG_WINMATE_FM07_KEYS=m
# CONFIG_SEL3350_PLATFORM is not set
CONFIG_P2SB=y
CONFIG_HAVE_CLK=y
CONFIG_HAVE_CLK_PREPARE=y
CONFIG_COMMON_CLK=y
CONFIG_COMMON_CLK_WM831X=m
CONFIG_LMK04832=m
CONFIG_COMMON_CLK_MAX9485=m
CONFIG_COMMON_CLK_SI5341=m
CONFIG_COMMON_CLK_SI5351=m
CONFIG_COMMON_CLK_SI544=m
CONFIG_COMMON_CLK_CDCE706=m
CONFIG_COMMON_CLK_TPS68470=m
CONFIG_COMMON_CLK_CS2000_CP=m
CONFIG_CLK_TWL6040=m
CONFIG_COMMON_CLK_PALMAS=m
CONFIG_COMMON_CLK_PWM=m
CONFIG_XILINX_VCU=m
CONFIG_HWSPINLOCK=y

#
# Clock Source drivers
#
CONFIG_CLKEVT_I8253=y
CONFIG_I8253_LOCK=y
CONFIG_CLKBLD_I8253=y
# end of Clock Source drivers

CONFIG_MAILBOX=y
CONFIG_PCC=y
CONFIG_ALTERA_MBOX=m
CONFIG_IOMMU_IOVA=y
CONFIG_IOMMU_API=y
CONFIG_IOMMU_SUPPORT=y

#
# Generic IOMMU Pagetable Support
#
CONFIG_IOMMU_IO_PGTABLE=y
# end of Generic IOMMU Pagetable Support

# CONFIG_IOMMU_DEBUGFS is not set
# CONFIG_IOMMU_DEFAULT_DMA_STRICT is not set
CONFIG_IOMMU_DEFAULT_DMA_LAZY=y
# CONFIG_IOMMU_DEFAULT_PASSTHROUGH is not set
CONFIG_IOMMU_DMA=y
CONFIG_IOMMU_SVA=y
CONFIG_AMD_IOMMU=y
CONFIG_AMD_IOMMU_V2=m
CONFIG_DMAR_TABLE=y
CONFIG_INTEL_IOMMU=y
CONFIG_INTEL_IOMMU_SVM=y
# CONFIG_INTEL_IOMMU_DEFAULT_ON is not set
CONFIG_INTEL_IOMMU_FLOPPY_WA=y
# CONFIG_INTEL_IOMMU_SCALABLE_MODE_DEFAULT_ON is not set
CONFIG_INTEL_IOMMU_PERF_EVENTS=y
CONFIG_IOMMUFD=m
CONFIG_IRQ_REMAP=y
CONFIG_HYPERV_IOMMU=y
CONFIG_VIRTIO_IOMMU=y

#
# Remoteproc drivers
#
CONFIG_REMOTEPROC=y
CONFIG_REMOTEPROC_CDEV=y
# end of Remoteproc drivers

#
# Rpmsg drivers
#
CONFIG_RPMSG=m
CONFIG_RPMSG_CHAR=m
CONFIG_RPMSG_CTRL=m
CONFIG_RPMSG_NS=m
CONFIG_RPMSG_QCOM_GLINK=m
CONFIG_RPMSG_QCOM_GLINK_RPM=m
CONFIG_RPMSG_VIRTIO=m
# end of Rpmsg drivers

CONFIG_SOUNDWIRE=m

#
# SoundWire Devices
#
# CONFIG_SOUNDWIRE_AMD is not set
CONFIG_SOUNDWIRE_CADENCE=m
CONFIG_SOUNDWIRE_INTEL=m
CONFIG_SOUNDWIRE_QCOM=m
CONFIG_SOUNDWIRE_GENERIC_ALLOCATION=m

#
# SOC (System On Chip) specific Drivers
#

#
# Amlogic SoC drivers
#
# end of Amlogic SoC drivers

#
# Broadcom SoC drivers
#
# end of Broadcom SoC drivers

#
# NXP/Freescale QorIQ SoC drivers
#
# end of NXP/Freescale QorIQ SoC drivers

#
# fujitsu SoC drivers
#
# end of fujitsu SoC drivers

#
# i.MX SoC drivers
#
# end of i.MX SoC drivers

#
# Enable LiteX SoC Builder specific drivers
#
# end of Enable LiteX SoC Builder specific drivers

# CONFIG_WPCM450_SOC is not set

#
# Qualcomm SoC drivers
#
CONFIG_QCOM_QMI_HELPERS=m
# end of Qualcomm SoC drivers

CONFIG_SOC_TI=y

#
# Xilinx SoC drivers
#
# end of Xilinx SoC drivers
# end of SOC (System On Chip) specific Drivers

CONFIG_PM_DEVFREQ=y

#
# DEVFREQ Governors
#
CONFIG_DEVFREQ_GOV_SIMPLE_ONDEMAND=y
CONFIG_DEVFREQ_GOV_PERFORMANCE=y
CONFIG_DEVFREQ_GOV_POWERSAVE=y
CONFIG_DEVFREQ_GOV_USERSPACE=y
CONFIG_DEVFREQ_GOV_PASSIVE=y

#
# DEVFREQ Drivers
#
CONFIG_PM_DEVFREQ_EVENT=y
CONFIG_EXTCON=y

#
# Extcon Device Drivers
#
CONFIG_EXTCON_ADC_JACK=m
CONFIG_EXTCON_AXP288=m
CONFIG_EXTCON_FSA9480=m
CONFIG_EXTCON_GPIO=m
CONFIG_EXTCON_INTEL_INT3496=m
CONFIG_EXTCON_INTEL_CHT_WC=m
CONFIG_EXTCON_INTEL_MRFLD=m
CONFIG_EXTCON_MAX14577=m
CONFIG_EXTCON_MAX3355=m
CONFIG_EXTCON_MAX77693=m
CONFIG_EXTCON_MAX77843=m
CONFIG_EXTCON_MAX8997=m
CONFIG_EXTCON_PALMAS=m
CONFIG_EXTCON_PTN5150=m
CONFIG_EXTCON_RT8973A=m
CONFIG_EXTCON_SM5502=m
CONFIG_EXTCON_USB_GPIO=m
CONFIG_EXTCON_USBC_CROS_EC=m
CONFIG_EXTCON_USBC_TUSB320=m
CONFIG_MEMORY=y
CONFIG_FPGA_DFL_EMIF=m
CONFIG_IIO=m
CONFIG_IIO_BUFFER=y
CONFIG_IIO_BUFFER_CB=m
CONFIG_IIO_BUFFER_DMA=m
CONFIG_IIO_BUFFER_DMAENGINE=m
CONFIG_IIO_BUFFER_HW_CONSUMER=m
CONFIG_IIO_KFIFO_BUF=m
CONFIG_IIO_TRIGGERED_BUFFER=m
CONFIG_IIO_CONFIGFS=m
CONFIG_IIO_TRIGGER=y
CONFIG_IIO_CONSUMERS_PER_TRIGGER=2
CONFIG_IIO_SW_DEVICE=m
CONFIG_IIO_SW_TRIGGER=m
CONFIG_IIO_TRIGGERED_EVENT=m

#
# Accelerometers
#
CONFIG_ADIS16201=m
CONFIG_ADIS16209=m
CONFIG_ADXL313=m
CONFIG_ADXL313_I2C=m
CONFIG_ADXL313_SPI=m
CONFIG_ADXL355=m
CONFIG_ADXL355_I2C=m
CONFIG_ADXL355_SPI=m
CONFIG_ADXL367=m
CONFIG_ADXL367_SPI=m
CONFIG_ADXL367_I2C=m
CONFIG_ADXL372=m
CONFIG_ADXL372_SPI=m
CONFIG_ADXL372_I2C=m
CONFIG_BMA220=m
CONFIG_BMA400=m
CONFIG_BMA400_I2C=m
CONFIG_BMA400_SPI=m
CONFIG_BMC150_ACCEL=m
CONFIG_BMC150_ACCEL_I2C=m
CONFIG_BMC150_ACCEL_SPI=m
CONFIG_BMI088_ACCEL=m
CONFIG_BMI088_ACCEL_SPI=m
CONFIG_DA280=m
CONFIG_DA311=m
CONFIG_DMARD06=m
CONFIG_DMARD09=m
CONFIG_DMARD10=m
CONFIG_FXLS8962AF=m
CONFIG_FXLS8962AF_I2C=m
CONFIG_FXLS8962AF_SPI=m
CONFIG_HID_SENSOR_ACCEL_3D=m
CONFIG_IIO_CROS_EC_ACCEL_LEGACY=m
CONFIG_IIO_ST_ACCEL_3AXIS=m
CONFIG_IIO_ST_ACCEL_I2C_3AXIS=m
CONFIG_IIO_ST_ACCEL_SPI_3AXIS=m
CONFIG_IIO_KX022A=m
CONFIG_IIO_KX022A_SPI=m
CONFIG_IIO_KX022A_I2C=m
CONFIG_KXSD9=m
CONFIG_KXSD9_SPI=m
CONFIG_KXSD9_I2C=m
CONFIG_KXCJK1013=m
CONFIG_MC3230=m
CONFIG_MMA7455=m
CONFIG_MMA7455_I2C=m
CONFIG_MMA7455_SPI=m
CONFIG_MMA7660=m
CONFIG_MMA8452=m
CONFIG_MMA9551_CORE=m
CONFIG_MMA9551=m
CONFIG_MMA9553=m
CONFIG_MSA311=m
CONFIG_MXC4005=m
CONFIG_MXC6255=m
CONFIG_SCA3000=m
CONFIG_SCA3300=m
CONFIG_STK8312=m
CONFIG_STK8BA50=m
# end of Accelerometers

#
# Analog to digital converters
#
CONFIG_AD_SIGMA_DELTA=m
CONFIG_AD4130=m
CONFIG_AD7091R5=m
CONFIG_AD7124=m
CONFIG_AD7192=m
CONFIG_AD7266=m
CONFIG_AD7280=m
CONFIG_AD7291=m
CONFIG_AD7292=m
CONFIG_AD7298=m
CONFIG_AD7476=m
CONFIG_AD7606=m
CONFIG_AD7606_IFACE_PARALLEL=m
CONFIG_AD7606_IFACE_SPI=m
CONFIG_AD7766=m
CONFIG_AD7768_1=m
CONFIG_AD7780=m
CONFIG_AD7791=m
CONFIG_AD7793=m
CONFIG_AD7887=m
CONFIG_AD7923=m
CONFIG_AD7949=m
CONFIG_AD799X=m
CONFIG_AXP20X_ADC=m
CONFIG_AXP288_ADC=m
CONFIG_CC10001_ADC=m
CONFIG_DA9150_GPADC=m
CONFIG_DLN2_ADC=m
CONFIG_ENVELOPE_DETECTOR=m
CONFIG_HI8435=m
CONFIG_HX711=m
CONFIG_INA2XX_ADC=m
CONFIG_INTEL_MRFLD_ADC=m
CONFIG_LP8788_ADC=m
CONFIG_LTC2471=m
CONFIG_LTC2485=m
CONFIG_LTC2496=m
CONFIG_LTC2497=m
CONFIG_MAX1027=m
CONFIG_MAX11100=m
CONFIG_MAX1118=m
CONFIG_MAX11205=m
CONFIG_MAX11410=m
CONFIG_MAX1241=m
CONFIG_MAX1363=m
CONFIG_MAX9611=m
CONFIG_MCP320X=m
CONFIG_MCP3422=m
CONFIG_MCP3911=m
CONFIG_MEDIATEK_MT6360_ADC=m
CONFIG_MEDIATEK_MT6370_ADC=m
CONFIG_MEN_Z188_ADC=m
CONFIG_MP2629_ADC=m
CONFIG_NAU7802=m
CONFIG_PALMAS_GPADC=m
CONFIG_QCOM_VADC_COMMON=m
CONFIG_QCOM_SPMI_IADC=m
CONFIG_QCOM_SPMI_VADC=m
CONFIG_QCOM_SPMI_ADC5=m
CONFIG_RICHTEK_RTQ6056=m
CONFIG_SD_ADC_MODULATOR=m
CONFIG_TI_ADC081C=m
CONFIG_TI_ADC0832=m
CONFIG_TI_ADC084S021=m
CONFIG_TI_ADC12138=m
CONFIG_TI_ADC108S102=m
CONFIG_TI_ADC128S052=m
CONFIG_TI_ADC161S626=m
CONFIG_TI_ADS1015=m
# CONFIG_TI_ADS7924 is not set
# CONFIG_TI_ADS1100 is not set
CONFIG_TI_ADS7950=m
CONFIG_TI_ADS8344=m
CONFIG_TI_ADS8688=m
CONFIG_TI_ADS124S08=m
CONFIG_TI_ADS131E08=m
CONFIG_TI_AM335X_ADC=m
# CONFIG_TI_LMP92064 is not set
CONFIG_TI_TLC4541=m
CONFIG_TI_TSC2046=m
CONFIG_TWL4030_MADC=m
CONFIG_TWL6030_GPADC=m
CONFIG_VF610_ADC=m
CONFIG_VIPERBOARD_ADC=m
CONFIG_XILINX_XADC=m
# end of Analog to digital converters

#
# Analog to digital and digital to analog converters
#
CONFIG_AD74115=m
CONFIG_AD74413R=m
CONFIG_STX104=m
# end of Analog to digital and digital to analog converters

#
# Analog Front Ends
#
CONFIG_IIO_RESCALE=m
# end of Analog Front Ends

#
# Amplifiers
#
CONFIG_AD8366=m
CONFIG_ADA4250=m
CONFIG_HMC425=m
# end of Amplifiers

#
# Capacitance to digital converters
#
CONFIG_AD7150=m
CONFIG_AD7746=m
# end of Capacitance to digital converters

#
# Chemical Sensors
#
CONFIG_ATLAS_PH_SENSOR=m
CONFIG_ATLAS_EZO_SENSOR=m
CONFIG_BME680=m
CONFIG_BME680_I2C=m
CONFIG_BME680_SPI=m
CONFIG_CCS811=m
CONFIG_IAQCORE=m
CONFIG_PMS7003=m
CONFIG_SCD30_CORE=m
CONFIG_SCD30_I2C=m
CONFIG_SCD30_SERIAL=m
CONFIG_SCD4X=m
CONFIG_SENSIRION_SGP30=m
CONFIG_SENSIRION_SGP40=m
CONFIG_SPS30=m
CONFIG_SPS30_I2C=m
CONFIG_SPS30_SERIAL=m
CONFIG_SENSEAIR_SUNRISE_CO2=m
CONFIG_VZ89X=m
# end of Chemical Sensors

CONFIG_IIO_CROS_EC_SENSORS_CORE=m
CONFIG_IIO_CROS_EC_SENSORS=m
CONFIG_IIO_CROS_EC_SENSORS_LID_ANGLE=m

#
# Hid Sensor IIO Common
#
CONFIG_HID_SENSOR_IIO_COMMON=m
CONFIG_HID_SENSOR_IIO_TRIGGER=m
# end of Hid Sensor IIO Common

CONFIG_IIO_INV_SENSORS_TIMESTAMP=m
CONFIG_IIO_MS_SENSORS_I2C=m

#
# IIO SCMI Sensors
#
# end of IIO SCMI Sensors

#
# SSP Sensor Common
#
CONFIG_IIO_SSP_SENSORS_COMMONS=m
CONFIG_IIO_SSP_SENSORHUB=m
# end of SSP Sensor Common

CONFIG_IIO_ST_SENSORS_I2C=m
CONFIG_IIO_ST_SENSORS_SPI=m
CONFIG_IIO_ST_SENSORS_CORE=m

#
# Digital to analog converters
#
CONFIG_AD3552R=m
CONFIG_AD5064=m
CONFIG_AD5360=m
CONFIG_AD5380=m
CONFIG_AD5421=m
CONFIG_AD5446=m
CONFIG_AD5449=m
CONFIG_AD5592R_BASE=m
CONFIG_AD5592R=m
CONFIG_AD5593R=m
CONFIG_AD5504=m
CONFIG_AD5624R_SPI=m
CONFIG_LTC2688=m
CONFIG_AD5686=m
CONFIG_AD5686_SPI=m
CONFIG_AD5696_I2C=m
CONFIG_AD5755=m
CONFIG_AD5758=m
CONFIG_AD5761=m
CONFIG_AD5764=m
CONFIG_AD5766=m
CONFIG_AD5770R=m
CONFIG_AD5791=m
CONFIG_AD7293=m
CONFIG_AD7303=m
CONFIG_AD8801=m
CONFIG_CIO_DAC=m
CONFIG_DPOT_DAC=m
CONFIG_DS4424=m
CONFIG_LTC1660=m
CONFIG_LTC2632=m
CONFIG_M62332=m
CONFIG_MAX517=m
# CONFIG_MAX5522 is not set
CONFIG_MAX5821=m
CONFIG_MCP4725=m
# CONFIG_MCP4728 is not set
CONFIG_MCP4922=m
CONFIG_TI_DAC082S085=m
CONFIG_TI_DAC5571=m
CONFIG_TI_DAC7311=m
CONFIG_TI_DAC7612=m
CONFIG_VF610_DAC=m
# end of Digital to analog converters

#
# IIO dummy driver
#
CONFIG_IIO_SIMPLE_DUMMY=m
# CONFIG_IIO_SIMPLE_DUMMY_EVENTS is not set
# CONFIG_IIO_SIMPLE_DUMMY_BUFFER is not set
# end of IIO dummy driver

#
# Filters
#
CONFIG_ADMV8818=m
# end of Filters

#
# Frequency Synthesizers DDS/PLL
#

#
# Clock Generator/Distribution
#
CONFIG_AD9523=m
# end of Clock Generator/Distribution

#
# Phase-Locked Loop (PLL) frequency synthesizers
#
CONFIG_ADF4350=m
CONFIG_ADF4371=m
CONFIG_ADF4377=m
CONFIG_ADMV1013=m
CONFIG_ADMV1014=m
CONFIG_ADMV4420=m
CONFIG_ADRF6780=m
# end of Phase-Locked Loop (PLL) frequency synthesizers
# end of Frequency Synthesizers DDS/PLL

#
# Digital gyroscope sensors
#
CONFIG_ADIS16080=m
CONFIG_ADIS16130=m
CONFIG_ADIS16136=m
CONFIG_ADIS16260=m
CONFIG_ADXRS290=m
CONFIG_ADXRS450=m
CONFIG_BMG160=m
CONFIG_BMG160_I2C=m
CONFIG_BMG160_SPI=m
CONFIG_FXAS21002C=m
CONFIG_FXAS21002C_I2C=m
CONFIG_FXAS21002C_SPI=m
CONFIG_HID_SENSOR_GYRO_3D=m
CONFIG_MPU3050=m
CONFIG_MPU3050_I2C=m
CONFIG_IIO_ST_GYRO_3AXIS=m
CONFIG_IIO_ST_GYRO_I2C_3AXIS=m
CONFIG_IIO_ST_GYRO_SPI_3AXIS=m
CONFIG_ITG3200=m
# end of Digital gyroscope sensors

#
# Health Sensors
#

#
# Heart Rate Monitors
#
CONFIG_AFE4403=m
CONFIG_AFE4404=m
CONFIG_MAX30100=m
CONFIG_MAX30102=m
# end of Heart Rate Monitors
# end of Health Sensors

#
# Humidity sensors
#
CONFIG_AM2315=m
CONFIG_DHT11=m
CONFIG_HDC100X=m
CONFIG_HDC2010=m
CONFIG_HID_SENSOR_HUMIDITY=m
CONFIG_HTS221=m
CONFIG_HTS221_I2C=m
CONFIG_HTS221_SPI=m
CONFIG_HTU21=m
CONFIG_SI7005=m
CONFIG_SI7020=m
# end of Humidity sensors

#
# Inertial measurement units
#
CONFIG_ADIS16400=m
CONFIG_ADIS16460=m
CONFIG_ADIS16475=m
CONFIG_ADIS16480=m
CONFIG_BMI160=m
CONFIG_BMI160_I2C=m
CONFIG_BMI160_SPI=m
CONFIG_BOSCH_BNO055=m
CONFIG_BOSCH_BNO055_SERIAL=m
CONFIG_BOSCH_BNO055_I2C=m
CONFIG_FXOS8700=m
CONFIG_FXOS8700_I2C=m
CONFIG_FXOS8700_SPI=m
CONFIG_KMX61=m
CONFIG_INV_ICM42600=m
CONFIG_INV_ICM42600_I2C=m
CONFIG_INV_ICM42600_SPI=m
CONFIG_INV_MPU6050_IIO=m
CONFIG_INV_MPU6050_I2C=m
CONFIG_INV_MPU6050_SPI=m
CONFIG_IIO_ST_LSM6DSX=m
CONFIG_IIO_ST_LSM6DSX_I2C=m
CONFIG_IIO_ST_LSM6DSX_SPI=m
CONFIG_IIO_ST_LSM6DSX_I3C=m
CONFIG_IIO_ST_LSM9DS0=m
CONFIG_IIO_ST_LSM9DS0_I2C=m
CONFIG_IIO_ST_LSM9DS0_SPI=m
# end of Inertial measurement units

CONFIG_IIO_ADIS_LIB=m
CONFIG_IIO_ADIS_LIB_BUFFER=y

#
# Light sensors
#
CONFIG_ACPI_ALS=m
CONFIG_ADJD_S311=m
CONFIG_ADUX1020=m
CONFIG_AL3010=m
CONFIG_AL3320A=m
CONFIG_APDS9300=m
CONFIG_APDS9960=m
CONFIG_AS73211=m
CONFIG_BH1750=m
CONFIG_BH1780=m
CONFIG_CM32181=m
CONFIG_CM3232=m
CONFIG_CM3323=m
CONFIG_CM3605=m
CONFIG_CM36651=m
CONFIG_IIO_CROS_EC_LIGHT_PROX=m
CONFIG_GP2AP002=m
CONFIG_GP2AP020A00F=m
CONFIG_IQS621_ALS=m
CONFIG_SENSORS_ISL29018=m
CONFIG_SENSORS_ISL29028=m
CONFIG_ISL29125=m
CONFIG_HID_SENSOR_ALS=m
CONFIG_HID_SENSOR_PROX=m
CONFIG_JSA1212=m
# CONFIG_ROHM_BU27008 is not set
# CONFIG_ROHM_BU27034 is not set
CONFIG_RPR0521=m
CONFIG_SENSORS_LM3533=m
CONFIG_LTR501=m
CONFIG_LTRF216A=m
CONFIG_LV0104CS=m
CONFIG_MAX44000=m
CONFIG_MAX44009=m
CONFIG_NOA1305=m
CONFIG_OPT3001=m
# CONFIG_OPT4001 is not set
CONFIG_PA12203001=m
CONFIG_SI1133=m
CONFIG_SI1145=m
CONFIG_STK3310=m
CONFIG_ST_UVIS25=m
CONFIG_ST_UVIS25_I2C=m
CONFIG_ST_UVIS25_SPI=m
CONFIG_TCS3414=m
CONFIG_TCS3472=m
CONFIG_SENSORS_TSL2563=m
CONFIG_TSL2583=m
CONFIG_TSL2591=m
CONFIG_TSL2772=m
CONFIG_TSL4531=m
CONFIG_US5182D=m
CONFIG_VCNL4000=m
CONFIG_VCNL4035=m
CONFIG_VEML6030=m
CONFIG_VEML6070=m
CONFIG_VL6180=m
CONFIG_ZOPT2201=m
# end of Light sensors

#
# Magnetometer sensors
#
CONFIG_AK8974=m
CONFIG_AK8975=m
CONFIG_AK09911=m
CONFIG_BMC150_MAGN=m
CONFIG_BMC150_MAGN_I2C=m
CONFIG_BMC150_MAGN_SPI=m
CONFIG_MAG3110=m
CONFIG_HID_SENSOR_MAGNETOMETER_3D=m
CONFIG_MMC35240=m
CONFIG_IIO_ST_MAGN_3AXIS=m
CONFIG_IIO_ST_MAGN_I2C_3AXIS=m
CONFIG_IIO_ST_MAGN_SPI_3AXIS=m
CONFIG_SENSORS_HMC5843=m
CONFIG_SENSORS_HMC5843_I2C=m
CONFIG_SENSORS_HMC5843_SPI=m
CONFIG_SENSORS_RM3100=m
CONFIG_SENSORS_RM3100_I2C=m
CONFIG_SENSORS_RM3100_SPI=m
# CONFIG_TI_TMAG5273 is not set
CONFIG_YAMAHA_YAS530=m
# end of Magnetometer sensors

#
# Multiplexers
#
CONFIG_IIO_MUX=m
# end of Multiplexers

#
# Inclinometer sensors
#
CONFIG_HID_SENSOR_INCLINOMETER_3D=m
CONFIG_HID_SENSOR_DEVICE_ROTATION=m
# end of Inclinometer sensors

#
# Triggers - standalone
#
CONFIG_IIO_HRTIMER_TRIGGER=m
CONFIG_IIO_INTERRUPT_TRIGGER=m
CONFIG_IIO_TIGHTLOOP_TRIGGER=m
CONFIG_IIO_SYSFS_TRIGGER=m
# end of Triggers - standalone

#
# Linear and angular position sensors
#
CONFIG_IQS624_POS=m
CONFIG_HID_SENSOR_CUSTOM_INTEL_HINGE=m
# end of Linear and angular position sensors

#
# Digital potentiometers
#
CONFIG_AD5110=m
CONFIG_AD5272=m
CONFIG_DS1803=m
CONFIG_MAX5432=m
CONFIG_MAX5481=m
CONFIG_MAX5487=m
CONFIG_MCP4018=m
CONFIG_MCP4131=m
CONFIG_MCP4531=m
CONFIG_MCP41010=m
CONFIG_TPL0102=m
# CONFIG_X9250 is not set
# end of Digital potentiometers

#
# Digital potentiostats
#
CONFIG_LMP91000=m
# end of Digital potentiostats

#
# Pressure sensors
#
CONFIG_ABP060MG=m
CONFIG_BMP280=m
CONFIG_BMP280_I2C=m
CONFIG_BMP280_SPI=m
CONFIG_IIO_CROS_EC_BARO=m
CONFIG_DLHL60D=m
CONFIG_DPS310=m
CONFIG_HID_SENSOR_PRESS=m
CONFIG_HP03=m
CONFIG_ICP10100=m
CONFIG_MPL115=m
CONFIG_MPL115_I2C=m
CONFIG_MPL115_SPI=m
CONFIG_MPL3115=m
# CONFIG_MPRLS0025PA is not set
CONFIG_MS5611=m
CONFIG_MS5611_I2C=m
CONFIG_MS5611_SPI=m
CONFIG_MS5637=m
CONFIG_IIO_ST_PRESS=m
CONFIG_IIO_ST_PRESS_I2C=m
CONFIG_IIO_ST_PRESS_SPI=m
CONFIG_T5403=m
CONFIG_HP206C=m
CONFIG_ZPA2326=m
CONFIG_ZPA2326_I2C=m
CONFIG_ZPA2326_SPI=m
# end of Pressure sensors

#
# Lightning sensors
#
CONFIG_AS3935=m
# end of Lightning sensors

#
# Proximity and distance sensors
#
CONFIG_CROS_EC_MKBP_PROXIMITY=m
# CONFIG_IRSD200 is not set
CONFIG_ISL29501=m
CONFIG_LIDAR_LITE_V2=m
CONFIG_MB1232=m
CONFIG_PING=m
CONFIG_RFD77402=m
CONFIG_SRF04=m
CONFIG_SX_COMMON=m
CONFIG_SX9310=m
CONFIG_SX9324=m
CONFIG_SX9360=m
CONFIG_SX9500=m
CONFIG_SRF08=m
CONFIG_VCNL3020=m
CONFIG_VL53L0X_I2C=m
# end of Proximity and distance sensors

#
# Resolver to digital converters
#
CONFIG_AD2S90=m
CONFIG_AD2S1200=m
# end of Resolver to digital converters

#
# Temperature sensors
#
CONFIG_IQS620AT_TEMP=m
CONFIG_LTC2983=m
CONFIG_MAXIM_THERMOCOUPLE=m
CONFIG_HID_SENSOR_TEMP=m
CONFIG_MLX90614=m
CONFIG_MLX90632=m
CONFIG_TMP006=m
CONFIG_TMP007=m
CONFIG_TMP117=m
CONFIG_TSYS01=m
CONFIG_TSYS02D=m
CONFIG_MAX30208=m
CONFIG_MAX31856=m
CONFIG_MAX31865=m
# end of Temperature sensors

CONFIG_NTB=m
CONFIG_NTB_MSI=y
# CONFIG_NTB_AMD is not set
CONFIG_NTB_IDT=m
CONFIG_NTB_INTEL=m
CONFIG_NTB_EPF=m
CONFIG_NTB_SWITCHTEC=m
CONFIG_NTB_PINGPONG=m
CONFIG_NTB_TOOL=m
CONFIG_NTB_PERF=m
# CONFIG_NTB_MSI_TEST is not set
CONFIG_NTB_TRANSPORT=m
CONFIG_PWM=y
CONFIG_PWM_SYSFS=y
# CONFIG_PWM_DEBUG is not set
CONFIG_PWM_CLK=m
CONFIG_PWM_CRC=y
CONFIG_PWM_CROS_EC=m
CONFIG_PWM_DWC=m
CONFIG_PWM_IQS620A=m
CONFIG_PWM_LP3943=m
CONFIG_PWM_LPSS=y
CONFIG_PWM_LPSS_PCI=y
CONFIG_PWM_LPSS_PLATFORM=y
CONFIG_PWM_PCA9685=m
CONFIG_PWM_TWL=m
CONFIG_PWM_TWL_LED=m

#
# IRQ chip support
#
CONFIG_MADERA_IRQ=m
# end of IRQ chip support

CONFIG_IPACK_BUS=m
CONFIG_BOARD_TPCI200=m
CONFIG_SERIAL_IPOCTAL=m
CONFIG_RESET_CONTROLLER=y
CONFIG_RESET_SIMPLE=y
CONFIG_RESET_TI_SYSCON=m
CONFIG_RESET_TI_TPS380X=m

#
# PHY Subsystem
#
CONFIG_GENERIC_PHY=y
CONFIG_GENERIC_PHY_MIPI_DPHY=y
CONFIG_USB_LGM_PHY=m
CONFIG_PHY_CAN_TRANSCEIVER=m

#
# PHY drivers for Broadcom platforms
#
CONFIG_BCM_KONA_USB2_PHY=m
# end of PHY drivers for Broadcom platforms

CONFIG_PHY_PXA_28NM_HSIC=m
CONFIG_PHY_PXA_28NM_USB2=m
CONFIG_PHY_CPCAP_USB=m
CONFIG_PHY_QCOM_USB_HS=m
CONFIG_PHY_QCOM_USB_HSIC=m
# CONFIG_PHY_RTK_RTD_USB2PHY is not set
# CONFIG_PHY_RTK_RTD_USB3PHY is not set
CONFIG_PHY_SAMSUNG_USB2=m
CONFIG_PHY_TUSB1210=m
CONFIG_PHY_INTEL_LGM_EMMC=m
# end of PHY Subsystem

CONFIG_POWERCAP=y
CONFIG_INTEL_RAPL_CORE=m
CONFIG_INTEL_RAPL=m
CONFIG_IDLE_INJECT=y
CONFIG_MCB=m
CONFIG_MCB_PCI=m
CONFIG_MCB_LPC=m

#
# Performance monitor support
#
# end of Performance monitor support

CONFIG_RAS=y
CONFIG_RAS_CEC=y
# CONFIG_RAS_CEC_DEBUG is not set
CONFIG_USB4=m
# CONFIG_USB4_DEBUGFS_WRITE is not set
# CONFIG_USB4_DMA_TEST is not set

#
# Android
#
# CONFIG_ANDROID_BINDER_IPC is not set
# end of Android

CONFIG_LIBNVDIMM=y
CONFIG_BLK_DEV_PMEM=m
CONFIG_ND_CLAIM=y
CONFIG_ND_BTT=m
CONFIG_BTT=y
CONFIG_ND_PFN=m
CONFIG_NVDIMM_PFN=y
CONFIG_NVDIMM_DAX=y
CONFIG_NVDIMM_KEYS=y
# CONFIG_NVDIMM_SECURITY_TEST is not set
CONFIG_DAX=y
CONFIG_DEV_DAX=m
CONFIG_DEV_DAX_PMEM=m
CONFIG_DEV_DAX_HMEM=m
CONFIG_DEV_DAX_CXL=m
CONFIG_DEV_DAX_HMEM_DEVICES=y
CONFIG_DEV_DAX_KMEM=m
CONFIG_NVMEM=y
CONFIG_NVMEM_SYSFS=y

#
# Layout Types
#
# CONFIG_NVMEM_LAYOUT_SL28_VPD is not set
# CONFIG_NVMEM_LAYOUT_ONIE_TLV is not set
# end of Layout Types

CONFIG_NVMEM_RAVE_SP_EEPROM=m
CONFIG_NVMEM_RMEM=m
CONFIG_NVMEM_SPMI_SDAM=m

#
# HW tracing support
#
CONFIG_STM=m
CONFIG_STM_PROTO_BASIC=m
CONFIG_STM_PROTO_SYS_T=m
CONFIG_STM_DUMMY=m
CONFIG_STM_SOURCE_CONSOLE=m
CONFIG_STM_SOURCE_HEARTBEAT=m
CONFIG_STM_SOURCE_FTRACE=m
CONFIG_INTEL_TH=m
CONFIG_INTEL_TH_PCI=m
CONFIG_INTEL_TH_ACPI=m
CONFIG_INTEL_TH_GTH=m
CONFIG_INTEL_TH_STH=m
CONFIG_INTEL_TH_MSU=m
CONFIG_INTEL_TH_PTI=m
# CONFIG_INTEL_TH_DEBUG is not set
# end of HW tracing support

CONFIG_FPGA=m
CONFIG_ALTERA_PR_IP_CORE=m
CONFIG_FPGA_MGR_ALTERA_PS_SPI=m
CONFIG_FPGA_MGR_ALTERA_CVP=m
CONFIG_FPGA_MGR_XILINX_SPI=m
CONFIG_FPGA_MGR_MACHXO2_SPI=m
CONFIG_FPGA_BRIDGE=m
CONFIG_ALTERA_FREEZE_BRIDGE=m
CONFIG_XILINX_PR_DECOUPLER=m
CONFIG_FPGA_REGION=m
CONFIG_FPGA_DFL=m
CONFIG_FPGA_DFL_FME=m
CONFIG_FPGA_DFL_FME_MGR=m
CONFIG_FPGA_DFL_FME_BRIDGE=m
CONFIG_FPGA_DFL_FME_REGION=m
CONFIG_FPGA_DFL_AFU=m
CONFIG_FPGA_DFL_NIOS_INTEL_PAC_N3000=m
CONFIG_FPGA_DFL_PCI=m
CONFIG_FPGA_MGR_MICROCHIP_SPI=m
CONFIG_FPGA_MGR_LATTICE_SYSCONFIG=m
CONFIG_FPGA_MGR_LATTICE_SYSCONFIG_SPI=m
CONFIG_TEE=m
CONFIG_AMDTEE=m
CONFIG_MULTIPLEXER=m

#
# Multiplexer drivers
#
CONFIG_MUX_ADG792A=m
CONFIG_MUX_ADGS1408=m
CONFIG_MUX_GPIO=m
# end of Multiplexer drivers

CONFIG_PM_OPP=y
CONFIG_SIOX=m
CONFIG_SIOX_BUS_GPIO=m
CONFIG_SLIMBUS=m
CONFIG_SLIM_QCOM_CTRL=m
CONFIG_INTERCONNECT=y
CONFIG_I8254=m
CONFIG_COUNTER=m
CONFIG_104_QUAD_8=m
CONFIG_INTEL_QEP=m
CONFIG_INTERRUPT_CNT=m
CONFIG_MOST=m
CONFIG_MOST_USB_HDM=m
CONFIG_MOST_CDEV=m
CONFIG_MOST_SND=m
CONFIG_PECI=m
CONFIG_PECI_CPU=m
CONFIG_HTE=y
# end of Device Drivers

#
# File systems
#
CONFIG_DCACHE_WORD_ACCESS=y
CONFIG_VALIDATE_FS_PARSER=y
CONFIG_FS_IOMAP=y
CONFIG_BUFFER_HEAD=y
CONFIG_LEGACY_DIRECT_IO=y
# CONFIG_EXT2_FS is not set
# CONFIG_EXT3_FS is not set
CONFIG_EXT4_FS=y
CONFIG_EXT4_USE_FOR_EXT2=y
CONFIG_EXT4_FS_POSIX_ACL=y
CONFIG_EXT4_FS_SECURITY=y
# CONFIG_EXT4_DEBUG is not set
CONFIG_JBD2=y
# CONFIG_JBD2_DEBUG is not set
CONFIG_FS_MBCACHE=y
CONFIG_REISERFS_FS=m
# CONFIG_REISERFS_CHECK is not set
# CONFIG_REISERFS_PROC_INFO is not set
CONFIG_REISERFS_FS_XATTR=y
CONFIG_REISERFS_FS_POSIX_ACL=y
CONFIG_REISERFS_FS_SECURITY=y
CONFIG_JFS_FS=m
CONFIG_JFS_POSIX_ACL=y
CONFIG_JFS_SECURITY=y
# CONFIG_JFS_DEBUG is not set
CONFIG_JFS_STATISTICS=y
CONFIG_XFS_FS=m
CONFIG_XFS_SUPPORT_V4=y
CONFIG_XFS_SUPPORT_ASCII_CI=y
CONFIG_XFS_QUOTA=y
CONFIG_XFS_POSIX_ACL=y
CONFIG_XFS_RT=y
# CONFIG_XFS_ONLINE_SCRUB is not set
# CONFIG_XFS_WARN is not set
# CONFIG_XFS_DEBUG is not set
CONFIG_GFS2_FS=m
CONFIG_GFS2_FS_LOCKING_DLM=y
CONFIG_OCFS2_FS=m
CONFIG_OCFS2_FS_O2CB=m
CONFIG_OCFS2_FS_USERSPACE_CLUSTER=m
CONFIG_OCFS2_FS_STATS=y
CONFIG_OCFS2_DEBUG_MASKLOG=y
# CONFIG_OCFS2_DEBUG_FS is not set
CONFIG_BTRFS_FS=m
CONFIG_BTRFS_FS_POSIX_ACL=y
# CONFIG_BTRFS_FS_CHECK_INTEGRITY is not set
# CONFIG_BTRFS_FS_RUN_SANITY_TESTS is not set
# CONFIG_BTRFS_DEBUG is not set
# CONFIG_BTRFS_ASSERT is not set
# CONFIG_BTRFS_FS_REF_VERIFY is not set
CONFIG_NILFS2_FS=m
CONFIG_F2FS_FS=m
CONFIG_F2FS_STAT_FS=y
CONFIG_F2FS_FS_XATTR=y
CONFIG_F2FS_FS_POSIX_ACL=y
CONFIG_F2FS_FS_SECURITY=y
# CONFIG_F2FS_CHECK_FS is not set
# CONFIG_F2FS_FAULT_INJECTION is not set
CONFIG_F2FS_FS_COMPRESSION=y
CONFIG_F2FS_FS_LZO=y
CONFIG_F2FS_FS_LZORLE=y
CONFIG_F2FS_FS_LZ4=y
CONFIG_F2FS_FS_LZ4HC=y
CONFIG_F2FS_FS_ZSTD=y
# CONFIG_F2FS_IOSTAT is not set
CONFIG_F2FS_UNFAIR_RWSEM=y
CONFIG_ZONEFS_FS=m
CONFIG_FS_DAX=y
CONFIG_FS_DAX_PMD=y
CONFIG_FS_POSIX_ACL=y
CONFIG_EXPORTFS=y
CONFIG_EXPORTFS_BLOCK_OPS=y
CONFIG_FILE_LOCKING=y
CONFIG_FS_ENCRYPTION=y
CONFIG_FS_ENCRYPTION_ALGS=y
CONFIG_FS_ENCRYPTION_INLINE_CRYPT=y
CONFIG_FS_VERITY=y
CONFIG_FS_VERITY_BUILTIN_SIGNATURES=y
CONFIG_FSNOTIFY=y
CONFIG_DNOTIFY=y
CONFIG_INOTIFY_USER=y
CONFIG_FANOTIFY=y
CONFIG_FANOTIFY_ACCESS_PERMISSIONS=y
CONFIG_QUOTA=y
CONFIG_QUOTA_NETLINK_INTERFACE=y
# CONFIG_QUOTA_DEBUG is not set
CONFIG_QUOTA_TREE=m
CONFIG_QFMT_V1=m
CONFIG_QFMT_V2=m
CONFIG_QUOTACTL=y
CONFIG_AUTOFS_FS=m
CONFIG_FUSE_FS=y
CONFIG_CUSE=m
CONFIG_VIRTIO_FS=m
CONFIG_FUSE_DAX=y
CONFIG_OVERLAY_FS=m
# CONFIG_OVERLAY_FS_REDIRECT_DIR is not set
CONFIG_OVERLAY_FS_REDIRECT_ALWAYS_FOLLOW=y
# CONFIG_OVERLAY_FS_INDEX is not set
CONFIG_OVERLAY_FS_XINO_AUTO=y
# CONFIG_OVERLAY_FS_METACOPY is not set
# CONFIG_OVERLAY_FS_DEBUG is not set

#
# Caches
#
CONFIG_NETFS_SUPPORT=m
CONFIG_NETFS_STATS=y
CONFIG_FSCACHE=m
CONFIG_FSCACHE_STATS=y
# CONFIG_FSCACHE_DEBUG is not set
CONFIG_CACHEFILES=m
# CONFIG_CACHEFILES_DEBUG is not set
CONFIG_CACHEFILES_ERROR_INJECTION=y
# CONFIG_CACHEFILES_ONDEMAND is not set
# end of Caches

#
# CD-ROM/DVD Filesystems
#
CONFIG_ISO9660_FS=m
CONFIG_JOLIET=y
CONFIG_ZISOFS=y
CONFIG_UDF_FS=m
# end of CD-ROM/DVD Filesystems

#
# DOS/FAT/EXFAT/NT Filesystems
#
CONFIG_FAT_FS=y
CONFIG_MSDOS_FS=m
CONFIG_VFAT_FS=y
CONFIG_FAT_DEFAULT_CODEPAGE=437
CONFIG_FAT_DEFAULT_IOCHARSET="iso8859-1"
# CONFIG_FAT_DEFAULT_UTF8 is not set
CONFIG_EXFAT_FS=m
CONFIG_EXFAT_DEFAULT_IOCHARSET="utf8"
CONFIG_NTFS_FS=m
# CONFIG_NTFS_DEBUG is not set
# CONFIG_NTFS_RW is not set
CONFIG_NTFS3_FS=m
# CONFIG_NTFS3_64BIT_CLUSTER is not set
CONFIG_NTFS3_LZX_XPRESS=y
CONFIG_NTFS3_FS_POSIX_ACL=y
# end of DOS/FAT/EXFAT/NT Filesystems

#
# Pseudo filesystems
#
CONFIG_PROC_FS=y
CONFIG_PROC_KCORE=y
CONFIG_PROC_VMCORE=y
CONFIG_PROC_VMCORE_DEVICE_DUMP=y
CONFIG_PROC_SYSCTL=y
CONFIG_PROC_PAGE_MONITOR=y
CONFIG_PROC_CHILDREN=y
CONFIG_PROC_PID_ARCH_STATUS=y
CONFIG_PROC_CPU_RESCTRL=y
CONFIG_KERNFS=y
CONFIG_SYSFS=y
CONFIG_TMPFS=y
CONFIG_TMPFS_POSIX_ACL=y
CONFIG_TMPFS_XATTR=y
CONFIG_TMPFS_INODE64=y
# CONFIG_TMPFS_QUOTA is not set
CONFIG_HUGETLBFS=y
CONFIG_HUGETLB_PAGE=y
CONFIG_HUGETLB_PAGE_OPTIMIZE_VMEMMAP=y
# CONFIG_HUGETLB_PAGE_OPTIMIZE_VMEMMAP_DEFAULT_ON is not set
CONFIG_ARCH_HAS_GIGANTIC_PAGE=y
CONFIG_CONFIGFS_FS=y
CONFIG_EFIVAR_FS=y
# end of Pseudo filesystems

CONFIG_MISC_FILESYSTEMS=y
CONFIG_ORANGEFS_FS=m
CONFIG_ADFS_FS=m
# CONFIG_ADFS_FS_RW is not set
CONFIG_AFFS_FS=m
CONFIG_ECRYPT_FS=y
CONFIG_ECRYPT_FS_MESSAGING=y
CONFIG_HFS_FS=m
CONFIG_HFSPLUS_FS=m
CONFIG_BEFS_FS=m
# CONFIG_BEFS_DEBUG is not set
CONFIG_BFS_FS=m
CONFIG_EFS_FS=m
CONFIG_JFFS2_FS=m
CONFIG_JFFS2_FS_DEBUG=0
CONFIG_JFFS2_FS_WRITEBUFFER=y
# CONFIG_JFFS2_FS_WBUF_VERIFY is not set
# CONFIG_JFFS2_SUMMARY is not set
CONFIG_JFFS2_FS_XATTR=y
CONFIG_JFFS2_FS_POSIX_ACL=y
CONFIG_JFFS2_FS_SECURITY=y
CONFIG_JFFS2_COMPRESSION_OPTIONS=y
CONFIG_JFFS2_ZLIB=y
CONFIG_JFFS2_LZO=y
CONFIG_JFFS2_RTIME=y
# CONFIG_JFFS2_RUBIN is not set
# CONFIG_JFFS2_CMODE_NONE is not set
# CONFIG_JFFS2_CMODE_PRIORITY is not set
# CONFIG_JFFS2_CMODE_SIZE is not set
CONFIG_JFFS2_CMODE_FAVOURLZO=y
CONFIG_UBIFS_FS=m
# CONFIG_UBIFS_FS_ADVANCED_COMPR is not set
CONFIG_UBIFS_FS_LZO=y
CONFIG_UBIFS_FS_ZLIB=y
CONFIG_UBIFS_FS_ZSTD=y
# CONFIG_UBIFS_ATIME_SUPPORT is not set
CONFIG_UBIFS_FS_XATTR=y
CONFIG_UBIFS_FS_SECURITY=y
CONFIG_UBIFS_FS_AUTHENTICATION=y
CONFIG_CRAMFS=m
CONFIG_CRAMFS_BLOCKDEV=y
CONFIG_CRAMFS_MTD=y
CONFIG_SQUASHFS=y
# CONFIG_SQUASHFS_FILE_CACHE is not set
CONFIG_SQUASHFS_FILE_DIRECT=y
CONFIG_SQUASHFS_DECOMP_SINGLE=y
CONFIG_SQUASHFS_DECOMP_MULTI=y
CONFIG_SQUASHFS_DECOMP_MULTI_PERCPU=y
CONFIG_SQUASHFS_CHOICE_DECOMP_BY_MOUNT=y
CONFIG_SQUASHFS_MOUNT_DECOMP_THREADS=y
CONFIG_SQUASHFS_XATTR=y
CONFIG_SQUASHFS_ZLIB=y
CONFIG_SQUASHFS_LZ4=y
CONFIG_SQUASHFS_LZO=y
CONFIG_SQUASHFS_XZ=y
CONFIG_SQUASHFS_ZSTD=y
# CONFIG_SQUASHFS_4K_DEVBLK_SIZE is not set
# CONFIG_SQUASHFS_EMBEDDED is not set
CONFIG_SQUASHFS_FRAGMENT_CACHE_SIZE=3
CONFIG_VXFS_FS=m
CONFIG_MINIX_FS=m
CONFIG_OMFS_FS=m
CONFIG_HPFS_FS=m
CONFIG_QNX4FS_FS=m
CONFIG_QNX6FS_FS=m
# CONFIG_QNX6FS_DEBUG is not set
CONFIG_ROMFS_FS=m
CONFIG_ROMFS_BACKED_BY_BLOCK=y
# CONFIG_ROMFS_BACKED_BY_MTD is not set
# CONFIG_ROMFS_BACKED_BY_BOTH is not set
CONFIG_ROMFS_ON_BLOCK=y
CONFIG_PSTORE=y
CONFIG_PSTORE_DEFAULT_KMSG_BYTES=10240
CONFIG_PSTORE_COMPRESS=y
# CONFIG_PSTORE_CONSOLE is not set
# CONFIG_PSTORE_PMSG is not set
# CONFIG_PSTORE_FTRACE is not set
CONFIG_PSTORE_RAM=m
CONFIG_PSTORE_ZONE=m
CONFIG_PSTORE_BLK=m
CONFIG_PSTORE_BLK_BLKDEV=""
CONFIG_PSTORE_BLK_KMSG_SIZE=64
CONFIG_PSTORE_BLK_MAX_REASON=2
CONFIG_SYSV_FS=m
CONFIG_UFS_FS=m
# CONFIG_UFS_FS_WRITE is not set
# CONFIG_UFS_DEBUG is not set
CONFIG_EROFS_FS=m
# CONFIG_EROFS_FS_DEBUG is not set
CONFIG_EROFS_FS_XATTR=y
CONFIG_EROFS_FS_POSIX_ACL=y
CONFIG_EROFS_FS_SECURITY=y
CONFIG_EROFS_FS_ZIP=y
# CONFIG_EROFS_FS_ZIP_LZMA is not set
# CONFIG_EROFS_FS_ZIP_DEFLATE is not set
# CONFIG_EROFS_FS_PCPU_KTHREAD is not set
CONFIG_VBOXSF_FS=m
CONFIG_NETWORK_FILESYSTEMS=y
CONFIG_NFS_FS=m
CONFIG_NFS_V2=m
CONFIG_NFS_V3=m
CONFIG_NFS_V3_ACL=y
CONFIG_NFS_V4=m
CONFIG_NFS_SWAP=y
CONFIG_NFS_V4_1=y
CONFIG_NFS_V4_2=y
CONFIG_PNFS_FILE_LAYOUT=m
CONFIG_PNFS_BLOCK=m
CONFIG_PNFS_FLEXFILE_LAYOUT=m
CONFIG_NFS_V4_1_IMPLEMENTATION_ID_DOMAIN="kernel.org"
CONFIG_NFS_V4_1_MIGRATION=y
CONFIG_NFS_V4_SECURITY_LABEL=y
CONFIG_NFS_FSCACHE=y
# CONFIG_NFS_USE_LEGACY_DNS is not set
CONFIG_NFS_USE_KERNEL_DNS=y
CONFIG_NFS_DEBUG=y
CONFIG_NFS_DISABLE_UDP_SUPPORT=y
# CONFIG_NFS_V4_2_READ_PLUS is not set
CONFIG_NFSD=m
# CONFIG_NFSD_V2 is not set
CONFIG_NFSD_V3_ACL=y
CONFIG_NFSD_V4=y
CONFIG_NFSD_PNFS=y
CONFIG_NFSD_BLOCKLAYOUT=y
CONFIG_NFSD_SCSILAYOUT=y
CONFIG_NFSD_FLEXFILELAYOUT=y
CONFIG_NFSD_V4_2_INTER_SSC=y
CONFIG_NFSD_V4_SECURITY_LABEL=y
CONFIG_GRACE_PERIOD=m
CONFIG_LOCKD=m
CONFIG_LOCKD_V4=y
CONFIG_NFS_ACL_SUPPORT=m
CONFIG_NFS_COMMON=y
CONFIG_NFS_V4_2_SSC_HELPER=y
CONFIG_SUNRPC=m
CONFIG_SUNRPC_GSS=m
CONFIG_SUNRPC_BACKCHANNEL=y
CONFIG_SUNRPC_SWAP=y
CONFIG_RPCSEC_GSS_KRB5=m
CONFIG_RPCSEC_GSS_KRB5_ENCTYPES_AES_SHA1=y
# CONFIG_RPCSEC_GSS_KRB5_ENCTYPES_CAMELLIA is not set
# CONFIG_RPCSEC_GSS_KRB5_ENCTYPES_AES_SHA2 is not set
CONFIG_SUNRPC_DEBUG=y
CONFIG_SUNRPC_XPRT_RDMA=m
CONFIG_CEPH_FS=m
CONFIG_CEPH_FSCACHE=y
CONFIG_CEPH_FS_POSIX_ACL=y
CONFIG_CEPH_FS_SECURITY_LABEL=y
CONFIG_CIFS=m
# CONFIG_CIFS_STATS2 is not set
CONFIG_CIFS_ALLOW_INSECURE_LEGACY=y
CONFIG_CIFS_UPCALL=y
CONFIG_CIFS_XATTR=y
CONFIG_CIFS_POSIX=y
CONFIG_CIFS_DEBUG=y
# CONFIG_CIFS_DEBUG2 is not set
# CONFIG_CIFS_DEBUG_DUMP_KEYS is not set
CONFIG_CIFS_DFS_UPCALL=y
CONFIG_CIFS_SWN_UPCALL=y
# CONFIG_CIFS_SMB_DIRECT is not set
CONFIG_CIFS_FSCACHE=y
CONFIG_SMB_SERVER=m
CONFIG_SMB_SERVER_SMBDIRECT=y
CONFIG_SMB_SERVER_CHECK_CAP_NET_ADMIN=y
CONFIG_SMB_SERVER_KERBEROS5=y
CONFIG_SMBFS=m
CONFIG_CODA_FS=m
CONFIG_AFS_FS=m
# CONFIG_AFS_DEBUG is not set
CONFIG_AFS_FSCACHE=y
# CONFIG_AFS_DEBUG_CURSOR is not set
CONFIG_9P_FS=m
CONFIG_9P_FSCACHE=y
CONFIG_9P_FS_POSIX_ACL=y
CONFIG_9P_FS_SECURITY=y
CONFIG_NLS=y
CONFIG_NLS_DEFAULT="utf8"
CONFIG_NLS_CODEPAGE_437=y
CONFIG_NLS_CODEPAGE_737=m
CONFIG_NLS_CODEPAGE_775=m
CONFIG_NLS_CODEPAGE_850=m
CONFIG_NLS_CODEPAGE_852=m
CONFIG_NLS_CODEPAGE_855=m
CONFIG_NLS_CODEPAGE_857=m
CONFIG_NLS_CODEPAGE_860=m
CONFIG_NLS_CODEPAGE_861=m
CONFIG_NLS_CODEPAGE_862=m
CONFIG_NLS_CODEPAGE_863=m
CONFIG_NLS_CODEPAGE_864=m
CONFIG_NLS_CODEPAGE_865=m
CONFIG_NLS_CODEPAGE_866=m
CONFIG_NLS_CODEPAGE_869=m
CONFIG_NLS_CODEPAGE_936=m
CONFIG_NLS_CODEPAGE_950=m
CONFIG_NLS_CODEPAGE_932=m
CONFIG_NLS_CODEPAGE_949=m
CONFIG_NLS_CODEPAGE_874=m
CONFIG_NLS_ISO8859_8=m
CONFIG_NLS_CODEPAGE_1250=m
CONFIG_NLS_CODEPAGE_1251=m
CONFIG_NLS_ASCII=m
CONFIG_NLS_ISO8859_1=m
CONFIG_NLS_ISO8859_2=m
CONFIG_NLS_ISO8859_3=m
CONFIG_NLS_ISO8859_4=m
CONFIG_NLS_ISO8859_5=m
CONFIG_NLS_ISO8859_6=m
CONFIG_NLS_ISO8859_7=m
CONFIG_NLS_ISO8859_9=m
CONFIG_NLS_ISO8859_13=m
CONFIG_NLS_ISO8859_14=m
CONFIG_NLS_ISO8859_15=m
CONFIG_NLS_KOI8_R=m
CONFIG_NLS_KOI8_U=m
CONFIG_NLS_MAC_ROMAN=m
CONFIG_NLS_MAC_CELTIC=m
CONFIG_NLS_MAC_CENTEURO=m
CONFIG_NLS_MAC_CROATIAN=m
CONFIG_NLS_MAC_CYRILLIC=m
CONFIG_NLS_MAC_GAELIC=m
CONFIG_NLS_MAC_GREEK=m
CONFIG_NLS_MAC_ICELAND=m
CONFIG_NLS_MAC_INUIT=m
CONFIG_NLS_MAC_ROMANIAN=m
CONFIG_NLS_MAC_TURKISH=m
CONFIG_NLS_UTF8=m
CONFIG_NLS_UCS2_UTILS=m
CONFIG_DLM=m
# CONFIG_DLM_DEBUG is not set
CONFIG_UNICODE=y
# CONFIG_UNICODE_NORMALIZATION_SELFTEST is not set
CONFIG_IO_WQ=y
# end of File systems

#
# Security options
#
CONFIG_KEYS=y
CONFIG_KEYS_REQUEST_CACHE=y
CONFIG_PERSISTENT_KEYRINGS=y
CONFIG_TRUSTED_KEYS=y
CONFIG_TRUSTED_KEYS_TPM=y
CONFIG_ENCRYPTED_KEYS=y
CONFIG_USER_DECRYPTED_DATA=y
CONFIG_KEY_DH_OPERATIONS=y
CONFIG_KEY_NOTIFICATIONS=y
CONFIG_SECURITY_DMESG_RESTRICT=y
CONFIG_SECURITY=y
CONFIG_SECURITYFS=y
CONFIG_SECURITY_NETWORK=y
CONFIG_SECURITY_INFINIBAND=y
CONFIG_SECURITY_NETWORK_XFRM=y
CONFIG_SECURITY_PATH=y
CONFIG_INTEL_TXT=y
CONFIG_LSM_MMAP_MIN_ADDR=0
CONFIG_HARDENED_USERCOPY=y
CONFIG_FORTIFY_SOURCE=y
# CONFIG_STATIC_USERMODEHELPER is not set
CONFIG_SECURITY_SELINUX=y
CONFIG_SECURITY_SELINUX_BOOTPARAM=y
CONFIG_SECURITY_SELINUX_DEVELOP=y
CONFIG_SECURITY_SELINUX_AVC_STATS=y
CONFIG_SECURITY_SELINUX_SIDTAB_HASH_BITS=9
CONFIG_SECURITY_SELINUX_SID2STR_CACHE_SIZE=256
# CONFIG_SECURITY_SELINUX_DEBUG is not set
CONFIG_SECURITY_SMACK=y
# CONFIG_SECURITY_SMACK_BRINGUP is not set
CONFIG_SECURITY_SMACK_NETFILTER=y
CONFIG_SECURITY_SMACK_APPEND_SIGNALS=y
CONFIG_SECURITY_TOMOYO=y
CONFIG_SECURITY_TOMOYO_MAX_ACCEPT_ENTRY=2048
CONFIG_SECURITY_TOMOYO_MAX_AUDIT_LOG=1024
# CONFIG_SECURITY_TOMOYO_OMIT_USERSPACE_LOADER is not set
CONFIG_SECURITY_TOMOYO_POLICY_LOADER="/sbin/tomoyo-init"
CONFIG_SECURITY_TOMOYO_ACTIVATION_TRIGGER="/sbin/init"
# CONFIG_SECURITY_TOMOYO_INSECURE_BUILTIN_SETTING is not set
CONFIG_SECURITY_APPARMOR=y
# CONFIG_SECURITY_APPARMOR_DEBUG is not set
CONFIG_SECURITY_APPARMOR_INTROSPECT_POLICY=y
CONFIG_SECURITY_APPARMOR_HASH=y
CONFIG_SECURITY_APPARMOR_HASH_DEFAULT=y
CONFIG_SECURITY_APPARMOR_EXPORT_BINARY=y
CONFIG_SECURITY_APPARMOR_PARANOID_LOAD=y
# CONFIG_SECURITY_LOADPIN is not set
CONFIG_SECURITY_YAMA=y
CONFIG_SECURITY_SAFESETID=y
CONFIG_SECURITY_LOCKDOWN_LSM=y
CONFIG_SECURITY_LOCKDOWN_LSM_EARLY=y
CONFIG_LOCK_DOWN_KERNEL_FORCE_NONE=y
# CONFIG_LOCK_DOWN_KERNEL_FORCE_INTEGRITY is not set
# CONFIG_LOCK_DOWN_KERNEL_FORCE_CONFIDENTIALITY is not set
CONFIG_SECURITY_LANDLOCK=y
CONFIG_INTEGRITY=y
CONFIG_INTEGRITY_SIGNATURE=y
CONFIG_INTEGRITY_ASYMMETRIC_KEYS=y
CONFIG_INTEGRITY_TRUSTED_KEYRING=y
CONFIG_INTEGRITY_PLATFORM_KEYRING=y
CONFIG_INTEGRITY_MACHINE_KEYRING=y
# CONFIG_INTEGRITY_CA_MACHINE_KEYRING is not set
CONFIG_LOAD_UEFI_KEYS=y
CONFIG_INTEGRITY_AUDIT=y
CONFIG_IMA=y
CONFIG_IMA_KEXEC=y
CONFIG_IMA_MEASURE_PCR_IDX=10
CONFIG_IMA_LSM_RULES=y
CONFIG_IMA_NG_TEMPLATE=y
# CONFIG_IMA_SIG_TEMPLATE is not set
CONFIG_IMA_DEFAULT_TEMPLATE="ima-ng"
CONFIG_IMA_DEFAULT_HASH_SHA1=y
# CONFIG_IMA_DEFAULT_HASH_SHA256 is not set
# CONFIG_IMA_DEFAULT_HASH_SHA512 is not set
CONFIG_IMA_DEFAULT_HASH="sha1"
# CONFIG_IMA_WRITE_POLICY is not set
# CONFIG_IMA_READ_POLICY is not set
CONFIG_IMA_APPRAISE=y
CONFIG_IMA_ARCH_POLICY=y
# CONFIG_IMA_APPRAISE_BUILD_POLICY is not set
CONFIG_IMA_APPRAISE_BOOTPARAM=y
CONFIG_IMA_APPRAISE_MODSIG=y
# CONFIG_IMA_KEYRINGS_PERMIT_SIGNED_BY_BUILTIN_OR_SECONDARY is not set
CONFIG_IMA_MEASURE_ASYMMETRIC_KEYS=y
CONFIG_IMA_QUEUE_EARLY_BOOT_KEYS=y
CONFIG_IMA_SECURE_AND_OR_TRUSTED_BOOT=y
# CONFIG_IMA_DISABLE_HTABLE is not set
CONFIG_EVM=y
CONFIG_EVM_ATTR_FSUUID=y
CONFIG_EVM_EXTRA_SMACK_XATTRS=y
CONFIG_EVM_ADD_XATTRS=y
# CONFIG_EVM_LOAD_X509 is not set
# CONFIG_DEFAULT_SECURITY_SELINUX is not set
# CONFIG_DEFAULT_SECURITY_SMACK is not set
# CONFIG_DEFAULT_SECURITY_TOMOYO is not set
CONFIG_DEFAULT_SECURITY_APPARMOR=y
# CONFIG_DEFAULT_SECURITY_DAC is not set
CONFIG_LSM="landlock,lockdown,yama,integrity,apparmor"

#
# Kernel hardening options
#

#
# Memory initialization
#
CONFIG_CC_HAS_AUTO_VAR_INIT_PATTERN=y
CONFIG_CC_HAS_AUTO_VAR_INIT_ZERO_BARE=y
CONFIG_CC_HAS_AUTO_VAR_INIT_ZERO=y
# CONFIG_INIT_STACK_NONE is not set
# CONFIG_INIT_STACK_ALL_PATTERN is not set
CONFIG_INIT_STACK_ALL_ZERO=y
CONFIG_INIT_ON_ALLOC_DEFAULT_ON=y
# CONFIG_INIT_ON_FREE_DEFAULT_ON is not set
CONFIG_CC_HAS_ZERO_CALL_USED_REGS=y
CONFIG_ZERO_CALL_USED_REGS=y
# end of Memory initialization

#
# Hardening of kernel data structures
#
# CONFIG_LIST_HARDENED is not set
# CONFIG_BUG_ON_DATA_CORRUPTION is not set
# end of Hardening of kernel data structures

CONFIG_RANDSTRUCT_NONE=y
# end of Kernel hardening options
# end of Security options

CONFIG_XOR_BLOCKS=m
CONFIG_ASYNC_CORE=m
CONFIG_ASYNC_MEMCPY=m
CONFIG_ASYNC_XOR=m
CONFIG_ASYNC_PQ=m
CONFIG_ASYNC_RAID6_RECOV=m
CONFIG_CRYPTO=y

#
# Crypto core or helper
#
CONFIG_CRYPTO_ALGAPI=y
CONFIG_CRYPTO_ALGAPI2=y
CONFIG_CRYPTO_AEAD=y
CONFIG_CRYPTO_AEAD2=y
CONFIG_CRYPTO_SIG2=y
CONFIG_CRYPTO_SKCIPHER=y
CONFIG_CRYPTO_SKCIPHER2=y
CONFIG_CRYPTO_HASH=y
CONFIG_CRYPTO_HASH2=y
CONFIG_CRYPTO_RNG=y
CONFIG_CRYPTO_RNG2=y
CONFIG_CRYPTO_RNG_DEFAULT=y
CONFIG_CRYPTO_AKCIPHER2=y
CONFIG_CRYPTO_AKCIPHER=y
CONFIG_CRYPTO_KPP2=y
CONFIG_CRYPTO_KPP=y
CONFIG_CRYPTO_ACOMP2=y
CONFIG_CRYPTO_MANAGER=y
CONFIG_CRYPTO_MANAGER2=y
CONFIG_CRYPTO_USER=m
CONFIG_CRYPTO_MANAGER_DISABLE_TESTS=y
CONFIG_CRYPTO_NULL=y
CONFIG_CRYPTO_NULL2=y
CONFIG_CRYPTO_PCRYPT=m
CONFIG_CRYPTO_CRYPTD=m
CONFIG_CRYPTO_AUTHENC=m
CONFIG_CRYPTO_TEST=m
CONFIG_CRYPTO_SIMD=m
CONFIG_CRYPTO_ENGINE=m
# end of Crypto core or helper

#
# Public-key cryptography
#
CONFIG_CRYPTO_RSA=y
CONFIG_CRYPTO_DH=y
CONFIG_CRYPTO_DH_RFC7919_GROUPS=y
CONFIG_CRYPTO_ECC=m
CONFIG_CRYPTO_ECDH=m
CONFIG_CRYPTO_ECDSA=m
CONFIG_CRYPTO_ECRDSA=m
CONFIG_CRYPTO_SM2=m
CONFIG_CRYPTO_CURVE25519=m
# end of Public-key cryptography

#
# Block ciphers
#
CONFIG_CRYPTO_AES=y
CONFIG_CRYPTO_AES_TI=m
CONFIG_CRYPTO_ARIA=m
CONFIG_CRYPTO_BLOWFISH=m
CONFIG_CRYPTO_BLOWFISH_COMMON=m
CONFIG_CRYPTO_CAMELLIA=m
CONFIG_CRYPTO_CAST_COMMON=m
CONFIG_CRYPTO_CAST5=m
CONFIG_CRYPTO_CAST6=m
CONFIG_CRYPTO_DES=m
CONFIG_CRYPTO_FCRYPT=m
CONFIG_CRYPTO_SERPENT=m
CONFIG_CRYPTO_SM4=m
CONFIG_CRYPTO_SM4_GENERIC=m
CONFIG_CRYPTO_TWOFISH=m
CONFIG_CRYPTO_TWOFISH_COMMON=m
# end of Block ciphers

#
# Length-preserving ciphers and modes
#
CONFIG_CRYPTO_ADIANTUM=m
CONFIG_CRYPTO_CHACHA20=m
CONFIG_CRYPTO_CBC=y
CONFIG_CRYPTO_CFB=m
CONFIG_CRYPTO_CTR=y
CONFIG_CRYPTO_CTS=y
CONFIG_CRYPTO_ECB=y
CONFIG_CRYPTO_HCTR2=m
CONFIG_CRYPTO_KEYWRAP=m
CONFIG_CRYPTO_LRW=m
CONFIG_CRYPTO_OFB=m
CONFIG_CRYPTO_PCBC=m
CONFIG_CRYPTO_XCTR=m
CONFIG_CRYPTO_XTS=y
CONFIG_CRYPTO_NHPOLY1305=m
# end of Length-preserving ciphers and modes

#
# AEAD (authenticated encryption with associated data) ciphers
#
CONFIG_CRYPTO_AEGIS128=m
CONFIG_CRYPTO_CHACHA20POLY1305=m
CONFIG_CRYPTO_CCM=m
CONFIG_CRYPTO_GCM=y
CONFIG_CRYPTO_GENIV=y
CONFIG_CRYPTO_SEQIV=y
CONFIG_CRYPTO_ECHAINIV=m
CONFIG_CRYPTO_ESSIV=m
# end of AEAD (authenticated encryption with associated data) ciphers

#
# Hashes, digests, and MACs
#
CONFIG_CRYPTO_BLAKE2B=m
CONFIG_CRYPTO_CMAC=m
CONFIG_CRYPTO_GHASH=y
CONFIG_CRYPTO_HMAC=y
CONFIG_CRYPTO_MD4=m
CONFIG_CRYPTO_MD5=y
CONFIG_CRYPTO_MICHAEL_MIC=m
CONFIG_CRYPTO_POLYVAL=m
CONFIG_CRYPTO_POLY1305=m
CONFIG_CRYPTO_RMD160=m
CONFIG_CRYPTO_SHA1=y
CONFIG_CRYPTO_SHA256=y
CONFIG_CRYPTO_SHA512=y
CONFIG_CRYPTO_SHA3=y
CONFIG_CRYPTO_SM3=m
CONFIG_CRYPTO_SM3_GENERIC=m
CONFIG_CRYPTO_STREEBOG=m
CONFIG_CRYPTO_VMAC=m
CONFIG_CRYPTO_WP512=m
CONFIG_CRYPTO_XCBC=m
CONFIG_CRYPTO_XXHASH=m
# end of Hashes, digests, and MACs

#
# CRCs (cyclic redundancy checks)
#
CONFIG_CRYPTO_CRC32C=y
CONFIG_CRYPTO_CRC32=m
CONFIG_CRYPTO_CRCT10DIF=y
CONFIG_CRYPTO_CRC64_ROCKSOFT=y
# end of CRCs (cyclic redundancy checks)

#
# Compression
#
CONFIG_CRYPTO_DEFLATE=y
CONFIG_CRYPTO_LZO=y
CONFIG_CRYPTO_842=m
CONFIG_CRYPTO_LZ4=m
CONFIG_CRYPTO_LZ4HC=m
CONFIG_CRYPTO_ZSTD=m
# end of Compression

#
# Random number generation
#
CONFIG_CRYPTO_ANSI_CPRNG=m
CONFIG_CRYPTO_DRBG_MENU=y
CONFIG_CRYPTO_DRBG_HMAC=y
CONFIG_CRYPTO_DRBG_HASH=y
CONFIG_CRYPTO_DRBG_CTR=y
CONFIG_CRYPTO_DRBG=y
CONFIG_CRYPTO_JITTERENTROPY=y
# CONFIG_CRYPTO_JITTERENTROPY_TESTINTERFACE is not set
CONFIG_CRYPTO_KDF800108_CTR=y
# end of Random number generation

#
# Userspace interface
#
CONFIG_CRYPTO_USER_API=m
CONFIG_CRYPTO_USER_API_HASH=m
CONFIG_CRYPTO_USER_API_SKCIPHER=m
CONFIG_CRYPTO_USER_API_RNG=m
# CONFIG_CRYPTO_USER_API_RNG_CAVP is not set
CONFIG_CRYPTO_USER_API_AEAD=m
# CONFIG_CRYPTO_USER_API_ENABLE_OBSOLETE is not set
CONFIG_CRYPTO_STATS=y
# end of Userspace interface

CONFIG_CRYPTO_HASH_INFO=y

#
# Accelerated Cryptographic Algorithms for CPU (x86)
#
CONFIG_CRYPTO_CURVE25519_X86=m
CONFIG_CRYPTO_AES_NI_INTEL=m
CONFIG_CRYPTO_BLOWFISH_X86_64=m
CONFIG_CRYPTO_CAMELLIA_X86_64=m
CONFIG_CRYPTO_CAMELLIA_AESNI_AVX_X86_64=m
CONFIG_CRYPTO_CAMELLIA_AESNI_AVX2_X86_64=m
CONFIG_CRYPTO_CAST5_AVX_X86_64=m
CONFIG_CRYPTO_CAST6_AVX_X86_64=m
CONFIG_CRYPTO_DES3_EDE_X86_64=m
CONFIG_CRYPTO_SERPENT_SSE2_X86_64=m
CONFIG_CRYPTO_SERPENT_AVX_X86_64=m
CONFIG_CRYPTO_SERPENT_AVX2_X86_64=m
CONFIG_CRYPTO_SM4_AESNI_AVX_X86_64=m
CONFIG_CRYPTO_SM4_AESNI_AVX2_X86_64=m
CONFIG_CRYPTO_TWOFISH_X86_64=m
CONFIG_CRYPTO_TWOFISH_X86_64_3WAY=m
CONFIG_CRYPTO_TWOFISH_AVX_X86_64=m
CONFIG_CRYPTO_ARIA_AESNI_AVX_X86_64=m
# CONFIG_CRYPTO_ARIA_AESNI_AVX2_X86_64 is not set
# CONFIG_CRYPTO_ARIA_GFNI_AVX512_X86_64 is not set
CONFIG_CRYPTO_CHACHA20_X86_64=m
CONFIG_CRYPTO_AEGIS128_AESNI_SSE2=m
CONFIG_CRYPTO_NHPOLY1305_SSE2=m
CONFIG_CRYPTO_NHPOLY1305_AVX2=m
CONFIG_CRYPTO_BLAKE2S_X86=y
CONFIG_CRYPTO_POLYVAL_CLMUL_NI=m
CONFIG_CRYPTO_POLY1305_X86_64=m
CONFIG_CRYPTO_SHA1_SSSE3=m
CONFIG_CRYPTO_SHA256_SSSE3=m
CONFIG_CRYPTO_SHA512_SSSE3=m
CONFIG_CRYPTO_SM3_AVX_X86_64=m
CONFIG_CRYPTO_GHASH_CLMUL_NI_INTEL=m
CONFIG_CRYPTO_CRC32C_INTEL=y
CONFIG_CRYPTO_CRC32_PCLMUL=m
CONFIG_CRYPTO_CRCT10DIF_PCLMUL=m
# end of Accelerated Cryptographic Algorithms for CPU (x86)

CONFIG_CRYPTO_HW=y
CONFIG_CRYPTO_DEV_PADLOCK=y
CONFIG_CRYPTO_DEV_PADLOCK_AES=m
CONFIG_CRYPTO_DEV_PADLOCK_SHA=m
CONFIG_CRYPTO_DEV_ATMEL_I2C=m
CONFIG_CRYPTO_DEV_ATMEL_ECC=m
CONFIG_CRYPTO_DEV_ATMEL_SHA204A=m
CONFIG_CRYPTO_DEV_CCP=y
CONFIG_CRYPTO_DEV_CCP_DD=m
CONFIG_CRYPTO_DEV_SP_CCP=y
CONFIG_CRYPTO_DEV_CCP_CRYPTO=m
CONFIG_CRYPTO_DEV_SP_PSP=y
# CONFIG_CRYPTO_DEV_CCP_DEBUGFS is not set
CONFIG_CRYPTO_DEV_NITROX=m
CONFIG_CRYPTO_DEV_NITROX_CNN55XX=m
CONFIG_CRYPTO_DEV_QAT=m
CONFIG_CRYPTO_DEV_QAT_DH895xCC=m
CONFIG_CRYPTO_DEV_QAT_C3XXX=m
CONFIG_CRYPTO_DEV_QAT_C62X=m
CONFIG_CRYPTO_DEV_QAT_4XXX=m
CONFIG_CRYPTO_DEV_QAT_DH895xCCVF=m
CONFIG_CRYPTO_DEV_QAT_C3XXXVF=m
CONFIG_CRYPTO_DEV_QAT_C62XVF=m
CONFIG_CRYPTO_DEV_CHELSIO=m
CONFIG_CRYPTO_DEV_VIRTIO=m
CONFIG_CRYPTO_DEV_SAFEXCEL=m
CONFIG_CRYPTO_DEV_AMLOGIC_GXL=m
# CONFIG_CRYPTO_DEV_AMLOGIC_GXL_DEBUG is not set
CONFIG_ASYMMETRIC_KEY_TYPE=y
CONFIG_ASYMMETRIC_PUBLIC_KEY_SUBTYPE=y
CONFIG_X509_CERTIFICATE_PARSER=y
CONFIG_PKCS8_PRIVATE_KEY_PARSER=m
CONFIG_PKCS7_MESSAGE_PARSER=y
CONFIG_PKCS7_TEST_KEY=m
CONFIG_SIGNED_PE_FILE_VERIFICATION=y
# CONFIG_FIPS_SIGNATURE_SELFTEST is not set

#
# Certificates for signature checking
#
CONFIG_MODULE_SIG_KEY="certs/signing_key.pem"
CONFIG_MODULE_SIG_KEY_TYPE_RSA=y
# CONFIG_MODULE_SIG_KEY_TYPE_ECDSA is not set
CONFIG_SYSTEM_TRUSTED_KEYRING=y
CONFIG_SYSTEM_TRUSTED_KEYS=""
CONFIG_SYSTEM_EXTRA_CERTIFICATE=y
CONFIG_SYSTEM_EXTRA_CERTIFICATE_SIZE=4096
CONFIG_SECONDARY_TRUSTED_KEYRING=y
CONFIG_SYSTEM_BLACKLIST_KEYRING=y
CONFIG_SYSTEM_BLACKLIST_HASH_LIST=""
CONFIG_SYSTEM_REVOCATION_LIST=y
CONFIG_SYSTEM_REVOCATION_KEYS=""
# CONFIG_SYSTEM_BLACKLIST_AUTH_UPDATE is not set
# end of Certificates for signature checking

CONFIG_BINARY_PRINTF=y

#
# Library routines
#
CONFIG_RAID6_PQ=m
CONFIG_RAID6_PQ_BENCHMARK=y
CONFIG_LINEAR_RANGES=y
CONFIG_PACKING=y
CONFIG_BITREVERSE=y
CONFIG_GENERIC_STRNCPY_FROM_USER=y
CONFIG_GENERIC_STRNLEN_USER=y
CONFIG_GENERIC_NET_UTILS=y
CONFIG_CORDIC=m
# CONFIG_PRIME_NUMBERS is not set
CONFIG_RATIONAL=y
CONFIG_GENERIC_PCI_IOMAP=y
CONFIG_GENERIC_IOMAP=y
CONFIG_ARCH_USE_CMPXCHG_LOCKREF=y
CONFIG_ARCH_HAS_FAST_MULTIPLIER=y
CONFIG_ARCH_USE_SYM_ANNOTATIONS=y

#
# Crypto library routines
#
CONFIG_CRYPTO_LIB_UTILS=y
CONFIG_CRYPTO_LIB_AES=y
CONFIG_CRYPTO_LIB_ARC4=m
CONFIG_CRYPTO_LIB_GF128MUL=y
CONFIG_CRYPTO_ARCH_HAVE_LIB_BLAKE2S=y
CONFIG_CRYPTO_LIB_BLAKE2S_GENERIC=y
CONFIG_CRYPTO_ARCH_HAVE_LIB_CHACHA=m
CONFIG_CRYPTO_LIB_CHACHA_GENERIC=m
CONFIG_CRYPTO_LIB_CHACHA=m
CONFIG_CRYPTO_ARCH_HAVE_LIB_CURVE25519=m
CONFIG_CRYPTO_LIB_CURVE25519_GENERIC=m
CONFIG_CRYPTO_LIB_CURVE25519=m
CONFIG_CRYPTO_LIB_DES=m
CONFIG_CRYPTO_LIB_POLY1305_RSIZE=11
CONFIG_CRYPTO_ARCH_HAVE_LIB_POLY1305=m
CONFIG_CRYPTO_LIB_POLY1305_GENERIC=m
CONFIG_CRYPTO_LIB_POLY1305=m
CONFIG_CRYPTO_LIB_CHACHA20POLY1305=m
CONFIG_CRYPTO_LIB_SHA1=y
CONFIG_CRYPTO_LIB_SHA256=y
# end of Crypto library routines

CONFIG_CRC_CCITT=y
CONFIG_CRC16=y
CONFIG_CRC_T10DIF=y
CONFIG_CRC64_ROCKSOFT=y
CONFIG_CRC_ITU_T=m
CONFIG_CRC32=y
# CONFIG_CRC32_SELFTEST is not set
CONFIG_CRC32_SLICEBY8=y
# CONFIG_CRC32_SLICEBY4 is not set
# CONFIG_CRC32_SARWATE is not set
# CONFIG_CRC32_BIT is not set
CONFIG_CRC64=y
CONFIG_CRC4=m
CONFIG_CRC7=m
CONFIG_LIBCRC32C=m
CONFIG_CRC8=m
CONFIG_XXHASH=y
# CONFIG_RANDOM32_SELFTEST is not set
CONFIG_842_COMPRESS=m
CONFIG_842_DECOMPRESS=m
CONFIG_ZLIB_INFLATE=y
CONFIG_ZLIB_DEFLATE=y
CONFIG_LZO_COMPRESS=y
CONFIG_LZO_DECOMPRESS=y
CONFIG_LZ4_COMPRESS=m
CONFIG_LZ4HC_COMPRESS=m
CONFIG_LZ4_DECOMPRESS=y
CONFIG_ZSTD_COMMON=y
CONFIG_ZSTD_COMPRESS=y
CONFIG_ZSTD_DECOMPRESS=y
CONFIG_XZ_DEC=y
CONFIG_XZ_DEC_X86=y
CONFIG_XZ_DEC_POWERPC=y
CONFIG_XZ_DEC_IA64=y
CONFIG_XZ_DEC_ARM=y
CONFIG_XZ_DEC_ARMTHUMB=y
CONFIG_XZ_DEC_SPARC=y
CONFIG_XZ_DEC_MICROLZMA=y
CONFIG_XZ_DEC_BCJ=y
CONFIG_XZ_DEC_TEST=m
CONFIG_DECOMPRESS_GZIP=y
CONFIG_DECOMPRESS_BZIP2=y
CONFIG_DECOMPRESS_LZMA=y
CONFIG_DECOMPRESS_XZ=y
CONFIG_DECOMPRESS_LZO=y
CONFIG_DECOMPRESS_LZ4=y
CONFIG_DECOMPRESS_ZSTD=y
CONFIG_GENERIC_ALLOCATOR=y
CONFIG_REED_SOLOMON=m
CONFIG_REED_SOLOMON_ENC8=y
CONFIG_REED_SOLOMON_DEC8=y
CONFIG_REED_SOLOMON_DEC16=y
CONFIG_BCH=m
CONFIG_TEXTSEARCH=y
CONFIG_TEXTSEARCH_KMP=m
CONFIG_TEXTSEARCH_BM=m
CONFIG_TEXTSEARCH_FSM=m
CONFIG_BTREE=y
CONFIG_INTERVAL_TREE=y
CONFIG_INTERVAL_TREE_SPAN_ITER=y
CONFIG_XARRAY_MULTI=y
CONFIG_ASSOCIATIVE_ARRAY=y
CONFIG_HAS_IOMEM=y
CONFIG_HAS_IOPORT=y
CONFIG_HAS_IOPORT_MAP=y
CONFIG_HAS_DMA=y
CONFIG_DMA_OPS=y
CONFIG_NEED_SG_DMA_FLAGS=y
CONFIG_NEED_SG_DMA_LENGTH=y
CONFIG_NEED_DMA_MAP_STATE=y
CONFIG_ARCH_DMA_ADDR_T_64BIT=y
CONFIG_ARCH_HAS_FORCE_DMA_UNENCRYPTED=y
CONFIG_SWIOTLB=y
# CONFIG_SWIOTLB_DYNAMIC is not set
CONFIG_DMA_COHERENT_POOL=y
# CONFIG_DMA_API_DEBUG is not set
# CONFIG_DMA_MAP_BENCHMARK is not set
CONFIG_SGL_ALLOC=y
CONFIG_IOMMU_HELPER=y
CONFIG_CHECK_SIGNATURE=y
CONFIG_CPUMASK_OFFSTACK=y
# CONFIG_FORCE_NR_CPUS is not set
CONFIG_CPU_RMAP=y
CONFIG_DQL=y
CONFIG_GLOB=y
# CONFIG_GLOB_SELFTEST is not set
CONFIG_NLATTR=y
CONFIG_LRU_CACHE=m
CONFIG_CLZ_TAB=y
CONFIG_IRQ_POLL=y
CONFIG_MPILIB=y
CONFIG_SIGNATURE=y
CONFIG_DIMLIB=y
CONFIG_OID_REGISTRY=y
CONFIG_UCS2_STRING=y
CONFIG_HAVE_GENERIC_VDSO=y
CONFIG_GENERIC_GETTIMEOFDAY=y
CONFIG_GENERIC_VDSO_TIME_NS=y
CONFIG_FONT_SUPPORT=y
CONFIG_FONTS=y
CONFIG_FONT_8x8=y
CONFIG_FONT_8x16=y
# CONFIG_FONT_6x11 is not set
# CONFIG_FONT_7x14 is not set
# CONFIG_FONT_PEARL_8x8 is not set
CONFIG_FONT_ACORN_8x8=y
# CONFIG_FONT_MINI_4x6 is not set
CONFIG_FONT_6x10=y
# CONFIG_FONT_10x18 is not set
# CONFIG_FONT_SUN8x16 is not set
# CONFIG_FONT_SUN12x22 is not set
CONFIG_FONT_TER16x32=y
# CONFIG_FONT_6x8 is not set
CONFIG_SG_POOL=y
CONFIG_ARCH_HAS_PMEM_API=y
CONFIG_MEMREGION=y
CONFIG_ARCH_HAS_CPU_CACHE_INVALIDATE_MEMREGION=y
CONFIG_ARCH_HAS_UACCESS_FLUSHCACHE=y
CONFIG_ARCH_HAS_COPY_MC=y
CONFIG_ARCH_STACKWALK=y
CONFIG_STACKDEPOT=y
CONFIG_STACKDEPOT_ALWAYS_INIT=y
CONFIG_SBITMAP=y
CONFIG_PARMAN=m
CONFIG_OBJAGG=m
# end of Library routines

CONFIG_PLDMFW=y
CONFIG_ASN1_ENCODER=y
CONFIG_POLYNOMIAL=m

#
# Kernel hacking
#

#
# printk and dmesg options
#
CONFIG_PRINTK_TIME=y
# CONFIG_PRINTK_CALLER is not set
# CONFIG_STACKTRACE_BUILD_ID is not set
CONFIG_CONSOLE_LOGLEVEL_DEFAULT=7
CONFIG_CONSOLE_LOGLEVEL_QUIET=4
CONFIG_MESSAGE_LOGLEVEL_DEFAULT=4
CONFIG_BOOT_PRINTK_DELAY=y
CONFIG_DYNAMIC_DEBUG=y
CONFIG_DYNAMIC_DEBUG_CORE=y
CONFIG_SYMBOLIC_ERRNAME=y
CONFIG_DEBUG_BUGVERBOSE=y
# end of printk and dmesg options

CONFIG_DEBUG_KERNEL=y
CONFIG_DEBUG_MISC=y

#
# Compile-time checks and compiler options
#
CONFIG_DEBUG_INFO=y
CONFIG_AS_HAS_NON_CONST_LEB128=y
# CONFIG_DEBUG_INFO_NONE is not set
# CONFIG_DEBUG_INFO_DWARF_TOOLCHAIN_DEFAULT is not set
# CONFIG_DEBUG_INFO_DWARF4 is not set
CONFIG_DEBUG_INFO_DWARF5=y
# CONFIG_DEBUG_INFO_REDUCED is not set
CONFIG_DEBUG_INFO_COMPRESSED_NONE=y
# CONFIG_DEBUG_INFO_COMPRESSED_ZLIB is not set
# CONFIG_DEBUG_INFO_SPLIT is not set
CONFIG_DEBUG_INFO_BTF=y
CONFIG_PAHOLE_HAS_SPLIT_BTF=y
CONFIG_PAHOLE_HAS_LANG_EXCLUDE=y
CONFIG_DEBUG_INFO_BTF_MODULES=y
# CONFIG_MODULE_ALLOW_BTF_MISMATCH is not set
CONFIG_GDB_SCRIPTS=y
CONFIG_FRAME_WARN=2048
# CONFIG_STRIP_ASM_SYMS is not set
# CONFIG_READABLE_ASM is not set
# CONFIG_HEADERS_INSTALL is not set
# CONFIG_DEBUG_SECTION_MISMATCH is not set
CONFIG_SECTION_MISMATCH_WARN_ONLY=y
# CONFIG_DEBUG_FORCE_FUNCTION_ALIGN_64B is not set
CONFIG_FRAME_POINTER=y
CONFIG_OBJTOOL=y
CONFIG_STACK_VALIDATION=y
CONFIG_VMLINUX_MAP=y
# CONFIG_DEBUG_FORCE_WEAK_PER_CPU is not set
# end of Compile-time checks and compiler options

#
# Generic Kernel Debugging Instruments
#
CONFIG_MAGIC_SYSRQ=y
CONFIG_MAGIC_SYSRQ_DEFAULT_ENABLE=0x01b6
CONFIG_MAGIC_SYSRQ_SERIAL=y
CONFIG_MAGIC_SYSRQ_SERIAL_SEQUENCE=""
CONFIG_DEBUG_FS=y
CONFIG_DEBUG_FS_ALLOW_ALL=y
# CONFIG_DEBUG_FS_DISALLOW_MOUNT is not set
# CONFIG_DEBUG_FS_ALLOW_NONE is not set
CONFIG_HAVE_ARCH_KGDB=y
CONFIG_KGDB=y
CONFIG_KGDB_HONOUR_BLOCKLIST=y
CONFIG_KGDB_SERIAL_CONSOLE=y
# CONFIG_KGDB_TESTS is not set
CONFIG_KGDB_LOW_LEVEL_TRAP=y
CONFIG_KGDB_KDB=y
CONFIG_KDB_DEFAULT_ENABLE=0x1
CONFIG_KDB_KEYBOARD=y
CONFIG_KDB_CONTINUE_CATASTROPHIC=0
CONFIG_ARCH_HAS_EARLY_DEBUG=y
CONFIG_ARCH_HAS_UBSAN_SANITIZE_ALL=y
CONFIG_UBSAN=y
# CONFIG_UBSAN_TRAP is not set
CONFIG_CC_HAS_UBSAN_BOUNDS_STRICT=y
CONFIG_UBSAN_BOUNDS=y
CONFIG_UBSAN_BOUNDS_STRICT=y
CONFIG_UBSAN_SHIFT=y
# CONFIG_UBSAN_DIV_ZERO is not set
CONFIG_UBSAN_BOOL=y
CONFIG_UBSAN_ENUM=y
# CONFIG_UBSAN_ALIGNMENT is not set
CONFIG_UBSAN_SANITIZE_ALL=y
# CONFIG_TEST_UBSAN is not set
CONFIG_HAVE_ARCH_KCSAN=y
CONFIG_HAVE_KCSAN_COMPILER=y
# end of Generic Kernel Debugging Instruments

#
# Networking Debugging
#
# CONFIG_NET_DEV_REFCNT_TRACKER is not set
# CONFIG_NET_NS_REFCNT_TRACKER is not set
# CONFIG_DEBUG_NET is not set
# end of Networking Debugging

#
# Memory Debugging
#
# CONFIG_PAGE_EXTENSION is not set
# CONFIG_DEBUG_PAGEALLOC is not set
CONFIG_SLUB_DEBUG=y
# CONFIG_SLUB_DEBUG_ON is not set
# CONFIG_PAGE_OWNER is not set
# CONFIG_PAGE_TABLE_CHECK is not set
CONFIG_PAGE_POISONING=y
# CONFIG_DEBUG_PAGE_REF is not set
# CONFIG_DEBUG_RODATA_TEST is not set
CONFIG_ARCH_HAS_DEBUG_WX=y
CONFIG_DEBUG_WX=y
CONFIG_GENERIC_PTDUMP=y
CONFIG_PTDUMP_CORE=y
# CONFIG_PTDUMP_DEBUGFS is not set
CONFIG_HAVE_DEBUG_KMEMLEAK=y
# CONFIG_DEBUG_KMEMLEAK is not set
# CONFIG_PER_VMA_LOCK_STATS is not set
# CONFIG_DEBUG_OBJECTS is not set
# CONFIG_SHRINKER_DEBUG is not set
# CONFIG_DEBUG_STACK_USAGE is not set
CONFIG_SCHED_STACK_END_CHECK=y
CONFIG_ARCH_HAS_DEBUG_VM_PGTABLE=y
# CONFIG_DEBUG_VM is not set
# CONFIG_DEBUG_VM_PGTABLE is not set
CONFIG_ARCH_HAS_DEBUG_VIRTUAL=y
# CONFIG_DEBUG_VIRTUAL is not set
# CONFIG_DEBUG_MEMORY_INIT is not set
CONFIG_MEMORY_NOTIFIER_ERROR_INJECT=m
# CONFIG_DEBUG_PER_CPU_MAPS is not set
CONFIG_HAVE_ARCH_KASAN=y
CONFIG_HAVE_ARCH_KASAN_VMALLOC=y
CONFIG_CC_HAS_KASAN_GENERIC=y
CONFIG_CC_HAS_WORKING_NOSANITIZE_ADDRESS=y
CONFIG_KASAN=y
CONFIG_KASAN_GENERIC=y
CONFIG_KASAN_OUTLINE=y
# CONFIG_KASAN_INLINE is not set
CONFIG_KASAN_STACK=y
# CONFIG_KASAN_VMALLOC is not set
# CONFIG_KASAN_MODULE_TEST is not set
CONFIG_HAVE_ARCH_KFENCE=y
CONFIG_KFENCE=y
CONFIG_KFENCE_SAMPLE_INTERVAL=0
CONFIG_KFENCE_NUM_OBJECTS=255
# CONFIG_KFENCE_DEFERRABLE is not set
# CONFIG_KFENCE_STATIC_KEYS is not set
CONFIG_KFENCE_STRESS_TEST_FAULTS=0
CONFIG_HAVE_ARCH_KMSAN=y
# end of Memory Debugging

# CONFIG_DEBUG_SHIRQ is not set

#
# Debug Oops, Lockups and Hangs
#
# CONFIG_PANIC_ON_OOPS is not set
CONFIG_PANIC_ON_OOPS_VALUE=0
CONFIG_PANIC_TIMEOUT=0
CONFIG_LOCKUP_DETECTOR=y
CONFIG_SOFTLOCKUP_DETECTOR=y
# CONFIG_BOOTPARAM_SOFTLOCKUP_PANIC is not set
CONFIG_HAVE_HARDLOCKUP_DETECTOR_BUDDY=y
CONFIG_HARDLOCKUP_DETECTOR=y
# CONFIG_HARDLOCKUP_DETECTOR_PREFER_BUDDY is not set
CONFIG_HARDLOCKUP_DETECTOR_PERF=y
# CONFIG_HARDLOCKUP_DETECTOR_BUDDY is not set
# CONFIG_HARDLOCKUP_DETECTOR_ARCH is not set
CONFIG_HARDLOCKUP_DETECTOR_COUNTS_HRTIMER=y
CONFIG_HARDLOCKUP_CHECK_TIMESTAMP=y
# CONFIG_BOOTPARAM_HARDLOCKUP_PANIC is not set
CONFIG_DETECT_HUNG_TASK=y
CONFIG_DEFAULT_HUNG_TASK_TIMEOUT=120
# CONFIG_BOOTPARAM_HUNG_TASK_PANIC is not set
# CONFIG_WQ_WATCHDOG is not set
# CONFIG_WQ_CPU_INTENSIVE_REPORT is not set
# CONFIG_TEST_LOCKUP is not set
# end of Debug Oops, Lockups and Hangs

#
# Scheduler Debugging
#
CONFIG_SCHED_DEBUG=y
CONFIG_SCHED_INFO=y
CONFIG_SCHEDSTATS=y
# end of Scheduler Debugging

# CONFIG_DEBUG_TIMEKEEPING is not set
# CONFIG_DEBUG_PREEMPT is not set

#
# Lock Debugging (spinlocks, mutexes, etc...)
#
CONFIG_LOCK_DEBUGGING_SUPPORT=y
CONFIG_PROVE_LOCKING=y
CONFIG_PROVE_RAW_LOCK_NESTING=y
CONFIG_LOCK_STAT=y
CONFIG_DEBUG_RT_MUTEXES=y
CONFIG_DEBUG_SPINLOCK=y
CONFIG_DEBUG_MUTEXES=y
CONFIG_DEBUG_WW_MUTEX_SLOWPATH=y
CONFIG_DEBUG_RWSEMS=y
CONFIG_DEBUG_LOCK_ALLOC=y
CONFIG_LOCKDEP=y
CONFIG_LOCKDEP_BITS=15
CONFIG_LOCKDEP_CHAINS_BITS=16
CONFIG_LOCKDEP_STACK_TRACE_BITS=19
CONFIG_LOCKDEP_STACK_TRACE_HASH_BITS=14
CONFIG_LOCKDEP_CIRCULAR_QUEUE_BITS=12
CONFIG_DEBUG_LOCKDEP=y
CONFIG_DEBUG_ATOMIC_SLEEP=y
CONFIG_DEBUG_LOCKING_API_SELFTESTS=y
# CONFIG_LOCK_TORTURE_TEST is not set
# CONFIG_WW_MUTEX_SELFTEST is not set
# CONFIG_SCF_TORTURE_TEST is not set
# CONFIG_CSD_LOCK_WAIT_DEBUG is not set
# end of Lock Debugging (spinlocks, mutexes, etc...)

CONFIG_TRACE_IRQFLAGS=y
CONFIG_TRACE_IRQFLAGS_NMI=y
# CONFIG_NMI_CHECK_CPU is not set
CONFIG_DEBUG_IRQFLAGS=y
CONFIG_STACKTRACE=y
# CONFIG_WARN_ALL_UNSEEDED_RANDOM is not set
# CONFIG_DEBUG_KOBJECT is not set

#
# Debug kernel data structures
#
# CONFIG_DEBUG_LIST is not set
# CONFIG_DEBUG_PLIST is not set
# CONFIG_DEBUG_SG is not set
# CONFIG_DEBUG_NOTIFIERS is not set
# CONFIG_DEBUG_MAPLE_TREE is not set
# end of Debug kernel data structures

# CONFIG_DEBUG_CREDENTIALS is not set

#
# RCU Debugging
#
CONFIG_PROVE_RCU=y
# CONFIG_RCU_SCALE_TEST is not set
# CONFIG_RCU_TORTURE_TEST is not set
# CONFIG_RCU_REF_SCALE_TEST is not set
CONFIG_RCU_CPU_STALL_TIMEOUT=60
CONFIG_RCU_EXP_CPU_STALL_TIMEOUT=0
# CONFIG_RCU_CPU_STALL_CPUTIME is not set
# CONFIG_RCU_TRACE is not set
# CONFIG_RCU_EQS_DEBUG is not set
# end of RCU Debugging

# CONFIG_DEBUG_WQ_FORCE_RR_CPU is not set
# CONFIG_CPU_HOTPLUG_STATE_CONTROL is not set
# CONFIG_LATENCYTOP is not set
# CONFIG_DEBUG_CGROUP_REF is not set
CONFIG_USER_STACKTRACE_SUPPORT=y
CONFIG_NOP_TRACER=y
CONFIG_HAVE_RETHOOK=y
CONFIG_RETHOOK=y
CONFIG_HAVE_FUNCTION_TRACER=y
CONFIG_HAVE_FUNCTION_GRAPH_TRACER=y
CONFIG_HAVE_FUNCTION_GRAPH_RETVAL=y
CONFIG_HAVE_DYNAMIC_FTRACE=y
CONFIG_HAVE_DYNAMIC_FTRACE_WITH_REGS=y
CONFIG_HAVE_DYNAMIC_FTRACE_WITH_DIRECT_CALLS=y
CONFIG_HAVE_DYNAMIC_FTRACE_WITH_ARGS=y
CONFIG_HAVE_DYNAMIC_FTRACE_NO_PATCHABLE=y
CONFIG_HAVE_FTRACE_MCOUNT_RECORD=y
CONFIG_HAVE_SYSCALL_TRACEPOINTS=y
CONFIG_HAVE_FENTRY=y
CONFIG_HAVE_OBJTOOL_MCOUNT=y
CONFIG_HAVE_OBJTOOL_NOP_MCOUNT=y
CONFIG_HAVE_C_RECORDMCOUNT=y
CONFIG_HAVE_BUILDTIME_MCOUNT_SORT=y
CONFIG_BUILDTIME_MCOUNT_SORT=y
CONFIG_TRACER_MAX_TRACE=y
CONFIG_TRACE_CLOCK=y
CONFIG_RING_BUFFER=y
CONFIG_EVENT_TRACING=y
CONFIG_CONTEXT_SWITCH_TRACER=y
CONFIG_PREEMPTIRQ_TRACEPOINTS=y
CONFIG_TRACING=y
CONFIG_GENERIC_TRACER=y
CONFIG_TRACING_SUPPORT=y
CONFIG_FTRACE=y
CONFIG_BOOTTIME_TRACING=y
CONFIG_FUNCTION_TRACER=y
CONFIG_FUNCTION_GRAPH_TRACER=y
# CONFIG_FUNCTION_GRAPH_RETVAL is not set
CONFIG_DYNAMIC_FTRACE=y
CONFIG_DYNAMIC_FTRACE_WITH_REGS=y
CONFIG_DYNAMIC_FTRACE_WITH_DIRECT_CALLS=y
CONFIG_DYNAMIC_FTRACE_WITH_ARGS=y
CONFIG_FPROBE=y
CONFIG_FUNCTION_PROFILER=y
CONFIG_STACK_TRACER=y
# CONFIG_IRQSOFF_TRACER is not set
# CONFIG_PREEMPT_TRACER is not set
CONFIG_SCHED_TRACER=y
CONFIG_HWLAT_TRACER=y
CONFIG_OSNOISE_TRACER=y
CONFIG_TIMERLAT_TRACER=y
CONFIG_MMIOTRACE=y
CONFIG_FTRACE_SYSCALLS=y
CONFIG_TRACER_SNAPSHOT=y
# CONFIG_TRACER_SNAPSHOT_PER_CPU_SWAP is not set
CONFIG_BRANCH_PROFILE_NONE=y
# CONFIG_PROFILE_ANNOTATED_BRANCHES is not set
CONFIG_BLK_DEV_IO_TRACE=y
CONFIG_FPROBE_EVENTS=y
CONFIG_PROBE_EVENTS_BTF_ARGS=y
CONFIG_KPROBE_EVENTS=y
# CONFIG_KPROBE_EVENTS_ON_NOTRACE is not set
CONFIG_UPROBE_EVENTS=y
CONFIG_BPF_EVENTS=y
CONFIG_DYNAMIC_EVENTS=y
CONFIG_PROBE_EVENTS=y
CONFIG_BPF_KPROBE_OVERRIDE=y
CONFIG_FTRACE_MCOUNT_RECORD=y
CONFIG_FTRACE_MCOUNT_USE_CC=y
CONFIG_TRACING_MAP=y
CONFIG_SYNTH_EVENTS=y
# CONFIG_USER_EVENTS is not set
CONFIG_HIST_TRIGGERS=y
CONFIG_TRACE_EVENT_INJECT=y
# CONFIG_TRACEPOINT_BENCHMARK is not set
# CONFIG_RING_BUFFER_BENCHMARK is not set
# CONFIG_TRACE_EVAL_MAP_FILE is not set
# CONFIG_FTRACE_RECORD_RECURSION is not set
# CONFIG_FTRACE_STARTUP_TEST is not set
# CONFIG_FTRACE_SORT_STARTUP_TEST is not set
# CONFIG_RING_BUFFER_STARTUP_TEST is not set
# CONFIG_RING_BUFFER_VALIDATE_TIME_DELTAS is not set
# CONFIG_MMIOTRACE_TEST is not set
# CONFIG_PREEMPTIRQ_DELAY_TEST is not set
# CONFIG_SYNTH_EVENT_GEN_TEST is not set
# CONFIG_KPROBE_EVENT_GEN_TEST is not set
# CONFIG_HIST_TRIGGERS_DEBUG is not set
CONFIG_DA_MON_EVENTS=y
CONFIG_DA_MON_EVENTS_ID=y
CONFIG_RV=y
CONFIG_RV_MON_WWNR=y
CONFIG_RV_REACTORS=y
CONFIG_RV_REACT_PRINTK=y
CONFIG_RV_REACT_PANIC=y
# CONFIG_PROVIDE_OHCI1394_DMA_INIT is not set
CONFIG_SAMPLES=y
# CONFIG_SAMPLE_AUXDISPLAY is not set
# CONFIG_SAMPLE_TRACE_EVENTS is not set
# CONFIG_SAMPLE_TRACE_CUSTOM_EVENTS is not set
CONFIG_SAMPLE_TRACE_PRINTK=m
CONFIG_SAMPLE_FTRACE_DIRECT=m
# CONFIG_SAMPLE_FTRACE_DIRECT_MULTI is not set
# CONFIG_SAMPLE_FTRACE_OPS is not set
CONFIG_SAMPLE_TRACE_ARRAY=m
# CONFIG_SAMPLE_KOBJECT is not set
# CONFIG_SAMPLE_KPROBES is not set
# CONFIG_SAMPLE_HW_BREAKPOINT is not set
# CONFIG_SAMPLE_FPROBE is not set
# CONFIG_SAMPLE_KFIFO is not set
# CONFIG_SAMPLE_KDB is not set
# CONFIG_SAMPLE_RPMSG_CLIENT is not set
# CONFIG_SAMPLE_LIVEPATCH is not set
# CONFIG_SAMPLE_CONFIGFS is not set
# CONFIG_SAMPLE_VFIO_MDEV_MTTY is not set
# CONFIG_SAMPLE_VFIO_MDEV_MDPY is not set
# CONFIG_SAMPLE_VFIO_MDEV_MDPY_FB is not set
# CONFIG_SAMPLE_VFIO_MDEV_MBOCHS is not set
# CONFIG_SAMPLE_WATCHDOG is not set
CONFIG_HAVE_SAMPLE_FTRACE_DIRECT=y
CONFIG_HAVE_SAMPLE_FTRACE_DIRECT_MULTI=y
CONFIG_ARCH_HAS_DEVMEM_IS_ALLOWED=y
CONFIG_STRICT_DEVMEM=y
# CONFIG_IO_STRICT_DEVMEM is not set

#
# x86 Debugging
#
CONFIG_EARLY_PRINTK_USB=y
# CONFIG_X86_VERBOSE_BOOTUP is not set
CONFIG_EARLY_PRINTK=y
CONFIG_EARLY_PRINTK_DBGP=y
CONFIG_EARLY_PRINTK_USB_XDBC=y
# CONFIG_EFI_PGT_DUMP is not set
# CONFIG_DEBUG_TLBFLUSH is not set
# CONFIG_IOMMU_DEBUG is not set
CONFIG_HAVE_MMIOTRACE_SUPPORT=y
# CONFIG_X86_DECODER_SELFTEST is not set
# CONFIG_IO_DELAY_0X80 is not set
CONFIG_IO_DELAY_0XED=y
# CONFIG_IO_DELAY_UDELAY is not set
# CONFIG_IO_DELAY_NONE is not set
# CONFIG_DEBUG_BOOT_PARAMS is not set
# CONFIG_CPA_DEBUG is not set
# CONFIG_DEBUG_ENTRY is not set
# CONFIG_DEBUG_NMI_SELFTEST is not set
CONFIG_X86_DEBUG_FPU=y
CONFIG_PUNIT_ATOM_DEBUG=m
# CONFIG_UNWINDER_ORC is not set
CONFIG_UNWINDER_FRAME_POINTER=y
# end of x86 Debugging

#
# Kernel Testing and Coverage
#
# CONFIG_KUNIT is not set
CONFIG_NOTIFIER_ERROR_INJECTION=m
CONFIG_PM_NOTIFIER_ERROR_INJECT=m
# CONFIG_NETDEV_NOTIFIER_ERROR_INJECT is not set
CONFIG_FUNCTION_ERROR_INJECTION=y
# CONFIG_FAULT_INJECTION is not set
CONFIG_ARCH_HAS_KCOV=y
CONFIG_CC_HAS_SANCOV_TRACE_PC=y
# CONFIG_KCOV is not set
CONFIG_RUNTIME_TESTING_MENU=y
# CONFIG_TEST_DHRY is not set
# CONFIG_LKDTM is not set
# CONFIG_TEST_MIN_HEAP is not set
# CONFIG_TEST_DIV64 is not set
# CONFIG_BACKTRACE_SELF_TEST is not set
# CONFIG_TEST_REF_TRACKER is not set
# CONFIG_RBTREE_TEST is not set
# CONFIG_REED_SOLOMON_TEST is not set
# CONFIG_INTERVAL_TREE_TEST is not set
# CONFIG_PERCPU_TEST is not set
# CONFIG_ATOMIC64_SELFTEST is not set
# CONFIG_ASYNC_RAID6_TEST is not set
# CONFIG_TEST_HEXDUMP is not set
# CONFIG_STRING_SELFTEST is not set
# CONFIG_TEST_STRING_HELPERS is not set
# CONFIG_TEST_KSTRTOX is not set
# CONFIG_TEST_PRINTF is not set
# CONFIG_TEST_SCANF is not set
# CONFIG_TEST_BITMAP is not set
# CONFIG_TEST_UUID is not set
# CONFIG_TEST_XARRAY is not set
# CONFIG_TEST_MAPLE_TREE is not set
# CONFIG_TEST_RHASHTABLE is not set
# CONFIG_TEST_IDA is not set
# CONFIG_TEST_PARMAN is not set
# CONFIG_TEST_LKM is not set
# CONFIG_TEST_BITOPS is not set
# CONFIG_TEST_VMALLOC is not set
# CONFIG_TEST_USER_COPY is not set
CONFIG_TEST_BPF=m
CONFIG_TEST_BLACKHOLE_DEV=m
# CONFIG_FIND_BIT_BENCHMARK is not set
# CONFIG_TEST_FIRMWARE is not set
# CONFIG_TEST_SYSCTL is not set
# CONFIG_TEST_UDELAY is not set
# CONFIG_TEST_STATIC_KEYS is not set
# CONFIG_TEST_DYNAMIC_DEBUG is not set
# CONFIG_TEST_KMOD is not set
# CONFIG_TEST_MEMCAT_P is not set
# CONFIG_TEST_LIVEPATCH is not set
# CONFIG_TEST_OBJAGG is not set
# CONFIG_TEST_MEMINIT is not set
# CONFIG_TEST_HMM is not set
# CONFIG_TEST_FREE_PAGES is not set
# CONFIG_TEST_FPU is not set
# CONFIG_TEST_CLOCKSOURCE_WATCHDOG is not set
CONFIG_ARCH_USE_MEMTEST=y
CONFIG_MEMTEST=y
# CONFIG_HYPERV_TESTING is not set
# end of Kernel Testing and Coverage

#
# Rust hacking
#
# end of Rust hacking
# end of Kernel hacking

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 19:48                                                                               ` Bart Van Assche
  2023-10-18 20:03                                                                                 ` Bob Pearson
  2023-10-18 20:04                                                                                 ` Bob Pearson
@ 2023-10-18 20:14                                                                                 ` Bob Pearson
  2023-10-18 20:29                                                                                 ` Bob Pearson
  3 siblings, 0 replies; 87+ messages in thread
From: Bob Pearson @ 2023-10-18 20:14 UTC (permalink / raw)
  To: Bart Van Assche, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

[-- Attachment #1: Type: text/plain, Size: 878 bytes --]

On 10/18/23 14:48, Bart Van Assche wrote:
> 
> On 10/18/23 12:17, Jason Gunthorpe wrote:
>> If siw hangs as well, I definitely comfortable continuing to debug and
>> leaving the work queues in-tree for now.
> 
> Regarding the KASAN complaint that I shared about one month ago, can that complaint have any other root cause than the patch "RDMA/rxe: Add
> workqueue support for rxe tasks"? That report shows a use-after-free by
> rxe code with a pointer to memory that was owned by the rxe driver and
> that was freed by the rxe driver. That memory is an skbuff. The rxe
> driver manages skbuffs. The SRP driver doesn't even know about these
> skbuff objects. See also https://lore.kernel.org/linux-rdma/8ee2869b-3f51-4195-9883-015cd30b4241@acm.org/
> 
> Thanks,
> 
> Bart.
> 

And here are the patches to ib_srp I have. I will retry with a clean version to see what happens.

Bob

[-- Attachment #2: ib_srp.diff --]
[-- Type: text/x-patch, Size: 7705 bytes --]

diff --git a/drivers/infiniband/ulp/srp/ib_srp.c b/drivers/infiniband/ulp/srp/ib_srp.c
index 1574218764e0..1b9919e08176 100644
--- a/drivers/infiniband/ulp/srp/ib_srp.c
+++ b/drivers/infiniband/ulp/srp/ib_srp.c
@@ -512,9 +512,11 @@ static struct srp_fr_pool *srp_alloc_fr_pool(struct srp_target_port *target)
  */
 static void srp_destroy_qp(struct srp_rdma_ch *ch)
 {
-	spin_lock_irq(&ch->lock);
-	ib_process_cq_direct(ch->send_cq, -1);
-	spin_unlock_irq(&ch->lock);
+	//spin_lock_irq(&ch->lock);
+	//pr_info("qp#%03d: %s to ib_process_cq_direct\n",
+			//ch->qp->qp_num, __func__);
+	//ib_process_cq_direct(ch->send_cq, -1);
+	//spin_unlock_irq(&ch->lock);
 
 	ib_drain_qp(ch->qp);
 	ib_destroy_qp(ch->qp);
@@ -544,8 +546,10 @@ static int srp_create_ch_ib(struct srp_rdma_ch *ch)
 		goto err;
 	}
 
+	//send_cq = ib_alloc_cq(dev->dev, ch, m * target->queue_size,
+				//ch->comp_vector, IB_POLL_DIRECT);
 	send_cq = ib_alloc_cq(dev->dev, ch, m * target->queue_size,
-				ch->comp_vector, IB_POLL_DIRECT);
+				ch->comp_vector, IB_POLL_SOFTIRQ);
 	if (IS_ERR(send_cq)) {
 		ret = PTR_ERR(send_cq);
 		goto err_recv_cq;
@@ -1154,12 +1158,19 @@ static int srp_connect_ch(struct srp_rdma_ch *ch, uint32_t max_iu_len,
 
 static void srp_inv_rkey_err_done(struct ib_cq *cq, struct ib_wc *wc)
 {
+	struct srp_rdma_ch *ch = cq->cq_context;
+
+	pr_err("qp#%03d: inv_rkey-done: opcode: %d status: %d: len: %d\n",
+			ch->qp->qp_num, wc->opcode, wc->status, wc->byte_len);
+
 	srp_handle_qp_err(cq, wc, "INV RKEY");
 }
 
 static int srp_inv_rkey(struct srp_request *req, struct srp_rdma_ch *ch,
 		u32 rkey)
 {
+	int ret;
+
 	struct ib_send_wr wr = {
 		.opcode		    = IB_WR_LOCAL_INV,
 		.next		    = NULL,
@@ -1170,7 +1181,12 @@ static int srp_inv_rkey(struct srp_request *req, struct srp_rdma_ch *ch,
 
 	wr.wr_cqe = &req->reg_cqe;
 	req->reg_cqe.done = srp_inv_rkey_err_done;
-	return ib_post_send(ch->qp, &wr, NULL);
+	ret = ib_post_send(ch->qp, &wr, NULL);
+	if (ret)
+		pr_err("qp#%03d: %s: ret = %d\n", ch->qp->qp_num, __func__, ret);
+	else
+		pr_info("qp#%03d: post-inv_rkey: %#x", ch->qp->qp_num, rkey);
+	return ret;
 }
 
 static void srp_unmap_data(struct scsi_cmnd *scmnd,
@@ -1408,6 +1424,11 @@ static void srp_map_desc(struct srp_map_state *state, dma_addr_t dma_addr,
 
 static void srp_reg_mr_err_done(struct ib_cq *cq, struct ib_wc *wc)
 {
+	struct srp_rdma_ch *ch = cq->cq_context;
+
+	pr_err("qp#%03d: reg_mr-done: opcode: %d status: %d: len: %d\n",
+			ch->qp->qp_num, wc->opcode, wc->status, wc->byte_len);
+
 	srp_handle_qp_err(cq, wc, "FAST REG");
 }
 
@@ -1488,6 +1509,11 @@ static int srp_map_finish_fr(struct srp_map_state *state,
 		     desc->mr->length, desc->mr->rkey);
 
 	err = ib_post_send(ch->qp, &wr.wr, NULL);
+	if (err)
+		pr_err("qp#%03d: %s: err = %d\n", ch->qp->qp_num, __func__, err);
+	else
+		pr_info("qp#%03d: post-reg_mr: %#x\n", ch->qp->qp_num,
+				desc->mr->rkey);
 	if (unlikely(err)) {
 		WARN_ON_ONCE(err == -ENOMEM);
 		return err;
@@ -1840,10 +1866,14 @@ static struct srp_iu *__srp_get_tx_iu(struct srp_rdma_ch *ch,
 
 	lockdep_assert_held(&ch->lock);
 
-	ib_process_cq_direct(ch->send_cq, -1);
+	//pr_info("qp#%03d: %s to ib_process_cq_direct\n",
+			//ch->qp->qp_num, __func__);
+	//ib_process_cq_direct(ch->send_cq, -1);
 
-	if (list_empty(&ch->free_tx))
+	if (list_empty(&ch->free_tx)) {
+		pr_err("%s: ran out of iu\n", __func__);
 		return NULL;
+	}
 
 	/* Initiator responses to target requests do not consume credits */
 	if (iu_type != SRP_IU_RSP) {
@@ -1869,15 +1899,22 @@ static void srp_send_done(struct ib_cq *cq, struct ib_wc *wc)
 {
 	struct srp_iu *iu = container_of(wc->wr_cqe, struct srp_iu, cqe);
 	struct srp_rdma_ch *ch = cq->cq_context;
+	/**/unsigned long flags;
+
+	pr_info("qp#%03d: send-done: opcode: %d status: %d: len: %d\n",
+			ch->qp->qp_num, wc->opcode, wc->status, wc->byte_len);
 
 	if (unlikely(wc->status != IB_WC_SUCCESS)) {
 		srp_handle_qp_err(cq, wc, "SEND");
 		return;
 	}
 
+
+	/**/spin_lock_irqsave(&ch->lock, flags);
 	lockdep_assert_held(&ch->lock);
 
 	list_add(&iu->list, &ch->free_tx);
+	/**/spin_unlock_irqrestore(&ch->lock, flags);
 }
 
 /**
@@ -1890,6 +1927,7 @@ static int srp_post_send(struct srp_rdma_ch *ch, struct srp_iu *iu, int len)
 {
 	struct srp_target_port *target = ch->target;
 	struct ib_send_wr wr;
+	int ret;
 
 	if (WARN_ON_ONCE(iu->num_sge > SRP_MAX_SGE))
 		return -EINVAL;
@@ -1907,7 +1945,12 @@ static int srp_post_send(struct srp_rdma_ch *ch, struct srp_iu *iu, int len)
 	wr.opcode     = IB_WR_SEND;
 	wr.send_flags = IB_SEND_SIGNALED;
 
-	return ib_post_send(ch->qp, &wr, NULL);
+	ret = ib_post_send(ch->qp, &wr, NULL);
+	if (ret)
+		pr_err("qp#%03d: %s: ret = %d\n", ch->qp->qp_num, __func__, ret);
+	else
+		pr_info("qp#%03d: post-send:\n", ch->qp->qp_num);
+	return ret;
 }
 
 static int srp_post_recv(struct srp_rdma_ch *ch, struct srp_iu *iu)
@@ -1915,6 +1958,7 @@ static int srp_post_recv(struct srp_rdma_ch *ch, struct srp_iu *iu)
 	struct srp_target_port *target = ch->target;
 	struct ib_recv_wr wr;
 	struct ib_sge list;
+	int ret;
 
 	list.addr   = iu->dma;
 	list.length = iu->size;
@@ -1927,7 +1971,12 @@ static int srp_post_recv(struct srp_rdma_ch *ch, struct srp_iu *iu)
 	wr.sg_list  = &list;
 	wr.num_sge  = 1;
 
-	return ib_post_recv(ch->qp, &wr, NULL);
+	ret = ib_post_recv(ch->qp, &wr, NULL);
+	if (ret)
+		pr_err("qp#%03d: %s: ret = %d\n", ch->qp->qp_num, __func__, ret);
+	else
+		pr_info("qp#%03d: post-recv:\n", ch->qp->qp_num);
+	return ret;
 }
 
 static void srp_process_rsp(struct srp_rdma_ch *ch, struct srp_rsp *rsp)
@@ -2004,8 +2053,9 @@ static int srp_response_common(struct srp_rdma_ch *ch, s32 req_delta,
 	spin_unlock_irqrestore(&ch->lock, flags);
 
 	if (!iu) {
-		shost_printk(KERN_ERR, target->scsi_host, PFX
-			     "no IU available to send response\n");
+		pr_err("%s: no iu to send response\n", __func__);
+		//shost_printk(KERN_ERR, target->scsi_host, PFX
+			     //"no IU available to send response\n");
 		return 1;
 	}
 
@@ -2034,8 +2084,9 @@ static void srp_process_cred_req(struct srp_rdma_ch *ch,
 	s32 delta = be32_to_cpu(req->req_lim_delta);
 
 	if (srp_response_common(ch, delta, &rsp, sizeof(rsp)))
-		shost_printk(KERN_ERR, ch->target->scsi_host, PFX
-			     "problems processing SRP_CRED_REQ\n");
+		pr_err("%s: problems with cred req\n", __func__);
+		//shost_printk(KERN_ERR, ch->target->scsi_host, PFX
+			     //"problems processing SRP_CRED_REQ\n");
 }
 
 static void srp_process_aer_req(struct srp_rdma_ch *ch,
@@ -2065,6 +2116,9 @@ static void srp_recv_done(struct ib_cq *cq, struct ib_wc *wc)
 	int res;
 	u8 opcode;
 
+	pr_info("qp#%03d: recv-done: opcode: %d status: %d: len: %d\n",
+			ch->qp->qp_num, wc->opcode, wc->status, wc->byte_len);
+
 	if (unlikely(wc->status != IB_WC_SUCCESS)) {
 		srp_handle_qp_err(cq, wc, "RECV");
 		return;
@@ -2173,8 +2227,10 @@ static int srp_queuecommand(struct Scsi_Host *shost, struct scsi_cmnd *scmnd)
 	iu = __srp_get_tx_iu(ch, SRP_IU_CMD);
 	spin_unlock_irqrestore(&ch->lock, flags);
 
-	if (!iu)
+	if (!iu) {
+		pr_err("%s: no iu to queue command\n", __func__);
 		goto err;
+	}
 
 	dev = target->srp_host->srp_dev->dev;
 	ib_dma_sync_single_for_cpu(dev, iu->dma, ch->max_it_iu_len,
@@ -2240,6 +2296,7 @@ static int srp_queuecommand(struct Scsi_Host *shost, struct scsi_cmnd *scmnd)
 		scsi_done(scmnd);
 		ret = 0;
 	} else {
+		pr_err("%s: returned SCSI_MLQUEUE_HOST_BUSY\n", __func__);
 		ret = SCSI_MLQUEUE_HOST_BUSY;
 	}
 
@@ -2734,6 +2791,7 @@ static int srp_send_tsk_mgmt(struct srp_rdma_ch *ch, u64 req_tag, u64 lun,
 	spin_unlock_irq(&ch->lock);
 
 	if (!iu) {
+		pr_err("%s: no iu for task management\n", __func__);
 		mutex_unlock(&rport->mutex);
 
 		return -1;

^ permalink raw reply related	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 19:48                                                                               ` Bart Van Assche
                                                                                                   ` (2 preceding siblings ...)
  2023-10-18 20:14                                                                                 ` Bob Pearson
@ 2023-10-18 20:29                                                                                 ` Bob Pearson
  2023-10-18 20:49                                                                                   ` Bart Van Assche
  3 siblings, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-10-18 20:29 UTC (permalink / raw)
  To: Bart Van Assche, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

[-- Attachment #1: Type: text/plain, Size: 1077 bytes --]

On 10/18/23 14:48, Bart Van Assche wrote:
> 
> On 10/18/23 12:17, Jason Gunthorpe wrote:
>> If siw hangs as well, I definitely comfortable continuing to debug and
>> leaving the work queues in-tree for now.
> 
> Regarding the KASAN complaint that I shared about one month ago, can that complaint have any other root cause than the patch "RDMA/rxe: Add
> workqueue support for rxe tasks"? That report shows a use-after-free by
> rxe code with a pointer to memory that was owned by the rxe driver and
> that was freed by the rxe driver. That memory is an skbuff. The rxe
> driver manages skbuffs. The SRP driver doesn't even know about these
> skbuff objects. See also https://lore.kernel.org/linux-rdma/8ee2869b-3f51-4195-9883-015cd30b4241@acm.org/
> 
> Thanks,
> 
> Bart.
> 

OK, with clean code from current rdma for-next branch with the .config I sent before, if I run:

rpearson:blktests$ sudo use_siw=1 ./check srp/002
[sudo] password for rpearson: 
srp/002 (File I/O on top of multipath concurrently with logout and login (mq))

It hangs. The dmesg trace is attached.

Bob

[-- Attachment #2: out --]
[-- Type: text/plain, Size: 479920 bytes --]

[ 6571.509854] run blktests srp/002 at 2023-10-18 15:19:37
[ 6572.354842] rxe0: cq#1 rxe_destroy_cq: called
[ 6572.972739] rdma_rxe: unloaded
[ 6573.550708] null_blk: module loaded
[ 6573.644227] null_blk: disk nullb0 created
[ 6573.680173] null_blk: disk nullb1 created
[ 6573.795329] io scheduler bfq registered
[ 6573.821391] io scheduler kyber registered
[ 6573.878994] SoftiWARP attached
[ 6574.004092] scsi_debug:sdebug_add_store: dif_storep 524288 bytes @ ffffc90003573000
[ 6574.010351] scsi_debug:sdebug_driver_probe: scsi_debug: trim poll_queues to 0. poll_q/nr_hw = (0/1)
[ 6574.010364] scsi_debug:sdebug_driver_probe: host protection DIF3 DIX3
[ 6574.010380] scsi host10: scsi_debug: version 0191 [20210520]
                 dev_size_mb=32, opts=0x0, submit_queues=1, statistics=0
[ 6574.017431] scsi 10:0:0:0: Direct-Access     Linux    scsi_debug       0191 PQ: 0 ANSI: 7
[ 6574.018514] scsi 10:0:0:0: Power-on or device reset occurred
[ 6574.028626] sd 10:0:0:0: [sda] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6574.028717] sd 10:0:0:0: [sda] Write Protect is off
[ 6574.028742] sd 10:0:0:0: [sda] Mode Sense: 73 00 10 08
[ 6574.028883] sd 10:0:0:0: [sda] Write cache: enabled, read cache: enabled, supports DPO and FUA
[ 6574.029213] sd 10:0:0:0: [sda] Enabling DIX T10-DIF-TYPE3-CRC, application tag size 6 bytes
[ 6574.029248] sd 10:0:0:0: [sda] Enabling DIF Type 3 protection
[ 6574.029273] sd 10:0:0:0: [sda] Preferred minimum I/O size 512 bytes
[ 6574.029299] sd 10:0:0:0: [sda] Optimal transfer size 524288 bytes
[ 6574.029327] sd 10:0:0:0: Attached scsi generic sg0 type 0
[ 6574.050983] sd 10:0:0:0: [sda] Attached SCSI disk
[ 6574.818728] Rounding down aligned max_sectors from 4294967295 to 4294967288
[ 6574.882826] ib_srpt:srpt_add_one: ib_srpt device = 0000000051e8584b
[ 6574.883013] ib_srpt:srpt_use_srq: ib_srpt srpt_use_srq(enp6s0_siw): use_srq = 0; ret = 0
[ 6574.883021] ib_srpt:srpt_add_one: ib_srpt Target login info: id_ext=b62e99fffef9fa2e,ioc_guid=b62e99fffef9fa2e,pkey=ffff,service_id=b62e99fffef9fa2e
[ 6574.883292] ib_srpt:srpt_add_one: ib_srpt added enp6s0_siw.
[ 6575.486869] Rounding down aligned max_sectors from 255 to 248
[ 6575.578667] Rounding down aligned max_sectors from 255 to 248
[ 6575.666780] Rounding down aligned max_sectors from 4294967295 to 4294967288
[ 6575.686666] iwpm_register_pid: Unable to send a nlmsg (client = 2)
[ 6576.646833] ib_srp:srp_add_one: ib_srp: srp_add_one: 18446744073709551615 / 4096 = 4503599627370495 <> 512
[ 6576.646845] ib_srp:srp_add_one: ib_srp: enp6s0_siw: mr_page_shift = 12, device->max_mr_size = 0xffffffffffffffff, device->max_fast_reg_page_list_len = 256, max_pages_per_mr = 256, mr_max_size = 0x100000
[ 6576.798846] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6576.798942] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6576.799082] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 6576.799099] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6576.823444] scsi host11: ib_srp: REJ received
[ 6576.823501] scsi host11:   REJ reason 0xffffff98
[ 6576.823818] scsi host11: ib_srp: Connection 0/24 to 192.168.1.77 failed
[ 6576.985864] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6576.985897] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6576.985963] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6576.985995] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6576.986014] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 6576.986023] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6576.994531] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6576.997182] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6577.087691] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000020cea838
[ 6577.088106] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6577.088947] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000a35740f0 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000020cea838
[ 6577.090580] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-2: queued zerolength write
[ 6577.090727] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6577.090734] scsi host11: ib_srp: using immediate data
[ 6577.093353] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-2 wc->status 0
[ 6577.105149] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6577.107387] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6577.190868] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000139262ee
[ 6577.191125] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6577.191232] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000aaf21ae2 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000139262ee
[ 6577.191600] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-4: queued zerolength write
[ 6577.192487] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6577.192501] scsi host11: ib_srp: using immediate data
[ 6577.192704] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-4 wc->status 0
[ 6577.206802] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6577.208951] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6577.291104] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000001e02bd8b
[ 6577.291363] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6577.291472] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000e33689af name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000001e02bd8b
[ 6577.291827] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-6: queued zerolength write
[ 6577.291852] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6577.291859] scsi host11: ib_srp: using immediate data
[ 6577.292075] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-6 wc->status 0
[ 6577.305955] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6577.308060] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6577.387756] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000009f218f1d
[ 6577.388012] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6577.388114] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000b56b70fc name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000009f218f1d
[ 6577.388520] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-8: queued zerolength write
[ 6577.388632] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6577.388651] scsi host11: ib_srp: using immediate data
[ 6577.388978] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-8 wc->status 0
[ 6577.403527] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6577.408028] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6577.512419] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000081430bf1
[ 6577.512674] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6577.512782] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000001174e49c name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000081430bf1
[ 6577.513203] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-10: queued zerolength write
[ 6577.513349] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6577.513366] scsi host11: ib_srp: using immediate data
[ 6577.514946] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-10 wc->status 0
[ 6577.527958] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6577.530916] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6577.622924] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000f1e26e95
[ 6577.623188] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6577.623290] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000d56ca942 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000f1e26e95
[ 6577.623634] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-12: queued zerolength write
[ 6577.623732] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6577.623749] scsi host11: ib_srp: using immediate data
[ 6577.625320] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-12 wc->status 0
[ 6577.637309] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6577.640501] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6577.740137] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000039ed2029
[ 6577.740421] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6577.740528] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000002a00fdc2 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000039ed2029
[ 6577.740875] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-14: queued zerolength write
[ 6577.740899] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6577.740907] scsi host11: ib_srp: using immediate data
[ 6577.741273] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-14 wc->status 0
[ 6577.754403] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6577.756509] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6577.833517] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000601a6cc7
[ 6577.833772] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6577.833874] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000cb839a1b name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000601a6cc7
[ 6577.834233] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-16: queued zerolength write
[ 6577.834257] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6577.834264] scsi host11: ib_srp: using immediate data
[ 6577.834449] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-16 wc->status 0
[ 6577.847725] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6577.849829] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6577.926901] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000077e8b508
[ 6577.927157] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6577.927260] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007062b29e name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000077e8b508
[ 6577.927696] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-18: queued zerolength write
[ 6577.927790] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6577.927808] scsi host11: ib_srp: using immediate data
[ 6577.928088] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-18 wc->status 0
[ 6577.943130] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6577.947448] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6578.048908] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000876e4fc8
[ 6578.049163] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6578.049265] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000077f903cf name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000876e4fc8
[ 6578.049660] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-20: queued zerolength write
[ 6578.049816] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6578.049832] scsi host11: ib_srp: using immediate data
[ 6578.050054] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-20 wc->status 0
[ 6578.062787] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6578.065578] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6578.156509] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000ecf2a564
[ 6578.156763] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6578.156865] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000006449e27c name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000ecf2a564
[ 6578.157218] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-22: queued zerolength write
[ 6578.157241] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6578.157249] scsi host11: ib_srp: using immediate data
[ 6578.157447] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-22 wc->status 0
[ 6578.170901] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6578.173000] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6578.250256] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000d20735b7
[ 6578.250512] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6578.250614] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000084d7544d name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000d20735b7
[ 6578.250972] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-24: queued zerolength write
[ 6578.251063] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6578.251081] scsi host11: ib_srp: using immediate data
[ 6578.251379] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-24 wc->status 0
[ 6578.267973] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6578.272761] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6578.377290] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000c6dd1322
[ 6578.377555] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6578.377656] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000fb030611 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000c6dd1322
[ 6578.377997] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-26: queued zerolength write
[ 6578.378021] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6578.378028] scsi host11: ib_srp: using immediate data
[ 6578.378375] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-26 wc->status 0
[ 6578.393042] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6578.395216] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6578.475512] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000001428dc05
[ 6578.475767] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6578.475871] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000ae2e6dc0 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000001428dc05
[ 6578.476224] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-28: queued zerolength write
[ 6578.476253] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6578.476263] scsi host11: ib_srp: using immediate data
[ 6578.476562] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-28 wc->status 0
[ 6578.489706] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6578.491771] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6578.571667] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000614f8a65
[ 6578.571936] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6578.572038] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000008b13a185 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000614f8a65
[ 6578.572449] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-30: queued zerolength write
[ 6578.572526] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6578.572544] scsi host11: ib_srp: using immediate data
[ 6578.572841] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-30 wc->status 0
[ 6578.586025] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6578.590252] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6578.690044] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000b46f4155
[ 6578.690299] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6578.690401] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000009d973c95 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000b46f4155
[ 6578.690752] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-32: queued zerolength write
[ 6578.690776] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6578.690783] scsi host11: ib_srp: using immediate data
[ 6578.691241] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-32 wc->status 0
[ 6578.705715] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6578.707774] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6578.783544] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000099673ac1
[ 6578.783800] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6578.783902] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000003d029187 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000099673ac1
[ 6578.784265] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-34: queued zerolength write
[ 6578.784363] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6578.784418] scsi host11: ib_srp: using immediate data
[ 6578.786144] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-34 wc->status 0
[ 6578.798666] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6578.800768] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6578.876991] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000ca85e525
[ 6578.877246] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6578.877353] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000162e7ca0 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000ca85e525
[ 6578.877709] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-36: queued zerolength write
[ 6578.877796] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6578.877813] scsi host11: ib_srp: using immediate data
[ 6578.878105] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-36 wc->status 0
[ 6578.894195] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6578.896243] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6578.972966] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000072955f3a
[ 6578.973220] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6578.973322] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000bedc9940 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000072955f3a
[ 6578.973688] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-38: queued zerolength write
[ 6578.973711] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6578.973718] scsi host11: ib_srp: using immediate data
[ 6578.974062] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-38 wc->status 0
[ 6578.988788] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6578.990881] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6579.067742] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000009889db15
[ 6579.067997] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6579.068108] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000e0abd9be name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000009889db15
[ 6579.068517] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-40: queued zerolength write
[ 6579.068671] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6579.068689] scsi host11: ib_srp: using immediate data
[ 6579.068948] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-40 wc->status 0
[ 6579.083151] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6579.086912] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6579.173353] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000745dfb31
[ 6579.173605] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6579.173708] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000005052d2cf name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000745dfb31
[ 6579.174026] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-42: queued zerolength write
[ 6579.174133] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6579.174152] scsi host11: ib_srp: using immediate data
[ 6579.174233] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-42 wc->status 0
[ 6579.189654] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6579.191703] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6579.284841] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000003795ab50
[ 6579.285116] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6579.285233] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000059d9a6ff name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000003795ab50
[ 6579.285605] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-44: queued zerolength write
[ 6579.285698] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6579.285715] scsi host11: ib_srp: using immediate data
[ 6579.285988] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-44 wc->status 0
[ 6579.300271] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6579.304559] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6579.402003] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000113ca1fa
[ 6579.402258] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6579.402358] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000028559935 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000113ca1fa
[ 6579.402713] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-46: queued zerolength write
[ 6579.402737] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6579.402744] scsi host11: ib_srp: using immediate data
[ 6579.402969] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-46 wc->status 0
[ 6579.417239] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6579.419375] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6579.494770] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000086e3ad2d
[ 6579.495027] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6579.495135] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000fd609857 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000086e3ad2d
[ 6579.495478] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-48: queued zerolength write
[ 6579.495566] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6579.495583] scsi host11: ib_srp: using immediate data
[ 6579.495704] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-48 wc->status 0
[ 6579.497766] scsi host11: SRP.T10:B62E99FFFEF9FA2E
[ 6579.570917] scsi 11:0:0:0: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6579.581300] scsi 11:0:0:0: LUN assignments on this target have changed. The Linux SCSI layer does not automatically remap LUN assignments.
[ 6579.584796] scsi 11:0:0:0: alua: supports implicit and explicit TPGS
[ 6579.585015] scsi 11:0:0:0: alua: device naa.60014056e756c6c62300000000000000 port group 0 rel port 1
[ 6579.591387] sd 11:0:0:0: Attached scsi generic sg1 type 0
[ 6579.591765] sd 11:0:0:0: [sdb] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6579.592737] sd 11:0:0:0: [sdb] Write Protect is off
[ 6579.592765] sd 11:0:0:0: [sdb] Mode Sense: 43 00 00 08
[ 6579.594607] sd 11:0:0:0: [sdb] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 6579.596608] scsi 11:0:0:2: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6579.597795] sd 11:0:0:0: [sdb] Preferred minimum I/O size 512 bytes
[ 6579.597822] sd 11:0:0:0: [sdb] Optimal transfer size 126976 bytes
[ 6579.608397] scsi 11:0:0:2: alua: supports implicit and explicit TPGS
[ 6579.608497] scsi 11:0:0:2: alua: device naa.60014057363736964626700000000000 port group 0 rel port 1
[ 6579.615731] sd 11:0:0:2: [sdc] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6579.615809] sd 11:0:0:2: Attached scsi generic sg2 type 0
[ 6579.616203] sd 11:0:0:2: [sdc] Write Protect is off
[ 6579.616226] sd 11:0:0:2: [sdc] Mode Sense: 43 00 10 08
[ 6579.617242] sd 11:0:0:2: [sdc] Write cache: enabled, read cache: enabled, supports DPO and FUA
[ 6579.620953] sd 11:0:0:2: [sdc] Preferred minimum I/O size 512 bytes
[ 6579.620966] sd 11:0:0:2: [sdc] Optimal transfer size 524288 bytes
[ 6579.621830] scsi 11:0:0:1: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6579.630636] scsi 11:0:0:1: LUN assignments on this target have changed. The Linux SCSI layer does not automatically remap LUN assignments.
[ 6579.633679] scsi 11:0:0:1: alua: supports implicit and explicit TPGS
[ 6579.633743] scsi 11:0:0:1: alua: device naa.60014056e756c6c62310000000000000 port group 0 rel port 1
[ 6579.643580] sd 11:0:0:1: Attached scsi generic sg3 type 0
[ 6579.645749] sd 11:0:0:1: [sdd] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6579.645978] ib_srp:srp_add_target: ib_srp: host11: SCSI scan succeeded - detected 3 LUNs
[ 6579.646144] scsi host11: ib_srp: new target: id_ext b62e99fffef9fa2e ioc_guid b62e99fffef9fa2e sgid b42e:99f9:fa2e:0000:0000:0000:0000:0000 dest 2603:8081:1405:679b:0000:0000:0000:132c
[ 6579.650139] sd 11:0:0:1: [sdd] Write Protect is off
[ 6579.650153] sd 11:0:0:1: [sdd] Mode Sense: 43 00 00 08
[ 6579.651327] sd 11:0:0:1: [sdd] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 6579.652146] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6579.652206] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6579.652324] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6579.652383] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6579.652525] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6579.652558] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6579.652591] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6579.653293] sd 11:0:0:1: [sdd] Preferred minimum I/O size 512 bytes
[ 6579.653318] sd 11:0:0:1: [sdd] Optimal transfer size 126976 bytes
[ 6579.661924] sd 11:0:0:0: [sdb] Attached SCSI disk
[ 6579.664286] sd 11:0:0:2: [sdc] Attached SCSI disk
[ 6579.690028] sd 11:0:0:1: [sdd] Attached SCSI disk
[ 6579.730750] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6579.730786] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6579.730857] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6579.730892] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6579.730956] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6579.730987] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6579.731051] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6579.731082] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6579.731103] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6579.802144] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6579.802180] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6579.802244] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6579.802276] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6579.802340] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6579.802375] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6579.802446] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6579.802478] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6579.802542] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6579.802573] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6579.802593] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6579.887798] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6579.887834] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6579.887900] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6579.887951] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6579.888066] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6579.888124] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6579.888239] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6579.888297] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6579.888382] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6579.888470] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6579.888579] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6579.888615] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6579.888635] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6580.219148] sd 11:0:0:0: alua: transition timeout set to 60 seconds
[ 6580.219383] sd 11:0:0:0: alua: port group 00 state A non-preferred supports TOlUSNA
[ 6582.389396] EXT4-fs (dm-5): mounted filesystem 56808e44-29bd-41ca-8262-92dc933ddf8f r/w without journal. Quota mode: none.
[ 6587.495430] device-mapper: multipath: 253:5: Failing path 8:16.
[ 6587.676853] sd 11:0:0:2: [sdc] Synchronizing SCSI cache
[ 6587.768987] scsi 11:0:0:2: alua: Detached
[ 6587.876857] scsi 11:0:0:1: alua: Detached
[ 6587.896223] ib_srpt receiving failed for ioctx 00000000a2049389 with status 5
[ 6587.896240] ib_srpt receiving failed for ioctx 000000002407e55a with status 5
[ 6587.896251] ib_srpt receiving failed for ioctx 000000004f38f070 with status 5
[ 6587.896262] ib_srpt receiving failed for ioctx 00000000d26a6cad with status 5
[ 6587.896273] ib_srpt receiving failed for ioctx 000000006e9aaf02 with status 5
[ 6587.896284] ib_srpt receiving failed for ioctx 000000008471a700 with status 5
[ 6587.896295] ib_srpt receiving failed for ioctx 000000006ef93c72 with status 5
[ 6587.896306] ib_srpt receiving failed for ioctx 0000000097dd5c35 with status 5
[ 6587.896317] ib_srpt receiving failed for ioctx 0000000024edf1fb with status 5
[ 6587.896327] ib_srpt receiving failed for ioctx 00000000e438043d with status 5
[ 6592.666138] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6592.666175] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6592.666194] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 6592.666202] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6592.672767] scsi host12: ib_srp: REJ received
[ 6592.672776] scsi host12:   REJ reason 0xffffff98
[ 6592.672807] scsi host12: ib_srp: Connection 0/24 to 192.168.1.77 failed
[ 6592.807537] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6592.807604] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6592.807729] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6592.807792] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6592.807831] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 6592.807848] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6592.820010] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6592.824428] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6592.931124] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000de16ccb7
[ 6592.931391] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6592.931552] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000aa822b07 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000de16ccb7
[ 6592.931919] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-3: queued zerolength write
[ 6592.932015] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6592.932033] scsi host12: ib_srp: using immediate data
[ 6592.932164] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-3 wc->status 0
[ 6592.947930] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6592.950174] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6593.031755] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000004faa6ca6
[ 6593.032023] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6593.032135] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000de8497ae name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000004faa6ca6
[ 6593.032491] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-7: queued zerolength write
[ 6593.032586] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6593.032604] scsi host12: ib_srp: using immediate data
[ 6593.032879] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-7 wc->status 0
[ 6593.048337] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6593.050544] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6593.130805] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000006ee69e26
[ 6593.131073] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6593.131179] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000da14b63f name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000006ee69e26
[ 6593.131552] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-11: queued zerolength write
[ 6593.131749] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-11 wc->status 0
[ 6593.132415] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6593.132443] scsi host12: ib_srp: using immediate data
[ 6593.140601] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6593.145333] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6593.252278] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000000d113df7
[ 6593.252544] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6593.252653] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000bb95905b name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000000d113df7
[ 6593.253114] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-15: queued zerolength write
[ 6593.253249] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6593.253257] scsi host12: ib_srp: using immediate data
[ 6593.253357] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-15 wc->status 0
[ 6593.261143] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6593.263368] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6593.344836] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000006c27080a
[ 6593.345229] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6593.345337] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007b31c0b3 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000006c27080a
[ 6593.345705] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-19: queued zerolength write
[ 6593.345801] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6593.345809] scsi host12: ib_srp: using immediate data
[ 6593.346273] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-19 wc->status 0
[ 6593.353233] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6593.355417] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6593.437505] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000fb15925f
[ 6593.437774] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6593.437883] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000050ad99ae name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000fb15925f
[ 6593.438259] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-23: queued zerolength write
[ 6593.438355] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6593.438373] scsi host12: ib_srp: using immediate data
[ 6593.441994] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-23 wc->status 0
[ 6593.448164] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6593.452017] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6593.553429] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000002e1317c6
[ 6593.553697] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6593.553809] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000003c75bd31 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000002e1317c6
[ 6593.554190] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-27: queued zerolength write
[ 6593.554285] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6593.554303] scsi host12: ib_srp: using immediate data
[ 6593.554417] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-27 wc->status 0
[ 6593.562400] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6593.564574] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6593.645521] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000077eec89a
[ 6593.645787] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6593.645896] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000009bbed859 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000077eec89a
[ 6593.646273] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-31: queued zerolength write
[ 6593.646444] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6593.646462] scsi host12: ib_srp: using immediate data
[ 6593.646618] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-31 wc->status 0
[ 6593.661897] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6593.664064] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6593.743624] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000005f5e174
[ 6593.743891] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6593.743998] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000068364167 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000005f5e174
[ 6593.744373] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-35: queued zerolength write
[ 6593.744569] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-35 wc->status 0
[ 6593.744589] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6593.744596] scsi host12: ib_srp: using immediate data
[ 6593.757729] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6593.759897] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6593.841991] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000031d299aa
[ 6593.842256] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6593.842362] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000070981c53 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000031d299aa
[ 6593.842719] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-39: queued zerolength write
[ 6593.842812] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6593.842830] scsi host12: ib_srp: using immediate data
[ 6593.842950] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-39 wc->status 0
[ 6593.858347] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6593.861081] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6593.940575] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000802401db
[ 6593.940842] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6593.941000] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000042828f9d name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000802401db
[ 6593.942051] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-43: queued zerolength write
[ 6593.942149] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6593.942167] scsi host12: ib_srp: using immediate data
[ 6593.942389] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-43 wc->status 0
[ 6593.956957] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6593.959177] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6594.038742] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000b4c55d03
[ 6594.039008] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6594.039113] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000000b802ab1 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000b4c55d03
[ 6594.039482] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-47: queued zerolength write
[ 6594.039702] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6594.039716] scsi host12: ib_srp: using immediate data
[ 6594.039746] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-47 wc->status 0
[ 6594.053392] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6594.055581] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6594.135687] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000009810be7c
[ 6594.135953] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6594.136062] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000000bddd322 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000009810be7c
[ 6594.136428] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-50: queued zerolength write
[ 6594.136523] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6594.136541] scsi host12: ib_srp: using immediate data
[ 6594.136660] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-50 wc->status 0
[ 6594.152178] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6594.154430] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6594.233154] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000d74c6a06
[ 6594.233419] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6594.233524] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000003025b2d2 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000d74c6a06
[ 6594.233889] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-52: queued zerolength write
[ 6594.233981] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6594.233998] scsi host12: ib_srp: using immediate data
[ 6594.234186] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-52 wc->status 0
[ 6594.249637] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6594.252335] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6594.331291] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000ef9ff1b9
[ 6594.331557] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6594.331661] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000002f6cfdce name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000ef9ff1b9
[ 6594.332029] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-54: queued zerolength write
[ 6594.332203] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6594.332220] scsi host12: ib_srp: using immediate data
[ 6594.332394] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-54 wc->status 0
[ 6594.346363] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6594.348549] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6594.428074] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000b6ae2f83
[ 6594.428342] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6594.428448] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000003eded7d4 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000b6ae2f83
[ 6594.428828] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-56: queued zerolength write
[ 6594.428919] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6594.428937] scsi host12: ib_srp: using immediate data
[ 6594.429100] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-56 wc->status 0
[ 6594.443265] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6594.445787] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6594.525695] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000002ce5cd56
[ 6594.525963] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6594.526068] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000091d46b86 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000002ce5cd56
[ 6594.526435] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-58: queued zerolength write
[ 6594.526461] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6594.526468] scsi host12: ib_srp: using immediate data
[ 6594.526733] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-58 wc->status 0
[ 6594.539235] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6594.541544] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6594.621405] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000043020b30
[ 6594.621671] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6594.621778] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007de83b54 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000043020b30
[ 6594.622152] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-60: queued zerolength write
[ 6594.622247] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6594.622265] scsi host12: ib_srp: using immediate data
[ 6594.622538] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-60 wc->status 0
[ 6594.636561] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6594.638831] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6594.717575] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000056595f0c
[ 6594.717839] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6594.717945] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000e2c61128 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000056595f0c
[ 6594.718373] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-62: queued zerolength write
[ 6594.718591] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-62 wc->status 0
[ 6594.719033] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6594.719051] scsi host12: ib_srp: using immediate data
[ 6594.733959] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6594.736337] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6594.815441] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000000b0fbfd9
[ 6594.815709] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6594.815815] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000c8478a89 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000000b0fbfd9
[ 6594.816180] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-64: queued zerolength write
[ 6594.816273] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6594.816291] scsi host12: ib_srp: using immediate data
[ 6594.816467] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-64 wc->status 0
[ 6594.831294] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6594.833571] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6594.913573] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000d05691ca
[ 6594.913841] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6594.913946] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000630fae64 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000d05691ca
[ 6594.914300] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-66: queued zerolength write
[ 6594.914403] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6594.914420] scsi host12: ib_srp: using immediate data
[ 6594.914659] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-66 wc->status 0
[ 6594.929303] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6594.931479] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6595.010803] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000103b9f75
[ 6595.011069] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6595.011176] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000d7bf18db name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000103b9f75
[ 6595.011988] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-68: queued zerolength write
[ 6595.012081] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6595.012099] scsi host12: ib_srp: using immediate data
[ 6595.012256] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-68 wc->status 0
[ 6595.026601] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6595.028752] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6595.107278] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000084adba6f
[ 6595.107544] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6595.107658] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000b72a50f5 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000084adba6f
[ 6595.108025] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-70: queued zerolength write
[ 6595.108119] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6595.108136] scsi host12: ib_srp: using immediate data
[ 6595.108311] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-70 wc->status 0
[ 6595.123008] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6595.125376] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6595.203762] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000009d82e473
[ 6595.204030] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6595.204135] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000b427c1de name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000009d82e473
[ 6595.204501] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-72: queued zerolength write
[ 6595.204603] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6595.204620] scsi host12: ib_srp: using immediate data
[ 6595.204863] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-72 wc->status 0
[ 6595.207321] scsi host12: SRP.T10:B62E99FFFEF9FA2E
[ 6595.278706] scsi 12:0:0:0: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6595.295793] scsi 12:0:0:0: alua: supports implicit and explicit TPGS
[ 6595.295880] scsi 12:0:0:0: alua: device naa.60014056e756c6c62300000000000000 port group 0 rel port 1
[ 6595.302159] sd 12:0:0:0: [sdc] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6595.302704] sd 12:0:0:0: [sdc] Write Protect is off
[ 6595.302730] sd 12:0:0:0: [sdc] Mode Sense: 43 00 00 08
[ 6595.303250] sd 12:0:0:0: Attached scsi generic sg1 type 0
[ 6595.303786] sd 12:0:0:0: [sdc] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 6595.305324] sd 12:0:0:0: [sdc] Preferred minimum I/O size 512 bytes
[ 6595.305339] sd 12:0:0:0: [sdc] Optimal transfer size 126976 bytes
[ 6595.312392] scsi 12:0:0:2: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6595.326894] scsi 12:0:0:2: alua: supports implicit and explicit TPGS
[ 6595.326930] scsi 12:0:0:2: alua: device naa.60014057363736964626700000000000 port group 0 rel port 1
[ 6595.330309] sd 12:0:0:2: Attached scsi generic sg2 type 0
[ 6595.336547] scsi 12:0:0:1: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6595.338778] sd 12:0:0:2: [sdd] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6595.339372] sd 12:0:0:2: [sdd] Write Protect is off
[ 6595.339385] sd 12:0:0:2: [sdd] Mode Sense: 43 00 10 08
[ 6595.340686] sd 12:0:0:2: [sdd] Write cache: enabled, read cache: enabled, supports DPO and FUA
[ 6595.343145] sd 12:0:0:0: [sdc] Attached SCSI disk
[ 6595.346742] sd 12:0:0:2: [sdd] Preferred minimum I/O size 512 bytes
[ 6595.346760] sd 12:0:0:2: [sdd] Optimal transfer size 524288 bytes
[ 6595.354630] scsi 12:0:0:1: alua: supports implicit and explicit TPGS
[ 6595.354668] scsi 12:0:0:1: alua: device naa.60014056e756c6c62310000000000000 port group 0 rel port 1
[ 6595.358835] sd 12:0:0:1: Attached scsi generic sg3 type 0
[ 6595.360543] ib_srp:srp_add_target: ib_srp: host12: SCSI scan succeeded - detected 3 LUNs
[ 6595.360555] scsi host12: ib_srp: new target: id_ext b62e99fffef9fa2e ioc_guid b62e99fffef9fa2e sgid b42e:99f9:fa2e:0000:0000:0000:0000:0000 dest 2603:8081:1405:679b:0000:0000:0000:132c
[ 6595.363066] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6595.363100] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6595.363171] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6595.363203] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6595.363267] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6595.363299] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6595.363319] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6595.366084] sd 12:0:0:1: [sde] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6595.366665] sd 12:0:0:1: [sde] Write Protect is off
[ 6595.366678] sd 12:0:0:1: [sde] Mode Sense: 43 00 00 08
[ 6595.367836] sd 12:0:0:1: [sde] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 6595.370666] sd 12:0:0:1: [sde] Preferred minimum I/O size 512 bytes
[ 6595.370681] sd 12:0:0:1: [sde] Optimal transfer size 126976 bytes
[ 6595.392064] sd 12:0:0:2: [sdd] Attached SCSI disk
[ 6595.428388] sd 12:0:0:1: [sde] Attached SCSI disk
[ 6595.430873] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6595.430907] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6595.430971] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6595.431006] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6595.431070] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6595.431102] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6595.431166] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6595.431198] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6595.431218] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6595.493876] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6595.493911] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6595.493975] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6595.494008] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6595.494071] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6595.494103] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6595.494168] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6595.494200] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6595.494267] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6595.494298] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6595.494319] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6595.575374] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6595.575410] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6595.575474] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6595.575506] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6595.575571] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6595.575603] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6595.575892] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6595.575924] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6595.575992] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6595.576026] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6595.576091] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6595.576123] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6595.576143] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6600.777273] scsi 12:0:0:0: alua: Detached
[ 6600.871329] sd 12:0:0:2: [sdd] Synchronizing SCSI cache
[ 6600.953209] scsi 12:0:0:2: alua: Detached
[ 6601.097323] scsi 12:0:0:1: alua: Detached
[ 6601.129325] srpt_recv_done: 3062 callbacks suppressed
[ 6601.129335] ib_srpt receiving failed for ioctx 00000000e6e1dcf4 with status 5
[ 6601.129364] ib_srpt receiving failed for ioctx 000000005a0e34a9 with status 5
[ 6601.129386] ib_srpt receiving failed for ioctx 00000000e22e4597 with status 5
[ 6601.129407] ib_srpt receiving failed for ioctx 00000000bfb111b0 with status 5
[ 6601.129427] ib_srpt receiving failed for ioctx 000000002689997b with status 5
[ 6601.129448] ib_srpt receiving failed for ioctx 0000000085ddbb3e with status 5
[ 6601.129469] ib_srpt receiving failed for ioctx 000000001c37a747 with status 5
[ 6601.129489] ib_srpt receiving failed for ioctx 00000000bcff77b3 with status 5
[ 6601.129510] ib_srpt receiving failed for ioctx 00000000fc663c96 with status 5
[ 6601.129531] ib_srpt receiving failed for ioctx 00000000626e70e1 with status 5
[ 6602.853030] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6602.853066] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6602.853086] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 6602.900308] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6602.900373] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6602.900437] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6602.900469] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6602.900489] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 6602.943058] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6602.943093] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6602.943156] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6602.943188] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6602.943251] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6602.943283] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6602.943303] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6602.996094] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6602.996127] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6602.996193] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6602.996242] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6602.996305] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6602.996337] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6602.996400] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6602.996431] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6602.996451] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6603.049320] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6603.049353] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6603.049415] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6603.049447] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6603.049510] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6603.049542] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6603.049609] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6603.049640] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6603.049703] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6603.049734] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6603.049753] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6603.095776] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6603.095809] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6603.095871] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6603.095906] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6603.095974] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6603.096008] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6603.096071] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6603.096103] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6603.096166] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6603.096197] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6603.096261] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6603.096293] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6603.096312] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6603.393960] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6603.393996] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6603.394016] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 6603.444901] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6603.444942] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6603.445021] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6603.445061] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6603.445085] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 6603.491221] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6603.491253] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6603.491317] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6603.491348] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6603.491412] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6603.491444] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6603.491464] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6603.539083] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6603.539116] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6603.539179] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6603.539210] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6603.539274] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6603.539309] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6603.539376] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6603.539411] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6603.539430] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6603.587186] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6603.587219] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6603.587283] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6603.587318] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6603.587392] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6603.587424] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6603.587488] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6603.587519] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6603.587583] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6603.587614] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6603.587634] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6603.632601] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6603.632634] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6603.632697] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6603.632728] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6603.632792] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6603.632824] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6603.632891] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6603.632922] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6603.632986] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6603.633018] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6603.633082] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6603.633114] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6603.633133] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6603.924165] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6603.924224] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6603.924258] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 6603.924273] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6603.931338] scsi host12: ib_srp: REJ received
[ 6603.931348] scsi host12:   REJ reason 0xffffff98
[ 6603.931381] scsi host12: ib_srp: Connection 0/24 to 192.168.1.77 failed
[ 6604.079188] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6604.079221] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6604.079284] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6604.079316] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6604.079336] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 6604.079344] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6604.090422] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6604.095069] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6604.201933] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000057bd8c02
[ 6604.202204] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6604.202381] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000002d9eacb9 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000057bd8c02
[ 6604.202734] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-5: queued zerolength write
[ 6604.202832] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6604.202851] scsi host12: ib_srp: using immediate data
[ 6604.203140] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-5 wc->status 0
[ 6604.211258] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6604.215795] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6604.321012] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000ef6b178d
[ 6604.321280] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6604.321441] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000f234f11b name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000ef6b178d
[ 6604.321833] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-13: queued zerolength write
[ 6604.321991] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6604.322009] scsi host12: ib_srp: using immediate data
[ 6604.322199] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-13 wc->status 0
[ 6604.331463] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6604.334823] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6604.429687] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000ac79adbf
[ 6604.429955] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6604.430063] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000046cf8285 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000ac79adbf
[ 6604.430424] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-21: queued zerolength write
[ 6604.430449] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6604.430457] scsi host12: ib_srp: using immediate data
[ 6604.430649] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-21 wc->status 0
[ 6604.443372] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6604.445782] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6604.525043] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000cb8ddf10
[ 6604.525343] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6604.525452] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000eb9b7cd4 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000cb8ddf10
[ 6604.525818] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-29: queued zerolength write
[ 6604.525843] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6604.525851] scsi host12: ib_srp: using immediate data
[ 6604.526211] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-29 wc->status 0
[ 6604.538708] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6604.540904] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6604.620632] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000c26edb60
[ 6604.620903] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6604.621009] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000006ff76eac name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000c26edb60
[ 6604.621413] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-37: queued zerolength write
[ 6604.621535] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6604.621543] scsi host12: ib_srp: using immediate data
[ 6604.621773] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-37 wc->status 0
[ 6604.634461] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6604.636641] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6604.716432] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000003b4a938e
[ 6604.716702] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6604.716809] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000e09d80f8 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000003b4a938e
[ 6604.717191] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-45: queued zerolength write
[ 6604.717288] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6604.717343] scsi host12: ib_srp: using immediate data
[ 6604.717456] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-45 wc->status 0
[ 6604.731981] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6604.736676] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6604.843154] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000008083dab0
[ 6604.843418] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6604.843523] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000003180d238 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000008083dab0
[ 6604.843898] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-51: queued zerolength write
[ 6604.843994] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6604.844012] scsi host12: ib_srp: using immediate data
[ 6604.844123] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-51 wc->status 0
[ 6604.857889] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6604.861013] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6604.960292] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000013f9ed96
[ 6604.960568] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6604.960675] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000c61d2dad name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000013f9ed96
[ 6604.961041] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-55: queued zerolength write
[ 6604.961066] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6604.961073] scsi host12: ib_srp: using immediate data
[ 6604.961544] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-55 wc->status 0
[ 6604.973983] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6604.976167] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6605.055535] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000fe14829f
[ 6605.055804] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6605.055910] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000347e2e6f name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000fe14829f
[ 6605.056276] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-59: queued zerolength write
[ 6605.056369] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6605.056387] scsi host12: ib_srp: using immediate data
[ 6605.056675] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-59 wc->status 0
[ 6605.071386] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6605.075773] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6605.181378] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000a67236ab
[ 6605.181645] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6605.181751] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000e199acbf name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000a67236ab
[ 6605.182118] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-63: queued zerolength write
[ 6605.182142] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6605.182149] scsi host12: ib_srp: using immediate data
[ 6605.182348] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-63 wc->status 0
[ 6605.194936] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6605.197146] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6605.275399] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000c23078ca
[ 6605.275663] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6605.275769] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000002ed2ba8e name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000c23078ca
[ 6605.276132] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-67: queued zerolength write
[ 6605.276157] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6605.276164] scsi host12: ib_srp: using immediate data
[ 6605.276498] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-67 wc->status 0
[ 6605.289028] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6605.291261] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6605.368770] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000c9ed7d2c
[ 6605.369039] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6605.369144] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000036c1e036 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000c9ed7d2c
[ 6605.369565] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-71: queued zerolength write
[ 6605.369664] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6605.369682] scsi host12: ib_srp: using immediate data
[ 6605.370035] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-71 wc->status 0
[ 6605.383706] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6605.387939] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6605.487137] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000014ce1fe6
[ 6605.487406] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6605.487513] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000002d8060ff name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000014ce1fe6
[ 6605.487884] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-74: queued zerolength write
[ 6605.487981] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6605.487998] scsi host12: ib_srp: using immediate data
[ 6605.488162] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-74 wc->status 0
[ 6605.501682] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6605.504994] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6605.606340] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000001ded296f
[ 6605.606604] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6605.606709] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000047f0c0bf name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000001ded296f
[ 6605.607075] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-76: queued zerolength write
[ 6605.607167] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6605.607184] scsi host12: ib_srp: using immediate data
[ 6605.607332] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-76 wc->status 0
[ 6605.620319] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6605.624389] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6605.724157] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000081a7a6df
[ 6605.724425] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6605.724533] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000caf2180a name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000081a7a6df
[ 6605.724903] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-78: queued zerolength write
[ 6605.725001] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6605.725018] scsi host12: ib_srp: using immediate data
[ 6605.725151] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-78 wc->status 0
[ 6605.739320] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6605.743040] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6605.847382] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000006f9e68c0
[ 6605.847648] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6605.847804] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000062cd9251 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000006f9e68c0
[ 6605.848169] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-80: queued zerolength write
[ 6605.848265] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6605.848283] scsi host12: ib_srp: using immediate data
[ 6605.848405] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-80 wc->status 0
[ 6605.863742] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6605.866149] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6605.944266] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000034cefa1f
[ 6605.944533] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6605.944638] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000b820769a name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000034cefa1f
[ 6605.945004] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-82: queued zerolength write
[ 6605.945092] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6605.945109] scsi host12: ib_srp: using immediate data
[ 6605.945261] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-82 wc->status 0
[ 6605.958567] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6605.960915] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6606.038123] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000004449abeb
[ 6606.038387] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6606.038492] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000a0256905 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000004449abeb
[ 6606.038866] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-84: queued zerolength write
[ 6606.038959] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6606.038976] scsi host12: ib_srp: using immediate data
[ 6606.039228] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-84 wc->status 0
[ 6606.052685] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6606.055254] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6606.132407] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000ac60d27e
[ 6606.132674] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6606.132786] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000fe3790e5 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000ac60d27e
[ 6606.133153] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-86: queued zerolength write
[ 6606.133236] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6606.133253] scsi host12: ib_srp: using immediate data
[ 6606.133403] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-86 wc->status 0
[ 6606.147214] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6606.149429] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6606.226520] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000047c809c5
[ 6606.226786] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6606.226891] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000e749d651 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000047c809c5
[ 6606.227253] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-88: queued zerolength write
[ 6606.227334] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6606.227351] scsi host12: ib_srp: using immediate data
[ 6606.227510] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-88 wc->status 0
[ 6606.241345] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6606.243623] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6606.321053] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000006c56c76
[ 6606.321317] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6606.321459] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000004f31265e name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000006c56c76
[ 6606.321859] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-90: queued zerolength write
[ 6606.321942] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6606.321959] scsi host12: ib_srp: using immediate data
[ 6606.322394] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-90 wc->status 0
[ 6606.335411] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6606.337652] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6606.414447] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000e502071c
[ 6606.414714] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6606.414825] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000ffdf32f9 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000e502071c
[ 6606.415221] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-92: queued zerolength write
[ 6606.415303] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6606.415319] scsi host12: ib_srp: using immediate data
[ 6606.415448] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-92 wc->status 0
[ 6606.428992] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6606.431421] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6606.509756] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000a66c5835
[ 6606.510023] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6606.510134] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000093871c7b name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000a66c5835
[ 6606.510515] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-94: queued zerolength write
[ 6606.510590] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6606.510607] scsi host12: ib_srp: using immediate data
[ 6606.510722] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-94 wc->status 0
[ 6606.523684] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6606.528171] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6606.630957] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000015009791
[ 6606.631227] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6606.631337] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000cdcb99de name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000015009791
[ 6606.631695] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-96: queued zerolength write
[ 6606.631724] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6606.631732] scsi host12: ib_srp: using immediate data
[ 6606.633236] scsi host12: SRP.T10:B62E99FFFEF9FA2E
[ 6606.633640] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-96 wc->status 0
[ 6606.700605] scsi 12:0:0:0: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6606.715761] scsi 12:0:0:0: alua: supports implicit and explicit TPGS
[ 6606.715873] scsi 12:0:0:0: alua: device naa.60014056e756c6c62300000000000000 port group 0 rel port 1
[ 6606.722425] sd 12:0:0:0: Attached scsi generic sg1 type 0
[ 6606.724345] sd 12:0:0:0: [sdc] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6606.726389] sd 12:0:0:0: [sdc] Write Protect is off
[ 6606.726431] sd 12:0:0:0: [sdc] Mode Sense: 43 00 00 08
[ 6606.729789] sd 12:0:0:0: [sdc] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 6606.734561] sd 12:0:0:0: [sdc] Preferred minimum I/O size 512 bytes
[ 6606.734619] sd 12:0:0:0: [sdc] Optimal transfer size 126976 bytes
[ 6606.737784] scsi 12:0:0:2: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6606.755205] scsi 12:0:0:2: alua: supports implicit and explicit TPGS
[ 6606.755279] scsi 12:0:0:2: alua: device naa.60014057363736964626700000000000 port group 0 rel port 1
[ 6606.760529] sd 12:0:0:2: [sdd] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6606.761066] sd 12:0:0:2: [sdd] Write Protect is off
[ 6606.761079] sd 12:0:0:2: [sdd] Mode Sense: 43 00 10 08
[ 6606.761322] sd 12:0:0:2: Attached scsi generic sg2 type 0
[ 6606.762386] sd 12:0:0:2: [sdd] Write cache: enabled, read cache: enabled, supports DPO and FUA
[ 6606.764448] sd 12:0:0:2: [sdd] Preferred minimum I/O size 512 bytes
[ 6606.764462] sd 12:0:0:2: [sdd] Optimal transfer size 524288 bytes
[ 6606.769267] scsi 12:0:0:1: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6606.785831] scsi 12:0:0:1: alua: supports implicit and explicit TPGS
[ 6606.785929] scsi 12:0:0:1: alua: device naa.60014056e756c6c62310000000000000 port group 0 rel port 1
[ 6606.793735] sd 12:0:0:1: Attached scsi generic sg3 type 0
[ 6606.795167] ib_srp:srp_add_target: ib_srp: host12: SCSI scan succeeded - detected 3 LUNs
[ 6606.795185] scsi host12: ib_srp: new target: id_ext b62e99fffef9fa2e ioc_guid b62e99fffef9fa2e sgid b42e:99f9:fa2e:0000:0000:0000:0000:0000 dest 2603:8081:1405:679b:0000:0000:0000:132c
[ 6606.795314] sd 12:0:0:0: [sdc] Attached SCSI disk
[ 6606.796576] sd 12:0:0:1: [sde] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6606.797912] sd 12:0:0:1: [sde] Write Protect is off
[ 6606.797936] sd 12:0:0:1: [sde] Mode Sense: 43 00 00 08
[ 6606.798115] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6606.798148] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6606.798215] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6606.798247] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6606.798312] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6606.798344] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6606.798364] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6606.799850] sd 12:0:0:1: [sde] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 6606.801287] sd 12:0:0:2: [sdd] Attached SCSI disk
[ 6606.802973] sd 12:0:0:1: [sde] Preferred minimum I/O size 512 bytes
[ 6606.802987] sd 12:0:0:1: [sde] Optimal transfer size 126976 bytes
[ 6606.835823] sd 12:0:0:1: [sde] Attached SCSI disk
[ 6606.871252] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6606.871287] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6606.871351] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6606.871384] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6606.871448] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6606.871480] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6606.871561] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6606.871599] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6606.871619] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6606.971806] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6606.971843] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6606.971907] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6606.971939] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6606.972007] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6606.972039] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6606.972103] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6606.972135] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6606.972200] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6606.972231] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6606.972253] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6607.042856] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6607.042891] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6607.042955] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6607.042987] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6607.043051] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6607.043083] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6607.043148] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6607.043179] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6607.043243] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6607.043275] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6607.043340] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6607.043372] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6607.043392] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6612.281642] scsi 12:0:0:0: alua: Detached
[ 6612.311324] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6612.311385] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6612.311420] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 6612.374512] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6612.374577] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6612.374703] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6612.374762] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6612.374798] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 6612.379044] sd 12:0:0:2: [sdd] Synchronizing SCSI cache
[ 6612.410338] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6612.410387] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6612.410493] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6612.410552] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6612.410676] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6612.410734] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6612.410768] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6612.446646] scsi 12:0:0:2: alua: Detached
[ 6612.460316] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6612.460375] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6612.460786] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6612.460845] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6612.460960] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6612.461018] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6612.461140] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6612.461196] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6612.461226] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6612.516019] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6612.516079] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6612.516194] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6612.516251] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6612.516363] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6612.516419] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6612.516531] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6612.516586] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6612.516710] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6612.516765] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6612.516800] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6612.556785] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6612.556855] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6612.556982] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6612.557047] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6612.557191] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6612.557255] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6612.557383] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6612.557446] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6612.557653] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6612.557735] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6612.557965] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6612.558090] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6612.558159] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6612.577656] scsi 12:0:0:1: alua: Detached
[ 6612.594747] srpt_recv_done: 2978 callbacks suppressed
[ 6612.594755] ib_srpt receiving failed for ioctx 00000000e5878cbe with status 5
[ 6612.594773] ib_srpt receiving failed for ioctx 00000000a73848bf with status 5
[ 6612.594785] ib_srpt receiving failed for ioctx 000000008b53446e with status 5
[ 6612.594797] ib_srpt receiving failed for ioctx 0000000027636827 with status 5
[ 6612.594809] ib_srpt receiving failed for ioctx 000000001a15e74d with status 5
[ 6612.594821] ib_srpt receiving failed for ioctx 00000000a31139e9 with status 5
[ 6612.594833] ib_srpt receiving failed for ioctx 00000000d6bd0134 with status 5
[ 6612.594845] ib_srpt receiving failed for ioctx 00000000795be3d3 with status 5
[ 6612.594857] ib_srpt receiving failed for ioctx 000000004485c0ff with status 5
[ 6612.594869] ib_srpt receiving failed for ioctx 000000006bcee816 with status 5
[ 6612.867439] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6612.867500] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6612.867535] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 6612.902895] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6612.902955] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6612.903072] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6612.903130] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6612.903166] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 6612.950454] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6612.950513] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6612.950639] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6612.950697] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6612.950815] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6612.950874] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6612.950910] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6612.998922] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6612.998982] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6612.999109] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6612.999167] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6612.999285] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6612.999343] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6612.999460] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6612.999519] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6612.999555] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6613.050462] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6613.050522] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6613.050625] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6613.050686] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6613.050798] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6613.050856] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6613.050978] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6613.051036] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6613.051153] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6613.051210] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6613.051246] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6613.098471] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6613.098506] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6613.098571] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6613.098603] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6613.098667] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6613.098702] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6613.098769] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6613.098805] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6613.098868] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6613.098900] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6613.098964] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6613.098996] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6613.099016] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6613.408259] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6613.408295] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6613.408315] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 6613.458473] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6613.458508] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6613.458572] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6613.458604] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6613.458623] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 6613.511992] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6613.512050] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6613.512161] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6613.512217] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6613.512328] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6613.512384] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6613.512418] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6613.580039] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6613.580072] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6613.580142] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6613.580174] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6613.580247] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6613.580279] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6613.580352] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6613.580383] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6613.580403] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6613.633142] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6613.633175] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6613.633238] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6613.633270] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6613.633334] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6613.633365] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6613.633436] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6613.633470] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6613.633537] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6613.633568] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6613.633588] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6613.692919] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6613.692994] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6613.693121] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6613.693185] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6613.693312] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6613.693378] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6613.694342] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6613.694462] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6613.694630] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6613.694746] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6613.694984] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6613.695079] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6613.695121] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6614.000795] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.000831] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.000851] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 6614.050021] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.050057] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.050121] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6614.050153] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6614.050174] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 6614.097412] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.097446] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.097509] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6614.097541] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6614.097605] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6614.097685] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6614.097707] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6614.139255] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.139327] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.139454] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6614.139517] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6614.139651] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6614.139714] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6614.139797] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6614.139828] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6614.139848] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6614.185868] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.185902] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.185965] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6614.185997] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6614.186068] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6614.186110] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6614.186183] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6614.186214] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6614.186278] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6614.186309] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6614.186329] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6614.245918] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.245954] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.246020] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6614.246052] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6614.246116] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6614.246148] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6614.246218] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6614.246257] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6614.246321] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6614.246352] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6614.246417] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6614.246452] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6614.246472] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6614.540676] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.540711] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.540731] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 6614.586220] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.586255] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.586326] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6614.586358] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6614.586378] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 6614.646718] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.646754] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.646818] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6614.646858] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6614.646921] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6614.646959] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6614.646979] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6614.693468] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.693501] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.693563] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6614.693599] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6614.693711] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6614.693747] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6614.693820] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6614.693876] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6614.693911] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6614.746183] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.746216] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.746279] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6614.746317] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6614.746384] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6614.746416] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6614.746479] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6614.746511] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6614.746574] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6614.746605] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6614.746624] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6614.793911] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6614.793955] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6614.794021] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6614.794065] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6614.794134] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6614.794166] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6614.794229] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6614.794261] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6614.794324] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6614.794358] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6614.794422] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6614.794454] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6614.794473] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6615.099462] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6615.099498] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6615.099518] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 6615.146895] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6615.146930] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6615.146993] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6615.147028] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6615.147048] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 6615.194734] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6615.194776] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6615.194839] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6615.194871] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6615.194946] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6615.194981] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6615.195001] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 6615.195009] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6615.201284] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6615.207089] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6615.314085] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000009b87179f
[ 6615.314353] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6615.314563] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000a05e52aa name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000009b87179f
[ 6615.314920] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-9: queued zerolength write
[ 6615.314948] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6615.314956] scsi host13: ib_srp: using immediate data
[ 6615.315160] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-9 wc->status 0
[ 6615.322454] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6615.324697] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6615.403933] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000008e41946e
[ 6615.404206] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6615.404321] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007674327d name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000008e41946e
[ 6615.404676] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-25: queued zerolength write
[ 6615.404796] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6615.404815] scsi host13: ib_srp: using immediate data
[ 6615.405107] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-25 wc->status 0
[ 6615.413389] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6615.418020] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6615.498350] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000079221d52
[ 6615.498622] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6615.498729] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000087b37dc2 name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=0000000079221d52
[ 6615.499077] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-41: queued zerolength write
[ 6615.499146] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6615.499162] scsi host13: ib_srp: using immediate data
[ 6615.499301] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-41 wc->status 0
[ 6615.506338] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6615.508627] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6615.587375] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000005954260d
[ 6615.587644] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6615.587756] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000000d1bcf2c name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000005954260d
[ 6615.588125] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-53: queued zerolength write
[ 6615.588149] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6615.588157] scsi host13: ib_srp: using immediate data
[ 6615.588522] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-53 wc->status 0
[ 6615.595165] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6615.597609] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6615.674714] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000a1a77905
[ 6615.674984] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6615.675096] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000005b7078ce name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=00000000a1a77905
[ 6615.675452] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-61: queued zerolength write
[ 6615.675477] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6615.675484] scsi host13: ib_srp: using immediate data
[ 6615.675791] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-61 wc->status 0
[ 6615.682496] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6615.684817] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6615.763443] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000028cacec6
[ 6615.763713] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6615.763819] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000312d2b92 name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=0000000028cacec6
[ 6615.764187] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-69: queued zerolength write
[ 6615.764211] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6615.764219] scsi host13: ib_srp: using immediate data
[ 6615.764391] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-69 wc->status 0
[ 6615.771533] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6615.773858] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6615.851839] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000019610fe0
[ 6615.852107] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6615.852214] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000009216654c name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=0000000019610fe0
[ 6615.852578] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-75: queued zerolength write
[ 6615.852602] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6615.852609] scsi host13: ib_srp: using immediate data
[ 6615.852821] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-75 wc->status 0
[ 6615.859651] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6615.861929] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6615.939220] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000002fa2e19d
[ 6615.939489] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6615.939603] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000019c12a8d name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000002fa2e19d
[ 6615.939971] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-79: queued zerolength write
[ 6615.939995] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6615.940003] scsi host13: ib_srp: using immediate data
[ 6615.940342] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-79 wc->status 0
[ 6615.947019] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6615.949344] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.027696] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000f171a1c1
[ 6616.027966] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.028075] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000073d29af1 name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=00000000f171a1c1
[ 6616.028448] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-83: queued zerolength write
[ 6616.028540] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.028558] scsi host13: ib_srp: using immediate data
[ 6616.028837] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-83 wc->status 0
[ 6616.037031] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.041626] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.126446] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000000e90a698
[ 6616.126713] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.126817] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000002609b38b name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000000e90a698
[ 6616.127182] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-87: queued zerolength write
[ 6616.127207] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.127215] scsi host13: ib_srp: using immediate data
[ 6616.127385] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-87 wc->status 0
[ 6616.134154] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.136442] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.213333] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000009b226137
[ 6616.213600] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.213741] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000ba4126da name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000009b226137
[ 6616.214132] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-91: queued zerolength write
[ 6616.214332] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.214344] scsi host13: ib_srp: using immediate data
[ 6616.214579] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-91 wc->status 0
[ 6616.221477] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.223954] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.300561] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000084e936d9
[ 6616.300830] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.300942] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000336f982e name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=0000000084e936d9
[ 6616.301310] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-95: queued zerolength write
[ 6616.301399] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.301417] scsi host13: ib_srp: using immediate data
[ 6616.301690] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-95 wc->status 0
[ 6616.309801] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.314262] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.419019] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000bb1d8f3f
[ 6616.419286] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.419394] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000047370e6f name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=00000000bb1d8f3f
[ 6616.419762] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-98: queued zerolength write
[ 6616.419860] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.419877] scsi host13: ib_srp: using immediate data
[ 6616.419974] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-98 wc->status 0
[ 6616.427042] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.429393] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.506044] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000002b29551f
[ 6616.506311] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.506419] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000009d7ad980 name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000002b29551f
[ 6616.506787] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-100: queued zerolength write
[ 6616.506815] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.506826] scsi host13: ib_srp: using immediate data
[ 6616.507114] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-100 wc->status 0
[ 6616.514028] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.516244] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.592813] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000b58d1b4d
[ 6616.593078] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.593183] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000098216f6c name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=00000000b58d1b4d
[ 6616.593558] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-102: queued zerolength write
[ 6616.593583] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.593590] scsi host13: ib_srp: using immediate data
[ 6616.593953] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-102 wc->status 0
[ 6616.600711] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.602913] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.679214] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000000633bd8b
[ 6616.679478] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.679586] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000006e27f2f4 name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000000633bd8b
[ 6616.679950] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-104: queued zerolength write
[ 6616.680051] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.680069] scsi host13: ib_srp: using immediate data
[ 6616.680147] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-104 wc->status 0
[ 6616.688304] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.692739] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.798675] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000007005bcea
[ 6616.798943] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.799051] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000000cd3dd32 name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000007005bcea
[ 6616.799419] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-106: queued zerolength write
[ 6616.799449] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.799457] scsi host13: ib_srp: using immediate data
[ 6616.799658] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-106 wc->status 0
[ 6616.806400] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.808660] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.886301] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000009d950d0
[ 6616.886572] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.886677] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000d5e3a042 name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=0000000009d950d0
[ 6616.887043] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-108: queued zerolength write
[ 6616.887137] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.887154] scsi host13: ib_srp: using immediate data
[ 6616.889986] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-108 wc->status 0
[ 6616.894674] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.896897] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6616.978456] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000004820b826
[ 6616.978723] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6616.978834] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000f71b7bb1 name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000004820b826
[ 6616.979189] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-110: queued zerolength write
[ 6616.979213] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6616.979220] scsi host13: ib_srp: using immediate data
[ 6616.979402] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-110 wc->status 0
[ 6616.986196] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6616.988375] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6617.064632] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000cd2a4c14
[ 6617.064899] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6617.065005] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000aa62b97d name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=00000000cd2a4c14
[ 6617.065376] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-112: queued zerolength write
[ 6617.065463] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6617.065480] scsi host13: ib_srp: using immediate data
[ 6617.065597] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-112 wc->status 0
[ 6617.072825] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6617.077448] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6617.182620] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000af558e31
[ 6617.182890] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6617.183006] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000000e77d0c9 name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=00000000af558e31
[ 6617.183373] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-114: queued zerolength write
[ 6617.183398] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6617.183405] scsi host13: ib_srp: using immediate data
[ 6617.186577] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-114 wc->status 0
[ 6617.190367] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6617.192584] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6617.269383] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000f1aee9df
[ 6617.269652] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6617.269803] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000901e8e9b name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=00000000f1aee9df
[ 6617.270168] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-116: queued zerolength write
[ 6617.270271] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6617.270288] scsi host13: ib_srp: using immediate data
[ 6617.270380] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-116 wc->status 0
[ 6617.277529] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6617.281694] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6617.364268] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000005ce8810a
[ 6617.364532] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6617.364639] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000073ba864e name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=000000005ce8810a
[ 6617.365010] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-118: queued zerolength write
[ 6617.365101] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6617.365126] scsi host13: ib_srp: using immediate data
[ 6617.365212] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-118 wc->status 0
[ 6617.372658] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 6617.374917] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 6617.450905] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000ad617f1f
[ 6617.451177] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:65c5:47f5:fc42:655d or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 6617.451283] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000012fad3a name=2603:8081:1405:679b:65c5:47f5:fc42:655d ch=00000000ad617f1f
[ 6617.451719] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-120: queued zerolength write
[ 6617.451756] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 6617.451765] scsi host13: ib_srp: using immediate data
[ 6617.452104] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:65c5:47f5:fc42:655d-120 wc->status 0
[ 6617.453231] scsi host13: SRP.T10:B62E99FFFEF9FA2E
[ 6617.490530] scsi 13:0:0:0: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6617.506432] scsi 13:0:0:0: alua: supports implicit and explicit TPGS
[ 6617.506507] scsi 13:0:0:0: alua: device naa.60014056e756c6c62300000000000000 port group 0 rel port 1
[ 6617.513794] sd 13:0:0:0: Attached scsi generic sg1 type 0
[ 6617.514280] sd 13:0:0:0: [sdc] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6617.515157] sd 13:0:0:0: [sdc] Write Protect is off
[ 6617.515186] sd 13:0:0:0: [sdc] Mode Sense: 43 00 00 08
[ 6617.516978] sd 13:0:0:0: [sdc] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 6617.519137] sd 13:0:0:0: [sdc] Preferred minimum I/O size 512 bytes
[ 6617.519176] sd 13:0:0:0: [sdc] Optimal transfer size 126976 bytes
[ 6617.520897] scsi 13:0:0:2: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6617.533602] scsi 13:0:0:2: alua: supports implicit and explicit TPGS
[ 6617.533670] scsi 13:0:0:2: alua: device naa.60014057363736964626700000000000 port group 0 rel port 1
[ 6617.543853] sd 13:0:0:2: Attached scsi generic sg2 type 0
[ 6617.544727] sd 13:0:0:2: [sdd] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6617.545310] sd 13:0:0:2: [sdd] Write Protect is off
[ 6617.545331] sd 13:0:0:2: [sdd] Mode Sense: 43 00 10 08
[ 6617.546739] sd 13:0:0:2: [sdd] Write cache: enabled, read cache: enabled, supports DPO and FUA
[ 6617.550333] sd 13:0:0:2: [sdd] Preferred minimum I/O size 512 bytes
[ 6617.550384] sd 13:0:0:2: [sdd] Optimal transfer size 524288 bytes
[ 6617.553913] scsi 13:0:0:1: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 6617.564361] scsi 13:0:0:1: alua: supports implicit and explicit TPGS
[ 6617.564399] scsi 13:0:0:1: alua: device naa.60014056e756c6c62310000000000000 port group 0 rel port 1
[ 6617.569320] sd 13:0:0:1: Attached scsi generic sg3 type 0
[ 6617.569876] sd 13:0:0:1: [sde] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 6617.570099] ib_srp:srp_add_target: ib_srp: host13: SCSI scan succeeded - detected 3 LUNs
[ 6617.570110] scsi host13: ib_srp: new target: id_ext b62e99fffef9fa2e ioc_guid b62e99fffef9fa2e sgid b42e:99f9:fa2e:0000:0000:0000:0000:0000 dest 2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6617.570561] sd 13:0:0:1: [sde] Write Protect is off
[ 6617.570574] sd 13:0:0:1: [sde] Mode Sense: 43 00 00 08
[ 6617.571915] sd 13:0:0:1: [sde] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 6617.572448] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6617.572481] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6617.572548] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6617.572763] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6617.572827] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6617.572859] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6617.572923] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6617.572955] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6617.572975] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6617.574385] sd 13:0:0:1: [sde] Preferred minimum I/O size 512 bytes
[ 6617.574399] sd 13:0:0:1: [sde] Optimal transfer size 126976 bytes
[ 6617.589694] sd 13:0:0:0: [sdc] Attached SCSI disk
[ 6617.610703] sd 13:0:0:2: [sdd] Attached SCSI disk
[ 6617.618888] sd 13:0:0:1: [sde] Attached SCSI disk
[ 6617.628021] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6617.628081] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6617.628195] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6617.628248] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6617.628364] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6617.628423] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6617.628542] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6617.628601] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6617.628720] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6617.628777] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6617.628812] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6617.708905] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6617.708965] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6617.709093] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6617.709152] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6617.709269] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6617.709326] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6617.709446] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6617.709506] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6617.709623] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6617.709682] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6617.709877] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6617.709936] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6617.709973] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6617.911742] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6617.911778] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6617.911799] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 6617.980338] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6617.980400] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6617.980537] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6617.980597] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6617.980634] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 6618.042551] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6618.042585] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6618.042648] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6618.042681] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6618.042745] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6618.042777] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6618.042797] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:65c5:47f5:fc42:655d
[ 6618.091639] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6618.091672] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6618.091736] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6618.091768] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6618.091831] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6618.091863] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6618.091931] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6618.091970] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6618.091989] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 6618.136288] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6618.136321] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6618.136384] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6618.136416] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6618.136488] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6618.136519] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6618.136583] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6618.136615] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6618.136679] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6618.136710] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6618.136730] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 6618.191573] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 6618.191606] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 6618.191670] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 6618.191702] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 6618.191767] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d] -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:0/11010381%0
[ 6618.191798] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555 -> [2603:8081:1405:679b:65c5:47f5:fc42:655d]:5555/11010381%0
[ 6618.191863] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 6618.191895] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 6618.191959] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 6618.191998] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 6618.192066] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 6618.192099] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 6618.192118] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 6692.571317] sched: RT throttling activated
[ 6771.189527] INFO: task kworker/11:0:85 blocked for more than 120 seconds.
[ 6771.189678]       Tainted: G           OE      6.6.0-rc3+ #11
[ 6771.189695] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 6771.189720] task:kworker/11:0    state:D stack:0     pid:85    ppid:2      flags:0x00004000
[ 6771.189740] Workqueue: dio/dm-5 iomap_dio_complete_work
[ 6771.189763] Call Trace:
[ 6771.189771]  <TASK>
[ 6771.189786]  ? __schedule+0x996/0x2c80
[ 6771.189804]  __schedule+0x9f6/0x2c80
[ 6771.189868]  ? io_schedule_timeout+0xc0/0xc0
[ 6771.189889]  ? mark_held_locks+0x71/0xa0
[ 6771.189921]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.189938]  ? trace_hardirqs_on+0x22/0x100
[ 6771.189962]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.189993]  schedule+0x96/0x150
[ 6771.190015]  bit_wait+0x1c/0xa0
[ 6771.190035]  __wait_on_bit+0x42/0x110
[ 6771.190047]  ? bit_wait_io+0xa0/0xa0
[ 6771.190084]  __inode_wait_for_writeback+0x11b/0x190
[ 6771.190104]  ? inode_prepare_wbs_switch+0x160/0x160
[ 6771.190133]  ? swake_up_one+0xb0/0xb0
[ 6771.190152]  ? __lock_acquire+0x1139/0x3d30
[ 6771.190195]  writeback_single_inode+0xb8/0x250
[ 6771.190224]  sync_inode_metadata+0xa2/0xe0
[ 6771.190240]  ? write_inode_now+0x160/0x160
[ 6771.190251]  ? __lock_acquire+0x1139/0x3d30
[ 6771.190309]  ? file_write_and_wait_range+0x54/0xe0
[ 6771.190335]  generic_buffers_fsync_noflush+0x135/0x160
[ 6771.190366]  ext4_sync_file+0x3b3/0x620
[ 6771.190403]  vfs_fsync_range+0x69/0x110
[ 6771.190415]  ? ext4_getfsmap+0x520/0x520
[ 6771.190446]  iomap_dio_complete+0x35c/0x3a0
[ 6771.190459]  ? process_one_work+0x3d8/0x950
[ 6771.190473]  ? process_one_work+0x3d3/0x950
[ 6771.190499]  ? aio_fsync_work+0x190/0x190
[ 6771.190519]  iomap_dio_complete_work+0x36/0x50
[ 6771.190542]  process_one_work+0x46c/0x950
[ 6771.190555]  ? worker_thread+0xd6/0x680
[ 6771.190600]  ? kick_pool+0x200/0x200
[ 6771.190646]  ? assign_work+0xe8/0x120
[ 6771.190675]  worker_thread+0x37e/0x680
[ 6771.190726]  ? create_worker+0x400/0x400
[ 6771.190743]  kthread+0x1b0/0x1f0
[ 6771.190758]  ? kthread+0x103/0x1f0
[ 6771.190773]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.190795]  ret_from_fork+0x40/0x70
[ 6771.190807]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.190828]  ret_from_fork_asm+0x11/0x20
[ 6771.190892]  </TASK>
[ 6771.190919] INFO: task kworker/4:1:166 blocked for more than 120 seconds.
[ 6771.190935]       Tainted: G           OE      6.6.0-rc3+ #11
[ 6771.190951] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 6771.190965] task:kworker/4:1     state:D stack:0     pid:166   ppid:2      flags:0x00004000
[ 6771.190983] Workqueue: dio/dm-5 iomap_dio_complete_work
[ 6771.191001] Call Trace:
[ 6771.191009]  <TASK>
[ 6771.191024]  ? __schedule+0x996/0x2c80
[ 6771.191041]  __schedule+0x9f6/0x2c80
[ 6771.191101]  ? io_schedule_timeout+0xc0/0xc0
[ 6771.191118]  ? mark_held_locks+0x71/0xa0
[ 6771.191146]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.191160]  ? trace_hardirqs_on+0x22/0x100
[ 6771.191180]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.191207]  schedule+0x96/0x150
[ 6771.191227]  bit_wait+0x1c/0xa0
[ 6771.191246]  __wait_on_bit+0x42/0x110
[ 6771.191257]  ? bit_wait_io+0xa0/0xa0
[ 6771.191291]  __inode_wait_for_writeback+0x11b/0x190
[ 6771.191310]  ? inode_prepare_wbs_switch+0x160/0x160
[ 6771.191338]  ? swake_up_one+0xb0/0xb0
[ 6771.191356]  ? __lock_acquire+0x1139/0x3d30
[ 6771.191399]  writeback_single_inode+0xb8/0x250
[ 6771.191428]  sync_inode_metadata+0xa2/0xe0
[ 6771.191444]  ? write_inode_now+0x160/0x160
[ 6771.191455]  ? __lock_acquire+0x1139/0x3d30
[ 6771.191513]  ? file_write_and_wait_range+0x54/0xe0
[ 6771.191539]  generic_buffers_fsync_noflush+0x135/0x160
[ 6771.191569]  ext4_sync_file+0x3b3/0x620
[ 6771.191605]  vfs_fsync_range+0x69/0x110
[ 6771.191617]  ? ext4_getfsmap+0x520/0x520
[ 6771.191648]  iomap_dio_complete+0x35c/0x3a0
[ 6771.191661]  ? process_one_work+0x3d8/0x950
[ 6771.191674]  ? process_one_work+0x3d3/0x950
[ 6771.191699]  ? aio_fsync_work+0x190/0x190
[ 6771.191719]  iomap_dio_complete_work+0x36/0x50
[ 6771.191742]  process_one_work+0x46c/0x950
[ 6771.191755]  ? worker_thread+0xd6/0x680
[ 6771.191800]  ? kick_pool+0x200/0x200
[ 6771.191850]  ? assign_work+0xe8/0x120
[ 6771.191880]  worker_thread+0x37e/0x680
[ 6771.191929]  ? create_worker+0x400/0x400
[ 6771.191947]  kthread+0x1b0/0x1f0
[ 6771.191962]  ? kthread+0x103/0x1f0
[ 6771.191976]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.191999]  ret_from_fork+0x40/0x70
[ 6771.192010]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.192030]  ret_from_fork_asm+0x11/0x20
[ 6771.192095]  </TASK>
[ 6771.192104] INFO: task kworker/6:1:168 blocked for more than 120 seconds.
[ 6771.192119]       Tainted: G           OE      6.6.0-rc3+ #11
[ 6771.192134] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 6771.192148] task:kworker/6:1     state:D stack:0     pid:168   ppid:2      flags:0x00004000
[ 6771.192167] Workqueue: dio/dm-5 iomap_dio_complete_work
[ 6771.192185] Call Trace:
[ 6771.192192]  <TASK>
[ 6771.192207]  ? __schedule+0x996/0x2c80
[ 6771.192225]  __schedule+0x9f6/0x2c80
[ 6771.192289]  ? io_schedule_timeout+0xc0/0xc0
[ 6771.192309]  ? mark_held_locks+0x71/0xa0
[ 6771.192340]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.192356]  ? trace_hardirqs_on+0x22/0x100
[ 6771.192379]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.192411]  schedule+0x96/0x150
[ 6771.192433]  bit_wait+0x1c/0xa0
[ 6771.192452]  __wait_on_bit+0x42/0x110
[ 6771.192465]  ? bit_wait_io+0xa0/0xa0
[ 6771.192502]  __inode_wait_for_writeback+0x11b/0x190
[ 6771.192521]  ? inode_prepare_wbs_switch+0x160/0x160
[ 6771.192550]  ? swake_up_one+0xb0/0xb0
[ 6771.192569]  ? __lock_acquire+0x1139/0x3d30
[ 6771.192612]  writeback_single_inode+0xb8/0x250
[ 6771.192641]  sync_inode_metadata+0xa2/0xe0
[ 6771.192657]  ? write_inode_now+0x160/0x160
[ 6771.192668]  ? __lock_acquire+0x1139/0x3d30
[ 6771.192726]  ? file_write_and_wait_range+0x54/0xe0
[ 6771.192753]  generic_buffers_fsync_noflush+0x135/0x160
[ 6771.192783]  ext4_sync_file+0x3b3/0x620
[ 6771.192820]  vfs_fsync_range+0x69/0x110
[ 6771.192832]  ? ext4_getfsmap+0x520/0x520
[ 6771.192862]  iomap_dio_complete+0x35c/0x3a0
[ 6771.192875]  ? process_one_work+0x3d8/0x950
[ 6771.192888]  ? process_one_work+0x3d3/0x950
[ 6771.192913]  ? aio_fsync_work+0x190/0x190
[ 6771.192933]  iomap_dio_complete_work+0x36/0x50
[ 6771.192955]  process_one_work+0x46c/0x950
[ 6771.192968]  ? worker_thread+0xd6/0x680
[ 6771.193013]  ? kick_pool+0x200/0x200
[ 6771.193063]  ? assign_work+0xe8/0x120
[ 6771.193093]  worker_thread+0x37e/0x680
[ 6771.193106]  ? trace_hardirqs_on+0x22/0x100
[ 6771.193154]  ? create_worker+0x400/0x400
[ 6771.193171]  kthread+0x1b0/0x1f0
[ 6771.193186]  ? kthread+0x103/0x1f0
[ 6771.193200]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.193223]  ret_from_fork+0x40/0x70
[ 6771.193234]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.193255]  ret_from_fork_asm+0x11/0x20
[ 6771.193320]  </TASK>
[ 6771.193330] INFO: task kworker/17:1:179 blocked for more than 120 seconds.
[ 6771.193346]       Tainted: G           OE      6.6.0-rc3+ #11
[ 6771.193360] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 6771.193375] task:kworker/17:1    state:D stack:0     pid:179   ppid:2      flags:0x00004000
[ 6771.193392] Workqueue: dio/dm-5 iomap_dio_complete_work
[ 6771.193410] Call Trace:
[ 6771.193418]  <TASK>
[ 6771.193433]  ? __schedule+0x996/0x2c80
[ 6771.193451]  __schedule+0x9f6/0x2c80
[ 6771.193565]  ? io_schedule_timeout+0xc0/0xc0
[ 6771.193587]  ? mark_held_locks+0x71/0xa0
[ 6771.193618]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.193635]  ? trace_hardirqs_on+0x22/0x100
[ 6771.193659]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.193690]  schedule+0x96/0x150
[ 6771.193712]  bit_wait+0x1c/0xa0
[ 6771.193732]  __wait_on_bit+0x42/0x110
[ 6771.193745]  ? bit_wait_io+0xa0/0xa0
[ 6771.193782]  __inode_wait_for_writeback+0x11b/0x190
[ 6771.193802]  ? inode_prepare_wbs_switch+0x160/0x160
[ 6771.193831]  ? swake_up_one+0xb0/0xb0
[ 6771.193850]  ? __lock_acquire+0x1139/0x3d30
[ 6771.193894]  writeback_single_inode+0xb8/0x250
[ 6771.193923]  sync_inode_metadata+0xa2/0xe0
[ 6771.193939]  ? write_inode_now+0x160/0x160
[ 6771.193950]  ? __lock_acquire+0x1139/0x3d30
[ 6771.194008]  ? file_write_and_wait_range+0x54/0xe0
[ 6771.194035]  generic_buffers_fsync_noflush+0x135/0x160
[ 6771.194065]  ext4_sync_file+0x3b3/0x620
[ 6771.194102]  vfs_fsync_range+0x69/0x110
[ 6771.194114]  ? ext4_getfsmap+0x520/0x520
[ 6771.194145]  iomap_dio_complete+0x35c/0x3a0
[ 6771.194159]  ? process_one_work+0x3d8/0x950
[ 6771.194172]  ? process_one_work+0x3d3/0x950
[ 6771.194198]  ? aio_fsync_work+0x190/0x190
[ 6771.194218]  iomap_dio_complete_work+0x36/0x50
[ 6771.194241]  process_one_work+0x46c/0x950
[ 6771.194254]  ? worker_thread+0xd6/0x680
[ 6771.194299]  ? kick_pool+0x200/0x200
[ 6771.194349]  ? assign_work+0xe8/0x120
[ 6771.194379]  worker_thread+0x37e/0x680
[ 6771.194392]  ? trace_hardirqs_on+0x22/0x100
[ 6771.194441]  ? create_worker+0x400/0x400
[ 6771.194459]  kthread+0x1b0/0x1f0
[ 6771.194474]  ? kthread+0x103/0x1f0
[ 6771.194489]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.194512]  ret_from_fork+0x40/0x70
[ 6771.194524]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.194545]  ret_from_fork_asm+0x11/0x20
[ 6771.194611]  </TASK>
[ 6771.194656] INFO: task kworker/7:2:615 blocked for more than 120 seconds.
[ 6771.194673]       Tainted: G           OE      6.6.0-rc3+ #11
[ 6771.194687] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 6771.194717] task:kworker/7:2     state:D stack:0     pid:615   ppid:2      flags:0x00004000
[ 6771.194735] Workqueue: dio/dm-5 iomap_dio_complete_work
[ 6771.194753] Call Trace:
[ 6771.194761]  <TASK>
[ 6771.194777]  ? __schedule+0x996/0x2c80
[ 6771.194794]  __schedule+0x9f6/0x2c80
[ 6771.194859]  ? io_schedule_timeout+0xc0/0xc0
[ 6771.194879]  ? mark_held_locks+0x71/0xa0
[ 6771.194911]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.194927]  ? trace_hardirqs_on+0x22/0x100
[ 6771.194951]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.194982]  schedule+0x96/0x150
[ 6771.195004]  bit_wait+0x1c/0xa0
[ 6771.195024]  __wait_on_bit+0x42/0x110
[ 6771.195035]  ? bit_wait_io+0xa0/0xa0
[ 6771.195070]  __inode_wait_for_writeback+0x11b/0x190
[ 6771.195088]  ? inode_prepare_wbs_switch+0x160/0x160
[ 6771.195116]  ? swake_up_one+0xb0/0xb0
[ 6771.195135]  ? __lock_acquire+0x1139/0x3d30
[ 6771.195178]  writeback_single_inode+0xb8/0x250
[ 6771.195207]  sync_inode_metadata+0xa2/0xe0
[ 6771.195223]  ? write_inode_now+0x160/0x160
[ 6771.195234]  ? __lock_acquire+0x1139/0x3d30
[ 6771.195292]  ? file_write_and_wait_range+0x54/0xe0
[ 6771.195318]  generic_buffers_fsync_noflush+0x135/0x160
[ 6771.195349]  ext4_sync_file+0x3b3/0x620
[ 6771.195385]  vfs_fsync_range+0x69/0x110
[ 6771.195397]  ? ext4_getfsmap+0x520/0x520
[ 6771.195428]  iomap_dio_complete+0x35c/0x3a0
[ 6771.195441]  ? process_one_work+0x3d8/0x950
[ 6771.195454]  ? process_one_work+0x3d3/0x950
[ 6771.195479]  ? aio_fsync_work+0x190/0x190
[ 6771.195499]  iomap_dio_complete_work+0x36/0x50
[ 6771.195522]  process_one_work+0x46c/0x950
[ 6771.195535]  ? worker_thread+0xd6/0x680
[ 6771.195580]  ? kick_pool+0x200/0x200
[ 6771.195630]  ? assign_work+0xe8/0x120
[ 6771.195660]  worker_thread+0x37e/0x680
[ 6771.195673]  ? trace_hardirqs_on+0x22/0x100
[ 6771.195721]  ? create_worker+0x400/0x400
[ 6771.195739]  kthread+0x1b0/0x1f0
[ 6771.195753]  ? kthread+0x103/0x1f0
[ 6771.195768]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.195790]  ret_from_fork+0x40/0x70
[ 6771.195802]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.195822]  ret_from_fork_asm+0x11/0x20
[ 6771.195887]  </TASK>
[ 6771.195897] INFO: task kworker/8:2:627 blocked for more than 120 seconds.
[ 6771.195913]       Tainted: G           OE      6.6.0-rc3+ #11
[ 6771.195928] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 6771.195942] task:kworker/8:2     state:D stack:0     pid:627   ppid:2      flags:0x00004000
[ 6771.195960] Workqueue: dio/dm-5 iomap_dio_complete_work
[ 6771.195978] Call Trace:
[ 6771.195986]  <TASK>
[ 6771.196001]  ? __schedule+0x996/0x2c80
[ 6771.196019]  __schedule+0x9f6/0x2c80
[ 6771.196083]  ? io_schedule_timeout+0xc0/0xc0
[ 6771.196103]  ? mark_held_locks+0x71/0xa0
[ 6771.196134]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.196150]  ? trace_hardirqs_on+0x22/0x100
[ 6771.196174]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.196205]  schedule+0x96/0x150
[ 6771.196228]  bit_wait+0x1c/0xa0
[ 6771.196247]  __wait_on_bit+0x42/0x110
[ 6771.196260]  ? bit_wait_io+0xa0/0xa0
[ 6771.196297]  __inode_wait_for_writeback+0x11b/0x190
[ 6771.196316]  ? inode_prepare_wbs_switch+0x160/0x160
[ 6771.196345]  ? swake_up_one+0xb0/0xb0
[ 6771.196363]  ? __lock_acquire+0x1139/0x3d30
[ 6771.196407]  writeback_single_inode+0xb8/0x250
[ 6771.196436]  sync_inode_metadata+0xa2/0xe0
[ 6771.196453]  ? write_inode_now+0x160/0x160
[ 6771.196464]  ? __lock_acquire+0x1139/0x3d30
[ 6771.196522]  ? file_write_and_wait_range+0x54/0xe0
[ 6771.196549]  generic_buffers_fsync_noflush+0x135/0x160
[ 6771.196579]  ext4_sync_file+0x3b3/0x620
[ 6771.196614]  vfs_fsync_range+0x69/0x110
[ 6771.196625]  ? ext4_getfsmap+0x520/0x520
[ 6771.196656]  iomap_dio_complete+0x35c/0x3a0
[ 6771.196669]  ? process_one_work+0x3d8/0x950
[ 6771.196682]  ? process_one_work+0x3d3/0x950
[ 6771.196705]  ? aio_fsync_work+0x190/0x190
[ 6771.196726]  iomap_dio_complete_work+0x36/0x50
[ 6771.196749]  process_one_work+0x46c/0x950
[ 6771.196763]  ? worker_thread+0xd6/0x680
[ 6771.196807]  ? kick_pool+0x200/0x200
[ 6771.196857]  ? assign_work+0xe8/0x120
[ 6771.196887]  worker_thread+0x37e/0x680
[ 6771.196900]  ? trace_hardirqs_on+0x22/0x100
[ 6771.196949]  ? create_worker+0x400/0x400
[ 6771.196967]  kthread+0x1b0/0x1f0
[ 6771.196982]  ? kthread+0x103/0x1f0
[ 6771.196996]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.197019]  ret_from_fork+0x40/0x70
[ 6771.197030]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.197051]  ret_from_fork_asm+0x11/0x20
[ 6771.197116]  </TASK>
[ 6771.197146] INFO: task kworker/22:2:1674 blocked for more than 120 seconds.
[ 6771.197163]       Tainted: G           OE      6.6.0-rc3+ #11
[ 6771.197177] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 6771.197191] task:kworker/22:2    state:D stack:0     pid:1674  ppid:2      flags:0x00004000
[ 6771.197208] Workqueue: dio/dm-5 iomap_dio_complete_work
[ 6771.197227] Call Trace:
[ 6771.197235]  <TASK>
[ 6771.197250]  ? __schedule+0x996/0x2c80
[ 6771.197267]  __schedule+0x9f6/0x2c80
[ 6771.197332]  ? io_schedule_timeout+0xc0/0xc0
[ 6771.197352]  ? mark_held_locks+0x71/0xa0
[ 6771.197382]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.197398]  ? trace_hardirqs_on+0x22/0x100
[ 6771.197422]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.197453]  schedule+0x96/0x150
[ 6771.197512]  bit_wait+0x1c/0xa0
[ 6771.197533]  __wait_on_bit+0x42/0x110
[ 6771.197545]  ? bit_wait_io+0xa0/0xa0
[ 6771.197582]  __inode_wait_for_writeback+0x11b/0x190
[ 6771.197600]  ? inode_prepare_wbs_switch+0x160/0x160
[ 6771.197628]  ? swake_up_one+0xb0/0xb0
[ 6771.197646]  ? __lock_acquire+0x1139/0x3d30
[ 6771.197690]  writeback_single_inode+0xb8/0x250
[ 6771.197719]  sync_inode_metadata+0xa2/0xe0
[ 6771.197735]  ? write_inode_now+0x160/0x160
[ 6771.197746]  ? __lock_acquire+0x1139/0x3d30
[ 6771.197805]  ? file_write_and_wait_range+0x54/0xe0
[ 6771.197832]  generic_buffers_fsync_noflush+0x135/0x160
[ 6771.197863]  ext4_sync_file+0x3b3/0x620
[ 6771.197899]  vfs_fsync_range+0x69/0x110
[ 6771.197911]  ? ext4_getfsmap+0x520/0x520
[ 6771.197943]  iomap_dio_complete+0x35c/0x3a0
[ 6771.197956]  ? process_one_work+0x3d8/0x950
[ 6771.197970]  ? process_one_work+0x3d3/0x950
[ 6771.197994]  ? aio_fsync_work+0x190/0x190
[ 6771.198015]  iomap_dio_complete_work+0x36/0x50
[ 6771.198038]  process_one_work+0x46c/0x950
[ 6771.198052]  ? worker_thread+0xd6/0x680
[ 6771.198098]  ? kick_pool+0x200/0x200
[ 6771.198148]  ? assign_work+0xe8/0x120
[ 6771.198178]  worker_thread+0x37e/0x680
[ 6771.198192]  ? trace_hardirqs_on+0x22/0x100
[ 6771.198240]  ? create_worker+0x400/0x400
[ 6771.198259]  kthread+0x1b0/0x1f0
[ 6771.198274]  ? kthread+0x103/0x1f0
[ 6771.198288]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.198311]  ret_from_fork+0x40/0x70
[ 6771.198323]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.198343]  ret_from_fork_asm+0x11/0x20
[ 6771.198410]  </TASK>
[ 6771.198545] INFO: task kworker/20:0:5149 blocked for more than 120 seconds.
[ 6771.198563]       Tainted: G           OE      6.6.0-rc3+ #11
[ 6771.198578] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 6771.198592] task:kworker/20:0    state:D stack:0     pid:5149  ppid:2      flags:0x00004000
[ 6771.198610] Workqueue: dio/dm-5 iomap_dio_complete_work
[ 6771.198629] Call Trace:
[ 6771.198637]  <TASK>
[ 6771.198652]  ? __schedule+0x996/0x2c80
[ 6771.198670]  __schedule+0x9f6/0x2c80
[ 6771.198734]  ? io_schedule_timeout+0xc0/0xc0
[ 6771.198754]  ? mark_held_locks+0x71/0xa0
[ 6771.198785]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.198802]  ? trace_hardirqs_on+0x22/0x100
[ 6771.198826]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.198857]  schedule+0x96/0x150
[ 6771.198880]  bit_wait+0x1c/0xa0
[ 6771.198899]  __wait_on_bit+0x42/0x110
[ 6771.198912]  ? bit_wait_io+0xa0/0xa0
[ 6771.198949]  __inode_wait_for_writeback+0x11b/0x190
[ 6771.198969]  ? inode_prepare_wbs_switch+0x160/0x160
[ 6771.198998]  ? swake_up_one+0xb0/0xb0
[ 6771.199016]  ? __lock_acquire+0x1139/0x3d30
[ 6771.199060]  writeback_single_inode+0xb8/0x250
[ 6771.199089]  sync_inode_metadata+0xa2/0xe0
[ 6771.199105]  ? write_inode_now+0x160/0x160
[ 6771.199116]  ? __lock_acquire+0x1139/0x3d30
[ 6771.199173]  ? file_write_and_wait_range+0x54/0xe0
[ 6771.199199]  generic_buffers_fsync_noflush+0x135/0x160
[ 6771.199229]  ext4_sync_file+0x3b3/0x620
[ 6771.199265]  vfs_fsync_range+0x69/0x110
[ 6771.199277]  ? ext4_getfsmap+0x520/0x520
[ 6771.199309]  iomap_dio_complete+0x35c/0x3a0
[ 6771.199322]  ? process_one_work+0x3d8/0x950
[ 6771.199335]  ? process_one_work+0x3d3/0x950
[ 6771.199360]  ? aio_fsync_work+0x190/0x190
[ 6771.199380]  iomap_dio_complete_work+0x36/0x50
[ 6771.199403]  process_one_work+0x46c/0x950
[ 6771.199416]  ? worker_thread+0xd6/0x680
[ 6771.199461]  ? kick_pool+0x200/0x200
[ 6771.199512]  ? assign_work+0xe8/0x120
[ 6771.199542]  worker_thread+0x37e/0x680
[ 6771.199592]  ? create_worker+0x400/0x400
[ 6771.199610]  kthread+0x1b0/0x1f0
[ 6771.199625]  ? kthread+0x103/0x1f0
[ 6771.199639]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.199662]  ret_from_fork+0x40/0x70
[ 6771.199674]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.199693]  ret_from_fork_asm+0x11/0x20
[ 6771.199759]  </TASK>
[ 6771.199768] INFO: task kworker/13:0:6226 blocked for more than 120 seconds.
[ 6771.199784]       Tainted: G           OE      6.6.0-rc3+ #11
[ 6771.199799] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 6771.199813] task:kworker/13:0    state:D stack:0     pid:6226  ppid:2      flags:0x00004000
[ 6771.199832] Workqueue: dio/dm-5 iomap_dio_complete_work
[ 6771.199850] Call Trace:
[ 6771.199858]  <TASK>
[ 6771.199873]  ? __schedule+0x996/0x2c80
[ 6771.199890]  __schedule+0x9f6/0x2c80
[ 6771.199954]  ? io_schedule_timeout+0xc0/0xc0
[ 6771.199974]  ? mark_held_locks+0x71/0xa0
[ 6771.200005]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.200022]  ? trace_hardirqs_on+0x22/0x100
[ 6771.200046]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.200077]  schedule+0x96/0x150
[ 6771.200099]  io_schedule+0x70/0xb0
[ 6771.200122]  bit_wait_io+0x1c/0xa0
[ 6771.200142]  __wait_on_bit_lock+0xd7/0x130
[ 6771.200154]  ? wait_for_completion_io_timeout+0x30/0x30
[ 6771.200191]  out_of_line_wait_on_bit_lock+0xe3/0x120
[ 6771.200209]  ? __wait_on_bit_lock+0x130/0x130
[ 6771.200236]  ? swake_up_one+0xb0/0xb0
[ 6771.200260]  ? __might_sleep+0x72/0xe0
[ 6771.200289]  __sync_dirty_buffer+0x1ce/0x210
[ 6771.200311]  sync_dirty_buffer+0x13/0x20
[ 6771.200325]  ext4_write_inode+0x2e1/0x300
[ 6771.200348]  ? __ext4_iget+0x1db0/0x1db0
[ 6771.200384]  ? __kasan_check_read+0x11/0x20
[ 6771.200400]  ? do_raw_spin_unlock+0x91/0x110
[ 6771.200429]  __writeback_single_inode+0x59a/0x710
[ 6771.200455]  ? __mark_inode_dirty+0x610/0x610
[ 6771.200470]  ? wbc_attach_and_unlock_inode+0x204/0x3e0
[ 6771.200496]  writeback_single_inode+0x13b/0x250
[ 6771.200521]  sync_inode_metadata+0xa2/0xe0
[ 6771.200534]  ? write_inode_now+0x160/0x160
[ 6771.200543]  ? __lock_acquire+0x1139/0x3d30
[ 6771.200597]  ? file_write_and_wait_range+0x54/0xe0
[ 6771.200620]  generic_buffers_fsync_noflush+0x135/0x160
[ 6771.200648]  ext4_sync_file+0x3b3/0x620
[ 6771.200677]  vfs_fsync_range+0x69/0x110
[ 6771.200688]  ? ext4_getfsmap+0x520/0x520
[ 6771.200717]  iomap_dio_complete+0x35c/0x3a0
[ 6771.200731]  ? process_one_work+0x3d8/0x950
[ 6771.200743]  ? process_one_work+0x3d3/0x950
[ 6771.200768]  ? aio_fsync_work+0x190/0x190
[ 6771.200788]  iomap_dio_complete_work+0x36/0x50
[ 6771.200811]  process_one_work+0x46c/0x950
[ 6771.200824]  ? worker_thread+0xd6/0x680
[ 6771.200869]  ? kick_pool+0x200/0x200
[ 6771.200919]  ? assign_work+0xe8/0x120
[ 6771.200949]  worker_thread+0x37e/0x680
[ 6771.200999]  ? create_worker+0x400/0x400
[ 6771.201017]  kthread+0x1b0/0x1f0
[ 6771.201032]  ? kthread+0x103/0x1f0
[ 6771.201046]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.201069]  ret_from_fork+0x40/0x70
[ 6771.201081]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.201101]  ret_from_fork_asm+0x11/0x20
[ 6771.201167]  </TASK>
[ 6771.201176] INFO: task kworker/12:1:6639 blocked for more than 120 seconds.
[ 6771.201192]       Tainted: G           OE      6.6.0-rc3+ #11
[ 6771.201207] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[ 6771.201221] task:kworker/12:1    state:D stack:0     pid:6639  ppid:2      flags:0x00004000
[ 6771.201239] Workqueue: dio/dm-5 iomap_dio_complete_work
[ 6771.201257] Call Trace:
[ 6771.201265]  <TASK>
[ 6771.201281]  ? __schedule+0x996/0x2c80
[ 6771.201298]  __schedule+0x9f6/0x2c80
[ 6771.201362]  ? io_schedule_timeout+0xc0/0xc0
[ 6771.201383]  ? mark_held_locks+0x71/0xa0
[ 6771.201414]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.201431]  ? trace_hardirqs_on+0x22/0x100
[ 6771.201454]  ? _raw_spin_unlock_irq+0x27/0x60
[ 6771.201511]  schedule+0x96/0x150
[ 6771.201534]  bit_wait+0x1c/0xa0
[ 6771.201554]  __wait_on_bit+0x42/0x110
[ 6771.201567]  ? bit_wait_io+0xa0/0xa0
[ 6771.201606]  __inode_wait_for_writeback+0x11b/0x190
[ 6771.201625]  ? inode_prepare_wbs_switch+0x160/0x160
[ 6771.201655]  ? swake_up_one+0xb0/0xb0
[ 6771.201674]  ? __lock_acquire+0x1139/0x3d30
[ 6771.201719]  writeback_single_inode+0xb8/0x250
[ 6771.201749]  sync_inode_metadata+0xa2/0xe0
[ 6771.201766]  ? write_inode_now+0x160/0x160
[ 6771.201777]  ? __lock_acquire+0x1139/0x3d30
[ 6771.201837]  ? file_write_and_wait_range+0x54/0xe0
[ 6771.201864]  generic_buffers_fsync_noflush+0x135/0x160
[ 6771.201895]  ext4_sync_file+0x3b3/0x620
[ 6771.201932]  vfs_fsync_range+0x69/0x110
[ 6771.201945]  ? ext4_getfsmap+0x520/0x520
[ 6771.201982]  iomap_dio_complete+0x35c/0x3a0
[ 6771.201998]  ? process_one_work+0x3d8/0x950
[ 6771.202012]  ? process_one_work+0x3d3/0x950
[ 6771.202039]  ? aio_fsync_work+0x190/0x190
[ 6771.202060]  iomap_dio_complete_work+0x36/0x50
[ 6771.202084]  process_one_work+0x46c/0x950
[ 6771.202098]  ? worker_thread+0xd6/0x680
[ 6771.202145]  ? kick_pool+0x200/0x200
[ 6771.202196]  ? assign_work+0xe8/0x120
[ 6771.202227]  worker_thread+0x37e/0x680
[ 6771.202279]  ? create_worker+0x400/0x400
[ 6771.202298]  kthread+0x1b0/0x1f0
[ 6771.202313]  ? kthread+0x103/0x1f0
[ 6771.202328]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.202352]  ret_from_fork+0x40/0x70
[ 6771.202364]  ? kthread_complete_and_exit+0x30/0x30
[ 6771.202386]  ret_from_fork_asm+0x11/0x20
[ 6771.202453]  </TASK>
[ 6771.202462] Future hung task reports are suppressed, see sysctl kernel.hung_task_warnings
[ 6771.203286] 
               Showing all locks held in the system:
[ 6771.203313] 2 locks held by kworker/11:0/85:
[ 6771.203323]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.203377]  #1: ffff88810115fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.203439] 2 locks held by kworker/4:1/166:
[ 6771.203449]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.203500]  #1: ffff88810265fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.203547] 2 locks held by kworker/6:1/168:
[ 6771.203555]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.203606]  #1: ffff888102677da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.203653] 2 locks held by kworker/17:1/179:
[ 6771.203663]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.203713]  #1: ffff8881026f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.203761] 1 lock held by khungtaskd/192:
[ 6771.203770]  #0: ffffffff8e3c8040 (rcu_read_lock){....}-{1:3}, at: debug_show_all_locks+0x47/0x290
[ 6771.203837] 2 locks held by systemd-journal/574:
[ 6771.203848] 2 locks held by kworker/7:2/615:
[ 6771.203856]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.203905]  #1: ffff88810e71fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.203952] 2 locks held by kworker/8:2/627:
[ 6771.203962]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204019]  #1: ffff8881381cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204077] 1 lock held by gmain/1695:
[ 6771.204087]  #0: ffff888dfc615e58 (&rq->__lock){-.-.}-{2:2}, at: newidle_balance+0x383/0xda0
[ 6771.204137] 1 lock held by in:imklog/1681:
[ 6771.204146]  #0: ffff88817c61f7d8 (&f->f_pos_lock){+.+.}-{4:4}, at: __fdget_pos+0x7d/0xc0
[ 6771.204191] 5 locks held by rs:main Q:Reg/1682:
[ 6771.204202] 2 locks held by kworker/22:2/1674:
[ 6771.204211]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204262]  #1: ffff88817bdbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204397] 2 locks held by kworker/20:0/5149:
[ 6771.204407]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204458]  #1: ffff88813f47fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204505] 2 locks held by kworker/13:0/6226:
[ 6771.204514]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204565]  #1: ffff88819e7b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204613] 2 locks held by kworker/12:1/6639:
[ 6771.204621]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204673]  #1: ffff888165377da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204720] 2 locks held by kworker/3:2/6994:
[ 6771.204729]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204780]  #1: ffff88819518fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204827] 2 locks held by kworker/4:0/6995:
[ 6771.204836]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204887]  #1: ffff888115e17da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204934] 2 locks held by kworker/11:1/7047:
[ 6771.204943]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.204996]  #1: ffff888192697da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205043] 2 locks held by kworker/22:0/7314:
[ 6771.205052]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205102]  #1: ffff8881be08fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205149] 2 locks held by kworker/14:0/7901:
[ 6771.205158]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205209]  #1: ffff8881749a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205256] 2 locks held by kworker/3:0/7905:
[ 6771.205264]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205315]  #1: ffff88815a83fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205363] 4 locks held by gnome-terminal-/7931:
[ 6771.205394] 2 locks held by kworker/6:0/8590:
[ 6771.205402]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205451]  #1: ffff88812a637da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205523] 2 locks held by kworker/7:1/8733:
[ 6771.205532]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205581]  #1: ffff888131497da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205629] 2 locks held by kworker/19:1/8772:
[ 6771.205638]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205688]  #1: ffff8883559d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205734] 2 locks held by kworker/8:0/8773:
[ 6771.205743]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205792]  #1: ffff8881a408fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205839] 2 locks held by kworker/14:1/8806:
[ 6771.205848]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205898]  #1: ffff8881b72f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.205950] 2 locks held by kworker/u66:1/9058:
[ 6771.205959]  #0: ffff888102e07148 ((wq_completion)writeback){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206007]  #1: ffff888354dbfda8 ((work_completion)(&(&wb->dwork)->work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206055] 2 locks held by kworker/14:2/9127:
[ 6771.206064]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206114]  #1: ffff88814c737da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206160] 2 locks held by kworker/17:0/11071:
[ 6771.206170]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206221]  #1: ffff88811226fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206268] 2 locks held by kworker/12:2/11108:
[ 6771.206277]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206326]  #1: ffff88811c38fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206373] 2 locks held by kworker/u68:2/11166:
[ 6771.206383]  #0: ffff888100064548 ((wq_completion)events_unbound){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206428]  #1: ffff88816df8fda8 ((work_completion)(&state->commit_work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206476] 2 locks held by kworker/15:0/11346:
[ 6771.206485]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206535]  #1: ffff88837edf7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206582] 2 locks held by kworker/11:2/11347:
[ 6771.206591]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206642]  #1: ffff88837edffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206687] 2 locks held by kworker/13:1/11374:
[ 6771.206697]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206747]  #1: ffff88811e7a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206797] 2 locks held by kworker/4:2/11426:
[ 6771.206807]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206857]  #1: ffff888383467da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206904] 2 locks held by multipathd/11428:
[ 6771.206917] 2 locks held by kworker/6:2/11634:
[ 6771.206926]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.206976]  #1: ffff88815700fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207026] 2 locks held by kworker/R-dio/d/11885:
[ 6771.207035]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207086]  #1: ffff888374bafd38 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207133] 2 locks held by kworker/11:3/11898:
[ 6771.207142]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207192]  #1: ffff888370867da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207240] 2 locks held by kworker/22:1/11903:
[ 6771.207249]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207299]  #1: ffff88836ee2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207345] 2 locks held by kworker/11:4/11905:
[ 6771.207354]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207405]  #1: ffff88816e00fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207451] 2 locks held by kworker/8:1/11906:
[ 6771.207460]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207511]  #1: ffff88819fbffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207557] 2 locks held by kworker/22:3/11907:
[ 6771.207567]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207618]  #1: ffff88810a4cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207665] 2 locks held by kworker/10:3/11908:
[ 6771.207673]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207725]  #1: ffff88819d1cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207772] 2 locks held by kworker/5:4/11911:
[ 6771.207780]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207830]  #1: ffff8883df137da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207876] 2 locks held by kworker/11:5/11912:
[ 6771.207885]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207934]  #1: ffff8883df13fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.207983] 2 locks held by kworker/19:0/11913:
[ 6771.207994]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208047]  #1: ffff8883df147da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208094] 2 locks held by kworker/21:2/11915:
[ 6771.208102]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208152]  #1: ffff8883df157da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208200] 2 locks held by kworker/21:3/11920:
[ 6771.208208]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208258]  #1: ffff8883df17fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208303] 2 locks held by kworker/21:4/11921:
[ 6771.208311]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208361]  #1: ffff88818e74fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208408] 2 locks held by kworker/11:6/11927:
[ 6771.208417]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208467]  #1: ffff8881b9687da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208513] 2 locks held by kworker/10:4/11930:
[ 6771.208522]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208572]  #1: ffff88811c747da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208618] 2 locks held by kworker/5:7/11932:
[ 6771.208627]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208677]  #1: ffff88837718fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208723] 2 locks held by kworker/5:8/11934:
[ 6771.208732]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208782]  #1: ffff88813a14fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208827] 2 locks held by kworker/22:4/11936:
[ 6771.208836]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208886]  #1: ffff888379527da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208933] 2 locks held by kworker/19:4/11941:
[ 6771.208942]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.208992]  #1: ffff8881090c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209039] 2 locks held by kworker/10:5/11943:
[ 6771.209047]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209097]  #1: ffff888121c37da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209144] 2 locks held by kworker/3:1/11945:
[ 6771.209152]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209202]  #1: ffff88811edbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209249] 2 locks held by kworker/22:6/11951:
[ 6771.209258]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209308]  #1: ffff888108527da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209355] 2 locks held by kworker/3:4/11956:
[ 6771.209364]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209413]  #1: ffff88813270fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209460] 2 locks held by kworker/13:6/11959:
[ 6771.209493]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209543]  #1: ffff888191867da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209590] 2 locks held by kworker/8:4/11961:
[ 6771.209599]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209649]  #1: ffff8883df20fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209696] 2 locks held by kworker/3:5/11963:
[ 6771.209705]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209756]  #1: ffff8883df21fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209804] 2 locks held by kworker/13:8/11965:
[ 6771.209812]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209863]  #1: ffff8883df22fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209911] 2 locks held by kworker/3:6/11967:
[ 6771.209920]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.209971]  #1: ffff8883df23fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210019] 2 locks held by kworker/0:1/11969:
[ 6771.210028]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210078]  #1: ffff8883df24fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210124] 2 locks held by kworker/23:9/11971:
[ 6771.210133]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210182]  #1: ffff8883df25fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210229] 2 locks held by kworker/3:7/11972:
[ 6771.210238]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210288]  #1: ffff8883df267da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210334] 2 locks held by kworker/10:6/11975:
[ 6771.210343]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210394]  #1: ffff8883df287da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210439] 2 locks held by kworker/0:3/11976:
[ 6771.210448]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210498]  #1: ffff8883df28fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210545] 2 locks held by kworker/10:7/11982:
[ 6771.210554]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210604]  #1: ffff8881a5adfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210650] 2 locks held by kworker/13:10/11983:
[ 6771.210660]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210709]  #1: ffff8881ace27da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210756] 2 locks held by kworker/3:9/11986:
[ 6771.210764]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210814]  #1: ffff8881374c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210861] 2 locks held by kworker/0:4/11988:
[ 6771.210870]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210920]  #1: ffff888386c57da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.210966] 2 locks held by kworker/13:11/11990:
[ 6771.210975]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211026]  #1: ffff888386c4fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211071] 2 locks held by kworker/3:10/11993:
[ 6771.211079]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211127]  #1: ffff888385fc7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211174] 2 locks held by kworker/11:9/11994:
[ 6771.211182]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211232]  #1: ffff888385fbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211278] 2 locks held by kworker/13:12/11996:
[ 6771.211287]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211336]  #1: ffff888385fafda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211384] 2 locks held by kworker/11:11/12003:
[ 6771.211393]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211442]  #1: ffff8883df2efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211489] 2 locks held by kworker/13:14/12004:
[ 6771.211498]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211547]  #1: ffff8883df2f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211592] 2 locks held by kworker/11:12/12005:
[ 6771.211601]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211651]  #1: ffff8883df2ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211697] 2 locks held by kworker/10:11/12007:
[ 6771.211706]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211756]  #1: ffff8883df30fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211801] 2 locks held by kworker/11:13/12009:
[ 6771.211811]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211860]  #1: ffff8883df357da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211908] 2 locks held by kworker/21:6/12017:
[ 6771.211917]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.211967]  #1: ffff888113edfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212014] 2 locks held by kworker/13:17/12020:
[ 6771.212022]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212072]  #1: ffff888138397da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212119] 2 locks held by kworker/21:7/12025:
[ 6771.212128]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212178]  #1: ffff88813a0cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212225] 2 locks held by kworker/11:16/12030:
[ 6771.212234]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212283]  #1: ffff88815b6ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212329] 2 locks held by kworker/5:12/12032:
[ 6771.212338]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212387]  #1: ffff888139a8fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212433] 2 locks held by kworker/21:8/12033:
[ 6771.212442]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212491]  #1: ffff8881b4197da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212537] 2 locks held by kworker/3:11/12035:
[ 6771.212546]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212596]  #1: ffff8881418b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212642] 2 locks held by kworker/23:14/12038:
[ 6771.212650]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212700]  #1: ffff8883df38fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212746] 2 locks held by kworker/5:13/12039:
[ 6771.212755]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212804]  #1: ffff8883df397da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212850] 2 locks held by kworker/21:9/12040:
[ 6771.212858]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212908]  #1: ffff8883df39fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.212955] 2 locks held by kworker/20:14/12042:
[ 6771.212963]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213013]  #1: ffff8883df3afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213058] 2 locks held by kworker/10:15/12043:
[ 6771.213067]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213117]  #1: ffff8883df3b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213163] 2 locks held by kworker/3:12/12044:
[ 6771.213172]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213222]  #1: ffff8883df3bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213268] 2 locks held by kworker/0:12/12045:
[ 6771.213276]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213325]  #1: ffff8883df3c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213372] 2 locks held by kworker/5:14/12048:
[ 6771.213381]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213431]  #1: ffff8883df3dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213498] 2 locks held by kworker/21:10/12049:
[ 6771.213508]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213557]  #1: ffff8883df3efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213603] 2 locks held by kworker/1:8/12050:
[ 6771.213612]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213661]  #1: ffff8883df3f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213708] 2 locks held by kworker/10:16/12052:
[ 6771.213716]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213766]  #1: ffff8883df807da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213814] 2 locks held by kworker/0:13/12053:
[ 6771.213822]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213871]  #1: ffff8883df80fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213918] 2 locks held by kworker/5:15/12055:
[ 6771.213927]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.213977]  #1: ffff8883df81fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214023] 2 locks held by kworker/21:11/12056:
[ 6771.214032]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214082]  #1: ffff8883df827da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214129] 2 locks held by kworker/11:20/12063:
[ 6771.214139]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214189]  #1: ffff8883df867da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214237] 2 locks held by kworker/10:17/12066:
[ 6771.214246]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214296]  #1: ffff8883df87fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214343] 2 locks held by kworker/5:16/12067:
[ 6771.214351]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214402]  #1: ffff8883df887da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214448] 2 locks held by kworker/11:21/12068:
[ 6771.214457]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214507]  #1: ffff8883df88fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214554] 2 locks held by kworker/5:17/12072:
[ 6771.214563]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214613]  #1: ffff8883df8afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214659] 2 locks held by kworker/10:19/12073:
[ 6771.214667]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214717]  #1: ffff8883df8b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214764] 2 locks held by kworker/20:19/12075:
[ 6771.214772]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214822]  #1: ffff88811fd3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214867] 2 locks held by kworker/17:5/12077:
[ 6771.214876]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214926]  #1: ffff8881748cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.214972] 2 locks held by kworker/13:20/12078:
[ 6771.214982]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215038]  #1: ffff888199c3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215085] 2 locks held by kworker/5:19/12081:
[ 6771.215094]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215143]  #1: ffff88818835fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215193] 2 locks held by kworker/10:22/12093:
[ 6771.215202]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215252]  #1: ffff8883df8cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215299] 2 locks held by kworker/5:22/12094:
[ 6771.215308]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215358]  #1: ffff8883df8d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215404] 2 locks held by kworker/10:23/12097:
[ 6771.215413]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215463]  #1: ffff8883df927da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215510] 2 locks held by kworker/3:13/12098:
[ 6771.215518]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215569]  #1: ffff8883df92fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215615] 2 locks held by kworker/5:23/12099:
[ 6771.215624]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215674]  #1: ffff8883df937da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215719] 2 locks held by kworker/13:25/12101:
[ 6771.215728]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215778]  #1: ffff8883df94fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215824] 2 locks held by kworker/10:24/12103:
[ 6771.215833]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215883]  #1: ffff8883df95fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215930] 2 locks held by kworker/5:25/12112:
[ 6771.215939]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.215989]  #1: ffff8883df9a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216036] 2 locks held by kworker/3:16/12115:
[ 6771.216045]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216095]  #1: ffff8883df9c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216141] 2 locks held by kworker/13:28/12117:
[ 6771.216150]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216199]  #1: ffff8883df9d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216242] 2 locks held by kworker/19:5/12118:
[ 6771.216251]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216300]  #1: ffff8883df9dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216347] 2 locks held by kworker/3:17/12120:
[ 6771.216356]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216406]  #1: ffff8883df9efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216452] 2 locks held by kworker/13:29/12121:
[ 6771.216461]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216511]  #1: ffff8883dfa2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216557] 2 locks held by kworker/19:6/12122:
[ 6771.216566]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216616]  #1: ffff8883dfa37da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216663] 2 locks held by kworker/3:18/12124:
[ 6771.216671]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216721]  #1: ffff8883dfa47da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216767] 2 locks held by kworker/20:24/12125:
[ 6771.216775]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216826]  #1: ffff8883dfa4fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216872] 2 locks held by kworker/19:7/12127:
[ 6771.216881]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216931]  #1: ffff8883dfa67da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.216976] 2 locks held by kworker/10:29/12129:
[ 6771.216985]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217035]  #1: ffff8883dfa77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217082] 2 locks held by kworker/19:8/12131:
[ 6771.217091]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217141]  #1: ffff8883dfa87da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217187] 2 locks held by kworker/0:22/12132:
[ 6771.217196]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217245]  #1: ffff8883dfa8fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217292] 2 locks held by kworker/10:31/12137:
[ 6771.217301]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217351]  #1: ffff8883dfab7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217398] 2 locks held by kworker/3:20/12138:
[ 6771.217406]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217456]  #1: ffff8883dfabfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217525] 2 locks held by kworker/13:33/12139:
[ 6771.217535]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217585]  #1: ffff8883dfac7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217631] 2 locks held by kworker/3:21/12140:
[ 6771.217640]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217690]  #1: ffff8883dfad7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217737] 2 locks held by kworker/3:23/12146:
[ 6771.217746]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217796]  #1: ffff8883dfb07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217843] 2 locks held by kworker/3:24/12151:
[ 6771.217852]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217902]  #1: ffff8883dfb2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.217949] 2 locks held by kworker/13:37/12153:
[ 6771.217957]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218012]  #1: ffff8883dfb47da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218059] 2 locks held by kworker/10:35/12155:
[ 6771.218068]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218119]  #1: ffff8883dfb57da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218165] 2 locks held by kworker/3:25/12156:
[ 6771.218175]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218226]  #1: ffff8883dfb5fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218272] 2 locks held by kworker/19:10/12157:
[ 6771.218281]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218331]  #1: ffff8883dfb67da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218379] 2 locks held by kworker/3:26/12161:
[ 6771.218388]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218438]  #1: ffff8883dfb7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218484] 2 locks held by kworker/17:9/12163:
[ 6771.218493]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218542]  #1: ffff8883dfb8fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218589] 2 locks held by kworker/10:37/12164:
[ 6771.218598]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218648]  #1: ffff8883dfb97da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218695] 2 locks held by kworker/18:3/12168:
[ 6771.218704]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218754]  #1: ffff8883dfbbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218800] 2 locks held by kworker/10:38/12171:
[ 6771.218809]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218859]  #1: ffff8883dfbd7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218905] 2 locks held by kworker/5:27/12173:
[ 6771.218914]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.218964]  #1: ffff8883dfbe7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219010] 2 locks held by kworker/1:18/12175:
[ 6771.219018]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219066]  #1: ffff8883dfbf7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219110] 2 locks held by kworker/5:28/12177:
[ 6771.219119]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219167]  #1: ffff8883dfc07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219214] 2 locks held by kworker/1:19/12179:
[ 6771.219223]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219272]  #1: ffff8883dfc1fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219319] 2 locks held by kworker/18:5/12180:
[ 6771.219328]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219378]  #1: ffff8883dfc27da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219423] 2 locks held by kworker/3:27/12182:
[ 6771.219432]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219482]  #1: ffff888129affda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219528] 2 locks held by kworker/10:40/12184:
[ 6771.219537]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219587]  #1: ffff888121bafda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219634] 2 locks held by kworker/3:28/12188:
[ 6771.219643]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219693]  #1: ffff88813456fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219739] 2 locks held by kworker/3:29/12191:
[ 6771.219748]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219798]  #1: ffff888191627da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219844] 2 locks held by kworker/23:19/12192:
[ 6771.219853]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219903]  #1: ffff888193657da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.219948] 2 locks held by kworker/10:42/12193:
[ 6771.219957]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220006]  #1: ffff8881acef7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220052] 2 locks held by kworker/21:13/12195:
[ 6771.220061]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220112]  #1: ffff88836fcdfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220158] 2 locks held by kworker/5:29/12196:
[ 6771.220166]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220216]  #1: ffff88836fce7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220262] 2 locks held by kworker/22:8/12197:
[ 6771.220271]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220320]  #1: ffff8881ba1efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220368] 2 locks held by kworker/3:32/12202:
[ 6771.220376]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220426]  #1: ffff88818f337da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220473] 2 locks held by kworker/3:33/12205:
[ 6771.220482]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220532]  #1: ffff8883dfc37da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220578] 2 locks held by kworker/10:44/12206:
[ 6771.220587]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220637]  #1: ffff8883dfc3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220683] 2 locks held by kworker/13:39/12208:
[ 6771.220692]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220738]  #1: ffff8883dfc57da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220786] 2 locks held by kworker/13:40/12212:
[ 6771.220795]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220845]  #1: ffff8883dfcafda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220892] 2 locks held by kworker/13:41/12216:
[ 6771.220901]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220951]  #1: ffff8883dfccfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.220998] 2 locks held by kworker/3:36/12220:
[ 6771.221007]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221056]  #1: ffff8883dfcf7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221104] 2 locks held by kworker/3:37/12226:
[ 6771.221113]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221163]  #1: ffff8883dfd2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221209] 2 locks held by kworker/13:43/12227:
[ 6771.221218]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221267]  #1: ffff8883dfd37da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221314] 2 locks held by kworker/8:11/12230:
[ 6771.221322]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221373]  #1: ffff8883dfd4fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221419] 2 locks held by kworker/21:15/12231:
[ 6771.221428]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221497]  #1: ffff8883dfd5fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221543] 2 locks held by kworker/13:44/12232:
[ 6771.221553]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221602]  #1: ffff8883dfd67da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221649] 2 locks held by kworker/3:38/12233:
[ 6771.221657]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221707]  #1: ffff8883dfd6fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221753] 2 locks held by kworker/10:50/12235:
[ 6771.221762]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221812]  #1: ffff8883dfdb7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221859] 2 locks held by kworker/10:51/12238:
[ 6771.221868]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221919]  #1: ffff8883dfdcfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.221965] 2 locks held by kworker/21:17/12239:
[ 6771.221974]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222023]  #1: ffff8883dfdd7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222071] 2 locks held by kworker/21:18/12240:
[ 6771.222079]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222130]  #1: ffff8883dfddfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222178] 2 locks held by kworker/10:53/12246:
[ 6771.222187]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222240]  #1: ffff8883dfe1fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222285] 2 locks held by kworker/4:10/12247:
[ 6771.222294]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222345]  #1: ffff8883dfe27da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222393] 2 locks held by kworker/10:54/12250:
[ 6771.222402]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222452]  #1: ffff88818c057da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222499] 2 locks held by kworker/17:13/12252:
[ 6771.222508]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222557]  #1: ffff888197f97da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222604] 2 locks held by kworker/10:55/12254:
[ 6771.222613]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222662]  #1: ffff8881af0e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222710] 2 locks held by kworker/17:14/12256:
[ 6771.222719]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222768]  #1: ffff888126e97da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222814] 2 locks held by kworker/4:13/12257:
[ 6771.222823]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222873]  #1: ffff88819931fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222920] 2 locks held by kworker/10:56/12258:
[ 6771.222929]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.222978]  #1: ffff8881a55bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223026] 2 locks held by kworker/5:32/12265:
[ 6771.223035]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223084]  #1: ffff8883dfe9fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223130] 2 locks held by kworker/5:33/12268:
[ 6771.223139]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223189]  #1: ffff8883dfeb7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223236] 2 locks held by kworker/3:39/12271:
[ 6771.223244]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223294]  #1: ffff8883dfecfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223339] 2 locks held by kworker/13:46/12274:
[ 6771.223348]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223399]  #1: ffff8883dfeefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223444] 2 locks held by kworker/8:12/12276:
[ 6771.223453]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223503]  #1: ffff8883dfeffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223551] 2 locks held by kworker/13:48/12283:
[ 6771.223560]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223605]  #1: ffff8883dff3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223652] 2 locks held by kworker/8:13/12285:
[ 6771.223661]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223710]  #1: ffff8883dff4fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223757] 2 locks held by kworker/3:42/12288:
[ 6771.223766]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223815]  #1: ffff8883dff6fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223861] 2 locks held by kworker/13:50/12290:
[ 6771.223870]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223921]  #1: ffff8883dff7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.223970] 2 locks held by kworker/3:43/12292:
[ 6771.223979]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224028]  #1: ffff8883dff8fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224074] 2 locks held by kworker/3:44/12295:
[ 6771.224082]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224132]  #1: ffff8883e0017da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224179] 2 locks held by kworker/11:22/12304:
[ 6771.224188]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224237]  #1: ffff8883e006fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224283] 2 locks held by kworker/13:54/12306:
[ 6771.224292]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224341]  #1: ffff88812175fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224386] 2 locks held by kworker/13:55/12307:
[ 6771.224395]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224445]  #1: ffff88812eeb7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224491] 2 locks held by kworker/13:56/12309:
[ 6771.224499]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224549]  #1: ffff88815b887da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224594] 2 locks held by kworker/13:57/12310:
[ 6771.224603]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224653]  #1: ffff888138117da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224699] 2 locks held by kworker/19:13/12312:
[ 6771.224707]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224757]  #1: ffff88811ee97da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224804] 2 locks held by kworker/3:45/12317:
[ 6771.224812]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224861]  #1: ffff88837c247da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224906] 2 locks held by kworker/17:20/12318:
[ 6771.224915]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.224969]  #1: ffff88837c23fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225016] 2 locks held by kworker/3:46/12320:
[ 6771.225025]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225075]  #1: ffff88837c22fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225121] 2 locks held by kworker/13:61/12323:
[ 6771.225130]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225179]  #1: ffff8881966e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225224] 2 locks held by kworker/3:47/12326:
[ 6771.225233]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225283]  #1: ffff8883e008fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225330] 2 locks held by kworker/17:25/12332:
[ 6771.225338]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225388]  #1: ffff8883e00d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225433] 2 locks held by kworker/17:26/12334:
[ 6771.225442]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225513]  #1: ffff8883e00e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225559] 2 locks held by kworker/17:27/12337:
[ 6771.225568]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225617]  #1: ffff8883e013fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225663] 2 locks held by kworker/0:25/12338:
[ 6771.225672]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225722]  #1: ffff8883e014fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225768] 2 locks held by kworker/3:50/12339:
[ 6771.225777]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225827]  #1: ffff8883e0157da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225874] 2 locks held by kworker/17:28/12340:
[ 6771.225882]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225931]  #1: ffff8883e015fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.225975] 2 locks held by kworker/3:52/12344:
[ 6771.225984]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226034]  #1: ffff8883e01d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226081] 2 locks held by kworker/3:53/12346:
[ 6771.226090]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226141]  #1: ffff8883e01e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226189] 2 locks held by kworker/13:64/12348:
[ 6771.226198]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226248]  #1: ffff88817569fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226294] 2 locks held by kworker/3:55/12350:
[ 6771.226302]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226353]  #1: ffff888377e3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226398] 2 locks held by kworker/3:56/12351:
[ 6771.226408]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226457]  #1: ffff888189107da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226503] 2 locks held by kworker/20:34/12354:
[ 6771.226513]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226563]  #1: ffff888377e2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226609] 2 locks held by kworker/10:57/12356:
[ 6771.226618]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226668]  #1: ffff888377fdfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226712] 2 locks held by kworker/3:59/12357:
[ 6771.226721]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226770]  #1: ffff888377e0fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226817] 2 locks held by kworker/3:60/12358:
[ 6771.226825]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226875]  #1: ffff888377df7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226922] 2 locks held by kworker/19:19/12361:
[ 6771.226930]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.226979]  #1: ffff8883e0207da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227031] 2 locks held by kworker/3:63/12367:
[ 6771.227041]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227090]  #1: ffff8883e0287da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227135] 2 locks held by kworker/20:35/12368:
[ 6771.227144]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227194]  #1: ffff8883e0297da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227240] 2 locks held by kworker/10:58/12369:
[ 6771.227249]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227299]  #1: ffff8883e029fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227345] 2 locks held by kworker/5:40/12372:
[ 6771.227354]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227404]  #1: ffff8883e02c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227450] 2 locks held by kworker/0:27/12374:
[ 6771.227459]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227509]  #1: ffff8883e02dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227557] 2 locks held by kworker/17:33/12382:
[ 6771.227565]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227616]  #1: ffff8883e036fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227662] 2 locks held by kworker/3:66/12384:
[ 6771.227671]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227721]  #1: ffff8883e0387da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227769] 2 locks held by kworker/3:67/12389:
[ 6771.227777]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227827]  #1: ffff88812919fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227874] 2 locks held by kworker/20:37/12393:
[ 6771.227883]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227932]  #1: ffff888394177da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.227979] 2 locks held by kworker/3:68/12396:
[ 6771.227988]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228037]  #1: ffff888120427da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228085] 2 locks held by kworker/11:25/12405:
[ 6771.228094]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228144]  #1: ffff8883e03dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228190] 2 locks held by kworker/8:19/12406:
[ 6771.228199]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228249]  #1: ffff8883e03efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228296] 2 locks held by kworker/21:21/12410:
[ 6771.228304]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228354]  #1: ffff8883e0427da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228400] 2 locks held by kworker/3:72/12415:
[ 6771.228409]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228459]  #1: ffff8883e0457da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228506] 2 locks held by kworker/21:24/12420:
[ 6771.228514]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228563]  #1: ffff888196fffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228609] 2 locks held by kworker/4:17/12421:
[ 6771.228617]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228668]  #1: ffff888137a9fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228714] 2 locks held by kworker/3:74/12423:
[ 6771.228723]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228772]  #1: ffff888354857da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228820] 2 locks held by kworker/3:75/12429:
[ 6771.228828]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228878]  #1: ffff8881b265fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228925] 2 locks held by kworker/21:26/12432:
[ 6771.228934]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.228983]  #1: ffff888395b07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229029] 2 locks held by kworker/8:20/12433:
[ 6771.229038]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229088]  #1: ffff888395b0fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229134] 2 locks held by kworker/0:32/12434:
[ 6771.229143]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229193]  #1: ffff888395bbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229239] 2 locks held by kworker/4:19/12435:
[ 6771.229248]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229297]  #1: ffff888395bc7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229344] 2 locks held by kworker/3:76/12437:
[ 6771.229353]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229403]  #1: ffff888167417da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229449] 2 locks held by kworker/21:27/12439:
[ 6771.229458]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229528]  #1: ffff888166b7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229575] 2 locks held by kworker/4:20/12440:
[ 6771.229584]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229634]  #1: ffff888130bbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229679] 2 locks held by kworker/0:33/12441:
[ 6771.229688]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229738]  #1: ffff88811e51fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229785] 2 locks held by kworker/3:77/12442:
[ 6771.229794]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229844]  #1: ffff888129747da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229891] 2 locks held by kworker/10:59/12445:
[ 6771.229900]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229950]  #1: ffff88818706fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.229997] 2 locks held by kworker/21:28/12446:
[ 6771.230006]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230061]  #1: ffff8883ddf67da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230108] 2 locks held by kworker/3:78/12447:
[ 6771.230118]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230168]  #1: ffff8883e04b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230216] 2 locks held by kworker/10:60/12450:
[ 6771.230225]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230275]  #1: ffff8883e04d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230321] 2 locks held by kworker/21:29/12451:
[ 6771.230331]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230381]  #1: ffff8883e04dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230427] 2 locks held by kworker/4:21/12455:
[ 6771.230435]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230486]  #1: ffff8883e050fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230532] 2 locks held by kworker/21:30/12456:
[ 6771.230541]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230592]  #1: ffff8883e0517da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230638] 2 locks held by kworker/10:61/12457:
[ 6771.230647]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230696]  #1: ffff8883e051fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230743] 2 locks held by kworker/3:80/12460:
[ 6771.230752]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230803]  #1: ffff8883e054fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230849] 2 locks held by kworker/10:62/12463:
[ 6771.230858]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230908]  #1: ffff8883e056fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.230954] 2 locks held by kworker/3:81/12466:
[ 6771.230963]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231013]  #1: ffff8883e058fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231060] 2 locks held by kworker/11:26/12469:
[ 6771.231068]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231118]  #1: ffff8883e05afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231165] 2 locks held by kworker/21:32/12470:
[ 6771.231174]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231223]  #1: ffff8883e05bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231270] 2 locks held by kworker/10:63/12471:
[ 6771.231279]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231328]  #1: ffff8883e05c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231375] 2 locks held by kworker/3:82/12473:
[ 6771.231384]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231434]  #1: ffff8883e05e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231480] 2 locks held by kworker/20:40/12475:
[ 6771.231489]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231539]  #1: ffff8883e05f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231586] 2 locks held by kworker/21:33/12477:
[ 6771.231595]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231644]  #1: ffff88813ac3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231691] 2 locks held by kworker/3:83/12478:
[ 6771.231700]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231750]  #1: ffff888370937da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231796] 2 locks held by kworker/20:41/12479:
[ 6771.231805]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231855]  #1: ffff88818112fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231902] 2 locks held by kworker/3:84/12480:
[ 6771.231911]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.231961]  #1: ffff88819d4bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232009] 2 locks held by kworker/20:42/12482:
[ 6771.232018]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232068]  #1: ffff88837092fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232114] 2 locks held by kworker/3:85/12483:
[ 6771.232122]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232172]  #1: ffff88813920fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232217] 2 locks held by kworker/20:43/12485:
[ 6771.232226]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232275]  #1: ffff8883e0647da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232322] 2 locks held by kworker/3:86/12486:
[ 6771.232330]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232380]  #1: ffff8883e0657da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232426] 2 locks held by kworker/3:87/12490:
[ 6771.232435]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232485]  #1: ffff8883e067fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232530] 2 locks held by kworker/23:21/12491:
[ 6771.232539]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232588]  #1: ffff8883e068fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232634] 2 locks held by kworker/10:67/12494:
[ 6771.232643]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232692]  #1: ffff8883e06afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232739] 2 locks held by kworker/0:39/12499:
[ 6771.232748]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232797]  #1: ffff8881b10f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232842] 2 locks held by kworker/10:68/12500:
[ 6771.232851]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232902]  #1: ffff8881acbefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.232948] 2 locks held by kworker/20:47/12504:
[ 6771.232957]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233006]  #1: ffff88814aa2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233059] 2 locks held by kworker/3:91/12507:
[ 6771.233069]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233119]  #1: ffff888133867da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233166] 2 locks held by kworker/3:92/12510:
[ 6771.233175]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233225]  #1: ffff8883e06ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233270] 2 locks held by kworker/17:38/12511:
[ 6771.233279]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233329]  #1: ffff8883e0707da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233376] 2 locks held by kworker/3:93/12515:
[ 6771.233385]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233435]  #1: ffff8883e07b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233504] 2 locks held by kworker/20:50/12516:
[ 6771.233513]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233562]  #1: ffff8883e07bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233608] 2 locks held by kworker/10:72/12517:
[ 6771.233617]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233666]  #1: ffff8883e07c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233713] 2 locks held by kworker/0:42/12518:
[ 6771.233722]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233772]  #1: ffff8883e07cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233818] 2 locks held by kworker/10:73/12520:
[ 6771.233827]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233877]  #1: ffff8883e07dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233924] 2 locks held by kworker/3:95/12521:
[ 6771.233933]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.233983]  #1: ffff8883e07e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234029] 2 locks held by kworker/0:43/12524:
[ 6771.234038]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234088]  #1: ffff8883e07ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234136] 2 locks held by kworker/3:96/12525:
[ 6771.234145]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234196]  #1: ffff8883e0817da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234243] 2 locks held by kworker/0:44/12527:
[ 6771.234252]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234301]  #1: ffff8883e0827da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234348] 2 locks held by kworker/3:98/12531:
[ 6771.234357]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234407]  #1: ffff8883e0857da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234454] 2 locks held by kworker/10:77/12532:
[ 6771.234463]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234513]  #1: ffff8883e0867da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234560] 2 locks held by kworker/3:99/12534:
[ 6771.234569]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234618]  #1: ffff8883e087fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234661] 2 locks held by kworker/10:78/12535:
[ 6771.234670]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234719]  #1: ffff8883e0887da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234765] 2 locks held by kworker/0:47/12536:
[ 6771.234774]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234824]  #1: ffff8883e088fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234869] 2 locks held by kworker/3:100/12537:
[ 6771.234878]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234928]  #1: ffff8883e089fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.234974] 2 locks held by kworker/10:79/12538:
[ 6771.234983]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235032]  #1: ffff8883e08afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235077] 2 locks held by kworker/10:80/12540:
[ 6771.235086]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235134]  #1: ffff8883e08c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235181] 2 locks held by kworker/19:22/12543:
[ 6771.235188]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235237]  #1: ffff8883e08e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235284] 2 locks held by kworker/10:81/12545:
[ 6771.235293]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235342]  #1: ffff8883e08f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235386] 2 locks held by kworker/19:23/12546:
[ 6771.235395]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235445]  #1: ffff8883e0907da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235491] 2 locks held by kworker/0:50/12547:
[ 6771.235499]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235550]  #1: ffff8883e090fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235596] 2 locks held by kworker/19:24/12549:
[ 6771.235605]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235655]  #1: ffff8883e0927da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235701] 2 locks held by kworker/10:83/12550:
[ 6771.235710]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235760]  #1: ffff8883e0937da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235807] 2 locks held by kworker/10:84/12553:
[ 6771.235815]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235865]  #1: ffff8883e095fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235912] 2 locks held by kworker/10:85/12556:
[ 6771.235921]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.235970]  #1: ffff8883e097fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236016] 2 locks held by kworker/17:40/12558:
[ 6771.236025]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236075]  #1: ffff8883e0997da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236121] 2 locks held by kworker/10:86/12560:
[ 6771.236130]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236180]  #1: ffff8883e09a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236227] 2 locks held by kworker/5:45/12562:
[ 6771.236235]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236285]  #1: ffff8883e09bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236332] 2 locks held by kworker/19:29/12565:
[ 6771.236341]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236390]  #1: ffff8883e09e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236437] 2 locks held by kworker/10:88/12568:
[ 6771.236445]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236496]  #1: ffff8883e0a0fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236541] 2 locks held by kworker/19:30/12569:
[ 6771.236550]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236600]  #1: ffff8883e0a17da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236647] 2 locks held by kworker/20:54/12571:
[ 6771.236656]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236706]  #1: ffff888396f9fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236752] 2 locks held by kworker/0:56/12574:
[ 6771.236761]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236811]  #1: ffff888396fb7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236857] 2 locks held by kworker/20:55/12575:
[ 6771.236866]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236915]  #1: ffff8883970a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.236962] 2 locks held by kworker/10:90/12579:
[ 6771.236971]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237020]  #1: ffff888175817da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237070] 2 locks held by kworker/11:28/12582:
[ 6771.237080]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237129]  #1: ffff888396cd7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237176] 2 locks held by kworker/23:23/12584:
[ 6771.237185]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237234]  #1: ffff888396e77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237281] 2 locks held by kworker/11:29/12588:
[ 6771.237289]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237339]  #1: ffff888396e9fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237386] 2 locks held by kworker/5:47/12592:
[ 6771.237395]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237446]  #1: ffff888396f37da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237513] 2 locks held by kworker/10:92/12594:
[ 6771.237521]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237571]  #1: ffff88815bb9fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237617] 2 locks held by kworker/0:59/12595:
[ 6771.237626]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237676]  #1: ffff888131cefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237722] 2 locks held by kworker/11:30/12597:
[ 6771.237731]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237781]  #1: ffff88814a387da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237827] 2 locks held by kworker/19:35/12598:
[ 6771.237836]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237887]  #1: ffff88818d25fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237932] 2 locks held by kworker/23:25/12599:
[ 6771.237941]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.237991]  #1: ffff888112f2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238037] 2 locks held by kworker/4:23/12601:
[ 6771.238047]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238097]  #1: ffff888119d97da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238144] 2 locks held by kworker/10:93/12602:
[ 6771.238153]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238204]  #1: ffff8881a5f9fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238252] 2 locks held by kworker/6:3/12606:
[ 6771.238261]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238312]  #1: ffff88810288fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238359] 2 locks held by kworker/11:31/12607:
[ 6771.238368]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238418]  #1: ffff8881217f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238465] 2 locks held by kworker/19:36/12608:
[ 6771.238473]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238523]  #1: ffff8881bd047da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238571] 2 locks held by kworker/20:60/12614:
[ 6771.238580]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238630]  #1: ffff8883e0a67da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238676] 2 locks held by kworker/19:37/12616:
[ 6771.238685]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238734]  #1: ffff8883e0a87da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238781] 2 locks held by kworker/10:95/12618:
[ 6771.238789]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238839]  #1: ffff8883e0adfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238885] 2 locks held by kworker/22:13/12622:
[ 6771.238894]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238943]  #1: ffff8883e0b07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.238990] 2 locks held by kworker/11:33/12625:
[ 6771.238999]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239049]  #1: ffff8883e0b27da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239096] 2 locks held by kworker/10:96/12629:
[ 6771.239105]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239154]  #1: ffff8883e0b4fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239201] 2 locks held by kworker/22:14/12630:
[ 6771.239208]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239259]  #1: ffff8883e0b5fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239306] 2 locks held by kworker/22:15/12635:
[ 6771.239315]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239364]  #1: ffff8883e0b9fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239411] 2 locks held by kworker/5:49/12638:
[ 6771.239419]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239469]  #1: ffff888396297da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239516] 2 locks held by kworker/22:16/12640:
[ 6771.239525]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239575]  #1: ffff888395be7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239621] 2 locks held by kworker/19:41/12641:
[ 6771.239630]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239680]  #1: ffff888396207da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239727] 2 locks held by kworker/20:62/12642:
[ 6771.239736]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239786]  #1: ffff88839645fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239832] 2 locks held by kworker/6:4/12645:
[ 6771.239841]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239891]  #1: ffff88817a46fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239937] 2 locks held by kworker/22:17/12647:
[ 6771.239946]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.239994]  #1: ffff8881765ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240039] 2 locks held by kworker/19:42/12648:
[ 6771.240048]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240098]  #1: ffff888195367da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240144] 2 locks held by kworker/10:100/12652:
[ 6771.240153]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240203]  #1: ffff8881b550fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240249] 2 locks held by kworker/22:18/12653:
[ 6771.240258]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240308]  #1: ffff888387a37da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240355] 2 locks held by kworker/10:101/12656:
[ 6771.240364]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240414]  #1: ffff88819c707da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240460] 2 locks held by kworker/22:19/12657:
[ 6771.240469]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240519]  #1: ffff888138e57da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240566] 2 locks held by kworker/19:44/12660:
[ 6771.240575]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240625]  #1: ffff888161597da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240672] 2 locks held by kworker/10:102/12661:
[ 6771.240680]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240730]  #1: ffff888387a1fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240778] 2 locks held by kworker/17:42/12669:
[ 6771.240788]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240838]  #1: ffff88818c4e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240886] 2 locks held by kworker/17:44/12676:
[ 6771.240895]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240944]  #1: ffff88817fc6fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.240991] 2 locks held by kworker/20:69/12678:
[ 6771.241000]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241049]  #1: ffff888151ecfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241097] 2 locks held by kworker/17:46/12684:
[ 6771.241106]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241156]  #1: ffff8881a4597da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241203] 2 locks held by kworker/11:34/12691:
[ 6771.241212]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241262]  #1: ffff888138747da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241308] 2 locks held by kworker/5:50/12692:
[ 6771.241317]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241368]  #1: ffff8881ba53fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241414] 2 locks held by kworker/4:36/12693:
[ 6771.241423]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241493]  #1: ffff88811cfb7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241541] 2 locks held by kworker/5:52/12699:
[ 6771.241551]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241601]  #1: ffff888150cf7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241648] 2 locks held by kworker/10:108/12700:
[ 6771.241657]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241707]  #1: ffff8881708bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241753] 2 locks held by kworker/0:75/12703:
[ 6771.241762]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241812]  #1: ffff8881ac037da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241860] 2 locks held by kworker/10:109/12705:
[ 6771.241869]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241919]  #1: ffff88838d287da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.241966] 2 locks held by kworker/6:5/12710:
[ 6771.241975]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242025]  #1: ffff888387c87da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242071] 2 locks held by kworker/1:26/12711:
[ 6771.242080]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242131]  #1: ffff888387c8fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242179] 2 locks held by kworker/5:54/12715:
[ 6771.242189]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242239]  #1: ffff8883d237fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242286] 2 locks held by kworker/0:79/12719:
[ 6771.242295]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242346]  #1: ffff8883e0befda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242393] 2 locks held by kworker/3:102/12721:
[ 6771.242402]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242452]  #1: ffff8883e0bffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242497] 2 locks held by kworker/4:41/12722:
[ 6771.242506]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242556]  #1: ffff8883e0c0fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242603] 2 locks held by kworker/10:112/12724:
[ 6771.242612]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242661]  #1: ffff8883e0c27da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242708] 2 locks held by kworker/5:56/12726:
[ 6771.242718]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242767]  #1: ffff8883e0c47da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242814] 2 locks held by kworker/10:113/12730:
[ 6771.242823]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242873]  #1: ffff8883e0c6fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242921] 2 locks held by kworker/0:83/12738:
[ 6771.242930]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.242980]  #1: ffff8883e0ccfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243027] 2 locks held by kworker/3:104/12740:
[ 6771.243037]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243088]  #1: ffff8883e0ce7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243133] 2 locks held by kworker/3:105/12744:
[ 6771.243142]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243191]  #1: ffff8883e0d17da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243238] 2 locks held by kworker/0:85/12745:
[ 6771.243246]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243296]  #1: ffff8883e0d1fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243342] 2 locks held by kworker/3:106/12749:
[ 6771.243351]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243400]  #1: ffff8883e0d4fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243447] 2 locks held by kworker/10:117/12751:
[ 6771.243456]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243505]  #1: ffff8883e0d67da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243552] 2 locks held by kworker/3:107/12752:
[ 6771.243561]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243611]  #1: ffff8883e0d77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243656] 2 locks held by kworker/1:36/12753:
[ 6771.243665]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243715]  #1: ffff8883e0d7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243761] 2 locks held by kworker/10:118/12754:
[ 6771.243770]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243821]  #1: ffff8883e0d87da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243868] 2 locks held by kworker/3:108/12757:
[ 6771.243876]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243925]  #1: ffff8883e0da7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.243972] 2 locks held by kworker/10:119/12759:
[ 6771.243981]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244033]  #1: ffff8883e0dbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244079] 2 locks held by kworker/5:58/12760:
[ 6771.244088]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244137]  #1: ffff8883e0dcfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244184] 2 locks held by kworker/0:88/12761:
[ 6771.244193]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244241]  #1: ffff8883e0ddfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244287] 2 locks held by kworker/3:109/12762:
[ 6771.244296]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244346]  #1: ffff8883e0de7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244392] 2 locks held by kworker/21:34/12764:
[ 6771.244401]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244450]  #1: ffff8883e0e07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244496] 2 locks held by kworker/23:29/12765:
[ 6771.244505]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244555]  #1: ffff8883e0e0fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244601] 2 locks held by kworker/10:120/12767:
[ 6771.244610]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244660]  #1: ffff8883e0e27da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244705] 2 locks held by kworker/3:110/12768:
[ 6771.244713]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244763]  #1: ffff8883e0e2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244809] 2 locks held by kworker/0:90/12771:
[ 6771.244818]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244868]  #1: ffff8883e0e4fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244913] 2 locks held by kworker/3:111/12773:
[ 6771.244922]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.244972]  #1: ffff8883e0e67da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245018] 2 locks held by kworker/0:91/12776:
[ 6771.245028]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245085]  #1: ffff8883e0ebfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245131] 2 locks held by kworker/3:112/12777:
[ 6771.245140]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245190]  #1: ffff8883e0ecfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245236] 2 locks held by kworker/10:122/12779:
[ 6771.245245]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245294]  #1: ffff8883e0eefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245341] 2 locks held by kworker/0:92/12780:
[ 6771.245350]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245400]  #1: ffff8883e0ef7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245446] 2 locks held by kworker/10:123/12782:
[ 6771.245455]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245525]  #1: ffff8883e0f0fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245573] 2 locks held by kworker/0:95/12789:
[ 6771.245582]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245633]  #1: ffff8883e0f57da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245678] 2 locks held by kworker/10:125/12790:
[ 6771.245688]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245737]  #1: ffff8883e0f67da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245784] 2 locks held by kworker/0:96/12793:
[ 6771.245793]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245843]  #1: ffff8883e0f87da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245890] 2 locks held by kworker/10:126/12794:
[ 6771.245898]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245948]  #1: ffff8883e0f97da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.245994] 2 locks held by kworker/21:36/12795:
[ 6771.246003]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246054]  #1: ffff8883e0f9fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246101] 2 locks held by kworker/20:72/12796:
[ 6771.246110]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246161]  #1: ffff8883e0fafda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246208] 2 locks held by kworker/4:47/12797:
[ 6771.246217]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246267]  #1: ffff8883e0fbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246314] 2 locks held by kworker/5:59/12798:
[ 6771.246322]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246372]  #1: ffff8883e0fc7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246419] 2 locks held by kworker/10:127/12800:
[ 6771.246428]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246478]  #1: ffff8883e0fdfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246525] 2 locks held by kworker/0:98/12809:
[ 6771.246534]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246584]  #1: ffff8883e1047da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246632] 2 locks held by kworker/12:4/12814:
[ 6771.246641]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246691]  #1: ffff8883e107fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246737] 2 locks held by kworker/0:99/12816:
[ 6771.246746]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246795]  #1: ffff8883e109fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246842] 2 locks held by kworker/21:37/12818:
[ 6771.246851]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246901]  #1: ffff8883e10b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.246947] 2 locks held by kworker/11:36/12820:
[ 6771.246956]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247006]  #1: ffff8883e10c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247051] 2 locks held by kworker/4:51/12821:
[ 6771.247060]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247110]  #1: ffff8883e10d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247156] 2 locks held by kworker/1:50/12824:
[ 6771.247165]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247215]  #1: ffff8883e112fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247261] 2 locks held by kworker/10:129/12827:
[ 6771.247270]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247319]  #1: ffff8883e114fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247366] 2 locks held by kworker/3:113/12828:
[ 6771.247375]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247424]  #1: ffff8883e1157da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247471] 2 locks held by kworker/5:63/12829:
[ 6771.247479]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247529]  #1: ffff8883e119fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247574] 2 locks held by kworker/21:39/12830:
[ 6771.247583]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247633]  #1: ffff8883e11afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247680] 2 locks held by kworker/3:114/12834:
[ 6771.247689]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247739]  #1: ffff8883e11dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247787] 2 locks held by kworker/1:52/12840:
[ 6771.247795]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247845]  #1: ffff8883e121fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247892] 2 locks held by kworker/0:102/12843:
[ 6771.247901]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247951]  #1: ffff8883e1247da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.247998] 2 locks held by kworker/11:39/12845:
[ 6771.248006]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248056]  #1: ffff8883e125fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248102] 2 locks held by kworker/4:54/12846:
[ 6771.248111]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248161]  #1: ffff8883e1267da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248208] 2 locks held by kworker/3:116/12848:
[ 6771.248217]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248266]  #1: ffff8883e127fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248314] 2 locks held by kworker/11:40/12856:
[ 6771.248323]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248373]  #1: ffff8883e12dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248421] 2 locks held by kworker/23:32/12864:
[ 6771.248429]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248480]  #1: ffff8883e1337da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248526] 2 locks held by kworker/5:65/12865:
[ 6771.248534]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248583]  #1: ffff8883e133fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248631] 2 locks held by kworker/5:66/12869:
[ 6771.248640]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248690]  #1: ffff8883e137fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248737] 2 locks held by kworker/1:58/12872:
[ 6771.248746]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248796]  #1: ffff8883e139fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248841] 2 locks held by kworker/3:117/12873:
[ 6771.248850]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248900]  #1: ffff8883e13a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.248947] 2 locks held by kworker/5:67/12874:
[ 6771.248956]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249006]  #1: ffff8883e13afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249055] 2 locks held by kworker/1:59/12875:
[ 6771.249065]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249116]  #1: ffff8883e13bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249163] 2 locks held by kworker/3:118/12876:
[ 6771.249172]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249222]  #1: ffff8883e13c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249269] 2 locks held by kworker/5:69/12880:
[ 6771.249278]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249327]  #1: ffff8883e13efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249374] 2 locks held by kworker/11:41/12887:
[ 6771.249383]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249433]  #1: ffff8883e1487da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249499] 2 locks held by kworker/11:43/12892:
[ 6771.249507]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249558]  #1: ffff8883e14b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249604] 2 locks held by kworker/1:67/12893:
[ 6771.249612]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249662]  #1: ffff8883e14c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249710] 2 locks held by kworker/17:48/12902:
[ 6771.249719]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249769]  #1: ffff88813b6c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249810] 2 locks held by kworker/17:49/12903:
[ 6771.249817]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249860]  #1: ffff888378abfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249901] 2 locks held by kworker/17:50/12904:
[ 6771.249909]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249953]  #1: ffff888378ab7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.249993] 2 locks held by kworker/12:10/12905:
[ 6771.250001]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250049]  #1: ffff888377387da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250098] 2 locks held by kworker/10:133/12914:
[ 6771.250107]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250158]  #1: ffff8881b53e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250206] 2 locks held by kworker/23:41/12920:
[ 6771.250215]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250266]  #1: ffff8883e157fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250314] 2 locks held by kworker/10:138/12926:
[ 6771.250323]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250373]  #1: ffff88815b36fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250422] 2 locks held by kworker/23:49/12935:
[ 6771.250431]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250481]  #1: ffff8883e0ab7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250527] 2 locks held by kworker/10:142/12936:
[ 6771.250536]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250586]  #1: ffff8883e0abfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250632] 2 locks held by kworker/10:143/12938:
[ 6771.250641]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250691]  #1: ffff8883e15f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250738] 2 locks held by kworker/23:52/12941:
[ 6771.250747]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250797]  #1: ffff8883e1617da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250843] 2 locks held by kworker/10:145/12942:
[ 6771.250852]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250901]  #1: ffff8883e161fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.250948] 2 locks held by kworker/10:146/12944:
[ 6771.250957]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251007]  #1: ffff8883e1637da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251051] 2 locks held by kworker/23:54/12945:
[ 6771.251060]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251107]  #1: ffff8883e163fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251153] 2 locks held by kworker/10:147/12946:
[ 6771.251162]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251211]  #1: ffff8883e1657da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251257] 2 locks held by kworker/10:148/12948:
[ 6771.251267]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251317]  #1: ffff8883e1667da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251362] 2 locks held by kworker/23:56/12949:
[ 6771.251371]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251421]  #1: ffff8883e1677da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251467] 2 locks held by kworker/10:149/12950:
[ 6771.251476]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251526]  #1: ffff8883e167fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251573] 2 locks held by kworker/10:150/12953:
[ 6771.251582]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251632]  #1: ffff8883e169fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251678] 2 locks held by kworker/21:46/12955:
[ 6771.251687]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251736]  #1: ffff8883e16b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251783] 2 locks held by kworker/23:60/12956:
[ 6771.251792]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251841]  #1: ffff8883e16bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251887] 2 locks held by kworker/10:151/12957:
[ 6771.251896]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251945]  #1: ffff8883e16c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.251992] 2 locks held by kworker/23:61/12959:
[ 6771.252000]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252053]  #1: ffff8883e171fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252098] 2 locks held by kworker/10:152/12960:
[ 6771.252107]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252157]  #1: ffff8883e1727da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252204] 2 locks held by kworker/10:154/12964:
[ 6771.252212]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252261]  #1: ffff8883e175fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252307] 2 locks held by kworker/23:64/12965:
[ 6771.252316]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252366]  #1: ffff8883e1767da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252412] 2 locks held by kworker/10:155/12966:
[ 6771.252421]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252470]  #1: ffff8883e176fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252517] 2 locks held by kworker/17:52/12973:
[ 6771.252526]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252576]  #1: ffff8883e17c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252624] 2 locks held by kworker/23:66/12980:
[ 6771.252632]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252682]  #1: ffff8883e180fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252728] 2 locks held by kworker/23:67/12981:
[ 6771.252737]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252786]  #1: ffff8883e1817da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252831] 2 locks held by kworker/10:160/12982:
[ 6771.252840]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252890]  #1: ffff8883e182fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252936] 2 locks held by kworker/10:161/12985:
[ 6771.252945]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.252995]  #1: ffff8883e1857da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253040] 2 locks held by kworker/10:162/12986:
[ 6771.253050]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253108]  #1: ffff8883e185fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253154] 2 locks held by kworker/10:163/12987:
[ 6771.253162]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253212]  #1: ffff8883e1867da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253258] 2 locks held by kworker/10:164/12988:
[ 6771.253266]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253316]  #1: ffff8883e1877da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253364] 2 locks held by kworker/23:73/12996:
[ 6771.253373]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253423]  #1: ffff8883e18ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253490] 2 locks held by kworker/10:167/12997:
[ 6771.253499]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253550]  #1: ffff8883e190fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253596] 2 locks held by kworker/10:169/13003:
[ 6771.253605]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253656]  #1: ffff8883e195fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253703] 2 locks held by kworker/10:170/13006:
[ 6771.253712]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253762]  #1: ffff8883e197fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253808] 2 locks held by kworker/23:77/13007:
[ 6771.253817]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253867]  #1: ffff8883e1987da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253914] 2 locks held by kworker/12:13/13013:
[ 6771.253923]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.253972]  #1: ffff8883773b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254019] 2 locks held by kworker/5:73/13015:
[ 6771.254028]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254079]  #1: ffff8883e19efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254126] 2 locks held by kworker/5:76/13018:
[ 6771.254135]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254186]  #1: ffff8883e1a0fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254233] 2 locks held by kworker/5:77/13019:
[ 6771.254242]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254292]  #1: ffff8883e1a1fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254338] 2 locks held by kworker/5:79/13021:
[ 6771.254347]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254397]  #1: ffff8883e1a2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254444] 2 locks held by kworker/5:81/13023:
[ 6771.254453]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254503]  #1: ffff8883e1a4fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254549] 2 locks held by kworker/5:83/13025:
[ 6771.254558]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254608]  #1: ffff8883e1a67da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254655] 2 locks held by kworker/5:89/13031:
[ 6771.254664]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254714]  #1: ffff8883e1aafda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254761] 2 locks held by kworker/5:91/13033:
[ 6771.254769]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254820]  #1: ffff8883e1abfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254866] 2 locks held by kworker/5:92/13035:
[ 6771.254874]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254921]  #1: ffff8883e1ad7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.254967] 2 locks held by kworker/6:6/13038:
[ 6771.254976]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255025]  #1: ffff8883e1affda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255072] 2 locks held by kworker/12:20/13043:
[ 6771.255081]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255130]  #1: ffff8883e1bcfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255176] 2 locks held by kworker/12:21/13044:
[ 6771.255185]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255235]  #1: ffff8883e1bd7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255283] 2 locks held by kworker/12:23/13046:
[ 6771.255292]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255342]  #1: ffff8883e1befda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255389] 2 locks held by kworker/17:55/13050:
[ 6771.255398]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255447]  #1: ffff8883e1c17da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255494] 2 locks held by kworker/12:27/13055:
[ 6771.255503]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255553]  #1: ffff8883e1c57da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255600] 2 locks held by kworker/12:30/13058:
[ 6771.255609]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255659]  #1: ffff8883e1c7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255706] 2 locks held by kworker/13:77/13063:
[ 6771.255715]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255766]  #1: ffff8883e1cafda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255811] 2 locks held by kworker/12:34/13064:
[ 6771.255820]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255870]  #1: ffff8883e1cf7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255917] 2 locks held by kworker/12:36/13068:
[ 6771.255926]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.255976]  #1: ffff8883e1d5fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256022] 2 locks held by kworker/13:80/13069:
[ 6771.256031]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256081]  #1: ffff8883e1d67da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256127] 2 locks held by kworker/12:37/13070:
[ 6771.256136]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256186]  #1: ffff8883e1d77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256233] 2 locks held by kworker/12:38/13073:
[ 6771.256242]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256292]  #1: ffff8883e1dcfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256339] 2 locks held by kworker/17:60/13076:
[ 6771.256348]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256398]  #1: ffff8883e1df7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256444] 2 locks held by kworker/13:83/13078:
[ 6771.256453]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256504]  #1: ffff8883e1e07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256550] 2 locks held by kworker/13:84/13081:
[ 6771.256559]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256609]  #1: ffff8883e1e2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256655] 2 locks held by kworker/17:62/13082:
[ 6771.256664]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256714]  #1: ffff8883e1e3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256761] 2 locks held by kworker/12:41/13083:
[ 6771.256770]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256820]  #1: ffff8883e1e47da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256866] 2 locks held by kworker/13:85/13084:
[ 6771.256875]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256924]  #1: ffff8883e1e4fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.256970] 2 locks held by kworker/12:42/13085:
[ 6771.256979]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257025]  #1: ffff8883e1b0fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257072] 2 locks held by kworker/12:49/13093:
[ 6771.257081]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257131]  #1: ffff8881afeefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257177] 2 locks held by kworker/12:51/13095:
[ 6771.257186]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257236]  #1: ffff8881881ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257283] 2 locks held by kworker/14:3/13097:
[ 6771.257292]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257342]  #1: ffff8883e0bcfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257388] 2 locks held by kworker/14:4/13098:
[ 6771.257396]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257446]  #1: ffff8883e0bd7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257515] 2 locks held by kworker/12:52/13100:
[ 6771.257524]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257574]  #1: ffff888199af7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257622] 2 locks held by kworker/15:4/13107:
[ 6771.257631]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257681]  #1: ffff8883e1f7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257728] 2 locks held by kworker/15:7/13112:
[ 6771.257738]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257787]  #1: ffff8883e1fcfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257835] 2 locks held by kworker/18:16/13116:
[ 6771.257844]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257894]  #1: ffff8883e1ff7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.257941] 2 locks held by kworker/15:9/13117:
[ 6771.257950]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258000]  #1: ffff8883e2007da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258046] 2 locks held by kworker/15:10/13118:
[ 6771.258055]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258105]  #1: ffff8883e200fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258153] 2 locks held by kworker/15:11/13120:
[ 6771.258162]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258213]  #1: ffff8883e2077da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258260] 2 locks held by kworker/18:18/13121:
[ 6771.258269]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258320]  #1: ffff8883e207fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258366] 2 locks held by kworker/15:12/13122:
[ 6771.258375]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258425]  #1: ffff8883e2087da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258472] 2 locks held by kworker/18:19/13123:
[ 6771.258481]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258531]  #1: ffff8883e2097da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258577] 2 locks held by kworker/15:13/13124:
[ 6771.258586]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258636]  #1: ffff8883e209fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258682] 2 locks held by kworker/18:20/13125:
[ 6771.258691]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258741]  #1: ffff8883e20afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258787] 2 locks held by kworker/15:14/13129:
[ 6771.258795]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258845]  #1: ffff8883e20dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258892] 2 locks held by kworker/20:78/13130:
[ 6771.258900]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258950]  #1: ffff8883e20efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.258997] 2 locks held by kworker/20:79/13131:
[ 6771.259005]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259053]  #1: ffff8883e20f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259104] 2 locks held by kworker/18:23/13132:
[ 6771.259114]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259164]  #1: ffff8883e2107da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259212] 2 locks held by kworker/15:19/13142:
[ 6771.259221]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259270]  #1: ffff88837717fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259317] 2 locks held by kworker/18:27/13143:
[ 6771.259326]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259376]  #1: ffff888377177da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259422] 2 locks held by kworker/15:20/13145:
[ 6771.259431]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259482]  #1: ffff88814a677da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259528] 2 locks held by kworker/15:21/13147:
[ 6771.259537]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259581]  #1: ffff88813a51fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259623] 2 locks held by kworker/18:29/13149:
[ 6771.259631]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259675]  #1: ffff88810bee7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259719] 2 locks held by kworker/20:85/13153:
[ 6771.259727]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259772]  #1: ffff88812443fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259817] 2 locks held by kworker/15:23/13154:
[ 6771.259826]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259876]  #1: ffff888198927da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259923] 2 locks held by kworker/15:25/13159:
[ 6771.259932]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.259982]  #1: ffff888172aafda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260030] 2 locks held by kworker/18:35/13165:
[ 6771.260039]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260090]  #1: ffff888394cefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260137] 2 locks held by kworker/15:27/13167:
[ 6771.260146]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260195]  #1: ffff88835593fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260242] 2 locks held by kworker/15:28/13169:
[ 6771.260251]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260301]  #1: ffff88838836fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260348] 2 locks held by kworker/15:29/13172:
[ 6771.260357]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260407]  #1: ffff88817c76fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260454] 2 locks held by kworker/20:91/13174:
[ 6771.260463]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260512]  #1: ffff88818e81fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260559] 2 locks held by kworker/18:39/13176:
[ 6771.260568]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260618]  #1: ffff88813d67fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260665] 2 locks held by kworker/15:31/13177:
[ 6771.260674]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260723]  #1: ffff888188ebfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260770] 2 locks held by kworker/18:40/13179:
[ 6771.260779]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260829]  #1: ffff888143fafda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260876] 2 locks held by kworker/20:93/13182:
[ 6771.260885]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260935]  #1: ffff8881b9117da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.260980] 2 locks held by kworker/18:41/13183:
[ 6771.260989]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261040]  #1: ffff8881afad7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261086] 2 locks held by kworker/12:59/13184:
[ 6771.261095]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261145]  #1: ffff888194617da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261192] 2 locks held by kworker/15:33/13185:
[ 6771.261201]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261251]  #1: ffff8881adf77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261298] 2 locks held by kworker/12:60/13188:
[ 6771.261307]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261358]  #1: ffff888160d07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261406] 2 locks held by kworker/12:61/13192:
[ 6771.261415]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261465]  #1: ffff8881ab027da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261533] 2 locks held by kworker/20:96/13194:
[ 6771.261542]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261591]  #1: ffff888387ef7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261638] 2 locks held by kworker/18:44/13195:
[ 6771.261647]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261697]  #1: ffff888387f07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261743] 2 locks held by kworker/20:97/13197:
[ 6771.261752]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261803]  #1: ffff888167727da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261849] 2 locks held by kworker/12:62/13198:
[ 6771.261859]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261909]  #1: ffff8881bc087da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.261956] 2 locks held by kworker/20:98/13201:
[ 6771.261965]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262015]  #1: ffff8883e2207da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262062] 2 locks held by kworker/12:63/13202:
[ 6771.262072]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262128]  #1: ffff8883e220fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262176] 2 locks held by kworker/15:38/13203:
[ 6771.262185]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262237]  #1: ffff8883e221fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262283] 2 locks held by kworker/15:39/13206:
[ 6771.262292]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262343]  #1: ffff8883e2247da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262390] 2 locks held by kworker/12:65/13207:
[ 6771.262399]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262449]  #1: ffff8883e2257da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262496] 2 locks held by kworker/15:40/13209:
[ 6771.262505]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262555]  #1: ffff8883e229fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262602] 2 locks held by kworker/15:41/13211:
[ 6771.262611]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262661]  #1: ffff8883e22b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262708] 2 locks held by kworker/18:49/13212:
[ 6771.262717]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262766]  #1: ffff8883e22bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262813] 2 locks held by kworker/15:45/13218:
[ 6771.262822]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262872]  #1: ffff8883e21a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262919] 2 locks held by kworker/15:46/13219:
[ 6771.262928]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.262978]  #1: ffff8883e22d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263025] 2 locks held by kworker/15:48/13221:
[ 6771.263034]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263087]  #1: ffff8883e22efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263136] 2 locks held by kworker/15:49/13223:
[ 6771.263146]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263196]  #1: ffff8883e2307da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263243] 2 locks held by kworker/15:50/13225:
[ 6771.263252]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263302]  #1: ffff8883e2317da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263348] 2 locks held by kworker/15:51/13227:
[ 6771.263357]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263408]  #1: ffff8883e232fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263456] 2 locks held by kworker/19:51/13230:
[ 6771.263465]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263515]  #1: ffff8883e234fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263561] 2 locks held by kworker/15:53/13231:
[ 6771.263570]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263620]  #1: ffff8883e2357da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263668] 2 locks held by kworker/19:53/13234:
[ 6771.263677]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263727]  #1: ffff8883e23b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263774] 2 locks held by kworker/19:54/13235:
[ 6771.263783]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263833]  #1: ffff8883e23c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263880] 2 locks held by kworker/19:56/13238:
[ 6771.263889]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263940]  #1: ffff8883e23efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.263986] 2 locks held by kworker/19:57/13239:
[ 6771.263995]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264046]  #1: ffff8883e23e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264092] 2 locks held by kworker/19:58/13240:
[ 6771.264101]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264152]  #1: ffff8881498e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264198] 2 locks held by kworker/14:5/13241:
[ 6771.264207]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264258]  #1: ffff8883e007fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264305] 2 locks held by kworker/14:6/13242:
[ 6771.264314]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264364]  #1: ffff8883df377da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264411] 2 locks held by kworker/12:66/13243:
[ 6771.264420]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264471]  #1: ffff8883df37fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264518] 2 locks held by kworker/11:46/13246:
[ 6771.264527]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264576]  #1: ffff8883e246fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264622] 2 locks held by kworker/11:47/13247:
[ 6771.264631]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264681]  #1: ffff8883e2477da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264727] 2 locks held by kworker/11:48/13248:
[ 6771.264737]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264787]  #1: ffff8883e247fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264834] 2 locks held by kworker/11:51/13251:
[ 6771.264843]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264893]  #1: ffff8883e24d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.264942] 2 locks held by kworker/11:55/13255:
[ 6771.264951]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265002]  #1: ffff8883e2507da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265050] 2 locks held by kworker/11:57/13257:
[ 6771.265059]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265109]  #1: ffff8883e251fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265157] 2 locks held by kworker/11:58/13258:
[ 6771.265166]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265216]  #1: ffff8883e252fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265264] 2 locks held by kworker/11:59/13259:
[ 6771.265273]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265323]  #1: ffff8881ac5cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265370] 2 locks held by kworker/11:60/13260:
[ 6771.265379]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265429]  #1: ffff8881976b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265498] 2 locks held by kworker/11:62/13262:
[ 6771.265507]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265558]  #1: ffff888378a77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265605] 2 locks held by kworker/11:64/13264:
[ 6771.265614]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265664]  #1: ffff8881b215fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265710] 2 locks held by kworker/11:65/13265:
[ 6771.265719]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265770]  #1: ffff8881b042fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265817] 2 locks held by kworker/11:67/13267:
[ 6771.265826]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265877]  #1: ffff888133027da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265923] 2 locks held by kworker/11:68/13268:
[ 6771.265932]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.265983]  #1: ffff888140e2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266030] 2 locks held by kworker/11:69/13269:
[ 6771.266038]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266088]  #1: ffff8881317f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266135] 2 locks held by kworker/11:70/13270:
[ 6771.266144]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266194]  #1: ffff888187837da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266240] 2 locks held by kworker/11:71/13271:
[ 6771.266249]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266300]  #1: ffff88811929fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266346] 2 locks held by kworker/11:72/13272:
[ 6771.266355]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266405]  #1: ffff8883874e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266451] 2 locks held by kworker/11:73/13273:
[ 6771.266460]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266511]  #1: ffff88838819fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266558] 2 locks held by kworker/11:74/13274:
[ 6771.266567]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266618]  #1: ffff8881acb07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266666] 2 locks held by kworker/11:75/13275:
[ 6771.266675]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266726]  #1: ffff8883881a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266772] 2 locks held by kworker/11:77/13277:
[ 6771.266782]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266832]  #1: ffff8881928dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266879] 2 locks held by kworker/11:79/13279:
[ 6771.266888]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266938]  #1: ffff88814e54fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.266984] 2 locks held by kworker/11:80/13280:
[ 6771.266993]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267043]  #1: ffff888385cb7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267090] 2 locks held by kworker/11:82/13282:
[ 6771.267100]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267150]  #1: ffff88810c967da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267198] 2 locks held by kworker/11:84/13284:
[ 6771.267207]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267257]  #1: ffff88836fa87da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267304] 2 locks held by kworker/11:85/13285:
[ 6771.267313]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267363]  #1: ffff88810514fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267410] 2 locks held by kworker/11:88/13288:
[ 6771.267419]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267469]  #1: ffff8883e253fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267516] 2 locks held by kworker/11:89/13289:
[ 6771.267525]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267575]  #1: ffff8883e254fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267623] 2 locks held by kworker/11:93/13293:
[ 6771.267632]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267682]  #1: ffff8883e2587da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267729] 2 locks held by kworker/11:95/13295:
[ 6771.267738]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267785]  #1: ffff8883e259fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267832] 2 locks held by kworker/12:73/13301:
[ 6771.267841]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267892]  #1: ffff8882d21d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267939] 2 locks held by kworker/12:74/13302:
[ 6771.267948]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.267998]  #1: ffff888388167da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268045] 2 locks held by kworker/12:75/13303:
[ 6771.268054]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268110]  #1: ffff88815b24fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268157] 2 locks held by kworker/12:77/13305:
[ 6771.268166]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268216]  #1: ffff888181e0fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268262] 2 locks held by kworker/12:79/13307:
[ 6771.268271]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268321]  #1: ffff8881a44efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268368] 2 locks held by kworker/12:83/13311:
[ 6771.268377]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268428]  #1: ffff88815d617da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268474] 2 locks held by kworker/12:85/13313:
[ 6771.268483]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268533]  #1: ffff88818cdefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268580] 2 locks held by kworker/22:20/13314:
[ 6771.268588]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268639]  #1: ffff88819408fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268687] 2 locks held by kworker/17:67/13318:
[ 6771.268696]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268746]  #1: ffff8881470e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268794] 2 locks held by kworker/17:75/13326:
[ 6771.268803]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268852]  #1: ffff888371197da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268899] 2 locks held by kworker/17:76/13327:
[ 6771.268908]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.268958]  #1: ffff88837002fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269004] 2 locks held by kworker/17:77/13328:
[ 6771.269013]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269062]  #1: ffff8883e25afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269113] 2 locks held by kworker/17:92/13343:
[ 6771.269122]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269173]  #1: ffff8881a511fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269220] 2 locks held by kworker/17:94/13345:
[ 6771.269229]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269278]  #1: ffff888196fdfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269324] 2 locks held by kworker/17:97/13348:
[ 6771.269333]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269384]  #1: ffff8881288e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269431] 2 locks held by kworker/17:99/13350:
[ 6771.269439]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269517]  #1: ffff8883e264fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269563] 2 locks held by kworker/6:7/13352:
[ 6771.269572]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269622]  #1: ffff88816695fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269669] 2 locks held by kworker/3:120/13353:
[ 6771.269679]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269729]  #1: ffff8883889ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269776] 2 locks held by kworker/3:121/13354:
[ 6771.269785]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269835]  #1: ffff888388a07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269882] 2 locks held by kworker/3:122/13355:
[ 6771.269892]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269942]  #1: ffff888388e07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.269989] 2 locks held by kworker/3:123/13356:
[ 6771.269998]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270048]  #1: ffff888388ddfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270095] 2 locks held by kworker/3:126/13359:
[ 6771.270104]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270156]  #1: ffff8881a41dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270204] 2 locks held by kworker/3:127/13360:
[ 6771.270213]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270264]  #1: ffff888121e6fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270311] 2 locks held by kworker/3:129/13362:
[ 6771.270320]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270370]  #1: ffff8881b2157da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270417] 2 locks held by kworker/3:130/13363:
[ 6771.270426]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270477]  #1: ffff88812103fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270524] 2 locks held by kworker/3:132/13365:
[ 6771.270533]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270583]  #1: ffff888182907da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270629] 2 locks held by kworker/3:135/13368:
[ 6771.270638]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270689]  #1: ffff8881621bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270735] 2 locks held by kworker/3:136/13369:
[ 6771.270744]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270794]  #1: ffff8881a5d87da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270840] 2 locks held by kworker/3:137/13370:
[ 6771.270849]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270899]  #1: ffff8881b24afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.270945] 2 locks held by kworker/3:138/13371:
[ 6771.270954]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271004]  #1: ffff88819f95fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271053] 2 locks held by kworker/12:86/13377:
[ 6771.271061]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271112]  #1: ffff888126227da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271159] 2 locks held by kworker/12:87/13378:
[ 6771.271168]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271218]  #1: ffff8883e2667da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271265] 2 locks held by kworker/14:7/13380:
[ 6771.271274]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271324]  #1: ffff8883e2397da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271370] 2 locks held by kworker/12:89/13381:
[ 6771.271379]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271429]  #1: ffff8883e239fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271476] 2 locks held by kworker/12:91/13383:
[ 6771.271485]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271536]  #1: ffff8883e23afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271583] 2 locks held by kworker/22:21/13387:
[ 6771.271592]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271642]  #1: ffff888151f8fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271688] 2 locks held by kworker/22:22/13388:
[ 6771.271697]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271747]  #1: ffff8881a4a3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271794] 2 locks held by kworker/22:23/13389:
[ 6771.271803]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271853]  #1: ffff8882cc86fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271898] 2 locks held by kworker/22:24/13390:
[ 6771.271907]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.271957]  #1: ffff88814d397da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272003] 2 locks held by kworker/22:25/13391:
[ 6771.272012]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272063]  #1: ffff8881b3f8fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272109] 2 locks held by kworker/22:26/13392:
[ 6771.272118]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272167]  #1: ffff88815fe6fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272214] 2 locks held by kworker/22:27/13393:
[ 6771.272223]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272273]  #1: ffff8881a0d87da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272320] 2 locks held by kworker/22:28/13394:
[ 6771.272329]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272379]  #1: ffff888174867da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272425] 2 locks held by kworker/22:29/13395:
[ 6771.272434]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272485]  #1: ffff88811f81fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272531] 2 locks held by kworker/22:30/13396:
[ 6771.272540]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272590]  #1: ffff8881a4937da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272637] 2 locks held by kworker/22:31/13397:
[ 6771.272645]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272695]  #1: ffff888196847da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272742] 2 locks held by kworker/22:32/13398:
[ 6771.272751]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272801]  #1: ffff8881194c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272848] 2 locks held by kworker/22:33/13399:
[ 6771.272857]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272907]  #1: ffff8881a13d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.272953] 2 locks held by kworker/22:34/13400:
[ 6771.272962]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273011]  #1: ffff888389ddfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273058] 2 locks held by kworker/22:35/13401:
[ 6771.273067]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273118]  #1: ffff888389cafda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273164] 2 locks held by kworker/22:36/13402:
[ 6771.273173]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273223]  #1: ffff888389ca7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273270] 2 locks held by kworker/22:37/13403:
[ 6771.273279]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273329]  #1: ffff888389c8fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273375] 2 locks held by kworker/22:38/13404:
[ 6771.273384]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273435]  #1: ffff8881644f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273502] 2 locks held by kworker/22:39/13405:
[ 6771.273511]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273562]  #1: ffff888389c9fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273609] 2 locks held by kworker/22:40/13406:
[ 6771.273618]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273669]  #1: ffff8881adbc7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273715] 2 locks held by kworker/22:41/13407:
[ 6771.273724]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273775]  #1: ffff8881ac997da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273821] 2 locks held by kworker/22:42/13408:
[ 6771.273831]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273881]  #1: ffff88815ae77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273927] 2 locks held by kworker/12:95/13409:
[ 6771.273936]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.273987]  #1: ffff8881ae38fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274025] 2 locks held by kworker/22:43/13410:
[ 6771.274032]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274075]  #1: ffff8881aaddfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274116] 2 locks held by kworker/22:44/13411:
[ 6771.274124]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274165]  #1: ffff88811caefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274206] 2 locks held by kworker/22:45/13412:
[ 6771.274215]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274259]  #1: ffff8881812efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274305] 2 locks held by kworker/22:46/13413:
[ 6771.274315]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274366]  #1: ffff8883881cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274412] 2 locks held by kworker/22:47/13414:
[ 6771.274421]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274472]  #1: ffff888127267da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274518] 2 locks held by kworker/22:48/13415:
[ 6771.274527]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274576]  #1: ffff8881621ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274624] 2 locks held by kworker/22:49/13416:
[ 6771.274633]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274683]  #1: ffff88817be1fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274730] 2 locks held by kworker/22:51/13418:
[ 6771.274739]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274790]  #1: ffff88816e8cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274836] 2 locks held by kworker/22:52/13419:
[ 6771.274845]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274896]  #1: ffff8881ae237da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.274942] 2 locks held by kworker/22:54/13421:
[ 6771.274951]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275002]  #1: ffff88812ffd7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275048] 2 locks held by kworker/22:55/13422:
[ 6771.275056]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275107]  #1: ffff88815d55fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275155] 2 locks held by kworker/22:56/13423:
[ 6771.275164]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275215]  #1: ffff88812178fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275262] 2 locks held by kworker/22:58/13425:
[ 6771.275271]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275321]  #1: ffff88816c5cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275367] 2 locks held by kworker/22:59/13426:
[ 6771.275376]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275426]  #1: ffff8881af26fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275474] 2 locks held by kworker/23:86/13436:
[ 6771.275484]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275534]  #1: ffff88813d837da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275578] 2 locks held by kworker/23:91/13441:
[ 6771.275585]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275628]  #1: ffff888388677da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275668] 2 locks held by kworker/23:93/13443:
[ 6771.275675]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275719]  #1: ffff8883e263fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275765] 2 locks held by kworker/22:60/13447:
[ 6771.275773]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275819]  #1: ffff888388defda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275865] 2 locks held by kworker/23:98/13450:
[ 6771.275874]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275925]  #1: ffff8883886dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.275972] 2 locks held by kworker/23:102/13454:
[ 6771.275981]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276032]  #1: ffff8883886e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276079] 2 locks held by kworker/14:8/13456:
[ 6771.276087]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276137]  #1: ffff8883886ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276186] 2 locks held by kworker/20:100/13462:
[ 6771.276195]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276245]  #1: ffff88813ccffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276293] 2 locks held by kworker/20:104/13466:
[ 6771.276302]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276352]  #1: ffff88814aeefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276399] 2 locks held by kworker/20:107/13469:
[ 6771.276408]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276459]  #1: ffff888375277da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276505] 2 locks held by kworker/20:108/13470:
[ 6771.276514]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276564]  #1: ffff888375d3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276611] 2 locks held by kworker/20:109/13471:
[ 6771.276620]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276670]  #1: ffff8883de4efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276717] 2 locks held by kworker/20:115/13477:
[ 6771.276726]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276777]  #1: ffff8883e277fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276824] 2 locks held by kworker/22:61/13480:
[ 6771.276833]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276883]  #1: ffff8883e279fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276929] 2 locks held by kworker/20:118/13481:
[ 6771.276938]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.276988]  #1: ffff8883e27d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277035] 2 locks held by kworker/20:119/13482:
[ 6771.277044]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277095]  #1: ffff8883e27dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277141] 2 locks held by kworker/6:8/13483:
[ 6771.277150]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277196]  #1: ffff8883e27e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277243] 2 locks held by kworker/6:9/13484:
[ 6771.277252]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277302]  #1: ffff8883e27f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277349] 2 locks held by kworker/6:10/13485:
[ 6771.277357]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277407]  #1: ffff8883e27ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277454] 2 locks held by kworker/13:87/13487:
[ 6771.277463]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277536]  #1: ffff8883e2c17da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277583] 2 locks held by kworker/13:89/13489:
[ 6771.277592]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277642]  #1: ffff8883e2c27da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277689] 2 locks held by kworker/13:90/13490:
[ 6771.277697]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277748]  #1: ffff888371507da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277795] 2 locks held by kworker/13:91/13491:
[ 6771.277804]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277854]  #1: ffff8883e2c3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277902] 2 locks held by kworker/20:120/13493:
[ 6771.277912]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.277961]  #1: ffff8881ac20fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278009] 2 locks held by kworker/8:21/13494:
[ 6771.278018]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278069]  #1: ffff8882cffdfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278116] 2 locks held by kworker/8:23/13496:
[ 6771.278126]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278177]  #1: ffff88816831fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278226] 2 locks held by kworker/19:65/13504:
[ 6771.278235]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278286]  #1: ffff8882ce83fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278335] 2 locks held by kworker/8:26/13508:
[ 6771.278344]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278394]  #1: ffff88818848fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278442] 2 locks held by kworker/8:27/13509:
[ 6771.278451]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278501]  #1: ffff8881b2957da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278548] 2 locks held by kworker/8:28/13510:
[ 6771.278557]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278607]  #1: ffff88817ca2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278655] 2 locks held by kworker/8:30/13512:
[ 6771.278663]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278714]  #1: ffff8881a6827da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278760] 2 locks held by kworker/8:31/13513:
[ 6771.278769]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278819]  #1: ffff8881af8ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278867] 2 locks held by kworker/12:101/13518:
[ 6771.278876]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278926]  #1: ffff888385047da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.278972] 2 locks held by kworker/12:102/13519:
[ 6771.278981]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279031]  #1: ffff888385057da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279078] 2 locks held by kworker/12:105/13522:
[ 6771.279087]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279137]  #1: ffff8883e2cf7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279185] 2 locks held by kworker/12:111/13528:
[ 6771.279194]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279244]  #1: ffff8883e2d17da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279291] 2 locks held by kworker/22:62/13531:
[ 6771.279300]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279351]  #1: ffff88816070fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279397] 2 locks held by kworker/22:63/13532:
[ 6771.279406]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279456]  #1: ffff8881651c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279503] 2 locks held by kworker/22:65/13534:
[ 6771.279511]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279562]  #1: ffff8881828dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279609] 2 locks held by kworker/22:66/13535:
[ 6771.279618]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279668]  #1: ffff88838a64fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279714] 2 locks held by kworker/22:67/13536:
[ 6771.279723]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279773]  #1: ffff88838a62fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279820] 2 locks held by kworker/22:68/13537:
[ 6771.279829]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279878]  #1: ffff8882d1befda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279925] 2 locks held by kworker/22:69/13538:
[ 6771.279934]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.279984]  #1: ffff8881bcb6fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280030] 2 locks held by kworker/22:70/13539:
[ 6771.280039]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280089]  #1: ffff88814e33fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280136] 2 locks held by kworker/22:72/13541:
[ 6771.280145]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280195]  #1: ffff8881a4b77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280242] 2 locks held by kworker/22:73/13542:
[ 6771.280251]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280301]  #1: ffff88838969fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280347] 2 locks held by kworker/22:74/13543:
[ 6771.280356]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280407]  #1: ffff88819d0e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280453] 2 locks held by kworker/22:75/13544:
[ 6771.280461]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280512]  #1: ffff888389907da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280558] 2 locks held by kworker/22:76/13545:
[ 6771.280567]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280618]  #1: ffff8883898d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280664] 2 locks held by kworker/22:77/13546:
[ 6771.280673]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280723]  #1: ffff88812fe77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280770] 2 locks held by kworker/22:78/13547:
[ 6771.280779]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280828]  #1: ffff888389bcfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280874] 2 locks held by kworker/22:79/13548:
[ 6771.280883]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280934]  #1: ffff88838966fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.280980] 2 locks held by kworker/22:80/13549:
[ 6771.280989]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281040]  #1: ffff8883892e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281085] 2 locks held by kworker/22:81/13550:
[ 6771.281094]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281150]  #1: ffff88818166fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281197] 2 locks held by kworker/22:82/13551:
[ 6771.281205]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281256]  #1: ffff8881aeb1fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281304] 2 locks held by kworker/22:83/13552:
[ 6771.281312]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281362]  #1: ffff8881b3e4fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281409] 2 locks held by kworker/22:84/13553:
[ 6771.281418]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281488]  #1: ffff8881226e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281534] 2 locks held by kworker/22:85/13554:
[ 6771.281544]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281594]  #1: ffff888183107da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281640] 2 locks held by kworker/22:86/13555:
[ 6771.281649]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281700]  #1: ffff888388e2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281746] 2 locks held by kworker/22:87/13556:
[ 6771.281755]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281806]  #1: ffff88810f07fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281852] 2 locks held by kworker/22:88/13557:
[ 6771.281861]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281911]  #1: ffff88838a707da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.281958] 2 locks held by kworker/22:89/13558:
[ 6771.281967]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282017]  #1: ffff88838ac77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282065] 2 locks held by kworker/4:58/13562:
[ 6771.282074]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282124]  #1: ffff888388ccfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282173] 2 locks held by kworker/4:66/13570:
[ 6771.282183]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282235]  #1: ffff888198b2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282285] 2 locks held by kworker/1:71/13582:
[ 6771.282294]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282345]  #1: ffff888157d7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282395] 2 locks held by kworker/1:84/13595:
[ 6771.282404]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282455]  #1: ffff88819547fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282502] 2 locks held by kworker/1:85/13596:
[ 6771.282511]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282561]  #1: ffff88838a847da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282609] 2 locks held by kworker/1:89/13600:
[ 6771.282618]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282667]  #1: ffff8881a48efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282714] 2 locks held by kworker/8:33/13601:
[ 6771.282724]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282774]  #1: ffff8881bfdf7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282821] 2 locks held by kworker/8:35/13603:
[ 6771.282830]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282880]  #1: ffff88818bfe7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282926] 2 locks held by kworker/8:36/13604:
[ 6771.282935]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.282985]  #1: ffff888125d27da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283031] 2 locks held by kworker/8:37/13605:
[ 6771.283040]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283091]  #1: ffff8881bd1bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283137] 2 locks held by kworker/8:39/13607:
[ 6771.283146]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283196]  #1: ffff88812c2d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283243] 2 locks held by kworker/8:41/13609:
[ 6771.283252]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283302]  #1: ffff88814bacfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283349] 2 locks held by kworker/8:45/13613:
[ 6771.283357]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283408]  #1: ffff88838bb07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283455] 2 locks held by kworker/4:76/13617:
[ 6771.283464]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283514]  #1: ffff88838b0bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283561] 2 locks held by kworker/4:77/13618:
[ 6771.283570]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283620]  #1: ffff88817c727da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283666] 2 locks held by kworker/4:78/13619:
[ 6771.283675]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283726]  #1: ffff88838a997da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283772] 2 locks held by kworker/20:121/13620:
[ 6771.283782]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283832]  #1: ffff88819f59fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283879] 2 locks held by kworker/20:123/13622:
[ 6771.283888]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283938]  #1: ffff88811b72fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.283985] 2 locks held by kworker/20:125/13624:
[ 6771.283993]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284043]  #1: ffff8881ab1bfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284090] 2 locks held by kworker/20:126/13625:
[ 6771.284099]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284149]  #1: ffff888115a0fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284195] 2 locks held by kworker/20:128/13627:
[ 6771.284205]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284255]  #1: ffff88819333fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284297] 2 locks held by kworker/20:129/13628:
[ 6771.284304]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284348]  #1: ffff88817a127da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284388] 2 locks held by kworker/20:130/13629:
[ 6771.284396]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284440]  #1: ffff8881223a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284487] 2 locks held by kworker/20:135/13634:
[ 6771.284496]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284546]  #1: ffff8881a576fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284593] 2 locks held by kworker/20:136/13635:
[ 6771.284602]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284652]  #1: ffff888197ae7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284698] 2 locks held by kworker/14:9/13636:
[ 6771.284707]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284757]  #1: ffff88815c83fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284804] 2 locks held by kworker/14:10/13637:
[ 6771.284813]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284863]  #1: ffff8881a31a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284909] 2 locks held by kworker/14:11/13638:
[ 6771.284918]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.284968]  #1: ffff88838755fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285015] 2 locks held by kworker/14:12/13639:
[ 6771.285024]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285074]  #1: ffff8881b9a47da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285122] 2 locks held by kworker/20:140/13644:
[ 6771.285131]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285181]  #1: ffff8881a7287da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285229] 2 locks held by kworker/20:143/13652:
[ 6771.285238]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285288]  #1: ffff8883897dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285336] 2 locks held by kworker/20:144/13656:
[ 6771.285345]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285395]  #1: ffff88816a81fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285443] 2 locks held by kworker/20:146/13667:
[ 6771.285452]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285523]  #1: ffff8881b010fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285570] 2 locks held by kworker/20:147/13668:
[ 6771.285579]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285629]  #1: ffff88838b8b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285675] 2 locks held by kworker/20:148/13669:
[ 6771.285684]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285732]  #1: ffff8881aa3c7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285779] 2 locks held by kworker/20:149/13670:
[ 6771.285788]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285838]  #1: ffff88836fe7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285885] 2 locks held by kworker/20:150/13671:
[ 6771.285894]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285944]  #1: ffff88838886fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.285992] 2 locks held by kworker/14:13/13673:
[ 6771.286001]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286052]  #1: ffff8881b0fdfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286098] 2 locks held by kworker/14:14/13674:
[ 6771.286108]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286158]  #1: ffff8881811dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286205] 2 locks held by kworker/14:15/13675:
[ 6771.286214]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286265]  #1: ffff888387c1fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286313] 2 locks held by kworker/14:16/13676:
[ 6771.286322]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286373]  #1: ffff88838855fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286419] 2 locks held by kworker/14:17/13677:
[ 6771.286428]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286478]  #1: ffff888388557da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286527] 2 locks held by kworker/14:18/13684:
[ 6771.286536]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286586]  #1: ffff88813eaf7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286635] 2 locks held by kworker/18:66/13691:
[ 6771.286643]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286694]  #1: ffff8883e1f2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286744] 2 locks held by kworker/4:79/13706:
[ 6771.286753]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286804]  #1: ffff88815e7dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286851] 2 locks held by kworker/4:81/13708:
[ 6771.286860]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286910]  #1: ffff88838b97fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.286957] 2 locks held by kworker/4:82/13709:
[ 6771.286966]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287016]  #1: ffff888194da7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287063] 2 locks held by kworker/4:84/13711:
[ 6771.287072]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287122]  #1: ffff888197b8fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287169] 2 locks held by kworker/4:85/13712:
[ 6771.287178]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287228]  #1: ffff88811be37da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287275] 2 locks held by kworker/4:90/13717:
[ 6771.287284]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287334]  #1: ffff888136497da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287381] 2 locks held by kworker/4:91/13718:
[ 6771.287390]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287440]  #1: ffff888389db7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287486] 2 locks held by kworker/4:92/13719:
[ 6771.287495]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287546]  #1: ffff88819b547da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287593] 2 locks held by kworker/4:96/13724:
[ 6771.287602]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287653]  #1: ffff888389dd7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287702] 2 locks held by kworker/4:97/13733:
[ 6771.287711]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287760]  #1: ffff888199037da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287808] 2 locks held by kworker/4:99/13736:
[ 6771.287816]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287867]  #1: ffff88812380fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287914] 2 locks held by kworker/4:101/13738:
[ 6771.287924]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.287974]  #1: ffff88814c1afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288020] 2 locks held by kworker/4:102/13739:
[ 6771.288029]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288079]  #1: ffff8881b8e57da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288126] 2 locks held by kworker/4:103/13740:
[ 6771.288135]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288185]  #1: ffff888158477da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288232] 2 locks held by kworker/4:105/13743:
[ 6771.288241]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288290]  #1: ffff88838a54fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288337] 2 locks held by kworker/4:106/13744:
[ 6771.288345]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288396]  #1: ffff8882cc8dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288443] 2 locks held by kworker/4:108/13746:
[ 6771.288451]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288502]  #1: ffff8881633cfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288547] 2 locks held by kworker/4:110/13748:
[ 6771.288556]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288606]  #1: ffff88810bf47da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288655] 2 locks held by kworker/4:115/13753:
[ 6771.288663]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288714]  #1: ffff88838b25fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288761] 2 locks held by kworker/4:117/13755:
[ 6771.288770]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288819]  #1: ffff888123b37da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288866] 2 locks held by kworker/4:118/13756:
[ 6771.288875]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288925]  #1: ffff88838a357da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.288972] 2 locks held by kworker/4:120/13758:
[ 6771.288981]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289031]  #1: ffff88838a117da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289077] 2 locks held by kworker/4:121/13759:
[ 6771.289086]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289136]  #1: ffff88838a107da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289183] 2 locks held by kworker/4:122/13760:
[ 6771.289192]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289242]  #1: ffff88838a0ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289288] 2 locks held by kworker/4:124/13762:
[ 6771.289297]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289347]  #1: ffff8881578d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289393] 2 locks held by kworker/4:125/13763:
[ 6771.289402]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289452]  #1: ffff88833592fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289519] 2 locks held by kworker/8:50/13766:
[ 6771.289528]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289579]  #1: ffff888132547da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289625] 2 locks held by kworker/8:51/13767:
[ 6771.289634]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289684]  #1: ffff8883897ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289731] 2 locks held by kworker/8:53/13769:
[ 6771.289740]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289791]  #1: ffff888389587da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289839] 2 locks held by kworker/14:19/13774:
[ 6771.289848]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289899]  #1: ffff8883893f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.289945] 2 locks held by kworker/14:20/13775:
[ 6771.289954]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290005]  #1: ffff8883893efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290051] 2 locks held by kworker/12:115/13777:
[ 6771.290060]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290111]  #1: ffff8882dec1fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290164] 2 locks held by kworker/14:21/13780:
[ 6771.290173]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290224]  #1: ffff88818b877da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290272] 2 locks held by kworker/14:22/13781:
[ 6771.290281]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290332]  #1: ffff888182cf7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290377] 2 locks held by kworker/14:23/13782:
[ 6771.290386]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290436]  #1: ffff88836ff6fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290483] 2 locks held by kworker/14:24/13783:
[ 6771.290492]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290543]  #1: ffff8883d2cefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290589] 2 locks held by kworker/14:25/13784:
[ 6771.290598]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290648]  #1: ffff8883d2ce7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290695] 2 locks held by kworker/14:26/13785:
[ 6771.290704]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290754]  #1: ffff8883d2cdfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290801] 2 locks held by kworker/14:27/13787:
[ 6771.290810]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290861]  #1: ffff8883d2cc7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290907] 2 locks held by kworker/15:57/13788:
[ 6771.290916]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.290967]  #1: ffff8883d2cbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291014] 2 locks held by kworker/14:28/13789:
[ 6771.291023]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291073]  #1: ffff8883d2cafda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291120] 2 locks held by kworker/14:29/13791:
[ 6771.291129]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291179]  #1: ffff8883d2c1fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291226] 2 locks held by kworker/14:30/13794:
[ 6771.291236]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291286]  #1: ffff88817bd57da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291331] 2 locks held by kworker/14:31/13795:
[ 6771.291340]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291390]  #1: ffff8881ae83fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291436] 2 locks held by kworker/14:32/13796:
[ 6771.291445]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291495]  #1: ffff8881301afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291542] 2 locks held by kworker/14:33/13797:
[ 6771.291551]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291601]  #1: ffff88818b9dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291650] 2 locks held by kworker/12:121/13803:
[ 6771.291659]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291709]  #1: ffff88818e117da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291756] 2 locks held by kworker/15:63/13804:
[ 6771.291765]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291814]  #1: ffff8881487ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291856] 2 locks held by kworker/12:122/13805:
[ 6771.291863]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291906]  #1: ffff888138d7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.291951] 2 locks held by kworker/15:66/13808:
[ 6771.291958]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292005]  #1: ffff8883704efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292046] 2 locks held by kworker/15:67/13810:
[ 6771.292053]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292103]  #1: ffff888335817da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292150] 2 locks held by kworker/12:126/13815:
[ 6771.292159]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292209]  #1: ffff88818d207da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292256] 2 locks held by kworker/15:70/13816:
[ 6771.292265]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292315]  #1: ffff888134457da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292362] 2 locks held by kworker/15:72/13818:
[ 6771.292371]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292421]  #1: ffff88814b2a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292468] 2 locks held by kworker/15:73/13819:
[ 6771.292477]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292527]  #1: ffff888156947da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292574] 2 locks held by kworker/15:74/13820:
[ 6771.292583]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292633]  #1: ffff88812ae2fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292679] 2 locks held by kworker/15:75/13821:
[ 6771.292688]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292738]  #1: ffff888190947da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292784] 2 locks held by kworker/15:76/13822:
[ 6771.292793]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292843]  #1: ffff8883d284fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292890] 2 locks held by kworker/15:78/13824:
[ 6771.292899]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292949]  #1: ffff8881502b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.292995] 2 locks held by kworker/15:79/13825:
[ 6771.293004]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293054]  #1: ffff88837f81fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293101] 2 locks held by kworker/15:80/13826:
[ 6771.293110]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293160]  #1: ffff8881b06ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293207] 2 locks held by kworker/15:83/13829:
[ 6771.293216]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293266]  #1: ffff88838c7afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293313] 2 locks held by kworker/17:103/13832:
[ 6771.293322]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293373]  #1: ffff88811fd27da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293420] 2 locks held by kworker/17:105/13834:
[ 6771.293429]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293501]  #1: ffff8881910efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293547] 2 locks held by kworker/17:106/13835:
[ 6771.293556]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293607]  #1: ffff888131fbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293655] 2 locks held by kworker/17:110/13839:
[ 6771.293664]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293714]  #1: ffff88838cd07da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293764] 2 locks held by kworker/5:95/13849:
[ 6771.293772]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293823]  #1: ffff888371c6fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293868] 2 locks held by kworker/5:96/13850:
[ 6771.293877]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293928]  #1: ffff888151e97da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.293977] 2 locks held by kworker/17:120/13856:
[ 6771.293986]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294036]  #1: ffff88815cd3fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294083] 2 locks held by kworker/14:34/13861:
[ 6771.294093]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294143]  #1: ffff8881601dfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294193] 2 locks held by kworker/13:103/13873:
[ 6771.294203]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294254]  #1: ffff8881602e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294302] 2 locks held by kworker/4:129/13878:
[ 6771.294311]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294363]  #1: ffff88838ce7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294410] 2 locks held by kworker/4:131/13880:
[ 6771.294419]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294470]  #1: ffff88838c87fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294519] 2 locks held by kworker/4:138/13887:
[ 6771.294528]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294578]  #1: ffff88838980fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294624] 2 locks held by kworker/4:139/13888:
[ 6771.294634]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294684]  #1: ffff888389817da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294732] 2 locks held by kworker/4:141/13890:
[ 6771.294741]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294790]  #1: ffff88838cda7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294836] 2 locks held by kworker/4:143/13892:
[ 6771.294845]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294895]  #1: ffff88838cdbfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.294941] 2 locks held by kworker/4:144/13893:
[ 6771.294950]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295000]  #1: ffff8883e38a7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295047] 2 locks held by kworker/4:146/13895:
[ 6771.295056]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295106]  #1: ffff8883e38b7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295153] 2 locks held by kworker/4:148/13897:
[ 6771.295162]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295212]  #1: ffff8883e38ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295259] 2 locks held by kworker/4:149/13898:
[ 6771.295268]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295318]  #1: ffff8883e390fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295364] 2 locks held by kworker/4:151/13900:
[ 6771.295373]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295424]  #1: ffff8883e391fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295471] 2 locks held by kworker/4:154/13903:
[ 6771.295481]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295531]  #1: ffff8881628ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295578] 2 locks held by kworker/4:157/13906:
[ 6771.295587]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295637]  #1: ffff88815069fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295684] 2 locks held by kworker/4:159/13908:
[ 6771.295693]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295743]  #1: ffff88818e0e7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295789] 2 locks held by kworker/21:50/13909:
[ 6771.295798]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295848]  #1: ffff8883d0e17da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295895] 2 locks held by kworker/21:51/13910:
[ 6771.295904]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.295953]  #1: ffff88810d9f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296000] 2 locks held by kworker/4:160/13912:
[ 6771.296009]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296059]  #1: ffff88819930fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296106] 2 locks held by kworker/4:162/13914:
[ 6771.296115]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296165]  #1: ffff88817cf9fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296211] 2 locks held by kworker/4:163/13915:
[ 6771.296220]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296270]  #1: ffff8881bc147da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296317] 2 locks held by kworker/4:165/13917:
[ 6771.296326]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296376]  #1: ffff88834dfa7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296423] 2 locks held by kworker/4:167/13919:
[ 6771.296432]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296482]  #1: ffff88817cd77da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296529] 2 locks held by kworker/4:168/13920:
[ 6771.296538]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296588]  #1: ffff888135f6fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296633] 2 locks held by kworker/4:169/13921:
[ 6771.296642]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296693]  #1: ffff888126ee7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296739] 2 locks held by kworker/4:171/13923:
[ 6771.296748]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296798]  #1: ffff88810f71fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296846] 2 locks held by kworker/6:11/13929:
[ 6771.296855]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296905]  #1: ffff88817d537da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.296953] 2 locks held by kworker/11:97/13934:
[ 6771.296961]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297011]  #1: ffff8881a6fe7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297057] 2 locks held by kworker/14:35/13935:
[ 6771.297066]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297117]  #1: ffff8883d493fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297163] 2 locks held by kworker/14:36/13936:
[ 6771.297172]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297223]  #1: ffff888193eefda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297269] 2 locks held by kworker/14:37/13937:
[ 6771.297279]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297329]  #1: ffff888162ebfda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297376] 2 locks held by kworker/14:38/13938:
[ 6771.297385]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297435]  #1: ffff88817aac7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297504] 2 locks held by kworker/14:39/13939:
[ 6771.297512]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297561]  #1: ffff88818577fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297608] 2 locks held by kworker/14:40/13940:
[ 6771.297617]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297667]  #1: ffff888127e27da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297714] 2 locks held by kworker/14:42/13942:
[ 6771.297723]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297774]  #1: ffff8883d58ffda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297822] 2 locks held by kworker/14:44/13944:
[ 6771.297831]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297882]  #1: ffff8883d4be7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297928] 2 locks held by kworker/11:98/13946:
[ 6771.297937]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.297988]  #1: ffff8883d6337da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298034] 2 locks held by kworker/11:99/13947:
[ 6771.298044]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298094]  #1: ffff8883d628fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298142] 2 locks held by kworker/11:100/13948:
[ 6771.298151]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298203]  #1: ffff8883d6207da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298249] 2 locks held by kworker/11:101/13949:
[ 6771.298258]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298309]  #1: ffff8883d61f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298356] 2 locks held by kworker/11:102/13950:
[ 6771.298365]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298415]  #1: ffff8883d618fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298463] 2 locks held by kworker/11:104/13952:
[ 6771.298472]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298520]  #1: ffff8881acb7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298567] 2 locks held by kworker/11:105/13953:
[ 6771.298576]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298626]  #1: ffff888144127da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298673] 2 locks held by kworker/11:106/13954:
[ 6771.298682]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298732]  #1: ffff888117627da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298779] 2 locks held by kworker/11:107/13955:
[ 6771.298788]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298838]  #1: ffff888157d97da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298885] 2 locks held by kworker/11:108/13956:
[ 6771.298894]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298945]  #1: ffff8883d67d7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.298991] 2 locks held by kworker/11:109/13957:
[ 6771.299000]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299050]  #1: ffff8883d66afda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299096] 2 locks held by kworker/11:110/13958:
[ 6771.299105]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299156]  #1: ffff8883d6557da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299202] 2 locks held by kworker/7:3/13959:
[ 6771.299211]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299262]  #1: ffff8883d654fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299307] 2 locks held by kworker/11:111/13960:
[ 6771.299316]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299367]  #1: ffff8883d653fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299414] 2 locks held by kworker/11:112/13961:
[ 6771.299422]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299473]  #1: ffff8883d6387da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299520] 2 locks held by kworker/7:4/13962:
[ 6771.299529]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299579]  #1: ffff8883d637fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299627] 2 locks held by kworker/11:114/13965:
[ 6771.299636]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299686]  #1: ffff8883d4fd7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299733] 2 locks held by kworker/11:115/13966:
[ 6771.299742]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299792]  #1: ffff8883847efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299839] 2 locks held by kworker/11:116/13967:
[ 6771.299848]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299899]  #1: ffff88815dc7fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.299946] 2 locks held by kworker/11:117/13968:
[ 6771.299955]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300005]  #1: ffff88838b787da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300052] 2 locks held by kworker/11:118/13969:
[ 6771.300061]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300112]  #1: ffff8881a17f7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300163] 2 locks held by kworker/11:120/13971:
[ 6771.300172]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300223]  #1: ffff8883d4dc7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300271] 2 locks held by kworker/11:124/13976:
[ 6771.300280]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300329]  #1: ffff8881275efda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300376] 2 locks held by kworker/11:125/13977:
[ 6771.300385]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300435]  #1: ffff88812b16fda8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300482] 2 locks held by kworker/11:126/13978:
[ 6771.300491]  #0: ffff88815f79f948 ((wq_completion)dio/dm-5#14){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300541]  #1: ffff888188af7da8 ((work_completion)(&dio->aio.work)){+.+.}-{0:0}, at: process_one_work+0x3d3/0x950
[ 6771.300588] 2 locks held by tee/13997:
[ 6771.300596]  #0: ffff8881af2a00a8 (&tty->ldisc_sem){++++}-{0:0}, at: ldsem_down_read+0x35/0x40
[ 6771.300642]  #1: ffff8881af2a0148 (&tty->atomic_write_lock){+.+.}-{4:4}, at: file_tty_write.constprop.0+0x145/0x4e0
[ 6771.300689] 2 locks held by dmesg/13999:

[ 6771.300708] =============================================


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 20:29                                                                                 ` Bob Pearson
@ 2023-10-18 20:49                                                                                   ` Bart Van Assche
  2023-10-18 21:17                                                                                     ` Pearson, Robert B
  0 siblings, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-10-18 20:49 UTC (permalink / raw)
  To: Bob Pearson, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/18/23 13:29, Bob Pearson wrote:
> OK, with clean code from current rdma for-next branch with the .config I sent before, if I run:
> 
> rpearson:blktests$ sudo use_siw=1 ./check srp/002
> [sudo] password for rpearson:
> srp/002 (File I/O on top of multipath concurrently with logout and login (mq))
> 
> It hangs. The dmesg trace is attached.

Thank you for having shared the dmesg output. If the test setup is still 
in the same state, please try to run the following command:

$ dmsetup ls |
   while read a b; do dmsetup message $a 0 fail_if_no_path; done

If this resolves the hang, this hang is not a kernel bug but a bug in 
multipathd or in the test scripts.

Thanks,

Bart.

^ permalink raw reply	[flat|nested] 87+ messages in thread

* RE: [bug report] blktests srp/002 hang
  2023-10-18 20:49                                                                                   ` Bart Van Assche
@ 2023-10-18 21:17                                                                                     ` Pearson, Robert B
  2023-10-18 21:27                                                                                       ` Bart Van Assche
  0 siblings, 1 reply; 87+ messages in thread
From: Pearson, Robert B @ 2023-10-18 21:17 UTC (permalink / raw)
  To: Bart Van Assche, Bob Pearson, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi



-----Original Message-----
From: Bart Van Assche <bvanassche@acm.org> 
Sent: Wednesday, October 18, 2023 3:50 PM
To: Bob Pearson <rpearsonhpe@gmail.com>; Jason Gunthorpe <jgg@ziepe.ca>
Cc: Daisuke Matsuda (Fujitsu) <matsuda-daisuke@fujitsu.com>; 'Rain River' <rain.1986.08.12@gmail.com>; Zhu Yanjun <yanjun.zhu@linux.dev>; leon@kernel.org; Shinichiro Kawasaki <shinichiro.kawasaki@wdc.com>; RDMA mailing list <linux-rdma@vger.kernel.org>; linux-scsi@vger.kernel.org
Subject: Re: [bug report] blktests srp/002 hang

On 10/18/23 13:29, Bob Pearson wrote:
> OK, with clean code from current rdma for-next branch with the .config I sent before, if I run:
> 
> rpearson:blktests$ sudo use_siw=1 ./check srp/002 [sudo] password for 
> rpearson:
> srp/002 (File I/O on top of multipath concurrently with logout and 
> login (mq))
> 
> It hangs. The dmesg trace is attached.

Thank you for having shared the dmesg output. If the test setup is still in the same state, please try to run the following command:

$ dmsetup ls |
   while read a b; do dmsetup message $a 0 fail_if_no_path; done

If this resolves the hang, this hang is not a kernel bug but a bug in multipathd or in the test scripts.

Thanks,

Bart.

Help. Do I run after the hang (from a different shell) or before I run srp/002?

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 21:17                                                                                     ` Pearson, Robert B
@ 2023-10-18 21:27                                                                                       ` Bart Van Assche
  2023-10-18 21:52                                                                                         ` Bob Pearson
  0 siblings, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-10-18 21:27 UTC (permalink / raw)
  To: Pearson, Robert B, Bob Pearson, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/18/23 14:17, Pearson, Robert B wrote:
> Help. Do I run after the hang (from a different shell) or before I
> run srp/002?

Hi Bob,

Please only run that shell command after a complaint like "INFO: task
kworker/11:0:85 blocked for more than 120 seconds." has appeared in the
kernel log.

Thanks,

Bart


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 21:27                                                                                       ` Bart Van Assche
@ 2023-10-18 21:52                                                                                         ` Bob Pearson
  2023-10-19 19:17                                                                                           ` Bart Van Assche
  0 siblings, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-10-18 21:52 UTC (permalink / raw)
  To: Bart Van Assche, Pearson, Robert B, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

[-- Attachment #1: Type: text/plain, Size: 1543 bytes --]

On 10/18/23 16:27, Bart Van Assche wrote:
> On 10/18/23 14:17, Pearson, Robert B wrote:
>> Help. Do I run after the hang (from a different shell) or before I
>> run srp/002?
> 
> Hi Bob,
> 
> Please only run that shell command after a complaint like "INFO: task
> kworker/11:0:85 blocked for more than 120 seconds." has appeared in the
> kernel log.
> 
> Thanks,
> 
> Bart
> 

results are slightly ambiguous but I ran the command here:

rpearson:blktests$ sudo use_siw=1 ./check srp/002
srp/002 (File I/O on top of multipath concurrently with logout and login (mq)) [failed]time  245.018s  ...
    runtime  245.018s  ...  128.110s
    --- tests/srp/002.out	2023-02-15 12:07:40.675530344 -0600
    +++ /home/rpearson/src/blktests/results/nodev/srp/002.out.bad	2023-10-18 16:36:14.723323257 -0500
    @@ -1,2 +1 @@
     Configured SRP target driver
    -Passed
rpearson:blktests$ 

And while it was hung I ran the following:

root@rpearson-X570-AORUS-PRO-WIFI: dmsetup ls | while read a b; do dmsetup message $a 0 fail_if_no_path; done
device-mapper: message ioctl on mpatha-part1  failed: Invalid argument
Command failed.
device-mapper: message ioctl on mpatha-part2  failed: Invalid argument
Command failed.
device-mapper: message ioctl on mpathb-part1  failed: Invalid argument
Command failed.

mpath[ab]-part[12] are multipath devices (dm-1,2,3) holding the Ubuntu system images and not the devices created
by blktests. When this command finished the srp/002 run came back life but did not succeed (see above)

The dmesg log is attached

Bob

[-- Attachment #2: out --]
[-- Type: text/plain, Size: 230202 bytes --]

[ 1091.037161] rxe0: cq#1 rxe_req_notify_cq: called, flags: 2
[ 1357.415664] audit: type=1400 audit(1697664842.082:63): apparmor="DENIED" operation="capable" class="cap" profile="/snap/snapd/20290/usr/lib/snapd/snap-confine" pid=9765 comm="snap-confine" capability=12  capname="net_admin"
[ 1357.415743] audit: type=1400 audit(1697664842.082:64): apparmor="DENIED" operation="capable" class="cap" profile="/snap/snapd/20290/usr/lib/snapd/snap-confine" pid=9765 comm="snap-confine" capability=38  capname="perfmon"
[ 1358.362843] audit: type=1400 audit(1697664843.030:65): apparmor="DENIED" operation="open" class="file" profile="snap-update-ns.firefox" name="/var/lib/" pid=9791 comm="5" requested_mask="r" denied_mask="r" fsuid=0 ouid=0
[ 1358.363015] audit: type=1400 audit(1697664843.030:66): apparmor="DENIED" operation="sendmsg" class="net" profile="snap-update-ns.firefox" pid=9791 comm="5" family="unix" sock_type="stream" protocol=0 requested_mask="send" denied_mask="send"
[ 1358.376866] audit: type=1400 audit(1697664843.042:67): apparmor="DENIED" operation="open" class="file" profile="snap-update-ns.firefox" name="/var/lib/" pid=9791 comm="5" requested_mask="r" denied_mask="r" fsuid=0 ouid=0
[ 1358.376954] audit: type=1400 audit(1697664843.042:68): apparmor="DENIED" operation="sendmsg" class="net" profile="snap-update-ns.firefox" pid=9791 comm="5" family="unix" sock_type="stream" protocol=0 requested_mask="send" denied_mask="send"
[ 1484.870921] run blktests srp/002 at 2023-10-18 16:36:09
[ 1485.686105] rxe0: cq#1 rxe_destroy_cq: called
[ 1486.673744] rdma_rxe: unloaded
[ 1487.272575] null_blk: module loaded
[ 1487.339652] null_blk: disk nullb0 created
[ 1487.377848] null_blk: disk nullb1 created
[ 1487.514865] SoftiWARP attached
[ 1487.637202] scsi_debug:sdebug_add_store: dif_storep 524288 bytes @ ffffc90003611000
[ 1487.640442] scsi_debug:sdebug_driver_probe: scsi_debug: trim poll_queues to 0. poll_q/nr_hw = (0/1)
[ 1487.640450] scsi_debug:sdebug_driver_probe: host protection DIF3 DIX3
[ 1487.640460] scsi host10: scsi_debug: version 0191 [20210520]
                 dev_size_mb=32, opts=0x0, submit_queues=1, statistics=0
[ 1487.645059] scsi 10:0:0:0: Direct-Access     Linux    scsi_debug       0191 PQ: 0 ANSI: 7
[ 1487.645805] scsi 10:0:0:0: Power-on or device reset occurred
[ 1487.648625] sd 10:0:0:0: [sda] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1487.648640] sd 10:0:0:0: Attached scsi generic sg0 type 0
[ 1487.648677] sd 10:0:0:0: [sda] Write Protect is off
[ 1487.648690] sd 10:0:0:0: [sda] Mode Sense: 73 00 10 08
[ 1487.648769] sd 10:0:0:0: [sda] Write cache: enabled, read cache: enabled, supports DPO and FUA
[ 1487.648940] sd 10:0:0:0: [sda] Enabling DIX T10-DIF-TYPE3-CRC, application tag size 6 bytes
[ 1487.648953] sd 10:0:0:0: [sda] Enabling DIF Type 3 protection
[ 1487.649017] sd 10:0:0:0: [sda] Preferred minimum I/O size 512 bytes
[ 1487.649030] sd 10:0:0:0: [sda] Optimal transfer size 524288 bytes
[ 1487.662148] sd 10:0:0:0: [sda] Attached SCSI disk
[ 1488.384017] Rounding down aligned max_sectors from 4294967295 to 4294967288
[ 1488.447113] ib_srpt:srpt_add_one: ib_srpt device = 000000009a232cd0
[ 1488.447663] ib_srpt:srpt_use_srq: ib_srpt srpt_use_srq(enp6s0_siw): use_srq = 0; ret = 0
[ 1488.447671] ib_srpt:srpt_add_one: ib_srpt Target login info: id_ext=b62e99fffef9fa2e,ioc_guid=b62e99fffef9fa2e,pkey=ffff,service_id=b62e99fffef9fa2e
[ 1488.447751] ib_srpt:srpt_add_one: ib_srpt added enp6s0_siw.
[ 1489.043936] Rounding down aligned max_sectors from 255 to 248
[ 1489.137044] Rounding down aligned max_sectors from 255 to 248
[ 1489.231195] Rounding down aligned max_sectors from 4294967295 to 4294967288
[ 1490.246530] ib_srp:srp_add_one: ib_srp: srp_add_one: 18446744073709551615 / 4096 = 4503599627370495 <> 512
[ 1490.246543] ib_srp:srp_add_one: ib_srp: enp6s0_siw: mr_page_shift = 12, device->max_mr_size = 0xffffffffffffffff, device->max_fast_reg_page_list_len = 256, max_pages_per_mr = 256, mr_max_size = 0x100000
[ 1490.398111] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1490.398160] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1490.398230] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 1490.398239] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1490.409880] scsi host11: ib_srp: REJ received
[ 1490.409899] scsi host11:   REJ reason 0xffffff98
[ 1490.410238] scsi host11: ib_srp: Connection 0/24 to 192.168.1.77 failed
[ 1490.581914] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1490.581971] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1490.582080] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1490.582135] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1490.582168] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 1490.582183] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1490.596143] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1490.598109] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1490.687872] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000046756f68
[ 1490.688296] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1490.688937] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000cfe175b5 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000046756f68
[ 1490.690451] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-2: queued zerolength write
[ 1490.690691] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1490.690699] scsi host11: ib_srp: using immediate data
[ 1490.693672] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-2 wc->status 0
[ 1490.704912] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1490.707084] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1490.790962] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000006f2281e2
[ 1490.791220] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1490.791790] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007eb855d5 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000006f2281e2
[ 1490.792137] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-4: queued zerolength write
[ 1490.792975] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1490.792989] scsi host11: ib_srp: using immediate data
[ 1490.793033] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-4 wc->status 0
[ 1490.806694] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1490.808892] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1490.889880] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000dd1e4d50
[ 1490.890138] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1490.890263] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000e95b64d0 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000dd1e4d50
[ 1490.890685] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-6: queued zerolength write
[ 1490.890791] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1490.890811] scsi host11: ib_srp: using immediate data
[ 1490.894070] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-6 wc->status 0
[ 1490.906496] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1490.910912] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1491.002632] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000377ef535
[ 1491.002955] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1491.003073] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000f63d6843 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000377ef535
[ 1491.003893] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-8: queued zerolength write
[ 1491.004007] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1491.004025] scsi host11: ib_srp: using immediate data
[ 1491.004270] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-8 wc->status 0
[ 1491.017161] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1491.020452] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1491.116440] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000051d48562
[ 1491.116714] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1491.116829] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000088c6bdf0 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000051d48562
[ 1491.117344] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-10: queued zerolength write
[ 1491.117429] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1491.117441] scsi host11: ib_srp: using immediate data
[ 1491.117685] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-10 wc->status 0
[ 1491.129982] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1491.134270] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1491.219944] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000140b01fb
[ 1491.220201] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1491.220311] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000005785857c name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000140b01fb
[ 1491.220675] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-12: queued zerolength write
[ 1491.220698] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1491.220705] scsi host11: ib_srp: using immediate data
[ 1491.221177] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-12 wc->status 0
[ 1491.233982] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1491.236141] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1491.315491] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000ed436912
[ 1491.315755] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1491.315872] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000013a32f6a name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000ed436912
[ 1491.316240] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-14: queued zerolength write
[ 1491.316264] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1491.316271] scsi host11: ib_srp: using immediate data
[ 1491.316647] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-14 wc->status 0
[ 1491.328102] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1491.330273] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1491.409253] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000016660ed7
[ 1491.409518] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1491.409626] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000000b5dc79f name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000016660ed7
[ 1491.409985] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-16: queued zerolength write
[ 1491.410091] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1491.410111] scsi host11: ib_srp: using immediate data
[ 1491.410351] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-16 wc->status 0
[ 1491.423523] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1491.427872] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1491.534919] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000002e135536
[ 1491.535185] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1491.535298] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000089bf0ff5 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000002e135536
[ 1491.535643] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-18: queued zerolength write
[ 1491.535667] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1491.535675] scsi host11: ib_srp: using immediate data
[ 1491.536021] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-18 wc->status 0
[ 1491.548699] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1491.550930] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1491.630737] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000058a936bb
[ 1491.631001] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1491.631107] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000003bd06319 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000058a936bb
[ 1491.631485] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-20: queued zerolength write
[ 1491.631568] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1491.631585] scsi host11: ib_srp: using immediate data
[ 1491.631880] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-20 wc->status 0
[ 1491.645666] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1491.649942] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1491.751517] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000b7297a20
[ 1491.751788] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1491.751898] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000002ae39b0a name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000b7297a20
[ 1491.752258] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-22: queued zerolength write
[ 1491.752636] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-22 wc->status 0
[ 1491.752750] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1491.752759] scsi host11: ib_srp: using immediate data
[ 1491.765572] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1491.769012] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1491.866596] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000a787ef95
[ 1491.866862] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1491.866977] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000004985559 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000a787ef95
[ 1491.867346] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-24: queued zerolength write
[ 1491.867436] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1491.867452] scsi host11: ib_srp: using immediate data
[ 1491.867720] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-24 wc->status 0
[ 1491.880331] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1491.883893] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1491.982287] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000008e17ecc7
[ 1491.982555] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1491.982660] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000006f85c88b name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000008e17ecc7
[ 1491.983058] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-26: queued zerolength write
[ 1491.983083] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1491.983090] scsi host11: ib_srp: using immediate data
[ 1491.983449] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-26 wc->status 0
[ 1491.996331] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1491.998666] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1492.076588] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000001dabdf1f
[ 1492.076886] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1492.076997] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000f9228c6b name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000001dabdf1f
[ 1492.077375] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-28: queued zerolength write
[ 1492.077399] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1492.077407] scsi host11: ib_srp: using immediate data
[ 1492.077869] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-28 wc->status 0
[ 1492.091622] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1492.093934] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1492.171977] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000bab943bc
[ 1492.172240] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1492.172348] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000008963c6a3 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000bab943bc
[ 1492.172701] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-30: queued zerolength write
[ 1492.172789] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1492.172808] scsi host11: ib_srp: using immediate data
[ 1492.173156] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-30 wc->status 0
[ 1492.186353] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1492.190608] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1492.295080] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000ec544544
[ 1492.295344] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1492.295451] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000615f4387 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000ec544544
[ 1492.295816] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-32: queued zerolength write
[ 1492.295903] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1492.295913] scsi host11: ib_srp: using immediate data
[ 1492.296229] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-32 wc->status 0
[ 1492.308889] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1492.312198] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1492.412487] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000078863349
[ 1492.412752] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1492.412897] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000510b12f9 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000078863349
[ 1492.413257] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-34: queued zerolength write
[ 1492.413283] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1492.413290] scsi host11: ib_srp: using immediate data
[ 1492.413828] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-34 wc->status 0
[ 1492.426281] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1492.428447] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1492.506505] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000046477401
[ 1492.506770] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1492.506877] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000048f8c634 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000046477401
[ 1492.507268] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-36: queued zerolength write
[ 1492.507293] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1492.507300] scsi host11: ib_srp: using immediate data
[ 1492.507632] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-36 wc->status 0
[ 1492.519449] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1492.521674] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1492.599729] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000868cedbb
[ 1492.599994] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1492.600098] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000dd9373c9 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000868cedbb
[ 1492.600447] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-38: queued zerolength write
[ 1492.600625] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1492.600643] scsi host11: ib_srp: using immediate data
[ 1492.600874] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-38 wc->status 0
[ 1492.614207] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1492.618465] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1492.722031] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000eeef14ad
[ 1492.722296] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1492.722401] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000059640bd5 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000eeef14ad
[ 1492.722770] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-40: queued zerolength write
[ 1492.722794] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1492.722801] scsi host11: ib_srp: using immediate data
[ 1492.723143] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-40 wc->status 0
[ 1492.735282] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1492.737516] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1492.815292] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000681aa292
[ 1492.815556] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1492.815664] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000c0354932 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000681aa292
[ 1492.816047] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-42: queued zerolength write
[ 1492.816072] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1492.816079] scsi host11: ib_srp: using immediate data
[ 1492.816413] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-42 wc->status 0
[ 1492.829095] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1492.831244] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1492.911368] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000007c55bafa
[ 1492.911631] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1492.911740] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000093a021c name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000007c55bafa
[ 1492.912096] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-44: queued zerolength write
[ 1492.912178] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1492.912196] scsi host11: ib_srp: using immediate data
[ 1492.912491] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-44 wc->status 0
[ 1492.927161] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1492.931594] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1493.037637] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000002800777
[ 1493.037900] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1493.038009] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000e2dfbb27 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000002800777
[ 1493.038357] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-46: queued zerolength write
[ 1493.038382] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1493.038390] scsi host11: ib_srp: using immediate data
[ 1493.038730] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-46 wc->status 0
[ 1493.050584] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1493.052738] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1493.129989] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000017b35355
[ 1493.130252] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1493.130360] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000420ba792 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000017b35355
[ 1493.130718] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-48: queued zerolength write
[ 1493.130844] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1493.130852] scsi host11: ib_srp: using immediate data
[ 1493.131078] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-48 wc->status 0
[ 1493.132167] scsi host11: SRP.T10:B62E99FFFEF9FA2E
[ 1493.201908] scsi 11:0:0:0: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1493.216268] scsi 11:0:0:0: LUN assignments on this target have changed. The Linux SCSI layer does not automatically remap LUN assignments.
[ 1493.222672] scsi 11:0:0:0: alua: supports implicit and explicit TPGS
[ 1493.223012] scsi 11:0:0:0: alua: device naa.60014056e756c6c62300000000000000 port group 0 rel port 1
[ 1493.234195] sd 11:0:0:0: [sdb] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1493.235142] sd 11:0:0:0: [sdb] Write Protect is off
[ 1493.235187] sd 11:0:0:0: [sdb] Mode Sense: 43 00 00 08
[ 1493.236508] sd 11:0:0:0: Attached scsi generic sg1 type 0
[ 1493.249751] sd 11:0:0:0: [sdb] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 1493.252685] scsi 11:0:0:2: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1493.254749] sd 11:0:0:0: [sdb] Preferred minimum I/O size 512 bytes
[ 1493.254764] sd 11:0:0:0: [sdb] Optimal transfer size 126976 bytes
[ 1493.266641] scsi 11:0:0:2: alua: supports implicit and explicit TPGS
[ 1493.266678] scsi 11:0:0:2: alua: device naa.60014057363736964626700000000000 port group 0 rel port 1
[ 1493.270705] sd 11:0:0:2: Attached scsi generic sg2 type 0
[ 1493.271549] sd 11:0:0:2: [sdc] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1493.272355] sd 11:0:0:2: [sdc] Write Protect is off
[ 1493.272378] sd 11:0:0:2: [sdc] Mode Sense: 43 00 10 08
[ 1493.274786] sd 11:0:0:2: [sdc] Write cache: enabled, read cache: enabled, supports DPO and FUA
[ 1493.276486] scsi 11:0:0:1: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1493.277737] sd 11:0:0:2: [sdc] Preferred minimum I/O size 512 bytes
[ 1493.277760] sd 11:0:0:2: [sdc] Optimal transfer size 524288 bytes
[ 1493.288537] scsi 11:0:0:1: LUN assignments on this target have changed. The Linux SCSI layer does not automatically remap LUN assignments.
[ 1493.291188] sd 11:0:0:0: [sdb] Attached SCSI disk
[ 1493.292038] scsi 11:0:0:1: alua: supports implicit and explicit TPGS
[ 1493.292077] scsi 11:0:0:1: alua: device naa.60014056e756c6c62310000000000000 port group 0 rel port 1
[ 1493.295376] sd 11:0:0:1: Attached scsi generic sg3 type 0
[ 1493.296864] ib_srp:srp_add_target: ib_srp: host11: SCSI scan succeeded - detected 3 LUNs
[ 1493.297012] scsi host11: ib_srp: new target: id_ext b62e99fffef9fa2e ioc_guid b62e99fffef9fa2e sgid b42e:99f9:fa2e:0000:0000:0000:0000:0000 dest 2603:8081:1405:679b:0000:0000:0000:132c
[ 1493.300234] sd 11:0:0:1: [sdd] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1493.300545] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1493.300586] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1493.300649] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1493.300681] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1493.300744] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1493.300819] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1493.300840] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1493.305615] sd 11:0:0:1: [sdd] Write Protect is off
[ 1493.305642] sd 11:0:0:1: [sdd] Mode Sense: 43 00 00 08
[ 1493.310270] sd 11:0:0:1: [sdd] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 1493.312647] sd 11:0:0:1: [sdd] Preferred minimum I/O size 512 bytes
[ 1493.312670] sd 11:0:0:1: [sdd] Optimal transfer size 126976 bytes
[ 1493.330507] sd 11:0:0:2: [sdc] Attached SCSI disk
[ 1493.345685] sd 11:0:0:1: [sdd] Attached SCSI disk
[ 1493.370963] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1493.370997] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1493.371061] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1493.371093] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1493.371156] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1493.371188] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1493.371255] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1493.371286] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1493.371307] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1493.442637] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1493.442673] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1493.442736] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1493.442772] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1493.442841] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1493.442876] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1493.442940] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1493.442983] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1493.443047] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1493.443078] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1493.443098] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1493.518736] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1493.518771] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1493.518838] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1493.518870] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1493.518937] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1493.518969] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1493.519033] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1493.519065] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1493.519129] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1493.519160] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1493.519226] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1493.519258] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1493.519278] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1493.834590] sd 11:0:0:0: alua: transition timeout set to 60 seconds
[ 1493.834612] sd 11:0:0:0: alua: port group 00 state A non-preferred supports TOlUSNA
[ 1494.335255] EXT4-fs (dm-5): mounted filesystem cdecb2f6-0925-4e9b-8883-dfd9241e0702 r/w without journal. Quota mode: none.
[ 1499.443147] device-mapper: multipath: 253:5: Failing path 8:16.
[ 1499.768914] sd 11:0:0:2: [sdc] Synchronizing SCSI cache
[ 1499.852649] scsi 11:0:0:2: alua: Detached
[ 1499.976695] scsi 11:0:0:1: alua: Detached
[ 1499.996232] ib_srpt receiving failed for ioctx 00000000437922c2 with status 5
[ 1499.996249] ib_srpt receiving failed for ioctx 00000000d4754a17 with status 5
[ 1499.996259] ib_srpt receiving failed for ioctx 000000009091c19b with status 5
[ 1499.996269] ib_srpt receiving failed for ioctx 00000000faab32b8 with status 5
[ 1499.996279] ib_srpt receiving failed for ioctx 000000003c9a9d18 with status 5
[ 1499.996290] ib_srpt receiving failed for ioctx 000000008a631e7b with status 5
[ 1499.996301] ib_srpt receiving failed for ioctx 00000000c2aea613 with status 5
[ 1499.996312] ib_srpt receiving failed for ioctx 00000000ce7640a0 with status 5
[ 1499.996323] ib_srpt receiving failed for ioctx 0000000000804db4 with status 5
[ 1499.996334] ib_srpt receiving failed for ioctx 000000002115f9aa with status 5
[ 1504.613456] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1504.613492] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1504.613512] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 1504.613520] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1504.620112] scsi host12: ib_srp: REJ received
[ 1504.620122] scsi host12:   REJ reason 0xffffff98
[ 1504.620158] scsi host12: ib_srp: Connection 0/24 to 192.168.1.77 failed
[ 1504.747152] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1504.747211] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1504.747326] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1504.747383] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1504.747418] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 1504.747434] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1504.756452] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1504.760221] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1504.853504] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000000f3f3c47
[ 1504.853770] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1504.853935] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000911fb20b name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000000f3f3c47
[ 1504.854287] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-3: queued zerolength write
[ 1504.854323] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1504.854331] scsi host12: ib_srp: using immediate data
[ 1504.856175] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-3 wc->status 0
[ 1504.861527] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1504.863715] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1504.946238] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000006490069e
[ 1504.946508] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1504.946625] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007024e5bc name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000006490069e
[ 1504.946995] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-7: queued zerolength write
[ 1504.947020] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1504.947028] scsi host12: ib_srp: using immediate data
[ 1504.947423] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-7 wc->status 0
[ 1504.962580] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1504.964813] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1505.044762] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000f7a980dd
[ 1505.045029] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1505.045138] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000006741c5b0 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000f7a980dd
[ 1505.045511] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-11: queued zerolength write
[ 1505.045536] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1505.045543] scsi host12: ib_srp: using immediate data
[ 1505.045734] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-11 wc->status 0
[ 1505.059910] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1505.062134] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1505.141801] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000006e8832bf
[ 1505.142069] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1505.142179] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000c3e6dae3 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000006e8832bf
[ 1505.142553] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-15: queued zerolength write
[ 1505.142648] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1505.142659] scsi host12: ib_srp: using immediate data
[ 1505.142917] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-15 wc->status 0
[ 1505.158037] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1505.160255] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1505.243618] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000feb9df7a
[ 1505.243900] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1505.244027] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000008c1ba974 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000feb9df7a
[ 1505.244386] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-19: queued zerolength write
[ 1505.244454] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1505.244465] scsi host12: ib_srp: using immediate data
[ 1505.244796] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-19 wc->status 0
[ 1505.257257] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1505.259715] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1505.340800] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000c6d5c35b
[ 1505.341068] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1505.341180] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000cd696454 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000c6d5c35b
[ 1505.341552] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-23: queued zerolength write
[ 1505.341584] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1505.341591] scsi host12: ib_srp: using immediate data
[ 1505.343088] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-23 wc->status 0
[ 1505.354329] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1505.356680] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1505.438780] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000002f85891
[ 1505.439047] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1505.439157] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000048070d51 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000002f85891
[ 1505.439518] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-27: queued zerolength write
[ 1505.439614] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1505.439634] scsi host12: ib_srp: using immediate data
[ 1505.439900] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-27 wc->status 0
[ 1505.453298] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1505.455499] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1505.538137] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000004285927f
[ 1505.538406] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1505.538516] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000312e58fb name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000004285927f
[ 1505.538883] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-31: queued zerolength write
[ 1505.538973] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1505.538991] scsi host12: ib_srp: using immediate data
[ 1505.539236] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-31 wc->status 0
[ 1505.552433] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1505.554805] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1505.636037] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000148be3df
[ 1505.636339] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1505.636486] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000045e693d name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000148be3df
[ 1505.637078] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-35: queued zerolength write
[ 1505.637124] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1505.637137] scsi host12: ib_srp: using immediate data
[ 1505.638662] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-35 wc->status 0
[ 1505.650044] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1505.652277] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1505.736789] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000f34b5de1
[ 1505.737057] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1505.737169] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000de767304 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000f34b5de1
[ 1505.737536] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-39: queued zerolength write
[ 1505.737623] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1505.737642] scsi host12: ib_srp: using immediate data
[ 1505.737946] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-39 wc->status 0
[ 1505.751823] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1505.754071] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1505.836095] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000fa50edd7
[ 1505.836363] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1505.836506] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000724a51e3 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000fa50edd7
[ 1505.836890] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-43: queued zerolength write
[ 1505.836976] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1505.836995] scsi host12: ib_srp: using immediate data
[ 1505.837232] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-43 wc->status 0
[ 1505.850216] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1505.854604] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1505.958625] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000b3e8dace
[ 1505.958893] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1505.959006] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000464dc3a9 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000b3e8dace
[ 1505.959379] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-47: queued zerolength write
[ 1505.959450] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1505.959465] scsi host12: ib_srp: using immediate data
[ 1505.959614] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-47 wc->status 0
[ 1505.972001] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1505.975314] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1506.075605] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000a39000e8
[ 1506.075875] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1506.075983] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000e6b6a073 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000a39000e8
[ 1506.076345] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-50: queued zerolength write
[ 1506.076412] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1506.076424] scsi host12: ib_srp: using immediate data
[ 1506.076575] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-50 wc->status 0
[ 1506.090620] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1506.092992] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1506.179880] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000089ffc33c
[ 1506.180152] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1506.180261] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007fa78446 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000089ffc33c
[ 1506.180799] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-52: queued zerolength write
[ 1506.180845] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1506.180857] scsi host12: ib_srp: using immediate data
[ 1506.182549] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-52 wc->status 0
[ 1506.194458] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1506.196829] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1506.276735] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000f71baa70
[ 1506.277003] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1506.277130] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000036c740b7 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000f71baa70
[ 1506.277492] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-54: queued zerolength write
[ 1506.277584] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1506.277604] scsi host12: ib_srp: using immediate data
[ 1506.277748] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-54 wc->status 0
[ 1506.291585] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1506.296181] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1506.399679] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000e6e4eb83
[ 1506.399948] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1506.400054] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000044d2117b name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000e6e4eb83
[ 1506.400458] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-56: queued zerolength write
[ 1506.400486] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1506.400497] scsi host12: ib_srp: using immediate data
[ 1506.400815] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-56 wc->status 0
[ 1506.413409] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1506.415777] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1506.494558] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000020cdf409
[ 1506.494825] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1506.494938] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007179650c name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000020cdf409
[ 1506.495314] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-58: queued zerolength write
[ 1506.495409] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1506.495428] scsi host12: ib_srp: using immediate data
[ 1506.495710] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-58 wc->status 0
[ 1506.508248] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1506.510479] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1506.590161] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000436ff907
[ 1506.590430] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1506.590536] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000004a1786cb name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000436ff907
[ 1506.590913] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-60: queued zerolength write
[ 1506.591014] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1506.591035] scsi host12: ib_srp: using immediate data
[ 1506.591146] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-60 wc->status 0
[ 1506.605319] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1506.610225] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1506.715469] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000083683622
[ 1506.715730] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1506.715834] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000fc5a432f name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000083683622
[ 1506.716184] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-62: queued zerolength write
[ 1506.716314] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1506.716322] scsi host12: ib_srp: using immediate data
[ 1506.716727] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-62 wc->status 0
[ 1506.729554] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1506.732568] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1506.812761] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000085e5c684
[ 1506.813025] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1506.813140] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000342ded5a name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000085e5c684
[ 1506.813493] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-64: queued zerolength write
[ 1506.813585] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1506.813602] scsi host12: ib_srp: using immediate data
[ 1506.814171] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-64 wc->status 0
[ 1506.826306] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1506.830816] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1506.937974] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000041f9f657
[ 1506.938239] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1506.938345] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000ac29459f name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000041f9f657
[ 1506.938697] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-66: queued zerolength write
[ 1506.938786] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1506.938804] scsi host12: ib_srp: using immediate data
[ 1506.938952] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-66 wc->status 0
[ 1506.951058] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1506.955427] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1507.060418] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000faf4f409
[ 1507.060686] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1507.060793] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000d64de830 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000faf4f409
[ 1507.061141] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-68: queued zerolength write
[ 1507.061165] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1507.061173] scsi host12: ib_srp: using immediate data
[ 1507.061403] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-68 wc->status 0
[ 1507.073834] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1507.076078] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1507.153925] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000002fa129da
[ 1507.154189] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1507.154300] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007bbe0275 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000002fa129da
[ 1507.154666] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-70: queued zerolength write
[ 1507.154757] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1507.154774] scsi host12: ib_srp: using immediate data
[ 1507.155071] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-70 wc->status 0
[ 1507.168821] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1507.171031] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1507.249599] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000c70217b1
[ 1507.249862] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1507.249968] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000a6809ba1 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000c70217b1
[ 1507.250316] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-72: queued zerolength write
[ 1507.250402] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1507.250418] scsi host12: ib_srp: using immediate data
[ 1507.250734] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-72 wc->status 0
[ 1507.252894] scsi host12: SRP.T10:B62E99FFFEF9FA2E
[ 1507.317310] scsi 12:0:0:0: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1507.339780] scsi 12:0:0:0: alua: supports implicit and explicit TPGS
[ 1507.339923] scsi 12:0:0:0: alua: device naa.60014056e756c6c62300000000000000 port group 0 rel port 1
[ 1507.346121] sd 12:0:0:0: Attached scsi generic sg1 type 0
[ 1507.347728] sd 12:0:0:0: [sdc] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1507.348621] sd 12:0:0:0: [sdc] Write Protect is off
[ 1507.348667] sd 12:0:0:0: [sdc] Mode Sense: 43 00 00 08
[ 1507.350143] sd 12:0:0:0: [sdc] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 1507.352319] sd 12:0:0:0: [sdc] Preferred minimum I/O size 512 bytes
[ 1507.352447] sd 12:0:0:0: [sdc] Optimal transfer size 126976 bytes
[ 1507.356111] scsi 12:0:0:2: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1507.366395] scsi 12:0:0:2: alua: supports implicit and explicit TPGS
[ 1507.366461] scsi 12:0:0:2: alua: device naa.60014057363736964626700000000000 port group 0 rel port 1
[ 1507.371969] sd 12:0:0:2: Attached scsi generic sg2 type 0
[ 1507.377696] sd 12:0:0:2: [sdd] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1507.378475] sd 12:0:0:2: [sdd] Write Protect is off
[ 1507.378517] sd 12:0:0:2: [sdd] Mode Sense: 43 00 10 08
[ 1507.381002] scsi 12:0:0:1: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1507.382155] sd 12:0:0:2: [sdd] Write cache: enabled, read cache: enabled, supports DPO and FUA
[ 1507.390989] sd 12:0:0:2: [sdd] Preferred minimum I/O size 512 bytes
[ 1507.391030] sd 12:0:0:2: [sdd] Optimal transfer size 524288 bytes
[ 1507.399734] sd 12:0:0:0: [sdc] Attached SCSI disk
[ 1507.401853] scsi 12:0:0:1: alua: supports implicit and explicit TPGS
[ 1507.401895] scsi 12:0:0:1: alua: device naa.60014056e756c6c62310000000000000 port group 0 rel port 1
[ 1507.406252] sd 12:0:0:1: [sde] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1507.406680] sd 12:0:0:1: Attached scsi generic sg3 type 0
[ 1507.407205] sd 12:0:0:1: [sde] Write Protect is off
[ 1507.407219] sd 12:0:0:1: [sde] Mode Sense: 43 00 00 08
[ 1507.408039] ib_srp:srp_add_target: ib_srp: host12: SCSI scan succeeded - detected 3 LUNs
[ 1507.408050] scsi host12: ib_srp: new target: id_ext b62e99fffef9fa2e ioc_guid b62e99fffef9fa2e sgid b42e:99f9:fa2e:0000:0000:0000:0000:0000 dest 2603:8081:1405:679b:0000:0000:0000:132c
[ 1507.408548] sd 12:0:0:1: [sde] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 1507.410541] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1507.410575] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1507.410639] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1507.410671] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1507.410738] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1507.410770] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1507.410790] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1507.410982] sd 12:0:0:1: [sde] Preferred minimum I/O size 512 bytes
[ 1507.410995] sd 12:0:0:1: [sde] Optimal transfer size 126976 bytes
[ 1507.438292] sd 12:0:0:2: [sdd] Attached SCSI disk
[ 1507.450287] sd 12:0:0:1: [sde] Attached SCSI disk
[ 1507.490056] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1507.490092] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1507.490156] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1507.490189] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1507.490256] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1507.490288] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1507.490352] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1507.490384] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1507.490404] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1507.562996] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1507.563032] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1507.563096] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1507.563129] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1507.563196] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1507.563228] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1507.563292] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1507.563324] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1507.563391] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1507.563423] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1507.563443] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1507.633861] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1507.633896] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1507.633960] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1507.633992] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1507.634056] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1507.634088] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1507.634152] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1507.634183] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1507.634251] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1507.634282] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1507.634347] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1507.634379] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1507.634399] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1512.864254] scsi 12:0:0:0: alua: Detached
[ 1512.944272] sd 12:0:0:2: [sdd] Synchronizing SCSI cache
[ 1513.016258] scsi 12:0:0:2: alua: Detached
[ 1513.124223] scsi 12:0:0:1: alua: Detached
[ 1513.139533] srpt_recv_done: 2945 callbacks suppressed
[ 1513.139537] ib_srpt receiving failed for ioctx 00000000f74a692d with status 5
[ 1513.139547] ib_srpt receiving failed for ioctx 00000000a86039be with status 5
[ 1513.139553] ib_srpt receiving failed for ioctx 0000000063015017 with status 5
[ 1513.139559] ib_srpt receiving failed for ioctx 00000000f834b252 with status 5
[ 1513.139565] ib_srpt receiving failed for ioctx 000000006cb36b06 with status 5
[ 1513.139571] ib_srpt receiving failed for ioctx 00000000291a9b5d with status 5
[ 1513.139577] ib_srpt receiving failed for ioctx 00000000749b839c with status 5
[ 1513.139583] ib_srpt receiving failed for ioctx 0000000024cf8bd3 with status 5
[ 1513.139589] ib_srpt receiving failed for ioctx 000000005d2195a2 with status 5
[ 1513.139595] ib_srpt receiving failed for ioctx 0000000077e23ff0 with status 5
[ 1514.900749] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1514.900807] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1514.900841] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 1514.953023] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1514.953081] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1514.953214] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1514.953272] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1514.953308] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 1515.004667] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1515.004700] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1515.004763] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1515.004807] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1515.004872] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1515.004904] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1515.004923] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1515.065493] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1515.065526] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1515.065590] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1515.065622] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1515.065686] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1515.065718] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1515.065782] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1515.065813] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1515.065833] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1515.116795] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1515.116860] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1515.116986] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1515.117050] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1515.117177] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1515.117247] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1515.117387] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1515.117449] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1515.117577] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1515.117639] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1515.117678] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1515.163583] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1515.163647] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1515.163773] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1515.163836] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1515.163969] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1515.164032] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1515.164333] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1515.164445] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1515.164574] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1515.164637] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1515.164765] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1515.164829] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1515.164867] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1515.473021] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1515.473057] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1515.473077] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 1515.518394] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1515.518452] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1515.518559] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1515.518618] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1515.518654] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 1515.565539] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1515.565597] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1515.565714] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1515.565773] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1515.565891] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1515.565949] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1515.565986] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1515.613858] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1515.613923] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1515.614049] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1515.614113] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1515.614240] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1515.614304] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1515.614432] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1515.614494] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1515.614534] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1515.665520] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1515.665584] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1515.665710] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1515.665774] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1515.665901] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1515.665964] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1515.666098] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1515.666161] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1515.666288] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1515.666350] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1515.666389] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1515.712586] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1515.712647] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1515.712764] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1515.712823] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1515.712939] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1515.713004] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1515.713125] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1515.713184] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1515.713304] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1515.713362] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1515.713474] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1515.713533] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1515.713569] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1516.023798] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1516.023835] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1516.023854] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 1516.023863] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1516.030682] scsi host12: ib_srp: REJ received
[ 1516.030691] scsi host12:   REJ reason 0xffffff98
[ 1516.030724] scsi host12: ib_srp: Connection 0/24 to 192.168.1.77 failed
[ 1516.179768] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1516.179831] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1516.179947] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1516.180006] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1516.180041] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 1516.180103] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1516.186447] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1516.188730] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1516.273121] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000006459201a
[ 1516.273392] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1516.273566] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000090c82f77 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000006459201a
[ 1516.274602] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1516.274612] scsi host12: ib_srp: using immediate data
[ 1516.276213] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-5: queued zerolength write
[ 1516.276444] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-5 wc->status 0
[ 1516.287614] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1516.289864] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1516.369937] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000596683bb
[ 1516.370205] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1516.370317] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000b1dd41bd name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000596683bb
[ 1516.370678] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-13: queued zerolength write
[ 1516.370703] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1516.370710] scsi host12: ib_srp: using immediate data
[ 1516.371109] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-13 wc->status 0
[ 1516.383682] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1516.385934] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1516.465235] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000c7836461
[ 1516.465501] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1516.465608] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000a9a9c86b name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000c7836461
[ 1516.465975] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-21: queued zerolength write
[ 1516.466088] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1516.466108] scsi host12: ib_srp: using immediate data
[ 1516.468337] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-21 wc->status 0
[ 1516.479131] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1516.481448] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1516.565355] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000851548b8
[ 1516.565619] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1516.565729] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000009432e4bf name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000851548b8
[ 1516.566077] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-29: queued zerolength write
[ 1516.566102] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1516.566110] scsi host12: ib_srp: using immediate data
[ 1516.566279] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-29 wc->status 0
[ 1516.578821] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1516.581168] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1516.661103] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000c7b6b802
[ 1516.661367] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1516.661481] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000af469b86 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000c7b6b802
[ 1516.661844] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-37: queued zerolength write
[ 1516.661868] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1516.661876] scsi host12: ib_srp: using immediate data
[ 1516.662216] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-37 wc->status 0
[ 1516.674938] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1516.677627] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1516.757445] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000012d317ed
[ 1516.757713] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1516.757825] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000057643ea name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000012d317ed
[ 1516.758196] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-45: queued zerolength write
[ 1516.758249] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1516.758265] scsi host12: ib_srp: using immediate data
[ 1516.758539] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-45 wc->status 0
[ 1516.771995] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1516.776276] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1516.877878] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000550b8d40
[ 1516.878142] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1516.878248] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000010bad4d9 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000550b8d40
[ 1516.878615] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-51: queued zerolength write
[ 1516.878641] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1516.878649] scsi host12: ib_srp: using immediate data
[ 1516.880317] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-51 wc->status 0
[ 1516.890781] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1516.892982] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1516.970351] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000004658cc72
[ 1516.970616] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1516.970723] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000c40d1635 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000004658cc72
[ 1516.971104] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-55: queued zerolength write
[ 1516.971128] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1516.971136] scsi host12: ib_srp: using immediate data
[ 1516.971451] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-55 wc->status 0
[ 1516.983237] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1516.985737] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1517.062822] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000092c96e66
[ 1517.063086] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1517.063196] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000fa9f8e2c name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000092c96e66
[ 1517.063564] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-59: queued zerolength write
[ 1517.063588] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1517.063596] scsi host12: ib_srp: using immediate data
[ 1517.063787] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-59 wc->status 0
[ 1517.075969] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1517.078171] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1517.155655] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000038b25cb1
[ 1517.155919] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1517.156056] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000544a8022 name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000038b25cb1
[ 1517.156417] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-63: queued zerolength write
[ 1517.156442] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1517.156450] scsi host12: ib_srp: using immediate data
[ 1517.156651] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-63 wc->status 0
[ 1517.168707] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1517.170890] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1517.247929] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000a35c911e
[ 1517.248226] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1517.248335] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000c7aae329 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000a35c911e
[ 1517.248690] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-67: queued zerolength write
[ 1517.248715] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1517.248723] scsi host12: ib_srp: using immediate data
[ 1517.249025] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-67 wc->status 0
[ 1517.260979] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1517.263161] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1517.339436] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000000eb2673b
[ 1517.339702] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1517.339808] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000e08efc66 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000000eb2673b
[ 1517.340214] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-71: queued zerolength write
[ 1517.340259] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1517.340274] scsi host12: ib_srp: using immediate data
[ 1517.340439] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-71 wc->status 0
[ 1517.353783] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1517.356646] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1517.442862] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000b1ca6771
[ 1517.443127] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1517.443239] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000b5522fad name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000b1ca6771
[ 1517.443610] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-74: queued zerolength write
[ 1517.443634] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1517.443642] scsi host12: ib_srp: using immediate data
[ 1517.444058] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-74 wc->status 0
[ 1517.455756] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1517.458258] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1517.536923] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000e19ef7f0
[ 1517.537212] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1517.537328] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000056351df0 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000e19ef7f0
[ 1517.538150] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-76: queued zerolength write
[ 1517.538174] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1517.538182] scsi host12: ib_srp: using immediate data
[ 1517.538535] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-76 wc->status 0
[ 1517.551389] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1517.553681] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1517.632808] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000f62a9a0e
[ 1517.633101] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1517.633210] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000ba9e4cec name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000f62a9a0e
[ 1517.633573] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-78: queued zerolength write
[ 1517.633604] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1517.633612] scsi host12: ib_srp: using immediate data
[ 1517.635119] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-78 wc->status 0
[ 1517.645798] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1517.648093] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1517.728609] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000dabfe1b1
[ 1517.728874] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1517.728982] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007690040e name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000dabfe1b1
[ 1517.729351] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-80: queued zerolength write
[ 1517.729452] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1517.729470] scsi host12: ib_srp: using immediate data
[ 1517.729733] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-80 wc->status 0
[ 1517.743337] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1517.745539] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1517.822834] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000036485cf3
[ 1517.823099] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1517.823205] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000b8a1c04c name=2603:8081:1405:679b:0000:0000:0000:132c ch=0000000036485cf3
[ 1517.823584] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-82: queued zerolength write
[ 1517.823679] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1517.823687] scsi host12: ib_srp: using immediate data
[ 1517.824050] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-82 wc->status 0
[ 1517.836128] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1517.841323] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1517.943567] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000003bf2affa
[ 1517.943834] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1517.943942] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000009060390e name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000003bf2affa
[ 1517.944341] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-84: queued zerolength write
[ 1517.944367] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1517.944375] scsi host12: ib_srp: using immediate data
[ 1517.944586] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-84 wc->status 0
[ 1517.957906] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1517.960102] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1518.037947] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000000b61eb5e
[ 1518.038212] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1518.038318] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000005c9c96c0 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000000b61eb5e
[ 1518.038677] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-86: queued zerolength write
[ 1518.038701] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1518.038708] scsi host12: ib_srp: using immediate data
[ 1518.039059] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-86 wc->status 0
[ 1518.053536] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1518.055712] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1518.135120] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000a9dbae08
[ 1518.135386] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1518.135499] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000fd4a7332 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000a9dbae08
[ 1518.135866] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-88: queued zerolength write
[ 1518.135957] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1518.135975] scsi host12: ib_srp: using immediate data
[ 1518.136275] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-88 wc->status 0
[ 1518.153044] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1518.157409] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1518.262703] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000008455468c
[ 1518.262972] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1518.263081] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000eb9841db name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000008455468c
[ 1518.263443] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-90: queued zerolength write
[ 1518.263543] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1518.263561] scsi host12: ib_srp: using immediate data
[ 1518.263676] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-90 wc->status 0
[ 1518.270559] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1518.273263] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1518.371020] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000146fd491
[ 1518.371284] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1518.371398] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000008c133789 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000146fd491
[ 1518.371762] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-92: queued zerolength write
[ 1518.371786] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1518.371794] scsi host12: ib_srp: using immediate data
[ 1518.372318] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-92 wc->status 0
[ 1518.385279] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1518.387552] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1518.464373] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000de2af17f
[ 1518.464639] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1518.464755] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000cd0ea778 name=2603:8081:1405:679b:0000:0000:0000:132c ch=00000000de2af17f
[ 1518.465137] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-94: queued zerolength write
[ 1518.465162] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1518.465170] scsi host12: ib_srp: using immediate data
[ 1518.465510] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-94 wc->status 0
[ 1518.478897] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1518.481281] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1518.558930] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000006a408398
[ 1518.559196] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0000:0000:0000:132c or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1518.559315] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000054ae0b92 name=2603:8081:1405:679b:0000:0000:0000:132c ch=000000006a408398
[ 1518.559688] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-96: queued zerolength write
[ 1518.559810] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1518.559828] scsi host12: ib_srp: using immediate data
[ 1518.559920] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-96 wc->status 0
[ 1518.562333] scsi host12: SRP.T10:B62E99FFFEF9FA2E
[ 1518.633106] scsi 12:0:0:0: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1518.649695] scsi 12:0:0:0: alua: supports implicit and explicit TPGS
[ 1518.649845] scsi 12:0:0:0: alua: device naa.60014056e756c6c62300000000000000 port group 0 rel port 1
[ 1518.657995] sd 12:0:0:0: Attached scsi generic sg1 type 0
[ 1518.666089] sd 12:0:0:0: [sdc] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1518.666959] sd 12:0:0:0: [sdc] Write Protect is off
[ 1518.666984] sd 12:0:0:0: [sdc] Mode Sense: 43 00 00 08
[ 1518.668269] scsi 12:0:0:2: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1518.668446] sd 12:0:0:0: [sdc] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 1518.670552] sd 12:0:0:0: [sdc] Preferred minimum I/O size 512 bytes
[ 1518.670583] sd 12:0:0:0: [sdc] Optimal transfer size 126976 bytes
[ 1518.689873] scsi 12:0:0:2: alua: supports implicit and explicit TPGS
[ 1518.689938] scsi 12:0:0:2: alua: device naa.60014057363736964626700000000000 port group 0 rel port 1
[ 1518.700101] sd 12:0:0:2: Attached scsi generic sg2 type 0
[ 1518.700997] sd 12:0:0:2: [sdd] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1518.701591] sd 12:0:0:2: [sdd] Write Protect is off
[ 1518.701604] sd 12:0:0:2: [sdd] Mode Sense: 43 00 10 08
[ 1518.706431] scsi 12:0:0:1: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1518.709188] sd 12:0:0:0: [sdc] Attached SCSI disk
[ 1518.709403] sd 12:0:0:2: [sdd] Write cache: enabled, read cache: enabled, supports DPO and FUA
[ 1518.712009] sd 12:0:0:2: [sdd] Preferred minimum I/O size 512 bytes
[ 1518.712033] sd 12:0:0:2: [sdd] Optimal transfer size 524288 bytes
[ 1518.719944] scsi 12:0:0:1: alua: supports implicit and explicit TPGS
[ 1518.720029] scsi 12:0:0:1: alua: device naa.60014056e756c6c62310000000000000 port group 0 rel port 1
[ 1518.722972] sd 12:0:0:1: Attached scsi generic sg3 type 0
[ 1518.723377] sd 12:0:0:1: [sde] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1518.723718] ib_srp:srp_add_target: ib_srp: host12: SCSI scan succeeded - detected 3 LUNs
[ 1518.723729] scsi host12: ib_srp: new target: id_ext b62e99fffef9fa2e ioc_guid b62e99fffef9fa2e sgid b42e:99f9:fa2e:0000:0000:0000:0000:0000 dest 2603:8081:1405:679b:0000:0000:0000:132c
[ 1518.723939] sd 12:0:0:1: [sde] Write Protect is off
[ 1518.723953] sd 12:0:0:1: [sde] Mode Sense: 43 00 00 08
[ 1518.725632] sd 12:0:0:1: [sde] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 1518.727290] sd 12:0:0:1: [sde] Preferred minimum I/O size 512 bytes
[ 1518.727304] sd 12:0:0:1: [sde] Optimal transfer size 126976 bytes
[ 1518.727485] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1518.727519] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1518.727582] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1518.727614] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1518.727678] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1518.727719] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1518.727739] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1518.750311] sd 12:0:0:2: [sdd] Attached SCSI disk
[ 1518.759636] sd 12:0:0:1: [sde] Attached SCSI disk
[ 1518.833994] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1518.834056] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1518.834172] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1518.834231] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1518.834360] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1518.834426] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1518.834543] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1518.834601] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1518.834638] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1518.913786] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1518.913823] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1518.913888] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1518.913920] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1518.913984] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1518.914016] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1518.914081] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1518.914113] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1518.914177] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1518.914208] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1518.914229] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1518.986790] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1518.986832] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1518.986896] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1518.986928] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1518.986992] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1518.987023] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1518.987096] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1518.987127] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1518.987195] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1518.987227] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1518.987292] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1518.987324] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1518.987345] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1524.199888] scsi 12:0:0:0: alua: Detached
[ 1524.270503] sd 12:0:0:2: [sdd] Synchronizing SCSI cache
[ 1524.351867] scsi 12:0:0:2: alua: Detached
[ 1524.459863] scsi 12:0:0:1: alua: Detached
[ 1524.476266] srpt_recv_done: 3021 callbacks suppressed
[ 1524.476271] ib_srpt receiving failed for ioctx 00000000b92b4eb9 with status 5
[ 1524.476281] ib_srpt receiving failed for ioctx 000000002f1b1b16 with status 5
[ 1524.476288] ib_srpt receiving failed for ioctx 00000000b442a80f with status 5
[ 1524.476294] ib_srpt receiving failed for ioctx 000000007692f599 with status 5
[ 1524.476300] ib_srpt receiving failed for ioctx 00000000593f0ddd with status 5
[ 1524.476306] ib_srpt receiving failed for ioctx 00000000e8573a58 with status 5
[ 1524.476312] ib_srpt receiving failed for ioctx 00000000bc767c91 with status 5
[ 1524.476318] ib_srpt receiving failed for ioctx 000000006a963119 with status 5
[ 1524.476323] ib_srpt receiving failed for ioctx 000000008574247b with status 5
[ 1524.476329] ib_srpt receiving failed for ioctx 00000000e5b21eb5 with status 5
[ 1525.262199] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1525.262236] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1525.262256] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 1525.327459] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1525.327495] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1525.327559] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1525.327592] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1525.327613] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 1525.373698] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1525.373733] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1525.373797] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1525.373829] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1525.373893] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1525.373929] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1525.373949] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1525.420651] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1525.420710] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1525.420821] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1525.420877] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1525.420994] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1525.421054] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1525.421174] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1525.421229] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1525.421264] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1525.465988] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1525.466022] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1525.466085] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1525.466117] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1525.466181] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1525.466213] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1525.466276] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1525.466310] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1525.466380] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1525.466411] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1525.466431] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1525.517830] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1525.517864] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1525.517927] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1525.517959] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1525.518023] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1525.518054] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1525.518118] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1525.518149] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1525.518213] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1525.518244] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1525.518308] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1525.518349] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1525.518369] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1525.822944] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1525.822981] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1525.823001] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 1525.877263] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1525.877297] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1525.877360] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1525.877393] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1525.877413] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 1525.921155] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1525.921188] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1525.921255] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1525.921290] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1525.921354] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1525.921386] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1525.921406] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1525.963930] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1525.963963] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1525.964026] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1525.964058] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1525.964122] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1525.964154] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1525.964223] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1525.964257] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1525.964277] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1526.008237] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1526.008271] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1526.008338] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1526.008369] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1526.008440] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1526.008471] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1526.008541] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1526.008573] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1526.008637] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1526.008668] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1526.008688] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1526.060132] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1526.060165] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1526.060228] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1526.060260] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1526.060323] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1526.060355] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1526.060418] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1526.060450] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1526.060513] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1526.060545] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1526.060609] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1526.060641] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1526.060660] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1526.350377] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1526.350439] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1526.350475] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 1526.417635] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1526.417694] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1526.417811] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1526.417869] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1526.417905] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 1526.472757] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1526.472816] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1526.472938] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1526.473001] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1526.473119] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1526.473177] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1526.473213] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1526.516516] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1526.516575] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1526.516691] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1526.516750] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1526.516867] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1526.516925] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1526.517042] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1526.517100] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1526.517137] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1526.565058] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1526.565118] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1526.565235] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1526.565294] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1526.565411] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1526.565476] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1526.565593] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1526.565651] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1526.565769] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1526.565826] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1526.565862] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1526.629907] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1526.629969] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1526.630086] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1526.630145] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1526.630268] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1526.630328] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1526.630445] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1526.630504] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1526.630634] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1526.630691] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1526.630811] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1526.630870] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1526.630906] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1526.929747] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1526.929783] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1526.929803] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 1526.973367] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1526.973402] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1526.973465] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1526.973498] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1526.973518] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 1527.020837] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1527.020873] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1527.020941] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1527.020973] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1527.021040] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1527.021072] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1527.021092] scsi host13: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1527.061512] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1527.061547] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1527.061610] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1527.061642] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1527.061709] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1527.061741] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1527.061805] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1527.061836] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1527.061856] ib_srp:add_target_store: ib_srp: max_sectors = 1024; max_pages_per_mr = 256; mr_page_size = 4096; max_sectors_per_mr = 2048; mr_per_cmd = 2
[ 1527.061864] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1527.067866] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1527.070929] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1527.156136] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000020e676b7
[ 1527.156450] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1527.156647] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000001cd22ad8 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=0000000020e676b7
[ 1527.156974] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-9: queued zerolength write
[ 1527.157103] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1527.157124] scsi host13: ib_srp: using immediate data
[ 1527.157221] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-9 wc->status 0
[ 1527.165845] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1527.168283] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1527.247178] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000b976eb6b
[ 1527.247447] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1527.247559] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000919d1727 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000b976eb6b
[ 1527.248084] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1527.248098] scsi host13: ib_srp: using immediate data
[ 1527.249493] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-25: queued zerolength write
[ 1527.249730] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-25 wc->status 0
[ 1527.255643] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1527.258080] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1527.335853] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000bf077c56
[ 1527.336122] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1527.336230] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000005c32a88f name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000bf077c56
[ 1527.336590] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-41: queued zerolength write
[ 1527.336615] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1527.336623] scsi host13: ib_srp: using immediate data
[ 1527.336980] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-41 wc->status 0
[ 1527.349388] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1527.351583] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1527.428948] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000c54d0768
[ 1527.429223] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1527.429329] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000294b5252 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000c54d0768
[ 1527.429716] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-53: queued zerolength write
[ 1527.429741] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1527.429748] scsi host13: ib_srp: using immediate data
[ 1527.429942] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-53 wc->status 0
[ 1527.442089] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1527.444495] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1527.520960] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000d446c00e
[ 1527.521230] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1527.521336] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000004f81c298 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000d446c00e
[ 1527.521690] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-61: queued zerolength write
[ 1527.521716] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1527.521723] scsi host13: ib_srp: using immediate data
[ 1527.522080] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-61 wc->status 0
[ 1527.534465] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1527.536856] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1527.614237] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000467e8222
[ 1527.614513] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1527.614627] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000fed84d0b name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000467e8222
[ 1527.615000] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-69: queued zerolength write
[ 1527.615093] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1527.615114] scsi host13: ib_srp: using immediate data
[ 1527.615365] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-69 wc->status 0
[ 1527.629229] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1527.633489] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1527.734797] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000090f5339b
[ 1527.735067] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1527.735175] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000b16d0be8 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=0000000090f5339b
[ 1527.735552] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-75: queued zerolength write
[ 1527.735576] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1527.735584] scsi host13: ib_srp: using immediate data
[ 1527.736129] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-75 wc->status 0
[ 1527.748208] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1527.750413] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1527.827688] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000bb003506
[ 1527.827958] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1527.828067] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000006acca446 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000bb003506
[ 1527.828438] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-79: queued zerolength write
[ 1527.828521] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1527.828539] scsi host13: ib_srp: using immediate data
[ 1527.828842] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-79 wc->status 0
[ 1527.841541] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1527.846057] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1527.947232] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000001b8f29d9
[ 1527.947499] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1527.947606] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000339d4141 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=000000001b8f29d9
[ 1527.948180] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-83: queued zerolength write
[ 1527.948221] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1527.948234] scsi host13: ib_srp: using immediate data
[ 1527.948427] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-83 wc->status 0
[ 1527.961002] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1527.963209] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1528.040804] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000007d8c9132
[ 1528.041072] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1528.041179] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000079de8f59 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=000000007d8c9132
[ 1528.041548] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-87: queued zerolength write
[ 1528.041638] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1528.041658] scsi host13: ib_srp: using immediate data
[ 1528.041916] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-87 wc->status 0
[ 1528.054256] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1528.059199] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1528.161871] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000f913ab3f
[ 1528.162140] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1528.162252] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000614d2833 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000f913ab3f
[ 1528.162622] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-91: queued zerolength write
[ 1528.162648] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1528.162655] scsi host13: ib_srp: using immediate data
[ 1528.163005] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-91 wc->status 0
[ 1528.175035] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1528.177714] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1528.255405] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000bfa19829
[ 1528.255704] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1528.255813] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000dcde5c01 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000bfa19829
[ 1528.256180] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-95: queued zerolength write
[ 1528.256275] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1528.256293] scsi host13: ib_srp: using immediate data
[ 1528.256411] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-95 wc->status 0
[ 1528.269612] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1528.274130] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1528.373047] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000003d78203a
[ 1528.373316] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1528.373423] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000045b53899 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=000000003d78203a
[ 1528.373792] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-98: queued zerolength write
[ 1528.373816] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1528.373824] scsi host13: ib_srp: using immediate data
[ 1528.374173] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-98 wc->status 0
[ 1528.386206] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1528.389021] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1528.466419] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000ddbafbde
[ 1528.466687] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1528.466825] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000531d80d6 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000ddbafbde
[ 1528.467196] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-100: queued zerolength write
[ 1528.467288] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1528.467308] scsi host13: ib_srp: using immediate data
[ 1528.467562] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-100 wc->status 0
[ 1528.479547] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1528.484251] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1528.585103] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000d07f2ad1
[ 1528.585373] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1528.585482] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000977fdd38 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000d07f2ad1
[ 1528.585856] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-102: queued zerolength write
[ 1528.585881] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1528.585889] scsi host13: ib_srp: using immediate data
[ 1528.586147] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-102 wc->status 0
[ 1528.598194] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1528.600596] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1528.678236] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000006e044b9b
[ 1528.678505] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1528.678615] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000062f2e813 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=000000006e044b9b
[ 1528.678988] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-104: queued zerolength write
[ 1528.679075] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1528.679092] scsi host13: ib_srp: using immediate data
[ 1528.679360] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-104 wc->status 0
[ 1528.692634] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1528.697438] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1528.797113] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000954bc6d9
[ 1528.797382] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1528.797490] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=0000000011349e5f name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000954bc6d9
[ 1528.797866] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-106: queued zerolength write
[ 1528.797891] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1528.797898] scsi host13: ib_srp: using immediate data
[ 1528.798277] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-106 wc->status 0
[ 1528.810967] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1528.813488] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1528.890628] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000d085cfb5
[ 1528.890904] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1528.891011] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000f92b0550 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000d085cfb5
[ 1528.891386] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-108: queued zerolength write
[ 1528.891411] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1528.891419] scsi host13: ib_srp: using immediate data
[ 1528.891622] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-108 wc->status 0
[ 1528.905871] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1528.908259] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1528.985582] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000f51cdad5
[ 1528.985851] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1528.985960] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000002027c5a0 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000f51cdad5
[ 1528.986331] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-110: queued zerolength write
[ 1528.986420] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1528.986440] scsi host13: ib_srp: using immediate data
[ 1528.986708] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-110 wc->status 0
[ 1528.999998] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1529.004643] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1529.103948] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000655e6a9e
[ 1529.104220] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1529.104334] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000005f2103dc name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000655e6a9e
[ 1529.104697] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-112: queued zerolength write
[ 1529.104875] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1529.104892] scsi host13: ib_srp: using immediate data
[ 1529.105074] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-112 wc->status 0
[ 1529.117146] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1529.119349] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1529.209682] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000009d2426b1
[ 1529.209952] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1529.210058] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000efe8ffce name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=000000009d2426b1
[ 1529.210428] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-114: queued zerolength write
[ 1529.210453] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1529.210460] scsi host13: ib_srp: using immediate data
[ 1529.210669] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-114 wc->status 0
[ 1529.223182] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1529.225540] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1529.303144] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 00000000517b082a
[ 1529.303411] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1529.303519] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000882e2fc0 name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=00000000517b082a
[ 1529.304069] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-116: queued zerolength write
[ 1529.304118] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1529.304130] scsi host13: ib_srp: using immediate data
[ 1529.306068] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-116 wc->status 0
[ 1529.317418] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1529.320265] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1529.397048] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 0000000024420947
[ 1529.397314] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1529.397422] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=00000000bf3cc71c name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=0000000024420947
[ 1529.397800] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-118: queued zerolength write
[ 1529.397886] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1529.397907] scsi host13: ib_srp: using immediate data
[ 1529.398171] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-118 wc->status 0
[ 1529.410993] ib_srpt Received SRP_LOGIN_REQ with i_port_id b42e:99f9:fa2e:0000:0000:0000:0000:0000, t_port_id b62e:99ff:fef9:fa2e:b62e:99ff:fef9:fa2e and it_iu_len 8260 on port 1 (guid=b42e:99f9:fa2e:0000:0000:0000:0000:0000); pkey 0x00
[ 1529.415271] ib_srpt:srpt_cm_req_recv: ib_srpt imm_data_offset = 68
[ 1529.515508] ib_srpt:srpt_create_ch_ib: ib_srpt srpt_create_ch_ib: max_cqe= 8192 max_sge= 6 sq_size = 8192 ch= 000000005953c88f
[ 1529.515807] ib_srpt:srpt_cm_req_recv: ib_srpt registering src addr 2603:8081:1405:679b:0e9a:f295:ecfc:dc93 or i_port_id 0xb42e99f9fa2e00000000000000000000
[ 1529.515916] ib_srpt:srpt_cm_req_recv: ib_srpt Establish connection sess=000000007b098faa name=2603:8081:1405:679b:0e9a:f295:ecfc:dc93 ch=000000005953c88f
[ 1529.516282] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-120: queued zerolength write
[ 1529.516308] ib_srp:srp_max_it_iu_len: ib_srp: max_iu_len = 8260
[ 1529.516316] scsi host13: ib_srp: using immediate data
[ 1529.516526] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-120 wc->status 0
[ 1529.517656] scsi host13: SRP.T10:B62E99FFFEF9FA2E
[ 1529.586608] scsi 13:0:0:0: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1529.606287] scsi 13:0:0:0: alua: supports implicit and explicit TPGS
[ 1529.606385] scsi 13:0:0:0: alua: device naa.60014056e756c6c62300000000000000 port group 0 rel port 1
[ 1529.612598] sd 13:0:0:0: Attached scsi generic sg1 type 0
[ 1529.613110] sd 13:0:0:0: [sdc] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1529.614209] sd 13:0:0:0: [sdc] Write Protect is off
[ 1529.614235] sd 13:0:0:0: [sdc] Mode Sense: 43 00 00 08
[ 1529.615933] sd 13:0:0:0: [sdc] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 1529.619778] sd 13:0:0:0: [sdc] Preferred minimum I/O size 512 bytes
[ 1529.619824] sd 13:0:0:0: [sdc] Optimal transfer size 126976 bytes
[ 1529.623054] scsi 13:0:0:2: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1529.641201] scsi 13:0:0:2: alua: supports implicit and explicit TPGS
[ 1529.641327] scsi 13:0:0:2: alua: device naa.60014057363736964626700000000000 port group 0 rel port 1
[ 1529.652119] sd 13:0:0:2: Attached scsi generic sg2 type 0
[ 1529.652423] sd 13:0:0:2: [sdd] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1529.656765] sd 13:0:0:2: [sdd] Write Protect is off
[ 1529.656789] sd 13:0:0:2: [sdd] Mode Sense: 43 00 10 08
[ 1529.658009] sd 13:0:0:2: [sdd] Write cache: enabled, read cache: enabled, supports DPO and FUA
[ 1529.659894] sd 13:0:0:2: [sdd] Preferred minimum I/O size 512 bytes
[ 1529.659919] sd 13:0:0:2: [sdd] Optimal transfer size 524288 bytes
[ 1529.664465] scsi 13:0:0:1: Direct-Access     LIO-ORG  IBLOCK           4.0  PQ: 0 ANSI: 6
[ 1529.681972] sd 13:0:0:0: [sdc] Attached SCSI disk
[ 1529.683737] scsi 13:0:0:1: alua: supports implicit and explicit TPGS
[ 1529.683786] scsi 13:0:0:1: alua: device naa.60014056e756c6c62310000000000000 port group 0 rel port 1
[ 1529.689243] sd 13:0:0:1: Attached scsi generic sg3 type 0
[ 1529.690253] ib_srp:srp_add_target: ib_srp: host13: SCSI scan succeeded - detected 3 LUNs
[ 1529.690264] scsi host13: ib_srp: new target: id_ext b62e99fffef9fa2e ioc_guid b62e99fffef9fa2e sgid b42e:99f9:fa2e:0000:0000:0000:0000:0000 dest 2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1529.691577] sd 13:0:0:1: [sde] 65536 512-byte logical blocks: (33.6 MB/32.0 MiB)
[ 1529.692419] sd 13:0:0:1: [sde] Write Protect is off
[ 1529.692443] sd 13:0:0:1: [sde] Mode Sense: 43 00 00 08
[ 1529.693366] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1529.693399] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1529.693467] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1529.693499] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1529.693563] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1529.693595] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1529.693659] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1529.693694] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1529.693758] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1529.693759] sd 13:0:0:1: [sde] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA
[ 1529.693790] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1529.693810] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1529.695829] sd 13:0:0:1: [sde] Preferred minimum I/O size 512 bytes
[ 1529.695853] sd 13:0:0:1: [sde] Optimal transfer size 126976 bytes
[ 1529.701615] sd 13:0:0:2: [sdd] Attached SCSI disk
[ 1529.749736] sd 13:0:0:1: [sde] Attached SCSI disk
[ 1529.750508] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1529.750543] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1529.750607] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1529.750641] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1529.750705] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1529.750736] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1529.750800] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1529.750832] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1529.750902] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1529.750934] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1529.750998] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1529.751030] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1529.751050] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1530.000245] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1530.000285] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1530.000306] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 1530.065372] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1530.065408] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1530.065472] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1530.065504] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1530.065525] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 1530.117068] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1530.117103] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1530.117171] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1530.117203] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1530.117268] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1530.117300] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1530.117320] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1530.167763] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1530.167797] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1530.167860] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1530.167892] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1530.167957] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1530.167989] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1530.168053] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1530.168085] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1530.168105] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1530.223857] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1530.223890] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1530.223953] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1530.223985] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1530.224049] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1530.224081] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1530.224145] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1530.224181] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1530.224247] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1530.224278] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1530.224297] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1530.292560] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1530.292594] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1530.292660] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1530.292692] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1530.292942] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1530.292974] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1530.293041] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1530.293073] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1530.293137] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1530.293168] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1530.293233] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1530.293265] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1530.293285] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1608.572605] device-mapper: ioctl: Target type does not support messages
[ 1608.586863] device-mapper: ioctl: Target type does not support messages
[ 1608.615042] device-mapper: ioctl: Target type does not support messages
[ 1608.628371] blk_print_req_error: 344 callbacks suppressed
[ 1608.628383] I/O error, dev dm-5, sector 15584 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 2
[ 1608.628402] I/O error, dev dm-5, sector 30560 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 2
[ 1608.628453] I/O error, dev dm-5, sector 26848 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 2
[ 1608.628498] I/O error, dev dm-5, sector 112 op 0x1:(WRITE) flags 0x3800 phys_seg 1 prio class 2
[ 1608.628540] buffer_io_error: 139 callbacks suppressed
[ 1608.628546] Buffer I/O error on dev dm-5, logical block 14, lost sync page write
[ 1608.628586] I/O error, dev dm-5, sector 11520 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 2
[ 1608.628603] I/O error, dev dm-5, sector 39416 op 0x0:(READ) flags 0x0 phys_seg 1 prio class 2
[ 1608.628757] I/O error, dev dm-5, sector 40088 op 0x0:(READ) flags 0x0 phys_seg 1 prio class 2
[ 1608.628795] I/O error, dev dm-5, sector 23520 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 2
[ 1608.628868] I/O error, dev dm-5, sector 22096 op 0x1:(WRITE) flags 0x8800 phys_seg 1 prio class 2
[ 1608.628884] I/O error, dev dm-5, sector 40880 op 0x0:(READ) flags 0x0 phys_seg 1 prio class 2
[ 1608.628967] Buffer I/O error on dev dm-5, logical block 37, lost sync page write
[ 1608.629002] EXT4-fs error (device dm-5): ext4_write_inode:5207: inode #15: block 37: comm kworker/17:98: IO error syncing inode
[ 1608.629339] Buffer I/O error on dev dm-5, logical block 14, lost sync page write
[ 1608.629932] Buffer I/O error on dev dm-5, logical block 37, lost sync page write
[ 1608.629954] EXT4-fs error (device dm-5): ext4_write_inode:5207: inode #16: block 37: comm kworker/4:99: IO error syncing inode
[ 1608.629978] EXT4-fs error (device dm-5): ext4_write_inode:5207: inode #14: block 37: comm kworker/18:82: IO error syncing inode
[ 1608.629988] EXT4-fs error (device dm-5): ext4_write_inode:5207: inode #13: block 37: comm kworker/4:31: IO error syncing inode
[ 1608.630092] Buffer I/O error on dev dm-5, logical block 0, lost sync page write
[ 1608.630131] EXT4-fs (dm-5): previous I/O error to superblock detected
[ 1608.630158] Buffer I/O error on dev dm-5, logical block 38, lost sync page write
[ 1608.630177] Buffer I/O error on dev dm-5, logical block 0, lost sync page write
[ 1608.630206] EXT4-fs (dm-5): I/O error while writing superblock
[ 1608.630448] Buffer I/O error on dev dm-5, logical block 0, lost sync page write
[ 1608.630654] EXT4-fs (dm-5): previous I/O error to superblock detected
[ 1608.630692] Buffer I/O error on dev dm-5, logical block 0, lost sync page write
[ 1608.630708] EXT4-fs (dm-5): I/O error while writing superblock
[ 1608.630898] Buffer I/O error on dev dm-5, logical block 38, lost sync page write
[ 1608.630979] EXT4-fs error (device dm-5): ext4_write_inode:5207: inode #23: block 38: comm kworker/22:39: IO error syncing inode
[ 1608.630983] EXT4-fs error (device dm-5): ext4_write_inode:5207: inode #17: block 38: comm kworker/19:20: IO error syncing inode
[ 1608.631025] EXT4-fs error (device dm-5): ext4_check_bdev_write_error:224: comm fio: Error while async write back metadata
[ 1608.631039] EXT4-fs error (device dm-5): ext4_write_inode:5207: inode #19: block 38: comm kworker/20:105: IO error syncing inode
[ 1608.631062] EXT4-fs error (device dm-5): ext4_write_inode:5207: inode #22: block 38: comm kworker/4:208: IO error syncing inode
[ 1608.631091] EXT4-fs error (device dm-5): ext4_write_inode:5207: inode #27: block 38: comm kworker/12:85: IO error syncing inode
[ 1608.631120] EXT4-fs (dm-5): previous I/O error to superblock detected
[ 1608.631225] EXT4-fs (dm-5): previous I/O error to superblock detected
[ 1608.631433] EXT4-fs (dm-5): I/O error while writing superblock
[ 1608.631440] EXT4-fs (dm-5): I/O error while writing superblock
[ 1608.631443] EXT4-fs (dm-5): I/O error while writing superblock
[ 1608.632285] EXT4-fs (dm-5): previous I/O error to superblock detected
[ 1608.972319] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1608.972355] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1608.972375] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=192.168.1.77
[ 1609.029720] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1609.029756] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1609.029819] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1609.029857] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1609.029878] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0000:0000:0000:132c
[ 1609.090464] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1609.090531] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1609.090657] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1609.090720] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1609.090847] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1609.090910] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1609.090949] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:1569:5535:218a:c00c
[ 1609.156228] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1609.156294] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1609.156426] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1609.156489] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1609.156621] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1609.156684] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1609.156816] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1609.156885] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1609.156925] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=2603:8081:1405:679b:0e9a:f295:ecfc:dc93
[ 1609.202360] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1609.202432] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1609.202557] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1609.202620] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1609.202746] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1609.202809] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1609.202935] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1609.202998] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1609.203124] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1609.203186] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1609.203225] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:b62e:99ff:fef9:fa2e
[ 1609.250556] ib_srp:srp_parse_in: ib_srp: 192.168.1.77 -> 192.168.1.77:0
[ 1609.250623] ib_srp:srp_parse_in: ib_srp: 192.168.1.77:5555 -> 192.168.1.77:5555
[ 1609.250749] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c] -> [2603:8081:1405:679b::132c]:0/11010381%0
[ 1609.250813] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b::132c]:5555 -> [2603:8081:1405:679b::132c]:5555/11010381%0
[ 1609.250939] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c] -> [2603:8081:1405:679b:1569:5535:218a:c00c]:0/11010381%0
[ 1609.251002] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:1569:5535:218a:c00c]:5555 -> [2603:8081:1405:679b:1569:5535:218a:c00c]:5555/11010381%0
[ 1609.251135] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93] -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:0/11010381%0
[ 1609.251204] ib_srp:srp_parse_in: ib_srp: [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555 -> [2603:8081:1405:679b:e9a:f295:ecfc:dc93]:5555/11010381%0
[ 1609.251330] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e] -> [fe80::b62e:99ff:fef9:fa2e]:0/11010381%0
[ 1609.251392] ib_srp:srp_parse_in: ib_srp: [fe80::b62e:99ff:fef9:fa2e]:5555 -> [fe80::b62e:99ff:fef9:fa2e]:5555/11010381%0
[ 1609.251520] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2] -> [fe80::21bb:9ba3:7562:5fb8]:0/11010381%2
[ 1609.251583] ib_srp:srp_parse_in: ib_srp: [fe80::21bb:9ba3:7562:5fb8%2]:5555 -> [fe80::21bb:9ba3:7562:5fb8]:5555/11010381%2
[ 1609.251622] scsi host12: ib_srp: Already connected to target port with id_ext=b62e99fffef9fa2e;ioc_guid=b62e99fffef9fa2e;dest=fe80:0000:0000:0000:21bb:9ba3:7562:5fb8
[ 1613.593078] scsi 11:0:0:0: alua: Detached
[ 1613.889256] device-mapper: ioctl: unable to set up device queue for new table.
[ 1613.937209] scsi 13:0:0:0: alua: Detached
[ 1613.953968] sd 13:0:0:2: [sdd] Synchronizing SCSI cache
[ 1614.125093] scsi 13:0:0:2: alua: Detached
[ 1614.199741] srpt_recv_done: 3062 callbacks suppressed
[ 1614.199745] ib_srpt receiving failed for ioctx 00000000325a0ecb with status 5
[ 1614.199755] ib_srpt receiving failed for ioctx 0000000086ac6964 with status 5
[ 1614.199762] ib_srpt receiving failed for ioctx 00000000c0bba452 with status 5
[ 1614.199768] ib_srpt receiving failed for ioctx 00000000b8cd660f with status 5
[ 1614.199773] ib_srpt receiving failed for ioctx 000000004078515e with status 5
[ 1614.199780] ib_srpt receiving failed for ioctx 000000006055b38d with status 5
[ 1614.199786] ib_srpt receiving failed for ioctx 00000000d8eee54c with status 5
[ 1614.199792] ib_srpt receiving failed for ioctx 000000009afdaffc with status 5
[ 1614.199798] ib_srpt receiving failed for ioctx 000000000b5ef94e with status 5
[ 1614.199803] ib_srpt receiving failed for ioctx 00000000793da65b with status 5
[ 1614.255779] device-mapper: multipath: 253:5: Failing path 8:64.
[ 1614.321592] scsi 13:0:0:1: alua: Detached
[ 1617.887394] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-2: queued zerolength write
[ 1617.889252] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-4: queued zerolength write
[ 1617.889303] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-6: queued zerolength write
[ 1617.889351] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-8: queued zerolength write
[ 1617.889397] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-10: queued zerolength write
[ 1617.889444] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-12: queued zerolength write
[ 1617.889492] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-14: queued zerolength write
[ 1617.889539] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-16: queued zerolength write
[ 1617.889585] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-18: queued zerolength write
[ 1617.889632] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-20: queued zerolength write
[ 1617.889679] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-22: queued zerolength write
[ 1617.889726] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-24: queued zerolength write
[ 1617.889772] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-26: queued zerolength write
[ 1617.889819] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-28: queued zerolength write
[ 1617.889865] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-30: queued zerolength write
[ 1617.889911] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-32: queued zerolength write
[ 1617.889959] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-34: queued zerolength write
[ 1617.890005] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-36: queued zerolength write
[ 1617.890052] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-38: queued zerolength write
[ 1617.890099] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-40: queued zerolength write
[ 1617.890145] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-42: queued zerolength write
[ 1617.890192] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-44: queued zerolength write
[ 1617.890238] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-46: queued zerolength write
[ 1617.890285] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-48: queued zerolength write
[ 1617.890332] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-3: queued zerolength write
[ 1617.890378] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-7: queued zerolength write
[ 1617.890425] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-11: queued zerolength write
[ 1617.890472] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-15: queued zerolength write
[ 1617.890518] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-19: queued zerolength write
[ 1617.890565] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-23: queued zerolength write
[ 1617.890611] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-27: queued zerolength write
[ 1617.890658] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-31: queued zerolength write
[ 1617.890704] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-35: queued zerolength write
[ 1617.890751] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-39: queued zerolength write
[ 1617.890797] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-43: queued zerolength write
[ 1617.890844] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-47: queued zerolength write
[ 1617.890890] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-50: queued zerolength write
[ 1617.890936] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-52: queued zerolength write
[ 1617.890983] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-54: queued zerolength write
[ 1617.891029] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-56: queued zerolength write
[ 1617.891076] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-58: queued zerolength write
[ 1617.891122] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-60: queued zerolength write
[ 1617.891168] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-62: queued zerolength write
[ 1617.891215] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-64: queued zerolength write
[ 1617.891261] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-66: queued zerolength write
[ 1617.891308] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-68: queued zerolength write
[ 1617.891354] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-70: queued zerolength write
[ 1617.891400] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-72: queued zerolength write
[ 1617.891447] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-5: queued zerolength write
[ 1617.891494] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-13: queued zerolength write
[ 1617.891540] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-21: queued zerolength write
[ 1617.891587] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-29: queued zerolength write
[ 1617.891633] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-37: queued zerolength write
[ 1617.891680] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-45: queued zerolength write
[ 1617.891727] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-51: queued zerolength write
[ 1617.891773] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-55: queued zerolength write
[ 1617.891819] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-59: queued zerolength write
[ 1617.891866] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-63: queued zerolength write
[ 1617.891912] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-67: queued zerolength write
[ 1617.891958] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-71: queued zerolength write
[ 1617.892005] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-74: queued zerolength write
[ 1617.892051] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-76: queued zerolength write
[ 1617.892098] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-78: queued zerolength write
[ 1617.892144] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-80: queued zerolength write
[ 1617.892191] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-82: queued zerolength write
[ 1617.892237] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-84: queued zerolength write
[ 1617.892284] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-86: queued zerolength write
[ 1617.892330] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-88: queued zerolength write
[ 1617.892376] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-90: queued zerolength write
[ 1617.892423] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-92: queued zerolength write
[ 1617.892469] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-94: queued zerolength write
[ 1617.892516] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-96: queued zerolength write
[ 1617.892563] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-9: queued zerolength write
[ 1617.892609] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-25: queued zerolength write
[ 1617.892656] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-41: queued zerolength write
[ 1617.892702] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-53: queued zerolength write
[ 1617.892749] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-61: queued zerolength write
[ 1617.892795] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-69: queued zerolength write
[ 1617.892842] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-75: queued zerolength write
[ 1617.893418] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-2 wc->status 5
[ 1617.893498] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-4 wc->status 5
[ 1617.893504] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-79: queued zerolength write
[ 1617.893558] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-6 wc->status 5
[ 1617.893614] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-8 wc->status 5
[ 1617.893638] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-83: queued zerolength write
[ 1617.893670] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-10 wc->status 5
[ 1617.893725] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-87: queued zerolength write
[ 1617.893726] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-12 wc->status 5
[ 1617.893784] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-14 wc->status 5
[ 1617.893804] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-91: queued zerolength write
[ 1617.893840] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-16 wc->status 5
[ 1617.893891] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-95: queued zerolength write
[ 1617.893896] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-18 wc->status 5
[ 1617.893944] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-20 wc->status 5
[ 1617.893981] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-98: queued zerolength write
[ 1617.894000] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-22 wc->status 5
[ 1617.894056] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-24 wc->status 5
[ 1617.894067] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-100: queued zerolength write
[ 1617.894114] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-26 wc->status 5
[ 1617.894153] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-102: queued zerolength write
[ 1617.894170] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-28 wc->status 5
[ 1617.894226] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-30 wc->status 5
[ 1617.894245] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-104: queued zerolength write
[ 1617.894282] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-32 wc->status 5
[ 1617.894332] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-106: queued zerolength write
[ 1617.894337] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-34 wc->status 5
[ 1617.894385] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-36 wc->status 5
[ 1617.894422] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-108: queued zerolength write
[ 1617.894441] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-38 wc->status 5
[ 1617.894497] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-40 wc->status 5
[ 1617.894508] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-110: queued zerolength write
[ 1617.894556] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-42 wc->status 5
[ 1617.894595] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-112: queued zerolength write
[ 1617.894611] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-44 wc->status 5
[ 1617.894668] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-46 wc->status 5
[ 1617.894686] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-114: queued zerolength write
[ 1617.894724] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-48 wc->status 5
[ 1617.894772] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-116: queued zerolength write
[ 1617.894780] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-3 wc->status 5
[ 1617.894828] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-7 wc->status 5
[ 1617.894861] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-118: queued zerolength write
[ 1617.894884] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-11 wc->status 5
[ 1617.894940] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-15 wc->status 5
[ 1617.894947] ib_srpt:srpt_zerolength_write: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-120: queued zerolength write
[ 1617.894998] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-19 wc->status 5
[ 1617.895051] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-23 wc->status 5
[ 1617.895106] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-27 wc->status 5
[ 1617.895161] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-31 wc->status 5
[ 1617.895212] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-35 wc->status 5
[ 1617.895263] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-39 wc->status 5
[ 1617.895317] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-43 wc->status 5
[ 1617.895372] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-47 wc->status 5
[ 1617.895426] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-50 wc->status 5
[ 1617.895480] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-52 wc->status 5
[ 1617.895535] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-54 wc->status 5
[ 1617.895590] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-56 wc->status 5
[ 1617.895645] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-58 wc->status 5
[ 1617.895701] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-60 wc->status 5
[ 1617.895757] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-62 wc->status 5
[ 1617.895812] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-64 wc->status 5
[ 1617.895868] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-66 wc->status 5
[ 1617.895924] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-68 wc->status 5
[ 1617.895980] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-70 wc->status 5
[ 1617.896035] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-72 wc->status 5
[ 1617.896091] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-5 wc->status 5
[ 1617.896147] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-13 wc->status 5
[ 1617.896203] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-21 wc->status 5
[ 1617.896259] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-29 wc->status 5
[ 1617.896316] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-37 wc->status 5
[ 1617.896372] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-45 wc->status 5
[ 1617.896427] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-51 wc->status 5
[ 1617.896483] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-55 wc->status 5
[ 1617.896538] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-59 wc->status 5
[ 1617.896592] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-63 wc->status 5
[ 1617.896646] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-67 wc->status 5
[ 1617.896702] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-71 wc->status 5
[ 1617.896757] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-74 wc->status 5
[ 1617.896813] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-76 wc->status 5
[ 1617.896869] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-78 wc->status 5
[ 1617.896999] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-79 wc->status 5
[ 1617.897063] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-83 wc->status 5
[ 1617.897110] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-87 wc->status 5
[ 1617.897156] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-91 wc->status 5
[ 1617.897209] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-2
[ 1617.897209] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-95 wc->status 5
[ 1617.897266] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-98 wc->status 5
[ 1617.897321] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-100 wc->status 5
[ 1617.897378] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-102 wc->status 5
[ 1617.897435] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-104 wc->status 5
[ 1617.897491] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-106 wc->status 5
[ 1617.897547] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-108 wc->status 5
[ 1617.897603] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-110 wc->status 5
[ 1617.897659] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-112 wc->status 5
[ 1617.897714] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-114 wc->status 5
[ 1617.897771] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-116 wc->status 5
[ 1617.897827] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-118 wc->status 5
[ 1617.897883] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-120 wc->status 5
[ 1617.897951] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-79
[ 1617.899479] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-83
[ 1617.899485] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-80 wc->status 5
[ 1617.899534] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-82 wc->status 5
[ 1617.899575] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-87
[ 1617.899588] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-84 wc->status 5
[ 1617.899643] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-86 wc->status 5
[ 1617.899670] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-91
[ 1617.899698] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-88 wc->status 5
[ 1617.899751] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-90 wc->status 5
[ 1617.899762] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-95
[ 1617.899810] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-92 wc->status 5
[ 1617.899853] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-98
[ 1617.899865] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-94 wc->status 5
[ 1617.899919] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-96 wc->status 5
[ 1617.899946] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-100
[ 1617.899974] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-9 wc->status 5
[ 1617.900029] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-25 wc->status 5
[ 1617.900035] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-102
[ 1617.900087] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-41 wc->status 5
[ 1617.900141] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-53 wc->status 5
[ 1617.900146] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-104
[ 1617.900199] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-61 wc->status 5
[ 1617.900231] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-106
[ 1617.900254] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-69 wc->status 5
[ 1617.900309] ib_srpt:srpt_zerolength_write_done: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-75 wc->status 5
[ 1617.900320] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-108
[ 1617.900378] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-4
[ 1617.900408] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-110
[ 1617.900466] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-6
[ 1617.900494] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-112
[ 1617.900554] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-8
[ 1617.900580] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-114
[ 1617.900642] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-10
[ 1617.900665] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-116
[ 1617.900740] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-12
[ 1617.900745] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-118
[ 1617.900831] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-120
[ 1617.900831] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-14
[ 1617.900966] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-16
[ 1617.901037] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-18
[ 1617.901088] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-20
[ 1617.901139] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-22
[ 1617.901189] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-24
[ 1617.901239] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-26
[ 1617.901289] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-28
[ 1617.901339] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-30
[ 1617.901389] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-32
[ 1617.901439] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-34
[ 1617.901489] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-36
[ 1617.901538] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-38
[ 1617.901589] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-40
[ 1617.901638] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-42
[ 1617.901689] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-44
[ 1617.901739] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-46
[ 1617.901789] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-48
[ 1617.901839] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-3
[ 1617.901888] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-7
[ 1617.901938] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-11
[ 1617.901989] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-15
[ 1617.902038] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-19
[ 1617.902088] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-23
[ 1617.902138] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-27
[ 1617.902188] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-31
[ 1617.902238] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-35
[ 1617.902288] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-39
[ 1617.902338] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-43
[ 1617.902388] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-47
[ 1617.902439] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-50
[ 1617.902489] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-52
[ 1617.902540] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-54
[ 1617.902590] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-56
[ 1617.902640] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-58
[ 1617.902689] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-60
[ 1617.902739] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-62
[ 1617.902788] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-64
[ 1617.902838] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-66
[ 1617.902887] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-68
[ 1617.902938] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-70
[ 1617.902988] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-72
[ 1617.903038] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-5
[ 1617.903088] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-13
[ 1617.903138] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-21
[ 1617.903188] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-29
[ 1617.903238] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-37
[ 1617.903288] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-45
[ 1617.903337] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-51
[ 1617.903386] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-55
[ 1617.903436] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-59
[ 1617.903487] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-63
[ 1617.903536] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-67
[ 1617.903586] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-71
[ 1617.903636] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-74
[ 1617.903686] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-76
[ 1617.903736] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-78
[ 1617.903786] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-80
[ 1617.903835] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-82
[ 1617.903885] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-84
[ 1617.903935] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-86
[ 1617.903985] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-88
[ 1617.904035] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-90
[ 1617.904085] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-92
[ 1617.904135] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-94
[ 1617.904185] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0000:0000:0000:132c-96
[ 1617.904235] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-9
[ 1617.904284] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-25
[ 1617.904334] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-41
[ 1617.904384] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-53
[ 1617.904433] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-61
[ 1617.904483] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-69
[ 1617.904533] ib_srpt:srpt_release_channel_work: ib_srpt 2603:8081:1405:679b:0e9a:f295:ecfc:dc93-75
[ 1623.980806] ib_srpt enp6s0_siw_1: waiting for unregistration of 70 sessions ...
[ 1629.096833] ib_srpt enp6s0_siw_1: waiting for unregistration of 35 sessions ...
[ 1633.514271] sd 10:0:0:0: [sda] Synchronizing SCSI cache
[ 1633.969159] SoftiWARP detached

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-18 21:52                                                                                         ` Bob Pearson
@ 2023-10-19 19:17                                                                                           ` Bart Van Assche
  2023-10-20 17:12                                                                                             ` Bob Pearson
  0 siblings, 1 reply; 87+ messages in thread
From: Bart Van Assche @ 2023-10-19 19:17 UTC (permalink / raw)
  To: Bob Pearson, Pearson, Robert B, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/18/23 14:52, Bob Pearson wrote:
> results are slightly ambiguous but I ran the command here:
> 
> rpearson:blktests$ sudo use_siw=1 ./check srp/002
> srp/002 (File I/O on top of multipath concurrently with logout and login (mq)) [failed]time  245.018s  ...
>      runtime  245.018s  ...  128.110s
>      --- tests/srp/002.out	2023-02-15 12:07:40.675530344 -0600
>      +++ /home/rpearson/src/blktests/results/nodev/srp/002.out.bad	2023-10-18 16:36:14.723323257 -0500
>      @@ -1,2 +1 @@
>       Configured SRP target driver
>      -Passed
> rpearson:blktests$
> 
> And while it was hung I ran the following:
> 
> root@rpearson-X570-AORUS-PRO-WIFI: dmsetup ls | while read a b; do dmsetup message $a 0 fail_if_no_path; done
> device-mapper: message ioctl on mpatha-part1  failed: Invalid argument
> Command failed.
> device-mapper: message ioctl on mpatha-part2  failed: Invalid argument
> Command failed.
> device-mapper: message ioctl on mpathb-part1  failed: Invalid argument
> Command failed.
> 
> mpath[ab]-part[12] are multipath devices (dm-1,2,3) holding the Ubuntu system images and not the devices created
> by blktests. When this command finished the srp/002 run came back life but did not succeed (see above)
> 
> The dmesg log is attached

Hi Bob,

Thank you for having shared these results. The 'dmsetup message' command
should only be applied to multipath devices created by the srp/002
script and not to the multipath devices created by Ubuntu but I'm not
sure how to do that.

If the above 'dmsetup message' command resolved the hang, that indicates
that the root cause of the hang is probably in user space and not in the
kernel. Did you use the Ubuntu version of multipathd or a self-built
version of multipathd? I remember that last time I ran the SRP tests on
an Ubuntu system that I had to replace Ubuntu's multipathd with a 
self-built binary to make the tests pass.

Thanks for having shared the dmesg output. I don't think it is possible 
to derive the root cause from that output, which is unfortunate.

Thanks,

Bart.


^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-19 19:17                                                                                           ` Bart Van Assche
@ 2023-10-20 17:12                                                                                             ` Bob Pearson
  2023-10-20 17:41                                                                                               ` Bart Van Assche
  0 siblings, 1 reply; 87+ messages in thread
From: Bob Pearson @ 2023-10-20 17:12 UTC (permalink / raw)
  To: Bart Van Assche, Pearson, Robert B, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/19/23 14:17, Bart Van Assche wrote:
> On 10/18/23 14:52, Bob Pearson wrote:
>> results are slightly ambiguous but I ran the command here:
>>
>> rpearson:blktests$ sudo use_siw=1 ./check srp/002
>> srp/002 (File I/O on top of multipath concurrently with logout and login (mq)) [failed]time  245.018s  ...
>>      runtime  245.018s  ...  128.110s
>>      --- tests/srp/002.out    2023-02-15 12:07:40.675530344 -0600
>>      +++ /home/rpearson/src/blktests/results/nodev/srp/002.out.bad    2023-10-18 16:36:14.723323257 -0500
>>      @@ -1,2 +1 @@
>>       Configured SRP target driver
>>      -Passed
>> rpearson:blktests$
>>
>> And while it was hung I ran the following:
>>
>> root@rpearson-X570-AORUS-PRO-WIFI: dmsetup ls | while read a b; do dmsetup message $a 0 fail_if_no_path; done
>> device-mapper: message ioctl on mpatha-part1  failed: Invalid argument
>> Command failed.
>> device-mapper: message ioctl on mpatha-part2  failed: Invalid argument
>> Command failed.
>> device-mapper: message ioctl on mpathb-part1  failed: Invalid argument
>> Command failed.
>>
>> mpath[ab]-part[12] are multipath devices (dm-1,2,3) holding the Ubuntu system images and not the devices created
>> by blktests. When this command finished the srp/002 run came back life but did not succeed (see above)
>>
>> The dmesg log is attached
> 
> Hi Bob,
> 
> Thank you for having shared these results. The 'dmsetup message' command
> should only be applied to multipath devices created by the srp/002
> script and not to the multipath devices created by Ubuntu but I'm not
> sure how to do that.
> 
> If the above 'dmsetup message' command resolved the hang, that indicates
> that the root cause of the hang is probably in user space and not in the
> kernel. Did you use the Ubuntu version of multipathd or a self-built
> version of multipathd? I remember that last time I ran the SRP tests on
> an Ubuntu system that I had to replace Ubuntu's multipathd with a self-built binary to make the tests pass.
> 
> Thanks for having shared the dmesg output. I don't think it is possible to derive the root cause from that output, which is unfortunate.
> 
> Thanks,
> 
> Bart.
> 

Bart,

The results from yesterday are from Ubuntu 23.04 which has multipath-tools at version 0.8.8.
Ubuntu 23.10 has version 0.9.4. Github has head of tree at 0.9.6. Do you recall which version
made it work when you were looking at this? It may be less risky for me to upgrade to
Ubuntu 23.10 than try and build multipath-tools from source.

Bob

^ permalink raw reply	[flat|nested] 87+ messages in thread

* Re: [bug report] blktests srp/002 hang
  2023-10-20 17:12                                                                                             ` Bob Pearson
@ 2023-10-20 17:41                                                                                               ` Bart Van Assche
  0 siblings, 0 replies; 87+ messages in thread
From: Bart Van Assche @ 2023-10-20 17:41 UTC (permalink / raw)
  To: Bob Pearson, Pearson, Robert B, Jason Gunthorpe
  Cc: Daisuke Matsuda (Fujitsu), 'Rain River',
	Zhu Yanjun, leon, Shinichiro Kawasaki, RDMA mailing list,
	linux-scsi

On 10/20/23 10:12, Bob Pearson wrote:
> The results from yesterday are from Ubuntu 23.04 which has 
> multipath-tools at version 0.8.8. Ubuntu 23.10 has version 0.9.4. 
> Github has head of tree at 0.9.6. Do you recall which version made
> it work when you were looking at this? It may be less risky for me
> to upgrade to Ubuntu 23.10 than try and build multipath-tools from 
> source.

Hi Bob,

I haven't tested Ubuntu 23.10 yet but I think it's worth trying to
upgrade to Ubuntu 23.10. I switched to openSUSE Tumbleweed for running
blktests. Sorry but I do not remember the version number of the broken
Ubuntu multipath-tools package.

Bart.

^ permalink raw reply	[flat|nested] 87+ messages in thread

end of thread, other threads:[~2023-10-20 17:42 UTC | newest]

Thread overview: 87+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2023-08-21  6:46 [bug report] blktests srp/002 hang Shinichiro Kawasaki
2023-08-22  1:46 ` Bob Pearson
2023-08-22 10:18   ` Shinichiro Kawasaki
2023-08-22 15:20     ` Bart Van Assche
2023-08-23 16:19       ` Bob Pearson
2023-08-23 19:46         ` Bart Van Assche
2023-08-24 16:24           ` Bob Pearson
2023-08-24  8:55         ` Bernard Metzler
2023-08-24 15:35         ` Bernard Metzler
2023-08-24 16:05           ` Bart Van Assche
2023-08-24 16:27             ` Bob Pearson
2023-08-25  1:11       ` Shinichiro Kawasaki
2023-08-25  1:36         ` Bob Pearson
2023-08-25 10:16           ` Shinichiro Kawasaki
2023-08-25 13:49           ` Bart Van Assche
2023-08-25 13:52         ` Bart Van Assche
2023-09-13 17:36           ` Bob Pearson
2023-09-13 23:38             ` Zhu Yanjun
2023-09-16  5:59               ` Zhu Yanjun
2023-09-19  4:14                 ` Shinichiro Kawasaki
2023-09-19  8:07                   ` Zhu Yanjun
2023-09-19 16:30                     ` Pearson, Robert B
2023-09-19 18:11                     ` Bob Pearson
2023-09-20  4:22                       ` Zhu Yanjun
2023-09-20 16:24                         ` Bob Pearson
2023-09-20 16:36                           ` Bart Van Assche
2023-09-20 17:18                             ` Bob Pearson
2023-09-20 17:22                               ` Bart Van Assche
2023-09-20 17:29                                 ` Bob Pearson
2023-09-21  5:46                                   ` Zhu Yanjun
2023-09-21 10:06                                   ` Zhu Yanjun
2023-09-21 14:23                                   ` Rain River
2023-09-21 14:39                                     ` Bob Pearson
2023-09-21 15:08                                       ` Zhu Yanjun
2023-09-21 15:10                                       ` Zhu Yanjun
2023-09-22 18:14                                         ` Bob Pearson
2023-09-22 22:06                                           ` Bart Van Assche
2023-09-24  1:17                                           ` Rain River
2023-09-25  4:47                                             ` Daisuke Matsuda (Fujitsu)
2023-09-25 14:31                                               ` Zhu Yanjun
2023-09-26  1:09                                                 ` Daisuke Matsuda (Fujitsu)
2023-09-26  6:09                                                   ` Zhu Yanjun
2023-09-25 15:00                                               ` Bart Van Assche
2023-09-25 15:25                                                 ` Bob Pearson
2023-09-25 15:52                                                 ` Jason Gunthorpe
2023-09-25 15:54                                                   ` Bob Pearson
2023-09-25 19:57                                                 ` Bob Pearson
2023-09-25 20:33                                                   ` Bart Van Assche
2023-09-25 20:40                                                     ` Bob Pearson
2023-09-26 15:36                                                   ` Rain River
2023-09-26  1:17                                                 ` Daisuke Matsuda (Fujitsu)
2023-10-17 17:09                                                   ` Bob Pearson
2023-10-17 17:13                                                     ` Bart Van Assche
2023-10-17 17:15                                                       ` Bob Pearson
2023-10-17 17:19                                                       ` Bob Pearson
2023-10-17 17:34                                                         ` Bart Van Assche
2023-10-17 17:58                                                     ` Jason Gunthorpe
2023-10-17 18:44                                                       ` Bob Pearson
2023-10-17 18:51                                                         ` Jason Gunthorpe
2023-10-17 19:55                                                           ` Bob Pearson
2023-10-17 20:06                                                             ` Bart Van Assche
2023-10-17 20:13                                                               ` Bob Pearson
2023-10-17 21:14                                                               ` Bob Pearson
2023-10-17 21:18                                                                 ` Bart Van Assche
2023-10-17 21:23                                                                   ` Bob Pearson
2023-10-17 21:30                                                                     ` Bart Van Assche
2023-10-17 21:39                                                                       ` Bob Pearson
2023-10-17 22:42                                                                         ` Bart Van Assche
2023-10-18 18:29                                                                           ` Bob Pearson
2023-10-18 19:17                                                                             ` Jason Gunthorpe
2023-10-18 19:48                                                                               ` Bart Van Assche
2023-10-18 20:03                                                                                 ` Bob Pearson
2023-10-18 20:04                                                                                 ` Bob Pearson
2023-10-18 20:14                                                                                 ` Bob Pearson
2023-10-18 20:29                                                                                 ` Bob Pearson
2023-10-18 20:49                                                                                   ` Bart Van Assche
2023-10-18 21:17                                                                                     ` Pearson, Robert B
2023-10-18 21:27                                                                                       ` Bart Van Assche
2023-10-18 21:52                                                                                         ` Bob Pearson
2023-10-19 19:17                                                                                           ` Bart Van Assche
2023-10-20 17:12                                                                                             ` Bob Pearson
2023-10-20 17:41                                                                                               ` Bart Van Assche
2023-10-18 19:38                                                                             ` Bart Van Assche
2023-10-17 19:18                                                       ` Bart Van Assche
2023-10-18  8:16                                                     ` Zhu Yanjun
2023-09-22 11:06 ` Linux regression tracking #adding (Thorsten Leemhuis)
2023-10-13 12:51   ` Linux regression tracking #update (Thorsten Leemhuis)

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.